Development and Initial Testing of the Tiltrotor Test Rig
NASA Technical Reports Server (NTRS)
Acree, C. W., Jr.; Sheikman, A. L.
2018-01-01
The NASA Tiltrotor Test Rig (TTR) is a new, large-scale proprotor test system, developed jointly with the U.S. Army and Air Force, to develop a new, large-scale proprotor test system for the National Full-Scale Aerodynamics Complex (NFAC). The TTR is designed to test advanced proprotors up to 26 feet in diameter at speeds up to 300 knots, and even larger rotors at lower airspeeds. This combination of size and speed is unprecedented and is necessary for research into 21st-century tiltrotors and other advanced rotorcraft concepts. The TTR will provide critical data for validation of state-of-the-art design and analysis tools.
State of the Art in Large-Scale Soil Moisture Monitoring
NASA Technical Reports Server (NTRS)
Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.;
2013-01-01
Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.
Advances and trends in computational structural mechanics
NASA Technical Reports Server (NTRS)
Noor, A. K.
1986-01-01
Recent developments in computational structural mechanics are reviewed with reference to computational needs for future structures technology, advances in computational models for material behavior, discrete element technology, assessment and control of numerical simulations of structural response, hybrid analysis, and techniques for large-scale optimization. Research areas in computational structural mechanics which have high potential for meeting future technological needs are identified. These include prediction and analysis of the failure of structural components made of new materials, development of computational strategies and solution methodologies for large-scale structural calculations, and assessment of reliability and adaptive improvement of response predictions.
Neural data science: accelerating the experiment-analysis-theory cycle in large-scale neuroscience.
Paninski, L; Cunningham, J P
2018-06-01
Modern large-scale multineuronal recording methodologies, including multielectrode arrays, calcium imaging, and optogenetic techniques, produce single-neuron resolution data of a magnitude and precision that were the realm of science fiction twenty years ago. The major bottlenecks in systems and circuit neuroscience no longer lie in simply collecting data from large neural populations, but also in understanding this data: developing novel scientific questions, with corresponding analysis techniques and experimental designs to fully harness these new capabilities and meaningfully interrogate these questions. Advances in methods for signal processing, network analysis, dimensionality reduction, and optimal control-developed in lockstep with advances in experimental neurotechnology-promise major breakthroughs in multiple fundamental neuroscience problems. These trends are clear in a broad array of subfields of modern neuroscience; this review focuses on recent advances in methods for analyzing neural time-series data with single-neuronal precision. Copyright © 2018 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Winthrop, Rebecca; Simons, Kate Anderson
2013-01-01
In recent years, the global community has developed a range of initiatives to inform the post-2015 global development agenda. In the education community, International Large-Scale Assessments (ILSAs) have an important role to play in advancing a global shift in focus to access plus learning. However, there are a number of other assessment tools…
Advanced Connectivity Analysis (ACA): a Large Scale Functional Connectivity Data Mining Environment.
Chen, Rong; Nixon, Erika; Herskovits, Edward
2016-04-01
Using resting-state functional magnetic resonance imaging (rs-fMRI) to study functional connectivity is of great importance to understand normal development and function as well as a host of neurological and psychiatric disorders. Seed-based analysis is one of the most widely used rs-fMRI analysis methods. Here we describe a freely available large scale functional connectivity data mining software package called Advanced Connectivity Analysis (ACA). ACA enables large-scale seed-based analysis and brain-behavior analysis. It can seamlessly examine a large number of seed regions with minimal user input. ACA has a brain-behavior analysis component to delineate associations among imaging biomarkers and one or more behavioral variables. We demonstrate applications of ACA to rs-fMRI data sets from a study of autism.
ERIC Educational Resources Information Center
Bakah, Marie Afua Baah; Voogt, Joke M.; Pieters, Jules M.
2012-01-01
Polytechnic staff perspectives are sought on the sustainability and large-scale implementation of design teams (DT), as a means for collaborative curriculum design and teacher professional development in Ghana's polytechnics, months after implementation. Data indicates that teachers still collaborate in DTs for curriculum design and professional…
Mems: Platform for Large-Scale Integrated Vacuum Electronic Circuits
2017-03-20
SECURITY CLASSIFICATION OF: The objective of the LIVEC advanced study project was to develop a platform for large-scale integrated vacuum electronic ...Distribution Unlimited UU UU UU UU 20-03-2017 1-Jul-2014 30-Jun-2015 Final Report: MEMS Platform for Large-Scale Integrated Vacuum Electronic ... Electronic Circuits (LIVEC) Contract No: W911NF-14-C-0093 COR Dr. James Harvey U.S. ARO RTP, NC 27709-2211 Phone: 702-696-2533 e-mail
Development of a metal-clad advanced composite shear web design concept
NASA Technical Reports Server (NTRS)
Laakso, J. H.
1974-01-01
An advanced composite web concept was developed for potential application to the Space Shuttle Orbiter main engine thrust structure. The program consisted of design synthesis, analysis, detail design, element testing, and large scale component testing. A concept was sought that offered significant weight saving by the use of Boron/Epoxy (B/E) reinforced titanium plate structure. The desired concept was one that was practical and that utilized metal to efficiently improve structural reliability. The resulting development of a unique titanium-clad B/E shear web design concept is described. Three large scale components were fabricated and tested to demonstrate the performance of the concept: a titanium-clad plus or minus 45 deg B/E web laminate stiffened with vertical B/E reinforced aluminum stiffeners.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2018-01-23
Deploying an ADMS or looking to optimize its value? NREL offers a low-cost, low-risk evaluation platform for assessing ADMS performance. The National Renewable Energy Laboratory (NREL) has developed a vendor-neutral advanced distribution management system (ADMS) evaluation platform and is expanding its capabilities. The platform uses actual grid-scale hardware, large-scale distribution system models, and advanced visualization to simulate realworld conditions for the most accurate ADMS evaluation and experimentation.
Linear Scaling Density Functional Calculations with Gaussian Orbitals
NASA Technical Reports Server (NTRS)
Scuseria, Gustavo E.
1999-01-01
Recent advances in linear scaling algorithms that circumvent the computational bottlenecks of large-scale electronic structure simulations make it possible to carry out density functional calculations with Gaussian orbitals on molecules containing more than 1000 atoms and 15000 basis functions using current workstations and personal computers. This paper discusses the recent theoretical developments that have led to these advances and demonstrates in a series of benchmark calculations the present capabilities of state-of-the-art computational quantum chemistry programs for the prediction of molecular structure and properties.
Advances in multi-scale modeling of solidification and casting processes
NASA Astrophysics Data System (ADS)
Liu, Baicheng; Xu, Qingyan; Jing, Tao; Shen, Houfa; Han, Zhiqiang
2011-04-01
The development of the aviation, energy and automobile industries requires an advanced integrated product/process R&D systems which could optimize the product and the process design as well. Integrated computational materials engineering (ICME) is a promising approach to fulfill this requirement and make the product and process development efficient, economic, and environmentally friendly. Advances in multi-scale modeling of solidification and casting processes, including mathematical models as well as engineering applications are presented in the paper. Dendrite morphology of magnesium and aluminum alloy of solidification process by using phase field and cellular automaton methods, mathematical models of segregation of large steel ingot, and microstructure models of unidirectionally solidified turbine blade casting are studied and discussed. In addition, some engineering case studies, including microstructure simulation of aluminum casting for automobile industry, segregation of large steel ingot for energy industry, and microstructure simulation of unidirectionally solidified turbine blade castings for aviation industry are discussed.
Self-Reacting Friction Stir Welding for Aluminum Complex Curvature Applications
NASA Technical Reports Server (NTRS)
Brown, Randy J.; Martin, W.; Schneider, J.; Hartley, P. J.; Russell, Carolyn; Lawless, Kirby; Jones, Chip
2003-01-01
This viewgraph representation provides an overview of sucessful research conducted by Lockheed Martin and NASA to develop an advanced self-reacting friction stir technology for complex curvature aluminum alloys. The research included weld process development for 0.320 inch Al 2219, sucessful transfer from the 'lab' scale to the production scale tool and weld quality exceeding strenght goals. This process will enable development and implementation of large scale complex geometry hardware fabrication. Topics covered include: weld process development, weld process transfer, and intermediate hardware fabrication.
Advanced Computing Tools and Models for Accelerator Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryne, Robert; Ryne, Robert D.
2008-06-11
This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.
Establishment of a National Wind Energy Center at University of Houston
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Su Su
The DOE-supported project objectives are to: establish a national wind energy center (NWEC) at University of Houston and conduct research to address critical science and engineering issues for the development of future large MW-scale wind energy production systems, especially offshore wind turbines. The goals of the project are to: (1) establish a sound scientific/technical knowledge base of solutions to critical science and engineering issues for developing future MW-scale large wind energy production systems, (2) develop a state-of-the-art wind rotor blade research facility at the University of Houston, and (3) through multi-disciplinary research, introducing technology innovations on advanced wind-turbine materials, processing/manufacturingmore » technology, design and simulation, testing and reliability assessment methods related to future wind turbine systems for cost-effective production of offshore wind energy. To achieve the goals of the project, the following technical tasks were planned and executed during the period from April 15, 2010 to October 31, 2014 at the University of Houston: (1) Basic research on large offshore wind turbine systems (2) Applied research on innovative wind turbine rotors for large offshore wind energy systems (3) Integration of offshore wind-turbine design, advanced materials and manufacturing technologies (4) Integrity and reliability of large offshore wind turbine blades and scaled model testing (5) Education and training of graduate and undergraduate students and post- doctoral researchers (6) Development of a national offshore wind turbine blade research facility The research program addresses both basic science and engineering of current and future large wind turbine systems, especially offshore wind turbines, for MW-scale power generation. The results of the research advance current understanding of many important scientific issues and provide technical information for solving future large wind turbines with advanced design, composite materials, integrated manufacturing, and structural reliability and integrity. The educational program have trained many graduate and undergraduate students and post-doctoral level researchers to learn critical science and engineering of wind energy production systems through graduate-level courses and research, and participating in various projects in center’s large multi-disciplinary research. These students and researchers are now employed by the wind industry, national labs and universities to support the US and international wind energy industry. The national offshore wind turbine blade research facility developed in the project has been used to support the technical and training tasks planned in the program to accomplish their goals, and it is a national asset which is available for used by domestic and international researchers in the wind energy arena.« less
Cost-Driven Design of a Large Scale X-Plane
NASA Technical Reports Server (NTRS)
Welstead, Jason R.; Frederic, Peter C.; Frederick, Michael A.; Jacobson, Steven R.; Berton, Jeffrey J.
2017-01-01
A conceptual design process focused on the development of a low-cost, large scale X-plane was developed as part of an internal research and development effort. One of the concepts considered for this process was the double-bubble configuration recently developed as an advanced single-aisle class commercial transport similar in size to a Boeing 737-800 or Airbus A320. The study objective was to reduce the contractor cost from contract award to first test flight to less than $100 million, and having the first flight within three years of contract award. Methods and strategies for reduced cost are discussed.
A multidisciplinary approach to the development of low-cost high-performance lightwave networks
NASA Technical Reports Server (NTRS)
Maitan, Jacek; Harwit, Alex
1991-01-01
Our research focuses on high-speed distributed systems. We anticipate that our results will allow the fabrication of low-cost networks employing multi-gigabit-per-second data links for space and military applications. The recent development of high-speed low-cost photonic components and new generations of microprocessors creates an opportunity to develop advanced large-scale distributed information systems. These systems currently involve hundreds of thousands of nodes and are made up of components and communications links that may fail during operation. In order to realize these systems, research is needed into technologies that foster adaptability and scaleability. Self-organizing mechanisms are needed to integrate a working fabric of large-scale distributed systems. The challenge is to fuse theory, technology, and development methodologies to construct a cost-effective, efficient, large-scale system.
Toward exascale production of recombinant adeno-associated virus for gene transfer applications.
Cecchini, S; Negrete, A; Kotin, R M
2008-06-01
To gain acceptance as a medical treatment, adeno-associated virus (AAV) vectors require a scalable and economical production method. Recent developments indicate that recombinant AAV (rAAV) production in insect cells is compatible with current good manufacturing practice production on an industrial scale. This platform can fully support development of rAAV therapeutics from tissue culture to small animal models, to large animal models, to toxicology studies, to Phase I clinical trials and beyond. Efforts to characterize, optimize and develop insect cell-based rAAV production have culminated in successful bioreactor-scale production of rAAV, with total yields potentially capable of approaching the exa-(10(18)) scale. These advances in large-scale AAV production will allow us to address specific catastrophic, intractable human diseases such as Duchenne muscular dystrophy, for which large amounts of recombinant vector are essential for successful outcome.
NASA advanced turboprop research and concept validation program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitlow, J.B. Jr.; Sievers, G.K.
1988-01-01
NASA has determined by experimental and analytical effort that use of advanced turboprop propulsion instead of the conventional turbofans in the older narrow-body airline fleet could reduce fuel consumption for this type of aircraft by up to 50 percent. In cooperation with industry, NASA has defined and implemented an Advanced Turboprop (ATP) program to develop and validate the technology required for these new high-speed, multibladed, thin, swept propeller concepts. This paper presents an overview of the analysis, model-scale test, and large-scale flight test elements of the program together with preliminary test results, as available.
Networking at Conferences: Developing Your Professional Support System
ERIC Educational Resources Information Center
Kowalsky, Michelle
2012-01-01
The complexity and scale of any large library, education, or technology conference can sometimes be overwhelming. Therefore, spending time reviewing the conference program and perusing the workshop offerings in advance can help you stay organized and make the most of your time at the event. Planning in advance will help you manage potential time…
USDA-ARS?s Scientific Manuscript database
The rapid advancement in high-throughput SNP genotyping technologies along with next generation sequencing (NGS) platforms has decreased the cost, improved the quality of large-scale genome surveys, and allowed specialty crops with limited genomic resources such as carrot (Daucus carota) to access t...
ERIC Educational Resources Information Center
Roschelle, Jeremy; Shechtman, Nicole; Tatar, Deborah; Hegedus, Stephen; Hopkins, Bill; Empson, Susan; Knudsen, Jennifer; Gallagher, Lawrence P.
2010-01-01
The authors present three studies (two randomized controlled experiments and one embedded quasi-experiment) designed to evaluate the impact of replacement units targeting student learning of advanced middle school mathematics. The studies evaluated the SimCalc approach, which integrates an interactive representational technology, paper curriculum,…
The Air Force Advanced Instructional System (AIS): An Overview.
ERIC Educational Resources Information Center
Yasutake, Joseph Y.; Stobie, William H.
The Air Force Advanced Instructional System (AIS) is a prototype computer-based multimedia system for the administration and management of individualized technical training on a large scale. The paper provides an overview of the AIS: (1) its purposes and goals, (2) the background and rationale for the development approach, (3) a basic description…
Evaluating New Approaches to Teaching of Sight-Reading Skills to Advanced Pianists
ERIC Educational Resources Information Center
Zhukov, Katie
2014-01-01
This paper evaluates three teaching approaches to improving sight-reading skills against a control in a large-scale study of advanced pianists. One hundred pianists in four equal groups participated in newly developed training programmes (accompanying, rhythm, musical style and control), with pre- and post-sight-reading tests analysed using…
Large eddy simulations in 2030 and beyond
Piomelli, U
2014-01-01
Since its introduction, in the early 1970s, large eddy simulations (LES) have advanced considerably, and their application is transitioning from the academic environment to industry. Several landmark developments can be identified over the past 40 years, such as the wall-resolved simulations of wall-bounded flows, the development of advanced models for the unresolved scales that adapt to the local flow conditions and the hybridization of LES with the solution of the Reynolds-averaged Navier–Stokes equations. Thanks to these advancements, LES is now in widespread use in the academic community and is an option available in most commercial flow-solvers. This paper will try to predict what algorithmic and modelling advancements are needed to make it even more robust and inexpensive, and which areas show the most promise. PMID:25024415
NASA Technical Reports Server (NTRS)
Liu, J. T. C.
1986-01-01
Advances in the mechanics of boundary layer flow are reported. The physical problems of large scale coherent structures in real, developing free turbulent shear flows, from the nonlinear aspects of hydrodynamic stability are addressed. The presence of fine grained turbulence in the problem, and its absence, lacks a small parameter. The problem is presented on the basis of conservation principles, which are the dynamics of the problem directed towards extracting the most physical information, however, it is emphasized that it must also involve approximations.
Evaluation of advanced microelectronics for inclusion in MIL-STD-975
NASA Technical Reports Server (NTRS)
Scott, W. Richard
1991-01-01
The approach taken by NASA and JPL (Jet Propulsion Laboratory) in the development of a MIL-STD-975 section which contains advanced technology such as Large Scale Integration and Very Large Scale Integration (LSI/VLSI) microelectronic devices is described. The parts listed in this section are recommended as satisfactory for NASA flight applications, in the absence of alternate qualified devices, based on satisfactory results of a vendor capability audit, the availability of sufficient characterization and reliability data from the manufacturers and users and negotiated detail procurement specifications. The criteria used in the selection and evaluation of the vendors and candidate parts, the preparation of procurement specifications, and the status of this activity are discussed.
Building to Scale: An Analysis of Web-Based Services in CIC (Big Ten) Libraries.
ERIC Educational Resources Information Center
Dewey, Barbara I.
Advancing library services in large universities requires creative approaches for "building to scale." This is the case for CIC, Committee on Institutional Cooperation (Big Ten), libraries whose home institutions serve thousands of students, faculty, staff, and others. Developing virtual Web-based services is an increasingly viable…
Bioinspired Wood Nanotechnology for Functional Materials.
Berglund, Lars A; Burgert, Ingo
2018-05-01
It is a challenging task to realize the vision of hierarchically structured nanomaterials for large-scale applications. Herein, the biomaterial wood as a large-scale biotemplate for functionalization at multiple scales is discussed, to provide an increased property range to this renewable and CO 2 -storing bioresource, which is available at low cost and in large quantities. The Progress Report reviews the emerging field of functional wood materials in view of the specific features of the structural template and novel nanotechnological approaches for the development of wood-polymer composites and wood-mineral hybrids for advanced property profiles and new functions. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Machine Learning, deep learning and optimization in computer vision
NASA Astrophysics Data System (ADS)
Canu, Stéphane
2017-03-01
As quoted in the Large Scale Computer Vision Systems NIPS workshop, computer vision is a mature field with a long tradition of research, but recent advances in machine learning, deep learning, representation learning and optimization have provided models with new capabilities to better understand visual content. The presentation will go through these new developments in machine learning covering basic motivations, ideas, models and optimization in deep learning for computer vision, identifying challenges and opportunities. It will focus on issues related with large scale learning that is: high dimensional features, large variety of visual classes, and large number of examples.
Development of the US3D Code for Advanced Compressible and Reacting Flow Simulations
NASA Technical Reports Server (NTRS)
Candler, Graham V.; Johnson, Heath B.; Nompelis, Ioannis; Subbareddy, Pramod K.; Drayna, Travis W.; Gidzak, Vladimyr; Barnhardt, Michael D.
2015-01-01
Aerothermodynamics and hypersonic flows involve complex multi-disciplinary physics, including finite-rate gas-phase kinetics, finite-rate internal energy relaxation, gas-surface interactions with finite-rate oxidation and sublimation, transition to turbulence, large-scale unsteadiness, shock-boundary layer interactions, fluid-structure interactions, and thermal protection system ablation and thermal response. Many of the flows have a large range of length and time scales, requiring large computational grids, implicit time integration, and large solution run times. The University of Minnesota NASA US3D code was designed for the simulation of these complex, highly-coupled flows. It has many of the features of the well-established DPLR code, but uses unstructured grids and has many advanced numerical capabilities and physical models for multi-physics problems. The main capabilities of the code are described, the physical modeling approaches are discussed, the different types of numerical flux functions and time integration approaches are outlined, and the parallelization strategy is overviewed. Comparisons between US3D and the NASA DPLR code are presented, and several advanced simulations are presented to illustrate some of novel features of the code.
ORNL Pre-test Analyses of A Large-scale Experiment in STYLE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Paul T; Yin, Shengjun; Klasky, Hilda B
Oak Ridge National Laboratory (ORNL) is conducting a series of numerical analyses to simulate a large scale mock-up experiment planned within the European Network for Structural Integrity for Lifetime Management non-RPV Components (STYLE). STYLE is a European cooperative effort to assess the structural integrity of (non-reactor pressure vessel) reactor coolant pressure boundary components relevant to ageing and life-time management and to integrate the knowledge created in the project into mainstream nuclear industry assessment codes. ORNL contributes work-in-kind support to STYLE Work Package 2 (Numerical Analysis/Advanced Tools) and Work Package 3 (Engineering Assessment Methods/LBB Analyses). This paper summarizes the current statusmore » of ORNL analyses of the STYLE Mock-Up3 large-scale experiment to simulate and evaluate crack growth in a cladded ferritic pipe. The analyses are being performed in two parts. In the first part, advanced fracture mechanics models are being developed and performed to evaluate several experiment designs taking into account the capabilities of the test facility while satisfying the test objectives. Then these advanced fracture mechanics models will be utilized to simulate the crack growth in the large scale mock-up test. For the second part, the recently developed ORNL SIAM-PFM open-source, cross-platform, probabilistic computational tool will be used to generate an alternative assessment for comparison with the advanced fracture mechanics model results. The SIAM-PFM probabilistic analysis of the Mock-Up3 experiment will utilize fracture modules that are installed into a general probabilistic framework. The probabilistic results of the Mock-Up3 experiment obtained from SIAM-PFM will be compared to those results generated using the deterministic 3D nonlinear finite-element modeling approach. The objective of the probabilistic analysis is to provide uncertainty bounds that will assist in assessing the more detailed 3D finite-element solutions and to also assess the level of confidence that can be placed in the best-estimate finiteelement solutions.« less
Towards large-scale, human-based, mesoscopic neurotechnologies.
Chang, Edward F
2015-04-08
Direct human brain recordings have transformed the scope of neuroscience in the past decade. Progress has relied upon currently available neurophysiological approaches in the context of patients undergoing neurosurgical procedures for medical treatment. While this setting has provided precious opportunities for scientific research, it also has presented significant constraints on the development of new neurotechnologies. A major challenge now is how to achieve high-resolution spatiotemporal neural recordings at a large scale. By narrowing the gap between current approaches, new directions tailored to the mesoscopic (intermediate) scale of resolution may overcome the barriers towards safe and reliable human-based neurotechnology development, with major implications for advancing both basic research and clinical translation. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Samsonov, S. V.; Feng, W.
2017-12-01
InSAR-based mapping of surface deformation (displacement) has proven valuable to a variety of geoscience applications within NRCan. Conventional approaches to InSAR analysis require significant expert intervention to separate useful signal from noise and are not suited to the address the opportunities and challenges presented by the large multi-temporal SAR datasets provided by future radar constellations. The Canada Centre for Mapping and Earth Observation (CCMEO) develops, in support of NRCAN and Government of Canada priorities a framework for automatic generation of standard and advanced deformation products based on Interferometric Synthetic Aperture Radar (InSAR) technology from RADARSAT Constellation Mission (RCM) Synthetic Aperture Radar data. We utilize existing processing algorithms that are currently used for processing RADARSAT-2 data and adapt them to RCM specifications. In addition we develop novel advanced processing algorithms that address large data sets made possible by the satellites' rapid revisit cycle and expand InSAR functionality to regional and national scales across a wide range of time scales. Through automation the system makes it possible to extend the mapping of surface deformation to non-SAR experts. The architecture is scalable and expandable to serve large number of clients and simultaneously address multiple application areas including: natural and anthropogenic hazards, natural resource development, permafrost and glacier monitoring, coastal and environmental change and wetlands mapping.
A Navy Shore Activity Manpower Planning System for Civilians. Technical Report No. 24.
ERIC Educational Resources Information Center
Niehaus, R. J.; Sholtz, D.
This report describes the U.S. Navy Shore Activity Manpower Planning System (SAMPS) advanced development research project. This effort is aimed at large-scale feasibility tests of manpower models for large Naval installations. These local planning systems are integrated with Navy-wide information systems on a data-communications network accessible…
Manufacturing Process Developments for Regeneratively-Cooled Channel Wall Rocket Nozzles
NASA Technical Reports Server (NTRS)
Gradl, Paul; Brandsmeier, Will
2016-01-01
Regeneratively cooled channel wall nozzles incorporate a series of integral coolant channels to contain the coolant to maintain adequate wall temperatures and expand hot gas providing engine thrust and specific impulse. NASA has been evaluating manufacturing techniques targeting large scale channel wall nozzles to support affordability of current and future liquid rocket engine nozzles and thrust chamber assemblies. The development of these large scale manufacturing techniques focus on the liner formation, channel slotting with advanced abrasive water-jet milling techniques and closeout of the coolant channels to replace or augment other cost reduction techniques being evaluated for nozzles. NASA is developing a series of channel closeout techniques including large scale additive manufacturing laser deposition and explosively bonded closeouts. A series of subscale nozzles were completed evaluating these processes. Fabrication of mechanical test and metallography samples, in addition to subscale hardware has focused on Inconel 625, 300 series stainless, aluminum alloys as well as other candidate materials. Evaluations of these techniques are demonstrating potential for significant cost reductions for large scale nozzles and chambers. Hot fire testing is planned using these techniques in the future.
Chronic, Wireless Recordings of Large Scale Brain Activity in Freely Moving Rhesus Monkeys
Schwarz, David A.; Lebedev, Mikhail A.; Hanson, Timothy L.; Dimitrov, Dragan F.; Lehew, Gary; Meloy, Jim; Rajangam, Sankaranarayani; Subramanian, Vivek; Ifft, Peter J.; Li, Zheng; Ramakrishnan, Arjun; Tate, Andrew; Zhuang, Katie; Nicolelis, Miguel A.L.
2014-01-01
Advances in techniques for recording large-scale brain activity contribute to both the elucidation of neurophysiological principles and the development of brain-machine interfaces (BMIs). Here we describe a neurophysiological paradigm for performing tethered and wireless large-scale recordings based on movable volumetric three-dimensional (3D) multielectrode implants. This approach allowed us to isolate up to 1,800 units per animal and simultaneously record the extracellular activity of close to 500 cortical neurons, distributed across multiple cortical areas, in freely behaving rhesus monkeys. The method is expandable, in principle, to thousands of simultaneously recorded channels. It also allows increased recording longevity (5 consecutive years), and recording of a broad range of behaviors, e.g. social interactions, and BMI paradigms in freely moving primates. We propose that wireless large-scale recordings could have a profound impact on basic primate neurophysiology research, while providing a framework for the development and testing of clinically relevant neuroprostheses. PMID:24776634
NASA Technical Reports Server (NTRS)
Aanstoos, J. V.; Snyder, W. E.
1981-01-01
Anticipated major advances in integrated circuit technology in the near future are described as well as their impact on satellite onboard signal processing systems. Dramatic improvements in chip density, speed, power consumption, and system reliability are expected from very large scale integration. Improvements are expected from very large scale integration enable more intelligence to be placed on remote sensing platforms in space, meeting the goals of NASA's information adaptive system concept, a major component of the NASA End-to-End Data System program. A forecast of VLSI technological advances is presented, including a description of the Defense Department's very high speed integrated circuit program, a seven-year research and development effort.
Multidimensional quantum entanglement with large-scale integrated optics.
Wang, Jianwei; Paesani, Stefano; Ding, Yunhong; Santagati, Raffaele; Skrzypczyk, Paul; Salavrakos, Alexia; Tura, Jordi; Augusiak, Remigiusz; Mančinska, Laura; Bacco, Davide; Bonneau, Damien; Silverstone, Joshua W; Gong, Qihuang; Acín, Antonio; Rottwitt, Karsten; Oxenløwe, Leif K; O'Brien, Jeremy L; Laing, Anthony; Thompson, Mark G
2018-04-20
The ability to control multidimensional quantum systems is central to the development of advanced quantum technologies. We demonstrate a multidimensional integrated quantum photonic platform able to generate, control, and analyze high-dimensional entanglement. A programmable bipartite entangled system is realized with dimensions up to 15 × 15 on a large-scale silicon photonics quantum circuit. The device integrates more than 550 photonic components on a single chip, including 16 identical photon-pair sources. We verify the high precision, generality, and controllability of our multidimensional technology, and further exploit these abilities to demonstrate previously unexplored quantum applications, such as quantum randomness expansion and self-testing on multidimensional states. Our work provides an experimental platform for the development of multidimensional quantum technologies. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.
Mechanisation of large-scale agricultural fields in developing countries - a review.
Onwude, Daniel I; Abdulstter, Rafia; Gomes, Chandima; Hashim, Norhashila
2016-09-01
Mechanisation of large-scale agricultural fields often requires the application of modern technologies such as mechanical power, automation, control and robotics. These technologies are generally associated with relatively well developed economies. The application of these technologies in some developing countries in Africa and Asia is limited by factors such as technology compatibility with the environment, availability of resources to facilitate the technology adoption, cost of technology purchase, government policies, adequacy of technology and appropriateness in addressing the needs of the population. As a result, many of the available resources have been used inadequately by farmers, who continue to rely mostly on conventional means of agricultural production, using traditional tools and equipment in most cases. This has led to low productivity and high cost of production among others. Therefore this paper attempts to evaluate the application of present day technology and its limitations to the advancement of large-scale mechanisation in developing countries of Africa and Asia. Particular emphasis is given to a general understanding of the various levels of mechanisation, present day technology, its management and application to large-scale agricultural fields. This review also focuses on/gives emphasis to future outlook that will enable a gradual, evolutionary and sustainable technological change. The study concludes that large-scale-agricultural farm mechanisation for sustainable food production in Africa and Asia must be anchored on a coherent strategy based on the actual needs and priorities of the large-scale farmers. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Heat pipes for wing leading edges of hypersonic vehicles
NASA Technical Reports Server (NTRS)
Boman, B. L.; Citrin, K. M.; Garner, E. C.; Stone, J. E.
1990-01-01
Wing leading edge heat pipes were conceptually designed for three types of vehicle: an entry research vehicle, aero-space plane, and advanced shuttle. A full scale, internally instrumented sodium/Hastelloy X heat pipe was successfully designed and fabricated for the advanced shuttle application. The 69.4 inch long heat pipe reduces peak leading edge temperatures from 3500 F to 1800 F. It is internally instrumented with thermocouples and pressure transducers to measure sodium vapor qualities. Large thermal gradients and consequently large thermal stresses, which have the potential of limiting heat pipe life, were predicted to occur during startup. A test stand and test plan were developed for subsequent testing of this heat pipe. Heat pipe manufacturing technology was advanced during this program, including the development of an innovative technique for wick installation.
The Numerical Propulsion System Simulation: An Overview
NASA Technical Reports Server (NTRS)
Lytle, John K.
2000-01-01
Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.
Computational aerodynamics development and outlook /Dryden Lecture in Research for 1979/
NASA Technical Reports Server (NTRS)
Chapman, D. R.
1979-01-01
Some past developments and current examples of computational aerodynamics are briefly reviewed. An assessment is made of the requirements on future computer memory and speed imposed by advanced numerical simulations, giving emphasis to the Reynolds averaged Navier-Stokes equations and to turbulent eddy simulations. Experimental scales of turbulence structure are used to determine the mesh spacings required to adequately resolve turbulent energy and shear. Assessment also is made of the changing market environment for developing future large computers, and of the projections of micro-electronics memory and logic technology that affect future computer capability. From the two assessments, estimates are formed of the future time scale in which various advanced types of aerodynamic flow simulations could become feasible. Areas of research judged especially relevant to future developments are noted.
Towards multiscale modeling of influenza infection
Murillo, Lisa N.; Murillo, Michael S.; Perelson, Alan S.
2013-01-01
Aided by recent advances in computational power, algorithms, and higher fidelity data, increasingly detailed theoretical models of infection with influenza A virus are being developed. We review single scale models as they describe influenza infection from intracellular to global scales, and, in particular, we consider those models that capture details specific to influenza and can be used to link different scales. We discuss the few multiscale models of influenza infection that have been developed in this emerging field. In addition to discussing modeling approaches, we also survey biological data on influenza infection and transmission that is relevant for constructing influenza infection models. We envision that, in the future, multiscale models that capitalize on technical advances in experimental biology and high performance computing could be used to describe the large spatial scale epidemiology of influenza infection, evolution of the virus, and transmission between hosts more accurately. PMID:23608630
Kranz, Christine
2014-01-21
In recent years, major developments in scanning electrochemical microscopy (SECM) have significantly broadened the application range of this electroanalytical technique from high-resolution electrochemical imaging via nanoscale probes to large scale mapping using arrays of microelectrodes. A major driving force in advancing the SECM methodology is based on developing more sophisticated probes beyond conventional micro-disc electrodes usually based on noble metals or carbon microwires. This critical review focuses on the design and development of advanced electrochemical probes particularly enabling combinations of SECM with other analytical measurement techniques to provide information beyond exclusively measuring electrochemical sample properties. Consequently, this critical review will focus on recent progress and new developments towards multifunctional imaging.
Fuel savings potential of the NASA Advanced Turboprop Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitlow, J.B. Jr.; Sievers, G.K.
1984-01-01
The NASA Advanced Turboprop (ATP) Program is directed at developing new technology for highly loaded, multibladed propellers for use at Mach 0.65 to 0.85 and at altitudes compatible with the air transport system requirements. Advanced turboprop engines offer the potential of 15 to 30 percent savings in aircraft block fuel relative to advanced turbofan engines (50 to 60 percent savings over today's turbofan fleet). The concept, propulsive efficiency gains, block fuel savings and other benefits, and the program objectives through a systems approach are described. Current program status and major accomplishments in both single rotation and counter rotation propeller technologymore » are addressed. The overall program from scale model wind tunnel tests to large scale flight tests on testbed aircraft is discussed.« less
Large-Scale Brain Systems in ADHD: Beyond the Prefrontal-Striatal Model
Castellanos, F. Xavier; Proal, Erika
2012-01-01
Attention-deficit/hyperactivity disorder (ADHD) has long been thought to reflect dysfunction of prefrontal-striatal circuitry, with involvement of other circuits largely ignored. Recent advances in systems neuroscience-based approaches to brain dysfunction enable the development of models of ADHD pathophysiology that encompass a number of different large-scale “resting state” networks. Here we review progress in delineating large-scale neural systems and illustrate their relevance to ADHD. We relate frontoparietal, dorsal attentional, motor, visual, and default networks to the ADHD functional and structural literature. Insights emerging from mapping intrinsic brain connectivity networks provide a potentially mechanistic framework for understanding aspects of ADHD, such as neuropsychological and behavioral inconsistency, and the possible role of primary visual cortex in attentional dysfunction in the disorder. PMID:22169776
Fire Detection Organizing Questions
NASA Technical Reports Server (NTRS)
2004-01-01
Verified models of fire precursor transport in low and partial gravity: a. Development of models for large-scale transport in reduced gravity. b. Validated CFD simulations of transport of fire precursors. c. Evaluation of the effect of scale on transport and reduced gravity fires. Advanced fire detection system for gaseous and particulate pre-fire and fire signaturesa: a. Quantification of pre-fire pyrolysis products in microgravity. b. Suite of gas and particulate sensors. c. Reduced gravity evaluation of candidate detector technologies. d. Reduced gravity verification of advanced fire detection system. e. Validated database of fire and pre-fire signatures in low and partial gravity.
High Temperature Electrolysis 4 kW Experiment Design, Operation, and Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
J.E. O'Brien; X. Zhang; K. DeWall
2012-09-01
This report provides results of long-term stack testing completed in the new high-temperature steam electrolysis multi-kW test facility recently developed at INL. The report includes detailed descriptions of the piping layout, steam generation and delivery system, test fixture, heat recuperation system, hot zone, instrumentation, and operating conditions. This facility has provided a demonstration of high-temperature steam electrolysis operation at the 4 kW scale with advanced cell and stack technology. This successful large-scale demonstration of high-temperature steam electrolysis will help to advance the technology toward near-term commercialization.
NASA Astrophysics Data System (ADS)
Bonne, François; Alamir, Mazen; Bonnay, Patrick
2014-01-01
In this paper, a physical method to obtain control-oriented dynamical models of large scale cryogenic refrigerators is proposed, in order to synthesize model-based advanced control schemes. These schemes aim to replace classical user experience designed approaches usually based on many independent PI controllers. This is particularly useful in the case where cryoplants are submitted to large pulsed thermal loads, expected to take place in the cryogenic cooling systems of future fusion reactors such as the International Thermonuclear Experimental Reactor (ITER) or the Japan Torus-60 Super Advanced Fusion Experiment (JT-60SA). Advanced control schemes lead to a better perturbation immunity and rejection, to offer a safer utilization of cryoplants. The paper gives details on how basic components used in the field of large scale helium refrigeration (especially those present on the 400W @1.8K helium test facility at CEA-Grenoble) are modeled and assembled to obtain the complete dynamic description of controllable subsystems of the refrigerator (controllable subsystems are namely the Joule-Thompson Cycle, the Brayton Cycle, the Liquid Nitrogen Precooling Unit and the Warm Compression Station). The complete 400W @1.8K (in the 400W @4.4K configuration) helium test facility model is then validated against experimental data and the optimal control of both the Joule-Thompson valve and the turbine valve is proposed, to stabilize the plant under highly variable thermals loads. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonne, François; Bonnay, Patrick; Alamir, Mazen
2014-01-29
In this paper, a physical method to obtain control-oriented dynamical models of large scale cryogenic refrigerators is proposed, in order to synthesize model-based advanced control schemes. These schemes aim to replace classical user experience designed approaches usually based on many independent PI controllers. This is particularly useful in the case where cryoplants are submitted to large pulsed thermal loads, expected to take place in the cryogenic cooling systems of future fusion reactors such as the International Thermonuclear Experimental Reactor (ITER) or the Japan Torus-60 Super Advanced Fusion Experiment (JT-60SA). Advanced control schemes lead to a better perturbation immunity and rejection,more » to offer a safer utilization of cryoplants. The paper gives details on how basic components used in the field of large scale helium refrigeration (especially those present on the 400W @1.8K helium test facility at CEA-Grenoble) are modeled and assembled to obtain the complete dynamic description of controllable subsystems of the refrigerator (controllable subsystems are namely the Joule-Thompson Cycle, the Brayton Cycle, the Liquid Nitrogen Precooling Unit and the Warm Compression Station). The complete 400W @1.8K (in the 400W @4.4K configuration) helium test facility model is then validated against experimental data and the optimal control of both the Joule-Thompson valve and the turbine valve is proposed, to stabilize the plant under highly variable thermals loads. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.« less
Using Large-Scale Databases in Evaluation: Advances, Opportunities, and Challenges
ERIC Educational Resources Information Center
Penuel, William R.; Means, Barbara
2011-01-01
Major advances in the number, capabilities, and quality of state, national, and transnational databases have opened up new opportunities for evaluators. Both large-scale data sets collected for administrative purposes and those collected by other researchers can provide data for a variety of evaluation-related activities. These include (a)…
NASA Astrophysics Data System (ADS)
Sotiropoulos, Fotis; Khosronejad, Ali
2016-02-01
Sand waves arise in subaqueous and Aeolian environments as the result of the complex interaction between turbulent flows and mobile sand beds. They occur across a wide range of spatial scales, evolve at temporal scales much slower than the integral scale of the transporting turbulent flow, dominate river morphodynamics, undermine streambank stability and infrastructure during flooding, and sculpt terrestrial and extraterrestrial landscapes. In this paper, we present the vision for our work over the last ten years, which has sought to develop computational tools capable of simulating the coupled interactions of sand waves with turbulence across the broad range of relevant scales: from small-scale ripples in laboratory flumes to mega-dunes in large rivers. We review the computational advances that have enabled us to simulate the genesis and long-term evolution of arbitrarily large and complex sand dunes in turbulent flows using large-eddy simulation and summarize numerous novel physical insights derived from our simulations. Our findings explain the role of turbulent sweeps in the near-bed region as the primary mechanism for destabilizing the sand bed, show that the seeds of the emergent structure in dune fields lie in the heterogeneity of the turbulence and bed shear stress fluctuations over the initially flatbed, and elucidate how large dunes at equilibrium give rise to energetic coherent structures and modify the spectra of turbulence. We also discuss future challenges and our vision for advancing a data-driven simulation-based engineering science approach for site-specific simulations of river flooding.
A large scale software system for simulation and design optimization of mechanical systems
NASA Technical Reports Server (NTRS)
Dopker, Bernhard; Haug, Edward J.
1989-01-01
The concept of an advanced integrated, networked simulation and design system is outlined. Such an advanced system can be developed utilizing existing codes without compromising the integrity and functionality of the system. An example has been used to demonstrate the applicability of the concept of the integrated system outlined here. The development of an integrated system can be done incrementally. Initial capabilities can be developed and implemented without having a detailed design of the global system. Only a conceptual global system must exist. For a fully integrated, user friendly design system, further research is needed in the areas of engineering data bases, distributed data bases, and advanced user interface design.
NASA Astrophysics Data System (ADS)
Frumin, Kim; Dede, Chris; Fischer, Christian; Foster, Brandon; Lawrenz, Frances; Eisenkraft, Arthur; Fishman, Barry; Jurist Levy, Abigail; McCoy, Ayana
2018-03-01
Over the past decade, the field of teacher professional learning has coalesced around core characteristics of high quality professional development experiences (e.g. Borko, Jacobs, & Koellner, 2010. Contemporary approaches to teacher professional development. In P. L. Peterson, E. Baker, & B. McGaw (Eds.), International encyclopedia of education (Vol. 7, pp. 548-556). Oxford: Elsevier.; Darling-Hammond, Hyler, & Gardner, 2017. Effective teacher professional development. Palo Alto, CA: Learning Policy Institute). Many countries have found these advances of great interest because of a desire to build teacher capacity in science education and across the full curriculum. This paper continues this progress by examining the role and impact of an online professional development community within the top-down, large-scale curriculum and assessment revision of Advanced Placement (AP) Biology, Chemistry, and Physics. This paper is part of a five-year, longitudinal, U.S. National Science Foundation-funded project to study the relative effectiveness of various types of professional development in enabling teachers to adapt to the revised AP course goals and exams. Of the many forms of professional development our research has examined, preliminary analyses indicated that participation in the College Board's online AP Teacher Community (APTC) - where teachers can discuss teaching strategies, share resources, and connect with each other - had positive, direct, and statistically significant association with teacher self-reported shifts in practice and with gains in student AP scores (Fishman et al., 2014). This study explored how usage of the online APTC might be useful to teachers and examined a more robust estimate of these effects. Findings from the experience of AP teachers may be valuable in supporting other large-scale curriculum changes, such as the U.S. Next Generation Science Standards or Common Core Standards, as well as parallel curricular shifts in other countries.
NASA Technical Reports Server (NTRS)
Denning, P. J.; Adams, G. B., III; Brown, R. L.; Kanerva, P.; Leiner, B. M.; Raugh, M. R.
1986-01-01
Large, complex computer systems require many years of development. It is recognized that large scale systems are unlikely to be delivered in useful condition unless users are intimately involved throughout the design process. A mechanism is described that will involve users in the design of advanced computing systems and will accelerate the insertion of new systems into scientific research. This mechanism is embodied in a facility called the Center for Advanced Architectures (CAA). CAA would be a division of RIACS (Research Institute for Advanced Computer Science) and would receive its technical direction from a Scientific Advisory Board established by RIACS. The CAA described here is a possible implementation of a center envisaged in a proposed cooperation between NASA and DARPA.
Survey of decentralized control methods. [for large scale dynamic systems
NASA Technical Reports Server (NTRS)
Athans, M.
1975-01-01
An overview is presented of the types of problems that are being considered by control theorists in the area of dynamic large scale systems with emphasis on decentralized control strategies. Approaches that deal directly with decentralized decision making for large scale systems are discussed. It is shown that future advances in decentralized system theory are intimately connected with advances in the stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools associated with the latter are summarized, and recommendations concerning future research are presented.
Creating a biopower agenda through grassroots organizing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hauter, W.
1995-11-01
Biomass electricity provides both opportunities for strengthening the rural economy and advancing environmental goals. However, while large scale biomass development can be done in a manner that both furthers economic development and helps prevent environmental degradation, its commercialization requires a complex coordination of activities between utilities and farmers. Inherent problems exist in creating parallel development of a resource base and technological advancements. In fact, an understanding of the anthropology of biopower is necessary in order to advance it on a large scale. The Union of Concerned Scientists (UCS) published a report on renewable electricity, released in March 1992, that hasmore » been used as a foundation for state-based work promoting renewables. In several Midwestern states, such as Nebraska, Minnesota, and Wisconsin, we have used classic grassroots organizing skills to educate the public and key constituencies about the benefits of biomass. Besides working directly with utilities to promote biomass development, we also have a legislative agenda that helps create a climate favorable to biopower. This paper will focus on the grassroots aspect of our campaigns. It will also include an overview of some anthropological work that the author has done in communities with farmers. The main tool for this has been focus groups. We have found that people can be organized around biomass issues and that a grassroots base furthers biomass development.« less
2017-01-01
Phase relations between specific scales in a turbulent boundary layer are studied here by highlighting the associated nonlinear scale interactions in the flow. This is achieved through an experimental technique that allows for targeted forcing of the flow through the use of a dynamic wall perturbation. Two distinct large-scale modes with well-defined spatial and temporal wavenumbers were simultaneously forced in the boundary layer, and the resulting nonlinear response from their direct interactions was isolated from the turbulence signal for the study. This approach advances the traditional studies of large- and small-scale interactions in wall turbulence by focusing on the direct interactions between scales with triadic wavenumber consistency. The results are discussed in the context of modelling high Reynolds number wall turbulence. This article is part of the themed issue ‘Toward the development of high-fidelity models of wall turbulence at large Reynolds number’. PMID:28167576
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schaidle, Joshua A.; Habas, Susan E.; Baddour, Frederick G.
Catalyst design, from idea to commercialization, requires multi-disciplinary scientific and engineering research and development over 10-20 year time periods. Historically, the identification of new or improved catalyst materials has largely been an empirical trial-and-error process. However, advances in computational capabilities (new tools and increased processing power) coupled with new synthetic techniques have started to yield rationally-designed catalysts with controlled nano-structures and tailored properties. This technological advancement represents an opportunity to accelerate the catalyst development timeline and to deliver new materials that outperform existing industrial catalysts or enable new applications, once a number of unique challenges associated with the scale-up ofmore » nano-structured materials are overcome.« less
Bello, Mustapha Mohammed; Abdul Raman, Abdul Aziz
2017-08-01
Palm oil processing is a multi-stage operation which generates large amount of effluent. On average, palm oil mill effluent (POME) may contain up to 51, 000 mg/L COD, 25,000 mg/L BOD, 40,000 TS and 6000 mg/L oil and grease. Due to its potential to cause environmental pollution, palm oil mills are required to treat the effluent prior to discharge. Biological treatments using open ponding system are widely used for POME treatment. Although these processes are capable of reducing the pollutant concentrations, they require long hydraulic retention time and large space, with the effluent frequently failing to satisfy the discharge regulation. Due to more stringent environmental regulations, research interest has recently shifted to the development of polishing technologies for the biologically-treated POME. Various technologies such as advanced oxidation processes, membrane technology, adsorption and coagulation have been investigated. Among these, advanced oxidation processes have shown potentials as polishing technologies for POME. This paper offers an overview on the POME polishing technologies, with particularly emphasis on advanced oxidation processes and their prospects for large scale applications. Although there are some challenges in large scale applications of these technologies, this review offers some perspectives that could help in overcoming these challenges. Copyright © 2017 Elsevier Ltd. All rights reserved.
Microprocessor Seminar, phase 2
NASA Technical Reports Server (NTRS)
Scott, W. R.
1977-01-01
Workshop sessions and papers were devoted to various aspects of microprocessor and large scale integrated circuit technology. Presentations were made on advanced LSI developments for high reliability military and NASA applications. Microprocessor testing techniques were discussed, and test data were presented. High reliability procurement specifications were also discussed.
ERIC Educational Resources Information Center
Frumin, Kim; Dede, Chris; Fischer, Christian; Foster, Brandon; Lawrenz, Frances; Eisenkraft, Arthur; Fishman, Barry; Jurist Levy, Abigail; McCoy, Ayana
2018-01-01
Over the past decade, the field of teacher professional learning has coalesced around core characteristics of high quality professional development experiences (e.g. Borko, Jacobs, & Koellner, 2010. Contemporary approaches to teacher professional development. In P. L. Peterson, E. Baker, & B. McGaw (Eds.), "International encyclopedia…
Advancing effects analysis for integrated, large-scale wildfire risk assessment
Matthew P. Thompson; David E. Calkin; Julie W. Gilbertson-Day; Alan A. Ager
2011-01-01
In this article, we describe the design and development of a quantitative, geospatial risk assessment tool intended to facilitate monitoring trends in wildfire risk over time and to provide information useful in prioritizing fuels treatments and mitigation measures. The research effort is designed to develop, from a strategic view, a first approximation of how both...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schell, Daniel J
The goal of this work is to use the large fermentation vessels in the National Renewable Energy Laboratory's (NREL) Integrated Biorefinery Research Facility (IBRF) to scale-up Lygos' biological-based process for producing malonic acid and to generate performance data. Initially, work at the 1 L scale validated successful transfer of Lygos' fermentation protocols to NREL using a glucose substrate. Outside of the scope of the CRADA with NREL, Lygos tested their process on lignocellulosic sugars produced by NREL at Lawrence Berkeley National Laboratory's (LBNL) Advanced Biofuels Process Development Unit (ABPDU). NREL produced these cellulosic sugar solutions from corn stover using amore » separate cellulose/hemicellulose process configuration. Finally, NREL performed fermentations using glucose in large fermentors (1,500- and 9,000-L vessels) to intermediate product and to demonstrate successful performance of Lygos' technology at larger scales.« less
Materials Integration and Doping of Carbon Nanotube-based Logic Circuits
NASA Astrophysics Data System (ADS)
Geier, Michael
Over the last 20 years, extensive research into the structure and properties of single- walled carbon nanotube (SWCNT) has elucidated many of the exceptional qualities possessed by SWCNTs, including record-setting tensile strength, excellent chemical stability, distinctive optoelectronic features, and outstanding electronic transport characteristics. In order to exploit these remarkable qualities, many application-specific hurdles must be overcome before the material can be implemented in commercial products. For electronic applications, recent advances in sorting SWCNTs by electronic type have enabled significant progress towards SWCNT-based integrated circuits. Despite these advances, demonstrations of SWCNT-based devices with suitable characteristics for large-scale integrated circuits have been limited. The processing methodologies, materials integration, and mechanistic understanding of electronic properties developed in this dissertation have enabled unprecedented scales of SWCNT-based transistor fabrication and integrated circuit demonstrations. Innovative materials selection and processing methods are at the core of this work and these advances have led to transistors with the necessary transport properties required for modern circuit integration. First, extensive collaborations with other research groups allowed for the exploration of SWCNT thin-film transistors (TFTs) using a wide variety of materials and processing methods such as new dielectric materials, hybrid semiconductor materials systems, and solution-based printing of SWCNT TFTs. These materials were integrated into circuit demonstrations such as NOR and NAND logic gates, voltage-controlled ring oscillators, and D-flip-flops using both rigid and flexible substrates. This dissertation explores strategies for implementing complementary SWCNT-based circuits, which were developed by using local metal gate structures that achieve enhancement-mode p-type and n-type SWCNT TFTs with widely separated and symmetric threshold voltages. Additionally, a novel n-type doping procedure for SWCNT TFTs was also developed utilizing a solution-processed organometallic small molecule to demonstrate the first network top-gated n-type SWCNT TFTs. Lastly, new doping and encapsulation layers were incorporated to stabilize both p-type and n-type SWCNT TFT electronic properties, which enabled the fabrication of large-scale memory circuits. Employing these materials and processing advances has addressed many application specific barriers to commercialization. For instance, the first thin-film SWCNT complementary metal-oxide-semi-conductor (CMOS) logic devices are demonstrated with sub-nanowatt static power consumption and full rail-to-rail voltage transfer characteristics. With the introduction of a new n-type Rh-based molecular dopant, the first SWCNT TFTs are fabricated in top-gate geometries over large areas with high yield. Then by utilizing robust encapsulation methods, stable and uniform electronic performance of both p-type and n-type SWCNT TFTs has been achieved. Based on these complementary SWCNT TFTs, it is possible to simulate, design, and fabricate arrays of low-power static random access memory (SRAM) circuits, achieving large-scale integration for the first time based on solution-processed semiconductors. Together, this work provides a direct pathway for solution processable, large scale, power-efficient advanced integrated logic circuits and systems.
New technology of underground structures the framework of restrained urban conditions
NASA Astrophysics Data System (ADS)
Pleshko, Mikhail; Pankratenko, Alexander; Revyakin, Alexey; Shchekina, Ekaterina; Kholodova, Svetlana
2018-03-01
In the paper was indicated the essentiality of large-scale underground space development and high-rise construction of cities in Russia. The basic elements of transport facilities construction effective technology without traffic restriction are developed. Unlike the well-known solutions, it offers the inclusion of an advanced lining in the construction that strengthens the soil mass. The fundamental principles of methods for determining stress in advanced support and monitoring of underground construction, providing the application of pressure sensors, strain sensors and displacement sensors are considered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ouyang, Lizhi
Advanced Ultra Supercritical Boiler (AUSC) requires materials that can operate in corrosive environment at temperature and pressure as high as 760°C (or 1400°F) and 5000psi, respectively, while at the same time maintain good ductility at low temperature. We develop automated simulation software tools to enable fast large scale screening studies of candidate designs. While direct evaluation of creep rupture strength and ductility are currently not feasible, properties such as energy, elastic constants, surface energy, interface energy, and stack fault energy can be used to assess their relative ductility and creeping strength. We implemented software to automate the complex calculations tomore » minimize human inputs in the tedious screening studies which involve model structures generation, settings for first principles calculations, results analysis and reporting. The software developed in the project and library of computed mechanical properties of phases found in ferritic steels, many are complex solid solutions estimated for the first time, will certainly help the development of low cost ferritic steel for AUSC.« less
Systems-Level Synthetic Biology for Advanced Biofuel Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruffing, Anne; Jensen, Travis J.; Strickland, Lucas Marshall
2015-03-01
Cyanobacteria have been shown to be capable of producing a variety of advanced biofuels; however, product yields remain well below those necessary for large scale production. New genetic tools and high throughput metabolic engineering techniques are needed to optimize cyanobacterial metabolisms for enhanced biofuel production. Towards this goal, this project advances the development of a multiple promoter replacement technique for systems-level optimization of gene expression in a model cyanobacterial host: Synechococcus sp. PCC 7002. To realize this multiple-target approach, key capabilities were developed, including a high throughput detection method for advanced biofuels, enhanced transformation efficiency, and genetic tools for Synechococcusmore » sp. PCC 7002. Moreover, several additional obstacles were identified for realization of this multiple promoter replacement technique. The techniques and tools developed in this project will help to enable future efforts in the advancement of cyanobacterial biofuels.« less
Natural Language Processing: Toward Large-Scale, Robust Systems.
ERIC Educational Resources Information Center
Haas, Stephanie W.
1996-01-01
Natural language processing (NLP) is concerned with getting computers to do useful things with natural language. Major applications include machine translation, text generation, information retrieval, and natural language interfaces. Reviews important developments since 1987 that have led to advances in NLP; current NLP applications; and problems…
Advanced Carbon Fabric/Phenolics for Thermal Protection Applications.
1982-02-01
structural properties are lower than rayon-based carbon fabriL analogues, they appear to be adequate for most ablative heat- shielding applications...34Development of Ablative Nozzles. Part II Ablative Nozzle Concept, Scaling Law , and Test Results," IAS Mtg. on Large Rockets, Sacramento, CA., Oct. 30
Thinking big: linking rivers to landscapes
Joan O’Callaghan; Ashley E. Steel; Kelly M. Burnett
2012-01-01
Exploring relationships between landscape characteristics and rivers is an emerging field, enabled by the proliferation of satellite date, advances in statistical analysis, and increased emphasis on large-scale monitoring. Landscapes features such as road networks, underlying geology, and human developments, determine the characteristics of the rivers flowing through...
Large-scale Advanced Prop-fan (LAP) technology assessment report
NASA Technical Reports Server (NTRS)
Degeorge, C. L.
1988-01-01
The technologically significant findings and accomplishments of the Large Scale Advanced Prop-Fan (LAP) program in the areas of aerodynamics, aeroelasticity, acoustics and materials and fabrication are described. The extent to which the program goals related to these disciplines were achieved is discussed, and recommendations for additional research are presented. The LAP program consisted of the design, manufacture and testing of a near full-scale Prop-Fan or advanced turboprop capable of operating efficiently at speeds to Mach .8. An aeroelastically scaled model of the LAP was also designed and fabricated. The goal of the program was to acquire data on Prop-Fan performance that would indicate the technology readiness of Prop-Fans for practical applications in commercial and military aviation.
Advanced bulk processing of lightweight materials for utilization in the transportation sector
NASA Astrophysics Data System (ADS)
Milner, Justin L.
The overall objective of this research is to develop the microstructure of metallic lightweight materials via multiple advanced processing techniques with potentials for industrial utilization on a large scale to meet the demands of the aerospace and automotive sectors. This work focused on (i) refining the grain structure to increase the strength, (ii) controlling the texture to increase formability and (iii) directly reducing processing/production cost of lightweight material components. Advanced processing is conducted on a bulk scale by several severe plastic deformation techniques including: accumulative roll bonding, isolated shear rolling and friction stir processing to achieve the multiple targets of this research. Development and validation of the processing techniques is achieved through wide-ranging experiments along with detailed mechanical and microstructural examination of the processed material. On a broad level, this research will make advancements in processing of bulk lightweight materials facilitating industrial-scale implementation. Where accumulative roll bonding and isolated shear rolling, currently feasible on an industrial scale, processes bulk sheet materials capable of replacing more expensive grades of alloys and enabling low-temperature and high-strain-rate formability. Furthermore, friction stir processing to manufacture lightweight tubes, made from magnesium alloys, has the potential to increase the utilization of these materials in the automotive and aerospace sectors for high strength - high formability applications. With the increased utilization of these advanced processing techniques will significantly reduce the cost associated with lightweight materials for many applications in the transportation sectors.
Gary M. Tabor; Anne Carlson; Travis Belote
2014-01-01
The Yellowstone to Yukon Conservation Initiative (Y2Y) was established over 20 years ago as an experiment in large landscape conservation. Initially, Y2Y emerged as a response to large scale habitat fragmentation by advancing ecological connectivity. It also laid the foundation for large scale multi-stakeholder conservation collaboration with almost 200 non-...
Large-scale neuromorphic computing systems
NASA Astrophysics Data System (ADS)
Furber, Steve
2016-10-01
Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.
Large-Scale Demonstration of Liquid Hydrogen Storage with Zero Boiloff for In-Space Applications
NASA Technical Reports Server (NTRS)
Hastings, L. J.; Bryant, C. B.; Flachbart, R. H.; Holt, K. A.; Johnson, E.; Hedayat, A.; Hipp, B.; Plachta, D. W.
2010-01-01
Cryocooler and passive insulation technology advances have substantially improved prospects for zero-boiloff cryogenic storage. Therefore, a cooperative effort by NASA s Ames Research Center, Glenn Research Center, and Marshall Space Flight Center (MSFC) was implemented to develop zero-boiloff concepts for in-space cryogenic storage. Described herein is one program element - a large-scale, zero-boiloff demonstration using the MSFC multipurpose hydrogen test bed (MHTB). A commercial cryocooler was interfaced with an existing MHTB spray bar mixer and insulation system in a manner that enabled a balance between incoming and extracted thermal energy.
Random access in large-scale DNA data storage.
Organick, Lee; Ang, Siena Dumas; Chen, Yuan-Jyue; Lopez, Randolph; Yekhanin, Sergey; Makarychev, Konstantin; Racz, Miklos Z; Kamath, Govinda; Gopalan, Parikshit; Nguyen, Bichlien; Takahashi, Christopher N; Newman, Sharon; Parker, Hsing-Yeh; Rashtchian, Cyrus; Stewart, Kendall; Gupta, Gagan; Carlson, Robert; Mulligan, John; Carmean, Douglas; Seelig, Georg; Ceze, Luis; Strauss, Karin
2018-03-01
Synthetic DNA is durable and can encode digital data with high density, making it an attractive medium for data storage. However, recovering stored data on a large-scale currently requires all the DNA in a pool to be sequenced, even if only a subset of the information needs to be extracted. Here, we encode and store 35 distinct files (over 200 MB of data), in more than 13 million DNA oligonucleotides, and show that we can recover each file individually and with no errors, using a random access approach. We design and validate a large library of primers that enable individual recovery of all files stored within the DNA. We also develop an algorithm that greatly reduces the sequencing read coverage required for error-free decoding by maximizing information from all sequence reads. These advances demonstrate a viable, large-scale system for DNA data storage and retrieval.
Computational methods to extract meaning from text and advance theories of human cognition.
McNamara, Danielle S
2011-01-01
Over the past two decades, researchers have made great advances in the area of computational methods for extracting meaning from text. This research has to a large extent been spurred by the development of latent semantic analysis (LSA), a method for extracting and representing the meaning of words using statistical computations applied to large corpora of text. Since the advent of LSA, researchers have developed and tested alternative statistical methods designed to detect and analyze meaning in text corpora. This research exemplifies how statistical models of semantics play an important role in our understanding of cognition and contribute to the field of cognitive science. Importantly, these models afford large-scale representations of human knowledge and allow researchers to explore various questions regarding knowledge, discourse processing, text comprehension, and language. This topic includes the latest progress by the leading researchers in the endeavor to go beyond LSA. Copyright © 2010 Cognitive Science Society, Inc.
Schultz, Simon R; Copeland, Caroline S; Foust, Amanda J; Quicke, Peter; Schuck, Renaud
2017-01-01
Recent years have seen substantial developments in technology for imaging neural circuits, raising the prospect of large scale imaging studies of neural populations involved in information processing, with the potential to lead to step changes in our understanding of brain function and dysfunction. In this article we will review some key recent advances: improved fluorophores for single cell resolution functional neuroimaging using a two photon microscope; improved approaches to the problem of scanning active circuits; and the prospect of scanless microscopes which overcome some of the bandwidth limitations of current imaging techniques. These advances in technology for experimental neuroscience have in themselves led to technical challenges, such as the need for the development of novel signal processing and data analysis tools in order to make the most of the new experimental tools. We review recent work in some active topics, such as region of interest segmentation algorithms capable of demixing overlapping signals, and new highly accurate algorithms for calcium transient detection. These advances motivate the development of new data analysis tools capable of dealing with spatial or spatiotemporal patterns of neural activity, that scale well with pattern size.
Schultz, Simon R.; Copeland, Caroline S.; Foust, Amanda J.; Quicke, Peter; Schuck, Renaud
2017-01-01
Recent years have seen substantial developments in technology for imaging neural circuits, raising the prospect of large scale imaging studies of neural populations involved in information processing, with the potential to lead to step changes in our understanding of brain function and dysfunction. In this article we will review some key recent advances: improved fluorophores for single cell resolution functional neuroimaging using a two photon microscope; improved approaches to the problem of scanning active circuits; and the prospect of scanless microscopes which overcome some of the bandwidth limitations of current imaging techniques. These advances in technology for experimental neuroscience have in themselves led to technical challenges, such as the need for the development of novel signal processing and data analysis tools in order to make the most of the new experimental tools. We review recent work in some active topics, such as region of interest segmentation algorithms capable of demixing overlapping signals, and new highly accurate algorithms for calcium transient detection. These advances motivate the development of new data analysis tools capable of dealing with spatial or spatiotemporal patterns of neural activity, that scale well with pattern size. PMID:28757657
A large-scale computer facility for computational aerodynamics
NASA Technical Reports Server (NTRS)
Bailey, F. R.; Ballhaus, W. F., Jr.
1985-01-01
As a result of advances related to the combination of computer system technology and numerical modeling, computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. NASA has, therefore, initiated the Numerical Aerodynamic Simulation (NAS) Program with the objective to provide a basis for further advances in the modeling of aerodynamic flowfields. The Program is concerned with the development of a leading-edge, large-scale computer facility. This facility is to be made available to Government agencies, industry, and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. Attention is given to the requirements for computational aerodynamics, the principal specific goals of the NAS Program, the high-speed processor subsystem, the workstation subsystem, the support processing subsystem, the graphics subsystem, the mass storage subsystem, the long-haul communication subsystem, the high-speed data-network subsystem, and software.
Ying, Gui-shuang; Maguire, Maureen G; Alexander, Judith; Martin, Revell W; Antoszyk, Andrew N
2009-09-01
To describe characteristics of the Age-Related Eye Disease Study (AREDS) 9-step severity scale applied to participants in the Complications of Age-related Macular Degeneration Prevention Trial (CAPT). Eligibility criteria for CAPT required 10 or more large (>or=125 microm) drusen in each eye. Readers graded baseline photographs from all participants and all follow-up photographs from 402 untreated eyes. Drusen and pigment characteristics were used to assign the AREDS scale score. Choroidal neovascularization was identified from fluorescein angiograms. Geographic atrophy involving the macular center was identified from color photographs. Among 1001 untreated eyes, 90% were at steps 5 to 7 at baseline. The 5-year incidence of advanced age-related macular degeneration (AMD) increased with each step from 8% (step 4) to 40% (steps 8 and 9 combined). These rates were similar to those reported in AREDS. Among 261 eyes with all 5 annual photograph gradings available and without progression to advanced AMD, 55% of eyes had scores that indicated improvement at least once. Before progression to advanced AMD, only 32% of 141 eyes either went through step 8 or 9 or had an increase of 2 or more steps from baseline. The AREDS 9-step severity scale was predictive of development of advanced AMD. The AREDS scale has deficiencies as a surrogate outcome for progression to advanced AMD.
Jiao, Jialong; Ren, Huilong; Adenya, Christiaan Adika; Chen, Chaohe
2017-01-01
Wave-induced motion and load responses are important criteria for ship performance evaluation. Physical experiments have long been an indispensable tool in the predictions of ship’s navigation state, speed, motions, accelerations, sectional loads and wave impact pressure. Currently, majority of the experiments are conducted in laboratory tank environment, where the wave environments are different from the realistic sea waves. In this paper, a laboratory tank testing system for ship motions and loads measurement is reviewed and reported first. Then, a novel large-scale model measurement technique is developed based on the laboratory testing foundations to obtain accurate motion and load responses of ships in realistic sea conditions. For this purpose, a suite of advanced remote control and telemetry experimental system was developed in-house to allow for the implementation of large-scale model seakeeping measurement at sea. The experimental system includes a series of technique sensors, e.g., the Global Position System/Inertial Navigation System (GPS/INS) module, course top, optical fiber sensors, strain gauges, pressure sensors and accelerometers. The developed measurement system was tested by field experiments in coastal seas, which indicates that the proposed large-scale model testing scheme is capable and feasible. Meaningful data including ocean environment parameters, ship navigation state, motions and loads were obtained through the sea trial campaign. PMID:29109379
Advances in Field Deployable Instrumented Particles for the Study of Alluvial Transport Mechanisms
NASA Astrophysics Data System (ADS)
Dillon, B.; Strom, K.
2017-12-01
Advances in microelectromechanical systems (MEMs) in the past decade have lead to the development of various instrumented or "smart" particles for use in the study of alluvial transport. The goal of many of these devices is to collect data on the interaction between hydrodynamic turbulence and individual sediment particles. Studying this interaction provides a basis to better understand entrainment and deposition processes which leads to better predictive morphologic and transport models. In collecting data on these processes, researchers seek to capture the time history of the forces incident on the particle and the particle's reaction. Many methods have been employed to capture this data - miniaturized pressure traps, accelerometers, gyroscopes, MEMs pressure transducers, and cantilevered load cells. However no system to date has been able to capture the pressure forces incident on the particle and its reaction while remaining mobile and of a size and density comparable to most gravels. Advances in the development, deployment, and use of waterproofed laboratory instrumentation have led our research group to develop such a particle. This particle has been used in both laboratory settings and large-scale fluvial environments (coupled with a field-deployable PIV system) to capture data on turbulent erosion processes. This system advances the practice in several ways: 1) It is, at present, the smallest (⌀ 19mm) instrumented erodible particle reported in the literature. 2) It contains novel developments in pressure sensing technology which allow the inclusion of six pressure ports, a 3-axis accelerometer, and a 1-axis gyroscope - all of which can be recorded simultaneously. 3) It expands the researcher's abilities to gather data on phenomena that, previously, have mandated the use of a laboratory scale model. The use of this system has generated observations of the so-called very large scale motions (VLSMs) in a reach of the Virginia section of the New River. Their effects on erosional processes are presented.
The Vortex Lattice Method for the Rotor-Vortex Interaction Problem
NASA Technical Reports Server (NTRS)
Padakannaya, R.
1974-01-01
The rotor blade-vortex interaction problem and the resulting impulsive airloads which generate undesirable noise levels are discussed. A numerical lifting surface method to predict unsteady aerodynamic forces induced on a finite aspect ratio rectangular wing by a straight, free vortex placed at an arbitrary angle in a subsonic incompressible free stream is developed first. Using a rigid wake assumption, the wake vortices are assumed to move downsteam with the free steam velocity. Unsteady load distributions are obtained which compare favorably with the results of planar lifting surface theory. The vortex lattice method has been extended to a single bladed rotor operating at high advance ratios and encountering a free vortex from a fixed wing upstream of the rotor. The predicted unsteady load distributions on the model rotor blade are generally in agreement with the experimental results. This method has also been extended to full scale rotor flight cases in which vortex induced loads near the tip of a rotor blade were indicated. In both the model and the full scale rotor blade airload calculations a flat planar wake was assumed which is a good approximation at large advance ratios because the downwash is small in comparison to the free stream at large advance ratios. The large fluctuations in the measured airloads near the tip of the rotor blade on the advance side is predicted closely by the vortex lattice method.
NASA Technical Reports Server (NTRS)
Christensen, Elmer
1985-01-01
The objectives were to develop the flat-plate photovoltaic (PV) array technologies required for large-scale terrestrial use late in the 1980s and in the 1990s; advance crystalline silicon PV technologies; develop the technologies required to convert thin-film PV research results into viable module and array technology; and to stimulate transfer of knowledge of advanced PV materials, solar cells, modules, and arrays to the PV community. Progress reached on attaining these goals, along with future recommendations are discussed.
Molecular Dynamics Studies of Proton Transport in Hydrogenase and Hydrogenase Mimics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ginovska-Pangovska, Bojana; Raugei, Simone; Shaw, Wendy J.
2016-08-02
Protons are used throughout the biological world for a number of functions, from charge balance to energy carriers. Metalloenzymes use protons as energy carriers and control proton movement both temporally and spatially. Despite the interest and need for controlled proton movement in other systems, the scientific community has not been able to develop extensive general rules for developing synthetic proton pathways. In part this is due to the challenging nature of studying these large and complex molecules experimentally, although experiments have gleaned extensive critical insight. While computational methods are also challenging because of the large size of the molecules, theymore » have been critical in advancing our knowledge of proton movement through pathways, but even further, they have advanced our knowledge in how protonation and proton movement is correlated with large and small scale molecular motions and electron movement. These studies often complement experimental studies but provide insight and depth simply not possible in many cases in the absence of theory. In this chapter, we will discuss advances and methods used in understanding proton movement in hydrogenases.« less
Duvvuri, Subrahmanyam; McKeon, Beverley
2017-03-13
Phase relations between specific scales in a turbulent boundary layer are studied here by highlighting the associated nonlinear scale interactions in the flow. This is achieved through an experimental technique that allows for targeted forcing of the flow through the use of a dynamic wall perturbation. Two distinct large-scale modes with well-defined spatial and temporal wavenumbers were simultaneously forced in the boundary layer, and the resulting nonlinear response from their direct interactions was isolated from the turbulence signal for the study. This approach advances the traditional studies of large- and small-scale interactions in wall turbulence by focusing on the direct interactions between scales with triadic wavenumber consistency. The results are discussed in the context of modelling high Reynolds number wall turbulence.This article is part of the themed issue 'Toward the development of high-fidelity models of wall turbulence at large Reynolds number'. © 2017 The Author(s).
NASA Astrophysics Data System (ADS)
Pérez, Lara F.; Nielsen, Tove; Knutz, Paul C.; Kuijpers, Antoon; Damm, Volkmar
2018-04-01
The continental shelf of central-east Greenland is shaped by several glacially carved transverse troughs that form the oceanward extension of the major fjord systems. The evolution of these troughs through time, and their relation with the large-scale glaciation of the Northern Hemisphere, is poorly understood. In this study seismostratigraphic analyses have been carried out to determine the morphological and structural development of this important sector of the East Greenland glaciated margin. The age of major stratigraphic discontinuities has been constrained by a direct tie to ODP site 987 drilled in the Greenland Sea basin plain off Scoresby Sund fan system. The areal distribution and internal facies of the identified seismic units reveal the large-scale depositional pattern formed by ice-streams draining a major part of the central-east Greenland ice sheet. Initial sedimentation along the margin was, however, mainly controlled by tectonic processes related to the margin construction, continental uplift, and fluvial processes. From late Miocene to present, progradational and erosional patterns point to repeated glacial advances across the shelf. The evolution of depo-centres suggests that ice sheet advances over the continental shelf have occurred since late Miocene, about 2 Myr earlier than previously assumed. This cross-shelf glaciation is more pronounced during late Miocene and early Pliocene along Blosseville Kyst and around the Pliocene/Pleistocene boundary off Scoresby Sund; indicating a northward migration of the glacial advance. The two main periods of glaciation were separated by a major retreat of the ice sheet to an inland position during middle Pliocene. Mounded-wavy deposits interpreted as current-related deposits suggest the presence of changing along-slope current dynamics in concert with the development of the modern North Atlantic oceanographic pattern.
Discovering and understanding oncogenic gene fusions through data intensive computational approaches
Latysheva, Natasha S.; Babu, M. Madan
2016-01-01
Abstract Although gene fusions have been recognized as important drivers of cancer for decades, our understanding of the prevalence and function of gene fusions has been revolutionized by the rise of next-generation sequencing, advances in bioinformatics theory and an increasing capacity for large-scale computational biology. The computational work on gene fusions has been vastly diverse, and the present state of the literature is fragmented. It will be fruitful to merge three camps of gene fusion bioinformatics that appear to rarely cross over: (i) data-intensive computational work characterizing the molecular biology of gene fusions; (ii) development research on fusion detection tools, candidate fusion prioritization algorithms and dedicated fusion databases and (iii) clinical research that seeks to either therapeutically target fusion transcripts and proteins or leverages advances in detection tools to perform large-scale surveys of gene fusion landscapes in specific cancer types. In this review, we unify these different—yet highly complementary and symbiotic—approaches with the view that increased synergy will catalyze advancements in gene fusion identification, characterization and significance evaluation. PMID:27105842
Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sussman, Alan
2014-10-21
This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.
Advancement of proprotor technology. Task 2: Wind-tunnel test results
NASA Technical Reports Server (NTRS)
1971-01-01
An advanced-design 25-foot-diameter flightworthy proprotor was tested in the NASA-Ames Large-Scale Wind Tunnel. These tests, have verified and confirmed the theory and design solutions developed as part of the Army Composite Aircraft Program. This report presents the test results and compares them with theoretical predictions. During performance tests, the results met or exceeded predictions. Hover thrust 15 percent greater than the predicted maximum was measured. In airplane mode, propulsive efficiencies (some of which exceeded 90 percent) agreed with theory.
Gao, Tia; Kim, Matthew I.; White, David; Alm, Alexander M.
2006-01-01
We have developed a system for real-time patient monitoring during large-scale disasters. Our system is designed with scalable algorithms to monitor large numbers of patients, an intuitive interface to support the overwhelmed responders, and ad-hoc mesh networking capabilities to maintain connectivity to patients in the chaotic settings. This paper describes an iterative approach to user-centered design adopted to guide development of our system. This system is a part of the Advanced Health and Disaster Aid Network (AID-N) architecture. PMID:17238348
Large scale in vivo recordings to study neuronal biophysics.
Giocomo, Lisa M
2015-06-01
Over the last several years, technological advances have enabled researchers to more readily observe single-cell membrane biophysics in awake, behaving animals. Studies utilizing these technologies have provided important insights into the mechanisms generating functional neural codes in both sensory and non-sensory cortical circuits. Crucial for a deeper understanding of how membrane biophysics control circuit dynamics however, is a continued effort to move toward large scale studies of membrane biophysics, in terms of the numbers of neurons and ion channels examined. Future work faces a number of theoretical and technical challenges on this front but recent technological developments hold great promise for a larger scale understanding of how membrane biophysics contribute to circuit coding and computation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping
NASA Astrophysics Data System (ADS)
Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.
2017-12-01
Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.
ERIC Educational Resources Information Center
Xiong, Yao; Suen, Hoi K.
2018-01-01
The development of massive open online courses (MOOCs) has launched an era of large-scale interactive participation in education. While massive open enrolment and the advances of learning technology are creating exciting potentials for lifelong learning in formal and informal ways, the implementation of efficient and effective assessment is still…
Space Station: Key to the Future.
ERIC Educational Resources Information Center
National Aeronautics and Space Administration, Washington, DC.
The possible applications, advantages and features of an advanced space station to be developed are considered in a non-technical manner in this booklet. Some of the areas of application considered include the following: the detection of large scale dynamic earth processes such as changes in snow pack, crops, and air pollution levels; the…
Software Tools | Office of Cancer Clinical Proteomics Research
The CPTAC program develops new approaches to elucidate aspects of the molecular complexity of cancer made from large-scale proteogenomic datasets, and advance them toward precision medicine. Part of the CPTAC mission is to make data and tools available and accessible to the greater research community to accelerate the discovery process.
On the Quality Assessment of Advanced E-Learning Services
ERIC Educational Resources Information Center
Stefani, Antonia; Vassiliadis, Bill; Xenos, Michalis
2006-01-01
Distance learning has been widely researched the past few years, nevertheless the focus has been more on its technological dimension. Designing, developing and supporting a large scale e-learning application for Higher Education is still a challenging task in many ways. E-learning is data-intensive, user-driven, and has increasing needs for…
Hybrid Wing Body Configuration Scaling Study
NASA Technical Reports Server (NTRS)
Nickol, Craig L.
2012-01-01
The Hybrid Wing Body (HWB) configuration is a subsonic transport aircraft concept with the potential to simultaneously reduce fuel burn, noise and emissions compared to conventional concepts. Initial studies focused on very large applications with capacities for up to 800 passengers. More recent studies have focused on the large, twin-aisle class with passenger capacities in the 300-450 range. Efficiently scaling this concept down to the single aisle or smaller size is challenging due to geometric constraints, potentially reducing the desirability of this concept for applications in the 100-200 passenger capacity range or less. In order to quantify this scaling challenge, five advanced conventional (tube-and-wing layout) concepts were developed, along with equivalent (payload/range/technology) HWB concepts, and their fuel burn performance compared. The comparison showed that the HWB concepts have fuel burn advantages over advanced tube-and-wing concepts in the larger payload/range classes (roughly 767-sized and larger). Although noise performance was not quantified in this study, the HWB concept has distinct noise advantages over the conventional tube-and-wing configuration due to the inherent noise shielding features of the HWB. NASA s Environmentally Responsible Aviation (ERA) project will continue to investigate advanced configurations, such as the HWB, due to their potential to simultaneously reduce fuel burn, noise and emissions.
NASA Astrophysics Data System (ADS)
Block, P. J.; Alexander, S.; WU, S.
2017-12-01
Skillful season-ahead predictions conditioned on local and large-scale hydro-climate variables can provide valuable knowledge to farmers and reservoir operators, enabling informed water resource allocation and management decisions. In Ethiopia, the potential for advancing agriculture and hydropower management, and subsequently economic growth, is substantial, yet evidence suggests a weak adoption of prediction information by sectoral audiences. To address common critiques, including skill, scale, and uncertainty, probabilistic forecasts are developed at various scales - temporally and spatially - for the Finchaa hydropower dam and the Koga agricultural scheme in an attempt to promote uptake and application. Significant prediction skill is evident across scales, particularly for statistical models. This raises questions regarding other potential barriers to forecast utilization at community scales, which are also addressed.
Simulation of all-scale atmospheric dynamics on unstructured meshes
NASA Astrophysics Data System (ADS)
Smolarkiewicz, Piotr K.; Szmelter, Joanna; Xiao, Feng
2016-10-01
The advance of massively parallel computing in the nineteen nineties and beyond encouraged finer grid intervals in numerical weather-prediction models. This has improved resolution of weather systems and enhanced the accuracy of forecasts, while setting the trend for development of unified all-scale atmospheric models. This paper first outlines the historical background to a wide range of numerical methods advanced in the process. Next, the trend is illustrated with a technical review of a versatile nonoscillatory forward-in-time finite-volume (NFTFV) approach, proven effective in simulations of atmospheric flows from small-scale dynamics to global circulations and climate. The outlined approach exploits the synergy of two specific ingredients: the MPDATA methods for the simulation of fluid flows based on the sign-preserving properties of upstream differencing; and the flexible finite-volume median-dual unstructured-mesh discretisation of the spatial differential operators comprising PDEs of atmospheric dynamics. The paper consolidates the concepts leading to a family of generalised nonhydrostatic NFTFV flow solvers that include soundproof PDEs of incompressible Boussinesq, anelastic and pseudo-incompressible systems, common in large-eddy simulation of small- and meso-scale dynamics, as well as all-scale compressible Euler equations. Such a framework naturally extends predictive skills of large-eddy simulation to the global atmosphere, providing a bottom-up alternative to the reverse approach pursued in the weather-prediction models. Theoretical considerations are substantiated by calculations attesting to the versatility and efficacy of the NFTFV approach. Some prospective developments are also discussed.
Large-Scale Astrophysical Visualization on Smartphones
NASA Astrophysics Data System (ADS)
Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.
2011-07-01
Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.
Microwave Remote Sensing and the Cold Land Processes Field Experiment
NASA Technical Reports Server (NTRS)
Kim, Edward J.; Cline, Don; Davis, Bert; Hildebrand, Peter H. (Technical Monitor)
2001-01-01
The Cold Land Processes Field Experiment (CLPX) has been designed to advance our understanding of the terrestrial cryosphere. Developing a more complete understanding of fluxes, storage, and transformations of water and energy in cold land areas is a critical focus of the NASA Earth Science Enterprise Research Strategy, the NASA Global Water and Energy Cycle (GWEC) Initiative, the Global Energy and Water Cycle Experiment (GEWEX), and the GEWEX Americas Prediction Project (GAPP). The movement of water and energy through cold regions in turn plays a large role in ecological activity and biogeochemical cycles. Quantitative understanding of cold land processes over large areas will require synergistic advancements in 1) understanding how cold land processes, most comprehensively understood at local or hillslope scales, extend to larger scales, 2) improved representation of cold land processes in coupled and uncoupled land-surface models, and 3) a breakthrough in large-scale observation of hydrologic properties, including snow characteristics, soil moisture, the extent of frozen soils, and the transition between frozen and thawed soil conditions. The CLPX Plan has been developed through the efforts of over 60 interested scientists that have participated in the NASA Cold Land Processes Working Group (CLPWG). This group is charged with the task of assessing, planning and implementing the required background science, technology, and application infrastructure to support successful land surface hydrology remote sensing space missions. A major product of the experiment will be a comprehensive, legacy data set that will energize many aspects of cold land processes research. The CLPX will focus on developing the quantitative understanding, models, and measurements necessary to extend our local-scale understanding of water fluxes, storage, and transformations to regional and global scales. The experiment will particularly emphasize developing a strong synergism between process-oriented understanding, land surface models and microwave remote sensing. The experimental design is a multi-sensor, multi-scale (1-ha to 160,000 km ^ {2}) approach to providing the comprehensive data set necessary to address several experiment objectives. A description focusing on the microwave remote sensing components (ground, airborne, and spaceborne) of the experiment will be presented.
Composites for Exploration Upper Stage
NASA Technical Reports Server (NTRS)
Fikes, J. C.; Jackson, J. R.; Richardson, S. W.; Thomas, A. D.; Mann, T. O.; Miller, S. G.
2016-01-01
The Composites for Exploration Upper Stage (CEUS) was a 3-year, level III project within the Technology Demonstration Missions program of the NASA Space Technology Mission Directorate. Studies have shown that composites provide important programmatic enhancements, including reduced weight to increase capability and accelerated expansion of exploration and science mission objectives. The CEUS project was focused on technologies that best advanced innovation, infusion, and broad applications for the inclusion of composites on future large human-rated launch vehicles and spacecraft. The benefits included near- and far-term opportunities for infusion (NASA, industry/commercial, Department of Defense), demonstrated critical technologies and technically implementable evolvable innovations, and sustained Agency experience. The initial scope of the project was to advance technologies for large composite structures applicable to the Space Launch System (SLS) Exploration Upper Stage (EUS) by focusing on the affordability and technical performance of the EUS forward and aft skirts. The project was tasked to develop and demonstrate critical composite technologies with a focus on full-scale materials, design, manufacturing, and test using NASA in-house capabilities. This would have demonstrated a major advancement in confidence and matured the large-scale composite technology to a Technology Readiness Level 6. This project would, therefore, have bridged the gap for providing composite application to SLS upgrades, enabling future exploration missions.
Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; ...
2008-01-01
Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less
Large-Scale Traffic Microsimulation From An MPO Perspective
DOT National Transportation Integrated Search
1997-01-01
One potential advancement of the four-step travel model process is the forecasting and simulation of individual activities and travel. A common concern with such an approach is that the data and computational requirements for a large-scale, regional ...
Gaussian processes for personalized e-health monitoring with wearable sensors.
Clifton, Lei; Clifton, David A; Pimentel, Marco A F; Watkinson, Peter J; Tarassenko, Lionel
2013-01-01
Advances in wearable sensing and communications infrastructure have allowed the widespread development of prototype medical devices for patient monitoring. However, such devices have not penetrated into clinical practice, primarily due to a lack of research into "intelligent" analysis methods that are sufficiently robust to support large-scale deployment. Existing systems are typically plagued by large false-alarm rates, and an inability to cope with sensor artifact in a principled manner. This paper has two aims: 1) proposal of a novel, patient-personalized system for analysis and inference in the presence of data uncertainty, typically caused by sensor artifact and data incompleteness; 2) demonstration of the method using a large-scale clinical study in which 200 patients have been monitored using the proposed system. This latter provides much-needed evidence that personalized e-health monitoring is feasible within an actual clinical environment, at scale, and that the method is capable of improving patient outcomes via personalized healthcare.
An interactive display system for large-scale 3D models
NASA Astrophysics Data System (ADS)
Liu, Zijian; Sun, Kun; Tao, Wenbing; Liu, Liman
2018-04-01
With the improvement of 3D reconstruction theory and the rapid development of computer hardware technology, the reconstructed 3D models are enlarging in scale and increasing in complexity. Models with tens of thousands of 3D points or triangular meshes are common in practical applications. Due to storage and computing power limitation, it is difficult to achieve real-time display and interaction with large scale 3D models for some common 3D display software, such as MeshLab. In this paper, we propose a display system for large-scale 3D scene models. We construct the LOD (Levels of Detail) model of the reconstructed 3D scene in advance, and then use an out-of-core view-dependent multi-resolution rendering scheme to realize the real-time display of the large-scale 3D model. With the proposed method, our display system is able to render in real time while roaming in the reconstructed scene and 3D camera poses can also be displayed. Furthermore, the memory consumption can be significantly decreased via internal and external memory exchange mechanism, so that it is possible to display a large scale reconstructed scene with over millions of 3D points or triangular meshes in a regular PC with only 4GB RAM.
Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S
2016-11-01
There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.
Cuellar, Maria C; Heijnen, Joseph J; van der Wielen, Luuk A M
2013-06-01
Industrial biotechnology is playing an important role in the transition to a bio-based economy. Currently, however, industrial implementation is still modest, despite the advances made in microorganism development. Given that the fuels and commodity chemicals sectors are characterized by tight economic margins, we propose to address overall process design and efficiency at the start of bioprocess development. While current microorganism development is targeted at product formation and product yield, addressing process design at the start of bioprocess development means that microorganism selection can also be extended to other critical targets for process technology and process scale implementation, such as enhancing cell separation or increasing cell robustness at operating conditions that favor the overall process. In this paper we follow this approach for the microbial production of diesel-like biofuels. We review current microbial routes with both oleaginous and engineered microorganisms. For the routes leading to extracellular production, we identify the process conditions for large scale operation. The process conditions identified are finally translated to microorganism development targets. We show that microorganism development should be directed at anaerobic production, increasing robustness at extreme process conditions and tailoring cell surface properties. All the same time, novel process configurations integrating fermentation and product recovery, cell reuse and low-cost technologies for product separation are mandatory. This review provides a state-of-the-art summary of the latest challenges in large-scale production of diesel-like biofuels. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A new resource for developing and strengthening large-scale community health worker programs.
Perry, Henry; Crigler, Lauren; Lewin, Simon; Glenton, Claire; LeBan, Karen; Hodgins, Steve
2017-01-12
Large-scale community health worker programs are now growing in importance around the world in response to the resurgence of interest and growing evidence of the importance of community-based primary health care for improving the health of populations in resource-constrained, high-mortality settings. These programs, because of their scale and operational challenges, merit special consideration by the global health community, national policy-makers, and program implementers. A new online resource is now available to assist in that effort: Developing and Strengthening Community Health Worker Programs at Scale: A Reference Guide and Case Studies for Program Managers and Policymakers ( http://www.mchip.net/CHWReferenceGuide ). This CHW Reference Guide is the product of 27 different collaborators who, collectively, have a formidable breadth and depth of experience and knowledge about CHW programming around the world. It provides a thoughtful discussion about the many operational issues that large-scale CHW programs need to address as they undergo the process of development, expansion or strengthening. Detailed case studies of 12 national CHW programs are included in the Appendix-the most current and complete cases studies as a group that are currently available. Future articles in this journal will highlight many of the themes in the CHW Reference Guide and provide an update of recent advances and experiences. These articles will serve, we hope, to (1) increase awareness about the CHW Reference Guide and its usefulness and (2) connect a broader audience to the critical importance of strengthening large-scale CHW programs for the health benefits that they can bring to underserved populations around the world.
Development of the Large-Scale Forcing Data to Support MC3E Cloud Modeling Studies
NASA Astrophysics Data System (ADS)
Xie, S.; Zhang, Y.
2011-12-01
The large-scale forcing fields (e.g., vertical velocity and advective tendencies) are required to run single-column and cloud-resolving models (SCMs/CRMs), which are the two key modeling frameworks widely used to link field data to climate model developments. In this study, we use an advanced objective analysis approach to derive the required forcing data from the soundings collected by the Midlatitude Continental Convective Cloud Experiment (MC3E) in support of its cloud modeling studies. MC3E is the latest major field campaign conducted during the period 22 April 2011 to 06 June 2011 in south-central Oklahoma through a joint effort between the DOE ARM program and the NASA Global Precipitation Measurement Program. One of its primary goals is to provide a comprehensive dataset that can be used to describe the large-scale environment of convective cloud systems and evaluate model cumulus parameterizations. The objective analysis used in this study is the constrained variational analysis method. A unique feature of this approach is the use of domain-averaged surface and top-of-the atmosphere (TOA) observations (e.g., precipitation and radiative and turbulent fluxes) as constraints to adjust atmospheric state variables from soundings by the smallest possible amount to conserve column-integrated mass, moisture, and static energy so that the final analysis data is dynamically and thermodynamically consistent. To address potential uncertainties in the surface observations, an ensemble forcing dataset will be developed. Multi-scale forcing will be also created for simulating various scale convective systems. At the meeting, we will provide more details about the forcing development and present some preliminary analysis of the characteristics of the large-scale forcing structures for several selected convective systems observed during MC3E.
Tian, Yang; Liu, Zhilin; Li, Xiaoqian; Zhang, Lihua; Li, Ruiqing; Jiang, Ripeng; Dong, Fang
2018-05-01
Ultrasonic sonotrodes play an essential role in transmitting power ultrasound into the large-scale metallic casting. However, cavitation erosion considerably impairs the in-service performance of ultrasonic sonotrodes, leading to marginal microstructural refinement. In this work, the cavitation erosion behaviour of ultrasonic sonotrodes in large-scale castings was explored using the industry-level experiments of Al alloy cylindrical ingots (i.e. 630 mm in diameter and 6000 mm in length). When introducing power ultrasound, severe cavitation erosion was found to reproducibly occur at some specific positions on ultrasonic sonotrodes. However, there is no cavitation erosion present on the ultrasonic sonotrodes that were not driven by electric generator. Vibratory examination showed cavitation erosion depended on the vibration state of ultrasonic sonotrodes. Moreover, a finite element (FE) model was developed to simulate the evolution and distribution of acoustic pressure in 3-D solidification volume. FE simulation results confirmed that significant dynamic interaction between sonotrodes and melts only happened at some specific positions corresponding to severe cavitation erosion. This work will allow for developing more advanced ultrasonic sonotrodes with better cavitation erosion-resistance, in particular for large-scale castings, from the perspectives of ultrasonic physics and mechanical design. Copyright © 2018 Elsevier B.V. All rights reserved.
Dwarshuis, Nate J; Parratt, Kirsten; Santiago-Miranda, Adriana; Roy, Krishnendu
2017-05-15
Therapeutic cells hold tremendous promise in treating currently incurable, chronic diseases since they perform multiple, integrated, complex functions in vivo compared to traditional small-molecule drugs or biologics. However, they also pose significant challenges as therapeutic products because (a) their complex mechanisms of actions are difficult to understand and (b) low-cost bioprocesses for large-scale, reproducible manufacturing of cells have yet to be developed. Immunotherapies using T cells and dendritic cells (DCs) have already shown great promise in treating several types of cancers, and human mesenchymal stromal cells (hMSCs) are now extensively being evaluated in clinical trials as immune-modulatory cells. Despite these exciting developments, the full potential of cell-based therapeutics cannot be realized unless new engineering technologies enable cost-effective, consistent manufacturing of high-quality therapeutic cells at large-scale. Here we review cell-based immunotherapy concepts focused on the state-of-the-art in manufacturing processes including cell sourcing, isolation, expansion, modification, quality control (QC), and culture media requirements. We also offer insights into how current technologies could be significantly improved and augmented by new technologies, and how disciplines must converge to meet the long-term needs for large-scale production of cell-based immunotherapies. Copyright © 2017 Elsevier B.V. All rights reserved.
Large-scale structural analysis: The structural analyst, the CSM Testbed and the NAS System
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Mccleary, Susan L.; Macy, Steven C.; Aminpour, Mohammad A.
1989-01-01
The Computational Structural Mechanics (CSM) activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM testbed methods development environment is presented and some numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.
Challenges of forest landscape modeling - simulating large landscapes and validating results
Hong S. He; Jian Yang; Stephen R. Shifley; Frank R. Thompson
2011-01-01
Over the last 20 years, we have seen a rapid development in the field of forest landscape modeling, fueled by both technological and theoretical advances. Two fundamental challenges have persisted since the inception of FLMs: (1) balancing realistic simulation of ecological processes at broad spatial and temporal scales with computing capacity, and (2) validating...
Preliminary logging analysis system (PLANS): overview.
R.H. Twito; S.E. Reutebuch; R.J. McGaughey; C.N. Mann
1987-01-01
The paper previews a computer-aided design system, PLANS, that is useful for developing timber harvest and road network plans on large-scale topographic maps. Earlier planning techniques are reviewed, and the advantages are explained of using advanced planning systems like PLANS. There is a brief summary of the input, output, and function of each program in the PLANS...
Advances in Thermal Spray Coatings for Gas Turbines and Energy Generation: A Review
NASA Astrophysics Data System (ADS)
Hardwicke, Canan U.; Lau, Yuk-Chiu
2013-06-01
Functional coatings are widely used in energy generation equipment in industries such as renewables, oil and gas, propulsion engines, and gas turbines. Intelligent thermal spray processing is vital in many of these areas for efficient manufacturing. Advanced thermal spray coating applications include thermal management, wear, oxidation, corrosion resistance, sealing systems, vibration and sound absorbance, and component repair. This paper reviews the current status of materials, equipment, processing, and properties' aspects for key coatings in the energy industry, especially the developments in large-scale gas turbines. In addition to the most recent industrial advances in thermal spray technologies, future technical needs are also highlighted.
Los Alamos NEP research in advanced plasma thrusters
NASA Technical Reports Server (NTRS)
Schoenberg, Kurt; Gerwin, Richard
1991-01-01
Research was initiated in advanced plasma thrusters that capitalizes on lab capabilities in plasma science and technology. The goal of the program was to examine the scaling issues of magnetoplasmadynamic (MPD) thruster performance in support of NASA's MPD thruster development program. The objective was to address multi-megawatt, large scale, quasi-steady state MPD thruster performance. Results to date include a new quasi-steady state operating regime which was obtained at space exploration initiative relevant power levels, that enables direct coaxial gun-MPD comparisons of thruster physics and performance. The radiative losses are neglible. Operation with an applied axial magnetic field shows the same operational stability and exhaust plume uniformity benefits seen in MPD thrusters. Observed gun impedance is in close agreement with the magnetic Bernoulli model predictions. Spatial and temporal measurements of magnetic field, electric field, plasma density, electron temperature, and ion/neutral energy distribution are underway. Model applications to advanced mission logistics are also underway.
A comprehensive study on urban true orthorectification
Zhou, G.; Chen, W.; Kelmelis, J.A.; Zhang, Dongxiao
2005-01-01
To provide some advanced technical bases (algorithms and procedures) and experience needed for national large-scale digital orthophoto generation and revision of the Standards for National Large-Scale City Digital Orthophoto in the National Digital Orthophoto Program (NDOP), this paper presents a comprehensive study on theories, algorithms, and methods of large-scale urban orthoimage generation. The procedures of orthorectification for digital terrain model (DTM)-based and digital building model (DBM)-based orthoimage generation and their mergence for true orthoimage generation are discussed in detail. A method of compensating for building occlusions using photogrammetric geometry is developed. The data structure needed to model urban buildings for accurately generating urban orthoimages is presented. Shadow detection and removal, the optimization of seamline for automatic mosaic, and the radiometric balance of neighbor images are discussed. Street visibility analysis, including the relationship between flight height, building height, street width, and relative location of the street to the imaging center, is analyzed for complete true orthoimage generation. The experimental results demonstrated that our method can effectively and correctly orthorectify the displacements caused by terrain and buildings in urban large-scale aerial images. ?? 2005 IEEE.
Assessment of the State-of-the-Art in the Design and Manufacturing of Large Composite Structure
NASA Technical Reports Server (NTRS)
Harris, C. E.
2001-01-01
This viewgraph presentation gives an assessment of the state-of-the-art in the design and manufacturing of large component structures, including details on the use of continuous fiber reinforced polymer matrix composites (CFRP) in commercial and military aircraft and in space launch vehicles. Project risk mitigation plans must include a building-block test approach to structural design development, manufacturing process scale-up development tests, and pre-flight ground tests to verify structural integrity. The potential benefits of composite structures justifies NASA's investment in developing the technology. Advanced composite structures technology is enabling to virtually every Aero-Space Technology Enterprise Goal.
Mathematical and Computational Challenges in Population Biology and Ecosystems Science
NASA Technical Reports Server (NTRS)
Levin, Simon A.; Grenfell, Bryan; Hastings, Alan; Perelson, Alan S.
1997-01-01
Mathematical and computational approaches provide powerful tools in the study of problems in population biology and ecosystems science. The subject has a rich history intertwined with the development of statistics and dynamical systems theory, but recent analytical advances, coupled with the enhanced potential of high-speed computation, have opened up new vistas and presented new challenges. Key challenges involve ways to deal with the collective dynamics of heterogeneous ensembles of individuals, and to scale from small spatial regions to large ones. The central issues-understanding how detail at one scale makes its signature felt at other scales, and how to relate phenomena across scales-cut across scientific disciplines and go to the heart of algorithmic development of approaches to high-speed computation. Examples are given from ecology, genetics, epidemiology, and immunology.
NASA Astrophysics Data System (ADS)
Qu, Xingtian; Li, Jinlai; Yin, Zhifu
2018-04-01
Micro- and nanofluidic chips are becoming increasing significance for biological and medical applications. Future advances in micro- and nanofluidics and its utilization in commercial applications depend on the development and fabrication of low cost and high fidelity large scale plastic micro- and nanofluidic chips. However, the majority of the present fabrication methods suffer from a low bonding rate of the chip during thermal bonding process due to air trapping between the substrate and the cover plate. In the present work, a novel bonding technique based on Ar plasma and water treatment was proposed to fully bond the large scale micro- and nanofluidic chips. The influence of Ar plasma parameters on the water contact angle and the effect of bonding conditions on the bonding rate and the bonding strength of the chip were studied. The fluorescence tests demonstrate that the 5 × 5 cm2 poly(methyl methacrylate) chip with 180 nm wide and 180 nm deep nanochannels can be fabricated without any block and leakage by our newly developed method.
Advances in Parallelization for Large Scale Oct-Tree Mesh Generation
NASA Technical Reports Server (NTRS)
O'Connell, Matthew; Karman, Steve L.
2015-01-01
Despite great advancements in the parallelization of numerical simulation codes over the last 20 years, it is still common to perform grid generation in serial. Generating large scale grids in serial often requires using special "grid generation" compute machines that can have more than ten times the memory of average machines. While some parallel mesh generation techniques have been proposed, generating very large meshes for LES or aeroacoustic simulations is still a challenging problem. An automated method for the parallel generation of very large scale off-body hierarchical meshes is presented here. This work enables large scale parallel generation of off-body meshes by using a novel combination of parallel grid generation techniques and a hybrid "top down" and "bottom up" oct-tree method. Meshes are generated using hardware commonly found in parallel compute clusters. The capability to generate very large meshes is demonstrated by the generation of off-body meshes surrounding complex aerospace geometries. Results are shown including a one billion cell mesh generated around a Predator Unmanned Aerial Vehicle geometry, which was generated on 64 processors in under 45 minutes.
Park, Jun-Sang; Okasinski, John; Chatterjee, Kamalika; ...
2017-05-30
High energy X-rays can penetrate large components and samples made from engineering alloys. Brilliant synchrotron sources like the Advanced Photon Source (APS) combined with unique experimental setups are increasingly allowing scientists and engineers to non-destructively characterize the state of materials across a range of length scales. In this article, some of the new developments at the APS, namely the high energy diffraction microscopy technique for grain-by-grain maps and aperture-based techniques for aggregate maps, are described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Jun-Sang; Okasinski, John; Chatterjee, Kamalika
High energy X-rays can penetrate large components and samples made from engineering alloys. Brilliant synchrotron sources like the Advanced Photon Source (APS) combined with unique experimental setups are increasingly allowing scientists and engineers to non-destructively characterize the state of materials across a range of length scales. In this article, some of the new developments at the APS, namely the high energy diffraction microscopy technique for grain-by-grain maps and aperture-based techniques for aggregate maps, are described.
Advanced computations in plasma physics
NASA Astrophysics Data System (ADS)
Tang, W. M.
2002-05-01
Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.
Advanced Computation in Plasma Physics
NASA Astrophysics Data System (ADS)
Tang, William
2001-10-01
Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. This talk will review recent progress and future directions for advanced simulations in magnetically-confined plasmas with illustrative examples chosen from areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop MPP's to produce 3-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for tens of thousands time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.
Progress in the Development of a Global Quasi-3-D Multiscale Modeling Framework
NASA Astrophysics Data System (ADS)
Jung, J.; Konor, C. S.; Randall, D. A.
2017-12-01
The Quasi-3-D Multiscale Modeling Framework (Q3D MMF) is a second-generation MMF, which has following advances over the first-generation MMF: 1) The cloud-resolving models (CRMs) that replace conventional parameterizations are not confined to the large-scale dynamical-core grid cells, and are seamlessly connected to each other, 2) The CRMs sense the three-dimensional large- and cloud-scale environment, 3) Two perpendicular sets of CRM channels are used, and 4) The CRMs can resolve the steep surface topography along the channel direction. The basic design of the Q3D MMF has been developed and successfully tested in a limited-area modeling framework. Currently, global versions of the Q3D MMF are being developed for both weather and climate applications. The dynamical cores governing the large-scale circulation in the global Q3D MMF are selected from two cube-based global atmospheric models. The CRM used in the model is the 3-D nonhydrostatic anelastic Vector-Vorticity Model (VVM), which has been tested with the limited-area version for its suitability for this framework. As a first step of the development, the VVM has been reconstructed on the cubed-sphere grid so that it can be applied to global channel domains and also easily fitted to the large-scale dynamical cores. We have successfully tested the new VVM by advecting a bell-shaped passive tracer and simulating the evolutions of waves resulted from idealized barotropic and baroclinic instabilities. For improvement of the model, we also modified the tracer advection scheme to yield positive-definite results and plan to implement a new physics package that includes a double-moment microphysics and an aerosol physics. The interface for coupling the large-scale dynamical core and the VVM is under development. In this presentation, we shall describe the recent progress in the development and show some test results.
ABLE project: Development of an advanced lead-acid storage system for autonomous PV installations
NASA Astrophysics Data System (ADS)
Lemaire-Potteau, Elisabeth; Vallvé, Xavier; Pavlov, Detchko; Papazov, G.; Borg, Nico Van der; Sarrau, Jean-François
In the advanced battery for low-cost renewable energy (ABLE) project, the partners have developed an advanced storage system for small and medium-size PV systems. It is composed of an innovative valve-regulated lead-acid (VRLA) battery, optimised for reliability and manufacturing cost, and an integrated regulator, for optimal battery management and anti-fraudulent use. The ABLE battery performances are comparable to flooded tubular batteries, which are the reference in medium-size PV systems. The ABLE regulator has several innovative features regarding energy management and modular series/parallel association. The storage system has been validated by indoor, outdoor and field tests, and it is expected that this concept could be a major improvement for large-scale implementation of PV within the framework of national rural electrification schemes.
Scientific Discovery through Advanced Computing in Plasma Science
NASA Astrophysics Data System (ADS)
Tang, William
2005-03-01
Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research during the 21st Century. For example, the Department of Energy's ``Scientific Discovery through Advanced Computing'' (SciDAC) Program was motivated in large measure by the fact that formidable scientific challenges in its research portfolio could best be addressed by utilizing the combination of the rapid advances in super-computing technology together with the emergence of effective new algorithms and computational methodologies. The imperative is to translate such progress into corresponding increases in the performance of the scientific codes used to model complex physical systems such as those encountered in high temperature plasma research. If properly validated against experimental measurements and analytic benchmarks, these codes can provide reliable predictive capability for the behavior of a broad range of complex natural and engineered systems. This talk reviews recent progress and future directions for advanced simulations with some illustrative examples taken from the plasma science applications area. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by the combination of access to powerful new computational resources together with innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning a huge range in time and space scales. In particular, the plasma science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of plasma turbulence in magnetically-confined high temperature plasmas. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to the computational science area.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alan Black; Arnis Judzis
2003-10-01
This document details the progress to date on the OPTIMIZATION OF DEEP DRILLING PERFORMANCE--DEVELOPMENT AND BENCHMARK TESTING OF ADVANCED DIAMOND PRODUCT DRILL BITS AND HP/HT FLUIDS TO SIGNIFICANTLY IMPROVE RATES OF PENETRATION contract for the year starting October 2002 through September 2002. The industry cost shared program aims to benchmark drilling rates of penetration in selected simulated deep formations and to significantly improve ROP through a team development of aggressive diamond product drill bit--fluid system technologies. Overall the objectives are as follows: Phase 1--Benchmark ''best in class'' diamond and other product drilling bits and fluids and develop concepts for amore » next level of deep drilling performance; Phase 2--Develop advanced smart bit--fluid prototypes and test at large scale; and Phase 3--Field trial smart bit--fluid concepts, modify as necessary and commercialize products. Accomplishments to date include the following: 4Q 2002--Project started; Industry Team was assembled; Kick-off meeting was held at DOE Morgantown; 1Q 2003--Engineering meeting was held at Hughes Christensen, The Woodlands Texas to prepare preliminary plans for development and testing and review equipment needs; Operators started sending information regarding their needs for deep drilling challenges and priorities for large-scale testing experimental matrix; Aramco joined the Industry Team as DEA 148 objectives paralleled the DOE project; 2Q 2003--Engineering and planning for high pressure drilling at TerraTek commenced; 3Q 2003--Continuation of engineering and design work for high pressure drilling at TerraTek; Baker Hughes INTEQ drilling Fluids and Hughes Christensen commence planning for Phase 1 testing--recommendations for bits and fluids.« less
[Advances in the research of application of artificial intelligence in burn field].
Li, H H; Bao, Z X; Liu, X B; Zhu, S H
2018-04-20
Artificial intelligence has been able to automatically learn and judge large-scale data to some extent. Based on database of a large amount of burn data and in-depth learning, artificial intelligence can assist burn surgeons to evaluate burn surface, diagnose burn depth, guide fluid supply during shock stage, and predict prognosis, with high accuracy. With the development of technology, artificial intelligence can provide more accurate information for burn surgeons to make clinical diagnosis and treatment strategies.
Electroluminescence in SrTiO3:Cr single-crystal nonvolatile memory cells
NASA Astrophysics Data System (ADS)
Alvarado, S. F.; La Mattina, F.; Bednorz, J. G.
2007-10-01
Materials chemistry has emerged as one of the most consistent fabrication tools for the rational delivery of high purity functional nanomaterials, engineered from molecular to microscopic scale at low cost and large scale. An overview of the major achievements and latest advances of a recently developed growth concept and low temperature aqueous synthesis method, for the fabrication of purpose-built large bandgap metal oxide semiconductor materials and oriented nano-arrays is presented. Important insight of direct relevance for semiconductor technology, optoelectronics, photovoltaics and photocatalysis for solar hydrogen generation, are revealed by in-depth investigations of the electronic structure of metal oxide nanostructures with new morphology and architecture, carried out at synchrotron radiation facilities.
1987-11-01
developed that can be used by circuit engineers to extract the maximum performance from the devices on various board technologies including multilayer ceramic...Design guidelines have been developed that can be used by circuit engineers to extract the maxi- mum performance from the devices on various board...25 Attenuation and Dispersion Effects ......................................... 27 Skin Effect
CrossTalk. The Journal of Defense Software Engineering. Volume 13, Number 6, June 2000
2000-06-01
Techniques for Efficiently Generating and Testing Software This paper presents a proven process that uses advanced tools to design, develop and test... optimal software. by Keith R. Wegner Large Software Systems—Back to Basics Development methods that work on small problems seem to not scale well to...Ability Requirements for Teamwork: Implications for Human Resource Management, Journal of Management, Vol. 20, No. 2, 1994. 11. Ferguson, Pat, Watts S
Advanced computational simulations of water waves interacting with wave energy converters
NASA Astrophysics Data System (ADS)
Pathak, Ashish; Freniere, Cole; Raessi, Mehdi
2017-03-01
Wave energy converter (WEC) devices harness the renewable ocean wave energy and convert it into useful forms of energy, e.g. mechanical or electrical. This paper presents an advanced 3D computational framework to study the interaction between water waves and WEC devices. The computational tool solves the full Navier-Stokes equations and considers all important effects impacting the device performance. To enable large-scale simulations in fast turnaround times, the computational solver was developed in an MPI parallel framework. A fast multigrid preconditioned solver is introduced to solve the computationally expensive pressure Poisson equation. The computational solver was applied to two surface-piercing WEC geometries: bottom-hinged cylinder and flap. Their numerically simulated response was validated against experimental data. Additional simulations were conducted to investigate the applicability of Froude scaling in predicting full-scale WEC response from the model experiments.
Advanced Nanostructured Anode Materials for Sodium-Ion Batteries.
Wang, Qidi; Zhao, Chenglong; Lu, Yaxiang; Li, Yunming; Zheng, Yuheng; Qi, Yuruo; Rong, Xiaohui; Jiang, Liwei; Qi, Xinguo; Shao, Yuanjun; Pan, Du; Li, Baohua; Hu, Yong-Sheng; Chen, Liquan
2017-11-01
Sodium-ion batteries (NIBs), due to the advantages of low cost and relatively high safety, have attracted widespread attention all over the world, making them a promising candidate for large-scale energy storage systems. However, the inherent lower energy density to lithium-ion batteries is the issue that should be further investigated and optimized. Toward the grid-level energy storage applications, designing and discovering appropriate anode materials for NIBs are of great concern. Although many efforts on the improvements and innovations are achieved, several challenges still limit the current requirements of the large-scale application, including low energy/power densities, moderate cycle performance, and the low initial Coulombic efficiency. Advanced nanostructured strategies for anode materials can significantly improve ion or electron transport kinetic performance enhancing the electrochemical properties of battery systems. Herein, this Review intends to provide a comprehensive summary on the progress of nanostructured anode materials for NIBs, where representative examples and corresponding storage mechanisms are discussed. Meanwhile, the potential directions to obtain high-performance anode materials of NIBs are also proposed, which provide references for the further development of advanced anode materials for NIBs. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Final Project Report. Scalable fault tolerance runtime technology for petascale computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnamoorthy, Sriram; Sadayappan, P
With the massive number of components comprising the forthcoming petascale computer systems, hardware failures will be routinely encountered during execution of large-scale applications. Due to the multidisciplinary, multiresolution, and multiscale nature of scientific problems that drive the demand for high end systems, applications place increasingly differing demands on the system resources: disk, network, memory, and CPU. In addition to MPI, future applications are expected to use advanced programming models such as those developed under the DARPA HPCS program as well as existing global address space programming models such as Global Arrays, UPC, and Co-Array Fortran. While there has been amore » considerable amount of work in fault tolerant MPI with a number of strategies and extensions for fault tolerance proposed, virtually none of advanced models proposed for emerging petascale systems is currently fault aware. To achieve fault tolerance, development of underlying runtime and OS technologies able to scale to petascale level is needed. This project has evaluated range of runtime techniques for fault tolerance for advanced programming models.« less
Information Management for a Large Multidisciplinary Project
NASA Technical Reports Server (NTRS)
Jones, Kennie H.; Randall, Donald P.; Cronin, Catherine K.
1992-01-01
In 1989, NASA's Langley Research Center (LaRC) initiated the High-Speed Airframe Integration Research (HiSAIR) Program to develop and demonstrate an integrated environment for high-speed aircraft design using advanced multidisciplinary analysis and optimization procedures. The major goals of this program were to evolve the interactions among disciplines and promote sharing of information, to provide a timely exchange of information among aeronautical disciplines, and to increase the awareness of the effects each discipline has upon other disciplines. LaRC historically has emphasized the advancement of analysis techniques. HiSAIR was founded to synthesize these advanced methods into a multidisciplinary design process emphasizing information feedback among disciplines and optimization. Crucial to the development of such an environment are the definition of the required data exchanges and the methodology for both recording the information and providing the exchanges in a timely manner. These requirements demand extensive use of data management techniques, graphic visualization, and interactive computing. HiSAIR represents the first attempt at LaRC to promote interdisciplinary information exchange on a large scale using advanced data management methodologies combined with state-of-the-art, scientific visualization techniques on graphics workstations in a distributed computing environment. The subject of this paper is the development of the data management system for HiSAIR.
Battery technologies for large-scale stationary energy storage.
Soloveichik, Grigorii L
2011-01-01
In recent years, with the deployment of renewable energy sources, advances in electrified transportation, and development in smart grids, the markets for large-scale stationary energy storage have grown rapidly. Electrochemical energy storage methods are strong candidate solutions due to their high energy density, flexibility, and scalability. This review provides an overview of mature and emerging technologies for secondary and redox flow batteries. New developments in the chemistry of secondary and flow batteries as well as regenerative fuel cells are also considered. Advantages and disadvantages of current and prospective electrochemical energy storage options are discussed. The most promising technologies in the short term are high-temperature sodium batteries with β″-alumina electrolyte, lithium-ion batteries, and flow batteries. Regenerative fuel cells and lithium metal batteries with high energy density require further research to become practical.
CSM Testbed Development and Large-Scale Structural Applications
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Gillian, R. E.; Mccleary, Susan L.; Lotts, C. G.; Poole, E. L.; Overman, A. L.; Macy, S. C.
1989-01-01
A research activity called Computational Structural Mechanics (CSM) conducted at the NASA Langley Research Center is described. This activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM Testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM Testbed methods development environment is presented and some new numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.
Promoting Student Progressions in Science Classrooms: A Video Study
ERIC Educational Resources Information Center
Jin, Hui; Johnson, Michele E.; Shin, Hyo Jeong; Anderson, Charles W.
2017-01-01
This study was conducted in a large-scale environmental literacy project. In the project, we developed a Learning Progression Framework (LPF) for matter and energy in social-ecological systems; the LPF contains four achievement levels. Based on the LPF, we designed a Plant Unit to help Levels 2 and 3 students advance to Level 4 of the LPF. In the…
ERIC Educational Resources Information Center
Bejar, Isaac I.
2010-01-01
The foregoing articles constitute what I consider a comprehensive and clear description of the redesign process of a major assessment. The articles serve to illustrate the problems that will need to be addressed by large-scale assessments in the twenty-first century. Primary among them is how to organize the development of such assessments to meet…
The Emerging Role of the Data Base Manager. Report No. R-1253-PR.
ERIC Educational Resources Information Center
Sawtelle, Thomas K.
The Air Force Logistics Command (AFLC) is revising and enhancing its data-processing capabilities with the development of a large-scale, multi-site, on-line, integrated data base information system known as the Advanced Logistics System (ALS). A data integrity program is to be built around a Data Base Manager (DBM), an individual or a group of…
Using Monoclonal Antibodies to Prevent Mucosal Transmission of Epidemic Infectious Diseases
Zeitlin, Larry; Cone, Richard A.
1999-01-01
Passive immunization with antibodies has been shown to prevent a wide variety of diseases. Recent advances in monoclonal antibody technology are enabling the development of new methods for passive immunization of mucosal surfaces. Human monoclonal antibodies, produced rapidly, inexpensively, and in large quantities, may help prevent respiratory, diarrheal, and sexually transmitted diseases on a public health scale. PMID:10081672
ERIC Educational Resources Information Center
Domyancich, John M.
2014-01-01
Multiple-choice questions are an important part of large-scale summative assessments, such as the advanced placement (AP) chemistry exam. However, past AP chemistry exam items often lacked the ability to test conceptual understanding and higher-order cognitive skills. The redesigned AP chemistry exam shows a distinctive shift in item types toward…
VLSI Microsystem for Rapid Bioinformatic Pattern Recognition
NASA Technical Reports Server (NTRS)
Fang, Wai-Chi; Lue, Jaw-Chyng
2009-01-01
A system comprising very-large-scale integrated (VLSI) circuits is being developed as a means of bioinformatics-oriented analysis and recognition of patterns of fluorescence generated in a microarray in an advanced, highly miniaturized, portable genetic-expression-assay instrument. Such an instrument implements an on-chip combination of polymerase chain reactions and electrochemical transduction for amplification and detection of deoxyribonucleic acid (DNA).
Neuroscience thinks big (and collaboratively).
Kandel, Eric R; Markram, Henry; Matthews, Paul M; Yuste, Rafael; Koch, Christof
2013-09-01
Despite cash-strapped times for research, several ambitious collaborative neuroscience projects have attracted large amounts of funding and media attention. In Europe, the Human Brain Project aims to develop a large-scale computer simulation of the brain, whereas in the United States, the Brain Activity Map is working towards establishing a functional connectome of the entire brain, and the Allen Institute for Brain Science has embarked upon a 10-year project to understand the mouse visual cortex (the MindScope project). US President Barack Obama's announcement of the BRAIN Initiative (Brain Research through Advancing Innovative Neurotechnologies Initiative) in April 2013 highlights the political commitment to neuroscience and is expected to further foster interdisciplinary collaborations, accelerate the development of new technologies and thus fuel much needed medical advances. In this Viewpoint article, five prominent neuroscientists explain the aims of the projects and how they are addressing some of the questions (and criticisms) that have arisen.
Advancing solar energy forecasting through the underlying physics
NASA Astrophysics Data System (ADS)
Yang, H.; Ghonima, M. S.; Zhong, X.; Ozge, B.; Kurtz, B.; Wu, E.; Mejia, F. A.; Zamora, M.; Wang, G.; Clemesha, R.; Norris, J. R.; Heus, T.; Kleissl, J. P.
2017-12-01
As solar power comprises an increasingly large portion of the energy generation mix, the ability to accurately forecast solar photovoltaic generation becomes increasingly important. Due to the variability of solar power caused by cloud cover, knowledge of both the magnitude and timing of expected solar power production ahead of time facilitates the integration of solar power onto the electric grid by reducing electricity generation from traditional ancillary generators such as gas and oil power plants, as well as decreasing the ramping of all generators, reducing start and shutdown costs, and minimizing solar power curtailment, thereby providing annual economic value. The time scales involved in both the energy markets and solar variability range from intra-hour to several days ahead. This wide range of time horizons led to the development of a multitude of techniques, with each offering unique advantages in specific applications. For example, sky imagery provides site-specific forecasts on the minute-scale. Statistical techniques including machine learning algorithms are commonly used in the intra-day forecast horizon for regional applications, while numerical weather prediction models can provide mesoscale forecasts on both the intra-day and days-ahead time scale. This talk will provide an overview of the challenges unique to each technique and highlight the advances in their ongoing development which come alongside advances in the fundamental physics underneath.
NASA Astrophysics Data System (ADS)
Zhang, Pengsong; Jiang, Shanping; Yang, Linhua; Zhang, Bolun
2018-01-01
In order to meet the requirement of high precision thermal distortion measurement foraΦ4.2m deployable mesh antenna of satellite in vacuum and cryogenic environment, based on Digital Close-range Photogrammetry and Space Environment Test Technology of Spacecraft, a large scale antenna distortion measurement system under vacuum and cryogenic environment is developed in this paper. The antenna Distortion measurement system (ADMS) is the first domestic independently developed thermal distortion measurement system for large antenna, which has successfully solved non-contact high precision distortion measurement problem in large spacecraft structure under vacuum and cryogenic environment. The measurement accuracy of ADMS is better than 50 μm/5m, which has reached international advanced level. The experimental results show that the measurement system has great advantages in large structural measurement of spacecrafts, and also has broad application prospects in space or other related fields.
Large-eddy simulation of a boundary layer with concave streamwise curvature
NASA Technical Reports Server (NTRS)
Lund, Thomas S.
1994-01-01
Turbulence modeling continues to be one of the most difficult problems in fluid mechanics. Existing prediction methods are well developed for certain classes of simple equilibrium flows, but are still not entirely satisfactory for a large category of complex non-equilibrium flows found in engineering practice. Direct and large-eddy simulation (LES) approaches have long been believed to have great potential for the accurate prediction of difficult turbulent flows, but the associated computational cost has been prohibitive for practical problems. This remains true for direct simulation but is no longer clear for large-eddy simulation. Advances in computer hardware, numerical methods, and subgrid-scale modeling have made it possible to conduct LES for flows or practical interest at Reynolds numbers in the range of laboratory experiments. The objective of this work is to apply ES and the dynamic subgrid-scale model to the flow of a boundary layer over a concave surface.
A multi-scale network method for two-phase flow in porous media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khayrat, Karim, E-mail: khayratk@ifd.mavt.ethz.ch; Jenny, Patrick
Pore-network models of porous media are useful in the study of pore-scale flow in porous media. In order to extract macroscopic properties from flow simulations in pore-networks, it is crucial the networks are large enough to be considered representative elementary volumes. However, existing two-phase network flow solvers are limited to relatively small domains. For this purpose, a multi-scale pore-network (MSPN) method, which takes into account flow-rate effects and can simulate larger domains compared to existing methods, was developed. In our solution algorithm, a large pore network is partitioned into several smaller sub-networks. The algorithm to advance the fluid interfaces withinmore » each subnetwork consists of three steps. First, a global pressure problem on the network is solved approximately using the multiscale finite volume (MSFV) method. Next, the fluxes across the subnetworks are computed. Lastly, using fluxes as boundary conditions, a dynamic two-phase flow solver is used to advance the solution in time. Simulation results of drainage scenarios at different capillary numbers and unfavourable viscosity ratios are presented and used to validate the MSPN method against solutions obtained by an existing dynamic network flow solver.« less
Large-Scale Advanced Prop-Fan (LAP) pitch change actuator and control design report
NASA Technical Reports Server (NTRS)
Schwartz, R. A.; Carvalho, P.; Cutler, M. J.
1986-01-01
In recent years, considerable attention has been directed toward improving aircraft fuel consumption. Studies have shown that the high inherent efficiency previously demonstrated by low speed turboprop propulsion systems may now be extended to today's higher speed aircraft if advanced high-speed propeller blades having thin airfoils and aerodynamic sweep are utilized. Hamilton Standard has designed a 9-foot diameter single-rotation Large-Scale Advanced Prop-Fan (LAP) which will be tested on a static test stand, in a high speed wind tunnel and on a research aircraft. The major objective of this testing is to establish the structural integrity of large-scale Prop-Fans of advanced construction in addition to the evaluation of aerodynamic performance and aeroacoustic design. This report describes the operation, design features and actual hardware of the (LAP) Prop-Fan pitch control system. The pitch control system which controls blade angle and propeller speed consists of two separate assemblies. The first is the control unit which provides the hydraulic supply, speed governing and feather function for the system. The second unit is the hydro-mechanical pitch change actuator which directly changes blade angle (pitch) as scheduled by the control.
Resonant soft X-ray scattering for polymer materials
Liu, Feng; Brady, Michael A.; Wang, Cheng
2016-04-16
Resonant Soft X-ray Scattering (RSoXS) was developed within the last few years, and the first dedicated resonant soft X-ray scattering beamline for soft materials was constructed at the Advanced Light Source, LBNL. RSoXS combines soft X-ray spectroscopy with X-ray scattering and thus offers statistical information for 3D chemical morphology over a large length scale range from nanometers to micrometers. Using RSoXS to characterize multi-length scale soft materials with heterogeneous chemical structures, we have demonstrated that soft X-ray scattering is a unique complementary technique to conventional hard X-ray and neutron scattering. Its unique chemical sensitivity, large accessible size scale, molecular bondmore » orientation sensitivity with polarized X-rays, and high coherence have shown great potential for chemically specific structural characterization for many classes of materials.« less
Advanced Model for Extreme Lift and Improved Aeroacoustics (AMELIA)
NASA Technical Reports Server (NTRS)
Lichtwardt, Jonathan; Paciano, Eric; Jameson, Tina; Fong, Robert; Marshall, David
2012-01-01
With the very recent advent of NASA's Environmentally Responsible Aviation Project (ERA), which is dedicated to designing aircraft that will reduce the impact of aviation on the environment, there is a need for research and development of methodologies to minimize fuel burn, emissions, and reduce community noise produced by regional airliners. ERA tackles airframe technology, propulsion technology, and vehicle systems integration to meet performance objectives in the time frame for the aircraft to be at a Technology Readiness Level (TRL) of 4-6 by the year of 2020 (deemed N+2). The proceeding project that investigated similar goals to ERA was NASA's Subsonic Fixed Wing (SFW). SFW focused on conducting research to improve prediction methods and technologies that will produce lower noise, lower emissions, and higher performing subsonic aircraft for the Next Generation Air Transportation System. The work provided in this investigation was a NASA Research Announcement (NRA) contract #NNL07AA55C funded by Subsonic Fixed Wing. The project started in 2007 with a specific goal of conducting a large-scale wind tunnel test along with the development of new and improved predictive codes for the advanced powered-lift concepts. Many of the predictive codes were incorporated to refine the wind tunnel model outer mold line design. The large scale wind tunnel test goal was to investigate powered lift technologies and provide an experimental database to validate current and future modeling techniques. Powered-lift concepts investigated were Circulation Control (CC) wing in conjunction with over-the-wing mounted engines to entrain the exhaust to further increase the lift generated by CC technologies alone. The NRA was a five-year effort; during the first year the objective was to select and refine CESTOL concepts and then to complete a preliminary design of a large-scale wind tunnel model for the large scale test. During the second, third, and fourth years the large-scale wind tunnel model design would be completed, manufactured, and calibrated. During the fifth year the large scale wind tunnel test was conducted. This technical memo will describe all phases of the Advanced Model for Extreme Lift and Improved Aeroacoustics (AMELIA) project and provide a brief summary of the background and modeling efforts involved in the NRA. The conceptual designs considered for this project and the decision process for the selected configuration adapted for a wind tunnel model will be briefly discussed. The internal configuration of AMELIA, and the internal measurements chosen in order to satisfy the requirements of obtaining a database of experimental data to be used for future computational model validations. The external experimental techniques that were employed during the test, along with the large-scale wind tunnel test facility are covered in great detail. Experimental measurements in the database include forces and moments, and surface pressure distributions, local skin friction measurements, boundary and shear layer velocity profiles, far-field acoustic data and noise signatures from turbofan propulsion simulators. Results and discussion of the circulation control performance, over-the-wing mounted engines, and the combined performance are also discussed in great detail.
Plant-made vaccine antigens and biopharmaceuticals
Daniell, Henry; Singh, Nameirakpam D.; Mason, Hugh; Streatfield, Stephen J.
2009-01-01
Plant cells are ideal bioreactors for the production and oral delivery of vaccines and biopharmaceuticals, eliminating the need for expensive fermentation, purification, cold storage, transportation and sterile delivery. Plant-made vaccines have been developed for two decades but none has advanced beyond Phase I. However, two plant-made biopharmaceuticals are now advancing through Phase II and Phase III human clinical trials. In this review, we evaluate the advantages and disadvantages of different plant expression systems (stable nuclear and chloroplast or transient viral) and their current limitations or challenges. We provide suggestions for advancing this valuable concept for clinical applications and conclude that greater research emphasis is needed on large scale production, purification, functional characterization, oral delivery and preclinical evaluation. PMID:19836291
[Research advances in dendrochronology].
Fang, Ke-Yan; Chen, Qiu-Yan; Liu, Chang-Zhi; Cao, Chun-Fu; Chen, Ya-Jun; Zhou, Fei-Fei
2014-07-01
Tree-ring studies in China have achieved great advances since the 1990s, particularly for the dendroclimatological studies which have made some influence around the world. However, because of the uneven development, limited attention has been currently paid on the other branches of dendrochronology. We herein briefly compared the advances of dendrochronology in China and of the world and presented suggestions on future dendrochronological studies. Large-scale tree-ring based climate reconstructions in China are highly needed by employing mathematical methods and a high quality tree-ring network of the ring-width, density, stable isotope and wood anatomy. Tree-ring based field climate reconstructions provide potentials on explorations of climate forcings during the reconstructed periods via climate diagnosis and process simulation.
Challenges and opportunities in synthetic biology for chemical engineers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, YZ; Lee, JK; Zhao, HM
Synthetic biology provides numerous great opportunities for chemical engineers in the development of new processes for large-scale production of biofuels, value-added chemicals, and protein therapeutics. However, challenges across all scales abound. In particular, the modularization and standardization of the components in a biological system, so-called biological parts, remain the biggest obstacle in synthetic biology. In this perspective, we will discuss the main challenges and opportunities in the rapidly growing synthetic biology field and the important roles that chemical engineers can play in its advancement. (C) 2012 Elsevier Ltd. All rights reserved.
Challenges and opportunities in synthetic biology for chemical engineers
Luo, Yunzi; Lee, Jung-Kul; Zhao, Huimin
2012-01-01
Synthetic biology provides numerous great opportunities for chemical engineers in the development of new processes for large-scale production of biofuels, value-added chemicals, and protein therapeutics. However, challenges across all scales abound. In particular, the modularization and standardization of the components in a biological system, so-called biological parts, remain the biggest obstacle in synthetic biology. In this perspective, we will discuss the main challenges and opportunities in the rapidly growing synthetic biology field and the important roles that chemical engineers can play in its advancement. PMID:24222925
Challenges and opportunities in synthetic biology for chemical engineers.
Luo, Yunzi; Lee, Jung-Kul; Zhao, Huimin
2013-11-15
Synthetic biology provides numerous great opportunities for chemical engineers in the development of new processes for large-scale production of biofuels, value-added chemicals, and protein therapeutics. However, challenges across all scales abound. In particular, the modularization and standardization of the components in a biological system, so-called biological parts, remain the biggest obstacle in synthetic biology. In this perspective, we will discuss the main challenges and opportunities in the rapidly growing synthetic biology field and the important roles that chemical engineers can play in its advancement.
Parallel Visualization Co-Processing of Overnight CFD Propulsion Applications
NASA Technical Reports Server (NTRS)
Edwards, David E.; Haimes, Robert
1999-01-01
An interactive visualization system pV3 is being developed for the investigation of advanced computational methodologies employing visualization and parallel processing for the extraction of information contained in large-scale transient engineering simulations. Visual techniques for extracting information from the data in terms of cutting planes, iso-surfaces, particle tracing and vector fields are included in this system. This paper discusses improvements to the pV3 system developed under NASA's Affordable High Performance Computing project.
TOPICAL REVIEW: Advances and challenges in computational plasma science
NASA Astrophysics Data System (ADS)
Tang, W. M.; Chan, V. S.
2005-02-01
Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This should produce the scientific excitement which will help to (a) stimulate enhanced cross-cutting collaborations with other fields and (b) attract the bright young talent needed for the future health of the field of plasma science.
Advances and challenges in computational plasma science
NASA Astrophysics Data System (ADS)
Tang, W. M.
2005-02-01
Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This should produce the scientific excitement which will help to (a) stimulate enhanced cross-cutting collaborations with other fields and (b) attract the bright young talent needed for the future health of the field of plasma science.
Taking Stock: Existing Resources for Assessing a New Vision of Science Learning
ERIC Educational Resources Information Center
Alonzo, Alicia C.; Ke, Li
2016-01-01
A new vision of science learning described in the "Next Generation Science Standards"--particularly the science and engineering practices and their integration with content--pose significant challenges for large-scale assessment. This article explores what might be learned from advances in large-scale science assessment and…
Mission and Objectives for the X-1 Advanced Radiation Source*
NASA Astrophysics Data System (ADS)
Rochau, Gary E.; Ramirez, Juan J.; Raglin, Paul S.
1998-11-01
Sandia National Laboratories PO Box 5800, MS-1178, Albuquerque, NM 87185 The X-1 Advanced Radiation Source represents a next step in providing the U.S. Department of Energy's Stockpile Stewardship Program with the high-energy, large volume, laboratory x-ray source for the Radiation Effects Science and Simulation, Inertial Confinement Fusion, and Weapon Physics Programs. Advances in fast pulsed power technology and in z-pinch hohlraums on Sandia National Laboratories' Z Accelerator provide sufficient basis for pursuing the development of X-1. The X-1 plan follows a strategy based on scaling the 2 MJ x-ray output on Z via a 3-fold increase in z-pinch load current. The large volume (>5 cm3), high temperature (>150 eV), temporally long (>10 ns) hohlraums are unique outside of underground nuclear weapon testing. Analytical scaling arguments and hydrodynamic simulations indicate that these hohlraums at temperatures of 230-300 eV will ignite thermonuclear fuel and drive the reaction to a yield of 200 to 1,200 MJ in the laboratory. Non-ignition sources will provide cold x-ray environments (<15 keV) and high yield fusion burn sources will provide high fidelity warm x-ray environments (15 keV-80 keV). This paper will introduce the X-1 Advanced Radiation Source Facility Project, describe the project mission, objective, and preliminary schedule.
Cruise noise of the 2/9th scale model of the Large-scale Advanced Propfan (LAP) propeller, SR-7A
NASA Technical Reports Server (NTRS)
Dittmar, James H.; Stang, David B.
1987-01-01
Noise data on the Large-scale Advanced Propfan (LAP) propeller model SR-7A were taken in the NASA Lewis Research Center 8 x 6 foot Wind Tunnel. The maximum blade passing tone noise first rises with increasing helical tip Mach number to a peak level, then remains the same or decreases from its peak level when going to higher helical tip Mach numbers. This trend was observed for operation at both constant advance ratio and approximately equal thrust. This noise reduction or, leveling out at high helical tip Mach numbers, points to the use of higher propeller tip speeds as a possible method to limit airplane cabin noise while maintaining high flight speed and efficiency. Projections of the tunnel model data are made to the full scale LAP propeller mounted on the test bed aircraft and compared with predictions. The prediction method is found to be somewhat conservative in that it slightly overpredicts the projected model data at the peak.
Cruise noise of the 2/9 scale model of the Large-scale Advanced Propfan (LAP) propeller, SR-7A
NASA Technical Reports Server (NTRS)
Dittmar, James H.; Stang, David B.
1987-01-01
Noise data on the Large-scale Advanced Propfan (LAP) propeller model SR-7A were taken in the NASA Lewis Research Center 8 x 6 foot Wind Tunnel. The maximum blade passing tone noise first rises with increasing helical tip Mach number to a peak level, then remains the same or decreases from its peak level when going to higher helical tip Mach numbers. This trend was observed for operation at both constant advance ratio and approximately equal thrust. This noise reduction or, leveling out at high helical tip Mach numbers, points to the use of higher propeller tip speeds as a possible method to limit airplane cabin noise while maintaining high flight speed and efficiency. Projections of the tunnel model data are made to the full scale LAP propeller mounted on the test bed aircraft and compared with predictions. The prediction method is found to be somewhat conservative in that it slightly overpredicts the projected model data at the peak.
Advancing Development and Greenhouse Gas Reductions in Vietnam's Wind Sector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bilello, D.; Katz, J.; Esterly, S.
2014-09-01
Clean energy development is a key component of Vietnam's Green Growth Strategy, which establishes a target to reduce greenhouse gas (GHG) emissions from domestic energy activities by 20-30 percent by 2030 relative to a business-as-usual scenario. Vietnam has significant wind energy resources, which, if developed, could help the country reach this target while providing ancillary economic, social, and environmental benefits. Given Vietnam's ambitious clean energy goals and the relatively nascent state of wind energy development in the country, this paper seeks to fulfill two primary objectives: to distill timely and useful information to provincial-level planners, analysts, and project developers asmore » they evaluate opportunities to develop local wind resources; and, to provide insights to policymakers on how coordinated efforts may help advance large-scale wind development, deliver near-term GHG emission reductions, and promote national objectives in the context of a low emission development framework.« less
Advances in cell culture: anchorage dependence
Merten, Otto-Wilhelm
2015-01-01
Anchorage-dependent cells are of great interest for various biotechnological applications. (i) They represent a formidable production means of viruses for vaccination purposes at very large scales (in 1000–6000 l reactors) using microcarriers, and in the last decade many more novel viral vaccines have been developed using this production technology. (ii) With the advent of stem cells and their use/potential use in clinics for cell therapy and regenerative medicine purposes, the development of novel culture devices and technologies for adherent cells has accelerated greatly with a view to the large-scale expansion of these cells. Presently, the really scalable systems—microcarrier/microcarrier-clump cultures using stirred-tank reactors—for the expansion of stem cells are still in their infancy. Only laboratory scale reactors of maximally 2.5 l working volume have been evaluated because thorough knowledge and basic understanding of critical issues with respect to cell expansion while retaining pluripotency and differentiation potential, and the impact of the culture environment on stem cell fate, etc., are still lacking and require further studies. This article gives an overview on critical issues common to all cell culture systems for adherent cells as well as specifics for different types of stem cells in view of small- and large-scale cell expansion and production processes. PMID:25533097
A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project.
Ewers, Robert M; Didham, Raphael K; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L; Turner, Edgar C
2011-11-27
Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification.
A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project
Ewers, Robert M.; Didham, Raphael K.; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D.; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L.; Turner, Edgar C.
2011-01-01
Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification. PMID:22006969
Large-scale semidefinite programming for many-electron quantum mechanics.
Mazziotti, David A
2011-02-25
The energy of a many-electron quantum system can be approximated by a constrained optimization of the two-electron reduced density matrix (2-RDM) that is solvable in polynomial time by semidefinite programming (SDP). Here we develop a SDP method for computing strongly correlated 2-RDMs that is 10-20 times faster than previous methods [D. A. Mazziotti, Phys. Rev. Lett. 93, 213001 (2004)]. We illustrate with (i) the dissociation of N(2) and (ii) the metal-to-insulator transition of H(50). For H(50) the SDP problem has 9.4×10(6) variables. This advance also expands the feasibility of large-scale applications in quantum information, control, statistics, and economics. © 2011 American Physical Society
Large-Scale Semidefinite Programming for Many-Electron Quantum Mechanics
NASA Astrophysics Data System (ADS)
Mazziotti, David A.
2011-02-01
The energy of a many-electron quantum system can be approximated by a constrained optimization of the two-electron reduced density matrix (2-RDM) that is solvable in polynomial time by semidefinite programming (SDP). Here we develop a SDP method for computing strongly correlated 2-RDMs that is 10-20 times faster than previous methods [D. A. Mazziotti, Phys. Rev. Lett. 93, 213001 (2004)PRLTAO0031-900710.1103/PhysRevLett.93.213001]. We illustrate with (i) the dissociation of N2 and (ii) the metal-to-insulator transition of H50. For H50 the SDP problem has 9.4×106 variables. This advance also expands the feasibility of large-scale applications in quantum information, control, statistics, and economics.
Advances in the manufacture of MIP nanoparticles.
Poma, Alessandro; Turner, Anthony P F; Piletsky, Sergey A
2010-12-01
Molecularly imprinted polymers (MIPs) are prepared by creating a three-dimensional polymeric matrix around a template molecule. After the matrix is removed, complementary cavities with respect to shape and functional groups remain. MIPs have been produced for applications in in vitro diagnostics, therapeutics and separations. However, this promising technology still lacks widespread application because of issues related to large-scale production and optimization of the synthesis. Recent developments in the area of MIP nanoparticles might offer solutions to several problems associated with performance and application. This review discusses various approaches used in the preparation of MIP nanoparticles, focusing in particular on the issues associated with large-scale manufacture and implications for the performance of synthesized nanomaterials. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Rolla, L. Barrera; Rice, H. J.
2006-09-01
In this paper a "forward-advancing" field discretization method suitable for solving the Helmholtz equation in large-scale problems is proposed. The forward wave expansion method (FWEM) is derived from a highly efficient discretization procedure based on interpolation of wave functions known as the wave expansion method (WEM). The FWEM computes the propagated sound field by means of an exclusively forward advancing solution, neglecting the backscattered field. It is thus analogous to methods such as the (one way) parabolic equation method (PEM) (usually discretized using standard finite difference or finite element methods). These techniques do not require the inversion of large system matrices and thus enable the solution of large-scale acoustic problems where backscatter is not of interest. Calculations using FWEM are presented for two propagation problems and comparisons to data computed with analytical and theoretical solutions and show this forward approximation to be highly accurate. Examples of sound propagation over a screen in upwind and downwind refracting atmospheric conditions at low nodal spacings (0.2 per wavelength in the propagation direction) are also included to demonstrate the flexibility and efficiency of the method.
Scaling of data communications for an advanced supercomputer network
NASA Technical Reports Server (NTRS)
Levin, E.; Eaton, C. K.; Young, Bruce
1986-01-01
The goal of NASA's Numerical Aerodynamic Simulation (NAS) Program is to provide a powerful computational environment for advanced research and development in aeronautics and related disciplines. The present NAS system consists of a Cray 2 supercomputer connected by a data network to a large mass storage system, to sophisticated local graphics workstations and by remote communication to researchers throughout the United States. The program plan is to continue acquiring the most powerful supercomputers as they become available. The implications of a projected 20-fold increase in processing power on the data communications requirements are described.
ERIC Educational Resources Information Center
Chow, Christina M.
2011-01-01
Maintaining a competitive edge within the 21st century is dependent on the cultivation of human capital, producing qualified and innovative employees capable of competing within the new global marketplace. Technological advancements in communications technology as well as large scale, infrastructure development has led to a leveled playing field…
NASA Astrophysics Data System (ADS)
Kariniotakis, G.; Anemos Team
2003-04-01
Objectives: Accurate forecasting of the wind energy production up to two days ahead is recognized as a major contribution for reliable large-scale wind power integration. Especially, in a liberalized electricity market, prediction tools enhance the position of wind energy compared to other forms of dispatchable generation. ANEMOS, is a new 3.5 years R&D project supported by the European Commission, that resembles research organizations and end-users with an important experience on the domain. The project aims to develop advanced forecasting models that will substantially outperform current methods. Emphasis is given to situations like complex terrain, extreme weather conditions, as well as to offshore prediction for which no specific tools currently exist. The prediction models will be implemented in a software platform and installed for online operation at onshore and offshore wind farms by the end-users participating in the project. Approach: The paper presents the methodology of the project. Initially, the prediction requirements are identified according to the profiles of the end-users. The project develops prediction models based on both a physical and an alternative statistical approach. Research on physical models gives emphasis to techniques for use in complex terrain and the development of prediction tools based on CFD techniques, advanced model output statistics or high-resolution meteorological information. Statistical models (i.e. based on artificial intelligence) are developed for downscaling, power curve representation, upscaling for prediction at regional or national level, etc. A benchmarking process is set-up to evaluate the performance of the developed models and to compare them with existing ones using a number of case studies. The synergy between statistical and physical approaches is examined to identify promising areas for further improvement of forecasting accuracy. Appropriate physical and statistical prediction models are also developed for offshore wind farms taking into account advances in marine meteorology (interaction between wind and waves, coastal effects). The benefits from the use of satellite radar images for modeling local weather patterns are investigated. A next generation forecasting software, ANEMOS, will be developed to integrate the various models. The tool is enhanced by advanced Information Communication Technology (ICT) functionality and can operate both in stand alone, or remote mode, or be interfaced with standard Energy or Distribution Management Systems (EMS/DMS) systems. Contribution: The project provides an advanced technology for wind resource forecasting applicable in a large scale: at a single wind farm, regional or national level and for both interconnected and island systems. A major milestone is the on-line operation of the developed software by the participating utilities for onshore and offshore wind farms and the demonstration of the economic benefits. The outcome of the ANEMOS project will help consistently the increase of wind integration in two levels; in an operational level due to better management of wind farms, but also, it will contribute to increasing the installed capacity of wind farms. This is because accurate prediction of the resource reduces the risk of wind farm developers, who are then more willing to undertake new wind farm installations especially in a liberalized electricity market environment.
NASA Astrophysics Data System (ADS)
Wagener, T.
2017-12-01
Current societal problems and questions demand that we increasingly build hydrologic models for regional or even continental scale assessment of global change impacts. Such models offer new opportunities for scientific advancement, for example by enabling comparative hydrology or connectivity studies, and for improved support of water management decision, since we might better understand regional impacts on water resources from large scale phenomena such as droughts. On the other hand, we are faced with epistemic uncertainties when we move up in scale. The term epistemic uncertainty describes those uncertainties that are not well determined by historical observations. This lack of determination can be because the future is not like the past (e.g. due to climate change), because the historical data is unreliable (e.g. because it is imperfectly recorded from proxies or missing), or because it is scarce (either because measurements are not available at the right scale or there is no observation network available at all). In this talk I will explore: (1) how we might build a bridge between what we have learned about catchment scale processes and hydrologic model development and evaluation at larger scales. (2) How we can understand the impact of epistemic uncertainty in large scale hydrologic models. And (3) how we might utilize large scale hydrologic predictions to understand climate change impacts, e.g. on infectious disease risk.
Scalability improvements to NRLMOL for DFT calculations of large molecules
NASA Astrophysics Data System (ADS)
Diaz, Carlos Manuel
Advances in high performance computing (HPC) have provided a way to treat large, computationally demanding tasks using thousands of processors. With the development of more powerful HPC architectures, the need to create efficient and scalable code has grown more important. Electronic structure calculations are valuable in understanding experimental observations and are routinely used for new materials predictions. For the electronic structure calculations, the memory and computation time are proportional to the number of atoms. Memory requirements for these calculations scale as N2, where N is the number of atoms. While the recent advances in HPC offer platforms with large numbers of cores, the limited amount of memory available on a given node and poor scalability of the electronic structure code hinder their efficient usage of these platforms. This thesis will present some developments to overcome these bottlenecks in order to study large systems. These developments, which are implemented in the NRLMOL electronic structure code, involve the use of sparse matrix storage formats and the use of linear algebra using sparse and distributed matrices. These developments along with other related development now allow ground state density functional calculations using up to 25,000 basis functions and the excited state calculations using up to 17,000 basis functions while utilizing all cores on a node. An example on a light-harvesting triad molecule is described. Finally, future plans to further improve the scalability will be presented.
Ubiquitinated Proteome: Ready for Global?*
Shi, Yi; Xu, Ping; Qin, Jun
2011-01-01
Ubiquitin (Ub) is a small and highly conserved protein that can covalently modify protein substrates. Ubiquitination is one of the major post-translational modifications that regulate a broad spectrum of cellular functions. The advancement of mass spectrometers as well as the development of new affinity purification tools has greatly expedited proteome-wide analysis of several post-translational modifications (e.g. phosphorylation, glycosylation, and acetylation). In contrast, large-scale profiling of lysine ubiquitination remains a challenge. Most recently, new Ub affinity reagents such as Ub remnant antibody and tandem Ub binding domains have been developed, allowing for relatively large-scale detection of several hundreds of lysine ubiquitination events in human cells. Here we review different strategies for the identification of ubiquitination site and discuss several issues associated with data analysis. We suggest that careful interpretation and orthogonal confirmation of MS spectra is necessary to minimize false positive assignments by automatic searching algorithms. PMID:21339389
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramamurthy, Byravamurthy
2014-05-05
In this project, developed scheduling frameworks for dynamic bandwidth demands for large-scale science applications. In particular, we developed scheduling algorithms for dynamic bandwidth demands in this project. Apart from theoretical approaches such as Integer Linear Programming, Tabu Search and Genetic Algorithm heuristics, we have utilized practical data from ESnet OSCARS project (from our DOE lab partners) to conduct realistic simulations of our approaches. We have disseminated our work through conference paper presentations and journal papers and a book chapter. In this project we addressed the problem of scheduling of lightpaths over optical wavelength division multiplexed (WDM) networks. We published severalmore » conference papers and journal papers on this topic. We also addressed the problems of joint allocation of computing, storage and networking resources in Grid/Cloud networks and proposed energy-efficient mechanisms for operatin optical WDM networks.« less
Visual Systems for Interactive Exploration and Mining of Large-Scale Neuroimaging Data Archives
Bowman, Ian; Joshi, Shantanu H.; Van Horn, John D.
2012-01-01
While technological advancements in neuroimaging scanner engineering have improved the efficiency of data acquisition, electronic data capture methods will likewise significantly expedite the populating of large-scale neuroimaging databases. As they do and these archives grow in size, a particular challenge lies in examining and interacting with the information that these resources contain through the development of compelling, user-driven approaches for data exploration and mining. In this article, we introduce the informatics visualization for neuroimaging (INVIZIAN) framework for the graphical rendering of, and dynamic interaction with the contents of large-scale neuroimaging data sets. We describe the rationale behind INVIZIAN, detail its development, and demonstrate its usage in examining a collection of over 900 T1-anatomical magnetic resonance imaging (MRI) image volumes from across a diverse set of clinical neuroimaging studies drawn from a leading neuroimaging database. Using a collection of cortical surface metrics and means for examining brain similarity, INVIZIAN graphically displays brain surfaces as points in a coordinate space and enables classification of clusters of neuroanatomically similar MRI images and data mining. As an initial step toward addressing the need for such user-friendly tools, INVIZIAN provides a highly unique means to interact with large quantities of electronic brain imaging archives in ways suitable for hypothesis generation and data mining. PMID:22536181
A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations
NASA Astrophysics Data System (ADS)
Demir, I.; Agliamzanov, R.
2014-12-01
Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.
Zhou, Juntuo; Liu, Huiying; Liu, Yang; Liu, Jia; Zhao, Xuyang; Yin, Yuxin
2016-04-19
Recent advances in mass spectrometers which have yielded higher resolution and faster scanning speeds have expanded their application in metabolomics of diverse diseases. Using a quadrupole-Orbitrap LC-MS system, we developed an efficient large-scale quantitative method targeting 237 metabolites involved in various metabolic pathways using scheduled, parallel reaction monitoring (PRM). We assessed the dynamic range, linearity, reproducibility, and system suitability of the PRM assay by measuring concentration curves, biological samples, and clinical serum samples. The quantification performances of PRM and MS1-based assays in Q-Exactive were compared, and the MRM assay in QTRAP 6500 was also compared. The PRM assay monitoring 237 polar metabolites showed greater reproducibility and quantitative accuracy than MS1-based quantification and also showed greater flexibility in postacquisition assay refinement than the MRM assay in QTRAP 6500. We present a workflow for convenient PRM data processing using Skyline software which is free of charge. In this study we have established a reliable PRM methodology on a quadrupole-Orbitrap platform for evaluation of large-scale targeted metabolomics, which provides a new choice for basic and clinical metabolomics study.
Drive to miniaturization: integrated optical networks on mobile platforms
NASA Astrophysics Data System (ADS)
Salour, Michael M.; Batayneh, Marwan; Figueroa, Luis
2011-11-01
With rapid growth of the Internet, bandwidth demand for data traffic is continuing to explode. In addition, emerging and future applications are becoming more and more network centric. With the proliferation of data communication platforms and data-intensive applications (e.g. cloud computing), high-bandwidth materials such as video clips dominating the Internet, and social networking tools, a networking technology is very desirable which can scale the Internet's capability (particularly its bandwidth) by two to three orders of magnitude. As the limits of Moore's law are approached, optical mesh networks based on wavelength-division multiplexing (WDM) have the ability to satisfy the large- and scalable-bandwidth requirements of our future backbone telecommunication networks. In addition, this trend is also affecting other special-purpose systems in applications such as mobile platforms, automobiles, aircraft, ships, tanks, and micro unmanned air vehicles (UAVs) which are becoming independent systems roaming the sky while sensing data, processing, making decisions, and even communicating and networking with other heterogeneous systems. Recently, WDM optical technologies have seen advances in its transmission speeds, switching technologies, routing protocols, and control systems. Such advances have made WDM optical technology an appealing choice for the design of future Internet architectures. Along these lines, scientists across the entire spectrum of the network architectures from physical layer to applications have been working on developing devices and communication protocols which can take full advantage of the rapid advances in WDM technology. Nevertheless, the focus has always been on large-scale telecommunication networks that span hundreds and even thousands of miles. Given these advances, we investigate the vision and applicability of integrating the traditionally large-scale WDM optical networks into miniaturized mobile platforms such as UAVs. We explain the benefits of WDM optical technology for these applications. We also describe some of the limitations of WDM optical networks as the size of a vehicle gets smaller, such as in micro-UAVs, and study the miniaturization and communication system limitations in such environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kitanidis, Peter
As large-scale, commercial storage projects become operational, the problem of utilizing information from diverse sources becomes more critically important. In this project, we developed, tested, and applied an advanced joint data inversion system for CO 2 storage modeling with large data sets for use in site characterization and real-time monitoring. Emphasis was on the development of advanced and efficient computational algorithms for joint inversion of hydro-geophysical data, coupled with state-of-the-art forward process simulations. The developed system consists of (1) inversion tools using characterization data, such as 3D seismic survey (amplitude images), borehole log and core data, as well as hydraulic,more » tracer and thermal tests before CO 2 injection, (2) joint inversion tools for updating the geologic model with the distribution of rock properties, thus reducing uncertainty, using hydro-geophysical monitoring data, and (3) highly efficient algorithms for directly solving the dense or sparse linear algebra systems derived from the joint inversion. The system combines methods from stochastic analysis, fast linear algebra, and high performance computing. The developed joint inversion tools have been tested through synthetic CO 2 storage examples.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anh Bui; Nam Dinh; Brian Williams
In addition to validation data plan, development of advanced techniques for calibration and validation of complex multiscale, multiphysics nuclear reactor simulation codes are a main objective of the CASL VUQ plan. Advanced modeling of LWR systems normally involves a range of physico-chemical models describing multiple interacting phenomena, such as thermal hydraulics, reactor physics, coolant chemistry, etc., which occur over a wide range of spatial and temporal scales. To a large extent, the accuracy of (and uncertainty in) overall model predictions is determined by the correctness of various sub-models, which are not conservation-laws based, but empirically derived from measurement data. Suchmore » sub-models normally require extensive calibration before the models can be applied to analysis of real reactor problems. This work demonstrates a case study of calibration of a common model of subcooled flow boiling, which is an important multiscale, multiphysics phenomenon in LWR thermal hydraulics. The calibration process is based on a new strategy of model-data integration, in which, all sub-models are simultaneously analyzed and calibrated using multiple sets of data of different types. Specifically, both data on large-scale distributions of void fraction and fluid temperature and data on small-scale physics of wall evaporation were simultaneously used in this work’s calibration. In a departure from traditional (or common-sense) practice of tuning/calibrating complex models, a modern calibration technique based on statistical modeling and Bayesian inference was employed, which allowed simultaneous calibration of multiple sub-models (and related parameters) using different datasets. Quality of data (relevancy, scalability, and uncertainty) could be taken into consideration in the calibration process. This work presents a step forward in the development and realization of the “CIPS Validation Data Plan” at the Consortium for Advanced Simulation of LWRs to enable quantitative assessment of the CASL modeling of Crud-Induced Power Shift (CIPS) phenomenon, in particular, and the CASL advanced predictive capabilities, in general. This report is prepared for the Department of Energy’s Consortium for Advanced Simulation of LWRs program’s VUQ Focus Area.« less
DOT National Transportation Integrated Search
1994-10-01
A shake test was performed on the Large Scale Dynamic Rig in the 40- by 80-Foot Wind Tunnel in support of the McDonnell Douglas Advanced Rotor Technology (MDART) Test Program. The shake test identifies the hub modes and the dynamic calibration matrix...
An Overview of NASA Efforts on Zero Boiloff Storage of Cryogenic Propellants
NASA Technical Reports Server (NTRS)
Hastings, Leon J.; Plachta, D. W.; Salerno, L.; Kittel, P.; Haynes, Davy (Technical Monitor)
2001-01-01
Future mission planning within NASA has increasingly motivated consideration of cryogenic propellant storage durations on the order of years as opposed to a few weeks or months. Furthermore, the advancement of cryocooler and passive insulation technologies in recent years has substantially improved the prospects for zero boiloff storage of cryogenics. Accordingly, a cooperative effort by NASA's Ames Research Center (ARC), Glenn Research Center (GRC), and Marshall Space Flight Center (MSFC) has been implemented to develop and demonstrate "zero boiloff" concepts for in-space storage of cryogenic propellants, particularly liquid hydrogen and oxygen. ARC is leading the development of flight-type cryocoolers, GRC the subsystem development and small scale testing, and MSFC the large scale and integrated system level testing. Thermal and fluid modeling involves a combined effort by the three Centers. Recent accomplishments include: 1) development of "zero boiloff" analytical modeling techniques for sizing the storage tankage, passive insulation, cryocooler, power source mass, and radiators; 2) an early subscale demonstration with liquid hydrogen 3) procurement of a flight-type 10 watt, 95 K pulse tube cryocooler for liquid oxygen storage and 4) assembly of a large-scale test article for an early demonstration of the integrated operation of passive insulation, destratification/pressure control, and cryocooler (commercial unit) subsystems to achieve zero boiloff storage of liquid hydrogen. Near term plans include the large-scale integrated system demonstration testing this summer, subsystem testing of the flight-type pulse-tube cryocooler with liquid nitrogen (oxygen simulant), and continued development of a flight-type liquid hydrogen pulse tube cryocooler.
New Composite Thermoelectric Materials for Macro-size Applications
Dresselhaus, Mildred [MIT, Cambridge, Massachusetts, United States
2017-12-09
A review will be given of several important recent advances in both thermoelectrics research and industrial thermoelectric applications, which have attracted much attention, increasing incentives for developing advanced materials appropriate for large-scale applications of thermoelectric devices. One promising strategy is the development of materials with a dense packing of random nanostructures as a route for the sacle-up of thermoelectrics applications. The concepts involved in designing composite materials containing nanostructures for thermoelectric applications will be discussed in general terms. Specific application is made to the Bi{sub 2}Te{sub 3} nanocomposite system for use in power generation. Also emphasized are the scientific advantages of the nanocomposite approach for the simultaneous increase in the power factor and decrease of the thermal conductivity, along with the practical advantages of having bulk samples for property measurements and device applications. A straightforward path is identified for the scale-up of thermoelectric materials synthesis containing nanostructured constituents for use in thermoelectric applications. We end with some vision of where the field of thermoelectrics is now heading.
Tran, Duy Phu; Pham, Thuy Thi Thanh; Wolfrum, Bernhard; Offenhäusser, Andreas; Thierry, Benjamin
2018-05-11
Owing to their two-dimensional confinements, silicon nanowires display remarkable optical, magnetic, and electronic properties. Of special interest has been the development of advanced biosensing approaches based on the field effect associated with silicon nanowires (SiNWs). Recent advancements in top-down fabrication technologies have paved the way to large scale production of high density and quality arrays of SiNW field effect transistor (FETs), a critical step towards their integration in real-life biosensing applications. A key requirement toward the fulfilment of SiNW FETs' promises in the bioanalytical field is their efficient integration within functional devices. Aiming to provide a comprehensive roadmap for the development of SiNW FET based sensing platforms, we critically review and discuss the key design and fabrication aspects relevant to their development and integration within complementary metal-oxide-semiconductor (CMOS) technology.
Photovoltaic Subcontract Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Surek, Thomas; Catalano, Anthony
1993-03-01
This report summarizes the fiscal year (FY) 1992 progress of the subcontracted photovoltaic (PV) research and development (R D) performed under the Photovoltaic Advanced Research and Development Project at the National Renewable Energy Laboratory (NREL)-formerly the Solar Energy Research Institute (SERI). The mission of the national PV program is to develop PV technology for large-scale generation of economically competitive electric power in the United States. The technical sections of the report cover the main areas of the subcontract program: the Crystalline Materials and Advanced Concepts project, the Polycrystalline Thin Films project, Amorphous Silicon Research project, the Photovoltaic Manufacturing Technology (PVMaT)more » project, PV Module and System Performance and Engineering project, and the PV Analysis and Applications Development project. Technical summaries of each of the subcontracted programs provide a discussion of approaches, major accomplishments in FY 1992, and future research directions.« less
Simulating chemical reactions in ionic liquids using QM/MM methodology.
Acevedo, Orlando
2014-12-18
The use of ionic liquids as a reaction medium for chemical reactions has dramatically increased in recent years due in large part to the numerous reported advances in catalysis and organic synthesis. In some extreme cases, ionic liquids have been shown to induce mechanistic changes relative to conventional solvents. Despite the large interest in the solvents, a clear understanding of the molecular factors behind their chemical impact is largely unknown. This feature article reviews our efforts developing and applying mixed quantum and molecular mechanical (QM/MM) methodology to elucidate the microscopic details of how these solvents operate to enhance rates and alter mechanisms for industrially and academically important reactions, e.g., Diels-Alder, Kemp eliminations, nucleophilic aromatic substitutions, and β-eliminations. Explicit solvent representation provided the medium dependence of the activation barriers and atomic-level characterization of the solute-solvent interactions responsible for the experimentally observed "ionic liquid effects". Technical advances are also discussed, including a linear-scaling pairwise electrostatic interaction alternative to Ewald sums, an efficient polynomial fitting method for modeling proton transfers, and the development of a custom ionic liquid OPLS-AA force field.
NASA Astrophysics Data System (ADS)
Duro, Javier; Iglesias, Rubén; Blanco, Pablo; Albiol, David; Koudogbo, Fifamè
2015-04-01
The Wide Area Product (WAP) is a new interferometric product developed to provide measurement over large regions. Persistent Scatterers Interferometry (PSI) has largely proved their robust and precise performance in measuring ground surface deformation in different application domains. In this context, however, the accurate displacement estimation over large-scale areas (more than 10.000 km2) characterized by low magnitude motion gradients (3-5 mm/year), such as the ones induced by inter-seismic or Earth tidal effects, still remains an open issue. The main reason for that is the inclusion of low quality and more distant persistent scatterers in order to bridge low-quality areas, such as water bodies, crop areas and forested regions. This fact yields to spatial propagation errors on PSI integration process, poor estimation and compensation of the Atmospheric Phase Screen (APS) and the difficult to face residual long-wavelength phase patterns originated by orbit state vectors inaccuracies. Research work for generating a Wide Area Product of ground motion in preparation for the Sentinel-1 mission has been conducted in the last stages of Terrafirma as well as in other research programs. These developments propose technological updates for keeping the precision over large scale PSI analysis. Some of the updates are based on the use of external information, like meteorological models, and the employment of GNSS data for an improved calibration of large measurements. Usually, covering wide regions implies the processing over areas with a land use which is chiefly focused on livestock, horticulture, urbanization and forest. This represents an important challenge for providing continuous InSAR measurements and the application of advanced phase filtering strategies to enhance the coherence. The advanced PSI processing has been performed out over several areas, allowing a large scale analysis of tectonic patterns, and motion caused by multi-hazards as volcanic, landslide and flood. Several examples of the application of the PSI WAP to wide regions for measuring ground displacements related to different types of hazards, natural and human induced will be presented. The InSAR processing approach to measure accurate movements at local and large scales for allowing multi-hazard interpretation studies will also be discussed. The test areas will show deformations related to active faults systems, landslides in mountains slopes, ground compaction over underneath aquifers and movements in volcanic areas.
CImbinator: a web-based tool for drug synergy analysis in small- and large-scale datasets.
Flobak, Åsmund; Vazquez, Miguel; Lægreid, Astrid; Valencia, Alfonso
2017-08-01
Drug synergies are sought to identify combinations of drugs particularly beneficial. User-friendly software solutions that can assist analysis of large-scale datasets are required. CImbinator is a web-service that can aid in batch-wise and in-depth analyzes of data from small-scale and large-scale drug combination screens. CImbinator offers to quantify drug combination effects, using both the commonly employed median effect equation, as well as advanced experimental mathematical models describing dose response relationships. CImbinator is written in Ruby and R. It uses the R package drc for advanced drug response modeling. CImbinator is available at http://cimbinator.bioinfo.cnio.es , the source-code is open and available at https://github.com/Rbbt-Workflows/combination_index . A Docker image is also available at https://hub.docker.com/r/mikisvaz/rbbt-ci_mbinator/ . asmund.flobak@ntnu.no or miguel.vazquez@cnio.es. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
NASA Technical Reports Server (NTRS)
Bushnell, P.; Gruber, M.; Parzych, D.
1988-01-01
Unsteady blade surface pressure data for the Large-Scale Advanced Prop-Fan (LAP) blade operation with angular inflow, wake inflow and uniform flow over a range of inflow Mach numbers of 0.02 to 0.70 is provided. The data are presented as Fourier coefficients for the first 35 harmonics of shaft rotational frequency. Also presented is a brief discussion of the unsteady blade response observed at takeoff and cruise conditions with angular and wake inflow.
The Advanced Modeling, Simulation and Analysis Capability Roadmap Vision for Engineering
NASA Technical Reports Server (NTRS)
Zang, Thomas; Lieber, Mike; Norton, Charles; Fucik, Karen
2006-01-01
This paper summarizes a subset of the Advanced Modeling Simulation and Analysis (AMSA) Capability Roadmap that was developed for NASA in 2005. The AMSA Capability Roadmap Team was chartered to "To identify what is needed to enhance NASA's capabilities to produce leading-edge exploration and science missions by improving engineering system development, operations, and science understanding through broad application of advanced modeling, simulation and analysis techniques." The AMSA roadmap stressed the need for integration, not just within the science, engineering and operations domains themselves, but also across these domains. Here we discuss the roadmap element pertaining to integration within the engineering domain, with a particular focus on implications for future observatory missions. The AMSA products supporting the system engineering function are mission information, bounds on information quality, and system validation guidance. The Engineering roadmap element contains 5 sub-elements: (1) Large-Scale Systems Models, (2) Anomalous Behavior Models, (3) advanced Uncertainty Models, (4) Virtual Testing Models, and (5) space-based Robotics Manufacture and Servicing Models.
Public-Private Partnerships in Cloud-Computing Services in the Context of Genomic Research.
Granados Moreno, Palmira; Joly, Yann; Knoppers, Bartha Maria
2017-01-01
Public-private partnerships (PPPs) have been increasingly used to spur and facilitate innovation in a number of fields. In healthcare, the purpose of using a PPP is commonly to develop and/or provide vaccines and drugs against communicable diseases, mainly in developing or underdeveloped countries. With the advancement of technology and of the area of genomics, these partnerships also focus on large-scale genomic research projects that aim to advance the understanding of diseases that have a genetic component and to develop personalized treatments. This new focus has created new forms of PPPs that involve information technology companies, which provide computing infrastructure and services to store, analyze, and share the massive amounts of data genomic-related projects produce. In this article, we explore models of PPPs proposed to handle, protect, and share the genomic data collected and to further develop genomic-based medical products. We also identify the reasons that make these models suitable and the challenges they have yet to overcome. To achieve this, we describe the details and complexities of MSSNG, International Cancer Genome Consortium, and 100,000 Genomes Project, the three PPPs that focus on large-scale genomic research to better understand the genetic components of autism, cancer, rare diseases, and infectious diseases with the intention to find appropriate treatments. Organized as PPP and employing cloud-computing services, the three projects have advanced quickly and are likely to be important sources of research and development for future personalized medicine. However, there still are unresolved matters relating to conflicts of interest, commercialization, and data control. Learning from the challenges encountered by past PPPs allowed us to establish that developing guidelines to adequately manage personal health information stored in clouds and ensuring the protection of data integrity and privacy would be critical steps in the development of future PPPs.
Public–Private Partnerships in Cloud-Computing Services in the Context of Genomic Research
Granados Moreno, Palmira; Joly, Yann; Knoppers, Bartha Maria
2017-01-01
Public–private partnerships (PPPs) have been increasingly used to spur and facilitate innovation in a number of fields. In healthcare, the purpose of using a PPP is commonly to develop and/or provide vaccines and drugs against communicable diseases, mainly in developing or underdeveloped countries. With the advancement of technology and of the area of genomics, these partnerships also focus on large-scale genomic research projects that aim to advance the understanding of diseases that have a genetic component and to develop personalized treatments. This new focus has created new forms of PPPs that involve information technology companies, which provide computing infrastructure and services to store, analyze, and share the massive amounts of data genomic-related projects produce. In this article, we explore models of PPPs proposed to handle, protect, and share the genomic data collected and to further develop genomic-based medical products. We also identify the reasons that make these models suitable and the challenges they have yet to overcome. To achieve this, we describe the details and complexities of MSSNG, International Cancer Genome Consortium, and 100,000 Genomes Project, the three PPPs that focus on large-scale genomic research to better understand the genetic components of autism, cancer, rare diseases, and infectious diseases with the intention to find appropriate treatments. Organized as PPP and employing cloud-computing services, the three projects have advanced quickly and are likely to be important sources of research and development for future personalized medicine. However, there still are unresolved matters relating to conflicts of interest, commercialization, and data control. Learning from the challenges encountered by past PPPs allowed us to establish that developing guidelines to adequately manage personal health information stored in clouds and ensuring the protection of data integrity and privacy would be critical steps in the development of future PPPs. PMID:28164085
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alan Black; Arnis Judzis
2005-09-30
This document details the progress to date on the OPTIMIZATION OF DEEP DRILLING PERFORMANCE--DEVELOPMENT AND BENCHMARK TESTING OF ADVANCED DIAMOND PRODUCT DRILL BITS AND HP/HT FLUIDS TO SIGNIFICANTLY IMPROVE RATES OF PENETRATION contract for the year starting October 2004 through September 2005. The industry cost shared program aims to benchmark drilling rates of penetration in selected simulated deep formations and to significantly improve ROP through a team development of aggressive diamond product drill bit--fluid system technologies. Overall the objectives are as follows: Phase 1--Benchmark ''best in class'' diamond and other product drilling bits and fluids and develop concepts for amore » next level of deep drilling performance; Phase 2--Develop advanced smart bit-fluid prototypes and test at large scale; and Phase 3--Field trial smart bit--fluid concepts, modify as necessary and commercialize products. As of report date, TerraTek has concluded all Phase 1 testing and is planning Phase 2 development.« less
The Classification, Natural History and Treatment of the Limb Girdle Muscular Dystrophies
Murphy, Alexander Peter; Straub, Volker
2015-01-01
Abstract Over sixty years ago John Walton and Frederick Nattrass defined limb girdle muscular dystrophy (LGMD) as a separate entity from the X-linked dystrophinopathies such as Duchenne and Becker muscular dystrophies. LGMD is a highly heterogeneous group of very rare neuromuscular disorders whose common factor is their autosomal inheritance. Sixty years later, with the development of increasingly advanced molecular genetic investigations, a more precise classification and understanding of the pathogenesis is possible. To date, over 30 distinct subtypes of LGMD have been identified, most of them inherited in an autosomal recessive fashion. There are significant differences in the frequency of subtypes of LGMD between different ethnic populations, providing evidence of founder mutations. Clinically there is phenotypic heterogeneity between subtypes of LGMD with varying severity and age of onset of symptoms. The first natural history studies into subtypes of LGMD are in process, but large scale longitudinal data have been lacking due to the rare nature of these diseases. Following natural history data collection, the next challenge is to develop more effective, disease specific treatments. Current management is focussed on symptomatic and supportive treatments. Advances in the application of new omics technologies and the generation of large-scale biomedical data will help to better understand disease mechanisms in LGMD and should ultimately help to accelerate the development of novel and more effective therapeutic approaches. PMID:27858764
The Classification, Natural History and Treatment of the Limb Girdle Muscular Dystrophies.
Murphy, Alexander Peter; Straub, Volker
2015-07-22
Over sixty years ago John Walton and Frederick Nattrass defined limb girdle muscular dystrophy (LGMD) as a separate entity from the X-linked dystrophinopathies such as Duchenne and Becker muscular dystrophies. LGMD is a highly heterogeneous group of very rare neuromuscular disorders whose common factor is their autosomal inheritance. Sixty years later, with the development of increasingly advanced molecular genetic investigations, a more precise classification and understanding of the pathogenesis is possible.To date, over 30 distinct subtypes of LGMD have been identified, most of them inherited in an autosomal recessive fashion. There are significant differences in the frequency of subtypes of LGMD between different ethnic populations, providing evidence of founder mutations. Clinically there is phenotypic heterogeneity between subtypes of LGMD with varying severity and age of onset of symptoms. The first natural history studies into subtypes of LGMD are in process, but large scale longitudinal data have been lacking due to the rare nature of these diseases. Following natural history data collection, the next challenge is to develop more effective, disease specific treatments. Current management is focussed on symptomatic and supportive treatments. Advances in the application of new omics technologies and the generation of large-scale biomedical data will help to better understand disease mechanisms in LGMD and should ultimately help to accelerate the development of novel and more effective therapeutic approaches.
NASA Marshall Space Flight Center Controls Systems Design and Analysis Branch
NASA Technical Reports Server (NTRS)
Gilligan, Eric
2014-01-01
Marshall Space Flight Center maintains a critical national capability in the analysis of launch vehicle flight dynamics and flight certification of GN&C algorithms. MSFC analysts are domain experts in the areas of flexible-body dynamics and control-structure interaction, thrust vector control, sloshing propellant dynamics, and advanced statistical methods. Marshall's modeling and simulation expertise has supported manned spaceflight for over 50 years. Marshall's unparalleled capability in launch vehicle guidance, navigation, and control technology stems from its rich heritage in developing, integrating, and testing launch vehicle GN&C systems dating to the early Mercury-Redstone and Saturn vehicles. The Marshall team is continuously developing novel methods for design, including advanced techniques for large-scale optimization and analysis.
[Prospect of the Advanced Life Support Program Breadboard Project at Kennedy Space Center in USA].
Guo, S S; Ai, W D
2001-04-01
The Breadboard Project at Kennedy Space Center in NASA of USA was focused on the development of the bioregenerative life support components, crop plants for water, air, and food production and bioreactors for recycling of wastes. The keystone of the Breadboard Project was the Biomass Production Chamber (BPC), which was supported by 15 environmentally controlled chambers and several laboratory facilities holding a total area of 2150 m2. In supporting the Advanced Life Support Program (ALS Program), the Project utilizes these facilities for large-scale testing of components and development of required technologies for human-rated test-beds at Johnson Space Center in NASA, in order to enable a Lunar and a Mars mission finally.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Linfeng
A literature survey has been conducted to collect information on the International R&D activities in the extraction of uranium from seawater for the period from the 1960s till the year of 2010. The reported activities, on both the laboratory scale bench experiments and the large scale marine experiments, were summarized by country/region in this report. Among all countries where such activities have been reported, Japan has carried out the most advanced large scale marine experiments with the amidoxime-based system, and achieved the collection efficiency (1.5 g-U/kg-adsorbent for 30 days soaking in the ocean) that could justify the development of industrialmore » scale marine systems to produce uranium from seawater at the price competitive with those from conventional uranium resources. R&D opportunities are discussed for improving the system performance (selectivity for uranium, loading capacity, chemical stability and mechanical durability in the sorption-elution cycle, and sorption kinetics) and making the collection of uranium from seawater more economically competitive.« less
Large-scale magnetic fields at high Reynolds numbers in magnetohydrodynamic simulations.
Hotta, H; Rempel, M; Yokoyama, T
2016-03-25
The 11-year solar magnetic cycle shows a high degree of coherence in spite of the turbulent nature of the solar convection zone. It has been found in recent high-resolution magnetohydrodynamics simulations that the maintenance of a large-scale coherent magnetic field is difficult with small viscosity and magnetic diffusivity (≲10 (12) square centimenters per second). We reproduced previous findings that indicate a reduction of the energy in the large-scale magnetic field for lower diffusivities and demonstrate the recovery of the global-scale magnetic field using unprecedentedly high resolution. We found an efficient small-scale dynamo that suppresses small-scale flows, which mimics the properties of large diffusivity. As a result, the global-scale magnetic field is maintained even in the regime of small diffusivities-that is, large Reynolds numbers. Copyright © 2016, American Association for the Advancement of Science.
An investigation of small scales of turbulence in a boundary layer at high Reynolds numbers
NASA Technical Reports Server (NTRS)
Wallace, James M.; Ong, L.; Balint, J.-L.
1993-01-01
The assumption that turbulence at large wave-numbers is isotropic and has universal spectral characteristics which are independent of the flow geometry, at least for high Reynolds numbers, has been a cornerstone of closure theories as well as of the most promising recent development in the effort to predict turbulent flows, viz. large eddy simulations. This hypothesis was first advanced by Kolmogorov based on the supposition that turbulent kinetic energy cascades down the scales (up the wave-numbers) of turbulence and that, if the number of these cascade steps is sufficiently large (i.e. the wave-number range is large), then the effects of anisotropies at the large scales are lost in the energy transfer process. Experimental attempts were repeatedly made to verify this fundamental assumption. However, Van Atta has recently suggested that an examination of the scalar and velocity gradient fields is necessary to definitively verify this hypothesis or prove it to be unfounded. Of course, this must be carried out in a flow with a sufficiently high Reynolds number to provide the necessary separation of scales in order unambiguously to provide the possibility of local isotropy at large wave-numbers. An opportunity to use our 12-sensor hot-wire probe to address this issue directly was made available at the 80'x120' wind tunnel at the NASA Ames Research Center, which is normally used for full-scale aircraft tests. An initial report on this high Reynolds number experiment and progress toward its evaluation is presented.
Human cardiomyocyte generation from pluripotent stem cells: A state-of-art.
Talkhabi, Mahmood; Aghdami, Nasser; Baharvand, Hossein
2016-01-15
The human heart is considered a non-regenerative organ. Worldwide, cardiovascular diseases continue to be the leading cause of death. Despite advances in cardiac treatment, myocardial repair remains severely limited by the lack of an appropriate source of viable cardiomyocytes (CMs) to replace damaged tissue. Human pluripotent stem cells (hPSCs), embryonic stem cells (ESCs) and induced pluripotent stem cells (iPSCs) can efficiently be differentiated into functional CMs necessary for cell replacement therapy and other potential applications. The number of protocols that derive CMs from hPSCs has increased exponentially over the past decade following observation of the first human beating CMs. A number of highly efficient, chemical based protocols have been developed to generate human CMs (hCMs) in small-scale and large-scale suspension systems. To reduce the heterogeneity of hPSC-derived CMs, the differentiation protocols were modulated to exclusively generate atrial-, ventricular-, and nodal-like CM subtypes. Recently, remarkable advances have been achieved in hCM generation including chemical-based cardiac differentiation, cardiac subtype specification, large-scale suspension culture differentiation, and development of chemically defined culture conditions. These hCMs could be useful particularly in the context of in vitro disease modeling, pharmaceutical screening and in cellular replacement therapies once the safety issues are overcome. Herein we review recent progress in the in vitro generation of CMs and cardiac subtypes from hPSCs and discuss their potential applications and current limitations. Copyright © 2015 Elsevier Inc. All rights reserved.
Turboprop Model in the 8- by 6-Foot Supersonic Wind Tunnel
1976-08-21
National Aeronautics and Space Administration (NASA) engineer Robert Jeracki prepares a Hamilton Standard SR-1 turboprop model in the test section of the 8- by 6-Foot Supersonic Wind Tunnel at the Lewis Research Center. Lewis researchers were analyzing a series of eight-bladed propellers in their wind tunnels to determine their operating characteristics at speeds up to Mach 0.8. The program, which became the Advanced Turboprop, was part of a NASA-wide Aircraft Energy Efficiency Program which was designed to reduce aircraft fuel costs by 50 percent. The ATP concept was different from the turboprops in use in the 1950s. The modern versions had at least eight blades and were swept back for better performance. After Lewis researchers developed the advanced turboprop theory and established its potential performance capabilities, they commenced an almost decade-long partnership with Hamilton Standard to develop, verify, and improve the concept. A series of 24-inch scale models of the SR-1 with different blade shapes and angles were tested in Lewis’ wind tunnels. A formal program was established in 1978 to examine associated noise levels, aerodynamics, and the drive system. The testing of the large-scale propfan was done on test rigs, in large wind tunnels, and, eventually, on aircraft.
Development of advanced materials composites for use as insulations for LH2 tanks
NASA Technical Reports Server (NTRS)
Lemons, C. R.; Watts, C. R.; Salmassy, O. K.
1972-01-01
A study of internal insulation materials and fabrication processes for space shuttle LH2 tanks is reported. Emphasis was placed on an insulation system capable of reentry and multiple reuse in the Shuttle environment. Results are given on the optimization and manufacturing process scale-up of a 3D fiberreinforced foam insulation, BX-251-3D, derived from the Saturn S-4B internal insulation. It is shown that BX-251-3D can be satisfactorily installed in large-scale tanks under conditions that will permit a significant cost saving over the existing S-4B technology.
Current challenges in quantifying preferential flow through the vadose zone
NASA Astrophysics Data System (ADS)
Koestel, John; Larsbo, Mats; Jarvis, Nick
2017-04-01
In this presentation, we give an overview of current challenges in quantifying preferential flow through the vadose zone. A review of the literature suggests that current generation models do not fully reflect the present state of process understanding and empirical knowledge of preferential flow. We believe that the development of improved models will be stimulated by the increasingly widespread application of novel imaging technologies as well as future advances in computational power and numerical techniques. One of the main challenges in this respect is to bridge the large gap between the scales at which preferential flow occurs (pore to Darcy scales) and the scale of interest for management (fields, catchments, regions). Studies at the pore scale are being supported by the development of 3-D non-invasive imaging and numerical simulation techniques. These studies are leading to a better understanding of how macropore network topology and initial/boundary conditions control key state variables like matric potential and thus the strength of preferential flow. Extrapolation of this knowledge to larger scales would require support from theoretical frameworks such as key concepts from percolation and network theory, since we lack measurement technologies to quantify macropore networks at these large scales. Linked hydro-geophysical measurement techniques that produce highly spatially and temporally resolved data enable investigation of the larger-scale heterogeneities that can generate preferential flow patterns at pedon, hillslope and field scales. At larger regional and global scales, improved methods of data-mining and analyses of large datasets (machine learning) may help in parameterizing models as well as lead to new insights into the relationships between soil susceptibility to preferential flow and site attributes (climate, land uses, soil types).
Aquatic ecosystem protection and restoration: Advances in methods for assessment and evaluation
Bain, M.B.; Harig, A.L.; Loucks, D.P.; Goforth, R.R.; Mills, K.E.
2000-01-01
Many methods and criteria are available to assess aquatic ecosystems, and this review focuses on a set that demonstrates advancements from community analyses to methods spanning large spatial and temporal scales. Basic methods have been extended by incorporating taxa sensitivity to different forms of stress, adding measures linked to system function, synthesizing multiple faunal groups, integrating biological and physical attributes, spanning large spatial scales, and enabling simulations through time. These tools can be customized to meet the needs of a particular assessment and ecosystem. Two case studies are presented to show how new methods were applied at the ecosystem scale for achieving practical management goals. One case used an assessment of biotic structure to demonstrate how enhanced river flows can improve habitat conditions and restore a diverse fish fauna reflective of a healthy riverine ecosystem. In the second case, multitaxonomic integrity indicators were successful in distinguishing lake ecosystems that were disturbed, healthy, and in the process of restoration. Most methods strive to address the concept of biological integrity and assessment effectiveness often can be impeded by the lack of more specific ecosystem management objectives. Scientific and policy explorations are needed to define new ways for designating a healthy system so as to allow specification of precise quality criteria that will promote further development of ecosystem analysis tools.
Constructing Neuronal Network Models in Massively Parallel Environments.
Ippen, Tammo; Eppler, Jochen M; Plesser, Hans E; Diesmann, Markus
2017-01-01
Recent advances in the development of data structures to represent spiking neuron network models enable us to exploit the complete memory of petascale computers for a single brain-scale network simulation. In this work, we investigate how well we can exploit the computing power of such supercomputers for the creation of neuronal networks. Using an established benchmark, we divide the runtime of simulation code into the phase of network construction and the phase during which the dynamical state is advanced in time. We find that on multi-core compute nodes network creation scales well with process-parallel code but exhibits a prohibitively large memory consumption. Thread-parallel network creation, in contrast, exhibits speedup only up to a small number of threads but has little overhead in terms of memory. We further observe that the algorithms creating instances of model neurons and their connections scale well for networks of ten thousand neurons, but do not show the same speedup for networks of millions of neurons. Our work uncovers that the lack of scaling of thread-parallel network creation is due to inadequate memory allocation strategies and demonstrates that thread-optimized memory allocators recover excellent scaling. An analysis of the loop order used for network construction reveals that more complex tests on the locality of operations significantly improve scaling and reduce runtime by allowing construction algorithms to step through large networks more efficiently than in existing code. The combination of these techniques increases performance by an order of magnitude and harnesses the increasingly parallel compute power of the compute nodes in high-performance clusters and supercomputers.
Constructing Neuronal Network Models in Massively Parallel Environments
Ippen, Tammo; Eppler, Jochen M.; Plesser, Hans E.; Diesmann, Markus
2017-01-01
Recent advances in the development of data structures to represent spiking neuron network models enable us to exploit the complete memory of petascale computers for a single brain-scale network simulation. In this work, we investigate how well we can exploit the computing power of such supercomputers for the creation of neuronal networks. Using an established benchmark, we divide the runtime of simulation code into the phase of network construction and the phase during which the dynamical state is advanced in time. We find that on multi-core compute nodes network creation scales well with process-parallel code but exhibits a prohibitively large memory consumption. Thread-parallel network creation, in contrast, exhibits speedup only up to a small number of threads but has little overhead in terms of memory. We further observe that the algorithms creating instances of model neurons and their connections scale well for networks of ten thousand neurons, but do not show the same speedup for networks of millions of neurons. Our work uncovers that the lack of scaling of thread-parallel network creation is due to inadequate memory allocation strategies and demonstrates that thread-optimized memory allocators recover excellent scaling. An analysis of the loop order used for network construction reveals that more complex tests on the locality of operations significantly improve scaling and reduce runtime by allowing construction algorithms to step through large networks more efficiently than in existing code. The combination of these techniques increases performance by an order of magnitude and harnesses the increasingly parallel compute power of the compute nodes in high-performance clusters and supercomputers. PMID:28559808
Fauchald, Per; Langeland, Knut; Ims, Rolf A.; Yoccoz, Nigel G.; Bråthen, Kari Anne
2014-01-01
The spatial and temporal distribution of forage quality is among the most central factors affecting herbivore habitat selection. Yet, for high latitude areas, forage quantity has been found to be more important than quality. Studies on large ungulate foraging patterns are faced with methodological challenges in both assessing animal movements at the scale of forage distribution, and in assessing forage quality with relevant metrics. Here we use first-passage time analyses to assess how reindeer movements relate to forage quality and quantity measured as the phenology and cover of growth forms along reindeer tracks. The study was conducted in a high latitude ecosystem dominated by low-palatable growth forms. We found that the scale of reindeer movement was season dependent, with more extensive area use as the summer season advanced. Small-scale movement in the early season was related to selection for younger stages of phenology and for higher abundances of generally phenologically advanced palatable growth forms (grasses and deciduous shrubs). Also there was a clear selection for later phenological stages of the most dominant, yet generally phenologically slow and low-palatable growth form (evergreen shrubs). As the summer season advanced only quantity was important, with selection for higher quantities of one palatable growth form and avoidance of a low palatable growth form. We conclude that both forage quality and quantity are significant predictors to habitat selection by a large herbivore at high latitude. The early season selectivity reflected that among dominating low palatability growth forms there were palatable phenological stages and palatable growth forms available, causing herbivores to be selective in their habitat use. The diminishing selectivity and the increasing scale of movement as the season developed suggest a response by reindeer to homogenized forage availability of low quality. PMID:24972188
Iversen, Marianne; Fauchald, Per; Langeland, Knut; Ims, Rolf A; Yoccoz, Nigel G; Bråthen, Kari Anne
2014-01-01
The spatial and temporal distribution of forage quality is among the most central factors affecting herbivore habitat selection. Yet, for high latitude areas, forage quantity has been found to be more important than quality. Studies on large ungulate foraging patterns are faced with methodological challenges in both assessing animal movements at the scale of forage distribution, and in assessing forage quality with relevant metrics. Here we use first-passage time analyses to assess how reindeer movements relate to forage quality and quantity measured as the phenology and cover of growth forms along reindeer tracks. The study was conducted in a high latitude ecosystem dominated by low-palatable growth forms. We found that the scale of reindeer movement was season dependent, with more extensive area use as the summer season advanced. Small-scale movement in the early season was related to selection for younger stages of phenology and for higher abundances of generally phenologically advanced palatable growth forms (grasses and deciduous shrubs). Also there was a clear selection for later phenological stages of the most dominant, yet generally phenologically slow and low-palatable growth form (evergreen shrubs). As the summer season advanced only quantity was important, with selection for higher quantities of one palatable growth form and avoidance of a low palatable growth form. We conclude that both forage quality and quantity are significant predictors to habitat selection by a large herbivore at high latitude. The early season selectivity reflected that among dominating low palatability growth forms there were palatable phenological stages and palatable growth forms available, causing herbivores to be selective in their habitat use. The diminishing selectivity and the increasing scale of movement as the season developed suggest a response by reindeer to homogenized forage availability of low quality.
NASA Astrophysics Data System (ADS)
Klein, R.; Woodward, C. S.; Johannesson, G.; Domyancic, D.; Covey, C. C.; Lucas, D. D.
2012-12-01
Uncertainty Quantification (UQ) is a critical field within 21st century simulation science that resides at the very center of the web of emerging predictive capabilities. The science of UQ holds the promise of giving much greater meaning to the results of complex large-scale simulations, allowing for quantifying and bounding uncertainties. This powerful capability will yield new insights into scientific predictions (e.g. Climate) of great impact on both national and international arenas, allow informed decisions on the design of critical experiments (e.g. ICF capsule design, MFE, NE) in many scientific fields, and assign confidence bounds to scientifically predictable outcomes (e.g. nuclear weapons design). In this talk I will discuss a major new strategic initiative (SI) we have developed at Lawrence Livermore National Laboratory to advance the science of Uncertainty Quantification at LLNL focusing in particular on (a) the research and development of new algorithms and methodologies of UQ as applied to multi-physics multi-scale codes, (b) incorporation of these advancements into a global UQ Pipeline (i.e. a computational superstructure) that will simplify user access to sophisticated tools for UQ studies as well as act as a self-guided, self-adapting UQ engine for UQ studies on extreme computing platforms and (c) use laboratory applications as a test bed for new algorithms and methodologies. The initial SI focus has been on applications for the quantification of uncertainty associated with Climate prediction, but the validated UQ methodologies we have developed are now being fed back into Science Based Stockpile Stewardship (SSS) and ICF UQ efforts. To make advancements in several of these UQ grand challenges, I will focus in talk on the following three research areas in our Strategic Initiative: Error Estimation in multi-physics and multi-scale codes ; Tackling the "Curse of High Dimensionality"; and development of an advanced UQ Computational Pipeline to enable complete UQ workflow and analysis for ensemble runs at the extreme scale (e.g. exascale) with self-guiding adaptation in the UQ Pipeline engine. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).
Frongillo, Edward A; Nguyen, Phuong H; Saha, Kuntal K; Sanghvi, Tina; Afsana, Kaosar; Haque, Raisul; Baker, Jean; Ruel, Marie T; Rawat, Rahul; Menon, Purnima
2017-02-01
Promoting adequate nutrition through interventions to improve infant and young child feeding (IYCF) has the potential to contribute to child development. We examined whether an intensive intervention package that was aimed at improving IYCF at scale through the Alive & Thrive initiative in Bangladesh also advanced language and gross motor development, and whether advancements in language and gross motor development were explained through improved complementary feeding. A cluster-randomized design compared 2 intervention packages: intensive interpersonal counseling on IYCF, mass media campaign, and community mobilization (intensive) compared with usual nutrition counseling and mass media campaign (nonintensive). Twenty subdistricts were randomly assigned to receive either the intensive or the nonintensive intervention. Household surveys were conducted at baseline (2010) and at endline (2014) in the same communities (n = ∼4000 children aged 0-47.9 mo for each round). Child development was measured by asking mothers if their child had reached each of multiple milestones, with some observed. Linear regression accounting for clustering was used to derive difference-in-differences (DID) impact estimates, and path analysis was used to examine developmental advancement through indicators of improved IYCF and other factors. The DID in language development between intensive and nonintensive groups was 1.05 milestones (P = 0.001) among children aged 6-23.9 mo and 0.76 milestones (P = 0.038) among children aged 24-47.9 mo. For gross motor development, the DID was 0.85 milestones (P = 0.035) among children aged 6-23.9 mo. The differences observed corresponded to age- and sex-adjusted effect sizes of 0.35 for language and 0.23 for gross motor development. Developmental advancement at 6-23.9 mo was partially explained through improved minimum dietary diversity and the consumption of iron-rich food. Intensive IYCF intervention differentially advanced language and gross motor development, which was partially explained through improved complementary feeding. Measuring a diverse set of child outcomes, including functional outcomes such as child development, is important when evaluating integrated nutrition programs. This trial was registered at clinicaltrials.gov as NCT01678716. © 2017 American Society for Nutrition.
Advanced spacecraft: What will they look like and why
NASA Technical Reports Server (NTRS)
Price, Humphrey W.
1990-01-01
The next century of spaceflight will witness an expansion in the physical scale of spacecraft, from the extreme of the microspacecraft to the very large megaspacecraft. This will respectively spawn advances in highly integrated and miniaturized components, and also advances in lightweight structures, space fabrication, and exotic control systems. Challenges are also presented by the advent of advanced propulsion systems, many of which require controlling and directing hot plasma, dissipating large amounts of waste heat, and handling very high radiation sources. Vehicle configuration studies for a number of theses types of advanced spacecraft were performed, and some of them are presented along with the rationale for their physical layouts.
Spasojevic, Marko J; Bahlai, Christie A; Bradley, Bethany A; Butterfield, Bradley J; Tuanmu, Mao-Ning; Sistla, Seeta; Wiederholt, Ruscena; Suding, Katharine N
2016-04-01
Understanding the mechanisms underlying ecosystem resilience - why some systems have an irreversible response to disturbances while others recover - is critical for conserving biodiversity and ecosystem function in the face of global change. Despite the widespread acceptance of a positive relationship between biodiversity and resilience, empirical evidence for this relationship remains fairly limited in scope and localized in scale. Assessing resilience at the large landscape and regional scales most relevant to land management and conservation practices has been limited by the ability to measure both diversity and resilience over large spatial scales. Here, we combined tools used in large-scale studies of biodiversity (remote sensing and trait databases) with theoretical advances developed from small-scale experiments to ask whether the functional diversity within a range of woodland and forest ecosystems influences the recovery of productivity after wildfires across the four-corner region of the United States. We additionally asked how environmental variation (topography, macroclimate) across this geographic region influences such resilience, either directly or indirectly via changes in functional diversity. Using path analysis, we found that functional diversity in regeneration traits (fire tolerance, fire resistance, resprout ability) was a stronger predictor of the recovery of productivity after wildfire than the functional diversity of seed mass or species richness. Moreover, slope, elevation, and aspect either directly or indirectly influenced the recovery of productivity, likely via their effect on microclimate, while macroclimate had no direct or indirect effects. Our study provides some of the first direct empirical evidence for functional diversity increasing resilience at large spatial scales. Our approach highlights the power of combining theory based on local-scale studies with tools used in studies at large spatial scales and trait databases to understand pressing environmental issues. © 2015 John Wiley & Sons Ltd.
Distributed state machine supervision for long-baseline gravitational-wave detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rollins, Jameson Graef, E-mail: jameson.rollins@ligo.org
The Laser Interferometer Gravitational-wave Observatory (LIGO) consists of two identical yet independent, widely separated, long-baseline gravitational-wave detectors. Each Advanced LIGO detector consists of complex optical-mechanical systems isolated from the ground by multiple layers of active seismic isolation, all controlled by hundreds of fast, digital, feedback control systems. This article describes a novel state machine-based automation platform developed to handle the automation and supervisory control challenges of these detectors. The platform, called Guardian, consists of distributed, independent, state machine automaton nodes organized hierarchically for full detector control. User code is written in standard Python and the platform is designed to facilitatemore » the fast-paced development process associated with commissioning the complicated Advanced LIGO instruments. While developed specifically for the Advanced LIGO detectors, Guardian is a generic state machine automation platform that is useful for experimental control at all levels, from simple table-top setups to large-scale multi-million dollar facilities.« less
2'-modified nucleosides for site-specific labeling of oligonucleotides
NASA Technical Reports Server (NTRS)
Krider, Elizabeth S.; Miller, Jeremiah E.; Meade, Thomas J.
2002-01-01
We report the synthesis of 2'-modified nucleosides designed specifically for incorporating labels into oligonucleotides. Conversion of these nucleosides to phosphoramidite and solid support-bound derivatives proceeds in good yield. Large-scale synthesis of 11-mer oligonucleotides possessing the 2'-modified nucleosides is achieved using these derivatives. Thermal denaturation studies indicate that the presence of 2'-modified nucleosides in 11-mer duplexes has minimal destabilizing effects on the duplex structure when the nucleosides are placed at the duplex termini. The powerful combination of phosphoramidite and support-bound derivatives of 2'-modified nucleosides affords the large-scale preparation of an entirely new class of oligonucleotides. The ability to synthesize oligonucleotides containing label attachment sites at 3', intervening, and 5' locations of a duplex is a significant advance in the development of oligonucleotide conjugates.
A functional model for characterizing long-distance movement behaviour
Buderman, Frances E.; Hooten, Mevin B.; Ivan, Jacob S.; Shenk, Tanya M.
2016-01-01
Advancements in wildlife telemetry techniques have made it possible to collect large data sets of highly accurate animal locations at a fine temporal resolution. These data sets have prompted the development of a number of statistical methodologies for modelling animal movement.Telemetry data sets are often collected for purposes other than fine-scale movement analysis. These data sets may differ substantially from those that are collected with technologies suitable for fine-scale movement modelling and may consist of locations that are irregular in time, are temporally coarse or have large measurement error. These data sets are time-consuming and costly to collect but may still provide valuable information about movement behaviour.We developed a Bayesian movement model that accounts for error from multiple data sources as well as movement behaviour at different temporal scales. The Bayesian framework allows us to calculate derived quantities that describe temporally varying movement behaviour, such as residence time, speed and persistence in direction. The model is flexible, easy to implement and computationally efficient.We apply this model to data from Colorado Canada lynx (Lynx canadensis) and use derived quantities to identify changes in movement behaviour.
Research and Development of Large Capacity CFB Boilers in TPRI
NASA Astrophysics Data System (ADS)
Xianbin, Sun; Minhua, Jiang
This paper presents an overview of advancements of circulating fluidized bed (CFB) technology in Thermal Power Research Institute (TPRI),including technologies and configuration and progress of scaling up. For devoloping large CFB boiler, the CFB combustion test facilities have been established, the key technologies of large capacity CFB boiler have been research systematically, the 100MW ˜330MW CFB boiler have been developed and manufactured. The first domestically designed 100MW and 210MW CFB boiler have been put into commericial operation and have good operating performance. Domestic 330MW CFB boiler demonstration project also has been put into commericial operation,which is H type CFB boiler with Compact heat exchanger. This boiler is China's largest CFB boiler. The technical plan of domestic 600MW supercritical CFB boiler are also briefly introduced.
Dudley, Joel T; Listgarten, Jennifer; Stegle, Oliver; Brenner, Steven E; Parts, Leopold
2015-01-01
Advances in molecular profiling and sensor technologies are expanding the scope of personalized medicine beyond genotypes, providing new opportunities for developing richer and more dynamic multi-scale models of individual health. Recent studies demonstrate the value of scoring high-dimensional microbiome, immune, and metabolic traits from individuals to inform personalized medicine. Efforts to integrate multiple dimensions of clinical and molecular data towards predictive multi-scale models of individual health and wellness are already underway. Improved methods for mining and discovery of clinical phenotypes from electronic medical records and technological developments in wearable sensor technologies present new opportunities for mapping and exploring the critical yet poorly characterized "phenome" and "envirome" dimensions of personalized medicine. There are ambitious new projects underway to collect multi-scale molecular, sensor, clinical, behavioral, and environmental data streams from large population cohorts longitudinally to enable more comprehensive and dynamic models of individual biology and personalized health. Personalized medicine stands to benefit from inclusion of rich new sources and dimensions of data. However, realizing these improvements in care relies upon novel informatics methodologies, tools, and systems to make full use of these data to advance both the science and translational applications of personalized medicine.
Fabrication of the HIAD Large-Scale Demonstration Assembly and Upcoming Mission Applications
NASA Technical Reports Server (NTRS)
Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; Dinonno, J. M.; Cheatwood, F M.
2017-01-01
Over a decade of work has been conducted in the development of NASAs Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale.In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then integrated in early 2017. The design includes provisions to add the remaining four tori necessary to complete the assembly of the 12m Human-Scale Pathfinder HIAD in the event future project funding becomes available.This presentation will discuss the HIAD large-scale demonstration assembly design and fabrication per-formed in the last year including the precursor tori development and the partial-stack fabrication. Potential near-term and future 10-15m HIAD applications will also be discussed.
Fabrication of the HIAD Large-Scale Demonstration Assembly
NASA Technical Reports Server (NTRS)
Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; DiNonno, J. M.; Cheatwood, F. M.
2017-01-01
Over a decade of work has been conducted in the development of NASA's Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale. In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then integrated in early 2017. The design includes provisions to add the remaining four tori necessary to complete the assembly of the 12m Human-Scale Pathfinder HIAD in the event future project funding becomes available.This presentation will discuss the HIAD large-scale demonstration assembly design and fabrication per-formed in the last year including the precursor tori development and the partial-stack fabrication. Potential near-term and future 10-15m HIAD applications will also be discussed.
Cloud-enabled large-scale land surface model simulations with the NASA Land Information System
NASA Astrophysics Data System (ADS)
Duffy, D.; Vaughan, G.; Clark, M. P.; Peters-Lidard, C. D.; Nijssen, B.; Nearing, G. S.; Rheingrover, S.; Kumar, S.; Geiger, J. V.
2017-12-01
Developed by the Hydrological Sciences Laboratory at NASA Goddard Space Flight Center (GSFC), the Land Information System (LIS) is a high-performance software framework for terrestrial hydrology modeling and data assimilation. LIS provides the ability to integrate satellite and ground-based observational products and advanced modeling algorithms to extract land surface states and fluxes. Through a partnership with the National Center for Atmospheric Research (NCAR) and the University of Washington, the LIS model is currently being extended to include the Structure for Unifying Multiple Modeling Alternatives (SUMMA). With the addition of SUMMA in LIS, meaningful simulations containing a large multi-model ensemble will be enabled and can provide advanced probabilistic continental-domain modeling capabilities at spatial scales relevant for water managers. The resulting LIS/SUMMA application framework is difficult for non-experts to install due to the large amount of dependencies on specific versions of operating systems, libraries, and compilers. This has created a significant barrier to entry for domain scientists that are interested in using the software on their own systems or in the cloud. In addition, the requirement to support multiple run time environments across the LIS community has created a significant burden on the NASA team. To overcome these challenges, LIS/SUMMA has been deployed using Linux containers, which allows for an entire software package along with all dependences to be installed within a working runtime environment, and Kubernetes, which orchestrates the deployment of a cluster of containers. Within a cloud environment, users can now easily create a cluster of virtual machines and run large-scale LIS/SUMMA simulations. Installations that have taken weeks and months can now be performed in minutes of time. This presentation will discuss the steps required to create a cloud-enabled large-scale simulation, present examples of its use, and describe the potential deployment of this information technology with other NASA applications.
Povey, Jane F; O'Malley, Christopher J; Root, Tracy; Martin, Elaine B; Montague, Gary A; Feary, Marc; Trim, Carol; Lang, Dietmar A; Alldread, Richard; Racher, Andrew J; Smales, C Mark
2014-08-20
Despite many advances in the generation of high producing recombinant mammalian cell lines over the last few decades, cell line selection and development is often slowed by the inability to predict a cell line's phenotypic characteristics (e.g. growth or recombinant protein productivity) at larger scale (large volume bioreactors) using data from early cell line construction at small culture scale. Here we describe the development of an intact cell MALDI-ToF mass spectrometry fingerprinting method for mammalian cells early in the cell line construction process whereby the resulting mass spectrometry data are used to predict the phenotype of mammalian cell lines at larger culture scale using a Partial Least Squares Discriminant Analysis (PLS-DA) model. Using MALDI-ToF mass spectrometry, a library of mass spectrometry fingerprints was generated for individual cell lines at the 96 deep well plate stage of cell line development. The growth and productivity of these cell lines were evaluated in a 10L bioreactor model of Lonza's large-scale (up to 20,000L) fed-batch cell culture processes. Using the mass spectrometry information at the 96 deep well plate stage and phenotype information at the 10L bioreactor scale a PLS-DA model was developed to predict the productivity of unknown cell lines at the 10L scale based upon their MALDI-ToF fingerprint at the 96 deep well plate scale. This approach provides the basis for the very early prediction of cell lines' performance in cGMP manufacturing-scale bioreactors and the foundation for methods and models for predicting other mammalian cell phenotypes from rapid, intact-cell mass spectrometry based measurements. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Applications of Molecular Nanotechnology
NASA Technical Reports Server (NTRS)
Globus, Al; Bailey, David; Han, Jie; Jaffe, Richard; Levit, Creon; Merkle, Ralph; Srivastava, Deepak
1998-01-01
Laboratories throughout the world are rapidly gaining atomically precise control over matter. As this control extends to an ever wider variety of materials, processes and devices, opportunities for applications relevant to NASA's missions will be created. This document surveys a number of future molecular nanotechnology capabilities of aerospace interest. Computer applications, launch vehicle improvements, and active materials appear to be of particular interest. We also list a number of applications for each of NASA's enterprises. If advanced molecular nanotechnology can be developed, almost all of NASA's endeavors will be radically improved. In particular, a sufficiently advanced molecular nanotechnology can arguably bring large scale space colonization within our grasp.
Advances in bioartificial liver assist devices.
Patzer, J F
2001-11-01
Rapid advances in development of bioartificial liver assist devices (BLADs) are exciting clinical interest in the application of BLAD technology for support of patients with acute liver failure. Four devices (Circe Biomedical HepatAssist, Vitagen ELAD, Gerlach BELS, and Excorp Medical BLSS) that rely on hepatocytes cultured in hollow-fiber membrane technology are currently in various stages of clinical evaluation. Several alternative approaches for culture and perfusion of hepatocytes have been evaluated in preclinical, large animal models of liver failure, or at a laboratory scale. Engineering design issues with respect to xenotransplantation, BLAD perfusion, hepatocyte functionality and culture maintenance, and ultimate distribution of a BLAD to a clinical site are delineated.
Wakefield Computations for the CLIC PETS using the Parallel Finite Element Time-Domain Code T3P
DOE Office of Scientific and Technical Information (OSTI.GOV)
Candel, A; Kabel, A.; Lee, L.
In recent years, SLAC's Advanced Computations Department (ACD) has developed the high-performance parallel 3D electromagnetic time-domain code, T3P, for simulations of wakefields and transients in complex accelerator structures. T3P is based on advanced higher-order Finite Element methods on unstructured grids with quadratic surface approximation. Optimized for large-scale parallel processing on leadership supercomputing facilities, T3P allows simulations of realistic 3D structures with unprecedented accuracy, aiding the design of the next generation of accelerator facilities. Applications to the Compact Linear Collider (CLIC) Power Extraction and Transfer Structure (PETS) are presented.
Recent advances on Fe- and Mn-based cathode materials for lithium and sodium ion batteries
NASA Astrophysics Data System (ADS)
Zhu, Xiaobo; Lin, Tongen; Manning, Eric; Zhang, Yuancheng; Yu, Mengmeng; Zuo, Bin; Wang, Lianzhou
2018-06-01
The ever-growing market of electrochemical energy storage impels the advances on cost-effective and environmentally friendly battery chemistries. Lithium-ion batteries (LIBs) are currently the most critical energy storage devices for a variety of applications, while sodium-ion batteries (SIBs) are expected to complement LIBs in large-scale applications. In respect to their constituent components, the cathode part is the most significant sector regarding weight fraction and cost. Therefore, the development of cathode materials based on Earth's abundant elements (Fe and Mn) largely determines the prospects of the batteries. Herein, we offer a comprehensive review of the up-to-date advances on Fe- and Mn-based cathode materials for LIBs and SIBs, highlighting some promising candidates, such as Li- and Mn-rich layered oxides, LiNi0.5Mn1.5O4, LiFe1-xMnxPO4, NaxFeyMn1-yO2, Na4MnFe2(PO4)(P2O7), and Prussian blue analogs. Also, challenges and prospects are discussed to direct the possible development of cost-effective and high-performance cathode materials for future rechargeable batteries.
Next-Generation Proteomics and Its Application to Clinical Breast Cancer Research.
Mardamshina, Mariya; Geiger, Tamar
2017-10-01
Proteomics technology aims to map the protein landscapes of biological samples, and it can be applied to a variety of samples, including cells, tissues, and body fluids. Because the proteins are the main functional molecules in the cells, their levels reflect much more accurately the cellular phenotype and the regulatory processes within them than gene levels, mutations, and even mRNA levels. With the advancement in the technology, it is possible now to obtain comprehensive views of the biological systems and to study large patient cohorts in a streamlined manner. In this review we discuss the technological advancements in mass spectrometry-based proteomics, which allow analysis of breast cancer tissue samples, leading to the first large-scale breast cancer proteomics studies. Furthermore, we discuss the technological developments in blood-based biomarker discovery, which provide the basis for future development of assays for routine clinical use. Although these are only the first steps in implementation of proteomics into the clinic, extensive collaborative work between these worlds will undoubtedly lead to major discoveries and advances in clinical practice. Copyright © 2017 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.
Heavy hydrocarbon main injector technology
NASA Technical Reports Server (NTRS)
Fisher, S. C.; Arbit, H. A.
1988-01-01
One of the key components of the Advanced Launch System (ALS) is a large liquid rocket, booster engine. To keep the overall vehicle size and cost down, this engine will probably use liquid oxygen (LOX) and a heavy hydrocarbon, such as RP-1, as propellants and operate at relatively high chamber pressures to increase overall performance. A technology program (Heavy Hydrocarbon Main Injector Technology) is being studied. The main objective of this effort is to develop a logic plan and supporting experimental data base to reduce the risk of developing a large scale (approximately 750,000 lb thrust), high performance main injector system. The overall approach and program plan, from initial analyses to large scale, two dimensional combustor design and test, and the current status of the program are discussed. Progress includes performance and stability analyses, cold flow tests of injector model, design and fabrication of subscale injectors and calorimeter combustors for performance, heat transfer, and dynamic stability tests, and preparation of hot fire test plans. Related, current, high pressure, LOX/RP-1 injector technology efforts are also briefly discussed.
Wolfrum, Bernhard; Thierry, Benjamin
2018-01-01
Owing to their two-dimensional confinements, silicon nanowires display remarkable optical, magnetic, and electronic properties. Of special interest has been the development of advanced biosensing approaches based on the field effect associated with silicon nanowires (SiNWs). Recent advancements in top-down fabrication technologies have paved the way to large scale production of high density and quality arrays of SiNW field effect transistor (FETs), a critical step towards their integration in real-life biosensing applications. A key requirement toward the fulfilment of SiNW FETs’ promises in the bioanalytical field is their efficient integration within functional devices. Aiming to provide a comprehensive roadmap for the development of SiNW FET based sensing platforms, we critically review and discuss the key design and fabrication aspects relevant to their development and integration within complementary metal-oxide-semiconductor (CMOS) technology. PMID:29751688
Photovoltaic Subcontract Program. Annual report, FY 1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-03-01
This report summarizes the fiscal year (FY) 1992 progress of the subcontracted photovoltaic (PV) research and development (R&D) performed under the Photovoltaic Advanced Research and Development Project at the National Renewable Energy Laboratory (NREL)-formerly the Solar Energy Research Institute (SERI). The mission of the national PV program is to develop PV technology for large-scale generation of economically competitive electric power in the United States. The technical sections of the report cover the main areas of the subcontract program: the Crystalline Materials and Advanced Concepts project, the Polycrystalline Thin Films project, Amorphous Silicon Research project, the Photovoltaic Manufacturing Technology (PVMaT) project,more » PV Module and System Performance and Engineering project, and the PV Analysis and Applications Development project. Technical summaries of each of the subcontracted programs provide a discussion of approaches, major accomplishments in FY 1992, and future research directions.« less
NASA Astrophysics Data System (ADS)
Arrigo, J. S.; Famiglietti, J. S.; Murdoch, L. C.; Lakshmi, V.; Hooper, R. P.
2012-12-01
The Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) continues a major effort towards supporting Community Hydrologic Modeling. From 2009 - 2011, the Community Hydrologic Modeling Platform (CHyMP) initiative held three workshops, the ultimate goal of which was to produce recommendations and an implementation plan to establish a community modeling program that enables comprehensive simulation of water anywhere on the North American continent. Such an effort would include connections to and advances in global climate models, biogeochemistry, and efforts of other disciplines that require an understanding of water patterns and processes in the environment. To achieve such a vision will require substantial investment in human and cyber-infrastructure and significant advances in the science of hydrologic modeling and spatial scaling. CHyMP concluded with a final workshop, held March 2011, and produced several recommendations. CUAHSI and the university community continue to advance community modeling and implement these recommendations through several related and follow on efforts. Key results from the final 2011 workshop included agreement among participants that the community is ready to move forward with implementation. It is recognized that initial implementation of this larger effort can begin with simulation capabilities that currently exist, or that can be easily developed. CHyMP identified four key activities in support of community modeling: benchmarking, dataset evaluation and development, platform evaluation, and developing a national water model framework. Key findings included: 1) The community supported the idea of a National Water Model framework; a community effort is needed to explore what the ultimate implementation of a National Water Model is. A true community modeling effort would support the modeling of "water anywhere" and would include all relevant scales and processes. 2) Implementation of a community modeling program could initially focus on continental scale modeling of water quantity (rather than quality). The goal of this initial model is the comprehensive description of water stores and fluxes in such a way to permit linkage to GCM's, biogeochemical, ecological, and geomorphic models. This continental scale focus allows systematic evaluation of our current state of knowledge and data, leverages existing efforts done by large scale modelers, contributes to scientific discovery that informs globally and societal relevant questions, and provides an initial framework to evaluate hydrologic information relevant to other disciplines and a structure into which to incorporate other classes of hydrologic models. 3) Dataset development will be a key aspect of any successful national water model implementation. Our current knowledge of the subsurface is limiting our ability to truly integrate soil and groundwater into large scale models, and to answering critical science questions with societal relevance (i.e. groundwater's influence on climate). 4) The CHyMP workshops and efforts to date have achieved collaboration between university scientists, government agencies and the private sector that must be maintained. Follow on efforts in community modeling should aim at leveraging and maintaining this collaboration for maximum scientific and societal benefit.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendon, Vrushali V.; Taylor, Zachary T.
ABSTRACT: Recent advances in residential building energy efficiency and codes have resulted in increased interest in detailed residential building energy models using the latest energy simulation software. One of the challenges of developing residential building models to characterize new residential building stock is to allow for flexibility to address variability in house features like geometry, configuration, HVAC systems etc. Researchers solved this problem in a novel way by creating a simulation structure capable of creating fully-functional EnergyPlus batch runs using a completely scalable residential EnergyPlus template system. This system was used to create a set of thirty-two residential prototype buildingmore » models covering single- and multifamily buildings, four common foundation types and four common heating system types found in the United States (US). A weighting scheme with detailed state-wise and national weighting factors was designed to supplement the residential prototype models. The complete set is designed to represent a majority of new residential construction stock. The entire structure consists of a system of utility programs developed around the core EnergyPlus simulation engine to automate the creation and management of large-scale simulation studies with minimal human effort. The simulation structure and the residential prototype building models have been used for numerous large-scale studies, one of which is briefly discussed in this paper.« less
NASA Astrophysics Data System (ADS)
Massei, Nicolas; Labat, David; Jourde, Hervé; Lecoq, Nicolas; Mazzilli, Naomi
2017-04-01
The french karst observatory network SNO KARST is a national initiative from the National Institute for Earth Sciences and Astronomy (INSU) of the National Center for Scientific Research (CNRS). It is also part of the new french research infrastructure for the observation of the critical zone OZCAR. SNO KARST is composed by several karst sites distributed over conterminous France which are located in different physiographic and climatic contexts (Mediterranean, Pyrenean, Jura mountain, western and northwestern shore near the Atlantic or the English Channel). This allows the scientific community to develop advanced research and experiments dedicated to improve understanding of the hydrological functioning of karst catchments. Here we used several sites of SNO KARST in order to assess the hydrological response of karst catchments to long-term variation of large-scale atmospheric circulation. Using NCEP reanalysis products and karst discharge, we analyzed the links between large-scale circulation and karst water resources variability. As karst hydrosystems are highly heterogeneous media, they behave differently across different time-scales : we explore the large-scale/local-scale relationships according to time-scales using a wavelet multiresolution approach of both karst hydrological variables and large-scale climate fields such as sea level pressure (SLP). The different wavelet components of karst discharge in response to the corresponding wavelet component of climate fields are either 1) compared to physico-chemical/geochemical responses at karst springs, or 2) interpreted in terms of hydrological functioning by comparing discharge wavelet components to internal components obtained from precipitation/discharge models using the KARSTMOD conceptual modeling platform of SNO KARST.
Wen, X.; Datta, A.; Traverso, L. M.; Pan, L.; Xu, X.; Moon, E. E.
2015-01-01
Optical lithography, the enabling process for defining features, has been widely used in semiconductor industry and many other nanotechnology applications. Advances of nanotechnology require developments of high-throughput optical lithography capabilities to overcome the optical diffraction limit and meet the ever-decreasing device dimensions. We report our recent experimental advancements to scale up diffraction unlimited optical lithography in a massive scale using the near field nanolithography capabilities of bowtie apertures. A record number of near-field optical elements, an array of 1,024 bowtie antenna apertures, are simultaneously employed to generate a large number of patterns by carefully controlling their working distances over the entire array using an optical gap metrology system. Our experimental results reiterated the ability of using massively-parallel near-field devices to achieve high-throughput optical nanolithography, which can be promising for many important nanotechnology applications such as computation, data storage, communication, and energy. PMID:26525906
Web-based Visual Analytics for Extreme Scale Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Evans, Katherine J; Harney, John F
In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less
NASA Astrophysics Data System (ADS)
Guiquan, Xi; Lin, Cong; Xuehui, Jin
2018-05-01
As an important platform for scientific and technological development, large -scale scientific facilities are the cornerstone of technological innovation and a guarantee for economic and social development. Researching management of large-scale scientific facilities can play a key role in scientific research, sociology and key national strategy. This paper reviews the characteristics of large-scale scientific facilities, and summarizes development status of China's large-scale scientific facilities. At last, the construction, management, operation and evaluation of large-scale scientific facilities is analyzed from the perspective of sustainable development.
Household Energy Consumption Segmentation Using Hourly Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kwac, J; Flora, J; Rajagopal, R
2014-01-01
The increasing US deployment of residential advanced metering infrastructure (AMI) has made hourly energy consumption data widely available. Using CA smart meter data, we investigate a household electricity segmentation methodology that uses an encoding system with a pre-processed load shape dictionary. Structured approaches using features derived from the encoded data drive five sample program and policy relevant energy lifestyle segmentation strategies. We also ensure that the methodologies developed scale to large data sets.
Integrated analysis of remote sensing products from basic geological surveys. [Brazil
NASA Technical Reports Server (NTRS)
Dasilvafagundesfilho, E. (Principal Investigator)
1984-01-01
Recent advances in remote sensing led to the development of several techniques to obtain image information. These techniques as effective tools in geological maping are analyzed. A strategy for optimizing the images in basic geological surveying is presented. It embraces as integrated analysis of spatial, spectral, and temporal data through photoptic (color additive viewer) and computer processing at different scales, allowing large areas survey in a fast, precise, and low cost manner.
ERIC Educational Resources Information Center
Steiner-Khamsi, Gita; Appleton, Margaret; Vellani, Shezleen
2018-01-01
The media analysis is situated in the larger body of studies that explore the varied reasons why different policy actors advocate for international large-scale student assessments (ILSAs) and adds to the research on the fast advance of the global education industry. The analysis of "The Economist," "Financial Times," and…
A Brief Description of the Kokkos implementation of the SNAP potential in ExaMiniMD.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, Aidan P.; Trott, Christian Robert
2017-11-01
Within the EXAALT project, the SNAP [1] approach is being used to develop high accuracy potentials for use in large-scale long-time molecular dynamics simulations of materials behavior. In particular, we have developed a new SNAP potential that is suitable for describing the interplay between helium atoms and vacancies in high-temperature tungsten[2]. This model is now being used to study plasma-surface interactions in nuclear fusion reactors for energy production. The high-accuracy of SNAP potentials comes at the price of increased computational cost per atom and increased computational complexity. The increased cost is mitigated by improvements in strong scaling that can bemore » achieved using advanced algorithms [3].« less
NASA Technical Reports Server (NTRS)
Shivers, J. P.; Mclemore, H. C.; Coe, P. L., Jr.
1976-01-01
Tests have been conducted in a full scale tunnel to determine the low speed aerodynamic characteristics of a large scale advanced arrow wing supersonic transport configuration with engines mounted above the wing for upper surface blowing. Tests were made over an angle of attack range of -10 deg to 32 deg, sideslip angles of + or - 5 deg, and a Reynolds number range of 3,530,000 to 7,330,000. Configuration variables included trailing edge flap deflection, engine jet nozzle angle, engine thrust coefficient, engine out operation, and asymmetrical trailing edge boundary layer control for providing roll trim. Downwash measurements at the tail were obtained for different thrust coefficients, tail heights, and at two fuselage stations.
A dynamical systems approach to studying midlatitude weather extremes
NASA Astrophysics Data System (ADS)
Messori, Gabriele; Caballero, Rodrigo; Faranda, Davide
2017-04-01
Extreme weather occurrences carry enormous social and economic costs and routinely garner widespread scientific and media coverage. The ability to predict these events is therefore a topic of crucial importance. Here we propose a novel predictability pathway for extreme events, by building upon recent advances in dynamical systems theory. We show that simple dynamical systems metrics can be used to identify sets of large-scale atmospheric flow patterns with similar spatial structure and temporal evolution on time scales of several days to a week. In regions where these patterns favor extreme weather, they afford a particularly good predictability of the extremes. We specifically test this technique on the atmospheric circulation in the North Atlantic region, where it provides predictability of large-scale wintertime surface temperature extremes in Europe up to 1 week in advance.
Advanced Hypervelocity Aerophysics Facility Workshop
NASA Technical Reports Server (NTRS)
Witcofski, Robert D. (Compiler); Scallion, William I. (Compiler)
1989-01-01
The primary objective of the workshop was to obtain a critical assessment of a concept for a large, advanced hypervelocity ballistic range test facility powered by an electromagnetic launcher, which was proposed by the Langley Research Center. It was concluded that the subject large-scale facility was feasible and would provide the required ground-based capability for performing tests at entry flight conditions (velocity and density) on large, complex, instrumented models. It was also concluded that advances in remote measurement techniques and particularly onboard model instrumentation, light-weight model construction techniques, and model electromagnetic launcher (EML) systems must be made before any commitment for the construction of such a facility can be made.
Winfree, Seth; Dagher, Pierre C; Dunn, Kenneth W; Eadon, Michael T; Ferkowicz, Michael; Barwinska, Daria; Kelly, Katherine J; Sutton, Timothy A; El-Achkar, Tarek M
2018-06-05
Kidney biopsy remains the gold standard for uncovering the pathogenesis of acute and chronic kidney diseases. However, the ability to perform high resolution, quantitative, molecular and cellular interrogation of this precious tissue is still at a developing stage compared to other fields such as oncology. Here, we discuss recent advances in performing large-scale, three-dimensional (3D), multi-fluorescence imaging of kidney biopsies and quantitative analysis referred to as 3D tissue cytometry. This approach allows the accurate measurement of specific cell types and their spatial distribution in a thick section spanning the entire length of the biopsy. By uncovering specific disease signatures, including rare occurrences, and linking them to the biology in situ, this approach will enhance our understanding of disease pathogenesis. Furthermore, by providing accurate quantitation of cellular events, 3D cytometry may improve the accuracy of prognosticating the clinical course and response to therapy. Therefore, large-scale 3D imaging and cytometry of kidney biopsy is poised to become a bridge towards personalized medicine for patients with kidney disease. © 2018 S. Karger AG, Basel.
Advanced Aerospace Materials by Design
NASA Technical Reports Server (NTRS)
Srivastava, Deepak; Djomehri, Jahed; Wei, Chen-Yu
2004-01-01
The advances in the emerging field of nanophase thermal and structural composite materials; materials with embedded sensors and actuators for morphing structures; light-weight composite materials for energy and power storage; and large surface area materials for in-situ resource generation and waste recycling, are expected to :revolutionize the capabilities of virtually every system comprising of future robotic and :human moon and mars exploration missions. A high-performance multiscale simulation platform, including the computational capabilities and resources of Columbia - the new supercomputer, is being developed to discover, validate, and prototype next generation (of such advanced materials. This exhibit will describe the porting and scaling of multiscale 'physics based core computer simulation codes for discovering and designing carbon nanotube-polymer composite materials for light-weight load bearing structural and 'thermal protection applications.
Interleaved arrays antenna technology development
NASA Technical Reports Server (NTRS)
1986-01-01
Phase one and two of a program to further develop and investigate advanced graphite epoxy waveguides, radiators, and components with application to space antennas are discussed. The objective of the two phases were to demonstrate mechanical integrity of a small panel of radiators and parts procured under a previous contract and to develop alternate designs and applications of the technology. Most of the emphasis was on the assembly and test of a 5 x 5 element module. This effort was supported by evaluation of adhesives and waveguide joint configurations. The evaluation and final assembly considered not only mechanical performance but also producibility in large scale.
NASA Astrophysics Data System (ADS)
Guo, Jie; Zhu, Chang`an
2016-01-01
The development of optics and computer technologies enables the application of the vision-based technique that uses digital cameras to the displacement measurement of large-scale structures. Compared with traditional contact measurements, vision-based technique allows for remote measurement, has a non-intrusive characteristic, and does not necessitate mass introduction. In this study, a high-speed camera system is developed to complete the displacement measurement in real time. The system consists of a high-speed camera and a notebook computer. The high-speed camera can capture images at a speed of hundreds of frames per second. To process the captured images in computer, the Lucas-Kanade template tracking algorithm in the field of computer vision is introduced. Additionally, a modified inverse compositional algorithm is proposed to reduce the computing time of the original algorithm and improve the efficiency further. The modified algorithm can rapidly accomplish one displacement extraction within 1 ms without having to install any pre-designed target panel onto the structures in advance. The accuracy and the efficiency of the system in the remote measurement of dynamic displacement are demonstrated in the experiments on motion platform and sound barrier on suspension viaduct. Experimental results show that the proposed algorithm can extract accurate displacement signal and accomplish the vibration measurement of large-scale structures.
Post-Cold War Science and Technology at Los Alamos
NASA Astrophysics Data System (ADS)
Browne, John C.
2002-04-01
Los Alamos National Laboratory serves the nation through the development and application of leading-edge science and technology in support of national security. Our mission supports national security by: ensuring the safety, security, and reliability of the U.S. nuclear stockpile; reducing the threat of weapons of mass destruction in support of counter terrorism and homeland defense; and solving national energy, environment, infrastructure, and health security problems. We require crosscutting fundamental and advanced science and technology research to accomplish our mission. The Stockpile Stewardship Program develops and applies, advanced experimental science, computational simulation, and technology to ensure the safety and reliability of U.S. nuclear weapons in the absence of nuclear testing. This effort in itself is a grand challenge. However, the terrorist attack of September 11, 2001, reminded us of the importance of robust and vibrant research and development capabilities to meet new and evolving threats to our national security. Today through rapid prototyping we are applying new, innovative, science and technology for homeland defense, to address the threats of nuclear, chemical, and biological weapons globally. Synergistically, with the capabilities that we require for our core mission, we contribute in many other areas of scientific endeavor. For example, our Laboratory has been part of the NASA effort on mapping water on the moon and NSF/DOE projects studying high-energy astrophysical phenomena, understanding fundamental scaling phenomena of life, exploring high-temperature superconductors, investigating quantum information systems, applying neutrons to condensed-matter and nuclear physics research, developing large-scale modeling and simulations to understand complex phenomena, and exploring nanoscience that bridges the atomic to macroscopic scales. In this presentation, I will highlight some of these post-cold war science and technology advances including our national security contributions, and discuss some of challenges for Los Alamos in the future.
Christensen, A. J.; Srinivasan, V.; Hart, J. C.; ...
2018-03-17
Sustainable crop production is a contributing factor to current and future food security. Innovative technologies are needed to design strategies that will achieve higher crop yields on less land and with fewer resources. Computational modeling coupled with advanced scientific visualization enables researchers to explore and interact with complex agriculture, nutrition, and climate data to predict how crops will respond to untested environments. These virtual observations and predictions can direct the development of crop ideotypes designed to meet future yield and nutritional demands. This review surveys modeling strategies for the development of crop ideotypes and scientific visualization technologies that have ledmore » to discoveries in “big data” analysis. Combined modeling and visualization approaches have been used to realistically simulate crops and to guide selection that immediately enhances crop quantity and quality under challenging environmental conditions. Lastly, this survey of current and developing technologies indicates that integrative modeling and advanced scientific visualization may help overcome challenges in agriculture and nutrition data as large-scale and multidimensional data become available in these fields.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christensen, A. J.; Srinivasan, V.; Hart, J. C.
Sustainable crop production is a contributing factor to current and future food security. Innovative technologies are needed to design strategies that will achieve higher crop yields on less land and with fewer resources. Computational modeling coupled with advanced scientific visualization enables researchers to explore and interact with complex agriculture, nutrition, and climate data to predict how crops will respond to untested environments. These virtual observations and predictions can direct the development of crop ideotypes designed to meet future yield and nutritional demands. This review surveys modeling strategies for the development of crop ideotypes and scientific visualization technologies that have ledmore » to discoveries in “big data” analysis. Combined modeling and visualization approaches have been used to realistically simulate crops and to guide selection that immediately enhances crop quantity and quality under challenging environmental conditions. Lastly, this survey of current and developing technologies indicates that integrative modeling and advanced scientific visualization may help overcome challenges in agriculture and nutrition data as large-scale and multidimensional data become available in these fields.« less
NASA Technical Reports Server (NTRS)
Tri, Terry O.
1999-01-01
As a key component in its ground test bed capability, NASA's Advanced Life Support Program has been developing a large-scale advanced life support test facility capable of supporting long-duration evaluations of integrated bioregenerative life support systems with human test crews. This facility-targeted for evaluation of hypogravity compatible life support systems to be developed for use on planetary surfaces such as Mars or the Moon-is called the Bioregenerative Planetary Life Support Systems Test Complex (BIO-Plex) and is currently under development at the Johnson Space Center. This test bed is comprised of a set of interconnected chambers with a sealed internal environment which are outfitted with systems capable of supporting test crews of four individuals for periods exceeding one year. The advanced technology systems to be tested will consist of both biological and physicochemical components and will perform all required crew life support functions. This presentation provides a description of the proposed test "missions" to be supported by the BIO-Plex and the planned development strategy for the facility.
Christensen, A J; Srinivasan, Venkatraman; Hart, John C; Marshall-Colon, Amy
2018-05-01
Sustainable crop production is a contributing factor to current and future food security. Innovative technologies are needed to design strategies that will achieve higher crop yields on less land and with fewer resources. Computational modeling coupled with advanced scientific visualization enables researchers to explore and interact with complex agriculture, nutrition, and climate data to predict how crops will respond to untested environments. These virtual observations and predictions can direct the development of crop ideotypes designed to meet future yield and nutritional demands. This review surveys modeling strategies for the development of crop ideotypes and scientific visualization technologies that have led to discoveries in "big data" analysis. Combined modeling and visualization approaches have been used to realistically simulate crops and to guide selection that immediately enhances crop quantity and quality under challenging environmental conditions. This survey of current and developing technologies indicates that integrative modeling and advanced scientific visualization may help overcome challenges in agriculture and nutrition data as large-scale and multidimensional data become available in these fields.
Potential Collaborative Research topics with Korea’s Agency for Defense Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrar, Charles R.; Todd, Michael D.
2012-08-23
This presentation provides a high level summary of current research activities at the Los Alamos National Laboratory (LANL)-University of California Jacobs School of Engineering (UCSD) Engineering Institute that will be presented at Korea's Agency for Defense Development (ADD). These research activities are at the basic engineering science level with different level of maturity ranging from initial concepts to field proof-of-concept demonstrations. We believe that all of these activities are appropriate for collaborative research activities with ADD subject to approval by each institution. All the activities summarized herein have the common theme that they are multi-disciplinary in nature and typically involvedmore » the integration of high-fidelity predictive modeling, advanced sensing technologies and new development in information technology. These activities include: Wireless Sensor Systems, Swarming Robot sensor systems, Advanced signal processing (compressed sensing) and pattern recognition, Model Verification and Validation, Optimal/robust sensor system design, Haptic systems for large-scale data processing, Cyber-physical security for robots, Multi-source energy harvesting, Reliability-based approaches to damage prognosis, SHMTools software development, and Cyber-physical systems advanced study institute.« less
Christensen, A J; Srinivasan, Venkatraman; Hart, John C; Marshall-Colon, Amy
2018-01-01
Abstract Sustainable crop production is a contributing factor to current and future food security. Innovative technologies are needed to design strategies that will achieve higher crop yields on less land and with fewer resources. Computational modeling coupled with advanced scientific visualization enables researchers to explore and interact with complex agriculture, nutrition, and climate data to predict how crops will respond to untested environments. These virtual observations and predictions can direct the development of crop ideotypes designed to meet future yield and nutritional demands. This review surveys modeling strategies for the development of crop ideotypes and scientific visualization technologies that have led to discoveries in “big data” analysis. Combined modeling and visualization approaches have been used to realistically simulate crops and to guide selection that immediately enhances crop quantity and quality under challenging environmental conditions. This survey of current and developing technologies indicates that integrative modeling and advanced scientific visualization may help overcome challenges in agriculture and nutrition data as large-scale and multidimensional data become available in these fields. PMID:29562368
Cryogenics for superconductors: Refrigeration, delivery, and preservation of the cold
NASA Astrophysics Data System (ADS)
Ganni, Venkatarao; Fesmire, James
2012-06-01
Applications in superconductivity have become widespread, enabled by advancements in cryogenic engineering. In this paper, the history of cryogenic refrigeration, its delivery, its preservation and the important scientific and engineering advancements in these areas in the last 100 years will be reviewed, beginning with small laboratory dewars to very large scale systems. The key technological advancements in these areas that enabled the development of superconducting applications at temperatures from 4 to 77 K are identified. Included are advancements in the components used up to the present state-of-the-art in refrigeration systems design. Viewpoints as both an equipment supplier and the end-user with regard to the equipment design and operations will be presented. Some of the present and future challenges in these areas will be outlined. Most of the materials in this paper are a collection of the historical materials applicable to these areas of interest.
Cryogenics for superconductors: Refrigeration, delivery, and preservation of the cold
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venkatarao Ganni, James Fesmire
Applications in superconductivity have become widespread, enabled by advancements in cryogenic engineering. In this paper, the history of cryogenic refrigeration, its delivery, its preservation and the important scientific and engineering advancements in these areas in the last 100 years will be reviewed, beginning with small laboratory dewars to very large scale systems. The key technological advancements in these areas that enabled the development of superconducting applications at temperatures from 4 to 77 K are identified. Included are advancements in the components used up to the present state-of-the-art in refrigeration systems design. Viewpoints as both an equipment supplier and the end-usermore » with regard to the equipment design and operations will be presented. Some of the present and future challenges in these areas will be outlined. Most of the materials in this paper are a collection of the historical materials applicable to these areas of interest.« less
Cryogenics for Superconductors: Refrigeration, Delivery, and Preservation of the Cold
NASA Technical Reports Server (NTRS)
Ganni, V.; Fesmire, J. E.
2011-01-01
Applications in superconductivity have become widespread, enabled by advancements in cryogenic engineering. In this paper, the history of cryogenic refrigeration, its delivery, its preservation and the important scientific and engineering advancements in these areas in the last 100 years will be reviewed, beginning with small laboratory dewars to very large scale systems. The key technological advancements in these areas that enabled the development of superconducting applications at temperatures from 4 to 77 K are identified. Included are advancements in the components used up to the present state-of-the-art in refrigeration systems design. Viewpoints as both an equipment supplier and the end-user with regard to the equipment design and operations will be presented. Some of the present and future challenges in these areas will be outlined. Most of the materials in this paper are a collection of the historical materials applicable to these areas of interest.
Basu, Sumanta; Duren, William; Evans, Charles R; Burant, Charles F; Michailidis, George; Karnovsky, Alla
2017-05-15
Recent technological advances in mass spectrometry, development of richer mass spectral libraries and data processing tools have enabled large scale metabolic profiling. Biological interpretation of metabolomics studies heavily relies on knowledge-based tools that contain information about metabolic pathways. Incomplete coverage of different areas of metabolism and lack of information about non-canonical connections between metabolites limits the scope of applications of such tools. Furthermore, the presence of a large number of unknown features, which cannot be readily identified, but nonetheless can represent bona fide compounds, also considerably complicates biological interpretation of the data. Leveraging recent developments in the statistical analysis of high-dimensional data, we developed a new Debiased Sparse Partial Correlation algorithm (DSPC) for estimating partial correlation networks and implemented it as a Java-based CorrelationCalculator program. We also introduce a new version of our previously developed tool Metscape that enables building and visualization of correlation networks. We demonstrate the utility of these tools by constructing biologically relevant networks and in aiding identification of unknown compounds. http://metscape.med.umich.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Characterization of the Temperature Capabilities of Advanced Disk Alloy ME3
NASA Technical Reports Server (NTRS)
Gabb, Timothy P.; Telesman, Jack; Kantzos, Peter T.; OConnor, Kenneth
2002-01-01
The successful development of an advanced powder metallurgy disk alloy, ME3, was initiated in the NASA High Speed Research/Enabling Propulsion Materials (HSR/EPM) Compressor/Turbine Disk program in cooperation with General Electric Engine Company and Pratt & Whitney Aircraft Engines. This alloy was designed using statistical screening and optimization of composition and processing variables to have extended durability at 1200 F in large disks. Disks of this alloy were produced at the conclusion of the program using a realistic scaled-up disk shape and processing to enable demonstration of these properties. The objective of the Ultra-Efficient Engine Technologies disk program was to assess the mechanical properties of these ME3 disks as functions of temperature in order to estimate the maximum temperature capabilities of this advanced alloy. These disks were sectioned, machined into specimens, and extensively tested. Additional sub-scale disks and blanks were processed and selectively tested to explore the effects of several processing variations on mechanical properties. Results indicate the baseline ME3 alloy and process can produce 1300 to 1350 F temperature capabilities, dependent on detailed disk and engine design property requirements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, Daniel
8-Session Symposium on STRUCTURE AND DYNAMICS IN COMPLEX CHEMICAL SYSTEMS: GAINING NEW INSIGHTS THROUGH RECENT ADVANCES IN TIME-RESOLVED SPECTROSCOPIES. The intricacy of most chemical, biochemical, and material processes and their applications are underscored by the complex nature of the environments in which they occur. Substantial challenges for building a global understanding of a heterogeneous system include (1) identifying unique signatures associated with specific structural motifs within the heterogeneous distribution, and (2) resolving the significance of each of multiple time scales involved in both small- and large-scale nuclear reorganization. This symposium focuses on the progress in our understanding of dynamics inmore » complex systems driven by recent innovations in time-resolved spectroscopies and theoretical developments. Such advancement is critical for driving discovery at the molecular level facilitating new applications. Broad areas of interest include: Structural relaxation and the impact of structure on dynamics in liquids, interfaces, biochemical systems, materials, and other heterogeneous environments.« less
Advances in optical structure systems; Proceedings of the Meeting, Orlando, FL, Apr. 16-19, 1990
NASA Astrophysics Data System (ADS)
Breakwell, John; Genberg, Victor L.; Krumweide, Gary C.
Various papers on advances in optical structure systems are presented. Individual topics addressed include: beam pathlength optimization, thermal stress in glass/metal bond with PR 1578 adhesive, structural and optical properties for typical solid mirror shapes, parametric study of spinning polygon mirror deformations, simulation of small structures-optics-controls system, spatial PSDs of optical structures due to random vibration, mountings for a four-meter glass mirror, fast-steering mirrors in optical control systems, adaptive state estimation for control of flexible structures, surface control techniques for large segmented mirrors, two-time-scale control designs for large flexible structures, closed-loop dynamic shape control of a flexible beam. Also discussed are: inertially referenced pointing for body-fixed payloads, sensor blending line-of-sight stabilization, controls/optics/structures simulation development, transfer functions for piezoelectric control of a flexible beam, active control experiments for large-optics vibration alleviation, composite structures for a large-optical test bed, graphite/epoxy composite mirror for beam-steering applications, composite structures for optical-mirror applications, thin carbon-fiber prepregs for dimensionally critical structures.
How Large Scale Flows in the Solar Convection Zone may Influence Solar Activity
NASA Technical Reports Server (NTRS)
Hathaway, D. H.
2004-01-01
Large scale flows within the solar convection zone are the primary drivers of the Sun s magnetic activity cycle. Differential rotation can amplify the magnetic field and convert poloidal fields into toroidal fields. Poleward meridional flow near the surface can carry magnetic flux that reverses the magnetic poles and can convert toroidal fields into poloidal fields. The deeper, equatorward meridional flow can carry magnetic flux toward the equator where it can reconnect with oppositely directed fields in the other hemisphere. These axisymmetric flows are themselves driven by large scale convective motions. The effects of the Sun s rotation on convection produce velocity correlations that can maintain the differential rotation and meridional circulation. These convective motions can influence solar activity themselves by shaping the large-scale magnetic field pattern. While considerable theoretical advances have been made toward understanding these large scale flows, outstanding problems in matching theory to observations still remain.
Recent developments in microfluidic large scale integration.
Araci, Ismail Emre; Brisk, Philip
2014-02-01
In 2002, Thorsen et al. integrated thousands of micromechanical valves on a single microfluidic chip and demonstrated that the control of the fluidic networks can be simplified through multiplexors [1]. This enabled realization of highly parallel and automated fluidic processes with substantial sample economy advantage. Moreover, the fabrication of these devices by multilayer soft lithography was easy and reliable hence contributed to the power of the technology; microfluidic large scale integration (mLSI). Since then, mLSI has found use in wide variety of applications in biology and chemistry. In the meantime, efforts to improve the technology have been ongoing. These efforts mostly focus on; novel materials, components, micromechanical valve actuation methods, and chip architectures for mLSI. In this review, these technological advances are discussed and, recent examples of the mLSI applications are summarized. Copyright © 2013 Elsevier Ltd. All rights reserved.
Marrone, Babetta L.; Lacey, Ronald E.; Anderson, Daniel B.; ...
2017-08-07
Energy-efficient and scalable harvesting and lipid extraction processes must be developed in order for the algal biofuels and bioproducts industry to thrive. The major challenge for harvesting is the handling of large volumes of cultivation water to concentrate low amounts of biomass. For lipid extraction, the major energy and cost drivers are associated with disrupting the algae cell wall and drying the biomass before solvent extraction of the lipids. Here we review the research and development conducted by the Harvesting and Extraction Team during the 3-year National Alliance for Advanced Biofuels and Bioproducts (NAABB) algal consortium project. The harvesting andmore » extraction team investigated five harvesting and three wet extraction technologies at lab bench scale for effectiveness, and conducted a techoeconomic study to evaluate their costs and energy efficiency compared to available baseline technologies. Based on this study, three harvesting technologies were selected for further study at larger scale. We evaluated the selected harvesting technologies: electrocoagulation, membrane filtration, and ultrasonic harvesting, in a field study at minimum scale of 100 L/h. None of the extraction technologies were determined to be ready for scale-up; therefore, an emerging extraction technology (wet solvent extraction) was selected from industry to provide scale-up data and capabilities to produce lipid and lipid-extracted materials for the NAABB program. One specialized extraction/adsorption technology was developed that showed promise for recovering high value co-products from lipid extracts. Overall, the NAABB Harvesting and Extraction Team improved the readiness level of several innovative, energy efficient technologies to integrate with algae production processes and captured valuable lessons learned about scale-up challenges.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marrone, Babetta L.; Lacey, Ronald E.; Anderson, Daniel B.
Energy-efficient and scalable harvesting and lipid extraction processes must be developed in order for the algal biofuels and bioproducts industry to thrive. The major challenge for harvesting is the handling of large volumes of cultivation water to concentrate low amounts of biomass. For lipid extraction, the major energy and cost drivers are associated with disrupting the algae cell wall and drying the biomass before solvent extraction of the lipids. Here we review the research and development conducted by the Harvesting and Extraction Team during the 3-year National Alliance for Advanced Biofuels and Bioproducts (NAABB) algal consortium project. The harvesting andmore » extraction team investigated five harvesting and three wet extraction technologies at lab bench scale for effectiveness, and conducted a techoeconomic study to evaluate their costs and energy efficiency compared to available baseline technologies. Based on this study, three harvesting technologies were selected for further study at larger scale. We evaluated the selected harvesting technologies: electrocoagulation, membrane filtration, and ultrasonic harvesting, in a field study at minimum scale of 100 L/h. None of the extraction technologies were determined to be ready for scale-up; therefore, an emerging extraction technology (wet solvent extraction) was selected from industry to provide scale-up data and capabilities to produce lipid and lipid-extracted materials for the NAABB program. One specialized extraction/adsorption technology was developed that showed promise for recovering high value co-products from lipid extracts. Overall, the NAABB Harvesting and Extraction Team improved the readiness level of several innovative, energy efficient technologies to integrate with algae production processes and captured valuable lessons learned about scale-up challenges.« less
NASA Technical Reports Server (NTRS)
Fijany, Amir; Collier, James B.; Citak, Ari
1997-01-01
A team of US Army Corps of Engineers, Omaha District and Engineering and Support Center, Huntsville, let Propulsion Laboratory (JPL), Stanford Research Institute (SRI), and Montgomery Watson is currently in the process of planning and conducting the largest ever survey at the Former Buckley Field (60,000 acres), in Colorado, by using SRI airborne, ground penetrating, Synthetic Aperture Radar (SAR). The purpose of this survey is the detection of surface and subsurface Unexploded Ordnance (UXO) and in a broader sense the site characterization for identification of contaminated as well as clear areas. In preparation for such a large-scale survey, JPL has been developing advanced algorithms and a high-performance restbed for processing of massive amount of expected SAR data from this site. Two key requirements of this project are the accuracy (in terms of UXO detection) and speed of SAR data processing. The first key feature of this testbed is a large degree of automation and a minimum degree of the need for human perception in the processing to achieve an acceptable processing rate of several hundred acres per day. For accurate UXO detection, novel algorithms have been developed and implemented. These algorithms analyze dual polarized (HH and VV) SAR data. They are based on the correlation of HH and VV SAR data and involve a rather large set of parameters for accurate detection of UXO. For each specific site, this set of parameters can be optimized by using ground truth data (i.e., known surface and subsurface UXOs). In this paper, we discuss these algorithms and their successful application for detection of surface and subsurface anti-tank mines by using a data set from Yuma proving Ground, A7, acquired by SRI SAR.
Daniel C. Dey; George Hartman
2004-01-01
In 1997, The Nature Conservancy initiated a large-scale prescribed fire management study on approximately 2,500 acres of their Chilton Creek property located in Shannon and Carter counties, Missouri. Since the spring of 1998, five management units, of roughly 500 acres each, have been burned in the dormant season to simulate a range of fire regimes that vary from...
Advanced Image Processing Techniques for Maximum Information Recovery
2006-11-01
0704-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302. Respondents should be aware that notwithstanding any other provision...available information from an image. Some radio frequency and optical sensors collect large-scale sets of spatial imagery data whose content is often...Some radio frequency and optical sensors collect large- scale sets of spatial imagery data whose content is often obscured by fog, clouds, foliage
Advances in the Biology and Chemistry of Sialic Acids
Chen, Xi; Varki, Ajit
2010-01-01
Sialic acids are a subset of nonulosonic acids, which are nine-carbon alpha-keto aldonic acids. Natural existing sialic acid-containing structures are presented in different sialic acid forms, various sialyl linkages, and on diverse underlying glycans. They play important roles in biological, pathological, and immunological processes. Sialobiology has been a challenging and yet attractive research area. Recent advances in chemical and chemoenzymatic synthesis as well as large-scale E. coli cell-based production have provided a large library of sialoside standards and derivatives in amounts sufficient for structure-activity relationship studies. Sialoglycan microarrays provide an efficient platform for quick identification of preferred ligands for sialic acid-binding proteins. Future research on sialic acid will continue to be at the interface of chemistry and biology. Research efforts will not only lead to a better understanding of the biological and pathological importance of sialic acids and their diversity, but could also lead to the development of therapeutics. PMID:20020717
A review of advances in pixel detectors for experiments with high rate and radiation
NASA Astrophysics Data System (ADS)
Garcia-Sciveres, Maurice; Wermes, Norbert
2018-06-01
The large Hadron collider (LHC) experiments ATLAS and CMS have established hybrid pixel detectors as the instrument of choice for particle tracking and vertexing in high rate and radiation environments, as they operate close to the LHC interaction points. With the high luminosity-LHC upgrade now in sight, for which the tracking detectors will be completely replaced, new generations of pixel detectors are being devised. They have to address enormous challenges in terms of data throughput and radiation levels, ionizing and non-ionizing, that harm the sensing and readout parts of pixel detectors alike. Advances in microelectronics and microprocessing technologies now enable large scale detector designs with unprecedented performance in measurement precision (space and time), radiation hard sensors and readout chips, hybridization techniques, lightweight supports, and fully monolithic approaches to meet these challenges. This paper reviews the world-wide effort on these developments.
Electronics manufacturing and assembly in Japan
NASA Technical Reports Server (NTRS)
Kukowski, John A.; Boulton, William R.
1995-01-01
In the consumer electronics industry, precision processing technology is the basis for enhancing product functions and for minimizing components and end products. Throughout Japan, manufacturing technology is seen as critical to the production and assembly of advanced products. While its population has increased less than 30 percent over twenty-five years, Japan's gross national product has increase thirtyfold; this growth has resulted in large part from rapid replacement of manual operations with innovative, high-speed, large-scale, continuously running, complex machines that process a growing number of miniaturized components. The JTEC panel found that introduction of next-generation electronics products in Japan goes hand-in-hand with introduction of new and improved production equipment. In the panel's judgment, Japan's advanced process technologies and equipment development and its highly automated factories are crucial elements of its domination of the consumer electronics marketplace - and Japan's expertise in manufacturing consumer electronics products gives it potentially unapproachable process expertise in all electronics markets.
Scientific breakthroughs necessary for the commercial success of renewable energy (Invited)
NASA Astrophysics Data System (ADS)
Sharp, J.
2010-12-01
In recent years the wind energy industry has grown at an unprecedented rate, and in certain regions has attained significant penetration into the power infrastructure. This growth has been both a result of, and a precursor to, significant advances in the science and business of wind energy. But as a result of this growth and increasing penetration, further advances and breakthroughs will become increasingly important. These advances will be required in a number of different aspects of wind energy, including: resource assessment, operations and performance analysis, forecasting, and the impacts of increased wind energy development. Resource assessment has benefited from the development of tools specifically designed for this purpose. Despite this, the atmosphere is often portrayed in an extremely simplified manner by these tools. New methodologies should rely upon more sophisticated application of the physics of fluid flows. There will need to be an increasing reliance and acceptance of improved measurement techniques (remote sensing, volume rather than point measurements, etc), and more sophisticated and higher-resolution numerical methods for micrositing. The goals of resource assessment will have to include a better understanding of the variability and forecastability of potential sites. Operational and performance analysis are vital to quantifying how well all aspects of the business are being carried out. Operational wind farms generate large amounts of meteorological and mechanical data. Data mining and detailed analysis of this data has proven to be invaluable to shed light upon poorly understood aspects of the science and industry. Future analysis will need to be even more rigorous and creative. Worthy topics of study include the impact of turbine wakes upon downstream turbine performance, how to utilize operational data to improve resource assessment and forecasting, and what the impacts of large-scale wind energy development might be. Forecasting is an area in which there have been great advances, and yet even greater advances will be required in the future. Until recently, the scale of wind energy made forecasting relatively unimportant - something that could be handled by automated systems augmented with limited observations. Recently, however, the use of human forecasting teams and specialized observation networks has greatly advanced the state of the art. Further advances will need to include dense networks of observations, providing timely and reliable observations over a much deeper layer of the boundary layer. High resolution rapid refresh models incorporating these observations via data assimilation should advance the state of the art further. Finally, understanding potential impacts of increasing wind energy development is an area where there has been significant interest lately. Preliminary studies have raised concerns of possible unintended climatological consequences upon downwind areas. A policy breakthrough was the inclusion of language into SB 1462, providing for research into these concerns. Advances will be required in the areas of transmission system improvements. The generation of large amounts of wind energy itself will impact the energy infrastructure, and will require breakthroughs within all of the topics above, and thus be a breakthrough in its own right.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Menapace, J A
2010-10-27
Over the last eight years we have been developing advanced MRF tools and techniques to manufacture meter-scale optics for use in Megajoule class laser systems. These systems call for optics having unique characteristics that can complicate their fabrication using conventional polishing methods. First, exposure to the high-power nanosecond and sub-nanosecond pulsed laser environment in the infrared (>27 J/cm{sup 2} at 1053 nm), visible (>18 J/cm{sup 2} at 527 nm), and ultraviolet (>10 J/cm{sup 2} at 351 nm) demands ultra-precise control of optical figure and finish to avoid intensity modulation and scatter that can result in damage to the optics chainmore » or system hardware. Second, the optics must be super-polished and virtually free of surface and subsurface flaws that can limit optic lifetime through laser-induced damage initiation and growth at the flaw sites, particularly at 351 nm. Lastly, ultra-precise optics for beam conditioning are required to control laser beam quality. These optics contain customized surface topographical structures that cannot be made using traditional fabrication processes. In this review, we will present the development and implementation of large-aperture MRF tools and techniques specifically designed to meet the demanding optical performance challenges required in large-aperture high-power laser systems. In particular, we will discuss the advances made by using MRF technology to expose and remove surface and subsurface flaws in optics during final polishing to yield optics with improve laser damage resistance, the novel application of MRF deterministic polishing to imprint complex topographical information and wavefront correction patterns onto optical surfaces, and our efforts to advance the technology to manufacture large-aperture damage resistant optics.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tome, Carlos N; Caro, J A; Lebensohn, R A
2010-01-01
Advancing the performance of Light Water Reactors, Advanced Nuclear Fuel Cycles, and Advanced Reactors, such as the Next Generation Nuclear Power Plants, requires enhancing our fundamental understanding of fuel and materials behavior under irradiation. The capability to accurately model the nuclear fuel systems to develop predictive tools is critical. Not only are fabrication and performance models needed to understand specific aspects of the nuclear fuel, fully coupled fuel simulation codes are required to achieve licensing of specific nuclear fuel designs for operation. The backbone of these codes, models, and simulations is a fundamental understanding and predictive capability for simulating themore » phase and microstructural behavior of the nuclear fuel system materials and matrices. In this paper we review the current status of the advanced modeling and simulation of nuclear reactor cladding, with emphasis on what is available and what is to be developed in each scale of the project, how we propose to pass information from one scale to the next, and what experimental information is required for benchmarking and advancing the modeling at each scale level.« less
Potential Impacts of Offshore Wind Farms on North Sea Stratification
Carpenter, Jeffrey R.; Merckelbach, Lucas; Callies, Ulrich; Clark, Suzanna; Gaslikova, Lidia; Baschek, Burkard
2016-01-01
Advances in offshore wind farm (OWF) technology have recently led to their construction in coastal waters that are deep enough to be seasonally stratified. As tidal currents move past the OWF foundation structures they generate a turbulent wake that will contribute to a mixing of the stratified water column. In this study we show that the mixing generated in this way may have a significant impact on the large-scale stratification of the German Bight region of the North Sea. This region is chosen as the focus of this study since the planning of OWFs is particularly widespread. Using a combination of idealised modelling and in situ measurements, we provide order-of-magnitude estimates of two important time scales that are key to understanding the impacts of OWFs: (i) a mixing time scale, describing how long a complete mixing of the stratification takes, and (ii) an advective time scale, quantifying for how long a water parcel is expected to undergo enhanced wind farm mixing. The results are especially sensitive to both the drag coefficient and type of foundation structure, as well as the evolution of the pycnocline under enhanced mixing conditions—both of which are not well known. With these limitations in mind, the results show that OWFs could impact the large-scale stratification, but only when they occupy extensive shelf regions. They are expected to have very little impact on large-scale stratification at the current capacity in the North Sea, but the impact could be significant in future large-scale development scenarios. PMID:27513754
Potential Impacts of Offshore Wind Farms on North Sea Stratification.
Carpenter, Jeffrey R; Merckelbach, Lucas; Callies, Ulrich; Clark, Suzanna; Gaslikova, Lidia; Baschek, Burkard
2016-01-01
Advances in offshore wind farm (OWF) technology have recently led to their construction in coastal waters that are deep enough to be seasonally stratified. As tidal currents move past the OWF foundation structures they generate a turbulent wake that will contribute to a mixing of the stratified water column. In this study we show that the mixing generated in this way may have a significant impact on the large-scale stratification of the German Bight region of the North Sea. This region is chosen as the focus of this study since the planning of OWFs is particularly widespread. Using a combination of idealised modelling and in situ measurements, we provide order-of-magnitude estimates of two important time scales that are key to understanding the impacts of OWFs: (i) a mixing time scale, describing how long a complete mixing of the stratification takes, and (ii) an advective time scale, quantifying for how long a water parcel is expected to undergo enhanced wind farm mixing. The results are especially sensitive to both the drag coefficient and type of foundation structure, as well as the evolution of the pycnocline under enhanced mixing conditions-both of which are not well known. With these limitations in mind, the results show that OWFs could impact the large-scale stratification, but only when they occupy extensive shelf regions. They are expected to have very little impact on large-scale stratification at the current capacity in the North Sea, but the impact could be significant in future large-scale development scenarios.
Large-Scale Advanced Prop-Fan (LAP)
NASA Technical Reports Server (NTRS)
Degeorge, C. L.
1988-01-01
In recent years, considerable attention has been directed toward improving aircraft fuel efficiency. Analytical studies and research with wind tunnel models have demonstrated that the high inherent efficiency of low speed turboprop propulsion systems may now be extended to the Mach .8 flight regime of today's commercial airliners. This can be accomplished with a propeller, employing a large number of thin highly swept blades. The term Prop-Fan has been coined to describe such a propulsion system. In 1983 the NASA-Lewis Research Center contracted with Hamilton Standard to design, build and test a near full scale Prop-Fan, designated the Large Scale Advanced Prop-Fan (LAP). This report provides a detailed description of the LAP program. The assumptions and analytical procedures used in the design of Prop-Fan system components are discussed in detail. The manufacturing techniques used in the fabrication of the Prop-Fan are presented. Each of the tests run during the course of the program are also discussed and the major conclusions derived from them stated.
Advances in risk assessment for climate change adaptation policy.
Adger, W Neil; Brown, Iain; Surminski, Swenja
2018-06-13
Climate change risk assessment involves formal analysis of the consequences, likelihoods and responses to the impacts of climate change and the options for addressing these under societal constraints. Conventional approaches to risk assessment are challenged by the significant temporal and spatial dynamics of climate change; by the amplification of risks through societal preferences and values; and through the interaction of multiple risk factors. This paper introduces the theme issue by reviewing the current practice and frontiers of climate change risk assessment, with specific emphasis on the development of adaptation policy that aims to manage those risks. These frontiers include integrated assessments, dealing with climate risks across borders and scales, addressing systemic risks, and innovative co-production methods to prioritize solutions to climate challenges with decision-makers. By reviewing recent developments in the use of large-scale risk assessment for adaptation policy-making, we suggest a forward-looking research agenda to meet ongoing strategic policy requirements in local, national and international contexts.This article is part of the theme issue 'Advances in risk assessment for climate change adaptation policy'. © 2018 The Author(s).
Advances in risk assessment for climate change adaptation policy
NASA Astrophysics Data System (ADS)
Adger, W. Neil; Brown, Iain; Surminski, Swenja
2018-06-01
Climate change risk assessment involves formal analysis of the consequences, likelihoods and responses to the impacts of climate change and the options for addressing these under societal constraints. Conventional approaches to risk assessment are challenged by the significant temporal and spatial dynamics of climate change; by the amplification of risks through societal preferences and values; and through the interaction of multiple risk factors. This paper introduces the theme issue by reviewing the current practice and frontiers of climate change risk assessment, with specific emphasis on the development of adaptation policy that aims to manage those risks. These frontiers include integrated assessments, dealing with climate risks across borders and scales, addressing systemic risks, and innovative co-production methods to prioritize solutions to climate challenges with decision-makers. By reviewing recent developments in the use of large-scale risk assessment for adaptation policy-making, we suggest a forward-looking research agenda to meet ongoing strategic policy requirements in local, national and international contexts. This article is part of the theme issue `Advances in risk assessment for climate change adaptation policy'.
Advances in risk assessment for climate change adaptation policy
Adger, W. Neil; Brown, Iain; Surminski, Swenja
2018-01-01
Climate change risk assessment involves formal analysis of the consequences, likelihoods and responses to the impacts of climate change and the options for addressing these under societal constraints. Conventional approaches to risk assessment are challenged by the significant temporal and spatial dynamics of climate change; by the amplification of risks through societal preferences and values; and through the interaction of multiple risk factors. This paper introduces the theme issue by reviewing the current practice and frontiers of climate change risk assessment, with specific emphasis on the development of adaptation policy that aims to manage those risks. These frontiers include integrated assessments, dealing with climate risks across borders and scales, addressing systemic risks, and innovative co-production methods to prioritize solutions to climate challenges with decision-makers. By reviewing recent developments in the use of large-scale risk assessment for adaptation policy-making, we suggest a forward-looking research agenda to meet ongoing strategic policy requirements in local, national and international contexts. This article is part of the theme issue ‘Advances in risk assessment for climate change adaptation policy’. PMID:29712800
Biochemical Conversion Processes of Lignocellulosic Biomass to Fuels and Chemicals - A Review.
Brethauer, Simone; Studer, Michael H
2015-01-01
Lignocellulosic biomass - such as wood, agricultural residues or dedicated energy crops - is a promising renewable feedstock for production of fuels and chemicals that is available at large scale at low cost without direct competition for food usage. Its biochemical conversion in a sugar platform biorefinery includes three main unit operations that are illustrated in this review: the physico-chemical pretreatment of the biomass, the enzymatic hydrolysis of the carbohydrates to a fermentable sugar stream by cellulases and finally the fermentation of the sugars by suitable microorganisms to the target molecules. Special emphasis in this review is put on the technology, commercial status and future prospects of the production of second-generation fuel ethanol, as this process has received most research and development efforts so far. Despite significant advances, high enzyme costs are still a hurdle for large scale competitive lignocellulosic ethanol production. This could be overcome by a strategy termed 'consolidated bioprocessing' (CBP), where enzyme production, enzymatic hydrolysis and fermentation is integrated in one step - either by utilizing one genetically engineered superior microorganism or by creating an artificial co-culture. Insight is provided on both CBP strategies for the production of ethanol as well as of advanced fuels and commodity chemicals.
Development of Shape Memory Alloys- Challenges and Solutions
NASA Technical Reports Server (NTRS)
Benafan, Othmane
2016-01-01
Shape memory alloys (SMAs) are a unique class of multifunctional materials that have the ability to recover large deformations or generate high stresses in response to thermal, mechanical andor electromagnetic stimuli. These abilities have made them a viable option for actuation systems in aerospace, medical, and automotive applications, amongst others. However, despite many advantages and the fact that SMA actuators have been developed and used for many years, so far they have only found service in a limited range of applications. In order to expand their applications, further developments are needed to increase their reliability and stability and to address processing, testing and qualification needed for large-scale commercial application of SMA actuators. In this seminar, historical inhibitors of SMA applications and current research efforts by NASA Glenn Research Center and collaborators will be discussed. Relationships between fundamental physicalscientific understanding, and the direct transition to engineering and design of mechanisms using these novel materials will be highlighted. Examples will be presented related to targeted alloy development, microstructural control, and bulk-scale testing as a function of stresses, temperatures and harsh environments. The seminar will conclude with a summary of SMA applications under development and current advances.
Insights into next developments in advanced gastric cancer.
Obermannová, Radka; Lordick, Florian
2016-07-01
The purpose of the review is to delineate novel approaches for biology-based treatment in advanced gastric cancer. We reviewed the latest translational and clinical research articles and congress presentations. A new molecular classification of gastric cancer based on histology, genetic and proteomic alterations has evolved. It provides a roadmap for development of new drugs and combinations and for patient stratification. Anti-HER2 treatment, which is an effective strategy in metastatic gastric cancer, is now also being studied in the perioperative setting. However, resistance mechanisms in advanced disease are poorly understood and optimal patient selection remains challenging. Targeting angiogenesis is an emerging concept in the management of advanced gastric cancer, and ramucirumab has prolonged survival in the second line either as a monotherapy or in combination with paclitaxel. Biomarkers for selecting patients who benefit from ramucirumab are still lacking. Immune checkpoint blockade and inhibition of cancer stemness targets are other emerging directions for the medical treatment of gastric cancer. Large-scale international studies are ongoing. Promising biology-based treatment strategies are evolving. But tumor heterogeneity which is an inherent feature of gastric cancer challenges the development of molecularly targeted and personalized treatment strategies.
Kennedy, Jacob J.; Abbatiello, Susan E.; Kim, Kyunggon; Yan, Ping; Whiteaker, Jeffrey R.; Lin, Chenwei; Kim, Jun Seok; Zhang, Yuzheng; Wang, Xianlong; Ivey, Richard G.; Zhao, Lei; Min, Hophil; Lee, Youngju; Yu, Myeong-Hee; Yang, Eun Gyeong; Lee, Cheolju; Wang, Pei; Rodriguez, Henry; Kim, Youngsoo; Carr, Steven A.; Paulovich, Amanda G.
2014-01-01
The successful application of MRM in biological specimens raises the exciting possibility that assays can be configured to measure all human proteins, resulting in an assay resource that would promote advances in biomedical research. We report the results of a pilot study designed to test the feasibility of a large-scale, international effort in MRM assay generation. We have configured, validated across three laboratories, and made publicly available as a resource to the community 645 novel MRM assays representing 319 proteins expressed in human breast cancer. Assays were multiplexed in groups of >150 peptides and deployed to quantify endogenous analyte in a panel of breast cancer-related cell lines. Median assay precision was 5.4%, with high inter-laboratory correlation (R2 >0.96). Peptide measurements in breast cancer cell lines were able to discriminate amongst molecular subtypes and identify genome-driven changes in the cancer proteome. These results establish the feasibility of a scaled, international effort. PMID:24317253
The Numerical Propulsion System Simulation: A Multidisciplinary Design System for Aerospace Vehicles
NASA Technical Reports Server (NTRS)
Lytle, John K.
1999-01-01
Advances in computational technology and in physics-based modeling are making large scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze ma or propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of design process and to provide the designer with critical information about the components early in the design process. This paper describes the development of the Numerical Propulsion System Simulation (NPSS), a multidisciplinary system of analysis tools that is focussed on extending the simulation capability from components to the full system. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.
Review of the cultivation program within the National Alliance for Advanced Biofuels and Bioproducts
Lammers, Peter J.; Huesemann, Michael; Boeing, Wiebke; ...
2016-12-12
The cultivation efforts within the National Alliance for Advanced Biofuels and Bioproducts (NAABB) were developed to provide four major goals for the consortium, which included biomass production for downstream experimentation, development of new assessment tools for cultivation, development of new cultivation reactor technologies, and development of methods for robust cultivation. The NAABB consortium testbeds produced over 1500 kg of biomass for downstream processing. The biomass production included a number of model production strains, but also took into production some of the more promising strains found through the prospecting efforts of the consortium. Cultivation efforts at large scale are intensive andmore » costly, therefore the consortium developed tools and models to assess the productivity of strains under various environmental conditions, at lab scale, and validated these against scaled outdoor production systems. Two new pond-based bioreactor designs were tested for their ability to minimize energy consumption while maintaining, and even exceeding, the productivity of algae cultivation compared to traditional systems. Also, molecular markers were developed for quality control and to facilitate detection of bacterial communities associated with cultivated algal species, including the Chlorella spp. pathogen, Vampirovibrio chlorellavorus, which was identified in at least two test site locations in Arizona and New Mexico. Finally, the consortium worked on understanding methods to utilize compromised municipal wastewater streams for cultivation. In conclusion, this review provides an overview of the cultivation methods and tools developed by the NAABB consortium to produce algae biomass, in robust low energy systems, for biofuel production.« less
Beigh, Mohammad Muzafar
2016-01-01
Humans have predicted the relationship between heredity and diseases for a long time. Only in the beginning of the last century, scientists begin to discover the connotations between different genes and disease phenotypes. Recent trends in next-generation sequencing (NGS) technologies have brought a great momentum in biomedical research that in turn has remarkably augmented our basic understanding of human biology and its associated diseases. State-of-the-art next generation biotechnologies have started making huge strides in our current understanding of mechanisms of various chronic illnesses like cancers, metabolic disorders, neurodegenerative anomalies, etc. We are experiencing a renaissance in biomedical research primarily driven by next generation biotechnologies like genomics, transcriptomics, proteomics, metabolomics, lipidomics etc. Although genomic discoveries are at the forefront of next generation omics technologies, however, their implementation into clinical arena had been painstakingly slow mainly because of high reaction costs and unavailability of requisite computational tools for large-scale data analysis. However rapid innovations and steadily lowering cost of sequence-based chemistries along with the development of advanced bioinformatics tools have lately prompted launching and implementation of large-scale massively parallel genome sequencing programs in different fields ranging from medical genetics, infectious biology, agriculture sciences etc. Recent advances in large-scale omics-technologies is bringing healthcare research beyond the traditional “bench to bedside” approach to more of a continuum that will include improvements, in public healthcare and will be primarily based on predictive, preventive, personalized, and participatory medicine approach (P4). Recent large-scale research projects in genetic and infectious disease biology have indicated that massively parallel whole-genome/whole-exome sequencing, transcriptome analysis, and other functional genomic tools can reveal large number of unique functional elements and/or markers that otherwise would be undetected by traditional sequencing methodologies. Therefore, latest trends in the biomedical research is giving birth to the new branch in medicine commonly referred to as personalized and/or precision medicine. Developments in the post-genomic era are believed to completely restructure the present clinical pattern of disease prevention and treatment as well as methods of diagnosis and prognosis. The next important step in the direction of the precision/personalized medicine approach should be its early adoption in clinics for future medical interventions. Consequently, in coming year’s next generation biotechnologies will reorient medical practice more towards disease prediction and prevention approaches rather than curing them at later stages of their development and progression, even at wider population level(s) for general public healthcare system. PMID:28930123
Jennifer C. Pierson; Fred W. Allendorf; Pierre Drapeau; Michael K. Schwartz
2013-01-01
An exciting advance in the understanding of metapopulation dynamics has been the investigation of how populations respond to ephemeral patches that go 'extinct' during the lifetime of an individual. Previous research has shown that this scenario leads to genetic homogenization across large spatial scales. However, little is known about fine-scale genetic...
Blaney, Cerissa L; Redding, Colleen A; Paiva, Andrea L; Rossi, Joseph S; Prochaska, James O; Blissmer, Bryan; Burditt, Caitlin T; Nash, Justin M; Bayley, Keri Dotson
2018-03-01
Although integrated primary care (IPC) is growing, several barriers remain. Better understanding of behavioral health professionals' (BHPs') readiness for and engagement in IPC behaviors could improve IPC research and training. This study developed measures of IPC behaviors and stage of change. The sample included 319 licensed, practicing BHPs with a range of interests and experience with IPC. Sequential measurement development procedures, with split-half cross-validation were conducted. Exploratory principal components analyses (N = 152) and confirmatory factor analyses (N = 167) yielded a 12-item scale with 2 factors: consultation/practice management (CPM) and intervention/knowledge (IK). A higher-order Integrated Primary Care Behavior Scale (IPCBS) model showed good fit to the data, and excellent internal consistencies. The multivariate analysis of variance (MANOVA) on the IPCBS demonstrated significant large-sized differences across stage and behavior groups. The IPCBS demonstrated good psychometric properties and external validation, advancing research, education, and training for IPC practice. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Thurber, Mark C; Phadke, Himani; Nagavarapu, Sriniketh; Shrimali, Gireesh; Zerriffi, Hisham
2014-04-01
Replacing traditional stoves with advanced alternatives that burn more cleanly has the potential to ameliorate major health problems associated with indoor air pollution in developing countries. With a few exceptions, large government and charitable programs to distribute advanced stoves have not had the desired impact. Commercially-based distributions that seek cost recovery and even profits might plausibly do better, both because they encourage distributors to supply and promote products that people want and because they are based around properly-incentivized supply chains that could more be scalable, sustainable, and replicable. The sale in India of over 400,000 "Oorja" stoves to households from 2006 onwards represents the largest commercially-based distribution of a gasification-type advanced biomass stove. BP's Emerging Consumer Markets (ECM) division and then successor company First Energy sold this stove and the pelletized biomass fuel on which it operates. We assess the success of this effort and the role its commercial aspect played in outcomes using a survey of 998 households in areas of Maharashtra and Karnataka where the stove was sold as well as detailed interviews with BP and First Energy staff. Statistical models based on this data indicate that Oorja purchase rates were significantly influenced by the intensity of Oorja marketing in a region as well as by pre-existing stove mix among households. The highest rate of adoption came from LPG-using households for which Oorja's pelletized biomass fuel reduced costs. Smoke- and health-related messages from Oorja marketing did not significantly influence the purchase decision, although they did appear to affect household perceptions about smoke. By the time of our survey, only 9% of households that purchased Oorja were still using the stove, the result in large part of difficulties First Energy encountered in developing a viable supply chain around low-cost procurement of "agricultural waste" to make pellets. The business orientation of First Energy allowed the company to pivot rapidly to commercial customers when the household market encountered difficulties. The business background of managers also facilitated the initial marketing and distribution efforts that allowed the stove distribution to reach scale.
Thurber, Mark C.; Phadke, Himani; Nagavarapu, Sriniketh; Shrimali, Gireesh; Zerriffi, Hisham
2015-01-01
Replacing traditional stoves with advanced alternatives that burn more cleanly has the potential to ameliorate major health problems associated with indoor air pollution in developing countries. With a few exceptions, large government and charitable programs to distribute advanced stoves have not had the desired impact. Commercially-based distributions that seek cost recovery and even profits might plausibly do better, both because they encourage distributors to supply and promote products that people want and because they are based around properly-incentivized supply chains that could more be scalable, sustainable, and replicable. The sale in India of over 400,000 “Oorja” stoves to households from 2006 onwards represents the largest commercially-based distribution of a gasification-type advanced biomass stove. BP's Emerging Consumer Markets (ECM) division and then successor company First Energy sold this stove and the pelletized biomass fuel on which it operates. We assess the success of this effort and the role its commercial aspect played in outcomes using a survey of 998 households in areas of Maharashtra and Karnataka where the stove was sold as well as detailed interviews with BP and First Energy staff. Statistical models based on this data indicate that Oorja purchase rates were significantly influenced by the intensity of Oorja marketing in a region as well as by pre-existing stove mix among households. The highest rate of adoption came from LPG-using households for which Oorja's pelletized biomass fuel reduced costs. Smoke- and health-related messages from Oorja marketing did not significantly influence the purchase decision, although they did appear to affect household perceptions about smoke. By the time of our survey, only 9% of households that purchased Oorja were still using the stove, the result in large part of difficulties First Energy encountered in developing a viable supply chain around low-cost procurement of “agricultural waste” to make pellets. The business orientation of First Energy allowed the company to pivot rapidly to commercial customers when the household market encountered difficulties. The business background of managers also facilitated the initial marketing and distribution efforts that allowed the stove distribution to reach scale. PMID:25814822
New Roles for Occupational Instructors.
ERIC Educational Resources Information Center
Campbell, Dale F.
Changes in the future role of occupational instructors which will be brought about by advances in educational technology are illustrated by the description of the Advanced Instructional System (AIS), a complex approach to occupational training which permits large-scale application of individualized instruction through the use of computer-assisted…
Yong-Ki Kim — His Life and Recent Work
NASA Astrophysics Data System (ADS)
Stone, Philip M.
2007-08-01
Dr. Kim made internationally recognized contributions in many areas of atomic physics research and applications, and was still very active when he was killed in an automobile accident. He joined NIST in 1983 after 17 years at the Argonne National Laboratory following his Ph.D. work at the University of Chicago. Much of his early work at Argonne and especially at NIST was the elucidation and detailed analysis of the structure of highly charged ions. He developed a sophisticated, fully relativistic atomic structure theory that accurately predicts atomic energy levels, transition wavelengths, lifetimes, and transition probabilities for a large number of ions. This information has been vital to model the properties of the hot interior of fusion research plasmas, where atomic ions must be described with relativistic atomic structure calculations. In recent years, Dr. Kim worked on the precise calculation of ionization and excitation cross sections of numerous atoms, ions, and molecules that are important in fusion research and in plasma processing for manufacturing semiconductor chips. Dr. Kim greatly advanced the state-of-the-art of calculations for these cross sections through development and implementation of highly innovative methods, including his Binary-Encounter-Bethe (BEB) theory and a scaled plane wave Born (scaled PWB) theory. His methods, using closed quantum mechanical formulas and no adjustable parameters, avoid tedious large-scale computations with main-frame computers. His calculations closely reproduce the results of benchmark experiments as well as large-scale calculations requiring hours of computer time. This recent work on BEB and scaled PWB is reviewed and examples of its capabilities are shown.
2004-10-01
MONITORING AGENCY NAME(S) AND ADDRESS(ES) Defense Advanced Research Projects Agency AFRL/IFTC 3701 North Fairfax Drive...Scalable Parallel Libraries for Large-Scale Concurrent Applications," Technical Report UCRL -JC-109251, Lawrence Livermore National Laboratory
Shin-Etsu super-high-flat substrate for FPD panel photomask
NASA Astrophysics Data System (ADS)
Ishitsuka, Youkou; Harada, Daijitsu; Watabe, Atsushi; Takeuchi, Masaki
2017-07-01
Recently, high-resolution exposure machine has been developed for production of high-definition (HD) panels, and higher-flat photomask substrates for FPD is being expected for panel makers to produce HD panels. In this presentation, we introduce about Shin-Etsu's advanced technique of producing super-high-flat photomask substrates. Shin-Etsu has developed surface polishing and planarization technology with No.1-quality-IC photomask substrates. Our most advanced IC photomask substrates have gained the highest estimation and appreciation from our customers because of their surface quality (non-defect surface without sub-0.1um size defects) and ultimate flatness (sub-0.1um order having achieved). By scaling up those IC photomask substrate technologies and developing unique large-size processing technologies, we have achieved creating high-flat large substrates, even G10-photomask size as well as regular G6-G8 photomask size. The core technology is that the surface shape of the substrate is completely controlled by the unique method. For example, we can regularly produce a substrate with its flatness of triple 5ums; front side flatness, back side flatness and total thickness variation are all less than 5μm. Furthermore, we are able to supply a substrate with its flatness of triple 3ums for G6-photomask size advanced grade, believed to be needed in near future.
X-ray techniques for innovation in industry
Lawniczak-Jablonska, Krystyna; Cutler, Jeffrey
2014-01-01
The smart specialization declared in the European program Horizon 2020, and the increasing cooperation between research and development found in companies and researchers at universities and research institutions have created a new paradigm where many calls for proposals require participation and funding from public and private entities. This has created a unique opportunity for large-scale facilities, such as synchrotron research laboratories, to participate in and support applied research programs. Scientific staff at synchrotron facilities have developed many advanced tools that make optimal use of the characteristics of the light generated by the storage ring. These tools have been exceptionally valuable for materials characterization including X-ray absorption spectroscopy, diffraction, tomography and scattering, and have been key in solving many research and development issues. Progress in optics and detectors, as well as a large effort put into the improvement of data analysis codes, have resulted in the development of reliable and reproducible procedures for materials characterization. Research with photons has contributed to the development of a wide variety of products such as plastics, cosmetics, chemicals, building materials, packaging materials and pharma. In this review, a few examples are highlighted of successful cooperation leading to solutions of a variety of industrial technological problems which have been exploited by industry including lessons learned from the Science Link project, supported by the European Commission, as a new approach to increase the number of commercial users at large-scale research infrastructures. PMID:25485139
Jo, Sunhwan; Cheng, Xi; Islam, Shahidul M; Huang, Lei; Rui, Huan; Zhu, Allen; Lee, Hui Sun; Qi, Yifei; Han, Wei; Vanommeslaeghe, Kenno; MacKerell, Alexander D; Roux, Benoît; Im, Wonpil
2014-01-01
CHARMM-GUI, http://www.charmm-gui.org, is a web-based graphical user interface to prepare molecular simulation systems and input files to facilitate the usage of common and advanced simulation techniques. Since it is originally developed in 2006, CHARMM-GUI has been widely adopted for various purposes and now contains a number of different modules designed to setup a broad range of simulations including free energy calculation and large-scale coarse-grained representation. Here, we describe functionalities that have recently been integrated into CHARMM-GUI PDB Manipulator, such as ligand force field generation, incorporation of methanethiosulfonate spin labels and chemical modifiers, and substitution of amino acids with unnatural amino acids. These new features are expected to be useful in advanced biomolecular modeling and simulation of proteins. © 2014 Elsevier Inc. All rights reserved.
An assessment of General Aviation utilization of advanced avionics technology
NASA Technical Reports Server (NTRS)
Quinby, G. F.
1980-01-01
Needs of the general aviation industry for services and facilities which might be supplied by NASA were examined. In the data collection phase, twenty-one individuals from nine manufacturing companies in general aviation were interviewed against a carefully prepared meeting format. General aviation avionics manufacturers were credited with a high degree of technology transfer from the forcing industries such as television, automotive, and computers and a demonstrated ability to apply advanced technology such as large scale integration and microprocessors to avionics functions in an innovative and cost effective manner. The industry's traditional resistance to any unnecessary regimentation or standardization was confirmed. Industry's self sufficiency in applying advanced technology to avionics product development was amply demonstrated. NASA research capability could be supportive in areas of basic mechanics of turbulence in weather and alternative means for its sensing.
NASA: Assessments of Selected Large-Scale Projects
2011-03-01
REPORT DATE MAR 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Assessments Of Selected Large-Scale Projects...Volatile EvolutioN MEP Mars Exploration Program MIB Mishap Investigation Board MMRTG Multi Mission Radioisotope Thermoelectric Generator MMS Magnetospheric...probes designed to explore the Martian surface, to satellites equipped with advanced sensors to study the earth , to telescopes intended to explore the
NASA Astrophysics Data System (ADS)
Nikzad, Shouleh; Jewell, April D.; Hoenk, Michael E.; Jones, Todd J.; Hennessy, John; Goodsall, Tim; Carver, Alexander G.; Shapiro, Charles; Cheng, Samuel R.; Hamden, Erika T.; Kyne, Gillian; Martin, D. Christopher; Schiminovich, David; Scowen, Paul; France, Kevin; McCandliss, Stephan; Lupu, Roxana E.
2017-07-01
Exciting concepts are under development for flagship, probe class, explorer class, and suborbital class NASA missions in the ultraviolet/optical spectral range. These missions will depend on high-performance silicon detector arrays being delivered affordably and in high numbers. To that end, we have advanced delta-doping technology to high-throughput and high-yield wafer-scale processing, encompassing a multitude of state-of-the-art silicon-based detector formats and designs. We have embarked on a number of field observations, instrument integrations, and independent evaluations of delta-doped arrays. We present recent data and innovations from JPL's Advanced Detectors and Systems Program, including two-dimensional doping technology, JPL's end-to-end postfabrication processing of high-performance UV/optical/NIR arrays and advanced coatings for detectors. While this paper is primarily intended to provide an overview of past work, developments are identified and discussed throughout. Additionally, we present examples of past, in-progress, and planned observations and deployments of delta-doped arrays.
Planner-Based Control of Advanced Life Support Systems
NASA Technical Reports Server (NTRS)
Muscettola, Nicola; Kortenkamp, David; Fry, Chuck; Bell, Scott
2005-01-01
The paper describes an approach to the integration of qualitative and quantitative modeling techniques for advanced life support (ALS) systems. Developing reliable control strategies that scale up to fully integrated life support systems requires augmenting quantitative models and control algorithms with the abstractions provided by qualitative, symbolic models and their associated high-level control strategies. This will allow for effective management of the combinatorics due to the integration of a large number of ALS subsystems. By focusing control actions at different levels of detail and reactivity we can use faster: simpler responses at the lowest level and predictive but complex responses at the higher levels of abstraction. In particular, methods from model-based planning and scheduling can provide effective resource management over long time periods. We describe reference implementation of an advanced control system using the IDEA control architecture developed at NASA Ames Research Center. IDEA uses planning/scheduling as the sole reasoning method for predictive and reactive closed loop control. We describe preliminary experiments in planner-based control of ALS carried out on an integrated ALS simulation developed at NASA Johnson Space Center.
Advances in QTL Mapping in Pigs
Rothschild, Max F.; Hu, Zhi-liang; Jiang, Zhihua
2007-01-01
Over the past 15 years advances in the porcine genetic linkage map and discovery of useful candidate genes have led to valuable gene and trait information being discovered. Early use of exotic breed crosses and now commercial breed crosses for quantitative trait loci (QTL) scans and candidate gene analyses have led to 110 publications which have identified 1,675 QTL. Additionally, these studies continue to identify genes associated with economically important traits such as growth rate, leanness, feed intake, meat quality, litter size, and disease resistance. A well developed QTL database called PigQTLdb is now as a valuable tool for summarizing and pinpointing in silico regions of interest to researchers. The commercial pig industry is actively incorporating these markers in marker-assisted selection along with traditional performance information to improve traits of economic performance. The long awaited sequencing efforts are also now beginning to provide sequence available for both comparative genomics and large scale single nucleotide polymorphism (SNP) association studies. While these advances are all positive, development of useful new trait families and measurement of new or underlying traits still limits future discoveries. A review of these developments is presented. PMID:17384738
University of Arizona High Energy Physics Program at the Cosmic Frontier 2014-2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
abate, alex; cheu, elliott
This is the final technical report from the University of Arizona High Energy Physics program at the Cosmic Frontier covering the period 2014-2016. The work aims to advance the understanding of dark energy using the Large Synoptic Survey Telescope (LSST). Progress on the engineering design of the power supplies for the LSST camera is discussed. A variety of contributions to photometric redshift measurement uncertainties were studied. The effect of the intergalactic medium on the photometric redshift of very distant galaxies was evaluated. Computer code was developed realizing the full chain of calculations needed to accurately and efficiently run large-scale simulations.
Developments in advanced and energy saving thermal isolations for cryogenic applications
NASA Astrophysics Data System (ADS)
Shu, Q. S.; Demko, J. A.; Fesmire, J. E.
2015-12-01
The cooling power consumption in large scale superconducting systems is huge and cryogenic devices used in space applications often require an extremely long cryogen holding time. To economically maintain the device at its operating temperature and minimize the refrigeration losses, high performance of thermal isolation is essential. The radiation from warm surrounding surfaces and conducting heat leaks through supports and penetrations are the dominant heat loads to the cold mass under vacuum condition. The advanced developments in various cryogenic applications to successfully reduce the heat loads through radiation and conduction are briefly and systematically discussed and evaluated in this review paper. These include: (1) thermal Insulation for different applications (foams, perlites, glass bubbles, aerogel and MLI), (2) sophisticated low-heat-leak support (cryogenic tension straps, trolley bars and posts with dedicated thermal intercepts), and (3) novel cryogenic heat switches.
NSF's Perspective on Space Weather Research for Building Forecasting Capabilities
NASA Astrophysics Data System (ADS)
Bisi, M. M.; Pulkkinen, A. A.; Bisi, M. M.; Pulkkinen, A. A.; Webb, D. F.; Oughton, E. J.; Azeem, S. I.
2017-12-01
Space weather research at the National Science Foundation (NSF) is focused on scientific discovery and on deepening knowledge of the Sun-Geospace system. The process of maturation of knowledge base is a requirement for the development of improved space weather forecast models and for the accurate assessment of potential mitigation strategies. Progress in space weather forecasting requires advancing in-depth understanding of the underlying physical processes, developing better instrumentation and measurement techniques, and capturing the advancements in understanding in large-scale physics based models that span the entire chain of events from the Sun to the Earth. This presentation will provide an overview of current and planned programs pertaining to space weather research at NSF and discuss the recommendations of the Geospace Section portfolio review panel within the context of space weather forecasting capabilities.
NASA's Advanced Life Support Systems Human-Rated Test Facility
NASA Technical Reports Server (NTRS)
Henninger, D. L.; Tri, T. O.; Packham, N. J.
1996-01-01
Future NASA missions to explore the solar system will be long-duration missions, requiring human life support systems which must operate with very high reliability over long periods of time. Such systems must be highly regenerative, requiring minimum resupply, to enable the crews to be largely self-sufficient. These regenerative life support systems will use a combination of higher plants, microorganisms, and physicochemical processes to recycle air and water, produce food, and process wastes. A key step in the development of these systems is establishment of a human-rated test facility specifically tailored to evaluation of closed, regenerative life supports systems--one in which long-duration, large-scale testing involving human test crews can be performed. Construction of such a facility, the Advanced Life Support Program's (ALS) Human-Rated Test Facility (HRTF), has begun at NASA's Johnson Space Center, and definition of systems and development of initial outfitting concepts for the facility are underway. This paper will provide an overview of the HRTF project plan, an explanation of baseline configurations, and descriptive illustrations of facility outfitting concepts.
Advanced Kalman Filter for Real-Time Responsiveness in Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Welch, Gregory Francis; Zhang, Jinghe
2014-06-10
Complex engineering systems pose fundamental challenges in real-time operations and control because they are highly dynamic systems consisting of a large number of elements with severe nonlinearities and discontinuities. Today’s tools for real-time complex system operations are mostly based on steady state models, unable to capture the dynamic nature and too slow to prevent system failures. We developed advanced Kalman filtering techniques and the formulation of dynamic state estimation using Kalman filtering techniques to capture complex system dynamics in aiding real-time operations and control. In this work, we looked at complex system issues including severe nonlinearity of system equations, discontinuitiesmore » caused by system controls and network switches, sparse measurements in space and time, and real-time requirements of power grid operations. We sought to bridge the disciplinary boundaries between Computer Science and Power Systems Engineering, by introducing methods that leverage both existing and new techniques. While our methods were developed in the context of electrical power systems, they should generalize to other large-scale scientific and engineering applications.« less
D-He-3 spherical torus fusion reactor system study
NASA Astrophysics Data System (ADS)
Macon, William A., Jr.
1992-04-01
This system study extrapolates present physics knowledge and technology to predict the anticipated characteristics of D-He3 spherical torus fusion reactors and their sensitivity to uncertainties in important parameters. Reference cases for steady-state 1000 MWe reactors operating in H-mode in both the 1st stability regime and the 2nd stability regime were developed and assessed quantitatively. These devices would a very small aspect ratio (A=1,2), a major radius of about 2.0 m, an on-axis magnetic field less than 2 T, a large plasma current (80-120 MA) dominated by the bootstrap effect, and high plasma beta (greater than O.6). The estimated cost of electricity is in the range of 60-90 mills/kW-hr, assuming the use of a direct energy conversion system. The inherent safety and environmental advantages of D-He3 fusion indicate that this reactor concept could be competitive with advanced fission breeder reactors and large-scale solar electric plants by the end of the 21st century if research and development can produce the anticipated physics and technology advances.
Recent advances in large-eddy simulation of spray and coal combustion
NASA Astrophysics Data System (ADS)
Zhou, L. X.
2013-07-01
Large-eddy simulation (LES) is under its rapid development and is recognized as a possible second generation of CFD methods used in engineering. Spray and coal combustion is widely used in power, transportation, chemical and metallurgical, iron and steel making, aeronautical and astronautical engineering, hence LES of spray and coal two-phase combustion is particularly important for engineering application. LES of two-phase combustion attracts more and more attention; since it can give the detailed instantaneous flow and flame structures and more exact statistical results than those given by the Reynolds averaged modeling (RANS modeling). One of the key problems in LES is to develop sub-grid scale (SGS) models, including SGS stress models and combustion models. Different investigators proposed or adopted various SGS models. In this paper the present author attempts to review the advances in studies on LES of spray and coal combustion, including the studies done by the present author and his colleagues. Different SGS models adopted by different investigators are described, some of their main results are summarized, and finally some research needs are discussed.
Balance models for equatorial planetary-scale dynamics
NASA Astrophysics Data System (ADS)
Chan, Ian Hiu-Fung
This thesis aims at advancing our understanding of large-scale dynamics in the tropics, specifically the characterization of slow planetary-scale motions through a balance theory; current balance theories in the tropics are unsatisfactory as they filter out Kelvin waves, which are an important component of variability, along with fast inertia-gravity (IG) waves. (Abstract shortened by UMI.).
NASA Astrophysics Data System (ADS)
Qin, Fangcheng; Li, Yongtang; Qi, Huiping; Ju, Li
2017-01-01
Research on compact manufacturing technology for shape and performance controllability of metallic components can realize the simplification and high-reliability of manufacturing process on the premise of satisfying the requirement of macro/micro-structure. It is not only the key paths in improving performance, saving material and energy, and green manufacturing of components used in major equipments, but also the challenging subjects in frontiers of advanced plastic forming. To provide a novel horizon for the manufacturing in the critical components is significant. Focused on the high-performance large-scale components such as bearing rings, flanges, railway wheels, thick-walled pipes, etc, the conventional processes and their developing situations are summarized. The existing problems including multi-pass heating, wasting material and energy, high cost and high-emission are discussed, and the present study unable to meet the manufacturing in high-quality components is also pointed out. Thus, the new techniques related to casting-rolling compound precise forming of rings, compact manufacturing for duplex-metal composite rings, compact manufacturing for railway wheels, and casting-extruding continuous forming of thick-walled pipes are introduced in detail, respectively. The corresponding research contents, such as casting ring blank, hot ring rolling, near solid-state pressure forming, hot extruding, are elaborated. Some findings in through-thickness microstructure evolution and mechanical properties are also presented. The components produced by the new techniques are mainly characterized by fine and homogeneous grains. Moreover, the possible directions for further development of those techniques are suggested. Finally, the key scientific problems are first proposed. All of these results and conclusions have reference value and guiding significance for the integrated control of shape and performance in advanced compact manufacturing.
Scaling up to address data science challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wendelberger, Joanne R.
Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less
Scaling up to address data science challenges
Wendelberger, Joanne R.
2017-04-27
Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less
Large-scale retrieval for medical image analytics: A comprehensive review.
Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting
2018-01-01
Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Dittmar, J. H.
1985-01-01
Noise data on the Large-scale Advanced Propfan (LAP) propeller model SR-7A were taken into the NASA Lewis 8- by 6-Foot Wind Tunnel. The maximum blade passing tone decreases from the peak level when going to higher helical tip Mach numbers. This noise reduction points to the use of higher propeller speeds as a possible method to reduce airplane cabin noise while maintaining high flight speed and efficiency. Comparison of the SR-7A blade passing noise with the noise of the similarly designed SR-3 propeller shows good agreement as expected. The SR-7A propeller is slightly noisier than the SR-3 model in the plane of rotation at the cruise condition. Projections of the tunnel model data are made to the full-scale LAP propeller mounted on the test bed aircraft and compared with design predictions. The prediction method is conservative in the sense that it overpredicts the projected model data.
NASA Astrophysics Data System (ADS)
Simon, Sara Michelle
The LCDM model of the universe is supported by an abundance of astronomical observations, but it does not confirm a period of inflation in the early universe or explain the nature of dark energy and dark matter. The polarization of the cosmic microwave background (CMB) may hold the key to addressing these profound questions. If a period of inflation occurred in the early universe, it could have left a detectable odd-parity pattern called B-modes in the polarization of the CMB on large angular scales. Additionally, the CMB can be used to probe the structure of the universe on small angular scales through lensing and the detection of galaxy clusters and their motions via the Sunyaev-Zel'dovich effect, which can improve our understanding of neutrinos, dark matter, and dark energy. The Atacama B-mode Search (ABS) instrument was a cryogenic crossed-Dragone telescope located at an elevation of 5190m in the Atacama Desert in Chile that observed from February 2012 until October 2014. ABS searched on degree-angular scales for inflationary B-modes in the CMB and pioneered the use of a rapidly-rotating half-wave plate (HWP), which modulates the polarization of incoming light to permit the measurement of celestial polarization on large angular scales that would otherwise be obscured by 1/f noise from the atmosphere. Located next to ABS in the Atacama is the Atacama Cosmology Telescope (ACT), which is an off-axis Gregorian telescope. Its large 6m primary mirror facilitates measurements of the CMB on small angular scales. HWPs are baselined for use with the upgraded polarization-sensitive camera for ACT, called Advanced ACTPol, to extend observations of the polarized CMB to larger angular scales while also retaining sensitivity to small angular scales. The B-mode signal is extremely faint, and measuring it poses an instrumental challenge that requires the development of new technologies and well-characterized instruments. I will discuss the use of novel instrumentation and methods on the ABS telescope and Advanced ACTPol, the characterization of the ABS instrument, and the first two seasons of ABS data, including an overview of the data selection process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crater, Jason; Galleher, Connor; Lievense, Jeff
NREL is developing an advanced aerobic bubble column model using Aspen Custom Modeler (ACM). The objective of this work is to integrate the new fermentor model with existing techno-economic models in Aspen Plus and Excel to establish a new methodology for guiding process design. To assist this effort, NREL has contracted Genomatica to critique and make recommendations for improving NREL's bioreactor model and large scale aerobic bioreactor design for biologically producing lipids at commercial scale. Genomatica has highlighted a few areas for improving the functionality and effectiveness of the model. Genomatica recommends using a compartment model approach with an integratedmore » black-box kinetic model of the production microbe. We also suggest including calculations for stirred tank reactors to extend the models functionality and adaptability for future process designs. Genomatica also suggests making several modifications to NREL's large-scale lipid production process design. The recommended process modifications are based on Genomatica's internal techno-economic assessment experience and are focused primarily on minimizing capital and operating costs. These recommendations include selecting/engineering a thermotolerant yeast strain with lipid excretion; using bubble column fermentors; increasing the size of production fermentors; reducing the number of vessels; employing semi-continuous operation; and recycling cell mass.« less
Trace: a high-throughput tomographic reconstruction engine for large-scale datasets
Bicer, Tekin; Gursoy, Doga; Andrade, Vincent De; ...
2017-01-28
Here, synchrotron light source and detector technologies enable scientists to perform advanced experiments. These scientific instruments and experiments produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used data acquisition technique at light sources is Computed Tomography, which can generate tens of GB/s depending on x-ray range. A large-scale tomographic dataset, such as mouse brain, may require hours of computation time with a medium size workstation. In this paper, we present Trace, a data-intensive computing middleware we developed for implementation and parallelization of iterative tomographic reconstruction algorithms. Tracemore » provides fine-grained reconstruction of tomography datasets using both (thread level) shared memory and (process level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations we have done on the replicated reconstruction objects and evaluate them using a shale and a mouse brain sinogram. Our experimental evaluations show that the applied optimizations and parallelization techniques can provide 158x speedup (using 32 compute nodes) over single core configuration, which decreases the reconstruction time of a sinogram (with 4501 projections and 22400 detector resolution) from 12.5 hours to less than 5 minutes per iteration.« less
Trace: a high-throughput tomographic reconstruction engine for large-scale datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bicer, Tekin; Gursoy, Doga; Andrade, Vincent De
Here, synchrotron light source and detector technologies enable scientists to perform advanced experiments. These scientific instruments and experiments produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used data acquisition technique at light sources is Computed Tomography, which can generate tens of GB/s depending on x-ray range. A large-scale tomographic dataset, such as mouse brain, may require hours of computation time with a medium size workstation. In this paper, we present Trace, a data-intensive computing middleware we developed for implementation and parallelization of iterative tomographic reconstruction algorithms. Tracemore » provides fine-grained reconstruction of tomography datasets using both (thread level) shared memory and (process level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations we have done on the replicated reconstruction objects and evaluate them using a shale and a mouse brain sinogram. Our experimental evaluations show that the applied optimizations and parallelization techniques can provide 158x speedup (using 32 compute nodes) over single core configuration, which decreases the reconstruction time of a sinogram (with 4501 projections and 22400 detector resolution) from 12.5 hours to less than 5 minutes per iteration.« less
Simulating Coupling Complexity in Space Plasmas: First Results from a new code
NASA Astrophysics Data System (ADS)
Kryukov, I.; Zank, G. P.; Pogorelov, N. V.; Raeder, J.; Ciardo, G.; Florinski, V. A.; Heerikhuisen, J.; Li, G.; Petrini, F.; Shematovich, V. I.; Winske, D.; Shaikh, D.; Webb, G. M.; Yee, H. M.
2005-12-01
The development of codes that embrace 'coupling complexity' via the self-consistent incorporation of multiple physical scales and multiple physical processes in models has been identified by the NRC Decadal Survey in Solar and Space Physics as a crucial necessary development in simulation/modeling technology for the coming decade. The National Science Foundation, through its Information Technology Research (ITR) Program, is supporting our efforts to develop a new class of computational code for plasmas and neutral gases that integrates multiple scales and multiple physical processes and descriptions. We are developing a highly modular, parallelized, scalable code that incorporates multiple scales by synthesizing 3 simulation technologies: 1) Computational fluid dynamics (hydrodynamics or magneto-hydrodynamics-MHD) for the large-scale plasma; 2) direct Monte Carlo simulation of atoms/neutral gas, and 3) transport code solvers to model highly energetic particle distributions. We are constructing the code so that a fourth simulation technology, hybrid simulations for microscale structures and particle distributions, can be incorporated in future work, but for the present, this aspect will be addressed at a test-particle level. This synthesis we will provide a computational tool that will advance our understanding of the physics of neutral and charged gases enormously. Besides making major advances in basic plasma physics and neutral gas problems, this project will address 3 Grand Challenge space physics problems that reflect our research interests: 1) To develop a temporal global heliospheric model which includes the interaction of solar and interstellar plasma with neutral populations (hydrogen, helium, etc., and dust), test-particle kinetic pickup ion acceleration at the termination shock, anomalous cosmic ray production, interaction with galactic cosmic rays, while incorporating the time variability of the solar wind and the solar cycle. 2) To develop a coronal mass ejection and interplanetary shock propagation model for the inner and outer heliosphere, including, at a test-particle level, wave-particle interactions and particle acceleration at traveling shock waves and compression regions. 3) To develop an advanced Geospace General Circulation Model (GGCM) capable of realistically modeling space weather events, in particular the interaction with CMEs and geomagnetic storms. Furthermore, by implementing scalable run-time supports and sophisticated off- and on-line prediction algorithms, we anticipate important advances in the development of automatic and intelligent system software to optimize a wide variety of 'embedded' computations on parallel computers. Finally, public domain MHD and hydrodynamic codes had a transforming effect on space and astrophysics. We expect that our new generation, open source, public domain multi-scale code will have a similar transformational effect in a variety of disciplines, opening up new classes of problems to physicists and engineers alike.
Neo-deterministic definition of earthquake hazard scenarios: a multiscale application to India
NASA Astrophysics Data System (ADS)
Peresan, Antonella; Magrin, Andrea; Parvez, Imtiyaz A.; Rastogi, Bal K.; Vaccari, Franco; Cozzini, Stefano; Bisignano, Davide; Romanelli, Fabio; Panza, Giuliano F.; Ashish, Mr; Mir, Ramees R.
2014-05-01
The development of effective mitigation strategies requires scientifically consistent estimates of seismic ground motion; recent analysis, however, showed that the performances of the classical probabilistic approach to seismic hazard assessment (PSHA) are very unsatisfactory in anticipating ground shaking from future large earthquakes. Moreover, due to their basic heuristic limitations, the standard PSHA estimates are by far unsuitable when dealing with the protection of critical structures (e.g. nuclear power plants) and cultural heritage, where it is necessary to consider extremely long time intervals. Nonetheless, the persistence in resorting to PSHA is often explained by the need to deal with uncertainties related with ground shaking and earthquakes recurrence. We show that current computational resources and physical knowledge of the seismic waves generation and propagation processes, along with the improving quantity and quality of geophysical data, allow nowadays for viable numerical and analytical alternatives to the use of PSHA. The advanced approach considered in this study, namely the NDSHA (neo-deterministic seismic hazard assessment), is based on the physically sound definition of a wide set of credible scenario events and accounts for uncertainties and earthquakes recurrence in a substantially different way. The expected ground shaking due to a wide set of potential earthquakes is defined by means of full waveforms modelling, based on the possibility to efficiently compute synthetic seismograms in complex laterally heterogeneous anelastic media. In this way a set of scenarios of ground motion can be defined, either at national and local scale, the latter considering the 2D and 3D heterogeneities of the medium travelled by the seismic waves. The efficiency of the NDSHA computational codes allows for the fast generation of hazard maps at the regional scale even on a modern laptop computer. At the scenario scale, quick parametric studies can be easily performed to understand the influence of the model characteristics on the computed ground shaking scenarios. For massive parametric tests, or for the repeated generation of large scale hazard maps, the methodology can take advantage of more advanced computational platforms, ranging from GRID computing infrastructures to HPC dedicated clusters up to Cloud computing. In such a way, scientists can deal efficiently with the variety and complexity of the potential earthquake sources, and perform parametric studies to characterize the related uncertainties. NDSHA provides realistic time series of expected ground motion readily applicable for seismic engineering analysis and other mitigation actions. The methodology has been successfully applied to strategic buildings, lifelines and cultural heritage sites, and for the purpose of seismic microzoning in several urban areas worldwide. A web application is currently being developed that facilitates the access to the NDSHA methodology and the related outputs by end-users, who are interested in reliable territorial planning and in the design and construction of buildings and infrastructures in seismic areas. At the same, the web application is also shaping up as an advanced educational tool to explore interactively how seismic waves are generated at the source, propagate inside structural models, and build up ground shaking scenarios. We illustrate the preliminary results obtained from a multiscale application of NDSHA approach to the territory of India, zooming from large scale hazard maps of ground shaking at bedrock, to the definition of local scale earthquake scenarios for selected sites in the Gujarat state (NW India). The study aims to provide the community (e.g. authorities and engineers) with advanced information for earthquake risk mitigation, which is particularly relevant to Gujarat in view of the rapid development and urbanization of the region.
NASA Technical Reports Server (NTRS)
Jackson, Karen E.
1990-01-01
Scale model technology represents one method of investigating the behavior of advanced, weight-efficient composite structures under a variety of loading conditions. It is necessary, however, to understand the limitations involved in testing scale model structures before the technique can be fully utilized. These limitations, or scaling effects, are characterized. in the large deflection response and failure of composite beams. Scale model beams were loaded with an eccentric axial compressive load designed to produce large bending deflections and global failure. A dimensional analysis was performed on the composite beam-column loading configuration to determine a model law governing the system response. An experimental program was developed to validate the model law under both static and dynamic loading conditions. Laminate stacking sequences including unidirectional, angle ply, cross ply, and quasi-isotropic were tested to examine a diversity of composite response and failure modes. The model beams were loaded under scaled test conditions until catastrophic failure. A large deflection beam solution was developed to compare with the static experimental results and to analyze beam failure. Also, the finite element code DYCAST (DYnamic Crash Analysis of STructure) was used to model both the static and impulsive beam response. Static test results indicate that the unidirectional and cross ply beam responses scale as predicted by the model law, even under severe deformations. In general, failure modes were consistent between scale models within a laminate family; however, a significant scale effect was observed in strength. The scale effect in strength which was evident in the static tests was also observed in the dynamic tests. Scaling of load and strain time histories between the scale model beams and the prototypes was excellent for the unidirectional beams, but inconsistent results were obtained for the angle ply, cross ply, and quasi-isotropic beams. Results show that valuable information can be obtained from testing on scale model composite structures, especially in the linear elastic response region. However, due to scaling effects in the strength behavior of composite laminates, caution must be used in extrapolating data taken from a scale model test when that test involves failure of the structure.
NASA Astrophysics Data System (ADS)
Hullo, J.-F.; Thibault, G.; Boucheny, C.
2015-02-01
In a context of increased maintenance operations and workers generational renewal, a nuclear owner and operator like Electricité de France (EDF) is interested in the scaling up of tools and methods of "as-built virtual reality" for larger buildings and wider audiences. However, acquisition and sharing of as-built data on a large scale (large and complex multi-floored buildings) challenge current scientific and technical capacities. In this paper, we first present a state of the art of scanning tools and methods for industrial plants with very complex architecture. Then, we introduce the inner characteristics of the multi-sensor scanning and visualization of the interior of the most complex building of a power plant: a nuclear reactor building. We introduce several developments that made possible a first complete survey of such a large building, from acquisition, processing and fusion of multiple data sources (3D laser scans, total-station survey, RGB panoramic, 2D floor plans, 3D CAD as-built models). In addition, we present the concepts of a smart application developed for the painless exploration of the whole dataset. The goal of this application is to help professionals, unfamiliar with the manipulation of such datasets, to take into account spatial constraints induced by the building complexity while preparing maintenance operations. Finally, we discuss the main feedbacks of this large experiment, the remaining issues for the generalization of such large scale surveys and the future technical and scientific challenges in the field of industrial "virtual reality".
Large-Scale Optimization for Bayesian Inference in Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willcox, Karen; Marzouk, Youssef
2013-11-12
The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of themore » SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less
Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghattas, Omar
2013-10-15
The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUAROmore » Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less
Interactions between Antarctic sea ice and large-scale atmospheric modes in CMIP5 models
NASA Astrophysics Data System (ADS)
Schroeter, Serena; Hobbs, Will; Bindoff, Nathaniel L.
2017-03-01
The response of Antarctic sea ice to large-scale patterns of atmospheric variability varies according to sea ice sector and season. In this study, interannual atmosphere-sea ice interactions were explored using observations and reanalysis data, and compared with simulated interactions by models in the Coupled Model Intercomparison Project Phase 5 (CMIP5). Simulated relationships between atmospheric variability and sea ice variability generally reproduced the observed relationships, though more closely during the season of sea ice advance than the season of sea ice retreat. Atmospheric influence on sea ice is known to be strongest during advance, and it appears that models are able to capture the dominance of the atmosphere during advance. Simulations of ocean-atmosphere-sea ice interactions during retreat, however, require further investigation. A large proportion of model ensemble members overestimated the relative importance of the Southern Annular Mode (SAM) compared with other modes of high southern latitude climate, while the influence of tropical forcing was underestimated. This result emerged particularly strongly during the season of sea ice retreat. The zonal patterns of the SAM in many models and its exaggerated influence on sea ice overwhelm the comparatively underestimated meridional influence, suggesting that simulated sea ice variability would become more zonally symmetric as a result. Across the seasons of sea ice advance and retreat, three of the five sectors did not reveal a strong relationship with a pattern of large-scale atmospheric variability in one or both seasons, indicating that sea ice in these sectors may be influenced more strongly by atmospheric variability unexplained by the major atmospheric modes, or by heat exchange in the ocean.
Large eddy simulation modelling of combustion for propulsion applications.
Fureby, C
2009-07-28
Predictive modelling of turbulent combustion is important for the development of air-breathing engines, internal combustion engines, furnaces and for power generation. Significant advances in modelling non-reactive turbulent flows are now possible with the development of large eddy simulation (LES), in which the large energetic scales of the flow are resolved on the grid while modelling the effects of the small scales. Here, we discuss the use of combustion LES in predictive modelling of propulsion applications such as gas turbine, ramjet and scramjet engines. The LES models used are described in some detail and are validated against laboratory data-of which results from two cases are presented. These validated LES models are then applied to an annular multi-burner gas turbine combustor and a simplified scramjet combustor, for which some additional experimental data are available. For these cases, good agreement with the available reference data is obtained, and the LES predictions are used to elucidate the flow physics in such devices to further enhance our knowledge of these propulsion systems. Particular attention is focused on the influence of the combustion chemistry, turbulence-chemistry interaction, self-ignition, flame holding burner-to-burner interactions and combustion oscillations.
Techniques for extracting single-trial activity patterns from large-scale neural recordings
Churchland, Mark M; Yu, Byron M; Sahani, Maneesh; Shenoy, Krishna V
2008-01-01
Summary Large, chronically-implanted arrays of microelectrodes are an increasingly common tool for recording from primate cortex, and can provide extracellular recordings from many (order of 100) neurons. While the desire for cortically-based motor prostheses has helped drive their development, such arrays also offer great potential to advance basic neuroscience research. Here we discuss the utility of array recording for the study of neural dynamics. Neural activity often has dynamics beyond that driven directly by the stimulus. While governed by those dynamics, neural responses may nevertheless unfold differently for nominally identical trials, rendering many traditional analysis methods ineffective. We review recent studies – some employing simultaneous recording, some not – indicating that such variability is indeed present both during movement generation, and during the preceding premotor computations. In such cases, large-scale simultaneous recordings have the potential to provide an unprecedented view of neural dynamics at the level of single trials. However, this enterprise will depend not only on techniques for simultaneous recording, but also on the use and further development of analysis techniques that can appropriately reduce the dimensionality of the data, and allow visualization of single-trial neural behavior. PMID:18093826
Analysis of terrestrial conditions and dynamics
NASA Technical Reports Server (NTRS)
Goward, S. N. (Principal Investigator)
1984-01-01
Land spectral reflectance properties for selected locations, including the Goddard Space Flight Center, the Wallops Flight Facility, a MLA test site in Cambridge, Maryland, and an acid test site in Burlington, Vermont, were measured. Methods to simulate the bidirectional reflectance properties of vegetated landscapes and a data base for spatial resolution were developed. North American vegetation patterns observed with the Advanced Very High Resolution Radiometer were assessed. Data and methods needed to model large-scale vegetation activity with remotely sensed observations and climate data were compiled.
NASA Technical Reports Server (NTRS)
Andrews, E. H., Jr.; Mackley, E. A.
1976-01-01
The NASA Hypersonic Research Engine (HRE) Project was initiated for the purpose of advancing the technology of airbreathing propulsion for hypersonic flight. A large component (inlet, combustor, and nozzle) and structures development program was encompassed by the project. The tests of a full-scale (18 in. diameter cowl and 87 in. long) HRE concept, designated the Aerothermodynamic Integration Model (AIM), at Mach numbers of 5, 6, and 7. Computer program results for Mach 6 component integration tests are presented.
Privacy Challenges of Genomic Big Data.
Shen, Hong; Ma, Jian
2017-01-01
With the rapid advancement of high-throughput DNA sequencing technologies, genomics has become a big data discipline where large-scale genetic information of human individuals can be obtained efficiently with low cost. However, such massive amount of personal genomic data creates tremendous challenge for privacy, especially given the emergence of direct-to-consumer (DTC) industry that provides genetic testing services. Here we review the recent development in genomic big data and its implications on privacy. We also discuss the current dilemmas and future challenges of genomic privacy.
High Efficiency Solar Thermochemical Reactor for Hydrogen Production.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDaniel, Anthony H.
2017-09-30
This research and development project is focused on the advancement of a technology that produces hydrogen at a cost that is competitive with fossil-based fuels for transportation. A twostep, solar-driven WS thermochemical cycle is theoretically capable of achieving an STH conversion ratio that exceeds the DOE target of 26% at a scale large enough to support an industrialized economy [1]. The challenge is to transition this technology from the laboratory to the marketplace and produce hydrogen at a cost that meets or exceeds DOE targets.
NASA Technical Reports Server (NTRS)
Singh, Mrityunjay; Petko, Jeannie F.
2004-01-01
Affordable fiber-reinforced ceramic matrix composites with multifunctional properties are critically needed for high-temperature aerospace and space transportation applications. These materials have various applications in advanced high-efficiency and high-performance engines, airframe and propulsion components for next-generation launch vehicles, and components for land-based systems. A number of these applications require materials with specific functional characteristics: for example, thick component, hybrid layups for environmental durability and stress management, and self-healing and smart composite matrices. At present, with limited success and very high cost, traditional composite fabrication technologies have been utilized to manufacture some large, complex-shape components of these materials. However, many challenges still remain in developing affordable, robust, and flexible manufacturing technologies for large, complex-shape components with multifunctional properties. The prepreg and melt infiltration (PREMI) technology provides an affordable and robust manufacturing route for low-cost, large-scale production of multifunctional ceramic composite components.
Advancing the large-scale CCS database for metabolomics and lipidomics at the machine-learning era.
Zhou, Zhiwei; Tu, Jia; Zhu, Zheng-Jiang
2018-02-01
Metabolomics and lipidomics aim to comprehensively measure the dynamic changes of all metabolites and lipids that are present in biological systems. The use of ion mobility-mass spectrometry (IM-MS) for metabolomics and lipidomics has facilitated the separation and the identification of metabolites and lipids in complex biological samples. The collision cross-section (CCS) value derived from IM-MS is a valuable physiochemical property for the unambiguous identification of metabolites and lipids. However, CCS values obtained from experimental measurement and computational modeling are limited available, which significantly restricts the application of IM-MS. In this review, we will discuss the recently developed machine-learning based prediction approach, which could efficiently generate precise CCS databases in a large scale. We will also highlight the applications of CCS databases to support metabolomics and lipidomics. Copyright © 2017 Elsevier Ltd. All rights reserved.
Tools for phospho- and glycoproteomics of plasma membranes.
Wiśniewski, Jacek R
2011-07-01
Analysis of plasma membrane proteins and their posttranslational modifications is considered as important for identification of disease markers and targets for drug treatment. Due to their insolubility in water, studying of plasma membrane proteins using mass spectrometry has been difficult for a long time. Recent technological developments in sample preparation together with important improvements in mass spectrometric analysis have facilitated analysis of these proteins and their posttranslational modifications. Now, large scale proteomic analyses allow identification of thousands of membrane proteins from minute amounts of sample. Optimized protocols for affinity enrichment of phosphorylated and glycosylated peptides have set new dimensions in the depth of characterization of these posttranslational modifications of plasma membrane proteins. Here, I summarize recent advances in proteomic technology for the characterization of the cell surface proteins and their modifications. In the focus are approaches allowing large scale mapping rather than analytical methods suitable for studying individual proteins or non-complex mixtures.
Large-scale high-throughput computer-aided discovery of advanced materials using cloud computing
NASA Astrophysics Data System (ADS)
Bazhirov, Timur; Mohammadi, Mohammad; Ding, Kevin; Barabash, Sergey
Recent advances in cloud computing made it possible to access large-scale computational resources completely on-demand in a rapid and efficient manner. When combined with high fidelity simulations, they serve as an alternative pathway to enable computational discovery and design of new materials through large-scale high-throughput screening. Here, we present a case study for a cloud platform implemented at Exabyte Inc. We perform calculations to screen lightweight ternary alloys for thermodynamic stability. Due to the lack of experimental data for most such systems, we rely on theoretical approaches based on first-principle pseudopotential density functional theory. We calculate the formation energies for a set of ternary compounds approximated by special quasirandom structures. During an example run we were able to scale to 10,656 CPUs within 7 minutes from the start, and obtain results for 296 compounds within 38 hours. The results indicate that the ultimate formation enthalpy of ternary systems can be negative for some of lightweight alloys, including Li and Mg compounds. We conclude that compared to traditional capital-intensive approach that requires in on-premises hardware resources, cloud computing is agile and cost-effective, yet scalable and delivers similar performance.
NASA Astrophysics Data System (ADS)
Hua, Wei-Bo; Guo, Xiao-Dong; Zheng, Zhuo; Wang, Yan-Jie; Zhong, Ben-He; Fang, Baizeng; Wang, Jia-Zhao; Chou, Shu-Lei; Liu, Heng
2015-02-01
Developing advanced electrode materials that deliver high energy at ultra-fast charge and discharge rates are very crucial to meet an increasing large-scale market demand for high power lithium ion batteries (LIBs). A three-dimensional (3D) nanoflower structure is successfully developed in the large-scale synthesis of LiNi1/3Co1/3Mn1/3O2 material for the first time. The fast co-precipitation is the key technique to prepare the nanoflower structure in our method. After heat treatment, the obtained LiNi1/3Co1/3Mn1/3O2 nanoflowers (NL333) pronouncedly present a pristine flower-like nano-architecture and provide fast pathways for the transport of Li-ions and electrons. As a cathode material in a LIB, the prepared NL333 electrode demonstrates an outstanding high-rate capability. Particularly, in a narrow voltage range of 2.7-4.3 V, the discharge capacity at an ultra-fast charge-discharge rate (20C) is up to 126 mAh g-1, which reaches 78% of that at 0.2C, and is much higher than that (i.e., 44.17%) of the traditional bulk LiNi1/3Co1/3Mn1/3O2.
NASA Technical Reports Server (NTRS)
Gabb, Tim; Gayda, John; Telesman, Jack
2001-01-01
The advanced powder metallurgy disk alloy ME3 was designed using statistical screening and optimization of composition and processing variables in the NASA HSR/EPM disk program to have extended durability at 1150 to 1250 "Fin large disks. Scaled-up disks of this alloy were produced at the conclusion of this program to demonstrate these properties in realistic disk shapes. The objective of the UEET disk program was to assess the mechanical properties of these ME3 disks as functions of temperature, in order to estimate the maximum temperature capabilities of this advanced alloy. Scaled-up disks processed in the HSR/EPM Compressor / Turbine Disk program were sectioned, machined into specimens, and tested in tensile, creep, fatigue, and fatigue crack growth tests by NASA Glenn Research Center, in cooperation with General Electric Engine Company and Pratt & Whitney Aircraft Engines. Additional sub-scale disks and blanks were processed and tested to explore the effects of several processing variations on mechanical properties. Scaled-up disks of an advanced regional disk alloy, Alloy 10, were used to evaluate dual microstructure heat treatments. This allowed demonstration of an improved balance of properties in disks with higher strength and fatigue resistance in the bores and higher creep and dwell fatigue crack growth resistance in the rims. Results indicate the baseline ME3 alloy and process has 1300 to 1350 O F temperature capabilities, dependent on detailed disk and engine design property requirements. Chemistry and process enhancements show promise for further increasing temperature capabilities.
Sustainability of utility-scale solar energy: Critical environmental concepts
NASA Astrophysics Data System (ADS)
Hernandez, R. R.; Moore-O'Leary, K. A.; Johnston, D. S.; Abella, S.; Tanner, K.; Swanson, A.; Kreitler, J.; Lovich, J.
2017-12-01
Renewable energy development is an arena where ecological, political, and socioeconomic values collide. Advances in renewable energy will incur steep environmental costs to landscapes in which facilities are constructed and operated. Scientists - including those from academia, industry, and government agencies - have only recently begun to quantify trade-off in this arena, often using ground-mounted, utility-scale solar energy facilities (USSE, ≥ 1 megawatt) as a model. Here, we discuss five critical ecological concepts applicable to the development of more sustainable USSE with benefits over fossil-fuel-generated energy: (1) more sustainable USSE development requires careful evaluation of trade-offs between land, energy, and ecology; (2) species responses to habitat modification by USSE vary; (3) cumulative and large-scale ecological impacts are complex and challenging to mitigate; (4) USSE development affects different types of ecosystems and requires customized design and management strategies; and (5) long-term ecological consequences associated with USSE sites must be carefully considered. These critical concepts provide a framework for reducing adverse environmental impacts, informing policy to establish and address conservation priorities, and improving energy production sustainability.
Sustainability of utility-scale solar energy – critical ecological concepts
Moore-O'Leary, Kara A.; Hernandez, Rebecca R.; Johnston, Dave S.; Abella, Scott R.; Tanner, Karen E.; Swanson, Amanda C.; Kreitler, Jason R.; Lovich, Jeffrey E.
2017-01-01
Renewable energy development is an arena where ecological, political, and socioeconomic values collide. Advances in renewable energy will incur steep environmental costs to landscapes in which facilities are constructed and operated. Scientists – including those from academia, industry, and government agencies – have only recently begun to quantify trade-offs in this arena, often using ground-mounted, utility-scale solar energy facilities (USSE, ≥1 megawatt) as a model. Here, we discuss five critical ecological concepts applicable to the development of more sustainable USSE with benefits over fossil-fuel-generated energy: (1) more sustainable USSE development requires careful evaluation of trade-offs between land, energy, and ecology; (2) species responses to habitat modification by USSE vary; (3) cumulative and large-scale ecological impacts are complex and challenging to mitigate; (4) USSE development affects different types of ecosystems and requires customized design and management strategies; and (5) long-term ecological consequences associated with USSE sites must be carefully considered. These critical concepts provide a framework for reducing adverse environmental impacts, informing policy to establish and address conservation priorities, and improving energy production sustainability.
Steps Towards Understanding Large-scale Deformation of Gas Hydrate-bearing Sediments
NASA Astrophysics Data System (ADS)
Gupta, S.; Deusner, C.; Haeckel, M.; Kossel, E.
2016-12-01
Marine sediments bearing gas hydrates are typically characterized by heterogeneity in the gas hydrate distribution and anisotropy in the sediment-gas hydrate fabric properties. Gas hydrates also contribute to the strength and stiffness of the marine sediment, and any disturbance in the thermodynamic stability of the gas hydrates is likely to affect the geomechanical stability of the sediment. Understanding mechanisms and triggers of large-strain deformation and failure of marine gas hydrate-bearing sediments is an area of extensive research, particularly in the context of marine slope-stability and industrial gas production. The ultimate objective is to predict severe deformation events such as regional-scale slope failure or excessive sand production by using numerical simulation tools. The development of such tools essentially requires a careful analysis of thermo-hydro-chemo-mechanical behavior of gas hydrate-bearing sediments at lab-scale, and its stepwise integration into reservoir-scale simulators through definition of effective variables, use of suitable constitutive relations, and application of scaling laws. One of the focus areas of our research is to understand the bulk coupled behavior of marine gas hydrate systems with contributions from micro-scale characteristics, transport-reaction dynamics, and structural heterogeneity through experimental flow-through studies using high-pressure triaxial test systems and advanced tomographical tools (CT, ERT, MRI). We combine these studies to develop mathematical model and numerical simulation tools which could be used to predict the coupled hydro-geomechanical behavior of marine gas hydrate reservoirs in a large-strain framework. Here we will present some of our recent results from closely co-ordinated experimental and numerical simulation studies with an objective to capture the large-deformation behavior relevant to different gas production scenarios. We will also report on a variety of mechanically relevant test scenarios focusing on effects of dynamic changes in gas hydrate saturation, highly uneven gas hydrate distributions, focused fluid migration and gas hydrate production through depressurization and CO2 injection.
Orthographic and Phonological Neighborhood Databases across Multiple Languages.
Marian, Viorica
2017-01-01
The increased globalization of science and technology and the growing number of bilinguals and multilinguals in the world have made research with multiple languages a mainstay for scholars who study human function and especially those who focus on language, cognition, and the brain. Such research can benefit from large-scale databases and online resources that describe and measure lexical, phonological, orthographic, and semantic information. The present paper discusses currently-available resources and underscores the need for tools that enable measurements both within and across multiple languages. A general review of language databases is followed by a targeted introduction to databases of orthographic and phonological neighborhoods. A specific focus on CLEARPOND illustrates how databases can be used to assess and compare neighborhood information across languages, to develop research materials, and to provide insight into broad questions about language. As an example of how using large-scale databases can answer questions about language, a closer look at neighborhood effects on lexical access reveals that not only orthographic, but also phonological neighborhoods can influence visual lexical access both within and across languages. We conclude that capitalizing upon large-scale linguistic databases can advance, refine, and accelerate scientific discoveries about the human linguistic capacity.
Weighted and directed interactions in evolving large-scale epileptic brain networks
NASA Astrophysics Data System (ADS)
Dickten, Henning; Porz, Stephan; Elger, Christian E.; Lehnertz, Klaus
2016-10-01
Epilepsy can be regarded as a network phenomenon with functionally and/or structurally aberrant connections in the brain. Over the past years, concepts and methods from network theory substantially contributed to improve the characterization of structure and function of these epileptic networks and thus to advance understanding of the dynamical disease epilepsy. We extend this promising line of research and assess—with high spatial and temporal resolution and using complementary analysis approaches that capture different characteristics of the complex dynamics—both strength and direction of interactions in evolving large-scale epileptic brain networks of 35 patients that suffered from drug-resistant focal seizures with different anatomical onset locations. Despite this heterogeneity, we find that even during the seizure-free interval the seizure onset zone is a brain region that, when averaged over time, exerts strongest directed influences over other brain regions being part of a large-scale network. This crucial role, however, manifested by averaging on the population-sample level only - in more than one third of patients, strongest directed interactions can be observed between brain regions far off the seizure onset zone. This may guide new developments for individualized diagnosis, treatment and control.
Supersonic Retropropulsion Technology Development in NASA's Entry, Descent, and Landing Project
NASA Technical Reports Server (NTRS)
Edquist, Karl T.; Berry, Scott A.; Rhode, Matthew N.; Kelb, Bil; Korzun, Ashley; Dyakonov, Artem A.; Zarchi, Kerry A.; Schauerhamer, Daniel G.; Post, Ethan A.
2012-01-01
NASA's Entry, Descent, and Landing (EDL) space technology roadmap calls for new technologies to achieve human exploration of Mars in the coming decades [1]. One of those technologies, termed Supersonic Retropropulsion (SRP), involves initiation of propulsive deceleration at supersonic Mach numbers. The potential benefits afforded by SRP to improve payload mass and landing precision make the technology attractive for future EDL missions. NASA's EDL project spent two years advancing the technological maturity of SRP for Mars exploration [2-15]. This paper summarizes the technical accomplishments from the project and highlights challenges and recommendations for future SRP technology development programs. These challenges include: developing sufficiently large SRP engines for use on human-scale entry systems; testing and computationally modelling complex and unsteady SRP fluid dynamics; understanding the effects of SRP on entry vehicle stability and controllability; and demonstrating sub-scale SRP entry systems in Earth's atmosphere.
Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy
2012-11-01
Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright © 2012 Elsevier B.V. All rights reserved.
Heeres, Arjan S; Picone, Carolina S F; van der Wielen, Luuk A M; Cunha, Rosiane L; Cuellar, Maria C
2014-04-01
Isoprenoids and alkanes produced and secreted by microorganisms are emerging as an alternative biofuel for diesel and jet fuel replacements. In a similar way as for other bioprocesses comprising an organic liquid phase, the presence of microorganisms, medium composition, and process conditions may result in emulsion formation during fermentation, hindering product recovery. At the same time, a low-cost production process overcoming this challenge is required to make these advanced biofuels a feasible alternative. We review the main mechanisms and causes of emulsion formation during fermentation, because a better understanding on the microscale can give insights into how to improve large-scale processes and the process technology options that can address these challenges. Copyright © 2014 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alan Black; Arnis Judzis
2004-10-01
The industry cost shared program aims to benchmark drilling rates of penetration in selected simulated deep formations and to significantly improve ROP through a team development of aggressive diamond product drill bit--fluid system technologies. Overall the objectives are as follows: Phase 1--Benchmark ''best in class'' diamond and other product drilling bits and fluids and develop concepts for a next level of deep drilling performance; Phase 2--Develop advanced smart bit-fluid prototypes and test at large scale; and Phase 3--Field trial smart bit-fluid concepts, modify as necessary and commercialize products. As of report date, TerraTek has concluded all major preparations for themore » high pressure drilling campaign. Baker Hughes encountered difficulties in providing additional pumping capacity before TerraTek's scheduled relocation to another facility, thus the program was delayed further to accommodate the full testing program.« less
NASA Astrophysics Data System (ADS)
Dörner, Ralf; Lok, Benjamin; Broll, Wolfgang
Backed by a large consumer market, entertainment and education applications have spurred developments in the fields of real-time rendering and interactive computer graphics. Relying on Computer Graphics methodologies, Virtual Reality and Augmented Reality benefited indirectly from this; however, there is no large scale demand for VR and AR in gaming and learning. What are the shortcomings of current VR/AR technology that prevent a widespread use in these application areas? What advances in VR/AR will be necessary? And what might future “VR-enhanced” gaming and learning look like? Which role can and will Virtual Humans play? Concerning these questions, this article analyzes the current situation and provides an outlook on future developments. The focus is on social gaming and learning.
Wawrzyniak, Zbigniew M; Paczesny, Daniel; Mańczuk, Marta; Zatoński, Witold A
2011-01-01
Large-scale epidemiologic studies can assess health indicators differentiating social groups and important health outcomes of the incidence and mortality of cancer, cardiovascular disease, and others, to establish a solid knowledgebase for the prevention management of premature morbidity and mortality causes. This study presents new advanced methods of data collection and data management systems with current data quality control and security to ensure high quality data assessment of health indicators in the large epidemiologic PONS study (The Polish-Norwegian Study). The material for experiment is the data management design of the large-scale population study in Poland (PONS) and the managed processes are applied into establishing a high quality and solid knowledge. The functional requirements of the PONS study data collection, supported by the advanced IT web-based methods, resulted in medical data of a high quality, data security, with quality data assessment, control process and evolution monitoring are fulfilled and shared by the IT system. Data from disparate and deployed sources of information are integrated into databases via software interfaces, and archived by a multi task secure server. The practical and implemented solution of modern advanced database technologies and remote software/hardware structure successfully supports the research of the big PONS study project. Development and implementation of follow-up control of the consistency and quality of data analysis and the processes of the PONS sub-databases have excellent measurement properties of data consistency of more than 99%. The project itself, by tailored hardware/software application, shows the positive impact of Quality Assurance (QA) on the quality of outcomes analysis results, effective data management within a shorter time. This efficiency ensures the quality of the epidemiological data and indicators of health by the elimination of common errors of research questionnaires and medical measurements.
Mapping the dark space of chemical reactions with extended nanomole synthesis and MALDI-TOF MS.
Lin, Shishi; Dikler, Sergei; Blincoe, William D; Ferguson, Ronald D; Sheridan, Robert P; Peng, Zhengwei; Conway, Donald V; Zawatzky, Kerstin; Wang, Heather; Cernak, Tim; Davies, Ian W; DiRocco, Daniel A; Sheng, Huaming; Welch, Christopher J; Dreher, Spencer D
2018-05-24
Understanding the practical limitations of chemical reactions is critically important for efficiently planning the synthesis of compounds in pharmaceutical, agrochemical and specialty chemical research and development. However, literature reports of the scope of new reactions are often cursory and biased toward successful results, severely limiting the ability to predict reaction outcomes for untested substrates. We herein illustrate strategies for carrying out large scale surveys of chemical reactivity using a material-sparing nanomole-scale automated synthesis platform with greatly expanded synthetic scope combined with ultra-high throughput (uHT) matrix assisted laser desorption/ionization time of flight mass spectrometry (MALDI-TOF MS). Copyright © 2018, American Association for the Advancement of Science.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babu, Sudarsanam Suresh; Love, Lonnie J.; Peter, William H.
Additive manufacturing (AM) is considered an emerging technology that is expected to transform the way industry can make low-volume, high value complex structures. This disruptive technology promises to replace legacy manufacturing methods for the fabrication of existing components in addition to bringing new innovation for new components with increased functional and mechanical properties. This report outlines the outcome of a workshop on large-scale metal additive manufacturing held at Oak Ridge National Laboratory (ORNL) on March 11, 2016. The charter for the workshop was outlined by the Department of Energy (DOE) Advanced Manufacturing Office program manager. The status and impact ofmore » the Big Area Additive Manufacturing (BAAM) for polymer matrix composites was presented as the background motivation for the workshop. Following, the extension of underlying technology to low-cost metals was proposed with the following goals: (i) High deposition rates (approaching 100 lbs/h); (ii) Low cost (<$10/lbs) for steel, iron, aluminum, nickel, as well as, higher cost titanium, (iii) large components (major axis greater than 6 ft) and (iv) compliance of property requirements. The above concept was discussed in depth by representatives from different industrial sectors including welding, metal fabrication machinery, energy, construction, aerospace and heavy manufacturing. In addition, DOE’s newly launched High Performance Computing for Manufacturing (HPC4MFG) program was reviewed. This program will apply thermo-mechanical models to elucidate deeper understanding of the interactions between design, process, and materials during additive manufacturing. Following these presentations, all the attendees took part in a brainstorming session where everyone identified the top 10 challenges in large-scale metal AM from their own perspective. The feedback was analyzed and grouped in different categories including, (i) CAD to PART software, (ii) selection of energy source, (iii) systems development, (iv) material feedstock, (v) process planning, (vi) residual stress & distortion, (vii) post-processing, (viii) qualification of parts, (ix) supply chain and (x) business case. Furthermore, an open innovation network methodology was proposed to accelerate the development and deployment of new large-scale metal additive manufacturing technology with the goal of creating a new generation of high deposition rate equipment, affordable feed stocks, and large metallic components to enhance America’s economic competitiveness.« less
RECENT ADVANCES IN HIGH TEMPERATURE ELECTROLYSIS AT IDAHO NATIONAL LABORATORY: STACK TESTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
X, Zhang; J. E. O'Brien; R. C. O'Brien
2012-07-01
High temperature steam electrolysis is a promising technology for efficient sustainable large-scale hydrogen production. Solid oxide electrolysis cells (SOECs) are able to utilize high temperature heat and electric power from advanced high-temperature nuclear reactors or renewable sources to generate carbon-free hydrogen at large scale. However, long term durability of SOECs needs to be improved significantly before commercialization of this technology. A degradation rate of 1%/khr or lower is proposed as a threshold value for commercialization of this technology. Solid oxide electrolysis stack tests have been conducted at Idaho National Laboratory to demonstrate recent improvements in long-term durability of SOECs. Electrolytesupportedmore » and electrode-supported SOEC stacks were provided by Ceramatec Inc., Materials and Systems Research Inc. (MSRI), and Saint Gobain Advanced Materials (St. Gobain), respectively for these tests. Long-term durability tests were generally operated for a duration of 1000 hours or more. Stack tests based on technology developed at Ceramatec and MSRI have shown significant improvement in durability in the electrolysis mode. Long-term degradation rates of 3.2%/khr and 4.6%/khr were observed for MSRI and Ceramatec stacks, respectively. One recent Ceramatec stack even showed negative degradation (performance improvement) over 1900 hours of operation. A three-cell short stack provided by St. Gobain, however, showed rapid degradation in the electrolysis mode. Improvements on electrode materials, interconnect coatings, and electrolyteelectrode interface microstructures contribute to better durability of SOEC stacks.« less
Varma, Rohit; Foong, Athena W.P.; Lai, Mei-Ying; Choudhury, Farzana; Klein, Ronald; Azen, Stanley P.
2011-01-01
Purpose To estimate 4-year incidence and progression of early and advanced age-related macular degeneration (AMD). Design Population-based cohort study. Methods A comprehensive ophthalmologic examination including stereoscopic fundus photography was performed on adult Latinos at baseline and follow-up. Photographs were graded using a modified Wisconsin Age-Related Maculopathy Grading System. For estimations of incidence and progression of AMD, the Age Related Eye Disease Study Scale was used. Main outcome measures are incidence and progression of early AMD (drusen type, drusen size, and retinal pigmentary abnormalities) and advanced AMD (exudative AMD and geographic atrophy). Results 4,658/6100 (76%) completed the follow-up examination. The 4-year incidence of early AMD was 7.5% (95%CI:6.6,8.4) and advanced AMD was 0.2% (95%CI:0.1,0.4). Progression of any AMD occurred in 9.3% (95%CI:8.4,10.3) of at-risk participants. Incidence and progression increased with age. Incidence of early AMD in the second eye (10.8%) was higher than incidence in the first eye (6.9%). Baseline presence of soft indistinct large drusen≥250μm in diameter was more likely to predict the 4-year incidence of pigmentary abnormalities, geographic atrophy, and exudative AMD than smaller or hard or soft distinct drusen. Conclusions Age-specific incidence and progression of AMD in Latinos are lower than in non-Hispanic whites. While incident early AMD is more often unilateral, the risk of its development in the second is higher than in the first eye. Older persons and those with soft indistinct large drusen had a higher risk of developing advanced AMD compared to those who were younger and did not have soft indistinct large drusen. PMID:20399926
Antibody Engineering for Pursuing a Healthier Future
Saeed, Abdullah F. U. H.; Wang, Rongzhi; Ling, Sumei; Wang, Shihua
2017-01-01
Since the development of antibody-production techniques, a number of immunoglobulins have been developed on a large scale using conventional methods. Hybridoma technology opened a new horizon in the production of antibodies against target antigens of infectious pathogens, malignant diseases including autoimmune disorders, and numerous potent toxins. However, these clinical humanized or chimeric murine antibodies have several limitations and complexities. Therefore, to overcome these difficulties, recent advances in genetic engineering techniques and phage display technique have allowed the production of highly specific recombinant antibodies. These engineered antibodies have been constructed in the hunt for novel therapeutic drugs equipped with enhanced immunoprotective abilities, such as engaging immune effector functions, effective development of fusion proteins, efficient tumor and tissue penetration, and high-affinity antibodies directed against conserved targets. Advanced antibody engineering techniques have extensive applications in the fields of immunology, biotechnology, diagnostics, and therapeutic medicines. However, there is limited knowledge regarding dynamic antibody development approaches. Therefore, this review extends beyond our understanding of conventional polyclonal and monoclonal antibodies. Furthermore, recent advances in antibody engineering techniques together with antibody fragments, display technologies, immunomodulation, and broad applications of antibodies are discussed to enhance innovative antibody production in pursuit of a healthier future for humans. PMID:28400756
NASA Technical Reports Server (NTRS)
Badescu, Mircea
2014-01-01
Subsurface penetration by coring, drilling or abrading is of great importance for a large number of space and earth applications. An Ultrasonic/Sonic Drill/Corer (USDC) has been in development at JPL's Nondestructive Evaluation and Advanced Actuators (NDEAA) lab as an adaptable tool for many of these applications. The USDC uses a novel drive mechanism to transform the high frequency ultrasonic or sonic vibrations of the tip of a horn into a lower frequency sonic hammering of a drill bit through an intermediate free-flying mass. The USDC device idea has been implemented at various scales from handheld drills to large diameter coring devices. A series of computer programs that model the function and performance of the USDC device were developed and were later integrated into an automated modeling package. The USDC has also evolved from a purely hammering drill to a rotary hammer drill as the design requirements increased form small diameter shallow drilling to large diameter deep coring. A synthesis of the Auto-Gopher development is presented in this paper.
Towards Portable Large-Scale Image Processing with High-Performance Computing.
Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A
2018-05-03
High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software development and expansion, and (3) scalable spider deployment compatible with HPC clusters and local workstations.
Scan-o-matic: High-Resolution Microbial Phenomics at a Massive Scale
Zackrisson, Martin; Hallin, Johan; Ottosson, Lars-Göran; Dahl, Peter; Fernandez-Parada, Esteban; Ländström, Erik; Fernandez-Ricaud, Luciano; Kaferle, Petra; Skyman, Andreas; Stenberg, Simon; Omholt, Stig; Petrovič, Uroš; Warringer, Jonas; Blomberg, Anders
2016-01-01
The capacity to map traits over large cohorts of individuals—phenomics—lags far behind the explosive development in genomics. For microbes, the estimation of growth is the key phenotype because of its link to fitness. We introduce an automated microbial phenomics framework that delivers accurate, precise, and highly resolved growth phenotypes at an unprecedented scale. Advancements were achieved through the introduction of transmissive scanning hardware and software technology, frequent acquisition of exact colony population size measurements, extraction of population growth rates from growth curves, and removal of spatial bias by reference-surface normalization. Our prototype arrangement automatically records and analyzes close to 100,000 growth curves in parallel. We demonstrate the power of the approach by extending and nuancing the known salt-defense biology in baker’s yeast. The introduced framework represents a major advance in microbial phenomics by providing high-quality data for extensive cohorts of individuals and generating well-populated and standardized phenomics databases PMID:27371952
Images as drivers of progress in cardiac computational modelling
Lamata, Pablo; Casero, Ramón; Carapella, Valentina; Niederer, Steve A.; Bishop, Martin J.; Schneider, Jürgen E.; Kohl, Peter; Grau, Vicente
2014-01-01
Computational models have become a fundamental tool in cardiac research. Models are evolving to cover multiple scales and physical mechanisms. They are moving towards mechanistic descriptions of personalised structure and function, including effects of natural variability. These developments are underpinned to a large extent by advances in imaging technologies. This article reviews how novel imaging technologies, or the innovative use and extension of established ones, integrate with computational models and drive novel insights into cardiac biophysics. In terms of structural characterization, we discuss how imaging is allowing a wide range of scales to be considered, from cellular levels to whole organs. We analyse how the evolution from structural to functional imaging is opening new avenues for computational models, and in this respect we review methods for measurement of electrical activity, mechanics and flow. Finally, we consider ways in which combined imaging and modelling research is likely to continue advancing cardiac research, and identify some of the main challenges that remain to be solved. PMID:25117497
NASA Technical Reports Server (NTRS)
Pokhrel, Yadu N.; Hanasaki, Naota; Wada, Yoshihide; Kim, Hyungjun
2016-01-01
The global water cycle has been profoundly affected by human land-water management. As the changes in the water cycle on land can affect the functioning of a wide range of biophysical and biogeochemical processes of the Earth system, it is essential to represent human land-water management in Earth system models (ESMs). During the recent past, noteworthy progress has been made in large-scale modeling of human impacts on the water cycle but sufficient advancements have not yet been made in integrating the newly developed schemes into ESMs. This study reviews the progresses made in incorporating human factors in large-scale hydrological models and their integration into ESMs. The study focuses primarily on the recent advancements and existing challenges in incorporating human impacts in global land surface models (LSMs) as a way forward to the development of ESMs with humans as integral components, but a brief review of global hydrological models (GHMs) is also provided. The study begins with the general overview of human impacts on the water cycle. Then, the algorithms currently employed to represent irrigation, reservoir operation, and groundwater pumping are discussed. Next, methodological deficiencies in current modeling approaches and existing challenges are identified. Furthermore, light is shed on the sources of uncertainties associated with model parameterizations, grid resolution, and datasets used for forcing and validation. Finally, representing human land-water management in LSMs is highlighted as an important research direction toward developing integrated models using ESM frameworks for the holistic study of human-water interactions within the Earths system.
NASA Technical Reports Server (NTRS)
1979-01-01
The preliminary design for a prototype small (20 kWe) solar thermal electric generating unit was completed, consisting of several subsystems. The concentrator and the receiver collect solar energy and a thermal buffer storage with a transport system is used to provide a partially smoothed heat input to the Stirling engine. A fossil-fuel combustor is included in the receiver designs to permit operation with partial or no solar insolation (hybrid). The engine converts the heat input into mechanical action that powers a generator. To obtain electric power on a large scale, multiple solar modules will be required to operate in parallel. The small solar electric power plant used as a baseline design will provide electricity at remote sites and small communities.
Real-time micro-modelling of city evacuations
NASA Astrophysics Data System (ADS)
Löhner, Rainald; Haug, Eberhard; Zinggerling, Claudio; Oñate, Eugenio
2018-01-01
A methodology to integrate geographical information system (GIS) data with large-scale pedestrian simulations has been developed. Advances in automatic data acquisition and archiving from GIS databases, automatic input for pedestrian simulations, as well as scalable pedestrian simulation tools have made it possible to simulate pedestrians at the individual level for complete cities in real time. An example that simulates the evacuation of the city of Barcelona demonstrates that this is now possible. This is the first step towards a fully integrated crowd prediction and management tool that takes into account not only data gathered in real time from cameras, cell phones or other sensors, but also merges these with advanced simulation tools to predict the future state of the crowd.
Prediction of Indian Summer-Monsoon Onset Variability: A Season in Advance.
Pradhan, Maheswar; Rao, A Suryachandra; Srivastava, Ankur; Dakate, Ashish; Salunke, Kiran; Shameera, K S
2017-10-27
Monsoon onset is an inherent transient phenomenon of Indian Summer Monsoon and it was never envisaged that this transience can be predicted at long lead times. Though onset is precipitous, its variability exhibits strong teleconnections with large scale forcing such as ENSO and IOD and hence may be predictable. Despite of the tremendous skill achieved by the state-of-the-art models in predicting such large scale processes, the prediction of monsoon onset variability by the models is still limited to just 2-3 weeks in advance. Using an objective definition of onset in a global coupled ocean-atmosphere model, it is shown that the skillful prediction of onset variability is feasible under seasonal prediction framework. The better representations/simulations of not only the large scale processes but also the synoptic and intraseasonal features during the evolution of monsoon onset are the comprehensions behind skillful simulation of monsoon onset variability. The changes observed in convection, tropospheric circulation and moisture availability prior to and after the onset are evidenced in model simulations, which resulted in high hit rate of early/delay in monsoon onset in the high resolution model.
Multi-scale Material Parameter Identification Using LS-DYNA® and LS-OPT®
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stander, Nielen; Basudhar, Anirban; Basu, Ushnish
2015-09-14
Ever-tightening regulations on fuel economy, and the likely future regulation of carbon emissions, demand persistent innovation in vehicle design to reduce vehicle mass. Classical methods for computational mass reduction include sizing, shape and topology optimization. One of the few remaining options for weight reduction can be found in materials engineering and material design optimization. Apart from considering different types of materials, by adding material diversity and composite materials, an appealing option in automotive design is to engineer steel alloys for the purpose of reducing plate thickness while retaining sufficient strength and ductility required for durability and safety. A project tomore » develop computational material models for advanced high strength steel is currently being executed under the auspices of the United States Automotive Materials Partnership (USAMP) funded by the US Department of Energy. Under this program, new Third Generation Advanced High Strength Steel (i.e., 3GAHSS) are being designed, tested and integrated with the remaining design variables of a benchmark vehicle Finite Element model. The objectives of the project are to integrate atomistic, microstructural, forming and performance models to create an integrated computational materials engineering (ICME) toolkit for 3GAHSS. The mechanical properties of Advanced High Strength Steels (AHSS) are controlled by many factors, including phase composition and distribution in the overall microstructure, volume fraction, size and morphology of phase constituents as well as stability of the metastable retained austenite phase. The complex phase transformation and deformation mechanisms in these steels make the well-established traditional techniques obsolete, and a multi-scale microstructure-based modeling approach following the ICME [0]strategy was therefore chosen in this project. Multi-scale modeling as a major area of research and development is an outgrowth of the Comprehensive Test Ban Treaty of 1996 which banned surface testing of nuclear devices [1]. This had the effect that experimental work was reduced from large scale tests to multiscale experiments to provide material models with validation at different length scales. In the subsequent years industry realized that multi-scale modeling and simulation-based design were transferable to the design optimization of any structural system. Horstemeyer [1] lists a number of advantages of the use of multiscale modeling. Among these are: the reduction of product development time by alleviating costly trial-and-error iterations as well as the reduction of product costs through innovations in material, product and process designs. Multi-scale modeling can reduce the number of costly large scale experiments and can increase product quality by providing more accurate predictions. Research tends to be focussed on each particular length scale, which enhances accuracy in the long term. This paper serves as an introduction to the LS-OPT and LS-DYNA methodology for multi-scale modeling. It mainly focuses on an approach to integrate material identification using material models of different length scales. As an example, a multi-scale material identification strategy, consisting of a Crystal Plasticity (CP) material model and a homogenized State Variable (SV) model, is discussed and the parameter identification of the individual material models of different length scales is demonstrated. The paper concludes with thoughts on integrating the multi-scale methodology into the overall vehicle design.« less
Collaborative visual analytics of radio surveys in the Big Data era
NASA Astrophysics Data System (ADS)
Vohl, Dany; Fluke, Christopher J.; Hassan, Amr H.; Barnes, David G.; Kilborn, Virginia A.
2017-06-01
Radio survey datasets comprise an increasing number of individual observations stored as sets of multidimensional data. In large survey projects, astronomers commonly face limitations regarding: 1) interactive visual analytics of sufficiently large subsets of data; 2) synchronous and asynchronous collaboration; and 3) documentation of the discovery workflow. To support collaborative data inquiry, we present encube, a large-scale comparative visual analytics framework. encube can utilise advanced visualization environments such as the CAVE2 (a hybrid 2D and 3D virtual reality environment powered with a 100 Tflop/s GPU-based supercomputer and 84 million pixels) for collaborative analysis of large subsets of data from radio surveys. It can also run on standard desktops, providing a capable visual analytics experience across the display ecology. encube is composed of four primary units enabling compute-intensive processing, advanced visualisation, dynamic interaction, parallel data query, along with data management. Its modularity will make it simple to incorporate astronomical analysis packages and Virtual Observatory capabilities developed within our community. We discuss how encube builds a bridge between high-end display systems (such as CAVE2) and the classical desktop, preserving all traces of the work completed on either platform - allowing the research process to continue wherever you are.
NASA Astrophysics Data System (ADS)
Savina, Irina N.; Ingavle, Ganesh C.; Cundy, Andrew B.; Mikhalovsky, Sergey V.
2016-02-01
The development of bulk, three-dimensional (3D), macroporous polymers with high permeability, large surface area and large volume is highly desirable for a range of applications in the biomedical, biotechnological and environmental areas. The experimental techniques currently used are limited to the production of small size and volume cryogel material. In this work we propose a novel, versatile, simple and reproducible method for the synthesis of large volume porous polymer hydrogels by cryogelation. By controlling the freezing process of the reagent/polymer solution, large-scale 3D macroporous gels with wide interconnected pores (up to 200 μm in diameter) and large accessible surface area have been synthesized. For the first time, macroporous gels (of up to 400 ml bulk volume) with controlled porous structure were manufactured, with potential for scale up to much larger gel dimensions. This method can be used for production of novel 3D multi-component macroporous composite materials with a uniform distribution of embedded particles. The proposed method provides better control of freezing conditions and thus overcomes existing drawbacks limiting production of large gel-based devices and matrices. The proposed method could serve as a new design concept for functional 3D macroporous gels and composites preparation for biomedical, biotechnological and environmental applications.
Precision machining of advanced materials with waterjets
NASA Astrophysics Data System (ADS)
Liu, H. T.
2017-01-01
Recent advances in abrasive waterjet technology have elevated to the state that it often competes on equal footing with lasers and EDM for precision machining. Under the support of a National Science Foundation SBIR Phase II grant, OMAX has developed and commercialized micro abrasive water technology that is incorporated into a MicroMAX® JetMa- chining® Center. Waterjet technology, combined both abrasive waterjet and micro abrasive waterjet technology, is capable of machining most materials from macro to micro scales for a wide range of part size and thickness. Waterjet technology has technological and manufacturing merits that cannot be matched by most existing tools. As a cold cutting tool that creates no heat-affected zone, for example, waterjet cuts much faster than wire EDM and laser when measures to minimize a heat-affected zone are taken into account. In addition, waterjet is material independent; it cuts materials that cannot be cut or are difficult to cut otherwise. The versatility of waterjet has also demonstrated machining simulated nanomaterials with large gradients of material properties from metal, nonmetal, to anything in between. This paper presents waterjet-machined samples made of a wide range of advanced materials from macro to micro scales.
The requirements for a new full scale subsonic wind tunnel
NASA Technical Reports Server (NTRS)
Kelly, M. W.; Mckinney, M. O.; Luidens, R. W.
1972-01-01
Justification and requirements are presented for a large subsonic wind tunnel capable of testing full scale aircraft, rotor systems, and advanced V/STOL propulsion systems. The design considerations and constraints for such a facility are reviewed, and the trades between facility test capability and costs are discussed.
NASA Astrophysics Data System (ADS)
Tijerina, D.; Gochis, D.; Condon, L. E.; Maxwell, R. M.
2017-12-01
Development of integrated hydrology modeling systems that couple atmospheric, land surface, and subsurface flow is growing trend in hydrologic modeling. Using an integrated modeling framework, subsurface hydrologic processes, such as lateral flow and soil moisture redistribution, are represented in a single cohesive framework with surface processes like overland flow and evapotranspiration. There is a need for these more intricate models in comprehensive hydrologic forecasting and water management over large spatial areas, specifically the Continental US (CONUS). Currently, two high-resolution, coupled hydrologic modeling applications have been developed for this domain: CONUS-ParFlow built using the integrated hydrologic model ParFlow and the National Water Model that uses the NCAR Weather Research and Forecasting hydrological extension package (WRF-Hydro). Both ParFlow and WRF-Hydro include land surface models, overland flow, and take advantage of parallelization and high-performance computing (HPC) capabilities; however, they have different approaches to overland subsurface flow and groundwater-surface water interactions. Accurately representing large domains remains a challenge considering the difficult task of representing complex hydrologic processes, computational expense, and extensive data needs; both models have accomplished this, but have differences in approach and continue to be difficult to validate. A further exploration of effective methodology to accurately represent large-scale hydrology with integrated models is needed to advance this growing field. Here we compare the outputs of CONUS-ParFlow and the National Water Model to each other and with observations to study the performance of hyper-resolution models over large domains. Models were compared over a range of scales for major watersheds within the CONUS with a specific focus on the Mississippi, Ohio, and Colorado River basins. We use a novel set of approaches and analysis for this comparison to better understand differences in process and bias. This intercomparison is a step toward better understanding how much water we have and interactions between surface and subsurface. Our goal is to advance our understanding and simulation of the hydrologic system and ultimately improve hydrologic forecasts.
MODIS algorithm development and data visualization using ACTS
NASA Technical Reports Server (NTRS)
Abbott, Mark R.
1992-01-01
The study of the Earth as a system will require the merger of scientific and data resources on a much larger scale than has been done in the past. New methods of scientific research, particularly in the development of geographically dispersed, interdisciplinary teams, are necessary if we are to understand the complexity of the Earth system. Even the planned satellite missions themselves, such as the Earth Observing System, will require much more interaction between researchers and engineers if they are to produce scientifically useful data products. A key component in these activities is the development of flexible, high bandwidth data networks that can be used to move large amounts of data as well as allow researchers to communicate in new ways, such as through video. The capabilities of the Advanced Communications Technology Satellite (ACTS) will allow the development of such networks. The Pathfinder global AVHRR data set and the upcoming SeaWiFS Earthprobe mission would serve as a testbed in which to develop the tools to share data and information among geographically distributed researchers. Our goal is to develop a 'Distributed Research Environment' that can be used as a model for scientific collaboration in the EOS era. The challenge is to unite the advances in telecommunications with the parallel advances in computing and networking.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zitney, S.E.
This presentation will examine process systems engineering R&D needs for application to advanced fossil energy (FE) systems and highlight ongoing research activities at the National Energy Technology Laboratory (NETL) under the auspices of a recently launched Collaboratory for Process & Dynamic Systems Research. The three current technology focus areas include: 1) High-fidelity systems with NETL's award-winning Advanced Process Engineering Co-Simulator (APECS) technology for integrating process simulation with computational fluid dynamics (CFD) and virtual engineering concepts, 2) Dynamic systems with R&D on plant-wide IGCC dynamic simulation, control, and real-time training applications, and 3) Systems optimization including large-scale process optimization, stochastic simulationmore » for risk/uncertainty analysis, and cost estimation. Continued R&D aimed at these and other key process systems engineering models, methods, and tools will accelerate the development of advanced gasification-based FE systems and produce increasingly valuable outcomes for DOE and the Nation.« less
Electrokinetic decontamination of concrete. Final report, August 3, 1993--September 15, 1996
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-12-31
The ELECTROSORB{reg_sign} {open_quotes}C{close_quotes} process is an electrokinetic process for decontaminating concrete. ELECTROSORB{reg_sign} {open_quotes}C{close_quotes} uses a carpet-like extraction pad which is placed on the contaminated concrete surface. An electrolyte solution is circulated from a supporting module. This module keeps the electrolyte solution clean. The work is advancing through the engineering development stage with steady progress toward a full scale demonstration unit which will be ready for incorporation in the DOE Large Scale Demonstration Program by Summer 1997. A demonstration was carried out at the Mound Facility in Miamisburg, Ohio, in June 1996. Third party verification by EG&G verified the effectiveness ofmore » the process. Results of this work and the development work that proceeded are described herein.« less
Genetics of the dentofacial variation in human malocclusion
Moreno Uribe, L. M.; Miller, S. F.
2015-01-01
Malocclusions affect individuals worldwide, resulting in compromised function and esthetics. Understanding the etiological factors contributing to the variation in dentofacial morphology associated with malocclusions is the key to develop novel treatment approaches. Advances in dentofacial phenotyping, which is the comprehensive characterization of hard and soft tissue variation in the craniofacial complex, together with the acquisition of large-scale genomic data have started to unravel genetic mechanisms underlying facial variation. Knowledge on the genetics of human malocclusion is limited even though results attained thus far are encouraging, with promising opportunities for future research. This review summarizes the most common dentofacial variations associated with malocclusions and reviews the current knowledge of the roles of genes in the development of malocclusions. Lastly, this review will describe ways to advance malocclusion research, following examples from the expanding fields of phenomics and genomic medicine, which aim to better patient outcomes. PMID:25865537
Status of the Combustion Devices Injector Technology Program at the NASA MSFC
NASA Technical Reports Server (NTRS)
Jones, Gregg; Protz, Christopher; Trinh, Huu; Tucker, Kevin; Nesman, Tomas; Hulka, James
2005-01-01
To support the NASA Space Exploration Mission, an in-house program called Combustion Devices Injector Technology (CDIT) is being conducted at the NASA Marshall Space Flight Center (MSFC) for the fiscal year 2005. CDIT is focused on developing combustor technology and analysis tools to improve reliability and durability of upper-stage and in-space liquid propellant rocket engines. The three areas of focus include injector/chamber thermal compatibility, ignition, and combustion stability. In the compatibility and ignition areas, small-scale single- and multi-element hardware experiments will be conducted to demonstrate advanced technological concepts as well as to provide experimental data for validation of computational analysis tools. In addition, advanced analysis tools will be developed to eventually include 3-dimensional and multi- element effects and improve capability and validity to analyze heat transfer and ignition in large, multi-element injectors.
NASDA's Advanced On-Line System (ADOLIS)
NASA Technical Reports Server (NTRS)
Yamamoto, Yoshikatsu; Hara, Hideo; Yamada, Shigeo; Hirata, Nobuyuki; Komatsu, Shigenori; Nishihata, Seiji; Oniyama, Akio
1993-01-01
Spacecraft operations including ground system operations are generally realized by various large or small scale group work which is done by operators, engineers, managers, users and so on, and their positions are geographically distributed in many cases. In face-to-face work environments, it is easy for them to understand each other. However, in distributed work environments which need communication media, if only using audio, they become estranged from each other and lose interest in and continuity of work. It is an obstacle to smooth operation of spacecraft. NASDA has developed an experimental model of a new real-time operation control system called 'ADOLIS' (ADvanced On-Line System) adopted to such a distributed environment using a multi-media system dealing with character, figure, image, handwriting, video and audio information which is accommodated to operation systems of a wide range including spacecraft and ground systems. This paper describes the results of the development of the experimental model.
NASA Astrophysics Data System (ADS)
Michalak, D. J.; Bruno, A.; Caudillo, R.; Elsherbini, A. A.; Falcon, J. A.; Nam, Y. S.; Poletto, S.; Roberts, J.; Thomas, N. K.; Yoscovits, Z. R.; Dicarlo, L.; Clarke, J. S.
Experimental quantum computing is rapidly approaching the integration of sufficient numbers of quantum bits for interesting applications, but many challenges still remain. These challenges include: realization of an extensible design for large array scale up, sufficient material process control, and discovery of integration schemes compatible with industrial 300 mm fabrication. We present recent developments in extensible circuits with vertical delivery. Toward the goal of developing a high-volume manufacturing process, we will present recent results on a new Josephson junction process that is compatible with current tooling. We will then present the improvements in NbTiN material uniformity that typical 300 mm fabrication tooling can provide. While initial results on few-qubit systems are encouraging, advanced processing control is expected to deliver the improvements in qubit uniformity, coherence time, and control required for larger systems. Research funded by Intel Corporation.
Climbing the Corporate Ladder.
ERIC Educational Resources Information Center
Smith, Christopher
The employment records of a large northeastern manufacturing plant were analyzed to test the opportunity for career advancement within a large-scale industrial establishment. The employment records analyzed covered the years 1921 through 1937 and more than 28,000 different employees (male and female). The company was selected as being…
Numerical prediction of the Mid-Atlantic states cyclone of 18-19 February 1979
NASA Technical Reports Server (NTRS)
Atlas, R.; Rosenberg, R.
1982-01-01
A series of forecast experiments was conducted to assess the accuracy of the GLAS model, and to determine the importance of large scale dynamical processes and diabatic heating to the cyclogenesis. The GLAS model correctly predicted intense coastal cyclogenesis and heavy precipitation. Repeated without surface heat and moisture fluxes, the model failed to predict any cyclone development. An extended range forecast, a forecast from the NMC analysis interpolated to the GLAS grid, and a forecast from the GLAS analysis with the surface moisture flux excluded predicted weak coastal low development. Diabatic heating resulting from oceanic fluxes significantly contributed to the generation of low level cyclonic vorticity and the intensification and slow rate of movement of an upper level ridge over the western Atlantic. As an upper level short wave trough approached this ridge, diabatic heating associated with the release of latent heat intensified, and the gradient of vorticity, vorticity advection and upper level divergence in advance of the trough were greatly increased, providing strong large scale forcing for the surface cyclogenesis.
Large-scale adenovirus and poxvirus-vectored vaccine manufacturing to enable clinical trials.
Kallel, Héla; Kamen, Amine A
2015-05-01
Efforts to make vaccines against infectious diseases and immunotherapies for cancer have evolved to utilize a variety of heterologous expression systems such as viral vectors. These vectors are often attenuated or engineered to safely deliver genes encoding antigens of different pathogens. Adenovirus and poxvirus vectors are among the viral vectors that are most frequently used to develop prophylactic vaccines against infectious diseases as well as therapeutic cancer vaccines. This mini-review describes the trends and processes in large-scale production of adenovirus and poxvirus vectors to meet the needs of clinical applications. We briefly describe the general principles for the production and purification of adenovirus and poxvirus viral vectors. Currently, adenovirus and poxvirus vector manufacturing methods rely on well-established cell culture technologies. Several improvements have been evaluated to increase the yield and to reduce the overall manufacturing cost, such as cultivation at high cell densities and continuous downstream processing. Additionally, advancements in vector characterization will greatly facilitate the development of novel vectored vaccine candidates. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Human Mars EDL Pathfinder Study: Assessment of Technology Development Gaps and Mitigations
NASA Technical Reports Server (NTRS)
Lillard, Randolph; Olejniczak, Joe; Polsgrove, Tara; Cianciolo, Alice Dwyer; Munk, Michelle; Whetsel, Charles; Drake, Bret
2017-01-01
This paper presents the results of a NASA initiated Agency-wide assessment to better characterize the risks and potential mitigation approaches associated with landing human class Entry, Descent, and Landing (EDL) systems on Mars. Due to the criticality and long-lead nature of advancing EDL techniques, it is necessary to determine an appropriate strategy to improve the capability to land large payloads. A key focus of this study was to understand the key EDL risks and with a focus on determining what "must" be tested at Mars. This process identified the various risks and potential risk mitigation strategies along with the key near term technology development efforts required and in what environment those technology demonstrations were best suited. The study identified key risks along with advantages to each entry technology. In addition, it was identified that provided the EDL concept of operations (con ops) minimized large scale transition events, there was no technology requirement for a Mars pre-cursor demonstration. Instead, NASA should take a direct path to a human-scale lander.
NASA Technical Reports Server (NTRS)
Erickson, G. E.; Gilbert, W. P.
1983-01-01
An experimental investigation was conducted to assess the vortex flow-field interactions on an advanced, twin-jet fighter aircraft configuration at high angles of attack. Flow-field surveys were conducted on a small-scale model in the Northrop 0.41 - by 0.60-meter water tunnel and, where appropriate, the qualitative observations were correlated with low-speed wind tunnel data trends obtained on a large-scale model of the advanced fighter in the NASA Langley Research Center 30- by 60-foot (9.1- by 18.3-meter) facility. Emphasis was placed on understanding the interactions of the forebody and LEX-wing vortical flows, defining the effects on rolling moment variation with sideslip, and identifying modifications to control or regulate the vortex interactions at high angles of attack. The water tunnel flow visualization results and wind tunnel data trend analysis revealed the potential for strong interactions between the forebody and LEX vortices at high angles of attack. In particular, the forebody flow development near the nose could be controlled by means of carefully-positioned radome strakes. The resultant strake-induced flow-field changes were amplified downstream by the more powerful LEX vortical motions with subsequent large effects on wing flow separation characteristics.
Commercial-scale biotherapeutics manufacturing facility for plant-made pharmaceuticals.
Holtz, Barry R; Berquist, Brian R; Bennett, Lindsay D; Kommineni, Vally J M; Munigunti, Ranjith K; White, Earl L; Wilkerson, Don C; Wong, Kah-Yat I; Ly, Lan H; Marcel, Sylvain
2015-10-01
Rapid, large-scale manufacture of medical countermeasures can be uniquely met by the plant-made-pharmaceutical platform technology. As a participant in the Defense Advanced Research Projects Agency (DARPA) Blue Angel project, the Caliber Biotherapeutics facility was designed, constructed, commissioned and released a therapeutic target (H1N1 influenza subunit vaccine) in <18 months from groundbreaking. As of 2015, this facility was one of the world's largest plant-based manufacturing facilities, with the capacity to process over 3500 kg of plant biomass per week in an automated multilevel growing environment using proprietary LED lighting. The facility can commission additional plant grow rooms that are already built to double this capacity. In addition to the commercial-scale manufacturing facility, a pilot production facility was designed based on the large-scale manufacturing specifications as a way to integrate product development and technology transfer. The primary research, development and manufacturing system employs vacuum-infiltrated Nicotiana benthamiana plants grown in a fully contained, hydroponic system for transient expression of recombinant proteins. This expression platform has been linked to a downstream process system, analytical characterization, and assessment of biological activity. This integrated approach has demonstrated rapid, high-quality production of therapeutic monoclonal antibody targets, including a panel of rituximab biosimilar/biobetter molecules and antiviral antibodies against influenza and dengue fever. © 2015 Society for Experimental Biology, Association of Applied Biologists and John Wiley & Sons Ltd.
Pangle, Luke A.; DeLong, Stephen B.; Abramson, Nate; Adams, John; Barron-Gafford, Greg A.; Breshears, David D.; Brooks, Paul D.; Chorover, Jon; Dietrich, William E.; Dontsova, Katerina; Durcik, Matej; Espeleta, Javier; Ferré, T.P.A.; Ferriere, Regis; Henderson, Whitney; Hunt, Edward A.; Huxman, Travis E.; Millar, David; Murphy, Brendan; Niu, Guo-Yue; Pavao-Zuckerman, Mitch; Pelletier, Jon D.; Rasmussen, Craig; Ruiz, Joaquin; Saleska, Scott; Schaap, Marcel; Sibayan, Michael; Troch, Peter A.; Tuller, Markus; van Haren, Joost; Zeng, Xubin
2015-01-01
Zero-order drainage basins, and their constituent hillslopes, are the fundamental geomorphic unit comprising much of Earth's uplands. The convergent topography of these landscapes generates spatially variable substrate and moisture content, facilitating biological diversity and influencing how the landscape filters precipitation and sequesters atmospheric carbon dioxide. In light of these significant ecosystem services, refining our understanding of how these functions are affected by landscape evolution, weather variability, and long-term climate change is imperative. In this paper we introduce the Landscape Evolution Observatory (LEO): a large-scale controllable infrastructure consisting of three replicated artificial landscapes (each 330 m2 surface area) within the climate-controlled Biosphere 2 facility in Arizona, USA. At LEO, experimental manipulation of rainfall, air temperature, relative humidity, and wind speed are possible at unprecedented scale. The Landscape Evolution Observatory was designed as a community resource to advance understanding of how topography, physical and chemical properties of soil, and biological communities coevolve, and how this coevolution affects water, carbon, and energy cycles at multiple spatial scales. With well-defined boundary conditions and an extensive network of sensors and samplers, LEO enables an iterative scientific approach that includes numerical model development and virtual experimentation, physical experimentation, data analysis, and model refinement. We plan to engage the broader scientific community through public dissemination of data from LEO, collaborative experimental design, and community-based model development.
Scale in Remote Sensing and GIS: An Advancement in Methods Towards a Science of Scale
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.
1998-01-01
The term "scale", both in space and time, is central to remote sensing and geographic information systems (GIS). The emergence and widespread use of GIS technologies, including remote sensing, has generated significant interest in addressing scale as a generic topic, and in the development and implementation of techniques for dealing explicitly with the vicissitudes of scale as a multidisciplinary issue. As science becomes more complex and utilizes databases that are capable of performing complex space-time data analyses, it becomes paramount that we develop the tools and techniques needed to operate at multiple scales, to work with data whose scales are not necessarily ideal, and to produce results that can be aggregated or disaggregated in ways that suit the decision-making process. Contemporary science is constantly coping with compromises, and the data available for a particular study rarely fit perfectly with the scales at which the processes being investigated operate, or the scales that policy-makers require to make sound, rational decisions. This presentation discusses some of the problems associated with scale as related to remote sensing and GIS, and describes some of the questions that need to be addressed in approaching the development of a multidisciplinary "science of scale". Techniques for dealing with multiple scaled data that have been developed or explored recently are described as a means for recognizing scale as a generic issue, along with associated theory and tools that can be of simultaneous value to a large number of disciplines. These can be used to seek answers to a host of interrelated questions in the interest of providing a formal structure for the management and manipulation of scale and its universality as a key concept from a multidisciplinary perspective.
Scale in Remote Sensing and GIS: An Advancement in Methods Towards a Science of Scale
NASA Technical Reports Server (NTRS)
Quattrochi, D. A.
1998-01-01
The term "scale", both in space and time, is central to remote sensing and Geographic Information Systems (GIS). The emergence and widespread use of GIS technologies, including remote sensing, has generated significant interest in addressing scale as a generic topic, and in the development and implementation of techniques for dealing explicitly with the vicissitudes of scale as a multidisciplinary issue. As science becomes more complex and utilizes databases that are capable of performing complex space-time data analyses, it becomes paramount that we develop the tools and techniques needed to operate at multiple scales, to work with data whose scales are not necessarily ideal, and to produce results that can be aggregated or disaggregated ways that suit the decision-making process. Contemporary science is constantly coping with compromises, and the data available for a particular study rarely fit perfectly with the scales at which the processes being investigated operate, or the scales that policy-makers require to make sound, rational decisions. This presentation discusses some of the problems associated with scale as related to remote sensing and GIS, and describes some of the questions that need to be addressed in approaching the development of a multidisciplinary "science of scale". Techniques for dealing with multiple scaled data that have been developed or explored recently are described as a means for recognizing scale as a generic issue, along with associated theory and tools that can be of simultaneous value to a large number of disciplines. These can be used to seek answers to a host of interrelated questions in the interest of providing a formal structure for the management and manipulation of scale and its universality as a key concept from a multidisciplinary perspective.
NASA Astrophysics Data System (ADS)
Menapace, Joseph A.
2010-11-01
Over the last eight years we have been developing advanced MRF tools and techniques to manufacture meter-scale optics for use in Megajoule class laser systems. These systems call for optics having unique characteristics that can complicate their fabrication using conventional polishing methods. First, exposure to the high-power nanosecond and sub-nanosecond pulsed laser environment in the infrared (>27 J/cm2 at 1053 nm), visible (>18 J/cm2 at 527 nm), and ultraviolet (>10 J/cm2 at 351 nm) demands ultra-precise control of optical figure and finish to avoid intensity modulation and scatter that can result in damage to the optics chain or system hardware. Second, the optics must be super-polished and virtually free of surface and subsurface flaws that can limit optic lifetime through laser-induced damage initiation and growth at the flaw sites, particularly at 351 nm. Lastly, ultra-precise optics for beam conditioning are required to control laser beam quality. These optics contain customized surface topographical structures that cannot be made using traditional fabrication processes. In this review, we will present the development and implementation of large-aperture MRF tools and techniques specifically designed to meet the demanding optical performance challenges required in large aperture high-power laser systems. In particular, we will discuss the advances made by using MRF technology to expose and remove surface and subsurface flaws in optics during final polishing to yield optics with improve laser damage resistance, the novel application of MRF deterministic polishing to imprint complex topographical information and wavefront correction patterns onto optical surfaces, and our efforts to advance the technology to manufacture largeaperture damage resistant optics.
Show me the data: advances in multi-model benchmarking, assimilation, and forecasting
NASA Astrophysics Data System (ADS)
Dietze, M.; Raiho, A.; Fer, I.; Cowdery, E.; Kooper, R.; Kelly, R.; Shiklomanov, A. N.; Desai, A. R.; Simkins, J.; Gardella, A.; Serbin, S.
2016-12-01
Researchers want their data to inform carbon cycle predictions, but there are considerable bottlenecks between data collection and the use of data to calibrate and validate earth system models and inform predictions. This talk highlights recent advancements in the PEcAn project aimed at it making it easier for individual researchers to confront models with their own data: (1) The development of an easily extensible site-scale benchmarking system aimed at ensuring that models capture process rather than just reproducing pattern; (2) Efficient emulator-based Bayesian parameter data assimilation to constrain model parameters; (3) A novel, generalized approach to ensemble data assimilation to estimate carbon pools and fluxes and quantify process error; (4) automated processing and downscaling of CMIP climate scenarios to support forecasts that include driver uncertainty; (5) a large expansion in the number of models supported, with new tools for conducting multi-model and multi-site analyses; and (6) a network-based architecture that allows analyses to be shared with model developers and other collaborators. Application of these methods is illustrated with data across a wide range of time scales, from eddy-covariance to forest inventories to tree rings to paleoecological pollen proxies.
Zamami, Yoshito; Niimura, Takahiro; Takechi, Kenshi; Imanishi, Masaki; Koyama, Toshihiro; Ishizawa, Keisuke
2017-01-01
Approximately 100000 people suffer cardiopulmonary arrest in Japan every year, and the aging of society means that this number is expected to increase. Worldwide, approximately 100 million develop cardiac arrest annually, making it an international issue. Although survival has improved thanks to advances in cardiopulmonary resuscitation, there is a high rate of postresuscitation encephalopathy after the return of spontaneous circulation, and the proportion of patients who can return to normal life is extremely low. Treatment for postresuscitation encephalopathy is long term, and if sequelae persist then nursing care is required, causing immeasurable economic burdens as a result of ballooning medical costs. As at present there is no drug treatment to improve postresuscitation encephalopathy as a complication of cardiopulmonary arrest, the development of novel drug treatments is desirable. In recent years, new efficacy for existing drugs used in the clinical setting has been discovered, and drug repositioning has been proposed as a strategy for developing those drugs as therapeutic agents for different diseases. This review describes a large-scale database study carried out following a discovery strategy for drug repositioning with the objective of improving survival rates after cardiopulmonary arrest and discusses future repositioning prospects.
Intelligent systems engineering methodology
NASA Technical Reports Server (NTRS)
Fouse, Scott
1990-01-01
An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.
New generation of elastic network models.
López-Blanco, José Ramón; Chacón, Pablo
2016-04-01
The intrinsic flexibility of proteins and nucleic acids can be grasped from remarkably simple mechanical models of particles connected by springs. In recent decades, Elastic Network Models (ENMs) combined with Normal Model Analysis widely confirmed their ability to predict biologically relevant motions of biomolecules and soon became a popular methodology to reveal large-scale dynamics in multiple structural biology scenarios. The simplicity, robustness, low computational cost, and relatively high accuracy are the reasons behind the success of ENMs. This review focuses on recent advances in the development and application of ENMs, paying particular attention to combinations with experimental data. Successful application scenarios include large macromolecular machines, structural refinement, docking, and evolutionary conservation. Copyright © 2015 Elsevier Ltd. All rights reserved.
SEAPAK user's guide, version 2.0. Volume 2: Descriptions of programs
NASA Technical Reports Server (NTRS)
Mcclain, Charles R.; Darzi, Michael; Firestone, James K.; Fu, Gary; Yeh, Eueng-Nan; Endres, Daniel L.
1991-01-01
The SEAPAK is a user-interactive satellite data analysis package that was developed for the processing and interpretation of Nimbus-7/Coastal Zone Color Scanner (CZCS) and the NOAA Advanced Very High Resolution Radiometer (AVHRR) data. Significant revisions were made since version 1.0, and the ancillary environmental data analysis module was greatly expanded. The package continues to be user friendly and user interactive. Also, because the scientific goals of the ocean color research being conducted have shifted to large space and time scales, batch processing capabilities for both satellite and ancillary environmental data analyses were enhanced, thus allowing for large quantities of data to be ingested and analyzed.
Finite element modeling and analysis of tires
NASA Technical Reports Server (NTRS)
Noor, A. K.; Andersen, C. M.
1983-01-01
Predicting the response of tires under various loading conditions using finite element technology is addressed. Some of the recent advances in finite element technology which have high potential for application to tire modeling problems are reviewed. The analysis and modeling needs for tires are identified. Reduction methods for large-scale nonlinear analysis, with particular emphasis on treatment of combined loads, displacement-dependent and nonconservative loadings; development of simple and efficient mixed finite element models for shell analysis, identification of equivalent mixed and purely displacement models, and determination of the advantages of using mixed models; and effective computational models for large-rotation nonlinear problems, based on a total Lagrangian description of the deformation are included.
Large scale isolation and purification of soluble RAGE from lung tissue.
Englert, Judson M; Ramsgaard, Lasse; Valnickova, Zuzana; Enghild, Jan J; Oury, Tim D
2008-09-01
The receptor for advanced glycation end-products (RAGE) has been implicated in numerous disease processes including: atherosclerosis, diabetic nephropathy, impaired wound healing and neuropathy to name a few. Treatment of animals with a soluble isoform of the receptor (sRAGE) has been shown to prevent and even reverse many disease processes. Isolating large quantities of pure sRAGE for in vitro and in vivo studies has hindered its development as a therapeutic strategy in other RAGE mediated diseases that require long-term therapy. This article provides an improvement in both yield and detail of a previously published method to obtain 10mg of pure, endotoxin free sRAGE from 65 g of lung tissue.
SEAPAK user's guide, version 2.0. Volume 1: System description
NASA Technical Reports Server (NTRS)
Mcclain, Charles R.; Darzi, Michael; Firestone, James K.; Fu, Gary; Yeh, Eueng-Nan; Endres, Daniel L.
1991-01-01
The SEAPAK is a user interactive satellite data analysis package that was developed for the processing and interpretation of Nimbus-7/Coastal Zone Color Scanner (CZCS) and the NOAA Advanced Very High Resolution Radiometer (AVHRR) data. Significant revisions were made to version 1.0 of the guide, and the ancillary environmental data analysis module was expanded. The package continues to emphasize user friendliness and user interactive data analyses. Additionally, because the scientific goals of the ocean color research being conducted have shifted to large space and time scales, batch processing capabilities for both satellite and ancillary environmental data analyses were enhanced, thus allowing large quantities of data to be ingested and analyzed in background.
High performance cellular level agent-based simulation with FLAME for the GPU.
Richmond, Paul; Walker, Dawn; Coakley, Simon; Romano, Daniela
2010-05-01
Driven by the availability of experimental data and ability to simulate a biological scale which is of immediate interest, the cellular scale is fast emerging as an ideal candidate for middle-out modelling. As with 'bottom-up' simulation approaches, cellular level simulations demand a high degree of computational power, which in large-scale simulations can only be achieved through parallel computing. The flexible large-scale agent modelling environment (FLAME) is a template driven framework for agent-based modelling (ABM) on parallel architectures ideally suited to the simulation of cellular systems. It is available for both high performance computing clusters (www.flame.ac.uk) and GPU hardware (www.flamegpu.com) and uses a formal specification technique that acts as a universal modelling format. This not only creates an abstraction from the underlying hardware architectures, but avoids the steep learning curve associated with programming them. In benchmarking tests and simulations of advanced cellular systems, FLAME GPU has reported massive improvement in performance over more traditional ABM frameworks. This allows the time spent in the development and testing stages of modelling to be drastically reduced and creates the possibility of real-time visualisation for simple visual face-validation.
The Multi-Scale Network Landscape of Collaboration.
Bae, Arram; Park, Doheum; Ahn, Yong-Yeol; Park, Juyong
2016-01-01
Propelled by the increasing availability of large-scale high-quality data, advanced data modeling and analysis techniques are enabling many novel and significant scientific understanding of a wide range of complex social, natural, and technological systems. These developments also provide opportunities for studying cultural systems and phenomena--which can be said to refer to all products of human creativity and way of life. An important characteristic of a cultural product is that it does not exist in isolation from others, but forms an intricate web of connections on many levels. In the creation and dissemination of cultural products and artworks in particular, collaboration and communication of ideas play an essential role, which can be captured in the heterogeneous network of the creators and practitioners of art. In this paper we propose novel methods to analyze and uncover meaningful patterns from such a network using the network of western classical musicians constructed from a large-scale comprehensive Compact Disc recordings data. We characterize the complex patterns in the network landscape of collaboration between musicians across multiple scales ranging from the macroscopic to the mesoscopic and microscopic that represent the diversity of cultural styles and the individuality of the artists.
The Multi-Scale Network Landscape of Collaboration
Ahn, Yong-Yeol; Park, Juyong
2016-01-01
Propelled by the increasing availability of large-scale high-quality data, advanced data modeling and analysis techniques are enabling many novel and significant scientific understanding of a wide range of complex social, natural, and technological systems. These developments also provide opportunities for studying cultural systems and phenomena—which can be said to refer to all products of human creativity and way of life. An important characteristic of a cultural product is that it does not exist in isolation from others, but forms an intricate web of connections on many levels. In the creation and dissemination of cultural products and artworks in particular, collaboration and communication of ideas play an essential role, which can be captured in the heterogeneous network of the creators and practitioners of art. In this paper we propose novel methods to analyze and uncover meaningful patterns from such a network using the network of western classical musicians constructed from a large-scale comprehensive Compact Disc recordings data. We characterize the complex patterns in the network landscape of collaboration between musicians across multiple scales ranging from the macroscopic to the mesoscopic and microscopic that represent the diversity of cultural styles and the individuality of the artists. PMID:26990088
Stearns, Leigh A.; Hamilton, Gordon S.; van der Veen, C. J.; Finnegan, D. C.; O'Neel, Shad; Scheick, J. B.; Lawson, D. E.
2015-01-01
Hubbard Glacier, located in southeast Alaska, is the world's largest non-polar tidewater glacier. It has been steadily advancing since it was first mapped in 1895; occasionally, the advance creates an ice or sediment dam that blocks a tributary fjord (Russell Fiord). The sustained advance raises the probability of long-term closure in the near-future, which will strongly impact the ecosystem of Russell Fiord and the nearby community of Yakutat. Here, we examine a 43-year record of flow speeds and terminus position to understand the large-scale dynamics of Hubbard Glacier. Our long-term record shows that the rate of terminus advance has increased slightly since 1895, with the exception of a slowed advance between approximately 1972 and 1984. The short-lived closure events in 1986 and 2002 were not initiated by perturbations in ice velocity or environmental forcings, but were likely due to fluctuations in sedimentation patterns at the terminus. This study points to the significance of a coupled system where short-term velocity fluctuations and morainal shoal development control tidewater glacier terminus position.
Numerical Propulsion System Simulation (NPSS) 1999 Industry Review
NASA Technical Reports Server (NTRS)
Lytle, John; Follen, Greg; Naiman, Cynthia; Evans, Austin
2000-01-01
The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia, and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective, high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. In addition, the paper contains a summary of the feedback received from industry partners in the development effort and the actions taken over the past year to respond to that feedback. The NPSS development was supported in FY99 by the High Performance Computing and Communications Program.
NASA Technical Reports Server (NTRS)
1993-01-01
The purpose of the STME Main Injector Program was to enhance the technology base for the large-scale main injector-combustor system of oxygen-hydrogen booster engines in the areas of combustion efficiency, chamber heating rates, and combustion stability. The initial task of the Main Injector Program, focused on analysis and theoretical predictions using existing models, was complemented by the design, fabrication, and test at MSFC of a subscale calorimetric, 40,000-pound thrust class, axisymmetric thrust chamber operating at approximately 2,250 psi and a 7:1 expansion ratio. Test results were used to further define combustion stability bounds, combustion efficiency, and heating rates using a large injector scale similar to the Pratt & Whitney (P&W) STME main injector design configuration including the tangential entry swirl coaxial injection elements. The subscale combustion data was used to verify and refine analytical modeling simulation and extend the database range to guide the design of the large-scale system main injector. The subscale injector design incorporated fuel and oxidizer flow area control features which could be varied; this allowed testing of several design points so that the STME conditions could be bracketed. The subscale injector design also incorporated high-reliability and low-cost fabrication techniques such as a one-piece electrical discharged machined (EDMed) interpropellant plate. Both subscale and large-scale injectors incorporated outer row injector elements with scarfed tip features to allow evaluation of reduced heating rates to the combustion chamber.
Trace: a high-throughput tomographic reconstruction engine for large-scale datasets.
Bicer, Tekin; Gürsoy, Doğa; Andrade, Vincent De; Kettimuthu, Rajkumar; Scullin, William; Carlo, Francesco De; Foster, Ian T
2017-01-01
Modern synchrotron light sources and detectors produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used imaging techniques that generates data at tens of gigabytes per second is computed tomography (CT). Although CT experiments result in rapid data generation, the analysis and reconstruction of the collected data may require hours or even days of computation time with a medium-sized workstation, which hinders the scientific progress that relies on the results of analysis. We present Trace, a data-intensive computing engine that we have developed to enable high-performance implementation of iterative tomographic reconstruction algorithms for parallel computers. Trace provides fine-grained reconstruction of tomography datasets using both (thread-level) shared memory and (process-level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations that we apply to the replicated reconstruction objects and evaluate them using tomography datasets collected at the Advanced Photon Source. Our experimental evaluations show that our optimizations and parallelization techniques can provide 158× speedup using 32 compute nodes (384 cores) over a single-core configuration and decrease the end-to-end processing time of a large sinogram (with 4501 × 1 × 22,400 dimensions) from 12.5 h to <5 min per iteration. The proposed tomographic reconstruction engine can efficiently process large-scale tomographic data using many compute nodes and minimize reconstruction times.
2017-04-01
ADVANCED VISUALIZATION AND INTERACTIVE DISPLAY RAPID INNOVATION AND DISCOVERY EVALUATION RESEARCH (VISRIDER) PROGRAM TASK 6: POINT CLOUD...To) OCT 2013 – SEP 2014 4. TITLE AND SUBTITLE ADVANCED VISUALIZATION AND INTERACTIVE DISPLAY RAPID INNOVATION AND DISCOVERY EVALUATION RESEARCH...various point cloud visualization techniques for viewing large scale LiDAR datasets. Evaluate their potential use for thick client desktop platforms
NASA Technical Reports Server (NTRS)
1989-01-01
Important and fundamental scientific progress can be attained through space observations in the wavelengths longward of 1 micron. The formation of galaxies, stars, and planets, the origin of quasars and the nature of active galactic nuclei, the large scale structure of the Universe, and the problem of the missing mass, are among the major scientific issues that can be addressed by these observations. Significant advances in many areas of astrophysics can be made over the next 20 years by implementing the outlined program. This program combines large observatories with smaller projects to create an overall scheme that emphasized complementarity and synergy, advanced technology, community support and development, and the training of the next generation of scientists. Key aspects of the program include: the Space Infrared Telescope Facility; the Stratospheric Observatory for Infrared Astronomy; a robust program of small missions; and the creation of the technology base for future major observatories.
Air pollution and mortality: A history
NASA Astrophysics Data System (ADS)
Anderson, H. R.
Mortality is the most important health effect of ambient air pollution and has been studied the longest. The earliest evidence relates to fog episodes but with the development of more precise methods of investigation it is still possible to discern short-term temporal associations with daily mortality at the historically low levels of air pollution that now exist in most developed countries. Another early observation was that mortality was higher in more polluted areas. This has been confirmed by modern cohort studies that account for other potential explanations for such associations. There does not appear to be a threshold of effect within the ambient range of concentrations. Advances in the understanding of air pollution and mortality have been driven by the combined development of methods and biomedical concepts. The most influential methodological developments have been in time-series techniques and the establishment of large cohort studies, both of which are underpinned by advances in data processing and statistical analysis. On the biomedical side two important developments can be identified. One has been the application of the concept of multifactorial disease causation to explaining how air pollution may affect mortality at low levels and why thresholds are not obvious at the population level. The other has been an increasing understanding of how air pollution may plausibly have pathophysiological effects that are remote from the lung interface with ambient air. Together, these advances have had a profound influence on policies to protect public health. Throughout the history of air pollution epidemiology, mortality studies have been central and this will continue because of the widespread availability of mortality data on a large population scale and the weight that mortality carries in estimating impacts for policy development.
Experimental and analytical studies of advanced air cushion landing systems
NASA Technical Reports Server (NTRS)
Lee, E. G. S.; Boghani, A. B.; Captain, K. M.; Rutishauser, H. J.; Farley, H. L.; Fish, R. B.; Jeffcoat, R. L.
1981-01-01
Several concepts are developed for air cushion landing systems (ACLS) which have the potential for improving performance characteristics (roll stiffness, heave damping, and trunk flutter), and reducing fabrication cost and complexity. After an initial screening, the following five concepts were evaluated in detail: damped trunk, filled trunk, compartmented trunk, segmented trunk, and roll feedback control. The evaluation was based on tests performed on scale models. An ACLS dynamic simulation developed earlier is updated so that it can be used to predict the performance of full-scale ACLS incorporating these refinements. The simulation was validated through scale-model tests. A full-scale ACLS based on the segmented trunk concept was fabricated and installed on the NASA ACLS test vehicle, where it is used to support advanced system development. A geometrically-scaled model (one third full scale) of the NASA test vehicle was fabricated and tested. This model, evaluated by means of a series of static and dynamic tests, is used to investigate scaling relationships between reduced and full-scale models. The analytical model developed earlier is applied to simulate both the one third scale and the full scale response.
Rocket Propulsion (RP) 21 Steering Committee Meeting - NASA Spacecraft Propulsion Update
NASA Technical Reports Server (NTRS)
Klem, Mark
2016-01-01
Lander Tech is three separate but synergistic efforts: Lunar CATALYST (Lunar Cargo Transportation and Landing by Soft Touchdown) Support U.S. industry led robotic lunar lander development via three public-private efforts. Support U.S. industry led robotic lunar lander development via three public-private partnerships. Infuse or transfer landing technologies into these public private partnerships. Advanced Exploration Systems-Automated Propellant Loading (APL) -Integrated Ground Operations. Demonstrate LH2 zero loss storage, loading and transfer operations via testing on a large scale in a relevant launch vehicle servicing environment. (KSC, GRC). Game Changing Technology-20 Kelvin -20 Watt Cryocooler Development of a Reverse Turbo-Brayton Cryocooler operating at 20 Kelvin with 20 Watts of refrigeration lift.
NASA Technical Reports Server (NTRS)
Chu, Robert L.; Bayha, Tom D.; Davis, HU; Ingram, J. ED; Shukla, Jay G.
1992-01-01
Composite Wing and Fuselage Structural Design/Manufacturing Concepts have been developed and evaluated. Trade studies were performed to determine how well the concepts satisfy the program goals of 25 percent cost savings, 40 percent weight savings with aircraft resizing, and 50 percent part count reduction as compared to the aluminum Lockheed L-1011 baseline. The concepts developed using emerging technologies such as large scale resin transfer molding (RTM), automatic tow placed (ATP), braiding, out-of-autoclave and automated manufacturing processes for both thermoset and thermoplastic materials were evaluated for possible application in the design concepts. Trade studies were used to determine which concepts carry into the detailed design development subtask.
NASA Advanced Supercomputing Facility Expansion
NASA Technical Reports Server (NTRS)
Thigpen, William W.
2017-01-01
The NASA Advanced Supercomputing (NAS) Division enables advances in high-end computing technologies and in modeling and simulation methods to tackle some of the toughest science and engineering challenges facing NASA today. The name "NAS" has long been associated with leadership and innovation throughout the high-end computing (HEC) community. We play a significant role in shaping HEC standards and paradigms, and provide leadership in the areas of large-scale InfiniBand fabrics, Lustre open-source filesystems, and hyperwall technologies. We provide an integrated high-end computing environment to accelerate NASA missions and make revolutionary advances in science. Pleiades, a petaflop-scale supercomputer, is used by scientists throughout the U.S. to support NASA missions, and is ranked among the most powerful systems in the world. One of our key focus areas is in modeling and simulation to support NASA's real-world engineering applications and make fundamental advances in modeling and simulation methods.
Supporting Source Code Comprehension during Software Evolution and Maintenance
ERIC Educational Resources Information Center
Alhindawi, Nouh
2013-01-01
This dissertation addresses the problems of program comprehension to support the evolution of large-scale software systems. The research concerns how software engineers locate features and concepts along with categorizing changes within very large bodies of source code along with their versioned histories. More specifically, advanced Information…
Impact of spectral nudging on the downscaling of tropical cyclones in regional climate simulations
NASA Astrophysics Data System (ADS)
Choi, Suk-Jin; Lee, Dong-Kyou
2016-06-01
This study investigated the simulations of three months of seasonal tropical cyclone (TC) activity over the western North Pacific using the Advanced Research WRF Model. In the control experiment (CTL), the TC frequency was considerably overestimated. Additionally, the tracks of some TCs tended to have larger radii of curvature and were shifted eastward. The large-scale environments of westerly monsoon flows and subtropical Pacific highs were unreasonably simulated. The overestimated frequency of TC formation was attributed to a strengthened westerly wind field in the southern quadrants of the TC center. In comparison with the experiment with the spectral nudging method, the strengthened wind speed was mainly modulated by large-scale flow that was greater than approximately 1000 km in the model domain. The spurious formation and undesirable tracks of TCs in the CTL were considerably improved by reproducing realistic large-scale atmospheric monsoon circulation with substantial adjustment between large-scale flow in the model domain and large-scale boundary forcing modified by the spectral nudging method. The realistic monsoon circulation took a vital role in simulating realistic TCs. It revealed that, in the downscaling from large-scale fields for regional climate simulations, scale interaction between model-generated regional features and forced large-scale fields should be considered, and spectral nudging is a desirable method in the downscaling method.
Findings and Challenges in Fine-Resolution Large-Scale Hydrological Modeling
NASA Astrophysics Data System (ADS)
Her, Y. G.
2017-12-01
Fine-resolution large-scale (FL) modeling can provide the overall picture of the hydrological cycle and transport while taking into account unique local conditions in the simulation. It can also help develop water resources management plans consistent across spatial scales by describing the spatial consequences of decisions and hydrological events extensively. FL modeling is expected to be common in the near future as global-scale remotely sensed data are emerging, and computing resources have been advanced rapidly. There are several spatially distributed models available for hydrological analyses. Some of them rely on numerical methods such as finite difference/element methods (FDM/FEM), which require excessive computing resources (implicit scheme) to manipulate large matrices or small simulation time intervals (explicit scheme) to maintain the stability of the solution, to describe two-dimensional overland processes. Others make unrealistic assumptions such as constant overland flow velocity to reduce the computational loads of the simulation. Thus, simulation efficiency often comes at the expense of precision and reliability in FL modeling. Here, we introduce a new FL continuous hydrological model and its application to four watersheds in different landscapes and sizes from 3.5 km2 to 2,800 km2 at the spatial resolution of 30 m on an hourly basis. The model provided acceptable accuracy statistics in reproducing hydrological observations made in the watersheds. The modeling outputs including the maps of simulated travel time, runoff depth, soil water content, and groundwater recharge, were animated, visualizing the dynamics of hydrological processes occurring in the watersheds during and between storm events. Findings and challenges were discussed in the context of modeling efficiency, accuracy, and reproducibility, which we found can be improved by employing advanced computing techniques and hydrological understandings, by using remotely sensed hydrological observations such as soil moisture and radar rainfall depth and by sharing the model and its codes in public domain, respectively.
An Integrative Bioinformatics Approach for Knowledge Discovery
NASA Astrophysics Data System (ADS)
Peña-Castillo, Lourdes; Phan, Sieu; Famili, Fazel
The vast amount of data being generated by large scale omics projects and the computational approaches developed to deal with this data have the potential to accelerate the advancement of our understanding of the molecular basis of genetic diseases. This better understanding may have profound clinical implications and transform the medical practice; for instance, therapeutic management could be prescribed based on the patient’s genetic profile instead of being based on aggregate data. Current efforts have established the feasibility and utility of integrating and analysing heterogeneous genomic data to identify molecular associations to pathogenesis. However, since these initiatives are data-centric, they either restrict the research community to specific data sets or to a certain application domain, or force researchers to develop their own analysis tools. To fully exploit the potential of omics technologies, robust computational approaches need to be developed and made available to the community. This research addresses such challenge and proposes an integrative approach to facilitate knowledge discovery from diverse datasets and contribute to the advancement of genomic medicine.
Challenges of NDE Simulation Tool Challenges of NDE Simulation Tool
NASA Technical Reports Server (NTRS)
Leckey, Cara A. C.; Juarez, Peter D.; Seebo, Jeffrey P.; Frank, Ashley L.
2015-01-01
Realistic nondestructive evaluation (NDE) simulation tools enable inspection optimization and predictions of inspectability for new aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of advanced aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation cannot rapidly simulate damage detection techniques for large scale, complex geometry composite components/vehicles with realistic damage types. This paper discusses some of the challenges of model development and validation for composites, such as the level of realism and scale of simulation needed for NASA' applications. Ongoing model development work is described along with examples of model validation studies. The paper will also discuss examples of the use of simulation tools at NASA to develop new damage characterization methods, and associated challenges of validating those methods.
Emplacement of the Rocche Rosse rhyolite lava flow (Lipari, Aeolian Islands)
NASA Astrophysics Data System (ADS)
Bullock, Liam A.; Gertisser, Ralf; O'Driscoll, Brian
2018-05-01
The Rocche Rosse lava flow marks the most recent rhyolitic extrusion on Lipari island (Italy), and preserves evidence for a multi-stage emplacement history. Due to the viscous nature of the advancing lava (108 to 1010 Pa s), indicators of complex emplacement processes are preserved in the final flow. This study focuses on structural mapping of the flow to highlight the interplay of cooling, crust formation and underlying slope in the development of rhyolitic lavas. The flow is made up of two prominent lobes, small (< 0.2 m) to large (> 0.2 m) scale folding and a channelled geometry. Foliations dip at 2-4° over the flatter topography close to the vent, and up to 30-50° over steeper mid-flow topography. Brittle faults, tension gashes and conjugate fractures are also evident across flow. Heterogeneous deformation is evident through increasing fold asymmetry from the vent due to downflow cooling and stagnation. A steeper underlying topography mid-flow led to development of a channelled morphology, and compression at topographic breaks resulted in fold superimposition in the channel. We propose an emplacement history that involved the evolution through five stages, each associated with the following flow regimes: (1) initial extrusion, crustal development and small scale folding; (2) extensional strain, stretching lineations and channel development over steeper topography; (3) compression at topographic break, autobrecciation, lobe development and medium scale folding; (4) progressive deformation with stagnation, large-scale folding and re-folding; and (5) brittle deformation following flow termination. The complex array of structural elements observed within the Rocche Rosse lava flow facilitates comparisons to be made with actively deforming rhyolitic lava flows at the Chilean volcanoes of Chaitén and Cordón Caulle, offering a fluid dynamic and structural framework within which to evaluate our data.
An Efficient and Versatile Means for Assembling and Manufacturing Systems in Space
NASA Technical Reports Server (NTRS)
Dorsey, John T.; Doggett, William R.; Hafley, Robert A.; Komendera, Erik; Correll, Nikolaus; King, Bruce
2012-01-01
Within NASA Space Science, Exploration and the Office of Chief Technologist, there are Grand Challenges and advanced future exploration, science and commercial mission applications that could benefit significantly from large-span and large-area structural systems. Of particular and persistent interest to the Space Science community is the desire for large (in the 10- 50 meter range for main aperture diameter) space telescopes that would revolutionize space astronomy. Achieving these systems will likely require on-orbit assembly, but previous approaches for assembling large-scale telescope truss structures and systems in space have been perceived as very costly because they require high precision and custom components. These components rely on a large number of mechanical connections and supporting infrastructure that are unique to each application. In this paper, a new assembly paradigm that mitigates these concerns is proposed and described. A new assembly approach, developed to implement the paradigm, is developed incorporating: Intelligent Precision Jigging Robots, Electron-Beam welding, robotic handling/manipulation, operations assembly sequence and path planning, and low precision weldable structural elements. Key advantages of the new assembly paradigm, as well as concept descriptions and ongoing research and technology development efforts for each of the major elements are summarized.
IP-Based Video Modem Extender Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pierson, L G; Boorman, T M; Howe, R E
2003-12-16
Visualization is one of the keys to understanding large complex data sets such as those generated by the large computing resources purchased and developed by the Advanced Simulation and Computing program (aka ASCI). In order to be convenient to researchers, visualization data must be distributed to offices and large complex visualization theaters. Currently, local distribution of the visual data is accomplished by distance limited modems and RGB switches that simply do not scale to hundreds of users across the local, metropolitan, and WAN distances without incurring large costs in fiber plant installation and maintenance. Wide Area application over the DOEmore » Complex is infeasible using these limited distance RGB extenders. On the other hand, Internet Protocols (IP) over Ethernet is a scalable well-proven technology that can distribute large volumes of data over these distances. Visual data has been distributed at lower resolutions over IP in industrial applications. This document describes requirements of the ASCI program in visual signal distribution for the purpose of identifying industrial partners willing to develop products to meet ASCI's needs.« less
Development of Supersonic Retro-Propulsion for Future Mars Entry, Descent, and Landing Systems
NASA Technical Reports Server (NTRS)
Edquist, Karl T.; Dyakonov, Artem A.; Shidner, Jeremy D.; Studak, Joseph W.; Tiggers, Michael A.; Kipp, Devin M.; Prakash, Ravi; Trumble, Kerry A.; Dupzyk, Ian C.; Korzun, Ashley M.
2010-01-01
Recent studies have concluded that Viking-era entry system technologies are reaching their practical limits and must be succeeded by new methods capable of delivering large payloads (greater than 10 metric tons) required for human exploration of Mars. One such technology, termed Supersonic Retro-Propulsion, has been proposed as an enabling deceleration technique. However, in order to be considered for future NASA flight projects, this technology will require significant maturation beyond its current state. This paper proposes a roadmap for advancing the component technologies to a point where Supersonic Retro-Propulsion can be reliably used on future Mars missions to land much larger payloads than are currently possible using Viking-based systems. The development roadmap includes technology gates that are achieved through testing and/or analysis, culminating with subscale flight tests in Earth atmosphere that demonstrate stable and controlled flight. The component technologies requiring advancement include large engines capable of throttling, computational models for entry vehicle aerodynamic/propulsive force and moment interactions, aerothermodynamic environments modeling, entry vehicle stability and control methods, integrated systems engineering and analyses, and high-fidelity six degree-of-freedom trajectory simulations. Quantifiable metrics are also proposed as a means to gage the technical progress of Supersonic Retro-Propulsion. Finally, an aggressive schedule is proposed for advancing the technology through sub-scale flight tests at Earth by 2016.
NASA Technical Reports Server (NTRS)
Gradl, Paul; Valentine, Peter; Crisanti, Matthew; Greene, Sandy Elam
2016-01-01
Upper stage and in-space liquid rocket engines are optimized for performance through the use of high area ratio nozzles to fully expand combustion gases to low exit pressures increasing exhaust velocities. Due to the large size of such nozzles and the related engine performance requirements, carbon-carbon (C/C) composite nozzle extensions are being considered for use in order to reduce weight impacts. NASA and industry partner Carbon-Carbon Advanced Technologies (C-CAT) are working towards advancing the technology readiness level of large-scale, domestically-fabricated, C/C nozzle extensions. These C/C extensions have the ability to reduce the overall costs of extensions relative to heritage metallic and composite extensions and to decrease weight by 50%. Material process and coating developments have advanced over the last several years, but hot fire testing to fully evaluate C/C nozzle extensions in relevant environments has been very limited. NASA and C-CAT have designed, fabricated and hot fire tested multiple subscale nozzle extension test articles of various C/C material systems, with the goal of assessing and advancing the manufacturability of these domestically producible materials as well as characterizing their performance when subjected to the typical environments found in a variety of liquid rocket and scramjet engines. Testing at the MSFC Test Stand 115 evaluated heritage and state-of-the-art C/C materials and coatings, demonstrating the capabilities of the high temperature materials and their fabrication methods. This paper discusses the design and fabrication of the 1.2k-lbf sized carbon-carbon nozzle extensions, provides an overview of the test campaign, presents results of the hot fire testing, and discusses potential follow-on development work.
Recursive renormalization group theory based subgrid modeling
NASA Technical Reports Server (NTRS)
Zhou, YE
1991-01-01
Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.
2002 Computing and Interdisciplinary Systems Office Review and Planning Meeting
NASA Technical Reports Server (NTRS)
Lytle, John; Follen, Gregory; Lopez, Isaac; Veres, Joseph; Lavelle, Thomas; Sehra, Arun; Freeh, Josh; Hah, Chunill
2003-01-01
The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with NASA Glenn s Propulsion program, NASA Ames, industry, academia and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This year s review meeting describes the current status of the NPSS and the Object Oriented Development Kit with specific emphasis on the progress made over the past year on air breathing propulsion applications for aeronautics and space transportation applications. Major accomplishments include the first 3-D simulation of the primary flow path of a large turbofan engine in less than 15 hours, and the formal release of the NPSS Version 1.5 that includes elements of rocket engine systems and a visual based syntax layer. NPSS and the Development Kit are managed by the Computing and Interdisciplinary Systems Office (CISO) at the NASA Glenn Research Center and financially supported in fiscal year 2002 by the Computing, Networking and Information Systems (CNIS) project managed at NASA Ames, the Glenn Aerospace Propulsion and Power Program and the Advanced Space Transportation Program.
Fortunato, Santo; Bergstrom, Carl T; Börner, Katy; Evans, James A; Helbing, Dirk; Milojević, Staša; Petersen, Alexander M; Radicchi, Filippo; Sinatra, Roberta; Uzzi, Brian; Vespignani, Alessandro; Waltman, Ludo; Wang, Dashun; Barabási, Albert-László
2018-03-02
Identifying fundamental drivers of science and developing predictive models to capture its evolution are instrumental for the design of policies that can improve the scientific enterprise-for example, through enhanced career paths for scientists, better performance evaluation for organizations hosting research, discovery of novel effective funding vehicles, and even identification of promising regions along the scientific frontier. The science of science uses large-scale data on the production of science to search for universal and domain-specific patterns. Here, we review recent developments in this transdisciplinary field. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.
Zou, Yun; Hu, Li; Tremp, Mathias; Jin, Yunbo; Chen, Hui; Ma, Gang; Lin, Xiaoxi
2018-02-23
The aim of this study was to repair large periorbital cutaneous defects by an innovative technique called PEPSI (periorbital elevation and positioning with secret incisions) technique with functional and aesthetic outcomes. In this retrospective study, unilateral periorbital cutaneous defects in 15 patients were repaired by the PEPSI technique. The ages of patients ranged from 3 to 46 years (average, 19 years). The outcome evaluations included scars (Vancouver Scar Scale and visual analog scale score), function and aesthetic appearance of eyelids, and patient satisfaction. The repair size was measured by the maximum advancement distance of skin flap during operation. All patients achieved an effective repair with a mean follow-up of 18.3 months. Except one with a small (approximately 0.3 cm) necrosis, all patients healed with no complication. The mean Vancouver Scar Scale and visual analog scale scores were 2.1 ± 1.7 and 8.5 ± 1.2, respectively. Ideal cosmetic and functional outcomes were achieved in 14 patients (93.3%). All patients achieved complete satisfaction except 1 patient with partial satisfaction. The mean maximum advancement distance of skin flap was 20.2 mm (range, 8-50 mm). This study demonstrated that the PEPSI technique is an effective method to repair large periorbital cutaneous defects with acceptable functional and aesthetic outcomes.
NASA Technical Reports Server (NTRS)
Tausworthe, Robert C.
1989-01-01
Report reviews history of tau ranging and advocates use of advanced electronic circuitry to revive this composite-code-uplink spacecraft-ranging technique. Very-large-scale integration gives new life to abandoned distance-measuring technique.
From a meso- to micro-scale connectome: array tomography and mGRASP
Rah, Jong-Cheol; Feng, Linqing; Druckmann, Shaul; Lee, Hojin; Kim, Jinhyun
2015-01-01
Mapping mammalian synaptic connectivity has long been an important goal of neuroscience because knowing how neurons and brain areas are connected underpins an understanding of brain function. Meeting this goal requires advanced techniques with single synapse resolution and large-scale capacity, especially at multiple scales tethering the meso- and micro-scale connectome. Among several advanced LM-based connectome technologies, Array Tomography (AT) and mammalian GFP-Reconstitution Across Synaptic Partners (mGRASP) can provide relatively high-throughput mapping synaptic connectivity at multiple scales. AT- and mGRASP-assisted circuit mapping (ATing and mGRASPing), combined with techniques such as retrograde virus, brain clearing techniques, and activity indicators will help unlock the secrets of complex neural circuits. Here, we discuss these useful new tools to enable mapping of brain circuits at multiple scales, some functional implications of spatial synaptic distribution, and future challenges and directions of these endeavors. PMID:26089781
Testing of a Stitched Composite Large-Scale Multi-Bay Pressure Box
NASA Technical Reports Server (NTRS)
Jegley, Dawn; Rouse, Marshall; Przekop, Adam; Lovejoy, Andrew
2016-01-01
NASA has created the Environmentally Responsible Aviation (ERA) Project to develop technologies to reduce aviation's impact on the environment. A critical aspect of this pursuit is the development of a lighter, more robust airframe to enable the introduction of unconventional aircraft configurations. NASA and The Boeing Company have worked together to develop a structural concept that is lightweight and an advancement beyond state-of-the-art composite structures. The Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) is an integrally stiffened panel design where elements are stitched together. The PRSEUS concept is designed to maintain residual load carrying capabilities under a variety of damage scenarios. A series of building block tests were evaluated to explore the fundamental assumptions related to the capability and advantages of PRSEUS panels. The final step in the building block series is an 80%-scale pressure box representing a portion of the center section of a Hybrid Wing Body (HWB) transport aircraft. The testing of this article under maneuver load and internal pressure load conditions is the subject of this paper. The experimental evaluation of this article, along with the other building block tests and the accompanying analyses, has demonstrated the viability of a PRSEUS center body for the HWB vehicle. Additionally, much of the development effort is also applicable to traditional tube-and-wing aircraft, advanced aircraft configurations, and other structures where weight and through-the-thickness strength are design considerations.
Dodge, Somayeh; Bohrer, Gil; Weinzierl, Rolf P.; Davidson, Sarah C.; Kays, Roland; Douglas, David C.; Cruz, Sebastian; Han, J.; Brandes, David; Wikelski, Martin
2013-01-01
The movement of animals is strongly influenced by external factors in their surrounding environment such as weather, habitat types, and human land use. With advances in positioning and sensor technologies, it is now possible to capture animal locations at high spatial and temporal granularities. Likewise, scientists have an increasing access to large volumes of environmental data. Environmental data are heterogeneous in source and format, and are usually obtained at different spatiotemporal scales than movement data. Indeed, there remain scientific and technical challenges in developing linkages between the growing collections of animal movement data and the large repositories of heterogeneous remote sensing observations, as well as in the developments of new statistical and computational methods for the analysis of movement in its environmental context. These challenges include retrieval, indexing, efficient storage, data integration, and analytical techniques.
NASA Automated Fiber Placement Capabilities: Similar Systems, Complementary Purposes
NASA Technical Reports Server (NTRS)
Wu, K. Chauncey; Jackson, Justin R.; Pelham, Larry I.; Stewart, Brian K.
2015-01-01
New automated fiber placement systems at the NASA Langley Research Center and NASA Marshall Space Flight Center provide state-of-art composites capabilities to these organizations. These systems support basic and applied research at Langley, complementing large-scale manufacturing and technology development at Marshall. These systems each consist of a multi-degree of freedom mobility platform including a commercial robot, a commercial tool changer mechanism, a bespoke automated fiber placement end effector, a linear track, and a rotational tool support structure. In addition, new end effectors with advanced capabilities may be either bought or developed with partners in industry and academia to extend the functionality of these systems. These systems will be used to build large and small composite parts in support of the ongoing NASA Composites for Exploration Upper Stage Project later this year.
Applications of Parallel Process HiMAP for Large Scale Multidisciplinary Problems
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.; Potsdam, Mark; Rodriguez, David; Kwak, Dochay (Technical Monitor)
2000-01-01
HiMAP is a three level parallel middleware that can be interfaced to a large scale global design environment for code independent, multidisciplinary analysis using high fidelity equations. Aerospace technology needs are rapidly changing. Computational tools compatible with the requirements of national programs such as space transportation are needed. Conventional computation tools are inadequate for modern aerospace design needs. Advanced, modular computational tools are needed, such as those that incorporate the technology of massively parallel processors (MPP).
A stakeholder-driven agenda for advancing the science and practice of scale-up and spread in health.
Norton, Wynne E; McCannon, C Joseph; Schall, Marie W; Mittman, Brian S
2012-12-06
Although significant advances have been made in implementation science, comparatively less attention has been paid to broader scale-up and spread of effective health programs at the regional, national, or international level. To address this gap in research, practice and policy attention, representatives from key stakeholder groups launched an initiative to identify gaps and stimulate additional interest and activity in scale-up and spread of effective health programs. We describe the background and motivation for this initiative and the content, process, and outcomes of two main phases comprising the core of the initiative: a state-of-the-art conference to develop recommendations for advancing scale-up and spread and a follow-up activity to operationalize and prioritize the recommendations. The conference was held in Washington, D.C. during July 2010 and attended by 100 representatives from research, practice, policy, public health, healthcare, and international health communities; the follow-up activity was conducted remotely the following year. Conference attendees identified and prioritized five recommendations (and corresponding sub-recommendations) for advancing scale-up and spread in health: increase awareness, facilitate information exchange, develop new methods, apply new approaches for evaluation, and expand capacity. In the follow-up activity, 'develop new methods' was rated as most important recommendation; expanding capacity was rated as least important, although differences were relatively minor. Based on the results of these efforts, we discuss priority activities that are needed to advance research, practice and policy to accelerate the scale-up and spread of effective health programs.
Irizarry, Kristopher J L; Downs, Eileen; Bryden, Randall; Clark, Jory; Griggs, Lisa; Kopulos, Renee; Boettger, Cynthia M; Carr, Thomas J; Keeler, Calvin L; Collisson, Ellen; Drechsler, Yvonne
2017-01-01
Discovering genetic biomarkers associated with disease resistance and enhanced immunity is critical to developing advanced strategies for controlling viral and bacterial infections in different species. Macrophages, important cells of innate immunity, are directly involved in cellular interactions with pathogens, the release of cytokines activating other immune cells and antigen presentation to cells of the adaptive immune response. IFNγ is a potent activator of macrophages and increased production has been associated with disease resistance in several species. This study characterizes the molecular basis for dramatically different nitric oxide production and immune function between the B2 and the B19 haplotype chicken macrophages.A large-scale RNA sequencing approach was employed to sequence the RNA of purified macrophages from each haplotype group (B2 vs. B19) during differentiation and after stimulation. Our results demonstrate that a large number of genes exhibit divergent expression between B2 and B19 haplotype cells both prior and after stimulation. These differences in gene expression appear to be regulated by complex epigenetic mechanisms that need further investigation.
Large-scale wind turbine structures
NASA Technical Reports Server (NTRS)
Spera, David A.
1988-01-01
The purpose of this presentation is to show how structural technology was applied in the design of modern wind turbines, which were recently brought to an advanced stage of development as sources of renewable power. Wind turbine structures present many difficult problems because they are relatively slender and flexible; subject to vibration and aeroelastic instabilities; acted upon by loads which are often nondeterministic; operated continuously with little maintenance in all weather; and dominated by life-cycle cost considerations. Progress in horizontal-axis wind turbines (HAWT) development was paced by progress in the understanding of structural loads, modeling of structural dynamic response, and designing of innovative structural response. During the past 15 years a series of large HAWTs was developed. This has culminated in the recent completion of the world's largest operating wind turbine, the 3.2 MW Mod-5B power plane installed on the island of Oahu, Hawaii. Some of the applications of structures technology to wind turbine will be illustrated by referring to the Mod-5B design. First, a video overview will be presented to provide familiarization with the Mod-5B project and the important components of the wind turbine system. Next, the structural requirements for large-scale wind turbines will be discussed, emphasizing the difficult fatigue-life requirements. Finally, the procedures used to design the structure will be presented, including the use of the fracture mechanics approach for determining allowable fatigue stresses.
Systematic Benchmarking of Diagnostic Technologies for an Electrical Power System
NASA Technical Reports Server (NTRS)
Kurtoglu, Tolga; Jensen, David; Poll, Scott
2009-01-01
Automated health management is a critical functionality for complex aerospace systems. A wide variety of diagnostic algorithms have been developed to address this technical challenge. Unfortunately, the lack of support to perform large-scale V&V (verification and validation) of diagnostic technologies continues to create barriers to effective development and deployment of such algorithms for aerospace vehicles. In this paper, we describe a formal framework developed for benchmarking of diagnostic technologies. The diagnosed system is the Advanced Diagnostics and Prognostics Testbed (ADAPT), a real-world electrical power system (EPS), developed and maintained at the NASA Ames Research Center. The benchmarking approach provides a systematic, empirical basis to the testing of diagnostic software and is used to provide performance assessment for different diagnostic algorithms.
Machine learning and computer vision approaches for phenotypic profiling.
Grys, Ben T; Lo, Dara S; Sahin, Nil; Kraus, Oren Z; Morris, Quaid; Boone, Charles; Andrews, Brenda J
2017-01-02
With recent advances in high-throughput, automated microscopy, there has been an increased demand for effective computational strategies to analyze large-scale, image-based data. To this end, computer vision approaches have been applied to cell segmentation and feature extraction, whereas machine-learning approaches have been developed to aid in phenotypic classification and clustering of data acquired from biological images. Here, we provide an overview of the commonly used computer vision and machine-learning methods for generating and categorizing phenotypic profiles, highlighting the general biological utility of each approach. © 2017 Grys et al.
1984-07-01
aerosols and sub pixel-sized clouds all tend to increase Channel 1 with respect to Channel 2 and reduce the computed VIN. Further, the Guide states that... computation of the VIN. Large scale cloud contamination of pixels, while diffi- cult to correct for, can at least be monitored and affected pixels...techniques have been developed for computer cloud screening. See, for example, Horvath et al. (1982), Gray and McCrary (1981a) and Nixon et al. (1983
Machine learning and computer vision approaches for phenotypic profiling
Morris, Quaid
2017-01-01
With recent advances in high-throughput, automated microscopy, there has been an increased demand for effective computational strategies to analyze large-scale, image-based data. To this end, computer vision approaches have been applied to cell segmentation and feature extraction, whereas machine-learning approaches have been developed to aid in phenotypic classification and clustering of data acquired from biological images. Here, we provide an overview of the commonly used computer vision and machine-learning methods for generating and categorizing phenotypic profiles, highlighting the general biological utility of each approach. PMID:27940887
Peroxisome Biogenesis and Function
Kaur, Navneet; Reumann, Sigrun; Hu, Jianping
2009-01-01
Peroxisomes are small and single membrane-delimited organelles that execute numerous metabolic reactions and have pivotal roles in plant growth and development. In recent years, forward and reverse genetic studies along with biochemical and cell biological analyses in Arabidopsis have enabled researchers to identify many peroxisome proteins and elucidate their functions. This review focuses on the advances in our understanding of peroxisome biogenesis and metabolism, and further explores the contribution of large-scale analysis, such as in sillco predictions and proteomics, in augmenting our knowledge of peroxisome function In Arabidopsis. PMID:22303249
Field Guide to Plant Model Systems
Chang, Caren; Bowman, John L.; Meyerowitz, Elliot M.
2016-01-01
For the past several decades, advances in plant development, physiology, cell biology, and genetics have relied heavily on the model (or reference) plant Arabidopsis thaliana. Arabidopsis resembles other plants, including crop plants, in many but by no means all respects. Study of Arabidopsis alone provides little information on the evolutionary history of plants, evolutionary differences between species, plants that survive in different environments, or plants that access nutrients and photosynthesize differently. Empowered by the availability of large-scale sequencing and new technologies for investigating gene function, many new plant models are being proposed and studied. PMID:27716506
Cole, P.D.; Calder, E.S.; Druitt, T.H.; Hoblitt, R.; Robertson, R.; Sparks, R.S.J.; Young, S.R.
1998-01-01
Numerous pyroclastic flows were produced during 1996-97 by collapse of the growing andesitic lava dome at Soufriere Hills Volcano, Montserrat. Measured deposit volumes from these flows range from 0.2 to 9 ?? 106 m3. Flows range from discrete, single pulse events to sustained large scale dome collapse events. Flows entered the sea on the eastern and southern coasts, depositing large fans of material at the coast. Small runout distance (<1 km) flows had average flow front velocities in the order of 3-10 m/s while flow fronts of the larger runout distance flows (up to 6.5 km) advanced in the order of 15-30 m/s. Many flows were locally highly erosive. Field relations show that development of the fine grained ash cloud surge component was enhanced during the larger sustained events. Periods of elevated pyroclastic flow productivity and sustained dome collapse events are linked to pulses of high magma extrusion rates.Numerous pyroclastic flows were produced during 1996-97 by collapse of the growing andesitic lava dome at Soufriere Hills Volcano, Montserrat. Measured deposit volumes from these flows range from 0.2 to 9??106 m3. Flows range from discrete, single pulse events to sustained large scale dome collapse events. Flows entered the sea on the eastern and southern coasts, depositing large fans of material at the coast. Small runout distance (<1 km) flows had average flow front velocities in the order of 3-10 m/s while flow fronts of the larger runout distance flows (up to 6.5 km) advanced in the order of 15-30 m/s. Many flows were locally highly erosive. Field relations show that development of the fine grained ash cloud surge component was enhanced during the larger sustained events. Periods of elevated dome pyroclastic flow productivity and sustained collapse events are linked to pulses of high magma extrusion rates.
A review of recent developments in rechargeable lithium-sulfur batteries.
Kang, Weimin; Deng, Nanping; Ju, Jingge; Li, Quanxiang; Wu, Dayong; Ma, Xiaomin; Li, Lei; Naebe, Minoo; Cheng, Bowen
2016-09-22
The research and development of advanced energy-storage systems must meet a large number of requirements, including high energy density, natural abundance of the raw material, low cost and environmental friendliness, and particularly reasonable safety. As the demands of high-performance batteries are continuously increasing, with large-scale energy storage systems and electric mobility equipment, lithium-sulfur batteries have become an attractive candidate for the new generation of high-performance batteries due to their high theoretical capacity (1675 mA h g -1 ) and energy density (2600 Wh kg -1 ). However, rapid capacity attenuation with poor cycle and rate performances make the batteries far from ideal with respect to real commercial applications. Outstanding breakthroughs and achievements have been made to alleviate these problems in the past ten years. This paper presents an overview of recent advances in lithium-sulfur battery research. We cover the research and development to date on various components of lithium-sulfur batteries, including cathodes, binders, separators, electrolytes, anodes, collectors, and some novel cell configurations. The current trends in materials selection for batteries are reviewed and various choices of cathode, binder, electrolyte, separator, anode, and collector materials are discussed. The current challenges associated with the use of batteries and their materials selection are listed and future perspectives for this class of battery are also discussed.
Development of Large-Scale Spacecraft Fire Safety Experiments
NASA Technical Reports Server (NTRS)
Ruff, Gary A.; Urban, David; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Legros, Guillaume; Eigenbrod, Christian; Smirnov, Nickolay; Fujita, Osamu; Cowlard, Adam J.;
2013-01-01
The status is presented of a spacecraft fire safety research project that is under development to reduce the uncertainty and risk in the design of spacecraft fire safety systems by testing at nearly full scale in low-gravity. Future crewed missions are expected to be more complex and longer in duration than previous exploration missions outside of low-earth orbit. This will increase the challenge of ensuring a fire-safe environment for the crew throughout the mission. Based on our fundamental uncertainty of the behavior of fires in low-gravity, the need for realistic scale testing at reduced gravity has been demonstrated. To address this gap in knowledge, a project has been established under the NASA Advanced Exploration Systems Program under the Human Exploration and Operations Mission directorate with the goal of substantially advancing our understanding of the spacecraft fire safety risk. Associated with the project is an international topical team of fire experts from other space agencies who conduct research that is integrated into the overall experiment design. The experiments are under development to be conducted in an Orbital Science Corporation Cygnus vehicle after it has undocked from the ISS. Although the experiment will need to meet rigorous safety requirements to ensure the carrier vehicle does not sustain damage, the absence of a crew removes the need for strict containment of combustion products. The tests will be fully automated with the data downlinked at the conclusion of the test before the Cygnus vehicle reenters the atmosphere. A computer modeling effort will complement the experimental effort. The international topical team is collaborating with the NASA team in the definition of the experiment requirements and performing supporting analysis, experimentation and technology development. The status of the overall experiment and the associated international technology development efforts are summarized.
ROADNET: A Real-time Data Aware System for Earth, Oceanographic, and Environmental Applications
NASA Astrophysics Data System (ADS)
Vernon, F.; Hansen, T.; Lindquist, K.; Ludascher, B.; Orcutt, J.; Rajasekar, A.
2003-12-01
The Real-time Observatories, Application, and Data management Network (ROADNet) Program aims to develop an integrated, seamless, and transparent environmental information network that will deliver geophysical, oceanographic, hydrological, ecological, and physical data to a variety of users in real-time. ROADNet is a multidisciplinary, multinational partnership of researchers, policymakers, natural resource managers, educators, and students who aim to use the data to advance our understanding and management of coastal, ocean, riparian, and terrestrial Earth systems in Southern California, Mexico, and well off shore. To date, project activity and funding have focused on the design and deployment of network linkages and on the exploratory development of the real-time data management system. We are currently adapting powerful "Data Grid" technologies to the unique challenges associated with the management and manipulation of real-time data. Current "Grid" projects deal with static data files, and significant technical innovation is required to address fundamental problems of real-time data processing, integration, and distribution. The technologies developed through this research will create a system that dynamically adapt downstream processing, cataloging, and data access interfaces when sensors are added or removed from the system; provide for real-time processing and monitoring of data streams--detecting events, and triggering computations, sensor and logger modifications, and other actions; integrate heterogeneous data from multiple (signal) domains; and provide for large-scale archival and querying of "consolidated" data. The software tools which must be developed do not exist, although limited prototype systems are available. This research has implications for the success of large-scale NSF initiatives in the Earth sciences (EarthScope), ocean sciences (OOI- Ocean Observatories Initiative), biological sciences (NEON - National Ecological Observatory Network) and civil engineering (NEES - Network for Earthquake Engineering Simulation). Each of these large scale initiatives aims to collect real-time data from thousands of sensors, and each will require new technologies to process, manage, and communicate real-time multidisciplinary environmental data on regional, national, and global scales.
Yarkoni, Tal
2012-01-01
Traditional pre-publication peer review of scientific output is a slow, inefficient, and unreliable process. Efforts to replace or supplement traditional evaluation models with open evaluation platforms that leverage advances in information technology are slowly gaining traction, but remain in the early stages of design and implementation. Here I discuss a number of considerations relevant to the development of such platforms. I focus particular attention on three core elements that next-generation evaluation platforms should strive to emphasize, including (1) open and transparent access to accumulated evaluation data, (2) personalized and highly customizable performance metrics, and (3) appropriate short-term incentivization of the userbase. Because all of these elements have already been successfully implemented on a large scale in hundreds of existing social web applications, I argue that development of new scientific evaluation platforms should proceed largely by adapting existing techniques rather than engineering entirely new evaluation mechanisms. Successful implementation of open evaluation platforms has the potential to substantially advance both the pace and the quality of scientific publication and evaluation, and the scientific community has a vested interest in shifting toward such models as soon as possible. PMID:23060783
NASA Technical Reports Server (NTRS)
Ormsby, J. P.
1982-01-01
An examination of the possibilities of using Landsat data to simulate NOAA-6 Advanced Very High Resolution Radiometer (AVHRR) data on two channels, as well as using actual NOAA-6 imagery, for large-scale hydrological studies is presented. A running average was obtained of 18 consecutive pixels of 1 km resolution taken by the Landsat scanners were scaled up to 8-bit data and investigated for different gray levels. AVHRR data comprising five channels of 10-bit, band-interleaved information covering 10 deg latitude were analyzed and a suitable pixel grid was chosen for comparison with the Landsat data in a supervised classification format, an unsupervised mode, and with ground truth. Landcover delineation was explored by removing snow, water, and cloud features from the cluster analysis, and resulted in less than 10% difference. Low resolution large-scale data was determined useful for characterizing some landcover features if weekly and/or monthly updates are maintained.
NASA Astrophysics Data System (ADS)
Hostache, Renaud; Rains, Dominik; Chini, Marco; Lievens, Hans; Verhoest, Niko E. C.; Matgen, Patrick
2017-04-01
Motivated by climate change and its impact on the scarcity or excess of water in many parts of the world, several agencies and research institutions have taken initiatives in monitoring and predicting the hydrologic cycle at a global scale. Such a monitoring/prediction effort is important for understanding the vulnerability to extreme hydrological events and for providing early warnings. This can be based on an optimal combination of hydro-meteorological models and remote sensing, in which satellite measurements can be used as forcing or calibration data or for regularly updating the model states or parameters. Many advances have been made in these domains and the near future will bring new opportunities with respect to remote sensing as a result of the increasing number of spaceborn sensors enabling the large scale monitoring of water resources. Besides of these advances, there is currently a tendency to refine and further complicate physically-based hydrologic models to better capture the hydrologic processes at hand. However, this may not necessarily be beneficial for large-scale hydrology, as computational efforts are therefore increasing significantly. As a matter of fact, a novel thematic science question that is to be investigated is whether a flexible conceptual model can match the performance of a complex physically-based model for hydrologic simulations at large scale. In this context, the main objective of this study is to investigate how innovative techniques that allow for the estimation of soil moisture from satellite data can help in reducing errors and uncertainties in large scale conceptual hydro-meteorological modelling. A spatially distributed conceptual hydrologic model has been set up based on recent developments of the SUPERFLEX modelling framework. As it requires limited computational efforts, this model enables early warnings for large areas. Using as forcings the ERA-Interim public dataset and coupled with the CMEM radiative transfer model, SUPERFLEX is capable of predicting runoff, soil moisture, and SMOS-like brightness temperature time series. Such a model is traditionally calibrated using only discharge measurements. In this study we designed a multi-objective calibration procedure based on both discharge measurements and SMOS-derived brightness temperature observations in order to evaluate the added value of remotely sensed soil moisture data in the calibration process. As a test case we set up the SUPERFLEX model for the large scale Murray-Darling catchment in Australia ( 1 Million km2). When compared to in situ soil moisture time series, model predictions show good agreement resulting in correlation coefficients exceeding 70 % and Root Mean Squared Errors below 1 %. When benchmarked with the physically based land surface model CLM, SUPERFLEX exhibits similar performance levels. By adapting the runoff routing function within the SUPERFLEX model, the predicted discharge results in a Nash Sutcliff Efficiency exceeding 0.7 over both the calibration and the validation periods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
de Supinski, B R; Miller, B P; Liblit, B
2011-09-13
Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques.more » Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two years of the project, we have successfully extended STAT to determine the relative progress of different MPI processes. We have shown that the STAT, which is now included in the debugging tools distributed by Cray with their large-scale systems, substantially reduces the scale at which traditional debugging techniques are applied. We have extended CBI to large-scale systems and developed new compiler based analyses that reduce its instrumentation overhead. Our results demonstrate that CBI can identify the source of errors in large-scale applications. Finally, we have developed MPIecho, a new technique that will reduce the time required to perform key correctness analyses, such as the detection of writes to unallocated memory. Overall, our research results are the foundations for new debugging paradigms that will improve application scientist productivity by reducing the time to determine which package or module contains the root cause of a problem that arises at all scales of our high end systems. While we have made substantial progress in the first two years of CoPS research, significant work remains. While STAT provides scalable debugging assistance for incorrect application runs, we could apply its techniques to assertions in order to observe deviations from expected behavior. Further, we must continue to refine STAT's techniques to represent behavioral equivalence classes efficiently as we expect systems with millions of threads in the next year. We are exploring new CBI techniques that can assess the likelihood that execution deviations from past behavior are the source of erroneous execution. Finally, we must develop usable correctness analyses that apply the MPIecho parallelization strategy in order to locate coding errors. We expect to make substantial progress on these directions in the next year but anticipate that significant work will remain to provide usable, scalable debugging paradigms.« less
A critical assessment of boron target compounds for boron neutron capture therapy.
Hawthorne, M Frederick; Lee, Mark W
2003-01-01
Boron neutron capture therapy (BNCT) has undergone dramatic developments since its inception by Locher in 1936 and the development of nuclear energy during World War II. The ensuing Cold War spawned the entirely new field of polyhedral borane chemistry, rapid advances in nuclear reactor technology and a corresponding increase in the number to reactors potentially available for BNCT. This effort has been largely oriented toward the eradication of glioblastoma multiforme (GBM) and melanoma with reduced interest in other types of malignancies. The design and synthesis of boron-10 target compounds needed for BNCT was not channeled to those types of compounds specifically required for GBM or melanoma. Consequently, a number of potentially useful boron agents are known which have not been biologically evaluated beyond a cursory examination and only three boron-10 enriched target species are approved for human use following their Investigational New Drug classification by the US Food and Drug Administration; BSH, BPA and GB-10. All ongoing clinical trials with GBM and melanoma are necessarily conducted with one of these three species and most often with BPA. The further development of BNCT is presently stalled by the absence of strong support for advanced compound evaluation and compound discovery driven by recent advances in biology and chemistry. A rigorous demonstration of BNCT efficacy surpassing that of currently available protocols has yet to be achieved. This article discusses the past history of compound development, contemporary problems such as compound classification and those problems which impede future advances. The latter include means for biological evaluation of new (and existing) boron target candidates at all stages of their development and the large-scale synthesis of boron target species for clinical trials and beyond. The future of BNCT is bright if latitude is given to the choice of clinical disease to be treated and if a recognized study demonstrating improved efficacy is completed. Eventually, BNCT in some form will be commercialized.
Hydrothermal crystal growth of oxides for optical applications
NASA Astrophysics Data System (ADS)
McMillen, Colin David
2007-12-01
The manipulation of light has proven to be an integral part of today's technology-based society. In particular, there is great interest in obtaining coherent radiation in all regions of the optical spectrum to advance technology in military, medical, industrial, scientific and consumer fields. Exploring new crystal growth techniques as well as the growth of new optical materials is critical in the advancement of solid state optics. Surprisingly, the academic world devotes little attention to the growth of large crystals. This shortcoming has left gaps in the optical spectrum inaccessible by solid state devices. This dissertation explores the hydrothermal crystal growth of materials that could fill two such gaps. The first gap exists in the deep-UV region, particularly below 200 nm. Some materials such as LiB3O5 and beta-BaB2O4 can generate coherent light at wavelengths as low as 205 nm. The growth of these materials was explored to investigate the feasibility of the hydrothermal method as a new technique for growing these crystals. Particular attention was paid to the descriptive chemistry surrounding these systems, and several novel structures were elucidated. The study was also extended to the growth of materials that could be used for the generation of coherent light as low as 155 nm. Novel synthetic schemes for Sr2Be2B2O7 and KBe2BO 3F2 were developed and the growth of large crystals was explored. An extensive study of the structures, properties and crystal growth of related compounds, RbBe2BO3F2 and CsBe2BO 3F2, was also undertaken. Optimization of a number of parameters within this family of compounds led to the hydrothermal growth of large, high quality single crystal at rates suitable for large-scale growth. The second gap in technology is in the area of high average power solid state lasers emitting in the 1 mum and eye-safe (>1.5 mum) regions. A hydrothermal technique was developed to grow high quality crystals of Sc 2O3 and Sc2O3 doped with suitable lanthanide activator ions. Preliminary spectroscopic studies were performed and large crystals were again grown at rates suitable for commercial production. The synthesis of ultra-high purity Ln2O3 (Ln = Sc, Y, La-Lu) nanoparticles was also explored to advance the development of ceramic-based solid state lasers. Crystal growth is a complex task involving a great number of intricacies that must be understood and balanced. This dissertation has advanced the art and science of growing crystals, and documented the development of large, high quality crystals of advanced optical materials The materials and hydrothermal crystal growth techniques developed over the course of this work represent important progress toward controlling the optical spectrum.
Advanced ACTPol Multichroic Polarimeter Array Fabrication Process for 150 mm Wafers
NASA Astrophysics Data System (ADS)
Duff, S. M.; Austermann, J.; Beall, J. A.; Becker, D.; Datta, R.; Gallardo, P. A.; Henderson, S. W.; Hilton, G. C.; Ho, S. P.; Hubmayr, J.; Koopman, B. J.; Li, D.; McMahon, J.; Nati, F.; Niemack, M. D.; Pappas, C. G.; Salatino, M.; Schmitt, B. L.; Simon, S. M.; Staggs, S. T.; Stevens, J. R.; Van Lanen, J.; Vavagiakis, E. M.; Ward, J. T.; Wollack, E. J.
2016-08-01
Advanced ACTPol (AdvACT) is a third-generation cosmic microwave background receiver to be deployed in 2016 on the Atacama Cosmology Telescope (ACT). Spanning five frequency bands from 25 to 280 GHz and having just over 5600 transition-edge sensor (TES) bolometers, this receiver will exhibit increased sensitivity and mapping speed compared to previously fielded ACT instruments. This paper presents the fabrication processes developed by NIST to scale to large arrays of feedhorn-coupled multichroic AlMn-based TES polarimeters on 150-mm diameter wafers. In addition to describing the streamlined fabrication process which enables high yields of densely packed detectors across larger wafers, we report the details of process improvements for sensor (AlMn) and insulator (SiN_x) materials and microwave structures, and the resulting performance improvements.
Advanced ACTPol Multichroic Polarimeter Array Fabrication Process for 150 mm Wafers
NASA Technical Reports Server (NTRS)
Duff, S. M.; Austermann, J.; Beall, J. A.; Becker, D.; Datta, R.; Gallardo, P. A.; Henderson, S. W.; Hilton, G. C.; Ho, S. P.; Hubmayr, J.;
2016-01-01
Advanced ACTPol (AdvACT) is a third-generation cosmic microwave background receiver to be deployed in 2016 on the Atacama Cosmology Telescope (ACT). Spanning five frequency bands from 25 to 280 GHz and having just over 5600 transition-edge sensor (TES) bolometers, this receiver will exhibit increased sensitivity and mapping speed compared to previously fielded ACT instruments. This paper presents the fabrication processes developed by NIST to scale to large arrays of feedhorn-coupled multichroic AlMn-based TES polarimeters on 150-mm diameter wafers. In addition to describing the streamlined fabrication process which enables high yields of densely packed detectors across larger wafers, we report the details of process improvements for sensor (AlMn) and insulator (SiN(sub x)) materials and microwave structures, and the resulting performance improvements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Srivastava, V.; Fannin, K.F.; Biljetina, R.
1986-07-01
The Institute of Gas Technology (IGT) conducted a comprehensive laboratory-scale research program to develop and optimize the anaerobic digestion process for producing methane from water hyacinth and sludge blends. This study focused on digester design and operating techniques, which gave improved methane yields and production rates over those observed using conventional digesters. The final digester concept and the operating experience was utilized to design and operate a large-scale experimentla test unit (ETU) at Walt Disney World, Florida. This paper describes the novel digester design, operating techniques, and the results obtained in the laboratory. The paper also discusses a kinetic modelmore » which predicts methane yield, methane production rate, and digester effluent solids as a function of retention time. This model was successfully utilized to predict the performance of the ETU. 15 refs., 6 figs., 6 tabs.« less
Progress on the Development of the hPIC Particle-in-Cell Code
NASA Astrophysics Data System (ADS)
Dart, Cameron; Hayes, Alyssa; Khaziev, Rinat; Marcinko, Stephen; Curreli, Davide; Laboratory of Computational Plasma Physics Team
2017-10-01
Advancements were made in the development of the kinetic-kinetic electrostatic Particle-in-Cell code, hPIC, designed for large-scale simulation of the Plasma-Material Interface. hPIC achieved a weak scaling efficiency of 87% using the Algebraic Multigrid Solver BoomerAMG from the PETSc library on more than 64,000 cores of the Blue Waters supercomputer at the University of Illinois at Urbana-Champaign. The code successfully simulates two-stream instability and a volume of plasma over several square centimeters of surface extending out to the presheath in kinetic-kinetic mode. Results from a parametric study of the plasma sheath in strongly magnetized conditions will be presented, as well as a detailed analysis of the plasma sheath structure at grazing magnetic angles. The distribution function and its moments will be reported for plasma species in the simulation domain and at the material surface for plasma sheath simulations. Membership Pending.
Compact Multimedia Systems in Multi-chip Module Technology
NASA Technical Reports Server (NTRS)
Fang, Wai-Chi; Alkalaj, Leon
1995-01-01
This tutorial paper shows advanced multimedia system designs based on multi-chip module (MCM) technologies that provide essential computing, compression, communication, and storage capabilities for various large scale information highway applications.!.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muchero, Wellington; Labbe, Jessy L; Priya, Ranjan
2014-01-01
To date, Populus ranks among a few plant species with a complete genome sequence and other highly developed genomic resources. With the first genome sequence among all tree species, Populus has been adopted as a suitable model organism for genomic studies in trees. However, far from being just a model species, Populus is a key renewable economic resource that plays a significant role in providing raw materials for the biofuel and pulp and paper industries. Therefore, aside from leading frontiers of basic tree molecular biology and ecological research, Populus leads frontiers in addressing global economic challenges related to fuel andmore » fiber production. The latter fact suggests that research aimed at improving quality and quantity of Populus as a raw material will likely drive the pursuit of more targeted and deeper research in order to unlock the economic potential tied in molecular biology processes that drive this tree species. Advances in genome sequence-driven technologies, such as resequencing individual genotypes, which in turn facilitates large scale SNP discovery and identification of large scale polymorphisms are key determinants of future success in these initiatives. In this treatise we discuss implications of genome sequence-enable technologies on Populus genomic and genetic studies of complex and specialized-traits.« less
Combining points and lines in rectifying satellite images
NASA Astrophysics Data System (ADS)
Elaksher, Ahmed F.
2017-09-01
The quick advance in remote sensing technologies established the potential to gather accurate and reliable information about the Earth surface using high resolution satellite images. Remote sensing satellite images of less than one-meter pixel size are currently used in large-scale mapping. Rigorous photogrammetric equations are usually used to describe the relationship between the image coordinates and ground coordinates. These equations require the knowledge of the exterior and interior orientation parameters of the image that might not be available. On the other hand, the parallel projection transformation could be used to represent the mathematical relationship between the image-space and objectspace coordinate systems and provides the required accuracy for large-scale mapping using fewer ground control features. This article investigates the differences between point-based and line-based parallel projection transformation models in rectifying satellite images with different resolutions. The point-based parallel projection transformation model and its extended form are presented and the corresponding line-based forms are developed. Results showed that the RMS computed using the point- or line-based transformation models are equivalent and satisfy the requirement for large-scale mapping. The differences between the transformation parameters computed using the point- and line-based transformation models are insignificant. The results showed high correlation between the differences in the ground elevation and the RMS.
Du, Hui; Chen, Xiaobo; Xi, Juntong; Yu, Chengyi; Zhao, Bao
2017-12-12
Large-scale surfaces are prevalent in advanced manufacturing industries, and 3D profilometry of these surfaces plays a pivotal role for quality control. This paper proposes a novel and flexible large-scale 3D scanning system assembled by combining a robot, a binocular structured light scanner and a laser tracker. The measurement principle and system construction of the integrated system are introduced. A mathematical model is established for the global data fusion. Subsequently, a robust method is introduced for the establishment of the end coordinate system. As for hand-eye calibration, the calibration ball is observed by the scanner and the laser tracker simultaneously. With this data, the hand-eye relationship is solved, and then an algorithm is built to get the transformation matrix between the end coordinate system and the world coordinate system. A validation experiment is designed to verify the proposed algorithms. Firstly, a hand-eye calibration experiment is implemented and the computation of the transformation matrix is done. Then a car body rear is measured 22 times in order to verify the global data fusion algorithm. The 3D shape of the rear is reconstructed successfully. To evaluate the precision of the proposed method, a metric tool is built and the results are presented.
Next Generation Analytic Tools for Large Scale Genetic Epidemiology Studies of Complex Diseases
Mechanic, Leah E.; Chen, Huann-Sheng; Amos, Christopher I.; Chatterjee, Nilanjan; Cox, Nancy J.; Divi, Rao L.; Fan, Ruzong; Harris, Emily L.; Jacobs, Kevin; Kraft, Peter; Leal, Suzanne M.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Province, Michael A.; Ramos, Erin M.; Ritchie, Marylyn D.; Roeder, Kathryn; Schaid, Daniel J.; Stephens, Matthew; Thomas, Duncan C.; Weinberg, Clarice R.; Witte, John S.; Zhang, Shunpu; Zöllner, Sebastian; Feuer, Eric J.; Gillanders, Elizabeth M.
2012-01-01
Over the past several years, genome-wide association studies (GWAS) have succeeded in identifying hundreds of genetic markers associated with common diseases. However, most of these markers confer relatively small increments of risk and explain only a small proportion of familial clustering. To identify obstacles to future progress in genetic epidemiology research and provide recommendations to NIH for overcoming these barriers, the National Cancer Institute sponsored a workshop entitled “Next Generation Analytic Tools for Large-Scale Genetic Epidemiology Studies of Complex Diseases” on September 15–16, 2010. The goal of the workshop was to facilitate discussions on (1) statistical strategies and methods to efficiently identify genetic and environmental factors contributing to the risk of complex disease; and (2) how to develop, apply, and evaluate these strategies for the design, analysis, and interpretation of large-scale complex disease association studies in order to guide NIH in setting the future agenda in this area of research. The workshop was organized as a series of short presentations covering scientific (gene-gene and gene-environment interaction, complex phenotypes, and rare variants and next generation sequencing) and methodological (simulation modeling and computational resources and data management) topic areas. Specific needs to advance the field were identified during each session and are summarized. PMID:22147673
Martin P. Schilling; Paul G. Wolf; Aaron M. Duffy; Hardeep S. Rai; Carol A. Rowe; Bryce A. Richardson; Karen E. Mock
2014-01-01
Continuing advances in nucleotide sequencing technology are inspiring a suite of genomic approaches in studies of natural populations. Researchers are faced with data management and analytical scales that are increasing by orders of magnitude. With such dramatic advances comes a need to understand biases and error rates, which can be propagated and magnified in large-...
Thibault, J. C.; Roe, D. R.; Eilbeck, K.; Cheatham, T. E.; Facelli, J. C.
2015-01-01
Biomolecular simulations aim to simulate structure, dynamics, interactions, and energetics of complex biomolecular systems. With the recent advances in hardware, it is now possible to use more complex and accurate models, but also reach time scales that are biologically significant. Molecular simulations have become a standard tool for toxicology and pharmacology research, but organizing and sharing data – both within the same organization and among different ones – remains a substantial challenge. In this paper we review our recent work leading to the development of a comprehensive informatics infrastructure to facilitate the organization and exchange of biomolecular simulations data. Our efforts include the design of data models and dictionary tools that allow the standardization of the metadata used to describe the biomedical simulations, the development of a thesaurus and ontology for computational reasoning when searching for biomolecular simulations in distributed environments, and the development of systems based on these models to manage and share the data at a large scale (iBIOMES), and within smaller groups of researchers at laboratory scale (iBIOMES Lite), that take advantage of the standardization of the meta data used to describe biomolecular simulations. PMID:26387907
Thibault, J C; Roe, D R; Eilbeck, K; Cheatham, T E; Facelli, J C
2015-01-01
Biomolecular simulations aim to simulate structure, dynamics, interactions, and energetics of complex biomolecular systems. With the recent advances in hardware, it is now possible to use more complex and accurate models, but also reach time scales that are biologically significant. Molecular simulations have become a standard tool for toxicology and pharmacology research, but organizing and sharing data - both within the same organization and among different ones - remains a substantial challenge. In this paper we review our recent work leading to the development of a comprehensive informatics infrastructure to facilitate the organization and exchange of biomolecular simulations data. Our efforts include the design of data models and dictionary tools that allow the standardization of the metadata used to describe the biomedical simulations, the development of a thesaurus and ontology for computational reasoning when searching for biomolecular simulations in distributed environments, and the development of systems based on these models to manage and share the data at a large scale (iBIOMES), and within smaller groups of researchers at laboratory scale (iBIOMES Lite), that take advantage of the standardization of the meta data used to describe biomolecular simulations.
Solar Cell and Array Technology Development for NASA Solar Electric Propulsion Missions
NASA Technical Reports Server (NTRS)
Piszczor, Michael; McNatt, Jeremiah; Mercer, Carolyn; Kerslake, Tom; Pappa, Richard
2012-01-01
NASA is currently developing advanced solar cell and solar array technologies to support future exploration activities. These advanced photovoltaic technology development efforts are needed to enable very large (multi-hundred kilowatt) power systems that must be compatible with solar electric propulsion (SEP) missions. The technology being developed must address a wide variety of requirements and cover the necessary advances in solar cell, blanket integration, and large solar array structures that are needed for this class of missions. Th is paper will summarize NASA's plans for high power SEP missions, initi al mission studies and power system requirements, plans for advanced photovoltaic technology development, and the status of specific cell and array technology development and testing that have already been conducted.
NASA Astrophysics Data System (ADS)
Kennedy, Scott Warren
A steady decline in the cost of wind turbines and increased experience in their successful operation have brought this technology to the forefront of viable alternatives for large-scale power generation. Methodologies for understanding the costs and benefits of large-scale wind power development, however, are currently limited. In this thesis, a new and widely applicable technique for estimating the social benefit of large-scale wind power production is presented. The social benefit is based upon wind power's energy and capacity services and the avoidance of environmental damages. The approach uses probabilistic modeling techniques to account for the stochastic interaction between wind power availability, electricity demand, and conventional generator dispatch. A method for including the spatial smoothing effect of geographically dispersed wind farms is also introduced. The model has been used to analyze potential offshore wind power development to the south of Long Island, NY. If natural gas combined cycle (NGCC) and integrated gasifier combined cycle (IGCC) are the alternative generation sources, wind power exhibits a negative social benefit due to its high capacity cost and the relatively low emissions of these advanced fossil-fuel technologies. Environmental benefits increase significantly if charges for CO2 emissions are included. Results also reveal a diminishing social benefit as wind power penetration increases. The dependence of wind power benefits on natural gas and coal prices is also discussed. In power systems with a high penetration of wind generated electricity, the intermittent availability of wind power may influence hourly spot prices. A price responsive electricity demand model is introduced that shows a small increase in wind power value when consumers react to hourly spot prices. The effectiveness of this mechanism depends heavily on estimates of the own- and cross-price elasticities of aggregate electricity demand. This work makes a valuable contribution by synthesizing information from research in power market economics, power system reliability, and environmental impact assessment, to develop a comprehensive methodology for analyzing wind power in the context of long-term energy planning.
Using Syntactic Patterns to Enhance Text Analytics
ERIC Educational Resources Information Center
Meyer, Bradley B.
2017-01-01
Large scale product and service reviews proliferate and are commonly found across the web. The ability to harvest, digest and analyze a large corpus of reviews from online websites is still however a difficult problem. This problem is referred to as "opinion mining." Opinion mining is an important area of research as advances in the…
Targeted enrichment strategies for next-generation plant biology
Richard Cronn; Brian J. Knaus; Aaron Liston; Peter J. Maughan; Matthew Parks; John V. Syring; Joshua Udall
2012-01-01
The dramatic advances offered by modem DNA sequencers continue to redefine the limits of what can be accomplished in comparative plant biology. Even with recent achievements, however, plant genomes present obstacles that can make it difficult to execute large-scale population and phylogenetic studies on next-generation sequencing platforms. Factors like large genome...
MGIS: Managing banana (Musa spp.) genetic resources information and high-throughput genotyping data
USDA-ARS?s Scientific Manuscript database
Unraveling genetic diversity held in genebanks on a large scale is underway, due to the advances in Next-generation sequence-based technologies that produce high-density genetic markers for a large number of samples at low cost. Genebank users should be in a position to identify and select germplasm...
NASA Technical Reports Server (NTRS)
Valinia, Azita; Moe, Rud; Seery, Bernard D.; Mankins, John C.
2013-01-01
We present a concept for an ISS-based optical system assembly demonstration designed to advance technologies related to future large in-space optical facilities deployment, including space solar power collectors and large-aperture astronomy telescopes. The large solar power collector problem is not unlike the large astronomical telescope problem, but at least conceptually it should be easier in principle, given the tolerances involved. We strive in this application to leverage heavily the work done on the NASA Optical Testbed Integration on ISS Experiment (OpTIIX) effort to erect a 1.5 m imaging telescope on the International Space Station (ISS). Specifically, we examine a robotic assembly sequence for constructing a large (meter diameter) slightly aspheric or spherical primary reflector, comprised of hexagonal mirror segments affixed to a lightweight rigidizing backplane structure. This approach, together with a structured robot assembler, will be shown to be scalable to the area and areal densities required for large-scale solar concentrator arrays.
Dynamical systems proxies of atmospheric predictability and mid-latitude extremes
NASA Astrophysics Data System (ADS)
Messori, Gabriele; Faranda, Davide; Caballero, Rodrigo; Yiou, Pascal
2017-04-01
Extreme weather ocurrences carry enormous social and economic costs and routinely garner widespread scientific and media coverage. Many extremes (for e.g. storms, heatwaves, cold spells, heavy precipitation) are tied to specific patterns of midlatitude atmospheric circulation. The ability to identify these patterns and use them to enhance the predictability of the extremes is therefore a topic of crucial societal and economic value. We propose a novel predictability pathway for extreme events, by building upon recent advances in dynamical systems theory. We use two simple dynamical systems metrics - local dimension and persistence - to identify sets of similar large-scale atmospheric flow patterns which present a coherent temporal evolution. When these patterns correspond to weather extremes, they therefore afford a particularly good forward predictability. We specifically test this technique on European winter temperatures, whose variability largely depends on the atmospheric circulation in the North Atlantic region. We find that our dynamical systems approach provides predictability of large-scale temperature extremes up to one week in advance.
Advances in DNA sequencing technologies for high resolution HLA typing.
Cereb, Nezih; Kim, Hwa Ran; Ryu, Jaejun; Yang, Soo Young
2015-12-01
This communication describes our experience in large-scale G group-level high resolution HLA typing using three different DNA sequencing platforms - ABI 3730 xl, Illumina MiSeq and PacBio RS II. Recent advances in DNA sequencing technologies, so-called next generation sequencing (NGS), have brought breakthroughs in deciphering the genetic information in all living species at a large scale and at an affordable level. The NGS DNA indexing system allows sequencing multiple genes for large number of individuals in a single run. Our laboratory has adopted and used these technologies for HLA molecular testing services. We found that each sequencing technology has its own strengths and weaknesses, and their sequencing performances complement each other. HLA genes are highly complex and genotyping them is quite challenging. Using these three sequencing platforms, we were able to meet all requirements for G group-level high resolution and high volume HLA typing. Copyright © 2015 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.
A Functional Model for Management of Large Scale Assessments.
ERIC Educational Resources Information Center
Banta, Trudy W.; And Others
This functional model for managing large-scale program evaluations was developed and validated in connection with the assessment of Tennessee's Nutrition Education and Training Program. Management of such a large-scale assessment requires the development of a structure for the organization; distribution and recovery of large quantities of…
Advanced Grid-Friendly Controls Demonstration Project for Utility-Scale PV Power Plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gevorgian, Vahan; O'Neill, Barbara
A typical photovoltaic (PV) power plant consists of multiple power electronic inverters and can contribute to grid stability and reliability through sophisticated 'grid-friendly' controls. The availability and dissemination of actual test data showing the viability of advanced utility-scale PV controls among all industry stakeholders can leverage PV's value from being simply an energy resource to providing additional ancillary services that range from variability smoothing and frequency regulation to power quality. Strategically partnering with a selected utility and/or PV power plant operator is a key condition for a successful demonstration project. The U.S. Department of Energy's (DOE's) Solar Energy Technologies Officemore » selected the National Renewable Energy Laboratory (NREL) to be a principal investigator in a two-year project with goals to (1) identify a potential partner(s), (2) develop a detailed scope of work and test plan for a field project to demonstrate the gird-friendly capabilities of utility-scale PV power plants, (3) facilitate conducting actual demonstration tests, and (4) disseminate test results among industry stakeholders via a joint NREL/DOE publication and participation in relevant technical conferences. The project implementation took place in FY 2014 and FY 2015. In FY14, NREL established collaborations with AES and First Solar Electric, LLC, to conduct demonstration testing on their utility-scale PV power plants in Puerto Rico and Texas, respectively, and developed test plans for each partner. Both Puerto Rico Electric Power Authority and the Electric Reliability Council of Texas expressed interest in this project because of the importance of such advanced controls for the reliable operation of their power systems under high penetration levels of variable renewable generation. During FY15, testing was completed on both plants, and a large amount of test data was produced and analyzed that demonstrates the ability of PV power plants to provide various types of new grid-friendly controls.« less
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1989-01-01
Deep Space Network advanced systems, very large scale integration architecture for decoders, radar interface and control units, microwave time delays, microwave antenna holography, and a radio frequency interference survey are among the topics discussed.
National Laboratory for Advanced Scientific Visualization at UNAM - Mexico
NASA Astrophysics Data System (ADS)
Manea, Marina; Constantin Manea, Vlad; Varela, Alfredo
2016-04-01
In 2015, the National Autonomous University of Mexico (UNAM) joined the family of Universities and Research Centers where advanced visualization and computing plays a key role to promote and advance missions in research, education, community outreach, as well as business-oriented consulting. This initiative provides access to a great variety of advanced hardware and software resources and offers a range of consulting services that spans a variety of areas related to scientific visualization, among which are: neuroanatomy, embryonic development, genome related studies, geosciences, geography, physics and mathematics related disciplines. The National Laboratory for Advanced Scientific Visualization delivers services through three main infrastructure environments: the 3D fully immersive display system Cave, the high resolution parallel visualization system Powerwall, the high resolution spherical displays Earth Simulator. The entire visualization infrastructure is interconnected to a high-performance-computing-cluster (HPCC) called ADA in honor to Ada Lovelace, considered to be the first computer programmer. The Cave is an extra large 3.6m wide room with projected images on the front, left and right, as well as floor walls. Specialized crystal eyes LCD-shutter glasses provide a strong stereo depth perception, and a variety of tracking devices allow software to track the position of a user's hand, head and wand. The Powerwall is designed to bring large amounts of complex data together through parallel computing for team interaction and collaboration. This system is composed by 24 (6x4) high-resolution ultra-thin (2 mm) bezel monitors connected to a high-performance GPU cluster. The Earth Simulator is a large (60") high-resolution spherical display used for global-scale data visualization like geophysical, meteorological, climate and ecology data. The HPCC-ADA, is a 1000+ computing core system, which offers parallel computing resources to applications that requires large quantity of memory as well as large and fast parallel storage systems. The entire system temperature is controlled by an energy and space efficient cooling solution, based on large rear door liquid cooled heat exchangers. This state-of-the-art infrastructure will boost research activities in the region, offer a powerful scientific tool for teaching at undergraduate and graduate levels, and enhance association and cooperation with business-oriented organizations.
ERIC Educational Resources Information Center
Craig, Heather
2007-01-01
Fishing industries around the world are currently undergoing a process of industrialization and commercialization. A similar story is unfolding in many fishing communities: large-scale industrial fishers who possess enormous capital and advanced technologies are threatening the lives of small-scale fisherfolk. The fishing industry in Lake Victoria…
Managing landscapes at multiple scales for sustainability of ecosystem functions (Preface)
R.A. Birdsey; R. Lucas; Y. Pan; G. Sun; E.J. Gustafson; A.H. Perera
2010-01-01
The science of landscape ecology is a rapidly evolving academic field with an emphasis on studying large-scale spatial heterogeneity created by natural influences and human activities. These advances have important implications for managing and conserving natural resources. At a September 2008 IUFRO conference in Chengdu, Sichuan, P.R. China, we highlighted both the...
Kelley, Shana O.; Mirkin, Chad A.; Walt, David R.; Ismagilov, Rustem F.; Toner, Mehmet; Sargent, Edward H.
2015-01-01
Rapid progress in identifying disease biomarkers has increased the importance of creating high-performance detection technologies. Over the last decade, the design of many detection platforms has focused on either the nano or micro length scale. Here, we review recent strategies that combine nano- and microscale materials and devices to produce large improvements in detection sensitivity, speed and accuracy, allowing previously undetectable biomarkers to be identified in clinical samples. Microsensors that incorporate nanoscale features can now rapidly detect disease-related nucleic acids expressed in patient samples. New microdevices that separate large clinical samples into nanocompartments allow precise quantitation of analytes, and microfluidic systems that utilize nanoscale binding events can detect rare cancer cells in the bloodstream more accurately than before. These advances will lead to faster and more reliable clinical diagnostic devices. PMID:25466541
NASA Astrophysics Data System (ADS)
Kelley, Shana O.; Mirkin, Chad A.; Walt, David R.; Ismagilov, Rustem F.; Toner, Mehmet; Sargent, Edward H.
2014-12-01
Rapid progress in identifying disease biomarkers has increased the importance of creating high-performance detection technologies. Over the last decade, the design of many detection platforms has focused on either the nano or micro length scale. Here, we review recent strategies that combine nano- and microscale materials and devices to produce large improvements in detection sensitivity, speed and accuracy, allowing previously undetectable biomarkers to be identified in clinical samples. Microsensors that incorporate nanoscale features can now rapidly detect disease-related nucleic acids expressed in patient samples. New microdevices that separate large clinical samples into nanocompartments allow precise quantitation of analytes, and microfluidic systems that utilize nanoscale binding events can detect rare cancer cells in the bloodstream more accurately than before. These advances will lead to faster and more reliable clinical diagnostic devices.
Study and design of cryogenic propellant acquisition systems. Volume 1: Design studies
NASA Technical Reports Server (NTRS)
Burge, G. W.; Blackmon, J. B.
1973-01-01
An in-depth study and selection of practical propellant surface tension acquisition system designs for two specific future cryogenic space vehicles, an advanced cryogenic space shuttle auxiliary propulsion system and an advanced space propulsion module is reported. A supporting laboratory scale experimental program was also conducted to provide design information critical to concept finalization and selection. Designs using localized pressure isolated surface tension screen devices were selected for each application and preliminary designs were generated. Based on these designs, large scale acquisition prototype hardware was designed and fabricated to be compatible with available NASA-MSFC feed system hardware.
Overview of NASA Iodine Hall Thruster Propulsion System Development
NASA Technical Reports Server (NTRS)
Smith, Timothy D.; Kamhawi, Hani; Hickman, Tyler; Haag, Thomas; Dankanich, John; Polzin, Kurt; Byrne, Lawrence; Szabo, James
2016-01-01
NASA is continuing to invest in advancing Hall thruster technologies for implementation in commercial and government missions. The most recent focus has been on increasing the power level for large-scale exploration applications. However, there has also been a similar push to examine applications of electric propulsion for small spacecraft in the range of 300 kg or less. There have been several recent iodine Hall propulsion system development activities performed by the team of the NASA Glenn Research Center, the NASA Marshall Space Flight Center, and Busek Co. Inc. In particular, the work focused on qualification of the Busek 200-W BHT-200-I and development of the 600-W BHT-600-I systems. This paper discusses the current status of iodine Hall propulsion system developments along with supporting technology development efforts.
Savina, Irina N.; Ingavle, Ganesh C.; Cundy, Andrew B.; Mikhalovsky, Sergey V.
2016-01-01
The development of bulk, three-dimensional (3D), macroporous polymers with high permeability, large surface area and large volume is highly desirable for a range of applications in the biomedical, biotechnological and environmental areas. The experimental techniques currently used are limited to the production of small size and volume cryogel material. In this work we propose a novel, versatile, simple and reproducible method for the synthesis of large volume porous polymer hydrogels by cryogelation. By controlling the freezing process of the reagent/polymer solution, large-scale 3D macroporous gels with wide interconnected pores (up to 200 μm in diameter) and large accessible surface area have been synthesized. For the first time, macroporous gels (of up to 400 ml bulk volume) with controlled porous structure were manufactured, with potential for scale up to much larger gel dimensions. This method can be used for production of novel 3D multi-component macroporous composite materials with a uniform distribution of embedded particles. The proposed method provides better control of freezing conditions and thus overcomes existing drawbacks limiting production of large gel-based devices and matrices. The proposed method could serve as a new design concept for functional 3D macroporous gels and composites preparation for biomedical, biotechnological and environmental applications. PMID:26883390
Current Challenges in Plant Eco-Metabolomics
Peters, Kristian; Worrich, Anja; Alka, Oliver; Balcke, Gerd; Bruelheide, Helge; Dietz, Sophie; Dührkop, Kai; Heinig, Uwe; Kücklich, Marlen; Müller, Caroline; Poeschl, Yvonne; Pohnert, Georg; Ruttkies, Christoph; Schweiger, Rabea; Shahaf, Nir; Tortosa, Maria; Ueberschaar, Nico; Velasco, Pablo; Weiß, Brigitte M.; van Dam, Nicole M.
2018-01-01
The relatively new research discipline of Eco-Metabolomics is the application of metabolomics techniques to ecology with the aim to characterise biochemical interactions of organisms across different spatial and temporal scales. Metabolomics is an untargeted biochemical approach to measure many thousands of metabolites in different species, including plants and animals. Changes in metabolite concentrations can provide mechanistic evidence for biochemical processes that are relevant at ecological scales. These include physiological, phenotypic and morphological responses of plants and communities to environmental changes and also interactions with other organisms. Traditionally, research in biochemistry and ecology comes from two different directions and is performed at distinct spatiotemporal scales. Biochemical studies most often focus on intrinsic processes in individuals at physiological and cellular scales. Generally, they take a bottom-up approach scaling up cellular processes from spatiotemporally fine to coarser scales. Ecological studies usually focus on extrinsic processes acting upon organisms at population and community scales and typically study top-down and bottom-up processes in combination. Eco-Metabolomics is a transdisciplinary research discipline that links biochemistry and ecology and connects the distinct spatiotemporal scales. In this review, we focus on approaches to study chemical and biochemical interactions of plants at various ecological levels, mainly plant–organismal interactions, and discuss related examples from other domains. We present recent developments and highlight advancements in Eco-Metabolomics over the last decade from various angles. We further address the five key challenges: (1) complex experimental designs and large variation of metabolite profiles; (2) feature extraction; (3) metabolite identification; (4) statistical analyses; and (5) bioinformatics software tools and workflows. The presented solutions to these challenges will advance connecting the distinct spatiotemporal scales and bridging biochemistry and ecology. PMID:29734799
DOE Office of Scientific and Technical Information (OSTI.GOV)
2014-08-21
Recent advancements in technology scaling have shown a trend towards greater integration with large-scale chips containing thousands of processors connected to memories and other I/O devices using non-trivial network topologies. Software simulation proves insufficient to study the tradeoffs in such complex systems due to slow execution time, whereas hardware RTL development is too time-consuming. We present OpenSoC Fabric, an on-chip network generation infrastructure which aims to provide a parameterizable and powerful on-chip network generator for evaluating future high performance computing architectures based on SoC technology. OpenSoC Fabric leverages a new hardware DSL, Chisel, which contains powerful abstractions provided by itsmore » base language, Scala, and generates both software (C++) and hardware (Verilog) models from a single code base. The OpenSoC Fabric2 infrastructure is modeled after existing state-of-the-art simulators, offers large and powerful collections of configuration options, and follows object-oriented design and functional programming to make functionality extension as easy as possible.« less
Vibration and flutter characteristics of the SR7L large-scale propfan
NASA Technical Reports Server (NTRS)
August, Richard; Kaza, Krishna Rao V.
1988-01-01
An investigation of the vibration characteristics and aeroelastic stability of the SR7L Large-Scale Advanced Propfan was performed using a finite element blade model and an improved aeroelasticity code. Analyses were conducted for different blade pitch angles, blade support conditions, number of blades, rotational speeds, and freestream Mach numbers. A finite element model of the blade was used to determine the blade's vibration behavior and sensitivity to support stiffness. The calculated frequencies and mode shape obtained with this model agreed well with the published experimental data. A computer code recently developed at NASA Lewis Research Center and based on three-dimensional, unsteady, lifting surface aerodynamic theory was used for the aeroelastic analysis to examine the blade's stability at a cruise condition of Mach 0.8 at 1700 rpm. The results showed that the blade is stable for that operating point. However, a flutter condition was predicted if the cruise Mach number was increased to 0.9.
Wearable Large-Scale Perovskite Solar-Power Source via Nanocellular Scaffold.
Hu, Xiaotian; Huang, Zengqi; Zhou, Xue; Li, Pengwei; Wang, Yang; Huang, Zhandong; Su, Meng; Ren, Wanjie; Li, Fengyu; Li, Mingzhu; Chen, Yiwang; Song, Yanlin
2017-11-01
Dramatic advances in perovskite solar cells (PSCs) and the blossoming of wearable electronics have triggered tremendous demands for flexible solar-power sources. However, the fracturing of functional crystalline films and transmittance wastage from flexible substrates are critical challenges to approaching the high-performance PSCs with flexural endurance. In this work, a nanocellular scaffold is introduced to architect a mechanics buffer layer and optics resonant cavity. The nanocellular scaffold releases mechanical stresses during flexural experiences and significantly improves the crystalline quality of the perovskite films. The nanocellular optics resonant cavity optimizes light harvesting and charge transportation of devices. More importantly, these flexible PSCs, which demonstrate excellent performance and mechanical stability, are practically fabricated in modules as a wearable solar-power source. A power conversion efficiency of 12.32% for a flexible large-scale device (polyethylene terephthalate substrate, indium tin oxide-free, 1.01 cm 2 ) is achieved. This ingenious flexible structure will enable a new approach for development of wearable electronics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Material design and engineering of next-generation flow-battery technologies
NASA Astrophysics Data System (ADS)
Park, Minjoon; Ryu, Jaechan; Wang, Wei; Cho, Jaephil
2017-01-01
Spatial separation of the electrolyte and electrode is the main characteristic of flow-battery technologies, which liberates them from the constraints of overall energy content and the energy/power ratio. The concept of a flowing electrolyte not only presents a cost-effective approach for large-scale energy storage, but has also recently been used to develop a wide range of new hybrid energy storage and conversion systems. The advent of flow-based lithium-ion, organic redox-active materials, metal-air cells and photoelectrochemical batteries promises new opportunities for advanced electrical energy-storage technologies. In this Review, we present a critical overview of recent progress in conventional aqueous redox-flow batteries and next-generation flow batteries, highlighting the latest innovative alternative materials. We outline their technical feasibility for use in long-term and large-scale electrical energy-storage devices, as well as the limitations that need to be overcome, providing our view of promising future research directions in the field of redox-flow batteries.
Heidari, Zahra; Roe, Daniel R; Galindo-Murillo, Rodrigo; Ghasemi, Jahan B; Cheatham, Thomas E
2016-07-25
Long time scale molecular dynamics (MD) simulations of biological systems are becoming increasingly commonplace due to the availability of both large-scale computational resources and significant advances in the underlying simulation methodologies. Therefore, it is useful to investigate and develop data mining and analysis techniques to quickly and efficiently extract the biologically relevant information from the incredible amount of generated data. Wavelet analysis (WA) is a technique that can quickly reveal significant motions during an MD simulation. Here, the application of WA on well-converged long time scale (tens of μs) simulations of a DNA helix is described. We show how WA combined with a simple clustering method can be used to identify both the physical and temporal locations of events with significant motion in MD trajectories. We also show that WA can not only distinguish and quantify the locations and time scales of significant motions, but by changing the maximum time scale of WA a more complete characterization of these motions can be obtained. This allows motions of different time scales to be identified or ignored as desired.
Remote sensing of vegetation structure using computer vision
NASA Astrophysics Data System (ADS)
Dandois, Jonathan P.
High-spatial resolution measurements of vegetation structure are needed for improving understanding of ecosystem carbon, water and nutrient dynamics, the response of ecosystems to a changing climate, and for biodiversity mapping and conservation, among many research areas. Our ability to make such measurements has been greatly enhanced by continuing developments in remote sensing technology---allowing researchers the ability to measure numerous forest traits at varying spatial and temporal scales and over large spatial extents with minimal to no field work, which is costly for large spatial areas or logistically difficult in some locations. Despite these advances, there remain several research challenges related to the methods by which three-dimensional (3D) and spectral datasets are joined (remote sensing fusion) and the availability and portability of systems for frequent data collections at small scale sampling locations. Recent advances in the areas of computer vision structure from motion (SFM) and consumer unmanned aerial systems (UAS) offer the potential to address these challenges by enabling repeatable measurements of vegetation structural and spectral traits at the scale of individual trees. However, the potential advances offered by computer vision remote sensing also present unique challenges and questions that need to be addressed before this approach can be used to improve understanding of forest ecosystems. For computer vision remote sensing to be a valuable tool for studying forests, bounding information about the characteristics of the data produced by the system will help researchers understand and interpret results in the context of the forest being studied and of other remote sensing techniques. This research advances understanding of how forest canopy and tree 3D structure and color are accurately measured by a relatively low-cost and portable computer vision personal remote sensing system: 'Ecosynth'. Recommendations are made for optimal conditions under which forest structure measurements should be obtained with UAS-SFM remote sensing. Ultimately remote sensing of vegetation by computer vision offers the potential to provide an 'ecologist's eye view', capturing not only canopy 3D and spectral properties, but also seeing the trees in the forest and the leaves on the trees.
NASA Astrophysics Data System (ADS)
Lin, Y.; O'Malley, D.; Vesselinov, V. V.
2015-12-01
Inverse modeling seeks model parameters given a set of observed state variables. However, for many practical problems due to the facts that the observed data sets are often large and model parameters are often numerous, conventional methods for solving the inverse modeling can be computationally expensive. We have developed a new, computationally-efficient Levenberg-Marquardt method for solving large-scale inverse modeling. Levenberg-Marquardt methods require the solution of a dense linear system of equations which can be prohibitively expensive to compute for large-scale inverse problems. Our novel method projects the original large-scale linear problem down to a Krylov subspace, such that the dimensionality of the measurements can be significantly reduced. Furthermore, instead of solving the linear system for every Levenberg-Marquardt damping parameter, we store the Krylov subspace computed when solving the first damping parameter and recycle it for all the following damping parameters. The efficiency of our new inverse modeling algorithm is significantly improved by using these computational techniques. We apply this new inverse modeling method to invert for a random transitivity field. Our algorithm is fast enough to solve for the distributed model parameters (transitivity) at each computational node in the model domain. The inversion is also aided by the use regularization techniques. The algorithm is coded in Julia and implemented in the MADS computational framework (http://mads.lanl.gov). Julia is an advanced high-level scientific programing language that allows for efficient memory management and utilization of high-performance computational resources. By comparing with a Levenberg-Marquardt method using standard linear inversion techniques, our Levenberg-Marquardt method yields speed-up ratio of 15 in a multi-core computational environment and a speed-up ratio of 45 in a single-core computational environment. Therefore, our new inverse modeling method is a powerful tool for large-scale applications.
Achieving large linear elasticity and high strength in bulk nanocompsite via synergistic effect
Hao, Shijie; Cui, Lishan; Guo, Fangmin; ...
2015-03-09
Elastic strain in bulk metallic materials is usually limited to only a fraction of 1%. Developing bulk metallic materials showing large linear elasticity and high strength has proven to be difficult. Here, based on the synergistic effect between nanowires and orientated martensite NiTi shape memory alloy, we developed an in-situ Nb nanowires-orientated martensitic NiTi matrix composite showing an ultra-large linear elastic strain of 4% and an ultrahigh yield strength of 1.8 GPa. This material also has a high mechanical energy storage efficiency of 96% and a high energy storage density of 36 J/cm 3 that is almost one order ofmore » larger than that of spring steel. It is demonstrated that the synergistic effect allows the exceptional mechanical properties of nanowires to be harvested at macro scale and the mechanical properties of matrix to be greatly improved, resulting in these superior properties. This research provides new avenues for developing advanced composites with superior properties by using effective synergistic effect between components.« less
Toolboxes for cyanobacteria: Recent advances and future direction.
Sun, Tao; Li, Shubin; Song, Xinyu; Diao, Jinjin; Chen, Lei; Zhang, Weiwen
2018-05-03
Photosynthetic cyanobacteria are important primary producers and model organisms for studying photosynthesis and elements cycling on earth. Due to the ability to absorb sunlight and utilize carbon dioxide, cyanobacteria have also been proposed as renewable chassis for carbon-neutral "microbial cell factories". Recent progresses on cyanobacterial synthetic biology have led to the successful production of more than two dozen of fuels and fine chemicals directly from CO 2 , demonstrating their potential for scale-up application in the future. However, compared with popular heterotrophic chassis like Escherichia coli and Saccharomyces cerevisiae, where abundant genetic tools are available for manipulations at levels from single gene, pathway to whole genome, limited genetic tools are accessible to cyanobacteria. Consequently, this significant technical hurdle restricts both the basic biological researches and further development and application of these renewable systems. Though still lagging the heterotrophic chassis, the vital roles of genetic tools in tuning of gene expression, carbon flux re-direction as well as genome-wide manipulations have been increasingly recognized in cyanobacteria. In recent years, significant progresses on developing and introducing new and efficient genetic tools have been made for cyanobacteria, including promoters, riboswitches, ribosome binding site engineering, clustered regularly interspaced short palindromic repeats/CRISPR-associated nuclease (CRISPR/Cas) systems, small RNA regulatory tools and genome-scale modeling strategies. In this review, we critically summarize recent advances on development and applications as well as technical limitations and future directions of the genetic tools in cyanobacteria. In addition, toolboxes feasible for using in large-scale cultivation are also briefly discussed. Copyright © 2018 Elsevier Inc. All rights reserved.
Towards Personal Exposures: How Technology Is Changing Air Pollution and Health Research.
Larkin, A; Hystad, P
2017-12-01
We present a review of emerging technologies and how these can transform personal air pollution exposure assessment and subsequent health research. Estimating personal air pollution exposures is currently split broadly into methods for modeling exposures for large populations versus measuring exposures for small populations. Air pollution sensors, smartphones, and air pollution models capitalizing on big/new data sources offer tremendous opportunity for unifying these approaches and improving long-term personal exposure prediction at scales needed for population-based research. A multi-disciplinary approach is needed to combine these technologies to not only estimate personal exposures for epidemiological research but also determine drivers of these exposures and new prevention opportunities. While available technologies can revolutionize air pollution exposure research, ethical, privacy, logistical, and data science challenges must be met before widespread implementations occur. Available technologies and related advances in data science can improve long-term personal air pollution exposure estimates at scales needed for population-based research. This will advance our ability to evaluate the impacts of air pollution on human health and develop effective prevention strategies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radchenko, I.; Tippabhotla, S. K.; Tamura, N.
2016-10-21
Synchrotron x-ray microdiffraction (μXRD) allows characterization of a crystalline material in small, localized volumes. Phase composition, crystal orientation and strain can all be probed in few-second time scales. Crystalline changes over a large areas can be also probed in a reasonable amount of time with submicron spatial resolution. However, despite all the listed capabilities, μXRD is mostly used to study pure materials but its application in actual device characterization is rather limited. This article will explore the recent developments of the μXRD technique illustrated with its advanced applications in microelectronic devices and solar photovoltaic systems. Application of μXRD in microelectronicsmore » will be illustrated by studying stress and microstructure evolution in Cu TSV (through silicon via) during and after annealing. Here, the approach allowing study of the microstructural evolution in the solder joint of crystalline Si solar cells due to thermal cycling will be also demonstrated.« less