Sample records for large-scale engineering analysis

  1. Symposium on Parallel Computational Methods for Large-scale Structural Analysis and Design, 2nd, Norfolk, VA, US

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O. (Editor); Housner, Jerrold M. (Editor)

    1993-01-01

    Computing speed is leaping forward by several orders of magnitude each decade. Engineers and scientists gathered at a NASA Langley symposium to discuss these exciting trends as they apply to parallel computational methods for large-scale structural analysis and design. Among the topics discussed were: large-scale static analysis; dynamic, transient, and thermal analysis; domain decomposition (substructuring); and nonlinear and numerical methods.

  2. A Combined Ethical and Scientific Analysis of Large-scale Tests of Solar Climate Engineering

    NASA Astrophysics Data System (ADS)

    Ackerman, T. P.

    2017-12-01

    Our research group recently published an analysis of the combined ethical and scientific issues surrounding large-scale testing of stratospheric aerosol injection (SAI; Lenferna et al., 2017, Earth's Future). We are expanding this study in two directions. The first is extending this same analysis to other geoengineering techniques, particularly marine cloud brightening (MCB). MCB has substantial differences to SAI in this context because MCB can be tested over significantly smaller areas of the planet and, following injection, has a much shorter lifetime of weeks as opposed to years for SAI. We examine issues such as the role of intent, the lesser of two evils, and the nature of consent. In addition, several groups are currently considering climate engineering governance tools such as a code of ethics and a registry. We examine how these tools might influence climate engineering research programs and, specifically, large-scale testing. The second direction of expansion is asking whether ethical and scientific issues associated with large-scale testing are so significant that they effectively preclude moving ahead with climate engineering research and testing. Some previous authors have suggested that no research should take place until these issues are resolved. We think this position is too draconian and consider a more nuanced version of this argument. We note, however, that there are serious questions regarding the ability of the scientific research community to move to the point of carrying out large-scale tests.

  3. A unifying framework for systems modeling, control systems design, and system operation

    NASA Technical Reports Server (NTRS)

    Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.

    2005-01-01

    Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.

  4. Multi-source Geospatial Data Analysis with Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Erickson, T.

    2014-12-01

    The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org

  5. Similitude design for the vibration problems of plates and shells: A review

    NASA Astrophysics Data System (ADS)

    Zhu, Yunpeng; Wang, You; Luo, Zhong; Han, Qingkai; Wang, Deyou

    2017-06-01

    Similitude design plays a vital role in the analysis of vibration and shock problems encountered in large engineering equipment. Similitude design, including dimensional analysis and governing equation method, is founded on the dynamic similitude theory. This study reviews the application of similitude design methods in engineering practice and summarizes the major achievements of the dynamic similitude theory in structural vibration and shock problems in different fields, including marine structures, civil engineering structures, and large power equipment. This study also reviews the dynamic similitude design methods for thin-walled and composite material plates and shells, including the most recent work published by the authors. Structure sensitivity analysis is used to evaluate the scaling factors to attain accurate distorted scaling laws. Finally, this study discusses the existing problems and the potential of the dynamic similitude theory for the analysis of vibration and shock problems of structures.

  6. Similarity spectra analysis of high-performance jet aircraft noise.

    PubMed

    Neilsen, Tracianne B; Gee, Kent L; Wall, Alan T; James, Michael M

    2013-04-01

    Noise measured in the vicinity of an F-22A Raptor has been compared to similarity spectra found previously to represent mixing noise from large-scale and fine-scale turbulent structures in laboratory-scale jet plumes. Comparisons have been made for three engine conditions using ground-based sideline microphones, which covered a large angular aperture. Even though the nozzle geometry is complex and the jet is nonideally expanded, the similarity spectra do agree with large portions of the measured spectra. Toward the sideline, the fine-scale similarity spectrum is used, while the large-scale similarity spectrum provides a good fit to the area of maximum radiation. Combinations of the two similarity spectra are shown to match the data in between those regions. Surprisingly, a combination of the two is also shown to match the data at the farthest aft angle. However, at high frequencies the degree of congruity between the similarity and the measured spectra changes with engine condition and angle. At the higher engine conditions, there is a systematically shallower measured high-frequency slope, with the largest discrepancy occurring in the regions of maximum radiation.

  7. Analysis of BJ493 diesel engine lubrication system properties

    NASA Astrophysics Data System (ADS)

    Liu, F.

    2017-12-01

    The BJ493ZLQ4A diesel engine design is based on the primary model of BJ493ZLQ3, of which exhaust level is upgraded to the National GB5 standard due to the improved design of combustion and injection systems. Given the above changes in the diesel lubrication system, its improved properties are analyzed in this paper. According to the structures, technical parameters and indices of the lubrication system, the lubrication system model of BJ493ZLQ4A diesel engine was constructed using the Flowmaster flow simulation software. The properties of the diesel engine lubrication system, such as the oil flow rate and pressure at different rotational speeds were analyzed for the schemes involving large- and small-scale oil filters. The calculated values of the main oil channel pressure are in good agreement with the experimental results, which verifies the proposed model feasibility. The calculation results show that the main oil channel pressure and maximum oil flow rate values for the large-scale oil filter scheme satisfy the design requirements, while the small-scale scheme yields too low main oil channel’s pressure and too high. Therefore, application of small-scale oil filters is hazardous, and the large-scale scheme is recommended.

  8. Potential utilization of the NASA/George C. Marshall Space Flight Center in earthquake engineering research

    NASA Technical Reports Server (NTRS)

    Scholl, R. E. (Editor)

    1979-01-01

    Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.

  9. Adjoint Sensitivity Analysis for Scale-Resolving Turbulent Flow Solvers

    NASA Astrophysics Data System (ADS)

    Blonigan, Patrick; Garai, Anirban; Diosady, Laslo; Murman, Scott

    2017-11-01

    Adjoint-based sensitivity analysis methods are powerful design tools for engineers who use computational fluid dynamics. In recent years, these engineers have started to use scale-resolving simulations like large-eddy simulations (LES) and direct numerical simulations (DNS), which resolve more scales in complex flows with unsteady separation and jets than the widely-used Reynolds-averaged Navier-Stokes (RANS) methods. However, the conventional adjoint method computes large, unusable sensitivities for scale-resolving simulations, which unlike RANS simulations exhibit the chaotic dynamics inherent in turbulent flows. Sensitivity analysis based on least-squares shadowing (LSS) avoids the issues encountered by conventional adjoint methods, but has a high computational cost even for relatively small simulations. The following talk discusses a more computationally efficient formulation of LSS, ``non-intrusive'' LSS, and its application to turbulent flows simulated with a discontinuous-Galkerin spectral-element-method LES/DNS solver. Results are presented for the minimal flow unit, a turbulent channel flow with a limited streamwise and spanwise domain.

  10. Engineering science and mechanics; Proceedings of the International Symposium, Tainan, Republic of China, December 29-31, 1981. Parts 1 & 2

    NASA Astrophysics Data System (ADS)

    Hsia, H.-M.; Chou, Y.-L.; Longman, R. W.

    1983-07-01

    The topics considered are related to measurements and controls in physical systems, the control of large scale and distributed parameter systems, chemical engineering systems, aerospace science and technology, thermodynamics and fluid mechanics, and computer applications. Subjects in structural dynamics are discussed, taking into account finite element approximations in transient analysis, buckling finite element analysis of flat plates, dynamic analysis of viscoelastic structures, the transient analysis of large frame structures by simple models, large amplitude vibration of an initially stressed thick plate, nonlinear aeroelasticity, a sensitivity analysis of a combined beam-spring-mass structure, and the optimal design and aeroelastic investigation of segmented windmill rotor blades. Attention is also given to dynamics and control of mechanical and civil engineering systems, composites, and topics in materials. For individual items see A83-44002 to A83-44061

  11. Reverse engineering and analysis of large genome-scale gene networks

    PubMed Central

    Aluru, Maneesha; Zola, Jaroslaw; Nettleton, Dan; Aluru, Srinivas

    2013-01-01

    Reverse engineering the whole-genome networks of complex multicellular organisms continues to remain a challenge. While simpler models easily scale to large number of genes and gene expression datasets, more accurate models are compute intensive limiting their scale of applicability. To enable fast and accurate reconstruction of large networks, we developed Tool for Inferring Network of Genes (TINGe), a parallel mutual information (MI)-based program. The novel features of our approach include: (i) B-spline-based formulation for linear-time computation of MI, (ii) a novel algorithm for direct permutation testing and (iii) development of parallel algorithms to reduce run-time and facilitate construction of large networks. We assess the quality of our method by comparison with ARACNe (Algorithm for the Reconstruction of Accurate Cellular Networks) and GeneNet and demonstrate its unique capability by reverse engineering the whole-genome network of Arabidopsis thaliana from 3137 Affymetrix ATH1 GeneChips in just 9 min on a 1024-core cluster. We further report on the development of a new software Gene Network Analyzer (GeNA) for extracting context-specific subnetworks from a given set of seed genes. Using TINGe and GeNA, we performed analysis of 241 Arabidopsis AraCyc 8.0 pathways, and the results are made available through the web. PMID:23042249

  12. Aerodynamic characteristics of a large-scale lift-engine fighter model with external swiveling lift-engines

    NASA Technical Reports Server (NTRS)

    Barrack, J. P.; Kirk, J. V.

    1972-01-01

    The aerodynamic characteristics of a six-engine (four lift, two lift-cruise) lift-engine model obtained in the Ames 40- by 80-foot wind tunnel are presented. The model was an approximate one-half scale representation of a lift-engine VTOL fighter aircraft with a variable-sweep wing. The four lift-engines were housed in the aft fuselage with the inlets located above the wing. Longitudinal and lateral-directional force and moment data are presented for a range of exhaust gas momentum ratios (thrust coefficients). Wind tunnel forward speed was varied from 0 to 140 knots corresponding to a maximum Reynolds number of 6.7 million. The data are presented without analysis.

  13. Combined heat and power supply using Carnot engines

    NASA Astrophysics Data System (ADS)

    Horlock, J. H.

    The Marshall Report on the thermodynamic and economic feasibility of introducing large scale combined heat and electrical power generation (CHP) into the United Kingdom is summarized. Combinations of reversible power plant (Carnot engines) to meet a given demand of power and heat production are analyzed. The Marshall Report states that fairly large scale CHP plants are an attractive energy saving option for areas of high heat load densities. Analysis shows that for given requirements, the total heat supply and utilization factor are functions of heat output, reservoir supply temperature, temperature of heat rejected to the reservoir, and an intermediate temperature for district heating.

  14. Planetary-Scale Geospatial Data Analysis Techniques in Google's Earth Engine Platform (Invited)

    NASA Astrophysics Data System (ADS)

    Hancher, M.

    2013-12-01

    Geoscientists have more and more access to new tools for large-scale computing. With any tool, some tasks are easy and other tasks hard. It is natural to look to new computing platforms to increase the scale and efficiency of existing techniques, but there is a more exiting opportunity to discover and develop a new vocabulary of fundamental analysis idioms that are made easy and effective by these new tools. Google's Earth Engine platform is a cloud computing environment for earth data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog includes a nearly complete archive of scenes from Landsat 4, 5, 7, and 8 that have been processed by the USGS, as well as a wide variety of other remotely-sensed and ancillary data products. Earth Engine supports a just-in-time computation model that enables real-time preview during algorithm development and debugging as well as during experimental data analysis and open-ended data exploration. Data processing operations are performed in parallel across many computers in Google's datacenters. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, resampling, and associating image metadata with pixel data. Early applications of Earth Engine have included the development of Google's global cloud-free fifteen-meter base map and global multi-decadal time-lapse animations, as well as numerous large and small experimental analyses by scientists from a range of academic, government, and non-governmental institutions, working in a wide variety of application areas including forestry, agriculture, urban mapping, and species habitat modeling. Patterns in the successes and failures of these early efforts have begun to emerge, sketching the outlines of a new set of simple and effective approaches to geospatial data analysis.

  15. GIGGLE: a search engine for large-scale integrated genome analysis.

    PubMed

    Layer, Ryan M; Pedersen, Brent S; DiSera, Tonya; Marth, Gabor T; Gertz, Jason; Quinlan, Aaron R

    2018-02-01

    GIGGLE is a genomics search engine that identifies and ranks the significance of genomic loci shared between query features and thousands of genome interval files. GIGGLE (https://github.com/ryanlayer/giggle) scales to billions of intervals and is over three orders of magnitude faster than existing methods. Its speed extends the accessibility and utility of resources such as ENCODE, Roadmap Epigenomics, and GTEx by facilitating data integration and hypothesis generation.

  16. GIGGLE: a search engine for large-scale integrated genome analysis

    PubMed Central

    Layer, Ryan M; Pedersen, Brent S; DiSera, Tonya; Marth, Gabor T; Gertz, Jason; Quinlan, Aaron R

    2018-01-01

    GIGGLE is a genomics search engine that identifies and ranks the significance of genomic loci shared between query features and thousands of genome interval files. GIGGLE (https://github.com/ryanlayer/giggle) scales to billions of intervals and is over three orders of magnitude faster than existing methods. Its speed extends the accessibility and utility of resources such as ENCODE, Roadmap Epigenomics, and GTEx by facilitating data integration and hypothesis generation. PMID:29309061

  17. Energy Efficient Engine acoustic supporting technology report

    NASA Technical Reports Server (NTRS)

    Lavin, S. P.; Ho, P. Y.

    1985-01-01

    The acoustic development of the Energy Efficient Engine combined testing and analysis using scale model rigs and an integrated Core/Low Spool demonstration engine. The scale model tests show that a cut-on blade/vane ratio fan with a large spacing (S/C = 2.3) is as quiet as a cut-off blade/vane ratio with a tighter spacing (S/C = 1.27). Scale model mixer tests show that separate flow nozzles are the noisiest, conic nozzles the quietest, with forced mixers in between. Based on projections of ICLS data the Energy Efficient Engine (E3) has FAR 36 margins of 3.7 EPNdB at approach, 4.5 EPNdB at full power takeoff, and 7.2 EPNdB at sideline conditions.

  18. Trace: a high-throughput tomographic reconstruction engine for large-scale datasets.

    PubMed

    Bicer, Tekin; Gürsoy, Doğa; Andrade, Vincent De; Kettimuthu, Rajkumar; Scullin, William; Carlo, Francesco De; Foster, Ian T

    2017-01-01

    Modern synchrotron light sources and detectors produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used imaging techniques that generates data at tens of gigabytes per second is computed tomography (CT). Although CT experiments result in rapid data generation, the analysis and reconstruction of the collected data may require hours or even days of computation time with a medium-sized workstation, which hinders the scientific progress that relies on the results of analysis. We present Trace, a data-intensive computing engine that we have developed to enable high-performance implementation of iterative tomographic reconstruction algorithms for parallel computers. Trace provides fine-grained reconstruction of tomography datasets using both (thread-level) shared memory and (process-level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations that we apply to the replicated reconstruction objects and evaluate them using tomography datasets collected at the Advanced Photon Source. Our experimental evaluations show that our optimizations and parallelization techniques can provide 158× speedup using 32 compute nodes (384 cores) over a single-core configuration and decrease the end-to-end processing time of a large sinogram (with 4501 × 1 × 22,400 dimensions) from 12.5 h to <5 min per iteration. The proposed tomographic reconstruction engine can efficiently process large-scale tomographic data using many compute nodes and minimize reconstruction times.

  19. Tony Jimenez | NREL

    Science.gov Websites

    pre-feasibility analysis; wind data analysis; the small wind turbine certification process; economic Regional Test Center effort, analysis of the potential economic impact of large-scale MHK deployment off pre-feasibility analysis. Tony is an engineer officer in the Army Reserve. He has deployed twice

  20. Organizational Influences on Interdisciplinary Interactions during Research and Design of Large-Scale Complex Engineered Systems

    NASA Technical Reports Server (NTRS)

    McGowan, Anna-Maria R.; Seifert, Colleen M.; Papalambros, Panos Y.

    2012-01-01

    The design of large-scale complex engineered systems (LaCES) such as an aircraft is inherently interdisciplinary. Multiple engineering disciplines, drawing from a team of hundreds to thousands of engineers and scientists, are woven together throughout the research, development, and systems engineering processes to realize one system. Though research and development (R&D) is typically focused in single disciplines, the interdependencies involved in LaCES require interdisciplinary R&D efforts. This study investigates the interdisciplinary interactions that take place during the R&D and early conceptual design phases in the design of LaCES. Our theoretical framework is informed by both engineering practices and social science research on complex organizations. This paper provides preliminary perspective on some of the organizational influences on interdisciplinary interactions based on organization theory (specifically sensemaking), data from a survey of LaCES experts, and the authors experience in the research and design. The analysis reveals couplings between the engineered system and the organization that creates it. Survey respondents noted the importance of interdisciplinary interactions and their significant benefit to the engineered system, such as innovation and problem mitigation. Substantial obstacles to interdisciplinarity are uncovered beyond engineering that include communication and organizational challenges. Addressing these challenges may ultimately foster greater efficiencies in the design and development of LaCES and improved system performance by assisting with the collective integration of interdependent knowledge bases early in the R&D effort. This research suggests that organizational and human dynamics heavily influence and even constrain the engineering effort for large-scale complex systems.

  1. Large-Scale 3D Printing: The Way Forward

    NASA Astrophysics Data System (ADS)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  2. Implementing Assessment Engineering in the Uniform Certified Public Accountant (CPA) Examination

    ERIC Educational Resources Information Center

    Burke, Matthew; Devore, Richard; Stopek, Josh

    2013-01-01

    This paper describes efforts to bring principled assessment design to a large-scale, high-stakes licensure examination by employing the frameworks of Assessment Engineering (AE), the Revised Bloom's Taxonomy (RBT), and Cognitive Task Analysis (CTA). The Uniform CPA Examination is practice-oriented and focuses on the skills of accounting. In…

  3. 78 FR 7464 - Large Scale Networking (LSN) ; Joint Engineering Team (JET)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-01

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN) ; Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination...://www.nitrd.gov/nitrdgroups/index.php?title=Joint_Engineering_Team_ (JET)#title. SUMMARY: The JET...

  4. Gas-Centered Swirl Coaxial Liquid Injector Evaluations

    NASA Technical Reports Server (NTRS)

    Cohn, A. K.; Strakey, P. A.; Talley, D. G.

    2005-01-01

    Development of Liquid Rocket Engines is expensive. Extensive testing at large scales usually required. In order to verify engine lifetime, large number of tests required. Limited Resources available for development. Sub-scale cold-flow and hot-fire testing is extremely cost effective. Could be a necessary (but not sufficient) condition for long engine lifetime. Reduces overall costs and risk of large scale testing. Goal: Determine knowledge that can be gained from sub-scale cold-flow and hot-fire evaluations of LRE injectors. Determine relationships between cold-flow and hot-fire data.

  5. 77 FR 58415 - Large Scale Networking (LSN); Joint Engineering Team (JET)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-20

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN); Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination Office (NCO..._Engineering_Team_ (JET). SUMMARY: The JET, established in 1997, provides for information sharing among Federal...

  6. 78 FR 70076 - Large Scale Networking (LSN)-Joint Engineering Team (JET)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN)--Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination Office (NCO..._Engineering_Team_ (JET)#title. SUMMARY: The JET, established in 1997, provides for information sharing among...

  7. Discovering Beaten Paths in Collaborative Ontology-Engineering Projects using Markov Chains

    PubMed Central

    Walk, Simon; Singer, Philipp; Strohmaier, Markus; Tudorache, Tania; Musen, Mark A.; Noy, Natalya F.

    2014-01-01

    Biomedical taxonomies, thesauri and ontologies in the form of the International Classification of Diseases as a taxonomy or the National Cancer Institute Thesaurus as an OWL-based ontology, play a critical role in acquiring, representing and processing information about human health. With increasing adoption and relevance, biomedical ontologies have also significantly increased in size. For example, the 11th revision of the International Classification of Diseases, which is currently under active development by the World Health Organization contains nearly 50, 000 classes representing a vast variety of different diseases and causes of death. This evolution in terms of size was accompanied by an evolution in the way ontologies are engineered. Because no single individual has the expertise to develop such large-scale ontologies, ontology-engineering projects have evolved from small-scale efforts involving just a few domain experts to large-scale projects that require effective collaboration between dozens or even hundreds of experts, practitioners and other stakeholders. Understanding the way these different stakeholders collaborate will enable us to improve editing environments that support such collaborations. In this paper, we uncover how large ontology-engineering projects, such as the International Classification of Diseases in its 11th revision, unfold by analyzing usage logs of five different biomedical ontology-engineering projects of varying sizes and scopes using Markov chains. We discover intriguing interaction patterns (e.g., which properties users frequently change after specific given ones) that suggest that large collaborative ontology-engineering projects are governed by a few general principles that determine and drive development. From our analysis, we identify commonalities and differences between different projects that have implications for project managers, ontology editors, developers and contributors working on collaborative ontology-engineering projects and tools in the biomedical domain. PMID:24953242

  8. Discovering beaten paths in collaborative ontology-engineering projects using Markov chains.

    PubMed

    Walk, Simon; Singer, Philipp; Strohmaier, Markus; Tudorache, Tania; Musen, Mark A; Noy, Natalya F

    2014-10-01

    Biomedical taxonomies, thesauri and ontologies in the form of the International Classification of Diseases as a taxonomy or the National Cancer Institute Thesaurus as an OWL-based ontology, play a critical role in acquiring, representing and processing information about human health. With increasing adoption and relevance, biomedical ontologies have also significantly increased in size. For example, the 11th revision of the International Classification of Diseases, which is currently under active development by the World Health Organization contains nearly 50,000 classes representing a vast variety of different diseases and causes of death. This evolution in terms of size was accompanied by an evolution in the way ontologies are engineered. Because no single individual has the expertise to develop such large-scale ontologies, ontology-engineering projects have evolved from small-scale efforts involving just a few domain experts to large-scale projects that require effective collaboration between dozens or even hundreds of experts, practitioners and other stakeholders. Understanding the way these different stakeholders collaborate will enable us to improve editing environments that support such collaborations. In this paper, we uncover how large ontology-engineering projects, such as the International Classification of Diseases in its 11th revision, unfold by analyzing usage logs of five different biomedical ontology-engineering projects of varying sizes and scopes using Markov chains. We discover intriguing interaction patterns (e.g., which properties users frequently change after specific given ones) that suggest that large collaborative ontology-engineering projects are governed by a few general principles that determine and drive development. From our analysis, we identify commonalities and differences between different projects that have implications for project managers, ontology editors, developers and contributors working on collaborative ontology-engineering projects and tools in the biomedical domain. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Space shuttle main engine anomaly data and inductive knowledge based systems: Automated corporate expertise

    NASA Technical Reports Server (NTRS)

    Modesitt, Kenneth L.

    1987-01-01

    Progress is reported on the development of SCOTTY, an expert knowledge-based system to automate the analysis procedure following test firings of the Space Shuttle Main Engine (SSME). The integration of a large-scale relational data base system, a computer graphics interface for experts and end-user engineers, potential extension of the system to flight engines, application of the system for training of newly-hired engineers, technology transfer to other engines, and the essential qualities of good software engineering practices for building expert knowledge-based systems are among the topics discussed.

  10. Data engineering systems: Computerized modeling and data bank capabilities for engineering analysis

    NASA Technical Reports Server (NTRS)

    Kopp, H.; Trettau, R.; Zolotar, B.

    1984-01-01

    The Data Engineering System (DES) is a computer-based system that organizes technical data and provides automated mechanisms for storage, retrieval, and engineering analysis. The DES combines the benefits of a structured data base system with automated links to large-scale analysis codes. While the DES provides the user with many of the capabilities of a computer-aided design (CAD) system, the systems are actually quite different in several respects. A typical CAD system emphasizes interactive graphics capabilities and organizes data in a manner that optimizes these graphics. On the other hand, the DES is a computer-aided engineering system intended for the engineer who must operationally understand an existing or planned design or who desires to carry out additional technical analysis based on a particular design. The DES emphasizes data retrieval in a form that not only provides the engineer access to search and display the data but also links the data automatically with the computer analysis codes.

  11. Workforce Development Analysis | Energy Analysis | NREL

    Science.gov Websites

    with customer service, construction, and electrical projects One-half of surveyed firms reported , training, and experience that will enable continued large-scale deployment of wind and solar technologies engineers; and project managers. Standardized education and training at all levels-primary school through

  12. Semiconductor nanocrystal quantum dot synthesis approaches towards large-scale industrial production for energy applications

    DOE PAGES

    Hu, Michael Z.; Zhu, Ting

    2015-12-04

    This study reviews the experimental synthesis and engineering developments that focused on various green approaches and large-scale process production routes for quantum dots. Fundamental process engineering principles were illustrated. In relation to the small-scale hot injection method, our discussions focus on the non-injection route that could be scaled up with engineering stir-tank reactors. In addition, applications that demand to utilize quantum dots as "commodity" chemicals are discussed, including solar cells and solid-state lightings.

  13. Repurposing Mass-produced Internal Combustion Engines Quantifying the Value and Use of Low-cost Internal Combustion Piston Engines for Modular Applications in Energy and Chemical Engineering Industries

    NASA Astrophysics Data System (ADS)

    L'Heureux, Zara E.

    This thesis proposes that internal combustion piston engines can help clear the way for a transformation in the energy, chemical, and refining industries that is akin to the transition computer technology experienced with the shift from large mainframes to small personal computers and large farms of individually small, modular processing units. This thesis provides a mathematical foundation, multi-dimensional optimizations, experimental results, an engine model, and a techno-economic assessment, all working towards quantifying the value of repurposing internal combustion piston engines for new applications in modular, small-scale technologies, particularly for energy and chemical engineering systems. Many chemical engineering and power generation industries have focused on increasing individual unit sizes and centralizing production. This "bigger is better" concept makes it difficult to evolve and incorporate change. Large systems are often designed with long lifetimes, incorporate innovation slowly, and necessitate high upfront investment costs. Breaking away from this cycle is essential for promoting change, especially change happening quickly in the energy and chemical engineering industries. The ability to evolve during a system's lifetime provides a competitive advantage in a field dominated by large and often very old equipment that cannot respond to technology change. This thesis specifically highlights the value of small, mass-manufactured internal combustion piston engines retrofitted to participate in non-automotive system designs. The applications are unconventional and stem first from the observation that, when normalized by power output, internal combustion engines are one hundred times less expensive than conventional, large power plants. This cost disparity motivated a look at scaling laws to determine if scaling across both individual unit size and number of units produced would predict the two order of magnitude difference seen here. For the first time, this thesis provides a mathematical analysis of scaling with a combination of both changing individual unit size and varying the total number of units produced. Different paths to meet a particular cumulative capacity are analyzed and show that total costs are path dependent and vary as a function of the unit size and number of units produced. The path dependence identified is fairly weak, however, and for all practical applications, the underlying scaling laws seem unaffected. This analysis continues to support the interest in pursuing designs built around small, modular infrastructure. Building on the observation that internal combustion engines are an inexpensive power-producing unit, the first optimization in this thesis focuses on quantifying the value of engine capacity committing to deliver power in the day-ahead electricity and reserve markets, specifically based on pricing from the New York Independent System Operator (NYISO). An optimization was written in Python to determine, based on engine cost, fuel cost, engine wear, engine lifetime, and electricity prices, when and how much of an engine's power should be committed to a particular energy market. The optimization aimed to maximize profit for the engine and generator (engine genset) system acting as a price-taker. The result is an annual profit on the order of \\$30 per kilowatt. The most value in the engine genset is in its commitments to the spinning reserve market, where power is often committed but not always called on to deliver. This analysis highlights the benefits of modularity in energy generation and provides one example where the system is so inexpensive and short-lived, that the optimization views the engine replacement cost as a consumable operating expense rather than a capital cost. Having the opportunity to incorporate incremental technological improvements in a system's infrastructure throughout its lifetime allows introduction of new technology with higher efficiencies and better designs. An alternative to traditionally large infrastructure that locks in a design and today's state-of-the-art technology for the next 50 - 70 years, is a system designed to incorporate new technology in a modular fashion. The modular engine genset system used for power generation is one example of how this works in practice. The largest single component of this thesis is modeling, designing, retrofitting, and testing a reciprocating piston engine used as a compressor. Motivated again by the low cost of an internal combustion engine, this work looks at how an engine (which is, in its conventional form, essentially a reciprocating compressor) can be cost-effectively retrofitted to perform as a small-scale gas compressor. In the laboratory, an engine compressor was built by retrofitting a one-cylinder, 79 cc engine. Various retrofitting techniques were incorporated into the system design, and the engine compressor performance was quantified in each iteration. Because the retrofitted engine is now a power consumer rather than a power-producing unit, the engine compressor is driven in the laboratory with an electric motor. Experimentally, compressed air engine exhaust (starting at elevated inlet pressures) surpassed 650 psia (about 45 bar), which makes this system very attractive for many applications in chemical engineering and refining industries. A model of the engine compressor system was written in Python and incorporates experimentally-derived parameters to quantify gas leakage, engine friction, and flow (including backflow) through valves. The model as a whole was calibrated and verified with experimental data and is used to explore engine retrofits beyond what was tested in the laboratory. Along with the experimental and modeling work, a techno-economic assessment is included to compare the engine compressor system with state-of-the-art, commercially-available compressors. Included in the financial analysis is a case study where an engine compressor system is modeled to achieve specific compression needs. The result of the assessment is that, indeed, the low engine cost, even with the necessary retrofits, provides a cost advantage over incumbent compression technologies. Lastly, this thesis provides an algorithm and case study for another application of small-scale units in energy infrastructure, specifically in energy storage. This study focuses on quantifying the value of small-scale, onsite energy storage in shaving peak power demands. This case study focuses on university-level power demands. The analysis finds that, because peak power is so costly, even small amounts of energy storage, when dispatched optimally, can provide significant cost reductions. This provides another example of the value of small-scale implementations, particularly in energy infrastructure. While the study focuses on flywheels and batteries as the energy storage medium, engine gensets could also be used to deliver power and shave peak power demands. The overarching goal of this thesis is to introduce small-scale, modular infrastructure, with a particular focus on the opportunity to retrofit and repurpose inexpensive, mass-manufactured internal combustion engines in new and unconventional applications. The modeling and experimental work presented in this dissertation show very compelling results for engines incorporated into both energy generation infrastructure and chemical engineering industries via compression technologies. The low engine cost provides an opportunity to add retrofits whilst remaining cost competitive with the incumbent technology. This work supports the claim that modular infrastructure, built on the indivisible unit of an internal combustion engine, can revolutionize many industries by providing a low-cost mechanism for rapid change and promoting small-scale designs.

  14. Operation of the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The ICASE research program is described in detail; it consists of four major categories: (1) efficient use of vector and parallel computers, with particular emphasis on the CDC STAR-100; (2) numerical analysis, with particular emphasis on the development and analysis of basic numerical algorithms; (3) analysis and planning of large-scale software systems; and (4) computational research in engineering and the natural sciences, with particular emphasis on fluid dynamics. The work in each of these areas is described in detail; other activities are discussed, a prognosis of future activities are included.

  15. Measuring System Value in the Ares 1 Rocket Using an Uncertainty-Based Coupling Analysis Approach

    NASA Astrophysics Data System (ADS)

    Wenger, Christopher

    Coupling of physics in large-scale complex engineering systems must be correctly accounted for during the systems engineering process to ensure no unanticipated behaviors or unintended consequences arise in the system during operation. Structural vibration of large segmented solid rocket motors, known as thrust oscillation, is a well-documented problem that can affect the health and safety of any crew onboard. Within the Ares 1 rocket, larger than anticipated vibrations were recorded during late stage flight that propagated from the engine chamber to the Orion crew module. Upon investigation engineers found the root cause to be the structure of the rockets feedback onto fluid flow within the engine. The goal of this paper is to showcase a coupling strength analysis from the field of Multidisciplinary Design Optimization to identify the major impacts that caused the Thrust Oscillation event in the Ares 1. Once identified an uncertainty analysis of the coupled system using an uncertainty based optimization technique is used to identify the likelihood of occurrence for these strong or weak interactions to take place.

  16. Energy and technology review: Engineering modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cabayan, H.S.; Goudreau, G.L.; Ziolkowski, R.W.

    1986-10-01

    This report presents information concerning: Modeling Canonical Problems in Electromagnetic Coupling Through Apertures; Finite-Element Codes for Computing Electrostatic Fields; Finite-Element Modeling of Electromagnetic Phenomena; Modeling Microwave-Pulse Compression in a Resonant Cavity; Lagrangian Finite-Element Analysis of Penetration Mechanics; Crashworthiness Engineering; Computer Modeling of Metal-Forming Processes; Thermal-Mechanical Modeling of Tungsten Arc Welding; Modeling Air Breakdown Induced by Electromagnetic Fields; Iterative Techniques for Solving Boltzmann's Equations for p-Type Semiconductors; Semiconductor Modeling; and Improved Numerical-Solution Techniques in Large-Scale Stress Analysis.

  17. Aquatic Plant Control Research Program. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 5. Synthesis Report.

    DTIC Science & Technology

    1984-06-01

    RD-Rl45 988 AQUATIC PLANT CONTROL RESEARCH PROGRAM LARGE-SCALE 1/2 OPERATIONS MANAGEMENT ..(U) ARMY ENGINEER WATERWAYS EXPERIMENT STATION VICKSBURG MS...REPORT A-78-2 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR -, CONTROL OF PROBLEM AQUATIC PLANTS Report 5 SYNTHESIS REPORT bv Andrew...Corps of Engineers Washington, DC 20314 84 0,_1 oil.. LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC

  18. Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Gorelick, Noel

    2013-04-01

    The Google Earth Engine platform is a system designed to enable petabyte-scale, scientific analysis and visualization of geospatial datasets. Earth Engine provides a consolidated environment including a massive data catalog co-located with thousands of computers for analysis. The user-friendly front-end provides a workbench environment to allow interactive data and algorithm development and exploration and provides a convenient mechanism for scientists to share data, visualizations and analytic algorithms via URLs. The Earth Engine data catalog contains a wide variety of popular, curated datasets, including the world's largest online collection of Landsat scenes (> 2.0M), numerous MODIS collections, and many vector-based data sets. The platform provides a uniform access mechanism to a variety of data types, independent of their bands, projection, bit-depth, resolution, etc..., facilitating easy multi-sensor analysis. Additionally, a user is able to add and curate their own data and collections. Using a just-in-time, distributed computation model, Earth Engine can rapidly process enormous quantities of geo-spatial data. All computation is performed lazily; nothing is computed until it's required either for output or as input to another step. This model allows real-time feedback and preview during algorithm development, supporting a rapid algorithm development, test, and improvement cycle that scales seamlessly to large-scale production data processing. Through integration with a variety of other services, Earth Engine is able to bring to bear considerable analytic and technical firepower in a transparent fashion, including: AI-based classification via integration with Google's machine learning infrastructure, publishing and distribution at Google scale through integration with the Google Maps API, Maps Engine and Google Earth, and support for in-the-field activities such as validation, ground-truthing, crowd-sourcing and citizen science though the Android Open Data Kit.

  19. Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Gorelick, N.

    2012-12-01

    The Google Earth Engine platform is a system designed to enable petabyte-scale, scientific analysis and visualization of geospatial datasets. Earth Engine provides a consolidated environment including a massive data catalog co-located with thousands of computers for analysis. The user-friendly front-end provides a workbench environment to allow interactive data and algorithm development and exploration and provides a convenient mechanism for scientists to share data, visualizations and analytic algorithms via URLs. The Earth Engine data catalog contains a wide variety of popular, curated datasets, including the world's largest online collection of Landsat scenes (> 2.0M), numerous MODIS collections, and many vector-based data sets. The platform provides a uniform access mechanism to a variety of data types, independent of their bands, projection, bit-depth, resolution, etc..., facilitating easy multi-sensor analysis. Additionally, a user is able to add and curate their own data and collections. Using a just-in-time, distributed computation model, Earth Engine can rapidly process enormous quantities of geo-spatial data. All computation is performed lazily; nothing is computed until it's required either for output or as input to another step. This model allows real-time feedback and preview during algorithm development, supporting a rapid algorithm development, test, and improvement cycle that scales seamlessly to large-scale production data processing. Through integration with a variety of other services, Earth Engine is able to bring to bear considerable analytic and technical firepower in a transparent fashion, including: AI-based classification via integration with Google's machine learning infrastructure, publishing and distribution at Google scale through integration with the Google Maps API, Maps Engine and Google Earth, and support for in-the-field activities such as validation, ground-truthing, crowd-sourcing and citizen science though the Android Open Data Kit.

  20. A Framework of Working Across Disciplines in Early Design and R&D of Large Complex Engineered Systems

    NASA Technical Reports Server (NTRS)

    McGowan, Anna-Maria Rivas; Papalambros, Panos Y.; Baker, Wayne E.

    2015-01-01

    This paper examines four primary methods of working across disciplines during R&D and early design of large-scale complex engineered systems such as aerospace systems. A conceptualized framework, called the Combining System Elements framework, is presented to delineate several aspects of cross-discipline and system integration practice. The framework is derived from a theoretical and empirical analysis of current work practices in actual operational settings and is informed by theories from organization science and engineering. The explanatory framework may be used by teams to clarify assumptions and associated work practices, which may reduce ambiguity in understanding diverse approaches to early systems research, development and design. The framework also highlights that very different engineering results may be obtained depending on work practices, even when the goals for the engineered system are the same.

  1. Solution of matrix equations using sparse techniques

    NASA Technical Reports Server (NTRS)

    Baddourah, Majdi

    1994-01-01

    The solution of large systems of matrix equations is key to the solution of a large number of scientific and engineering problems. This talk describes the sparse matrix solver developed at Langley which can routinely solve in excess of 263,000 equations in 40 seconds on one Cray C-90 processor. It appears that for large scale structural analysis applications, sparse matrix methods have a significant performance advantage over other methods.

  2. LLMapReduce: Multi-Level Map-Reduce for High Performance Data Analysis

    DTIC Science & Technology

    2016-05-23

    LLMapReduce works with several schedulers such as SLURM, Grid Engine and LSF. Keywords—LLMapReduce; map-reduce; performance; scheduler; Grid Engine ...SLURM; LSF I. INTRODUCTION Large scale computing is currently dominated by four ecosystems: supercomputing, database, enterprise , and big data [1...interconnects [6]), High performance math libraries (e.g., BLAS [7, 8], LAPACK [9], ScaLAPACK [10]) designed to exploit special processing hardware, High

  3. Tools for Large-Scale Mobile Malware Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bierma, Michael

    Analyzing mobile applications for malicious behavior is an important area of re- search, and is made di cult, in part, by the increasingly large number of appli- cations available for the major operating systems. There are currently over 1.2 million apps available in both the Google Play and Apple App stores (the respec- tive o cial marketplaces for the Android and iOS operating systems)[1, 2]. Our research provides two large-scale analysis tools to aid in the detection and analysis of mobile malware. The rst tool we present, Andlantis, is a scalable dynamic analysis system capa- ble of processing over 3000more » Android applications per hour. Traditionally, Android dynamic analysis techniques have been relatively limited in scale due to the compu- tational resources required to emulate the full Android system to achieve accurate execution. Andlantis is the most scalable Android dynamic analysis framework to date, and is able to collect valuable forensic data, which helps reverse-engineers and malware researchers identify and understand anomalous application behavior. We discuss the results of running 1261 malware samples through the system, and provide examples of malware analysis performed with the resulting data. While techniques exist to perform static analysis on a large number of appli- cations, large-scale analysis of iOS applications has been relatively small scale due to the closed nature of the iOS ecosystem, and the di culty of acquiring appli- cations for analysis. The second tool we present, iClone, addresses the challenges associated with iOS research in order to detect application clones within a dataset of over 20,000 iOS applications.« less

  4. Analysis of Ten Reverse Engineering Tools

    NASA Astrophysics Data System (ADS)

    Koskinen, Jussi; Lehmonen, Tero

    Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided data structures, visualization mechanisms, information request specification mechanisms, and navigation features. We provide a compact analysis of ten representative reverse engineering tools for supporting C, C++ or Java: Eclipse Java Development Tools, Wind River Workbench (for C and C++), Understand (for C++), Imagix 4D, Creole, Javadoc, Javasrc, Source Navigator, Doxygen, and HyperSoft. The results of the study supplement the earlier findings in this important area.

  5. A PRACTICAL ONTOLOGY FOR THE LARGE-SCALE MODELING OF SCHOLARLY ARTIFACTS AND THEIR USAGE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    RODRIGUEZ, MARKO A.; BOLLEN, JOHAN; VAN DE SOMPEL, HERBERT

    2007-01-30

    The large-scale analysis of scholarly artifact usage is constrained primarily by current practices in usage data archiving, privacy issues concerned with the dissemination of usage data, and the lack of a practical ontology for modeling the usage domain. As a remedy to the third constraint, this article presents a scholarly ontology that was engineered to represent those classes for which large-scale bibliographic and usage data exists, supports usage research, and whose instantiation is scalable to the order of 50 million articles along with their associated artifacts (e.g. authors and journals) and an accompanying 1 billion usage events. The real worldmore » instantiation of the presented abstract ontology is a semantic network model of the scholarly community which lends the scholarly process to statistical analysis and computational support. They present the ontology, discuss its instantiation, and provide some example inference rules for calculating various scholarly artifact metrics.« less

  6. Globus | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Globus software services provide secure cancer research data transfer, synchronization, and sharing in distributed environments at large scale. These services can be integrated into applications and research data gateways, leveraging Globus identity management, single sign-on, search, and authorization capabilities. Globus Genomics integrates Globus with the Galaxy genomics workflow engine and Amazon Web Services to enable cancer genomics analysis that can elastically scale compute resources with demand.

  7. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  8. A Matter of Time: Faster Percolator Analysis via Efficient SVM Learning for Large-Scale Proteomics.

    PubMed

    Halloran, John T; Rocke, David M

    2018-05-04

    Percolator is an important tool for greatly improving the results of a database search and subsequent downstream analysis. Using support vector machines (SVMs), Percolator recalibrates peptide-spectrum matches based on the learned decision boundary between targets and decoys. To improve analysis time for large-scale data sets, we update Percolator's SVM learning engine through software and algorithmic optimizations rather than heuristic approaches that necessitate the careful study of their impact on learned parameters across different search settings and data sets. We show that by optimizing Percolator's original learning algorithm, l 2 -SVM-MFN, large-scale SVM learning requires nearly only a third of the original runtime. Furthermore, we show that by employing the widely used Trust Region Newton (TRON) algorithm instead of l 2 -SVM-MFN, large-scale Percolator SVM learning is reduced to nearly only a fifth of the original runtime. Importantly, these speedups only affect the speed at which Percolator converges to a global solution and do not alter recalibration performance. The upgraded versions of both l 2 -SVM-MFN and TRON are optimized within the Percolator codebase for multithreaded and single-thread use and are available under Apache license at bitbucket.org/jthalloran/percolator_upgrade .

  9. Mach 5 to 7 RBCC Propulsion System Testing at NASA-LeRC HTF

    NASA Technical Reports Server (NTRS)

    Perkins, H. Douglas; Thomas, Scott R.; Pack, William D.

    1996-01-01

    A series of Mach 5 to 7 freejet tests of a Rocket Based Combined Cycle (RBCC) engine were cnducted at the NASA Lewis Research Center (LERC) Hypersonic Tunnel Facility (HTF). This paper describes the configuration and operation of the HTF and the RBCC engine during these tests. A number of facility support systems are described which were added or modified to enhance the HTF test capability for conducting this experiment. The unfueled aerodynamic perfor- mance of the RBCC engine flowpath is also presented and compared to sub-scale test results previously obtained in the NASA LERC I x I Supersonic Wind Tunnel (SWT) and to Computational Fluid Dynamic (CFD) analysis results. This test program demonstrated a successful configuration of the HTF for facility starting and operation with a generic RBCC type engine and an increased range of facility operating conditions. The ability of sub-scale testing and CFD analysis to predict flowpath performance was also shown. The HTF is a freejet, blowdown propulsion test facility that can simulate up to Mach 7 flight conditions with true air composition. Mach 5, 6, and 7 facility nozzles are available, each with an exit diameter of 42 in. This combination of clean air, large scale, and Mach 7 capabilities is unique to the HTF. This RBCC engine study is the first engine test program conducted at the HTF since 1974.

  10. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  11. The Advanced Modeling, Simulation and Analysis Capability Roadmap Vision for Engineering

    NASA Technical Reports Server (NTRS)

    Zang, Thomas; Lieber, Mike; Norton, Charles; Fucik, Karen

    2006-01-01

    This paper summarizes a subset of the Advanced Modeling Simulation and Analysis (AMSA) Capability Roadmap that was developed for NASA in 2005. The AMSA Capability Roadmap Team was chartered to "To identify what is needed to enhance NASA's capabilities to produce leading-edge exploration and science missions by improving engineering system development, operations, and science understanding through broad application of advanced modeling, simulation and analysis techniques." The AMSA roadmap stressed the need for integration, not just within the science, engineering and operations domains themselves, but also across these domains. Here we discuss the roadmap element pertaining to integration within the engineering domain, with a particular focus on implications for future observatory missions. The AMSA products supporting the system engineering function are mission information, bounds on information quality, and system validation guidance. The Engineering roadmap element contains 5 sub-elements: (1) Large-Scale Systems Models, (2) Anomalous Behavior Models, (3) advanced Uncertainty Models, (4) Virtual Testing Models, and (5) space-based Robotics Manufacture and Servicing Models.

  12. How much does a tokamak reactor cost?

    NASA Astrophysics Data System (ADS)

    Freidberg, J.; Cerfon, A.; Ballinger, S.; Barber, J.; Dogra, A.; McCarthy, W.; Milanese, L.; Mouratidis, T.; Redman, W.; Sandberg, A.; Segal, D.; Simpson, R.; Sorensen, C.; Zhou, M.

    2017-10-01

    The cost of a fusion reactor is of critical importance to its ultimate acceptability as a commercial source of electricity. While there are general rules of thumb for scaling both overnight cost and levelized cost of electricity the corresponding relations are not very accurate or universally agreed upon. We have carried out a series of scaling studies of tokamak reactor costs based on reasonably sophisticated plasma and engineering models. The analysis is largely analytic, requiring only a simple numerical code, thus allowing a very large number of designs. Importantly, the studies are aimed at plasma physicists rather than fusion engineers. The goals are to assess the pros and cons of steady state burning plasma experiments and reactors. One specific set of results discusses the benefits of higher magnetic fields, now possible because of the recent development of high T rare earth superconductors (REBCO); with this goal in mind, we calculate quantitative expressions, including both scaling and multiplicative constants, for cost and major radius as a function of central magnetic field.

  13. Application of multivariate analysis and mass transfer principles for refinement of a 3-L bioreactor scale-down model--when shake flasks mimic 15,000-L bioreactors better.

    PubMed

    Ahuja, Sanjeev; Jain, Shilpa; Ram, Kripa

    2015-01-01

    Characterization of manufacturing processes is key to understanding the effects of process parameters on process performance and product quality. These studies are generally conducted using small-scale model systems. Because of the importance of the results derived from these studies, the small-scale model should be predictive of large scale. Typically, small-scale bioreactors, which are considered superior to shake flasks in simulating large-scale bioreactors, are used as the scale-down models for characterizing mammalian cell culture processes. In this article, we describe a case study where a cell culture unit operation in bioreactors using one-sided pH control and their satellites (small-scale runs conducted using the same post-inoculation cultures and nutrient feeds) in 3-L bioreactors and shake flasks indicated that shake flasks mimicked the large-scale performance better than 3-L bioreactors. We detail here how multivariate analysis was used to make the pertinent assessment and to generate the hypothesis for refining the existing 3-L scale-down model. Relevant statistical techniques such as principal component analysis, partial least square, orthogonal partial least square, and discriminant analysis were used to identify the outliers and to determine the discriminatory variables responsible for performance differences at different scales. The resulting analysis, in combination with mass transfer principles, led to the hypothesis that observed similarities between 15,000-L and shake flask runs, and differences between 15,000-L and 3-L runs, were due to pCO2 and pH values. This hypothesis was confirmed by changing the aeration strategy at 3-L scale. By reducing the initial sparge rate in 3-L bioreactor, process performance and product quality data moved closer to that of large scale. © 2015 American Institute of Chemical Engineers.

  14. Effect of buoyancy on fuel containment in an open-cycle gas-core nuclear rocket engine.

    NASA Technical Reports Server (NTRS)

    Putre, H. A.

    1971-01-01

    Analysis aimed at determining the scaling laws for the buoyancy effect on fuel containment in an open-cycle gas-core nuclear rocket engine, so conducted that experimental conditions can be related to engine conditions. The fuel volume fraction in a short coaxial flow cavity is calculated with a programmed numerical solution of the steady Navier-Stokes equations for isothermal, variable density fluid mixing. A dimensionless parameter B, called the Buoyancy number, was found to correlate the fuel volume fraction for large accelerations and various density ratios. This parameter has the value B = 0 for zero acceleration, and B = 350 for typical engine conditions.

  15. Non-negative Tensor Factorization for Robust Exploratory Big-Data Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexandrov, Boian; Vesselinov, Velimir Valentinov; Djidjev, Hristo Nikolov

    Currently, large multidimensional datasets are being accumulated in almost every field. Data are: (1) collected by distributed sensor networks in real-time all over the globe, (2) produced by large-scale experimental measurements or engineering activities, (3) generated by high-performance simulations, and (4) gathered by electronic communications and socialnetwork activities, etc. Simultaneous analysis of these ultra-large heterogeneous multidimensional datasets is often critical for scientific discoveries, decision-making, emergency response, and national and global security. The importance of such analyses mandates the development of the next-generation of robust machine learning (ML) methods and tools for bigdata exploratory analysis.

  16. Socio-Technical Perspective on Interdisciplinary Interactions During the Development of Complex Engineered Systems

    NASA Technical Reports Server (NTRS)

    McGowan, Anna-Maria R.; Daly, Shanna; Baker, Wayne; Papalambros, panos; Seifert, Colleen

    2013-01-01

    This study investigates interdisciplinary interactions that take place during the research, development, and early conceptual design phases in the design of large-scale complex engineered systems (LaCES) such as aerospace vehicles. These interactions, that take place throughout a large engineering development organization, become the initial conditions of the systems engineering process that ultimately leads to the development of a viable system. This paper summarizes some of the challenges and opportunities regarding social and organizational issues that emerged from a qualitative study using ethnographic and survey data. The analysis reveals several socio-technical couplings between the engineered system and the organization that creates it. Survey respondents noted the importance of interdisciplinary interactions and their benefits to the engineered system as well as substantial challenges in interdisciplinary interactions. Noted benefits included enhanced knowledge and problem mitigation and noted obstacles centered on organizational and human dynamics. Findings suggest that addressing the social challenges may be a critical need in enabling interdisciplinary interactions

  17. Low-speed wind-tunnel investigation of a large scale advanced arrow-wing supersonic transport configuration with engines mounted above wing for upper-surface blowing

    NASA Technical Reports Server (NTRS)

    Shivers, J. P.; Mclemore, H. C.; Coe, P. L., Jr.

    1976-01-01

    Tests have been conducted in a full scale tunnel to determine the low speed aerodynamic characteristics of a large scale advanced arrow wing supersonic transport configuration with engines mounted above the wing for upper surface blowing. Tests were made over an angle of attack range of -10 deg to 32 deg, sideslip angles of + or - 5 deg, and a Reynolds number range of 3,530,000 to 7,330,000. Configuration variables included trailing edge flap deflection, engine jet nozzle angle, engine thrust coefficient, engine out operation, and asymmetrical trailing edge boundary layer control for providing roll trim. Downwash measurements at the tail were obtained for different thrust coefficients, tail heights, and at two fuselage stations.

  18. Formal and heuristic system decomposition methods in multidisciplinary synthesis. Ph.D. Thesis, 1991

    NASA Technical Reports Server (NTRS)

    Bloebaum, Christina L.

    1991-01-01

    The multidisciplinary interactions which exist in large scale engineering design problems provide a unique set of difficulties. These difficulties are associated primarily with unwieldy numbers of design variables and constraints, and with the interdependencies of the discipline analysis modules. Such obstacles require design techniques which account for the inherent disciplinary couplings in the analyses and optimizations. The objective of this work was to develop an efficient holistic design synthesis methodology that takes advantage of the synergistic nature of integrated design. A general decomposition approach for optimization of large engineering systems is presented. The method is particularly applicable for multidisciplinary design problems which are characterized by closely coupled interactions among discipline analyses. The advantage of subsystem modularity allows for implementation of specialized methods for analysis and optimization, computational efficiency, and the ability to incorporate human intervention and decision making in the form of an expert systems capability. The resulting approach is not a method applicable to only a specific situation, but rather, a methodology which can be used for a large class of engineering design problems in which the system is non-hierarchic in nature.

  19. Studies on combined model based on functional objectives of large scale complex engineering

    NASA Astrophysics Data System (ADS)

    Yuting, Wang; Jingchun, Feng; Jiabao, Sun

    2018-03-01

    As various functions were included in large scale complex engineering, and each function would be conducted with completion of one or more projects, combined projects affecting their functions should be located. Based on the types of project portfolio, the relationship of projects and their functional objectives were analyzed. On that premise, portfolio projects-technics based on their functional objectives were introduced, then we studied and raised the principles of portfolio projects-technics based on the functional objectives of projects. In addition, The processes of combined projects were also constructed. With the help of portfolio projects-technics based on the functional objectives of projects, our research findings laid a good foundation for management of large scale complex engineering portfolio management.

  20. A Coupling Analysis Approach to Capture Unexpected Behaviors in Ares 1

    NASA Astrophysics Data System (ADS)

    Kis, David

    Coupling of physics in large-scale complex engineering systems must be correctly accounted for during the systems engineering process. Preliminary corrections ensure no unanticipated behaviors arise during operation. Structural vibration of large segmented solid rocket motors, known as thrust oscillation, is a well-documented problem that can effect solid rocket motors in adverse ways. Within the Ares 1 rocket, unexpected vibrations deemed potentially harmful to future crew were recorded during late stage flight that propagated from the engine chamber to the Orion crew module. This research proposes the use of a coupling strength analysis during the design and development phase to identify potential unanticipated behaviors such as thrust oscillation. Once these behaviors and couplings are identified then a value function, based on research in Value Driven Design, is proposed to evaluate mitigation strategies and their impact on system value. The results from this study showcase a strong coupling interaction from structural displacement back onto the fluid flow of the Ares 1 that was previously deemed inconsequential. These findings show that the use of a coupling strength analysis can aid engineers and managers in identifying unanticipated behaviors and then rank order their importance based on the impact they have on value.

  1. Systems metabolic engineering of microorganisms to achieve large-scale production of flavonoid scaffolds.

    PubMed

    Wu, Junjun; Du, Guocheng; Zhou, Jingwen; Chen, Jian

    2014-10-20

    Flavonoids possess pharmaceutical potential due to their health-promoting activities. The complex structures of these products make extraction from plants difficult, and chemical synthesis is limited because of the use of many toxic solvents. Microbial production offers an alternate way to produce these compounds on an industrial scale in a more economical and environment-friendly manner. However, at present microbial production has been achieved only on a laboratory scale and improvements and scale-up of these processes remain challenging. Naringenin and pinocembrin, which are flavonoid scaffolds and precursors for most of the flavonoids, are the model molecules that are key to solving the current issues restricting industrial production of these chemicals. The emergence of systems metabolic engineering, which combines systems biology with synthetic biology and evolutionary engineering at the systems level, offers new perspectives on strain and process optimization. In this review, current challenges in large-scale fermentation processes involving flavonoid scaffolds and the strategies and tools of systems metabolic engineering used to overcome these challenges are summarized. This will offer insights into overcoming the limitations and challenges of large-scale microbial production of these important pharmaceutical compounds. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Control methods for aiding a pilot during STOL engine failure transients

    NASA Technical Reports Server (NTRS)

    Nelson, E. R.; Debra, D. B.

    1976-01-01

    Candidate autopilot control laws that control the engine failure transient sink rates by demonstrating the engineering application of modern state variable control theory were defined. The results of approximate modal analysis were compared to those derived from full state analyses provided from computer design solutions. The aircraft was described, and a state variable model of its longitudinal dynamic motion due to engine and control variations was defined. The classical fast and slow modes were assumed to be sufficiently different to define reduced order approximations of the aircraft motion amendable to hand analysis control definition methods. The original state equations of motion were also applied to a large scale state variable control design program, in particular OPTSYS. The resulting control laws were compared with respect to their relative responses, ease of application, and meeting the desired performance objectives.

  3. A Survey and Analysis of Access Control Architectures for XML Data

    DTIC Science & Technology

    2006-03-01

    13 4. XML Query Engines ...castle and the drawbridge over the moat. Extending beyond the visual analogy, there are many key components to the protection of information and...technology. While XML’s original intent was to enable large-scale electronic publishing over the internet, its functionality is firmly rooted in its

  4. Ontology-Driven Provenance Management in eScience: An Application in Parasite Research

    NASA Astrophysics Data System (ADS)

    Sahoo, Satya S.; Weatherly, D. Brent; Mutharaju, Raghava; Anantharam, Pramod; Sheth, Amit; Tarleton, Rick L.

    Provenance, from the French word "provenir", describes the lineage or history of a data entity. Provenance is critical information in scientific applications to verify experiment process, validate data quality and associate trust values with scientific results. Current industrial scale eScience projects require an end-to-end provenance management infrastructure. This infrastructure needs to be underpinned by formal semantics to enable analysis of large scale provenance information by software applications. Further, effective analysis of provenance information requires well-defined query mechanisms to support complex queries over large datasets. This paper introduces an ontology-driven provenance management infrastructure for biology experiment data, as part of the Semantic Problem Solving Environment (SPSE) for Trypanosoma cruzi (T.cruzi). This provenance infrastructure, called T.cruzi Provenance Management System (PMS), is underpinned by (a) a domain-specific provenance ontology called Parasite Experiment ontology, (b) specialized query operators for provenance analysis, and (c) a provenance query engine. The query engine uses a novel optimization technique based on materialized views called materialized provenance views (MPV) to scale with increasing data size and query complexity. This comprehensive ontology-driven provenance infrastructure not only allows effective tracking and management of ongoing experiments in the Tarleton Research Group at the Center for Tropical and Emerging Global Diseases (CTEGD), but also enables researchers to retrieve the complete provenance information of scientific results for publication in literature.

  5. Free Global Dsm Assessment on Large Scale Areas Exploiting the Potentialities of the Innovative Google Earth Engine Platform

    NASA Astrophysics Data System (ADS)

    Nascetti, A.; Di Rita, M.; Ravanelli, R.; Amicuzi, M.; Esposito, S.; Crespi, M.

    2017-05-01

    The high-performance cloud-computing platform Google Earth Engine has been developed for global-scale analysis based on the Earth observation data. In particular, in this work, the geometric accuracy of the two most used nearly-global free DSMs (SRTM and ASTER) has been evaluated on the territories of four American States (Colorado, Michigan, Nevada, Utah) and one Italian Region (Trentino Alto- Adige, Northern Italy) exploiting the potentiality of this platform. These are large areas characterized by different terrain morphology, land covers and slopes. The assessment has been performed using two different reference DSMs: the USGS National Elevation Dataset (NED) and a LiDAR acquisition. The DSMs accuracy has been evaluated through computation of standard statistic parameters, both at global scale (considering the whole State/Region) and in function of the terrain morphology using several slope classes. The geometric accuracy in terms of Standard deviation and NMAD, for SRTM range from 2-3 meters in the first slope class to about 45 meters in the last one, whereas for ASTER, the values range from 5-6 to 30 meters. In general, the performed analysis shows a better accuracy for the SRTM in the flat areas whereas the ASTER GDEM is more reliable in the steep areas, where the slopes increase. These preliminary results highlight the GEE potentialities to perform DSM assessment on a global scale.

  6. Monitoring Urban Heat Island Through Google Earth Engine: Potentialities and Difficulties in Different Cities of the United States

    NASA Astrophysics Data System (ADS)

    Ravanelli, R.; Nascetti, A.; Cirigliano, R. V.; Di Rico, C.; Monti, P.; Crespi, M.

    2018-04-01

    The aim of this work is to exploit the large-scale analysis capabilities of the innovative Google Earth Engine platform in order to investigate the temporal variations of the Urban Heat Island phenomenon as a whole. A intuitive methodology implementing a largescale correlation analysis between the Land Surface Temperature and Land Cover alterations was thus developed.The results obtained for the Phoenix MA are promising and show how the urbanization heavily affects the magnitude of the UHI effects with significant increases in LST. The proposed methodology is therefore able to efficiently monitor the UHI phenomenon.

  7. The prospect of modern thermomechanics in structural integrity calculations of large-scale pressure vessels

    NASA Astrophysics Data System (ADS)

    Fekete, Tamás

    2018-05-01

    Structural integrity calculations play a crucial role in designing large-scale pressure vessels. Used in the electric power generation industry, these kinds of vessels undergo extensive safety analyses and certification procedures before deemed feasible for future long-term operation. The calculations are nowadays directed and supported by international standards and guides based on state-of-the-art results of applied research and technical development. However, their ability to predict a vessel's behavior under accidental circumstances after long-term operation is largely limited by the strong dependence of the analysis methodology on empirical models that are correlated to the behavior of structural materials and their changes during material aging. Recently a new scientific engineering paradigm, structural integrity has been developing that is essentially a synergistic collaboration between a number of scientific and engineering disciplines, modeling, experiments and numerics. Although the application of the structural integrity paradigm highly contributed to improving the accuracy of safety evaluations of large-scale pressure vessels, the predictive power of the analysis methodology has not yet improved significantly. This is due to the fact that already existing structural integrity calculation methodologies are based on the widespread and commonly accepted 'traditional' engineering thermal stress approach, which is essentially based on the weakly coupled model of thermomechanics and fracture mechanics. Recently, a research has been initiated in MTA EK with the aim to review and evaluate current methodologies and models applied in structural integrity calculations, including their scope of validity. The research intends to come to a better understanding of the physical problems that are inherently present in the pool of structural integrity problems of reactor pressure vessels, and to ultimately find a theoretical framework that could serve as a well-grounded theoretical foundation for a new modeling framework of structural integrity. This paper presents the first findings of the research project.

  8. Tests and analysis of a vented D thrust deflecting nozzle on a turbofan engine. [conducted at the outdoor aerodynamic research facility of the Ames Research Center

    NASA Technical Reports Server (NTRS)

    Roseberg, E. W.

    1982-01-01

    The objectives were to: obtain nozzle performance characteristics in and out of ground effects; demonstrate the compatibility of the nozzle with a turbofan engine; obtain pressure and temperature distributions on the surface of the D vented nozzle; and establish a correlation of the nozzle performance between small scale and large scale models. The test nozzle was a boilerplate model of the MCAIR D vented nozzle configured for operation with a General Electric YTF-34-F5 turbofan engine. The nozzle was configured to provide: a thrust vectoring range of 0 to 115 deg; a yaw vectoring range of 0 to 10 deg; variable nozzle area control; and variable spacing between the core exit and nozzle entrance station. Compatibility between the YTF-34-T5 turbofan engine and the D vented nozzle was demonstrated. Velocity coefficients of 0.96 and greater were obtained for 90 deg of thrust vectoring. The nozzle walls remained cool during all test conditions.

  9. Some Thoughts About Water Analysis in Shipboard Steam Propulsion Systems for Marine Engineering Students.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.; And Others

    Information is presented about the problems involved in using sea water in the steam propulsion systems of large, modern ships. Discussions supply background chemical information concerning the problems of corrosion, scale buildup, and sludge production. Suggestions are given for ways to maintain a good water treatment program to effectively deal…

  10. Testing of the NASA Hypersonics Project Combined Cycle Engine Large Scale Inlet Mode Transition Experiment (CCE LlMX)

    NASA Technical Reports Server (NTRS)

    Saunders, J. D.; Stueber, T. J.; Thomas, S. R.; Suder, K. L.; Weir, L. J.; Sanders, B. W.

    2012-01-01

    Status on an effort to develop Turbine Based Combined Cycle (TBCC) propulsion is described. This propulsion technology can enable reliable and reusable space launch systems. TBCC propulsion offers improved performance and safety over rocket propulsion. The potential to realize aircraft-like operations and reduced maintenance are additional benefits. Among most the critical TBCC enabling technologies are: 1) mode transition from turbine to scramjet propulsion, 2) high Mach turbine engines and 3) TBCC integration. To address these TBCC challenges, the effort is centered on a propulsion mode transition experiment and includes analytical research. The test program, the Combined-Cycle Engine Large Scale Inlet Mode Transition Experiment (CCE LIMX), was conceived to integrate TBCC propulsion with proposed hypersonic vehicles. The goals address: (1) dual inlet operability and performance, (2) mode-transition sequences enabling a switch between turbine and scramjet flow paths, and (3) turbine engine transients during transition. Four test phases are planned from which a database can be used to both validate design and analysis codes and characterize operability and integration issues for TBCC propulsion. In this paper we discuss the research objectives, features of the CCE hardware and test plans, and status of the parametric inlet characterization testing which began in 2011. This effort is sponsored by the NASA Fundamental Aeronautics Hypersonics project

  11. Parametric analysis of a down-scaled turbo jet engine suitable for drone and UAV propulsion

    NASA Astrophysics Data System (ADS)

    Wessley, G. Jims John; Chauhan, Swati

    2018-04-01

    This paper presents a detailed study on the need for downscaling gas turbine engines for UAV and drone propulsion. Also, the procedure for downscaling and the parametric analysis of a downscaled engine using Gas Turbine Simulation Program software GSP 11 is presented. The need for identifying a micro gas turbine engine in the thrust range of 0.13 to 4.45 kN to power UAVs and drones weighing in the range of 4.5 to 25 kg is considered and in order to meet the requirement a parametric analysis on the scaled down Allison J33-A-35 Turbojet engine is performed. It is evident from the analysis that the thrust developed by the scaled engine and the Thrust Specific Fuel Consumption TSFC depends on pressure ratio, mass flow rate of air and Mach number. A scaling factor of 0.195 corresponding to air mass flow rate of 7.69 kg/s produces a thrust in the range of 4.57 to 5.6 kN while operating at a Mach number of 0.3 within the altitude of 5000 to 9000 m. The thermal and overall efficiency of the scaled engine is found to be 67% and 75% respectively for a pressure ratio of 2. The outcomes of this analysis form a strong base for further analysis, design and fabrication of micro gas turbine engines to propel future UAVs and drones.

  12. Engineering management of large scale systems

    NASA Technical Reports Server (NTRS)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  13. Apollo experience report: Guidance and control systems. Engineering simulation program

    NASA Technical Reports Server (NTRS)

    Gilbert, D. W.

    1973-01-01

    The Apollo Program experience from early 1962 to July 1969 with respect to the engineering-simulation support and the problems encountered is summarized in this report. Engineering simulation in support of the Apollo guidance and control system is discussed in terms of design analysis and verification, certification of hardware in closed-loop operation, verification of hardware/software compatibility, and verification of both software and procedures for each mission. The magnitude, time, and cost of the engineering simulations are described with respect to hardware availability, NASA and contractor facilities (for verification of the command module, the lunar module, and the primary guidance, navigation, and control system), and scheduling and planning considerations. Recommendations are made regarding implementation of similar, large-scale simulations for future programs.

  14. Retaining large and adjustable elastic strains of kilogram-scale Nb nanowires [Better Superconductor by Elastic Strain Engineering: Kilogram-scale Free-Standing Niobium Metal Composite with Large Retained Elastic Strains

    DOE PAGES

    Hao, Shijie; Cui, Lishan; Wang, Hua; ...

    2016-02-10

    Crystals held at ultrahigh elastic strains and stresses may exhibit exceptional physical and chemical properties. Individual metallic nanowires can sustain ultra-large elastic strains of 4-7%. However, retaining elastic strains of such magnitude in kilogram-scale nanowires is challenging. Here, we find that under active load, ~5.6% elastic strain can be achieved in Nb nanowires in a composite material. Moreover, large tensile (2.8%) and compressive (-2.4%) elastic strains can be retained in kilogram-scale Nb nanowires when the composite is unloaded to a free-standing condition. It is then demonstrated that the retained tensile elastic strains of Nb nanowires significantly increase their superconducting transitionmore » temperature and critical magnetic fields, corroborating ab initio calculations based on BCS theory. This free-standing nanocomposite design paradigm opens new avenues for retaining ultra-large elastic strains in great quantities of nanowires and elastic-strain-engineering at industrial scale.« less

  15. Versatile synchronized real-time MEG hardware controller for large-scale fast data acquisition.

    PubMed

    Sun, Limin; Han, Menglai; Pratt, Kevin; Paulson, Douglas; Dinh, Christoph; Esch, Lorenz; Okada, Yoshio; Hämäläinen, Matti

    2017-05-01

    Versatile controllers for accurate, fast, and real-time synchronized acquisition of large-scale data are useful in many areas of science, engineering, and technology. Here, we describe the development of a controller software based on a technique called queued state machine for controlling the data acquisition (DAQ) hardware, continuously acquiring a large amount of data synchronized across a large number of channels (>400) at a fast rate (up to 20 kHz/channel) in real time, and interfacing with applications for real-time data analysis and display of electrophysiological data. This DAQ controller was developed specifically for a 384-channel pediatric whole-head magnetoencephalography (MEG) system, but its architecture is useful for wide applications. This controller running in a LabVIEW environment interfaces with microprocessors in the MEG sensor electronics to control their real-time operation. It also interfaces with a real-time MEG analysis software via transmission control protocol/internet protocol, to control the synchronous acquisition and transfer of the data in real time from >400 channels to acquisition and analysis workstations. The successful implementation of this controller for an MEG system with a large number of channels demonstrates the feasibility of employing the present architecture in several other applications.

  16. Versatile synchronized real-time MEG hardware controller for large-scale fast data acquisition

    NASA Astrophysics Data System (ADS)

    Sun, Limin; Han, Menglai; Pratt, Kevin; Paulson, Douglas; Dinh, Christoph; Esch, Lorenz; Okada, Yoshio; Hämäläinen, Matti

    2017-05-01

    Versatile controllers for accurate, fast, and real-time synchronized acquisition of large-scale data are useful in many areas of science, engineering, and technology. Here, we describe the development of a controller software based on a technique called queued state machine for controlling the data acquisition (DAQ) hardware, continuously acquiring a large amount of data synchronized across a large number of channels (>400) at a fast rate (up to 20 kHz/channel) in real time, and interfacing with applications for real-time data analysis and display of electrophysiological data. This DAQ controller was developed specifically for a 384-channel pediatric whole-head magnetoencephalography (MEG) system, but its architecture is useful for wide applications. This controller running in a LabVIEW environment interfaces with microprocessors in the MEG sensor electronics to control their real-time operation. It also interfaces with a real-time MEG analysis software via transmission control protocol/internet protocol, to control the synchronous acquisition and transfer of the data in real time from >400 channels to acquisition and analysis workstations. The successful implementation of this controller for an MEG system with a large number of channels demonstrates the feasibility of employing the present architecture in several other applications.

  17. A modular approach to creating large engineered cartilage surfaces.

    PubMed

    Ford, Audrey C; Chui, Wan Fung; Zeng, Anne Y; Nandy, Aditya; Liebenberg, Ellen; Carraro, Carlo; Kazakia, Galateia; Alliston, Tamara; O'Connell, Grace D

    2018-01-23

    Native articular cartilage has limited capacity to repair itself from focal defects or osteoarthritis. Tissue engineering has provided a promising biological treatment strategy that is currently being evaluated in clinical trials. However, current approaches in translating these techniques to developing large engineered tissues remains a significant challenge. In this study, we present a method for developing large-scale engineered cartilage surfaces through modular fabrication. Modular Engineered Tissue Surfaces (METS) uses the well-known, but largely under-utilized self-adhesion properties of de novo tissue to create large scaffolds with nutrient channels. Compressive mechanical properties were evaluated throughout METS specimens, and the tensile mechanical strength of the bonds between attached constructs was evaluated over time. Raman spectroscopy, biochemical assays, and histology were performed to investigate matrix distribution. Results showed that by Day 14, stable connections had formed between the constructs in the METS samples. By Day 21, bonds were robust enough to form a rigid sheet and continued to increase in size and strength over time. Compressive mechanical properties and glycosaminoglycan (GAG) content of METS and individual constructs increased significantly over time. The METS technique builds on established tissue engineering accomplishments of developing constructs with GAG composition and compressive properties approaching native cartilage. This study demonstrated that modular fabrication is a viable technique for creating large-scale engineered cartilage, which can be broadly applied to many tissue engineering applications and construct geometries. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Assembling Large, Multi-Sensor Climate Datasets Using the SciFlo Grid Workflow System

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.; Fetzer, E.

    2008-12-01

    NASA's Earth Observing System (EOS) is the world's most ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the A-Train platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over periods of years to decades. However, moving from predominantly single-instrument studies to a multi-sensor, measurement-based model for long-duration analysis of important climate variables presents serious challenges for large-scale data mining and data fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another instrument (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the cloud scenes from CloudSat, and repeat the entire analysis over years of AIRS data. To perform such an analysis, one must discover & access multiple datasets from remote sites, find the space/time matchups between instruments swaths and model grids, understand the quality flags and uncertainties for retrieved physical variables, and assemble merged datasets for further scientific and statistical analysis. To meet these large-scale challenges, we are utilizing a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data query, access, subsetting, co-registration, mining, fusion, and advanced statistical analysis. SciFlo is a semantically-enabled ("smart") Grid Workflow system that ties together a peer-to-peer network of computers into an efficient engine for distributed computation. The SciFlo workflow engine enables scientists to do multi-instrument Earth Science by assembling remotely-invokable Web Services (SOAP or http GET URLs), native executables, command-line scripts, and Python codes into a distributed computing flow. A scientist visually authors the graph of operation in the VizFlow GUI, or uses a text editor to modify the simple XML workflow documents. The SciFlo client & server engines optimize the execution of such distributed workflows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. The engine transparently moves data to the operators, and moves operators to the data (on the dozen trusted SciFlo nodes). SciFlo also deploys a variety of Data Grid services to: query datasets in space and time, locate & retrieve on-line data granules, provide on-the-fly variable and spatial subsetting, and perform pairwise instrument matchups for A-Train datasets. These services are combined into efficient workflows to assemble the desired large-scale, merged climate datasets. SciFlo is currently being applied in several large climate studies: comparisons of aerosol optical depth between MODIS, MISR, AERONET ground network, and U. Michigan's IMPACT aerosol transport model; characterization of long-term biases in microwave and infrared instruments (AIRS, MLS) by comparisons to GPS temperature retrievals accurate to 0.1 degrees Kelvin; and construction of a decade-long, multi-sensor water vapor climatology stratified by classified cloud scene by bringing together datasets from AIRS/AMSU, AMSR-E, MLS, MODIS, and CloudSat (NASA MEASUREs grant, Fetzer PI). The presentation will discuss the SciFlo technologies, their application in these distributed workflows, and the many challenges encountered in assembling and analyzing these massive datasets.

  19. Assembling Large, Multi-Sensor Climate Datasets Using the SciFlo Grid Workflow System

    NASA Astrophysics Data System (ADS)

    Wilson, B.; Manipon, G.; Xing, Z.; Fetzer, E.

    2009-04-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over periods of years to decades. However, moving from predominantly single-instrument studies to a multi-sensor, measurement-based model for long-duration analysis of important climate variables presents serious challenges for large-scale data mining and data fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another instrument (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over years of AIRS data. To perform such an analysis, one must discover & access multiple datasets from remote sites, find the space/time "matchups" between instruments swaths and model grids, understand the quality flags and uncertainties for retrieved physical variables, assemble merged datasets, and compute fused products for further scientific and statistical analysis. To meet these large-scale challenges, we are utilizing a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data query, access, subsetting, co-registration, mining, fusion, and advanced statistical analysis. SciFlo is a semantically-enabled ("smart") Grid Workflow system that ties together a peer-to-peer network of computers into an efficient engine for distributed computation. The SciFlo workflow engine enables scientists to do multi-instrument Earth Science by assembling remotely-invokable Web Services (SOAP or http GET URLs), native executables, command-line scripts, and Python codes into a distributed computing flow. A scientist visually authors the graph of operation in the VizFlow GUI, or uses a text editor to modify the simple XML workflow documents. The SciFlo client & server engines optimize the execution of such distributed workflows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. The engine transparently moves data to the operators, and moves operators to the data (on the dozen trusted SciFlo nodes). SciFlo also deploys a variety of Data Grid services to: query datasets in space and time, locate & retrieve on-line data granules, provide on-the-fly variable and spatial subsetting, perform pairwise instrument matchups for A-Train datasets, and compute fused products. These services are combined into efficient workflows to assemble the desired large-scale, merged climate datasets. SciFlo is currently being applied in several large climate studies: comparisons of aerosol optical depth between MODIS, MISR, AERONET ground network, and U. Michigan's IMPACT aerosol transport model; characterization of long-term biases in microwave and infrared instruments (AIRS, MLS) by comparisons to GPS temperature retrievals accurate to 0.1 degrees Kelvin; and construction of a decade-long, multi-sensor water vapor climatology stratified by classified cloud scene by bringing together datasets from AIRS/AMSU, AMSR-E, MLS, MODIS, and CloudSat (NASA MEASUREs grant, Fetzer PI). The presentation will discuss the SciFlo technologies, their application in these distributed workflows, and the many challenges encountered in assembling and analyzing these massive datasets.

  20. Engineering large-scale agent-based systems with consensus

    NASA Technical Reports Server (NTRS)

    Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.

    1994-01-01

    The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.

  1. Adaptation of MSC/NASTRAN to a supercomputer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gloudeman, J.F.; Hodge, J.C.

    1982-01-01

    MSC/NASTRAN is a large-scale general purpose digital computer program which solves a wider variety of engineering analysis problems by the finite element method. The program capabilities include static and dynamic structural analysis (linear and nonlinear), heat transfer, acoustics, electromagnetism and other types of field problems. It is used worldwide by large and small companies in such diverse fields as automotive, aerospace, civil engineering, shipbuilding, offshore oil, industrial equipment, chemical engineering, biomedical research, optics and government research. The paper presents the significant aspects of the adaptation of MSC/NASTRAN to the Cray-1. First, the general architecture and predominant functional use of MSC/NASTRANmore » are discussed to help explain the imperatives and the challenges of this undertaking. The key characteristics of the Cray-1 which influenced the decision to undertake this effort are then reviewed to help identify performance targets. An overview of the MSC/NASTRAN adaptation effort is then given to help define the scope of the project. Finally, some measures of MSC/NASTRAN's operational performance on the Cray-1 are given, along with a few guidelines to help avoid improper interpretation. 17 references.« less

  2. The role of thermodynamics in biochemical engineering

    NASA Astrophysics Data System (ADS)

    von Stockar, Urs

    2013-09-01

    This article is an adapted version of the introductory chapter of a book whose publication is imminent. It bears the title "Biothermodynamics - The role of thermodynamics in biochemical engineering." The aim of the paper is to give a very short overview of the state of biothermodynamics in an engineering context as reflected in this book. Seen from this perspective, biothermodynamics may be subdivided according to the scale used to formalize the description of the biological system into three large areas: (i) biomolecular thermodynamics (most fundamental scale), (ii) thermodynamics of metabolism (intermediary scale), and (iii) whole-cell thermodynamics ("black-box" description of living entities). In each of these subareas, the main available theoretical approaches and the current and the potential applications are discussed. Biomolecular thermodynamics (i) is especially well developed and is obviously highly pertinent for the development of downstream processing. Its use ought to be encouraged as much as possible. The subarea of thermodynamics of live cells (iii), although scarcely applied in practice, is also expected to enhance bioprocess research and development, particularly in predicting culture performances, for understanding the driving forces for cellular growth, and in developing, monitoring, and controlling cellular cultures. Finally, there is no question that thermodynamic analysis of cellular metabolism (ii) is a promising tool for systems biology and for many other applications, but quite a large research effort is still needed before it may be put to practical use.

  3. Simultaneous Study of Intake and In-Cylinder IC Engine Flow Fields to Provide an Insight into Intake Induced Cyclic Variations

    NASA Astrophysics Data System (ADS)

    Justham, T.; Jarvis, S.; Clarke, A.; Garner, C. P.; Hargrave, G. K.; Halliwell, N. A.

    2006-07-01

    Simultaneous intake and in-cylinder digital particle image velocimetry (DPIV) experimental data is presented for a motored spark ignition (SI) optical internal combustion (IC) engine. Two individual DPIV systems were employed to study the inter-relationship between the intake and in-cylinder flow fields at an engine speed of 1500 rpm. Results for the intake runner velocity field at the time of maximum intake valve lift are compared to incylinder velocity fields later in the same engine cycle. Relationships between flow structures within the runner and cylinder were seen to be strong during the intake stroke but less significant during compression. Cyclic variations within the intake runner were seen to affect the large scale bulk flow motion. The subsequent decay of the large scale motions into smaller scale turbulent structures during the compression stroke appear to reduce the relationship with the intake flow variations.

  4. Automotive Stirling engine Market and Industrial Readiness Program (MIRP), phase 1

    NASA Astrophysics Data System (ADS)

    1982-05-01

    A program, begun in 1978, has the goal of transferring Stirling engine technology from United Stirling of Sweden to the US and, then, following design, fabrication, and prototype testing, to secure US manufacturers for the engine. The ultimate objective is the large-scale commercial use of the Automotive Stirling Engine (ASE) by the year 2000. The fist phase of the Market and Industrial Readiness Program for the ASE was concerned with defining the market, product, economic and technical factors necessary to be addressed to assure a reasonable chance of ultimate commercial acceptance. Program results for this first phase are reported and discussed. These results pertain to licensing strategy development, economic analysis, market factors, product planning, market growth, cost studies, and engine performance as measured by fuel economy using conventional fuels and by vehicle speed and acceleration characteristics.

  5. Energy: the microfluidic frontier.

    PubMed

    Sinton, David

    2014-09-07

    Global energy is largely a fluids problem. It is also large-scale, in stark contrast to microchannels. Microfluidic energy technologies must offer either massive scalability or direct relevance to energy processes already operating at scale. We have to pick our fights. Highlighted here are the exceptional opportunities I see, including some recent successes and areas where much more attention is needed. The most promising directions are those that leverage high surface-to-volume ratios, rapid diffusive transport, capacity for high temperature and high pressure experiments, and length scales characteristic of microbes and fluids (hydrocarbons, CO2) underground. The most immediate areas of application are where information is the product; either fluid sample analysis (e.g. oil analysis); or informing operations (e.g. CO2 transport in microporous media). I'll close with aspects that differentiate energy from traditional microfluidics applications, the uniquely important role of engineering in energy, and some thoughts for the research community forming at the nexus of lab-on-a-chip and energy--a microfluidic frontier.

  6. Methodological Problems of Nanotechnoscience

    NASA Astrophysics Data System (ADS)

    Gorokhov, V. G.

    Recently, we have reported on the definitions of nanotechnology as a new type of NanoTechnoScience and on the nanotheory as a cluster of the different natural and engineering theories. Nanotechnology is not only a new type of scientific-engineering discipline, but it evolves also in a “nonclassical” way. Nanoontology or nano scientific world view has a function of the methodological orientation for the choice the theoretical means and methods toward a solution to the scientific and engineering problems. This allows to change from one explanation and scientific world view to another without any problems. Thus, nanotechnology is both a field of scientific knowledge and a sphere of engineering activity, in other words, NanoTechnoScience is similar to Systems Engineering as the analysis and design of large-scale, complex, man/machine systems but micro- and nanosystems. Nano systems engineering as well as Macro systems engineering includes not only systems design but also complex research. Design orientation has influence on the change of the priorities in the complex research and of the relation to the knowledge, not only to “the knowledge about something”, but also to the knowledge as the means of activity: from the beginning control and restructuring of matter at the nano-scale is a necessary element of nanoscience.

  7. Sensemaking in a Value Based Context for Large Scale Complex Engineered Systems

    NASA Astrophysics Data System (ADS)

    Sikkandar Basha, Nazareen

    The design and the development of Large-Scale Complex Engineered Systems (LSCES) requires the involvement of multiple teams and numerous levels of the organization and interactions with large numbers of people and interdisciplinary departments. Traditionally, requirements-driven Systems Engineering (SE) is used in the design and development of these LSCES. The requirements are used to capture the preferences of the stakeholder for the LSCES. Due to the complexity of the system, multiple levels of interactions are required to elicit the requirements of the system within the organization. Since LSCES involves people and interactions between the teams and interdisciplinary departments, it should be socio-technical in nature. The elicitation of the requirements of most large-scale system projects are subjected to creep in time and cost due to the uncertainty and ambiguity of requirements during the design and development. In an organization structure, the cost and time overrun can occur at any level and iterate back and forth thus increasing the cost and time. To avoid such creep past researches have shown that rigorous approaches such as value based designing can be used to control it. But before the rigorous approaches can be used, the decision maker should have a proper understanding of requirements creep and the state of the system when the creep occurs. Sensemaking is used to understand the state of system when the creep occurs and provide a guidance to decision maker. This research proposes the use of the Cynefin framework, sensemaking framework which can be used in the design and development of LSCES. It can aide in understanding the system and decision making to minimize the value gap due to requirements creep by eliminating ambiguity which occurs during design and development. A sample hierarchical organization is used to demonstrate the state of the system at the occurrence of requirements creep in terms of cost and time using the Cynefin framework. These trials are continued for different requirements and at different sub-system level. The results obtained show that the Cynefin framework can be used to improve the value of the system and can be used for predictive analysis. The decision makers can use these findings and use rigorous approaches and improve the design of Large Scale Complex Engineered Systems.

  8. Overview of the NCC

    NASA Technical Reports Server (NTRS)

    Liu, Nan-Suey

    2001-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between then NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Glenn Research Center (LeRC), and Pratt & Whitney (P&W). The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration. The development of the NCC beta version was essentially completed in June 1998. Technical details of the NCC elements are given in the Reference List. Elements such as the baseline flow solver, turbulence module, and the chemistry module, have been extensively validated; and their parallel performance on large-scale parallel systems has been evaluated and optimized. However the scalar PDF module and the Spray module, as well as their coupling with the baseline flow solver, were developed in a small-scale distributed computing environment. As a result, the validation of the NCC beta version as a whole was quite limited. Current effort has been focused on the validation of the integrated code and the evaluation/optimization of its overall performance on large-scale parallel systems.

  9. Toward systems metabolic engineering of Aspergillus and Pichia species for the production of chemicals and biofuels.

    PubMed

    Caspeta, Luis; Nielsen, Jens

    2013-05-01

    Recently genome sequence data have become available for Aspergillus and Pichia species of industrial interest. This has stimulated the use of systems biology approaches for large-scale analysis of the molecular and metabolic responses of Aspergillus and Pichia under defined conditions, which has resulted in much new biological information. Case-specific contextualization of this information has been performed using comparative and functional genomic tools. Genomics data are also the basis for constructing genome-scale metabolic models, and these models have helped in the contextualization of knowledge on the fundamental biology of Aspergillus and Pichia species. Furthermore, with the availability of these models, the engineering of Aspergillus and Pichia is moving from traditional approaches, such as random mutagenesis, to a systems metabolic engineering approach. Here we review the recent trends in systems biology of Aspergillus and Pichia species, highlighting the relevance of these developments for systems metabolic engineering of these organisms for the production of hydrolytic enzymes, biofuels and chemicals from biomass. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Development of a Pebble-Bed Liquid-Nitrogen Evaporator/Superheater for the BRL 1/6th Scale Large Blast/Thermal Simulator Test Bed. Phase 1. Prototype Design and Analysis

    DTIC Science & Technology

    1991-08-01

    specifications are taken primarily from the 1983 version of the ASME Boiler and Pressure Vessel Code . Other design requirements were developea from standard safe...rules and practices of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code to provide a safe and reliable system

  11. Learning from Our Global Competitors: A Comparative Analysis of Science, Technology, Engineering and Mathematics (STEM) Education Pipelines in the United States, Mainland China and Taiwan

    ERIC Educational Resources Information Center

    Chow, Christina M.

    2011-01-01

    Maintaining a competitive edge within the 21st century is dependent on the cultivation of human capital, producing qualified and innovative employees capable of competing within the new global marketplace. Technological advancements in communications technology as well as large scale, infrastructure development has led to a leveled playing field…

  12. Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah

    Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has tomore » gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a scaling study that compares instrumented ROSS simulations with their noninstrumented counterparts in order to determine the amount of perturbation when running at different simulation scales.« less

  13. What scaling means in wind engineering: Complementary role of the reduced scale approach in a BLWT and the full scale testing in a large climatic wind tunnel

    NASA Astrophysics Data System (ADS)

    Flamand, Olivier

    2017-12-01

    Wind engineering problems are commonly studied by wind tunnel experiments at a reduced scale. This introduces several limitations and calls for a careful planning of the tests and the interpretation of the experimental results. The talk first revisits the similitude laws and discusses how they are actually applied in wind engineering. It will also remind readers why different scaling laws govern in different wind engineering problems. Secondly, the paper focuses on the ways to simplify a detailed structure (bridge, building, platform) when fabricating the downscaled models for the tests. This will be illustrated by several examples from recent engineering projects. Finally, under the most severe weather conditions, manmade structures and equipment should remain operational. What “recreating the climate” means and aims to achieve will be illustrated through common practice in climatic wind tunnel modelling.

  14. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill; Feiereisen, William (Technical Monitor)

    2000-01-01

    The term "Grid" refers to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. The vision for NASN's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks that will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: The scientist / design engineer whose primary interest is problem solving (e.g., determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user if the tool designer: The computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. This paper describes the current state of IPG (the operational testbed), the set of capabilities being put into place for the operational prototype IPG, as well as some of the longer term R&D tasks.

  15. PIV measurements of in-cylinder, large-scale structures in a water-analogue Diesel engine

    NASA Astrophysics Data System (ADS)

    Kalpakli Vester, A.; Nishio, Y.; Alfredsson, P. H.

    2016-11-01

    Swirl and tumble are large-scale structures that develop in an engine cylinder during the intake stroke. Their structure and strength depend on the design of the inlet ports and valves, but also on the valve lift history. Engine manufacturers make their design to obtain a specific flow structure that is assumed to give the best engine performance. Despite many efforts, there are still open questions, such as how swirl and tumble depend on the dynamics of the valves/piston as well as how cycle-to-cycle variations should be minimized. In collaboration with Swedish vehicle industry we perform PIV measurements of the flow dynamics during the intake stroke inside a cylinder of a water-analogue engine model having the same geometrical characteristics as a typical truck Diesel engine. Water can be used since during the intake stroke the flow is nearly incompressible. The flow from the valves moves radially outwards, hits the vertical walls of the cylinder, entrains surrounding fluid, moves along the cylinder walls and creates a central backflow, i.e. a tumble motion. Depending on the port and valve design and orientation none, low, or high swirl can be established. For the first time, the effect of the dynamic motion of the piston/valves on the large-scale structures is captured. Supported by the Swedish Energy Agency, Scania CV AB and Volvo GTT, through the FFI program.

  16. ENGINES: exploring single nucleotide variation in entire human genomes.

    PubMed

    Amigo, Jorge; Salas, Antonio; Phillips, Christopher

    2011-04-19

    Next generation ultra-sequencing technologies are starting to produce extensive quantities of data from entire human genome or exome sequences, and therefore new software is needed to present and analyse this vast amount of information. The 1000 Genomes project has recently released raw data for 629 complete genomes representing several human populations through their Phase I interim analysis and, although there are certain public tools available that allow exploration of these genomes, to date there is no tool that permits comprehensive population analysis of the variation catalogued by such data. We have developed a genetic variant site explorer able to retrieve data for Single Nucleotide Variation (SNVs), population by population, from entire genomes without compromising future scalability and agility. ENGINES (ENtire Genome INterface for Exploring SNVs) uses data from the 1000 Genomes Phase I to demonstrate its capacity to handle large amounts of genetic variation (>7.3 billion genotypes and 28 million SNVs), as well as deriving summary statistics of interest for medical and population genetics applications. The whole dataset is pre-processed and summarized into a data mart accessible through a web interface. The query system allows the combination and comparison of each available population sample, while searching by rs-number list, chromosome region, or genes of interest. Frequency and FST filters are available to further refine queries, while results can be visually compared with other large-scale Single Nucleotide Polymorphism (SNP) repositories such as HapMap or Perlegen. ENGINES is capable of accessing large-scale variation data repositories in a fast and comprehensive manner. It allows quick browsing of whole genome variation, while providing statistical information for each variant site such as allele frequency, heterozygosity or FST values for genetic differentiation. Access to the data mart generating scripts and to the web interface is granted from http://spsmart.cesga.es/engines.php. © 2011 Amigo et al; licensee BioMed Central Ltd.

  17. Application and theoretical analysis of the flamelet model for supersonic turbulent combustion flows in the scramjet engine

    NASA Astrophysics Data System (ADS)

    Gao, Zhenxun; Wang, Jingying; Jiang, Chongwen; Lee, Chunhian

    2014-11-01

    In the framework of Reynolds-averaged Navier-Stokes simulation, supersonic turbulent combustion flows at the German Aerospace Centre (DLR) combustor and Japan Aerospace Exploration Agency (JAXA) integrated scramjet engine are numerically simulated using the flamelet model. Based on the DLR combustor case, theoretical analysis and numerical experiments conclude that: the finite rate model only implicitly considers the large-scale turbulent effect and, due to the lack of the small-scale non-equilibrium effect, it would overshoot the peak temperature compared to the flamelet model in general. Furthermore, high-Mach-number compressibility affects the flamelet model mainly through two ways: the spatial pressure variation and the static enthalpy variation due to the kinetic energy. In the flamelet library, the mass fractions of the intermediate species, e.g. OH, are more sensible to the above two effects than the main species such as H2O. Additionally, in the combustion flowfield where the pressure is larger than the value adopted in the generation of the flamelet library or the conversion from the static enthalpy to the kinetic energy occurs, the temperature obtained by the flamelet model without taking compressibility effects into account would be undershot, and vice versa. The static enthalpy variation effect has only little influence on the temperature simulation of the flamelet model, while the effect of the spatial pressure variation may cause relatively large errors. From the JAXA case, it is found that the flamelet model cannot in general be used for an integrated scramjet engine. The existence of the inlet together with the transverse injection scheme could cause large spatial variations of pressure, so the pressure value adopted for the generation of a flamelet library should be fine-tuned according to a pre-simulation of pure mixing.

  18. Development of a metal-clad advanced composite shear web design concept

    NASA Technical Reports Server (NTRS)

    Laakso, J. H.

    1974-01-01

    An advanced composite web concept was developed for potential application to the Space Shuttle Orbiter main engine thrust structure. The program consisted of design synthesis, analysis, detail design, element testing, and large scale component testing. A concept was sought that offered significant weight saving by the use of Boron/Epoxy (B/E) reinforced titanium plate structure. The desired concept was one that was practical and that utilized metal to efficiently improve structural reliability. The resulting development of a unique titanium-clad B/E shear web design concept is described. Three large scale components were fabricated and tested to demonstrate the performance of the concept: a titanium-clad plus or minus 45 deg B/E web laminate stiffened with vertical B/E reinforced aluminum stiffeners.

  19. Submarine pipeline on-bottom stability. Volume 2: Software and manuals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-12-01

    The state-of-the-art in pipeline stability design has been changing very rapidly recent. The physics governing on-bottom stability are much better understood now than they were eight years. This is due largely because of research and large scale model tests sponsored by PRCI. Analysis tools utilizing this new knowledge have been developed. These tools provide the design engineer with a rational approach have been developed. These tools provide the design engineer with a rational approach for weight coating design, which he can use with confidence because the tools have been developed based on full scale and near full scale model tests.more » These tools represent the state-of-the-art in stability design and model the complex behavior of pipes subjected to both wave and current loads. These include: hydrodynamic forces which account for the effect of the wake (generated by flow over the pipe) washing back and forth over the pipe in oscillatory flow; and the embedment (digging) which occurs as a pipe resting on the seabed is exposed to oscillatory loadings and small oscillatory deflections. This report has been developed as a reference handbook for use in on-bottom pipeline stability analysis It consists of two volumes. Volume one is devoted descriptions of the various aspects of the problem: the pipeline design process; ocean physics, wave mechanics, hydrodynamic forces, and meteorological data determination; geotechnical data collection and soil mechanics; and stability design procedures. Volume two describes, lists, and illustrates the analysis software. Diskettes containing the software and examples of the software are also included in Volume two.« less

  20. Scale-Up of GRCop: From Laboratory to Rocket Engines

    NASA Technical Reports Server (NTRS)

    Ellis, David L.

    2016-01-01

    GRCop is a high temperature, high thermal conductivity copper-based series of alloys designed primarily for use in regeneratively cooled rocket engine liners. It began with laboratory-level production of a few grams of ribbon produced by chill block melt spinning and has grown to commercial-scale production of large-scale rocket engine liners. Along the way, a variety of methods of consolidating and working the alloy were examined, a database of properties was developed and a variety of commercial and government applications were considered. This talk will briefly address the basic material properties used for selection of compositions to scale up, the methods used to go from simple ribbon to rocket engines, the need to develop a suitable database, and the issues related to getting the alloy into a rocket engine or other application.

  1. Digital Image Correlation Techniques Applied to Large Scale Rocket Engine Testing

    NASA Technical Reports Server (NTRS)

    Gradl, Paul R.

    2016-01-01

    Rocket engine hot-fire ground testing is necessary to understand component performance, reliability and engine system interactions during development. The J-2X upper stage engine completed a series of developmental hot-fire tests that derived performance of the engine and components, validated analytical models and provided the necessary data to identify where design changes, process improvements and technology development were needed. The J-2X development engines were heavily instrumented to provide the data necessary to support these activities which enabled the team to investigate any anomalies experienced during the test program. This paper describes the development of an optical digital image correlation technique to augment the data provided by traditional strain gauges which are prone to debonding at elevated temperatures and limited to localized measurements. The feasibility of this optical measurement system was demonstrated during full scale hot-fire testing of J-2X, during which a digital image correlation system, incorporating a pair of high speed cameras to measure three-dimensional, real-time displacements and strains was installed and operated under the extreme environments present on the test stand. The camera and facility setup, pre-test calibrations, data collection, hot-fire test data collection and post-test analysis and results are presented in this paper.

  2. Engineering of Baeyer-Villiger monooxygenase-based Escherichia coli biocatalyst for large scale biotransformation of ricinoleic acid into (Z)-11-(heptanoyloxy)undec-9-enoic acid

    PubMed Central

    Seo, Joo-Hyun; Kim, Hwan-Hee; Jeon, Eun-Yeong; Song, Young-Ha; Shin, Chul-Soo; Park, Jin-Byung

    2016-01-01

    Baeyer-Villiger monooxygenases (BVMOs) are able to catalyze regiospecific Baeyer-Villiger oxygenation of a variety of cyclic and linear ketones to generate the corresponding lactones and esters, respectively. However, the enzymes are usually difficult to express in a functional form in microbial cells and are rather unstable under process conditions hindering their large-scale applications. Thereby, we investigated engineering of the BVMO from Pseudomonas putida KT2440 and the gene expression system to improve its activity and stability for large-scale biotransformation of ricinoleic acid (1) into the ester (i.e., (Z)-11-(heptanoyloxy)undec-9-enoic acid) (3), which can be hydrolyzed into 11-hydroxyundec-9-enoic acid (5) (i.e., a precursor of polyamide-11) and n-heptanoic acid (4). The polyionic tag-based fusion engineering of the BVMO and the use of a synthetic promoter for constitutive enzyme expression allowed the recombinant Escherichia coli expressing the BVMO and the secondary alcohol dehydrogenase of Micrococcus luteus to produce the ester (3) to 85 mM (26.6 g/L) within 5 h. The 5 L scale biotransformation process was then successfully scaled up to a 70 L bioreactor; 3 was produced to over 70 mM (21.9 g/L) in the culture medium 6 h after biotransformation. This study demonstrated that the BVMO-based whole-cell reactions can be applied for large-scale biotransformations. PMID:27311560

  3. McrEngine: A Scalable Checkpointing System Using Data-Aware Aggregation and Compression

    DOE PAGES

    Islam, Tanzima Zerin; Mohror, Kathryn; Bagchi, Saurabh; ...

    2013-01-01

    High performance computing (HPC) systems use checkpoint-restart to tolerate failures. Typically, applications store their states in checkpoints on a parallel file system (PFS). As applications scale up, checkpoint-restart incurs high overheads due to contention for PFS resources. The high overheads force large-scale applications to reduce checkpoint frequency, which means more compute time is lost in the event of failure. We alleviate this problem through a scalable checkpoint-restart system, mcrEngine. McrEngine aggregates checkpoints from multiple application processes with knowledge of the data semantics available through widely-used I/O libraries, e.g., HDF5 and netCDF, and compresses them. Our novel scheme improves compressibility ofmore » checkpoints up to 115% over simple concatenation and compression. Our evaluation with large-scale application checkpoints show that mcrEngine reduces checkpointing overhead by up to 87% and restart overhead by up to 62% over a baseline with no aggregation or compression.« less

  4. Education for Professional Engineering Practice

    ERIC Educational Resources Information Center

    Bramhall, Mike D.; Short, Chris

    2014-01-01

    This paper reports on a funded collaborative large-scale curriculum innovation and enhancement project undertaken as part of a UK National Higher Education Science, Technology Engineering and Mathematics (STEM) programme. Its aim was to develop undergraduate curricula to teach appropriate skills for professional engineering practice more…

  5. An Illustrative Guide to the Minerva Framework

    NASA Astrophysics Data System (ADS)

    Flom, Erik; Leonard, Patrick; Hoeffel, Udo; Kwak, Sehyun; Pavone, Andrea; Svensson, Jakob; Krychowiak, Maciej; Wendelstein 7-X Team Collaboration

    2017-10-01

    Modern phsyics experiments require tracking and modelling data and their associated uncertainties on a large scale, as well as the combined implementation of multiple independent data streams for sophisticated modelling and analysis. The Minerva Framework offers a centralized, user-friendly method of large-scale physics modelling and scientific inference. Currently used by teams at multiple large-scale fusion experiments including the Joint European Torus (JET) and Wendelstein 7-X (W7-X), the Minerva framework provides a forward-model friendly architecture for developing and implementing models for large-scale experiments. One aspect of the framework involves so-called data sources, which are nodes in the graphical model. These nodes are supplied with engineering and physics parameters. When end-user level code calls a node, it is checked network-wide against its dependent nodes for changes since its last implementation and returns version-specific data. Here, a filterscope data node is used as an illustrative example of the Minerva Framework's data management structure and its further application to Bayesian modelling of complex systems. This work has been carried out within the framework of the EUROfusion Consortium and has received funding from the Euratom research and training programme 2014-2018 under Grant Agreement No. 633053.

  6. Shape Memory Alloys for Vibration Isolation and Damping of Large-Scale Space Structures

    DTIC Science & Technology

    2010-08-04

    Portugal (2007) Figure 24 – Comparison of martensitic SMA with steel in sine upsweep 3.2.2.4 Dwell Test Comparison with Sine Sweep Results...International Conference on Experimental Vibration Analysis for Civil Engineering Structures (EVACES), Porto, Portugal (2007) † Lammering, Rolf...a unique jump in amplitude during a sine sweep if sufficient pre- stretch is applied. These results were significant, but investigation of more

  7. Scaling earthquake ground motions for performance-based assessment of buildings

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.; Hamburger, R.O.

    2011-01-01

    The impact of alternate ground-motion scaling procedures on the distribution of displacement responses in simplified structural systems is investigated. Recommendations are provided for selecting and scaling ground motions for performance-based assessment of buildings. Four scaling methods are studied, namely, (1)geometric-mean scaling of pairs of ground motions, (2)spectrum matching of ground motions, (3)first-mode-period scaling to a target spectral acceleration, and (4)scaling of ground motions per the distribution of spectral demands. Data were developed by nonlinear response-history analysis of a large family of nonlinear single degree-of-freedom (SDOF) oscillators that could represent fixed-base and base-isolated structures. The advantages and disadvantages of each scaling method are discussed. The relationship between spectral shape and a ground-motion randomness parameter, is presented. A scaling procedure that explicitly considers spectral shape is proposed. ?? 2011 American Society of Civil Engineers.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacques Hugo

    Traditional engineering methods do not make provision for the integration of human considerations, while traditional human factors methods do not scale well to the complexity of large-scale nuclear power plant projects. Although the need for up-to-date human factors engineering processes and tools is recognised widely in industry, so far no formal guidance has been developed. This article proposes such a framework.

  9. Gene targeting by TALEN-induced homologous recombination in goats directs production of β-lactoglobulin-free, high-human lactoferrin milk

    PubMed Central

    Cui, Chenchen; Song, Yujie; Liu, Jun; Ge, Hengtao; Li, Qian; Huang, Hui; Hu, Linyong; Zhu, Hongmei; Jin, Yaping; Zhang, Yong

    2015-01-01

    β-Lactoglobulin (BLG) is a major goat’s milk allergen that is absent in human milk. Engineered endonucleases, including transcription activator-like effector nucleases (TALENs) and zinc-finger nucleases, enable targeted genetic modification in livestock. In this study, TALEN-mediated gene knockout followed by gene knock-in were used to generate BLG knockout goats as mammary gland bioreactors for large-scale production of human lactoferrin (hLF). We introduced precise genetic modifications in the goat genome at frequencies of approximately 13.6% and 6.09% for the first and second sequential targeting, respectively, by using targeting vectors that underwent TALEN-induced homologous recombination (HR). Analysis of milk from the cloned goats revealed large-scale hLF expression or/and decreased BLG levels in milk from heterozygous goats as well as the absence of BLG in milk from homozygous goats. Furthermore, the TALEN-mediated targeting events in somatic cells can be transmitted through the germline after SCNT. Our result suggests that gene targeting via TALEN-induced HR may expedite the production of genetically engineered livestock for agriculture and biomedicine. PMID:25994151

  10. Gene targeting by TALEN-induced homologous recombination in goats directs production of β-lactoglobulin-free, high-human lactoferrin milk.

    PubMed

    Cui, Chenchen; Song, Yujie; Liu, Jun; Ge, Hengtao; Li, Qian; Huang, Hui; Hu, Linyong; Zhu, Hongmei; Jin, Yaping; Zhang, Yong

    2015-05-21

    β-Lactoglobulin (BLG) is a major goat's milk allergen that is absent in human milk. Engineered endonucleases, including transcription activator-like effector nucleases (TALENs) and zinc-finger nucleases, enable targeted genetic modification in livestock. In this study, TALEN-mediated gene knockout followed by gene knock-in were used to generate BLG knockout goats as mammary gland bioreactors for large-scale production of human lactoferrin (hLF). We introduced precise genetic modifications in the goat genome at frequencies of approximately 13.6% and 6.09% for the first and second sequential targeting, respectively, by using targeting vectors that underwent TALEN-induced homologous recombination (HR). Analysis of milk from the cloned goats revealed large-scale hLF expression or/and decreased BLG levels in milk from heterozygous goats as well as the absence of BLG in milk from homozygous goats. Furthermore, the TALEN-mediated targeting events in somatic cells can be transmitted through the germline after SCNT. Our result suggests that gene targeting via TALEN-induced HR may expedite the production of genetically engineered livestock for agriculture and biomedicine.

  11. Structural integrity of engineering composite materials: a cracking good yarn.

    PubMed

    Beaumont, Peter W R; Soutis, Costas

    2016-07-13

    Predicting precisely where a crack will develop in a material under stress and exactly when in time catastrophic fracture of the component will occur is one the oldest unsolved mysteries in the design and building of large-scale engineering structures. Where human life depends upon engineering ingenuity, the burden of testing to prove a 'fracture safe design' is immense. Fitness considerations for long-life implementation of large composite structures include understanding phenomena such as impact, fatigue, creep and stress corrosion cracking that affect reliability, life expectancy and durability of structure. Structural integrity analysis treats the design, the materials used, and figures out how best components and parts can be joined, and takes service duty into account. However, there are conflicting aims in the complete design process of designing simultaneously for high efficiency and safety assurance throughout an economically viable lifetime with an acceptable level of risk. This article is part of the themed issue 'Multiscale modelling of the structural integrity of composite materials'. © 2016 The Author(s).

  12. Design of a V/STOL propulsion system for a large-scale fighter model

    NASA Technical Reports Server (NTRS)

    Willis, W. S.

    1981-01-01

    Modifications were made to the existing Large-Scale STOL fighter model to simulate a V/STOL configuration. Modifications include the substitutions of two dimensional lift/cruise exhaust nozzles in the nacelles, and the addition of a third J97 engine in the fuselage to suppy a remote exhaust nozzle simulating a Remote Augmented Lift System. A preliminary design of the inlet and exhaust ducting for the third engine was developed and a detailed design was completed of the hot exhaust ducting and remote nozzle.

  13. Implementation of the Large-Scale Operations Management Test in the State of Washington.

    DTIC Science & Technology

    1982-12-01

    During FY 79, the U.S. Army Engineer Waterways Experiment Station (WES), Vicksburg, Miss., completed the first phase of its 3-year Large-Scale Operations Management Test (LSOMT). The LSOMT was designed to develop an operational plan to identify methodologies that can be implemented by the U.S. Army Engineer District, Seattle (NPS), to prevent the exotic aquatic macrophyte Eurasian watermilfoil (Myrophyllum spicatum L.) from reaching problem-level proportions in water bodies in the state of Washington. The WES developed specific plans as integral elements

  14. Probabilistic Multi-Scale, Multi-Level, Multi-Disciplinary Analysis and Optimization of Engine Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2000-01-01

    Aircraft engines are assemblies of dynamically interacting components. Engine updates to keep present aircraft flying safely and engines for new aircraft are progressively required to operate in more demanding technological and environmental requirements. Designs to effectively meet those requirements are necessarily collections of multi-scale, multi-level, multi-disciplinary analysis and optimization methods and probabilistic methods are necessary to quantify respective uncertainties. These types of methods are the only ones that can formally evaluate advanced composite designs which satisfy those progressively demanding requirements while assuring minimum cost, maximum reliability and maximum durability. Recent research activities at NASA Glenn Research Center have focused on developing multi-scale, multi-level, multidisciplinary analysis and optimization methods. Multi-scale refers to formal methods which describe complex material behavior metal or composite; multi-level refers to integration of participating disciplines to describe a structural response at the scale of interest; multidisciplinary refers to open-ended for various existing and yet to be developed discipline constructs required to formally predict/describe a structural response in engine operating environments. For example, these include but are not limited to: multi-factor models for material behavior, multi-scale composite mechanics, general purpose structural analysis, progressive structural fracture for evaluating durability and integrity, noise and acoustic fatigue, emission requirements, hot fluid mechanics, heat-transfer and probabilistic simulations. Many of these, as well as others, are encompassed in an integrated computer code identified as Engine Structures Technology Benefits Estimator (EST/BEST) or Multi-faceted/Engine Structures Optimization (MP/ESTOP). The discipline modules integrated in MP/ESTOP include: engine cycle (thermodynamics), engine weights, internal fluid mechanics, cost, mission and coupled structural/thermal, various composite property simulators and probabilistic methods to evaluate uncertainty effects (scatter ranges) in all the design parameters. The objective of the proposed paper is to briefly describe a multi-faceted design analysis and optimization capability for coupled multi-discipline engine structures optimization. Results are presented for engine and aircraft type metrics to illustrate the versatility of that capability. Results are also presented for reliability, noise and fatigue to illustrate its inclusiveness. For example, replacing metal rotors with composites reduces the engine weight by 20 percent, 15 percent noise reduction, and an order of magnitude improvement in reliability. Composite designs exist to increase fatigue life by at least two orders of magnitude compared to state-of-the-art metals.

  15. Disturbance Frequency Determines Morphology and Community Development in Multi-Species Biofilm at the Landscape Scale

    PubMed Central

    Milferstedt, Kim; Santa-Catalina, Gaëlle; Godon, Jean-Jacques; Escudié, Renaud; Bernet, Nicolas

    2013-01-01

    Many natural and engineered biofilm systems periodically face disturbances. Here we present how the recovery time of a biofilm between disturbances (expressed as disturbance frequency) shapes the development of morphology and community structure in a multi-species biofilm at the landscape scale. It was hypothesized that a high disturbance frequency favors the development of a stable adapted biofilm system while a low disturbance frequency promotes a dynamic biofilm response. Biofilms were grown in laboratory-scale reactors over a period of 55-70 days and exposed to the biocide monochloramine at two frequencies: daily or weekly pulse injections. One untreated reactor served as control. Biofilm morphology and community structure were followed on comparably large biofilm areas at the landscape scale using automated image analysis (spatial gray level dependence matrices) and community fingerprinting (single-strand conformation polymorphisms). We demonstrated that a weekly disturbed biofilm developed a resilient morphology and community structure. Immediately after the disturbance, the biofilm simplified but recovered its initial complex morphology and community structure between two biocide pulses. In the daily treated reactor, one organism largely dominated a morphologically simple and stable biofilm. Disturbances primarily affected the abundance distribution of already present bacterial taxa but did not promote growth of previously undetected organisms. Our work indicates that disturbances can be used as lever to engineer biofilms by maintaining a biofilm between two developmental states. PMID:24303024

  16. Rocket University at KSC

    NASA Technical Reports Server (NTRS)

    Sullivan, Steven J.

    2014-01-01

    "Rocket University" is an exciting new initiative at Kennedy Space Center led by NASA's Engineering and Technology Directorate. This hands-on experience has been established to develop, refine & maintain targeted flight engineering skills to enable the Agency and KSC strategic goals. Through "RocketU", KSC is developing a nimble, rapid flight engineering life cycle systems knowledge base. Ongoing activities in RocketU develop and test new technologies and potential customer systems through small scale vehicles, build and maintain flight experience through balloon and small-scale rocket missions, and enable a revolving fresh perspective of engineers with hands on expertise back into the large scale NASA programs, providing a more experienced multi-disciplined set of systems engineers. This overview will define the Program, highlight aspects of the training curriculum, and identify recent accomplishments and activities.

  17. Taking Stock: Existing Resources for Assessing a New Vision of Science Learning

    ERIC Educational Resources Information Center

    Alonzo, Alicia C.; Ke, Li

    2016-01-01

    A new vision of science learning described in the "Next Generation Science Standards"--particularly the science and engineering practices and their integration with content--pose significant challenges for large-scale assessment. This article explores what might be learned from advances in large-scale science assessment and…

  18. The bungling giant: Atomic Energy Canada Limited and next-generation nuclear technology, 1980--1994

    NASA Astrophysics Data System (ADS)

    Slater, Ian James

    From 1980--1994 Atomic Energy Canada Limited (AECL), the Crown Corporation responsible for the development of nuclear technology in Canada, ventured into the market for small-scale, decentralized power systems with the Slowpoke Energy System (SES), a 10MW nuclear reactor for space heating in urban and remote areas. The SES was designed to be "passively" or "inherently" safe, such that even the most catastrophic failure of the system would not result in a serious accident (e.g. a meltdown or an explosion). This Canadian initiative, a beneficiary of the National Energy Program, was the first and by far the most successful attempt at a passively safe, decentralized nuclear power system anywhere in the world. Part one uses archival documentation and interviews with project leaders to reconstruct the history of the SES. The standard explanations for the failure of the project, cheap oil, public resistance to the technology, and lack of commercial expertise, are rejected. Part two presents an alternative explanation for the failure of AECL to commercialize the SES. In short, technological momentum towards large-scale nuclear designs led to structural restrictions for the SES project. These restrictions manifested themselves internally to the company (e.g., marginalization of the SES) and externally to the company (e.g., licensing). In part three, the historical lessons of the SES are used to refine one of the central tenets of Popper's political philosophy, "piecemeal social engineering." Popper's presentation of the idea is lacking in detail; the analysis of the SES provides some empirical grounding for the concept. I argue that the institutions surrounding traditional nuclear power represent a form utopian social engineering, leading to consequences such as the suspension of civil liberties to guarantee security of the technology. The SES project was an example of a move from the utopian social engineering of large-scale centralized nuclear technology to the piecemeal social engineering of small-scale, safer and simpler decentralized nuclear heating.

  19. Integrating Delta Building Physics & Economics: Optimizing the Scale of Engineered Avulsions in the Mississippi River Delta

    NASA Astrophysics Data System (ADS)

    Kenney, M. A.; Mohrig, D.; Hobbs, B. F.; Parker, G.

    2011-12-01

    Land loss in the Mississippi River Delta caused by subsidence and erosion has resulted in habitat loss, interference with human activities, and increased exposure of New Orleans and other settled areas to storm surge risks. Prior to dam and levee building and oil and gas production in the 20th century, the long term rates of land building roughly balanced land loss through subsidence. Now, however, sediment is being deposited at dramatically lower rates in shallow areas in and adjacent to the Delta, with much of the remaining sediment borne by the Mississippi being lost to the deep areas of the Gulf of Mexico. A few projects have been built in order to divert sediment from the river to areas where land can be built, and many more are under consideration as part of State of Louisiana and Federal planning processes. Most are small scale, although there have been some proposals for large engineered avulsions that would divert a significant fraction of the remaining available sediment (W. Kim, et al. 2009, EOS). However, there is debate over whether small or large diversions are the economically optimally and socially most acceptable size of such land building projects. From an economic point of view, the optimal size involves tradeoffs between scale economies in civil work construction, the relationship between depth of diversion and sediment concentration in river water, effects on navigation, and possible diminishing returns to land building at a single location as the edge of built land progresses into deeper waters. Because land building efforts could potentially involve billions of dollars of investment, it is important to gain as much benefit as possible from those expenditures. We present the result of a general analysis of scale economies in land building from engineered avulsions. The analysis addresses the question: how many projects of what size should be built at what time in order to maximize the amount of land built by a particular time? The analysis integrates three models: 1. coarse sediment diversion as a function of the width, depth, and timing of water diversions (using our field measurements of sediment concentration as a function of depth), 2. land building as a function of the location, water, and amount of sediment diverted, accounting for bathymetry, subsidence, and other factors, and 3. cost of building and operating the necessary civil works. Our statistical analysis of past diversions indicates existence of scale economies in width and scale of diseconomies in depth. The analysis explores general relationships between size, cost, and land building, and does not consider specific actual project proposals or locations. Sensitivity to assumptions about fine sediment capture, accumulation rates for organic material, and other inputs will be discussed.

  20. Wind Tunnel Tests of Large- and Small-Scale Rotor Hubs and Pylons

    DTIC Science & Technology

    1981-04-01

    formed by the engine nacelle and the nose gearbox. A fairing was wrapped around the nose gearbox and blended into the top and bottom of the engine...the variation is due to the FABs installation which acts like a small winglet . The large excursion ahead of Station 200 is due to the wing, the flow

  1. Transportation Big Data: Unbiased Analysis and Tools to Inform Sustainable Transportation Decisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Today, transportation operation and energy systems data are generated at an unprecedented scale. The U.S. Department of Energy's National Renewable Energy Laboratory (NREL) is the go-to source for expertise in providing data and analysis to inform industry and government transportation decision making. The lab's teams of data experts and engineers are mining and analyzing large sets of complex data -- or 'big data' -- to develop solutions that support the research, development, and deployment of market-ready technologies that reduce fuel consumption and greenhouse gas emissions.

  2. Large space telescope engineering scale model optical design

    NASA Technical Reports Server (NTRS)

    Facey, T. A.

    1973-01-01

    The objective is to develop the detailed design and tolerance data for the LST engineering scale model optical system. This will enable MSFC to move forward to the optical element procurement phase and also to evaluate tolerances, manufacturing requirements, assembly/checkout procedures, reliability, operational complexity, stability requirements of the structure and thermal system, and the flexibility to change and grow.

  3. Study of an engine flow diverter system for a large scale ejector powered aircraft model

    NASA Technical Reports Server (NTRS)

    Springer, R. J.; Langley, B.; Plant, T.; Hunter, L.; Brock, O.

    1981-01-01

    Requirements were established for a conceptual design study to analyze and design an engine flow diverter system and to include accommodations for an ejector system in an existing 3/4 scale fighter model equipped with YJ-79 engines. Model constraints were identified and cost-effective limited modification was proposed to accept the ejectors, ducting and flow diverter valves. Complete system performance was calculated and a versatile computer program capable of analyzing any ejector system was developed.

  4. Flat-plate solar array project. Volume 6: Engineering sciences and reliability

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.; Smokler, M. I.

    1986-01-01

    The Flat-Plate Solar Array (FSA) Project activities directed at developing the engineering technology base required to achieve modules that meet the functional, safety, and reliability requirements of large scale terrestrial photovoltaic systems applications are reported. These activities included: (1) development of functional, safety, and reliability requirements for such applications; (2) development of the engineering analytical approaches, test techniques, and design solutions required to meet the requirements; (3) synthesis and procurement of candidate designs for test and evaluation; and (4) performance of extensive testing, evaluation, and failure analysis of define design shortfalls and, thus, areas requiring additional research and development. A summary of the approach and technical outcome of these activities are provided along with a complete bibliography of the published documentation covering the detailed accomplishments and technologies developed.

  5. An ecological compass for planetary engineering.

    PubMed

    Haqq-Misra, Jacob

    2012-10-01

    Proposals to address present-day global warming through the large-scale application of technology to the climate system, known as geoengineering, raise questions of environmental ethics relevant to the broader issue of planetary engineering. These questions have also arisen in the scientific literature as discussions of how to terraform a planet such as Mars or Venus in order to make it more Earth-like and habitable. Here we draw on insights from terraforming and environmental ethics to develop a two-axis comparative tool for ethical frameworks that considers the intrinsic or instrumental value placed upon organisms, environments, planetary systems, or space. We apply this analysis to the realm of planetary engineering, such as terraforming on Mars or geoengineering on present-day Earth, as well as to questions of planetary protection and space exploration.

  6. Heuristic decomposition for non-hierarchic systems

    NASA Technical Reports Server (NTRS)

    Bloebaum, Christina L.; Hajela, P.

    1991-01-01

    Design and optimization is substantially more complex in multidisciplinary and large-scale engineering applications due to the existing inherently coupled interactions. The paper introduces a quasi-procedural methodology for multidisciplinary optimization that is applicable for nonhierarchic systems. The necessary decision-making support for the design process is provided by means of an embedded expert systems capability. The method employs a decomposition approach whose modularity allows for implementation of specialized methods for analysis and optimization within disciplines.

  7. Fast Bound Methods for Large Scale Simulation with Application for Engineering Optimization

    NASA Technical Reports Server (NTRS)

    Patera, Anthony T.; Peraire, Jaime; Zang, Thomas A. (Technical Monitor)

    2002-01-01

    In this work, we have focused on fast bound methods for large scale simulation with application for engineering optimization. The emphasis is on the development of techniques that provide both very fast turnaround and a certificate of Fidelity; these attributes ensure that the results are indeed relevant to - and trustworthy within - the engineering context. The bound methodology which underlies this work has many different instantiations: finite element approximation; iterative solution techniques; and reduced-basis (parameter) approximation. In this grant we have, in fact, treated all three, but most of our effort has been concentrated on the first and third. We describe these below briefly - but with a pointer to an Appendix which describes, in some detail, the current "state of the art."

  8. Mobility Data Analytics Center.

    DOT National Transportation Integrated Search

    2016-01-01

    Mobility Data Analytics Center aims at building a centralized data engine to efficiently manipulate : large-scale data for smart decision making. Integrating and learning the massive data are the key to : the data engine. The ultimate goal of underst...

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zitney, S.E.

    This presentation will examine process systems engineering R&D needs for application to advanced fossil energy (FE) systems and highlight ongoing research activities at the National Energy Technology Laboratory (NETL) under the auspices of a recently launched Collaboratory for Process & Dynamic Systems Research. The three current technology focus areas include: 1) High-fidelity systems with NETL's award-winning Advanced Process Engineering Co-Simulator (APECS) technology for integrating process simulation with computational fluid dynamics (CFD) and virtual engineering concepts, 2) Dynamic systems with R&D on plant-wide IGCC dynamic simulation, control, and real-time training applications, and 3) Systems optimization including large-scale process optimization, stochastic simulationmore » for risk/uncertainty analysis, and cost estimation. Continued R&D aimed at these and other key process systems engineering models, methods, and tools will accelerate the development of advanced gasification-based FE systems and produce increasingly valuable outcomes for DOE and the Nation.« less

  10. Reducing aeration energy consumption in a large-scale membrane bioreactor: Process simulation and engineering application.

    PubMed

    Sun, Jianyu; Liang, Peng; Yan, Xiaoxu; Zuo, Kuichang; Xiao, Kang; Xia, Junlin; Qiu, Yong; Wu, Qing; Wu, Shijia; Huang, Xia; Qi, Meng; Wen, Xianghua

    2016-04-15

    Reducing the energy consumption of membrane bioreactors (MBRs) is highly important for their wider application in wastewater treatment engineering. Of particular significance is reducing aeration in aerobic tanks to reduce the overall energy consumption. This study proposed an in situ ammonia-N-based feedback control strategy for aeration in aerobic tanks; this was tested via model simulation and through a large-scale (50,000 m(3)/d) engineering application. A full-scale MBR model was developed based on the activated sludge model (ASM) and was calibrated to the actual MBR. The aeration control strategy took the form of a two-step cascaded proportion-integration (PI) feedback algorithm. Algorithmic parameters were optimized via model simulation. The strategy achieved real-time adjustment of aeration amounts based on feedback from effluent quality (i.e., ammonia-N). The effectiveness of the strategy was evaluated through both the model platform and the full-scale engineering application. In the former, the aeration flow rate was reduced by 15-20%. In the engineering application, the aeration flow rate was reduced by 20%, and overall specific energy consumption correspondingly reduced by 4% to 0.45 kWh/m(3)-effluent, using the present practice of regulating the angle of guide vanes of fixed-frequency blowers. Potential energy savings are expected to be higher for MBRs with variable-frequency blowers. This study indicated that the ammonia-N-based aeration control strategy holds promise for application in full-scale MBRs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Manufacturing Process Developments for Regeneratively-Cooled Channel Wall Rocket Nozzles

    NASA Technical Reports Server (NTRS)

    Gradl, Paul; Brandsmeier, Will

    2016-01-01

    Regeneratively cooled channel wall nozzles incorporate a series of integral coolant channels to contain the coolant to maintain adequate wall temperatures and expand hot gas providing engine thrust and specific impulse. NASA has been evaluating manufacturing techniques targeting large scale channel wall nozzles to support affordability of current and future liquid rocket engine nozzles and thrust chamber assemblies. The development of these large scale manufacturing techniques focus on the liner formation, channel slotting with advanced abrasive water-jet milling techniques and closeout of the coolant channels to replace or augment other cost reduction techniques being evaluated for nozzles. NASA is developing a series of channel closeout techniques including large scale additive manufacturing laser deposition and explosively bonded closeouts. A series of subscale nozzles were completed evaluating these processes. Fabrication of mechanical test and metallography samples, in addition to subscale hardware has focused on Inconel 625, 300 series stainless, aluminum alloys as well as other candidate materials. Evaluations of these techniques are demonstrating potential for significant cost reductions for large scale nozzles and chambers. Hot fire testing is planned using these techniques in the future.

  12. Large-Scale Wind-Tunnel Tests of Exhaust Ingestion Due to Thrust Reversal on a Four-Engine Jet Transport during Ground Roll

    NASA Technical Reports Server (NTRS)

    Tolhurst, William H., Jr.; Hickey, David H.; Aoyagi, Kiyoshi

    1961-01-01

    Wind-tunnel tests have been conducted on a large-scale model of a swept-wing jet transport type airplane to study the factors affecting exhaust gas ingestion into the engine inlets when thrust reversal is used during ground roll. The model was equipped with four small jet engines mounted in nacelles beneath the wing. The tests included studies of both cascade and target type reversers. The data obtained included the free-stream velocity at the occurrence of exhaust gas ingestion in the outboard engine and the increment of drag due to thrust reversal for various modifications of thrust reverser configuration. Motion picture films of smoke flow studies were also obtained to supplement the data. The results show that the free-stream velocity at which ingestion occurred in the outboard engines could be reduced considerably, by simple modifications to the reversers, without reducing the effective drag due to reversed thrust.

  13. Large-Scale Operations Management Test of Use of The White Amur for Control of Problem Aquatic Plants. Report 1. Baseline Studies. Volume V. The Herpetofauna of Lake Conway, Florida.

    DTIC Science & Technology

    1981-06-01

    V ADA02 7414 UNIVERSITY OF SOUTH FLORIDA TAMPA DEPT OF BIOLOGY F/6 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE MHITE AMUM-ETC(U) JUN 81...Army Engineer Waterways Expiftaton P. 0. Box 631, Vicksburg, Miss. 391( 0 81 8 1102 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR...78-22// 4. TITLE (and Su~btitle) 5 TYPE OF REPORT & PERIOD COVERED LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE. or Report I of a series THE W4HITE

  14. Large Pilot-Scale Carbon Dioxide (CO2) Capture Project Using Aminosilicone Solvent.Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hancu, Dan

    GE Global Research has developed, over the last 8 years, a platform of cost effective CO2 capture technologies based on a non-aqueous aminosilicone solvent (GAP-1m). As demonstrated in previous funded DOE projects (DE-FE0007502 and DEFE0013755), the GAP-1m solvent has increased CO2 working capacity, lower volatility and corrosivity than the benchmark aqueous amine technology. Performance of the GAP-1m solvent was recently demonstrated in a 0.5 MWe pilot at National Carbon Capture Center, AL with real flue gas for over 500 hours of operation using a Steam Stripper Column (SSC). The pilot-scale PSTU engineering data were used to (i) update the techno-economicmore » analysis, and EH&S assessment, (ii) perform technology gap analysis, and (iii) conduct the solvent manufacturability and scale-up study.« less

  15. Genome scale engineering techniques for metabolic engineering.

    PubMed

    Liu, Rongming; Bassalo, Marcelo C; Zeitoun, Ramsey I; Gill, Ryan T

    2015-11-01

    Metabolic engineering has expanded from a focus on designs requiring a small number of genetic modifications to increasingly complex designs driven by advances in genome-scale engineering technologies. Metabolic engineering has been generally defined by the use of iterative cycles of rational genome modifications, strain analysis and characterization, and a synthesis step that fuels additional hypothesis generation. This cycle mirrors the Design-Build-Test-Learn cycle followed throughout various engineering fields that has recently become a defining aspect of synthetic biology. This review will attempt to summarize recent genome-scale design, build, test, and learn technologies and relate their use to a range of metabolic engineering applications. Copyright © 2015 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  16. The Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  17. Multi-Scale Validation of a Nanodiamond Drug Delivery System and Multi-Scale Engineering Education

    ERIC Educational Resources Information Center

    Schwalbe, Michelle Kristin

    2010-01-01

    This dissertation has two primary concerns: (i) evaluating the uncertainty and prediction capabilities of a nanodiamond drug delivery model using Bayesian calibration and bias correction, and (ii) determining conceptual difficulties of multi-scale analysis from an engineering education perspective. A Bayesian uncertainty quantification scheme…

  18. Space transportation booster engine thrust chamber technology, large scale injector

    NASA Technical Reports Server (NTRS)

    Schneider, J. A.

    1993-01-01

    The objective of the Large Scale Injector (LSI) program was to deliver a 21 inch diameter, 600,000 lbf thrust class injector to NASA/MSFC for hot fire testing. The hot fire test program would demonstrate the feasibility and integrity of the full scale injector, including combustion stability, chamber wall compatibility (thermal management), and injector performance. The 21 inch diameter injector was delivered in September of 1991.

  19. Acoustic characteristics of a large-scale wind tunnel model of an upper-surface blown flap transport having two engines

    NASA Technical Reports Server (NTRS)

    Falarski, M. D.; Aoyagi, K.; Koenig, D. G.

    1973-01-01

    The upper-surface blown (USB) flap as a powered-lift concept has evolved because of the potential acoustic shielding provided when turbofan engines are installed on a wing upper surface. The results from a wind tunnel investigation of a large-scale USB model powered by two JT15D-1 turbofan engines are-presented. The effects of coanda flap extent and deflection, forward speed, and exhaust nozzle configuration were investigated. To determine the wing shielding the acoustics of a single engine nacelle removed from the model were also measured. Effective shielding occurred in the aft underwing quadrant. In the forward quadrant the shielding of the high frequency noise was counteracted by an increase in the lower frequency wing-exhaust interaction noise. The fuselage provided shielding of the opposite engine noise such that the difference between single and double engine operation was 1.5 PNdB under the wing. The effects of coanda flap deflection and extent, angle of attack, and forward speed were small. Forward speed reduced the perceived noise level (PNL) by reducing the wing-exhaust interaction noise.

  20. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  1. NeuroLines: A Subway Map Metaphor for Visualizing Nanoscale Neuronal Connectivity.

    PubMed

    Al-Awami, Ali K; Beyer, Johanna; Strobelt, Hendrik; Kasthuri, Narayanan; Lichtman, Jeff W; Pfister, Hanspeter; Hadwiger, Markus

    2014-12-01

    We present NeuroLines, a novel visualization technique designed for scalable detailed analysis of neuronal connectivity at the nanoscale level. The topology of 3D brain tissue data is abstracted into a multi-scale, relative distance-preserving subway map visualization that allows domain scientists to conduct an interactive analysis of neurons and their connectivity. Nanoscale connectomics aims at reverse-engineering the wiring of the brain. Reconstructing and analyzing the detailed connectivity of neurons and neurites (axons, dendrites) will be crucial for understanding the brain and its development and diseases. However, the enormous scale and complexity of nanoscale neuronal connectivity pose big challenges to existing visualization techniques in terms of scalability. NeuroLines offers a scalable visualization framework that can interactively render thousands of neurites, and that supports the detailed analysis of neuronal structures and their connectivity. We describe and analyze the design of NeuroLines based on two real-world use-cases of our collaborators in developmental neuroscience, and investigate its scalability to large-scale neuronal connectivity data.

  2. Multidisciplinary propulsion simulation using the numerical propulsion system simulator (NPSS)

    NASA Technical Reports Server (NTRS)

    Claus, Russel W.

    1994-01-01

    Implementing new technology in aerospace propulsion systems is becoming prohibitively expensive. One of the major contributions to the high cost is the need to perform many large scale system tests. The traditional design analysis procedure decomposes the engine into isolated components and focuses attention on each single physical discipline (e.g., fluid for structural dynamics). Consequently, the interactions that naturally occur between components and disciplines can be masked by the limited interactions that occur between individuals or teams doing the design and must be uncovered during expensive engine testing. This overview will discuss a cooperative effort of NASA, industry, and universities to integrate disciplines, components, and high performance computing into a Numerical propulsion System Simulator (NPSS).

  3. Decoupling local mechanics from large-scale structure in modular metamaterials.

    PubMed

    Yang, Nan; Silverberg, Jesse L

    2017-04-04

    A defining feature of mechanical metamaterials is that their properties are determined by the organization of internal structure instead of the raw fabrication materials. This shift of attention to engineering internal degrees of freedom has coaxed relatively simple materials into exhibiting a wide range of remarkable mechanical properties. For practical applications to be realized, however, this nascent understanding of metamaterial design must be translated into a capacity for engineering large-scale structures with prescribed mechanical functionality. Thus, the challenge is to systematically map desired functionality of large-scale structures backward into a design scheme while using finite parameter domains. Such "inverse design" is often complicated by the deep coupling between large-scale structure and local mechanical function, which limits the available design space. Here, we introduce a design strategy for constructing 1D, 2D, and 3D mechanical metamaterials inspired by modular origami and kirigami. Our approach is to assemble a number of modules into a voxelized large-scale structure, where the module's design has a greater number of mechanical design parameters than the number of constraints imposed by bulk assembly. This inequality allows each voxel in the bulk structure to be uniquely assigned mechanical properties independent from its ability to connect and deform with its neighbors. In studying specific examples of large-scale metamaterial structures we show that a decoupling of global structure from local mechanical function allows for a variety of mechanically and topologically complex designs.

  4. Decoupling local mechanics from large-scale structure in modular metamaterials

    NASA Astrophysics Data System (ADS)

    Yang, Nan; Silverberg, Jesse L.

    2017-04-01

    A defining feature of mechanical metamaterials is that their properties are determined by the organization of internal structure instead of the raw fabrication materials. This shift of attention to engineering internal degrees of freedom has coaxed relatively simple materials into exhibiting a wide range of remarkable mechanical properties. For practical applications to be realized, however, this nascent understanding of metamaterial design must be translated into a capacity for engineering large-scale structures with prescribed mechanical functionality. Thus, the challenge is to systematically map desired functionality of large-scale structures backward into a design scheme while using finite parameter domains. Such “inverse design” is often complicated by the deep coupling between large-scale structure and local mechanical function, which limits the available design space. Here, we introduce a design strategy for constructing 1D, 2D, and 3D mechanical metamaterials inspired by modular origami and kirigami. Our approach is to assemble a number of modules into a voxelized large-scale structure, where the module’s design has a greater number of mechanical design parameters than the number of constraints imposed by bulk assembly. This inequality allows each voxel in the bulk structure to be uniquely assigned mechanical properties independent from its ability to connect and deform with its neighbors. In studying specific examples of large-scale metamaterial structures we show that a decoupling of global structure from local mechanical function allows for a variety of mechanically and topologically complex designs.

  5. Challenges and opportunities in synthetic biology for chemical engineers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, YZ; Lee, JK; Zhao, HM

    Synthetic biology provides numerous great opportunities for chemical engineers in the development of new processes for large-scale production of biofuels, value-added chemicals, and protein therapeutics. However, challenges across all scales abound. In particular, the modularization and standardization of the components in a biological system, so-called biological parts, remain the biggest obstacle in synthetic biology. In this perspective, we will discuss the main challenges and opportunities in the rapidly growing synthetic biology field and the important roles that chemical engineers can play in its advancement. (C) 2012 Elsevier Ltd. All rights reserved.

  6. Challenges and opportunities in synthetic biology for chemical engineers

    PubMed Central

    Luo, Yunzi; Lee, Jung-Kul; Zhao, Huimin

    2012-01-01

    Synthetic biology provides numerous great opportunities for chemical engineers in the development of new processes for large-scale production of biofuels, value-added chemicals, and protein therapeutics. However, challenges across all scales abound. In particular, the modularization and standardization of the components in a biological system, so-called biological parts, remain the biggest obstacle in synthetic biology. In this perspective, we will discuss the main challenges and opportunities in the rapidly growing synthetic biology field and the important roles that chemical engineers can play in its advancement. PMID:24222925

  7. Challenges and opportunities in synthetic biology for chemical engineers.

    PubMed

    Luo, Yunzi; Lee, Jung-Kul; Zhao, Huimin

    2013-11-15

    Synthetic biology provides numerous great opportunities for chemical engineers in the development of new processes for large-scale production of biofuels, value-added chemicals, and protein therapeutics. However, challenges across all scales abound. In particular, the modularization and standardization of the components in a biological system, so-called biological parts, remain the biggest obstacle in synthetic biology. In this perspective, we will discuss the main challenges and opportunities in the rapidly growing synthetic biology field and the important roles that chemical engineers can play in its advancement.

  8. A hierarchical-multiobjective framework for risk management

    NASA Technical Reports Server (NTRS)

    Haimes, Yacov Y.; Li, Duan

    1991-01-01

    A broad hierarchical-multiobjective framework is established and utilized to methodologically address the management of risk. United into the framework are the hierarchical character of decision-making, the multiple decision-makers at separate levels within the hierarchy, the multiobjective character of large-scale systems, the quantitative/empirical aspects, and the qualitative/normative/judgmental aspects. The methodological components essentially consist of hierarchical-multiobjective coordination, risk of extreme events, and impact analysis. Examples of applications of the framework are presented. It is concluded that complex and interrelated forces require an analysis of trade-offs between engineering analysis and societal preferences, as in the hierarchical-multiobjective framework, to successfully address inherent risk.

  9. Rational function representation of flap noise spectra including correction for reflection effects. [acoustic properties of engine exhaust jets deflected for externally blown flaps

    NASA Technical Reports Server (NTRS)

    Miles, J. H.

    1974-01-01

    A rational function is presented for the acoustic spectra generated by deflection of engine exhaust jets for under-the-wing and over-the-wing versions of externally blown flaps. The functional representation is intended to provide a means for compact storage of data and for data analysis. The expressions are based on Fourier transform functions for the Strouhal normalized pressure spectral density, and on a correction for reflection effects based on the N-independent-source model of P. Thomas extended by use of a reflected ray transfer function. Curve fit comparisons are presented for blown flap data taken from turbofan engine tests and from large scale cold-flow model tests. Application of the rational function to scrubbing noise theory is also indicated.

  10. Airframe-Jet Engine Integration Noise

    NASA Technical Reports Server (NTRS)

    Tam, Christopher; Antcliff, Richard R. (Technical Monitor)

    2003-01-01

    It has been found experimentally that the noise radiated by a jet mounted under the wing of an aircraft exceeds that of the same jet in a stand-alone environment. The increase in noise is referred to as jet engine airframe integration noise. The objectives of the present investigation are, (1) To obtain a better understanding of the physical mechanisms responsible for jet engine airframe integration noise or installation noise. (2) To develop a prediction model for jet engine airframe integration noise. It is known that jet mixing noise consists of two principal components. They are the noise from the large turbulence structures of the jet flow and the noise from the fine scale turbulence. In this investigation, only the effect of jet engine airframe interaction on the fine scale turbulence noise of a jet is studied. The fine scale turbulence noise is the dominant noise component in the sideline direction. Thus we limit out consideration primarily to the sideline.

  11. Molecular Dynamics Visualization (MDV): Stereoscopic 3D Display of Biomolecular Structure and Interactions Using the Unity Game Engine.

    PubMed

    Wiebrands, Michael; Malajczuk, Chris J; Woods, Andrew J; Rohl, Andrew L; Mancera, Ricardo L

    2018-06-21

    Molecular graphics systems are visualization tools which, upon integration into a 3D immersive environment, provide a unique virtual reality experience for research and teaching of biomolecular structure, function and interactions. We have developed a molecular structure and dynamics application, the Molecular Dynamics Visualization tool, that uses the Unity game engine combined with large scale, multi-user, stereoscopic visualization systems to deliver an immersive display experience, particularly with a large cylindrical projection display. The application is structured to separate the biomolecular modeling and visualization systems. The biomolecular model loading and analysis system was developed as a stand-alone C# library and provides the foundation for the custom visualization system built in Unity. All visual models displayed within the tool are generated using Unity-based procedural mesh building routines. A 3D user interface was built to allow seamless dynamic interaction with the model while being viewed in 3D space. Biomolecular structure analysis and display capabilities are exemplified with a range of complex systems involving cell membranes, protein folding and lipid droplets.

  12. Punctuated Evolution of Influenza Virus Neuraminidase (A/H1N1) under Opposing Migration and Vaccination Pressures

    PubMed Central

    Phillips, J. C.

    2014-01-01

    Influenza virus contains two highly variable envelope glycoproteins, hemagglutinin (HA) and neuraminidase (NA). The structure and properties of HA, which is responsible for binding the virus to the cell that is being infected, change significantly when the virus is transmitted from avian or swine species to humans. Here we focus first on the simpler problem of the much smaller human individual evolutionary amino acid mutational changes in NA, which cleaves sialic acid groups and is required for influenza virus replication. Our thermodynamic panorama shows that very small amino acid changes can be monitored very accurately across many historic (1945–2011) Uniprot and NCBI strains using hydropathicity scales to quantify the roughness of water film packages. Quantitative sequential analysis is most effective with the fractal differential hydropathicity scale based on protein self-organized criticality (SOC). Our analysis shows that large-scale vaccination programs have been responsible for a very large convergent reduction in common influenza severity in the last century. Hydropathic analysis is capable of interpreting and even predicting trends of functional changes in mutation prolific viruses directly from amino acid sequences alone. An engineered strain of NA1 is described which could well be significantly less virulent than current circulating strains. PMID:25143953

  13. Degree program changes and curricular flexibility: Addressing long held beliefs about student progression

    NASA Astrophysics Data System (ADS)

    Ricco, George Dante

    In higher education and in engineering education in particular, changing majors is generally considered a negative event - or at least an event with negative consequences. An emergent field of study within engineering education revolves around understanding the factors and processes driving student changes of major. Of key importance to further the field of change of major research is a grasp of large scale phenomena occurring throughout multiple systems, knowledge of previous attempts at describing such issues, and the adoption of metrics to probe them effectively. The problem posed is exacerbated by the drive in higher education institutions and among state legislatures to understand and reduce time-to-degree and student attrition. With these factors in mind, insights into large-scale processes that affect student progression are essential to evaluating the success or failure of programs. The goals of this work include describing the current educational research on switchers, identifying core concepts and stumbling blocks in my treatment of switchers, and using the Multiple Institutional Database for Investigating Engineering Longitudinal Development (MIDFIELD) to explore how those who change majors perform as a function of large-scale academic pathways within and without the engineering context. To accomplish these goals, it was first necessary to delve into a recent history of the treatment of switchers within the literature and categorize their approach. While three categories of papers exist in the literature concerning change of major, all three may or may not be applicable to a given database of students or even a single institution. Furthermore, while the term has been coined in the literature, no portable metric for discussing large-scale navigational flexibility exists in engineering education. What such a metric would look like will be discussed as well as the delimitations involved. The results and subsequent discussion will include a description of changes of major, how they may or may not have a deleterious effect on one's academic pathway, the special context of changes of major in the pathways of students within first-year engineering programs students labeled as undecided, an exploration of curricular flexibility by the construction of a novel metric, and proposed future work.

  14. Women in Engineering in Turkey--A Large Scale Quantitative and Qualitative Examination

    ERIC Educational Resources Information Center

    Smith, Alice E.; Dengiz, Berna

    2010-01-01

    The underrepresentation of women in engineering is well known and unresolved. However, Turkey has witnessed a shift in trend from virtually no female participation in engineering to across-the-board proportions that dominate other industrialised countries within the 76 years of the founding of the Turkish Republic. This paper describes the largest…

  15. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 2. First Year Poststocking Results. Volume IV. Nitrogen and Phosphorus Dynamics of the Lake Conway Ecosystem: Loading Budgets and a Dynamic Hydrologic Phosphorus Model.

    DTIC Science & Technology

    1982-08-01

    AD-AIA 700 FLORIDA UN1V GAINESVILLE DEPT OF ENVIRONMENTAL ENGIN -ETC F/G 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMOR--ENL...Conway ecosystem and is part of the Large- Scale Operations Management Test (LSOMT) of the Aquatic Plant Control Research Program (APCRP) at the WES...should be cited as follows: Blancher, E. C., II, and Fellows, C. R. 1982. "Large-Scale Operations Management Test of Use of the White Amur for Control

  16. Predicting Performance in a First Engineering Calculus Course: Implications for Interventions

    ERIC Educational Resources Information Center

    Hieb, Jeffrey L.; Lyle, Keith B.; Ralston, Patricia A. S.; Chariker, Julia

    2015-01-01

    At the University of Louisville, a large, urban institution in the south-east United States, undergraduate engineering students take their mathematics courses from the school of engineering. In the fall of their freshman year, engineering students take "Engineering Analysis I," a calculus-based engineering analysis course. After the…

  17. Aerodynamic Design of a Dual-Flow Mach 7 Hypersonic Inlet System for a Turbine-Based Combined-Cycle Hypersonic Propulsion System

    NASA Technical Reports Server (NTRS)

    Sanders, Bobby W.; Weir, Lois J.

    2008-01-01

    A new hypersonic inlet for a turbine-based combined-cycle (TBCC) engine has been designed. This split-flow inlet is designed to provide flow to an over-under propulsion system with turbofan and dual-mode scramjet engines for flight from takeoff to Mach 7. It utilizes a variable-geometry ramp, high-speed cowl lip rotation, and a rotating low-speed cowl that serves as a splitter to divide the flow between the low-speed turbofan and the high-speed scramjet and to isolate the turbofan at high Mach numbers. The low-speed inlet was designed for Mach 4, the maximum mode transition Mach number. Integration of the Mach 4 inlet into the Mach 7 inlet imposed significant constraints on the low-speed inlet design, including a large amount of internal compression. The inlet design was used to develop mechanical designs for two inlet mode transition test models: small-scale (IMX) and large-scale (LIMX) research models. The large-scale model is designed to facilitate multi-phase testing including inlet mode transition and inlet performance assessment, controls development, and integrated systems testing with turbofan and scramjet engines.

  18. A controls engineering approach for analyzing airplane input-output characteristics

    NASA Technical Reports Server (NTRS)

    Arbuckle, P. Douglas

    1991-01-01

    An engineering approach for analyzing airplane control and output characteristics is presented. State-space matrix equations describing the linear perturbation dynamics are transformed from physical coordinates into scaled coordinates. The scaling is accomplished by applying various transformations to the system to employ prior engineering knowledge of the airplane physics. Two different analysis techniques are then explained. Modal analysis techniques calculate the influence of each system input on each fundamental mode of motion and the distribution of each mode among the system outputs. The optimal steady state response technique computes the blending of steady state control inputs that optimize the steady state response of selected system outputs. Analysis of an example airplane model is presented to demonstrate the described engineering approach.

  19. Computer-aided design of large-scale integrated circuits - A concept

    NASA Technical Reports Server (NTRS)

    Schansman, T. T.

    1971-01-01

    Circuit design and mask development sequence are improved by using general purpose computer with interactive graphics capability establishing efficient two way communications link between design engineer and system. Interactive graphics capability places design engineer in direct control of circuit development.

  20. Flat-plate solar array project: Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The process technology for the manufacture of semiconductor-grade silicon in a large commercial plant by 1986, at a price less than $14 per kilogram of silicon based on 1975 dollars is discussed. The engineering design, installation, checkout, and operation of an Experimental Process System Development unit was discussed. Quality control of scaling-up the process and an economic analysis of product and production costs are discussed.

  1. Disaster Response Contracting in a Post-Katrina World: Analyzing Current Disaster Response Strategies and Exploring Alternatives to Improve Processes for Rapid Reaction to Large Scale Disasters within the United States

    DTIC Science & Technology

    2006-12-01

    could benefit tremendously from pre-positioning within the Corps of Engineers ’ ID/IQ contracts or catalogs for the essential services and commodities...even advisable? 5. Telework, An In Depth Cost Benefit Analysis Proactively managed telecommuting programs have been heralded as a cost saving...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA MBA PROFESSIONAL REPORT Disaster Response Contracting in a Post-Katrina World

  2. 23 CFR 940.11 - Project implementation.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... projects funded with highway trust funds shall be based on a systems engineering analysis. (b) The analysis should be on a scale commensurate with the project scope. (c) The systems engineering analysis shall... definitions; (4) Analysis of alternative system configurations and technology options to meet requirements; (5...

  3. 23 CFR 940.11 - Project implementation.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... projects funded with highway trust funds shall be based on a systems engineering analysis. (b) The analysis should be on a scale commensurate with the project scope. (c) The systems engineering analysis shall... definitions; (4) Analysis of alternative system configurations and technology options to meet requirements; (5...

  4. 23 CFR 940.11 - Project implementation.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... projects funded with highway trust funds shall be based on a systems engineering analysis. (b) The analysis should be on a scale commensurate with the project scope. (c) The systems engineering analysis shall... definitions; (4) Analysis of alternative system configurations and technology options to meet requirements; (5...

  5. 23 CFR 940.11 - Project implementation.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... projects funded with highway trust funds shall be based on a systems engineering analysis. (b) The analysis should be on a scale commensurate with the project scope. (c) The systems engineering analysis shall... definitions; (4) Analysis of alternative system configurations and technology options to meet requirements; (5...

  6. 23 CFR 940.11 - Project implementation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... projects funded with highway trust funds shall be based on a systems engineering analysis. (b) The analysis should be on a scale commensurate with the project scope. (c) The systems engineering analysis shall... definitions; (4) Analysis of alternative system configurations and technology options to meet requirements; (5...

  7. Theoretical and Empirical Comparison of Big Data Image Processing with Apache Hadoop and Sun Grid Engine.

    PubMed

    Bao, Shunxing; Weitendorf, Frederick D; Plassard, Andrew J; Huo, Yuankai; Gokhale, Aniruddha; Landman, Bennett A

    2017-02-11

    The field of big data is generally concerned with the scale of processing at which traditional computational paradigms break down. In medical imaging, traditional large scale processing uses a cluster computer that combines a group of workstation nodes into a functional unit that is controlled by a job scheduler. Typically, a shared-storage network file system (NFS) is used to host imaging data. However, data transfer from storage to processing nodes can saturate network bandwidth when data is frequently uploaded/retrieved from the NFS, e.g., "short" processing times and/or "large" datasets. Recently, an alternative approach using Hadoop and HBase was presented for medical imaging to enable co-location of data storage and computation while minimizing data transfer. The benefits of using such a framework must be formally evaluated against a traditional approach to characterize the point at which simply "large scale" processing transitions into "big data" and necessitates alternative computational frameworks. The proposed Hadoop system was implemented on a production lab-cluster alongside a standard Sun Grid Engine (SGE). Theoretical models for wall-clock time and resource time for both approaches are introduced and validated. To provide real example data, three T1 image archives were retrieved from a university secure, shared web database and used to empirically assess computational performance under three configurations of cluster hardware (using 72, 109, or 209 CPU cores) with differing job lengths. Empirical results match the theoretical models. Based on these data, a comparative analysis is presented for when the Hadoop framework will be relevant and non-relevant for medical imaging.

  8. Theoretical and empirical comparison of big data image processing with Apache Hadoop and Sun Grid Engine

    NASA Astrophysics Data System (ADS)

    Bao, Shunxing; Weitendorf, Frederick D.; Plassard, Andrew J.; Huo, Yuankai; Gokhale, Aniruddha; Landman, Bennett A.

    2017-03-01

    The field of big data is generally concerned with the scale of processing at which traditional computational paradigms break down. In medical imaging, traditional large scale processing uses a cluster computer that combines a group of workstation nodes into a functional unit that is controlled by a job scheduler. Typically, a shared-storage network file system (NFS) is used to host imaging data. However, data transfer from storage to processing nodes can saturate network bandwidth when data is frequently uploaded/retrieved from the NFS, e.g., "short" processing times and/or "large" datasets. Recently, an alternative approach using Hadoop and HBase was presented for medical imaging to enable co-location of data storage and computation while minimizing data transfer. The benefits of using such a framework must be formally evaluated against a traditional approach to characterize the point at which simply "large scale" processing transitions into "big data" and necessitates alternative computational frameworks. The proposed Hadoop system was implemented on a production lab-cluster alongside a standard Sun Grid Engine (SGE). Theoretical models for wall-clock time and resource time for both approaches are introduced and validated. To provide real example data, three T1 image archives were retrieved from a university secure, shared web database and used to empirically assess computational performance under three configurations of cluster hardware (using 72, 109, or 209 CPU cores) with differing job lengths. Empirical results match the theoretical models. Based on these data, a comparative analysis is presented for when the Hadoop framework will be relevant and nonrelevant for medical imaging.

  9. Engineering research, development and technology FY99

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langland, R T

    The growth of computer power and connectivity, together with advances in wireless sensing and communication technologies, is transforming the field of complex distributed systems. The ability to deploy large numbers of sensors with a rapid, broadband communication system will enable high-fidelity, near real-time monitoring of complex systems. These technological developments will provide unprecedented insight into the actual performance of engineered and natural environment systems, enable the evolution of many new types of engineered systems for monitoring and detection, and enhance our ability to perform improved and validated large-scale simulations of complex systems. One of the challenges facing engineering is tomore » develop methodologies to exploit the emerging information technologies. Particularly important will be the ability to assimilate measured data into the simulation process in a way which is much more sophisticated than current, primarily ad hoc procedures. The reports contained in this section on the Center for Complex Distributed Systems describe activities related to the integrated engineering of large complex systems. The first three papers describe recent developments for each link of the integrated engineering process for large structural systems. These include (1) the development of model-based signal processing algorithms which will formalize the process of coupling measurements and simulation and provide a rigorous methodology for validation and update of computational models; (2) collaborative efforts with faculty at the University of California at Berkeley on the development of massive simulation models for the earth and large bridge structures; and (3) the development of wireless data acquisition systems which provide a practical means of monitoring large systems like the National Ignition Facility (NIF) optical support structures. These successful developments are coming to a confluence in the next year with applications to NIF structural characterizations and analysis of large bridge structures for the State of California. Initial feasibility investigations into the development of monitoring and detection systems are described in the papers on imaging of underground structures with ground-penetrating radar, and the use of live insects as sensor platforms. These efforts are establishing the basic performance characteristics essential to the decision process for future development of sensor arrays for information gathering related to national security.« less

  10. Using Relational Reasoning to Learn about Scientific Phenomena at Unfamiliar Scales

    ERIC Educational Resources Information Center

    Resnick, Ilyse; Davatzes, Alexandra; Newcombe, Nora S.; Shipley, Thomas F.

    2016-01-01

    Many scientific theories and discoveries involve reasoning about extreme scales, removed from human experience, such as time in geology, size in nanoscience. Thus, understanding scale is central to science, technology, engineering, and mathematics. Unfortunately, novices have trouble understanding and comparing sizes of unfamiliar large and small…

  11. Using Relational Reasoning to Learn about Scientific Phenomena at Unfamiliar Scales

    ERIC Educational Resources Information Center

    Resnick, Ilyse; Davatzes, Alexandra; Newcombe, Nora S.; Shipley, Thomas F.

    2017-01-01

    Many scientific theories and discoveries involve reasoning about extreme scales, removed from human experience, such as time in geology and size in nanoscience. Thus, understanding scale is central to science, technology, engineering, and mathematics. Unfortunately, novices have trouble understanding and comparing sizes of unfamiliar large and…

  12. Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping

    NASA Astrophysics Data System (ADS)

    Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.

    2017-12-01

    Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.

  13. Parallel block schemes for large scale least squares computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Golub, G.H.; Plemmons, R.J.; Sameh, A.

    1986-04-01

    Large scale least squares computations arise in a variety of scientific and engineering problems, including geodetic adjustments and surveys, medical image analysis, molecular structures, partial differential equations and substructuring methods in structural engineering. In each of these problems, matrices often arise which possess a block structure which reflects the local connection nature of the underlying physical problem. For example, such super-large nonlinear least squares computations arise in geodesy. Here the coordinates of positions are calculated by iteratively solving overdetermined systems of nonlinear equations by the Gauss-Newton method. The US National Geodetic Survey will complete this year (1986) the readjustment ofmore » the North American Datum, a problem which involves over 540 thousand unknowns and over 6.5 million observations (equations). The observation matrix for these least squares computations has a block angular form with 161 diagnonal blocks, each containing 3 to 4 thousand unknowns. In this paper parallel schemes are suggested for the orthogonal factorization of matrices in block angular form and for the associated backsubstitution phase of the least squares computations. In addition, a parallel scheme for the calculation of certain elements of the covariance matrix for such problems is described. It is shown that these algorithms are ideally suited for multiprocessors with three levels of parallelism such as the Cedar system at the University of Illinois. 20 refs., 7 figs.« less

  14. Engineering Education for Agricultural and Rural Development in Africa

    ERIC Educational Resources Information Center

    Adewumi, B. A.

    2008-01-01

    Agricultural Engineering has transformed agricultural practices from subsistence level to medium and large-scale production via mechanisation in the developed nations. This has reduced the labour force requirements in agriculture; increased production levels and efficiency, product shelf life and product quality; and resulted into…

  15. Impact Testing and Analysis of Composites for Aircraft Engine Fan Cases

    NASA Technical Reports Server (NTRS)

    Roberts, Gary D.; Revilock, Duane M.; Binienda, Wieslaw K.; Nie, Walter Z.; Mackenzie, S. Ben; Todd, Kevin B.

    2002-01-01

    The fan case in a jet engine is a heavy structure because of its size and because of the requirement that it contain a blade released during engine operation. Composite materials offer the potential for reducing the weight of the case. Efficient design, test, and analysis methods are needed to efficiently evaluate the large number of potential composite materials and design concepts. The type of damage expected in a composite case under blade-out conditions was evaluated using a subscale test in which a glass/epoxy composite half-ring target was impacted with a wedge-shaped titanium projectile. Fiber shearing occurred near points of contact between the projectile and target. Delamination and tearing occurred on a larger scale. These damage modes were reproduced in a simpler test in which flat glass/epoxy composites were impacted with a blunt cylindrical projectile. A surface layer of ceramic eliminated fiber shear fracture but did not reduce delamination. Tests on 3D woven carbon/epoxy composites indicated that transverse reinforcement is effective in reducing delamination. A 91 cm (36 in.) diameter full-ring sub-component was proposed for larger scale testing of these and other composite concepts. Explicit, transient, finite element analyses indicated that a full-ring test is needed to simulate complete impact dynamics, but simpler tests using smaller ring sections are adequate when evaluation of initial impact damage is the primary concern.

  16. Feasibility analysis of large length-scale thermocapillary flow experiment for the International Space Station

    NASA Astrophysics Data System (ADS)

    Alberts, Samantha J.

    The investigation of microgravity fluid dynamics emerged out of necessity with the advent of space exploration. In particular, capillary research took a leap forward in the 1960s with regards to liquid settling and interfacial dynamics. Due to inherent temperature variations in large spacecraft liquid systems, such as fuel tanks, forces develop on gas-liquid interfaces which induce thermocapillary flows. To date, thermocapillary flows have been studied in small, idealized research geometries usually under terrestrial conditions. The 1 to 3m lengths in current and future large tanks and hardware are designed based on hardware rather than research, which leaves spaceflight systems designers without the technological tools to effectively create safe and efficient designs. This thesis focused on the design and feasibility of a large length-scale thermocapillary flow experiment, which utilizes temperature variations to drive a flow. The design of a helical channel geometry ranging from 1 to 2.5m in length permits a large length-scale thermocapillary flow experiment to fit in a seemingly small International Space Station (ISS) facility such as the Fluids Integrated Rack (FIR). An initial investigation determined the proposed experiment produced measurable data while adhering to the FIR facility limitations. The computational portion of this thesis focused on the investigation of functional geometries of fuel tanks and depots using Surface Evolver. This work outlines the design of a large length-scale thermocapillary flow experiment for the ISS FIR. The results from this work improve the understanding thermocapillary flows and thus improve technological tools for predicting heat and mass transfer in large length-scale thermocapillary flows. Without the tools to understand the thermocapillary flows in these systems, engineers are forced to design larger, heavier vehicles to assure safety and mission success.

  17. Defense Acquisitions Acronyms and Terms

    DTIC Science & Technology

    2012-12-01

    Computer-Aided Design CADD Computer-Aided Design and Drafting CAE Component Acquisition Executive; Computer-Aided Engineering CAIV Cost As an...Radiation to Ordnance HFE Human Factors Engineering HHA Health Hazard Assessment HNA Host-Nation Approval HNS Host-Nation Support HOL High -Order...Engineering Change Proposal VHSIC Very High Speed Integrated Circuit VLSI Very Large Scale Integration VOC Volatile Organic Compound W WAN Wide

  18. Probabilistic Analysis of Large-Scale Composite Structures Using the IPACS Code

    NASA Technical Reports Server (NTRS)

    Lemonds, Jeffrey; Kumar, Virendra

    1995-01-01

    An investigation was performed to ascertain the feasibility of using IPACS (Integrated Probabilistic Assessment of Composite Structures) for probabilistic analysis of a composite fan blade, the development of which is being pursued by various industries for the next generation of aircraft engines. A model representative of the class of fan blades used in the GE90 engine has been chosen as the structural component to be analyzed with IPACS. In this study, typical uncertainties are assumed in the level, and structural responses for ply stresses and frequencies are evaluated in the form of cumulative probability density functions. Because of the geometric complexity of the blade, the number of plies varies from several hundred at the root to about a hundred at the tip. This represents a extremely complex composites application for the IPACS code. A sensitivity study with respect to various random variables is also performed.

  19. A Method for Capturing and Reconciling Stakeholder Intentions Based on the Formal Concept Analysis

    NASA Astrophysics Data System (ADS)

    Aoyama, Mikio

    Information systems are ubiquitous in our daily life. Thus, information systems need to work appropriately anywhere at any time for everybody. Conventional information systems engineering tends to engineer systems from the viewpoint of systems functionality. However, the diversity of the usage context requires fundamental change compared to our current thinking on information systems; from the functionality the systems provide to the goals the systems should achieve. The intentional approach embraces the goals and related aspects of the information systems. This chapter presents a method for capturing, structuring and reconciling diverse goals of multiple stakeholders. The heart of the method lies in the hierarchical structuring of goals by goal lattice based on the formal concept analysis, a semantic extension of the lattice theory. We illustrate the effectiveness of the presented method through application to the self-checkout systems for large-scale supermarkets.

  20. Challenges in engineering large customized bone constructs.

    PubMed

    Forrestal, David P; Klein, Travis J; Woodruff, Maria A

    2017-06-01

    The ability to treat large tissue defects with customized, patient-specific scaffolds is one of the most exciting applications in the tissue engineering field. While an increasing number of modestly sized tissue engineering solutions are making the transition to clinical use, successfully scaling up to large scaffolds with customized geometry is proving to be a considerable challenge. Managing often conflicting requirements of cell placement, structural integrity, and a hydrodynamic environment supportive of cell culture throughout the entire thickness of the scaffold has driven the continued development of many techniques used in the production, culturing, and characterization of these scaffolds. This review explores a range of technologies and methods relevant to the design and manufacture of large, anatomically accurate tissue-engineered scaffolds with a focus on the interaction of manufactured scaffolds with the dynamic tissue culture fluid environment. Biotechnol. Bioeng. 2017;114: 1129-1139. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  1. Bottom-up production of meta-atoms for optical magnetism in visible and NIR light

    NASA Astrophysics Data System (ADS)

    Barois, Philippe; Ponsinet, Virginie; Baron, Alexandre; Richetti, Philippe

    2018-02-01

    Many unusual optical properties of metamaterials arise from the magnetic response of engineered structures of sub-wavelength size (meta-atoms) exposed to light. The top-down approach whereby engineered nanostructure of well-defined morphology are engraved on a surface proved to be successful for the generation of strong optical magnetism. It faces however the limitations of high cost and small active area in visible light where nanometre resolution is needed. The bottom-up approach whereby the fabrication metamaterials of large volume or large area results from the combination of nanochemitry and self-assembly techniques may constitute a cost-effective alternative. This approach nevertheless requires the large-scale production of functional building-blocks (meta-atoms) bearing a strong magnetic optical response. We propose in this paper a few tracks that lead to the large scale synthesis of magnetic metamaterials operating in visible or near IR light.

  2. Rational functional representation of flap noise spectra including correction for reflection effects

    NASA Technical Reports Server (NTRS)

    Miles, J. H.

    1974-01-01

    A rational function is presented for the acoustic spectra generated by deflection of engine exhaust jets for under-the-wing and over-the-wing versions of externally blown flaps. The functional representation is intended to provide a means for compact storage of data and for data analysis. The expressions are based on Fourier transform functions for the Strouhal normalized pressure spectral density, and on a correction for reflection effects based on Thomas' (1969) N-independent-source model extended by use of a reflected ray transfer function. Curve fit comparisons are presented for blown-flap data taken from turbofan engine tests and from large-scale cold-flow model tests. Application of the rational function to scrubbing noise theory is also indicated.

  3. Regulatory impact analysis and regulatory support document: Control of air pollution; determination of significance for nonroad sources and emission standards for new nonroad compression-ignition engines at or above 37 kilowatts (50 horsepower). Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trimble, T.; North, D.R.; Green, K.A.H.

    1994-05-27

    The regulatory impact analysis and support document provides additional information in support of the Final Rulemaking (FRM). This FRM will regulate all new nonroad compression-ignition engines greater than or equal to 37 kilowatts (50 hp), except engines which propel or are used on marine vessels, aircraft engines, engines which propel locomotives, and engines regulated by the Mining, Safety, and Health Administration. The regulated engines are hereafter referred to as nonroad large CI engines. The goal of this regulation is to substantially reduce NOx emission and smoke from nonroad large CI engines beginning in the 1996 model year.

  4. Evolving from bioinformatics in-the-small to bioinformatics in-the-large.

    PubMed

    Parker, D Stott; Gorlick, Michael M; Lee, Christopher J

    2003-01-01

    We argue the significance of a fundamental shift in bioinformatics, from in-the-small to in-the-large. Adopting a large-scale perspective is a way to manage the problems endemic to the world of the small-constellations of incompatible tools for which the effort required to assemble an integrated system exceeds the perceived benefit of the integration. Where bioinformatics in-the-small is about data and tools, bioinformatics in-the-large is about metadata and dependencies. Dependencies represent the complexities of large-scale integration, including the requirements and assumptions governing the composition of tools. The popular make utility is a very effective system for defining and maintaining simple dependencies, and it offers a number of insights about the essence of bioinformatics in-the-large. Keeping an in-the-large perspective has been very useful to us in large bioinformatics projects. We give two fairly different examples, and extract lessons from them showing how it has helped. These examples both suggest the benefit of explicitly defining and managing knowledge flows and knowledge maps (which represent metadata regarding types, flows, and dependencies), and also suggest approaches for developing bioinformatics database systems. Generally, we argue that large-scale engineering principles can be successfully adapted from disciplines such as software engineering and data management, and that having an in-the-large perspective will be a key advantage in the next phase of bioinformatics development.

  5. Results of an Advanced Fan Stage Operating Over a Wide Range of Speed and Bypass Ratio. Part 2; Comparison of CFD and Experimental Results

    NASA Technical Reports Server (NTRS)

    Celestina, Mark L.; Suder, Kenneth L.; Kulkarni, Sameer

    2010-01-01

    NASA and GE teamed to design and build a 57 percent engine scaled fan stage for a Mach 4 variable cycle turbofan/ramjet engine for access to space with multipoint operations. This fan stage was tested in NASA's transonic compressor facility. The objectives of this test were to assess the aerodynamic and aero mechanic performance and operability characteristics of the fan stage over the entire range of engine operation including: 1) sea level static take-off; 2) transition over large swings in fan bypass ratio; 3) transition from turbofan to ramjet; and 4) fan wind-milling operation at high Mach flight conditions. This paper will focus on an assessment of APNASA, a multistage turbomachinery analysis code developed by NASA, to predict the fan stage performance and operability over a wide range of speeds (37 to 100 percent) and bypass ratios.

  6. Large-Scale Wind-Tunnel Tests and Evaluation of the Low-Speed Performance of a 35 deg Sweptback Wing Jet Transport Model Equipped with a Blowing Boundary-Layer-Control Flap and Leading-Edge Slat

    NASA Technical Reports Server (NTRS)

    Hickey, David H.; Aoyagi, Kiyoshi

    1960-01-01

    A wind-tunnel investigation was conducted to determine the effect of trailing-edge flaps with blowing-type boundary-layer control and leading-edge slats on the low-speed performance of a large-scale jet transport model with four engines and a 35 deg. sweptback wing of aspect ratio 7. Two spanwise extents and several deflections of the trailing-edge flap were tested. Results were obtained with a normal leading-edge and with full-span leading-edge slats. Three-component longitudinal force and moment data and boundary-layer-control flow requirements are presented. The test results are analyzed in terms of possible improvements in low-speed performance. The effect on performance of the source of boundary-layer-control air flow is considered in the analysis.

  7. Analysis of detection performance of multi band laser beam analyzer

    NASA Astrophysics Data System (ADS)

    Du, Baolin; Chen, Xiaomei; Hu, Leili

    2017-10-01

    Compared with microwave radar, Laser radar has high resolution, strong anti-interference ability and good hiding ability, so it becomes the focus of laser technology engineering application. A large scale Laser radar cross section (LRCS) measurement system is designed and experimentally tested. First, the boundary conditions are measured and the long range laser echo power is estimated according to the actual requirements. The estimation results show that the echo power is greater than the detector's response power. Secondly, a large scale LRCS measurement system is designed according to the demonstration and estimation. The system mainly consists of laser shaping, beam emitting device, laser echo receiving device and integrated control device. Finally, according to the designed lidar cross section measurement system, the scattering cross section of target is simulated and tested. The simulation results are basically the same as the test results, and the correctness of the system is proved.

  8. Scalable parallel distance field construction for large-scale applications

    DOE PAGES

    Yu, Hongfeng; Xie, Jinrong; Ma, Kwan -Liu; ...

    2015-10-01

    Computing distance fields is fundamental to many scientific and engineering applications. Distance fields can be used to direct analysis and reduce data. In this paper, we present a highly scalable method for computing 3D distance fields on massively parallel distributed-memory machines. Anew distributed spatial data structure, named parallel distance tree, is introduced to manage the level sets of data and facilitate surface tracking overtime, resulting in significantly reduced computation and communication costs for calculating the distance to the surface of interest from any spatial locations. Our method supports several data types and distance metrics from real-world applications. We demonstrate itsmore » efficiency and scalability on state-of-the-art supercomputers using both large-scale volume datasets and surface models. We also demonstrate in-situ distance field computation on dynamic turbulent flame surfaces for a petascale combustion simulation. In conclusion, our work greatly extends the usability of distance fields for demanding applications.« less

  9. Scalable Parallel Distance Field Construction for Large-Scale Applications.

    PubMed

    Yu, Hongfeng; Xie, Jinrong; Ma, Kwan-Liu; Kolla, Hemanth; Chen, Jacqueline H

    2015-10-01

    Computing distance fields is fundamental to many scientific and engineering applications. Distance fields can be used to direct analysis and reduce data. In this paper, we present a highly scalable method for computing 3D distance fields on massively parallel distributed-memory machines. A new distributed spatial data structure, named parallel distance tree, is introduced to manage the level sets of data and facilitate surface tracking over time, resulting in significantly reduced computation and communication costs for calculating the distance to the surface of interest from any spatial locations. Our method supports several data types and distance metrics from real-world applications. We demonstrate its efficiency and scalability on state-of-the-art supercomputers using both large-scale volume datasets and surface models. We also demonstrate in-situ distance field computation on dynamic turbulent flame surfaces for a petascale combustion simulation. Our work greatly extends the usability of distance fields for demanding applications.

  10. Intelligent Engine Systems: Acoustics

    NASA Technical Reports Server (NTRS)

    Wojno, John; Martens, Steve; Simpson, Benjamin

    2008-01-01

    An extensive study of new fan exhaust nozzle technologies was performed. Three new uniform chevron nozzles were designed, based on extensive CFD analysis. Two new azimuthally varying variants were defined. All five were tested, along with two existing nozzles, on a representative model-scale, medium BPR exhaust nozzle. Substantial acoustic benefits were obtained from the uniform chevron nozzle designs, the best benefit being provided by an existing design. However, one of the azimuthally varying nozzle designs exhibited even better performance than any of the uniform chevron nozzles. In addition to the fan chevron nozzles, a new technology was demonstrated, using devices that enhance mixing when applied to an exhaust nozzle. The acoustic benefits from these devices applied to medium BPR nozzles were similar, and in some cases superior to, those obtained from conventional uniform chevron nozzles. However, none of the low noise technologies provided equivalent acoustic benefits on a model-scale high BPR exhaust nozzle, similar to current large commercial applications. New technologies must be identified to improve the acoustics of state-of-the-art high BPR jet engines.

  11. Exploring Google Earth Engine platform for big data processing: classification of multi-temporal satellite imagery for crop mapping

    NASA Astrophysics Data System (ADS)

    Shelestov, Andrii; Lavreniuk, Mykola; Kussul, Nataliia; Novikov, Alexei; Skakun, Sergii

    2017-02-01

    Many applied problems arising in agricultural monitoring and food security require reliable crop maps at national or global scale. Large scale crop mapping requires processing and management of large amount of heterogeneous satellite imagery acquired by various sensors that consequently leads to a “Big Data” problem. The main objective of this study is to explore efficiency of using the Google Earth Engine (GEE) platform when classifying multi-temporal satellite imagery with potential to apply the platform for a larger scale (e.g. country level) and multiple sensors (e.g. Landsat-8 and Sentinel-2). In particular, multiple state-of-the-art classifiers available in the GEE platform are compared to produce a high resolution (30 m) crop classification map for a large territory ( 28,100 km2 and 1.0 M ha of cropland). Though this study does not involve large volumes of data, it does address efficiency of the GEE platform to effectively execute complex workflows of satellite data processing required with large scale applications such as crop mapping. The study discusses strengths and weaknesses of classifiers, assesses accuracies that can be achieved with different classifiers for the Ukrainian landscape, and compares them to the benchmark classifier using a neural network approach that was developed in our previous studies. The study is carried out for the Joint Experiment of Crop Assessment and Monitoring (JECAM) test site in Ukraine covering the Kyiv region (North of Ukraine) in 2013. We found that Google Earth Engine (GEE) provides very good performance in terms of enabling access to the remote sensing products through the cloud platform and providing pre-processing; however, in terms of classification accuracy, the neural network based approach outperformed support vector machine (SVM), decision tree and random forest classifiers available in GEE.

  12. Urban Elementary STEM Initiative

    ERIC Educational Resources Information Center

    Parker, Carolyn; Abel, Yolanda; Denisova, Ekaterina

    2015-01-01

    The new standards for K-12 science education suggest that student learning should be more integrated and should focus on crosscutting concepts and core ideas from the areas of physical science, life science, Earth/space science, and engineering/technology. This paper describes large-scale, urban elementary-focused science, technology, engineering,…

  13. Newly invented biobased materials from low-carbon, diverted waste fibers: research methods, testing, and full-scale application in a case study structure

    Treesearch

    Julee A Herdt; John Hunt; Kellen Schauermann

    2016-01-01

    This project demonstrates newly invented, biobased construction materials developed by applying lowcarbon, biomass waste sources through the Authors’ engineered fiber processes and technology. If manufactured and applied large-scale the project inventions can divert large volumes of cellulose waste into high-performance, low embodied energy, environmental construction...

  14. A puzzle assembly strategy for fabrication of large engineered cartilage tissue constructs.

    PubMed

    Nover, Adam B; Jones, Brian K; Yu, William T; Donovan, Daniel S; Podolnick, Jeremy D; Cook, James L; Ateshian, Gerard A; Hung, Clark T

    2016-03-21

    Engineering of large articular cartilage tissue constructs remains a challenge as tissue growth is limited by nutrient diffusion. Here, a novel strategy is investigated, generating large constructs through the assembly of individually cultured, interlocking, smaller puzzle-shaped subunits. These constructs can be engineered consistently with more desirable mechanical and biochemical properties than larger constructs (~4-fold greater Young׳s modulus). A failure testing technique was developed to evaluate the physiologic functionality of constructs, which were cultured as individual subunits for 28 days, then assembled and cultured for an additional 21-35 days. Assembled puzzle constructs withstood large deformations (40-50% compressive strain) prior to failure. Their ability to withstand physiologic loads may be enhanced by increases in subunit strength and assembled culture time. A nude mouse model was utilized to show biocompatibility and fusion of assembled puzzle pieces in vivo. Overall, the technique offers a novel, effective approach to scaling up engineered tissues and may be combined with other techniques and/or applied to the engineering of other tissues. Future studies will aim to optimize this system in an effort to engineer and integrate robust subunits to fill large defects. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. A Puzzle Assembly Strategy for Fabrication of Large Engineered Cartilage Tissue Constructs

    PubMed Central

    Nover, Adam B.; Jones, Brian K.; Yu, William T.; Donovan, Daniel S.; Podolnick, Jeremy D.; Cook, James L.; Ateshian, Gerard A.; Hung, Clark T.

    2016-01-01

    Engineering of large articular cartilage tissue constructs remains a challenge as tissue growth is limited by nutrient diffusion. Here, a novel strategy is investigated, generating large constructs through the assembly of individually cultured, interlocking, smaller puzzle-shaped subunits. These constructs can be engineered consistently with more desirable mechanical and biochemical properties than larger constructs (~4-fold greater Young's modulus). A failure testing technique was developed to evaluate the physiologic functionality of constructs, which were cultured as individual subunits for 28 days, then assembled and cultured for an additional 21-35 days. Assembled puzzle constructs withstood large deformations (40-50% compressive strain) prior to failure. Their ability to withstand physiologic loads may be enhanced by increases in subunit strength and assembled culture time. A nude mouse model was utilized to show biocompatibility and fusion of assembled puzzle pieces in vivo. Overall, the technique offers a novel, effective approach to scaling up engineered tissues and may be combined with other techniques and/or applied to the engineering of other tissues. Future studies will aim to optimize this system in an effort to engineer and integrate robust subunits to fill large defects. PMID:26895780

  16. Interactive Exploration and Analysis of Large-Scale Simulations Using Topology-Based Data Segmentation.

    PubMed

    Bremer, Peer-Timo; Weber, Gunther; Tierny, Julien; Pascucci, Valerio; Day, Marcus S; Bell, John B

    2011-09-01

    Large-scale simulations are increasingly being used to study complex scientific and engineering phenomena. As a result, advanced visualization and data analysis are also becoming an integral part of the scientific process. Often, a key step in extracting insight from these large simulations involves the definition, extraction, and evaluation of features in the space and time coordinates of the solution. However, in many applications, these features involve a range of parameters and decisions that will affect the quality and direction of the analysis. Examples include particular level sets of a specific scalar field, or local inequalities between derived quantities. A critical step in the analysis is to understand how these arbitrary parameters/decisions impact the statistical properties of the features, since such a characterization will help to evaluate the conclusions of the analysis as a whole. We present a new topological framework that in a single-pass extracts and encodes entire families of possible features definitions as well as their statistical properties. For each time step we construct a hierarchical merge tree a highly compact, yet flexible feature representation. While this data structure is more than two orders of magnitude smaller than the raw simulation data it allows us to extract a set of features for any given parameter selection in a postprocessing step. Furthermore, we augment the trees with additional attributes making it possible to gather a large number of useful global, local, as well as conditional statistic that would otherwise be extremely difficult to compile. We also use this representation to create tracking graphs that describe the temporal evolution of the features over time. Our system provides a linked-view interface to explore the time-evolution of the graph interactively alongside the segmentation, thus making it possible to perform extensive data analysis in a very efficient manner. We demonstrate our framework by extracting and analyzing burning cells from a large-scale turbulent combustion simulation. In particular, we show how the statistical analysis enabled by our techniques provides new insight into the combustion process.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    The Second SIAM Conference on Computational Science and Engineering was held in San Diego from February 10-12, 2003. Total conference attendance was 553. This is a 23% increase in attendance over the first conference. The focus of this conference was to draw attention to the tremendous range of major computational efforts on large problems in science and engineering, to promote the interdisciplinary culture required to meet these large-scale challenges, and to encourage the training of the next generation of computational scientists. Computational Science & Engineering (CS&E) is now widely accepted, along with theory and experiment, as a crucial third modemore » of scientific investigation and engineering design. Aerospace, automotive, biological, chemical, semiconductor, and other industrial sectors now rely on simulation for technical decision support. For federal agencies also, CS&E has become an essential support for decisions on resources, transportation, and defense. CS&E is, by nature, interdisciplinary. It grows out of physical applications and it depends on computer architecture, but at its heart are powerful numerical algorithms and sophisticated computer science techniques. From an applied mathematics perspective, much of CS&E has involved analysis, but the future surely includes optimization and design, especially in the presence of uncertainty. Another mathematical frontier is the assimilation of very large data sets through such techniques as adaptive multi-resolution, automated feature search, and low-dimensional parameterization. The themes of the 2003 conference included, but were not limited to: Advanced Discretization Methods; Computational Biology and Bioinformatics; Computational Chemistry and Chemical Engineering; Computational Earth and Atmospheric Sciences; Computational Electromagnetics; Computational Fluid Dynamics; Computational Medicine and Bioengineering; Computational Physics and Astrophysics; Computational Solid Mechanics and Materials; CS&E Education; Meshing and Adaptivity; Multiscale and Multiphysics Problems; Numerical Algorithms for CS&E; Discrete and Combinatorial Algorithms for CS&E; Inverse Problems; Optimal Design, Optimal Control, and Inverse Problems; Parallel and Distributed Computing; Problem-Solving Environments; Software and Wddleware Systems; Uncertainty Estimation and Sensitivity Analysis; and Visualization and Computer Graphics.« less

  18. A conceptual design of shock-eliminating clover combustor for large scale scramjet engine

    NASA Astrophysics Data System (ADS)

    Sun, Ming-bo; Zhao, Yu-xin; Zhao, Guo-yan; Liu, Yuan

    2017-01-01

    A new concept of shock-eliminating clover combustor is proposed for large scale scramjet engine to fulfill the requirements of fuel penetration, total pressure recovery and cooling. To generate the circular-to-clover transition shape of the combustor, the streamline tracing technique is used based on an axisymmetric expansion parent flowfield calculated using the method of characteristics. The combustor is examined using inviscid and viscous numerical simulations and a pure circular shape is calculated for comparison. The results showed that the combustor avoids the shock wave generation and produces low total pressure losses in a wide range of flight condition with various Mach number. The flameholding device for this combustor is briefly discussed.

  19. Fast time- and frequency-domain finite-element methods for electromagnetic analysis

    NASA Astrophysics Data System (ADS)

    Lee, Woochan

    Fast electromagnetic analysis in time and frequency domain is of critical importance to the design of integrated circuits (IC) and other advanced engineering products and systems. Many IC structures constitute a very large scale problem in modeling and simulation, the size of which also continuously grows with the advancement of the processing technology. This results in numerical problems beyond the reach of existing most powerful computational resources. Different from many other engineering problems, the structure of most ICs is special in the sense that its geometry is of Manhattan type and its dielectrics are layered. Hence, it is important to develop structure-aware algorithms that take advantage of the structure specialties to speed up the computation. In addition, among existing time-domain methods, explicit methods can avoid solving a matrix equation. However, their time step is traditionally restricted by the space step for ensuring the stability of a time-domain simulation. Therefore, making explicit time-domain methods unconditionally stable is important to accelerate the computation. In addition to time-domain methods, frequency-domain methods have suffered from an indefinite system that makes an iterative solution difficult to converge fast. The first contribution of this work is a fast time-domain finite-element algorithm for the analysis and design of very large-scale on-chip circuits. The structure specialty of on-chip circuits such as Manhattan geometry and layered permittivity is preserved in the proposed algorithm. As a result, the large-scale matrix solution encountered in the 3-D circuit analysis is turned into a simple scaling of the solution of a small 1-D matrix, which can be obtained in linear (optimal) complexity with negligible cost. Furthermore, the time step size is not sacrificed, and the total number of time steps to be simulated is also significantly reduced, thus achieving a total cost reduction in CPU time. The second contribution is a new method for making an explicit time-domain finite-element method (TDFEM) unconditionally stable for general electromagnetic analysis. In this method, for a given time step, we find the unstable modes that are the root cause of instability, and deduct them directly from the system matrix resulting from a TDFEM based analysis. As a result, an explicit TDFEM simulation is made stable for an arbitrarily large time step irrespective of the space step. The third contribution is a new method for full-wave applications from low to very high frequencies in a TDFEM based on matrix exponential. In this method, we directly deduct the eigenmodes having large eigenvalues from the system matrix, thus achieving a significantly increased time step in the matrix exponential based TDFEM. The fourth contribution is a new method for transforming the indefinite system matrix of a frequency-domain FEM to a symmetric positive definite one. We deduct non-positive definite component directly from the system matrix resulting from a frequency-domain FEM-based analysis. The resulting new representation of the finite-element operator ensures an iterative solution to converge in a small number of iterations. We then add back the non-positive definite component to synthesize the original solution with negligible cost.

  20. Advances in multi-scale modeling of solidification and casting processes

    NASA Astrophysics Data System (ADS)

    Liu, Baicheng; Xu, Qingyan; Jing, Tao; Shen, Houfa; Han, Zhiqiang

    2011-04-01

    The development of the aviation, energy and automobile industries requires an advanced integrated product/process R&D systems which could optimize the product and the process design as well. Integrated computational materials engineering (ICME) is a promising approach to fulfill this requirement and make the product and process development efficient, economic, and environmentally friendly. Advances in multi-scale modeling of solidification and casting processes, including mathematical models as well as engineering applications are presented in the paper. Dendrite morphology of magnesium and aluminum alloy of solidification process by using phase field and cellular automaton methods, mathematical models of segregation of large steel ingot, and microstructure models of unidirectionally solidified turbine blade casting are studied and discussed. In addition, some engineering case studies, including microstructure simulation of aluminum casting for automobile industry, segregation of large steel ingot for energy industry, and microstructure simulation of unidirectionally solidified turbine blade castings for aviation industry are discussed.

  1. Homogenizing bacterial cell factories: Analysis and engineering of phenotypic heterogeneity.

    PubMed

    Binder, Dennis; Drepper, Thomas; Jaeger, Karl-Erich; Delvigne, Frank; Wiechert, Wolfgang; Kohlheyer, Dietrich; Grünberger, Alexander

    2017-07-01

    In natural habitats, microbes form multispecies communities that commonly face rapidly changing and highly competitive environments. Thus, phenotypic heterogeneity has evolved as an innate and important survival strategy to gain an overall fitness advantage over cohabiting competitors. However, in defined artificial environments such as monocultures in small- to large-scale bioreactors, cell-to-cell variations are presumed to cause reduced production yields as well as process instability. Hence, engineering microbial production toward phenotypic homogeneity is a highly promising approach for synthetic biology and bioprocess optimization. In this review, we discuss recent studies that have unraveled the cell-to-cell heterogeneity observed during bacterial gene expression and metabolite production as well as the molecular mechanisms involved. In addition, current single-cell technologies are briefly reviewed with respect to their applicability in exploring cell-to-cell variations. We highlight emerging strategies and tools to reduce phenotypic heterogeneity in biotechnological expression setups. Here, strain or inducer modifications are combined with cell physiology manipulations to achieve the ultimate goal of equalizing bacterial populations. In this way, the majority of cells can be forced into high productivity, thus reducing less productive subpopulations that tend to consume valuable resources during production. Modifications in uptake systems, inducer molecules or nutrients represent valuable tools for diminishing heterogeneity. Finally, we address the challenge of transferring homogeneously responding cells into large-scale bioprocesses. Environmental heterogeneity originating from extrinsic factors such as stirring speed and pH, oxygen, temperature or nutrient distribution can significantly influence cellular physiology. We conclude that engineering microbial populations toward phenotypic homogeneity is an increasingly important task to take biotechnological productions to the next level of control. Copyright © 2017 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  2. Computers in Electrical Engineering Education at Virginia Polytechnic Institute.

    ERIC Educational Resources Information Center

    Bennett, A. Wayne

    1982-01-01

    Discusses use of computers in Electrical Engineering (EE) at Virginia Polytechnic Institute. Topics include: departmental background, level of computing power using large scale systems, mini and microcomputers, use of digital logic trainers and analog/hybrid computers, comments on integrating computers into EE curricula, and computer use in…

  3. Large Scale IR Evaluation

    ERIC Educational Resources Information Center

    Pavlu, Virgil

    2008-01-01

    Today, search engines are embedded into all aspects of digital world: in addition to Internet search, all operating systems have integrated search engines that respond even as you type, even over the network, even on cell phones; therefore the importance of their efficacy and efficiency cannot be overstated. There are many open possibilities for…

  4. Large-Scale NASA Science Applications on the Columbia Supercluster

    NASA Technical Reports Server (NTRS)

    Brooks, Walter

    2005-01-01

    Columbia, NASA's newest 61 teraflops supercomputer that became operational late last year, is a highly integrated Altix cluster of 10,240 processors, and was named to honor the crew of the Space Shuttle lost in early 2003. Constructed in just four months, Columbia increased NASA's computing capability ten-fold, and revitalized the Agency's high-end computing efforts. Significant cutting-edge science and engineering simulations in the areas of space and Earth sciences, as well as aeronautics and space operations, are already occurring on this largest operational Linux supercomputer, demonstrating its capacity and capability to accelerate NASA's space exploration vision. The presentation will describe how an integrated environment consisting not only of next-generation systems, but also modeling and simulation, high-speed networking, parallel performance optimization, and advanced data analysis and visualization, is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions. The talk will conclude by discussing how NAS partnered with various NASA centers, other government agencies, computer industry, and academia, to create a national resource in large-scale modeling and simulation.

  5. Collaborative-Large scale Engineering Assessment Networks for Environmental Research: The Overview

    NASA Astrophysics Data System (ADS)

    Moo-Young, H.

    2004-05-01

    A networked infrastructure for engineering solutions and policy alternatives is necessary to assess, manage, and protect complex, anthropogenic ally stressed environmental resources effectively. Reductionist and discrete disciplinary methodologies are no longer adequate to evaluate and model complex environmental systems and anthropogenic stresses. While the reductonist approach provides important information regarding individual mechanisms, it cannot provide complete information about how multiple processes are related. Therefore, it is not possible to make accurate predictions about system responses to engineering interventions and the effectiveness of policy options. For example, experts cannot agree on best management strategies for contaminated sediments in riverine and estuarine systems. This is due, in part to the fact that existing models do not accurately capture integrated system dynamics. In addition, infrastructure is not available for investigators to exchange and archive data, to collaborate on new investigative methods, and to synthesize these results to develop engineering solutions and policy alternatives. Our vision for the future is to create a network comprising field facilities and a collaboration of engineers, scientists, policy makers, and community groups. This will allow integration across disciplines, across different temporal and spatial scales, surface and subsurface geographies, and air sheds and watersheds. Benefits include fast response to changes in system health, real-time decision making, and continuous data collection that can be used to anticipate future problems, and to develop sound engineering solutions and management decisions. CLEANER encompasses four general aspects: 1) A Network of environmental field facilities instrumented for the acquisition and analysis of environmental data; 2) A Virtual Repository of Data and information technology for engineering modeling, analysis and visualization of data, i.e. an environmental cyber-infrastructure; 3) A Mechanism for multidisciplinary research and education activities designed to exploit the output of the instrumented sites and networked information technology, to formulate engineering and policy options directed toward the protection, remediation, and restoration of stressed environments and sustainability of environmental resources; and 4) A Collaboration among engineers, natural and social scientists, educators, policy makers, industry, non-governmental organizations, the public, and other stakeholders.

  6. Machine learning for fab automated diagnostics

    NASA Astrophysics Data System (ADS)

    Giollo, Manuel; Lam, Auguste; Gkorou, Dimitra; Liu, Xing Lan; van Haren, Richard

    2017-06-01

    Process optimization depends largely on field engineer's knowledge and expertise. However, this practice turns out to be less sustainable due to the fab complexity which is continuously increasing in order to support the extreme miniaturization of Integrated Circuits. On the one hand, process optimization and root cause analysis of tools is necessary for a smooth fab operation. On the other hand, the growth in number of wafer processing steps is adding a considerable new source of noise which may have a significant impact at the nanometer scale. This paper explores the ability of historical process data and Machine Learning to support field engineers in production analysis and monitoring. We implement an automated workflow in order to analyze a large volume of information, and build a predictive model of overlay variation. The proposed workflow addresses significant problems that are typical in fab production, like missing measurements, small number of samples, confounding effects due to heterogeneity of data, and subpopulation effects. We evaluate the proposed workflow on a real usecase and we show that it is able to predict overlay excursions observed in Integrated Circuits manufacturing. The chosen design focuses on linear and interpretable models of the wafer history, which highlight the process steps that are causing defective products. This is a fundamental feature for diagnostics, as it supports process engineers in the continuous improvement of the production line.

  7. Grid-Enabled Quantitative Analysis of Breast Cancer

    DTIC Science & Technology

    2010-10-01

    large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...research, we designed a pilot study utilizing large scale parallel Grid computing harnessing nationwide infrastructure for medical image analysis . Also

  8. Implementation of the Australian Water Observations from Space (WOfS) Algorithm in Africa and South America using the CEOS Data Cube

    NASA Astrophysics Data System (ADS)

    Purss, M. B. J.; Mueller, N. R.; Killough, B.; Oliver, S. A.

    2016-12-01

    In 2014 Geoscience Australia launched Water Observations from Space (WOfS) providing a continental-scale water product that shows how often surface water has been observed across Australia by the Landsat satellites since 1987. WOfS is a 23-step band-based decision tree that classifies pixels as water or non-water with 97% overall accuracy. The enabling infrastructure for WOfS is the Australian Geoscience Data Cube (AGDC), a high performance computing system organising Australian earth observation data into a systematic, consistently corrected analysis engine. The Committee on Earth Observation Satellites (CEOS) has adopted the AGDC methodology to create a series of international Data Cubes to provide the same capability to areas that would otherwise not be able to undertake time series analysis of the environment at these scales. The CEOS Systems Engineering Office (SEO) recently completed testing of WOfS using Data Cubes based on the AGDC version 2 over Kenya and Colombia. The results show how Data Cubes can provide water management information at large scales, and provide information in remote locations where other sources of water information are unavailable. The results also show an improvement in water detection capability over the Landsat CFmask. This water management product provides critical insight into the behavior of surface water over time and in particular, the extent of flooding.

  9. Large-Scale Simulations and Detailed Flow Field Measurements for Turbomachinery Aeroacoustics

    NASA Technical Reports Server (NTRS)

    VanZante, Dale

    2008-01-01

    The presentation is a review of recent work in highly loaded compressors, turbine aeroacoustics and cooling fan noise. The specific topics are: the importance of correct numerical modeling to capture blade row interactions in the Ultra Efficient Engine Technology Proof-of-Concept Compressor, the attenuation of a detonation pressure wave by an aircraft axial turbine stage, current work on noise sources and acoustic attenuation in turbines, and technology development work on cooling fans for spaceflight applications. The topic areas were related to each other by certain themes such as the advantage of an experimentalist s viewpoint when analyzing numerical simulations and the need to improve analysis methods for very large numerical datasets.

  10. Technoeconomic analysis of large scale production of pre-emergent Pseudomonas fluorescens microbial bioherbicide in Canada.

    PubMed

    Mupondwa, Edmund; Li, Xue; Boyetchko, Susan; Hynes, Russell; Geissler, Jon

    2015-01-01

    The study presents an ex ante technoeconomic analysis of commercial production of Pseudomonas fluorescens BRG100 bioherbicide in Canada. An engineering economic model is designed in SuperPro Designer® to investigate capital investment scaling and profitability. Total capital investment for a stand-alone BRG100 fermentation plant at baseline capacity (two 33,000L fermenters; 3602tonnesannum(-1)) is $17.55million. Total annual operating cost is $14.76million. Raw materials account for 50% of operating cost. The fermentation plant is profitable over wide operating scale, evaluated over a range of BRG100 prices and costs of capital. Smaller plants require higher NPV breakeven prices. However, larger plants are more sensitive to changes in the cost of capital. Unit production costs decrease as plant capacity increases, indicating scale economies. A plant operating for less than one year approaches positive NPV for periods as low as 2months. These findings can support bioherbicide R&D investment and commercialization strategies. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  11. Hierarchical Engine for Large-scale Infrastructure Co-Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-04-24

    HELICS is designed to support very-large-scale (100,000+ federates) cosimulations with off-the-shelf power-system, communication, market, and end-use tools. Other key features include cross platform operating system support, the integration of both event driven (e.g., packetized communication) and time-series (e.g., power flow) simulations, and the ability to co-iterate among federates to ensure physical model convergence at each time step.

  12. Tritium

    DTIC Science & Technology

    2011-11-01

    fusion energy -production processes of the particular type of reactor using a lithium (Li) blanket or related alloys such as the Pb-17Li eutectic. As such, tritium breeding is intimately connected with energy production, thermal management, radioactivity management, materials properties, and mechanical structures of any plausible future large-scale fusion power reactor. JASON is asked to examine the current state of scientific knowledge and engineering practice on the physical and chemical bases for large-scale tritium

  13. Interdisciplinary Interactions During R&D and Early Design of Large Engineered Systems

    NASA Technical Reports Server (NTRS)

    McGowan, Anna-Maria Rivas

    2014-01-01

    Designing Large-Scale Complex Engineered Systems (LaCES) such as aircraft and submarines requires the input of thousands of engineers and scientists whose work is proximate in neither time nor space. Comprehensive knowledge of the system is dispersed among specialists whose expertise is in typically one system component or discipline. This study examined the interactive work practices among such specialists seeking to improve engineering practice through a rigorous and theoretical understanding of current practice. This research explored current interdisciplinary practices and perspectives during R&D and early LaCES design and identified why these practices and perspectives prevail and persist. The research design consisted of a three-fold, integrative approach that combined an open-ended survey, semi-structured interviews, and ethnography. Significant empirical data from experienced engineers and scientists in a large engineering organization were obtained and integrated with theories from organization science and engineering. Qualitative analysis was used to obtain a holistic, contextualized understanding. The over-arching finding is that issues related to cognition, organization, and social interrelations mostly dominate interactions across disciplines. Engineering issues, such as the integration of hardware or physics-based models, are not as significant. For example, organization culture is an important underlying factor that guided researchers more toward individual sovereignty over cross-disciplinarity. The organization structure and the engineered system architecture also serve as constraints to the engineering work. Many differences in work practices were observed, including frequency and depth of interactions, definition or co-construction of requirements, clarity or creation of the system architecture, work group proximity, and cognitive challenges. Practitioners are often unaware of these differences resulting in confusion and incorrect assumptions regarding work expectations. Cognitively, the enactment and coconstruction of knowledge are the fundamental tasks of the interdisciplinary interactions. Distributed and collective cognition represent most of the efforts. Argument, ignorance, learning, and creativity are interrelated aspects of the interactions that cause discomfort but yield benefits such as problem mitigation, broader understanding, and improved system design and performance. The quality and quantity of social interrelations are central to all work across disciplines with reciprocity, respectful engagement, and heedful interrelations being significant to the effectiveness of the engineering and scientific work.

  14. Quantifying the Metrics That Characterize Safety Culture of Three Engineered Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tucker, Julie; Ernesti, Mary; Tokuhiro, Akira

    2002-07-01

    With potential energy shortages and increasing electricity demand, the nuclear energy option is being reconsidered in the United States. Public opinion will have a considerable voice in policy decisions that will 'road-map' the future of nuclear energy in this country. This report is an extension of the last author's work on the 'safety culture' associated with three engineered systems (automobiles, commercial airplanes, and nuclear power plants) in Japan and the United States. Safety culture, in brief is defined as a specifically developed culture based on societal and individual interpretations of the balance of real, perceived, and imagined risks versus themore » benefits drawn from utilizing a given engineered systems. The method of analysis is a modified scale analysis, with two fundamental Eigen-metrics, time- (t) and number-scales (N) that describe both engineered systems and human factors. The scale analysis approach is appropriate because human perception of risk, perception of benefit and level of (technological) acceptance are inherently subjective, therefore 'fuzzy' and rarely quantifiable in exact magnitude. Perception of risk, expressed in terms of the psychometric factors 'dread risk' and 'unknown risk', contains both time- and number-scale elements. Various engineering system accidents with fatalities, reported by mass media are characterized by t and N, and are presented in this work using the scale analysis method. We contend that level of acceptance infers a perception of benefit at least two orders larger magnitude than perception of risk. The 'amplification' influence of mass media is also deduced as being 100- to 1000-fold the actual number of fatalities/serious injuries in a nuclear-related accident. (authors)« less

  15. Engineering the earth system

    NASA Astrophysics Data System (ADS)

    Keith, D. W.

    2005-12-01

    The post-war growth of the earth sciences has been fueled, in part, by a drive to quantify environmental insults in order to support arguments for their reduction, yet paradoxically the knowledge gained is grants us ever greater capability to deliberately engineer environmental processes on a planetary scale. Increased capability can arises though seemingly unconnected scientific advances. Improvements in numerical weather prediction such as the use of adjoint models in analysis/forecast systems, for example, means that weather modification can be accomplished with smaller control inputs. Purely technological constraints on our ability to engineer earth systems arise from our limited ability to measure and predict system responses and from limits on our ability to manage large engineering projects. Trends in all three constraints suggest a rapid growth in our ability to engineer the planet. What are the implications of our growing ability to geoengineer? Will we see a reemergence of proposals to engineer our way out of the climate problem? How can we avoid the moral hazard posed by the knowledge that geoengineering might provide a backstop to climate damages? I will speculate about these issues, and suggest some institutional factors that may provide a stronger constraint on the use of geoengineering than is provided by any purely technological limit.

  16. An algorithm to estimate aircraft cruise black carbon emissions for use in developing a cruise emissions inventory.

    PubMed

    Peck, Jay; Oluwole, Oluwayemisi O; Wong, Hsi-Wu; Miake-Lye, Richard C

    2013-03-01

    To provide accurate input parameters to the large-scale global climate simulation models, an algorithm was developed to estimate the black carbon (BC) mass emission index for engines in the commercial fleet at cruise. Using a high-dimensional model representation (HDMR) global sensitivity analysis, relevant engine specification/operation parameters were ranked, and the most important parameters were selected. Simple algebraic formulas were then constructed based on those important parameters. The algorithm takes the cruise power (alternatively, fuel flow rate), altitude, and Mach number as inputs, and calculates BC emission index for a given engine/airframe combination using the engine property parameters, such as the smoke number, available in the International Civil Aviation Organization (ICAO) engine certification databank. The algorithm can be interfaced with state-of-the-art aircraft emissions inventory development tools, and will greatly improve the global climate simulations that currently use a single fleet average value for all airplanes. An algorithm to estimate the cruise condition black carbon emission index for commercial aircraft engines was developed. Using the ICAO certification data, the algorithm can evaluate the black carbon emission at given cruise altitude and speed.

  17. CLEANER-Hydrologic Observatory Joint Science Plan

    NASA Astrophysics Data System (ADS)

    Welty, C.; Dressler, K.; Hooper, R.

    2005-12-01

    The CLEANER-Hydrologic Observatory* initiative is a distributed network for research on complex environmental systems that focuses on the intersecting water-related issues of both the CUAHSI and CLEANER communities. It emphasizes research on the nation's water resources related to human-dominated natural and built environments. The network will be comprised of: interacting field sites with an integrated cyberinfrastructure; a centralized technical resource staff and management infrastructure to support interdisciplinary research through data collection from advanced sensor systems, data mining and aggregation from multiple sources and databases; cyber-tools for analysis, visualization, and predictive multi-scale modeling that is dynamically driven. As such, the network will transform 21st century workforce development in the water-related intersection of environmental science and engineering, as well as enable substantial educational and engagement opportunities for all age levels. The scientific goal and strategic intent of the CLEANER-Hydrologic Observatory Network is to transform our understanding of the earth's water cycle and associated biogeochemical cycles across spatial and temporal scales-enabling quantitative forecasts of critical water-related processes, especially those that affect and are affected by human activities. This strategy will develop scientific and engineering tools that will enable more effective adaptive approaches for resource management. The need for the network is based on three critical deficiencies in current abilities to understand large-scale environmental processes and thereby develop more effective management strategies. First we lack basic data and the infrastructure to collect them at the needed resolution. Second, we lack the means to integrate data across scales from different media (paper records, electronic worksheets, web-based) and sources (observations, experiments, simulations). Third, we lack sufficiently accurate modeling and decision-support tools to predict the underlying processes or subsequently forecast the effects of different management strategies. Water is a critical driver for the functioning of all ecosystems and development of human society, and it is a key ingredient for the success of industry, agriculture and, national economy. CLEANER-Hydrologic Observatories will foster cutting-edge science and engineering research that addresses major national needs (public and governmental) related to water and include, for example: (i) water resource problems, such as impaired surface waters, contaminated ground water, water availability for human use and ecosystem needs, floods and floodplain management, urban storm water, agricultural runoff, and coastal hypoxia; (ii) understanding environmental impacts on public health; (iii) achieving a balance of economic and environmental sustainability; (iv) reversing environmental degradation; and (v) protecting against chemical and biological threats. CLEANER (Collaborative Large-scale Engineering Analysis Network for Environmental Research) is an ENG initiative; the Hydrologic Observatory Network is GEO initiative through CUAHSI (Consortium of Universities for the Advancement of Hydrologic Science, Inc.). The two initiatives were merged into a joint, bi-directorate program in December 2004.

  18. ORNL Pre-test Analyses of A Large-scale Experiment in STYLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Paul T; Yin, Shengjun; Klasky, Hilda B

    Oak Ridge National Laboratory (ORNL) is conducting a series of numerical analyses to simulate a large scale mock-up experiment planned within the European Network for Structural Integrity for Lifetime Management non-RPV Components (STYLE). STYLE is a European cooperative effort to assess the structural integrity of (non-reactor pressure vessel) reactor coolant pressure boundary components relevant to ageing and life-time management and to integrate the knowledge created in the project into mainstream nuclear industry assessment codes. ORNL contributes work-in-kind support to STYLE Work Package 2 (Numerical Analysis/Advanced Tools) and Work Package 3 (Engineering Assessment Methods/LBB Analyses). This paper summarizes the current statusmore » of ORNL analyses of the STYLE Mock-Up3 large-scale experiment to simulate and evaluate crack growth in a cladded ferritic pipe. The analyses are being performed in two parts. In the first part, advanced fracture mechanics models are being developed and performed to evaluate several experiment designs taking into account the capabilities of the test facility while satisfying the test objectives. Then these advanced fracture mechanics models will be utilized to simulate the crack growth in the large scale mock-up test. For the second part, the recently developed ORNL SIAM-PFM open-source, cross-platform, probabilistic computational tool will be used to generate an alternative assessment for comparison with the advanced fracture mechanics model results. The SIAM-PFM probabilistic analysis of the Mock-Up3 experiment will utilize fracture modules that are installed into a general probabilistic framework. The probabilistic results of the Mock-Up3 experiment obtained from SIAM-PFM will be compared to those results generated using the deterministic 3D nonlinear finite-element modeling approach. The objective of the probabilistic analysis is to provide uncertainty bounds that will assist in assessing the more detailed 3D finite-element solutions and to also assess the level of confidence that can be placed in the best-estimate finiteelement solutions.« less

  19. Performance/price estimates for cortex-scale hardware: a design space exploration.

    PubMed

    Zaveri, Mazad S; Hammerstrom, Dan

    2011-04-01

    In this paper, we revisit the concept of virtualization. Virtualization is useful for understanding and investigating the performance/price and other trade-offs related to the hardware design space. Moreover, it is perhaps the most important aspect of a hardware design space exploration. Such a design space exploration is a necessary part of the study of hardware architectures for large-scale computational models for intelligent computing, including AI, Bayesian, bio-inspired and neural models. A methodical exploration is needed to identify potentially interesting regions in the design space, and to assess the relative performance/price points of these implementations. As an example, in this paper we investigate the performance/price of (digital and mixed-signal) CMOS and hypothetical CMOL (nanogrid) technology based hardware implementations of human cortex-scale spiking neural systems. Through this analysis, and the resulting performance/price points, we demonstrate, in general, the importance of virtualization, and of doing these kinds of design space explorations. The specific results suggest that hybrid nanotechnology such as CMOL is a promising candidate to implement very large-scale spiking neural systems, providing a more efficient utilization of the density and storage benefits of emerging nano-scale technologies. In general, we believe that the study of such hypothetical designs/architectures will guide the neuromorphic hardware community towards building large-scale systems, and help guide research trends in intelligent computing, and computer engineering. Copyright © 2010 Elsevier Ltd. All rights reserved.

  20. Computational study of culture conditions and nutrient supply in a hollow membrane sheet bioreactor for large-scale bone tissue engineering.

    PubMed

    Khademi, Ramin; Mohebbi-Kalhori, Davod; Hadjizadeh, Afra

    2014-03-01

    Successful bone tissue culture in a large implant is still a challenge. We have previously developed a porous hollow membrane sheet (HMSh) for tissue engineering applications (Afra Hadjizadeh and Davod Mohebbi-Kalhori, J Biomed. Mater. Res. Part A [2]). This study aims to investigate culture conditions and nutrient supply in a bioreactor made of HMSh. For this purpose, hydrodynamic and mass transport behavior in the newly proposed hollow membrane sheet bioreactor including a lumen region and porous membrane (scaffold) for supporting and feeding cells with a grooved section for accommodating gel-cell matrix was numerically studied. A finite element method was used for solving the governing equations in both homogenous and porous media. Furthermore, the cell resistance and waste production have been included in a 3D mathematical model. The influences of different bioreactor design parameters and the scaffold properties which determine the HMSh bioreactor performance and various operating conditions were discussed in detail. The obtained results illustrated that the novel scaffold can be employed in the large-scale applications in bone tissue engineering.

  1. Turbulent pipe flow at extreme Reynolds numbers.

    PubMed

    Hultmark, M; Vallikivi, M; Bailey, S C C; Smits, A J

    2012-03-02

    Both the inherent intractability and complex beauty of turbulence reside in its large range of physical and temporal scales. This range of scales is captured by the Reynolds number, which in nature and in many engineering applications can be as large as 10(5)-10(6). Here, we report turbulence measurements over an unprecedented range of Reynolds numbers using a unique combination of a high-pressure air facility and a new nanoscale anemometry probe. The results reveal previously unknown universal scaling behavior for the turbulent velocity fluctuations, which is remarkably similar to the well-known scaling behavior of the mean velocity distribution.

  2. Self-reconfigurable ship fluid-network modeling for simulation-based design

    NASA Astrophysics Data System (ADS)

    Moon, Kyungjin

    Our world is filled with large-scale engineering systems, which provide various services and conveniences in our daily life. A distinctive trend in the development of today's large-scale engineering systems is the extensive and aggressive adoption of automation and autonomy that enable the significant improvement of systems' robustness, efficiency, and performance, with considerably reduced manning and maintenance costs, and the U.S. Navy's DD(X), the next-generation destroyer program, is considered as an extreme example of such a trend. This thesis pursues a modeling solution for performing simulation-based analysis in the conceptual or preliminary design stage of an intelligent, self-reconfigurable ship fluid system, which is one of the concepts of DD(X) engineering plant development. Through the investigations on the Navy's approach for designing a more survivable ship system, it is found that the current naval simulation-based analysis environment is limited by the capability gaps in damage modeling, dynamic model reconfiguration, and simulation speed of the domain specific models, especially fluid network models. As enablers of filling these gaps, two essential elements were identified in the formulation of the modeling method. The first one is the graph-based topological modeling method, which will be employed for rapid model reconstruction and damage modeling, and the second one is the recurrent neural network-based, component-level surrogate modeling method, which will be used to improve the affordability and efficiency of the modeling and simulation (M&S) computations. The integration of the two methods can deliver computationally efficient, flexible, and automation-friendly M&S which will create an environment for more rigorous damage analysis and exploration of design alternatives. As a demonstration for evaluating the developed method, a simulation model of a notional ship fluid system was created, and a damage analysis was performed. Next, the models representing different design configurations of the fluid system were created, and damage analyses were performed with them in order to find an optimal design configuration for system survivability. Finally, the benefits and drawbacks of the developed method were discussed based on the result of the demonstration.

  3. Analysis of crack initiation and growth in the high level vibration test at Tadotsu

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kassir, M.K.; Park, Y.J.; Hofmayer, C.H.

    1993-08-01

    The High Level Vibration Test data are used to assess the accuracy and usefulness of current engineering methodologies for predicting crack initiation and growth in a cast stainless steel pipe elbow under complex, large amplitude loading. The data were obtained by testing at room temperature a large scale modified model of one loop of a PWR primary coolant system at the Tadotsu Engineering Laboratory in Japan. Fatigue crack initiation time is reasonably predicted by applying a modified local strain approach (Coffin-Mason-Goodman equation) in conjunction with Miner`s rule of cumulative damage. Three fracture mechanics methodologies are applied to investigate the crackmore » growth behavior observed in the hot leg of the model. These are: the {Delta}K methodology (Paris law), {Delta}J concepts and a recently developed limit load stress-range criterion. The report includes a discussion on the pros and cons of the analysis involved in each of the methods, the role played by the key parameters influencing the formulation and a comparison of the results with the actual crack growth behavior observed in the vibration test program. Some conclusions and recommendations for improvement of the methodologies are also provided.« less

  4. In silico metabolic engineering of Clostridium ljungdahlii for synthesis gas fermentation.

    PubMed

    Chen, Jin; Henson, Michael A

    2016-11-01

    Synthesis gas fermentation is one of the most promising routes to convert synthesis gas (syngas; mainly comprised of H 2 and CO) to renewable liquid fuels and chemicals by specialized bacteria. The most commonly studied syngas fermenting bacterium is Clostridium ljungdahlii, which produces acetate and ethanol as its primary metabolic byproducts. Engineering of C. ljungdahlii metabolism to overproduce ethanol, enhance the synthesize of the native byproducts lactate and 2,3-butanediol, and introduce the synthesis of non-native products such as butanol and butyrate has substantial commercial value. We performed in silico metabolic engineering studies using a genome-scale reconstruction of C. ljungdahlii metabolism and the OptKnock computational framework to identify gene knockouts that were predicted to enhance the synthesis of these native products and non-native products, introduced through insertion of the necessary heterologous pathways. The OptKnock derived strategies were often difficult to assess because increase product synthesis was invariably accompanied by decreased growth. Therefore, the OptKnock strategies were further evaluated using a spatiotemporal metabolic model of a syngas bubble column reactor, a popular technology for large-scale gas fermentation. Unlike flux balance analysis, the bubble column model accounted for the complex tradeoffs between increased product synthesis and reduced growth rates of engineered mutants within the spatially varying column environment. The two-stage methodology for deriving and evaluating metabolic engineering strategies was shown to yield new C. ljungdahlii gene targets that offer the potential for increased product synthesis under realistic syngas fermentation conditions. Copyright © 2016 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  5. Engineering-Scale Demonstration of DuraLith and Ceramicrete Waste Forms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Josephson, Gary B.; Westsik, Joseph H.; Pires, Richard P.

    2011-09-23

    To support the selection of a waste form for the liquid secondary wastes from the Hanford Waste Immobilization and Treatment Plant, Washington River Protection Solutions (WRPS) has initiated secondary waste form testing on four candidate waste forms. Two of the candidate waste forms have not been developed to scale as the more mature waste forms. This work describes engineering-scale demonstrations conducted on Ceramicrete and DuraLith candidate waste forms. Both candidate waste forms were successfully demonstrated at an engineering scale. A preliminary conceptual design could be prepared for full-scale production of the candidate waste forms. However, both waste forms are stillmore » too immature to support a detailed design. Formulations for each candidate waste form need to be developed so that the material has a longer working time after mixing the liquid and solid constituents together. Formulations optimized based on previous lab studies did not have sufficient working time to support large-scale testing. The engineering-scale testing was successfully completed using modified formulations. Further lab development and parametric studies are needed to optimize formulations with adequate working time and assess the effects of changes in raw materials and process parameters on the final product performance. Studies on effects of mixing intensity on the initial set time of the waste forms are also needed.« less

  6. Metabolic engineering of biosynthetic pathway for production of renewable biofuels.

    PubMed

    Singh, Vijai; Mani, Indra; Chaudhary, Dharmendra Kumar; Dhar, Pawan Kumar

    2014-02-01

    Metabolic engineering is an important area of research that involves editing genetic networks to overproduce a certain substance by the cells. Using a combination of genetic, metabolic, and modeling methods, useful substances have been synthesized in the past at industrial scale and in a cost-effective manner. Currently, metabolic engineering is being used to produce sufficient, economical, and eco-friendly biofuels. In the recent past, a number of efforts have been made towards engineering biosynthetic pathways for large scale and efficient production of biofuels from biomass. Given the adoption of metabolic engineering approaches by the biofuel industry, this paper reviews various approaches towards the production and enhancement of renewable biofuels such as ethanol, butanol, isopropanol, hydrogen, and biodiesel. We have also identified specific areas where more work needs to be done in the future.

  7. Aerodynamic characteristics of a large-scale hybrid upper surface blown flap model having four engines

    NASA Technical Reports Server (NTRS)

    Carros, R. J.; Boissevain, A. G.; Aoyagi, K.

    1975-01-01

    Data are presented from an investigation of the aerodynamic characteristics of large-scale wind tunnel aircraft model that utilized a hybrid-upper surface blown flap to augment lift. The hybrid concept of this investigation used a portion of the turbofan exhaust air for blowing over the trailing edge flap to provide boundary layer control. The model, tested in the Ames 40- by 80-foot Wind Tunnel, had a 27.5 deg swept wing of aspect ratio 8 and 4 turbofan engines mounted on the upper surface of the wing. The lift of the model was augmented by turbofan exhaust impingement on the wind upper-surface and flap system. Results were obtained for three flap deflections, for some variation of engine nozzle configuration and for jet thrust coefficients from 0 to 3.0. Six-component longitudinal and lateral data are presented with four engine operation and with the critical engine out. In addition, a limited number of cross-plots of the data are presented. All of the tests were made with a downwash rake installed instead of a horizontal tail. Some of these downwash data are also presented.

  8. Sustainability Metrics of a Small Scale Turbojet Engine

    NASA Astrophysics Data System (ADS)

    Ekici, Selcuk; Sohret, Yasin; Coban, Kahraman; Altuntas, Onder; Karakoc, T. Hikmet

    2018-05-01

    Over the last decade, sustainable energy consumption has attracted the attention of scientists and researchers. The current paper presents sustainability indicators of a small scale turbojet engine, operated on micro-aerial vehicles, for discussion of the sustainable development of the aviation industry from a different perspective. Experimental data was obtained from an engine at full power load and utilized to conduct an exergy-based sustainability analysis. Exergy efficiency, waste exergy ratio, recoverable exergy ratio, environmental effect factor, exergy destruction factor and exergetic sustainability index are evaluated as exergetic sustainability indicators of the turbojet engine under investigation in the current study. The exergy efficiency of the small scale turbojet engine is calculated as 27.25 % whereas the waste exergy ratio, the exergy destruction factor and the sustainability index of the engine are found to be 0.9756, 0.5466 and 0.2793, respectively.

  9. Turbofan engine demonstration of sensor failure detection

    NASA Technical Reports Server (NTRS)

    Merrill, Walter C.; Delaat, John C.; Abdelwahab, Mahmood

    1991-01-01

    In the paper, the results of a full-scale engine demonstration of a sensor failure detection algorithm are presented. The algorithm detects, isolates, and accommodates sensor failures using analytical redundancy. The experimental hardware, including the F100 engine, is described. Demonstration results were obtained over a large portion of a typical flight envelope for the F100 engine. They include both subsonic and supersonic conditions at both medium and full, nonafter burning, power. Estimated accuracy, minimum detectable levels of sensor failures, and failure accommodation performance for an F100 turbofan engine control system are discussed.

  10. Computer code for the prediction of nozzle admittance

    NASA Technical Reports Server (NTRS)

    Nguyen, Thong V.

    1988-01-01

    A procedure which can accurately characterize injector designs for large thrust (0.5 to 1.5 million pounds), high pressure (500 to 3000 psia) LOX/hydrocarbon engines is currently under development. In this procedure, a rectangular cross-sectional combustion chamber is to be used to simulate the lower traverse frequency modes of the large scale chamber. The chamber will be sized so that the first width mode of the rectangular chamber corresponds to the first tangential mode of the full-scale chamber. Test data to be obtained from the rectangular chamber will be used to assess the full scale engine stability. This requires the development of combustion stability models for rectangular chambers. As part of the combustion stability model development, a computer code, NOAD based on existing theory was developed to calculate the nozzle admittances for both rectangular and axisymmetric nozzles. This code is detailed.

  11. The development of a solar-powered residential heating and cooling system

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Efforts to demonstrate the engineering feasibility of utilizing solar power for residential heating and cooling are described. These efforts were concentrated on the analysis, design, and test of a full-scale demonstration system which is currently under construction at the National Aeronautics and Space Administration, Marshall Space Flight Center, Huntsville, Alabama. The basic solar heating and cooling system under development utilizes a flat plate solar energy collector, a large water tank for thermal energy storage, heat exchangers for space heating and water heating, and an absorption cycle air conditioner for space cooling.

  12. Distributed sensor networks: a cellular nonlinear network perspective.

    PubMed

    Haenggi, Martin

    2003-12-01

    Large-scale networks of integrated wireless sensors become increasingly tractable. Advances in hardware technology and engineering design have led to dramatic reductions in size, power consumption, and cost for digital circuitry, and wireless communications. Networking, self-organization, and distributed operation are crucial ingredients to harness the sensing, computing, and computational capabilities of the nodes into a complete system. This article shows that those networks can be considered as cellular nonlinear networks (CNNs), and that their analysis and design may greatly benefit from the rich theoretical results available for CNNs.

  13. Performance Characterization of Global Address Space Applications: A Case Study with NWChem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammond, Jeffrey R.; Krishnamoorthy, Sriram; Shende, Sameer

    The use of global address space languages and one-sided communication for complex applications is gaining attention in the parallel computing community. However, lack of good evaluative methods to observe multiple levels of performance makes it difficult to isolate the cause of performance deficiencies and to understand the fundamental limitations of system and application design for future improvement. NWChem is a popular computational chemistry package which depends on the Global Arrays/ ARMCI suite for partitioned global address space functionality to deliver high-end molecular modeling capabilities. A workload characterization methodology was developed to support NWChem performance engineering on large-scale parallel platforms. Themore » research involved both the integration of performance instrumentation and measurement in the NWChem software, as well as the analysis of one-sided communication performance in the context of NWChem workloads. Scaling studies were conducted for NWChem on Blue Gene/P and on two large-scale clusters using different generation Infiniband interconnects and x86 processors. The performance analysis and results show how subtle changes in the runtime parameters related to the communication subsystem could have significant impact on performance behavior. The tool has successfully identified several algorithmic bottlenecks which are already being tackled by computational chemists to improve NWChem performance.« less

  14. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 2. First Year Poststocking Results. Volume III. The Plankton and Benthos of Lake Conway, Florida,

    DTIC Science & Technology

    1981-11-01

    AD-AI09 516 FLORIDA UNIV GAINESVILLE DEPT OF ENVIRONMENTAL ENGIN--ETC F/G 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE,WHITE AMUR--ETC(U... OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC PLANTS Report I: Baseline Studies Volume I: The Aquatic Macropyes of...COVERED LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF Report 2 of a series THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC (In 7 volumes) PLANTS

  15. Google Earth Engine: a new cloud-computing platform for global-scale earth observation data and analysis

    NASA Astrophysics Data System (ADS)

    Moore, R. T.; Hansen, M. C.

    2011-12-01

    Google Earth Engine is a new technology platform that enables monitoring and measurement of changes in the earth's environment, at planetary scale, on a large catalog of earth observation data. The platform offers intrinsically-parallel computational access to thousands of computers in Google's data centers. Initial efforts have focused primarily on global forest monitoring and measurement, in support of REDD+ activities in the developing world. The intent is to put this platform into the hands of scientists and developing world nations, in order to advance the broader operational deployment of existing scientific methods, and strengthen the ability for public institutions and civil society to better understand, manage and report on the state of their natural resources. Earth Engine currently hosts online nearly the complete historical Landsat archive of L5 and L7 data collected over more than twenty-five years. Newly-collected Landsat imagery is downloaded from USGS EROS Center into Earth Engine on a daily basis. Earth Engine also includes a set of historical and current MODIS data products. The platform supports generation, on-demand, of spatial and temporal mosaics, "best-pixel" composites (for example to remove clouds and gaps in satellite imagery), as well as a variety of spectral indices. Supervised learning methods are available over the Landsat data catalog. The platform also includes a new application programming framework, or "API", that allows scientists access to these computational and data resources, to scale their current algorithms or develop new ones. Under the covers of the Google Earth Engine API is an intrinsically-parallel image-processing system. Several forest monitoring applications powered by this API are currently in development and expected to be operational in 2011. Combining science with massive data and technology resources in a cloud-computing framework can offer advantages of computational speed, ease-of-use and collaboration, as well as transparency in data and methods. Methods developed for global processing of MODIS data to map land cover are being adopted for use with Landsat data. Specifically, the MODIS Vegetation Continuous Field product methodology has been applied for mapping forest extent and change at national scales using Landsat time-series data sets. Scaling this method to continental and global scales is enabled by Google Earth Engine computing capabilities. By combining the supervised learning VCF approach with the Landsat archive and cloud computing, unprecedented monitoring of land cover dynamics is enabled.

  16. Impurity engineering of Czochralski silicon used for ultra large-scaled-integrated circuits

    NASA Astrophysics Data System (ADS)

    Yang, Deren; Chen, Jiahe; Ma, Xiangyang; Que, Duanlin

    2009-01-01

    Impurities in Czochralski silicon (Cz-Si) used for ultra large-scaled-integrated (ULSI) circuits have been believed to deteriorate the performance of devices. In this paper, a review of the recent processes from our investigation on internal gettering in Cz-Si wafers which were doped with nitrogen, germanium and/or high content of carbon is presented. It has been suggested that those impurities enhance oxygen precipitation, and create both denser bulk microdefects and enough denuded zone with the desirable width, which is benefit of the internal gettering of metal contamination. Based on the experimental facts, a potential mechanism of impurity doping on the internal gettering structure is interpreted and, a new concept of 'impurity engineering' for Cz-Si used for ULSI is proposed.

  17. Shaping carbon nanostructures by controlling the synthesis process

    NASA Astrophysics Data System (ADS)

    Merkulov, Vladimir I.; Guillorn, Michael A.; Lowndes, Douglas H.; Simpson, Michael L.; Voelkl, Edgar

    2001-08-01

    The ability to control the nanoscale shape of nanostructures in a large-scale synthesis process is an essential and elusive goal of nanotechnology research. Here, we report significant progress toward that goal. We have developed a technique that enables controlled synthesis of nanoscale carbon structures with conical and cylinder-on-cone shapes and provides the capability to dynamically change the nanostructure shape during the synthesis process. In addition, we present a phenomenological model that explains the formation of these nanostructures and provides insight into methods for precisely engineering their shape. Since the growth process we report is highly deterministic in allowing large-scale synthesis of precisely engineered nanoscale components at defined locations, our approach provides an important tool for a practical nanotechnology.

  18. Data Management Practices and Perspectives of Atmospheric Scientists and Engineering Faculty

    ERIC Educational Resources Information Center

    Wiley, Christie; Mischo, William H.

    2016-01-01

    This article analyzes 21 in-depth interviews of engineering and atmospheric science faculty at the University of Illinois Urbana-Champaign (UIUC) to determine faculty data management practices and needs within the context of their research activities. A detailed literature review of previous large-scale and institutional surveys and interviews…

  19. Can Models Capture the Complexity of the Systems Engineering Process?

    NASA Astrophysics Data System (ADS)

    Boppana, Krishna; Chow, Sam; de Weck, Olivier L.; Lafon, Christian; Lekkakos, Spyridon D.; Lyneis, James; Rinaldi, Matthew; Wang, Zhiyong; Wheeler, Paul; Zborovskiy, Marat; Wojcik, Leonard A.

    Many large-scale, complex systems engineering (SE) programs have been problematic; a few examples are listed below (Bar-Yam, 2003 and Cullen, 2004), and many others have been late, well over budget, or have failed: Hilton/Marriott/American Airlines system for hotel reservations and flights; 1988-1992; 125 million; "scrapped"

  20. Incorporating Learning Theory into Existing Systems Engineering Models

    DTIC Science & Technology

    2013-09-01

    3. Social  Cognition 22 Table 1. Classification of learning theories Behaviorism Cognitivism Constructivism Connectivism...Introdution to design of large scale systems. New York: Mcgraw-Hill. Grusec. J. (1992). Social learning theory and development psychology: The... LEARNING THEORY INTO EXISTING SYSTEMS ENGINEERING MODELS by Valentine Leo September 2013 Thesis Advisor: Gary O. Langford Co-Advisor

  1. Influence of small-scale disturbances by kangaroo rats on Chihuahuan Desert ants

    Treesearch

    R. L. Schooley; B. T. Bestelmeyer; J. F. Kelly

    2000-01-01

    Banner-tailed kangaroo rats (Dipodomys spectabilis) are prominent ecosystem engineers that build large mounds that influence the spatial structuring of fungi, plants, and some ground-dwelling animals. Ants are diverse and functionally important components of arid ecosystems; some species are also ecosystem engineers. We investigated the effects of...

  2. Engineering microbial cell factories for the production of plant natural products: from design principles to industrial-scale production.

    PubMed

    Liu, Xiaonan; Ding, Wentao; Jiang, Huifeng

    2017-07-19

    Plant natural products (PNPs) are widely used as pharmaceuticals, nutraceuticals, seasonings, pigments, etc., with a huge commercial value on the global market. However, most of these PNPs are still being extracted from plants. A resource-conserving and environment-friendly synthesis route for PNPs that utilizes microbial cell factories has attracted increasing attention since the 1940s. However, at the present only a handful of PNPs are being produced by microbial cell factories at an industrial scale, and there are still many challenges in their large-scale application. One of the challenges is that most biosynthetic pathways of PNPs are still unknown, which largely limits the number of candidate PNPs for heterologous microbial production. Another challenge is that the metabolic fluxes toward the target products in microbial hosts are often hindered by poor precursor supply, low catalytic activity of enzymes and obstructed product transport. Consequently, despite intensive studies on the metabolic engineering of microbial hosts, the fermentation costs of most heterologously produced PNPs are still too high for industrial-scale production. In this paper, we review several aspects of PNP production in microbial cell factories, including important design principles and recent progress in pathway mining and metabolic engineering. In addition, implemented cases of industrial-scale production of PNPs in microbial cell factories are also highlighted.

  3. Engineering large cartilage tissues using dynamic bioreactor culture at defined oxygen conditions.

    PubMed

    Daly, Andrew C; Sathy, Binulal N; Kelly, Daniel J

    2018-01-01

    Mesenchymal stem cells maintained in appropriate culture conditions are capable of producing robust cartilage tissue. However, gradients in nutrient availability that arise during three-dimensional culture can result in the development of spatially inhomogeneous cartilage tissues with core regions devoid of matrix. Previous attempts at developing dynamic culture systems to overcome these limitations have reported suppression of mesenchymal stem cell chondrogenesis compared to static conditions. We hypothesize that by modulating oxygen availability during bioreactor culture, it is possible to engineer cartilage tissues of scale. The objective of this study was to determine whether dynamic bioreactor culture, at defined oxygen conditions, could facilitate the development of large, spatially homogeneous cartilage tissues using mesenchymal stem cell laden hydrogels. A dynamic culture regime was directly compared to static conditions for its capacity to support chondrogenesis of mesenchymal stem cells in both small and large alginate hydrogels. The influence of external oxygen tension on the response to the dynamic culture conditions was explored by performing the experiment at 20% O 2 and 3% O 2 . At 20% O 2 , dynamic culture significantly suppressed chondrogenesis in engineered tissues of all sizes. In contrast, at 3% O 2 dynamic culture significantly enhanced the distribution and amount of cartilage matrix components (sulphated glycosaminoglycan and collagen II) in larger constructs compared to static conditions. Taken together, these results demonstrate that dynamic culture regimes that provide adequate nutrient availability and a low oxygen environment can be employed to engineer large homogeneous cartilage tissues. Such culture systems could facilitate the scaling up of cartilage tissue engineering strategies towards clinically relevant dimensions.

  4. Use of randomized sampling for analysis of metabolic networks.

    PubMed

    Schellenberger, Jan; Palsson, Bernhard Ø

    2009-02-27

    Genome-scale metabolic network reconstructions in microorganisms have been formulated and studied for about 8 years. The constraint-based approach has shown great promise in analyzing the systemic properties of these network reconstructions. Notably, constraint-based models have been used successfully to predict the phenotypic effects of knock-outs and for metabolic engineering. The inherent uncertainty in both parameters and variables of large-scale models is significant and is well suited to study by Monte Carlo sampling of the solution space. These techniques have been applied extensively to the reaction rate (flux) space of networks, with more recent work focusing on dynamic/kinetic properties. Monte Carlo sampling as an analysis tool has many advantages, including the ability to work with missing data, the ability to apply post-processing techniques, and the ability to quantify uncertainty and to optimize experiments to reduce uncertainty. We present an overview of this emerging area of research in systems biology.

  5. Ship Production Symposium Held in Seattle, Washington on August 24-26, 1988 (The National Shipbuilding Research Program)

    DTIC Science & Technology

    1988-08-01

    functional area in which one of the brothers was clearly in charge was engineering. Nat was the Chief Engineer largely because John was blind from the age of...work pack- age that straddles a bulkhead during hot work on the bulkhead, knowing full well that later in time, zones that coincide with the...take the natural step of employing these concepts in large scale repair work. Decreasing work the Marine Industry always fans the flames of the age

  6. Quality Function Deployment for Large Systems

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1992-01-01

    Quality Function Deployment (QFD) is typically applied to small subsystems. This paper describes efforts to extend QFD to large scale systems. It links QFD to the system engineering process, the concurrent engineering process, the robust design process, and the costing process. The effect is to generate a tightly linked project management process of high dimensionality which flushes out issues early to provide a high quality, low cost, and, hence, competitive product. A pre-QFD matrix linking customers to customer desires is described.

  7. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard; Allcock, William; Beggio, Chris

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at themore » DOE national laboratories. The report contains findings from that review.« less

  8. Using Computing and Data Grids for Large-Scale Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2001-01-01

    We use the term "Grid" to refer to a software system that provides uniform and location independent access to geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. These emerging data and computing Grids promise to provide a highly capable and scalable environment for addressing large-scale science problems. We describe the requirements for science Grids, the resulting services and architecture of NASA's Information Power Grid (IPG) and DOE's Science Grid, and some of the scaling issues that have come up in their implementation.

  9. Ultra Efficient Engine Technology Systems Integration and Environmental Assessment

    NASA Technical Reports Server (NTRS)

    Daggett, David L.; Geiselhart, Karl A. (Technical Monitor)

    2002-01-01

    This study documents the design and analysis of four types of advanced technology commercial transport airplane configurations (small, medium large and very large) with an assumed technology readiness date of 2010. These airplane configurations were used as a platform to evaluate the design concept and installed performance of advanced technology engines being developed under the NASA Ultra Efficient Engine Technology (UEET) program. Upon installation of the UEET engines onto the UEET advanced technology airframes, the small and medium airplanes both achieved an additional 16% increase in fuel efficiency when using GE advanced turbofan engines. The large airplane achieved an 18% increase in fuel efficiency when using the P&W geared fan engine. The very large airplane (i.e. BWB), also using P&W geared fan engines, only achieved an additional 16% that was attributed to a non-optimized airplane/engine combination.

  10. The SCALE-UP Project

    NASA Astrophysics Data System (ADS)

    Beichner, Robert

    2015-03-01

    The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).

  11. Large scale static tests of a tilt-nacelle V/STOL propulsion/attitude control system

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The concept of a combined V/STOL propulsion and aircraft attitude control system was subjected to large scale engine tests. The tilt nacelle/attitude control vane package consisted of the T55 powered Hamilton Standard Q-Fan demonstrator. Vane forces, moments, thermal and acoustic characteristics as well as the effects on propulsion system performance were measured under conditions simulating hover in and out of ground effect.

  12. Modal Testing of the NPSAT1 Engineering Development Unit

    DTIC Science & Technology

    2012-07-01

    erkläre ich, dass die vorliegende Master Arbeit von mir selbstständig und nur unter Verwendung der angegebenen Quellen und Hilfsmittel angefertigt...logarithmic scale . As 5 Figure 2 shows, natural frequencies are indicated by large values of the first CMIF (peaks), and multiple modes can be detected by...structure’s behavior. Ewins even states, “that no large- scale modal test should be permitted to proceed until some preliminary SDOF analyses have

  13. Rotordynamic Feasibility of a Conceptual Variable-Speed Power Turbine Propulsion System for Large Civil Tilt-Rotor Applications

    NASA Technical Reports Server (NTRS)

    Howard, Samuel

    2012-01-01

    A variable-speed power turbine concept is analyzed for rotordynamic feasibility in a Large Civil Tilt-Rotor (LCTR) class engine. Implementation of a variable-speed power turbine in a rotorcraft engine would enable high efficiency propulsion at the high forward velocities anticipated of large tilt-rotor vehicles. Therefore, rotordynamics is a critical issue for this engine concept. A preliminary feasibility study is presented herein to address this concern and identify if variable-speed is possible in a conceptual engine sized for the LCTR. The analysis considers critical speed placement in the operating speed envelope, stability analysis up to the maximum anticipated operating speed, and potential unbalance response amplitudes to determine that a variable-speed power turbine is likely to be challenging, but not impossible to achieve in a tilt-rotor propulsion engine.

  14. Analysis of temporal-longitudinal-latitudinal characteristics in the global ionosphere based on tensor rank-1 decomposition

    NASA Astrophysics Data System (ADS)

    Lu, Shikun; Zhang, Hao; Li, Xihai; Li, Yihong; Niu, Chao; Yang, Xiaoyun; Liu, Daizhi

    2018-03-01

    Combining analyses of spatial and temporal characteristics of the ionosphere is of great significance for scientific research and engineering applications. Tensor decomposition is performed to explore the temporal-longitudinal-latitudinal characteristics in the ionosphere. Three-dimensional tensors are established based on the time series of ionospheric vertical total electron content maps obtained from the Centre for Orbit Determination in Europe. To obtain large-scale characteristics of the ionosphere, rank-1 decomposition is used to obtain U^{(1)}, U^{(2)}, and U^{(3)}, which are the resulting vectors for the time, longitude, and latitude modes, respectively. Our initial finding is that the correspondence between the frequency spectrum of U^{(1)} and solar variation indicates that rank-1 decomposition primarily describes large-scale temporal variations in the global ionosphere caused by the Sun. Furthermore, the time lags between the maxima of the ionospheric U^{(2)} and solar irradiation range from 1 to 3.7 h without seasonal dependence. The differences in time lags may indicate different interactions between processes in the magnetosphere-ionosphere-thermosphere system. Based on the dataset displayed in the geomagnetic coordinates, the position of the barycenter of U^{(3)} provides evidence for north-south asymmetry (NSA) in the large-scale ionospheric variations. The daily variation in such asymmetry indicates the influences of solar ionization. The diurnal geomagnetic coordinate variations in U^{(3)} show that the large-scale EIA (equatorial ionization anomaly) variations during the day and night have similar characteristics. Considering the influences of geomagnetic disturbance on ionospheric behavior, we select the geomagnetic quiet GIMs to construct the ionospheric tensor. The results indicate that the geomagnetic disturbances have little effect on large-scale ionospheric characteristics.

  15. Maximizing the sensitivity and reliability of peptide identification in large-scale proteomic experiments by harnessing multiple search engines.

    PubMed

    Yu, Wen; Taylor, J Alex; Davis, Michael T; Bonilla, Leo E; Lee, Kimberly A; Auger, Paul L; Farnsworth, Chris C; Welcher, Andrew A; Patterson, Scott D

    2010-03-01

    Despite recent advances in qualitative proteomics, the automatic identification of peptides with optimal sensitivity and accuracy remains a difficult goal. To address this deficiency, a novel algorithm, Multiple Search Engines, Normalization and Consensus is described. The method employs six search engines and a re-scoring engine to search MS/MS spectra against protein and decoy sequences. After the peptide hits from each engine are normalized to error rates estimated from the decoy hits, peptide assignments are then deduced using a minimum consensus model. These assignments are produced in a series of progressively relaxed false-discovery rates, thus enabling a comprehensive interpretation of the data set. Additionally, the estimated false-discovery rate was found to have good concordance with the observed false-positive rate calculated from known identities. Benchmarking against standard proteins data sets (ISBv1, sPRG2006) and their published analysis, demonstrated that the Multiple Search Engines, Normalization and Consensus algorithm consistently achieved significantly higher sensitivity in peptide identifications, which led to increased or more robust protein identifications in all data sets compared with prior methods. The sensitivity and the false-positive rate of peptide identification exhibit an inverse-proportional and linear relationship with the number of participating search engines.

  16. Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems

    NASA Astrophysics Data System (ADS)

    Koch, Patrick Nathan

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.

  17. STE thrust chamber technology: Main injector technology program and nozzle Advanced Development Program (ADP)

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The purpose of the STME Main Injector Program was to enhance the technology base for the large-scale main injector-combustor system of oxygen-hydrogen booster engines in the areas of combustion efficiency, chamber heating rates, and combustion stability. The initial task of the Main Injector Program, focused on analysis and theoretical predictions using existing models, was complemented by the design, fabrication, and test at MSFC of a subscale calorimetric, 40,000-pound thrust class, axisymmetric thrust chamber operating at approximately 2,250 psi and a 7:1 expansion ratio. Test results were used to further define combustion stability bounds, combustion efficiency, and heating rates using a large injector scale similar to the Pratt & Whitney (P&W) STME main injector design configuration including the tangential entry swirl coaxial injection elements. The subscale combustion data was used to verify and refine analytical modeling simulation and extend the database range to guide the design of the large-scale system main injector. The subscale injector design incorporated fuel and oxidizer flow area control features which could be varied; this allowed testing of several design points so that the STME conditions could be bracketed. The subscale injector design also incorporated high-reliability and low-cost fabrication techniques such as a one-piece electrical discharged machined (EDMed) interpropellant plate. Both subscale and large-scale injectors incorporated outer row injector elements with scarfed tip features to allow evaluation of reduced heating rates to the combustion chamber.

  18. Structural Element Testing in Support of the Design of the NASA Composite Crew Module

    NASA Technical Reports Server (NTRS)

    Kellas, Sotiris; Jackson, Wade C.; Thesken, John C.; Schleicher, Eric; Wagner, Perry; Kirsch, Michael T.

    2012-01-01

    In January 2007, the NASA Administrator and Associate Administrator for the Exploration Systems Mission Directorate chartered the NASA Engineering and Safety Center (NESC) to design, build, and test a full-scale Composite Crew Module (CCM). For the design and manufacturing of the CCM, the team adopted the building block approach where design and manufacturing risks were mitigated through manufacturing trials and structural testing at various levels of complexity. Following NASA's Structural Design Verification Requirements, a further objective was the verification of design analysis methods and the provision of design data for critical structural features. Test articles increasing in complexity from basic material characterization coupons through structural feature elements and large structural components, to full-scale structures were evaluated. This paper discusses only four elements tests three of which include joints and one that includes a tapering honeycomb core detail. For each test series included are specimen details, instrumentation, test results, a brief analysis description, test analysis correlation and conclusions.

  19. Cloud Computing: A model Construct of Real-Time Monitoring for Big Dataset Analytics Using Apache Spark

    NASA Astrophysics Data System (ADS)

    Alkasem, Ameen; Liu, Hongwei; Zuo, Decheng; Algarash, Basheer

    2018-01-01

    The volume of data being collected, analyzed, and stored has exploded in recent years, in particular in relation to the activity on the cloud computing. While large-scale data processing, analysis, storage, and platform model such as cloud computing were previously and currently are increasingly. Today, the major challenge is it address how to monitor and control these massive amounts of data and perform analysis in real-time at scale. The traditional methods and model systems are unable to cope with these quantities of data in real-time. Here we present a new methodology for constructing a model for optimizing the performance of real-time monitoring of big datasets, which includes a machine learning algorithms and Apache Spark Streaming to accomplish fine-grained fault diagnosis and repair of big dataset. As a case study, we use the failure of Virtual Machines (VMs) to start-up. The methodology proposition ensures that the most sensible action is carried out during the procedure of fine-grained monitoring and generates the highest efficacy and cost-saving fault repair through three construction control steps: (I) data collection; (II) analysis engine and (III) decision engine. We found that running this novel methodology can save a considerate amount of time compared to the Hadoop model, without sacrificing the classification accuracy or optimization of performance. The accuracy of the proposed method (92.13%) is an improvement on traditional approaches.

  20. A Validity and Reliability Study of the Basic Electronics Skills Self-Efficacy Scale (BESS)

    ERIC Educational Resources Information Center

    Korkmaz, Ö.; Korkmaz, M. K.

    2016-01-01

    The aim of this study is to improve a measurement tool to evaluate the self-efficacy of Electrical-Electronics Engineering students through their basic electronics skills. The sample group is composed of 124 Electrical-Electronics engineering students. The validity of the scale is analyzed with two different methods through factor analysis and…

  1. New tools for non-invasive exploration of collagen network in cartilaginous tissue-engineered substitute.

    PubMed

    Henrionnet, Christel; Dumas, Dominique; Hupont, Sébastien; Stoltz, Jean François; Mainard, Didier; Gillet, Pierre; Pinzano, Astrid

    2017-01-01

    In tissue engineering approaches, the quality of substitutes is a key element to determine its ability to treat cartilage defects. However, in clinical practice, the evaluation of tissue-engineered cartilage substitute quality is not possible due to the invasiveness of the standard procedure, which is to date histology. The aim of this work was to validate a new innovative system performed from two-photon excitation laser adapted to an optical macroscope to evaluate at macroscopic scale the collagen network in cartilage tissue-engineered substitutes in confrontation with gold standard histologic techniques or immunohistochemistry to visualize type II collagen. This system permitted to differentiate the quality of collagen network between ITS and TGF-β1 treatments. Multiscale large field imaging combined to multimodality approaches (SHG-TCSPC) at macroscopical scale represent an innovative and non-invasive technique to monitor the quality of collagen network in cartilage tissue-engineered substitutes before in vivo implantation.

  2. Engineering a Large Scale Indium Nanodot Array for Refractive Index Sensing.

    PubMed

    Xu, Xiaoqing; Hu, Xiaolin; Chen, Xiaoshu; Kang, Yangsen; Zhang, Zhiping; B Parizi, Kokab; Wong, H-S Philip

    2016-11-23

    In this work, we developed a simple method to fabricate 12 × 4 mm 2 large scale nanostructure arrays and investigated the feasibility of indium nanodot (ND) array with different diameters and periods for refractive index sensing. Absorption resonances at multiple wavelengths from the visible to the near-infrared range were observed for various incident angles in a variety of media. Engineering the ND array with a centered square lattice, we successfully enhanced the sensitivity by 60% and improved the figure of merit (FOM) by 190%. The evolution of the resonance dips in the reflection spectra, of square lattice and centered square lattice, from air to water, matches well with the results of Lumerical FDTD simulation. The improvement of sensitivity is due to the enhancement of local electromagnetic field (E-field) near the NDs with centered square lattice, as revealed by E-field simulation at resonance wavelengths. The E-field is enhanced due to coupling between the two square ND arrays with [Formula: see text]x period at phase matching. This work illustrates an effective way to engineer and fabricate a refractive index sensor at a large scale. This is the first experimental demonstration of poor-metal (indium) nanostructure array for refractive index sensing. It also demonstrates a centered square lattice for higher sensitivity and as a better basic platform for more complex sensor designs.

  3. Small-scale test program to develop a more efficient swivel nozzle thrust deflector for V/STOL lift/cruise engines

    NASA Technical Reports Server (NTRS)

    Schlundt, D. W.

    1976-01-01

    The installed performance degradation of a swivel nozzle thrust deflector system obtained during increased vectoring angles of a large-scale test program was investigated and improved. Small-scale models were used to generate performance data for analyzing selected swivel nozzle configurations. A single-swivel nozzle design model with five different nozzle configurations and a twin-swivel nozzle design model, scaled to 0.15 size of the large-scale test hardware, were statically tested at low exhaust pressure ratios of 1.4, 1.3, 1.2, and 1.1 and vectored at four nozzle positions from 0 deg cruise through 90 deg vertical used for the VTOL mode.

  4. Establishment of a National Wind Energy Center at University of Houston

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Su Su

    The DOE-supported project objectives are to: establish a national wind energy center (NWEC) at University of Houston and conduct research to address critical science and engineering issues for the development of future large MW-scale wind energy production systems, especially offshore wind turbines. The goals of the project are to: (1) establish a sound scientific/technical knowledge base of solutions to critical science and engineering issues for developing future MW-scale large wind energy production systems, (2) develop a state-of-the-art wind rotor blade research facility at the University of Houston, and (3) through multi-disciplinary research, introducing technology innovations on advanced wind-turbine materials, processing/manufacturingmore » technology, design and simulation, testing and reliability assessment methods related to future wind turbine systems for cost-effective production of offshore wind energy. To achieve the goals of the project, the following technical tasks were planned and executed during the period from April 15, 2010 to October 31, 2014 at the University of Houston: (1) Basic research on large offshore wind turbine systems (2) Applied research on innovative wind turbine rotors for large offshore wind energy systems (3) Integration of offshore wind-turbine design, advanced materials and manufacturing technologies (4) Integrity and reliability of large offshore wind turbine blades and scaled model testing (5) Education and training of graduate and undergraduate students and post- doctoral researchers (6) Development of a national offshore wind turbine blade research facility The research program addresses both basic science and engineering of current and future large wind turbine systems, especially offshore wind turbines, for MW-scale power generation. The results of the research advance current understanding of many important scientific issues and provide technical information for solving future large wind turbines with advanced design, composite materials, integrated manufacturing, and structural reliability and integrity. The educational program have trained many graduate and undergraduate students and post-doctoral level researchers to learn critical science and engineering of wind energy production systems through graduate-level courses and research, and participating in various projects in center’s large multi-disciplinary research. These students and researchers are now employed by the wind industry, national labs and universities to support the US and international wind energy industry. The national offshore wind turbine blade research facility developed in the project has been used to support the technical and training tasks planned in the program to accomplish their goals, and it is a national asset which is available for used by domestic and international researchers in the wind energy arena.« less

  5. Usage of Parameterized Fatigue Spectra and Physics-Based Systems Engineering Models for Wind Turbine Component Sizing: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parsons, Taylor; Guo, Yi; Veers, Paul

    Software models that use design-level input variables and physics-based engineering analysis for estimating the mass and geometrical properties of components in large-scale machinery can be very useful for analyzing design trade-offs in complex systems. This study uses DriveSE, an OpenMDAO-based drivetrain model that uses stress and deflection criteria to size drivetrain components within a geared, upwind wind turbine. Because a full lifetime fatigue load spectrum can only be defined using computationally-expensive simulations in programs such as FAST, a parameterized fatigue loads spectrum that depends on wind conditions, rotor diameter, and turbine design life has been implemented. The parameterized fatigue spectrummore » is only used in this paper to demonstrate the proposed fatigue analysis approach. This paper details a three-part investigation of the parameterized approach and a comparison of the DriveSE model with and without fatigue analysis on the main shaft system. It compares loads from three turbines of varying size and determines if and when fatigue governs drivetrain sizing compared to extreme load-driven design. It also investigates the model's sensitivity to shaft material parameters. The intent of this paper is to demonstrate how fatigue considerations in addition to extreme loads can be brought into a system engineering optimization.« less

  6. On the use of metabolic control analysis in the optimization of cyanobacterial biosolar cell factories.

    PubMed

    Angermayr, S Andreas; Hellingwerf, Klaas J

    2013-09-26

    Oxygenic photosynthesis will have a key role in a sustainable future. It is therefore significant that this process can be engineered in organisms such as cyanobacteria to construct cell factories that catalyze the (sun)light-driven conversion of CO2 and water into products like ethanol, butanol, or other biofuels or lactic acid, a bioplastic precursor, and oxygen as a byproduct. It is of key importance to optimize such cell factories to maximal efficiency. This holds for their light-harvesting capabilities under, for example, circadian illumination in large-scale photobioreactors. However, this also holds for the "dark" reactions of photosynthesis, that is, the conversion of CO2, NADPH, and ATP into a product. Here, we present an analysis, based on metabolic control theory, to estimate the optimal capacity for product formation with which such cyanobacterial cell factories have to be equipped. Engineered l-lactic acid producing Synechocystis sp. PCC6803 strains are used to identify the relation between production rate and enzymatic capacity. The analysis shows that the engineered cell factories for l-lactic acid are fully limited by the metabolic capacity of the product-forming pathway. We attribute this to the fact that currently available promoter systems in cyanobacteria lack the genetic capacity to a provide sufficient expression in single-gene doses.

  7. Tools for Large-Scale Data Analytic Examination of Relational and Epistemic Networks in Engineering Education

    ERIC Educational Resources Information Center

    Madhavan, Krishna; Johri, Aditya; Xian, Hanjun; Wang, G. Alan; Liu, Xiaomo

    2014-01-01

    The proliferation of digital information technologies and related infrastructure has given rise to novel ways of capturing, storing and analyzing data. In this paper, we describe the research and development of an information system called Interactive Knowledge Networks for Engineering Education Research (iKNEER). This system utilizes a framework…

  8. Can Man Control His Biological Evolution? A Symposium on Genetic Engineering. Probabilities and Practicalities

    ERIC Educational Resources Information Center

    Djerassi, Carl

    1972-01-01

    Manipulation of genes in human beings on a large scale is not possible under present conditions because it lacks economic potential and other attractions for industry. However, preventive'' genetic engineering may be a field for vast research in the future and will perhaps be approved by governments, parishes, people and industry. (PS)

  9. Heart valve scaffold fabrication: Bioinspired control of macro-scale morphology, mechanics and micro-structure.

    PubMed

    D'Amore, Antonio; Luketich, Samuel K; Raffa, Giuseppe M; Olia, Salim; Menallo, Giorgio; Mazzola, Antonino; D'Accardi, Flavio; Grunberg, Tamir; Gu, Xinzhu; Pilato, Michele; Kameneva, Marina V; Badhwar, Vinay; Wagner, William R

    2018-01-01

    Valvular heart disease is currently treated with mechanical valves, which benefit from longevity, but are burdened by chronic anticoagulation therapy, or with bioprosthetic valves, which have reduced thromboembolic risk, but limited durability. Tissue engineered heart valves have been proposed to resolve these issues by implanting a scaffold that is replaced by endogenous growth, leaving autologous, functional leaflets that would putatively eliminate the need for anticoagulation and avoid calcification. Despite the diversity in fabrication strategies and encouraging results in large animal models, control over engineered valve structure-function remains at best partial. This study aimed to overcome these limitations by introducing double component deposition (DCD), an electrodeposition technique that employs multi-phase electrodes to dictate valve macro and microstructure and resultant function. Results in this report demonstrate the capacity of the DCD method to simultaneously control scaffold macro-scale morphology, mechanics and microstructure while producing fully assembled stent-less multi-leaflet valves composed of microscopic fibers. DCD engineered valve characterization included: leaflet thickness, biaxial properties, bending properties, and quantitative structural analysis of multi-photon and scanning electron micrographs. Quasi-static ex-vivo valve coaptation testing and dynamic organ level functional assessment in a pressure pulse duplicating device demonstrated appropriate acute valve functionality. Copyright © 2017. Published by Elsevier Ltd.

  10. The role of Natural Flood Management in managing floods in large scale basins during extreme events

    NASA Astrophysics Data System (ADS)

    Quinn, Paul; Owen, Gareth; ODonnell, Greg; Nicholson, Alex; Hetherington, David

    2016-04-01

    There is a strong evidence database showing the negative impacts of land use intensification and soil degradation in NW European river basins on hydrological response and to flood impact downstream. However, the ability to target zones of high runoff production and the extent to which we can manage flood risk using nature-based flood management solution are less known. A move to planting more trees and having less intense farmed landscapes is part of natural flood management (NFM) solutions and these methods suggest that flood risk can be managed in alternative and more holistic ways. So what local NFM management methods should be used, where in large scale basin should they be deployed and how does flow is propagate to any point downstream? Generally, how much intervention is needed and will it compromise food production systems? If we are observing record levels of rainfall and flow, for example during Storm Desmond in Dec 2015 in the North West of England, what other flood management options are really needed to complement our traditional defences in large basins for the future? In this paper we will show examples of NFM interventions in the UK that have impacted at local scale sites. We will demonstrate the impact of interventions at local, sub-catchment (meso-scale) and finally at the large scale. These tools include observations, process based models and more generalised Flood Impact Models. Issues of synchronisation and the design level of protection will be debated. By reworking observed rainfall and discharge (runoff) for observed extreme events in the River Eden and River Tyne, during Storm Desmond, we will show how much flood protection is needed in large scale basins. The research will thus pose a number of key questions as to how floods may have to be managed in large scale basins in the future. We will seek to support a method of catchment systems engineering that holds water back across the whole landscape as a major opportunity to management water in large scale basins in the future. The broader benefits of engineering landscapes to hold water for pollution control, sediment loss and drought minimisation will also be shown.

  11. Gene and process level modulation to overcome the bottlenecks of recombinant proteins expression in Pichia pastoris.

    PubMed

    Prabhu, Ashish A; Boro, Bibari; Bharali, Biju; Chakraborty, Shuchishloka; Dasu, Veeranki V

    2017-01-01

    Process development involving system metabolic engineering and bioprocess engineering has become one of the major thrust for the development of therapeutic proteins or enzymes. Pichia pastoris has emerged as a prominent host for the production of therapeutic protein or enzymes. Regardless of producing high protein titers, various cellular and process level bottlenecks restrict the expression of recombinant proteins in P. pastoris. In the present review, we have summarized the recent developments in the expression of foreign proteins in P. pastoris. Further, we have discussed various cellular engineering strategies which include codon optimization, pathway engineering, signal peptide processing, development of protease deficient strain and glyco-engineered strains for the high yield protein secretion of recombinant protein. Bioprocess development of recombinant proteins in large-scale bioreactor including medium optimization, optimum feeding strategy and co-substrate feeding in fed-batch as well as continuous cultivation have been described. The recent advances in system and synthetic biology studies including metabolic flux analysis in understanding the phenotypic characteristics of recombinant Pichia and genome editing with CRISPR-CAS system have also been summarized. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  12. Application of Open Source Technologies for Oceanographic Data Analysis

    NASA Astrophysics Data System (ADS)

    Huang, T.; Gangl, M.; Quach, N. T.; Wilson, B. D.; Chang, G.; Armstrong, E. M.; Chin, T. M.; Greguska, F.

    2015-12-01

    NEXUS is a data-intensive analysis solution developed with a new approach for handling science data that enables large-scale data analysis by leveraging open source technologies such as Apache Cassandra, Apache Spark, Apache Solr, and Webification. NEXUS has been selected to provide on-the-fly time-series and histogram generation for the Soil Moisture Active Passive (SMAP) mission for Level 2 and Level 3 Active, Passive, and Active Passive products. It also provides an on-the-fly data subsetting capability. NEXUS is designed to scale horizontally, enabling it to handle massive amounts of data in parallel. It takes a new approach on managing time and geo-referenced array data by dividing data artifacts into chunks and stores them in an industry-standard, horizontally scaled NoSQL database. This approach enables the development of scalable data analysis services that can infuse and leverage the elastic computing infrastructure of the Cloud. It is equipped with a high-performance geospatial and indexed data search solution, coupled with a high-performance data Webification solution free from file I/O bottlenecks, as well as a high-performance, in-memory data analysis engine. In this talk, we will focus on the recently funded AIST 2014 project by using NEXUS as the core for oceanographic anomaly detection service and web portal. We call it, OceanXtremes

  13. Identification of metabolic engineering targets for the enhancement of 1,4-butanediol production in recombinant E. coli using large-scale kinetic models.

    PubMed

    Andreozzi, Stefano; Chakrabarti, Anirikh; Soh, Keng Cher; Burgard, Anthony; Yang, Tae Hoon; Van Dien, Stephen; Miskovic, Ljubisa; Hatzimanikatis, Vassily

    2016-05-01

    Rational metabolic engineering methods are increasingly employed in designing the commercially viable processes for the production of chemicals relevant to pharmaceutical, biotechnology, and food and beverage industries. With the growing availability of omics data and of methodologies capable to integrate the available data into models, mathematical modeling and computational analysis are becoming important in designing recombinant cellular organisms and optimizing cell performance with respect to desired criteria. In this contribution, we used the computational framework ORACLE (Optimization and Risk Analysis of Complex Living Entities) to analyze the physiology of recombinant Escherichia coli producing 1,4-butanediol (BDO) and to identify potential strategies for improved production of BDO. The framework allowed us to integrate data across multiple levels and to construct a population of large-scale kinetic models despite the lack of available information about kinetic properties of every enzyme in the metabolic pathways. We analyzed these models and we found that the enzymes that primarily control the fluxes leading to BDO production are part of central glycolysis, the lower branch of tricarboxylic acid (TCA) cycle and the novel BDO production route. Interestingly, among the enzymes between the glucose uptake and the BDO pathway, the enzymes belonging to the lower branch of TCA cycle have been identified as the most important for improving BDO production and yield. We also quantified the effects of changes of the target enzymes on other intracellular states like energy charge, cofactor levels, redox state, cellular growth, and byproduct formation. Independent earlier experiments on this strain confirmed that the computationally obtained conclusions are consistent with the experimentally tested designs, and the findings of the present studies can provide guidance for future work on strain improvement. Overall, these studies demonstrate the potential and effectiveness of ORACLE for the accelerated design of microbial cell factories. Copyright © 2016 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  14. Physical modeling in geomorphology: are boundary conditions necessary?

    NASA Astrophysics Data System (ADS)

    Cantelli, A.

    2012-12-01

    Referring to the physical experimental design in geomorphology, boundary conditions are key elements that determine the quality of the results and therefore the study development. For years engineers have modeled structures, such as dams and bridges, with high precision and excellent results. Until the last decade, a great part of the physical experimental work in geomorphology has been developed with an engineer-like approach, requiring an accurate scaling analysis to determine inflow parameters and initial geometrical conditions. However, during the last decade, the way we have been approaching physical experiments has significantly changed. In particular, boundary conditions and initial conditions are considered unknown factors that need to be discovered during the experiment. This new philosophy leads to a more demanding data acquisition process but relaxes the obligation to a priori know the appropriate input and initial conditions and provides the flexibility to discover those data. Here I am going to present some practical examples of this experimental approach in deepwater geomorphology; some questions about scaling of turbidity currents and a new large experimental facility built at the Universidade Federal do Rio Grande do Sul, Brasil.

  15. The BOEING 777 - concurrent engineering and digital pre-assembly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abarbanel, B.

    The processes created on the 777 for checking designs were called {open_quotes}digital pre-assembly{close_quotes}. Using FlyThru(tm), a spin-off of a Boeing advanced computing research project, engineers were able to view up to 1500 models (15000 solids) in 3d traversing that data at high speed. FlyThru(tm) was rapidly deployed in 1991 to meet the needs of the 777 for large scale product visualization and verification. The digital pre-assembly process has bad fantastic results. The 777 has had far fewer assembly and systems problems compared to previous airplane programs. Today, FlyThru(tm) is installed on hundreds of workstations on almost every airplane program, andmore » is being used on Space Station, F22, AWACS, and other defense projects. It`s applications have gone far beyond just design review. In many ways, FlyThru is a Data Warehouse supported by advanced tools for analysis. It is today being integrated with Knowledge Based Engineering geometry generation tools.« less

  16. System level airworthiness tool: A comprehensive approach to small unmanned aircraft system airworthiness

    NASA Astrophysics Data System (ADS)

    Burke, David A.

    One of the pillars of aviation safety is assuring sound engineering practices through airworthiness certification. As Unmanned Aircraft Systems (UAS) grow in popularity, the need for airworthiness standards and verification methods tailored for UAS becomes critical. While airworthiness practices for large UAS may be similar to manned aircraft, it is clear that small UAS require a paradigm shift from the airworthiness practices of manned aircraft. Although small in comparison to manned aircraft these aircraft are not merely remote controlled toys. Small UAS may be complex aircraft flying in the National Airspace System (NAS) over populated areas for extended durations and beyond line of sight of the operators. A comprehensive systems engineering framework for certifying small UAS at the system level is needed. This work presents a point based tool that evaluates small UAS by rewarding good engineering practices in design, analysis, and testing. The airworthiness requirements scale with vehicle size and operational area, while allowing flexibility for new technologies and unique configurations.

  17. Solid rocket booster performance evaluation model. Volume 1: Engineering description

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The space shuttle solid rocket booster performance evaluation model (SRB-II) is made up of analytical and functional simulation techniques linked together so that a single pass through the model will predict the performance of the propulsion elements of a space shuttle solid rocket booster. The available options allow the user to predict static test performance, predict nominal and off nominal flight performance, and reconstruct actual flight and static test performance. Options selected by the user are dependent on the data available. These can include data derived from theoretical analysis, small scale motor test data, large motor test data and motor configuration data. The user has several options for output format that include print, cards, tape and plots. Output includes all major performance parameters (Isp, thrust, flowrate, mass accounting and operating pressures) as a function of time as well as calculated single point performance data. The engineering description of SRB-II discusses the engineering and programming fundamentals used, the function of each module, and the limitations of each module.

  18. Market-implied spread for earthquake CAT bonds: financial implications of engineering decisions.

    PubMed

    Damnjanovic, Ivan; Aslan, Zafer; Mander, John

    2010-12-01

    In the event of natural and man-made disasters, owners of large-scale infrastructure facilities (assets) need contingency plans to effectively restore the operations within the acceptable timescales. Traditionally, the insurance sector provides the coverage against potential losses. However, there are many problems associated with this traditional approach to risk transfer including counterparty risk and litigation. Recently, a number of innovative risk mitigation methods, termed alternative risk transfer (ART) methods, have been introduced to address these problems. One of the most important ART methods is catastrophe (CAT) bonds. The objective of this article is to develop an integrative model that links engineering design parameters with financial indicators including spread and bond rating. The developed framework is based on a four-step structural loss model and transformed survival model to determine expected excess returns. We illustrate the framework for a seismically designed bridge using two unique CAT bond contracts. The results show a nonlinear relationship between engineering design parameters and market-implied spread. © 2010 Society for Risk Analysis.

  19. Analysis of performance and emissions of diesel engine using sunflower biodiesel

    NASA Astrophysics Data System (ADS)

    Tutunea, Dragos; Dumitru, Ilie

    2017-10-01

    The world consumption of fossil fuels is increasing rapidly and it affects the environment by green house gases causing health hazards. Biodiesel is emerging as an important promising alternative energy resource which can be used to reduce or even replace the usage of petroleum. Since is mainly derived from vegetable oil or animal fats can be produce for large scale by local farmers offering a great choice. However the extensive utilization of the biofuels can lead to shortages in the food chain. This paper analyzed the sunflower methyl ester (SFME) and its blends as an alternate source of fuel for diesel engines. Biodiesel was prepared from sunflower oil in laboratory in a small biodiesel installation (30L) by base transesterification. A 4 cylinder Deutz F4L912 diesel engine was used to perform the tests on various blends of sunflower biodiesel. The emissions of CO, HC were lower than diesel fuel for all blends tested. The NOx emissions were higher due to the high volatility and high viscosity of biodiesel.

  20. A Historical Perspective on Dynamics Testing at the Langley Research Center

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Kvaternik, Raymond G.

    2000-01-01

    The history of structural dynamics testing research over the past four decades at the Langley Research Center of the National Aeronautics and Space Administration is reviewed. Beginning in the early sixties, Langley investigated several scale model and full-scale spacecraft including the NIMBUS and various concepts for Apollo and Viking landers. Langley engineers pioneered the use of scaled models to study the dynamics of launch vehicles including Saturn I, Saturn V, and Titan III. In the seventies, work emphasized the Space Shuttle and advanced test and data analysis methods. In the eighties, the possibility of delivering large structures to orbit by the Space Shuttle shifted focus towards understanding the interaction of flexible space structures with attitude control systems. Although Langley has maintained a tradition of laboratory-based research, some flight experiments were supported. This review emphasizes work that, in some way, advanced the state of knowledge at the time.

  1. CasCADe: A Novel 4D Visualization System for Virtual Construction Planning.

    PubMed

    Ivson, Paulo; Nascimento, Daniel; Celes, Waldemar; Barbosa, Simone Dj

    2018-01-01

    Building Information Modeling (BIM) provides an integrated 3D environment to manage large-scale engineering projects. The Architecture, Engineering and Construction (AEC) industry explores 4D visualizations over these datasets for virtual construction planning. However, existing solutions lack adequate visual mechanisms to inspect the underlying schedule and make inconsistencies readily apparent. The goal of this paper is to apply best practices of information visualization to improve 4D analysis of construction plans. We first present a review of previous work that identifies common use cases and limitations. We then consulted with AEC professionals to specify the main design requirements for such applications. These guided the development of CasCADe, a novel 4D visualization system where task sequencing and spatio-temporal simultaneity are immediately apparent. This unique framework enables the combination of diverse analytical features to create an information-rich analysis environment. We also describe how engineering collaborators used CasCADe to review the real-world construction plans of an Oil & Gas process plant. The system made evident schedule uncertainties, identified work-space conflicts and helped analyze other constructability issues. The results and contributions of this paper suggest new avenues for future research in information visualization for the AEC industry.

  2. Aeration costs in stirred-tank and bubble column bioreactors

    DOE PAGES

    Humbird, D.; Davis, R.; McMillan, J. D.

    2017-08-10

    To overcome knowledge gaps in the economics of large-scale aeration for production of commodity products, Aspen Plus is used to simulate steady-state oxygen delivery in both stirred-tank and bubble column bioreactors, using published engineering correlations for oxygen mass transfer as a function of aeration rate and power input, coupled with new equipment cost estimates developed in Aspen Capital Cost Estimator and validated against vendor quotations. Here, these simulations describe the cost efficiency of oxygen delivery as a function of oxygen uptake rate and vessel size, and show that capital and operating costs for oxygen delivery drop considerably moving from standard-sizemore » (200 m 3) to world-class size (500 m 3) reactors, but only marginally in further scaling up to hypothetically large (1000 m 3) reactors. Finally, this analysis suggests bubble-column reactor systems can reduce overall costs for oxygen delivery by 10-20% relative to stirred tanks at low to moderate oxygen transfer rates up to 150 mmol/L-h.« less

  3. Aeration costs in stirred-tank and bubble column bioreactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humbird, D.; Davis, R.; McMillan, J. D.

    To overcome knowledge gaps in the economics of large-scale aeration for production of commodity products, Aspen Plus is used to simulate steady-state oxygen delivery in both stirred-tank and bubble column bioreactors, using published engineering correlations for oxygen mass transfer as a function of aeration rate and power input, coupled with new equipment cost estimates developed in Aspen Capital Cost Estimator and validated against vendor quotations. Here, these simulations describe the cost efficiency of oxygen delivery as a function of oxygen uptake rate and vessel size, and show that capital and operating costs for oxygen delivery drop considerably moving from standard-sizemore » (200 m 3) to world-class size (500 m 3) reactors, but only marginally in further scaling up to hypothetically large (1000 m 3) reactors. Finally, this analysis suggests bubble-column reactor systems can reduce overall costs for oxygen delivery by 10-20% relative to stirred tanks at low to moderate oxygen transfer rates up to 150 mmol/L-h.« less

  4. META II: Formal Co-Verification of Correctness of Large-Scale Cyber-Physical Systems during Design. Volume 1

    DTIC Science & Technology

    2011-08-01

    design space is large. His research contributions are to the field of Decision-based Design, specifically in linking consumer preferences and...Integrating Consumer Preferences into Engineering Design, to be published in 2012. He received his PhD from Northwestern University in Mechanical

  5. Supporting Source Code Comprehension during Software Evolution and Maintenance

    ERIC Educational Resources Information Center

    Alhindawi, Nouh

    2013-01-01

    This dissertation addresses the problems of program comprehension to support the evolution of large-scale software systems. The research concerns how software engineers locate features and concepts along with categorizing changes within very large bodies of source code along with their versioned histories. More specifically, advanced Information…

  6. Large eddy simulation modelling of combustion for propulsion applications.

    PubMed

    Fureby, C

    2009-07-28

    Predictive modelling of turbulent combustion is important for the development of air-breathing engines, internal combustion engines, furnaces and for power generation. Significant advances in modelling non-reactive turbulent flows are now possible with the development of large eddy simulation (LES), in which the large energetic scales of the flow are resolved on the grid while modelling the effects of the small scales. Here, we discuss the use of combustion LES in predictive modelling of propulsion applications such as gas turbine, ramjet and scramjet engines. The LES models used are described in some detail and are validated against laboratory data-of which results from two cases are presented. These validated LES models are then applied to an annular multi-burner gas turbine combustor and a simplified scramjet combustor, for which some additional experimental data are available. For these cases, good agreement with the available reference data is obtained, and the LES predictions are used to elucidate the flow physics in such devices to further enhance our knowledge of these propulsion systems. Particular attention is focused on the influence of the combustion chemistry, turbulence-chemistry interaction, self-ignition, flame holding burner-to-burner interactions and combustion oscillations.

  7. Development, Implementation, and Analysis of Desktop-Scale Model Industrial Equipment and a Critical Thinking Rubric for Use in Chemical Engineering Education

    ERIC Educational Resources Information Center

    Golter, Paul B.

    2011-01-01

    In order to address some of the challenges facing engineering education, namely the demand that students be better prepared to practice professional as well as technical skills, we have developed an intervention consisting of equipment, assessments and a novel pedagogy. The equipment consists of desktop-scale replicas of common industrial…

  8. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3) Coupling large-scale computing and data systems to scientific and engineering instruments (e.g., realtime interaction with experiments through real-time data analysis and interpretation presented to the experimentalist in ways that allow direct interaction with the experiment (instead of just with instrument control); (5) Highly interactive, augmented reality and virtual reality remote collaborations (e.g., Ames / Boeing Remote Help Desk providing field maintenance use of coupled video and NDI to a remote, on-line airframe structures expert who uses this data to index into detailed design databases, and returns 3D internal aircraft geometry to the field); (5) Single computational problems too large for any single system (e.g. the rotocraft reference calculation). Grids also have the potential to provide pools of resources that could be called on in extraordinary / rapid response situations (such as disaster response) because they can provide common interfaces and access mechanisms, standardized management, and uniform user authentication and authorization, for large collections of distributed resources (whether or not they normally function in concert). IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: the scientist / design engineer whose primary interest is problem solving (e.g. determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user is the tool designer: the computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. The results of the analysis of the needs of these two types of users provides a broad set of requirements that gives rise to a general set of required capabilities. The IPG project is intended to address all of these requirements. In some cases the required computing technology exists, and in some cases it must be researched and developed. The project is using available technology to provide a prototype set of capabilities in a persistent distributed computing testbed. Beyond this, there are required capabilities that are not immediately available, and whose development spans the range from near-term engineering development (one to two years) to much longer term R&D (three to six years). Additional information is contained in the original.

  9. Grid-Enabled Quantitative Analysis of Breast Cancer

    DTIC Science & Technology

    2009-10-01

    large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...pilot study to utilize large scale parallel Grid computing to harness the nationwide cluster infrastructure for optimization of medical image ... analysis parameters. Additionally, we investigated the use of cutting edge dataanalysis/ mining techniques as applied to Ultrasound, FFDM, and DCE-MRI Breast

  10. Space Launch System Base Heating Test: Sub-Scale Rocket Engine/Motor Design, Development & Performance Analysis

    NASA Technical Reports Server (NTRS)

    Mehta, Manish; Seaford, Mark; Kovarik, Brian; Dufrene, Aaron; Solly, Nathan

    2014-01-01

    ATA-002 Technical Team has successfully designed, developed, tested and assessed the SLS Pathfinder propulsion systems for the Main Base Heating Test Program. Major Outcomes of the Pathfinder Test Program: Reach 90% of full-scale chamber pressure Achieved all engine/motor design parameter requirements Reach steady plume flow behavior in less than 35 msec Steady chamber pressure for 60 to 100 msec during engine/motor operation Similar model engine/motor performance to full-scale SLS system Mitigated nozzle throat and combustor thermal erosion Test data shows good agreement with numerical prediction codes Next phase of the ATA-002 Test Program Design & development of the SLS OML for the Main Base Heating Test Tweak BSRM design to optimize performance Tweak CS-REM design to increase robustness MSFC Aerosciences and CUBRC have the capability to develop sub-scale propulsion systems to meet desired performance requirements for short-duration testing.

  11. Stress distribution retrieval in granular materials: A multi-scale model and digital image correlation measurements

    NASA Astrophysics Data System (ADS)

    Bruno, Luigi; Decuzzi, Paolo; Gentile, Francesco

    2016-01-01

    The promise of nanotechnology lies in the possibility of engineering matter on the nanoscale and creating technological interfaces that, because of their small scales, may directly interact with biological objects, creating new strategies for the treatment of pathologies that are otherwise beyond the reach of conventional medicine. Nanotechnology is inherently a multiscale, multiphenomena challenge. Fundamental understanding and highly accurate predictive methods are critical to successful manufacturing of nanostructured materials, bio/mechanical devices and systems. In biomedical engineering, and in the mechanical analysis of biological tissues, classical continuum approaches are routinely utilized, even if these disregard the discrete nature of tissues, that are an interpenetrating network of a matrix (the extra cellular matrix, ECM) and a generally large but finite number of cells with a size falling in the micrometer range. Here, we introduce a nano-mechanical theory that accounts for the-non continuum nature of bio systems and other discrete systems. This discrete field theory, doublet mechanics (DM), is a technique to model the mechanical behavior of materials over multiple scales, ranging from some millimeters down to few nanometers. In the paper, we use this theory to predict the response of a granular material to an external applied load. Such a representation is extremely attractive in modeling biological tissues which may be considered as a spatial set of a large number of particulate (cells) dispersed in an extracellular matrix. Possibly more important of this, using digital image correlation (DIC) optical methods, we provide an experimental verification of the model.

  12. State of the Art Methodology for the Design and Analysis of Future Large Scale Evaluations: A Selective Examination.

    ERIC Educational Resources Information Center

    Burstein, Leigh

    Two specific methods of analysis in large-scale evaluations are considered: structural equation modeling and selection modeling/analysis of non-equivalent control group designs. Their utility in large-scale educational program evaluation is discussed. The examination of these methodological developments indicates how people (evaluators,…

  13. RICIS research

    NASA Technical Reports Server (NTRS)

    Mckay, Charles W.; Feagin, Terry; Bishop, Peter C.; Hallum, Cecil R.; Freedman, Glenn B.

    1987-01-01

    The principle focus of one of the RICIS (Research Institute for Computing and Information Systems) components is computer systems and software engineering in-the-large of the lifecycle of large, complex, distributed systems which: (1) evolve incrementally over a long time; (2) contain non-stop components; and (3) must simultaneously satisfy a prioritized balance of mission and safety critical requirements at run time. This focus is extremely important because of the contribution of the scaling direction problem to the current software crisis. The Computer Systems and Software Engineering (CSSE) component addresses the lifestyle issues of three environments: host, integration, and target.

  14. Development of a parallel FE simulator for modeling the whole trans-scale failure process of rock from meso- to engineering-scale

    NASA Astrophysics Data System (ADS)

    Li, Gen; Tang, Chun-An; Liang, Zheng-Zhao

    2017-01-01

    Multi-scale high-resolution modeling of rock failure process is a powerful means in modern rock mechanics studies to reveal the complex failure mechanism and to evaluate engineering risks. However, multi-scale continuous modeling of rock, from deformation, damage to failure, has raised high requirements on the design, implementation scheme and computation capacity of the numerical software system. This study is aimed at developing the parallel finite element procedure, a parallel rock failure process analysis (RFPA) simulator that is capable of modeling the whole trans-scale failure process of rock. Based on the statistical meso-damage mechanical method, the RFPA simulator is able to construct heterogeneous rock models with multiple mechanical properties, deal with and represent the trans-scale propagation of cracks, in which the stress and strain fields are solved for the damage evolution analysis of representative volume element by the parallel finite element method (FEM) solver. This paper describes the theoretical basis of the approach and provides the details of the parallel implementation on a Windows - Linux interactive platform. A numerical model is built to test the parallel performance of FEM solver. Numerical simulations are then carried out on a laboratory-scale uniaxial compression test, and field-scale net fracture spacing and engineering-scale rock slope examples, respectively. The simulation results indicate that relatively high speedup and computation efficiency can be achieved by the parallel FEM solver with a reasonable boot process. In laboratory-scale simulation, the well-known physical phenomena, such as the macroscopic fracture pattern and stress-strain responses, can be reproduced. In field-scale simulation, the formation process of net fracture spacing from initiation, propagation to saturation can be revealed completely. In engineering-scale simulation, the whole progressive failure process of the rock slope can be well modeled. It is shown that the parallel FE simulator developed in this study is an efficient tool for modeling the whole trans-scale failure process of rock from meso- to engineering-scale.

  15. Authentic Research Experience and “Big Data” Analysis in the Classroom: Maize Response to Abiotic Stress

    PubMed Central

    Makarevitch, Irina; Frechette, Cameo; Wiatros, Natalia

    2015-01-01

    Integration of inquiry-based approaches into curriculum is transforming the way science is taught and studied in undergraduate classrooms. Incorporating quantitative reasoning and mathematical skills into authentic biology undergraduate research projects has been shown to benefit students in developing various skills necessary for future scientists and to attract students to science, technology, engineering, and mathematics disciplines. While large-scale data analysis became an essential part of modern biological research, students have few opportunities to engage in analysis of large biological data sets. RNA-seq analysis, a tool that allows precise measurement of the level of gene expression for all genes in a genome, revolutionized molecular biology and provides ample opportunities for engaging students in authentic research. We developed, implemented, and assessed a series of authentic research laboratory exercises incorporating a large data RNA-seq analysis into an introductory undergraduate classroom. Our laboratory series is focused on analyzing gene expression changes in response to abiotic stress in maize seedlings; however, it could be easily adapted to the analysis of any other biological system with available RNA-seq data. Objective and subjective assessment of student learning demonstrated gains in understanding important biological concepts and in skills related to the process of science. PMID:26163561

  16. Study of Plume Impingement Effects in the Lunar Lander Environment

    NASA Technical Reports Server (NTRS)

    Marichalar, Jeremiah; Prisbell, A.; Lumpkin, F.; LeBeau, G.

    2010-01-01

    Plume impingement effects from the descent and ascent engine firings of the Lunar Lander were analyzed in support of the Lunar Architecture Team under the Constellation Program. The descent stage analysis was performed to obtain shear and pressure forces on the lunar surface as well as velocity and density profiles in the flow field in an effort to understand lunar soil erosion and ejected soil impact damage which was analyzed as part of a separate study. A CFD/DSMC decoupled methodology was used with the Bird continuum breakdown parameter to distinguish the continuum flow from the rarefied flow. The ascent stage analysis was performed to ascertain the forces and moments acting on the Lunar Lander Ascent Module due to the firing of the main engine on take-off. The Reacting and Multiphase Program (RAMP) method of characteristics (MOC) code was used to model the continuum region of the nozzle plume, and the Direct Simulation Monte Carlo (DSMC) Analysis Code (DAC) was used to model the impingement results in the rarefied region. The ascent module (AM) was analyzed for various pitch and yaw rotations and for various heights in relation to the descent module (DM). For the ascent stage analysis, the plume inflow boundary was located near the nozzle exit plane in a region where the flow number density was large enough to make the DSMC solution computationally expensive. Therefore, a scaling coefficient was used to make the DSMC solution more computationally manageable. An analysis of the effectiveness of this scaling technique was performed by investigating various scaling parameters for a single height and rotation of the AM. Because the inflow boundary was near the nozzle exit plane, another analysis was performed investigating three different inflow contours to determine the effects of the flow expansion around the nozzle lip on the final plume impingement results.

  17. Replication of engine block cylinder bridge microstructure and mechanical properties with lab scale 319 Al alloy billet castings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lombardi, A., E-mail: a2lombar@ryerson.ca; D'Elia, F.; Ravindran, C.

    2014-01-15

    In recent years, aluminum alloy gasoline engine blocks have in large part successfully replaced nodular cast iron engine blocks, resulting in improved vehicle fuel efficiency. However, because of the inadequate wear resistance properties of hypoeutectic Al–Si alloys, gray iron cylinder liners are required. These liners cause the development of large tensile residual stress along the cylinder bores and necessitate the maximization of mechanical properties in this region to prevent premature engine failure. The aim of this study was to replicate the engine cylinder bridge microstructure and mechanical properties following TSR treatment (which removes the sand binder to enable easy castingmore » retrieval) using lab scale billet castings of the same alloy composition with varying cooling rates. Comparisons in microstructure between the engine block and the billet castings were carried out using optical and scanning electron microscopy, while mechanical properties were assessed using tensile testing. The results suggest that the microstructure at the top and middle of the engine block cylinder bridge was successfully replicated by the billet castings. However, the microstructure at the bottom of the cylinder was not completely replicated due to variations in secondary phase morphology and distribution. The successful replication of engine block microstructure will enable the future optimization of heat treatment parameters. - Highlights: • A method to replicate engine block microstructure was developed. • Billet castings will allow cost effective optimization of heat treatment process. • The replication of microstructure in the cylinder region was mostly successful. • Porosity was more clustered in the billet castings compared to the engine block. • Mechanical properties were lower in billet castings due to porosity and inclusions.« less

  18. Defense Acquisitions: Future Aerostat and Airship Investment Decisions Drive Oversight and Coordination Needs

    DTIC Science & Technology

    2012-10-01

    earlier, LEMV experienced schedule delays of at least 10 months, largely rooted in technical, design, and engineering problems in scaling up the airship ...had informal coordination with the Blue Devil Block 2 effort in the past. For example, originally both airships had several diesel engine ...DEFENSE ACQUISITIONS Future Aerostat and Airship Investment Decisions Drive Oversight and Coordination Needs

  19. Large space antennas: A systems analysis case history

    NASA Technical Reports Server (NTRS)

    Keafer, Lloyd S. (Compiler); Lovelace, U. M. (Compiler)

    1987-01-01

    The value of systems analysis and engineering is aptly demonstrated by the work on Large Space Antennas (LSA) by the NASA Langley Spacecraft Analysis Branch. This work was accomplished over the last half-decade by augmenting traditional system engineering, analysis, and design techniques with computer-aided engineering (CAE) techniques using the Langley-developed Interactive Design and Evaluation of Advanced Spacecraft (IDEAS) system. This report chronicles the research highlights and special systems analyses that focused the LSA work on deployable truss antennas. It notes developmental trends toward greater use of CAE techniques in their design and analysis. A look to the future envisions the application of improved systems analysis capabilities to advanced space systems such as an advanced space station or to lunar and Martian missions and human habitats.

  20. New Parallel Algorithms for Structural Analysis and Design of Aerospace Structures

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.

    1998-01-01

    Subspace and Lanczos iterations have been developed, well documented, and widely accepted as efficient methods for obtaining p-lowest eigen-pair solutions of large-scale, practical engineering problems. The focus of this paper is to incorporate recent developments in vectorized sparse technologies in conjunction with Subspace and Lanczos iterative algorithms for computational enhancements. Numerical performance, in terms of accuracy and efficiency of the proposed sparse strategies for Subspace and Lanczos algorithm, is demonstrated by solving for the lowest frequencies and mode shapes of structural problems on the IBM-R6000/590 and SunSparc 20 workstations.

  1. Factors influencing efficient structure of fuel and energy complex

    NASA Astrophysics Data System (ADS)

    Sidorova, N. G.; Novikova, S. A.

    2017-10-01

    The development of the Russian fuel-energy complex is a priority for the national economic policy, and the Far East is a link between Russia and the Asia-Pacific region. Large-scale engineering of numerous resources of the Far East will force industrial development, increase living standard and strengthen Russia’s position in the global energy market. So, revealing the factors which influence rational structure of the fuel-energy complex is very urgent nowadays. With the use of depth analysis of development tendencies of the complex and its problems the authors show ways of its efficiency improvement.

  2. Safe Life Propulsion Design Technologies (3rd Generation Propulsion Research and Technology)

    NASA Technical Reports Server (NTRS)

    Ellis, Rod

    2000-01-01

    The tasks outlined in this viewgraph presentation on safe life propulsion design technologies (third generation propulsion research and technology) include the following: (1) Ceramic matrix composite (CMC) life prediction methods; (2) Life prediction methods for ultra high temperature polymer matrix composites for reusable launch vehicle (RLV) airframe and engine application; (3) Enabling design and life prediction technology for cost effective large-scale utilization of MMCs and innovative metallic material concepts; (4) Probabilistic analysis methods for brittle materials and structures; (5) Damage assessment in CMC propulsion components using nondestructive characterization techniques; and (6) High temperature structural seals for RLV applications.

  3. Aquatic Plant Control Research Program. Large-Scale Operations Management Test (LSOMT) of Insects and Pathogens for Control of Waterhyacinth in Louisiana. Volume 1. Results for 1979-1981.

    DTIC Science & Technology

    1985-01-01

    RD-Ai56 759 AQUATIC PLANT CONTROL RESEARCH PROGRAM LARGE-SCALE 1/2 OPERATIONS MRNAGEMENT..(U) ARMY ENGINEER WATERAYS EXPERIMENT STATION VICKSBURG MS...PO Box 631, Vicksburg, Aquatic Plant Control Mississippi 39180-0631 and University of Research Program Tennessee-Chattanooga, Chattanooga, 12...19. KEY WORDS (Continue on reverse side if necesary nd identify by block number) - Aquatic plant control Louisiana Biological control Plant

  4. Scale-Up: Improving Large Enrollment Physics Courses

    NASA Astrophysics Data System (ADS)

    Beichner, Robert

    1999-11-01

    The Student-Centered Activities for Large Enrollment University Physics (SCALE-UP) project is working to establish a learning environment that will promote increased conceptual understanding, improved problem-solving performance, and greater student satisfaction, while still maintaining class sizes of approximately 100. We are also addressing the new ABET engineering accreditation requirements for inquiry-based learning along with communication and team-oriented skills development. Results of studies of our latest classroom design, plans for future classroom space, and the current iteration of instructional materials will be discussed.

  5. Formation and representation: Critical analyses of identity, supply, and demand in science, technology, engineering, and mathematics

    NASA Astrophysics Data System (ADS)

    Mandayam Doddamane, Prabha

    2011-12-01

    Considerable research, policy, and programmatic efforts have been dedicated to addressing the participation of particular populations in STEM for decades. Each of these efforts claims equity-related goals; yet, they heavily frame the problem, through pervasive STEM pipeline model discourse, in terms of national needs, workforce supply, and competitiveness. This particular framing of the problem may, indeed, be counter to equity goals, especially when paired with policy that largely relies on statistical significance and broad aggregation of data over exploring the identities and experiences of the populations targeted for equitable outcomes in that policy. In this study, I used the mixed-methods approach of critical discourse and critical quantitative analyses to understand how the pipeline model ideology has become embedded within academic discourse, research, and data surrounding STEM education and work and to provide alternatives for quantitative analysis. Using critical theory as a lens, I first conducted a critical discourse analysis of contemporary STEM workforce studies with a particular eye to pipeline ideology. Next, I used that analysis to inform logistic regression analyses of the 2006 SESTAT data. This quantitative analysis compared and contrasted different ways of thinking about identity and retention. Overall, the findings of this study show that many subjective choices are made in the construction of the large-scale datasets used to inform much national science and engineering policy and that these choices greatly influence likelihood of retention outcomes.

  6. Grand challenges for biological engineering

    PubMed Central

    Yoon, Jeong-Yeol; Riley, Mark R

    2009-01-01

    Biological engineering will play a significant role in solving many of the world's problems in medicine, agriculture, and the environment. Recently the U.S. National Academy of Engineering (NAE) released a document "Grand Challenges in Engineering," covering broad realms of human concern from sustainability, health, vulnerability and the joy of living. Biological engineers, having tools and techniques at the interface between living and non-living entities, will play a prominent role in forging a better future. The 2010 Institute of Biological Engineering (IBE) conference in Cambridge, MA, USA will address, in part, the roles of biological engineering in solving the challenges presented by the NAE. This letter presents a brief outline of how biological engineers are working to solve these large scale and integrated problems of our society. PMID:19772647

  7. ACTIVIS: Visual Exploration of Industry-Scale Deep Neural Network Models.

    PubMed

    Kahng, Minsuk; Andrews, Pierre Y; Kalro, Aditya; Polo Chau, Duen Horng

    2017-08-30

    While deep learning models have achieved state-of-the-art accuracies for many prediction tasks, understanding these models remains a challenge. Despite the recent interest in developing visual tools to help users interpret deep learning models, the complexity and wide variety of models deployed in industry, and the large-scale datasets that they used, pose unique design challenges that are inadequately addressed by existing work. Through participatory design sessions with over 15 researchers and engineers at Facebook, we have developed, deployed, and iteratively improved ACTIVIS, an interactive visualization system for interpreting large-scale deep learning models and results. By tightly integrating multiple coordinated views, such as a computation graph overview of the model architecture, and a neuron activation view for pattern discovery and comparison, users can explore complex deep neural network models at both the instance- and subset-level. ACTIVIS has been deployed on Facebook's machine learning platform. We present case studies with Facebook researchers and engineers, and usage scenarios of how ACTIVIS may work with different models.

  8. Hybrid multiphoton volumetric functional imaging of large-scale bioengineered neuronal networks

    NASA Astrophysics Data System (ADS)

    Dana, Hod; Marom, Anat; Paluch, Shir; Dvorkin, Roman; Brosh, Inbar; Shoham, Shy

    2014-06-01

    Planar neural networks and interfaces serve as versatile in vitro models of central nervous system physiology, but adaptations of related methods to three dimensions (3D) have met with limited success. Here, we demonstrate for the first time volumetric functional imaging in a bioengineered neural tissue growing in a transparent hydrogel with cortical cellular and synaptic densities, by introducing complementary new developments in nonlinear microscopy and neural tissue engineering. Our system uses a novel hybrid multiphoton microscope design combining a 3D scanning-line temporal-focusing subsystem and a conventional laser-scanning multiphoton microscope to provide functional and structural volumetric imaging capabilities: dense microscopic 3D sampling at tens of volumes per second of structures with mm-scale dimensions containing a network of over 1,000 developing cells with complex spontaneous activity patterns. These developments open new opportunities for large-scale neuronal interfacing and for applications of 3D engineered networks ranging from basic neuroscience to the screening of neuroactive substances.

  9. The second law of thermodynamics and quantum heat engines: Is the law strictly enforced?

    NASA Astrophysics Data System (ADS)

    Keefe, Peter D.

    2010-01-01

    A quantum heat engine is a construct having a working medium which is cyclically processed through a pair of control variables of state involving a Bose-Einstein condensation (BEC) in which a heat input is converted into a work output. Of interest is a first species of quantum heat engine in which the working medium is macroscopic in the sense the size scale is sufficiently large that the BEC is not volumetrically coherent. In this first species of quantum heat engine, near Carnot efficiencies may be possible. Of particular interest is a second species of quantum heat engine in which the working medium is mesoscopic in the sense that the size scale is sufficiently small that the BEC is volumetrically coherent. In this second species of quantum heat engine, the resulting in-process non-equilibrium condition affects the finally arrived at control variables of state such that Carnot efficiencies and beyond may be possible. A Type I superconductor is used to model the first and second species of quantum heat engine.

  10. Ozone Depletion Caused by Rocket Engine Emissions: A Fundamental Limit on the Scale and Viability of Space-Based Geoengineering Schemes

    NASA Astrophysics Data System (ADS)

    Ross, M. N.; Toohey, D.

    2008-12-01

    Emissions from solid and liquid propellant rocket engines reduce global stratospheric ozone levels. Currently ~ one kiloton of payloads are launched into earth orbit annually by the global space industry. Stratospheric ozone depletion from present day launches is a small fraction of the ~ 4% globally averaged ozone loss caused by halogen gases. Thus rocket engine emissions are currently considered a minor, if poorly understood, contributor to ozone depletion. Proposed space-based geoengineering projects designed to mitigate climate change would require order of magnitude increases in the amount of material launched into earth orbit. The increased launches would result in comparable increases in the global ozone depletion caused by rocket emissions. We estimate global ozone loss caused by three space-based geoengineering proposals to mitigate climate change: (1) mirrors, (2) sunshade, and (3) space-based solar power (SSP). The SSP concept does not directly engineer climate, but is touted as a mitigation strategy in that SSP would reduce CO2 emissions. We show that launching the mirrors or sunshade would cause global ozone loss between 2% and 20%. Ozone loss associated with an economically viable SSP system would be at least 0.4% and possibly as large as 3%. It is not clear which, if any, of these levels of ozone loss would be acceptable under the Montreal Protocol. The large uncertainties are mainly caused by a lack of data or validated models regarding liquid propellant rocket engine emissions. Our results offer four main conclusions. (1) The viability of space-based geoengineering schemes could well be undermined by the relatively large ozone depletion that would be caused by the required rocket launches. (2) Analysis of space- based geoengineering schemes should include the difficult tradeoff between the gain of long-term (~ decades) climate control and the loss of short-term (~ years) deep ozone loss. (3) The trade can be properly evaluated only if our understanding of the stratospheric impact of rocket emissions is significantly improved. (4) Such an improved understanding requires a concerted effort of research including new in situ measurements in a variety of rocket plumes and a multi-scale modeling program similar in scope to the effort required to address the climate and ozone impacts of aircraft emissions.

  11. Micrometer scale guidance of mesenchymal stem cells to form structurally oriented large-scale tissue engineered cartilage.

    PubMed

    Chou, Chih-Ling; Rivera, Alexander L; Williams, Valencia; Welter, Jean F; Mansour, Joseph M; Drazba, Judith A; Sakai, Takao; Baskaran, Harihara

    2017-09-15

    Current clinical methods to treat articular cartilage lesions provide temporary relief of the symptoms but fail to permanently restore the damaged tissue. Tissue engineering, using mesenchymal stem cells (MSCs) combined with scaffolds and bioactive factors, is viewed as a promising method for repairing cartilage injuries. However, current tissue engineered constructs display inferior mechanical properties compared to native articular cartilage, which could be attributed to the lack of structural organization of the extracellular matrix (ECM) of these engineered constructs in comparison to the highly oriented structure of articular cartilage ECM. We previously showed that we can guide MSCs undergoing chondrogenesis to align using microscale guidance channels on the surface of a two-dimensional (2-D) collagen scaffold, which resulted in the deposition of aligned ECM within the channels and enhanced mechanical properties of the constructs. In this study, we developed a technique to roll 2-D collagen scaffolds containing MSCs within guidance channels in order to produce a large-scale, three-dimensional (3-D) tissue engineered cartilage constructs with enhanced mechanical properties compared to current constructs. After rolling the MSC-scaffold constructs into a 3-D cylindrical structure, the constructs were cultured for 21days under chondrogenic culture conditions. The microstructure architecture and mechanical properties of the constructs were evaluated using imaging and compressive testing. Histology and immunohistochemistry of the constructs showed extensive glycosaminoglycan (GAG) and collagen type II deposition. Second harmonic generation imaging and Picrosirius red staining indicated alignment of neo-collagen fibers within the guidance channels of the constructs. Mechanical testing indicated that constructs containing the guidance channels displayed enhanced compressive properties compared to control constructs without these channels. In conclusion, using a novel roll-up method, we have developed large scale MSC based tissue-engineered cartilage that shows microscale structural organization and enhanced compressive properties compared to current tissue engineered constructs. Tissue engineered cartilage constructs made with human mesenchymal stem cells (hMSCs), scaffolds and bioactive factors are a promising solution to treat cartilage defects. A major disadvantage of these constructs is their inferior mechanical properties compared to the native tissue, which is likely due to the lack of structural organization of the extracellular matrix of the engineered constructs. In this study, we developed three-dimensional (3-D) cartilage constructs from rectangular scaffold sheets containing hMSCs in micro-guidance channels and characterized their mechanical properties and metabolic requirements. The work led to a novel roll-up method to embed 2-D microscale structures in 3-D constructs. Further, micro-guidance channels incorporated within the 3-D cartilage constructs led to the production of aligned cell-produced matrix and enhanced mechanical function. Copyright © 2017 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  12. Combined effects of multiple large-scale hydraulic engineering on water stages in the middle Yangtze River

    NASA Astrophysics Data System (ADS)

    Han, Jianqiao; Sun, Zhaohua; Li, Yitian; Yang, Yunping

    2017-12-01

    Investigation of water stages influenced by human projects provides better understanding of riverine geomorphological processes and river management. Based on hydrological data collected over 60 years, an extreme stage-extreme discharge analysis and a specific-gauge analysis were performed to research the individual and combined effects of multiple engineering projects on a long-term time series of water stages in the middle Yangtze River. Conclusions are as follows. (1) In accordance with the operation years of the Jingjiang cutoff (CF), the Gezhouba Dam (GD), and the Three Gorges Dam (TGD), the time series (1955-2012) was divided into periods of P1 (1955-1970), P2 (1971-1980), P3 (1981-2002), and P4 (2003 - 2012). Water stage changes during P1-P2, P2-P3, and P3-P4 are varied because of the differences in the types and scales of these projects. The stage decreased at Shashi and increased at Luoshan owing to the operation of the CF. Additionally, after the GD was constructed, the low-flow stage decreased in the upstream reach of Chenglingji and increased in its downstream reach, whereas the flood stage merely decreased at Yichang. Moreover, the TGD resulted in an overall decrease in low-flow stages and a limited increase in flood stages because of the differential adjustments of river geometry and resistance between the low-flow channel and flood channel. (2) Although differences existed in the scouring mechanisms between streamwise erosion associated with dams and headward erosion associated with cutoffs, particular bed textures in the gravel reach led to a similar adjustment that stage reduction at Shashi was the greatest of all stations, which caused the flow slope and sediment transport capacity to decrease in the sandy reach. (3) These engineering projects caused changes in average low-flow and flood stages that varied between Yichang (- 1.58 and - 0.08 m respectively), Shashi (- 3.54 and - 0.12 m), and Luoshan (1.15 and 0.97 m) from P1 to P4. However, less influence was observed at Hankou owing to its remote location and the short impoundment time of the TGD. (4) Potentially detrimental decreases in low-flow stages and increases in flood stages should be monitored and managed in the future. Our results are of practical significance for river management and the evaluation of the influences of large-scale anthropogenic activities on the hydrological regimes of large rivers.

  13. VLSI (Very Large Scale Integrated) Design of a 16 Bit Very Fast Pipelined Carry Look Ahead Adder.

    DTIC Science & Technology

    1983-09-01

    the ability for systems engineers to custom design digital integrated circuits. Until recently, the design of integrated circuits has been...traditionally carried out by a select group of logic designers working in semiconductor laboratories. Systems engineers had to "make do" or "fit in" the...products of these labs to realize their designs. The systems engineers had little participation in the actual design of the chip. The MED and CONWAY design

  14. Visualization of flows in a motored rotary combustion engine using holographic interferometry

    NASA Technical Reports Server (NTRS)

    Hicks, Y. R.; Schock, H. J.; Craig, J. E.; Umstatter, H. L.; Lee, D. Y.

    1986-01-01

    The use of holographic interferometry to view the small- and large-scale flow field structures in the combustion chamber of a motored Wankel engine assembly is described. In order that the flow patterns of interest could be observed, small quantities of helium were injected with the intake air. Variation of the air flow patterns with engine speed, helium flow rate, and rotor position are described. The air flow at two locations within the combustion chamber was examined using this technique.

  15. Large-Scale Bi-Level Strain Design Approaches and Mixed-Integer Programming Solution Techniques

    PubMed Central

    Kim, Joonhoon; Reed, Jennifer L.; Maravelias, Christos T.

    2011-01-01

    The use of computational models in metabolic engineering has been increasing as more genome-scale metabolic models and computational approaches become available. Various computational approaches have been developed to predict how genetic perturbations affect metabolic behavior at a systems level, and have been successfully used to engineer microbial strains with improved primary or secondary metabolite production. However, identification of metabolic engineering strategies involving a large number of perturbations is currently limited by computational resources due to the size of genome-scale models and the combinatorial nature of the problem. In this study, we present (i) two new bi-level strain design approaches using mixed-integer programming (MIP), and (ii) general solution techniques that improve the performance of MIP-based bi-level approaches. The first approach (SimOptStrain) simultaneously considers gene deletion and non-native reaction addition, while the second approach (BiMOMA) uses minimization of metabolic adjustment to predict knockout behavior in a MIP-based bi-level problem for the first time. Our general MIP solution techniques significantly reduced the CPU times needed to find optimal strategies when applied to an existing strain design approach (OptORF) (e.g., from ∼10 days to ∼5 minutes for metabolic engineering strategies with 4 gene deletions), and identified strategies for producing compounds where previous studies could not (e.g., malate and serine). Additionally, we found novel strategies using SimOptStrain with higher predicted production levels (for succinate and glycerol) than could have been found using an existing approach that considers network additions and deletions in sequential steps rather than simultaneously. Finally, using BiMOMA we found novel strategies involving large numbers of modifications (for pyruvate and glutamate), which sequential search and genetic algorithms were unable to find. The approaches and solution techniques developed here will facilitate the strain design process and extend the scope of its application to metabolic engineering. PMID:21949695

  16. Large-scale bi-level strain design approaches and mixed-integer programming solution techniques.

    PubMed

    Kim, Joonhoon; Reed, Jennifer L; Maravelias, Christos T

    2011-01-01

    The use of computational models in metabolic engineering has been increasing as more genome-scale metabolic models and computational approaches become available. Various computational approaches have been developed to predict how genetic perturbations affect metabolic behavior at a systems level, and have been successfully used to engineer microbial strains with improved primary or secondary metabolite production. However, identification of metabolic engineering strategies involving a large number of perturbations is currently limited by computational resources due to the size of genome-scale models and the combinatorial nature of the problem. In this study, we present (i) two new bi-level strain design approaches using mixed-integer programming (MIP), and (ii) general solution techniques that improve the performance of MIP-based bi-level approaches. The first approach (SimOptStrain) simultaneously considers gene deletion and non-native reaction addition, while the second approach (BiMOMA) uses minimization of metabolic adjustment to predict knockout behavior in a MIP-based bi-level problem for the first time. Our general MIP solution techniques significantly reduced the CPU times needed to find optimal strategies when applied to an existing strain design approach (OptORF) (e.g., from ∼10 days to ∼5 minutes for metabolic engineering strategies with 4 gene deletions), and identified strategies for producing compounds where previous studies could not (e.g., malate and serine). Additionally, we found novel strategies using SimOptStrain with higher predicted production levels (for succinate and glycerol) than could have been found using an existing approach that considers network additions and deletions in sequential steps rather than simultaneously. Finally, using BiMOMA we found novel strategies involving large numbers of modifications (for pyruvate and glutamate), which sequential search and genetic algorithms were unable to find. The approaches and solution techniques developed here will facilitate the strain design process and extend the scope of its application to metabolic engineering.

  17. A NARROW SHORT-DURATION GRB JET FROM A WIDE CENTRAL ENGINE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duffell, Paul C.; Quataert, Eliot; MacFadyen, Andrew I., E-mail: duffell@berkeley.edu

    2015-11-01

    We use two-dimensional relativistic hydrodynamic numerical calculations to show that highly collimated relativistic jets can be produced in neutron star merger models of short-duration gamma-ray bursts (GRBs) without the need for a highly directed engine or a large net magnetic flux. Even a hydrodynamic engine generating a very wide sustained outflow on small scales can, in principle, produce a highly collimated relativistic jet, facilitated by a dense surrounding medium that provides a cocoon surrounding the jet core. An oblate geometry to the surrounding gas significantly enhances the collimation process. Previous numerical simulations have shown that the merger of two neutronmore » stars produces an oblate, expanding cloud of dynamical ejecta. We show that this gas can efficiently collimate the central engine power much like the surrounding star does in long-duration GRB models. For typical short-duration GRB central engine parameters, we find jets with opening angles of an order of 10° in which a large fraction of the total outflow power of the central engine resides in highly relativistic material. These results predict large differences in the opening angles of outflows from binary neutron star mergers versus neutron star–black hole mergers.« less

  18. 40 CFR 1048.101 - What exhaust emission standards must my engines meet?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... engineering analysis of information equivalent to such in-use data, such as data from research engines or... my engines meet? 1048.101 Section 1048.101 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS CONTROL OF EMISSIONS FROM NEW, LARGE NONROAD SPARK-IGNITION ENGINES...

  19. 40 CFR 1048.101 - What exhaust emission standards must my engines meet?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... engineering analysis of information equivalent to such in-use data, such as data from research engines or... my engines meet? 1048.101 Section 1048.101 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS CONTROL OF EMISSIONS FROM NEW, LARGE NONROAD SPARK-IGNITION ENGINES...

  20. 40 CFR 1048.101 - What exhaust emission standards must my engines meet?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... engineering analysis of information equivalent to such in-use data, such as data from research engines or... my engines meet? 1048.101 Section 1048.101 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS CONTROL OF EMISSIONS FROM NEW, LARGE NONROAD SPARK-IGNITION ENGINES...

  1. 40 CFR 1048.101 - What exhaust emission standards must my engines meet?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... engineering analysis of information equivalent to such in-use data, such as data from research engines or... my engines meet? 1048.101 Section 1048.101 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS CONTROL OF EMISSIONS FROM NEW, LARGE NONROAD SPARK-IGNITION ENGINES...

  2. A General-Purpose Optimization Engine for Multi-Disciplinary Design Applications

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Hopkins, Dale A.; Berke, Laszlo

    1996-01-01

    A general purpose optimization tool for multidisciplinary applications, which in the literature is known as COMETBOARDS, is being developed at NASA Lewis Research Center. The modular organization of COMETBOARDS includes several analyzers and state-of-the-art optimization algorithms along with their cascading strategy. The code structure allows quick integration of new analyzers and optimizers. The COMETBOARDS code reads input information from a number of data files, formulates a design as a set of multidisciplinary nonlinear programming problems, and then solves the resulting problems. COMETBOARDS can be used to solve a large problem which can be defined through multiple disciplines, each of which can be further broken down into several subproblems. Alternatively, a small portion of a large problem can be optimized in an effort to improve an existing system. Some of the other unique features of COMETBOARDS include design variable formulation, constraint formulation, subproblem coupling strategy, global scaling technique, analysis approximation, use of either sequential or parallel computational modes, and so forth. The special features and unique strengths of COMETBOARDS assist convergence and reduce the amount of CPU time used to solve the difficult optimization problems of aerospace industries. COMETBOARDS has been successfully used to solve a number of problems, including structural design of space station components, design of nozzle components of an air-breathing engine, configuration design of subsonic and supersonic aircraft, mixed flow turbofan engines, wave rotor topped engines, and so forth. This paper introduces the COMETBOARDS design tool and its versatility, which is illustrated by citing examples from structures, aircraft design, and air-breathing propulsion engine design.

  3. Exploring the relationship between the engineering and physical sciences and the health and life sciences by advanced bibliometric methods.

    PubMed

    Waltman, Ludo; van Raan, Anthony F J; Smart, Sue

    2014-01-01

    We investigate the extent to which advances in the health and life sciences (HLS) are dependent on research in the engineering and physical sciences (EPS), particularly physics, chemistry, mathematics, and engineering. The analysis combines two different bibliometric approaches. The first approach to analyze the 'EPS-HLS interface' is based on term map visualizations of HLS research fields. We consider 16 clinical fields and five life science fields. On the basis of expert judgment, EPS research in these fields is studied by identifying EPS-related terms in the term maps. In the second approach, a large-scale citation-based network analysis is applied to publications from all fields of science. We work with about 22,000 clusters of publications, each representing a topic in the scientific literature. Citation relations are used to identify topics at the EPS-HLS interface. The two approaches complement each other. The advantages of working with textual data compensate for the limitations of working with citation relations and the other way around. An important advantage of working with textual data is in the in-depth qualitative insights it provides. Working with citation relations, on the other hand, yields many relevant quantitative statistics. We find that EPS research contributes to HLS developments mainly in the following five ways: new materials and their properties; chemical methods for analysis and molecular synthesis; imaging of parts of the body as well as of biomaterial surfaces; medical engineering mainly related to imaging, radiation therapy, signal processing technology, and other medical instrumentation; mathematical and statistical methods for data analysis. In our analysis, about 10% of all EPS and HLS publications are classified as being at the EPS-HLS interface. This percentage has remained more or less constant during the past decade.

  4. Exploring the Relationship between the Engineering and Physical Sciences and the Health and Life Sciences by Advanced Bibliometric Methods

    PubMed Central

    Waltman, Ludo; van Raan, Anthony F. J.; Smart, Sue

    2014-01-01

    We investigate the extent to which advances in the health and life sciences (HLS) are dependent on research in the engineering and physical sciences (EPS), particularly physics, chemistry, mathematics, and engineering. The analysis combines two different bibliometric approaches. The first approach to analyze the ‘EPS-HLS interface’ is based on term map visualizations of HLS research fields. We consider 16 clinical fields and five life science fields. On the basis of expert judgment, EPS research in these fields is studied by identifying EPS-related terms in the term maps. In the second approach, a large-scale citation-based network analysis is applied to publications from all fields of science. We work with about 22,000 clusters of publications, each representing a topic in the scientific literature. Citation relations are used to identify topics at the EPS-HLS interface. The two approaches complement each other. The advantages of working with textual data compensate for the limitations of working with citation relations and the other way around. An important advantage of working with textual data is in the in-depth qualitative insights it provides. Working with citation relations, on the other hand, yields many relevant quantitative statistics. We find that EPS research contributes to HLS developments mainly in the following five ways: new materials and their properties; chemical methods for analysis and molecular synthesis; imaging of parts of the body as well as of biomaterial surfaces; medical engineering mainly related to imaging, radiation therapy, signal processing technology, and other medical instrumentation; mathematical and statistical methods for data analysis. In our analysis, about 10% of all EPS and HLS publications are classified as being at the EPS-HLS interface. This percentage has remained more or less constant during the past decade. PMID:25360616

  5. Achieving Agility and Stability in Large-Scale Software Development

    DTIC Science & Technology

    2013-01-16

    temporary team is assigned to prepare layers and frameworks for future feature teams. Presentation Layer Domain Layer Data Access Layer...http://www.sei.cmu.edu/training/ elearning ~ Software Engineering Institute CarnegieMellon

  6. Multi-agent based control of large-scale complex systems employing distributed dynamic inference engine

    NASA Astrophysics Data System (ADS)

    Zhang, Daili

    Increasing societal demand for automation has led to considerable efforts to control large-scale complex systems, especially in the area of autonomous intelligent control methods. The control system of a large-scale complex system needs to satisfy four system level requirements: robustness, flexibility, reusability, and scalability. Corresponding to the four system level requirements, there arise four major challenges. First, it is difficult to get accurate and complete information. Second, the system may be physically highly distributed. Third, the system evolves very quickly. Fourth, emergent global behaviors of the system can be caused by small disturbances at the component level. The Multi-Agent Based Control (MABC) method as an implementation of distributed intelligent control has been the focus of research since the 1970s, in an effort to solve the above-mentioned problems in controlling large-scale complex systems. However, to the author's best knowledge, all MABC systems for large-scale complex systems with significant uncertainties are problem-specific and thus difficult to extend to other domains or larger systems. This situation is partly due to the control architecture of multiple agents being determined by agent to agent coupling and interaction mechanisms. Therefore, the research objective of this dissertation is to develop a comprehensive, generalized framework for the control system design of general large-scale complex systems with significant uncertainties, with the focus on distributed control architecture design and distributed inference engine design. A Hybrid Multi-Agent Based Control (HyMABC) architecture is proposed by combining hierarchical control architecture and module control architecture with logical replication rings. First, it decomposes a complex system hierarchically; second, it combines the components in the same level as a module, and then designs common interfaces for all of the components in the same module; third, replications are made for critical agents and are organized into logical rings. This architecture maintains clear guidelines for complexity decomposition and also increases the robustness of the whole system. Multiple Sectioned Dynamic Bayesian Networks (MSDBNs) as a distributed dynamic probabilistic inference engine, can be embedded into the control architecture to handle uncertainties of general large-scale complex systems. MSDBNs decomposes a large knowledge-based system into many agents. Each agent holds its partial perspective of a large problem domain by representing its knowledge as a Dynamic Bayesian Network (DBN). Each agent accesses local evidence from its corresponding local sensors and communicates with other agents through finite message passing. If the distributed agents can be organized into a tree structure, satisfying the running intersection property and d-sep set requirements, globally consistent inferences are achievable in a distributed way. By using different frequencies for local DBN agent belief updating and global system belief updating, it balances the communication cost with the global consistency of inferences. In this dissertation, a fully factorized Boyen-Koller (BK) approximation algorithm is used for local DBN agent belief updating, and the static Junction Forest Linkage Tree (JFLT) algorithm is used for global system belief updating. MSDBNs assume a static structure and a stable communication network for the whole system. However, for a real system, sub-Bayesian networks as nodes could be lost, and the communication network could be shut down due to partial damage in the system. Therefore, on-line and automatic MSDBNs structure formation is necessary for making robust state estimations and increasing survivability of the whole system. A Distributed Spanning Tree Optimization (DSTO) algorithm, a Distributed D-Sep Set Satisfaction (DDSSS) algorithm, and a Distributed Running Intersection Satisfaction (DRIS) algorithm are proposed in this dissertation. Combining these three distributed algorithms and a Distributed Belief Propagation (DBP) algorithm in MSDBNs makes state estimations robust to partial damage in the whole system. Combining the distributed control architecture design and the distributed inference engine design leads to a process of control system design for a general large-scale complex system. As applications of the proposed methodology, the control system design of a simplified ship chilled water system and a notional ship chilled water system have been demonstrated step by step. Simulation results not only show that the proposed methodology gives a clear guideline for control system design for general large-scale complex systems with dynamic and uncertain environment, but also indicate that the combination of MSDBNs and HyMABC can provide excellent performance for controlling general large-scale complex systems.

  7. The induction of water to the inlet air as a means of internal cooling in aircraft-engine cylinders

    NASA Technical Reports Server (NTRS)

    Rothrock, Addison M; Krsek, Alois, Jr; Jones, Anthony W

    1943-01-01

    Report presents the results of investigations conducted on a full-scale air-cooled aircraft-engine cylinder of 202-cubic inch displacement to determine the effects of internal cooling by water induction on the maximum permissible power and output of an internal-combustion engine. For a range of fuel-air and water-fuel ratios, the engine inlet pressure was increased until knock was detected aurally, the power was then decreased 7 percent holding the ratios constant. The data indicated that water was a very effective internal coolant, permitting large increases in engine power as limited by either knock or by cylinder temperatures.

  8. Processing Shotgun Proteomics Data on the Amazon Cloud with the Trans-Proteomic Pipeline*

    PubMed Central

    Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W.; Moritz, Robert L.

    2015-01-01

    Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. PMID:25418363

  9. Processing shotgun proteomics data on the Amazon cloud with the trans-proteomic pipeline.

    PubMed

    Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W; Moritz, Robert L

    2015-02-01

    Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.

  10. RAID-2: Design and implementation of a large scale disk array controller

    NASA Technical Reports Server (NTRS)

    Katz, R. H.; Chen, P. M.; Drapeau, A. L.; Lee, E. K.; Lutz, K.; Miller, E. L.; Seshan, S.; Patterson, D. A.

    1992-01-01

    We describe the implementation of a large scale disk array controller and subsystem incorporating over 100 high performance 3.5 inch disk drives. It is designed to provide 40 MB/s sustained performance and 40 GB capacity in three 19 inch racks. The array controller forms an integral part of a file server that attaches to a Gb/s local area network. The controller implements a high bandwidth interconnect between an interleaved memory, an XOR calculation engine, the network interface (HIPPI), and the disk interfaces (SCSI). The system is now functionally operational, and we are tuning its performance. We review the design decisions, history, and lessons learned from this three year university implementation effort to construct a truly large scale system assembly.

  11. Recent developments in large-scale ozone generation with dielectric barrier discharges

    NASA Astrophysics Data System (ADS)

    Lopez, Jose L.

    2014-10-01

    Large-scale ozone generation for industrial applications has been entirely based on the creation of microplasmas or microdischarges created using dielectric barrier discharge (DBD) reactors. Although versions of DBD generated ozone have been in continuous use for over a hundred years especially in water treatment, recent changes in environmental awareness and sustainability have lead to a surge of ozone generating facilities throughout the world. As a result of this enhanced global usage of this environmental cleaning application various new discoveries have emerged in the science and technology of ozone generation. This presentation will describe some of the most recent breakthrough developments in large-scale ozone generation while further addressing some of the current scientific and engineering challenges of this technology.

  12. PGen: large-scale genomic variations analysis workflow and browser in SoyKB.

    PubMed

    Liu, Yang; Khan, Saad M; Wang, Juexin; Rynge, Mats; Zhang, Yuanxun; Zeng, Shuai; Chen, Shiyuan; Maldonado Dos Santos, Joao V; Valliyodan, Babu; Calyam, Prasad P; Merchant, Nirav; Nguyen, Henry T; Xu, Dong; Joshi, Trupti

    2016-10-06

    With the advances in next-generation sequencing (NGS) technology and significant reductions in sequencing costs, it is now possible to sequence large collections of germplasm in crops for detecting genome-scale genetic variations and to apply the knowledge towards improvements in traits. To efficiently facilitate large-scale NGS resequencing data analysis of genomic variations, we have developed "PGen", an integrated and optimized workflow using the Extreme Science and Engineering Discovery Environment (XSEDE) high-performance computing (HPC) virtual system, iPlant cloud data storage resources and Pegasus workflow management system (Pegasus-WMS). The workflow allows users to identify single nucleotide polymorphisms (SNPs) and insertion-deletions (indels), perform SNP annotations and conduct copy number variation analyses on multiple resequencing datasets in a user-friendly and seamless way. We have developed both a Linux version in GitHub ( https://github.com/pegasus-isi/PGen-GenomicVariations-Workflow ) and a web-based implementation of the PGen workflow integrated within the Soybean Knowledge Base (SoyKB), ( http://soykb.org/Pegasus/index.php ). Using PGen, we identified 10,218,140 single-nucleotide polymorphisms (SNPs) and 1,398,982 indels from analysis of 106 soybean lines sequenced at 15X coverage. 297,245 non-synonymous SNPs and 3330 copy number variation (CNV) regions were identified from this analysis. SNPs identified using PGen from additional soybean resequencing projects adding to 500+ soybean germplasm lines in total have been integrated. These SNPs are being utilized for trait improvement using genotype to phenotype prediction approaches developed in-house. In order to browse and access NGS data easily, we have also developed an NGS resequencing data browser ( http://soykb.org/NGS_Resequence/NGS_index.php ) within SoyKB to provide easy access to SNP and downstream analysis results for soybean researchers. PGen workflow has been optimized for the most efficient analysis of soybean data using thorough testing and validation. This research serves as an example of best practices for development of genomics data analysis workflows by integrating remote HPC resources and efficient data management with ease of use for biological users. PGen workflow can also be easily customized for analysis of data in other species.

  13. Digital Rocks Portal: a sustainable platform for imaged dataset sharing, translation and automated analysis

    NASA Astrophysics Data System (ADS)

    Prodanovic, M.; Esteva, M.; Hanlon, M.; Nanda, G.; Agarwal, P.

    2015-12-01

    Recent advances in imaging have provided a wealth of 3D datasets that reveal pore space microstructure (nm to cm length scale) and allow investigation of nonlinear flow and mechanical phenomena from first principles using numerical approaches. This framework has popularly been called "digital rock physics". Researchers, however, have trouble storing and sharing the datasets both due to their size and the lack of standardized image types and associated metadata for volumetric datasets. This impedes scientific cross-validation of the numerical approaches that characterize large scale porous media properties, as well as development of multiscale approaches required for correct upscaling. A single research group typically specializes in an imaging modality and/or related modeling on a single length scale, and lack of data-sharing infrastructure makes it difficult to integrate different length scales. We developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal, that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of geosciences or engineering researchers not necessarily trained in computer science or data analysis. Once widely accepter, the repository will jumpstart productivity and enable scientific inquiry and engineering decisions founded on a data-driven basis. This is the first repository of its kind. We show initial results on incorporating essential software tools and pipelines that make it easier for researchers to store and reuse data, and for educators to quickly visualize and illustrate concepts to a wide audience. For data sustainability and continuous access, the portal is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (TACC) at the University of Texas at Austin. Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative.

  14. Analysis of variation in oil pressure in lubricating system

    NASA Astrophysics Data System (ADS)

    Sharma, Sumit; Upreti, Mritunjay; Sharma, Bharat; Poddar, Keshav

    2018-05-01

    Automotive Maintenance for an engine contributes to its reliability, energy efficiency and repair cost reduction. Modeling of engine performance and fault detection require large amount of data, which are usually obtained on test benches. This report presents a methodical study on analysis of variation in lubrication system of various medium speed engines. Further this study is limited to the influence of Engine Oil Pressure on frictional losses, Torque analysis for various Oil Pressures and an analytical analysis of engine Lubrication System. The data collected from various Engines under diagnostics is represented graphically. Finally the illustrated results were used as a viable source for detection and troubleshooting of faults in Lubrication System of regular passenger vehicle.

  15. A new hybrid meta-heuristic algorithm for optimal design of large-scale dome structures

    NASA Astrophysics Data System (ADS)

    Kaveh, A.; Ilchi Ghazaan, M.

    2018-02-01

    In this article a hybrid algorithm based on a vibrating particles system (VPS) algorithm, multi-design variable configuration (Multi-DVC) cascade optimization, and an upper bound strategy (UBS) is presented for global optimization of large-scale dome truss structures. The new algorithm is called MDVC-UVPS in which the VPS algorithm acts as the main engine of the algorithm. The VPS algorithm is one of the most recent multi-agent meta-heuristic algorithms mimicking the mechanisms of damped free vibration of single degree of freedom systems. In order to handle a large number of variables, cascade sizing optimization utilizing a series of DVCs is used. Moreover, the UBS is utilized to reduce the computational time. Various dome truss examples are studied to demonstrate the effectiveness and robustness of the proposed method, as compared to some existing structural optimization techniques. The results indicate that the MDVC-UVPS technique is a powerful search and optimization method for optimizing structural engineering problems.

  16. a Model Study of Small-Scale World Map Generalization

    NASA Astrophysics Data System (ADS)

    Cheng, Y.; Yin, Y.; Li, C. M.; Wu, W.; Guo, P. P.; Ma, X. L.; Hu, F. M.

    2018-04-01

    With the globalization and rapid development every filed is taking an increasing interest in physical geography and human economics. There is a surging demand for small scale world map in large formats all over the world. Further study of automated mapping technology, especially the realization of small scale production on a large scale global map, is the key of the cartographic field need to solve. In light of this, this paper adopts the improved model (with the map and data separated) in the field of the mapmaking generalization, which can separate geographic data from mapping data from maps, mainly including cross-platform symbols and automatic map-making knowledge engine. With respect to the cross-platform symbol library, the symbol and the physical symbol in the geographic information are configured at all scale levels. With respect to automatic map-making knowledge engine consists 97 types, 1086 subtypes, 21845 basic algorithm and over 2500 relevant functional modules.In order to evaluate the accuracy and visual effect of our model towards topographic maps and thematic maps, we take the world map generalization in small scale as an example. After mapping generalization process, combining and simplifying the scattered islands make the map more explicit at 1 : 2.1 billion scale, and the map features more complete and accurate. Not only it enhance the map generalization of various scales significantly, but achieve the integration among map-makings of various scales, suggesting that this model provide a reference in cartographic generalization for various scales.

  17. Hierarchical Modeling and Robust Synthesis for the Preliminary Design of Large Scale Complex Systems

    NASA Technical Reports Server (NTRS)

    Koch, Patrick N.

    1997-01-01

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis; Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration; and Noise modeling techniques for implementing robust preliminary design when approximate models are employed. Hierarchical partitioning and modeling techniques including intermediate responses, linking variables, and compatibility constraints are incorporated within a hierarchical compromise decision support problem formulation for synthesizing subproblem solutions for a partitioned system. Experimentation and approximation techniques are employed for concurrent investigations and modeling of partitioned subproblems. A modified composite experiment is introduced for fitting better predictive models across the ranges of the factors, and an approach for constructing partitioned response surfaces is developed to reduce the computational expense of experimentation for fitting models in a large number of factors. Noise modeling techniques are compared and recommendations are offered for the implementation of robust design when approximate models are sought. These techniques, approaches, and recommendations are incorporated within the method developed for hierarchical robust preliminary design exploration. This method as well as the associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system. The case study is developed in collaboration with Allison Engine Company, Rolls Royce Aerospace, and is based on the Allison AE3007 existing engine designed for midsize commercial, regional business jets. For this case study, the turbofan system-level problem is partitioned into engine cycle design and configuration design and a compressor modules integrated for more detailed subsystem-level design exploration, improving system evaluation. The fan and low pressure turbine subsystems are also modeled, but in less detail. Given the defined partitioning, these subproblems are investigated independently and concurrently, and response surface models are constructed to approximate the responses of each. These response models are then incorporated within a commercial turbofan hierarchical compromise decision support problem formulation. Five design scenarios are investigated, and robust solutions are identified. The method and solutions identified are verified by comparison with the AE3007 engine. The solutions obtained are similar to the AE3007 cycle and configuration, but are better with respect to many of the requirements.

  18. Strategic Alliances in Education: The Knowledge Engineering Web

    ERIC Educational Resources Information Center

    Westera, Wim; van den Herik, Jaap; van de Vrie, Evert

    2004-01-01

    The field of higher education shows a jumble of alliances between fellow institutes. The alliances are strategic in kind and serve an economy-of-scales concept. A large scale is a prerequisite for allocating the budgets for new educational methods and technologies in order to keep the educational services up-to-date. All too often, however,…

  19. Electrical innovations, authority and consulting expertise in late Victorian Britain

    PubMed Central

    Arapostathis, Stathis

    2013-01-01

    In this article I examine the practices of electrical engineering experts, with special reference to their role in the implementation of innovations in late Victorian electrical networks. I focus on the consulting work of two leading figures in the scientific and engineering world of the period, Alexander Kennedy and William Preece. Both were Fellows of the Royal Society and both developed large-scale consulting activities in the emerging electrical industry of light and power. At the core of the study I place the issues of trust and authority, and the bearing of these on the engineering expertise of consultants in late Victorian Britain. I argue that the ascription of expertise to these engineers and the trust placed in their advice were products of power relations on the local scale. The study seeks to unravel both the technical and the social reasons for authoritative patterns of consulting expertise. PMID:24686584

  20. Lakeside: Merging Urban Design with Scientific Analysis

    ScienceCinema

    Guzowski, Leah; Catlett, Charlie; Woodbury, Ed

    2018-01-16

    Researchers at the U.S. Department of Energy's Argonne National Laboratory and the University of Chicago are developing tools that merge urban design with scientific analysis to improve the decision-making process associated with large-scale urban developments. One such tool, called LakeSim, has been prototyped with an initial focus on consumer-driven energy and transportation demand, through a partnership with the Chicago-based architectural and engineering design firm Skidmore, Owings & Merrill, Clean Energy Trust and developer McCaffery Interests. LakeSim began with the need to answer practical questions about urban design and planning, requiring a better understanding about the long-term impact of design decisions on energy and transportation demand for a 600-acre development project on Chicago's South Side - the Chicago Lakeside Development project.

  1. Large-scale neuromorphic computing systems

    NASA Astrophysics Data System (ADS)

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

  2. Engineered Living Materials: Prospects and Challenges for Using Biological Systems to Direct the Assembly of Smart Materials.

    PubMed

    Nguyen, Peter Q; Courchesne, Noémie-Manuelle Dorval; Duraj-Thatte, Anna; Praveschotinunt, Pichet; Joshi, Neel S

    2018-05-01

    Vast potential exists for the development of novel, engineered platforms that manipulate biology for the production of programmed advanced materials. Such systems would possess the autonomous, adaptive, and self-healing characteristics of living organisms, but would be engineered with the goal of assembling bulk materials with designer physicochemical or mechanical properties, across multiple length scales. Early efforts toward such engineered living materials (ELMs) are reviewed here, with an emphasis on engineered bacterial systems, living composite materials which integrate inorganic components, successful examples of large-scale implementation, and production methods. In addition, a conceptual exploration of the fundamental criteria of ELM technology and its future challenges is presented. Cradled within the rich intersection of synthetic biology and self-assembling materials, the development of ELM technologies allows the power of biology to be leveraged to grow complex structures and objects using a palette of bio-nanomaterials. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Next-Gen Gene Synthesis Enables Large-Scale Engineering in Biological Systems: Recent advances in synthetic biology are making this field more promising than ever.

    PubMed

    Leake, Devin

    2015-01-01

    As scientists make strides toward the goal of developing a form of biological engineering that's as predictive and reliable as chemical engineering is for chemistry, one technology component has become absolutely critical: gene synthesis. Gene synthesis is the process of building stretches of deoxyribonucleic acid (DNA) to order--some stretches based on DNA that exists already in nature, some based on novel designs intended to accomplish new functions. This process is the foundation of synthetic biology, which is rapidly becoming the engineering counterpart to biology.

  4. Predicting cell viability within tissue scaffolds under equiaxial strain: multi-scale finite element model of collagen-cardiomyocytes constructs.

    PubMed

    Elsaadany, Mostafa; Yan, Karen Chang; Yildirim-Ayan, Eda

    2017-06-01

    Successful tissue engineering and regenerative therapy necessitate having extensive knowledge about mechanical milieu in engineered tissues and the resident cells. In this study, we have merged two powerful analysis tools, namely finite element analysis and stochastic analysis, to understand the mechanical strain within the tissue scaffold and residing cells and to predict the cell viability upon applying mechanical strains. A continuum-based multi-length scale finite element model (FEM) was created to simulate the physiologically relevant equiaxial strain exposure on cell-embedded tissue scaffold and to calculate strain transferred to the tissue scaffold (macro-scale) and residing cells (micro-scale) upon various equiaxial strains. The data from FEM were used to predict cell viability under various equiaxial strain magnitudes using stochastic damage criterion analysis. The model validation was conducted through mechanically straining the cardiomyocyte-encapsulated collagen constructs using a custom-built mechanical loading platform (EQUicycler). FEM quantified the strain gradients over the radial and longitudinal direction of the scaffolds and the cells residing in different areas of interest. With the use of the experimental viability data, stochastic damage criterion, and the average cellular strains obtained from multi-length scale models, cellular viability was predicted and successfully validated. This methodology can provide a great tool to characterize the mechanical stimulation of bioreactors used in tissue engineering applications in providing quantification of mechanical strain and predicting cellular viability variations due to applied mechanical strain.

  5. Mathematical analysis of frontal affinity chromatography in particle and membrane configurations.

    PubMed

    Tejeda-Mansir, A; Montesinos, R M; Guzmán, R

    2001-10-30

    The scaleup and optimization of large-scale affinity-chromatographic operations in the recovery, separation and purification of biochemical components is of major industrial importance. The development of mathematical models to describe affinity-chromatographic processes, and the use of these models in computer programs to predict column performance is an engineering approach that can help to attain these bioprocess engineering tasks successfully. Most affinity-chromatographic separations are operated in the frontal mode, using fixed-bed columns. Purely diffusive and perfusion particles and membrane-based affinity chromatography are among the main commercially available technologies for these separations. For a particular application, a basic understanding of the main similarities and differences between particle and membrane frontal affinity chromatography and how these characteristics are reflected in the transport models is of fundamental relevance. This review presents the basic theoretical considerations used in the development of particle and membrane affinity chromatography models that can be applied in the design and operation of large-scale affinity separations in fixed-bed columns. A transport model for column affinity chromatography that considers column dispersion, particle internal convection, external film resistance, finite kinetic rate, plus macropore and micropore resistances is analyzed as a framework for exploring further the mathematical analysis. Such models provide a general realistic description of almost all practical systems. Specific mathematical models that take into account geometric considerations and transport effects have been developed for both particle and membrane affinity chromatography systems. Some of the most common simplified models, based on linear driving-force (LDF) and equilibrium assumptions, are emphasized. Analytical solutions of the corresponding simplified dimensionless affinity models are presented. Particular methods for estimating the parameters that characterize the mass-transfer and adsorption mechanisms in affinity systems are described.

  6. RNAslider: a faster engine for consecutive windows folding and its application to the analysis of genomic folding asymmetry.

    PubMed

    Horesh, Yair; Wexler, Ydo; Lebenthal, Ilana; Ziv-Ukelson, Michal; Unger, Ron

    2009-03-04

    Scanning large genomes with a sliding window in search of locally stable RNA structures is a well motivated problem in bioinformatics. Given a predefined window size L and an RNA sequence S of size N (L < N), the consecutive windows folding problem is to compute the minimal free energy (MFE) for the folding of each of the L-sized substrings of S. The consecutive windows folding problem can be naively solved in O(NL3) by applying any of the classical cubic-time RNA folding algorithms to each of the N-L windows of size L. Recently an O(NL2) solution for this problem has been described. Here, we describe and implement an O(NLpsi(L)) engine for the consecutive windows folding problem, where psi(L) is shown to converge to O(1) under the assumption of a standard probabilistic polymer folding model, yielding an O(L) speedup which is experimentally confirmed. Using this tool, we note an intriguing directionality (5'-3' vs. 3'-5') folding bias, i.e. that the minimal free energy (MFE) of folding is higher in the native direction of the DNA than in the reverse direction of various genomic regions in several organisms including regions of the genomes that do not encode proteins or ncRNA. This bias largely emerges from the genomic dinucleotide bias which affects the MFE, however we see some variations in the folding bias in the different genomic regions when normalized to the dinucleotide bias. We also present results from calculating the MFE landscape of a mouse chromosome 1, characterizing the MFE of the long ncRNA molecules that reside in this chromosome. The efficient consecutive windows folding engine described in this paper allows for genome wide scans for ncRNA molecules as well as large-scale statistics. This is implemented here as a software tool, called RNAslider, and applied to the scanning of long chromosomes, leading to the observation of features that are visible only on a large scale.

  7. 7th Annual Systems Biology Symposium: Systems Biology and Engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galitski, Timothy P.

    2008-04-01

    Systems biology recognizes the complex multi-scale organization of biological systems, from molecules to ecosystems. The International Symposium on Systems Biology has been hosted by the Institute for Systems Biology in Seattle, Washington, since 2002. The annual two-day event gathers the most influential researchers transforming biology into an integrative discipline investingating complex systems. Engineering and application of new technology is a central element of systems biology. Genome-scale, or very small-scale, biological questions drive the enigneering of new technologies, which enable new modes of experimentation and computational analysis, leading to new biological insights and questions. Concepts and analytical methods in engineering aremore » now finding direct applications in biology. Therefore, the 2008 Symposium, funded in partnership with the Department of Energy, featured global leaders in "Systems Biology and Engineering."« less

  8. Achieving Agility and Stability in Large-Scale Software Development

    DTIC Science & Technology

    2013-01-16

    temporary team is assigned to prepare layers and frameworks for future feature teams. Presentation Layer Domain Layer Data Access Layer Framework...http://www.sei.cmu.edu/training/ elearning ~ Software Engineering Institute CarnegieMellon

  9. Defaults, context, and knowledge: alternatives for OWL-indexed knowledge bases.

    PubMed

    Rector, A

    2004-01-01

    The new Web Ontology Language (OWL) and its Description Logic compatible sublanguage (OWL-DL) explicitly exclude defaults and exceptions, as do all logic based formalisms for ontologies. However, many biomedical applications appear to require default reasoning, at least if they are to be engineered in a maintainable way. Default reasoning has always been one of the great strengths of Frame systems such as Protégé. Resolving this conflict requires analysis of the different uses for defaults and exceptions. In some cases, alternatives can be provided within the OWL framework; in others, it appears that hybrid reasoning about a knowledge base of contingent facts built around the core ontology is necessary. Trade-offs include both human factors and the scaling of computational performance. The analysis presented here is based on the OpenGALEN experience with large scale ontologies using a formalism, GRAIL, which explicitly incorporates constructs for hybrid reasoning, numerous experiments with OWL, and initial work on combining OWL and Protégé.

  10. A Component-Based Extension Framework for Large-Scale Parallel Simulations in NEURON

    PubMed Central

    King, James G.; Hines, Michael; Hill, Sean; Goodman, Philip H.; Markram, Henry; Schürmann, Felix

    2008-01-01

    As neuronal simulations approach larger scales with increasing levels of detail, the neurosimulator software represents only a part of a chain of tools ranging from setup, simulation, interaction with virtual environments to analysis and visualizations. Previously published approaches to abstracting simulator engines have not received wide-spread acceptance, which in part may be to the fact that they tried to address the challenge of solving the model specification problem. Here, we present an approach that uses a neurosimulator, in this case NEURON, to describe and instantiate the network model in the simulator's native model language but then replaces the main integration loop with its own. Existing parallel network models are easily adopted to run in the presented framework. The presented approach is thus an extension to NEURON but uses a component-based architecture to allow for replaceable spike exchange components and pluggable components for monitoring, analysis, or control that can run in this framework alongside with the simulation. PMID:19430597

  11. Adaptive subdomain modeling: A multi-analysis technique for ocean circulation models

    NASA Astrophysics Data System (ADS)

    Altuntas, Alper; Baugh, John

    2017-07-01

    Many coastal and ocean processes of interest operate over large temporal and geographical scales and require a substantial amount of computational resources, particularly when engineering design and failure scenarios are also considered. This study presents an adaptive multi-analysis technique that improves the efficiency of these computations when multiple alternatives are being simulated. The technique, called adaptive subdomain modeling, concurrently analyzes any number of child domains, with each instance corresponding to a unique design or failure scenario, in addition to a full-scale parent domain providing the boundary conditions for its children. To contain the altered hydrodynamics originating from the modifications, the spatial extent of each child domain is adaptively adjusted during runtime depending on the response of the model. The technique is incorporated in ADCIRC++, a re-implementation of the popular ADCIRC ocean circulation model with an updated software architecture designed to facilitate this adaptive behavior and to utilize concurrent executions of multiple domains. The results of our case studies confirm that the method substantially reduces computational effort while maintaining accuracy.

  12. Space vehicle engine and heat shield environment review. Volume 1: Engineering analysis

    NASA Technical Reports Server (NTRS)

    Mcanelly, W. B.; Young, C. T. K.

    1973-01-01

    Methods for predicting the base heating characteristics of a multiple rocket engine installation are discussed. The environmental data is applied to the design of adequate protection system for the engine components. The methods for predicting the base region thermal environment are categorized as: (1) scale model testing, (2) extrapolation of previous and related flight test results, and (3) semiempirical analytical techniques.

  13. VALIDATION OF ANSYS FINITE ELEMENT ANALYSIS SOFTWARE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HAMM, E.R.

    2003-06-27

    This document provides a record of the verification and Validation of the ANSYS Version 7.0 software that is installed on selected CH2M HILL computers. The issues addressed include: Software verification, installation, validation, configuration management and error reporting. The ANSYS{reg_sign} computer program is a large scale multi-purpose finite element program which may be used for solving several classes of engineering analysis. The analysis capabilities of ANSYS Full Mechanical Version 7.0 installed on selected CH2M Hill Hanford Group (CH2M HILL) Intel processor based computers include the ability to solve static and dynamic structural analyses, steady-state and transient heat transfer problems, mode-frequency andmore » buckling eigenvalue problems, static or time-varying magnetic analyses and various types of field and coupled-field applications. The program contains many special features which allow nonlinearities or secondary effects to be included in the solution, such as plasticity, large strain, hyperelasticity, creep, swelling, large deflections, contact, stress stiffening, temperature dependency, material anisotropy, and thermal radiation. The ANSYS program has been in commercial use since 1970, and has been used extensively in the aerospace, automotive, construction, electronic, energy services, manufacturing, nuclear, plastics, oil and steel industries.« less

  14. Development and application of theoretical models for Rotating Detonation Engine flowfields

    NASA Astrophysics Data System (ADS)

    Fievisohn, Robert

    As turbine and rocket engine technology matures, performance increases between successive generations of engine development are becoming smaller. One means of accomplishing significant gains in thermodynamic performance and power density is to use detonation-based heat release instead of deflagration. This work is focused on developing and applying theoretical models to aid in the design and understanding of Rotating Detonation Engines (RDEs). In an RDE, a detonation wave travels circumferentially along the bottom of an annular chamber where continuous injection of fresh reactants sustains the detonation wave. RDEs are currently being designed, tested, and studied as a viable option for developing a new generation of turbine and rocket engines that make use of detonation heat release. One of the main challenges in the development of RDEs is to understand the complex flowfield inside the annular chamber. While simplified models are desirable for obtaining timely performance estimates for design analysis, one-dimensional models may not be adequate as they do not provide flow structure information. In this work, a two-dimensional physics-based model is developed, which is capable of modeling the curved oblique shock wave, exit swirl, counter-flow, detonation inclination, and varying pressure along the inflow boundary. This is accomplished by using a combination of shock-expansion theory, Chapman-Jouguet detonation theory, the Method of Characteristics (MOC), and other compressible flow equations to create a shock-fitted numerical algorithm and generate an RDE flowfield. This novel approach provides a numerically efficient model that can provide performance estimates as well as details of the large-scale flow structures in seconds on a personal computer. Results from this model are validated against high-fidelity numerical simulations that may require a high-performance computing framework to provide similar performance estimates. This work provides a designer a new tool to conduct large-scale parametric studies to optimize a design space before conducting computationally-intensive, high-fidelity simulations that may be used to examine additional effects. The work presented in this thesis not only bridges the gap between simple one-dimensional models and high-fidelity full numerical simulations, but it also provides an effective tool for understanding and exploring RDE flow processes.

  15. Kedalion: NASA's Adaptable and Agile Hardware/Software Integration and Test Lab

    NASA Technical Reports Server (NTRS)

    Mangieri, Mark L.; Vice, Jason

    2011-01-01

    NASA fs Kedalion engineering analysis lab at Johnson Space Center is on the forefront of validating and using many contemporary avionics hardware/software development and integration techniques, which represent new paradigms to heritage NASA culture. Kedalion has validated many of the Orion hardware/software engineering techniques borrowed from the adjacent commercial aircraft avionics solution space, with the intention to build upon such techniques to better align with today fs aerospace market. Using agile techniques, commercial products, early rapid prototyping, in-house expertise and tools, and customer collaboration, Kedalion has demonstrated that cost effective contemporary paradigms hold the promise to serve future NASA endeavors within a diverse range of system domains. Kedalion provides a readily adaptable solution for medium/large scale integration projects. The Kedalion lab is currently serving as an in-line resource for the project and the Multipurpose Crew Vehicle (MPCV) program.

  16. The Effect of Reduction Gearing on Propeller-body Interference as Shown by Full-Scale Wind-Tunnel Tests

    NASA Technical Reports Server (NTRS)

    Weick, Fred E

    1931-01-01

    This report presents the results of full-scale tests made on a 10-foot 5-inch propeller on a geared J-5 engine and also on a similar 8-foot 11-inch propeller on a direct-drive J-5 engine. Each propeller was tested at two different pitch settings, and with a large and a small fuselage. The investigation was made in such a manner that the propeller-body interference factors were isolated, and it was found that, considering this interference only, the geared propellers had an appreciable advantage in propulsive efficiency, partially due to the larger diameter of the propellers with respect to the bodies, and partially because the geared propellers were located farther ahead of the engines and bodies.

  17. Review of the Need for a Large-scale Test Facility for Research on the Effects of Extreme Winds on Structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. G. Little

    1999-03-01

    The Idaho National Engineering and Environmental Laboratory (INEEL), through the US Department of Energy (DOE), has proposed that a large-scale wind test facility (LSWTF) be constructed to study, in full-scale, the behavior of low-rise structures under simulated extreme wind conditions. To determine the need for, and potential benefits of, such a facility, the Idaho Operations Office of the DOE requested that the National Research Council (NRC) perform an independent assessment of the role and potential value of an LSWTF in the overall context of wind engineering research. The NRC established the Committee to Review the Need for a Large-scale Testmore » Facility for Research on the Effects of Extreme Winds on Structures, under the auspices of the Board on Infrastructure and the Constructed Environment, to perform this assessment. This report conveys the results of the committee's deliberations as well as its findings and recommendations. Data developed at large-scale would enhanced the understanding of how structures, particularly light-frame structures, are affected by extreme winds (e.g., hurricanes, tornadoes, sever thunderstorms, and other events). With a large-scale wind test facility, full-sized structures, such as site-built or manufactured housing and small commercial or industrial buildings, could be tested under a range of wind conditions in a controlled, repeatable environment. At this time, the US has no facility specifically constructed for this purpose. During the course of this study, the committee was confronted by three difficult questions: (1) does the lack of a facility equate to a need for the facility? (2) is need alone sufficient justification for the construction of a facility? and (3) would the benefits derived from information produced in an LSWTF justify the costs of producing that information? The committee's evaluation of the need and justification for an LSWTF was shaped by these realities.« less

  18. Engineered human skin substitutes undergo large-scale genomic reprogramming and normal skin-like maturation after transplantation to athymic mice.

    PubMed

    Klingenberg, Jennifer M; McFarland, Kevin L; Friedman, Aaron J; Boyce, Steven T; Aronow, Bruce J; Supp, Dorothy M

    2010-02-01

    Bioengineered skin substitutes can facilitate wound closure in severely burned patients, but deficiencies limit their outcomes compared with native skin autografts. To identify gene programs associated with their in vivo capabilities and limitations, we extended previous gene expression profile analyses to now compare engineered skin after in vivo grafting with both in vitro maturation and normal human skin. Cultured skin substitutes were grafted on full-thickness wounds in athymic mice, and biopsy samples for microarray analyses were collected at multiple in vitro and in vivo time points. Over 10,000 transcripts exhibited large-scale expression pattern differences during in vitro and in vivo maturation. Using hierarchical clustering, 11 different expression profile clusters were partitioned on the basis of differential sample type and temporal stage-specific activation or repression. Analyses show that the wound environment exerts a massive influence on gene expression in skin substitutes. For example, in vivo-healed skin substitutes gained the expression of many native skin-expressed genes, including those associated with epidermal barrier and multiple categories of cell-cell and cell-basement membrane adhesion. In contrast, immunological, trichogenic, and endothelial gene programs were largely lacking. These analyses suggest important areas for guiding further improvement of engineered skin for both increased homology with native skin and enhanced wound healing.

  19. A design-build-test cycle using modeling and experiments reveals interdependencies between upper glycolysis and xylose uptake in recombinant S. cerevisiae and improves predictive capabilities of large-scale kinetic models.

    PubMed

    Miskovic, Ljubisa; Alff-Tuomala, Susanne; Soh, Keng Cher; Barth, Dorothee; Salusjärvi, Laura; Pitkänen, Juha-Pekka; Ruohonen, Laura; Penttilä, Merja; Hatzimanikatis, Vassily

    2017-01-01

    Recent advancements in omics measurement technologies have led to an ever-increasing amount of available experimental data that necessitate systems-oriented methodologies for efficient and systematic integration of data into consistent large-scale kinetic models. These models can help us to uncover new insights into cellular physiology and also to assist in the rational design of bioreactor or fermentation processes. Optimization and Risk Analysis of Complex Living Entities (ORACLE) framework for the construction of large-scale kinetic models can be used as guidance for formulating alternative metabolic engineering strategies. We used ORACLE in a metabolic engineering problem: improvement of the xylose uptake rate during mixed glucose-xylose consumption in a recombinant Saccharomyces cerevisiae strain. Using the data from bioreactor fermentations, we characterized network flux and concentration profiles representing possible physiological states of the analyzed strain. We then identified enzymes that could lead to improved flux through xylose transporters (XTR). For some of the identified enzymes, including hexokinase (HXK), we could not deduce if their control over XTR was positive or negative. We thus performed a follow-up experiment, and we found out that HXK2 deletion improves xylose uptake rate. The data from the performed experiments were then used to prune the kinetic models, and the predictions of the pruned population of kinetic models were in agreement with the experimental data collected on the HXK2 -deficient S. cerevisiae strain. We present a design-build-test cycle composed of modeling efforts and experiments with a glucose-xylose co-utilizing recombinant S. cerevisiae and its HXK2 -deficient mutant that allowed us to uncover interdependencies between upper glycolysis and xylose uptake pathway. Through this cycle, we also obtained kinetic models with improved prediction capabilities. The present study demonstrates the potential of integrated "modeling and experiments" systems biology approaches that can be applied for diverse applications ranging from biotechnology to drug discovery.

  20. Metabolic Network Modeling of Microbial Communities

    PubMed Central

    Biggs, Matthew B.; Medlock, Gregory L.; Kolling, Glynis L.

    2015-01-01

    Genome-scale metabolic network reconstructions and constraint-based analysis are powerful methods that have the potential to make functional predictions about microbial communities. Current use of genome-scale metabolic networks to characterize the metabolic functions of microbial communities includes species compartmentalization, separating species-level and community-level objectives, dynamic analysis, the “enzyme-soup” approach, multi-scale modeling, and others. There are many challenges inherent to the field, including a need for tools that accurately assign high-level omics signals to individual community members, new automated reconstruction methods that rival manual curation, and novel algorithms for integrating omics data and engineering communities. As technologies and modeling frameworks improve, we expect that there will be proportional advances in the fields of ecology, health science, and microbial community engineering. PMID:26109480

  1. Development of Residential Prototype Building Models and Analysis System for Large-Scale Energy Efficiency Studies Using EnergyPlus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendon, Vrushali V.; Taylor, Zachary T.

    ABSTRACT: Recent advances in residential building energy efficiency and codes have resulted in increased interest in detailed residential building energy models using the latest energy simulation software. One of the challenges of developing residential building models to characterize new residential building stock is to allow for flexibility to address variability in house features like geometry, configuration, HVAC systems etc. Researchers solved this problem in a novel way by creating a simulation structure capable of creating fully-functional EnergyPlus batch runs using a completely scalable residential EnergyPlus template system. This system was used to create a set of thirty-two residential prototype buildingmore » models covering single- and multifamily buildings, four common foundation types and four common heating system types found in the United States (US). A weighting scheme with detailed state-wise and national weighting factors was designed to supplement the residential prototype models. The complete set is designed to represent a majority of new residential construction stock. The entire structure consists of a system of utility programs developed around the core EnergyPlus simulation engine to automate the creation and management of large-scale simulation studies with minimal human effort. The simulation structure and the residential prototype building models have been used for numerous large-scale studies, one of which is briefly discussed in this paper.« less

  2. HSTDEK: Developing a methodology for construction of large-scale, multi-use knowledge bases

    NASA Technical Reports Server (NTRS)

    Freeman, Michael S.

    1987-01-01

    The primary research objectives of the Hubble Space Telescope Design/Engineering Knowledgebase (HSTDEK) are to develop a methodology for constructing and maintaining large scale knowledge bases which can be used to support multiple applications. To insure the validity of its results, this research is being persued in the context of a real world system, the Hubble Space Telescope. The HSTDEK objectives are described in detail. The history and motivation of the project are briefly described. The technical challenges faced by the project are outlined.

  3. Development of Large-Eddy Interaction Model for inhomogeneous turbulent flows

    NASA Technical Reports Server (NTRS)

    Hong, S. K.; Payne, F. R.

    1987-01-01

    The objective of this paper is to demonstrate the applicability of a currently proposed model, with minimum empiricism, for calculation of the Reynolds stresses and other turbulence structural quantities in a channel. The current Large-Eddy Interaction Model not only yields Reynolds stresses but also presents an opportunity to illuminate typical characteristic motions of large-scale turbulence and the phenomenological aspects of engineering models for two Reynolds numbers.

  4. The scaling of performance and losses in miniature internal combustion engines

    NASA Astrophysics Data System (ADS)

    Menon, Shyam Kumar

    Miniature glow ignition internal combustion (IC) piston engines are an off--the--shelf technology that could dramatically increase the endurance of miniature electric power supplies and the range and endurance of small unmanned air vehicles provided their overall thermodynamic efficiencies can be increased to 15% or better. This thesis presents the first comprehensive analysis of small (<500 g) piston engine performance. A unique dynamometer system is developed that is capable of making reliable measurements of engine performance and losses in these small engines. Methodologies are also developed for measuring volumetric, heat transfer, exhaust, mechanical, and combustion losses. These instruments and techniques are used to investigate the performance of seven single-cylinder, two-stroke, glow fueled engines ranging in size from 15 to 450 g (0.16 to 7.5 cm3 displacement). Scaling rules for power output, overall efficiency, and normalized power are developed from the data. These will be useful to developers of micro-air vehicles and miniature power systems. The data show that the minimum length scale of a thermodynamically viable piston engine based on present technology is approximately 3 mm. Incomplete combustion is the most important challenge as it accounts for 60-70% of total energy losses. Combustion losses are followed in order of importance by heat transfer, sensible enthalpy, and friction. A net heat release analysis based on in-cylinder pressure measurements suggest that a two--stage combustion process occurs at low engine speeds and equivalence ratios close to 1. Different theories based on burning mode and reaction kinetics are proposed to explain the observed results. High speed imaging of the combustion chamber suggests that a turbulent premixed flame with its origin in the vicinity of the glow plug is the primary driver of combustion. Placing miniature IC engines on a turbulent combustion regime diagram shows that they operate in the 'flamelet in eddy' regime whereas conventional--scale engines operate mostly in the 'wrinkled laminar flame sheet' regime. Taken together, the results show that the combustion process is the key obstacle to realizing the potential of small IC engines. Overcoming this obstacle will require new diagnostic techniques, measurements, combustion models, and high temperature materials.

  5. The effect of primary sedimentation on full-scale WWTP nutrient removal performance.

    PubMed

    Puig, S; van Loosdrecht, M C M; Flameling, A G; Colprim, J; Meijer, S C F

    2010-06-01

    Traditionally, the performance of full-scale wastewater treatment plants (WWTPs) is measured based on influent and/or effluent and waste sludge flows and concentrations. Full-scale WWTP data typically have a high variance which often contains (large) measurement errors. A good process engineering evaluation of the WWTP performance is therefore difficult. This also makes it usually difficult to evaluate effect of process changes in a plant or compare plants to each other. In this paper we used a case study of a full-scale nutrient removing WWTP. The plant normally uses presettled wastewater, as a means to increase the nutrient removal the plant was operated for a period by-passing raw wastewater (27% of the influent flow). The effect of raw wastewater addition has been evaluated by different approaches: (i) influent characteristics, (ii) design retrofit, (iii) effluent quality, (iv) removal efficiencies, (v) activated sludge characteristics, (vi) microbial activity tests and FISH analysis and, (vii) performance assessment based on mass balance evaluation. This paper demonstrates that mass balance evaluation approach helps the WWTP engineers to distinguish and quantify between different strategies, where others could not. In the studied case, by-passing raw wastewater (27% of the influent flow) directly to the biological reactor did not improve the effluent quality and the nutrient removal efficiency of the WWTP. The increase of the influent C/N and C/P ratios was associated to particulate compounds with low COD/VSS ratio and a high non-biodegradable COD fraction. Copyright 2010 Elsevier Ltd. All rights reserved.

  6. Ursgal, Universal Python Module Combining Common Bottom-Up Proteomics Tools for Large-Scale Analysis.

    PubMed

    Kremer, Lukas P M; Leufken, Johannes; Oyunchimeg, Purevdulam; Schulze, Stefan; Fufezan, Christian

    2016-03-04

    Proteomics data integration has become a broad field with a variety of programs offering innovative algorithms to analyze increasing amounts of data. Unfortunately, this software diversity leads to many problems as soon as the data is analyzed using more than one algorithm for the same task. Although it was shown that the combination of multiple peptide identification algorithms yields more robust results, it is only recently that unified approaches are emerging; however, workflows that, for example, aim to optimize search parameters or that employ cascaded style searches can only be made accessible if data analysis becomes not only unified but also and most importantly scriptable. Here we introduce Ursgal, a Python interface to many commonly used bottom-up proteomics tools and to additional auxiliary programs. Complex workflows can thus be composed using the Python scripting language using a few lines of code. Ursgal is easily extensible, and we have made several database search engines (X!Tandem, OMSSA, MS-GF+, Myrimatch, MS Amanda), statistical postprocessing algorithms (qvality, Percolator), and one algorithm that combines statistically postprocessed outputs from multiple search engines ("combined FDR") accessible as an interface in Python. Furthermore, we have implemented a new algorithm ("combined PEP") that combines multiple search engines employing elements of "combined FDR", PeptideShaker, and Bayes' theorem.

  7. Analysis and Testing of a Composite Fuselage Shield for Open Rotor Engine Blade-Out Protection

    NASA Technical Reports Server (NTRS)

    Pereira, J. Michael; Emmerling, William; Seng, Silvia; Frankenberger, Charles; Ruggeri, Charles R.; Revilock, Duane M.; Carney, Kelly S.

    2016-01-01

    The Federal Aviation Administration is working with the European Aviation Safety Agency to determine the certification base for proposed new engines that would not have a containment structure on large commercial aircraft. Equivalent safety to the current fleet is desired by the regulators, which means that loss of a single fan blade will not cause hazard to the Aircraft. The NASA Glenn Research Center and The Naval Air Warfare Center (NAWC), China Lake, collaborated with the FAA Aircraft Catastrophic Failure Prevention Program to design and test lightweight composite shields for protection of the aircraft passengers and critical systems from a released blade that could impact the fuselage. LS-DYNA® was used to predict the thickness of the composite shield required to prevent blade penetration. In the test, two composite blades were pyrotechnically released from a running engine, each impacting a composite shield with a different thickness. The thinner shield was penetrated by the blade and the thicker shield prevented penetration. This was consistent with pre-test LS-DYNA predictions. This paper documents the analysis conducted to predict the required thickness of a composite shield, the live fire test from the full scale rig at NAWC China Lake and describes the damage to the shields as well as instrumentation results.

  8. Large Scale GW Calculations on the Cori System

    NASA Astrophysics Data System (ADS)

    Deslippe, Jack; Del Ben, Mauro; da Jornada, Felipe; Canning, Andrew; Louie, Steven

    The NERSC Cori system, powered by 9000+ Intel Xeon-Phi processors, represents one of the largest HPC systems for open-science in the United States and the world. We discuss the optimization of the GW methodology for this system, including both node level and system-scale optimizations. We highlight multiple large scale (thousands of atoms) case studies and discuss both absolute application performance and comparison to calculations on more traditional HPC architectures. We find that the GW method is particularly well suited for many-core architectures due to the ability to exploit a large amount of parallelism across many layers of the system. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division, as part of the Computational Materials Sciences Program.

  9. Flight effects on noise by the JT8D engine with inverted primary/fan flow as measured in the NASA-Ames 40 by 80 foot wind tunnel

    NASA Technical Reports Server (NTRS)

    Strout, F. G.

    1978-01-01

    A JT8D-17R engine with inverted primary and fan flows was tested under static conditions as well as in the NASA Ames 40 by 80 Foot Wind Tunnel to determine static and flight noise characteristics, and flow profile of a large scale engine. Test and analysis techniques developed by a previous model and JT8D engine test program were used to determine the in-flight noise. The engine with inverted flow was tested with a conical nozzle and with a plug nozzle, 20 lobe nozzle, and an acoustic shield. Wind tunnel results show that forward velocity causes significant reduction in peak PNL suppression relative to uninverted flow. The loss of EPNL suppression is relatively modest. The in-flight peak PNL suppression of the inverter with conical nozzle was 2.5 PNdb relative to a static value of 5.5 PNdb. The corresponding EPNL suppression was 4.0 EPNdb for flight and 5.0 EPNdb for static operation. The highest in-flight EPNL suppression was 7.5 EPNdb obtained by the inverter with 20 lobe nozzle and acoustic shield. When compared with the JT8D engine with internal mixer, the inverted flow configuration provides more EPNL suppression under both static and flight conditions.

  10. Scale model performance test investigation of exhaust system mixers for an Energy Efficient Engine /E3/ propulsion system

    NASA Technical Reports Server (NTRS)

    Kuchar, A. P.; Chamberlin, R.

    1980-01-01

    A scale model performance test was conducted as part of the NASA Energy Efficient Engine (E3) Program, to investigate the geometric variables that influence the aerodynamic design of exhaust system mixers for high-bypass, mixed-flow engines. Mixer configuration variables included lobe number, penetration and perimeter, as well as several cutback mixer geometries. Mixing effectiveness and mixer pressure loss were determined using measured thrust and nozzle exit total pressure and temperature surveys. Results provide a data base to aid the analysis and design development of the E3 mixed-flow exhaust system.

  11. Non-Destructive Characterization of Engineering Materials Using High-Energy X-rays at the Advanced Photon Source

    DOE PAGES

    Park, Jun-Sang; Okasinski, John; Chatterjee, Kamalika; ...

    2017-05-30

    High energy X-rays can penetrate large components and samples made from engineering alloys. Brilliant synchrotron sources like the Advanced Photon Source (APS) combined with unique experimental setups are increasingly allowing scientists and engineers to non-destructively characterize the state of materials across a range of length scales. In this article, some of the new developments at the APS, namely the high energy diffraction microscopy technique for grain-by-grain maps and aperture-based techniques for aggregate maps, are described.

  12. Non-Destructive Characterization of Engineering Materials Using High-Energy X-rays at the Advanced Photon Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Jun-Sang; Okasinski, John; Chatterjee, Kamalika

    High energy X-rays can penetrate large components and samples made from engineering alloys. Brilliant synchrotron sources like the Advanced Photon Source (APS) combined with unique experimental setups are increasingly allowing scientists and engineers to non-destructively characterize the state of materials across a range of length scales. In this article, some of the new developments at the APS, namely the high energy diffraction microscopy technique for grain-by-grain maps and aperture-based techniques for aggregate maps, are described.

  13. Large-scale visualization projects for teaching software engineering.

    PubMed

    Müller, Christoph; Reina, Guido; Burch, Michael; Weiskopf, Daniel

    2012-01-01

    The University of Stuttgart's software engineering major complements the traditional computer science major with more practice-oriented education. Two-semester software projects in various application areas offered by the university's different computer science institutes are a successful building block in the curriculum. With this realistic, complex project setting, students experience the practice of software engineering, including software development processes, technologies, and soft skills. In particular, visualization-based projects are popular with students. Such projects offer them the opportunity to gain profound knowledge that would hardly be possible with only regular lectures and homework assignments.

  14. A Technique of Two-Stage Clustering Applied to Environmental and Civil Engineering and Related Methods of Citation Analysis.

    ERIC Educational Resources Information Center

    Miyamoto, S.; Nakayama, K.

    1983-01-01

    A method of two-stage clustering of literature based on citation frequency is applied to 5,065 articles from 57 journals in environmental and civil engineering. Results of related methods of citation analysis (hierarchical graph, clustering of journals, multidimensional scaling) applied to same set of articles are compared. Ten references are…

  15. DEM GPU studies of industrial scale particle simulations for granular flow civil engineering applications

    NASA Astrophysics Data System (ADS)

    Pizette, Patrick; Govender, Nicolin; Wilke, Daniel N.; Abriak, Nor-Edine

    2017-06-01

    The use of the Discrete Element Method (DEM) for industrial civil engineering industrial applications is currently limited due to the computational demands when large numbers of particles are considered. The graphics processing unit (GPU) with its highly parallelized hardware architecture shows potential to enable solution of civil engineering problems using discrete granular approaches. We demonstrate in this study the pratical utility of a validated GPU-enabled DEM modeling environment to simulate industrial scale granular problems. As illustration, the flow discharge of storage silos using 8 and 17 million particles is considered. DEM simulations have been performed to investigate the influence of particle size (equivalent size for the 20/40-mesh gravel) and induced shear stress for two hopper shapes. The preliminary results indicate that the shape of the hopper significantly influences the discharge rates for the same material. Specifically, this work shows that GPU-enabled DEM modeling environments can model industrial scale problems on a single portable computer within a day for 30 seconds of process time.

  16. Breast tissue engineering.

    PubMed

    Patrick, Charles W

    2004-01-01

    Tissue engineering has the potential to redefine rehabilitation for the breast cancer patient by providing a translatable strategy that restores the postmastectomy breast mound while concomitantly obviating limitations realized with contemporary reconstructive surgery procedures. The engineering design goal is to provide a sufficient volume of viable fat tissue based on a patient's own cells such that deficits in breast volume can be abrogated. To be sure, adipose tissue engineering is in its infancy, but tremendous strides have been made. Numerous studies attest to the feasibility of adipose tissue engineering. The field is now poised to challenge barriers to clinical translation that are germane to most tissue engineering applications, namely scale-up, large animal model development, and vascularization. The innovative and rapid progress of adipose engineering to date, as well as opportunities for its future growth, is presented.

  17. Genome-Wide Analysis of Transposon and Retroviral Insertions Reveals Preferential Integrations in Regions of DNA Flexibility.

    PubMed

    Vrljicak, Pavle; Tao, Shijie; Varshney, Gaurav K; Quach, Helen Ngoc Bao; Joshi, Adita; LaFave, Matthew C; Burgess, Shawn M; Sampath, Karuna

    2016-04-07

    DNA transposons and retroviruses are important transgenic tools for genome engineering. An important consideration affecting the choice of transgenic vector is their insertion site preferences. Previous large-scale analyses of Ds transposon integration sites in plants were done on the basis of reporter gene expression or germ-line transmission, making it difficult to discern vertebrate integration preferences. Here, we compare over 1300 Ds transposon integration sites in zebrafish with Tol2 transposon and retroviral integration sites. Genome-wide analysis shows that Ds integration sites in the presence or absence of marker selection are remarkably similar and distributed throughout the genome. No strict motif was found, but a preference for structural features in the target DNA associated with DNA flexibility (Twist, Tilt, Rise, Roll, Shift, and Slide) was observed. Remarkably, this feature is also found in transposon and retroviral integrations in maize and mouse cells. Our findings show that structural features influence the integration of heterologous DNA in genomes, and have implications for targeted genome engineering. Copyright © 2016 Vrljicak et al.

  18. Crowdsourcing biomedical research: leveraging communities as innovation engines

    PubMed Central

    Saez-Rodriguez, Julio; Costello, James C.; Friend, Stephen H.; Kellen, Michael R.; Mangravite, Lara; Meyer, Pablo; Norman, Thea; Stolovitzky, Gustavo

    2018-01-01

    The generation of large-scale biomedical data is creating unprecedented opportunities for basic and translational science. Typically, the data producers perform initial analyses, but it is very likely that the most informative methods may reside with other groups. Crowdsourcing the analysis of complex and massive data has emerged as a framework to find robust methodologies. When the crowdsourcing is done in the form of collaborative scientific competitions, known as Challenges, the validation of the methods is inherently addressed. Challenges also encourage open innovation, create collaborative communities to solve diverse and important biomedical problems, and foster the creation and dissemination of well-curated data repositories. PMID:27418159

  19. Crowdsourcing biomedical research: leveraging communities as innovation engines.

    PubMed

    Saez-Rodriguez, Julio; Costello, James C; Friend, Stephen H; Kellen, Michael R; Mangravite, Lara; Meyer, Pablo; Norman, Thea; Stolovitzky, Gustavo

    2016-07-15

    The generation of large-scale biomedical data is creating unprecedented opportunities for basic and translational science. Typically, the data producers perform initial analyses, but it is very likely that the most informative methods may reside with other groups. Crowdsourcing the analysis of complex and massive data has emerged as a framework to find robust methodologies. When the crowdsourcing is done in the form of collaborative scientific competitions, known as Challenges, the validation of the methods is inherently addressed. Challenges also encourage open innovation, create collaborative communities to solve diverse and important biomedical problems, and foster the creation and dissemination of well-curated data repositories.

  20. Fundamental research in artificial intelligence at NASA

    NASA Technical Reports Server (NTRS)

    Friedland, Peter

    1990-01-01

    This paper describes basic research at NASA in the field of artificial intelligence. The work is conducted at the Ames Research Center and the Jet Propulsion Laboratory, primarily under the auspices of the NASA-wide Artificial Intelligence Program in the Office of Aeronautics, Exploration and Technology. The research is aimed at solving long-term NASA problems in missions operations, spacecraft autonomy, preservation of corporate knowledge about NASA missions and vehicles, and management/analysis of scientific and engineering data. From a scientific point of view, the research is broken into the categories of: planning and scheduling; machine learning; and design of and reasoning about large-scale physical systems.

  1. Cell-free translational screening of an expression sequence tag library of Clonorchis sinensis for novel antigen discovery.

    PubMed

    Kasi, Devi; Catherine, Christy; Lee, Seung-Won; Lee, Kyung-Ho; Kim, Yu Jung; Ro Lee, Myeong; Ju, Jung Won; Kim, Dong-Myung

    2017-05-01

    The rapidly evolving cloning and sequencing technologies have enabled understanding of genomic structure of parasite genomes, opening up new ways of combatting parasite-related diseases. To make the most of the exponentially accumulating genomic data, however, it is crucial to analyze the proteins encoded by these genomic sequences. In this study, we adopted an engineered cell-free protein synthesis system for large-scale expression screening of an expression sequence tag (EST) library of Clonorchis sinensis to identify potential antigens that can be used for diagnosis and treatment of clonorchiasis. To allow high-throughput expression and identification of individual genes comprising the library, a cell-free synthesis reaction was designed such that both the template DNA and the expressed proteins were co-immobilized on the same microbeads, leading to microbead-based linkage of the genotype and phenotype. This reaction configuration allowed streamlined expression, recovery, and analysis of proteins. This approach enabled us to identify 21 antigenic proteins. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:832-837, 2017. © 2017 American Institute of Chemical Engineers.

  2. Analysis and Testing of a Composite Fuselage Shield for Open Rotor Engine Blade-Out Protection

    NASA Technical Reports Server (NTRS)

    Pereira, J. Michael; Emmerling, William; Seng, Silvia; Frankenberger, Charles; Ruggeri, Charles R.; Revilock, Duane M.; Carney, Kelly S.

    2015-01-01

    The Federal Aviation Administration is working with the European Aviation Safety Agency to determine the certification base for proposed new engines that would not have a containment structure on large commercial aircraft. Equivalent safety to the current fleet is desired by the regulators, which means that loss of a single fan blade will not cause hazard to the Aircraft. The NASA Glenn Research Center and The Naval Air Warfare Center (NAWC), China Lake, collaborated with the FAA Aircraft Catastrophic Failure Prevention Program to design and test lightweight composite shields for protection of the aircraft passengers and critical systems from a released blade that could impact the fuselage. In the test, two composite blades were pyrotechnically released from a running engine, each impacting a composite shield with a different thickness. The thinner shield was penetrated by the blade and the thicker shield prevented penetration. This was consistent with pre-test predictions. This paper documents the live fire test from the full scale rig at NAWC China Lake and describes the damage to the shields as well as instrumentation results.

  3. Connecting large-scale coastal behaviour with coastal management of the Rhône delta

    NASA Astrophysics Data System (ADS)

    Sabatier, François; Samat, Olivier; Ullmann, Albin; Suanez, Serge

    2009-06-01

    The aim of this paper is to connect the Large Scale Coastal Behaviour (LSCB) of the Rhône delta (shoreface sediment budget, river sediment input to the beaches, climatic change) with the impact and efficiency of hard engineering coastal structures. The analysis of the 1895 to 1974 bathymetric maps as well as 2D modelling of the effect of wave blocking on longshore transport allows us to draw up a conceptual model of the LSCB of the Rhône delta. The river sand input, settled in the mouth area (prodeltaic lobe), favours the advance of adjacent beaches. There is however a very weak alongshore sand feeding of the non-adjacent beaches farther off the mouth. After a mouth shift, the prodelta is eroded by aggressive waves and the sand is moved alongshore to build spits. This conceptual model suggests that there is a "timeshift" between the input of river sediments to the sea and the build up of a beach (nonadjacent to the mouth). Nowadays, as the river channels are controlled by dykes and human interventions, a river shift is not possible. It thus appears unlikely that the river sediments can supply the beaches of the Rhône delta coast. Under these conditions, we must expect that the problems of erosion will continue at Saintes-Maries-de-la-Mer and on the Faraman shore, in areas with chronic erosion where the shoreline retreat has been partially stopped by hard engineering practices in the 1980s. Therefore, these artificially stabilised sectors remain potentially under threat because of profile steepening and downdrift erosion evidenced in this paper by bathymetric profile measurements. In the long-term (1905 to 2003), the temporal analysis of the storm surges and the sea level show very weak but reliable increasing trends. Thus, these climatic agents will be more aggressive on the beaches and on the coastal structures calling their efficiency into question. We also evidence that the hard engineering structures were built in a favourable climatic context during the 1980s meanwhile the storm surges and the sea-level rise are stronger since the 1990s. Regarding to the LSCB of the Rhône delta, and the impact of hard engineering coastal structures, we suggest that classical hard coastal protections are not the best option to protect the coast.

  4. Large-Scale Disasters

    NASA Astrophysics Data System (ADS)

    Gad-El-Hak, Mohamed

    "Extreme" events - including climatic events, such as hurricanes, tornadoes, and drought - can cause massive disruption to society, including large death tolls and property damage in the billions of dollars. Events in recent years have shown the importance of being prepared and that countries need to work together to help alleviate the resulting pain and suffering. This volume presents a review of the broad research field of large-scale disasters. It establishes a common framework for predicting, controlling and managing both manmade and natural disasters. There is a particular focus on events caused by weather and climate change. Other topics include air pollution, tsunamis, disaster modeling, the use of remote sensing and the logistics of disaster management. It will appeal to scientists, engineers, first responders and health-care professionals, in addition to graduate students and researchers who have an interest in the prediction, prevention or mitigation of large-scale disasters.

  5. Effects of Structural Flexibility on Aircraft-Engine Mounts

    NASA Technical Reports Server (NTRS)

    Phillips, W. H.

    1986-01-01

    Analysis extends technique for design of widely used type of vibration-isolating mounts for aircraft engines, in which rubber mounting pads located in plane behind center of gravity of enginepropeller combination. New analysis treats problem in statics. Results of simple approach useful in providing equations for design of vibrationisolating mounts. Equations applicable in usual situation in which engine-mount structure itself relatively light and placed between large mass of engine and other heavy components of airplane.

  6. A MySQL Based EPICS Archiver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christopher Slominski

    2009-10-01

    Archiving a large fraction of the EPICS signals within the Jefferson Lab (JLAB) Accelerator control system is vital for postmortem and real-time analysis of the accelerator performance. This analysis is performed on a daily basis by scientists, operators, engineers, technicians, and software developers. Archiving poses unique challenges due to the magnitude of the control system. A MySQL Archiving system (Mya) was developed to scale to the needs of the control system; currently archiving 58,000 EPICS variables, updating at a rate of 11,000 events per second. In addition to the large collection rate, retrieval of the archived data must also bemore » fast and robust. Archived data retrieval clients obtain data at a rate over 100,000 data points per second. Managing the data in a relational database provides a number of benefits. This paper describes an archiving solution that uses an open source database and standard off the shelf hardware to reach high performance archiving needs. Mya has been in production at Jefferson Lab since February of 2007.« less

  7. Non-Axisymmetric Inflatable Pressure Structure (NAIPS) Full-Scale Pressure Test

    NASA Technical Reports Server (NTRS)

    Jones, Thomas C.; Doggett, William R.; Warren, Jerry E.; Watson, Judith J.; Shariff, Khadijah; Makino, Alberto; Yount, Bryan C.

    2017-01-01

    Inflatable space structures have the potential to significantly reduce the required launch volume for large pressure vessels required for exploration applications including habitats, airlocks and tankage. In addition, mass savings can be achieved via the use of high specific strength softgoods materials, and the reduced design penalty from launching the structure in a densely packaged state. Large inclusions however, such as hatches, induce a high mass penalty at the interfaces with the softgoods and in the added rigid structure while reducing the packaging efficiency. A novel, Non-Axisymmetric Inflatable Pressure Structure (NAIPS) was designed and recently tested at NASA Langley Research Center to demonstrate an elongated inflatable architecture that could provide areas of low stress along a principal axis in the surface. These low stress zones will allow the integration of a flexible linear seal that substantially reduces the added mass and volume of a heritage rigid hatch structure. This paper describes the test of the first full-scale engineering demonstration unit (EDU) of the NAIPS geometry and a comparison of the results to finite element analysis.

  8. Neuromorphic Hardware Architecture Using the Neural Engineering Framework for Pattern Recognition.

    PubMed

    Wang, Runchun; Thakur, Chetan Singh; Cohen, Gregory; Hamilton, Tara Julia; Tapson, Jonathan; van Schaik, Andre

    2017-06-01

    We present a hardware architecture that uses the neural engineering framework (NEF) to implement large-scale neural networks on field programmable gate arrays (FPGAs) for performing massively parallel real-time pattern recognition. NEF is a framework that is capable of synthesising large-scale cognitive systems from subnetworks and we have previously presented an FPGA implementation of the NEF that successfully performs nonlinear mathematical computations. That work was developed based on a compact digital neural core, which consists of 64 neurons that are instantiated by a single physical neuron using a time-multiplexing approach. We have now scaled this approach up to build a pattern recognition system by combining identical neural cores together. As a proof of concept, we have developed a handwritten digit recognition system using the MNIST database and achieved a recognition rate of 96.55%. The system is implemented on a state-of-the-art FPGA and can process 5.12 million digits per second. The architecture and hardware optimisations presented offer high-speed and resource-efficient means for performing high-speed, neuromorphic, and massively parallel pattern recognition and classification tasks.

  9. In the Shadow of Coal: How Large-Scale Industries Contributed to Present-Day Regional Differences in Personality and Well-Being.

    PubMed

    Obschonka, Martin; Stuetzer, Michael; Rentfrow, Peter J; Shaw-Taylor, Leigh; Satchell, Max; Silbereisen, Rainer K; Potter, Jeff; Gosling, Samuel D

    2017-11-20

    Recent research has identified regional variation of personality traits within countries but we know little about the underlying drivers of this variation. We propose that the Industrial Revolution, as a key era in the history of industrialized nations, has led to a persistent clustering of well-being outcomes and personality traits associated with psychological adversity via processes of selective migration and socialization. Analyzing data from England and Wales, we examine relationships between the historical employment share in large-scale coal-based industries (coal mining and steam-powered manufacturing industries that used this coal as fuel for their steam engines) and today's regional variation in personality and well-being. Even after controlling for possible historical confounds (historical energy supply, education, wealth, geology, climate, population density), we find that the historical local dominance of large-scale coal-based industries predicts today's markers of psychological adversity (lower Conscientiousness [and order facet scores], higher Neuroticism [and anxiety and depression facet scores], lower activity [an Extraversion facet], and lower life satisfaction and life expectancy). An instrumental variable analysis, using the historical location of coalfields, supports the causal assumption behind these effects (with the exception of life satisfaction). Further analyses focusing on mechanisms hint at the roles of selective migration and persisting economic hardship. Finally, a robustness check in the U.S. replicates the effect of the historical concentration of large-scale industries on today's levels of psychological adversity. Taken together, the results show how today's regional patterns of personality and well-being (which shape the future trajectories of these regions) may have their roots in major societal changes underway decades or centuries earlier. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. Hybrid reduced order modeling for assembly calculations

    DOE PAGES

    Bang, Youngsuk; Abdel-Khalik, Hany S.; Jessee, Matthew A.; ...

    2015-08-14

    While the accuracy of assembly calculations has greatly improved due to the increase in computer power enabling more refined description of the phase space and use of more sophisticated numerical algorithms, the computational cost continues to increase which limits the full utilization of their effectiveness for routine engineering analysis. Reduced order modeling is a mathematical vehicle that scales down the dimensionality of large-scale numerical problems to enable their repeated executions on small computing environment, often available to end users. This is done by capturing the most dominant underlying relationships between the model's inputs and outputs. Previous works demonstrated the usemore » of the reduced order modeling for a single physics code, such as a radiation transport calculation. This paper extends those works to coupled code systems as currently employed in assembly calculations. Finally, numerical tests are conducted using realistic SCALE assembly models with resonance self-shielding, neutron transport, and nuclides transmutation/depletion models representing the components of the coupled code system.« less

  11. Coupled Finite Volume and Finite Element Method Analysis of a Complex Large-Span Roof Structure

    NASA Astrophysics Data System (ADS)

    Szafran, J.; Juszczyk, K.; Kamiński, M.

    2017-12-01

    The main goal of this paper is to present coupled Computational Fluid Dynamics and structural analysis for the precise determination of wind impact on internal forces and deformations of structural elements of a longspan roof structure. The Finite Volume Method (FVM) serves for a solution of the fluid flow problem to model the air flow around the structure, whose results are applied in turn as the boundary tractions in the Finite Element Method problem structural solution for the linear elastostatics with small deformations. The first part is carried out with the use of ANSYS 15.0 computer system, whereas the FEM system Robot supports stress analysis in particular roof members. A comparison of the wind pressure distribution throughout the roof surface shows some differences with respect to that available in the engineering designing codes like Eurocode, which deserves separate further numerical studies. Coupling of these two separate numerical techniques appears to be promising in view of future computational models of stochastic nature in large scale structural systems due to the stochastic perturbation method.

  12. Biogas utilization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moser, M.A.

    1996-01-01

    Options for successfully using biogas depend on project scale. Almost all biogas from anaerobic digesters must first go through a gas handling system that pressurizes, meters, and filters the biogas. Additional treatment, including hydrogen sulfide-mercaptan scrubbing, gas drying, and carbon dioxide removal may be necessary for specialized uses, but these are complex and expensive processes. Thus, they can be justified only for large-scale projects that require high-quality biogas. Small-scale projects (less than 65 cfm) generally use biogas (as produced) as a boiler fuel or for fueling internal combustion engine-generators to produce electricity. If engines or boilers are selected properly, theremore » should be no need to remove hydrogen sulfide. Small-scale combustion turbines, steam turbines, and fuel cells are not used because of their technical complexity and high capital cost. Biogas cleanup to pipeline or transportation fuel specifications is very costly, and energy economics preclude this level of treatment.« less

  13. Large animal in vivo evaluation of a binary blend polymer scaffold for skeletal tissue-engineering strategies; translational issues.

    PubMed

    Smith, James O; Tayton, Edward R; Khan, Ferdous; Aarvold, Alexander; Cook, Richard B; Goodship, Allen; Bradley, Mark; Oreffo, Richard O C

    2017-04-01

    Binary blend polymers offer the opportunity to combine different desirable properties into a single scaffold, to enhance function within the field of tissue engineering. Previous in vitro and murine in vivo analysis identified a polymer blend of poly(l-lactic acid)-poly(ε-caprolactone) (PLLA:PCL 20:80) to have characteristics desirable for bone regeneration. Polymer scaffolds in combination with marrow-derived skeletal stem cells (SSCs) were implanted into mid-shaft ovine 3.5 cm tibial defects, and indices of bone regeneration were compared to groups implanted with scaffolds alone and with empty defects after 12 weeks, including micro-CT, mechanical testing and histological analysis. The critical nature of the defect was confirmed via all modalities. Both the scaffold and scaffold/SSC groups showed enhanced quantitative bone regeneration; however, this was only found to be significant in the scaffold/SSCs group (p = 0.04) and complete defect bridging was not achieved in any group. The mechanical strength was significantly less than that of contralateral control tibiae (p < 0.01) and would not be appropriate for full functional loading in a clinical setting. This study explored the hypothesis that cell therapy would enhance bone formation in a critical-sized defect compared to scaffold alone, using an external fixation construct, to bridge the scale-up gap between small animal studies and potential clinical translation. The model has proved a successful critical defect and analytical techniques have been found to be both valid and reproducible. Further work is required with both scaffold production techniques and cellular protocols in order to successfully scale-up this stem cell/binary blend polymer scaffold. © 2015 The Authors. Journal of Tissue Engineering and Regenerative Medicine published by John Wiley & Sons, Ltd. © 2015 The Authors. Journal of Tissue Engineering and Regenerative Medicine published by John Wiley & Sons, Ltd.

  14. Design and implementation of a distributed large-scale spatial database system based on J2EE

    NASA Astrophysics Data System (ADS)

    Gong, Jianya; Chen, Nengcheng; Zhu, Xinyan; Zhang, Xia

    2003-03-01

    With the increasing maturity of distributed object technology, CORBA, .NET and EJB are universally used in traditional IT field. However, theories and practices of distributed spatial database need farther improvement in virtue of contradictions between large scale spatial data and limited network bandwidth or between transitory session and long transaction processing. Differences and trends among of CORBA, .NET and EJB are discussed in details, afterwards the concept, architecture and characteristic of distributed large-scale seamless spatial database system based on J2EE is provided, which contains GIS client application, web server, GIS application server and spatial data server. Moreover the design and implementation of components of GIS client application based on JavaBeans, the GIS engine based on servlet, the GIS Application server based on GIS enterprise JavaBeans(contains session bean and entity bean) are explained.Besides, the experiments of relation of spatial data and response time under different conditions are conducted, which proves that distributed spatial database system based on J2EE can be used to manage, distribute and share large scale spatial data on Internet. Lastly, a distributed large-scale seamless image database based on Internet is presented.

  15. Large-scale machine learning and evaluation platform for real-time traffic surveillance

    NASA Astrophysics Data System (ADS)

    Eichel, Justin A.; Mishra, Akshaya; Miller, Nicholas; Jankovic, Nicholas; Thomas, Mohan A.; Abbott, Tyler; Swanson, Douglas; Keller, Joel

    2016-09-01

    In traffic engineering, vehicle detectors are trained on limited datasets, resulting in poor accuracy when deployed in real-world surveillance applications. Annotating large-scale high-quality datasets is challenging. Typically, these datasets have limited diversity; they do not reflect the real-world operating environment. There is a need for a large-scale, cloud-based positive and negative mining process and a large-scale learning and evaluation system for the application of automatic traffic measurements and classification. The proposed positive and negative mining process addresses the quality of crowd sourced ground truth data through machine learning review and human feedback mechanisms. The proposed learning and evaluation system uses a distributed cloud computing framework to handle data-scaling issues associated with large numbers of samples and a high-dimensional feature space. The system is trained using AdaBoost on 1,000,000 Haar-like features extracted from 70,000 annotated video frames. The trained real-time vehicle detector achieves an accuracy of at least 95% for 1/2 and about 78% for 19/20 of the time when tested on ˜7,500,000 video frames. At the end of 2016, the dataset is expected to have over 1 billion annotated video frames.

  16. Structural Health Monitoring on Turbine Engines Using Microwave Blade Tip Clearance Sensors

    NASA Technical Reports Server (NTRS)

    Woike, Mark; Abdul-Aziz, Ali; Clem, Michelle

    2014-01-01

    The ability to monitor the structural health of the rotating components, especially in the hot sections of turbine engines, is of major interest to aero community in improving engine safety and reliability. The use of instrumentation for these applications remains very challenging. It requires sensors and techniques that are highly accurate, are able to operate in a high temperature environment, and can detect minute changes and hidden flaws before catastrophic events occur. The National Aeronautics and Space Administration (NASA) has taken a lead role in the investigation of new sensor technologies and techniques for the in situ structural health monitoring of gas turbine engines. As part of this effort, microwave sensor technology has been investigated as a means of making high temperature non-contact blade tip clearance, blade tip timing, and blade vibration measurements for use in gas turbine engines. This paper presents a summary of key results and findings obtained from the evaluation of two different types of microwave sensors that have been investigated for use possible in structural health monitoring applications. The first is a microwave blade tip clearance sensor that has been evaluated on a large scale Axial Vane Fan, a subscale Turbofan, and more recently on sub-scale turbine engine like disks. The second is a novel microwave based blade vibration sensor that was also used in parallel with the microwave blade tip clearance sensors on the experiments with the sub-scale turbine engine disks.

  17. Structural health monitoring on turbine engines using microwave blade tip clearance sensors

    NASA Astrophysics Data System (ADS)

    Woike, Mark; Abdul-Aziz, Ali; Clem, Michelle

    2014-04-01

    The ability to monitor the structural health of the rotating components, especially in the hot sections of turbine engines, is of major interest to the aero community in improving engine safety and reliability. The use of instrumentation for these applications remains very challenging. It requires sensors and techniques that are highly accurate, are able to operate in a high temperature environment, and can detect minute changes and hidden flaws before catastrophic events occur. The National Aeronautics and Space Administration (NASA) has taken a lead role in the investigation of new sensor technologies and techniques for the in situ structural health monitoring of gas turbine engines. As part of this effort, microwave sensor technology has been investigated as a means of making high temperature non-contact blade tip clearance, blade tip timing, and blade vibration measurements for use in gas turbine engines. This paper presents a summary of key results and findings obtained from the evaluation of two different types of microwave sensors that have been investigated for possible use in structural health monitoring applications. The first is a microwave blade tip clearance sensor that has been evaluated on a large scale Axial Vane Fan, a subscale Turbofan, and more recently on sub-scale turbine engine like disks. The second is a novel microwave based blade vibration sensor that was also used in parallel with the microwave blade tip clearance sensors on the same experiments with the sub-scale turbine engine disks.

  18. [Civil engineering education at the Imperial College of Engineering in Tokyo: an analysis based on Ayahiko Ishibashi's memoirs].

    PubMed

    Wada, Masanori

    2014-01-01

    The Imperial College of Engineering (ICE or Kobu-Daigakko) in Tokyo, founded in 1873 under the auspices of the Ministry of Public Works, was one of the most prominent modern institutions of engineering education in early Meiji Japan. Previous studies have revealed that the ICE offered large scale practical training programs at enterprises of the Ministry, which sometimes lasted several months, and praised their ideal combination of theory and practice. In reality, it has been difficult to evaluate the quality of education at the ICE mainly because of scarcity of sources. ICE students published a collection of memoirs for alumni members, commemorating the fiftieth-year of the history of the Tokyo Imperial University. Drawing on the previously neglected collection of students' memoires, this paper appraises the education of civil engineering offered by the ICE. The paper also compares this collection with other official records of the college, and confirms it as a reliable source, even though it contains some minor errors. The author particularly uses the memoirs by Ayahiko Ishibashi, one of the first graduates from its civil engineering course, who left sufficient reminiscences on education that he received. This paper, as a result, illustrates that the main practical training for the students of civil engineering was limited to designing process, including surveying. Furthermore, practical training that Ishibashi received at those enterprises often lacked a plan, and its effectiveness was questionable.

  19. The Utility-Scale Future - Continuum Magazine | NREL

    Science.gov Websites

    Spring 2011 / Issue 1 Continuum. Clean Energy Innovation at NREL The Utility-Scale Future Continuum facility will lead the way. Wind Innovation Enables Utility-Scale 02 Wind Innovation Enables Utility-Scale Archives 9 Beyond R&D: Market Impact 8 NREL Analysis 7 Partnering: An Engine for Innovation 6 Energy

  20. The quest for better quality-of-life - learning from large-scale shaking table tests

    NASA Astrophysics Data System (ADS)

    Nakashima, M.; Sato, E.; Nagae, T.; Kunio, F.; Takahito, I.

    2010-12-01

    Earthquake engineering has its origins in the practice of “learning from actual earthquakes and earthquake damages.” That is, we recognize serious problems by witnessing the actual damage to our structures, and then we develop and apply engineering solutions to solve these problems. This tradition in earthquake engineering, i.e., “learning from actual damage,” was an obvious engineering response to earthquakes and arose naturally as a practice in a civil and building engineering discipline that traditionally places more emphasis on experience than do other engineering disciplines. But with the rapid progress of urbanization, as society becomes denser, and as the many components that form our society interact with increasing complexity, the potential damage with which earthquakes threaten the society also increases. In such an era, the approach of ”learning from actual earthquake damages” becomes unacceptably dangerous and expensive. Among the practical alternatives to the old practice is to “learn from quasi-actual earthquake damages.” One tool for experiencing earthquake damages without attendant catastrophe is the large shaking table. E-Defense, the largest one we have, was developed in Japan after the 1995 Hyogoken-Nanbu (Kobe) earthquake. Since its inauguration in 2005, E-Defense has conducted over forty full-scale or large-scale shaking table tests, applied to a variety of structural systems. The tests supply detailed data on actual behavior and collapse of the tested structures, offering the earthquake engineering community opportunities to experience and assess the actual seismic performance of the structures, and to help society prepare for earthquakes. Notably, the data were obtained without having to wait for the aftermaths of actual earthquakes. Earthquake engineering has always been about life safety, but in recent years maintaining the quality of life has also become a critical issue. Quality-of-life concerns include nonstructural damage, business continuity, public health, quickness of damage assessment, infrastructure, data and communication networks, and other issues, and not enough useful empirical data have emerged about these issues from the experiences of actual earthquakes. To provide quantitative data that can be used to reduce earthquake risk to our quality of life, E-Defense recently has been implementing two comprehensive research projects in which a base-isolated hospital and a steel high-rise building were tested using the E-Defense shaking table and their seismic performance were examined particularly in terms of the nonstructural damage, damage to building contents and furniture, and operability, functionality, and business-continuity capability. The paper presents the overview of the two projects, together with major findings obtained from the projects.

  1. Defining Gas Turbine Engine Performance Requirements for the Large Civil TiltRotor (LCTR2)

    NASA Technical Reports Server (NTRS)

    Snyder, Christopher A.

    2013-01-01

    Defining specific engine requirements is a critical part of identifying technologies and operational models for potential future rotary wing vehicles. NASA's Fundamental Aeronautics Program, Subsonic Rotary Wing Project has identified the Large Civil TiltRotor (LCTR) as the configuration to best meet technology goals. This notional vehicle concept has evolved with more clearly defined mission and operational requirements to the LCTR-iteration 2 (LCTR2). This paper reports on efforts to further review and refine the LCTR2 analyses to ascertain specific engine requirements and propulsion sizing criteria. The baseline mission and other design or operational requirements are reviewed. Analysis tools are described to help understand their interactions and underlying assumptions. Various design and operational conditions are presented and explained for their contribution to defining operational and engine requirements. These identified engine requirements are discussed to suggest which are most critical to the engine sizing and operation. The most-critical engine requirements are compared to in-house NASA engine simulations to try to ascertain which operational requirements define engine requirements versus points within the available engine operational capability. Finally, results are summarized with suggestions for future efforts to improve analysis capabilities, and better define and refine mission and operational requirements.

  2. Trajectory and Mixing Scaling Laws for Confined and Unconfined Transverse Jets

    DTIC Science & Technology

    2012-05-01

    engines , issues of confinement, very large density ratio, and super/transcritical effects complicate the utility of the ...opposite wall at a streamwise position that is one -half pipe diameter downstream of the injection location (termed moderate impaction). This...BD, and Eq. 10 scaling laws are 0.97 and 0.90, respectively. One of the primary effects of the confinement is that the

  3. Human Factors Virtual Analysis Techniques for NASA's Space Launch System Ground Support using MSFC's Virtual Environments Lab (VEL)

    NASA Technical Reports Server (NTRS)

    Searcy, Brittani

    2017-01-01

    Using virtual environments to assess complex large scale human tasks provides timely and cost effective results to evaluate designs and to reduce operational risks during assembly and integration of the Space Launch System (SLS). NASA's Marshall Space Flight Center (MSFC) uses a suite of tools to conduct integrated virtual analysis during the design phase of the SLS Program. Siemens Jack is a simulation tool that allows engineers to analyze human interaction with CAD designs by placing a digital human model into the environment to test different scenarios and assess the design's compliance to human factors requirements. Engineers at MSFC are using Jack in conjunction with motion capture and virtual reality systems in MSFC's Virtual Environments Lab (VEL). The VEL provides additional capability beyond standalone Jack to record and analyze a person performing a planned task to assemble the SLS at Kennedy Space Center (KSC). The VEL integrates Vicon Blade motion capture system, Siemens Jack, Oculus Rift, and other virtual tools to perform human factors assessments. By using motion capture and virtual reality, a more accurate breakdown and understanding of how an operator will perform a task can be gained. By virtual analysis, engineers are able to determine if a specific task is capable of being safely performed by both a 5% (approx. 5ft) female and a 95% (approx. 6'1) male. In addition, the analysis will help identify any tools or other accommodations that may to help complete the task. These assessments are critical for the safety of ground support engineers and keeping launch operations on schedule. Motion capture allows engineers to save and examine human movements on a frame by frame basis, while virtual reality gives the actor (person performing a task in the VEL) an immersive view of the task environment. This presentation will discuss the need of human factors for SLS and the benefits of analyzing tasks in NASA MSFC's VEL.

  4. Space transportation booster engine configuration study. Volume 2: Design definition document and environmental analysis

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The objective of the Space Transportation Booster Engine (STBE) Configuration Study is to contribute to the Advanced Launch System (ALS) development effort by providing highly reliable, low cost booster engine concepts for both expendable and reusable rocket engines. The objectives of the space Transportation Booster Engine (STBE) Configuration Study were: (1) to identify engine configurations which enhance vehicle performance and provide operational flexibility at low cost, and (2) to explore innovative approaches to the follow-on Full-Scale Development (FSD) phase for the STBE.

  5. Systems Engineering Provides Successful High Temperature Steam Electrolysis Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charles V. Park; Emmanuel Ohene Opare, Jr.

    2011-06-01

    This paper describes two Systems Engineering Studies completed at the Idaho National Laboratory (INL) to support development of the High Temperature Stream Electrolysis (HTSE) process. HTSE produces hydrogen from water using nuclear power and was selected by the Department of Energy (DOE) for integration with the Next Generation Nuclear Plant (NGNP). The first study was a reliability, availability and maintainability (RAM) analysis to identify critical areas for technology development based on available information regarding expected component performance. An HTSE process baseline flowsheet at commercial scale was used as a basis. The NGNP project also established a process and capability tomore » perform future RAM analyses. The analysis identified which components had the greatest impact on HTSE process availability and indicated that the HTSE process could achieve over 90% availability. The second study developed a series of life-cycle cost estimates for the various scale-ups required to demonstrate the HTSE process. Both studies were useful in identifying near- and long-term efforts necessary for successful HTSE process deployment. The size of demonstrations to support scale-up was refined, which is essential to estimate near- and long-term cost and schedule. The life-cycle funding profile, with high-level allocations, was identified as the program transitions from experiment scale R&D to engineering scale demonstration.« less

  6. Additional challenges for uncertainty analysis in river engineering

    NASA Astrophysics Data System (ADS)

    Berends, Koen; Warmink, Jord; Hulscher, Suzanne

    2016-04-01

    The management of rivers for improving safety, shipping and environment requires conscious effort on the part of river managers. River engineers design hydraulic works to tackle various challenges, from increasing flow conveyance to ensuring minimal water depths for environmental flow and inland shipping. Last year saw the completion of such large scale river engineering in the 'Room for the River' programme for the Dutch Rhine River system, in which several dozen of human interventions were built to increase flood safety. Engineering works in rivers are not completed in isolation from society. Rather, their benefits - increased safety, landscaping beauty - and their disadvantages - expropriation, hindrance - directly affect inhabitants. Therefore river managers are required to carefully defend their plans. The effect of engineering works on river dynamics is being evaluated using hydraulic river models. Two-dimensional numerical models based on the shallow water equations provide the predictions necessary to make decisions on designs and future plans. However, like all environmental models, these predictions are subject to uncertainty. In recent years progress has been made in the identification of the main sources of uncertainty for hydraulic river models. Two of the most important sources are boundary conditions and hydraulic roughness (Warmink et al. 2013). The result of these sources of uncertainty is that the identification of single, deterministic prediction model is a non-trivial task. This is this is a well-understood problem in other fields as well - most notably hydrology - and known as equifinality. However, the particular case of human intervention modelling with hydraulic river models compounds the equifinality case. The model that provides the reference baseline situation is usually identified through calibration and afterwards modified for the engineering intervention. This results in two distinct models, the evaluation of which yields the effect of the proposed intervention. The implicit assumption underlying such analysis is that both models are commensurable. We hypothesize that they are commensurable only to a certain extent. In an idealised study we have demonstrated that prediction performance loss should be expected with increasingly large engineering works. When accounting for parametric uncertainty of floodplain roughness in model identification, we see uncertainty bounds for predicted effects of interventions increase with increasing intervention scale. Calibration of these types of models therefore seems to have a shelf-life, beyond which calibration does not longer improves prediction. Therefore a qualification scheme for model use is required that can be linked to model validity. In this study, we characterize model use along three dimensions: extrapolation (using the model with different external drivers), extension (using the model for different output or indicators) and modification (using modified models). Such use of models is expected to have implications for the applicability of surrogating modelling for efficient uncertainty analysis as well, which is recommended for future research. Warmink, J. J.; Straatsma, M. W.; Huthoff, F.; Booij, M. J. & Hulscher, S. J. M. H. 2013. Uncertainty of design water levels due to combined bed form and vegetation roughness in the Dutch river Waal. Journal of Flood Risk Management 6, 302-318 . DOI: 10.1111/jfr3.12014

  7. Advanced Packaging for VLSI/VHSIC (Very Large Scale Integrated Circuits/Very High Speed Integrated Circuits) Applications: Electrical, Thermal, and Mechanical Considerations - An IR&D Report.

    DTIC Science & Technology

    1987-11-01

    developed that can be used by circuit engineers to extract the maximum performance from the devices on various board technologies including multilayer ceramic...Design guidelines have been developed that can be used by circuit engineers to extract the maxi- mum performance from the devices on various board...25 Attenuation and Dispersion Effects ......................................... 27 Skin Effect

  8. Heavy hydrocarbon main injector technology

    NASA Technical Reports Server (NTRS)

    Fisher, S. C.; Arbit, H. A.

    1988-01-01

    One of the key components of the Advanced Launch System (ALS) is a large liquid rocket, booster engine. To keep the overall vehicle size and cost down, this engine will probably use liquid oxygen (LOX) and a heavy hydrocarbon, such as RP-1, as propellants and operate at relatively high chamber pressures to increase overall performance. A technology program (Heavy Hydrocarbon Main Injector Technology) is being studied. The main objective of this effort is to develop a logic plan and supporting experimental data base to reduce the risk of developing a large scale (approximately 750,000 lb thrust), high performance main injector system. The overall approach and program plan, from initial analyses to large scale, two dimensional combustor design and test, and the current status of the program are discussed. Progress includes performance and stability analyses, cold flow tests of injector model, design and fabrication of subscale injectors and calorimeter combustors for performance, heat transfer, and dynamic stability tests, and preparation of hot fire test plans. Related, current, high pressure, LOX/RP-1 injector technology efforts are also briefly discussed.

  9. Performance of the engineering analysis and data system 2 common file system

    NASA Technical Reports Server (NTRS)

    Debrunner, Linda S.

    1993-01-01

    The Engineering Analysis and Data System (EADS) was used from April 1986 to July 1993 to support large scale scientific and engineering computation (e.g. computational fluid dynamics) at Marshall Space Flight Center. The need for an updated system resulted in a RFP in June 1991, after which a contract was awarded to Cray Grumman. EADS II was installed in February 1993, and by July 1993 most users were migrated. EADS II is a network of heterogeneous computer systems supporting scientific and engineering applications. The Common File System (CFS) is a key component of this system. The CFS provides a seamless, integrated environment to the users of EADS II including both disk and tape storage. UniTree software is used to implement this hierarchical storage management system. The performance of the CFS suffered during the early months of the production system. Several of the performance problems were traced to software bugs which have been corrected. Other problems were associated with hardware. However, the use of NFS in UniTree UCFM software limits the performance of the system. The performance issues related to the CFS have led to a need to develop a greater understanding of the CFS organization. This paper will first describe the EADS II with emphasis on the CFS. Then, a discussion of mass storage systems will be presented, and methods of measuring the performance of the Common File System will be outlined. Finally, areas for further study will be identified and conclusions will be drawn.

  10. An Analysis of Large-Scale Writing Assessments in Canada (Grades 5-8)

    ERIC Educational Resources Information Center

    Peterson, Shelley Stagg; McClay, Jill; Main, Kristin

    2011-01-01

    This paper reports on an analysis of large-scale assessments of Grades 5-8 students' writing across 10 provinces and 2 territories in Canada. Theory, classroom practice, and the contributions and constraints of large-scale writing assessment are brought together with a focus on Grades 5-8 writing in order to provide both a broad view of…

  11. Helicopter rotor and engine sizing for preliminary performance estimation

    NASA Technical Reports Server (NTRS)

    Talbot, P. D.; Bowles, J. V.; Lee, H. C.

    1986-01-01

    Methods are presented for estimating some of the more fundamental design variables of single-rotor helicopters (tip speed, blade area, disk loading, and installed power) based on design requirements (speed, weight, fuselage drag, and design hover ceiling). The well-known constraints of advancing-blade compressibility and retreating-blade stall are incorporated into the estimation process, based on an empirical interpretation of rotor performance data from large-scale wind-tunnel tests. Engine performance data are presented and correlated with a simple model usable for preliminary design. When approximate results are required quickly, these methods may be more convenient to use and provide more insight than large digital computer programs.

  12. Software environment for implementing engineering applications on MIMD computers

    NASA Technical Reports Server (NTRS)

    Lopez, L. A.; Valimohamed, K. A.; Schiff, S.

    1990-01-01

    In this paper the concept for a software environment for developing engineering application systems for multiprocessor hardware (MIMD) is presented. The philosophy employed is to solve the largest problems possible in a reasonable amount of time, rather than solve existing problems faster. In the proposed environment most of the problems concerning parallel computation and handling of large distributed data spaces are hidden from the application program developer, thereby facilitating the development of large-scale software applications. Applications developed under the environment can be executed on a variety of MIMD hardware; it protects the application software from the effects of a rapidly changing MIMD hardware technology.

  13. Combining Flux Balance and Energy Balance Analysis for Large-Scale Metabolic Network: Biochemical Circuit Theory for Analysis of Large-Scale Metabolic Networks

    NASA Technical Reports Server (NTRS)

    Beard, Daniel A.; Liang, Shou-Dan; Qian, Hong; Biegel, Bryan (Technical Monitor)

    2001-01-01

    Predicting behavior of large-scale biochemical metabolic networks represents one of the greatest challenges of bioinformatics and computational biology. Approaches, such as flux balance analysis (FBA), that account for the known stoichiometry of the reaction network while avoiding implementation of detailed reaction kinetics are perhaps the most promising tools for the analysis of large complex networks. As a step towards building a complete theory of biochemical circuit analysis, we introduce energy balance analysis (EBA), which compliments the FBA approach by introducing fundamental constraints based on the first and second laws of thermodynamics. Fluxes obtained with EBA are thermodynamically feasible and provide valuable insight into the activation and suppression of biochemical pathways.

  14. Toward server-side, high performance climate change data analytics in the Earth System Grid Federation (ESGF) eco-system

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; Williams, Dean; Aloisio, Giovanni

    2016-04-01

    In many scientific domains such as climate, data is often n-dimensional and requires tools that support specialized data types and primitives to be properly stored, accessed, analysed and visualized. Moreover, new challenges arise in large-scale scenarios and eco-systems where petabytes (PB) of data can be available and data can be distributed and/or replicated (e.g., the Earth System Grid Federation (ESGF) serving the Coupled Model Intercomparison Project, Phase 5 (CMIP5) experiment, providing access to 2.5PB of data for the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). Most of the tools currently available for scientific data analysis in the climate domain fail at large scale since they: (1) are desktop based and need the data locally; (2) are sequential, so do not benefit from available multicore/parallel machines; (3) do not provide declarative languages to express scientific data analysis tasks; (4) are domain-specific, which ties their adoption to a specific domain; and (5) do not provide a workflow support, to enable the definition of complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes ("datacubes"). The project relies on a strong background of high performance database management and OLAP systems to manage large scientific data sets. It also provides a native workflow management support, to define processing chains and workflows with tens to hundreds of data analytics operators to build real scientific use cases. With regard to interoperability aspects, the talk will present the contribution provided both to the RDA Working Group on Array Databases, and the Earth System Grid Federation (ESGF) Compute Working Team. Also highlighted will be the results of large scale climate model intercomparison data analysis experiments, for example: (1) defined in the context of the EU H2020 INDIGO-DataCloud project; (2) implemented in a real geographically distributed environment involving CMCC (Italy) and LLNL (US) sites; (3) exploiting Ophidia as server-side, parallel analytics engine; and (4) applied on real CMIP5 data sets available through ESGF.

  15. Work and power fluctuations in a critical heat engine.

    PubMed

    Holubec, Viktor; Ryabov, Artem

    2017-09-01

    We investigate fluctuations of output work for a class of Stirling heat engines with working fluid composed of interacting units and compare these fluctuations to an average work output. In particular, we focus on engine performance close to a critical point where Carnot's efficiency may be attained at a finite power as reported by M. Campisi and R. Fazio [Nat. Commun. 7, 11895 (2016)2041-172310.1038/ncomms11895]. We show that the variance of work output per cycle scales with the same critical exponent as the heat capacity of the working fluid. As a consequence, the relative work fluctuation diverges unless the output work obeys a rather strict scaling condition, which would be very hard to fulfill in practice. Even under this condition, the fluctuations of work and power do not vanish in the infinite system size limit. Large fluctuations of output work thus constitute inseparable and dominant element in performance of the macroscopic heat engines close to a critical point.

  16. Work and power fluctuations in a critical heat engine

    NASA Astrophysics Data System (ADS)

    Holubec, Viktor; Ryabov, Artem

    2017-09-01

    We investigate fluctuations of output work for a class of Stirling heat engines with working fluid composed of interacting units and compare these fluctuations to an average work output. In particular, we focus on engine performance close to a critical point where Carnot's efficiency may be attained at a finite power as reported by M. Campisi and R. Fazio [Nat. Commun. 7, 11895 (2016), 10.1038/ncomms11895]. We show that the variance of work output per cycle scales with the same critical exponent as the heat capacity of the working fluid. As a consequence, the relative work fluctuation diverges unless the output work obeys a rather strict scaling condition, which would be very hard to fulfill in practice. Even under this condition, the fluctuations of work and power do not vanish in the infinite system size limit. Large fluctuations of output work thus constitute inseparable and dominant element in performance of the macroscopic heat engines close to a critical point.

  17. Information Power Grid Posters

    NASA Technical Reports Server (NTRS)

    Vaziri, Arsi

    2003-01-01

    This document is a summary of the accomplishments of the Information Power Grid (IPG). Grids are an emerging technology that provide seamless and uniform access to the geographically dispersed, computational, data storage, networking, instruments, and software resources needed for solving large-scale scientific and engineering problems. The goal of the NASA IPG is to use NASA's remotely located computing and data system resources to build distributed systems that can address problems that are too large or complex for a single site. The accomplishments outlined in this poster presentation are: access to distributed data, IPG heterogeneous computing, integration of large-scale computing node into distributed environment, remote access to high data rate instruments,and exploratory grid environment.

  18. Novel Membranes and Systems for Industrial and Municipal Water Purification and Reuse

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    This factsheet describes a project that developed nano-engineered, high-permeance membrane materials with more than double the permeance of current reverse osmosis membranes as well as manufacturing technologies for large-scale production of the novel materials.

  19. Infrapatellar fat pad-derived stem cells maintain their chondrogenic capacity in disease and can be used to engineer cartilaginous grafts of clinically relevant dimensions.

    PubMed

    Liu, Yurong; Buckley, Conor Timothy; Almeida, Henrique V; Mulhall, Kevin J; Kelly, Daniel John

    2014-11-01

    A therapy for regenerating large cartilaginous lesions within the articular surface of osteoarthritic joints remains elusive. While tissue engineering strategies such as matrix-assisted autologous chondrocyte implantation can be used in the repair of focal cartilage defects, extending such approaches to the treatment of osteoarthritis will require a number of scientific and technical challenges to be overcome. These include the identification of an abundant source of chondroprogenitor cells that maintain their chondrogenic capacity in disease, as well as the development of novel approaches to engineer scalable cartilaginous grafts that could be used to resurface large areas of damaged joints. In this study, it is first demonstrated that infrapatellar fat pad-derived stem cells (FPSCs) isolated from osteoarthritic (OA) donors possess a comparable chondrogenic capacity to FPSCs isolated from patients undergoing ligament reconstruction. In a further validation of their functionality, we also demonstrate that FPSCs from OA donors respond to the application of physiological levels of cyclic hydrostatic pressure by increasing aggrecan gene expression and the production of sulfated glycosaminoglycans. We next explored whether cartilaginous grafts could be engineered with diseased human FPSCs using a self-assembly or scaffold-free approach. After examining a range of culture conditions, it was found that continuous supplementation with both transforming growth factor-β3 (TGF-β3) and bone morphogenic protein-6 (BMP-6) promoted the development of tissues rich in proteoglycans and type II collagen. The final phase of the study sought to scale-up this approach to engineer cartilaginous grafts of clinically relevant dimensions (≥2 cm in diameter) by assembling FPSCs onto electrospun PLLA fiber membranes. Over 6 weeks in culture, it was possible to generate robust, flexible cartilage-like grafts of scale, opening up the possibility that tissues engineered using FPSCs derived from OA patients could potentially be used to resurface large areas of joint surfaces damaged by trauma or disease.

  20. Infrapatellar Fat Pad-Derived Stem Cells Maintain Their Chondrogenic Capacity in Disease and Can be Used to Engineer Cartilaginous Grafts of Clinically Relevant Dimensions

    PubMed Central

    Liu, Yurong; Buckley, Conor Timothy; Almeida, Henrique V.; Mulhall, Kevin J.

    2014-01-01

    A therapy for regenerating large cartilaginous lesions within the articular surface of osteoarthritic joints remains elusive. While tissue engineering strategies such as matrix-assisted autologous chondrocyte implantation can be used in the repair of focal cartilage defects, extending such approaches to the treatment of osteoarthritis will require a number of scientific and technical challenges to be overcome. These include the identification of an abundant source of chondroprogenitor cells that maintain their chondrogenic capacity in disease, as well as the development of novel approaches to engineer scalable cartilaginous grafts that could be used to resurface large areas of damaged joints. In this study, it is first demonstrated that infrapatellar fat pad-derived stem cells (FPSCs) isolated from osteoarthritic (OA) donors possess a comparable chondrogenic capacity to FPSCs isolated from patients undergoing ligament reconstruction. In a further validation of their functionality, we also demonstrate that FPSCs from OA donors respond to the application of physiological levels of cyclic hydrostatic pressure by increasing aggrecan gene expression and the production of sulfated glycosaminoglycans. We next explored whether cartilaginous grafts could be engineered with diseased human FPSCs using a self-assembly or scaffold-free approach. After examining a range of culture conditions, it was found that continuous supplementation with both transforming growth factor-β3 (TGF-β3) and bone morphogenic protein-6 (BMP-6) promoted the development of tissues rich in proteoglycans and type II collagen. The final phase of the study sought to scale-up this approach to engineer cartilaginous grafts of clinically relevant dimensions (≥2 cm in diameter) by assembling FPSCs onto electrospun PLLA fiber membranes. Over 6 weeks in culture, it was possible to generate robust, flexible cartilage-like grafts of scale, opening up the possibility that tissues engineered using FPSCs derived from OA patients could potentially be used to resurface large areas of joint surfaces damaged by trauma or disease. PMID:24785365

  1. An evaluation of multi-probe locality sensitive hashing for computing similarities over web-scale query logs.

    PubMed

    Cormode, Graham; Dasgupta, Anirban; Goyal, Amit; Lee, Chi Hoon

    2018-01-01

    Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users' queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with "vanilla" LSH, even when using the same amount of space.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryu, Jun-hyung

    University education aims to supply qualified human resources for industries. In complex large scale engineering systems such as nuclear power plants, the importance of qualified human resources cannot be underestimated. The corresponding education program should involve many topics systematically. Recently a nuclear engineering program has been initiated in Dongguk University, South Korea. The current education program focuses on undergraduate level nuclear engineering students. Our main objective is to provide industries fresh engineers with the understanding on the interconnection of local parts and the entire systems of nuclear power plants and the associated systems. From the experience there is a hugemore » opportunity for chemical engineering disciple in the context of giving macroscopic overview on nuclear power plant and waste treatment management by strengthening the analyzing capability of fundamental situations. (authors)« less

  3. Advanced Connectivity Analysis (ACA): a Large Scale Functional Connectivity Data Mining Environment.

    PubMed

    Chen, Rong; Nixon, Erika; Herskovits, Edward

    2016-04-01

    Using resting-state functional magnetic resonance imaging (rs-fMRI) to study functional connectivity is of great importance to understand normal development and function as well as a host of neurological and psychiatric disorders. Seed-based analysis is one of the most widely used rs-fMRI analysis methods. Here we describe a freely available large scale functional connectivity data mining software package called Advanced Connectivity Analysis (ACA). ACA enables large-scale seed-based analysis and brain-behavior analysis. It can seamlessly examine a large number of seed regions with minimal user input. ACA has a brain-behavior analysis component to delineate associations among imaging biomarkers and one or more behavioral variables. We demonstrate applications of ACA to rs-fMRI data sets from a study of autism.

  4. Commercial use of remote sensing in agriculture: a case study

    NASA Astrophysics Data System (ADS)

    Gnauck, Gary E.

    1999-12-01

    Over 25 years of research have clearly shown that an analysis of remote sensing imagery can provide information on agricultural crops. Most of this research has been funded by and directed toward the needs of government agencies. Commercial use of agricultural remote sensing has been limited to very small-scale operations supplying remote sensing services to a few selected customers. Datron/Transco Inc. undertook an internally funded remote sensing program directed toward the California cash crop industry (strawberries, lettuce, tomatoes, other fresh vegetables and cotton). The objectives of this program were twofold: (1) to assess the need and readiness of agricultural land managers to adopt remote sensing as a management tool, and (2) determine what technical barriers exist to large-scale implementation of this technology on a commercial basis. The program was divided into three phases: Planning, Engineering Test and Evaluation, and Commercial Operations. Findings: Remote sensing technology can deliver high resolution multispectral imagery with rapid turnaround, that can provide information on crop stress insects, disease and various soil parameters. The limiting factors to the use of remote sensing in agriculture are a lack of familiarization by the land managers, difficulty in translating 'information' into increased revenue or reduced cost for the land manager, and the large economies of scale needed to make the venture commercially viable.

  5. Space shuttle maneuvering engine reusable thrust chamber program. Task 11: Stability analyses and acoustic model testing data dump

    NASA Technical Reports Server (NTRS)

    Oberg, C. L.

    1974-01-01

    The combustion stability characteristics of engines applicable to the Space Shuttle Orbit Maneuvering System and the adequacy of acoustic cavities as a means of assuring stability in these engines were investigated. The study comprised full-scale stability rating tests, bench-scale acoustic model tests and analysis. Two series of stability rating tests were made. Acoustic model tests were made to determine the resonance characteristics and effects of acoustic cavities. Analytical studies were done to aid design of the cavity configurations to be tested and, also, to aid evaluation of the effectiveness of acoustic cavities from available test results.

  6. Linear Spectral Analysis of Plume Emissions Using an Optical Matrix Processor

    NASA Technical Reports Server (NTRS)

    Gary, C. K.

    1992-01-01

    Plume spectrometry provides a means to monitor the health of a burning rocket engine, and optical matrix processors provide a means to analyze the plume spectra in real time. By observing the spectrum of the exhaust plume of a rocket engine, researchers have detected anomalous behavior of the engine and have even determined the failure of some equipment before it would normally have been noticed. The spectrum of the plume is analyzed by isolating information in the spectrum about the various materials present to estimate what materials are being burned in the engine. Scientists at the Marshall Space Flight Center (MSFC) have implemented a high resolution spectrometer to discriminate the spectral peaks of the many species present in the plume. Researchers at the Stennis Space Center Demonstration Testbed Facility (DTF) have implemented a high resolution spectrometer observing a 1200-lb. thrust engine. At this facility, known concentrations of contaminants can be introduced into the burn, allowing for the confirmation of diagnostic algorithms. While the high resolution of the measured spectra has allowed greatly increased insight into the functioning of the engine, the large data flows generated limit the ability to perform real-time processing. The use of an optical matrix processor and the linear analysis technique described below may allow for the detailed real-time analysis of the engine's health. A small optical matrix processor can perform the required mathematical analysis both quicker and with less energy than a large electronic computer dedicated to the same spectral analysis routine.

  7. Hierarchical hybrid control of manipulators: Artificial intelligence in large scale integrated circuits

    NASA Technical Reports Server (NTRS)

    Greene, P. H.

    1972-01-01

    Both in practical engineering and in control of muscular systems, low level subsystems automatically provide crude approximations to the proper response. Through low level tuning of these approximations, the proper response variant can emerge from standardized high level commands. Such systems are expressly suited to emerging large scale integrated circuit technology. A computer, using symbolic descriptions of subsystem responses, can select and shape responses of low level digital or analog microcircuits. A mathematical theory that reveals significant informational units in this style of control and software for realizing such information structures are formulated.

  8. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    NASA Technical Reports Server (NTRS)

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  9. Hydrogen Safety Issues Compared to Safety Issues with Methane and Propane

    NASA Astrophysics Data System (ADS)

    Green, M. A.

    2006-04-01

    The hydrogen economy is not possible if the safety standards currently applied to liquid hydrogen and hydrogen gas by many laboratories are applied to devices that use either liquid or gaseous hydrogen. Methane and propane are commonly used by ordinary people without the special training. This report asks, "How is hydrogen different from flammable gasses that are commonly being used all over the world?" This report compares the properties of hydrogen, methane and propane and how these properties may relate to safety when they are used in both the liquid and gaseous state. Through such an analysis, sensible safety standards for the large-scale (or even small-scale) use of liquid and gaseous hydrogen systems can be developed. This paper is meant to promote discussion of issues related to hydrogen safety so that engineers designing equipment can factor sensible safety standards into their designs.

  10. Hydrogen Safety Issues Compared to Safety Issues with Methane andPropane

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, Michael A.

    The hydrogen economy is not possible if the safety standards currently applied to liquid hydrogen and hydrogen gas by many laboratories are applied to devices that use either liquid or gaseous hydrogen. Methane and propane are commonly used by ordinary people without the special training. This report asks, 'How is hydrogen different from flammable gasses that are commonly being used all over the world?' This report compares the properties of hydrogen, methane and propane and how these properties may relate to safety when they are used in both the liquid and gaseous state. Through such an analysis, sensible safety standardsmore » for the large-scale (or even small-scale) use of liquid and gaseous hydrogen systems can be developed. This paper is meant to promote discussion of issues related to hydrogen safety so that engineers designing equipment can factor sensible safety standards into their designs.« less

  11. The mechanical behavior of nanoscale metallic multilayers: A survey

    NASA Astrophysics Data System (ADS)

    Zhou, Q.; Xie, J. Y.; Wang, F.; Huang, P.; Xu, K. W.; Lu, T. J.

    2015-06-01

    The mechanical behavior of nanoscale metallic multilayers (NMMs) has attracted much attention from both scientific and practical views. Compared with their monolithic counterparts, the large number of interfaces existing in the NMMs dictates the unique behavior of this special class of structural composite materials. While there have been a number of reviews on the mechanical mechanism of microlaminates, the rapid development of nanotechnology brought a pressing need for an overview focusing exclusively on a property-based definition of the NMMs, especially their size-dependent microstructure and mechanical performance. This article attempts to provide a comprehensive and up-to-date review on the microstructure, mechanical property and plastic deformation physics of NMMs. We hope this review could accomplish two purposes: (1) introducing the basic concepts of scaling and dimensional analysis to scientists and engineers working on NMM systems, and (2) providing a better understanding of interface behavior and the exceptional qualities the interfaces in NMMs display at atomic scale.

  12. Increasing Scalability of Researcher Network Extraction from the Web

    NASA Astrophysics Data System (ADS)

    Asada, Yohei; Matsuo, Yutaka; Ishizuka, Mitsuru

    Social networks, which describe relations among people or organizations as a network, have recently attracted attention. With the help of a social network, we can analyze the structure of a community and thereby promote efficient communications within it. We investigate the problem of extracting a network of researchers from the Web, to assist efficient cooperation among researchers. Our method uses a search engine to get the cooccurences of names of two researchers and calculates the streangth of the relation between them. Then we label the relation by analyzing the Web pages in which these two names cooccur. Research on social network extraction using search engines as ours, is attracting attention in Japan as well as abroad. However, the former approaches issue too many queries to search engines to extract a large-scale network. In this paper, we propose a method to filter superfluous queries and facilitates the extraction of large-scale networks. By this method we are able to extract a network of around 3000-nodes. Our experimental results show that the proposed method reduces the number of queries significantly while preserving the quality of the network as compared to former methods.

  13. Preparing university students to lead K-12 engineering outreach programmes: a design experiment

    NASA Astrophysics Data System (ADS)

    Anthony, Anika B.; Greene, Howard; Post, Paul E.; Parkhurst, Andrew; Zhan, Xi

    2016-11-01

    This paper describes an engineering outreach programme designed to increase the interest of under-represented youth in engineering and to disseminate pre-engineering design challenge materials to K-12 educators and volunteers. Given university students' critical role as facilitators of the outreach programme, researchers conducted a two-year design experiment to examine the programme's effectiveness at preparing university students to lead pre-engineering activities. Pre- and post-surveys incorporated items from the Student Engagement sub-scale of the Teacher Sense of Efficacy Scale. Surveys were analysed using paired-samples t-test. Interview and open-ended survey data were analysed using discourse analysis and the constant comparative method. As a result of participation in the programme, university students reported a gain in efficacy to lead pre-engineering activities. The paper discusses programme features that supported efficacy gains and concludes with a set of design principles for developing learning environments that effectively prepare university students to facilitate pre-engineering outreach programmes.

  14. Modelling the effects and economics of managed realignment on the cycling and storage of nutrients, carbon and sediments in the Blackwater estuary UK

    NASA Astrophysics Data System (ADS)

    Shepherd, D.; Burgess, D.; Jickells, T.; Andrews, J.; Cave, R.; Turner, R. K.; Aldridge, J.; Parker, E. R.; Young, E.

    2007-07-01

    A hydrodynamic model is developed for the Blackwater estuary (UK) and used to estimate nitrate removal by denitrification. Using the model, sediment analysis and estimates of sedimentation rates, we estimate changes in estuarine denitrification and intertidal carbon and nutrient storage and associated value of habitat created under a scenario of extensive managed realignment. We then use this information, together with engineering and land costs, to conduct a cost benefit analysis of the managed realignment. This demonstrates that over a 50-100 year timescale the value of the habitat created and carbon buried is sufficient to make the large scale managed realignment cost effective. The analysis reveals that carbon and nutrient storage plus habitat creation represent major and quantifiable benefits of realignment. The methodology described here can be readily transferred to other coastal systems.

  15. Status of the Combustion Devices Injector Technology Program at the NASA MSFC

    NASA Technical Reports Server (NTRS)

    Jones, Gregg; Protz, Christopher; Trinh, Huu; Tucker, Kevin; Nesman, Tomas; Hulka, James

    2005-01-01

    To support the NASA Space Exploration Mission, an in-house program called Combustion Devices Injector Technology (CDIT) is being conducted at the NASA Marshall Space Flight Center (MSFC) for the fiscal year 2005. CDIT is focused on developing combustor technology and analysis tools to improve reliability and durability of upper-stage and in-space liquid propellant rocket engines. The three areas of focus include injector/chamber thermal compatibility, ignition, and combustion stability. In the compatibility and ignition areas, small-scale single- and multi-element hardware experiments will be conducted to demonstrate advanced technological concepts as well as to provide experimental data for validation of computational analysis tools. In addition, advanced analysis tools will be developed to eventually include 3-dimensional and multi- element effects and improve capability and validity to analyze heat transfer and ignition in large, multi-element injectors.

  16. Exometabolome analysis reveals hypoxia at the up-scaling of a Saccharomyces cerevisiae high-cell density fed-batch biopharmaceutical process

    PubMed Central

    2014-01-01

    Background Scale-up to industrial production level of a fermentation process occurs after optimization at small scale, a critical transition for successful technology transfer and commercialization of a product of interest. At the large scale a number of important bioprocess engineering problems arise that should be taken into account to match the values obtained at the small scale and achieve the highest productivity and quality possible. However, the changes of the host strain’s physiological and metabolic behavior in response to the scale transition are still not clear. Results Heterogeneity in substrate and oxygen distribution is an inherent factor at industrial scale (10,000 L) which affects the success of process up-scaling. To counteract these detrimental effects, changes in dissolved oxygen and pressure set points and addition of diluents were applied to 10,000 L scale to enable a successful process scale-up. A comprehensive semi-quantitative and time-dependent analysis of the exometabolome was performed to understand the impact of the scale-up on the metabolic/physiological behavior of the host microorganism. Intermediates from central carbon catabolism and mevalonate/ergosterol synthesis pathways were found to accumulate in both the 10 L and 10,000 L scale cultures in a time-dependent manner. Moreover, excreted metabolites analysis revealed that hypoxic conditions prevailed at the 10,000 L scale. The specific product yield increased at the 10,000 L scale, in spite of metabolic stress and catabolic-anabolic uncoupling unveiled by the decrease in biomass yield on consumed oxygen. Conclusions An optimized S. cerevisiae fermentation process was successfully scaled-up to an industrial scale bioreactor. The oxygen uptake rate (OUR) and overall growth profiles were matched between scales. The major remaining differences between scales were wet cell weight and culture apparent viscosity. The metabolic and physiological behavior of the host microorganism at the 10,000 L scale was investigated with exometabolomics, indicating that reduced oxygen availability affected oxidative phosphorylation cascading into down- and up-stream pathways producing overflow metabolism. Our study revealed striking metabolic and physiological changes in response to hypoxia exerted by industrial bioprocess up-scaling. PMID:24593159

  17. Department of Defense Laboratory Civilian Science and Engineering Workforce - 2013

    DTIC Science & Technology

    2013-10-01

    was completed by the Institute for Defense Analysis (IDA) in 20091 with an update prepared by the Defense Laboratories Office (DLO) in 2011. By...demographics will emerge giving decision- and policy-makers greater clarity about the impacts of budgets and macro scale policies on this important...Physiology 819 Environmental Engineering 1382 Food Technology 414 Entomology 830 Mechanical Engineering 1384 Textile Technology 415 Toxicology 840

  18. Investigations of Air-cooled Turbine Rotors for Turbojet Engines II : Mechanical Design, Stress Analysis, and Burst Test of Modified J33 Split-disk Rotor / Richard H. Kemp and Merland L. Moseson

    NASA Technical Reports Server (NTRS)

    Kemp, Richard H; Moseson, Merland L

    1952-01-01

    A full-scale J33 air-cooled split turbine rotor was designed and spin-pit tested to destruction. Stress analysis and spin-pit results indicated that the rotor in a J33 turbojet engine, however, showed that the rear disk of the rotor operated at temperatures substantially higher than the forward disk. An extension of the stress analysis to include the temperature difference between the two disks indicated that engine modifications are required to permit operation of the two disks at more nearly the same temperature level.

  19. Hydra: a scalable proteomic search engine which utilizes the Hadoop distributed computing framework

    PubMed Central

    2012-01-01

    Background For shotgun mass spectrometry based proteomics the most computationally expensive step is in matching the spectra against an increasingly large database of sequences and their post-translational modifications with known masses. Each mass spectrometer can generate data at an astonishingly high rate, and the scope of what is searched for is continually increasing. Therefore solutions for improving our ability to perform these searches are needed. Results We present a sequence database search engine that is specifically designed to run efficiently on the Hadoop MapReduce distributed computing framework. The search engine implements the K-score algorithm, generating comparable output for the same input files as the original implementation. The scalability of the system is shown, and the architecture required for the development of such distributed processing is discussed. Conclusion The software is scalable in its ability to handle a large peptide database, numerous modifications and large numbers of spectra. Performance scales with the number of processors in the cluster, allowing throughput to expand with the available resources. PMID:23216909

  20. Hydra: a scalable proteomic search engine which utilizes the Hadoop distributed computing framework.

    PubMed

    Lewis, Steven; Csordas, Attila; Killcoyne, Sarah; Hermjakob, Henning; Hoopmann, Michael R; Moritz, Robert L; Deutsch, Eric W; Boyle, John

    2012-12-05

    For shotgun mass spectrometry based proteomics the most computationally expensive step is in matching the spectra against an increasingly large database of sequences and their post-translational modifications with known masses. Each mass spectrometer can generate data at an astonishingly high rate, and the scope of what is searched for is continually increasing. Therefore solutions for improving our ability to perform these searches are needed. We present a sequence database search engine that is specifically designed to run efficiently on the Hadoop MapReduce distributed computing framework. The search engine implements the K-score algorithm, generating comparable output for the same input files as the original implementation. The scalability of the system is shown, and the architecture required for the development of such distributed processing is discussed. The software is scalable in its ability to handle a large peptide database, numerous modifications and large numbers of spectra. Performance scales with the number of processors in the cluster, allowing throughput to expand with the available resources.

  1. Copernicus Big Data and Google Earth Engine for Glacier Surface Velocity Field Monitoring: Feasibility Demonstration on San Rafael and San Quintin Glaciers

    NASA Astrophysics Data System (ADS)

    Di Tullio, M.; Nocchi, F.; Camplani, A.; Emanuelli, N.; Nascetti, A.; Crespi, M.

    2018-04-01

    The glaciers are a natural global resource and one of the principal climate change indicator at global and local scale, being influenced by temperature and snow precipitation changes. Among the parameters used for glacier monitoring, the surface velocity is a key element, since it is connected to glaciers changes (mass balance, hydro balance, glaciers stability, landscape erosion). The leading idea of this work is to continuously retrieve glaciers surface velocity using free ESA Sentinel-1 SAR imagery and exploiting the potentialities of the Google Earth Engine (GEE) platform. GEE has been recently released by Google as a platform for petabyte-scale scientific analysis and visualization of geospatial datasets. The algorithm of SAR off-set tracking developed at the Geodesy and Geomatics Division of the University of Rome La Sapienza has been integrated in a cloud based platform that automatically processes large stacks of Sentinel-1 data to retrieve glacier surface velocity field time series. We processed about 600 Sentinel-1 image pairs to obtain a continuous time series of velocity field measurements over 3 years from January 2015 to January 2018 for two wide glaciers located in the Northern Patagonian Ice Field (NPIF), the San Rafael and the San Quintin glaciers. Several results related to these relevant glaciers also validated with respect already available and renown software (i.e. ESA SNAP, CIAS) and with respect optical sensor measurements (i.e. LANDSAT8), highlight the potential of the Big Data analysis to automatically monitor glacier surface velocity fields at global scale, exploiting the synergy between GEE and Sentinel-1 imagery.

  2. A method to estimate weight and dimensions of large and small gas turbine engines

    NASA Technical Reports Server (NTRS)

    Onat, E.; Klees, G. W.

    1979-01-01

    A computerized method was developed to estimate weight and envelope dimensions of large and small gas turbine engines within + or - 5% to 10%. The method is based on correlations of component weight and design features of 29 data base engines. Rotating components were estimated by a preliminary design procedure which is sensitive to blade geometry, operating conditions, material properties, shaft speed, hub tip ratio, etc. The development and justification of the method selected, and the various methods of analysis are discussed.

  3. Laureates

    Science.gov Websites

    , multi-scale observing systems under challenging field conditions to document unexpectedly large soil CO2 pleased to recognize the Building Technology and Urban Systems Division's Retro-commissioning Sensor synthetic biology while providing novel approaches for crop engineering to support Berkeley Lab and DOE's

  4. Computation and Theory in Large-Scale Optimization

    DTIC Science & Technology

    1993-01-13

    Sang Jin Lee. Research Assistant. - Laura Morley, Research Assistant. - Yonca A. Ozge , Research Assistant. - Stephen M. Robinson. Professor. - Hichem...other participants. M.N. Azadez. S.J. Lee. Y.A. Ozge . and H. Sellami are continuing students in the doctoral program (in Industrial Engineering except

  5. Externally blown flap noise research

    NASA Technical Reports Server (NTRS)

    Dorsch, R. G.

    1974-01-01

    The Lewis Research Center cold-flow model externally blown flap (EBF) noise research test program is summarized. Both engine under-the-wing and over-the-wing EBF wing section configurations were studied. Ten large scale and nineteen small scale EBF models were tested. A limited number of forward airspeed effect and flap noise suppression tests were also run. The key results and conclusions drawn from the flap noise tests are summarized and discussed.

  6. pGlyco 2.0 enables precision N-glycoproteomics with comprehensive quality control and one-step mass spectrometry for intact glycopeptide identification.

    PubMed

    Liu, Ming-Qi; Zeng, Wen-Feng; Fang, Pan; Cao, Wei-Qian; Liu, Chao; Yan, Guo-Quan; Zhang, Yang; Peng, Chao; Wu, Jian-Qiang; Zhang, Xiao-Jin; Tu, Hui-Jun; Chi, Hao; Sun, Rui-Xiang; Cao, Yong; Dong, Meng-Qiu; Jiang, Bi-Yun; Huang, Jiang-Ming; Shen, Hua-Li; Wong, Catherine C L; He, Si-Min; Yang, Peng-Yuan

    2017-09-05

    The precise and large-scale identification of intact glycopeptides is a critical step in glycoproteomics. Owing to the complexity of glycosylation, the current overall throughput, data quality and accessibility of intact glycopeptide identification lack behind those in routine proteomic analyses. Here, we propose a workflow for the precise high-throughput identification of intact N-glycopeptides at the proteome scale using stepped-energy fragmentation and a dedicated search engine. pGlyco 2.0 conducts comprehensive quality control including false discovery rate evaluation at all three levels of matches to glycans, peptides and glycopeptides, improving the current level of accuracy of intact glycopeptide identification. The N-glycoproteome of samples metabolically labeled with 15 N/ 13 C were analyzed quantitatively and utilized to validate the glycopeptide identification, which could be used as a novel benchmark pipeline to compare different search engines. Finally, we report a large-scale glycoproteome dataset consisting of 10,009 distinct site-specific N-glycans on 1988 glycosylation sites from 955 glycoproteins in five mouse tissues.Protein glycosylation is a heterogeneous post-translational modification that generates greater proteomic diversity that is difficult to analyze. Here the authors describe pGlyco 2.0, a workflow for the precise one step identification of intact N-glycopeptides at the proteome scale.

  7. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    NASA Astrophysics Data System (ADS)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  8. Screening of Various Herbicide Modes of Action for Selective Control of Algae Responsible for Harmful Blooms

    DTIC Science & Technology

    2009-01-01

    that have selective activity against harmful algal blooms (HAB). The U.S. Army Corps of Engineers is responsible for managing numerous large reservoirs...systems, some of the enzyme-inhibiting herbicides may be active against algal species responsible for harmful blooms. The U.S. Army Engineer Research...small-scale flask studies to determine if these new chemistries are active against organisms responsible for HAB and if they show potential for

  9. Model-based system engineering approach for the Euclid mission to manage scientific and technical complexity

    NASA Astrophysics Data System (ADS)

    Lorenzo Alvarez, Jose; Metselaar, Harold; Amiaux, Jerome; Saavedra Criado, Gonzalo; Gaspar Venancio, Luis M.; Salvignol, Jean-Christophe; Laureijs, René J.; Vavrek, Roland

    2016-08-01

    In the last years, the system engineering field is coming to terms with a paradigm change in the approach for complexity management. Different strategies have been proposed to cope with highly interrelated systems, system of systems and collaborative system engineering have been proposed and a significant effort is being invested into standardization and ontology definition. In particular, Model Based System Engineering (MBSE) intends to introduce methodologies for a systematic system definition, development, validation, deployment, operation and decommission, based on logical and visual relationship mapping, rather than traditional 'document based' information management. The practical implementation in real large-scale projects is not uniform across fields. In space science missions, the usage has been limited to subsystems or sample projects with modeling being performed 'a-posteriori' in many instances. The main hurdle for the introduction of MBSE practices in new projects is still the difficulty to demonstrate their added value to a project and whether their benefit is commensurate with the level of effort required to put them in place. In this paper we present the implemented Euclid system modeling activities, and an analysis of the benefits and limitations identified to support in particular requirement break-down and allocation, and verification planning at mission level.

  10. Large eddy simulation of fine water sprays: comparative analysis of two models and computer codes

    NASA Astrophysics Data System (ADS)

    Tsoy, A. S.; Snegirev, A. Yu.

    2015-09-01

    The model and the computer code FDS, albeit widely used in engineering practice to predict fire development, is not sufficiently validated for fire suppression by fine water sprays. In this work, the effect of numerical resolution of the large scale turbulent pulsations on the accuracy of predicted time-averaged spray parameters is evaluated. Comparison of the simulation results obtained with the two versions of the model and code, as well as that of the predicted and measured radial distributions of the liquid flow rate revealed the need to apply monotonic and yet sufficiently accurate discrete approximations of the convective terms. Failure to do so delays jet break-up, otherwise induced by large turbulent eddies, thereby excessively focuses the predicted flow around its axis. The effect of the pressure drop in the spray nozzle is also examined, and its increase has shown to cause only weak increase of the evaporated fraction and vapor concentration despite the significant increase of flow velocity.

  11. Architecture independent environment for developing engineering software on MIMD computers

    NASA Technical Reports Server (NTRS)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  12. Investigation of a laser Doppler velocimeter system to measure the flow field around a large scale V/STOL aircraft in ground effect

    NASA Technical Reports Server (NTRS)

    Zalay, A. D.; Brashears, M. R.; Jordan, A. J.; Shrider, K. R.; Vought, C. D.

    1979-01-01

    The flow field measured around a hovering 70 percent scale vertical takeoff and landing (V/STOL) aircraft model is described. The velocity measurements were conducted with a ground based laser Doppler velocimeter. The remote sensing instrumentation and experimental tests of the velocity surveys are discussed. The distribution of vertical velocity in the fan jet and fountain; the radial velocity in the wall jet and the horizontal velocity along the aircraft underside are presented for different engine rpms and aircraft height above ground. Results show that it is feasible to use a mobile laser Doppler velocimeter to measure the flow field generated by a large scale V/STOL aircraft operating in ground effect.

  13. Nuclear Thermal Rocket Simulation in NPSS

    NASA Technical Reports Server (NTRS)

    Belair, Michael L.; Sarmiento, Charles J.; Lavelle, Thomas M.

    2013-01-01

    Four nuclear thermal rocket (NTR) models have been created in the Numerical Propulsion System Simulation (NPSS) framework. The models are divided into two categories. One set is based upon the ZrC-graphite composite fuel element and tie tube-style reactor developed during the Nuclear Engine for Rocket Vehicle Application (NERVA) project in the late 1960s and early 1970s. The other reactor set is based upon a W-UO2 ceramic-metallic (CERMET) fuel element. Within each category, a small and a large thrust engine are modeled. The small engine models utilize RL-10 turbomachinery performance maps and have a thrust of approximately 33.4 kN (7,500 lbf ). The large engine models utilize scaled RL-60 turbomachinery performance maps and have a thrust of approximately 111.2 kN (25,000 lbf ). Power deposition profiles for each reactor were obtained from a detailed Monte Carlo N-Particle (MCNP5) model of the reactor cores. Performance factors such as thermodynamic state points, thrust, specific impulse, reactor power level, and maximum fuel temperature are analyzed for each engine design.

  14. Nuclear Thermal Rocket Simulation in NPSS

    NASA Technical Reports Server (NTRS)

    Belair, Michael L.; Sarmiento, Charles J.; Lavelle, Thomas L.

    2013-01-01

    Four nuclear thermal rocket (NTR) models have been created in the Numerical Propulsion System Simulation (NPSS) framework. The models are divided into two categories. One set is based upon the ZrC-graphite composite fuel element and tie tube-style reactor developed during the Nuclear Engine for Rocket Vehicle Application (NERVA) project in the late 1960s and early 1970s. The other reactor set is based upon a W-UO2 ceramic- metallic (CERMET) fuel element. Within each category, a small and a large thrust engine are modeled. The small engine models utilize RL-10 turbomachinery performance maps and have a thrust of approximately 33.4 kN (7,500 lbf ). The large engine models utilize scaled RL-60 turbomachinery performance maps and have a thrust of approximately 111.2 kN (25,000 lbf ). Power deposition profiles for each reactor were obtained from a detailed Monte Carlo N-Particle (MCNP5) model of the reactor cores. Performance factors such as thermodynamic state points, thrust, specific impulse, reactor power level, and maximum fuel temperature are analyzed for each engine design.

  15. Wave rotor-enhanced gas turbine engines

    NASA Technical Reports Server (NTRS)

    Welch, Gerard E.; Scott, Jones M.; Paxson, Daniel E.

    1995-01-01

    The benefits of wave rotor-topping in small (400 to 600 hp-class) and intermediate (3000 to 4000 hp-class) turboshaft engines, and large (80,000 to 100,000 lb(sub f)-class) high bypass ratio turbofan engines are evaluated. Wave rotor performance levels are calculated using a one-dimensional design/analysis code. Baseline and wave rotor-enhanced engine performance levels are obtained from a cycle deck in which the wave rotor is represented as a burner with pressure gain. Wave rotor-toppings is shown to significantly enhance the specific fuel consumption and specific power of small and intermediate size turboshaft engines. The specific fuel consumption of the wave rotor-enhanced large turbofan engine can be reduced while operating at significantly reduced turbine inlet temperature. The wave rotor-enhanced engine is shown to behave off-design like a conventional engine. Discussion concerning the impact of the wave rotor/gas turbine engine integration identifies tenable technical challenges.

  16. Analysis and documentation of QCSEE (Quiet Clean Short-haul Experimental Engine) over-the-wing exhaust system development

    NASA Technical Reports Server (NTRS)

    Ammer, R. C.; Kutney, J. T.

    1977-01-01

    A static scale model test program was conducted in the static test area of the NASA-Langley 9.14- by 18.29 m(30- by 60-ft) Full-Scale Wind Tunnel Facility to develop an over-the-wing (OTW) nozzle and reverser configuration for the Quiet Clean Short-Haul Experimental Engine (QCSEE). Three nozzles and one basic reverser configuration were tested over the QCSEE takeoff and approach power nozzle pressure ratio range between 1.1 and 1.3. The models were scaled to 8.53% of QCSEE engine size and tested behind two 13.97-cm (5.5-in.) diameter tip-turbine-driven fan simulators coupled in tandem. An OTW nozzle and reverser configuration was identified which satisfies the QCSEE experimental engine requirements in terms of nozzle cycle area variation capability and reverse thrust level, and provides good jet flow spreading over a wing upper surface for achievement of high propulsive lift performance.

  17. Jade: using on-demand cloud analysis to give scientists back their flow

    NASA Astrophysics Data System (ADS)

    Robinson, N.; Tomlinson, J.; Hilson, A. J.; Arribas, A.; Powell, T.

    2017-12-01

    The UK's Met Office generates 400 TB weather and climate data every day by running physical models on its Top 20 supercomputer. As data volumes explode, there is a danger that analysis workflows become dominated by watching progress bars, and not thinking about science. We have been researching how we can use distributed computing to allow analysts to process these large volumes of high velocity data in a way that's easy, effective and cheap.Our prototype analysis stack, Jade, tries to encapsulate this. Functionality includes: An under-the-hood Dask engine which parallelises and distributes computations, without the need to retrain analysts Hybrid compute clusters (AWS, Alibaba, and local compute) comprising many thousands of cores Clusters which autoscale up/down in response to calculation load using Kubernetes, and balances the cluster across providers based on the current price of compute Lazy data access from cloud storage via containerised OpenDAP This technology stack allows us to perform calculations many orders of magnitude faster than is possible on local workstations. It is also possible to outperform dedicated local compute clusters, as cloud compute can, in principle, scale to much larger scales. The use of ephemeral compute resources also makes this implementation cost efficient.

  18. Evaluating the Reliability of Emergency Response Systems for Large-Scale Incident Operations

    PubMed Central

    Jackson, Brian A.; Faith, Kay Sullivan; Willis, Henry H.

    2012-01-01

    Abstract The ability to measure emergency preparedness—to predict the likely performance of emergency response systems in future events—is critical for policy analysis in homeland security. Yet it remains difficult to know how prepared a response system is to deal with large-scale incidents, whether it be a natural disaster, terrorist attack, or industrial or transportation accident. This research draws on the fields of systems analysis and engineering to apply the concept of system reliability to the evaluation of emergency response systems. The authors describe a method for modeling an emergency response system; identifying how individual parts of the system might fail; and assessing the likelihood of each failure and the severity of its effects on the overall response effort. The authors walk the reader through two applications of this method: a simplified example in which responders must deliver medical treatment to a certain number of people in a specified time window, and a more complex scenario involving the release of chlorine gas. The authors also describe an exploratory analysis in which they parsed a set of after-action reports describing real-world incidents, to demonstrate how this method can be used to quantitatively analyze data on past response performance. The authors conclude with a discussion of how this method of measuring emergency response system reliability could inform policy discussion of emergency preparedness, how system reliability might be improved, and the costs of doing so. PMID:28083267

  19. Contaminant Permeation in the Ionomer-Membrane Water Processor (IWP) System

    NASA Technical Reports Server (NTRS)

    Kelsey, Laura K.; Finger, Barry W.; Pasadilla, Patrick; Perry, Jay

    2016-01-01

    The Ionomer-membrane Water Processor (IWP) is a patented membrane-distillation based urine brine water recovery system. The unique properties of the IWP membrane pair limit contaminant permeation from the brine to the recovered water and purge gas. A paper study was conducted to predict volatile trace contaminant permeation in the IWP system. Testing of a large-scale IWP Engineering Development Unit (EDU) with urine brine pretreated with the International Space Station (ISS) pretreatment formulation was then conducted to collect air and water samples for quality analysis. Distillate water quality and purge air GC-MS results are presented and compared to predictions, along with implications for the IWP brine processing system.

  20. Nanosecond laser coloration on stainless steel surface.

    PubMed

    Lu, Yan; Shi, Xinying; Huang, Zhongjia; Li, Taohai; Zhang, Meng; Czajkowski, Jakub; Fabritius, Tapio; Huttula, Marko; Cao, Wei

    2017-08-02

    In this work, we present laser coloration on 304 stainless steel using nanosecond laser. Surface modifications are tuned by adjusting laser parameters of scanning speed, repetition rate, and pulse width. A comprehensive study of the physical mechanism leading to the appearance is presented. Microscopic patterns are measured and employed as input to simulate light-matter interferences, while chemical states and crystal structures of composites to figure out intrinsic colors. Quantitative analysis clarifies the final colors and RGB values are the combinations of structural colors and intrinsic colors from the oxidized pigments, with the latter dominating. Therefore, the engineering and scientific insights of nanosecond laser coloration highlight large-scale utilization of the present route for colorful and resistant steels.

  1. USB environment measurements based on full-scale static engine ground tests

    NASA Technical Reports Server (NTRS)

    Sussman, M. B.; Harkonen, D. L.; Reed, J. B.

    1976-01-01

    Flow turning parameters, static pressures, surface temperatures, surface fluctuating pressures and acceleration levels were measured in the environment of a full-scale upper surface blowing (USB) propulsive lift test configuration. The test components included a flightworthy CF6-50D engine, nacelle, and USB flap assembly utilized in conjunction with ground verification testing of the USAF YC-14 Advanced Medium STOL Transport propulsion system. Results, based on a preliminary analysis of the data, generally show reasonable agreement with predicted levels based on model data. However, additional detailed analysis is required to confirm the preliminary evaluation, to help delineate certain discrepancies with model data, and to establish a basis for future flight test comparisons.

  2. Wind-tunnel results of the aerodynamic characteristics of a 1/8-scale model of a twin engine short-haul transport. [in Langley V/STOL tunnel

    NASA Technical Reports Server (NTRS)

    Paulson, J. W., Jr.

    1977-01-01

    A wind tunnel test was conducted in the Langley V/STOL tunnel to define the aerodynamic characteristics of a 1/8-scale twin-engine short haul transport. The model was tested in both the cruise and approach configurations with various control surfaces deflected. Data were obtained out of ground effect for the cruise configuration and both in and out of ground effect for the approach configuration. These data are intended to be a reference point to begin the analysis of the flight characteristics of the NASA terminal configured vehicle (TCV) and are presented without analysis.

  3. Multidimensional assessment of strongly irregular voices such as in substitution voicing and spasmodic dysphonia: a compilation of own research.

    PubMed

    Moerman, Mieke; Martens, Jean-Pierre; Dejonckere, Philippe

    2015-04-01

    This article is a compilation of own research performed during the European COoperation in Science and Technology (COST) action 2103: 'Advance Voice Function Assessment', an initiative of voice and speech processing teams consisting of physicists, engineers, and clinicians. This manuscript concerns analyzing largely irregular voicing types, namely substitution voicing (SV) and adductor spasmodic dysphonia (AdSD). A specific perceptual rating scale (IINFVo) was developed, and the Auditory Model Based Pitch Extractor (AMPEX), a piece of software that automatically analyses running speech and generates pitch values in background noise, was applied. The IINFVo perceptual rating scale has been shown to be useful in evaluating SV. The analysis of strongly irregular voices stimulated a modification of the European Laryngological Society's assessment protocol which was originally designed for the common types of (less severe) dysphonia. Acoustic analysis with AMPEX demonstrates that the most informative features are, for SV, the voicing-related acoustic features and, for AdSD, the perturbation measures. Poor correlations between self-assessment and acoustic and perceptual dimensions in the assessment of highly irregular voices argue for a multidimensional approach.

  4. Structure and modeling of turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Novikov, E.A.

    The {open_quotes}vortex strings{close_quotes} scale l{sub s} {approximately} LRe{sup -3/10} (L-external scale, Re - Reynolds number) is suggested as a grid scale for the large-eddy simulation. Various aspects of the structure of turbulence and subgrid modeling are described in terms of conditional averaging, Markov processes with dependent increments and infinitely divisible distributions. The major request from the energy, naval, aerospace and environmental engineering communities to the theory of turbulence is to reduce the enormous number of degrees of freedom in turbulent flows to a level manageable by computer simulations. The vast majority of these degrees of freedom is in the small-scalemore » motion. The study of the structure of turbulence provides a basis for subgrid-scale (SGS) models, which are necessary for the large-eddy simulations (LES).« less

  5. Structural similitude and design of scaled down laminated models

    NASA Technical Reports Server (NTRS)

    Simitses, G. J.; Rezaeepazhand, J.

    1993-01-01

    The excellent mechanical properties of laminated composite structures make them prime candidates for wide variety of applications in aerospace, mechanical and other branches of engineering. The enormous design flexibility of advanced composites is obtained at the cost of large number of design parameters. Due to complexity of the systems and lack of complete design based informations, designers tend to be conservative in their design. Furthermore, any new design is extensively evaluated experimentally until it achieves the necessary reliability, performance and safety. However, the experimental evaluation of composite structures are costly and time consuming. Consequently, it is extremely useful if a full-scale structure can be replaced by a similar scaled-down model which is much easier to work with. Furthermore, a dramatic reduction in cost and time can be achieved, if available experimental data of a specific structure can be used to predict the behavior of a group of similar systems. This study investigates problems associated with the design of scaled models. Such study is important since it provides the necessary scaling laws, and the factors which affect the accuracy of the scale models. Similitude theory is employed to develop the necessary similarity conditions (scaling laws). Scaling laws provide relationship between a full-scale structure and its scale model, and can be used to extrapolate the experimental data of a small, inexpensive, and testable model into design information for a large prototype. Due to large number of design parameters, the identification of the principal scaling laws by conventional method (dimensional analysis) is tedious. Similitude theory based on governing equations of the structural system is more direct and simpler in execution. The difficulty of making completely similar scale models often leads to accept certain type of distortion from exact duplication of the prototype (partial similarity). Both complete and partial similarity are discussed. The procedure consists of systematically observing the effect of each parameter and corresponding scaling laws. Then acceptable intervals and limitations for these parameters and scaling laws are discussed. In each case, a set of valid scaling factors and corresponding response scaling laws that accurately predict the response of prototypes from experimental models is introduced. The examples used include rectangular laminated plates under destabilizing loads, applied individually, vibrational characteristics of same plates, as well as cylindrical bending of beam-plates.

  6. Progress with variable cycle engines

    NASA Technical Reports Server (NTRS)

    Westmoreland, J. S.

    1980-01-01

    The evaluation of components of an advanced propulsion system for a future supersonic cruise vehicle is discussed. These components, a high performance duct burner for thrust augmentation and a low jet noise coannular exhaust nozzle, are part of the variable stream control engine. An experimental test program involving both isolated component and complete engine tests was conducted for the high performance, low emissions duct burner with excellent results. Nozzle model tests were completed which substantiate the inherent jet noise benefit associated with the unique velocity profile possible of a coannular exhaust nozzle system on a variable stream control engine. Additional nozzle model performance tests have established high thrust efficiency levels at takeoff and supersonic cruise for this nozzle system. Large scale testing of these two critical components is conducted using an F100 engine as the testbed for simulating the variable stream control engine.

  7. To Make Archives Available Online: Transcending Boundaries or Building Walls?

    ERIC Educational Resources Information Center

    Hansen, Lars-Erik; Sundqvist, Anneli

    2012-01-01

    The development of information technology and the rise of the Internet have rendered a large-scale digitization and dissemination of originally analog information objects. On the Web sites "Lararnas Historia" ("History of Teachers" www.lararhistoria.se) and "Ingenjorshistoria" ("History of Engineers"…

  8. Development of a Catalytic Combustor for Aircraft Gas Turbine Engines.

    DTIC Science & Technology

    1976-09-22

    80 VI. DESIGN OF 7.6 CI CIANETE COMBUSTORS . . . . . . . . . . . 86 1. Design and Fabrication of CombusLors for Large Scale T est in...obtained for this program included round holes of different diameters, squares, rectangles, triangles, and other more complex hollow configurations

  9. Enhanced Vehicle Simulation Tool Enables Wider Array of Analyses | News |

    Science.gov Websites

    of vehicle types, including conventional vehicles, electric-drive vehicles, and fuel cell vehicles types," said NREL Senior Engineer Aaron Brooker. FASTSim facilitates large-scale evaluation of and on-road performance. Learn more about NREL's sustainable transportation research.

  10. Micron-scale lens array having diffracting structures

    DOEpatents

    Goldberg, Kenneth A

    2013-10-29

    A novel micron-scale lens, a microlens, is engineered to concentrate light efficiently onto an area of interest, such as a small, light-sensitive detector element in an integrated electronic device. Existing microlens designs imitate the form of large-scale lenses and are less effective at small sizes. The microlenses described herein have been designed to accommodate diffraction effects, which dominate the behavior of light at small length scales. Thus a new class of light-concentrating optical elements with much higher relative performance has been created. Furthermore, the new designs are much easier to fabricate than previous designs.

  11. Software engineering risk factors in the implementation of a small electronic medical record system: the problem of scalability.

    PubMed

    Chiang, Michael F; Starren, Justin B

    2002-01-01

    The successful implementation of clinical information systems is difficult. In examining the reasons and potential solutions for this problem, the medical informatics community may benefit from the lessons of a rich body of software engineering and management literature about the failure of software projects. Based on previous studies, we present a conceptual framework for understanding the risk factors associated with large-scale projects. However, the vast majority of existing literature is based on large, enterprise-wide systems, and it unclear whether those results may be scaled down and applied to smaller projects such as departmental medical information systems. To examine this issue, we discuss the case study of a delayed electronic medical record implementation project in a small specialty practice at Columbia-Presbyterian Medical Center. While the factors contributing to the delay of this small project share some attributes with those found in larger organizations, there are important differences. The significance of these differences for groups implementing small medical information systems is discussed.

  12. Evaluation of fuel preparation systems for lean premixing-prevaporizing combustors

    NASA Technical Reports Server (NTRS)

    Dodds, W. J.; Ekstedt, E. E.

    1985-01-01

    A series of experiments was carried out in order to produce design data for a premixing prevaporizing fuel-air mixture preparation system for aircraft gas turbine engine combustors. The fuel-air mixture uniformity of four different system design concepts was evaluated over a range of conditions representing the cruise operation of a modern commercial turbofan engine. Operating conditions including pressure, temperature, fuel-to-air ratio, and velocity, exhibited no clear effect on mixture uniformity of systems using pressure-atomizing fuel nozzles and large-scale mixing devices. However, the performance of systems using atomizing fuel nozzles and large-scale mixing devices was found to be sensitive to operating conditions. Variations in system design variables were also evaluated and correlated. Mixing uniformity was found to improve with system length, pressure drop, and the number of fuel injection points per unit area. A premixing system capable of providing mixing uniformity to within 15 percent over a typical range of cruise operating conditions is demonstrated.

  13. Radio Source Morphology: 'nature or nuture'?

    NASA Astrophysics Data System (ADS)

    Banfield, Julie; Emonts, Bjorn; O'Sullivan, Shane

    2012-10-01

    Radio sources, emanating from supermassive black-holes in the centres of active galaxies, display a large variety of morphological properties. It is a long-standing debate to what extent the differences between various types of radio sources are due to intrinsic properties of the central engine (`nature') or due to the properties of the interstellar medium that surrounds the central engine and host galaxy (`nurture'). Settling this `nature vs. nurture' debate for nearby radio galaxies, which can be studied in great detail, is vital for understanding the properties and evolution of radio galaxies throughout the Universe. We propose to observe the radio galaxy NGC 612 where previous observations have detected the presence of a large-scale HI bridge between the host galaxy and a nearby galaxy NGC 619. We request a total of 13 hrs in the 750m array-configuration to determine whether or not the 100 kpc-scale radio source morphology is directly related to the intergalactic distribution of neutral hydrogen gas.

  14. State analysis requirements database for engineering complex embedded systems

    NASA Technical Reports Server (NTRS)

    Bennett, Matthew B.; Rasmussen, Robert D.; Ingham, Michel D.

    2004-01-01

    It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer's intent, potentially leading to software errors. This problem is addressed by a systems engineering tool called the State Analysis Database, which provides a tool for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using the State Analysis Database.

  15. Physiological Effects of Free Fatty Acid Production in Genetically Engineered Synechococcus elongatus PCC 7942

    PubMed Central

    Ruffing, Anne M.; Jones, Howland D.T.

    2012-01-01

    The direct conversion of carbon dioxide into biofuels by photosynthetic microorganisms is a promising alternative energy solution. In this study, a model cyanobacterium, Synechococcus elongatus PCC 7942, is engineered to produce free fatty acids (FFA), potential biodiesel precursors, via gene knockout of the FFA-recycling acyl-ACP synthetase and expression of a thioesterase for release of the FFA. Similar to previous efforts, the engineered strains produce and excrete FFA, but the yields are too low for large-scale production. While other efforts have applied additional metabolic engineering strategies in an attempt to boost FFA production, we focus on characterizing the engineered strains to identify the physiological effects that limit cell growth and FFA synthesis. The strains engineered for FFA-production show reduced photosynthetic yields, chlorophyll-a degradation, and changes in the cellular localization of the light-harvesting pigments, phycocyanin and allophycocyanin. Possible causes of these physiological effects are also identified. The addition of exogenous linolenic acid, a polyunsaturated FFA, to cultures of S. elongatus 7942 yielded a physiological response similar to that observed in the FFA-producing strains with only one notable difference. In addition, the lipid constituents of the cell and thylakoid membranes in the FFA-producing strains show changes in both the relative amounts of lipid components and the degree of saturation of the fatty acid side chains. These changes in lipid composition may affect membrane integrity and structure, the binding and diffusion of phycobilisomes, and the activity of membrane-bound enzymes including those involved in photosynthesis. Thus, the toxicity of unsaturated FFA and changes in membrane composition may be responsible for the physiological effects observed in FFA-producing S. elongatus 7942. These issues must be addressed to enable the high yields of FFA synthesis necessary for large-scale biofuel production. PMID:22473793

  16. Solving large-scale fixed cost integer linear programming models for grid-based location problems with heuristic techniques

    NASA Astrophysics Data System (ADS)

    Noor-E-Alam, Md.; Doucette, John

    2015-08-01

    Grid-based location problems (GBLPs) can be used to solve location problems in business, engineering, resource exploitation, and even in the field of medical sciences. To solve these decision problems, an integer linear programming (ILP) model is designed and developed to provide the optimal solution for GBLPs considering fixed cost criteria. Preliminary results show that the ILP model is efficient in solving small to moderate-sized problems. However, this ILP model becomes intractable in solving large-scale instances. Therefore, a decomposition heuristic is proposed to solve these large-scale GBLPs, which demonstrates significant reduction of solution runtimes. To benchmark the proposed heuristic, results are compared with the exact solution via ILP. The experimental results show that the proposed method significantly outperforms the exact method in runtime with minimal (and in most cases, no) loss of optimality.

  17. An evaluation of multi-probe locality sensitive hashing for computing similarities over web-scale query logs

    PubMed Central

    2018-01-01

    Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users’ queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with “vanilla” LSH, even when using the same amount of space. PMID:29346410

  18. Upper atmosphere pollution measurements (GASP)

    NASA Technical Reports Server (NTRS)

    Rudey, R. A.; Holdeman, J. D.

    1975-01-01

    The environmental effects are discussed of engine effluents of future large fleets of aircraft operating in the stratosphere. Topics discussed include: atmospheric properties, aircraft engine effluents, upper atmospheric measurements, global air sampling, and data reduction and analysis

  19. Status of the Combined Cycle Engine Rig

    NASA Technical Reports Server (NTRS)

    Saunders, Dave; Slater, John; Dippold, Vance

    2009-01-01

    Status for the past year is provided of the turbine-based Combined-Cycle Engine (CCE) Rig for the hypersonic project. As part of the first stage propulsion of a two-stage-to-orbit vehicle concept, this engine rig is designed with a common inlet that supplies flow to a turbine engine and a dual-mode ramjet / scramjet engine in an over/under configuration. At Mach 4 the inlet has variable geometry to switch the airflow from the turbine to the ramjet / scramjet engine. This process is known as inlet mode-transition. In addition to investigating inlet aspects of mode transition, the rig will allow testing of turbine and scramjet systems later in the test series. Fully closing the splitter cowl "cocoons" the turbine engine and increases airflow to the scramjet duct. The CCE Rig will be a testbed to investigate integrated propulsion system and controls technology objectives. Four phases of testing are planned to 1) characterize the dual inlet database, 2) collect inlet dynamics using system identification techniques, 3) implement an inlet control to demonstrate mode-transition scenarios and 4) demonstrate integrated inlet/turbine engine operation through mode-transition. Status of the test planning and preparation activities is summarized with background on the inlet design and small-scale testing, analytical CFD predictions and some details of the large-scale hardware. The final stages of fabrication are underway.

  20. Practical Techniques for Modeling Gas Turbine Engine Performance

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.

    2016-01-01

    The cost and risk associated with the design and operation of gas turbine engine systems has led to an increasing dependence on mathematical models. In this paper, the fundamentals of engine simulation will be reviewed, an example performance analysis will be performed, and relationships useful for engine control system development will be highlighted. The focus will be on thermodynamic modeling utilizing techniques common in industry, such as: the Brayton cycle, component performance maps, map scaling, and design point criteria generation. In general, these topics will be viewed from the standpoint of an example turbojet engine model; however, demonstrated concepts may be adapted to other gas turbine systems, such as gas generators, marine engines, or high bypass aircraft engines. The purpose of this paper is to provide an example of gas turbine model generation and system performance analysis for educational uses, such as curriculum creation or student reference.

  1. Engine-over-the-wing noise research

    NASA Technical Reports Server (NTRS)

    Reshotko, M.; Goodykoontz, J. H.; Dorsch, R. G.

    1973-01-01

    Acoustic measurements for large model eingine-over-the-wing (EOW) research configurations having both conventional and powered lift applications were taken for flap positions typical of takeoff and approach and at locations simulating flyover and sideline. The results indicate that the noise is shielded by the wing and redirected above it, making the EOW concept a prime contender for quiet aircraft. The large-scale noise data are in agreement with earlier small-model results. Below the wing, the EOW configuration is about 10 PNdb quieter than the engine-under-the-wing externally-blown-flap for powered lift, and up to 10 db quieter than the nozzle alone at high frequencies for conventional lift applications.

  2. Bioprocess scale-up/down as integrative enabling technology: from fluid mechanics to systems biology and beyond.

    PubMed

    Delvigne, Frank; Takors, Ralf; Mudde, Rob; van Gulik, Walter; Noorman, Henk

    2017-09-01

    Efficient optimization of microbial processes is a critical issue for achieving a number of sustainable development goals, considering the impact of microbial biotechnology in agrofood, environment, biopharmaceutical and chemical industries. Many of these applications require scale-up after proof of concept. However, the behaviour of microbial systems remains unpredictable (at least partially) when shifting from laboratory-scale to industrial conditions. The need for robust microbial systems is thus highly needed in this context, as well as a better understanding of the interactions between fluid mechanics and cell physiology. For that purpose, a full scale-up/down computational framework is already available. This framework links computational fluid dynamics (CFD), metabolic flux analysis and agent-based modelling (ABM) for a better understanding of the cell lifelines in a heterogeneous environment. Ultimately, this framework can be used for the design of scale-down simulators and/or metabolically engineered cells able to cope with environmental fluctuations typically found in large-scale bioreactors. However, this framework still needs some refinements, such as a better integration of gas-liquid flows in CFD, and taking into account intrinsic biological noise in ABM. © 2017 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.

  3. Particle physics and polyedra proximity calculation for hazard simulations in large-scale industrial plants

    NASA Astrophysics Data System (ADS)

    Plebe, Alice; Grasso, Giorgio

    2016-12-01

    This paper describes a system developed for the simulation of flames inside an open-source 3D computer graphic software, Blender, with the aim of analyzing in virtual reality scenarios of hazards in large-scale industrial plants. The advantages of Blender are of rendering at high resolution the very complex structure of large industrial plants, and of embedding a physical engine based on smoothed particle hydrodynamics. This particle system is used to evolve a simulated fire. The interaction of this fire with the components of the plant is computed using polyhedron separation distance, adopting a Voronoi-based strategy that optimizes the number of feature distance computations. Results on a real oil and gas refining industry are presented.

  4. Genome-scale metabolic modeling of Mucor circinelloides and comparative analysis with other oleaginous species.

    PubMed

    Vongsangnak, Wanwipa; Klanchui, Amornpan; Tawornsamretkit, Iyarest; Tatiyaborwornchai, Witthawin; Laoteng, Kobkul; Meechai, Asawin

    2016-06-01

    We present a novel genome-scale metabolic model iWV1213 of Mucor circinelloides, which is an oleaginous fungus for industrial applications. The model contains 1213 genes, 1413 metabolites and 1326 metabolic reactions across different compartments. We demonstrate that iWV1213 is able to accurately predict the growth rates of M. circinelloides on various nutrient sources and culture conditions using Flux Balance Analysis and Phenotypic Phase Plane analysis. Comparative analysis of three oleaginous genome-scale models, including M. circinelloides (iWV1213), Mortierella alpina (iCY1106) and Yarrowia lipolytica (iYL619_PCP) revealed that iWV1213 possesses a higher number of genes involved in carbohydrate, amino acid, and lipid metabolisms that might contribute to its versatility in nutrient utilization. Moreover, the identification of unique and common active reactions among the Zygomycetes oleaginous models using Flux Variability Analysis unveiled a set of gene/enzyme candidates as metabolic engineering targets for cellular improvement. Thus, iWV1213 offers a powerful metabolic engineering tool for multi-level omics analysis, enabling strain optimization as a cell factory platform of lipid-based production. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. An engineering closure for heavily under-resolved coarse-grid CFD in large applications

    NASA Astrophysics Data System (ADS)

    Class, Andreas G.; Yu, Fujiang; Jordan, Thomas

    2016-11-01

    Even though high performance computation allows very detailed description of a wide range of scales in scientific computations, engineering simulations used for design studies commonly merely resolve the large scales thus speeding up simulation time. The coarse-grid CFD (CGCFD) methodology is developed for flows with repeated flow patterns as often observed in heat exchangers or porous structures. It is proposed to use inviscid Euler equations on a very coarse numerical mesh. This coarse mesh needs not to conform to the geometry in all details. To reinstall physics on all smaller scales cheap subgrid models are employed. Subgrid models are systematically constructed by analyzing well-resolved generic representative simulations. By varying the flow conditions in these simulations correlations are obtained. These comprehend for each individual coarse mesh cell a volume force vector and volume porosity. Moreover, for all vertices, surface porosities are derived. CGCFD is related to the immersed boundary method as both exploit volume forces and non-body conformal meshes. Yet, CGCFD differs with respect to the coarser mesh and the use of Euler equations. We will describe the methodology based on a simple test case and the application of the method to a 127 pin wire-wrap fuel bundle.

  6. Contribution of Road Grade to the Energy Use of Modern Automobiles Across Large Datasets of Real-World Drive Cycles: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, E.; Burton, E.; Duran, A.

    Understanding the real-world power demand of modern automobiles is of critical importance to engineers using modeling and simulation to inform the intelligent design of increasingly efficient powertrains. Increased use of global positioning system (GPS) devices has made large scale data collection of vehicle speed (and associated power demand) a reality. While the availability of real-world GPS data has improved the industry's understanding of in-use vehicle power demand, relatively little attention has been paid to the incremental power requirements imposed by road grade. This analysis quantifies the incremental efficiency impacts of real-world road grade by appending high fidelity elevation profiles tomore » GPS speed traces and performing a large simulation study. Employing a large real-world dataset from the National Renewable Energy Laboratory's Transportation Secure Data Center, vehicle powertrain simulations are performed with and without road grade under five vehicle models. Aggregate results of this study suggest that road grade could be responsible for 1% to 3% of fuel use in light-duty automobiles.« less

  7. Investigation on Composite Throat Insert For Cryogenic Engines

    NASA Astrophysics Data System (ADS)

    Ayyappan, G.; Tiwari, S. B.; Praveen, RS; Mohankumar, L.; Jathaveda, M.; Ganesh, P.

    2017-02-01

    Injector element testing is an important step in the development and qualification of the cryogenic rocket engines. For the purpose of characterising the injectors, sub scale chambers are used. In order to assess the performance of the injectors, different configurations of the injectors are tested using a combustion chamber and a convergent-divergent nozzle. Pressure distribution along the wall of the chamber and throat insert is obtained from the CFD analysis and temperature distribution is obtained from thermal analysis. Thermo-structural analysis is carried out for the sub-scale model of throat inert using temperature dependent material properties. For the experiments a sub-scale model of the thrust chamber is realised. Injector element tests are carried out for the studies. The objective of the present study is to investigate the behaviour of different throat inserts, mainly graphite, 2-D Carbon-Carbon(2D C-C), 4-D Carbon-Carbon (4D C-C) and Silica Phenolic (SP), under pressure and thermal load for repeated operation of the engine. Analytical results are compared with the test results. The paper gives the results of theoretical studies and experiments conducted with all the four type of throat material. It is concluded that 2D C-C is superior in terms of throat erosion being the least under specified combustion environment.

  8. Steam atmosphere dryer project: System development and field test. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1999-02-01

    The objective of this project was to develop and demonstrate the use of a superheated steam atmosphere dryer as a highly improved alternative to conventional hot air-drying systems, the present industrial standard method for drying various wet feedstocks. The development program plan consisted of three major activities. The first was engineering analysis and testing of a small-scale laboratory superheated steam dryer. This dryer provided the basic engineering heat transfer data necessary to design a large-scale system. The second major activity consisted of the design, fabrication, and laboratory checkout testing of the field-site prototype superheated steam dryer system. The third majormore » activity consisted of the installation and testing of the complete 250-lb/hr evaporation rate dryer and a 30-kW cogeneration system in conjunction with an anaerobic digester facility at the Village of Bergen, NY. Feedstock for the digester facility at the Village of Bergen, NY. Feedstock for the digester was waste residue from a nearby commercial food processing plant. The superheated steam dryer system was placed into operation in August 1996 and operated successfully through March 1997. During this period, the dryer processed all the material from the digester to a powdered consistency usable as a high-nitrogen-based fertilizer.« less

  9. Metabolic engineering tools in model cyanobacteria.

    PubMed

    Carroll, Austin L; Case, Anna E; Zhang, Angela; Atsumi, Shota

    2018-03-26

    Developing sustainable routes for producing chemicals and fuels is one of the most important challenges in metabolic engineering. Photoautotrophic hosts are particularly attractive because of their potential to utilize light as an energy source and CO 2 as a carbon substrate through photosynthesis. Cyanobacteria are unicellular organisms capable of photosynthesis and CO 2 fixation. While engineering in heterotrophs, such as Escherichia coli, has result in a plethora of tools for strain development and hosts capable of producing valuable chemicals efficiently, these techniques are not always directly transferable to cyanobacteria. However, recent efforts have led to an increase in the scope and scale of chemicals that cyanobacteria can produce. Adaptations of important metabolic engineering tools have also been optimized to function in photoautotrophic hosts, which include Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR)-Cas9, 13 C Metabolic Flux Analysis (MFA), and Genome-Scale Modeling (GSM). This review explores innovations in cyanobacterial metabolic engineering, and highlights how photoautotrophic metabolism has shaped their development. Copyright © 2018 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  10. Conducting Automated Test Assembly Using the Premium Solver Platform Version 7.0 with Microsoft Excel and the Large-Scale LP/QP Solver Engine Add-In

    ERIC Educational Resources Information Center

    Cor, Ken; Alves, Cecilia; Gierl, Mark J.

    2008-01-01

    This review describes and evaluates a software add-in created by Frontline Systems, Inc., that can be used with Microsoft Excel 2007 to solve large, complex test assembly problems. The combination of Microsoft Excel 2007 with the Frontline Systems Premium Solver Platform is significant because Microsoft Excel is the most commonly used spreadsheet…

  11. Scaling up high throughput field phenotyping of corn and soy research plots using ground rovers

    NASA Astrophysics Data System (ADS)

    Peshlov, Boyan; Nakarmi, Akash; Baldwin, Steven; Essner, Scott; French, Jasenka

    2017-05-01

    Crop improvement programs require large and meticulous selection processes that effectively and accurately collect and analyze data to generate quality plant products as efficiently as possible, develop superior cropping and/or crop improvement methods. Typically, data collection for such testing is performed by field teams using hand-held instruments or manually-controlled devices. Although steps are taken to reduce error, the data collected in such manner can be unreliable due to human error and fatigue, which reduces the ability to make accurate selection decisions. Monsanto engineering teams have developed a high-clearance mobile platform (Rover) as a step towards high throughput and high accuracy phenotyping at an industrial scale. The rovers are equipped with GPS navigation, multiple cameras and sensors and on-board computers to acquire data and compute plant vigor metrics per plot. The supporting IT systems enable automatic path planning, plot identification, image and point cloud data QA/QC and near real-time analysis where results are streamed to enterprise databases for additional statistical analysis and product advancement decisions. Since the rover program was launched in North America in 2013, the number of research plots we can analyze in a growing season has expanded dramatically. This work describes some of the successes and challenges in scaling up of the rover platform for automated phenotyping to enable science at scale.

  12. Energy Conservation Projects to Benefit the Railroad Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clifford Mirman; Promod Vohra

    The Energy Conservation Projects to benefit the railroad industry using the Norfolk Southern Company as a model for the railroad industry has five unique tasks which are in areas of importance within the rail industry, and specifically in the area of energy conservation. The NIU Engineering and Technology research team looked at five significant areas in which research and development work can provide unique solutions to the railroad industry in energy the conservation. (1) Alternate Fuels - An examination of various blends of bio-based diesel fuels for the railroad industry, using Norfolk Southern as a model for the industry. Themore » team determined that bio-diesel fuel is a suitable alternative to using straight diesel fuel, however, the cost and availability across the country varies to a great extent. (2) Utilization of fuel cells for locomotive power systems - While the application of the fuel cell has been successfully demonstrated in the passenger car, this is a very advanced topic for the railroad industry. There are many safety and power issues that the research team examined. (3) Thermal and emission reduction for current large scale diesel engines - The current locomotive system generates large amount of heat through engine cooling and heat dissipation when the traction motors are used to decelerate the train. The research team evaluated thermal management systems to efficiently deal with large thermal loads developed by the operating engines. (4) Use of Composite and Exotic Replacement Materials - Research team redesigned various components using new materials, coatings, and processes to provide the needed protection. Through design, analysis, and testing, new parts that can withstand the hostile environments were developed. (5) Tribology Applications - Identification of tribology issues in the Railroad industry which play a significant role in the improvement of energy usage. Research team analyzed and developed solutions which resulted in friction modification to improve energy efficiency.« less

  13. Large strain variable stiffness composites for shear deformations with applications to morphing aircraft skins

    NASA Astrophysics Data System (ADS)

    McKnight, G. P.; Henry, C. P.

    2008-03-01

    Morphing or reconfigurable structures potentially allow for previously unattainable vehicle performance by permitting several optimized structures to be achieved using a single platform. The key to enabling this technology in applications such as aircraft wings, nozzles, and control surfaces, are new engineered materials which can achieve the necessary deformations but limit losses in parasitic actuation mass and structural efficiency (stiffness/weight). These materials should exhibit precise control of deformation properties and provide high stiffness when exercised through large deformations. In this work, we build upon previous efforts in segmented reinforcement variable stiffness composites employing shape memory polymers to create prototype hybrid composite materials that combine the benefits of cellular materials with those of discontinuous reinforcement composites. These composites help overcome two key challenges for shearing wing skins: the resistance to out of plane buckling from actuation induced shear deformation, and resistance to membrane deflections resulting from distributed aerodynamic pressure loading. We designed, fabricated, and tested composite materials intended for shear deformation and address out of plane deflections in variable area wing skins. Our designs are based on the kinematic engineering of reinforcement platelets such that desired microstructural kinematics is achieved through prescribed boundary conditions. We achieve this kinematic control by etching sheets of metallic reinforcement into regular patterns of platelets and connecting ligaments. This kinematic engineering allows optimization of materials properties for a known deformation pathway. We use mechanical analysis and full field photogrammetry to relate local scale kinematics and strains to global deformations for both axial tension loading and shear loading with a pinned-diamond type fixture. The Poisson ratio of the kinematically engineered composite is ~3x higher than prototypical orthotropic variable stiffness composites. This design allows us to create composite materials that have high stiffness in the cold state below SMP T g (4-14GPa) and yet achieve large composite shear strains (5-20%) in the hot state (above SMP T g).

  14. Hypersonic research engine/aerothermodynamic integration model, experimental results. Volume 1: Mach 6 component integration

    NASA Technical Reports Server (NTRS)

    Andrews, E. H., Jr.; Mackley, E. A.

    1976-01-01

    The NASA Hypersonic Research Engine (HRE) Project was initiated for the purpose of advancing the technology of airbreathing propulsion for hypersonic flight. A large component (inlet, combustor, and nozzle) and structures development program was encompassed by the project. The tests of a full-scale (18 in. diameter cowl and 87 in. long) HRE concept, designated the Aerothermodynamic Integration Model (AIM), at Mach numbers of 5, 6, and 7. Computer program results for Mach 6 component integration tests are presented.

  15. Inverse problems in the design, modeling and testing of engineering systems

    NASA Technical Reports Server (NTRS)

    Alifanov, Oleg M.

    1991-01-01

    Formulations, classification, areas of application, and approaches to solving different inverse problems are considered for the design of structures, modeling, and experimental data processing. Problems in the practical implementation of theoretical-experimental methods based on solving inverse problems are analyzed in order to identify mathematical models of physical processes, aid in input data preparation for design parameter optimization, help in design parameter optimization itself, and to model experiments, large-scale tests, and real tests of engineering systems.

  16. Reconstructing the duty of water: a study of emergent norms in socio-hydrology

    NASA Astrophysics Data System (ADS)

    Wescoat, J. L., Jr.

    2013-06-01

    This paper assesses changing norms of water use known as the duty of water. It is a case study in historical socio-hydrology, a line of research useful for anticipating changing social values with respect to water. The duty of water is currently defined as the amount of water reasonably required to irrigate a substantial crop with careful management and without waste on a given tract of land. The historical section of the paper traces this concept back to late-18th century analysis of steam engine efficiencies for mine dewatering in Britain. A half-century later, British irrigation engineers fundamentally altered the concept of duty to plan large-scale canal irrigation systems in northern India at an average duty of 218 acres per cubic foot per second (cfs). They justified this extensive irrigation standard (i.e., low water application rate over large areas) with a suite of social values that linked famine prevention with revenue generation and territorial control. Several decades later irrigation engineers in the western US adapted the duty of water concept to a different socio-hydrologic system and norms, using it to establish minimum standards for water rights appropriation (e.g., only 40 to 80 acres per cfs). The final section shows that while the duty of water concept has now been eclipsed by other measures and standards of water efficiency, it may have continuing relevance for anticipating if not predicting emerging social values with respect to water.

  17. Analysis of high load dampers

    NASA Technical Reports Server (NTRS)

    Bhat, S. T.; Buono, D. F.; Hibner, D. H.

    1981-01-01

    High load damping requirements for modern jet engines are discussed. The design of damping systems which could satisfy these requirements is also discusseed. In order to evaluate high load damping requirements, engines in three major classes were studied; large transport engines, small general aviation engines, and military engines. Four damper concepts applicable to these engines were evaluated; multi-ring, cartridge, curved beam, and viscous/friction. The most promising damper concept was selected for each engine and performance was assessed relative to conventional dampers and in light of projected damping requirements for advanced jet engines.

  18. Japan’s Nuclear Future: Policy Debate, Prospects, and U.S. Interests

    DTIC Science & Technology

    2008-05-09

    raised in particular over the construction of an industrial- scale reprocessing facility in Japan,. Additionally, fast breeder reactors also produce more...Nuclear Fuel Cycle Engineering Laboratories. 10 A fast breeder reactor is a fast neutron reactor that produces more plutonium than it consumes, which can...Japan Nuclear Fuel Limited (JNFL) has built and is currently running active testing on a large - scale commercial reprocessing plant at Rokkasho-mura

  19. Innovative Double Bypass Engine for Increased Performance

    NASA Astrophysics Data System (ADS)

    Manoharan, Sanjivan

    Engines continue to grow in size to meet the current thrust requirements of the civil aerospace industry. Large engines pose significant transportation problems and require them to be split in order to be shipped. Thus, large amounts of time have been spent in researching methods to increase thrust capabilities while maintaining a reasonable engine size. Unfortunately, much of this research has been focused on increasing the performance and efficiencies of individual components while limited research has been done on innovative engine configurations. This thesis focuses on an innovative engine configuration, the High Double Bypass Engine, aimed at increasing fuel efficiency and thrust while maintaining a competitive fan diameter and engine length. The 1-D analysis was done in Excel and then compared to the results from Numerical Propulsion Simulation System (NPSS) software and were found to be within 4% error. Flow performance characteristics were also determined and validated against their criteria.

  20. Flux analysis and metabolomics for systematic metabolic engineering of microorganisms.

    PubMed

    Toya, Yoshihiro; Shimizu, Hiroshi

    2013-11-01

    Rational engineering of metabolism is important for bio-production using microorganisms. Metabolic design based on in silico simulations and experimental validation of the metabolic state in the engineered strain helps in accomplishing systematic metabolic engineering. Flux balance analysis (FBA) is a method for the prediction of metabolic phenotype, and many applications have been developed using FBA to design metabolic networks. Elementary mode analysis (EMA) and ensemble modeling techniques are also useful tools for in silico strain design. The metabolome and flux distribution of the metabolic pathways enable us to evaluate the metabolic state and provide useful clues to improve target productivity. Here, we reviewed several computational applications for metabolic engineering by using genome-scale metabolic models of microorganisms. We also discussed the recent progress made in the field of metabolomics and (13)C-metabolic flux analysis techniques, and reviewed these applications pertaining to bio-production development. Because these in silico or experimental approaches have their respective advantages and disadvantages, the combined usage of these methods is complementary and effective for metabolic engineering. Copyright © 2013 Elsevier Inc. All rights reserved.

Top