Discovering Beaten Paths in Collaborative Ontology-Engineering Projects using Markov Chains
Walk, Simon; Singer, Philipp; Strohmaier, Markus; Tudorache, Tania; Musen, Mark A.; Noy, Natalya F.
2014-01-01
Biomedical taxonomies, thesauri and ontologies in the form of the International Classification of Diseases as a taxonomy or the National Cancer Institute Thesaurus as an OWL-based ontology, play a critical role in acquiring, representing and processing information about human health. With increasing adoption and relevance, biomedical ontologies have also significantly increased in size. For example, the 11th revision of the International Classification of Diseases, which is currently under active development by the World Health Organization contains nearly 50, 000 classes representing a vast variety of different diseases and causes of death. This evolution in terms of size was accompanied by an evolution in the way ontologies are engineered. Because no single individual has the expertise to develop such large-scale ontologies, ontology-engineering projects have evolved from small-scale efforts involving just a few domain experts to large-scale projects that require effective collaboration between dozens or even hundreds of experts, practitioners and other stakeholders. Understanding the way these different stakeholders collaborate will enable us to improve editing environments that support such collaborations. In this paper, we uncover how large ontology-engineering projects, such as the International Classification of Diseases in its 11th revision, unfold by analyzing usage logs of five different biomedical ontology-engineering projects of varying sizes and scopes using Markov chains. We discover intriguing interaction patterns (e.g., which properties users frequently change after specific given ones) that suggest that large collaborative ontology-engineering projects are governed by a few general principles that determine and drive development. From our analysis, we identify commonalities and differences between different projects that have implications for project managers, ontology editors, developers and contributors working on collaborative ontology-engineering projects and tools in the biomedical domain. PMID:24953242
Discovering beaten paths in collaborative ontology-engineering projects using Markov chains.
Walk, Simon; Singer, Philipp; Strohmaier, Markus; Tudorache, Tania; Musen, Mark A; Noy, Natalya F
2014-10-01
Biomedical taxonomies, thesauri and ontologies in the form of the International Classification of Diseases as a taxonomy or the National Cancer Institute Thesaurus as an OWL-based ontology, play a critical role in acquiring, representing and processing information about human health. With increasing adoption and relevance, biomedical ontologies have also significantly increased in size. For example, the 11th revision of the International Classification of Diseases, which is currently under active development by the World Health Organization contains nearly 50,000 classes representing a vast variety of different diseases and causes of death. This evolution in terms of size was accompanied by an evolution in the way ontologies are engineered. Because no single individual has the expertise to develop such large-scale ontologies, ontology-engineering projects have evolved from small-scale efforts involving just a few domain experts to large-scale projects that require effective collaboration between dozens or even hundreds of experts, practitioners and other stakeholders. Understanding the way these different stakeholders collaborate will enable us to improve editing environments that support such collaborations. In this paper, we uncover how large ontology-engineering projects, such as the International Classification of Diseases in its 11th revision, unfold by analyzing usage logs of five different biomedical ontology-engineering projects of varying sizes and scopes using Markov chains. We discover intriguing interaction patterns (e.g., which properties users frequently change after specific given ones) that suggest that large collaborative ontology-engineering projects are governed by a few general principles that determine and drive development. From our analysis, we identify commonalities and differences between different projects that have implications for project managers, ontology editors, developers and contributors working on collaborative ontology-engineering projects and tools in the biomedical domain. Copyright © 2014 Elsevier Inc. All rights reserved.
Education for Professional Engineering Practice
ERIC Educational Resources Information Center
Bramhall, Mike D.; Short, Chris
2014-01-01
This paper reports on a funded collaborative large-scale curriculum innovation and enhancement project undertaken as part of a UK National Higher Education Science, Technology Engineering and Mathematics (STEM) programme. Its aim was to develop undergraduate curricula to teach appropriate skills for professional engineering practice more…
NASA Astrophysics Data System (ADS)
Beichner, Robert
2015-03-01
The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).
Services supporting collaborative alignment of engineering networks
NASA Astrophysics Data System (ADS)
Jansson, Kim; Uoti, Mikko; Karvonen, Iris
2015-08-01
Large-scale facilities such as power plants, process factories, ships and communication infrastructures are often engineered and delivered through geographically distributed operations. The competencies required are usually distributed across several contributing organisations. In these complicated projects, it is of key importance that all partners work coherently towards a common goal. VTT and a number of industrial organisations in the marine sector have participated in a national collaborative research programme addressing these needs. The main output of this programme was development of the Innovation and Engineering Maturity Model for Marine-Industry Networks. The recently completed European Union Framework Programme 7 project COIN developed innovative solutions and software services for enterprise collaboration and enterprise interoperability. One area of focus in that work was services for collaborative project management. This article first addresses a number of central underlying research themes and previous research results that have influenced the development work mentioned above. This article presents two approaches for the development of services that support distributed engineering work. Experience from use of the services is analysed, and potential for development is identified. This article concludes with a proposal for consolidation of the two above-mentioned methodologies. This article outlines the characteristics and requirements of future services supporting collaborative alignment of engineering networks.
Using Collaborative Engineering to Inform Collaboration Engineering
NASA Technical Reports Server (NTRS)
Cooper, Lynne P.
2012-01-01
Collaboration is a critical competency for modern organizations as they struggle to compete in an increasingly complex, global environment. A large body of research on collaboration in the workplace focuses both on teams, investigating how groups use teamwork to perform their task work, and on the use of information systems to support team processes ("collaboration engineering"). This research essay presents collaboration from an engineering perspective ("collaborative engineering"). It uses examples from professional and student engineering teams to illustrate key differences in collaborative versus collaboration engineering and investigates how challenges in the former can inform opportunities for the latter.
Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; ...
2008-01-01
Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less
PIV measurements of in-cylinder, large-scale structures in a water-analogue Diesel engine
NASA Astrophysics Data System (ADS)
Kalpakli Vester, A.; Nishio, Y.; Alfredsson, P. H.
2016-11-01
Swirl and tumble are large-scale structures that develop in an engine cylinder during the intake stroke. Their structure and strength depend on the design of the inlet ports and valves, but also on the valve lift history. Engine manufacturers make their design to obtain a specific flow structure that is assumed to give the best engine performance. Despite many efforts, there are still open questions, such as how swirl and tumble depend on the dynamics of the valves/piston as well as how cycle-to-cycle variations should be minimized. In collaboration with Swedish vehicle industry we perform PIV measurements of the flow dynamics during the intake stroke inside a cylinder of a water-analogue engine model having the same geometrical characteristics as a typical truck Diesel engine. Water can be used since during the intake stroke the flow is nearly incompressible. The flow from the valves moves radially outwards, hits the vertical walls of the cylinder, entrains surrounding fluid, moves along the cylinder walls and creates a central backflow, i.e. a tumble motion. Depending on the port and valve design and orientation none, low, or high swirl can be established. For the first time, the effect of the dynamic motion of the piston/valves on the large-scale structures is captured. Supported by the Swedish Energy Agency, Scania CV AB and Volvo GTT, through the FFI program.
Chen, Mingyang; Stott, Amanda C; Li, Shenggang; Dixon, David A
2012-04-01
A robust metadata database called the Collaborative Chemistry Database Tool (CCDBT) for massive amounts of computational chemistry raw data has been designed and implemented. It performs data synchronization and simultaneously extracts the metadata. Computational chemistry data in various formats from different computing sources, software packages, and users can be parsed into uniform metadata for storage in a MySQL database. Parsing is performed by a parsing pyramid, including parsers written for different levels of data types and sets created by the parser loader after loading parser engines and configurations. Copyright © 2011 Elsevier Inc. All rights reserved.
Analyzing Team Based Engineering Design Process in Computer Supported Collaborative Learning
ERIC Educational Resources Information Center
Lee, Dong-Kuk; Lee, Eun-Sang
2016-01-01
The engineering design process has been largely implemented in a collaborative project format. Recently, technological advancement has helped collaborative problem solving processes such as engineering design to have efficient implementation using computers or online technology. In this study, we investigated college students' interaction and…
Collaborative Multi-Scale 3d City and Infrastructure Modeling and Simulation
NASA Astrophysics Data System (ADS)
Breunig, M.; Borrmann, A.; Rank, E.; Hinz, S.; Kolbe, T.; Schilcher, M.; Mundani, R.-P.; Jubierre, J. R.; Flurl, M.; Thomsen, A.; Donaubauer, A.; Ji, Y.; Urban, S.; Laun, S.; Vilgertshofer, S.; Willenborg, B.; Menninghaus, M.; Steuer, H.; Wursthorn, S.; Leitloff, J.; Al-Doori, M.; Mazroobsemnani, N.
2017-09-01
Computer-aided collaborative and multi-scale 3D planning are challenges for complex railway and subway track infrastructure projects in the built environment. Many legal, economic, environmental, and structural requirements have to be taken into account. The stringent use of 3D models in the different phases of the planning process facilitates communication and collaboration between the stake holders such as civil engineers, geological engineers, and decision makers. This paper presents concepts, developments, and experiences gained by an interdisciplinary research group coming from civil engineering informatics and geo-informatics banding together skills of both, the Building Information Modeling and the 3D GIS world. New approaches including the development of a collaborative platform and 3D multi-scale modelling are proposed for collaborative planning and simulation to improve the digital 3D planning of subway tracks and other infrastructures. Experiences during this research and lessons learned are presented as well as an outlook on future research focusing on Building Information Modeling and 3D GIS applications for cities of the future.
Payne, Philip R.O.; Borlawsky, Tara B.; Rice, Robert; Embi, Peter J.
2010-01-01
With the growing prevalence of large-scale, team science endeavors in the biomedical and life science domains, the impetus to implement platforms capable of supporting asynchronous interaction among multidisciplinary groups of collaborators has increased commensurately. However, there is a paucity of literature describing systematic approaches to identifying the information needs of targeted end-users for such platforms, and the translation of such requirements into practicable software component design criteria. In previous studies, we have reported upon the efficacy of employing conceptual knowledge engineering (CKE) techniques to systematically address both of the preceding challenges in the context of complex biomedical applications. In this manuscript we evaluate the impact of CKE approaches relative to the design of a clinical and translational science collaboration portal, and report upon the preliminary qualitative users satisfaction as reported for the resulting system. PMID:21347146
Collaborative-Large scale Engineering Assessment Networks for Environmental Research: The Overview
NASA Astrophysics Data System (ADS)
Moo-Young, H.
2004-05-01
A networked infrastructure for engineering solutions and policy alternatives is necessary to assess, manage, and protect complex, anthropogenic ally stressed environmental resources effectively. Reductionist and discrete disciplinary methodologies are no longer adequate to evaluate and model complex environmental systems and anthropogenic stresses. While the reductonist approach provides important information regarding individual mechanisms, it cannot provide complete information about how multiple processes are related. Therefore, it is not possible to make accurate predictions about system responses to engineering interventions and the effectiveness of policy options. For example, experts cannot agree on best management strategies for contaminated sediments in riverine and estuarine systems. This is due, in part to the fact that existing models do not accurately capture integrated system dynamics. In addition, infrastructure is not available for investigators to exchange and archive data, to collaborate on new investigative methods, and to synthesize these results to develop engineering solutions and policy alternatives. Our vision for the future is to create a network comprising field facilities and a collaboration of engineers, scientists, policy makers, and community groups. This will allow integration across disciplines, across different temporal and spatial scales, surface and subsurface geographies, and air sheds and watersheds. Benefits include fast response to changes in system health, real-time decision making, and continuous data collection that can be used to anticipate future problems, and to develop sound engineering solutions and management decisions. CLEANER encompasses four general aspects: 1) A Network of environmental field facilities instrumented for the acquisition and analysis of environmental data; 2) A Virtual Repository of Data and information technology for engineering modeling, analysis and visualization of data, i.e. an environmental cyber-infrastructure; 3) A Mechanism for multidisciplinary research and education activities designed to exploit the output of the instrumented sites and networked information technology, to formulate engineering and policy options directed toward the protection, remediation, and restoration of stressed environments and sustainability of environmental resources; and 4) A Collaboration among engineers, natural and social scientists, educators, policy makers, industry, non-governmental organizations, the public, and other stakeholders.
Multi-source Geospatial Data Analysis with Google Earth Engine
NASA Astrophysics Data System (ADS)
Erickson, T.
2014-12-01
The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org
Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling (Final Report)
DOE Office of Scientific and Technical Information (OSTI.GOV)
William J. Schroeder
2011-11-13
This report contains the comprehensive summary of the work performed on the SBIR Phase II, Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling at Kitware Inc. in collaboration with Stanford Linear Accelerator Center (SLAC). The goal of the work was to develop collaborative visualization tools for large-scale data as illustrated in the figure below. The solutions we proposed address the typical problems faced by geographicallyand organizationally-separated research and engineering teams, who produce large data (either through simulation or experimental measurement) and wish to work together to analyze and understand their data. Because the data is large, we expect that it cannotmore » be easily transported to each team member's work site, and that the visualization server must reside near the data. Further, we also expect that each work site has heterogeneous resources: some with large computing clients, tiled (or large) displays and high bandwidth; others sites as simple as a team member on a laptop computer. Our solution is based on the open-source, widely used ParaView large-data visualization application. We extended this tool to support multiple collaborative clients who may locally visualize data, and then periodically rejoin and synchronize with the group to discuss their findings. Options for managing session control, adding annotation, and defining the visualization pipeline, among others, were incorporated. We also developed and deployed a Web visualization framework based on ParaView that enables the Web browser to act as a participating client in a collaborative session. The ParaView Web Visualization framework leverages various Web technologies including WebGL, JavaScript, Java and Flash to enable interactive 3D visualization over the web using ParaView as the visualization server. We steered the development of this technology by teaming with the SLAC National Accelerator Laboratory. SLAC has a computationally-intensive problem important to the nations scientific progress as described shortly. Further, SLAC researchers routinely generate massive amounts of data, and frequently collaborate with other researchers located around the world. Thus SLAC is an ideal teammate through which to develop, test and deploy this technology. The nature of the datasets generated by simulations performed at SLAC presented unique visualization challenges especially when dealing with higher-order elements that were addressed during this Phase II. During this Phase II, we have developed a strong platform for collaborative visualization based on ParaView. We have developed and deployed a ParaView Web Visualization framework that can be used for effective collaboration over the Web. Collaborating and visualizing over the Web presents the community with unique opportunities for sharing and accessing visualization and HPC resources that hitherto with either inaccessible or difficult to use. The technology we developed in here will alleviate both these issues as it becomes widely deployed and adopted.« less
Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping
NASA Astrophysics Data System (ADS)
Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.
2017-12-01
Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.
Aspects of Mutual Engagement: School of Engineering and Industry Collaborations
ERIC Educational Resources Information Center
Stroud, Dean; Hopkins, Andrew
2016-01-01
This paper is a case study of collaboration between a large steel company and a university's school of engineering. Our aim is to contribute to understandings of engagement between employers and higher education institutions and explore some of the complexities of such collaborations in their initiation and propagation. The analysis derives from…
A Grid Infrastructure for Supporting Space-based Science Operations
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Redman, Sandra H.; McNair, Ann R. (Technical Monitor)
2002-01-01
Emerging technologies for computational grid infrastructures have the potential for revolutionizing the way computers are used in all aspects of our lives. Computational grids are currently being implemented to provide a large-scale, dynamic, and secure research and engineering environments based on standards and next-generation reusable software, enabling greater science and engineering productivity through shared resources and distributed computing for less cost than traditional architectures. Combined with the emerging technologies of high-performance networks, grids provide researchers, scientists and engineers the first real opportunity for an effective distributed collaborative environment with access to resources such as computational and storage systems, instruments, and software tools and services for the most computationally challenging applications.
Crowdsourcing biomedical research: leveraging communities as innovation engines
Saez-Rodriguez, Julio; Costello, James C.; Friend, Stephen H.; Kellen, Michael R.; Mangravite, Lara; Meyer, Pablo; Norman, Thea; Stolovitzky, Gustavo
2018-01-01
The generation of large-scale biomedical data is creating unprecedented opportunities for basic and translational science. Typically, the data producers perform initial analyses, but it is very likely that the most informative methods may reside with other groups. Crowdsourcing the analysis of complex and massive data has emerged as a framework to find robust methodologies. When the crowdsourcing is done in the form of collaborative scientific competitions, known as Challenges, the validation of the methods is inherently addressed. Challenges also encourage open innovation, create collaborative communities to solve diverse and important biomedical problems, and foster the creation and dissemination of well-curated data repositories. PMID:27418159
Crowdsourcing biomedical research: leveraging communities as innovation engines.
Saez-Rodriguez, Julio; Costello, James C; Friend, Stephen H; Kellen, Michael R; Mangravite, Lara; Meyer, Pablo; Norman, Thea; Stolovitzky, Gustavo
2016-07-15
The generation of large-scale biomedical data is creating unprecedented opportunities for basic and translational science. Typically, the data producers perform initial analyses, but it is very likely that the most informative methods may reside with other groups. Crowdsourcing the analysis of complex and massive data has emerged as a framework to find robust methodologies. When the crowdsourcing is done in the form of collaborative scientific competitions, known as Challenges, the validation of the methods is inherently addressed. Challenges also encourage open innovation, create collaborative communities to solve diverse and important biomedical problems, and foster the creation and dissemination of well-curated data repositories.
Engineering large-scale agent-based systems with consensus
NASA Technical Reports Server (NTRS)
Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.
1994-01-01
The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.
ERIC Educational Resources Information Center
Matsuba, Ryuichi; Suzuki, Yusei; Kubota, Shin-Ichiro; Miyazaki, Makoto
2015-01-01
We study tactics for writing skills development through cross-disciplinary learning in online large-scale classes, and particularly are interested in implementation of online collaborative activities such as peer reviewing of writing. The goal of our study is to carry out collaborative works efficiently via online effectively in large-scale…
ERIC Educational Resources Information Center
Ghosh, Jaideep; Kshitij, Avinash
2017-01-01
This article introduces a number of methods that can be useful for examining the emergence of large-scale structures in collaboration networks. The study contributes to sociological research by investigating how clusters of research collaborators evolve and sometimes percolate in a collaboration network. Typically, we find that in our networks,…
2007-01-01
Mechanical Turk: Artificial Artificial Intelligence . Retrieved May 15, 2006 from http://www.mturk.com/ mturk/welcome Atkins, D. E., Droegemeier, K. K...Turk (Amazon, 2006) site goes beyond volunteers and pays people to do Human Intelligence Tasks, those that are difficult for computers but relatively...geographically distributed scientific collaboration, and the use of videogame technology for training. Address: U.S. Army Research Institute, 2511 Jefferson
Potential Collaborative Research topics with Korea’s Agency for Defense Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrar, Charles R.; Todd, Michael D.
2012-08-23
This presentation provides a high level summary of current research activities at the Los Alamos National Laboratory (LANL)-University of California Jacobs School of Engineering (UCSD) Engineering Institute that will be presented at Korea's Agency for Defense Development (ADD). These research activities are at the basic engineering science level with different level of maturity ranging from initial concepts to field proof-of-concept demonstrations. We believe that all of these activities are appropriate for collaborative research activities with ADD subject to approval by each institution. All the activities summarized herein have the common theme that they are multi-disciplinary in nature and typically involvedmore » the integration of high-fidelity predictive modeling, advanced sensing technologies and new development in information technology. These activities include: Wireless Sensor Systems, Swarming Robot sensor systems, Advanced signal processing (compressed sensing) and pattern recognition, Model Verification and Validation, Optimal/robust sensor system design, Haptic systems for large-scale data processing, Cyber-physical security for robots, Multi-source energy harvesting, Reliability-based approaches to damage prognosis, SHMTools software development, and Cyber-physical systems advanced study institute.« less
The Aeolus project: Science outreach through art.
Drumm, Ian A; Belantara, Amanda; Dorney, Steve; Waters, Timothy P; Peris, Eulalia
2015-04-01
With a general decline in people's choosing to pursue science and engineering degrees there has never been a greater need to raise the awareness of lesser known fields such as acoustics. Given this context, a large-scale public engagement project, the 'Aeolus project', was created to raise awareness of acoustics science through a major collaboration between an acclaimed artist and acoustics researchers. It centred on touring the large singing sculpture Aeolus during 2011/12, though the project also included an extensive outreach programme of talks, exhibitions, community workshops and resources for schools. Described here are the motivations behind the project and the artwork itself, the ways in which scientists and an artist collaborated, and the public engagement activities designed as part of the project. Evaluation results suggest that the project achieved its goal of inspiring interest in the discipline of acoustics through the exploration of an other-worldly work of art. © The Author(s) 2013.
Software architecture and engineering for patient records: current and future.
Weng, Chunhua; Levine, Betty A; Mun, Seong K
2009-05-01
During the "The National Forum on the Future of the Defense Health Information System," a track focusing on "Systems Architecture and Software Engineering" included eight presenters. These presenters identified three key areas of interest in this field, which include the need for open enterprise architecture and a federated database design, net centrality based on service-oriented architecture, and the need for focus on software usability and reusability. The eight panelists provided recommendations related to the suitability of service-oriented architecture and the enabling technologies of grid computing and Web 2.0 for building health services research centers and federated data warehouses to facilitate large-scale collaborative health care and research. Finally, they discussed the need to leverage industry best practices for software engineering to facilitate rapid software development, testing, and deployment.
NASA Astrophysics Data System (ADS)
Wardzinska, Aleksandra; Petit, Stephan; Bray, Rachel; Delamare, Christophe; Garcia Arza, Griselda; Krastev, Tsvetelin; Pater, Krzysztof; Suwalska, Anna; Widegren, David
2015-12-01
Large-scale long-term projects such as the LHC require the ability to store, manage, organize and distribute large amounts of engineering information, covering a wide spectrum of fields. This information is a living material, evolving in time, following specific lifecycles. It has to reach the next generations of engineers so they understand how their predecessors designed, crafted, operated and maintained the most complex machines ever built. This is the role of CERN EDMS. The Engineering and Equipment Data Management Service has served the High Energy Physics Community for over 15 years. It is CERN's official PLM (Product Lifecycle Management), supporting engineering communities in their collaborations inside and outside the laboratory. EDMS is integrated with the CAD (Computer-aided Design) and CMMS (Computerized Maintenance Management) systems used at CERN providing tools for engineers who work in different domains and who are not PLM specialists. Over the years, human collaborations and machines grew in size and complexity. So did EDMS: it is currently home to more than 2 million files and documents, and has over 6 thousand active users. In April 2014 we released a new major version of EDMS, featuring a complete makeover of the web interface, improved responsiveness and enhanced functionality. Following the results of user surveys and building upon feedback received from key users group, we brought what we think is a system that is more attractive and makes it easy to perform complex tasks. In this paper we will describe the main functions and the architecture of EDMS. We will discuss the available integration options, which enable further evolution and automation of engineering data management. We will also present our plans for the future development of EDMS.
Stahl, Olivier; Duvergey, Hugo; Guille, Arnaud; Blondin, Fanny; Vecchio, Alexandre Del; Finetti, Pascal; Granjeaud, Samuel; Vigy, Oana; Bidaut, Ghislain
2013-06-06
With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. We developed Djeen (Database for Joomla!'s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group.Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material.
2013-01-01
Background With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. Findings We developed Djeen (Database for Joomla!’s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Conclusion Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group. Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material. PMID:23742665
Low order climate models as a tool for cross-disciplinary collaboration
NASA Astrophysics Data System (ADS)
Newton, R.; Pfirman, S. L.; Tremblay, B.; Schlosser, P.
2014-12-01
Human impacts on climate are pervasive and significant and project future states cannot be projected without taking human influence into account. We recently helped convene a meeting of climatologists, policy analysts, lawyers and social scientists to discuss the dramatic loss in Arctic summer sea ice. A dialogue emerged around distinct time scales in the integrated human/natural climate system. Climate scientists tended to discuss engineering solutions as though they could be implemented immediately, whereas lags of 2 or more decades were estimated by social scientists for societal shifts and similar lags were cited for deployment by the engineers. Social scientists tended to project new climate states virtually overnight, while climatologists described time scales of decades to centuries for the system to respond to changes in forcing functions. For the conversation to develop, the group had to come to grips with an increasingly complex set of transient effect time scales and lags between decisions, changes in forcing, and system outputs. We use several low-order dynamical system models to explore mismatched timescales, ranges of lags, and uncertainty in cost estimates on climate outcomes, focusing on Arctic-specific issues. In addition to lessons regarding what is/isn't feasible from a policy and engineering perspective, these models provide a useful tool to concretize cross-disciplinary thinking. They are fast and easy to iterate through a large region of the problem space, while including surprising complexity in their evolution. Thus they are appropriate for investigating the implications of policy in an efficient, but not unrealistic physical setting. (Earth System Models, by contrast, can be too resource- and time-intensive for iteratively testing "what if" scenarios in cross-disciplinary collaborations.) Our runs indicate, for example, that the combined social, engineering and climate physics lags make it extremely unlikely that an ice-free summer ecology in the Arctic can be avoided. Further, if prospective remediation strategies are successful, a return to perennial ice conditions between one and two centuries from now is entirely likely, with interesting and large impacts on Northern economies.
Future Directions in Medical Physics: Models, Technology, and Translation to Medicine
NASA Astrophysics Data System (ADS)
Siewerdsen, Jeffrey
The application of physics in medicine has been integral to major advances in diagnostic and therapeutic medicine. Two primary areas represent the mainstay of medical physics research in the last century: in radiation therapy, physicists have propelled advances in conformal radiation treatment and high-precision image guidance; and in diagnostic imaging, physicists have advanced an arsenal of multi-modality imaging that includes CT, MRI, ultrasound, and PET as indispensible tools for noninvasive screening, diagnosis, and assessment of treatment response. In addition to their role in building such technologically rich fields of medicine, physicists have also become integral to daily clinical practice in these areas. The future suggests new opportunities for multi-disciplinary research bridging physics, biology, engineering, and computer science, and collaboration in medical physics carries a strong capacity for identification of significant clinical needs, access to clinical data, and translation of technologies to clinical studies. In radiation therapy, for example, the extraction of knowledge from large datasets on treatment delivery, image-based phenotypes, genomic profile, and treatment outcome will require innovation in computational modeling and connection with medical physics for the curation of large datasets. Similarly in imaging physics, the demand for new imaging technology capable of measuring physical and biological processes over orders of magnitude in scale (from molecules to whole organ systems) and exploiting new contrast mechanisms for greater sensitivity to molecular agents and subtle functional / morphological change will benefit from multi-disciplinary collaboration in physics, biology, and engineering. Also in surgery and interventional radiology, where needs for increased precision and patient safety meet constraints in cost and workflow, development of new technologies for imaging, image registration, and robotic assistance can leverage collaboration in physics, biomedical engineering, and computer science. In each area, there is major opportunity for multi-disciplinary collaboration with medical physics to accelerate the translation of such technologies to clinical use. Research supported by the National Institutes of Health, Siemens Healthcare, and Carestream Health.
NASA Astrophysics Data System (ADS)
Kapoglou, A.
2017-10-01
This presentation will describe how to build the foundations needed for a large scale, cross-industry collaboration to enable a sustainable and permanent return to the Moon based on system leadership, cross-sector partnership, and inclusive business.
NASA Astrophysics Data System (ADS)
Wyborn, L. A.; Evans, B. J. K.; Pugh, T.; Lescinsky, D. T.; Foster, C.; Uhlherr, A.
2014-12-01
The National Computational Infrastructure (NCI) at the Australian National University (ANU) is a partnership between CSIRO, ANU, Bureau of Meteorology (BoM) and Geoscience Australia. Recent investments in a 1.2 PFlop Supercomputer (Raijin), ~ 20 PB data storage using Lustre filesystems and a 3000 core high performance cloud have created a hybrid platform for higher performance computing and data-intensive science to enable large scale earth and climate systems modelling and analysis. There are > 3000 users actively logging in and > 600 projects on the NCI system. Efficiently scaling and adapting data and software systems to petascale infrastructures requires the collaborative development of an architecture that is designed, programmed and operated to enable users to interactively invoke different forms of in-situ computation over complex and large scale data collections. NCI makes available major and long tail data collections from both the government and research sectors based on six themes: 1) weather, climate and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology and 6) astronomy, bio and social. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. Collections are the operational form for data management and access. Similar data types from individual custodians are managed cohesively. Use of international standards for discovery and interoperability allow complex interactions within and between the collections. This design facilitates a transdisciplinary approach to research and enables a shift from small scale, 'stove-piped' science efforts to large scale, collaborative systems science. This new and complex infrastructure requires a move to shared, globally trusted software frameworks that can be maintained and updated. Workflow engines become essential and need to integrate provenance, versioning, traceability, repeatability and publication. There are also human resource challenges as highly skilled HPC/HPD specialists, specialist programmers, and data scientists are required whose skills can support scaling to the new paradigm of effective and efficient data-intensive earth science analytics on petascale, and soon to be exascale systems.
Engineered Plants Make Potential Precursor to Raw Material for Plastics
Shanklin, John
2018-06-12
In a first step toward achieving industrial-scale green production, scientists from BNL and collaborators at Dow AgroSciences report engineering a plant that produces industrially relevant levels of chemicals that could potentially be used to make plastics.
ERIC Educational Resources Information Center
Bakah, Marie Afua Baah; Voogt, Joke M.; Pieters, Jules M.
2012-01-01
Polytechnic staff perspectives are sought on the sustainability and large-scale implementation of design teams (DT), as a means for collaborative curriculum design and teacher professional development in Ghana's polytechnics, months after implementation. Data indicates that teachers still collaborate in DTs for curriculum design and professional…
Implementing Collaborative Learning across the Engineering Curriculum
ERIC Educational Resources Information Center
Ralston, Patricia A. S.; Tretter, Thomas R.; Kendall-Brown, Marie
2017-01-01
Active and collaborative teaching methods increase student learning, and it is broadly accepted that almost any active or collaborative approach will improve learning outcomes as compared to lecture. Yet, large numbers of faculty have not embraced these methods. Thus, the challenge to encourage evidence-based change in teaching is not only how to…
A community-based, interdisciplinary rehabilitation engineering course.
Lundy, Mary; Aceros, Juan
2016-08-01
A novel, community-based course was created through collaboration between the School of Engineering and the Physical Therapy program at the University of North Florida. This course offers a hands-on, interdisciplinary training experience for undergraduate engineering students through team-based design projects where engineering students are partnered with physical therapy students. Students learn the process of design, fabrication and testing of low-tech and high-tech rehabilitation technology for children with disabilities, and are exposed to a clinical experience under the guidance of licensed therapists. This course was taught in two consecutive years and pre-test/post-test data evaluating the impact of this interprofessional education experience on the students is presented using the Public Service Motivation Scale, Civic Actions Scale, Civic Attitudes Scale, and the Interprofessional Socialization and Valuing Scale.
Waste IPSC : Thermal-Hydrologic-Chemical-Mechanical (THCM) modeling and simulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freeze, Geoffrey A.; Wang, Yifeng; Arguello, Jose Guadalupe, Jr.
2010-10-01
Waste IPSC Objective is to develop an integrated suite of high performance computing capabilities to simulate radionuclide movement through the engineered components and geosphere of a radioactive waste storage or disposal system: (1) with robust thermal-hydrologic-chemical-mechanical (THCM) coupling; (2) for a range of disposal system alternatives (concepts, waste form types, engineered designs, geologic settings); (3) for long time scales and associated large uncertainties; (4) at multiple model fidelities (sub-continuum, high-fidelity continuum, PA); and (5) in accordance with V&V and software quality requirements. THCM Modeling collaborates with: (1) Other Waste IPSC activities: Sub-Continuum Processes (and FMM), Frameworks and Infrastructure (and VU,more » ECT, and CT); (2) Waste Form Campaign; (3) Used Fuel Disposition (UFD) Campaign; and (4) ASCEM.« less
Pettigrew, Luisa M; Kumpunen, Stephanie; Mays, Nicholas; Rosen, Rebecca; Posaner, Rachel
2018-03-01
Over the past decade, collaboration between general practices in England to form new provider networks and large-scale organisations has been driven largely by grassroots action among GPs. However, it is now being increasingly advocated for by national policymakers. Expectations of what scaling up general practice in England will achieve are significant. To review the evidence of the impact of new forms of large-scale general practice provider collaborations in England. Systematic review. Embase, MEDLINE, Health Management Information Consortium, and Social Sciences Citation Index were searched for studies reporting the impact on clinical processes and outcomes, patient experience, workforce satisfaction, or costs of new forms of provider collaborations between general practices in England. A total of 1782 publications were screened. Five studies met the inclusion criteria and four examined the same general practice networks, limiting generalisability. Substantial financial investment was required to establish the networks and the associated interventions that were targeted at four clinical areas. Quality improvements were achieved through standardised processes, incentives at network level, information technology-enabled performance dashboards, and local network management. The fifth study of a large-scale multisite general practice organisation showed that it may be better placed to implement safety and quality processes than conventional practices. However, unintended consequences may arise, such as perceptions of disenfranchisement among staff and reductions in continuity of care. Good-quality evidence of the impacts of scaling up general practice provider organisations in England is scarce. As more general practice collaborations emerge, evaluation of their impacts will be important to understand which work, in which settings, how, and why. © British Journal of General Practice 2018.
NASA Astrophysics Data System (ADS)
Noor, Ahmed K.
2013-12-01
Some of the recent attempts for improving and transforming engineering education are reviewed. The attempts aim at providing the entry level engineers with the skills needed to address the challenges of future large-scale complex systems and projects. Some of the frontier sectors and future challenges for engineers are outlined. The major characteristics of the coming intelligence convergence era (the post-information age) are identified. These include the prevalence of smart devices and environments, the widespread applications of anticipatory computing and predictive / prescriptive analytics, as well as a symbiotic relationship between humans and machines. Devices and machines will be able to learn from, and with, humans in a natural collaborative way. The recent game changers in learnscapes (learning paradigms, technologies, platforms, spaces, and environments) that can significantly impact engineering education in the coming era are identified. Among these are open educational resources, knowledge-rich classrooms, immersive interactive 3D learning, augmented reality, reverse instruction / flipped classroom, gamification, robots in the classroom, and adaptive personalized learning. Significant transformative changes in, and mass customization of, learning are envisioned to emerge from the synergistic combination of the game changers and other technologies. The realization of the aforementioned vision requires the development of a new multidisciplinary framework of emergent engineering for relating innovation, complexity and cybernetics, within the future learning environments. The framework can be used to treat engineering education as a complex adaptive system, with dynamically interacting and communicating components (instructors, individual, small, and large groups of learners). The emergent behavior resulting from the interactions can produce progressively better, and continuously improving, learning environment. As a first step towards the realization of the vision, intelligent adaptive cyber-physical ecosystems need to be developed to facilitate collaboration between the various stakeholders of engineering education, and to accelerate the development of a skilled engineering workforce. The major components of the ecosystems include integrated knowledge discovery and exploitation facilities, blended learning and research spaces, novel ultra-intelligent software agents, multimodal and autonomous interfaces, and networked cognitive and tele-presence robots.
Programming (Tips) for Physicists & Engineers
Ozcan, Erkcan
2018-02-19
Programming for today's physicists and engineers. Work environment: today's astroparticle, accelerator experiments and information industry rely on large collaborations. Need more than ever: code sharing/resuse, code building--framework integration, documentation and good visualization, working remotely, not reinventing the wheel.
Programming (Tips) for Physicists & Engineers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ozcan, Erkcan
2010-07-13
Programming for today's physicists and engineers. Work environment: today's astroparticle, accelerator experiments and information industry rely on large collaborations. Need more than ever: code sharing/resuse, code building--framework integration, documentation and good visualization, working remotely, not reinventing the wheel.
Improving Collaborative Learning in Online Software Engineering Education
ERIC Educational Resources Information Center
Neill, Colin J.; DeFranco, Joanna F.; Sangwan, Raghvinder S.
2017-01-01
Team projects are commonplace in software engineering education. They address a key educational objective, provide students critical experience relevant to their future careers, allow instructors to set problems of greater scale and complexity than could be tackled individually, and are a vehicle for socially constructed learning. While all…
Cloud computing for genomic data analysis and collaboration.
Langmead, Ben; Nellore, Abhinav
2018-04-01
Next-generation sequencing has made major strides in the past decade. Studies based on large sequencing data sets are growing in number, and public archives for raw sequencing data have been doubling in size every 18 months. Leveraging these data requires researchers to use large-scale computational resources. Cloud computing, a model whereby users rent computers and storage from large data centres, is a solution that is gaining traction in genomics research. Here, we describe how cloud computing is used in genomics for research and large-scale collaborations, and argue that its elasticity, reproducibility and privacy features make it ideally suited for the large-scale reanalysis of publicly available archived data, including privacy-protected data.
Applying object-oriented software engineering at the BaBar collaboration
NASA Astrophysics Data System (ADS)
Jacobsen, Bob; BaBar Collaboration Reconstruction Software Group
1997-02-01
The BaBar experiment at SLAC will start taking data in 1999. We are attempting to build its reconstruction software using good software engineering practices, including the use of object-oriented technology. We summarize our experience to date with analysis and design activities, training, CASE and documentation tools, C++ programming practice and similar topics. The emphasis is on the practical issues of simultaneously introducing new techniques to a large collaboration while under a deadline for system delivery.
NASA Astrophysics Data System (ADS)
Fekete, Tamás
2018-05-01
Structural integrity calculations play a crucial role in designing large-scale pressure vessels. Used in the electric power generation industry, these kinds of vessels undergo extensive safety analyses and certification procedures before deemed feasible for future long-term operation. The calculations are nowadays directed and supported by international standards and guides based on state-of-the-art results of applied research and technical development. However, their ability to predict a vessel's behavior under accidental circumstances after long-term operation is largely limited by the strong dependence of the analysis methodology on empirical models that are correlated to the behavior of structural materials and their changes during material aging. Recently a new scientific engineering paradigm, structural integrity has been developing that is essentially a synergistic collaboration between a number of scientific and engineering disciplines, modeling, experiments and numerics. Although the application of the structural integrity paradigm highly contributed to improving the accuracy of safety evaluations of large-scale pressure vessels, the predictive power of the analysis methodology has not yet improved significantly. This is due to the fact that already existing structural integrity calculation methodologies are based on the widespread and commonly accepted 'traditional' engineering thermal stress approach, which is essentially based on the weakly coupled model of thermomechanics and fracture mechanics. Recently, a research has been initiated in MTA EK with the aim to review and evaluate current methodologies and models applied in structural integrity calculations, including their scope of validity. The research intends to come to a better understanding of the physical problems that are inherently present in the pool of structural integrity problems of reactor pressure vessels, and to ultimately find a theoretical framework that could serve as a well-grounded theoretical foundation for a new modeling framework of structural integrity. This paper presents the first findings of the research project.
Large-Scale 3D Printing: The Way Forward
NASA Astrophysics Data System (ADS)
Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid
2018-03-01
Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.
The Emergence of Dominant Design(s) in Large Scale Cyber-Infrastructure Systems
ERIC Educational Resources Information Center
Diamanti, Eirini Ilana
2012-01-01
Cyber-infrastructure systems are integrated large-scale IT systems designed with the goal of transforming scientific practice by enabling multi-disciplinary, cross-institutional collaboration. Their large scale and socio-technical complexity make design decisions for their underlying architecture practically irreversible. Drawing on three…
WebViz:A Web-based Collaborative Interactive Visualization System for large-Scale Data Sets
NASA Astrophysics Data System (ADS)
Yuen, D. A.; McArthur, E.; Weiss, R. M.; Zhou, J.; Yao, B.
2010-12-01
WebViz is a web-based application designed to conduct collaborative, interactive visualizations of large data sets for multiple users, allowing researchers situated all over the world to utilize the visualization services offered by the University of Minnesota’s Laboratory for Computational Sciences and Engineering (LCSE). This ongoing project has been built upon over the last 3 1/2 years .The motivation behind WebViz lies primarily with the need to parse through an increasing amount of data produced by the scientific community as a result of larger and faster multicore and massively parallel computers coming to the market, including the use of general purpose GPU computing. WebViz allows these large data sets to be visualized online by anyone with an account. The application allows users to save time and resources by visualizing data ‘on the fly’, wherever he or she may be located. By leveraging AJAX via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide users with a remote, web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota. LCSE’s custom hierarchical volume rendering software provides high resolution visualizations on the order of 15 million pixels and has been employed for visualizing data primarily from simulations in astrophysics to geophysical fluid dynamics . In the current version of WebViz, we have implemented a highly extensible back-end framework built around HTTP "server push" technology. The web application is accessible via a variety of devices including netbooks, iPhones, and other web and javascript-enabled cell phones. Features in the current version include the ability for users to (1) securely login (2) launch multiple visualizations (3) conduct collaborative visualization sessions (4) delegate control aspects of a visualization to others and (5) engage in collaborative chats with other users within the user interface of the web application. These features are all in addition to a full range of essential visualization functions including 3-D camera and object orientation, position manipulation, time-stepping control, and custom color/alpha mapping.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marrinan, Thomas; Leigh, Jason; Renambot, Luc
Mixed presence collaboration involves remote collaboration between multiple collocated groups. This paper presents the design and results of a user study that focused on mixed presence collaboration using large-scale tiled display walls. The research was conducted in order to compare data synchronization schemes for multi-user visualization applications. Our study compared three techniques for sharing data between display spaces with varying constraints and affordances. The results provide empirical evidence that using data sharing techniques with continuous synchronization between the sites lead to improved collaboration for a search and analysis task between remotely located groups. We have also identified aspects of synchronizedmore » sessions that result in increased remote collaborator awareness and parallel task coordination. It is believed that this research will lead to better utilization of large-scale tiled display walls for distributed group work.« less
Breaking barriers through collaboration: the example of the Cell Migration Consortium.
Horwitz, Alan Rick; Watson, Nikki; Parsons, J Thomas
2002-10-15
Understanding complex integrated biological processes, such as cell migration, requires interdisciplinary approaches. The Cell Migration Consortium, funded by a Large-Scale Collaborative Project Award from the National Institute of General Medical Science, develops and disseminates new technologies, data, reagents, and shared information to a wide audience. The development and operation of this Consortium may provide useful insights for those who plan similarly large-scale, interdisciplinary approaches.
NASA's MERBoard: An Interactive Collaborative Workspace Platform. Chapter 4
NASA Technical Reports Server (NTRS)
Trimble, Jay; Wales, Roxana; Gossweiler, Rich
2003-01-01
This chapter describes the ongoing process by which a multidisciplinary group at NASA's Ames Research Center is designing and implementing a large interactive work surface called the MERBoard Collaborative Workspace. A MERBoard system involves several distributed, large, touch-enabled, plasma display systems with custom MERBoard software. A centralized server and database back the system. We are continually tuning MERBoard to support over two hundred scientists and engineers during the surface operations of the Mars Exploration Rover Missions. These scientists and engineers come from various disciplines and are working both in small and large groups over a span of space and time. We describe the multidisciplinary, human-centered process by which this h4ERBoard system is being designed, the usage patterns and social interactions that we have observed, and issues we are currently facing.
78 FR 7464 - Large Scale Networking (LSN) ; Joint Engineering Team (JET)
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-01
... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN) ; Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination...://www.nitrd.gov/nitrdgroups/index.php?title=Joint_Engineering_Team_ (JET)#title. SUMMARY: The JET...
Gas-Centered Swirl Coaxial Liquid Injector Evaluations
NASA Technical Reports Server (NTRS)
Cohn, A. K.; Strakey, P. A.; Talley, D. G.
2005-01-01
Development of Liquid Rocket Engines is expensive. Extensive testing at large scales usually required. In order to verify engine lifetime, large number of tests required. Limited Resources available for development. Sub-scale cold-flow and hot-fire testing is extremely cost effective. Could be a necessary (but not sufficient) condition for long engine lifetime. Reduces overall costs and risk of large scale testing. Goal: Determine knowledge that can be gained from sub-scale cold-flow and hot-fire evaluations of LRE injectors. Determine relationships between cold-flow and hot-fire data.
77 FR 58415 - Large Scale Networking (LSN); Joint Engineering Team (JET)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-20
... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN); Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination Office (NCO..._Engineering_Team_ (JET). SUMMARY: The JET, established in 1997, provides for information sharing among Federal...
78 FR 70076 - Large Scale Networking (LSN)-Joint Engineering Team (JET)
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-22
... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN)--Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination Office (NCO..._Engineering_Team_ (JET)#title. SUMMARY: The JET, established in 1997, provides for information sharing among...
Gary M. Tabor; Anne Carlson; Travis Belote
2014-01-01
The Yellowstone to Yukon Conservation Initiative (Y2Y) was established over 20 years ago as an experiment in large landscape conservation. Initially, Y2Y emerged as a response to large scale habitat fragmentation by advancing ecological connectivity. It also laid the foundation for large scale multi-stakeholder conservation collaboration with almost 200 non-...
NASA Technical Reports Server (NTRS)
Simmons, J.; Erlich, D.; Shockey, D.
2009-01-01
A team consisting of Arizona State University, Honeywell Engines, Systems & Services, the National Aeronautics and Space Administration Glenn Research Center, and SRI International collaborated to develop computational models and verification testing for designing and evaluating turbine engine fan blade fabric containment structures. This research was conducted under the Federal Aviation Administration Airworthiness Assurance Center of Excellence and was sponsored by the Aircraft Catastrophic Failure Prevention Program. The research was directed toward improving the modeling of a turbine engine fabric containment structure for an engine blade-out containment demonstration test required for certification of aircraft engines. The research conducted in Phase II began a new level of capability to design and develop fan blade containment systems for turbine engines. Significant progress was made in three areas: (1) further development of the ballistic fabric model to increase confidence and robustness in the material models for the Kevlar(TradeName) and Zylon(TradeName) material models developed in Phase I, (2) the capability was improved for finite element modeling of multiple layers of fabric using multiple layers of shell elements, and (3) large-scale simulations were performed. This report concentrates on the material model development and simulations of the impact tests.
International Collaboration Activities on Engineered Barrier Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jove-Colon, Carlos F.
The Used Fuel Disposition Campaign (UFDC) within the DOE Fuel Cycle Technologies (FCT) program has been engaging in international collaborations between repository R&D programs for high-level waste (HLW) disposal to leverage on gathered knowledge and laboratory/field data of near- and far-field processes from experiments at underground research laboratories (URL). Heater test experiments at URLs provide a unique opportunity to mimetically study the thermal effects of heat-generating nuclear waste in subsurface repository environments. Various configurations of these experiments have been carried out at various URLs according to the disposal design concepts of the hosting country repository program. The FEBEX (Full-scale Engineeredmore » Barrier Experiment in Crystalline Host Rock) project is a large-scale heater test experiment originated by the Spanish radioactive waste management agency (Empresa Nacional de Residuos Radiactivos S.A. – ENRESA) at the Grimsel Test Site (GTS) URL in Switzerland. The project was subsequently managed by CIEMAT. FEBEX-DP is a concerted effort of various international partners working on the evaluation of sensor data and characterization of samples obtained during the course of this field test and subsequent dismantling. The main purpose of these field-scale experiments is to evaluate feasibility for creation of an engineered barrier system (EBS) with a horizontal configuration according to the Spanish concept of deep geological disposal of high-level radioactive waste in crystalline rock. Another key aspect of this project is to improve the knowledge of coupled processes such as thermal-hydro-mechanical (THM) and thermal-hydro-chemical (THC) operating in the near-field environment. The focus of these is on model development and validation of predictions through model implementation in computational tools to simulate coupled THM and THC processes.« less
Improving PHENIX search with Solr, Nutch and Drupal.
NASA Astrophysics Data System (ADS)
Morrison, Dave; Sourikova, Irina
2012-12-01
During its 20 years of R&D, construction and operation the PHENIX experiment at the Relativistic Heavy Ion Collider (RHIC) has accumulated large amounts of proprietary collaboration data that is hosted on many servers around the world and is not open for commercial search engines for indexing and searching. The legacy search infrastructure did not scale well with the fast growing PHENIX document base and produced results inadequate in both precision and recall. After considering the possible alternatives that would provide an aggregated, fast, full text search of a variety of data sources and file formats we decided to use Nutch [1] as a web crawler and Solr [2] as a search engine. To present XML-based Solr search results in a user-friendly format we use Drupal [3] as a web interface to Solr. We describe the experience of building a federated search for a heterogeneous collection of 10 million PHENIX documents with Nutch, Solr and Drupal.
Kedalion: NASA's Adaptable and Agile Hardware/Software Integration and Test Lab
NASA Technical Reports Server (NTRS)
Mangieri, Mark L.; Vice, Jason
2011-01-01
NASA fs Kedalion engineering analysis lab at Johnson Space Center is on the forefront of validating and using many contemporary avionics hardware/software development and integration techniques, which represent new paradigms to heritage NASA culture. Kedalion has validated many of the Orion hardware/software engineering techniques borrowed from the adjacent commercial aircraft avionics solution space, with the intention to build upon such techniques to better align with today fs aerospace market. Using agile techniques, commercial products, early rapid prototyping, in-house expertise and tools, and customer collaboration, Kedalion has demonstrated that cost effective contemporary paradigms hold the promise to serve future NASA endeavors within a diverse range of system domains. Kedalion provides a readily adaptable solution for medium/large scale integration projects. The Kedalion lab is currently serving as an in-line resource for the project and the Multipurpose Crew Vehicle (MPCV) program.
Improving collaborative learning in online software engineering education
NASA Astrophysics Data System (ADS)
Neill, Colin J.; DeFranco, Joanna F.; Sangwan, Raghvinder S.
2017-11-01
Team projects are commonplace in software engineering education. They address a key educational objective, provide students critical experience relevant to their future careers, allow instructors to set problems of greater scale and complexity than could be tackled individually, and are a vehicle for socially constructed learning. While all student teams experience challenges, those in fully online programmes must also deal with remote working, asynchronous coordination, and computer-mediated communications all of which contribute to greater social distance between team members. We have developed a facilitation framework to aid team collaboration and have demonstrated its efficacy, in prior research, with respect to team performance and outcomes. Those studies indicated, however, that despite experiencing improved project outcomes, students working in effective software engineering teams did not experience significantly improved individual achievement. To address this deficiency we implemented theoretically grounded refinements to the collaboration model based upon peer-tutoring research. Our results indicate a modest, but statistically significant (p = .08), improvement in individual achievement using this refined model.
Brawer, Peter A; Martielli, Richard; Pye, Patrice L; Manwaring, Jamie; Tierney, Anna
2010-06-01
The primary care health setting is in crisis. Increasing demand for services, with dwindling numbers of providers, has resulted in decreased access and decreased satisfaction for both patients and providers. Moreover, the overwhelming majority of primary care visits are for behavioral and mental health concerns rather than issues of a purely medical etiology. Integrated-collaborative models of health care delivery offer possible solutions to this crisis. The purpose of this article is to review the existing data available after 2 years of the St. Louis Initiative for Integrated Care Excellence; an example of integrated-collaborative care on a large scale model within a regional Veterans Affairs Health Care System. There is clear evidence that the SLI(2)CE initiative rather dramatically increased access to health care, and modified primary care practitioners' willingness to address mental health issues within the primary care setting. In addition, data suggests strong fidelity to a model of integrated-collaborative care which has been successful in the past. Integrated-collaborative care offers unique advantages to the traditional view and practice of medical care. Through careful implementation and practice, success is possible on a large scale model. PsycINFO Database Record (c) 2010 APA, all rights reserved.
ERIC Educational Resources Information Center
Reynolds, Arthur J.; Hayakawa, Momoko; Ou, Suh-Ruu; Mondi, Christina F.; Englund, Michelle M.; Candee, Allyson J.; Smerillo, Nicole E.
2017-01-01
We describe the development, implementation, and evaluation of a comprehensive preschool to third grade prevention program for the goals of sustaining services at a large scale. The Midwest Child-Parent Center (CPC) Expansion is a multilevel collaborative school reform model designed to improve school achievement and parental involvement from ages…
NASA Astrophysics Data System (ADS)
Mirvis, E.; Iredell, M.
2015-12-01
The operational (OPS) NOAA National Centers for Environmental Prediction (NCEP) suite, traditionally, consist of a large set of multi- scale HPC models, workflows, scripts, tools and utilities, which are very much depending on the variety of the additional components. Namely, this suite utilizes a unique collection of the in-house developed 20+ shared libraries (NCEPLIBS), certain versions of the 3-rd party libraries (like netcdf, HDF, ESMF, jasper, xml etc.), HPC workflow tool within dedicated (sometimes even vendors' customized) HPC system homogeneous environment. This domain and site specific, accompanied with NCEP's product- driven large scale real-time data operations complicates NCEP collaborative development tremendously by reducing chances to replicate this OPS environment anywhere else. The NOAA/NCEP's Environmental Modeling Center (EMC) missions to develop and improve numerical weather, climate, hydrological and ocean prediction through the partnership with the research community. Realizing said difficulties, lately, EMC has been taken an innovative approach to improve flexibility of the HPC environment by building the elements and a foundation for NCEP OPS functionally equivalent environment (FEE), which can be used to ease the external interface constructs as well. Aiming to reduce turnaround time of the community code enhancements via Research-to-Operations (R2O) cycle, EMC developed and deployed several project sub-set standards that already paved the road to NCEP OPS implementation standards. In this topic we will discuss the EMC FEE for O2R requirements and approaches in collaborative standardization, including NCEPLIBS FEE and models code version control paired with the models' derived customized HPC modules and FEE footprints. We will share NCEP/EMC experience and potential in the refactoring of EMC development processes, legacy codes and in securing model source code quality standards by using combination of the Eclipse IDE, integrated with the reverse engineering tools/APIs. We will also inform on collaborative efforts in the restructuring of the NOAA Environmental Modeling System (NEMS) - the multi- model and coupling framework, and transitioning FEE verification methodology.
Design of Scalable and Effective Earth Science Collaboration Tool
NASA Astrophysics Data System (ADS)
Maskey, M.; Ramachandran, R.; Kuo, K. S.; Lynnes, C.; Niamsuwan, N.; Chidambaram, C.
2014-12-01
Collaborative research is growing rapidly. Many tools including IDEs are now beginning to incorporate new collaborative features. Software engineering research has shown the effectiveness of collaborative programming and analysis. In particular, drastic reduction in software development time resulting in reduced cost has been highlighted. Recently, we have witnessed the rise of applications that allow users to share their content. Most of these applications scale such collaboration using cloud technologies. Earth science research needs to adopt collaboration technologies to reduce redundancy, cut cost, expand knowledgebase, and scale research experiments. To address these needs, we developed the Earth science collaboration workbench (CWB). CWB provides researchers with various collaboration features by augmenting their existing analysis tools to minimize learning curve. During the development of the CWB, we understood that Earth science collaboration tasks are varied and we concluded that it is not possible to design a tool that serves all collaboration purposes. We adopted a mix of synchronous and asynchronous sharing methods that can be used to perform collaboration across time and location dimensions. We have used cloud technology for scaling the collaboration. Cloud has been highly utilized and valuable tool for Earth science researchers. Among other usages, cloud is used for sharing research results, Earth science data, and virtual machine images; allowing CWB to create and maintain research environments and networks to enhance collaboration between researchers. Furthermore, collaborative versioning tool, Git, is integrated into CWB for versioning of science artifacts. In this paper, we present our experience in designing and implementing the CWB. We will also discuss the integration of collaborative code development use cases for data search and discovery using NASA DAAC and simulation of satellite observations using NASA Earth Observing System Simulation Suite (NEOS3).
Template Interfaces for Agile Parallel Data-Intensive Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramakrishnan, Lavanya; Gunter, Daniel; Pastorello, Gilerto Z.
Tigres provides a programming library to compose and execute large-scale data-intensive scientific workflows from desktops to supercomputers. DOE User Facilities and large science collaborations are increasingly generating large enough data sets that it is no longer practical to download them to a desktop to operate on them. They are instead stored at centralized compute and storage resources such as high performance computing (HPC) centers. Analysis of this data requires an ability to run on these facilities, but with current technologies, scaling an analysis to an HPC center and to a large data set is difficult even for experts. Tigres ismore » addressing the challenge of enabling collaborative analysis of DOE Science data through a new concept of reusable "templates" that enable scientists to easily compose, run and manage collaborative computational tasks. These templates define common computation patterns used in analyzing a data set.« less
ERIC Educational Resources Information Center
Gates, Alexander E.
2017-01-01
A simulated physical model of volcanic processes using a glass art studio greatly enhanced enthusiasm and learning among urban, middle- to high-school aged, largely underrepresented minority students in Newark, New Jersey. The collaboration of a geoscience department with a glass art studio to create a science, technology, engineering, arts, and…
Hu, Michael Z.; Zhu, Ting
2015-12-04
This study reviews the experimental synthesis and engineering developments that focused on various green approaches and large-scale process production routes for quantum dots. Fundamental process engineering principles were illustrated. In relation to the small-scale hot injection method, our discussions focus on the non-injection route that could be scaled up with engineering stir-tank reactors. In addition, applications that demand to utilize quantum dots as "commodity" chemicals are discussed, including solar cells and solid-state lightings.
Engineering research, development and technology FY99
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langland, R T
The growth of computer power and connectivity, together with advances in wireless sensing and communication technologies, is transforming the field of complex distributed systems. The ability to deploy large numbers of sensors with a rapid, broadband communication system will enable high-fidelity, near real-time monitoring of complex systems. These technological developments will provide unprecedented insight into the actual performance of engineered and natural environment systems, enable the evolution of many new types of engineered systems for monitoring and detection, and enhance our ability to perform improved and validated large-scale simulations of complex systems. One of the challenges facing engineering is tomore » develop methodologies to exploit the emerging information technologies. Particularly important will be the ability to assimilate measured data into the simulation process in a way which is much more sophisticated than current, primarily ad hoc procedures. The reports contained in this section on the Center for Complex Distributed Systems describe activities related to the integrated engineering of large complex systems. The first three papers describe recent developments for each link of the integrated engineering process for large structural systems. These include (1) the development of model-based signal processing algorithms which will formalize the process of coupling measurements and simulation and provide a rigorous methodology for validation and update of computational models; (2) collaborative efforts with faculty at the University of California at Berkeley on the development of massive simulation models for the earth and large bridge structures; and (3) the development of wireless data acquisition systems which provide a practical means of monitoring large systems like the National Ignition Facility (NIF) optical support structures. These successful developments are coming to a confluence in the next year with applications to NIF structural characterizations and analysis of large bridge structures for the State of California. Initial feasibility investigations into the development of monitoring and detection systems are described in the papers on imaging of underground structures with ground-penetrating radar, and the use of live insects as sensor platforms. These efforts are establishing the basic performance characteristics essential to the decision process for future development of sensor arrays for information gathering related to national security.« less
Examining What We Mean by "Collaboration" in Collaborative Action Research: A Cross-Case Analysis
ERIC Educational Resources Information Center
Bruce, Catherine D.; Flynn, Tara; Stagg-Peterson, Shelley
2011-01-01
The purpose of this paper is to report on the nature of collaboration in a multi-year, large-scale collaborative action research project in which a teachers' federation (in Ontario, Canada), university researchers and teachers partnered to investigate teacher-selected topics for inquiry. Over two years, 14 case studies were generated involving six…
Practices and Strategies of Distributed Knowledge Collaboration
ERIC Educational Resources Information Center
Kudaravalli, Srinivas
2010-01-01
Information Technology is enabling large-scale, distributed collaboration across many different kinds of boundaries. Researchers have used the label new organizational forms to describe such collaborations and suggested that they are better able to meet the demands of flexibility, speed and adaptability that characterize the knowledge economy.…
CRP: Collaborative Research Project (A Mathematical Research Experience for Undergraduates)
ERIC Educational Resources Information Center
Parsley, Jason; Rusinko, Joseph
2017-01-01
The "Collaborative Research Project" ("CRP")--a mathematics research experience for undergraduates--offers a large-scale collaborative experience in research for undergraduate students. CRP seeks to widen the audience of students who participate in undergraduate research in mathematics. In 2015, the inaugural CRP had 100…
Coordinating the Commons: Diversity & Dynamics in Open Collaborations
ERIC Educational Resources Information Center
Morgan, Jonathan T.
2013-01-01
The success of Wikipedia demonstrates that open collaboration can be an effective model for organizing geographically-distributed volunteers to perform complex, sustained work at a massive scale. However, Wikipedia's history also demonstrates some of the challenges that large, long-term open collaborations face: the core community of Wikipedia…
Preschool Children, Painting and Palimpsest: Collaboration as Pedagogy, Practice and Learning
ERIC Educational Resources Information Center
Cutcher, Alexandra; Boyd, Wendy
2018-01-01
This article describes a small, collaborative, arts-based research project conducted in two rural early childhood centres in regional Australia, where the children made large-scale collaborative paintings in partnership with teachers and researchers. Observation of young children's artistic practices, in order to inform the development of…
1984-06-01
RD-Rl45 988 AQUATIC PLANT CONTROL RESEARCH PROGRAM LARGE-SCALE 1/2 OPERATIONS MANAGEMENT ..(U) ARMY ENGINEER WATERWAYS EXPERIMENT STATION VICKSBURG MS...REPORT A-78-2 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR -, CONTROL OF PROBLEM AQUATIC PLANTS Report 5 SYNTHESIS REPORT bv Andrew...Corps of Engineers Washington, DC 20314 84 0,_1 oil.. LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC
Secure web book to store structural genomics research data.
Manjasetty, Babu A; Höppner, Klaus; Mueller, Uwe; Heinemann, Udo
2003-01-01
Recently established collaborative structural genomics programs aim at significantly accelerating the crystal structure analysis of proteins. These large-scale projects require efficient data management systems to ensure seamless collaboration between different groups of scientists working towards the same goal. Within the Berlin-based Protein Structure Factory, the synchrotron X-ray data collection and the subsequent crystal structure analysis tasks are located at BESSY, a third-generation synchrotron source. To organize file-based communication and data transfer at the BESSY site of the Protein Structure Factory, we have developed the web-based BCLIMS, the BESSY Crystallography Laboratory Information Management System. BCLIMS is a relational data management system which is powered by MySQL as the database engine and Apache HTTP as the web server. The database interface routines are written in Python programing language. The software is freely available to academic users. Here we describe the storage, retrieval and manipulation of laboratory information, mainly pertaining to the synchrotron X-ray diffraction experiments and the subsequent protein structure analysis, using BCLIMS.
NeuroLines: A Subway Map Metaphor for Visualizing Nanoscale Neuronal Connectivity.
Al-Awami, Ali K; Beyer, Johanna; Strobelt, Hendrik; Kasthuri, Narayanan; Lichtman, Jeff W; Pfister, Hanspeter; Hadwiger, Markus
2014-12-01
We present NeuroLines, a novel visualization technique designed for scalable detailed analysis of neuronal connectivity at the nanoscale level. The topology of 3D brain tissue data is abstracted into a multi-scale, relative distance-preserving subway map visualization that allows domain scientists to conduct an interactive analysis of neurons and their connectivity. Nanoscale connectomics aims at reverse-engineering the wiring of the brain. Reconstructing and analyzing the detailed connectivity of neurons and neurites (axons, dendrites) will be crucial for understanding the brain and its development and diseases. However, the enormous scale and complexity of nanoscale neuronal connectivity pose big challenges to existing visualization techniques in terms of scalability. NeuroLines offers a scalable visualization framework that can interactively render thousands of neurites, and that supports the detailed analysis of neuronal structures and their connectivity. We describe and analyze the design of NeuroLines based on two real-world use-cases of our collaborators in developmental neuroscience, and investigate its scalability to large-scale neuronal connectivity data.
NASA Technical Reports Server (NTRS)
Johnston, William E.; Gannon, Dennis; Nitzberg, Bill
2000-01-01
We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3) Coupling large-scale computing and data systems to scientific and engineering instruments (e.g., realtime interaction with experiments through real-time data analysis and interpretation presented to the experimentalist in ways that allow direct interaction with the experiment (instead of just with instrument control); (5) Highly interactive, augmented reality and virtual reality remote collaborations (e.g., Ames / Boeing Remote Help Desk providing field maintenance use of coupled video and NDI to a remote, on-line airframe structures expert who uses this data to index into detailed design databases, and returns 3D internal aircraft geometry to the field); (5) Single computational problems too large for any single system (e.g. the rotocraft reference calculation). Grids also have the potential to provide pools of resources that could be called on in extraordinary / rapid response situations (such as disaster response) because they can provide common interfaces and access mechanisms, standardized management, and uniform user authentication and authorization, for large collections of distributed resources (whether or not they normally function in concert). IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: the scientist / design engineer whose primary interest is problem solving (e.g. determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user is the tool designer: the computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. The results of the analysis of the needs of these two types of users provides a broad set of requirements that gives rise to a general set of required capabilities. The IPG project is intended to address all of these requirements. In some cases the required computing technology exists, and in some cases it must be researched and developed. The project is using available technology to provide a prototype set of capabilities in a persistent distributed computing testbed. Beyond this, there are required capabilities that are not immediately available, and whose development spans the range from near-term engineering development (one to two years) to much longer term R&D (three to six years). Additional information is contained in the original.
Supply Chain Engineering and the Use of a Supporting Knowledge Management Application
NASA Astrophysics Data System (ADS)
Laakmann, Frank
The future competition in markets will happen between logistics networks and no longer between enterprises. A new approach for supporting the engineering of logistics networks is developed by this research as a part of the Collaborative Research Centre (SFB) 559: "Modeling of Large Networks in Logistics" at the University of Dortmund together with the Fraunhofer-Institute of Material Flow and Logistics founded by Deutsche Forschungsgemeinschaft (DFG). Based on a reference model for logistics processes, the process chain model, a guideline for logistics engineers is developed to manage the different types of design tasks of logistics networks. The technical background of this solution is a collaborative knowledge management application. This paper will introduce how new Internet-based technologies support supply chain design projects.
NASA Astrophysics Data System (ADS)
Arevalo, S.; Atwood, C.; Bell, P.; Blacker, T. D.; Dey, S.; Fisher, D.; Fisher, D. A.; Genalis, P.; Gorski, J.; Harris, A.; Hill, K.; Hurwitz, M.; Kendall, R. P.; Meakin, R. L.; Morton, S.; Moyer, E. T.; Post, D. E.; Strawn, R.; Veldhuizen, D. v.; Votta, L. G.; Wynn, S.; Zelinski, G.
2008-07-01
In FY2008, the U.S. Department of Defense (DoD) initiated the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program, a 360M program with a two-year planning phase and a ten-year execution phase. CREATE will develop and deploy three computational engineering tool sets for DoD acquisition programs to use to design aircraft, ships and radio-frequency antennas. The planning and execution of CREATE are based on the 'lessons learned' from case studies of large-scale computational science and engineering projects. The case studies stress the importance of a stable, close-knit development team; a focus on customer needs and requirements; verification and validation; flexible and agile planning, management, and development processes; risk management; realistic schedules and resource levels; balanced short- and long-term goals and deliverables; and stable, long-term support by the program sponsor. Since it began in FY2008, the CREATE program has built a team and project structure, developed requirements and begun validating them, identified candidate products, established initial connections with the acquisition programs, begun detailed project planning and development, and generated the initial collaboration infrastructure necessary for success by its multi-institutional, multidisciplinary teams.
SCALE(ing)-UP Teaching: A Case Study of Student Motivation in an Undergraduate Course
ERIC Educational Resources Information Center
Chittum, Jessica R.; McConnell, Kathryne Drezek; Sible, Jill
2017-01-01
Teaching large classes is increasingly common; thus, demand for effective large-class pedagogy is rising. One method, titled "SCALE-UP" (Student-Centered Active Learning Environment for Undergraduate Programs), is intended for large classes and involves collaborative, active learning in a technology-rich and student-centered environment.…
Analysis and Testing of a Composite Fuselage Shield for Open Rotor Engine Blade-Out Protection
NASA Technical Reports Server (NTRS)
Pereira, J. Michael; Emmerling, William; Seng, Silvia; Frankenberger, Charles; Ruggeri, Charles R.; Revilock, Duane M.; Carney, Kelly S.
2015-01-01
The Federal Aviation Administration is working with the European Aviation Safety Agency to determine the certification base for proposed new engines that would not have a containment structure on large commercial aircraft. Equivalent safety to the current fleet is desired by the regulators, which means that loss of a single fan blade will not cause hazard to the Aircraft. The NASA Glenn Research Center and The Naval Air Warfare Center (NAWC), China Lake, collaborated with the FAA Aircraft Catastrophic Failure Prevention Program to design and test lightweight composite shields for protection of the aircraft passengers and critical systems from a released blade that could impact the fuselage. In the test, two composite blades were pyrotechnically released from a running engine, each impacting a composite shield with a different thickness. The thinner shield was penetrated by the blade and the thicker shield prevented penetration. This was consistent with pre-test predictions. This paper documents the live fire test from the full scale rig at NAWC China Lake and describes the damage to the shields as well as instrumentation results.
Critical assembly: A technical history of Los Alamos during the Oppenheimer years, 1943--1945
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoddeson, L.; Henriksen, P.W.; Meade, R.A.
1993-11-01
This volume treats the technical research that led to the first atomic bombs. The authors explore how the ``critical assembly`` of scientists, engineers, and military Personnel at Los Alamos collaborated during World War II, blending their traditions to create a new approach to large-scale research. The research was characterized by strong mission orientation, multidisciplinary teamwork, expansion of the scientists` traditional methodology with engineering techniques, and a trail-and-error methodology responding to wartime deadlines. The book opens with an introduction laying out major themes. After a synopsis of the prehistory of the bomb project, from the discovery of nuclear fission to themore » start of the Manhattan Engineer District, and an overview of the early materials program, the book examines the establishment of the Los Alamos Laboratory, the implosion and gun assembly programs, nuclear physics research, chemistry and metallurgy, explosives, uranium and plutonium development, confirmation of spontaneous fission in pile-produced plutonium, the thermonuclear bomb, critical assemblies, the Trinity test, and delivery of the combat weapons.« less
ERIC Educational Resources Information Center
Jones, Brett D.; Epler, Cory M.; Mokri, Parastou; Bryant, Lauren H.; Paretti, Marie C.
2013-01-01
We identified and examined how the instructional elements of problem-based learning capstone engineering courses affected students' motivation to engage in the courses. We employed a two-phase, sequential, explanatory, mixed methods research design. For the quantitative phase, 47 undergraduate students at a large public university completed a…
Voices of Women in a Software Engineering Course: Reflections on Collaboration
ERIC Educational Resources Information Center
Berenson, Sarah B.; Slaten, Kelli M.; Williams, Laurie; Ho, Chih-Wei
2004-01-01
Those science, mathematics, and engineering faculty who are serious about making the education they offer as available to their daughters as to their sons are, we posit, facing the prospect of dismantling a large part of its traditional pedagogical structure, along with the assumptions and practice which support it. [Seymour and Hewett 1997].Prior…
NASA Astrophysics Data System (ADS)
Saleh, R.
2017-12-01
For a challenge as complex and far-reaching as sea level rise and improving shoreline resiliency, strong partnerships between scientists, elected officials, decision-makers, and the general public are the only way that effective solutions can be developed. The San Francisco Bay, like many similar sheltered water coastal environments (for example, Galveston Bay, Tampa Bay, or Venetian Lagoon) offers a unique opportunity for multiple jurisdictions to collaborate to address sea level rise on a regional basis. For the San Francisco Bay, significant scientific progress has been made in building a real-time simulation model for riverine and Bay hydrodynamics. Other major scientific initiatives, such as morphology mapping, shoreline mapping, and a sediment budget are also underway. In 2014, leaders from the Bay Area science, engineering, planning, policy, elected, and regulatory communities representing jurisdictions around the Bay joined together to address sea level rise. The group includes people from local, regional, state, and federal agencies and organizations. Together, CHARG (Coastal Hazards Adaptation Resiliency Group) established a collective vision and approach to implementing regional solutions. Decision-makers within many Bay Area jurisdictions are motivated to show demonstrable progress toward addressing sea level rise. However, the cost to implement shoreline resiliency solutions will be very large, and must be founded on strong science.CHARG is now tackling several key technical challenges. One is to develop science-based guidelines for local jurisdictions to determine when a project is local, sub-regional, or regional. Concurrently, several organizations are planning or implementing pilot shoreline resiliency projects and other programs. Many creative regional solutions are possible in a sheltered water environment that simply would not be feasible along the open coast. By definition, these solutions cannot be undertaken by one entity alone. Large-scale regional solutions are only possible through the hard work and collaboration of many. This paper will offer insights into the process of collaboration, initiated by the scientific and engineering communities, to influence and help direct major decisions about shoreline resiliency.
Toward visual user interfaces supporting collaborative multimedia content management
NASA Astrophysics Data System (ADS)
Husein, Fathi; Leissler, Martin; Hemmje, Matthias
2000-12-01
Supporting collaborative multimedia content management activities, as e.g., image and video acquisition, exploration, and access dialogues between naive users and multi media information systems is a non-trivial task. Although a wide variety of experimental and prototypical multimedia storage technologies as well as corresponding indexing and retrieval engines are available, most of them lack appropriate support for collaborative end-user oriented user interface front ends. The development of advanced user adaptable interfaces is necessary for building collaborative multimedia information- space presentations based upon advanced tools for information browsing, searching, filtering, and brokering to be applied on potentially very large and highly dynamic multimedia collections with a large number of users and user groups. Therefore, the development of advanced and at the same time adaptable and collaborative computer graphical information presentation schemes that allow to easily apply adequate visual metaphors for defined target user stereotypes has to become a key focus within ongoing research activities trying to support collaborative information work with multimedia collections.
Insights into the Coral Microbiome: Underpinning the Health and Resilience of Reef Ecosystems.
Bourne, David G; Morrow, Kathleen M; Webster, Nicole S
2016-09-08
Corals are fundamental ecosystem engineers, creating large, intricate reefs that support diverse and abundant marine life. At the core of a healthy coral animal is a dynamic relationship with microorganisms, including a mutually beneficial symbiosis with photosynthetic dinoflagellates (Symbiodinium spp.) and enduring partnerships with an array of bacterial, archaeal, fungal, protistan, and viral associates, collectively termed the coral holobiont. The combined genomes of this coral holobiont form a coral hologenome, and genomic interactions within the hologenome ultimately define the coral phenotype. Here we integrate contemporary scientific knowledge regarding the ecological, host-specific, and environmental forces shaping the diversity, specificity, and distribution of microbial symbionts within the coral holobiont, explore physiological pathways that contribute to holobiont fitness, and describe potential mechanisms for holobiont homeostasis. Understanding the role of the microbiome in coral resilience, acclimation, and environmental adaptation is a new frontier in reef science that will require large-scale collaborative research efforts.
Collaborative Science Using Web Services and the SciFlo Grid Dataflow Engine
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Manipon, G.; Xing, Z.; Yunck, T.
2006-12-01
The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* &Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client &server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data Access Protocol (OpenDAP) servers. The scientist injects a distributed computation into the Grid by simply filling out an HTML form or directly authoring the underlying XML dataflow document, and results are returned directly to the scientist's desktop. Once an analysis has been specified for a chunk or day of data, it can be easily repeated with different control parameters or over months of data. Recently, the Earth Science Information Partners (ESIP) Federation sponsored a collaborative activity in which several ESIP members advertised their respective WMS/WCS and SOAP services, developed some collaborative science scenarios for atmospheric and aerosol science, and then choreographed services from multiple groups into demonstration workflows using the SciFlo engine and a Business Process Execution Language (BPEL) workflow engine. For several scenarios, the same collaborative workflow was executed in three ways: using hand-coded scripts, by executing a SciFlo document, and by executing a BPEL workflow document. We will discuss the lessons learned from this activity, the need for standardized interfaces (like WMS/WCS), the difficulty in agreeing on even simple XML formats and interfaces, and further collaborations that are being pursued.
Wyborn, Carina; Bixler, R Patrick
2013-07-15
The problem of fit between social institutions and ecological systems is an enduring challenge in natural resource management and conservation. Developments in the science of conservation biology encourage the management of landscapes at increasingly larger scales. In contrast, sociological approaches to conservation emphasize the importance of ownership, collaboration and stewardship at scales relevant to the individual or local community. Despite the proliferation of initiatives seeking to work with local communities to undertake conservation across large landscapes, there is an inherent tension between these scales of operation. Consequently, questions about the changing nature of effective conservation across scales abound. Through an analysis of three nested cases working in a semiautonomous fashion in the Northern Rocky Mountains in North America, this paper makes an empirical contribution to the literature on nested governance, collaboration and communication across scales. Despite different scales of operation, constituencies and scale frames, we demonstrate a surprising similarity in organizational structure and an implicit dependency between these initiatives. This paper examines the different capacities and capabilities of collaborative conservation from the local to regional to supra regional. We draw on the underexplored concept of 'scale-dependent comparative advantage' (Cash and Moser, 2000), to gain insight into what activities take place at which scale and what those activities contribute to nested governance and collaborative conservation. The comparison of these semiautonomous cases provides fruitful territory to draw lessons for understanding the roles and relationships of organizations operating at different scales in more connected networks of nested governance. Copyright © 2013 Elsevier Ltd. All rights reserved.
Engineering and Language Discourse Collaboration: Practice Realities
ERIC Educational Resources Information Center
Harran, Marcelle
2011-01-01
This article describes a situated engineering project at a South African HE institution which is underpinned by collaboration between Applied Language Studies (DALS) and Mechanical Engineering. The collaboration requires language practitioners and engineering experts to negotiate and collaborate on academic literacies practices, discourse…
Aarons, Gregory A; Fettes, Danielle L; Hurlburt, Michael S; Palinkas, Lawrence A; Gunderson, Lara; Willging, Cathleen E; Chaffin, Mark J
2014-01-01
Implementation and scale-up of evidence-based practices (EBPs) is often portrayed as involving multiple stakeholders collaborating harmoniously in the service of a shared vision. In practice, however, collaboration is a more complex process that may involve shared and competing interests and agendas, and negotiation. The present study examined the scale-up of an EBP across an entire service system using the Interagency Collaborative Team approach. Participants were key stakeholders in a large-scale county-wide implementation of an EBP to reduce child neglect, SafeCare. Semistructured interviews and/or focus groups were conducted with 54 individuals representing diverse constituents in the service system, followed by an iterative approach to coding and analysis of transcripts. The study was conceptualized using the Exploration, Preparation, Implementation, and Sustainment framework. Although community stakeholders eventually coalesced around implementation of SafeCare, several challenges affected the implementation process. These challenges included differing organizational cultures, strategies, and approaches to collaboration; competing priorities across levels of leadership; power struggles; and role ambiguity. Each of the factors identified influenced how stakeholders approached the EBP implementation process. System-wide scale-up of EBPs involves multiple stakeholders operating in a nexus of differing agendas, priorities, leadership styles, and negotiation strategies. The term collaboration may oversimplify the multifaceted nature of the scale-up process. Implementation efforts should openly acknowledge and consider this nexus when individual stakeholders and organizations enter into EBP implementation through collaborative processes.
Aarons, Gregory A.; Fettes, Danielle; Hurlburt, Michael; Palinkas, Lawrence; Gunderson, Lara; Willging, Cathleen; Chaffin, Mark
2014-01-01
Objective Implementation and scale-up of evidence-based practices (EBPs) is often portrayed as involving multiple stakeholders collaborating harmoniously in the service of a shared vision. In practice, however, collaboration is a more complex process that may involve shared and competing interests and agendas, and negotiation. The present study examined the scale-up of an EBP across an entire service system using the Interagency Collaborative Team (ICT) approach. Methods Participants were key stakeholders in a large-scale county-wide implementation of an EBP to reduce child neglect, SafeCare®. Semi-structured interviews and/or focus groups were conducted with 54 individuals representing diverse constituents in the service system, followed by an iterative approach to coding and analysis of transcripts. The study was conceptualized using the Exploration, Preparation, Implementation, and Sustainment (EPIS) framework. Results Although community stakeholders eventually coalesced around implementation of SafeCare, several challenges affected the implementation process. These challenges included differing organizational cultures, strategies, and approaches to collaboration, competing priorities across levels of leadership, power struggles, and role ambiguity. Each of the factors identified influenced how stakeholders approached the EBP implementation process. Conclusions System wide scale-up of EBPs involves multiple stakeholders operating in a nexus of differing agendas, priorities, leadership styles, and negotiation strategies. The term collaboration may oversimplify the multifaceted nature of the scale-up process. Implementation efforts should openly acknowledge and consider this nexus when individual stakeholders and organizations enter into EBP implementation through collaborative processes. PMID:24611580
NASA Astrophysics Data System (ADS)
Bonner, J.; Brezonik, P.; Clesceri, N.; Gouldman, C.; Jamail, R.; Zilkoski, D.
2006-12-01
The Integrated Ocean Observing System (IOOS), established through the efforts of the National Office for Integrated and Sustained Ocean Observations (Oceans.US) provides quality controlled data and information on a routine and continuous basis regarding current and future states of the oceans and Great Lakes at scales from global ocean basins to coastal ecosystems. The seven societal goals of IOOS are outlined in this paper. The Engineering and Geosciences Directorates at the National Science Foundation (NSF) are collaborating in planning the WATERS (WATer Environmental Research System) Network, an outgrowth of earlier, separate initiatives of the two directorates: CLEANER (Collaborative Large-scale Engineering Analysis Network for Environmental Research) and Hydrologic Observatories. WATERS Network is being developed by engineers and scientists in the academic community who recognize the need for an observation and research network to enable better understanding of human-dominated water-environments, their stressors, and the links between them. The WATERS Network model is based on a research framework anchored in a distributed, cyber-based network supporting: 1) data collection; 2) data aggregation; 3) analytical and exploratory tools; and 4) a computational environment supporting predictive modeling and policy analysis on water resource systems. Within IOOS, the U.S. coastal margin is divided into Regional Associations (RAs), organizational units that are conceptually linked through planned data collection and analysis activities for resolving fundamental coastal margin ecosystem questions and addressing RA concerns. Under the WATERS Network scheme, a Coastal Margin Regional Environmental System (RES) for coastal areas would be defined conceptually based on geomorphologic considerations of four major water bodies; Atlantic and Pacific Oceans, Gulf of Mexico, and Laurentian Great Lakes. Within this framework, each coastal margin would operate one or more local environmental field facilities (or observatories). Mutual coordination and collaboration would exist among these coasts through RES interactions based on a cyberinfrastructure supporting all aspects of quantitative analysis. Because the U.S. Ocean Action Plan refers to the creation of a National Water Quality Monitoring Network, a close liaison between IOOS and WATERS Network could be mutually advantageous considering the shared visions, goals and objectives. A focus on activities and initiatives involving sensor and sensor networks for coastal margin observation and assessment would be a specific instance of this liaison, leveraging the infrastructural base of both organizations to maximize resource allocation. This coordinated venture with intelligent environmental systems would include new specialized coastal monitoring networks, and management of near-real-time data, including data assimilation models. An ongoing NSF planning grant aimed at environmental observatory design for coastal margins is a component of the broader WATERS Network planning for collaborative research to support adaptive and sustainable environmental management. We propose a collaborative framework between IOOS and WATERS Network wherein collaborative research will be enabled by cybernetworks to support adaptive and sustainable management of the coastal regions.
Teaching Cellular Automation Concepts through Interdisciplinary Collaborative Learning.
ERIC Educational Resources Information Center
Biernacki, Joseph J.; Ayers, Jerry B.
2000-01-01
Reports on the experiences of 12 students--three senior undergraduates majoring in chemical engineering, five master-level, and four doctoral students--in a course titled "Interdisciplinary Studies in Multi-Scale Simulation of Concrete Materials". Course objectives focused on incorporating team-oriented interdisciplinary experiences into the…
NASA Astrophysics Data System (ADS)
Montgomery, J. L.; Minsker, B. S.; Schnoor, J.; Haas, C.; Bonner, J.; Driscoll, C.; Eschenbach, E.; Finholt, T.; Glass, J.; Harmon, T.; Johnson, J.; Krupnik, A.; Reible, D.; Sanderson, A.; Small, M.; van Briesen, J.
2006-05-01
With increasing population and urban development, societies grow more and more concerned over balancing the need to maintain adequate water supplies with that of ensuring the quality of surface and groundwater resources. For example, multiple stressors such as overfishing, runoff of nutrients from agricultural fields and confined animal feeding lots, and pathogens in urban stormwater can often overwhelm a single water body. Mitigating just one of these problems often depends on understanding how it relates to others and how stressors can vary in temporal and spatial scales. Researchers are now in a position to answer questions about multiscale, spatiotemporally distributed hydrologic and environmental phenomena through the use of remote and embedded networked sensing technologies. It is now possible for data streaming from sensor networks to be integrated by a rich cyberinfrastructure encompassing the innovative computing, visualization, and information archiving strategies needed to cope with the anticipated onslaught of data, and to turn that data around in the form of real-time water quantity and quality forecasting. Recognizing this potential, NSF awarded $2 million to a coalition of 12 institutions in July 2005 to establish the CLEANER Project Office (Collaborative Large-Scale Engineering Analysis Network for Environmental Research; http://cleaner.ncsa.uiuc.edu). Over the next two years the project office, in coordination with CUAHSI (Consortium of Universities for the Advancement of Hydrologic Science, Inc.; http://www.cuahsi.org), will work together to develop a plan for a WATer and Environmental Research Systems Network (WATERS Network), which is envisioned to be a collaborative scientific exploration and engineering analysis network, using high performance tools and infrastructure, to transform our scientific understanding of how water quantity, quality, and related earth system processes are affected by natural and human-induced changes to the environment. This presentation will give an overview of the draft CLEANER program plans for the WATERS Network and next steps.
An Integrated Cyberenvironment for Event-Driven Environmental Observatory Research and Education
NASA Astrophysics Data System (ADS)
Myers, J.; Minsker, B.; Butler, R.
2006-12-01
National environmental observatories will soon provide large-scale data from diverse sensor networks and community models. While much attention is focused on piping data from sensors to archives and users, truly integrating these resources into the everyday research activities of scientists and engineers across the community, and enabling their results and innovations to be brought back into the observatory, also critical to long-term success of the observatories, is often neglected. This talk will give an overview of the Environmental Cyberinfrastructure Demonstrator (ECID) Cyberenvironment for observatory-centric environmental research and education, under development at the National Center for Supercomputing Applications (NCSA), which is designed to address these issues. Cyberenvironments incorporate collaboratory and grid technologies, web services, and other cyberinfrastructure into an overall framework that balances needs for efficient coordination and the ability to innovate. They are designed to support the full scientific lifecycle both in terms of individual experiments moving from data to workflows to publication and at the macro level where new discoveries lead to additional data, models, tools, and conceptual frameworks that augment and evolve community-scale systems such as observatories. The ECID cyberenvironment currently integrates five major components a collaborative portal, workflow engine, event manager, metadata repository, and social network personalization capabilities - that have novel features inspired by the Cyberenvironment concept and enabling powerful environmental research scenarios. A summary of these components and the overall cyberenvironment will be given in this talk, while other posters will give details on several of the components. The summary will be presented within the context of environmental use case scenarios created in collaboration with researchers from the WATERS (WATer and Environmental Research Systems) Network, a joint National Science Foundation-funded initiative of the hydrology and environmental engineering communities. The use case scenarios include identifying sensor anomalies in point- and streaming sensor data and notifying data managers in near-real time; and referring users of data or data products (e.g., workflows, publications) to related data or data products.
ERIC Educational Resources Information Center
Halatchliyski, Iassen; Moskaliuk, Johannes; Kimmerle, Joachim; Cress, Ulrike
2014-01-01
This article discusses the relevance of large-scale mass collaboration for computer-supported collaborative learning (CSCL) research, adhering to a theoretical perspective that views collective knowledge both as substance and as participatory activity. In an empirical study using the German Wikipedia as a data source, we explored collective…
ERIC Educational Resources Information Center
Nolin, Anna P.
2014-01-01
This study explored the role of professional learning communities for district leadership implementing large-scale technology initiatives such as 1:1 implementations (one computing device for every student). The existing literature regarding technology leadership is limited, as is literature on how districts use existing collaborative structures…
Nonlinear Dynamics and Control of Flexible Structures
1991-03-01
of which might be used for space applications. This project was a collaborative one involving structural, electrical and mechanical engineers and...methods for vibration analysis and new models to analyze chaotic dynamics in nonlinear structures with large deformations and friction forces. Finally... electrical and mechanical engineers and resulted in nine doctoral dissertations and two masters theses wholly or partially supported by this grant
The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications
NASA Technical Reports Server (NTRS)
Johnston, William E.
2002-01-01
With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.
NASA Astrophysics Data System (ADS)
Dodani, Sheel C.; Kiss, Gert; Cahn, Jackson K. B.; Su, Ye; Pande, Vijay S.; Arnold, Frances H.
2016-05-01
The dynamic motions of protein structural elements, particularly flexible loops, are intimately linked with diverse aspects of enzyme catalysis. Engineering of these loop regions can alter protein stability, substrate binding and even dramatically impact enzyme function. When these flexible regions are unresolvable structurally, computational reconstruction in combination with large-scale molecular dynamics simulations can be used to guide the engineering strategy. Here we present a collaborative approach that consists of both experiment and computation and led to the discovery of a single mutation in the F/G loop of the nitrating cytochrome P450 TxtE that simultaneously controls loop dynamics and completely shifts the enzyme's regioselectivity from the C4 to the C5 position of L-tryptophan. Furthermore, we find that this loop mutation is naturally present in a subset of homologous nitrating P450s and confirm that these uncharacterized enzymes exclusively produce 5-nitro-L-tryptophan, a previously unknown biosynthetic intermediate.
ERIC Educational Resources Information Center
Raczynski, Kevin R.; Cohen, Allan S.; Engelhard, George, Jr.; Lu, Zhenqiu
2015-01-01
There is a large body of research on the effectiveness of rater training methods in the industrial and organizational psychology literature. Less has been reported in the measurement literature on large-scale writing assessments. This study compared the effectiveness of two widely used rater training methods--self-paced and collaborative…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Blarigan, P.
A hydrogen fueled engine is being developed specifically for the auxiliary power unit (APU) in a series type hybrid vehicle. Hydrogen is different from other internal combustion (IC) engine fuels, and hybrid vehicle IC engine requirements are different from those of other IC vehicle engines. Together these differences will allow a new engine design based on first principles that will maximize thermal efficiency while minimizing principal emissions. The experimental program is proceeding in four steps: (1) Demonstration of the emissions and the indicated thermal efficiency capability of a standard CLR research engine modified for higher compression ratios and hydrogen fueledmore » operation. (2) Design and test a new combustion chamber geometry for an existing single cylinder research engine, in an attempt to improve on the baseline indicated thermal efficiency of the CLR engine. (3) Design and build, in conjunction with an industrial collaborator, a new full scale research engine designed to maximize brake thermal efficiency. Include a full complement of combustion diagnostics. (4) Incorporate all of the knowledge thus obtained in the design and fabrication, by an industrial collaborator, of the hydrogen fueled engine for the hybrid vehicle power train illustrator. Results of the CLR baseline engine testing are presented, as well as preliminary data from the new combustion chamber engine. The CLR data confirm the low NOx produced by lean operation. The preliminary indicated thermal efficiency data from the new combustion chamber design engine show an improvement relative to the CLR engine. Comparison with previous high compression engine results shows reasonable agreement.« less
NASA Astrophysics Data System (ADS)
Moore, R. T.; Hansen, M. C.
2011-12-01
Google Earth Engine is a new technology platform that enables monitoring and measurement of changes in the earth's environment, at planetary scale, on a large catalog of earth observation data. The platform offers intrinsically-parallel computational access to thousands of computers in Google's data centers. Initial efforts have focused primarily on global forest monitoring and measurement, in support of REDD+ activities in the developing world. The intent is to put this platform into the hands of scientists and developing world nations, in order to advance the broader operational deployment of existing scientific methods, and strengthen the ability for public institutions and civil society to better understand, manage and report on the state of their natural resources. Earth Engine currently hosts online nearly the complete historical Landsat archive of L5 and L7 data collected over more than twenty-five years. Newly-collected Landsat imagery is downloaded from USGS EROS Center into Earth Engine on a daily basis. Earth Engine also includes a set of historical and current MODIS data products. The platform supports generation, on-demand, of spatial and temporal mosaics, "best-pixel" composites (for example to remove clouds and gaps in satellite imagery), as well as a variety of spectral indices. Supervised learning methods are available over the Landsat data catalog. The platform also includes a new application programming framework, or "API", that allows scientists access to these computational and data resources, to scale their current algorithms or develop new ones. Under the covers of the Google Earth Engine API is an intrinsically-parallel image-processing system. Several forest monitoring applications powered by this API are currently in development and expected to be operational in 2011. Combining science with massive data and technology resources in a cloud-computing framework can offer advantages of computational speed, ease-of-use and collaboration, as well as transparency in data and methods. Methods developed for global processing of MODIS data to map land cover are being adopted for use with Landsat data. Specifically, the MODIS Vegetation Continuous Field product methodology has been applied for mapping forest extent and change at national scales using Landsat time-series data sets. Scaling this method to continental and global scales is enabled by Google Earth Engine computing capabilities. By combining the supervised learning VCF approach with the Landsat archive and cloud computing, unprecedented monitoring of land cover dynamics is enabled.
Graduate engineering research participation in aeronautics
NASA Technical Reports Server (NTRS)
Roberts, A. S., Jr.
1986-01-01
The Aeronautics Graduate Research Program commenced in 1971, with the primary goal of engaging students who qualified for regular admission to the Graduate School of Engineering at Old Dominion University in a graduate engineering research and study program in collaboration with NASA Langley Research Center, Hampton, Virginia. The format and purposes of this program are discussed. Student selection and program statistics are summarized. Abstracts are presented in the folowing areas: aircraft design, aerodynamics, lift/drag characteristics; avionics; fluid mechanics; solid mechanics; instrumentation and measurement techniques; thermophysical properties experiments; large space structures; earth orbital dynamics; and environmental engineering.
Jeffery B. Cannon; Kevin J. Barrett; Benjamin M. Gannon; Robert N. Addington; Mike A. Battaglia; Paula J. Fornwalt; Gregory H. Aplet; Antony S. Cheng; Jeffrey L. Underhill; Jennifer S. Briggs; Peter M. Brown
2018-01-01
In response to large, severe wildfires in historically fire-adapted forests in the western US, policy initiatives, such as the USDA Forest Serviceâs Collaborative Forest Landscape Restoration Program (CFLRP), seek to increase the pace and scale of ecological restoration. One required component of this program is collaborative adaptive management, in which monitoring...
High-Lift Engine Aeroacoustics Technology (HEAT) Test Program Overview
NASA Technical Reports Server (NTRS)
Zuniga, Fanny A.; Smith, Brian E.
1999-01-01
The NASA High-Speed Research program developed the High-Lift Engine Aeroacoustics Technology (HEAT) program to demonstrate satisfactory interaction between the jet noise suppressor and high-lift system of a High-Speed Civil Transport (HSCT) configuration at takeoff, climb, approach and landing conditions. One scheme for reducing jet exhaust noise generated by an HSCT is the use of a mixer-ejector system which would entrain large quantities of ambient air into the nozzle exhaust flow through secondary inlets in order to cool and slow the jet exhaust before it exits the nozzle. The effectiveness of such a noise suppression device must be evaluated in the presence of an HSCT wing high-lift system before definitive assessments can be made concerning its acoustic performance. In addition, these noise suppressors must provide the required acoustic attenuation while not degrading the thrust efficiency of the propulsion system or the aerodynamic performance of the high-lift devices on the wing. Therefore, the main objective of the HEAT program is to demonstrate these technologies and understand their interactions on a large-scale HSCT model. The HEAT program is a collaborative effort between NASA-Ames, Boeing Commercial Airplane Group, Douglas Aircraft Corp., Lockheed-Georgia, General Electric and NASA - Lewis. The suppressor nozzles used in the tests were Generation 1 2-D mixer-ejector nozzles made by General Electric. The model used was a 13.5%-scale semi-span model of a Boeing Reference H configuration.
Chesler, Naomi C; Ruis, A R; Collier, Wesley; Swiecki, Zachari; Arastoopour, Golnaz; Williamson Shaffer, David
2015-02-01
Engineering virtual internships are a novel paradigm for providing authentic engineering experiences in the first-year curriculum. They are both individualized and accommodate large numbers of students. As we describe in this report, this approach can (a) enable students to solve complex engineering problems in a mentored, collaborative environment; (b) allow educators to assess engineering thinking; and (c) provide an introductory experience that students enjoy and find valuable. Furthermore, engineering virtual internships have been shown to increase students'-and especially women's-interest in and motivation to pursue engineering degrees. When implemented in first-year engineering curricula more broadly, the potential impact of engineering virtual internships on the size and diversity of the engineering workforce could be dramatic.
Reconsidering Replication: New Perspectives on Large-Scale School Improvement
ERIC Educational Resources Information Center
Peurach, Donald J.; Glazer, Joshua L.
2012-01-01
The purpose of this analysis is to reconsider organizational replication as a strategy for large-scale school improvement: a strategy that features a "hub" organization collaborating with "outlet" schools to enact school-wide designs for improvement. To do so, we synthesize a leading line of research on commercial replication to construct a…
Collaborative mining and interpretation of large-scale data for biomedical research insights.
Tsiliki, Georgia; Karacapilidis, Nikos; Christodoulou, Spyros; Tzagarakis, Manolis
2014-01-01
Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence.
Collaborative Mining and Interpretation of Large-Scale Data for Biomedical Research Insights
Tsiliki, Georgia; Karacapilidis, Nikos; Christodoulou, Spyros; Tzagarakis, Manolis
2014-01-01
Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence. PMID:25268270
NASA Astrophysics Data System (ADS)
Lorenzo Alvarez, Jose; Metselaar, Harold; Amiaux, Jerome; Saavedra Criado, Gonzalo; Gaspar Venancio, Luis M.; Salvignol, Jean-Christophe; Laureijs, René J.; Vavrek, Roland
2016-08-01
In the last years, the system engineering field is coming to terms with a paradigm change in the approach for complexity management. Different strategies have been proposed to cope with highly interrelated systems, system of systems and collaborative system engineering have been proposed and a significant effort is being invested into standardization and ontology definition. In particular, Model Based System Engineering (MBSE) intends to introduce methodologies for a systematic system definition, development, validation, deployment, operation and decommission, based on logical and visual relationship mapping, rather than traditional 'document based' information management. The practical implementation in real large-scale projects is not uniform across fields. In space science missions, the usage has been limited to subsystems or sample projects with modeling being performed 'a-posteriori' in many instances. The main hurdle for the introduction of MBSE practices in new projects is still the difficulty to demonstrate their added value to a project and whether their benefit is commensurate with the level of effort required to put them in place. In this paper we present the implemented Euclid system modeling activities, and an analysis of the benefits and limitations identified to support in particular requirement break-down and allocation, and verification planning at mission level.
Memory Transmission in Small Groups and Large Networks: An Agent-Based Model.
Luhmann, Christian C; Rajaram, Suparna
2015-12-01
The spread of social influence in large social networks has long been an interest of social scientists. In the domain of memory, collaborative memory experiments have illuminated cognitive mechanisms that allow information to be transmitted between interacting individuals, but these experiments have focused on small-scale social contexts. In the current study, we took a computational approach, circumventing the practical constraints of laboratory paradigms and providing novel results at scales unreachable by laboratory methodologies. Our model embodied theoretical knowledge derived from small-group experiments and replicated foundational results regarding collaborative inhibition and memory convergence in small groups. Ultimately, we investigated large-scale, realistic social networks and found that agents are influenced by the agents with which they interact, but we also found that agents are influenced by nonneighbors (i.e., the neighbors of their neighbors). The similarity between these results and the reports of behavioral transmission in large networks offers a major theoretical insight by linking behavioral transmission to the spread of information. © The Author(s) 2015.
NASA Technical Reports Server (NTRS)
Braun, R. D.; Kroo, I. M.
1995-01-01
Collaborative optimization is a design architecture applicable in any multidisciplinary analysis environment but specifically intended for large-scale distributed analysis applications. In this approach, a complex problem is hierarchically de- composed along disciplinary boundaries into a number of subproblems which are brought into multidisciplinary agreement by a system-level coordination process. When applied to problems in a multidisciplinary design environment, this scheme has several advantages over traditional solution strategies. These advantageous features include reducing the amount of information transferred between disciplines, the removal of large iteration-loops, allowing the use of different subspace optimizers among the various analysis groups, an analysis framework which is easily parallelized and can operate on heterogenous equipment, and a structural framework that is well-suited for conventional disciplinary organizations. In this article, the collaborative architecture is developed and its mathematical foundation is presented. An example application is also presented which highlights the potential of this method for use in large-scale design applications.
Experimental Studies of Instability Development in Magnetically Driven Systems
Awe, Thomas James
2015-03-01
The author highlights results from a variety of experiments on the Z Machine, for which he served as the lead experimentalist. All experiments on Z take dedicated effort from a large collaboration of scientists, engineers, and technicians.
NASA Technical Reports Server (NTRS)
Shivers, J. P.; Mclemore, H. C.; Coe, P. L., Jr.
1976-01-01
Tests have been conducted in a full scale tunnel to determine the low speed aerodynamic characteristics of a large scale advanced arrow wing supersonic transport configuration with engines mounted above the wing for upper surface blowing. Tests were made over an angle of attack range of -10 deg to 32 deg, sideslip angles of + or - 5 deg, and a Reynolds number range of 3,530,000 to 7,330,000. Configuration variables included trailing edge flap deflection, engine jet nozzle angle, engine thrust coefficient, engine out operation, and asymmetrical trailing edge boundary layer control for providing roll trim. Downwash measurements at the tail were obtained for different thrust coefficients, tail heights, and at two fuselage stations.
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Manipon, G.; Xing, Z.
2007-12-01
The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* & Globus Alliance toolkits), and enables scientists to do multi- instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client & server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data Access Protocol (OpenDAP) servers. SciFlo also publishes its own SOAP services for space/time query and subsetting of Earth Science datasets, and automated access to large datasets via lists of (FTP, HTTP, or DAP) URLs which point to on-line HDF or netCDF files. Typical distributed workflows obtain datasets by calling standard WMS/WCS servers or discovering and fetching data granules from ftp sites; invoke remote analysis operators available as SOAP services (interface described by a WSDL document); and merge results into binary containers (netCDF or HDF files) for further analysis using local executable operators. Naming conventions (HDFEOS and CF-1.0 for netCDF) are exploited to automatically understand and read on-line datasets. More interoperable conventions, and broader adoption of existing converntions, are vital if we are to "scale up" automated choreography of Web Services beyond toy applications. Recently, the ESIP Federation sponsored a collaborative activity in which several ESIP members developed some collaborative science scenarios for atmospheric and aerosol science, and then choreographed services from multiple groups into demonstration workflows using the SciFlo engine and a Business Process Execution Language (BPEL) workflow engine. We will discuss the lessons learned from this activity, the need for standardized interfaces (like WMS/WCS), the difficulty in agreeing on even simple XML formats and interfaces, the benefits of doing collaborative science analysis at the "touch of a button" once services are connected, and further collaborations that are being pursued.
Content validation of an interprofessional learning video peer assessment tool.
Nisbet, Gillian; Jorm, Christine; Roberts, Chris; Gordon, Christopher J; Chen, Timothy F
2017-12-16
Large scale models of interprofessional learning (IPL) where outcomes are assessed are rare within health professional curricula. To date, there is sparse research describing robust assessment strategies to support such activities. We describe the development of an IPL assessment task based on peer rating of a student generated video evidencing collaborative interprofessional practice. We provide content validation evidence of an assessment rubric in the context of large scale IPL. Two established approaches to scale development in an educational setting were combined. A literature review was undertaken to develop a conceptual model of the relevant domains and issues pertaining to assessment of student generated videos within IPL. Starting with a prototype rubric developed from the literature, a series of staff and student workshops were undertaken to integrate expert opinion and user perspectives. Participants assessed five-minute videos produced in a prior pilot IPL activity. Outcomes from each workshop informed the next version of the rubric until agreement was reached on anchoring statements and criteria. At this point the rubric was declared fit to be used in the upcoming mandatory large scale IPL activity. The assessment rubric consisted of four domains: patient issues, interprofessional negotiation; interprofessional management plan in action; and effective use of video medium to engage audience. The first three domains reflected topic content relevant to the underlying construct of interprofessional collaborative practice. The fourth domain was consistent with the broader video assessment literature calling for greater emphasis on creativity in education. We have provided evidence for the content validity of a video-based peer assessment task portraying interprofessional collaborative practice in the context of large-scale IPL activities for healthcare professional students. Further research is needed to establish the reliability of such a scale.
CasCADe: A Novel 4D Visualization System for Virtual Construction Planning.
Ivson, Paulo; Nascimento, Daniel; Celes, Waldemar; Barbosa, Simone Dj
2018-01-01
Building Information Modeling (BIM) provides an integrated 3D environment to manage large-scale engineering projects. The Architecture, Engineering and Construction (AEC) industry explores 4D visualizations over these datasets for virtual construction planning. However, existing solutions lack adequate visual mechanisms to inspect the underlying schedule and make inconsistencies readily apparent. The goal of this paper is to apply best practices of information visualization to improve 4D analysis of construction plans. We first present a review of previous work that identifies common use cases and limitations. We then consulted with AEC professionals to specify the main design requirements for such applications. These guided the development of CasCADe, a novel 4D visualization system where task sequencing and spatio-temporal simultaneity are immediately apparent. This unique framework enables the combination of diverse analytical features to create an information-rich analysis environment. We also describe how engineering collaborators used CasCADe to review the real-world construction plans of an Oil & Gas process plant. The system made evident schedule uncertainties, identified work-space conflicts and helped analyze other constructability issues. The results and contributions of this paper suggest new avenues for future research in information visualization for the AEC industry.
NASA Astrophysics Data System (ADS)
Aliseda, Alberto; Bourgoin, Mickael; Eswirp Collaboration
2014-11-01
We present preliminary results from a recent grid turbulence experiment conducted at the ONERA wind tunnel in Modane, France. The ESWIRP Collaboration was conceived to probe the smallest scales of a canonical turbulent flow with very high Reynolds numbers. To achieve this, the largest scales of the turbulence need to be extremely big so that, even with the large separation of scales, the smallest scales would be well above the spatial and temporal resolution of the instruments. The ONERA wind tunnel in Modane (8 m -diameter test section) was chosen as a limit of the biggest large scales achievable in a laboratory setting. A giant inflatable grid (M = 0.8 m) was conceived to induce slowly-decaying homogeneous isotropic turbulence in a large region of the test section, with minimal structural risk. An international team or researchers collected hot wire anemometry, ultrasound anemometry, resonant cantilever anemometry, fast pitot tube anemometry, cold wire thermometry and high-speed particle tracking data of this canonical turbulent flow. While analysis of this large database, which will become publicly available over the next 2 years, has only started, the Taylor-scale Reynolds number is estimated to be between 400 and 800, with Kolmogorov scales as large as a few mm . The ESWIRP Collaboration is formed by an international team of scientists to investigate experimentally the smallest scales of turbulence. It was funded by the European Union to take advantage of the largest wind tunnel in Europe for fundamental research.
Software Engineering Research/Developer Collaborations (C104)
NASA Technical Reports Server (NTRS)
Shell, Elaine; Shull, Forrest
2005-01-01
The goal of this collaboration was to produce Flight Software Branch (FSB) process standards for software inspections which could be used across three new missions within the FSB. The standard was developed by Dr. Forrest Shull (Fraunhofer Center for Experimental Software Engineering, Maryland) using the Perspective-Based Inspection approach, (PBI research has been funded by SARP) , then tested on a pilot Branch project. Because the short time scale of the collaboration ruled out a quantitative evaluation, it would be decided whether the standard was suitable for roll-out to other Branch projects based on a qualitative measure: whether the standard received high ratings from Branch personnel as to usability and overall satisfaction. The project used for piloting the Perspective-Based Inspection approach was a multi-mission framework designed for reuse. This was a good choice because key representatives from the three new missions would be involved in the inspections. The perspective-based approach was applied to produce inspection procedures tailored for the specific quality needs of the branch. The technical information to do so was largely drawn through a series of interviews with Branch personnel. The framework team used the procedures to review requirements. The inspections were useful for indicating that a restructuring of the requirements document was needed, which led to changes in the development project plan. The standard was sent out to other Branch personnel for review. Branch personnel were very positive. However, important changes were identified because the perspective of Attitude Control System (ACS) developers had not been adequately represented, a result of the specific personnel interviewed. The net result is that with some further work to incorporate the ACS perspective, and in synchrony with the roll out of independent Branch standards, the PBI approach will be implemented in the FSB. Also, the project intends to continue its collaboration with the technology provider (Dr. Forrest Shull) past the end of the grant, to allow a more rigorous quantitative evaluation.
NASA Astrophysics Data System (ADS)
Moore, S. L.; Kar, A.; Gomez, R.
2015-12-01
A partnership between Fort Valley State University (FVSU), the Jackson School of Geosciences at The University of Texas (UT) at Austin, and the Texas Advanced Computing Center (TACC) is engaging computational geoscience faculty and researchers with academically talented underrepresented minority (URM) students, training them to solve grand challenges . These next generation computational geoscientists are being trained to solve some of the world's most challenging geoscience grand challenges requiring data intensive large scale modeling and simulation on high performance computers . UT Austin's geoscience outreach program GeoFORCE, recently awarded the Presidential Award in Excellence in Science, Mathematics and Engineering Mentoring, contributes to the collaborative best practices in engaging researchers with URM students. Collaborative efforts over the past decade are providing data demonstrating that integrative pipeline programs with mentoring and paid internship opportunities, multi-year scholarships, computational training, and communication skills development are having an impact on URMs developing middle skills for geoscience careers. Since 1997, the Cooperative Developmental Energy Program at FVSU and its collaborating universities have graduated 87 engineers, 33 geoscientists, and eight health physicists. Recruited as early as high school, students enroll for three years at FVSU majoring in mathematics, chemistry or biology, and then transfer to UT Austin or other partner institutions to complete a second STEM degree, including geosciences. A partnership with the Integrative Computational Education and Research Traineeship (ICERT), a National Science Foundation (NSF) Research Experience for Undergraduates (REU) Site at TACC provides students with a 10-week summer research experience at UT Austin. Mentored by TACC researchers, students with no previous background in computational science learn to use some of the world's most powerful high performance computing resources to address a grand geosciences problem. Students increase their ability to understand and explain the societal impact of their research and communicate the research to multidisciplinary and lay audiences via near-peer mentoring, poster presentations, and publication opportunities.
Studies on combined model based on functional objectives of large scale complex engineering
NASA Astrophysics Data System (ADS)
Yuting, Wang; Jingchun, Feng; Jiabao, Sun
2018-03-01
As various functions were included in large scale complex engineering, and each function would be conducted with completion of one or more projects, combined projects affecting their functions should be located. Based on the types of project portfolio, the relationship of projects and their functional objectives were analyzed. On that premise, portfolio projects-technics based on their functional objectives were introduced, then we studied and raised the principles of portfolio projects-technics based on the functional objectives of projects. In addition, The processes of combined projects were also constructed. With the help of portfolio projects-technics based on the functional objectives of projects, our research findings laid a good foundation for management of large scale complex engineering portfolio management.
Wu, Junjun; Du, Guocheng; Zhou, Jingwen; Chen, Jian
2014-10-20
Flavonoids possess pharmaceutical potential due to their health-promoting activities. The complex structures of these products make extraction from plants difficult, and chemical synthesis is limited because of the use of many toxic solvents. Microbial production offers an alternate way to produce these compounds on an industrial scale in a more economical and environment-friendly manner. However, at present microbial production has been achieved only on a laboratory scale and improvements and scale-up of these processes remain challenging. Naringenin and pinocembrin, which are flavonoid scaffolds and precursors for most of the flavonoids, are the model molecules that are key to solving the current issues restricting industrial production of these chemicals. The emergence of systems metabolic engineering, which combines systems biology with synthetic biology and evolutionary engineering at the systems level, offers new perspectives on strain and process optimization. In this review, current challenges in large-scale fermentation processes involving flavonoid scaffolds and the strategies and tools of systems metabolic engineering used to overcome these challenges are summarized. This will offer insights into overcoming the limitations and challenges of large-scale microbial production of these important pharmaceutical compounds. Copyright © 2014 Elsevier B.V. All rights reserved.
Operational aspects of CASA UNO '88-The first large scale international GPS geodetic network
NASA Technical Reports Server (NTRS)
Neilan, Ruth E.; Dixon, T. H.; Meehan, Thomas K.; Melbourne, William G.; Scheid, John A.; Kellogg, J. N.; Stowell, J. L.
1989-01-01
For three weeks, from January 18 to February 5, 1988, scientists and engineers from 13 countries and 30 international agencies and institutions cooperated in the most extensive GPS (Global Positioning System) field campaign, and the largest geodynamics experiment, in the world to date. This collaborative eperiment concentrated GPS receivers in Central and South America. The predicted rates of motions are on the order of 5-10 cm/yr. Global coverage of GPS observations spanned 220 deg of longitude and 125 deg of latitude using a total of 43 GPS receivers. The experiment was the first civilian effort at implementing an extended international GPS satellite tracking network. Covariance analyses incorporating the extended tracking network predicted significant improvement in precise orbit determination, allowing accurate long-baseline geodesy in the science areas.
Development of Shape Memory Alloys- Challenges and Solutions
NASA Technical Reports Server (NTRS)
Benafan, Othmane
2016-01-01
Shape memory alloys (SMAs) are a unique class of multifunctional materials that have the ability to recover large deformations or generate high stresses in response to thermal, mechanical andor electromagnetic stimuli. These abilities have made them a viable option for actuation systems in aerospace, medical, and automotive applications, amongst others. However, despite many advantages and the fact that SMA actuators have been developed and used for many years, so far they have only found service in a limited range of applications. In order to expand their applications, further developments are needed to increase their reliability and stability and to address processing, testing and qualification needed for large-scale commercial application of SMA actuators. In this seminar, historical inhibitors of SMA applications and current research efforts by NASA Glenn Research Center and collaborators will be discussed. Relationships between fundamental physicalscientific understanding, and the direct transition to engineering and design of mechanisms using these novel materials will be highlighted. Examples will be presented related to targeted alloy development, microstructural control, and bulk-scale testing as a function of stresses, temperatures and harsh environments. The seminar will conclude with a summary of SMA applications under development and current advances.
Engaging high school students as plasma science outreach ambassadors
NASA Astrophysics Data System (ADS)
Wendt, Amy; Boffard, John
2017-10-01
Exposure to plasma science among future scientists and engineers is haphazard. In the U.S., plasma science is rare (or absent) in mainstream high school and introductory college physics curricula. As a result, talented students may be drawn to other careers simply due to a lack of awareness of the stimulating science and wide array of fulfilling career opportunities involving plasmas. In the interest of enabling informed decisions about career options, we have initiated an outreach collaboration with the Madison West High School Rocket Club. Rocket Club members regularly exhibit their activities at public venues, including large-scale expos that draw large audiences of all ages. Building on their historical emphasis on small scale rockets with chemical motors, we worked with the group to add a new feature to their exhibit that highlights plasma-based spacecraft propulsion for interplanetary probes. This new exhibit includes a model satellite with a working (low power) plasma thruster. The participating high school students led the development process, to be described, and enthusiastically learned to articulate concepts related to plasma thruster operation and to compare the relative advantages of chemical vs. plasma/electrical propulsion systems for different scenarios. Supported by NSF Grant PHY-1617602.
The Ceiling to Coproduction in University-Industry Research Collaboration
ERIC Educational Resources Information Center
McCabe, Angela; Parker, Rachel; Cox, Stephen
2016-01-01
The purpose of this paper is to provide insight into government attempts at bridging the divide between theory and practice through university-industry research collaboration modelled under engaged scholarship. The findings are based on data sourced from interviews with 47 academic and industry project leaders from 23 large-scale research…
Understanding life together: A brief history of collaboration in biology
Vermeulen, Niki; Parker, John N.; Penders, Bart
2013-01-01
The history of science shows a shift from single-investigator ‘little science’ to increasingly large, expensive, multinational, interdisciplinary and interdependent ‘big science’. In physics and allied fields this shift has been well documented, but the rise of collaboration in the life sciences and its effect on scientific work and knowledge has received little attention. Research in biology exhibits different historical trajectories and organisation of collaboration in field and laboratory – differences still visible in contemporary collaborations such as the Census of Marine Life and the Human Genome Project. We employ these case studies as strategic exemplars, supplemented with existing research on collaboration in biology, to expose the different motives, organisational forms and social dynamics underpinning contemporary large-scale collaborations in biology and their relations to historical patterns of collaboration in the life sciences. We find the interaction between research subject, research approach as well as research organisation influencing collaboration patterns and the work of scientists. PMID:23578694
Hydroentangled High Quality (HQ) Cotton Developments: Cosmetic Pads and Greige Cotton Bed Sheets
USDA-ARS?s Scientific Manuscript database
The hydroentagled development work (at a plant-scale) was carried out in year 2004 in collaboration with Hollingsworth on Wheels, Greenville, SC, and Fleissener, Germany. This work was published as two papers in Journal of Engineered Fibers and Fabrics in 2006 and 2007. Early this year physical test...
NASA Astrophysics Data System (ADS)
Cała, Marek; Borowski, Marek
2018-03-01
The AGH University of Science and Technology collaborates closely with other universities, economic units, governmental and local administrative bodies. International cooperation plays a very important role in the academic research. The AGH University of Science and Technology has signed many collaboration agreements. They aim at multidimensional cooperation in the fields of education and academic research. AGH UST has always focused on collaboration with business and industry. In recent years, the global economy is undergoing massive transformations, what creates new challenges to companies and educational institutions that cater to the needs of industry. The expansion of business enterprises is largely dependent on their employees' expertise, skills and levels of competence. Certified engineers are provided by universities. Therefore, the qualifications of the graduates are determined by the curriculum and teaching methods, as well as the available educational and research facilities. Of equal importance is the qualified academic staff. Human activities in the field of engineering require finding solutions to problems of various nature and magnitude. An engineer's work consists in the design, construction, modification and maintenance of useful devices, processes and systems, using scientific and technical knowledge. In order to design complex engineering solutions, an engineer uses his imagination, experience, analytical skills, logical reasoning and makes conscious use of his knowledge. At the Faculty of Mining and Geoengineering of the AGH University of Science and Technology in Cracow, 15 engineers from Vietnam are studying Mining and Geology at the second-cycle studies (specialization: mine ventilation). The solutions proposed in the field of the engineers' education guarantee that foreign students gain both engineering knowledge and problem-solving skills. Therefore, the study programme was complemented by a series of practical aspects.
A Collaboration in Support of LBA Science and Data Exchange: Beija-flor and EOS-WEBSTER
NASA Astrophysics Data System (ADS)
Schloss, A. L.; Gentry, M. J.; Keller, M.; Rhyne, T.; Moore, B.
2001-12-01
The University of New Hampshire (UNH) has developed a Web-based tool that makes data, information, products, and services concerning terrestrial ecological and hydrological processes available to the Earth Science community. Our WEB-based System for Terrestrial Ecosystem Research (EOS-WEBSTER) provides a GIS-oriented interface to select, subset, reformat and download three main types of data: selected NASA Earth Observing System (EOS) remotely sensed data products, results from a suite of ecosystem and hydrological models, and geographic reference data. The Large Scale Biosphere-Atmosphere Experiment in Amazonia Project (LBA) has implemented a search engine, Beija-flor, that provides a centralized access point to data sets acquired for and produced by LBA researchers. The metadata in the Beija-flor index describe the content of the data sets and contain links to data distributed around the world. The query system returns a list of data sets that meet the search criteria of the user. A common problem when a user of a system like Beija-flor wants data products located within another system is that users are required to re-specify information, such as spatial coordinates, in the other system. This poster describes methodology by which Beija-flor generates a unique URL containing the requested search parameters and passes the information to EOS-WEBSTER, thus making the interactive services and large diverse data holdings in EOS-WEBSTER directly available to Beija-flor users. This "Calling Card" is used by EOS-WEBSTER to generate on-demand custom products tailored to each Beija-flor request. Through a collaborative effort, we have demonstrated the ability to integrate project-specific search engines such as Beija-flor with the products and services of large data systems such as EOS-WEBSTER, to provide very specific information products with a minimal amount of additional programming. This methodology has the potential to greatly facilitate research data exchange by enhancing the interoperability of diverse data systems beyond the two described here.
Similarity spectra analysis of high-performance jet aircraft noise.
Neilsen, Tracianne B; Gee, Kent L; Wall, Alan T; James, Michael M
2013-04-01
Noise measured in the vicinity of an F-22A Raptor has been compared to similarity spectra found previously to represent mixing noise from large-scale and fine-scale turbulent structures in laboratory-scale jet plumes. Comparisons have been made for three engine conditions using ground-based sideline microphones, which covered a large angular aperture. Even though the nozzle geometry is complex and the jet is nonideally expanded, the similarity spectra do agree with large portions of the measured spectra. Toward the sideline, the fine-scale similarity spectrum is used, while the large-scale similarity spectrum provides a good fit to the area of maximum radiation. Combinations of the two similarity spectra are shown to match the data in between those regions. Surprisingly, a combination of the two is also shown to match the data at the farthest aft angle. However, at high frequencies the degree of congruity between the similarity and the measured spectra changes with engine condition and angle. At the higher engine conditions, there is a systematically shallower measured high-frequency slope, with the largest discrepancy occurring in the regions of maximum radiation.
NASA Technical Reports Server (NTRS)
Brown, I. Foster; Moreira, Adriana
1997-01-01
Success of the Large-Scale Biosphere-Atmospheric Experiment in Amazonia (LBA) program depends on several critical factors, the most important being the effective participation of Amazonian researchers and institutions. Without host-county counterparts, particularly in Amazonia, many important studies cannot he undertaken due either to lack of qualified persons or to legal constraints. No less important, the acceptance of the LBA program in Amazonia is also dependent on what LBA can do for improving the scientific expertise in Amazonia. Gaining the active investment of Amazonian scientists in a comprehensive research program is not a trivial task. Potential collaborators are few, particularly where much of the research was to be originally focused - the southern arc of Brazilian Amazonia. The mid-term goals of the LBA Committee on Training and Education are to increase the number of collaborators and to demonstrate that LBA will be of benefit to the region.
Unfolding large-scale online collaborative human dynamics
Zha, Yilong; Zhou, Tao; Zhou, Changsong
2016-01-01
Large-scale interacting human activities underlie all social and economic phenomena, but quantitative understanding of regular patterns and mechanism is very challenging and still rare. Self-organized online collaborative activities with a precise record of event timing provide unprecedented opportunity. Our empirical analysis of the history of millions of updates in Wikipedia shows a universal double–power-law distribution of time intervals between consecutive updates of an article. We then propose a generic model to unfold collaborative human activities into three modules: (i) individual behavior characterized by Poissonian initiation of an action, (ii) human interaction captured by a cascading response to previous actions with a power-law waiting time, and (iii) population growth due to the increasing number of interacting individuals. This unfolding allows us to obtain an analytical formula that is fully supported by the universal patterns in empirical data. Our modeling approaches reveal “simplicity” beyond complex interacting human activities. PMID:27911766
The Multi-Scale Network Landscape of Collaboration.
Bae, Arram; Park, Doheum; Ahn, Yong-Yeol; Park, Juyong
2016-01-01
Propelled by the increasing availability of large-scale high-quality data, advanced data modeling and analysis techniques are enabling many novel and significant scientific understanding of a wide range of complex social, natural, and technological systems. These developments also provide opportunities for studying cultural systems and phenomena--which can be said to refer to all products of human creativity and way of life. An important characteristic of a cultural product is that it does not exist in isolation from others, but forms an intricate web of connections on many levels. In the creation and dissemination of cultural products and artworks in particular, collaboration and communication of ideas play an essential role, which can be captured in the heterogeneous network of the creators and practitioners of art. In this paper we propose novel methods to analyze and uncover meaningful patterns from such a network using the network of western classical musicians constructed from a large-scale comprehensive Compact Disc recordings data. We characterize the complex patterns in the network landscape of collaboration between musicians across multiple scales ranging from the macroscopic to the mesoscopic and microscopic that represent the diversity of cultural styles and the individuality of the artists.
The Multi-Scale Network Landscape of Collaboration
Ahn, Yong-Yeol; Park, Juyong
2016-01-01
Propelled by the increasing availability of large-scale high-quality data, advanced data modeling and analysis techniques are enabling many novel and significant scientific understanding of a wide range of complex social, natural, and technological systems. These developments also provide opportunities for studying cultural systems and phenomena—which can be said to refer to all products of human creativity and way of life. An important characteristic of a cultural product is that it does not exist in isolation from others, but forms an intricate web of connections on many levels. In the creation and dissemination of cultural products and artworks in particular, collaboration and communication of ideas play an essential role, which can be captured in the heterogeneous network of the creators and practitioners of art. In this paper we propose novel methods to analyze and uncover meaningful patterns from such a network using the network of western classical musicians constructed from a large-scale comprehensive Compact Disc recordings data. We characterize the complex patterns in the network landscape of collaboration between musicians across multiple scales ranging from the macroscopic to the mesoscopic and microscopic that represent the diversity of cultural styles and the individuality of the artists. PMID:26990088
34 CFR 350.31 - What collaboration must a Rehabilitation Engineering Research Center engage in?
Code of Federal Regulations, 2012 CFR
2012-07-01
... 34 Education 2 2012-07-01 2012-07-01 false What collaboration must a Rehabilitation Engineering... DISABILITY AND REHABILITATION RESEARCH PROJECTS AND CENTERS PROGRAM What Rehabilitation Engineering Research Centers Does the Secretary Assist? § 350.31 What collaboration must a Rehabilitation Engineering Research...
34 CFR 350.31 - What collaboration must a Rehabilitation Engineering Research Center engage in?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 34 Education 2 2013-07-01 2013-07-01 false What collaboration must a Rehabilitation Engineering... DISABILITY AND REHABILITATION RESEARCH PROJECTS AND CENTERS PROGRAM What Rehabilitation Engineering Research Centers Does the Secretary Assist? § 350.31 What collaboration must a Rehabilitation Engineering Research...
34 CFR 350.31 - What collaboration must a Rehabilitation Engineering Research Center engage in?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 34 Education 2 2010-07-01 2010-07-01 false What collaboration must a Rehabilitation Engineering... DISABILITY AND REHABILITATION RESEARCH PROJECTS AND CENTERS PROGRAM What Rehabilitation Engineering Research Centers Does the Secretary Assist? § 350.31 What collaboration must a Rehabilitation Engineering Research...
34 CFR 350.31 - What collaboration must a Rehabilitation Engineering Research Center engage in?
Code of Federal Regulations, 2014 CFR
2014-07-01
... 34 Education 2 2014-07-01 2013-07-01 true What collaboration must a Rehabilitation Engineering... DISABILITY AND REHABILITATION RESEARCH PROJECTS AND CENTERS PROGRAM What Rehabilitation Engineering Research Centers Does the Secretary Assist? § 350.31 What collaboration must a Rehabilitation Engineering Research...
34 CFR 350.31 - What collaboration must a Rehabilitation Engineering Research Center engage in?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 34 Education 2 2011-07-01 2010-07-01 true What collaboration must a Rehabilitation Engineering... DISABILITY AND REHABILITATION RESEARCH PROJECTS AND CENTERS PROGRAM What Rehabilitation Engineering Research Centers Does the Secretary Assist? § 350.31 What collaboration must a Rehabilitation Engineering Research...
Success in large high-technology projects: What really works?
NASA Astrophysics Data System (ADS)
Crosby, P.
2014-08-01
Despite a plethora of tools, technologies and management systems, successful execution of big science and engineering projects remains problematic. The sheer scale of globally funded projects such as the Large Hadron Collider and the Square Kilometre Array telescope means that lack of project success can impact both on national budgets, and collaborative reputations. In this paper, I explore data from contemporary literature alongside field research from several current high-technology projects in Europe and Australia, and reveal common `pressure points' that are shown to be key influencers of project control and success. I discuss the how mega-science projects sit between being merely complicated, and chaotic, and explain the importance of understanding multiple dimensions of project complexity. Project manager/leader traits are briefly discussed, including capability to govern and control such enterprises. Project structures are examined, including the challenge of collaborations. I show that early attention to building project resilience, curbing optimism, and risk alertness can help prepare large high-tech projects against threats, and why project managers need to understand aspects of `the silent power of time'. Mission assurance is advanced as a critical success function, alongside the deployment of task forces and new combinations of contingency plans. I argue for increased project control through industrial-style project reviews, and show how post-project reviews are an under-used, yet invaluable avenue of personal and organisational improvement. Lastly, I discuss the avoidance of project amnesia through effective capture of project knowledge, and transfer of lessons-learned to subsequent programs and projects.
Remote experimental site concept development
NASA Astrophysics Data System (ADS)
Casper, Thomas A.; Meyer, William; Butner, David
1995-01-01
Scientific research is now often conducted on large and expensive experiments that utilize collaborative efforts on a national or international scale to explore physics and engineering issues. This is particularly true for the current US magnetic fusion energy program where collaboration on existing facilities has increased in importance and will form the basis for future efforts. As fusion energy research approaches reactor conditions, the trend is towards fewer large and expensive experimental facilities, leaving many major institutions without local experiments. Since the expertise of various groups is a valuable resource, it is important to integrate these teams into an overall scientific program. To sustain continued involvement in experiments, scientists are now often required to travel frequently, or to move their families, to the new large facilities. This problem is common to many other different fields of scientific research. The next-generation tokamaks, such as the Tokamak Physics Experiment (TPX) or the International Thermonuclear Experimental Reactor (ITER), will operate in steady-state or long pulse mode and produce fluxes of fusion reaction products sufficient to activate the surrounding structures. As a direct consequence, remote operation requiring robotics and video monitoring will become necessary, with only brief and limited access to the vessel area allowed. Even the on-site control room, data acquisition facilities, and work areas will be remotely located from the experiment, isolated by large biological barriers, and connected with fiber-optics. Current planning for the ITER experiment includes a network of control room facilities to be located in the countries of the four major international partners; USA, Russian Federation, Japan, and the European Community.
The NCI Cohort Consortium is an extramural-intramural partnership formed by the National Cancer Institute to address the need for large-scale collaborations to pool the large quantity of data and biospecimens necessary to conduct a wide range of cancer studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xiaoliang; Stauffer, Philip H.
This effort is designed to expedite learnings from existing and planned large demonstration projects and their associated research through effective knowledge sharing among participants in the US and China.
NASA Astrophysics Data System (ADS)
Govoni, Marco; Galli, Giulia
Green's function based many-body perturbation theory (MBPT) methods are well established approaches to compute quasiparticle energies and electronic lifetimes. However, their application to large systems - for instance to heterogeneous systems, nanostructured, disordered, and defective materials - has been hindered by high computational costs. We will discuss recent MBPT methodological developments leading to an efficient formulation of electron-electron and electron-phonon interactions, and that can be applied to systems with thousands of electrons. Results using a formulation that does not require the explicit calculation of virtual states, nor the storage and inversion of large dielectric matrices will be presented. We will discuss data collections obtained using the WEST code, the advantages of the algorithms used in WEST over standard techniques, and the parallel performance. Work done in collaboration with I. Hamada, R. McAvoy, P. Scherpelz, and H. Zheng. This work was supported by MICCoM, as part of the Computational Materials Sciences Program funded by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division and by ANL.
Interagency Collaborative Team Model for Capacity Building to Scale-Up Evidence-Based Practice
Hurlburt, Michael; Aarons, Gregory A; Fettes, Danielle; Willging, Cathleen; Gunderson, Lara; Chaffin, Mark J
2015-01-01
Background System-wide scale up of evidence-based practice (EBP) is a complex process. Yet, few strategic approaches exist to support EBP implementation and sustainment across a service system. Building on the Exploration, Preparation, Implementation, and Sustainment (EPIS) implementation framework, we developed and are testing the Interagency Collaborative Team (ICT) process model to implement an evidence-based child neglect intervention (i.e., SafeCare®) within a large children’s service system. The ICT model emphasizes the role of local agency collaborations in creating structural supports for successful implementation. Methods We describe the ICT model and present preliminary qualitative results from use of the implementation model in one large scale EBP implementation. Qualitative interviews were conducted to assess challenges in building system, organization, and home visitor collaboration and capacity to implement the EBP. Data collection and analysis centered on EBP implementation issues, as well as the experiences of home visitors under the ICT model. Results Six notable issues relating to implementation process emerged from participant interviews, including: (a) initial commitment and collaboration among stakeholders, (b) leadership, (c) communication, (d) practice fit with local context, (e) ongoing negotiation and problem solving, and (f) early successes. These issues highlight strengths and areas for development in the ICT model. Conclusions Use of the ICT model led to sustained and widespread use of SafeCare in one large county. Although some aspects of the implementation model may benefit from enhancement, qualitative findings suggest that the ICT process generates strong structural supports for implementation and creates conditions in which tensions between EBP structure and local contextual variations can be resolved in ways that support the expansion and maintenance of an EBP while preserving potential for public health benefit. PMID:27512239
Interagency Collaborative Team Model for Capacity Building to Scale-Up Evidence-Based Practice.
Hurlburt, Michael; Aarons, Gregory A; Fettes, Danielle; Willging, Cathleen; Gunderson, Lara; Chaffin, Mark J
2014-04-01
System-wide scale up of evidence-based practice (EBP) is a complex process. Yet, few strategic approaches exist to support EBP implementation and sustainment across a service system. Building on the Exploration, Preparation, Implementation, and Sustainment (EPIS) implementation framework, we developed and are testing the Interagency Collaborative Team (ICT) process model to implement an evidence-based child neglect intervention (i.e., SafeCare®) within a large children's service system. The ICT model emphasizes the role of local agency collaborations in creating structural supports for successful implementation. We describe the ICT model and present preliminary qualitative results from use of the implementation model in one large scale EBP implementation. Qualitative interviews were conducted to assess challenges in building system, organization, and home visitor collaboration and capacity to implement the EBP. Data collection and analysis centered on EBP implementation issues, as well as the experiences of home visitors under the ICT model. Six notable issues relating to implementation process emerged from participant interviews, including: (a) initial commitment and collaboration among stakeholders, (b) leadership, (c) communication, (d) practice fit with local context, (e) ongoing negotiation and problem solving, and (f) early successes. These issues highlight strengths and areas for development in the ICT model. Use of the ICT model led to sustained and widespread use of SafeCare in one large county. Although some aspects of the implementation model may benefit from enhancement, qualitative findings suggest that the ICT process generates strong structural supports for implementation and creates conditions in which tensions between EBP structure and local contextual variations can be resolved in ways that support the expansion and maintenance of an EBP while preserving potential for public health benefit.
Analysis of BJ493 diesel engine lubrication system properties
NASA Astrophysics Data System (ADS)
Liu, F.
2017-12-01
The BJ493ZLQ4A diesel engine design is based on the primary model of BJ493ZLQ3, of which exhaust level is upgraded to the National GB5 standard due to the improved design of combustion and injection systems. Given the above changes in the diesel lubrication system, its improved properties are analyzed in this paper. According to the structures, technical parameters and indices of the lubrication system, the lubrication system model of BJ493ZLQ4A diesel engine was constructed using the Flowmaster flow simulation software. The properties of the diesel engine lubrication system, such as the oil flow rate and pressure at different rotational speeds were analyzed for the schemes involving large- and small-scale oil filters. The calculated values of the main oil channel pressure are in good agreement with the experimental results, which verifies the proposed model feasibility. The calculation results show that the main oil channel pressure and maximum oil flow rate values for the large-scale oil filter scheme satisfy the design requirements, while the small-scale scheme yields too low main oil channel’s pressure and too high. Therefore, application of small-scale oil filters is hazardous, and the large-scale scheme is recommended.
Cyberhubs: Virtual Research Environments for Astronomy
NASA Astrophysics Data System (ADS)
Herwig, Falk; Andrassy, Robert; Annau, Nic; Clarkson, Ondrea; Côté, Benoit; D’Sa, Aaron; Jones, Sam; Moa, Belaid; O’Connell, Jericho; Porter, David; Ritter, Christian; Woodward, Paul
2018-05-01
Collaborations in astronomy and astrophysics are faced with numerous cyber-infrastructure challenges, such as large data sets, the need to combine heterogeneous data sets, and the challenge to effectively collaborate on those large, heterogeneous data sets with significant processing requirements and complex science software tools. The cyberhubs system is an easy-to-deploy package for small- to medium-sized collaborations based on the Jupyter and Docker technology, which allows web-browser-enabled, remote, interactive analytic access to shared data. It offers an initial step to address these challenges. The features and deployment steps of the system are described, as well as the requirements collection through an account of the different approaches to data structuring, handling, and available analytic tools for the NuGrid and PPMstar collaborations. NuGrid is an international collaboration that creates stellar evolution and explosion physics and nucleosynthesis simulation data. The PPMstar collaboration performs large-scale 3D stellar hydrodynamics simulations of interior convection in the late phases of stellar evolution. Examples of science that is currently performed on cyberhubs, in the areas of 3D stellar hydrodynamic simulations, stellar evolution and nucleosynthesis, and Galactic chemical evolution, are presented.
SNMG: a social-level norm-based methodology for macro-governing service collaboration processes
NASA Astrophysics Data System (ADS)
Gao, Ji; Lv, Hexin; Jin, Zhiyong; Xu, Ping
2017-08-01
In order to adapt to the accelerative open tendency of collaborations between enterprises, this paper proposes a Social-level Norm-based methodology for Macro-Governing service collaboration processes, called SNMG, to regulate and control the social-level visible macro-behaviors of the social individuals participating in collaborations. SNMG not only can remove effectively the uncontrollability hindrance confronted with by open social activities, but also enables across-management-domain collaborations to be implemented by uniting the centralized controls of social individuals for respective social activities. Therefore, this paper provides a brand-new system construction mode to promote the development and large-scale deployment of service collaborations.
The future of fish passage science, engineering, and practice
Silva, Ana T.; Lucas, Martyn C.; Castro-Santos, Theodore R.; Katopodis, Christos; Baumgartner, Lee J.; Thiem, Jason D.; Aarestrup, Kim; Pompeu, Paulo S.; O'Brien, Gordon C.; Braun, Douglas C.; Burnett, Nicholas J.; Zhu, David Z.; Fjeldstad, Hans-Petter; Forseth, Torbjorn; Rajarathnam, Nallamuthu; Williams, John G.; Cooke, Steven J.
2018-01-01
Much effort has been devoted to developing, constructing and refining fish passage facilities to enable target species to pass barriers on fluvial systems, and yet, fishway science, engineering and practice remain imperfect. In this review, 17 experts from different fish passage research fields (i.e., biology, ecology, physiology, ecohydraulics, engineering) and from different continents (i.e., North and South America, Europe, Africa, Australia) identified knowledge gaps and provided a roadmap for research priorities and technical developments. Once dominated by an engineering‐focused approach, fishway science today involves a wide range of disciplines from fish behaviour to socioeconomics to complex modelling of passage prioritization options in river networks. River barrier impacts on fish migration and dispersal are currently better understood than historically, but basic ecological knowledge underpinning the need for effective fish passage in many regions of the world, including in biodiversity hotspots (e.g., equatorial Africa, South‐East Asia), remains largely unknown. Designing efficient fishways, with minimal passage delay and post‐passage impacts, requires adaptive management and continued innovation. While the use of fishways in river restoration demands a transition towards fish passage at the community scale, advances in selective fishways are also needed to manage invasive fish colonization. Because of the erroneous view in some literature and communities of practice that fish passage is largely a proven technology, improved international collaboration, information sharing, method standardization and multidisciplinary training are needed. Further development of regional expertise is needed in South America, Asia and Africa where hydropower dams are currently being planned and constructed.
Collaboration in the Humanities, Arts and Social Sciences in Australia
ERIC Educational Resources Information Center
Haddow, Gaby; Xia, Jianhong; Willson, Michele
2017-01-01
This paper reports on the first large-scale quantitative investigation into collaboration, demonstrated in co-authorship, by Australian humanities, arts and social sciences (HASS) researchers. Web of Science data were extracted for Australian HASS publications, with a focus on the softer social sciences, over the period 2004-2013. The findings…
2010-03-01
are turning out to be counterproductive because they are culturally anathema (Wollman, 2009). A consideration of psychological tenets by Sigmund ... Freud suggests a principle aspect of dysfunction in collaboration. He reasoned that An Ego governed by social convention and a Superego governed by
2011-09-30
and easy to apply in large-scale physical-biogeochemical simulations. We also collaborate with Dr. Curt Mobley at Sequoia Scientific for the second...we are collaborating with Dr. Curtis Mobley of Sequoia Scientific on improving the link between the radiative transfer model (EcoLight) within the
Transforming Power Systems Through Global Collaboration
DOE Office of Scientific and Technical Information (OSTI.GOV)
2017-06-01
Ambitious and integrated policy and regulatory frameworks are crucial to achieve power system transformation. The 21st Century Power Partnership -- a multilateral initiative of the Clean Energy Ministerial -- serves as a platform for public-private collaboration to advance integrated solutions for the large-scale deployment of renewable energy in combination with energy efficiency and grid modernization.
Interdisciplinary Collaboration in Launching a Large-Scale Research Study in Schools
ERIC Educational Resources Information Center
DeLoach, Kendra P.; Dvorsky, Melissa; George, Mellissa R. W.; Miller, Elaine; Weist, Mark D.; Kern, Lee
2012-01-01
Interdisciplinary collaboration (IC) is a critically important theme generally, and of particular significance in school mental health (SMH), given the range of people from different disciplines who work in schools and the various systems in place. Reflecting the move to a true shared school-family-community system agenda, the collaborative…
Reconciling Rigour and Impact by Collaborative Research Design: Study of Teacher Agency
ERIC Educational Resources Information Center
Pantic, Nataša
2017-01-01
This paper illustrates a new way of working collaboratively on the development of a methodology for studying teacher agency for social justice. Increasing emphasis of impact on change as a purpose of social research raises questions about appropriate research designs. Large-scale quantitative research framed within externally set parameters has…
A Combined Ethical and Scientific Analysis of Large-scale Tests of Solar Climate Engineering
NASA Astrophysics Data System (ADS)
Ackerman, T. P.
2017-12-01
Our research group recently published an analysis of the combined ethical and scientific issues surrounding large-scale testing of stratospheric aerosol injection (SAI; Lenferna et al., 2017, Earth's Future). We are expanding this study in two directions. The first is extending this same analysis to other geoengineering techniques, particularly marine cloud brightening (MCB). MCB has substantial differences to SAI in this context because MCB can be tested over significantly smaller areas of the planet and, following injection, has a much shorter lifetime of weeks as opposed to years for SAI. We examine issues such as the role of intent, the lesser of two evils, and the nature of consent. In addition, several groups are currently considering climate engineering governance tools such as a code of ethics and a registry. We examine how these tools might influence climate engineering research programs and, specifically, large-scale testing. The second direction of expansion is asking whether ethical and scientific issues associated with large-scale testing are so significant that they effectively preclude moving ahead with climate engineering research and testing. Some previous authors have suggested that no research should take place until these issues are resolved. We think this position is too draconian and consider a more nuanced version of this argument. We note, however, that there are serious questions regarding the ability of the scientific research community to move to the point of carrying out large-scale tests.
Virtual Collaborative Environments for System of Systems Engineering and Applications for ISAT
NASA Technical Reports Server (NTRS)
Dryer, David A.
2002-01-01
This paper describes an system of systems or metasystems approach and models developed to help prepare engineering organizations for distributed engineering environments. These changes in engineering enterprises include competition in increasingly global environments; new partnering opportunities caused by advances in information and communication technologies, and virtual collaboration issues associated with dispersed teams. To help address challenges and needs in this environment, a framework is proposed that can be customized and adapted for NASA to assist in improved engineering activities conducted in distributed, enhanced engineering environments. The approach is designed to prepare engineers for such distributed collaborative environments by learning and applying e-engineering methods and tools to a real-world engineering development scenario. The approach consists of two phases: an e-engineering basics phase and e-engineering application phase. The e-engineering basics phase addresses skills required for e-engineering. The e-engineering application phase applies these skills in a distributed collaborative environment to system development projects.
Sustainable water management under future uncertainty with eco-engineering decision scaling
NASA Astrophysics Data System (ADS)
Poff, N. Leroy; Brown, Casey M.; Grantham, Theodore E.; Matthews, John H.; Palmer, Margaret A.; Spence, Caitlin M.; Wilby, Robert L.; Haasnoot, Marjolijn; Mendoza, Guillermo F.; Dominique, Kathleen C.; Baeza, Andres
2016-01-01
Managing freshwater resources sustainably under future climatic and hydrological uncertainty poses novel challenges. Rehabilitation of ageing infrastructure and construction of new dams are widely viewed as solutions to diminish climate risk, but attaining the broad goal of freshwater sustainability will require expansion of the prevailing water resources management paradigm beyond narrow economic criteria to include socially valued ecosystem functions and services. We introduce a new decision framework, eco-engineering decision scaling (EEDS), that explicitly and quantitatively explores trade-offs in stakeholder-defined engineering and ecological performance metrics across a range of possible management actions under unknown future hydrological and climate states. We illustrate its potential application through a hypothetical case study of the Iowa River, USA. EEDS holds promise as a powerful framework for operationalizing freshwater sustainability under future hydrological uncertainty by fostering collaboration across historically conflicting perspectives of water resource engineering and river conservation ecology to design and operate water infrastructure for social and environmental benefits.
Sustainable water management under future uncertainty with eco-engineering decision scaling
Poff, N LeRoy; Brown, Casey M; Grantham, Theodore E.; Matthews, John H; Palmer, Margaret A.; Spence, Caitlin M; Wilby, Robert L.; Haasnoot, Marjolijn; Mendoza, Guillermo F; Dominique, Kathleen C; Baeza, Andres
2015-01-01
Managing freshwater resources sustainably under future climatic and hydrological uncertainty poses novel challenges. Rehabilitation of ageing infrastructure and construction of new dams are widely viewed as solutions to diminish climate risk, but attaining the broad goal of freshwater sustainability will require expansion of the prevailing water resources management paradigm beyond narrow economic criteria to include socially valued ecosystem functions and services. We introduce a new decision framework, eco-engineering decision scaling (EEDS), that explicitly and quantitatively explores trade-offs in stakeholder-defined engineering and ecological performance metrics across a range of possible management actions under unknown future hydrological and climate states. We illustrate its potential application through a hypothetical case study of the Iowa River, USA. EEDS holds promise as a powerful framework for operationalizing freshwater sustainability under future hydrological uncertainty by fostering collaboration across historically conflicting perspectives of water resource engineering and river conservation ecology to design and operate water infrastructure for social and environmental benefits.
Complex Behavior of Contaminant Flux and the Ecology of the Lower Mississippi River
NASA Astrophysics Data System (ADS)
Barton, C. C.; Manheim, F. T.; De Cola, L.; Bollinger, J. E.; Jenkins, J. A.
2001-12-01
This presentation is an overview of a collaborative NSF/USGS/Tulane funded multi-scale study of the Lower Mississippi River system. The study examines the system in three major dimensional realms: space, time, and complexity (systems and their hierarchies). Researchers at Tulane University and the U.S. Geological Survey have initiated a collaborative effort to undertake the study of interacting elements which directly or indirectly affect the water quality, ecology and physical condition of the Mississippi River. These researchers include experts in the fields of water quality chemistry, geochemistry, hydrologic modeling, bioengineering, biology, fish ecology, statistics, complexity analysis, epidemiology, and computer science. Underlying this research are large databases that permit quantitative analysis of the system over the past 40 years. Results to date show that the variation in discharge and the contaminant flux scale independently both exhibit fractal scaling, the signature geometry of nonlinear dynamical and complex systems. Public perception is that the Lower Mississippi River is a health hazard, but for the past decade, traditional water quality measurements show that contaminants are within current regulatory guidelines for human consumption. This difference between public perception and scientific reality represents a complex scientific and social issue. The connections and feedback within the ecological system and the Mississippi River are few because engineering structures isolate the lower Mississippi River from its surroundings. Investigation of the connections and feedback between human health and the ecological health of the River and the surrounding region as well as perceptions of these states of health - holds promise for explaining epidemiological patterns of human disease.
NREL-Statoil Collaborate to Make the First Multi-Turbine Floating Offshore
investigated four design load cases: power production, power production plus occurrence of fault, parked -thereby lowering the cost of energy by increasing power production. Senu Sirnivas, a principal engineer at design and analysis, turbine size-up scaling, the mooring system, instrumentation, data acquisition, and
ERIC Educational Resources Information Center
Medina-Dominguez, Fuensanta; Sanchez-Segura, Maria-Isabel; Mora-Soto, Arturo; Amescua, Antonio
2010-01-01
The development of collaborative Web applications does not follow a software engineering methodology. This is because when university students study Web applications in general, and collaborative Web portals in particular, they are not being trained in the use of software engineering techniques to develop collaborative Web portals. This paper…
magHD: a new approach to multi-dimensional data storage, analysis, display and exploitation
NASA Astrophysics Data System (ADS)
Angleraud, Christophe
2014-06-01
The ever increasing amount of data and processing capabilities - following the well- known Moore's law - is challenging the way scientists and engineers are currently exploiting large datasets. The scientific visualization tools, although quite powerful, are often too generic and provide abstract views of phenomena, thus preventing cross disciplines fertilization. On the other end, Geographic information Systems allow nice and visually appealing maps to be built but they often get very confused as more layers are added. Moreover, the introduction of time as a fourth analysis dimension to allow analysis of time dependent phenomena such as meteorological or climate models, is encouraging real-time data exploration techniques that allow spatial-temporal points of interests to be detected by integration of moving images by the human brain. Magellium is involved in high performance image processing chains for satellite image processing as well as scientific signal analysis and geographic information management since its creation (2003). We believe that recent work on big data, GPU and peer-to-peer collaborative processing can open a new breakthrough in data analysis and display that will serve many new applications in collaborative scientific computing, environment mapping and understanding. The magHD (for Magellium Hyper-Dimension) project aims at developing software solutions that will bring highly interactive tools for complex datasets analysis and exploration commodity hardware, targeting small to medium scale clusters with expansion capabilities to large cloud based clusters.
Current Approaches to Bone Tissue Engineering: The Interface between Biology and Engineering.
Li, Jiao Jiao; Ebied, Mohamed; Xu, Jen; Zreiqat, Hala
2018-03-01
The successful regeneration of bone tissue to replace areas of bone loss in large defects or at load-bearing sites remains a significant clinical challenge. Over the past few decades, major progress is achieved in the field of bone tissue engineering to provide alternative therapies, particularly through approaches that are at the interface of biology and engineering. To satisfy the diverse regenerative requirements of bone tissue, the field moves toward highly integrated approaches incorporating the knowledge and techniques from multiple disciplines, and typically involves the use of biomaterials as an essential element for supporting or inducing bone regeneration. This review summarizes the types of approaches currently used in bone tissue engineering, beginning with those primarily based on biology or engineering, and moving into integrated approaches in the areas of biomaterial developments, biomimetic design, and scalable methods for treating large or load-bearing bone defects, while highlighting potential areas for collaboration and providing an outlook on future developments. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Cen A Radio Optical Gamma Composite
2017-12-08
NASA release April 1, 2010 It takes the addition of radio data (orange) to fully appreciate the scale of Cen A's giant radio-emitting lobes, which stretch more than 1.4 million light-years. Gamma-rays from Fermi's Large Area Telescope (purple) and an image of the galaxy in visible light are also included in this composite. Credit: NASA/DOE/Fermi LAT Collaboration, Capella Observatory, and Ilana Feain, Tim Cornwell, and Ron Ekers (CSIRO/ATNF), R. Morganti (ASTRON), and N. Junkes (MPIfR) To learn more about these images go to: www.nasa.gov/mission_pages/GLAST/news/smokestack-plumes.html NASA Goddard Space Flight Center is home to the nation's largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.
Collaborative Working for Large Digitisation Projects
ERIC Educational Resources Information Center
Yeates, Robin; Guy, Damon
2006-01-01
Purpose: To explore the effectiveness of large-scale consortia for disseminating local heritage via the web. To describe the creation of a large geographically based cultural heritage consortium in the South East of England and management lessons resulting from a major web site digitisation project. To encourage the improved sharing of experience…
Collaborative Early Systems Engineering: Strategic Information Management Review
2010-09-02
Early Systems Engineering: Strategic Information Management Review 2 Table of Contents Executive Summary...5 Center for Systems Engineering (CSE) .............................................................................. 6...Collaborative Early Systems Engineering .......................................................................... 6 Development Planning
IFMIF: overview of the validation activities
NASA Astrophysics Data System (ADS)
Knaster, J.; Arbeiter, F.; Cara, P.; Favuzza, P.; Furukawa, T.; Groeschel, F.; Heidinger, R.; Ibarra, A.; Matsumoto, H.; Mosnier, A.; Serizawa, H.; Sugimoto, M.; Suzuki, H.; Wakai, E.
2013-11-01
The Engineering Validation and Engineering Design Activities (EVEDA) for the International Fusion Materials Irradiation Facility (IFMIF), an international collaboration under the Broader Approach Agreement between Japan Government and EURATOM, aims at allowing a rapid construction phase of IFMIF in due time with an understanding of the cost involved. The three main facilities of IFMIF (1) the Accelerator Facility, (2) the Target Facility and (3) the Test Facility are the subject of validation activities that include the construction of either full scale prototypes or smartly devised scaled down facilities that will allow a straightforward extrapolation to IFMIF needs. By July 2013, the engineering design activities of IFMIF matured with the delivery of an Intermediate IFMIF Engineering Design Report (IIEDR) supported by experimental results. The installation of a Linac of 1.125 MW (125 mA and 9 MeV) of deuterons started in March 2013 in Rokkasho (Japan). The world's largest liquid Li test loop is running in Oarai (Japan) with an ambitious experimental programme for the years ahead. A full scale high flux test module that will house ∼1000 small specimens developed jointly in Europe and Japan for the Fusion programme has been constructed by KIT (Karlsruhe) together with its He gas cooling loop. A full scale medium flux test module to carry out on-line creep measurement has been validated by CRPP (Villigen).
Engineering management of large scale systems
NASA Technical Reports Server (NTRS)
Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.
1989-01-01
The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.
Hao, Shijie; Cui, Lishan; Wang, Hua; ...
2016-02-10
Crystals held at ultrahigh elastic strains and stresses may exhibit exceptional physical and chemical properties. Individual metallic nanowires can sustain ultra-large elastic strains of 4-7%. However, retaining elastic strains of such magnitude in kilogram-scale nanowires is challenging. Here, we find that under active load, ~5.6% elastic strain can be achieved in Nb nanowires in a composite material. Moreover, large tensile (2.8%) and compressive (-2.4%) elastic strains can be retained in kilogram-scale Nb nanowires when the composite is unloaded to a free-standing condition. It is then demonstrated that the retained tensile elastic strains of Nb nanowires significantly increase their superconducting transitionmore » temperature and critical magnetic fields, corroborating ab initio calculations based on BCS theory. This free-standing nanocomposite design paradigm opens new avenues for retaining ultra-large elastic strains in great quantities of nanowires and elastic-strain-engineering at industrial scale.« less
Designs for Operationalizing Collaborative Problem Solving for Automated Assessment
ERIC Educational Resources Information Center
Scoular, Claire; Care, Esther; Hesse, Friedrich W.
2017-01-01
Collaborative problem solving is a complex skill set that draws on social and cognitive factors. The construct remains in its infancy due to lack of empirical evidence that can be drawn upon for validation. The differences and similarities between two large-scale initiatives that reflect this state of the art, in terms of underlying assumptions…
Development of an Internet Collaborative Learning Behavior Scale--Preliminary Results.
ERIC Educational Resources Information Center
Hsu, Ti; Wang, Hsiu Fei
It is well known that math phobia is a common problem among young school children. It becomes a challenge to educational practitioners and academic researchers to figure out ways to overcome the problem. Collaborative team learning has been proposed as one of the alternatives. This study was part of a large and ongoing research project designed to…
WikiTextbooks: Designing Your Course around a Collaborative Writing Project
ERIC Educational Resources Information Center
Katz, Brian P.; Thoren, Elizabeth
2014-01-01
We have used wiki technology to support large-scale, collaborative writing projects in which the students build reference texts (called WikiTextbooks). The goal of this paper is to prepare readers to adapt this idea for their own courses. We give examples of the implementation of WikiTextbooks in a variety of courses, including lecture and…
NASA Astrophysics Data System (ADS)
Weltzin, J. F.; Scully, R. A.; Bayer, J.
2016-12-01
Individual natural resource monitoring programs have evolved in response to different organizational mandates, jurisdictional needs, issues and questions. We are establishing a collaborative forum for large-scale, long-term monitoring programs to identify opportunities where collaboration could yield efficiency in monitoring design, implementation, analyses, and data sharing. We anticipate these monitoring programs will have similar requirements - e.g. survey design, standardization of protocols and methods, information management and delivery - that could be met by enterprise tools to promote sustainability, efficiency and interoperability of information across geopolitical boundaries or organizational cultures. MonitoringResources.org, a project of the Pacific Northwest Aquatic Monitoring Partnership, provides an on-line suite of enterprise tools focused on aquatic systems in the Pacific Northwest Region of the United States. We will leverage on and expand this existing capacity to support continental-scale monitoring of both aquatic and terrestrial systems. The current stakeholder group is focused on programs led by bureaus with the Department of Interior, but the tools will be readily and freely available to a broad variety of other stakeholders. Here, we report the results of two initial stakeholder workshops focused on (1) establishing a collaborative forum of large scale monitoring programs, (2) identifying and prioritizing shared needs, (3) evaluating existing enterprise resources, (4) defining priorities for development of enhanced capacity for MonitoringResources.org, and (5) identifying a small number of pilot projects that can be used to define and test development requirements for specific monitoring programs.
An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science.
2012-11-01
Reproducibility is a defining feature of science. However, because of strong incentives for innovation and weak incentives for confirmation, direct replication is rarely practiced or published. The Reproducibility Project is an open, large-scale, collaborative effort to systematically examine the rate and predictors of reproducibility in psychological science. So far, 72 volunteer researchers from 41 institutions have organized to openly and transparently replicate studies published in three prominent psychological journals in 2008. Multiple methods will be used to evaluate the findings, calculate an empirical rate of replication, and investigate factors that predict reproducibility. Whatever the result, a better understanding of reproducibility will ultimately improve confidence in scientific methodology and findings. © The Author(s) 2012.
Value-focused framework for defining landscape-scale conservation targets
Romañach, Stephanie; Benscoter, Allison M.; Brandt, Laura A.
2016-01-01
Conservation of natural resources can be challenging in a rapidly changing world and require collaborative efforts for success. Conservation planning is the process of deciding how to protect, conserve, and enhance or minimize loss of natural and cultural resources. Establishing conservation targets (also called indicators or endpoints), the measurable expressions of desired resource conditions, can help with site-specific up to landscape-scale conservation planning. Using conservation targets and tracking them through time can deliver benefits such as insight into ecosystem health and providing early warnings about undesirable trends. We describe an approach using value-focused thinking to develop statewide conservation targets for Florida. Using such an approach allowed us to first identify stakeholder objectives and then define conservation targets to meet those objectives. Stakeholders were able to see how their shared efforts fit into the broader conservation context, and also anticipate the benefits of multi-agency and -organization collaboration. We developed an iterative process for large-scale conservation planning that included defining a shared framework for the process, defining the conservation targets themselves, as well as developing management and monitoring strategies for evaluation of their effectiveness. The process we describe is applicable to other geographies where multiple parties are seeking to implement collaborative, large-scale biological planning.
Space Weather Research at the National Science Foundation
NASA Astrophysics Data System (ADS)
Moretto, T.
2015-12-01
There is growing recognition that the space environment can have substantial, deleterious, impacts on society. Consequently, research enabling specification and forecasting of hazardous space effects has become of great importance and urgency. This research requires studying the entire Sun-Earth system to understand the coupling of regions all the way from the source of disturbances in the solar atmosphere to the Earth's upper atmosphere. The traditional, region-based structure of research programs in Solar and Space physics is ill suited to fully support the change in research directions that the problem of space weather dictates. On the observational side, dense, distributed networks of observations are required to capture the full large-scale dynamics of the space environment. However, the cost of implementing these is typically prohibitive, especially for measurements in space. Thus, by necessity, the implementation of such new capabilities needs to build on creative and unconventional solutions. A particularly powerful idea is the utilization of new developments in data engineering and informatics research (big data). These new technologies make it possible to build systems that can collect and process huge amounts of noisy and inaccurate data and extract from them useful information. The shift in emphasis towards system level science for geospace also necessitates the development of large-scale and multi-scale models. The development of large-scale models capable of capturing the global dynamics of the Earth's space environment requires investment in research team efforts that go beyond what can typically be funded under the traditional grants programs. This calls for effective interdisciplinary collaboration and efficient leveraging of resources both nationally and internationally. This presentation will provide an overview of current and planned initiatives, programs, and activities at the National Science Foundation pertaining to space weathe research.
Analysis and Testing of a Composite Fuselage Shield for Open Rotor Engine Blade-Out Protection
NASA Technical Reports Server (NTRS)
Pereira, J. Michael; Emmerling, William; Seng, Silvia; Frankenberger, Charles; Ruggeri, Charles R.; Revilock, Duane M.; Carney, Kelly S.
2016-01-01
The Federal Aviation Administration is working with the European Aviation Safety Agency to determine the certification base for proposed new engines that would not have a containment structure on large commercial aircraft. Equivalent safety to the current fleet is desired by the regulators, which means that loss of a single fan blade will not cause hazard to the Aircraft. The NASA Glenn Research Center and The Naval Air Warfare Center (NAWC), China Lake, collaborated with the FAA Aircraft Catastrophic Failure Prevention Program to design and test lightweight composite shields for protection of the aircraft passengers and critical systems from a released blade that could impact the fuselage. LS-DYNA® was used to predict the thickness of the composite shield required to prevent blade penetration. In the test, two composite blades were pyrotechnically released from a running engine, each impacting a composite shield with a different thickness. The thinner shield was penetrated by the blade and the thicker shield prevented penetration. This was consistent with pre-test LS-DYNA predictions. This paper documents the analysis conducted to predict the required thickness of a composite shield, the live fire test from the full scale rig at NAWC China Lake and describes the damage to the shields as well as instrumentation results.
Application of a Novel Collaboration Engineering Method for Learning Design: A Case Study
ERIC Educational Resources Information Center
Cheng, Xusen; Li, Yuanyuan; Sun, Jianshan; Huang, Jianqing
2016-01-01
Collaborative case studies and computer-supported collaborative learning (CSCL) play an important role in the modern education environment. A number of researchers have given significant attention to learning design in order to improve the satisfaction of collaborative learning. Although collaboration engineering (CE) is a mature method widely…
Interweaving Objects, Gestures, and Talk in Context
ERIC Educational Resources Information Center
Brassac, Christian; Fixmer, Pierre; Mondada, Lorenza; Vinck, Dominique
2008-01-01
In a large French hospital, a group of professional experts (including physicians and software engineers) are working on the computerization of a blood-transfusion traceability device. By focusing on a particular moment in this slow process of design, we analyze their collaborative practices during a work session. The analysis takes a…
Hierarchical Modeling and Robust Synthesis for the Preliminary Design of Large Scale Complex Systems
NASA Technical Reports Server (NTRS)
Koch, Patrick N.
1997-01-01
Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis; Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration; and Noise modeling techniques for implementing robust preliminary design when approximate models are employed. Hierarchical partitioning and modeling techniques including intermediate responses, linking variables, and compatibility constraints are incorporated within a hierarchical compromise decision support problem formulation for synthesizing subproblem solutions for a partitioned system. Experimentation and approximation techniques are employed for concurrent investigations and modeling of partitioned subproblems. A modified composite experiment is introduced for fitting better predictive models across the ranges of the factors, and an approach for constructing partitioned response surfaces is developed to reduce the computational expense of experimentation for fitting models in a large number of factors. Noise modeling techniques are compared and recommendations are offered for the implementation of robust design when approximate models are sought. These techniques, approaches, and recommendations are incorporated within the method developed for hierarchical robust preliminary design exploration. This method as well as the associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system. The case study is developed in collaboration with Allison Engine Company, Rolls Royce Aerospace, and is based on the Allison AE3007 existing engine designed for midsize commercial, regional business jets. For this case study, the turbofan system-level problem is partitioned into engine cycle design and configuration design and a compressor modules integrated for more detailed subsystem-level design exploration, improving system evaluation. The fan and low pressure turbine subsystems are also modeled, but in less detail. Given the defined partitioning, these subproblems are investigated independently and concurrently, and response surface models are constructed to approximate the responses of each. These response models are then incorporated within a commercial turbofan hierarchical compromise decision support problem formulation. Five design scenarios are investigated, and robust solutions are identified. The method and solutions identified are verified by comparison with the AE3007 engine. The solutions obtained are similar to the AE3007 cycle and configuration, but are better with respect to many of the requirements.
A modular approach to creating large engineered cartilage surfaces.
Ford, Audrey C; Chui, Wan Fung; Zeng, Anne Y; Nandy, Aditya; Liebenberg, Ellen; Carraro, Carlo; Kazakia, Galateia; Alliston, Tamara; O'Connell, Grace D
2018-01-23
Native articular cartilage has limited capacity to repair itself from focal defects or osteoarthritis. Tissue engineering has provided a promising biological treatment strategy that is currently being evaluated in clinical trials. However, current approaches in translating these techniques to developing large engineered tissues remains a significant challenge. In this study, we present a method for developing large-scale engineered cartilage surfaces through modular fabrication. Modular Engineered Tissue Surfaces (METS) uses the well-known, but largely under-utilized self-adhesion properties of de novo tissue to create large scaffolds with nutrient channels. Compressive mechanical properties were evaluated throughout METS specimens, and the tensile mechanical strength of the bonds between attached constructs was evaluated over time. Raman spectroscopy, biochemical assays, and histology were performed to investigate matrix distribution. Results showed that by Day 14, stable connections had formed between the constructs in the METS samples. By Day 21, bonds were robust enough to form a rigid sheet and continued to increase in size and strength over time. Compressive mechanical properties and glycosaminoglycan (GAG) content of METS and individual constructs increased significantly over time. The METS technique builds on established tissue engineering accomplishments of developing constructs with GAG composition and compressive properties approaching native cartilage. This study demonstrated that modular fabrication is a viable technique for creating large-scale engineered cartilage, which can be broadly applied to many tissue engineering applications and construct geometries. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Koszalka, Tiffany A.; Wu, Yiyan
2010-01-01
Changes in engineering practices have spawned changes in engineering education and prompted the use of distributed learning environments. A distributed collaborative engineering design (CED) course was designed to engage engineering students in learning about and solving engineering design problems. The CED incorporated an advanced interactive…
FEASIBILITY OF LARGE-SCALE OCEAN CO2 SEQUESTRATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dr. Peter Brewer; Dr. James Barry
2002-09-30
We have continued to carry out creative small-scale experiments in the deep ocean to investigate the science underlying questions of possible future large-scale deep-ocean CO{sub 2} sequestration as a means of ameliorating greenhouse gas growth rates in the atmosphere. This project is closely linked to additional research funded by the DoE Office of Science, and to support from the Monterey Bay Aquarium Research Institute. The listing of project achievements here over the past year reflects these combined resources. Within the last project year we have: (1) Published a significant workshop report (58 pages) entitled ''Direct Ocean Sequestration Expert's Workshop'', basedmore » upon a meeting held at MBARI in 2001. The report is available both in hard copy, and on the NETL web site. (2) Carried out three major, deep ocean, (3600m) cruises to examine the physical chemistry, and biological consequences, of several liter quantities released on the ocean floor. (3) Carried out two successful short cruises in collaboration with Dr. Izuo Aya and colleagues (NMRI, Osaka, Japan) to examine the fate of cold (-55 C) CO{sub 2} released at relatively shallow ocean depth. (4) Carried out two short cruises in collaboration with Dr. Costas Tsouris, ORNL, to field test an injection nozzle designed to transform liquid CO{sub 2} into a hydrate slurry at {approx}1000m depth. (5) In collaboration with Prof. Jill Pasteris (Washington University) we have successfully accomplished the first field test of a deep ocean laser Raman spectrometer for probing in situ the physical chemistry of the CO{sub 2} system. (6) Submitted the first major paper on biological impacts as determined from our field studies. (7) Submitted a paper on our measurements of the fate of a rising stream of liquid CO{sub 2} droplets to Environmental Science & Technology. (8) Have had accepted for publication in Eos the first brief account of the laser Raman spectrometer success. (9) Have had two papers submitted for the Greenhouse Gas Technology--6 Conference (Kyoto) accepted. (10) Been nominated by the U.S. Dept. of State to attend the Nov. 2002 IPCC Workshop on Carbon Capture and Storage. (11) Given presentations at national meetings, including the AGU Ocean Sciences Meeting, the American Chemical Society, the Minerals, Materials, and Metals Society, the National Academy of Engineering, and given numerous invited lectures.« less
High Fidelity Simulations of Large-Scale Wireless Networks (Plus-Up)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Onunkwo, Uzoma
Sandia has built a strong reputation in scalable network simulation and emulation for cyber security studies to protect our nation’s critical information infrastructures. Georgia Tech has preeminent reputation in academia for excellence in scalable discrete event simulations, with strong emphasis on simulating cyber networks. Many of the experts in this field, such as Dr. Richard Fujimoto, Dr. George Riley, and Dr. Chris Carothers, have strong affiliations with Georgia Tech. The collaborative relationship that we intend to immediately pursue is in high fidelity simulations of practical large-scale wireless networks using ns-3 simulator via Dr. George Riley. This project will have mutualmore » benefits in bolstering both institutions’ expertise and reputation in the field of scalable simulation for cyber-security studies. This project promises to address high fidelity simulations of large-scale wireless networks. This proposed collaboration is directly in line with Georgia Tech’s goals for developing and expanding the Communications Systems Center, the Georgia Tech Broadband Institute, and Georgia Tech Information Security Center along with its yearly Emerging Cyber Threats Report. At Sandia, this work benefits the defense systems and assessment area with promise for large-scale assessment of cyber security needs and vulnerabilities of our nation’s critical cyber infrastructures exposed to wireless communications.« less
Lifelong Learning: The Value of an Industrial Internship for a Graduate Student Education
ERIC Educational Resources Information Center
Honda, Gregory S.; Pazmino, Jorge H.; Hickman, Daniel A.; Varma, Arvind
2015-01-01
A chemical engineering PhD student from Purdue University completed an internship at The Dow Chemical Company, evaluating the effect of scale on the hydrodynamics of a trickle bed reactor. A unique aspect of this work was that it arose from an ongoing collaboration, so that the project was within the scope of the graduate student's thesis. This…
Implementation and Performance Issues in Collaborative Optimization
NASA Technical Reports Server (NTRS)
Braun, Robert; Gage, Peter; Kroo, Ilan; Sobieski, Ian
1996-01-01
Collaborative optimization is a multidisciplinary design architecture that is well-suited to large-scale multidisciplinary optimization problems. This paper compares this approach with other architectures, examines the details of the formulation, and some aspects of its performance. A particular version of the architecture is proposed to better accommodate the occurrence of multiple feasible regions. The use of system level inequality constraints is shown to increase the convergence rate. A series of simple test problems, demonstrated to challenge related optimization architectures, is successfully solved with collaborative optimization.
Ice Accretion Measurements on an Airfoil and Wedge in Mixed-Phase Conditions
NASA Technical Reports Server (NTRS)
Struk, Peter; Bartkus, Tadas; Tsao, Jen-Ching; Currie, Tom; Fuleki, Dan
2015-01-01
This paper describes ice accretion measurements from experiments conducted at the National Research Council (NRC) of Canada's Research Altitude Test Facility during 2012. Due to numerous engine power loss events associated with high altitude convective weather, potential ice accretion within an engine due to ice crystal ingestion is being investigated collaboratively by NASA and NRC. These investigations examine the physical mechanisms of ice accretion on surfaces exposed to ice crystal and mixed phase conditions, similar to those believed to exist in core compressor regions of jet engines. A further objective of these tests is to examine scaling effects since altitude appears to play a key role in this icing process.
Jorm, Christine; Roberts, Chris; Lim, Renee; Roper, Josephine; Skinner, Clare; Robertson, Jeremy; Gentilcore, Stacey; Osomanski, Adam
2016-03-08
There is little research on large-scale complex health care simulations designed to facilitate student learning of non-technical skills in a team-working environment. We evaluated the acceptability and effectiveness of a novel natural disaster simulation that enabled medical students to demonstrate their achievement of the non-technical skills of collaboration, negotiation and communication. In a mixed methods approach, survey data were available from 117 students and a thematic analysis undertaken of both student qualitative comments and tutor observer participation data. Ninety three per cent of students found the activity engaging for their learning. Three themes emerged from the qualitative data: the impact of fidelity on student learning, reflexivity on the importance of non-technical skills in clinical care, and opportunities for collaborative teamwork. Physical fidelity was sufficient for good levels of student engagement, as was sociological fidelity. We demonstrated the effectiveness of the simulation in allowing students to reflect upon and evidence their acquisition of skills in collaboration, negotiation and communication, as well as situational awareness and attending to their emotions. Students readily identified emerging learning opportunities though critical reflection. The scenarios challenged students to work together collaboratively to solve clinical problems, using a range of resources including interacting with clinical experts. A large class teaching activity, framed as a simulation of a natural disaster is an acceptable and effective activity for medical students to develop the non-technical skills of collaboration, negotiation and communication, which are essential to team working. The design could be of value in medical schools in disaster prone areas, including within low resource countries, and as a feasible intervention for learning the non-technical skills that are needed for patient safety.
NASA Technical Reports Server (NTRS)
Scholl, R. E. (Editor)
1979-01-01
Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.
A state-based national network for effective wildlife conservation
Meretsky, Vicky J.; Maguire, Lynn A.; Davis, Frank W.; Stoms, David M.; Scott, J. Michael; Figg, Dennis; Goble, Dale D.; Griffith, Brad; Henke, Scott E.; Vaughn, Jacqueline; Yaffee, Steven L.
2012-01-01
State wildlife conservation programs provide a strong foundation for biodiversity conservation in the United States, building on state wildlife action plans. However, states may miss the species that are at the most risk at rangewide scales, and threats such as novel diseases and climate change increasingly act at regional and national levels. Regional collaborations among states and their partners have had impressive successes, and several federal programs now incorporate state priorities. However, regional collaborations are uneven across the country, and no national counterpart exists to support efforts at that scale. A national conservation-support program could fill this gap and could work across the conservation community to identify large-scale conservation needs and support efforts to meet them. By providing important information-sharing and capacity-building services, such a program would advance collaborative conservation among the states and their partners, thus increasing both the effectiveness and the efficiency of conservation in the United States.
NASA Astrophysics Data System (ADS)
Justham, T.; Jarvis, S.; Clarke, A.; Garner, C. P.; Hargrave, G. K.; Halliwell, N. A.
2006-07-01
Simultaneous intake and in-cylinder digital particle image velocimetry (DPIV) experimental data is presented for a motored spark ignition (SI) optical internal combustion (IC) engine. Two individual DPIV systems were employed to study the inter-relationship between the intake and in-cylinder flow fields at an engine speed of 1500 rpm. Results for the intake runner velocity field at the time of maximum intake valve lift are compared to incylinder velocity fields later in the same engine cycle. Relationships between flow structures within the runner and cylinder were seen to be strong during the intake stroke but less significant during compression. Cyclic variations within the intake runner were seen to affect the large scale bulk flow motion. The subsequent decay of the large scale motions into smaller scale turbulent structures during the compression stroke appear to reduce the relationship with the intake flow variations.
Castillo-Cagigal, Manuel; Matallanas, Eduardo; Gutiérrez, Alvaro; Monasterio-Huelin, Félix; Caamaño-Martín, Estefaná; Masa-Bote, Daniel; Jiménez-Leube, Javier
2011-01-01
In this paper we present a heterogeneous collaborative sensor network for electrical management in the residential sector. Improving demand-side management is very important in distributed energy generation applications. Sensing and control are the foundations of the "Smart Grid" which is the future of large-scale energy management. The system presented in this paper has been developed on a self-sufficient solar house called "MagicBox" equipped with grid connection, PV generation, lead-acid batteries, controllable appliances and smart metering. Therefore, there is a large number of energy variables to be monitored that allow us to precisely manage the energy performance of the house by means of collaborative sensors. The experimental results, performed on a real house, demonstrate the feasibility of the proposed collaborative system to reduce the consumption of electrical power and to increase energy efficiency.
NASA Technical Reports Server (NTRS)
McGowan, Anna-Maria R.; Seifert, Colleen M.; Papalambros, Panos Y.
2012-01-01
The design of large-scale complex engineered systems (LaCES) such as an aircraft is inherently interdisciplinary. Multiple engineering disciplines, drawing from a team of hundreds to thousands of engineers and scientists, are woven together throughout the research, development, and systems engineering processes to realize one system. Though research and development (R&D) is typically focused in single disciplines, the interdependencies involved in LaCES require interdisciplinary R&D efforts. This study investigates the interdisciplinary interactions that take place during the R&D and early conceptual design phases in the design of LaCES. Our theoretical framework is informed by both engineering practices and social science research on complex organizations. This paper provides preliminary perspective on some of the organizational influences on interdisciplinary interactions based on organization theory (specifically sensemaking), data from a survey of LaCES experts, and the authors experience in the research and design. The analysis reveals couplings between the engineered system and the organization that creates it. Survey respondents noted the importance of interdisciplinary interactions and their significant benefit to the engineered system, such as innovation and problem mitigation. Substantial obstacles to interdisciplinarity are uncovered beyond engineering that include communication and organizational challenges. Addressing these challenges may ultimately foster greater efficiencies in the design and development of LaCES and improved system performance by assisting with the collective integration of interdependent knowledge bases early in the R&D effort. This research suggests that organizational and human dynamics heavily influence and even constrain the engineering effort for large-scale complex systems.
Harrington, Brian A.; Brown, S.; Corven, James; Bart, Jonathan
2002-01-01
Shorebirds are among the most highly migratory creatures on earth. Both the study of their ecology and ongoing efforts to conserve their populations must reflect this central aspect of their biology. Many species of shorebirds use migration and staging sites scattered throughout the hemisphere to complete their annual migrations between breeding areas and nonbreeding habitats (Morrison 1984). The vast distances between habitats they use pose significant challenges for studying their migration ecology. At the same time, the large number of political boundaries shorebirds cross during their epic migrations create parallel challenges for organizations working on their management and conservation.Nebel et al. (2002) represent a collaborative effort to understand the conservation implications of Western Sandpiper (Calidris mauri) migration ecology on a scale worthy of this highly migratory species. The data sets involved in the analysis come from four U.S. states, two Canadian provinces, and a total of five nations. Only by collaborating on this historic scale were the authors able to assemble the information necessary to understand important aspects of the migration ecology of this species, and the implications for conservation of the patterns they discovered.Collaborative approaches to shorebird migration ecology developed slowly over several decades. The same period also saw the creation of large-scale efforts to monitor and conserve shorebirds. This overview first traces the history of the study of migration ecology of shorebirds during that fertile period, and then describes the monitoring and protection efforts that have been developed in an attempt to address the enormous issues of scale posed by shorebird migration ecology and conservation.
CLEANER-Hydrologic Observatory Joint Science Plan
NASA Astrophysics Data System (ADS)
Welty, C.; Dressler, K.; Hooper, R.
2005-12-01
The CLEANER-Hydrologic Observatory* initiative is a distributed network for research on complex environmental systems that focuses on the intersecting water-related issues of both the CUAHSI and CLEANER communities. It emphasizes research on the nation's water resources related to human-dominated natural and built environments. The network will be comprised of: interacting field sites with an integrated cyberinfrastructure; a centralized technical resource staff and management infrastructure to support interdisciplinary research through data collection from advanced sensor systems, data mining and aggregation from multiple sources and databases; cyber-tools for analysis, visualization, and predictive multi-scale modeling that is dynamically driven. As such, the network will transform 21st century workforce development in the water-related intersection of environmental science and engineering, as well as enable substantial educational and engagement opportunities for all age levels. The scientific goal and strategic intent of the CLEANER-Hydrologic Observatory Network is to transform our understanding of the earth's water cycle and associated biogeochemical cycles across spatial and temporal scales-enabling quantitative forecasts of critical water-related processes, especially those that affect and are affected by human activities. This strategy will develop scientific and engineering tools that will enable more effective adaptive approaches for resource management. The need for the network is based on three critical deficiencies in current abilities to understand large-scale environmental processes and thereby develop more effective management strategies. First we lack basic data and the infrastructure to collect them at the needed resolution. Second, we lack the means to integrate data across scales from different media (paper records, electronic worksheets, web-based) and sources (observations, experiments, simulations). Third, we lack sufficiently accurate modeling and decision-support tools to predict the underlying processes or subsequently forecast the effects of different management strategies. Water is a critical driver for the functioning of all ecosystems and development of human society, and it is a key ingredient for the success of industry, agriculture and, national economy. CLEANER-Hydrologic Observatories will foster cutting-edge science and engineering research that addresses major national needs (public and governmental) related to water and include, for example: (i) water resource problems, such as impaired surface waters, contaminated ground water, water availability for human use and ecosystem needs, floods and floodplain management, urban storm water, agricultural runoff, and coastal hypoxia; (ii) understanding environmental impacts on public health; (iii) achieving a balance of economic and environmental sustainability; (iv) reversing environmental degradation; and (v) protecting against chemical and biological threats. CLEANER (Collaborative Large-scale Engineering Analysis Network for Environmental Research) is an ENG initiative; the Hydrologic Observatory Network is GEO initiative through CUAHSI (Consortium of Universities for the Advancement of Hydrologic Science, Inc.). The two initiatives were merged into a joint, bi-directorate program in December 2004.
Placeless Organizations: Collaborating for Transformation
ERIC Educational Resources Information Center
Nardi, Bonnie A.
2007-01-01
This article defines and discusses placeless organizations as sites and generators of learning on a large scale. The emphasis is on how placeless organizations structure themselves to carry out social transformation--necessarily involving intensive learning--on a national or global scale. The argument is made that place is not a necessary…
Distributed Collaborative Homework Activities in a Problem-Based Usability Engineering Course
ERIC Educational Resources Information Center
Carroll, John M.; Jiang, Hao; Borge, Marcela
2015-01-01
Teams of students in an upper-division undergraduate Usability Engineering course used a collaborative environment to carry out a series of three distributed collaborative homework assignments. Assignments were case-based analyses structured using a jigsaw design; students were provided a collaborative software environment and introduced to a…
Promoting Collaborative Problem-Solving Skills in a Course on Engineering Grand Challenges
ERIC Educational Resources Information Center
Zou, Tracy X. P.; Mickleborough, Neil C.
2015-01-01
The ability to solve problems with people of diverse backgrounds is essential for engineering graduates. A course on engineering grand challenges was designed to promote collaborative problem-solving (CPS) skills. One unique component is that students need to work both within their own team and collaborate with the other team to tackle engineering…
ERIC Educational Resources Information Center
Gimenez, J.; Thondhlana, J.
2012-01-01
In engineering, like in many other disciplines, collaborative writing (CW) has been identified as a central practice in both the academy and industry. A number of studies have shown that both students and professionals in this field write most discipline-specific genres collaboratively. Despite its centrality, CW in engineering is still an…
Evolving the NCSA CyberCollaboratory for Distributed Environmental Observatory Networks
NASA Astrophysics Data System (ADS)
Myers, J.; Liu, Y.; Minsker, B.; Futrelle, J.; Downey, S.; Kim, I.; Rantanen, E.
2007-12-01
Since 2004, NCSA's Cybercollaboratory, which is built on top of the open source Liferay portal framework, has been evolving as part of NCSA's efforts to build national cyberinfrastructure to support collaborative research in environmental engineering and hydrological sciences and allow users to efficiently share contents (sensors, data, model, documents, etc.) in a context-sensitive way (e.g., providing different tools/data based on group affiliation and geospatial contexts). During this period, we provided the CyberCollaboratory to users in CLEANER (Collaborative Large-scale Engineering Analysis Network for Environmental Research, now WATer and Environmental Research Systems (WATERS) network) Project Office and several CLEANER /WATERS testbed projects. Preliminary statistics shows that one in four users (among over 400 registered users) provided contents with many other reading/accessing those contents (such as messages, documents, wikis). During the course of this use, and in evaluation by others including representatives from the CUAHSI (Consortium of Universities for the Advancement of Hydrologic Science) community, we have received significant feedback on issues of usability and suitability to various communities involved in environmental observatories. Much of this feedback applies to collaborative portals in general and some reflect a comparison of portals with newer Web 2.0 style social -networking sites. For example, users working in multiple groups found it difficult to get an overview of all of their activities and found differences in group layouts to be confusing. Users also found the standard account creation and group management processes cumbersome compared to inviting people to be friends on social sites and wanted a better sense of presence and social networks within the portal. The fragmentation of group documents between local stores, the portal document repository and email, and issues of "lost updates" was another significant concern. This poster reviews the usability feedback, identifies key issues that hinder traditional portal-based collaboration environments, and presents design changes made to the Cybercollaboratory to address them. Feedback on the effectiveness of the new design from hydrologists and environmental researchers and preliminary results from a formal usability study will also be presented.
Automatic Tools for Enhancing the Collaborative Experience in Large Projects
NASA Astrophysics Data System (ADS)
Bourilkov, D.; Rodriquez, J. L.
2014-06-01
With the explosion of big data in many fields, the efficient management of knowledge about all aspects of the data analysis gains in importance. A key feature of collaboration in large scale projects is keeping a log of what is being done and how - for private use, reuse, and for sharing selected parts with collaborators and peers, often distributed geographically on an increasingly global scale. Even better if the log is automatically created on the fly while the scientist or software developer is working in a habitual way, without the need for extra efforts. This saves time and enables a team to do more with the same resources. The CODESH - COllaborative DEvelopment SHell - and CAVES - Collaborative Analysis Versioning Environment System projects address this problem in a novel way. They build on the concepts of virtual states and transitions to enhance the collaborative experience by providing automatic persistent virtual logbooks. CAVES is designed for sessions of distributed data analysis using the popular ROOT framework, while CODESH generalizes the approach for any type of work on the command line in typical UNIX shells like bash or tcsh. Repositories of sessions can be configured dynamically to record and make available the knowledge accumulated in the course of a scientific or software endeavor. Access can be controlled to define logbooks of private sessions or sessions shared within or between collaborating groups. A typical use case is building working scalable systems for analysis of Petascale volumes of data as encountered in the LHC experiments. Our approach is general enough to find applications in many fields.
Leveraging premalignant biology for immune-based cancer prevention.
Spira, Avrum; Disis, Mary L; Schiller, John T; Vilar, Eduardo; Rebbeck, Timothy R; Bejar, Rafael; Ideker, Trey; Arts, Janine; Yurgelun, Matthew B; Mesirov, Jill P; Rao, Anjana; Garber, Judy; Jaffee, Elizabeth M; Lippman, Scott M
2016-09-27
Prevention is an essential component of cancer eradication. Next-generation sequencing of cancer genomes and epigenomes has defined large numbers of driver mutations and molecular subgroups, leading to therapeutic advances. By comparison, there is a relative paucity of such knowledge in premalignant neoplasia, which inherently limits the potential to develop precision prevention strategies. Studies on the interplay between germ-line and somatic events have elucidated genetic processes underlying premalignant progression and preventive targets. Emerging data hint at the immune system's ability to intercept premalignancy and prevent cancer. Genetically engineered mouse models have identified mechanisms by which genetic drivers and other somatic alterations recruit inflammatory cells and induce changes in normal cells to create and interact with the premalignant tumor microenvironment to promote oncogenesis and immune evasion. These studies are currently limited to only a few lesion types and patients. In this Perspective, we advocate a large-scale collaborative effort to systematically map the biology of premalignancy and the surrounding cellular response. By bringing together scientists from diverse disciplines (e.g., biochemistry, omics, and computational biology; microbiology, immunology, and medical genetics; engineering, imaging, and synthetic chemistry; and implementation science), we can drive a concerted effort focused on cancer vaccines to reprogram the immune response to prevent, detect, and reject premalignancy. Lynch syndrome, clonal hematopoiesis, and cervical intraepithelial neoplasia which also serve as models for inherited syndromes, blood, and viral premalignancies, are ideal scenarios in which to launch this initiative.
ERIC Educational Resources Information Center
Anderson, Greg; And Others
1996-01-01
Describes the Computer Science Technical Report Project, one of the earliest investigations into the system engineering of digital libraries which pioneered multiinstitutional collaborative research into technical, social, and legal issues related to the development and implementation of a large, heterogeneous, distributed digital library. (LRW)
Palinkas, Lawrence A; Fuentes, Dahlia; Finno, Megan; Garcia, Antonio R; Holloway, Ian W; Chamberlain, Patricia
2014-01-01
This study examined the role of inter-organizational collaboration in implementing new evidence-based practices for addressing problem behaviors in at-risk youth. Semi-structured interviews were conducted with 38 systems leaders of probation, mental health, and child welfare departments of 12 California counties participating in a large randomized controlled trial to scale-up the use of Multidimensional Treatment Foster Care. Three sets of collaboration characteristics were identified: (1) characteristics of collaboration process, (2) characteristics of the external environment, and (3) characteristics of participating organizations and individuals. Inter-organizational collaboration enables an exchange of information and advice and a pooling of resources individual agencies may require for successful implementation.
Big data analytics as a service infrastructure: challenges, desired properties and solutions
NASA Astrophysics Data System (ADS)
Martín-Márquez, Manuel
2015-12-01
CERN's accelerator complex generates a very large amount of data. A large volumen of heterogeneous data is constantly generated from control equipment and monitoring agents. These data must be stored and analysed. Over the decades, CERN's researching and engineering teams have applied different approaches, techniques and technologies for this purpose. This situation has minimised the necessary collaboration and, more relevantly, the cross data analytics over different domains. These two factors are essential to unlock hidden insights and correlations between the underlying processes, which enable better and more efficient daily-based accelerator operations and more informed decisions. The proposed Big Data Analytics as a Service Infrastructure aims to: (1) integrate the existing developments; (2) centralise and standardise the complex data analytics needs for CERN's research and engineering community; (3) deliver real-time, batch data analytics and information discovery capabilities; and (4) provide transparent access and Extract, Transform and Load (ETL), mechanisms to the various and mission-critical existing data repositories. This paper presents the desired objectives and properties resulting from the analysis of CERN's data analytics requirements; the main challenges: technological, collaborative and educational and; potential solutions.
NASA Astrophysics Data System (ADS)
Flamand, Olivier
2017-12-01
Wind engineering problems are commonly studied by wind tunnel experiments at a reduced scale. This introduces several limitations and calls for a careful planning of the tests and the interpretation of the experimental results. The talk first revisits the similitude laws and discusses how they are actually applied in wind engineering. It will also remind readers why different scaling laws govern in different wind engineering problems. Secondly, the paper focuses on the ways to simplify a detailed structure (bridge, building, platform) when fabricating the downscaled models for the tests. This will be illustrated by several examples from recent engineering projects. Finally, under the most severe weather conditions, manmade structures and equipment should remain operational. What “recreating the climate” means and aims to achieve will be illustrated through common practice in climatic wind tunnel modelling.
eScience for molecular-scale simulations and the eMinerals project.
Salje, E K H; Artacho, E; Austen, K F; Bruin, R P; Calleja, M; Chappell, H F; Chiang, G-T; Dove, M T; Frame, I; Goodwin, A L; Kleese van Dam, K; Marmier, A; Parker, S C; Pruneda, J M; Todorov, I T; Trachenko, K; Tyer, R P; Walker, A M; White, T O H
2009-03-13
We review the work carried out within the eMinerals project to develop eScience solutions that facilitate a new generation of molecular-scale simulation work. Technological developments include integration of compute and data systems, developing of collaborative frameworks and new researcher-friendly tools for grid job submission, XML data representation, information delivery, metadata harvesting and metadata management. A number of diverse science applications will illustrate how these tools are being used for large parameter-sweep studies, an emerging type of study for which the integration of computing, data and collaboration is essential.
Castillo-Cagigal, Manuel; Matallanas, Eduardo; Gutiérrez, Álvaro; Monasterio-Huelin, Félix; Caamaño-Martín, Estefaná; Masa-Bote, Daniel; Jiménez-Leube, Javier
2011-01-01
In this paper we present a heterogeneous collaborative sensor network for electrical management in the residential sector. Improving demand-side management is very important in distributed energy generation applications. Sensing and control are the foundations of the “Smart Grid” which is the future of large-scale energy management. The system presented in this paper has been developed on a self-sufficient solar house called “MagicBox” equipped with grid connection, PV generation, lead-acid batteries, controllable appliances and smart metering. Therefore, there is a large number of energy variables to be monitored that allow us to precisely manage the energy performance of the house by means of collaborative sensors. The experimental results, performed on a real house, demonstrate the feasibility of the proposed collaborative system to reduce the consumption of electrical power and to increase energy efficiency. PMID:22247680
The Quality of Textbooks: A Basis for European Collaboration.
ERIC Educational Resources Information Center
Hooghoff, Hans
The enhancement of the European dimension in the national curriculum is a large scale educational innovation that affects many European countries. This report puts forward the proposition that broad scale educational innovation has more success if the aims and objectives find their way into textbooks. The reasons why a European dimension in…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, George
1999-01-11
A workshop on collaborative problem-solving environments (CPSEs) was held June 29 through July 1, 1999, in San Diego, California. The workshop was sponsored by the U.S. Department of Energy and the High Performance Network Applications Team of the Large Scale Networking Working Group. The workshop brought together researchers and developers from industry, academia, and government to identify, define, and discuss future directions in collaboration and problem-solving technologies in support of scientific research.
NASA Technical Reports Server (NTRS)
Storaasli, Olaf O. (Editor); Housner, Jerrold M. (Editor)
1993-01-01
Computing speed is leaping forward by several orders of magnitude each decade. Engineers and scientists gathered at a NASA Langley symposium to discuss these exciting trends as they apply to parallel computational methods for large-scale structural analysis and design. Among the topics discussed were: large-scale static analysis; dynamic, transient, and thermal analysis; domain decomposition (substructuring); and nonlinear and numerical methods.
Scale-Up of GRCop: From Laboratory to Rocket Engines
NASA Technical Reports Server (NTRS)
Ellis, David L.
2016-01-01
GRCop is a high temperature, high thermal conductivity copper-based series of alloys designed primarily for use in regeneratively cooled rocket engine liners. It began with laboratory-level production of a few grams of ribbon produced by chill block melt spinning and has grown to commercial-scale production of large-scale rocket engine liners. Along the way, a variety of methods of consolidating and working the alloy were examined, a database of properties was developed and a variety of commercial and government applications were considered. This talk will briefly address the basic material properties used for selection of compositions to scale up, the methods used to go from simple ribbon to rocket engines, the need to develop a suitable database, and the issues related to getting the alloy into a rocket engine or other application.
2009-01-01
Background Insertional mutagenesis is an effective method for functional genomic studies in various organisms. It can rapidly generate easily tractable mutations. A large-scale insertional mutagenesis with the piggyBac (PB) transposon is currently performed in mice at the Institute of Developmental Biology and Molecular Medicine (IDM), Fudan University in Shanghai, China. This project is carried out via collaborations among multiple groups overseeing interconnected experimental steps and generates a large volume of experimental data continuously. Therefore, the project calls for an efficient database system for recording, management, statistical analysis, and information exchange. Results This paper presents a database application called MP-PBmice (insertional mutation mapping system of PB Mutagenesis Information Center), which is developed to serve the on-going large-scale PB insertional mutagenesis project. A lightweight enterprise-level development framework Struts-Spring-Hibernate is used here to ensure constructive and flexible support to the application. The MP-PBmice database system has three major features: strict access-control, efficient workflow control, and good expandability. It supports the collaboration among different groups that enter data and exchange information on daily basis, and is capable of providing real time progress reports for the whole project. MP-PBmice can be easily adapted for other large-scale insertional mutation mapping projects and the source code of this software is freely available at http://www.idmshanghai.cn/PBmice. Conclusion MP-PBmice is a web-based application for large-scale insertional mutation mapping onto the mouse genome, implemented with the widely used framework Struts-Spring-Hibernate. This system is already in use by the on-going genome-wide PB insertional mutation mapping project at IDM, Fudan University. PMID:19958505
Software for Collaborative Engineering of Launch Rockets
NASA Technical Reports Server (NTRS)
Stanley, Thomas Troy
2003-01-01
The Rocket Evaluation and Cost Integration for Propulsion and Engineering software enables collaborative computing with automated exchange of information in the design and analysis of launch rockets and other complex systems. RECIPE can interact with and incorporate a variety of programs, including legacy codes, that model aspects of a system from the perspectives of different technological disciplines (e.g., aerodynamics, structures, propulsion, trajectory, aeroheating, controls, and operations) and that are used by different engineers on different computers running different operating systems. RECIPE consists mainly of (1) ISCRM a file-transfer subprogram that makes it possible for legacy codes executed in their original operating systems on their original computers to exchange data and (2) CONES an easy-to-use filewrapper subprogram that enables the integration of legacy codes. RECIPE provides a tightly integrated conceptual framework that emphasizes connectivity among the programs used by the collaborators, linking these programs in a manner that provides some configuration control while facilitating collaborative engineering tradeoff studies, including design to cost studies. In comparison with prior collaborative-engineering schemes, one based on the use of RECIPE enables fewer engineers to do more in less time.
A flipped mode teaching approach for large and advanced electrical engineering courses
NASA Astrophysics Data System (ADS)
Ravishankar, Jayashri; Epps, Julien; Ambikairajah, Eliathamby
2018-05-01
A fully flipped mode teaching approach is challenging for students in advanced engineering courses, because of demanding pre-class preparation load, due to the complex and analytical nature of the topics. When this is applied to large classes, it brings an additional complexity in terms of promoting the intended active learning. This paper presents a novel selective flipped mode teaching approach designed for large and advanced courses that has two aspects: (i) it provides selective flipping of a few topics, while delivering others in traditional face-to-face teaching, to provide an effective trade-off between the two approaches according to the demands of individual topics and (ii) it introduces technology-enabled live in-class quizzes to obtain instant feedback and facilitate collaborative problem-solving exercises. The proposed approach was implemented for a large fourth year course in electrical power engineering over three successive years and the criteria for selecting between the flipped mode teaching and traditional teaching modes are outlined. Results confirmed that the proposed approach improved both students' academic achievements and their engagement in the course, without overloading them during the teaching period.
The MSFC Collaborative Engineering Process for Preliminary Design and Concept Definition Studies
NASA Technical Reports Server (NTRS)
Mulqueen, Jack; Jones, David; Hopkins, Randy
2011-01-01
This paper describes a collaborative engineering process developed by the Marshall Space Flight Center's Advanced Concepts Office for performing rapid preliminary design and mission concept definition studies for potential future NASA missions. The process has been developed and demonstrated for a broad range of mission studies including human space exploration missions, space transportation system studies and in-space science missions. The paper will describe the design team structure and specialized analytical tools that have been developed to enable a unique rapid design process. The collaborative engineering process consists of integrated analysis approach for mission definition, vehicle definition and system engineering. The relevance of the collaborative process elements to the standard NASA NPR 7120.1 system engineering process will be demonstrated. The study definition process flow for each study discipline will be will be outlined beginning with the study planning process, followed by definition of ground rules and assumptions, definition of study trades, mission analysis and subsystem analyses leading to a standardized set of mission concept study products. The flexibility of the collaborative engineering design process to accommodate a wide range of study objectives from technology definition and requirements definition to preliminary design studies will be addressed. The paper will also describe the applicability of the collaborative engineering process to include an integrated systems analysis approach for evaluating the functional requirements of evolving system technologies and capabilities needed to meet the needs of future NASA programs.
Towards a muon radiography of the Puy de Dôme
NASA Astrophysics Data System (ADS)
Cârloganu, C.; Niess, V.; Béné, S.; Busato, E.; Dupieux, P.; Fehr, F.; Gay, P.; Miallier, D.; Vulpescu, B.; Boivin, P.; Combaret, C.; Labazuy, P.; Laktineh, I.; Lénat, J.-F.; Mirabito, L.; Portal, A.
2013-02-01
High-energy (above a few hundred GeV) atmospheric muons are a natural probe for geophysical studies. They can travel through kilometres of rock allowing for a radiography of the density distribution within large structures, like mountains or volcanoes. A collaboration between volcanologists, astroparticle and particle physicists, Tomuvol was formed in 2009 to study tomographic muon imaging of volcanoes with high-resolution, large-scale tracking detectors. We report on two campaigns of measurements at the flank of the Puy de Dôme using glass resistive plate chambers (GRPCs) developed for particle physics, within the CALICE collaboration.
Towards a muon radiography of the Puy de Dôme
NASA Astrophysics Data System (ADS)
Cârloganu, C.; Niess, V.; Béné, S.; Busato, E.; Dupieux, P.; Fehr, F.; Gay, P.; Miallier, D.; Vulpescu, B.; Boivin, P.; Combaret, C.; Labazuy, P.; Laktineh, I.; Lénat, J.-F.; Mirabito, L.; Portal, A.
2012-09-01
High energy (above 100 GeV) atmospheric muons are a natural probe for geophysical studies. They can travel through kilometres of rock allowing for a radiography of the density distribution within large structures, like mountains or volcanoes. A collaboration between volcanologists, astroparticle and particle physicists, TOMUVOL, was formed in 2009 to study tomographic muon imaging of volcanoes with high resolution, large scale tracking detectors. We report on two campaigns of measurements at the flank of the Puy de Dôme using Glass Resistive Plate Chambers (GRPCs) developed for Particle Physics, within the CALICE collaboration.
Europe Report, Science and Technology.
1986-06-06
the question, of course , whether the combined treatment will cause chromosome damage. In the case of chromosomes from in vitro cultivated human ...latest acquisition, this division collaborates to major engine programs, in particular with Snecma ( Atar , CFM 56) and Turbomeca (ARTM 405). 9294 CSO...microbiology, enzymology, molecular and cell biology , organic chemistry, etc. Several large public research organizations are involved: CNRS [National
Seo, Joo-Hyun; Kim, Hwan-Hee; Jeon, Eun-Yeong; Song, Young-Ha; Shin, Chul-Soo; Park, Jin-Byung
2016-01-01
Baeyer-Villiger monooxygenases (BVMOs) are able to catalyze regiospecific Baeyer-Villiger oxygenation of a variety of cyclic and linear ketones to generate the corresponding lactones and esters, respectively. However, the enzymes are usually difficult to express in a functional form in microbial cells and are rather unstable under process conditions hindering their large-scale applications. Thereby, we investigated engineering of the BVMO from Pseudomonas putida KT2440 and the gene expression system to improve its activity and stability for large-scale biotransformation of ricinoleic acid (1) into the ester (i.e., (Z)-11-(heptanoyloxy)undec-9-enoic acid) (3), which can be hydrolyzed into 11-hydroxyundec-9-enoic acid (5) (i.e., a precursor of polyamide-11) and n-heptanoic acid (4). The polyionic tag-based fusion engineering of the BVMO and the use of a synthetic promoter for constitutive enzyme expression allowed the recombinant Escherichia coli expressing the BVMO and the secondary alcohol dehydrogenase of Micrococcus luteus to produce the ester (3) to 85 mM (26.6 g/L) within 5 h. The 5 L scale biotransformation process was then successfully scaled up to a 70 L bioreactor; 3 was produced to over 70 mM (21.9 g/L) in the culture medium 6 h after biotransformation. This study demonstrated that the BVMO-based whole-cell reactions can be applied for large-scale biotransformations. PMID:27311560
Advanced Information Technology in Simulation Based Life Cycle Design
NASA Technical Reports Server (NTRS)
Renaud, John E.
2003-01-01
In this research a Collaborative Optimization (CO) approach for multidisciplinary systems design is used to develop a decision based design framework for non-deterministic optimization. To date CO strategies have been developed for use in application to deterministic systems design problems. In this research the decision based design (DBD) framework proposed by Hazelrigg is modified for use in a collaborative optimization framework. The Hazelrigg framework as originally proposed provides a single level optimization strategy that combines engineering decisions with business decisions in a single level optimization. By transforming this framework for use in collaborative optimization one can decompose the business and engineering decision making processes. In the new multilevel framework of Decision Based Collaborative Optimization (DBCO) the business decisions are made at the system level. These business decisions result in a set of engineering performance targets that disciplinary engineering design teams seek to satisfy as part of subspace optimizations. The Decision Based Collaborative Optimization framework more accurately models the existing relationship between business and engineering in multidisciplinary systems design.
(Re)inventing Government-Industry R and D Collaboration
NASA Technical Reports Server (NTRS)
Holmes, Bruce J.
1996-01-01
This paper describes the lessons learned in developing and operating a large-scale strategic alliance whose organization and coordination is U.S. Government-led using new means for R&D collaboration. Consortia in the United States counter a century of 1884 Sherman Anti-Trust Law-based governmental and legal policy and a longstanding business tradition of unfettered competition. Success in public-private collaboration in America requires compelling vision and motivation by both partners to reinvent our ways of doing business. The foundations for reinventing government and alliance building were laid in 1994 with Vice President Al Gore's mandates for Federal Lab Reviews and other examinations of the roles and missions for the nation's more than 700 government labs. In addition, the 1984 National Cooperative Research Act (NCRA) set in motion the abilities for U.S. companies to collaborate in pre-competitive technology development. The budget realities of the 1990's for NASA and other government agencies demand that government discover the means to accomplish its mission by leveraging resources through streamlining as well as alliances. Federal R&D investments can be significantly leveraged for greater national benefit through strategic alliances with industry and university partners. This paper presents early results from one of NASA's first large-scale public/private joint R&D ventures.
Collaborative visual analytics of radio surveys in the Big Data era
NASA Astrophysics Data System (ADS)
Vohl, Dany; Fluke, Christopher J.; Hassan, Amr H.; Barnes, David G.; Kilborn, Virginia A.
2017-06-01
Radio survey datasets comprise an increasing number of individual observations stored as sets of multidimensional data. In large survey projects, astronomers commonly face limitations regarding: 1) interactive visual analytics of sufficiently large subsets of data; 2) synchronous and asynchronous collaboration; and 3) documentation of the discovery workflow. To support collaborative data inquiry, we present encube, a large-scale comparative visual analytics framework. encube can utilise advanced visualization environments such as the CAVE2 (a hybrid 2D and 3D virtual reality environment powered with a 100 Tflop/s GPU-based supercomputer and 84 million pixels) for collaborative analysis of large subsets of data from radio surveys. It can also run on standard desktops, providing a capable visual analytics experience across the display ecology. encube is composed of four primary units enabling compute-intensive processing, advanced visualisation, dynamic interaction, parallel data query, along with data management. Its modularity will make it simple to incorporate astronomical analysis packages and Virtual Observatory capabilities developed within our community. We discuss how encube builds a bridge between high-end display systems (such as CAVE2) and the classical desktop, preserving all traces of the work completed on either platform - allowing the research process to continue wherever you are.
McrEngine: A Scalable Checkpointing System Using Data-Aware Aggregation and Compression
Islam, Tanzima Zerin; Mohror, Kathryn; Bagchi, Saurabh; ...
2013-01-01
High performance computing (HPC) systems use checkpoint-restart to tolerate failures. Typically, applications store their states in checkpoints on a parallel file system (PFS). As applications scale up, checkpoint-restart incurs high overheads due to contention for PFS resources. The high overheads force large-scale applications to reduce checkpoint frequency, which means more compute time is lost in the event of failure. We alleviate this problem through a scalable checkpoint-restart system, mcrEngine. McrEngine aggregates checkpoints from multiple application processes with knowledge of the data semantics available through widely-used I/O libraries, e.g., HDF5 and netCDF, and compresses them. Our novel scheme improves compressibility ofmore » checkpoints up to 115% over simple concatenation and compression. Our evaluation with large-scale application checkpoints show that mcrEngine reduces checkpointing overhead by up to 87% and restart overhead by up to 62% over a baseline with no aggregation or compression.« less
Software engineering and automatic continuous verification of scientific software
NASA Astrophysics Data System (ADS)
Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.
2011-12-01
Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical solutions via the method of manufactured solutions. By developing and verifying code in tandem we avoid a number of pitfalls in scientific software development and advocate similar procedures for other scientific code applications.
Parallelization of Rocket Engine Simulator Software (PRESS)
NASA Technical Reports Server (NTRS)
Cezzar, Ruknet
1997-01-01
Parallelization of Rocket Engine System Software (PRESS) project is part of a collaborative effort with Southern University at Baton Rouge (SUBR), University of West Florida (UWF), and Jackson State University (JSU). The second-year funding, which supports two graduate students enrolled in our new Master's program in Computer Science at Hampton University and the principal investigator, have been obtained for the period from October 19, 1996 through October 18, 1997. The key part of the interim report was new directions for the second year funding. This came about from discussions during Rocket Engine Numeric Simulator (RENS) project meeting in Pensacola on January 17-18, 1997. At that time, a software agreement between Hampton University and NASA Lewis Research Center had already been concluded. That agreement concerns off-NASA-site experimentation with PUMPDES/TURBDES software. Before this agreement, during the first year of the project, another large-scale FORTRAN-based software, Two-Dimensional Kinetics (TDK), was being used for translation to an object-oriented language and parallelization experiments. However, that package proved to be too complex and lacking sufficient documentation for effective translation effort to the object-oriented C + + source code. The focus, this time with better documented and more manageable PUMPDES/TURBDES package, was still on translation to C + + with design improvements. At the RENS Meeting, however, the new impetus for the RENS projects in general, and PRESS in particular, has shifted in two important ways. One was closer alignment with the work on Numerical Propulsion System Simulator (NPSS) through cooperation and collaboration with LERC ACLU organization. The other was to see whether and how NASA's various rocket design software can be run over local and intra nets without any radical efforts for redesign and translation into object-oriented source code. There were also suggestions that the Fortran based code be encapsulated in C + + code thereby facilitating reuse without undue development effort. The details are covered in the aforementioned section of the interim report filed on April 28, 1997.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacques Hugo
Traditional engineering methods do not make provision for the integration of human considerations, while traditional human factors methods do not scale well to the complexity of large-scale nuclear power plant projects. Although the need for up-to-date human factors engineering processes and tools is recognised widely in industry, so far no formal guidance has been developed. This article proposes such a framework.
sbtools: A package connecting R to cloud-based data for collaborative online research
Winslow, Luke; Chamberlain, Scott; Appling, Alison P.; Read, Jordan S.
2016-01-01
The adoption of high-quality tools for collaboration and reproducible research such as R and Github is becoming more common in many research fields. While Github and other version management systems are excellent resources, they were originally designed to handle code and scale poorly to large text-based or binary datasets. A number of scientific data repositories are coming online and are often focused on dataset archival and publication. To handle collaborative workflows using large scientific datasets, there is increasing need to connect cloud-based online data storage to R. In this article, we describe how the new R package sbtools enables direct access to the advanced online data functionality provided by ScienceBase, the U.S. Geological Survey’s online scientific data storage platform.
Boise Hydrogeophysical Research Site: Control Volume/Test Cell and Community Research Asset
NASA Astrophysics Data System (ADS)
Barrash, W.; Bradford, J.; Malama, B.
2008-12-01
The Boise Hydrogeophysical Research Site (BHRS) is a research wellfield or field-scale test facility developed in a shallow, coarse, fluvial aquifer with the objectives of supporting: (a) development of cost- effective, non- or minimally-invasive quantitative characterization and imaging methods in heterogeneous aquifers using hydrologic and geophysical techniques; (b) examination of fundamental relationships and processes at multiple scales; (c) testing theories and models for groundwater flow and solute transport; and (d) educating and training of students in multidisciplinary subsurface science and engineering. The design of the wells and the wellfield support modular use and reoccupation of wells for a wide range of single-well, cross-hole, multiwell and multilevel hydrologic, geophysical, and combined hydrologic-geophysical experiments. Efforts to date by Boise State researchers and collaborators have been largely focused on: (a) establishing the 3D distributions of geologic, hydrologic, and geophysical parameters which can then be used as the basis for jointly inverting hard and soft data to return the 3D K distribution and (b) developing subsurface measurement and imaging methods including tomographic characterization and imaging methods. At this point the hydrostratigraphic framework of the BHRS is known to be a hierarchical multi-scale system which includes layers and lenses that are recognized with geologic, hydrologic, radar, seismic, and EM methods; details are now emerging which may allow 3D deterministic characterization of zones and/or material variations at the meter scale in the central wellfield. Also the site design and subsurface framework have supported a variety of testing configurations for joint hydrologic and geophysical experiments. Going forward we recognize the opportunity to increase the R&D returns from use of the BHRS with additional infrastructure (especially for monitoring the vadose zone and surface water-groundwater interactions), more collaborative activity, and greater access to site data. Our broader goal of becoming more available as a research asset for the scientific community also supports the long-term business plan of increasing funding opportunities to maintain and operate the site.
Enhancing non-technical skills by a multidisciplinary engineering summer school
NASA Astrophysics Data System (ADS)
Larsen, Peter Gorm; Kristiansen, Erik Lasse; Bennedsen, Jens; Bjerge, Kim
2017-11-01
In general engineering studies focus on the technical skills in their own discipline. However, in their subsequent industrial careers, a significant portion of their time needs to be devoted to non-technical skills. In addition, in an increasingly globalised world collaboration in teams across cultures and disciplines is paramount to the creation of new and innovative products. In order to enhance the non-technical skills for groups of engineering students a series of innovation courses has been arranged and delivered in close collaboration with an industrial company (Bang & Olufsen). These courses have been organised as summer schools called 'Conceptual Design and Development of Innovative Products' (CD-DIP) and delivered outside the usual educational environment. In order to explore the impact of this single course, we have conducted a study among the students participating from 2007 to 2013. This has been carried out both qualitatively using interviews with selected students as well as quantitatively using a survey. The results are outstanding in demonstrating that the non-technical skills obtained in this single course have been of high value for a large portion of the students' subsequent professional life.
Design of a V/STOL propulsion system for a large-scale fighter model
NASA Technical Reports Server (NTRS)
Willis, W. S.
1981-01-01
Modifications were made to the existing Large-Scale STOL fighter model to simulate a V/STOL configuration. Modifications include the substitutions of two dimensional lift/cruise exhaust nozzles in the nacelles, and the addition of a third J97 engine in the fuselage to suppy a remote exhaust nozzle simulating a Remote Augmented Lift System. A preliminary design of the inlet and exhaust ducting for the third engine was developed and a detailed design was completed of the hot exhaust ducting and remote nozzle.
Implementation of the Large-Scale Operations Management Test in the State of Washington.
1982-12-01
During FY 79, the U.S. Army Engineer Waterways Experiment Station (WES), Vicksburg, Miss., completed the first phase of its 3-year Large-Scale Operations Management Test (LSOMT). The LSOMT was designed to develop an operational plan to identify methodologies that can be implemented by the U.S. Army Engineer District, Seattle (NPS), to prevent the exotic aquatic macrophyte Eurasian watermilfoil (Myrophyllum spicatum L.) from reaching problem-level proportions in water bodies in the state of Washington. The WES developed specific plans as integral elements
ERIC Educational Resources Information Center
Ejiwale, James A.
2014-01-01
Collaboration plays a major role in interdisciplinary activities among Science, Technology, Engineering & Mathematics (STEM) disciplines or fields. It also affects the relationships among cluster members on the management team. Although effective collaboration does not guarantee success among STEM disciplines, its absence usually assures…
Astronomical large projects managed with MANATEE: management tool for effective engineering
NASA Astrophysics Data System (ADS)
García-Vargas, M. L.; Mujica-Alvarez, E.; Pérez-Calpena, A.
2012-09-01
This paper describes MANATEE, which is the Management project web tool developed by FRACTAL, specifically designed for managing large astronomical projects. MANATEE facilitates the management by providing an overall view of the project and the capabilities to control the three main projects parameters: scope, schedule and budget. MANATEE is one of the three tools of the FRACTAL System & Project Suite, which is composed also by GECO (System Engineering Tool) and DOCMA (Documentation Management Tool). These tools are especially suited for those Consortia and teams collaborating in a multi-discipline, complex project in a geographically distributed environment. Our Management view has been applied successfully in several projects and currently is being used for Managing MEGARA, the next instrument for the GTC 10m telescope.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poirier, M.R.
2002-06-07
Personnel performed engineering-scale tests at the Filtration Research Engineering Demonstration (FRED) to determine crossflow filter performance with a 5.6 M sodium solution containing varying concentrations of sludge and sodium permanganate. The work represents another in a series of collaborative efforts between the University of South Carolina and the Savannah River Technology Center in support of the process development efforts for the Savannah River Site. The current tests investigated filter performance with slurry containing simulated Tank 40H Sludge and sodium permanganate at concentrations between 0.070 weight percent and 3.04 weight percent insoluble solids.
The development of a collaborative virtual environment for finite element simulation
NASA Astrophysics Data System (ADS)
Abdul-Jalil, Mohamad Kasim
Communication between geographically distributed designers has been a major hurdle in traditional engineering design. Conventional methods of communication, such as video conferencing, telephone, and email, are less efficient especially when dealing with complex design models. Complex shapes, intricate features and hidden parts are often difficult to describe verbally or even using traditional 2-D or 3-D visual representations. Virtual Reality (VR) and Internet technologies have provided a substantial potential to bridge the present communication barrier. VR technology allows designers to immerse themselves in a virtual environment to view and manipulate this model just as in real-life. Fast Internet connectivity has enabled fast data transfer between remote locations. Although various collaborative virtual environment (CVE) systems have been developed in the past decade, they are limited to high-end technology that is not accessible to typical designers. The objective of this dissertation is to discover and develop a new approach to increase the efficiency of the design process, particularly for large-scale applications wherein participants are geographically distributed. A multi-platform and easily accessible collaborative virtual environment (CVRoom), is developed to accomplish the stated research objective. Geographically dispersed designers can meet in a single shared virtual environment to discuss issues pertaining to the engineering design process and to make trade-off decisions more quickly than before, thereby speeding the entire process. This 'faster' design process will be achieved through the development of capabilities to better enable the multidisciplinary and modeling the trade-off decisions that are so critical before launching into a formal detailed design. The features of the environment developed as a result of this research include the ability to view design models, use voice interaction, and to link engineering analysis modules (such as Finite Element Analysis module, such as is demonstrated in this work). One of the major issues in developing a CVE system for engineering design purposes is to obtain any pertinent simulation results in real-time. This is critical so that the designers can make decisions based on these results quickly. For example, in a finite element analysis, if a design model is changed or perturbed, the analysis results must be obtained in real-time or near real-time to make the virtual meeting environment realistic. In this research, the finite difference-based Design Sensitivity Analysis (DSA) approach is employed to approximate structural responses (i.e. stress, displacement, etc), so as to demonstrate the applicability of CVRoom for engineering design trade-offs. This DSA approach provides for fast approximation and is well-suited for the virtual meeting environment where fast response time is required. The DSA-based approach is tested on several example test problems to show its applicability and limitations. This dissertation demonstrates that an increase in efficiency and reduction of time required for a complex design processing can be accomplished using the approach developed in this dissertation research. Several implementations of CVRoom by students working on common design tasks were investigated. All participants confirmed the preference of using the collaborative virtual environment developed in this dissertation work (CVRoom) over other modes of interactions. It is proposed here that CVRoom is representative of the type of collaborative virtual environment that will be used by most designers in the future to reduce the time required in a design cycle and thereby reduce the associated cost.
Trietsch, Jasper; van Steenkiste, Ben; Hobma, Sjoerd; Frericks, Arnoud; Grol, Richard; Metsemakers, Job; van der Weijden, Trudy
2014-12-01
A quality improvement strategy consisting of comparative feedback and peer review embedded in available local quality improvement collaboratives proved to be effective in changing the test-ordering behaviour of general practitioners. However, implementing this strategy was problematic. We aimed for large-scale implementation of an adapted strategy covering both test ordering and prescribing performance. Because we failed to achieve large-scale implementation, the aim of this study was to describe and analyse the challenges of the transferring process. In a qualitative study 19 regional health officers, pharmacists, laboratory specialists and general practitioners were interviewed within 6 months after the transfer period. The interviews were audiotaped, transcribed and independently coded by two of the authors. The codes were matched to the dimensions of the normalization process theory. The general idea of the strategy was widely supported, but generating the feedback was more complex than expected and the need for external support after transfer of the strategy remained high because participants did not assume responsibility for the work and the distribution of resources that came with it. Evidence on effectiveness, a national infrastructure for these collaboratives and a general positive attitude were not sufficient for normalization. Thinking about managing large databases, responsibility for tasks and distribution of resources should start as early as possible when planning complex quality improvement strategies. Merely exploring the barriers and facilitators experienced in a preceding trial is not sufficient. Although multifaceted implementation strategies to change professional behaviour are attractive, their inherent complexity is also a pitfall for large-scale implementation. © 2014 John Wiley & Sons, Ltd.
Aerospace Systems Design in NASA's Collaborative Engineering Environment
NASA Technical Reports Server (NTRS)
Monell, Donald W.; Piland, William M.
2000-01-01
Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g., manufacturing and systems operation). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often lead to the inability of assessing critical programmatic and technical issues (e.g., cost, risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographical distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across Agency.
NASA Technical Reports Server (NTRS)
Sullivan, Steven J.
2014-01-01
"Rocket University" is an exciting new initiative at Kennedy Space Center led by NASA's Engineering and Technology Directorate. This hands-on experience has been established to develop, refine & maintain targeted flight engineering skills to enable the Agency and KSC strategic goals. Through "RocketU", KSC is developing a nimble, rapid flight engineering life cycle systems knowledge base. Ongoing activities in RocketU develop and test new technologies and potential customer systems through small scale vehicles, build and maintain flight experience through balloon and small-scale rocket missions, and enable a revolving fresh perspective of engineers with hands on expertise back into the large scale NASA programs, providing a more experienced multi-disciplined set of systems engineers. This overview will define the Program, highlight aspects of the training curriculum, and identify recent accomplishments and activities.
Collaborative Approach in Software Engineering Education: An Interdisciplinary Case
ERIC Educational Resources Information Center
Vicente, Aileen Joan; Tan, Tiffany Adelaine; Yu, Alvin Ray
2018-01-01
Aim/Purpose: This study was aimed at enhancing students' learning of software engineering methods. A collaboration between the Computer Science, Business Management, and Product Design programs was formed to work on actual projects with real clients. This interdisciplinary form of collaboration simulates the realities of a diverse Software…
Johnson, Elizabeth O; Troupis, Theodore; Soucacos, Panayotis N
2011-03-01
Bone grafts are an important part of orthopaedic surgeon's armamentarium. Despite well-established bone-grafting techniques, large bone defects still represent a challenge. Efforts have therefore been made to develop osteoconductive, osteoinductive, and osteogenic bone-replacement systems. The long-term clinical goal in bone tissue engineering is to reconstruct bony tissue in an anatomically functional three-dimensional morphology. Current bone tissue engineering strategies take into account that bone is known for its ability to regenerate following injury, and for its intrinsic capability to re-establish a complex hierarchical structure during regeneration. Although the tissue engineering of bone for the reconstruction of small to moderate sized bone defects technically feasible, the reconstruction of large defects remains a daunting challenge. The essential steps towards optimized clinical application of tissue-engineered bone are dependent upon recent advances in the area of neovascularization of the engineered construct. Despite these recent advances, however, a gap from bench to bedside remains; this may ultimately be bridged by a closer collaboration between basic scientists and reconstructive surgeons. The aim of this review is to introduce the basic principles of tissue engineering of bone, outline the relevant bone physiology, and discuss the recent concepts for the induction of vascularization in engineered bone tissue. Copyright © 2011 Wiley-Liss, Inc.
Taking Stock: Existing Resources for Assessing a New Vision of Science Learning
ERIC Educational Resources Information Center
Alonzo, Alicia C.; Ke, Li
2016-01-01
A new vision of science learning described in the "Next Generation Science Standards"--particularly the science and engineering practices and their integration with content--pose significant challenges for large-scale assessment. This article explores what might be learned from advances in large-scale science assessment and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2014-08-01
In this pilot project, the Building America Partnership for Improved Residential Construction and Florida Power and Light are collaborating to retrofit a large number of homes using a phased approach to both simple and deep retrofits. This project will provide the information necessary to significantly reduce energy use through larger community-scale projects in collaboration with utilities, program administrators and other market leader stakeholders.
NASA Technical Reports Server (NTRS)
Barrack, J. P.; Kirk, J. V.
1972-01-01
The aerodynamic characteristics of a six-engine (four lift, two lift-cruise) lift-engine model obtained in the Ames 40- by 80-foot wind tunnel are presented. The model was an approximate one-half scale representation of a lift-engine VTOL fighter aircraft with a variable-sweep wing. The four lift-engines were housed in the aft fuselage with the inlets located above the wing. Longitudinal and lateral-directional force and moment data are presented for a range of exhaust gas momentum ratios (thrust coefficients). Wind tunnel forward speed was varied from 0 to 140 knots corresponding to a maximum Reynolds number of 6.7 million. The data are presented without analysis.
Wind Tunnel Tests of Large- and Small-Scale Rotor Hubs and Pylons
1981-04-01
formed by the engine nacelle and the nose gearbox. A fairing was wrapped around the nose gearbox and blended into the top and bottom of the engine...the variation is due to the FABs installation which acts like a small winglet . The large excursion ahead of Station 200 is due to the wing, the flow
Large space telescope engineering scale model optical design
NASA Technical Reports Server (NTRS)
Facey, T. A.
1973-01-01
The objective is to develop the detailed design and tolerance data for the LST engineering scale model optical system. This will enable MSFC to move forward to the optical element procurement phase and also to evaluate tolerances, manufacturing requirements, assembly/checkout procedures, reliability, operational complexity, stability requirements of the structure and thermal system, and the flexibility to change and grow.
2002-01-01
behaviors are influenced by social interactions, and to how modern IT sys- tems should be designed to support these group technical activities. The...engineering disciplines to behavior, decision, psychology, organization, and the social sciences. “Conflict manage- ment activity in collaborative...Researchers instead began to search for an entirely new paradigm, starting from a theory in social science, to construct a conceptual framework to describe
Study of an engine flow diverter system for a large scale ejector powered aircraft model
NASA Technical Reports Server (NTRS)
Springer, R. J.; Langley, B.; Plant, T.; Hunter, L.; Brock, O.
1981-01-01
Requirements were established for a conceptual design study to analyze and design an engine flow diverter system and to include accommodations for an ejector system in an existing 3/4 scale fighter model equipped with YJ-79 engines. Model constraints were identified and cost-effective limited modification was proposed to accept the ejectors, ducting and flow diverter valves. Complete system performance was calculated and a versatile computer program capable of analyzing any ejector system was developed.
ERIC Educational Resources Information Center
Kettle, Kevin C., Ed.
This colloquium was held with the purposes of promoting cooperation and collaboration among engineering education institutions in the Mekong subregion and establishing the linkage with engineering institutions in France; to promote university-industry collaboration in the field of engineering and technology education; to establish a network of…
2006-08-01
and analytical techniques. Materials with larger grains, such as gamma titanium aluminide , can be instrumented with strain gages on each grain...scale. Materials such as Ti-15-Al-33Nb(at.%) have a significantly smaller microstructure than gamma titanium aluminide , therefore strain gages can...contact fatigue problems that arise at the blade -disk interface in aircraft engines. The stress fields can be used to predict the performance of
Synthesis of Backfunctionalized Imidazolinium Salts and NHC Carbene Complexes
2017-04-02
at American Chemical Society National Meeting; San Francisco, CA, USA (02 April 2017) Prepared in collaboration with California Institute of...AFRL – Tenant of Edwards AFB since late ‘50s – Full scale testing of the Atlas rockets (Gemini missions) – Initial testing of the F-1 engine (Apollo...Force has an interest in NHC carbene precursors for a variety applications – Ionic liquid propellants and additives – Ligands for Supercritical Chemical
NASA Technical Reports Server (NTRS)
Fujiwara, Gustavo; Bragg, Mike; Triphahn, Chris; Wiberg, Brock; Woodard, Brian; Loth, Eric; Malone, Adam; Paul, Bernard; Pitera, David; Wilcox, Pete;
2017-01-01
This report presents the key results from the first two years of a program to develop experimental icing simulation capabilities for full-scale swept wings. This investigation was undertaken as a part of a larger collaborative research effort on ice accretion and aerodynamics for large-scale swept wings. Ice accretion and the resulting aerodynamic effect on large-scale swept wings presents a significant airplane design and certification challenge to air frame manufacturers, certification authorities, and research organizations alike. While the effect of ice accretion on straight wings has been studied in detail for many years, the available data on swept-wing icing are much more limited, especially for larger scales.
NASA Astrophysics Data System (ADS)
Henkel, Daniela; Eisenhauer, Anton
2017-04-01
During the last decades, the number of large research projects has increased and therewith the requirement for multidisciplinary, multisectoral collaboration. Such complex and large-scale projects pose new competencies to form, manage, and use large, diverse teams as a competitive advantage. For complex projects the effort is magnified because multiple large international research consortia involving academic and non-academic partners, including big industries, NGOs, private and public bodies, all with cultural differences, individually discrepant expectations on teamwork and differences in the collaboration between national and multi-national administrations and research organisations, challenge the organisation and management of such multi-partner research consortia. How many partners are needed to establish and conduct collaboration with a multidisciplinary and multisectoral approach? How much personnel effort and what kinds of management techniques are required for such projects. This presentation identifies advantages and challenges of large research projects based on the experiences made in the context of an Innovative Training Network (ITN) project within Marie Skłodowska-Curie Actions of the European HORIZON 2020 program. Possible strategies are discussed to circumvent and avoid conflicts already at the beginning of the project.
Fast Bound Methods for Large Scale Simulation with Application for Engineering Optimization
NASA Technical Reports Server (NTRS)
Patera, Anthony T.; Peraire, Jaime; Zang, Thomas A. (Technical Monitor)
2002-01-01
In this work, we have focused on fast bound methods for large scale simulation with application for engineering optimization. The emphasis is on the development of techniques that provide both very fast turnaround and a certificate of Fidelity; these attributes ensure that the results are indeed relevant to - and trustworthy within - the engineering context. The bound methodology which underlies this work has many different instantiations: finite element approximation; iterative solution techniques; and reduced-basis (parameter) approximation. In this grant we have, in fact, treated all three, but most of our effort has been concentrated on the first and third. We describe these below briefly - but with a pointer to an Appendix which describes, in some detail, the current "state of the art."
NASA Technical Reports Server (NTRS)
Gradl, Paul
2016-01-01
NASA Marshall Space Flight Center (MSFC) has been advancing dynamic optical measurement systems, primarily Digital Image Correlation, for extreme environment rocket engine test applications. The Digital Image Correlation (DIC) technology is used to track local and full field deformations, displacement vectors and local and global strain measurements. This technology has been evaluated at MSFC through lab testing to full scale hotfire engine testing of the J-2X Upper Stage engine at Stennis Space Center. It has been shown to provide reliable measurement data and has replaced many traditional measurement techniques for NASA applications. NASA and AMRDEC have recently signed agreements for NASA to train and transition the technology to applications for missile and helicopter testing. This presentation will provide an overview and progression of the technology, various testing applications at NASA MSFC, overview of Army-NASA test collaborations and application lessons learned about Digital Image Correlation.
Mobility Data Analytics Center.
DOT National Transportation Integrated Search
2016-01-01
Mobility Data Analytics Center aims at building a centralized data engine to efficiently manipulate : large-scale data for smart decision making. Integrating and learning the massive data are the key to : the data engine. The ultimate goal of underst...
Aerospace Systems Design in NASA's Collaborative Engineering Environment
NASA Technical Reports Server (NTRS)
Monell, Donald W.; Piland, William M.
1999-01-01
Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g. manufacturing and systems operations). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often lead to the inability of assessing critical programmatic and technical issues (e.g., cost risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographically distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across the Agency.
Aerospace Systems Design in NASA's Collaborative Engineering Environment
NASA Astrophysics Data System (ADS)
Monell, Donald W.; Piland, William M.
2000-07-01
Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g., manufacturing and systems operations). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often led to the inability of assessing critical programmatic and technical issues (e.g., cost, risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographically distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across the Agency.
ERIC Educational Resources Information Center
Berthoud, L.; Gliddon, J.
2018-01-01
In today's global Aerospace industry, virtual workspaces are commonly used for collaboration between geographically distributed multidisciplinary teams. This study investigated the use of wikis to look at communication, collaboration and engagement in 'Capstone' team design projects at the end of an engineering degree. Wikis were set up for teams…
ERIC Educational Resources Information Center
Haruna, Umar Ibrahim
2015-01-01
Collaboration plays a major role in interdisciplinary activities among Science, Technology, Engineering & Mathematics (STEM) disciplines or fields. It also affects the relationships among cluster members on the management team. Although effective collaboration does not guarantee success among STEM disciplines, its absence usually assures…
Human Systems Engineering: A Leadership Model for Collaboration and Change.
ERIC Educational Resources Information Center
Clark, Karen L.
Human systems engineering (HSE) was created to introduce a new way of viewing collaboration. HSE emphasizes the role of leaders who welcome risk, commit to achieving positive change, and help others achieve change. The principles of HSE and its successful application to the collaborative process were illustrated through a case study representing a…
NASA Astrophysics Data System (ADS)
Mease, L.; Gibbs, T.; Adiseshan, T.
2014-12-01
The 2010 Deepwater Horizon disaster required unprecedented engagement and collaboration with scientists from multiple disciplines across government, academia, and industry. Although this spurred the rapid advancement of valuable new scientific knowledge and tools, it also exposed weaknesses in the system of information dissemination and exchange among the scientists from those three sectors. Limited government communication with the broader scientific community complicated the rapid mobilization of the scientific community to assist with spill response, evaluation of impact, and public perceptions of the crisis. The lessons and new laws produced from prior spills such as Exxon Valdez were helpful, but ultimately did not lead to the actions necessary to prepare a suitable infrastructure that would support collaboration with non-governmental scientists. As oil demand pushes drilling into increasingly extreme environments, addressing the challenge of effective, science-based disaster response is an imperative. Our study employs a user-centered design process to 1) understand the obstacles to and opportunity spaces for effective scientific collaboration during environmental crises such as large oil spills, 2) identify possible tools and strategies to enable rapid information exchange between government responders and non-governmental scientists from multiple relevant disciplines, and 3) build a network of key influencers to secure sufficient buy-in for scaled implementation of appropriate tools and strategies. Our methods include user ethnography, complex system mapping, individual and system behavioral analysis, and large-scale system design to identify and prototype a solution to this crisis collaboration challenge. In this talk, we will present out insights gleaned from existing analogs of successful scientific collaboration during crises and our initial findings from the 60 targeted interviews we conducted that highlight key collaboration challenges that government agencies, academic research institutions, and industry scientists face during oil spill crises. We will also present a synthesis of leverage points in the system that may amplify the impact of an improved collaboration strategy among scientific stakeholders.
Business and public health collaboration for emergency preparedness in Georgia: a case study.
Buehler, James W; Whitney, Ellen A; Berkelman, Ruth L
2006-11-20
Governments may be overwhelmed by a large-scale public health emergency, such as a massive bioterrorist attack or natural disaster, requiring collaboration with businesses and other community partners to respond effectively. In Georgia, public health officials and members of the Business Executives for National Security have successfully collaborated to develop and test procedures for dispensing medications from the Strategic National Stockpile. Lessons learned from this collaboration should be useful to other public health and business leaders interested in developing similar partnerships. The authors conducted a case study based on interviews with 26 government, business, and academic participants in this collaboration. The partnership is based on shared objectives to protect public health and assure community cohesion in the wake of a large-scale disaster, on the recognition that acting alone neither public health agencies nor businesses are likely to manage such a response successfully, and on the realization that business and community continuity are intertwined. The partnership has required participants to acknowledge and address multiple challenges, including differences in business and government cultures and operational constraints, such as concerns about the confidentiality of shared information, liability, and the limits of volunteerism. The partnership has been facilitated by a business model based on defining shared objectives, identifying mutual needs and vulnerabilities, developing carefully-defined projects, and evaluating proposed project methods through exercise testing. Through collaborative engagement in progressively more complex projects, increasing trust and understanding have enabled the partners to make significant progress in addressing these challenges. As a result of this partnership, essential relationships have been established, substantial private resources and capabilities have been engaged in government preparedness programs, and a model for collaborative, emergency mass dispensing of pharmaceuticals has been developed, tested, and slated for expansion. The lessons learned from this collaboration in Georgia should be considered by other government and business leaders seeking to develop similar partnerships.
Business and public health collaboration for emergency preparedness in Georgia: a case study
Buehler, James W; Whitney, Ellen A; Berkelman, Ruth L
2006-01-01
Background Governments may be overwhelmed by a large-scale public health emergency, such as a massive bioterrorist attack or natural disaster, requiring collaboration with businesses and other community partners to respond effectively. In Georgia, public health officials and members of the Business Executives for National Security have successfully collaborated to develop and test procedures for dispensing medications from the Strategic National Stockpile. Lessons learned from this collaboration should be useful to other public health and business leaders interested in developing similar partnerships. Methods The authors conducted a case study based on interviews with 26 government, business, and academic participants in this collaboration. Results The partnership is based on shared objectives to protect public health and assure community cohesion in the wake of a large-scale disaster, on the recognition that acting alone neither public health agencies nor businesses are likely to manage such a response successfully, and on the realization that business and community continuity are intertwined. The partnership has required participants to acknowledge and address multiple challenges, including differences in business and government cultures and operational constraints, such as concerns about the confidentiality of shared information, liability, and the limits of volunteerism. The partnership has been facilitated by a business model based on defining shared objectives, identifying mutual needs and vulnerabilities, developing carefully-defined projects, and evaluating proposed project methods through exercise testing. Through collaborative engagement in progressively more complex projects, increasing trust and understanding have enabled the partners to make significant progress in addressing these challenges. Conclusion As a result of this partnership, essential relationships have been established, substantial private resources and capabilities have been engaged in government preparedness programs, and a model for collaborative, emergency mass dispensing of pharmaceuticals has been developed, tested, and slated for expansion. The lessons learned from this collaboration in Georgia should be considered by other government and business leaders seeking to develop similar partnerships. PMID:17116256
Energy Efficient Engine acoustic supporting technology report
NASA Technical Reports Server (NTRS)
Lavin, S. P.; Ho, P. Y.
1985-01-01
The acoustic development of the Energy Efficient Engine combined testing and analysis using scale model rigs and an integrated Core/Low Spool demonstration engine. The scale model tests show that a cut-on blade/vane ratio fan with a large spacing (S/C = 2.3) is as quiet as a cut-off blade/vane ratio with a tighter spacing (S/C = 1.27). Scale model mixer tests show that separate flow nozzles are the noisiest, conic nozzles the quietest, with forced mixers in between. Based on projections of ICLS data the Energy Efficient Engine (E3) has FAR 36 margins of 3.7 EPNdB at approach, 4.5 EPNdB at full power takeoff, and 7.2 EPNdB at sideline conditions.
Analysis of Decision Making Skills for Large Scale Disaster Response
2015-08-21
Capability to influence and collaborate Compassion Teamwork Communication Leadership Provide vision of outcome / set priorities Confidence, courage to make...project evaluates the viability of expanding the use of serious games to augment classroom training, tabletop and full scale exercise, and actual...training, evaluation, analysis, and technology ex- ploration. Those techniques have found successful niches, but their wider applicability faces
Lessons Learned from the Everglades Collaborative Adaptive Management Program
Recent technical papers explore whether adaptive management (AM) is useful for environmental management and restoration efforts and discuss the many challenges to overcome for successful implementation, especially for large-scale restoration programs (McLain and Lee 1996; Levine ...
Sun, Jianyu; Liang, Peng; Yan, Xiaoxu; Zuo, Kuichang; Xiao, Kang; Xia, Junlin; Qiu, Yong; Wu, Qing; Wu, Shijia; Huang, Xia; Qi, Meng; Wen, Xianghua
2016-04-15
Reducing the energy consumption of membrane bioreactors (MBRs) is highly important for their wider application in wastewater treatment engineering. Of particular significance is reducing aeration in aerobic tanks to reduce the overall energy consumption. This study proposed an in situ ammonia-N-based feedback control strategy for aeration in aerobic tanks; this was tested via model simulation and through a large-scale (50,000 m(3)/d) engineering application. A full-scale MBR model was developed based on the activated sludge model (ASM) and was calibrated to the actual MBR. The aeration control strategy took the form of a two-step cascaded proportion-integration (PI) feedback algorithm. Algorithmic parameters were optimized via model simulation. The strategy achieved real-time adjustment of aeration amounts based on feedback from effluent quality (i.e., ammonia-N). The effectiveness of the strategy was evaluated through both the model platform and the full-scale engineering application. In the former, the aeration flow rate was reduced by 15-20%. In the engineering application, the aeration flow rate was reduced by 20%, and overall specific energy consumption correspondingly reduced by 4% to 0.45 kWh/m(3)-effluent, using the present practice of regulating the angle of guide vanes of fixed-frequency blowers. Potential energy savings are expected to be higher for MBRs with variable-frequency blowers. This study indicated that the ammonia-N-based aeration control strategy holds promise for application in full-scale MBRs. Copyright © 2016 Elsevier Ltd. All rights reserved.
Manufacturing Process Developments for Regeneratively-Cooled Channel Wall Rocket Nozzles
NASA Technical Reports Server (NTRS)
Gradl, Paul; Brandsmeier, Will
2016-01-01
Regeneratively cooled channel wall nozzles incorporate a series of integral coolant channels to contain the coolant to maintain adequate wall temperatures and expand hot gas providing engine thrust and specific impulse. NASA has been evaluating manufacturing techniques targeting large scale channel wall nozzles to support affordability of current and future liquid rocket engine nozzles and thrust chamber assemblies. The development of these large scale manufacturing techniques focus on the liner formation, channel slotting with advanced abrasive water-jet milling techniques and closeout of the coolant channels to replace or augment other cost reduction techniques being evaluated for nozzles. NASA is developing a series of channel closeout techniques including large scale additive manufacturing laser deposition and explosively bonded closeouts. A series of subscale nozzles were completed evaluating these processes. Fabrication of mechanical test and metallography samples, in addition to subscale hardware has focused on Inconel 625, 300 series stainless, aluminum alloys as well as other candidate materials. Evaluations of these techniques are demonstrating potential for significant cost reductions for large scale nozzles and chambers. Hot fire testing is planned using these techniques in the future.
Skate Genome Project: Cyber-Enabled Bioinformatics Collaboration
Vincent, J.
2011-01-01
The Skate Genome Project, a pilot project of the North East Cyber infrastructure Consortium, aims to produce a draft genome sequence of Leucoraja erinacea, the Little Skate. The pilot project was designed to also develop expertise in large scale collaborations across the NECC region. An overview of the bioinformatics and infrastructure challenges faced during the first year of the project will be presented. Results to date and lessons learned from the perspective of a bioinformatics core will be highlighted.
Software Engineering and Swarm-Based Systems
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Sterritt, Roy; Pena, Joaquin; Rouff, Christopher A.
2006-01-01
We discuss two software engineering aspects in the development of complex swarm-based systems. NASA researchers have been investigating various possible concept missions that would greatly advance future space exploration capabilities. The concept mission that we have focused on exploits the principles of autonomic computing as well as being based on the use of intelligent swarms, whereby a (potentially large) number of similar spacecraft collaborate to achieve mission goals. The intent is that such systems not only can be sent to explore remote and harsh environments but also are endowed with greater degrees of protection and longevity to achieve mission goals.
Politis, Christopher E; Mowat, David L; Keen, Deb
2017-06-16
The Canadian Partnership Against Cancer funded 12 large-scale knowledge to action cancer and chronic disease prevention projects between 2009 and 2016 through the Coalitions Linking Action and Science for Prevention (CLASP) initiative. Two projects, Healthy Canada by Design (HCBD) and Children's Mobility, Health and Happiness (CMHH), developed policies to address physical activity and the built environment through a multisectoral approach. A qualitative analysis involving a review of 183 knowledge products and 8 key informant interviews was conducted to understand what policy changes occurred, and the underlying critical success factors, through these projects. Both projects worked at the local level to change physical activity and built environment policy in 203 sites, including municipalities and schools. Both projects brought multisectoral expertise (e.g., public health, land use planning, transportation engineering, education, etc.) together to inform the development of local healthy public policy in the areas of land use, transportation and school travel planning. Through the qualitative analysis of the knowledge products and key informant interviews, 163 policies were attributed to HCBD and CMHH work. Fourteen "pathways to policy" were identified as critical success factors facilitating and accelerating the development and implementation of physical activity and built environment policy. Of the 14 pathways to policy, 8 had a focus on multisectoral collaboration. The lessons learned from the CLASP experience could support enhanced multisectoral collaborations to accelerate the development and implementation of physical activity and built environment policy in new jurisdictions across Canada and internationally.
Combined heat and power supply using Carnot engines
NASA Astrophysics Data System (ADS)
Horlock, J. H.
The Marshall Report on the thermodynamic and economic feasibility of introducing large scale combined heat and electrical power generation (CHP) into the United Kingdom is summarized. Combinations of reversible power plant (Carnot engines) to meet a given demand of power and heat production are analyzed. The Marshall Report states that fairly large scale CHP plants are an attractive energy saving option for areas of high heat load densities. Analysis shows that for given requirements, the total heat supply and utilization factor are functions of heat output, reservoir supply temperature, temperature of heat rejected to the reservoir, and an intermediate temperature for district heating.
NASA Technical Reports Server (NTRS)
Tolhurst, William H., Jr.; Hickey, David H.; Aoyagi, Kiyoshi
1961-01-01
Wind-tunnel tests have been conducted on a large-scale model of a swept-wing jet transport type airplane to study the factors affecting exhaust gas ingestion into the engine inlets when thrust reversal is used during ground roll. The model was equipped with four small jet engines mounted in nacelles beneath the wing. The tests included studies of both cascade and target type reversers. The data obtained included the free-stream velocity at the occurrence of exhaust gas ingestion in the outboard engine and the increment of drag due to thrust reversal for various modifications of thrust reverser configuration. Motion picture films of smoke flow studies were also obtained to supplement the data. The results show that the free-stream velocity at which ingestion occurred in the outboard engines could be reduced considerably, by simple modifications to the reversers, without reducing the effective drag due to reversed thrust.
NASA Earth Science Education Collaborative
NASA Astrophysics Data System (ADS)
Schwerin, T. G.; Callery, S.; Chambers, L. H.; Riebeek Kohl, H.; Taylor, J.; Martin, A. M.; Ferrell, T.
2016-12-01
The NASA Earth Science Education Collaborative (NESEC) is led by the Institute for Global Environmental Strategies with partners at three NASA Earth science Centers: Goddard Space Flight Center, Jet Propulsion Laboratory, and Langley Research Center. This cross-organization team enables the project to draw from the diverse skills, strengths, and expertise of each partner to develop fresh and innovative approaches for building pathways between NASA's Earth-related STEM assets to large, diverse audiences in order to enhance STEM teaching, learning and opportunities for learners throughout their lifetimes. These STEM assets include subject matter experts (scientists, engineers, and education specialists), science and engineering content, and authentic participatory and experiential opportunities. Specific project activities include authentic STEM experiences through NASA Earth science themed field campaigns and citizen science as part of international GLOBE program (for elementary and secondary school audiences) and GLOBE Observer (non-school audiences of all ages); direct connections to learners through innovative collaborations with partners like Odyssey of the Mind, an international creative problem-solving and design competition; and organizing thematic core content and strategically working with external partners and collaborators to adapt and disseminate core content to support the needs of education audiences (e.g., libraries and maker spaces, student research projects, etc.). A scaffolded evaluation is being conducted that 1) assesses processes and implementation, 2) answers formative evaluation questions in order to continuously improve the project; 3) monitors progress and 4) measures outcomes.
1981-06-01
V ADA02 7414 UNIVERSITY OF SOUTH FLORIDA TAMPA DEPT OF BIOLOGY F/6 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE MHITE AMUM-ETC(U) JUN 81...Army Engineer Waterways Expiftaton P. 0. Box 631, Vicksburg, Miss. 391( 0 81 8 1102 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR...78-22// 4. TITLE (and Su~btitle) 5 TYPE OF REPORT & PERIOD COVERED LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE. or Report I of a series THE W4HITE
NASA Technical Reports Server (NTRS)
Topousis, Daria E.; Murphy, Keri; Robinson, Greg
2008-01-01
In 2004, NASA faced major knowledge sharing challenges due to geographically isolated field centers that inhibited personnel from sharing experiences and ideas. Mission failures and new directions for the agency demanded better collaborative tools. In addition, with the push to send astronauts back to the moon and to Mars, NASA recognized that systems engineering would have to improve across the agency. Of the ten field centers, seven had not built a spacecraft in over 30 years, and had lost systems engineering expertise. The Systems Engineering Community of Practice came together to capture the knowledge of its members using the suite of collaborative tools provided by the NASA Engineering Network (NEN.) The NEN provided a secure collaboration space for over 60 practitioners across the agency to assemble and review a NASA systems engineering handbook. Once the handbook was complete, they used the open community area to disseminate it. This case study explores both the technology and the social networking that made the community possible, describes technological approaches that facilitated rapid setup and low maintenance, provides best practices that other organizations could adopt, and discusses the vision for how this community will continue to collaborate across the field centers to benefit the agency as it continues exploring the solar system.
Research Activities at Fermilab for Big Data Movement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mhashilkar, Parag; Wu, Wenji; Kim, Hyun W
2013-01-01
Adaptation of 100GE Networking Infrastructure is the next step towards management of Big Data. Being the US Tier-1 Center for the Large Hadron Collider's (LHC) Compact Muon Solenoid (CMS) experiment and the central data center for several other large-scale research collaborations, Fermilab has to constantly deal with the scaling and wide-area distribution challenges of the big data. In this paper, we will describe some of the challenges involved in the movement of big data over 100GE infrastructure and the research activities at Fermilab to address these challenges.
NASA Astrophysics Data System (ADS)
Benaben, Frederick; Mu, Wenxin; Boissel-Dallier, Nicolas; Barthe-Delanoe, Anne-Marie; Zribi, Sarah; Pingaud, Herve
2015-08-01
The Mediation Information System Engineering project is currently finishing its second iteration (MISE 2.0). The main objective of this scientific project is to provide any emerging collaborative situation with methods and tools to deploy a Mediation Information System (MIS). MISE 2.0 aims at defining and designing a service-based platform, dedicated to initiating and supporting the interoperability of collaborative situations among potential partners. This MISE 2.0 platform implements a model-driven engineering approach to the design of a service-oriented MIS dedicated to supporting the collaborative situation. This approach is structured in three layers, each providing their own key innovative points: (i) the gathering of individual and collaborative knowledge to provide appropriate collaborative business behaviour (key point: knowledge management, including semantics, exploitation and capitalisation), (ii) deployment of a mediation information system able to computerise the previously deduced collaborative processes (key point: the automatic generation of collaborative workflows, including connection with existing devices or services) (iii) the management of the agility of the obtained collaborative network of organisations (key point: supervision of collaborative situations and relevant exploitation of the gathered data). MISE covers business issues (through BPM), technical issues (through an SOA) and agility issues of collaborative situations (through EDA).
The answers are out there! Developing an inclusive approach to collaboration.
Hogg, David R
2016-01-01
Professional isolation is a recurring issue in the delivery of rural and remote health care. However, collaboration is now more feasible with developments in technology and connectivity. At an international scale, collaboration offers clear opportunities for good ideas and great work to be shared across distances and boundaries that previously precluded this. This article reflects a presentation given to the Rethinking Remote conference in Inverness (Scotland) in May 2016. A number of factors with regard to infrastructure and engagement are considered, along with ways in which the opportunities of collaboration between individuals and large centres can be optimised. Social media and increased connectivity pave the way for easier access to great practice across international sites that share similar challenges.
Strohmaier, Markus; Walk, Simon; Pöschko, Jan; Lamprecht, Daniel; Tudorache, Tania; Nyulas, Csongor; Musen, Mark A; Noy, Natalya F
2013-05-01
Traditionally, evaluation methods in the field of semantic technologies have focused on the end result of ontology engineering efforts, mainly, on evaluating ontologies and their corresponding qualities and characteristics. This focus has led to the development of a whole arsenal of ontology-evaluation techniques that investigate the quality of ontologies as a product . In this paper, we aim to shed light on the process of ontology engineering construction by introducing and applying a set of measures to analyze hidden social dynamics. We argue that especially for ontologies which are constructed collaboratively, understanding the social processes that have led to its construction is critical not only in understanding but consequently also in evaluating the ontology. With the work presented in this paper, we aim to expose the texture of collaborative ontology engineering processes that is otherwise left invisible. Using historical change-log data, we unveil qualitative differences and commonalities between different collaborative ontology engineering projects. Explaining and understanding these differences will help us to better comprehend the role and importance of social factors in collaborative ontology engineering projects. We hope that our analysis will spur a new line of evaluation techniques that view ontologies not as the static result of deliberations among domain experts, but as a dynamic, collaborative and iterative process that needs to be understood, evaluated and managed in itself. We believe that advances in this direction would help our community to expand the existing arsenal of ontology evaluation techniques towards more holistic approaches.
Strohmaier, Markus; Walk, Simon; Pöschko, Jan; Lamprecht, Daniel; Tudorache, Tania; Nyulas, Csongor; Musen, Mark A.; Noy, Natalya F.
2013-01-01
Traditionally, evaluation methods in the field of semantic technologies have focused on the end result of ontology engineering efforts, mainly, on evaluating ontologies and their corresponding qualities and characteristics. This focus has led to the development of a whole arsenal of ontology-evaluation techniques that investigate the quality of ontologies as a product. In this paper, we aim to shed light on the process of ontology engineering construction by introducing and applying a set of measures to analyze hidden social dynamics. We argue that especially for ontologies which are constructed collaboratively, understanding the social processes that have led to its construction is critical not only in understanding but consequently also in evaluating the ontology. With the work presented in this paper, we aim to expose the texture of collaborative ontology engineering processes that is otherwise left invisible. Using historical change-log data, we unveil qualitative differences and commonalities between different collaborative ontology engineering projects. Explaining and understanding these differences will help us to better comprehend the role and importance of social factors in collaborative ontology engineering projects. We hope that our analysis will spur a new line of evaluation techniques that view ontologies not as the static result of deliberations among domain experts, but as a dynamic, collaborative and iterative process that needs to be understood, evaluated and managed in itself. We believe that advances in this direction would help our community to expand the existing arsenal of ontology evaluation techniques towards more holistic approaches. PMID:24311994
Engineering uses of physics-based ground motion simulations
Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.
2014-01-01
This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.
Austin, Melissa A.; Hair, Marilyn S.; Fullerton, Stephanie M.
2012-01-01
Scientific research has shifted from studies conducted by single investigators to the creation of large consortia. Genetic epidemiologists, for example, now collaborate extensively for genome-wide association studies (GWAS). The effect has been a stream of confirmed disease-gene associations. However, effects on human subjects oversight, data-sharing, publication and authorship practices, research organization and productivity, and intellectual property remain to be examined. The aim of this analysis was to identify all research consortia that had published the results of a GWAS analysis since 2005, characterize them, determine which have publicly accessible guidelines for research practices, and summarize the policies in these guidelines. A review of the National Human Genome Research Institute’s Catalog of Published Genome-Wide Association Studies identified 55 GWAS consortia as of April 1, 2011. These consortia were comprised of individual investigators, research centers, studies, or other consortia and studied 48 different diseases or traits. Only 14 (25%) were found to have publicly accessible research guidelines on consortia websites. The available guidelines provide information on organization, governance, and research protocols; half address institutional review board approval. Details of publication, authorship, data-sharing, and intellectual property vary considerably. Wider access to consortia guidelines is needed to establish appropriate research standards with broad applicability to emerging forms of large-scale collaboration. PMID:22491085
Space transportation booster engine thrust chamber technology, large scale injector
NASA Technical Reports Server (NTRS)
Schneider, J. A.
1993-01-01
The objective of the Large Scale Injector (LSI) program was to deliver a 21 inch diameter, 600,000 lbf thrust class injector to NASA/MSFC for hot fire testing. The hot fire test program would demonstrate the feasibility and integrity of the full scale injector, including combustion stability, chamber wall compatibility (thermal management), and injector performance. The 21 inch diameter injector was delivered in September of 1991.
The Kamusi Project Edit Engine: A Tool for Collaborative Lexicography.
ERIC Educational Resources Information Center
Benjamin, Martin; Biersteker, Ann
2001-01-01
Discusses the design and implementation of the Kamusi Project Edit Engine, a Web-based software system uniquely suited to the needs of Swahili collaborative lexicography. Describes the edit engine, including organization of the lexicon and the mechanics by which participants use the system, discusses philosophical issues confronted in the design,…
Collaborating for Success: Team Teaching the Engineering Technical Thesis
ERIC Educational Resources Information Center
Keating, Terrence; Long, Mike
2012-01-01
This paper will examine the collaborative teaching process undertaken at College of the North Atlantic-Qatar (CNA-Q) by Engineering and the Communication faculties to improve the overall quality of engineering students' capstone projects known as the Technical Thesis. The Technical Thesis is divided into two separate components: a proposal stage…
Advanced engineering environment collaboration project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lamph, Jane Ann; Pomplun, Alan R.; Kiba, Grant W.
2008-12-01
The Advanced Engineering Environment (AEE) is a model for an engineering design and communications system that will enhance project collaboration throughout the nuclear weapons complex (NWC). Sandia National Laboratories and Parametric Technology Corporation (PTC) worked together on a prototype project to evaluate the suitability of a portion of PTC's Windchill 9.0 suite of data management, design and collaboration tools as the basis for an AEE. The AEE project team implemented Windchill 9.0 development servers in both classified and unclassified domains and used them to test and evaluate the Windchill tool suite relative to the needs of the NWC using weaponsmore » project use cases. A primary deliverable was the development of a new real time collaborative desktop design and engineering process using PDMLink (data management tool), Pro/Engineer (mechanical computer aided design tool) and ProductView Lite (visualization tool). Additional project activities included evaluations of PTC's electrical computer aided design, visualization, and engineering calculations applications. This report documents the AEE project work to share information and lessons learned with other NWC sites. It also provides PTC with recommendations for improving their products for NWC applications.« less
NASA Technical Reports Server (NTRS)
Falarski, M. D.; Aoyagi, K.; Koenig, D. G.
1973-01-01
The upper-surface blown (USB) flap as a powered-lift concept has evolved because of the potential acoustic shielding provided when turbofan engines are installed on a wing upper surface. The results from a wind tunnel investigation of a large-scale USB model powered by two JT15D-1 turbofan engines are-presented. The effects of coanda flap extent and deflection, forward speed, and exhaust nozzle configuration were investigated. To determine the wing shielding the acoustics of a single engine nacelle removed from the model were also measured. Effective shielding occurred in the aft underwing quadrant. In the forward quadrant the shielding of the high frequency noise was counteracted by an increase in the lower frequency wing-exhaust interaction noise. The fuselage provided shielding of the opposite engine noise such that the difference between single and double engine operation was 1.5 PNdB under the wing. The effects of coanda flap deflection and extent, angle of attack, and forward speed were small. Forward speed reduced the perceived noise level (PNL) by reducing the wing-exhaust interaction noise.
A unifying framework for systems modeling, control systems design, and system operation
NASA Technical Reports Server (NTRS)
Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.
2005-01-01
Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.
NASA's Information Power Grid: Large Scale Distributed Computing and Data Management
NASA Technical Reports Server (NTRS)
Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)
2001-01-01
Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.
A quality assessment tool for markup-based clinical guidelines.
Shalom, Erez; Shahar, Yuval; Taieb-Maimon, Meirav; Lunenfeld, Eitan
2008-11-06
We introduce a tool for quality assessment of procedural and declarative knowledge. We developed this tool for evaluating the specification of mark-up-based clinical GLs. Using this graphical tool, the expert physician and knowledge engineer collaborate to perform scoring, using pre-defined scoring scale, each of the knowledge roles of the mark-ups, comparing it to a gold standard. The tool enables scoring the mark-ups simultaneously at different sites by different users at different locations.
NASA Technical Reports Server (NTRS)
Kamhawi, Hilmi N.
2013-01-01
This report documents the work performed during the period from May 2011 - October 2012 on the Integrated Design and Engineering Analysis (IDEA) environment. IDEA is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML). This report will focus on describing the work done in the areas of: (1) Integrating propulsion data (turbines, rockets, and scramjets) in the system, and using the data to perform trajectory analysis; (2) Developing a parametric packaging strategy for a hypersonic air breathing vehicles allowing for tank resizing when multiple fuels and/or oxidizer are part of the configuration; and (3) Vehicle scaling and closure strategies.
Status and Prospects for Indirect Dark Matter Searches with the Fermi Large Area Telescope
NASA Astrophysics Data System (ADS)
Charles, Eric; Fermi-LAT Collaboration
2014-01-01
During the first five years of operation of the Fermi Large Area Telescope (LAT) the LAT collaboration has performed numerous searches for signatures of Dark Matter interactions in both gamma-ray and cosmic-ray data. These searches feature many different target types, including dwarf spheroidal galaxies, galaxy clusters, the Milky Way halo and inner Galaxy and unassociated LAT sources. They make use of a variety of techniques, and have been performed in both the spatial and spectral domains, as well as via less conventional strategies such as examining the potential Dark Matter contribution to both large scale and small scale anisotropies. To date no clear gamma-ray or cosmic-ray signal from dark matter annihilation or decay has been observed, and the deepest current limits for annihilation exclude many Dark Matter particle models with the canonical thermal relic cross section and masses up to 30 GeV. In this contribution we will briefly review the status of each of the searches by the LAT collaboration. We will also discuss the limiting factors for the various search strategies and examine the prospects for the future.
Becker, Christian M.; Laufer, Marc R.; Stratton, Pamela; Hummelshoj, Lone; Missmer, Stacey A.; Zondervan, Krina T.; Adamson, G. David; Adamson, G.D.; Allaire, C.; Anchan, R.; Becker, C.M.; Bedaiwy, M.A.; Buck Louis, G.M.; Calhaz-Jorge, C.; Chwalisz, K.; D'Hooghe, T.M.; Fassbender, A.; Faustmann, T.; Fazleabas, A.T.; Flores, I.; Forman, A.; Fraser, I.; Giudice, L.C.; Gotte, M.; Gregersen, P.; Guo, S.-W.; Harada, T.; Hartwell, D.; Horne, A.W.; Hull, M.L.; Hummelshoj, L.; Ibrahim, M.G.; Kiesel, L.; Laufer, M.R.; Machens, K.; Mechsner, S.; Missmer, S.A.; Montgomery, G.W.; Nap, A.; Nyegaard, M.; Osteen, K.G.; Petta, C.A.; Rahmioglu, N.; Renner, S.P.; Riedlinger, J.; Roehrich, S.; Rogers, P.A.; Rombauts, L.; Salumets, A.; Saridogan, E.; Seckin, T.; Stratton, P.; Sharpe-Timms, K.L.; Tworoger, S.; Vigano, P.; Vincent, K.; Vitonis, A.F.; Wienhues-Thelen, U.-H.; Yeung, P.P.; Yong, P.; Zondervan, K.T.
2014-01-01
Objective To standardize the recording of surgical phenotypic information on endometriosis and related sample collections obtained at laparoscopy, allowing large-scale collaborative research into the condition. Design An international collaboration involving 34 clinical/academic centers and three industry collaborators from 16 countries. Setting Two workshops were conducted in 2013, bringing together 54 clinical, academic, and industry leaders in endometriosis research and management worldwide. Patient(s) None. Intervention(s) A postsurgical scoring sheet containing general and gynecological patient and procedural information, extent of disease, the location and type of endometriotic lesion, and any other findings was developed during several rounds of review. Comments and any systematic surgical data collection tools used in the reviewers' centers were incorporated. Main Outcome Measure(s) The development of a standard recommended (SSF) and minimum required (MSF) form to collect data on the surgical phenotype of endometriosis. Result(s) SSF and MSF include detailed descriptions of lesions, modes of procedures and sample collection, comorbidities, and potential residual disease at the end of surgery, along with previously published instruments such as the revised American Society for Reproductive Medicine and Endometriosis Fertility Index classification tools for comparison and validation. Conclusion(s) This is the first multicenter, international collaboration between academic centers and industry addressing standardization of phenotypic data collection for a specific disease. The Endometriosis Phenome and Biobanking Harmonisation Project SSF and MSF are essential tools to increase our understanding of the pathogenesis of endometriosis by allowing large-scale collaborative research into the condition. PMID:25150390
Decoupling local mechanics from large-scale structure in modular metamaterials.
Yang, Nan; Silverberg, Jesse L
2017-04-04
A defining feature of mechanical metamaterials is that their properties are determined by the organization of internal structure instead of the raw fabrication materials. This shift of attention to engineering internal degrees of freedom has coaxed relatively simple materials into exhibiting a wide range of remarkable mechanical properties. For practical applications to be realized, however, this nascent understanding of metamaterial design must be translated into a capacity for engineering large-scale structures with prescribed mechanical functionality. Thus, the challenge is to systematically map desired functionality of large-scale structures backward into a design scheme while using finite parameter domains. Such "inverse design" is often complicated by the deep coupling between large-scale structure and local mechanical function, which limits the available design space. Here, we introduce a design strategy for constructing 1D, 2D, and 3D mechanical metamaterials inspired by modular origami and kirigami. Our approach is to assemble a number of modules into a voxelized large-scale structure, where the module's design has a greater number of mechanical design parameters than the number of constraints imposed by bulk assembly. This inequality allows each voxel in the bulk structure to be uniquely assigned mechanical properties independent from its ability to connect and deform with its neighbors. In studying specific examples of large-scale metamaterial structures we show that a decoupling of global structure from local mechanical function allows for a variety of mechanically and topologically complex designs.
Decoupling local mechanics from large-scale structure in modular metamaterials
NASA Astrophysics Data System (ADS)
Yang, Nan; Silverberg, Jesse L.
2017-04-01
A defining feature of mechanical metamaterials is that their properties are determined by the organization of internal structure instead of the raw fabrication materials. This shift of attention to engineering internal degrees of freedom has coaxed relatively simple materials into exhibiting a wide range of remarkable mechanical properties. For practical applications to be realized, however, this nascent understanding of metamaterial design must be translated into a capacity for engineering large-scale structures with prescribed mechanical functionality. Thus, the challenge is to systematically map desired functionality of large-scale structures backward into a design scheme while using finite parameter domains. Such “inverse design” is often complicated by the deep coupling between large-scale structure and local mechanical function, which limits the available design space. Here, we introduce a design strategy for constructing 1D, 2D, and 3D mechanical metamaterials inspired by modular origami and kirigami. Our approach is to assemble a number of modules into a voxelized large-scale structure, where the module’s design has a greater number of mechanical design parameters than the number of constraints imposed by bulk assembly. This inequality allows each voxel in the bulk structure to be uniquely assigned mechanical properties independent from its ability to connect and deform with its neighbors. In studying specific examples of large-scale metamaterial structures we show that a decoupling of global structure from local mechanical function allows for a variety of mechanically and topologically complex designs.
Dynamic Open-Rotor Composite Shield Impact Test Report
NASA Technical Reports Server (NTRS)
Seng, Silvia; Frankenberger, Charles; Ruggeri, Charles R.; Revilock, Duane M.; Pereira, J. Michael; Carney, Kelly S.; Emmerling, William C.
2015-01-01
The Federal Aviation Administration (FAA) is working with the European Aviation Safety Agency to determine the certification base for proposed new engines that would not have a containment structure on large commercial aircraft. Equivalent safety to the current fleet is desired by the regulators, which means that loss of a single fan blade will not cause hazard to the aircraft. NASA Glenn and Naval Air Warfare Center (NAWC) China Lake collaborated with the FAA Aircraft Catastrophic Failure Prevention Program to design and test a shield that would protect the aircraft passengers and critical systems from a released blade that could impact the fuselage. This report documents the live-fire test from a full-scale rig at NAWC China Lake. NASA provided manpower and photogrammetry expertise to document the impact and damage to the shields. The test was successful: the blade was stopped from penetrating the shield, which validates the design analysis method and the parameters used in the analysis. Additional work is required to implement the shielding into the aircraft.
Technology transfer and commercialization initiatives at TRI/Austin: Resources and examples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matzkanin, G.A.; Dingus, M.L.
1995-12-31
Located at TRI/Austin, and operated under a Department of Defense contract, is the Nondestructive Testing Information Analysis Center (NTIAC). This is a full service Information Analysis Center sponsored by the Defense Technical Information Center (DTIC), although services of NTIAC are available to other government agencies, government contractors, industry and academia. The principal objective of NTIAC is to help increase the productivity of the nation`s scientists, engineers, and technical managers involved in, or requiring, nondestructive testing by providing broad information analysis services of technical excellence. TRI/Austin is actively pursuing commercialization of several products based on results from outside funded R andmore » D programs. As a small business, TRI/Austin has limited capabilities for large scale fabrication, production, marketing or distribution. Thus, part of a successful commercialization process involves making appropriate collaboration arrangements with other organizations to augment TRI/Austin`s capabilities. Brief descriptions are given here of two recent commercialization efforts at TRI/Austin.« less
Challenges and opportunities in synthetic biology for chemical engineers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, YZ; Lee, JK; Zhao, HM
Synthetic biology provides numerous great opportunities for chemical engineers in the development of new processes for large-scale production of biofuels, value-added chemicals, and protein therapeutics. However, challenges across all scales abound. In particular, the modularization and standardization of the components in a biological system, so-called biological parts, remain the biggest obstacle in synthetic biology. In this perspective, we will discuss the main challenges and opportunities in the rapidly growing synthetic biology field and the important roles that chemical engineers can play in its advancement. (C) 2012 Elsevier Ltd. All rights reserved.
Challenges and opportunities in synthetic biology for chemical engineers
Luo, Yunzi; Lee, Jung-Kul; Zhao, Huimin
2012-01-01
Synthetic biology provides numerous great opportunities for chemical engineers in the development of new processes for large-scale production of biofuels, value-added chemicals, and protein therapeutics. However, challenges across all scales abound. In particular, the modularization and standardization of the components in a biological system, so-called biological parts, remain the biggest obstacle in synthetic biology. In this perspective, we will discuss the main challenges and opportunities in the rapidly growing synthetic biology field and the important roles that chemical engineers can play in its advancement. PMID:24222925
Challenges and opportunities in synthetic biology for chemical engineers.
Luo, Yunzi; Lee, Jung-Kul; Zhao, Huimin
2013-11-15
Synthetic biology provides numerous great opportunities for chemical engineers in the development of new processes for large-scale production of biofuels, value-added chemicals, and protein therapeutics. However, challenges across all scales abound. In particular, the modularization and standardization of the components in a biological system, so-called biological parts, remain the biggest obstacle in synthetic biology. In this perspective, we will discuss the main challenges and opportunities in the rapidly growing synthetic biology field and the important roles that chemical engineers can play in its advancement.
The Snowmastodon Project: cutting-edge science on the blade of a bulldozer
Pigati, Jeffery S.; Miller, Ian M.; Johnson, Kirk R.
2015-01-01
Cutting-edge science happens at a variety of scales, from the individual and intimate to the large-scale and collaborative. The publication of a special issue of Quaternary Research in Nov. 2014 dedicated to the scientific findings of the “Snowmastodon Project” highlights what can be done when natural history museums, governmental agencies, and academic institutions work toward a common goal.
Remote access laboratories in Australia and Europe
NASA Astrophysics Data System (ADS)
Ku, H.; Ahfock, T.; Yusaf, T.
2011-06-01
Remote access laboratories (RALs) were first developed in 1994 in Australia and Switzerland. The main purposes of developing them are to enable students to do their experiments at their own pace, time and locations and to enable students and teaching staff to get access to facilities beyond their institutions. Currently, most of the experiments carried out through RALs in Australia are heavily biased towards electrical, electronic and computer engineering disciplines. However, the experiments carried out through RALs in Europe had more variety, in addition to the traditional electrical, electronic and computer engineering disciplines, there were experiments in mechanical and mechatronic disciplines. It was found that RALs are now being developed aggressively in Australia and Europe and it can be argued that RALs will develop further and faster in the future with improving Internet technology. The rising costs of real experimental equipment will also speed up their development because by making the equipment remotely accessible, the cost can be shared by more universities or institutions and this will improve their cost-effectiveness. Their development would be particularly rapid in large countries with small populations such as Australia, Canada and Russia, because of the scale of economy. Reusability of software, interoperability in software implementation, computer supported collaborative learning and convergence with learning management systems are the required development of future RALs.
Almuneef, Maha A; Qayad, Mohamed; Noor, Ismail K; Al-Eissa, Majid A; Albuhairan, Fadia S; Inam, Sarah; Mikton, Christopher
2014-03-01
There has been increased awareness of child maltreatment in Saudi Arabia recently. This study assessed the readiness for implementing large-scale evidence-based child maltreatment prevention programs in Saudi Arabia. Key informants, who were key decision makers and senior managers in the field of child maltreatment, were invited to participate in the study. A multidimensional tool, developed by WHO and collaborators from several middle and low income countries, was used to assess 10 dimensions of readiness. A group of experts also gave an objective assessment of the 10 dimensions and key informants' and experts' scores were compared. On a scale of 100, the key informants gave a readiness score of 43% for Saudi Arabia to implement large-scale, evidence-based CM prevention programs, and experts gave an overall readiness score of 40%. Both the key informants and experts agreed that 4 of the dimensions (attitudes toward child maltreatment prevention, institutional links and resources, material resources, and human and technical resources) had low readiness scores (<5) each and three dimensions (knowledge of child maltreatment prevention, scientific data on child maltreatment prevention, and will to address child maltreatment problem) had high readiness scores (≥5) each. There was significant disagreement between key informants and experts on the remaining 3 dimensions. Overall, Saudi Arabia has a moderate/fair readiness to implement large-scale child maltreatment prevention programs. Capacity building; strengthening of material resources; and improving institutional links, collaborations, and attitudes toward the child maltreatment problem are required to improve the country's readiness to implement such programs. Copyright © 2013 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulvatunyou, Boonserm; Wysk, Richard A.; Cho, Hyunbo
2004-06-01
In today's global manufacturing environment, manufacturing functions are distributed as never before. Design, engineering, fabrication, and assembly of new products are done routinely in many different enterprises scattered around the world. Successful business transactions require the sharing of design and engineering data on an unprecedented scale. This paper describes a framework that facilitates the collaboration of engineering tasks, particularly process planning and analysis, to support such globalized manufacturing activities. The information models of data and the software components that integrate those information models are described. The integration framework uses an Integrated Product and Process Data (IPPD) representation called a Resourcemore » Independent Operation Summary (RIOS) to facilitate the communication of business and manufacturing requirements. Hierarchical process modeling, process planning decomposition and an augmented AND/OR directed graph are used in this representation. The Resource Specific Process Planning (RSPP) module assigns required equipment and tools, selects process parameters, and determines manufacturing costs based on two-level hierarchical RIOS data. The shop floor knowledge (resource and process knowledge) and a hybrid approach (heuristic and linear programming) to linearize the AND/OR graph provide the basis for the planning. Finally, a prototype system is developed and demonstrated with an exemplary part. Java and XML (Extensible Markup Language) are used to ensure software and information portability.« less
Developing A Large-Scale, Collaborative, Productive Geoscience Education Network
NASA Astrophysics Data System (ADS)
Manduca, C. A.; Bralower, T. J.; Egger, A. E.; Fox, S.; Ledley, T. S.; Macdonald, H.; Mcconnell, D. A.; Mogk, D. W.; Tewksbury, B. J.
2012-12-01
Over the past 15 years, the geoscience education community has grown substantially and developed broad and deep capacity for collaboration and dissemination of ideas. While this community is best viewed as emergent from complex interactions among changing educational needs and opportunities, we highlight the role of several large projects in the development of a network within this community. In the 1990s, three NSF projects came together to build a robust web infrastructure to support the production and dissemination of on-line resources: On The Cutting Edge (OTCE), Earth Exploration Toolbook, and Starting Point: Teaching Introductory Geoscience. Along with the contemporaneous Digital Library for Earth System Education, these projects engaged geoscience educators nationwide in exploring professional development experiences that produced lasting on-line resources, collaborative authoring of resources, and models for web-based support for geoscience teaching. As a result, a culture developed in the 2000s in which geoscience educators anticipated that resources for geoscience teaching would be shared broadly and that collaborative authoring would be productive and engaging. By this time, a diverse set of examples demonstrated the power of the web infrastructure in supporting collaboration, dissemination and professional development . Building on this foundation, more recent work has expanded both the size of the network and the scope of its work. Many large research projects initiated collaborations to disseminate resources supporting educational use of their data. Research results from the rapidly expanding geoscience education research community were integrated into the Pedagogies in Action website and OTCE. Projects engaged faculty across the nation in large-scale data collection and educational research. The Climate Literacy and Energy Awareness Network and OTCE engaged community members in reviewing the expanding body of on-line resources. Building Strong Geoscience Departments sought to create the same type of shared information base that was supporting individual faculty for departments. The Teach the Earth portal and its underlying web development tools were used by NSF-funded projects in education to disseminate their results. Leveraging these funded efforts, the Climate Literacy Network has expanded this geoscience education community to include individuals broadly interested in fostering climate literacy. Most recently, the InTeGrate project is implementing inter-institutional collaborative authoring, testing and evaluation of curricular materials. While these projects represent only a fraction of the activity in geoscience education, they are important drivers in the development of a large, national, coherent geoscience education network with the ability to collaborate and disseminate information effectively. Importantly, the community is open and defined by active participation. Key mechanisms for engagement have included alignment of project activities with participants needs and goals; productive face-to-face and virtual workshops, events, and series; stipends for completion of large products; and strong supporting staff to keep projects moving and assist with product production. One measure of its success is the adoption and adaptation of resources and models by emerging projects, which results in the continued growth of the network.
Reverse engineering and analysis of large genome-scale gene networks
Aluru, Maneesha; Zola, Jaroslaw; Nettleton, Dan; Aluru, Srinivas
2013-01-01
Reverse engineering the whole-genome networks of complex multicellular organisms continues to remain a challenge. While simpler models easily scale to large number of genes and gene expression datasets, more accurate models are compute intensive limiting their scale of applicability. To enable fast and accurate reconstruction of large networks, we developed Tool for Inferring Network of Genes (TINGe), a parallel mutual information (MI)-based program. The novel features of our approach include: (i) B-spline-based formulation for linear-time computation of MI, (ii) a novel algorithm for direct permutation testing and (iii) development of parallel algorithms to reduce run-time and facilitate construction of large networks. We assess the quality of our method by comparison with ARACNe (Algorithm for the Reconstruction of Accurate Cellular Networks) and GeneNet and demonstrate its unique capability by reverse engineering the whole-genome network of Arabidopsis thaliana from 3137 Affymetrix ATH1 GeneChips in just 9 min on a 1024-core cluster. We further report on the development of a new software Gene Network Analyzer (GeNA) for extracting context-specific subnetworks from a given set of seed genes. Using TINGe and GeNA, we performed analysis of 241 Arabidopsis AraCyc 8.0 pathways, and the results are made available through the web. PMID:23042249
High Thrust-to-Power Annular Engine Technology
NASA Technical Reports Server (NTRS)
Patterson, Michael J.; Thomas, Robert E.; Crofton, Mark W.; Young, Jason A.; Foster, John E.
2015-01-01
Gridded ion engines have the highest efficiency and total impulse of any mature electric propulsion technology, and have been successfully implemented for primary propulsion in both geocentric and heliocentric environments with excellent ground/in-space correlation of performance. However, they have not been optimized to maximize thrust-to-power, an important parameter for Earth orbit transfer applications. This publication discusses technology development work intended to maximize this parameter. These activities include investigating the capabilities of a non-conventional design approach, the annular engine, which has the potential of exceeding the thrust-to-power of other EP technologies. This publication discusses the status of this work, including the fabrication and initial tests of a large-area annular engine. This work is being conducted in collaboration among NASA Glenn Research Center, The Aerospace Corporation, and the University of Michigan.
High Thrust-to-Power Annular Engine Technology
NASA Technical Reports Server (NTRS)
Patterson, Michael; Thomas, Robert; Crofton, Mark; Young, Jason A.; Foster, John E.
2015-01-01
Gridded ion engines have the highest efficiency and total impulse of any mature electric propulsion technology, and have been successfully implemented for primary propulsion in both geocentric and heliocentric environments with excellent ground-in-space correlation of performance. However, they have not been optimized to maximize thrust-to-power, an important parameter for Earth orbit transfer applications. This publication discusses technology development work intended to maximize this parameter. These activities include investigating the capabilities of a non-conventional design approach, the annular engine, which has the potential of exceeding the thrust-to-power of other EP technologies. This publication discusses the status of this work, including the fabrication and initial tests of a large-area annular engine. This work is being conducted in collaboration among NASA Glenn Research Center, The Aerospace Corporation, and the University of Michigan.
Airframe-Jet Engine Integration Noise
NASA Technical Reports Server (NTRS)
Tam, Christopher; Antcliff, Richard R. (Technical Monitor)
2003-01-01
It has been found experimentally that the noise radiated by a jet mounted under the wing of an aircraft exceeds that of the same jet in a stand-alone environment. The increase in noise is referred to as jet engine airframe integration noise. The objectives of the present investigation are, (1) To obtain a better understanding of the physical mechanisms responsible for jet engine airframe integration noise or installation noise. (2) To develop a prediction model for jet engine airframe integration noise. It is known that jet mixing noise consists of two principal components. They are the noise from the large turbulence structures of the jet flow and the noise from the fine scale turbulence. In this investigation, only the effect of jet engine airframe interaction on the fine scale turbulence noise of a jet is studied. The fine scale turbulence noise is the dominant noise component in the sideline direction. Thus we limit out consideration primarily to the sideline.
Neuroscience thinks big (and collaboratively).
Kandel, Eric R; Markram, Henry; Matthews, Paul M; Yuste, Rafael; Koch, Christof
2013-09-01
Despite cash-strapped times for research, several ambitious collaborative neuroscience projects have attracted large amounts of funding and media attention. In Europe, the Human Brain Project aims to develop a large-scale computer simulation of the brain, whereas in the United States, the Brain Activity Map is working towards establishing a functional connectome of the entire brain, and the Allen Institute for Brain Science has embarked upon a 10-year project to understand the mouse visual cortex (the MindScope project). US President Barack Obama's announcement of the BRAIN Initiative (Brain Research through Advancing Innovative Neurotechnologies Initiative) in April 2013 highlights the political commitment to neuroscience and is expected to further foster interdisciplinary collaborations, accelerate the development of new technologies and thus fuel much needed medical advances. In this Viewpoint article, five prominent neuroscientists explain the aims of the projects and how they are addressing some of the questions (and criticisms) that have arisen.
Social Work and Engineering Collaboration: Forging Innovative Global Community Development Education
ERIC Educational Resources Information Center
Gilbert, Dorie J.
2014-01-01
Interdisciplinary programs in schools of social work are growing in scope and number. This article reports on collaboration between a school of social work and a school of engineering, which is forging a new area of interdisciplinary education. The program engages social work students working alongside engineering students in a team approach to…
ERIC Educational Resources Information Center
Celedón-Pattichis, Sylvia; LópezLeiva, Carlos Alfonso; Pattichis, Marios S.; Llamocca, Daniel
2013-01-01
There is a strong need in the United States to increase the number of students from underrepresented groups who pursue careers in Science, Technology, Engineering, and Mathematics. Drawing from sociocultural theory, we present approaches to establishing collaborations between computer engineering and mathematics/bilingual education faculty to…
A MANAGEMENT SUPPORT SYSTEM FOR GREAT LAKES COASTAL WETLANDS
The Great Lakes National Program Office in conjunction with the Great Lakes Commission and other researchers is leading a large scale collaborative effort that will yield, in unprecedented detail, a management support system for Great Lakes coastal wetlands. This entails the dev...
NASA Astrophysics Data System (ADS)
Berthoud, L.; Gliddon, J.
2018-03-01
In today's global Aerospace industry, virtual workspaces are commonly used for collaboration between geographically distributed multidisciplinary teams. This study investigated the use of wikis to look at communication, collaboration and engagement in 'Capstone' team design projects at the end of an engineering degree. Wikis were set up for teams of engineering students from different disciplinary backgrounds and years. The students' perception of the usefulness of the tool were surveyed and the user contribution statistics and content categorisation were analysed for a case study wiki. Recommendations and lessons learned for the deployment of wikis are provided for interested academic staff from other institutions. Wikis were found to be of limited use to investigate levels of communication and collaboration in this study, but may be of interest in other contexts. Wikis were considered a potentially useful tool to track engagement for Capstone design projects in engineering subjects.
Project-based learning with international collaboration for training biomedical engineers.
Krishnan, Shankar
2011-01-01
Training biomedical engineers while effectively keeping up with the fast paced scientific breakthroughs and the growth in technical innovations poses arduous challenges for educators. Traditional pedagogical methods are employed for coping with the increasing demands in biomedical engineering (BME) training and continuous improvements have been attempted with some success. Project-based learning (PBL) is an academic effort that challenges students by making them carry out interdisciplinary projects aimed at accomplishing a wide range of student learning outcomes. PBL has been shown to be effective in the medical field and has been adopted by other fields including engineering. The impact of globalization in healthcare appears to be steadily increasing which necessitates the inclusion of awareness of relevant international activities in the curriculum. Numerous difficulties are encountered when the formation of a collaborative team is tried, and additional difficulties occur as the collaboration team is extended to international partners. Understanding and agreement of responsibilities becomes somewhat complex and hence the collaborative project has to be planned and executed with clear understanding by all partners and participants. A model for training BME students by adopting PBL with international collaboration is proposed. The results of previous BME project work with international collaboration fit partially into the model. There were many logistic issues and constraints; however, the collaborative projects themselves greatly enhanced the student learning outcomes. This PBL type of learning experience tends to promote long term retention of multidisciplinary material and foster high-order cognitive activities such as analysis, synthesis and evaluation. In addition to introducing the students to experiences encountered in the real-life workforce, the proposed approach enhances developing professional contracts and global networking. In conclusion, despite initial challenges, adopting project-based learning with international collaboration has strong potentials to be valuable in the training of biomedical engineering students.
NASA Astrophysics Data System (ADS)
Xu, Chuanfu; Deng, Xiaogang; Zhang, Lilun; Fang, Jianbin; Wang, Guangxue; Jiang, Yi; Cao, Wei; Che, Yonggang; Wang, Yongxian; Wang, Zhenghua; Liu, Wei; Cheng, Xinghua
2014-12-01
Programming and optimizing complex, real-world CFD codes on current many-core accelerated HPC systems is very challenging, especially when collaborating CPUs and accelerators to fully tap the potential of heterogeneous systems. In this paper, with a tri-level hybrid and heterogeneous programming model using MPI + OpenMP + CUDA, we port and optimize our high-order multi-block structured CFD software HOSTA on the GPU-accelerated TianHe-1A supercomputer. HOSTA adopts two self-developed high-order compact definite difference schemes WCNS and HDCS that can simulate flows with complex geometries. We present a dual-level parallelization scheme for efficient multi-block computation on GPUs and perform particular kernel optimizations for high-order CFD schemes. The GPU-only approach achieves a speedup of about 1.3 when comparing one Tesla M2050 GPU with two Xeon X5670 CPUs. To achieve a greater speedup, we collaborate CPU and GPU for HOSTA instead of using a naive GPU-only approach. We present a novel scheme to balance the loads between the store-poor GPU and the store-rich CPU. Taking CPU and GPU load balance into account, we improve the maximum simulation problem size per TianHe-1A node for HOSTA by 2.3×, meanwhile the collaborative approach can improve the performance by around 45% compared to the GPU-only approach. Further, to scale HOSTA on TianHe-1A, we propose a gather/scatter optimization to minimize PCI-e data transfer times for ghost and singularity data of 3D grid blocks, and overlap the collaborative computation and communication as far as possible using some advanced CUDA and MPI features. Scalability tests show that HOSTA can achieve a parallel efficiency of above 60% on 1024 TianHe-1A nodes. With our method, we have successfully simulated an EET high-lift airfoil configuration containing 800M cells and China's large civil airplane configuration containing 150M cells. To our best knowledge, those are the largest-scale CPU-GPU collaborative simulations that solve realistic CFD problems with both complex configurations and high-order schemes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Chuanfu, E-mail: xuchuanfu@nudt.edu.cn; Deng, Xiaogang; Zhang, Lilun
Programming and optimizing complex, real-world CFD codes on current many-core accelerated HPC systems is very challenging, especially when collaborating CPUs and accelerators to fully tap the potential of heterogeneous systems. In this paper, with a tri-level hybrid and heterogeneous programming model using MPI + OpenMP + CUDA, we port and optimize our high-order multi-block structured CFD software HOSTA on the GPU-accelerated TianHe-1A supercomputer. HOSTA adopts two self-developed high-order compact definite difference schemes WCNS and HDCS that can simulate flows with complex geometries. We present a dual-level parallelization scheme for efficient multi-block computation on GPUs and perform particular kernel optimizations formore » high-order CFD schemes. The GPU-only approach achieves a speedup of about 1.3 when comparing one Tesla M2050 GPU with two Xeon X5670 CPUs. To achieve a greater speedup, we collaborate CPU and GPU for HOSTA instead of using a naive GPU-only approach. We present a novel scheme to balance the loads between the store-poor GPU and the store-rich CPU. Taking CPU and GPU load balance into account, we improve the maximum simulation problem size per TianHe-1A node for HOSTA by 2.3×, meanwhile the collaborative approach can improve the performance by around 45% compared to the GPU-only approach. Further, to scale HOSTA on TianHe-1A, we propose a gather/scatter optimization to minimize PCI-e data transfer times for ghost and singularity data of 3D grid blocks, and overlap the collaborative computation and communication as far as possible using some advanced CUDA and MPI features. Scalability tests show that HOSTA can achieve a parallel efficiency of above 60% on 1024 TianHe-1A nodes. With our method, we have successfully simulated an EET high-lift airfoil configuration containing 800M cells and China's large civil airplane configuration containing 150M cells. To our best knowledge, those are the largest-scale CPU–GPU collaborative simulations that solve realistic CFD problems with both complex configurations and high-order schemes.« less
NASA Astrophysics Data System (ADS)
Silva, F.; Maechling, P. J.; Goulet, C. A.; Somerville, P.; Jordan, T. H.
2014-12-01
The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving geoscientists, earthquake engineers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform (BBP) is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms for a well-observed historical earthquake. Then, the BBP calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results against GMPEs, and several new data products, such as map and distance-based goodness of fit plots. As the number and complexity of scenarios simulated using the Broadband Platform increases, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.
NASA Astrophysics Data System (ADS)
Ricco, George Dante
In higher education and in engineering education in particular, changing majors is generally considered a negative event - or at least an event with negative consequences. An emergent field of study within engineering education revolves around understanding the factors and processes driving student changes of major. Of key importance to further the field of change of major research is a grasp of large scale phenomena occurring throughout multiple systems, knowledge of previous attempts at describing such issues, and the adoption of metrics to probe them effectively. The problem posed is exacerbated by the drive in higher education institutions and among state legislatures to understand and reduce time-to-degree and student attrition. With these factors in mind, insights into large-scale processes that affect student progression are essential to evaluating the success or failure of programs. The goals of this work include describing the current educational research on switchers, identifying core concepts and stumbling blocks in my treatment of switchers, and using the Multiple Institutional Database for Investigating Engineering Longitudinal Development (MIDFIELD) to explore how those who change majors perform as a function of large-scale academic pathways within and without the engineering context. To accomplish these goals, it was first necessary to delve into a recent history of the treatment of switchers within the literature and categorize their approach. While three categories of papers exist in the literature concerning change of major, all three may or may not be applicable to a given database of students or even a single institution. Furthermore, while the term has been coined in the literature, no portable metric for discussing large-scale navigational flexibility exists in engineering education. What such a metric would look like will be discussed as well as the delimitations involved. The results and subsequent discussion will include a description of changes of major, how they may or may not have a deleterious effect on one's academic pathway, the special context of changes of major in the pathways of students within first-year engineering programs students labeled as undecided, an exploration of curricular flexibility by the construction of a novel metric, and proposed future work.
Women in Engineering in Turkey--A Large Scale Quantitative and Qualitative Examination
ERIC Educational Resources Information Center
Smith, Alice E.; Dengiz, Berna
2010-01-01
The underrepresentation of women in engineering is well known and unresolved. However, Turkey has witnessed a shift in trend from virtually no female participation in engineering to across-the-board proportions that dominate other industrialised countries within the 76 years of the founding of the Turkish Republic. This paper describes the largest…
1982-08-01
AD-AIA 700 FLORIDA UN1V GAINESVILLE DEPT OF ENVIRONMENTAL ENGIN -ETC F/G 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMOR--ENL...Conway ecosystem and is part of the Large- Scale Operations Management Test (LSOMT) of the Aquatic Plant Control Research Program (APCRP) at the WES...should be cited as follows: Blancher, E. C., II, and Fellows, C. R. 1982. "Large-Scale Operations Management Test of Use of the White Amur for Control
Trace: a high-throughput tomographic reconstruction engine for large-scale datasets.
Bicer, Tekin; Gürsoy, Doğa; Andrade, Vincent De; Kettimuthu, Rajkumar; Scullin, William; Carlo, Francesco De; Foster, Ian T
2017-01-01
Modern synchrotron light sources and detectors produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used imaging techniques that generates data at tens of gigabytes per second is computed tomography (CT). Although CT experiments result in rapid data generation, the analysis and reconstruction of the collected data may require hours or even days of computation time with a medium-sized workstation, which hinders the scientific progress that relies on the results of analysis. We present Trace, a data-intensive computing engine that we have developed to enable high-performance implementation of iterative tomographic reconstruction algorithms for parallel computers. Trace provides fine-grained reconstruction of tomography datasets using both (thread-level) shared memory and (process-level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations that we apply to the replicated reconstruction objects and evaluate them using tomography datasets collected at the Advanced Photon Source. Our experimental evaluations show that our optimizations and parallelization techniques can provide 158× speedup using 32 compute nodes (384 cores) over a single-core configuration and decrease the end-to-end processing time of a large sinogram (with 4501 × 1 × 22,400 dimensions) from 12.5 h to <5 min per iteration. The proposed tomographic reconstruction engine can efficiently process large-scale tomographic data using many compute nodes and minimize reconstruction times.
NASA Technical Reports Server (NTRS)
Sanders, Bobby W.; Weir, Lois J.
2008-01-01
A new hypersonic inlet for a turbine-based combined-cycle (TBCC) engine has been designed. This split-flow inlet is designed to provide flow to an over-under propulsion system with turbofan and dual-mode scramjet engines for flight from takeoff to Mach 7. It utilizes a variable-geometry ramp, high-speed cowl lip rotation, and a rotating low-speed cowl that serves as a splitter to divide the flow between the low-speed turbofan and the high-speed scramjet and to isolate the turbofan at high Mach numbers. The low-speed inlet was designed for Mach 4, the maximum mode transition Mach number. Integration of the Mach 4 inlet into the Mach 7 inlet imposed significant constraints on the low-speed inlet design, including a large amount of internal compression. The inlet design was used to develop mechanical designs for two inlet mode transition test models: small-scale (IMX) and large-scale (LIMX) research models. The large-scale model is designed to facilitate multi-phase testing including inlet mode transition and inlet performance assessment, controls development, and integrated systems testing with turbofan and scramjet engines.
NASA Technical Reports Server (NTRS)
Fulton, R. E.
1980-01-01
To respond to national needs for improved productivity in engineering design and manufacturing, a NASA supported joint industry/government project is underway denoted Integrated Programs for Aerospace-Vehicle Design (IPAD). The objective is to improve engineering productivity through better use of computer technology. It focuses on development of technology and associated software for integrated company-wide management of engineering information. The project has been underway since 1976 under the guidance of an Industry Technical Advisory Board (ITAB) composed of representatives of major engineering and computer companies and in close collaboration with the Air Force Integrated Computer-Aided Manufacturing (ICAM) program. Results to date on the IPAD project include an in-depth documentation of a representative design process for a large engineering project, the definition and design of computer-aided design software needed to support that process, and the release of prototype software to integrate selected design functions. Ongoing work concentrates on development of prototype software to manage engineering information, and initial software is nearing release.
Mitragotri, S
2013-01-01
Transdermal drug delivery continues to provide an advantageous route of drug administration over injections. While the number of drugs delivered by passive transdermal patches has increased over the years, no macromolecule is currently delivered by the transdermal route. Substantial research efforts have been dedicated by a large number of researchers representing varied disciplines including biology, chemistry, pharmaceutics and engineering to understand, model and overcome the skin's barrier properties. This article focuses on engineering contributions to the field of transdermal drug delivery. The article pays tribute to Prof. Robert Langer, who pioneered the engineering approach towards transdermal drug delivery. Over a period spanning nearly 25 years since his first publication in the field of transdermal drug delivery, Bob Langer has deeply impacted the field by quantitative analysis and innovative engineering. At the same time, he has inspired several generations of engineers by collaborations and mentorship. His scientific insights, innovative technologies, translational efforts and dedicated mentorship have transformed the field. © 2013 S. Karger AG, Basel.
ERIC Educational Resources Information Center
Ingram, Sandra; Parker, Anne
2002-01-01
Profiles two women from student engineering teams who participated in a study on collaboration and the role of gender. Shows that men and women alike displayed both gender-linked and non-gender-linked behavior, and that successful collaboration was influenced less by gender and more by such factors as a strong work ethic, team commitment, and…
NASA Technical Reports Server (NTRS)
Chattopadhyay, Debarati; Hihn, Jairus; Warfield, Keith
2011-01-01
As aerospace missions grow larger and more technically complex in the face of ever tighter budgets, it will become increasingly important to use concurrent engineering methods in the development of early conceptual designs because of their ability to facilitate rapid assessments and trades in a cost-efficient manner. To successfully accomplish these complex missions with limited funding, it is also essential to effectively leverage the strengths of individuals and teams across government, industry, academia, and international agencies by increased cooperation between organizations. As a result, the existing concurrent engineering teams will need to increasingly engage in distributed collaborative concurrent design. This paper is an extension of a recent white paper written by the Concurrent Engineering Working Group, which details the unique challenges of distributed collaborative concurrent engineering. This paper includes a short history of aerospace concurrent engineering, and defines the terms 'concurrent', 'collaborative' and 'distributed' in the context of aerospace concurrent engineering. In addition, a model for the levels of complexity of concurrent engineering teams is presented to provide a way to conceptualize information and data flow within these types of teams.
Volk, Carol J; Lucero, Yasmin; Barnas, Katie
2014-05-01
Increasingly, research and management in natural resource science rely on very large datasets compiled from multiple sources. While it is generally good to have more data, utilizing large, complex datasets has introduced challenges in data sharing, especially for collaborating researchers in disparate locations ("distributed research teams"). We surveyed natural resource scientists about common data-sharing problems. The major issues identified by our survey respondents (n = 118) when providing data were lack of clarity in the data request (including format of data requested). When receiving data, survey respondents reported various insufficiencies in documentation describing the data (e.g., no data collection description/no protocol, data aggregated, or summarized without explanation). Since metadata, or "information about the data," is a central obstacle in efficient data handling, we suggest documenting metadata through data dictionaries, protocols, read-me files, explicit null value documentation, and process metadata as essential to any large-scale research program. We advocate for all researchers, but especially those involved in distributed teams to alleviate these problems with the use of several readily available communication strategies including the use of organizational charts to define roles, data flow diagrams to outline procedures and timelines, and data update cycles to guide data-handling expectations. In particular, we argue that distributed research teams magnify data-sharing challenges making data management training even more crucial for natural resource scientists. If natural resource scientists fail to overcome communication and metadata documentation issues, then negative data-sharing experiences will likely continue to undermine the success of many large-scale collaborative projects.
NASA Astrophysics Data System (ADS)
Volk, Carol J.; Lucero, Yasmin; Barnas, Katie
2014-05-01
Increasingly, research and management in natural resource science rely on very large datasets compiled from multiple sources. While it is generally good to have more data, utilizing large, complex datasets has introduced challenges in data sharing, especially for collaborating researchers in disparate locations ("distributed research teams"). We surveyed natural resource scientists about common data-sharing problems. The major issues identified by our survey respondents ( n = 118) when providing data were lack of clarity in the data request (including format of data requested). When receiving data, survey respondents reported various insufficiencies in documentation describing the data (e.g., no data collection description/no protocol, data aggregated, or summarized without explanation). Since metadata, or "information about the data," is a central obstacle in efficient data handling, we suggest documenting metadata through data dictionaries, protocols, read-me files, explicit null value documentation, and process metadata as essential to any large-scale research program. We advocate for all researchers, but especially those involved in distributed teams to alleviate these problems with the use of several readily available communication strategies including the use of organizational charts to define roles, data flow diagrams to outline procedures and timelines, and data update cycles to guide data-handling expectations. In particular, we argue that distributed research teams magnify data-sharing challenges making data management training even more crucial for natural resource scientists. If natural resource scientists fail to overcome communication and metadata documentation issues, then negative data-sharing experiences will likely continue to undermine the success of many large-scale collaborative projects.
Computer-aided design of large-scale integrated circuits - A concept
NASA Technical Reports Server (NTRS)
Schansman, T. T.
1971-01-01
Circuit design and mask development sequence are improved by using general purpose computer with interactive graphics capability establishing efficient two way communications link between design engineer and system. Interactive graphics capability places design engineer in direct control of circuit development.
State of the Art in Large-Scale Soil Moisture Monitoring
NASA Technical Reports Server (NTRS)
Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.;
2013-01-01
Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.
PREPping Students for Authentic Science
ERIC Educational Resources Information Center
Dolan, Erin L.; Lally, David J.; Brooks, Eric; Tax, Frans E.
2008-01-01
In this article, the authors describe a large-scale research collaboration, the Partnership for Research and Education in Plants (PREP), which has capitalized on publicly available databases that contain massive amounts of biological information; stock centers that house and distribute inexpensive organisms with different genotypes; and the…
Transforming Power Systems; 21st Century Power Partnership
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2015-05-20
The 21st Century Power Partnership - a multilateral effort of the Clean Energy Ministerial - serves as a platform for public-private collaboration to advance integrated solutions for the large-scale deployment of renewable energy in combination with deep energy ef?ciency and smart grid solutions.
ERIC Educational Resources Information Center
Zhang, Li
2018-01-01
This article investigates citation and research collaboration habits of faculty in four engineering departments. The analysis focuses on similarities and differences among the engineering disciplines. Main differences exist in the use of conference papers and technical reports. The age of cited materials varies by discipline and by format.…
Takala, A; Korhonen-Yrjänheikki, K
2013-12-01
The key stakeholders of the Finnish engineering education collaborated during 2006-09 to reform the system of education, to face the challenges of the changing business environment and to create a national strategy for the Finnish engineering education. The work process was carried out using participatory work methods. Impacts of sustainable development (SD) on engineering education were analysed in one of the subprojects. In addition to participatory workshops, the core part of the work on SD consisted of a research with more than 60 interviews and an extensive literature survey. This paper discusses the results of the research and the work process of the Collaboration Group in the subproject of SD. It is suggested that enhancing systematic dialogue among key stakeholders using participatory work methods is crucial in increasing motivation and commitment in incorporating SD in engineering education. Development of the context of learning is essential for improving skills of engineering graduates in some of the key abilities related to SD: systemic- and life-cycle thinking, ethical understanding, collaborative learning and critical reflection skills. This requires changing of the educational paradigm from teacher-centred to learner-centred applying problem- and project-oriented active learning methods.
Brazile, Tiffany; Hostetter Shoop, Glenda; McDonough, Christine M; Van Citters, Douglas W
2018-01-30
Addressing current healthcare challenges requires innovation and collaboration. Current literature provides limited guidance in promoting these skills in medical school. One approach involves transdisciplinary training in which students from different disciplines work together toward a shared goal. We assessed the need for such a curriculum at Dartmouth College. We surveyed medical and engineering students' educational values; learning experiences; professional goals; and interest in transdisciplinary education and innovation. Data were analyzed using descriptive statistics. Shared values among student groups included leadership development, innovation, collaboration, and resource sharing. Medical students felt their curriculum inadequately addressed creativity and innovation relative to their engineering counterparts (p < 0.05). Medical students felt less prepared for entrepreneurial activities (p < 0.05), while engineering students indicated a need for basic medical knowledge and patient-oriented design factors. Despite strong interest, collaboration was less than 50% of indicated interest. Medical and engineering students share an interest in the innovation process and need a shared curriculum to facilitate collaboration. A transdisciplinary course that familiarizes students with this process has the potential to promote physicians and engineers as leaders and innovators who can effectively work across industry lines. A transdisciplinary course was piloted in Spring 2017.
Track reconstruction at LHC as a collaborative data challenge use case with RAMP
NASA Astrophysics Data System (ADS)
Amrouche, Sabrina; Braun, Nils; Calafiura, Paolo; Farrell, Steven; Gemmler, Jochen; Germain, Cécile; Gligorov, Vladimir Vava; Golling, Tobias; Gray, Heather; Guyon, Isabelle; Hushchyn, Mikhail; Innocente, Vincenzo; Kégl, Balázs; Neuhaus, Sara; Rousseau, David; Salzburger, Andreas; Ustyuzhanin, Andrei; Vlimant, Jean-Roch; Wessel, Christian; Yilmaz, Yetkin
2017-08-01
Charged particle track reconstruction is a major component of data-processing in high-energy physics experiments such as those at the Large Hadron Collider (LHC), and is foreseen to become more and more challenging with higher collision rates. A simplified two-dimensional version of the track reconstruction problem is set up on a collaborative platform, RAMP, in order for the developers to prototype and test new ideas. A small-scale competition was held during the Connecting The Dots / Intelligent Trackers 2017 (CTDWIT 2017) workshop. Despite the short time scale, a number of different approaches have been developed and compared along a single score metric, which was kept generic enough to accommodate a summarized performance in terms of both efficiency and fake rates.
GIGGLE: a search engine for large-scale integrated genome analysis.
Layer, Ryan M; Pedersen, Brent S; DiSera, Tonya; Marth, Gabor T; Gertz, Jason; Quinlan, Aaron R
2018-02-01
GIGGLE is a genomics search engine that identifies and ranks the significance of genomic loci shared between query features and thousands of genome interval files. GIGGLE (https://github.com/ryanlayer/giggle) scales to billions of intervals and is over three orders of magnitude faster than existing methods. Its speed extends the accessibility and utility of resources such as ENCODE, Roadmap Epigenomics, and GTEx by facilitating data integration and hypothesis generation.
GIGGLE: a search engine for large-scale integrated genome analysis
Layer, Ryan M; Pedersen, Brent S; DiSera, Tonya; Marth, Gabor T; Gertz, Jason; Quinlan, Aaron R
2018-01-01
GIGGLE is a genomics search engine that identifies and ranks the significance of genomic loci shared between query features and thousands of genome interval files. GIGGLE (https://github.com/ryanlayer/giggle) scales to billions of intervals and is over three orders of magnitude faster than existing methods. Its speed extends the accessibility and utility of resources such as ENCODE, Roadmap Epigenomics, and GTEx by facilitating data integration and hypothesis generation. PMID:29309061
Collaboration and decision making tools for mobile groups
NASA Astrophysics Data System (ADS)
Abrahamyan, Suren; Balyan, Serob; Ter-Minasyan, Harutyun; Degtyarev, Alexander
2017-12-01
Nowadays the use of distributed collaboration tools is widespread in many areas of people activity. But lack of mobility and certain equipment-dependency creates difficulties and decelerates development and integration of such technologies. Also mobile technologies allow individuals to interact with each other without need of traditional office spaces and regardless of location. Hence, realization of special infrastructures on mobile platforms with help of ad-hoc wireless local networks could eliminate hardware-attachment and be useful also in terms of scientific approach. Solutions from basic internet-messengers to complex software for online collaboration equipment in large-scale workgroups are implementations of tools based on mobile infrastructures. Despite growth of mobile infrastructures, applied distributed solutions in group decisionmaking and e-collaboration are not common. In this article we propose software complex for real-time collaboration and decision-making based on mobile devices, describe its architecture and evaluate performance.
NASA Technical Reports Server (NTRS)
Phfarr, Barbara B.; So, Maria M.; Lamb, Caroline Twomey; Rhodes, Donna H.
2009-01-01
Experienced systems engineers are adept at more than implementing systems engineering processes: they utilize systems thinking to solve complex engineering problems. Within the space industry demographics and economic pressures are reducing the number of experienced systems engineers that will be available in the future. Collaborative systems thinking within systems engineering teams is proposed as a way to integrate systems engineers of various experience levels to handle complex systems engineering challenges. This paper uses the GOES-R Program Systems Engineering team to illustrate the enablers and barriers to team level systems thinking and to identify ways in which performance could be improved. Ways NASA could expand its engineering training to promote team-level systems thinking are proposed.
NASA Astrophysics Data System (ADS)
L'Heureux, Zara E.
This thesis proposes that internal combustion piston engines can help clear the way for a transformation in the energy, chemical, and refining industries that is akin to the transition computer technology experienced with the shift from large mainframes to small personal computers and large farms of individually small, modular processing units. This thesis provides a mathematical foundation, multi-dimensional optimizations, experimental results, an engine model, and a techno-economic assessment, all working towards quantifying the value of repurposing internal combustion piston engines for new applications in modular, small-scale technologies, particularly for energy and chemical engineering systems. Many chemical engineering and power generation industries have focused on increasing individual unit sizes and centralizing production. This "bigger is better" concept makes it difficult to evolve and incorporate change. Large systems are often designed with long lifetimes, incorporate innovation slowly, and necessitate high upfront investment costs. Breaking away from this cycle is essential for promoting change, especially change happening quickly in the energy and chemical engineering industries. The ability to evolve during a system's lifetime provides a competitive advantage in a field dominated by large and often very old equipment that cannot respond to technology change. This thesis specifically highlights the value of small, mass-manufactured internal combustion piston engines retrofitted to participate in non-automotive system designs. The applications are unconventional and stem first from the observation that, when normalized by power output, internal combustion engines are one hundred times less expensive than conventional, large power plants. This cost disparity motivated a look at scaling laws to determine if scaling across both individual unit size and number of units produced would predict the two order of magnitude difference seen here. For the first time, this thesis provides a mathematical analysis of scaling with a combination of both changing individual unit size and varying the total number of units produced. Different paths to meet a particular cumulative capacity are analyzed and show that total costs are path dependent and vary as a function of the unit size and number of units produced. The path dependence identified is fairly weak, however, and for all practical applications, the underlying scaling laws seem unaffected. This analysis continues to support the interest in pursuing designs built around small, modular infrastructure. Building on the observation that internal combustion engines are an inexpensive power-producing unit, the first optimization in this thesis focuses on quantifying the value of engine capacity committing to deliver power in the day-ahead electricity and reserve markets, specifically based on pricing from the New York Independent System Operator (NYISO). An optimization was written in Python to determine, based on engine cost, fuel cost, engine wear, engine lifetime, and electricity prices, when and how much of an engine's power should be committed to a particular energy market. The optimization aimed to maximize profit for the engine and generator (engine genset) system acting as a price-taker. The result is an annual profit on the order of \\$30 per kilowatt. The most value in the engine genset is in its commitments to the spinning reserve market, where power is often committed but not always called on to deliver. This analysis highlights the benefits of modularity in energy generation and provides one example where the system is so inexpensive and short-lived, that the optimization views the engine replacement cost as a consumable operating expense rather than a capital cost. Having the opportunity to incorporate incremental technological improvements in a system's infrastructure throughout its lifetime allows introduction of new technology with higher efficiencies and better designs. An alternative to traditionally large infrastructure that locks in a design and today's state-of-the-art technology for the next 50 - 70 years, is a system designed to incorporate new technology in a modular fashion. The modular engine genset system used for power generation is one example of how this works in practice. The largest single component of this thesis is modeling, designing, retrofitting, and testing a reciprocating piston engine used as a compressor. Motivated again by the low cost of an internal combustion engine, this work looks at how an engine (which is, in its conventional form, essentially a reciprocating compressor) can be cost-effectively retrofitted to perform as a small-scale gas compressor. In the laboratory, an engine compressor was built by retrofitting a one-cylinder, 79 cc engine. Various retrofitting techniques were incorporated into the system design, and the engine compressor performance was quantified in each iteration. Because the retrofitted engine is now a power consumer rather than a power-producing unit, the engine compressor is driven in the laboratory with an electric motor. Experimentally, compressed air engine exhaust (starting at elevated inlet pressures) surpassed 650 psia (about 45 bar), which makes this system very attractive for many applications in chemical engineering and refining industries. A model of the engine compressor system was written in Python and incorporates experimentally-derived parameters to quantify gas leakage, engine friction, and flow (including backflow) through valves. The model as a whole was calibrated and verified with experimental data and is used to explore engine retrofits beyond what was tested in the laboratory. Along with the experimental and modeling work, a techno-economic assessment is included to compare the engine compressor system with state-of-the-art, commercially-available compressors. Included in the financial analysis is a case study where an engine compressor system is modeled to achieve specific compression needs. The result of the assessment is that, indeed, the low engine cost, even with the necessary retrofits, provides a cost advantage over incumbent compression technologies. Lastly, this thesis provides an algorithm and case study for another application of small-scale units in energy infrastructure, specifically in energy storage. This study focuses on quantifying the value of small-scale, onsite energy storage in shaving peak power demands. This case study focuses on university-level power demands. The analysis finds that, because peak power is so costly, even small amounts of energy storage, when dispatched optimally, can provide significant cost reductions. This provides another example of the value of small-scale implementations, particularly in energy infrastructure. While the study focuses on flywheels and batteries as the energy storage medium, engine gensets could also be used to deliver power and shave peak power demands. The overarching goal of this thesis is to introduce small-scale, modular infrastructure, with a particular focus on the opportunity to retrofit and repurpose inexpensive, mass-manufactured internal combustion engines in new and unconventional applications. The modeling and experimental work presented in this dissertation show very compelling results for engines incorporated into both energy generation infrastructure and chemical engineering industries via compression technologies. The low engine cost provides an opportunity to add retrofits whilst remaining cost competitive with the incumbent technology. This work supports the claim that modular infrastructure, built on the indivisible unit of an internal combustion engine, can revolutionize many industries by providing a low-cost mechanism for rapid change and promoting small-scale designs.
Developing a diverse and inclusive workforce in astronomy
NASA Astrophysics Data System (ADS)
Hunter, Lisa; McConnell, Nicholas; Seagroves, Scott; Barnes, Austin; Smith, Sonya; Palomino, Rafael
2018-06-01
Workforce development -- the preparation and advancement of a diverse and effective workforce -- in astronomy demands attention to a range of different career pathways, such as scientific users, telescope operations, and instrument builders. We will discuss the resources, expertise, and leadership needed to address workforce development challenges in astronomy, and the potential of one or more white papers to be prepared for the 2020 Decadal Survey. Potential white paper topics include (1) mentoring, training, and workplace practices to support diversity and inclusion; (2) enabling the next generation of astronomy faculty to teach effectively and inclusively; (3) supporting telescopes’ needs for local engineering and technologist talent, while telescope collaborations grow in scale and global extent; and (4) equipping early-career astronomers and instrumentalists with strategies and tools that are necessary for collaborating effectively on international teams.
Using Relational Reasoning to Learn about Scientific Phenomena at Unfamiliar Scales
ERIC Educational Resources Information Center
Resnick, Ilyse; Davatzes, Alexandra; Newcombe, Nora S.; Shipley, Thomas F.
2016-01-01
Many scientific theories and discoveries involve reasoning about extreme scales, removed from human experience, such as time in geology, size in nanoscience. Thus, understanding scale is central to science, technology, engineering, and mathematics. Unfortunately, novices have trouble understanding and comparing sizes of unfamiliar large and small…
Using Relational Reasoning to Learn about Scientific Phenomena at Unfamiliar Scales
ERIC Educational Resources Information Center
Resnick, Ilyse; Davatzes, Alexandra; Newcombe, Nora S.; Shipley, Thomas F.
2017-01-01
Many scientific theories and discoveries involve reasoning about extreme scales, removed from human experience, such as time in geology and size in nanoscience. Thus, understanding scale is central to science, technology, engineering, and mathematics. Unfortunately, novices have trouble understanding and comparing sizes of unfamiliar large and…
UCSD's Institute of Engineering in Medicine: fostering collaboration through research and education.
Chien, Shu
2012-07-01
The University of California, San Diego (UCSD) was established in 1961 as a new research university that emphasizes innovation, excellence, and interdisciplinary research and education. It has a School of Medicine (SOM) and the Jacobs School of Engineering (JSOE) in close proximity, and both schools have national rankings among the top 15. In 1991, with the support of the Whitaker Foundation, the Whitaker Institute of Biomedical Engineering was formed to foster collaborations in research and education. In 2008, the university extended the collaboration further by establishing the Institute of Engineering in Medicine (IEM), with the mission of accelerating the discoveries of novel science and technology to enhance health care through teamwork between engineering and medicine, and facilitating the translation of innovative technologies for delivery to the public through clinical application and commercialization.
Moving research to practice through partnership: a case study in Asphalt Paving.
Chang, Charlotte; Nixon, Laura; Baker, Robin
2015-08-01
Multi-stakeholder partnerships play a critical role in dissemination and implementation in health and safety. To better document and understand construction partnerships that have successfully scaled up effective interventions to protect workers, this case study focused on the collaborative processes of the Asphalt Paving Partnership. In the 1990s, this partnership developed, evaluated, disseminated, and achieved near universal, voluntary adoption of paver engineering controls to reduce exposure to asphalt fumes. We used in-depth interviews (n = 15) and document review in the case study. We describe contextual factors that both facilitated and challenged the formation of the collaboration, central themes and group processes, and research to practice (r2p) outcomes. The Asphalt Paving Partnership offers insight into how multi-stakeholder partnerships in construction can draw upon the strengths of diverse members to improve the dissemination and adoption of health and safety innovations and build a collaborative infrastructure to sustain momentum over time. © 2015 Wiley Periodicals, Inc.
The Student-Centered Active Learning Environment for Undergraduate Programs (SCALE-UP) Project
NASA Astrophysics Data System (ADS)
Beichner, Robert J.
2011-04-01
How do you keep a classroom of 100 undergraduates actively learning? Can students practice communication and teamwork skills in a large class? How do you boost the performance of underrepresented groups? The Student-Centered Active Learning Environment for Undergraduate Programs (SCALE-UP) Project has addressed these concerns. Because of their inclusion in a leading introductory physics textbook, project materials are used by more than 1/3 of all science, math, and engineering majors nationwide. The room design and pedagogy have been adopted at more than 100 leading institutions across the country. Physics, chemistry, math, astronomy, biology, engineering, earth sciences, and even literature classes are currently being taught this way. Educational research indicates that students should collaborate on interesting tasks and be deeply involved with the material they are studying. We promote active learning in a redesigned classroom for 100 students or more. (Of course, smaller classes can also benefit.) Class time is spent primarily on "tangibles" and "ponderables"--hands-on activities, simulations, and interesting questions. Nine students sit in three teams at round tables. Instructors circulate and engage in Socratic dialogues. The setting looks like a banquet hall, with lively interactions nearly all the time. Hundreds of hours of classroom video and audio recordings, transcripts of numerous interviews and focus groups, data from conceptual learning assessments (using widely-recognized instruments in a pretest/posttest protocol), and collected portfolios of student work are part of our rigorous assessment effort. Our findings (based on data from over 16,000 students collected over five years as well as replications at adopting sites) can be summarized as the following: 1) Female failure rate is 1/5 of previous levels, even though more is demanded of students. 2) Minority failure rate is 1/4 that seen in traditionally taught courses. 3) At-risk students are more successful in later engineering courses. 4) Top students gain the most, although students at all levels benefit. 5) Conceptual learning and problem solving are significantly improved, with same content coverage. In this talk I will discuss the need for reform, the SCALE-UP classroom environment, and examine the findings of studies of learning.
NASA Astrophysics Data System (ADS)
Jones, Michael; Chodas, Mark; Smith, Matthew J.; Masterson, Rebecca A.
2014-07-01
OSIRIS-REx is a NASA New Frontiers mission scheduled for launch in 2016 that will travel to the asteroid Bennu and return a pristine sample of the asteroid to Earth. The REgolith X-ray Imaging Spectrometer (REXIS) is a student collaboration instrument on-board the OSIRIS-REx spacecraft. REXIS is a NASA risk Class D instrument, and its design and development is largely student led. The engineering team consists of MIT graduate and undergraduate students and staff at the MIT Space Systems Laboratory. The primary goal of REXIS is the education of science and engineering students through participation in the development of light hardware. In light, REXIS will contribute to the mission by providing an elemental abundance map of the asteroid and by characterizing Bennu among the known meteorite groups. REXIS is sensitive to X-rays between 0.5 and 7 keV, and uses coded aperture imaging to map the distribution of iron with 50 m spatial resolution. This paper describes the science goals, concept of operations, and overall engineering design of the REXIS instrument. Each subsystem of the instrument is addressed with a high-level description of the design. Critical design elements such as the Thermal Isolation Layer (TIL), radiation cover, coded-aperture mask, and Detector Assembly Mount (DAM) are discussed in further detail.
Engineering Education for Agricultural and Rural Development in Africa
ERIC Educational Resources Information Center
Adewumi, B. A.
2008-01-01
Agricultural Engineering has transformed agricultural practices from subsistence level to medium and large-scale production via mechanisation in the developed nations. This has reduced the labour force requirements in agriculture; increased production levels and efficiency, product shelf life and product quality; and resulted into…
Lindau, Stacy Tessler; Makelarski, Jennifer A.; Chin, Marshall H.; Desautels, Shane; Johnson, Daniel; Johnson, Waldo E.; Miller, Doriane; Peters, Susan; Robinson, Connie; Schneider, John; Thicklin, Florence; Watson, Natalie P.; Wolfe, Marcus; Whitaker, Eric
2011-01-01
Objective To describe the roles community members can and should play in, and an asset-based strategy used by Chicago’s South Side Health and Vitality Studies for, building sustainable, large-scale community health research infrastructure. The Studies are a family of research efforts aiming to produce actionable knowledge to inform health policy, programming, and investments for the region. Methods Community and university collaborators, using a consensus-based approach, developed shared theoretical perspectives, guiding principles, and a model for collaboration in 2008, which were used to inform an asset-based operational strategy. Ongoing community engagement and relationship-building support the infrastructure and research activities of the Studies. Results Key steps in the asset-based strategy include: 1) continuous community engagement and relationship building, 2) identifying community priorities, 3) identifying community assets, 4) leveraging assets, 5) conducting research, 6) sharing knowledge and 7) informing action. Examples of community member roles, and how these are informed by the Studies’ guiding principles, are provided. Conclusions Community and university collaborators, with shared vision and principles, can effectively work together to plan innovative, large-scale community-based research that serves community needs and priorities. Sustainable, effective models are needed to realize NIH’s mandate for meaningful translation of biomedical discovery into improved population health. PMID:21236295
Midway, Stephen R.; Wagner, Tyler; Zydlewski, Joseph D.; Irwin, Brian J.; Paukert, Craig P.
2016-01-01
Managing inland fisheries in the 21st century presents several obstacles, including the need to view fisheries from multiple spatial and temporal scales, which usually involves populations and resources spanning sociopolitical boundaries. Though collaboration is not new to fisheries science, inland aquatic systems have historically been managed at local scales and present different challenges than in marine or large freshwater systems like the Laurentian Great Lakes. Therefore, we outline a flexible strategy that highlights organization, cooperation, analytics, and implementation as building blocks toward effectively addressing transboundary fisheries issues. Additionally, we discuss the use of Bayesian hierarchical models (within the analytical stage), due to their flexibility in dealing with the variability present in data from multiple scales. With growing recognition of both ecological drivers that span spatial and temporal scales and the subsequent need for collaboration to effectively manage heterogeneous resources, we expect implementation of transboundary approaches to become increasingly critical for effective inland fisheries management.
iClimate: a climate data and analysis portal
NASA Astrophysics Data System (ADS)
Goodman, P. J.; Russell, J. L.; Merchant, N.; Miller, S. J.; Juneja, A.
2015-12-01
We will describe a new climate data and analysis portal called iClimate that facilitates direct comparisons between available climate observations and climate simulations. Modeled after the successful iPlant Collaborative Discovery Environment (www.iplantcollaborative.org) that allows plant scientists to trade and share environmental, physiological and genetic data and analyses, iClimate provides an easy-to-use platform for large-scale climate research, including the storage, sharing, automated preprocessing, analysis and high-end visualization of large and often disparate observational and model datasets. iClimate will promote data exploration and scientific discovery by providing: efficient and high-speed transfer of data from nodes around the globe (e.g. PCMDI and NASA); standardized and customized data/model metrics; efficient subsampling of datasets based on temporal period, geographical region or variable; and collaboration tools for sharing data, workflows, analysis results, and data visualizations with collaborators or with the community at large. We will present iClimate's capabilities, and demonstrate how it will simplify and enhance the ability to do basic or cutting-edge climate research by professionals, laypeople and students.
Defense Acquisitions Acronyms and Terms
2012-12-01
Computer-Aided Design CADD Computer-Aided Design and Drafting CAE Component Acquisition Executive; Computer-Aided Engineering CAIV Cost As an...Radiation to Ordnance HFE Human Factors Engineering HHA Health Hazard Assessment HNA Host-Nation Approval HNS Host-Nation Support HOL High -Order...Engineering Change Proposal VHSIC Very High Speed Integrated Circuit VLSI Very Large Scale Integration VOC Volatile Organic Compound W WAN Wide
Chatzidionysiou, Katerina; Hetland, Merete Lund; Frisell, Thomas; Di Giuseppe, Daniela; Hellgren, Karin; Glintborg, Bente; Nordström, Dan; Aaltonen, Kalle; Törmänen, Minna RK; Klami Kristianslund, Eirik; Kvien, Tore K; Provan, Sella A; Guðbjörnsson, Bjorn; Dreyer, Lene; Kristensen, Lars Erik; Jørgensen, Tanja Schjødt; Jacobsson, Lennart; Askling, Johan
2018-01-01
There are increasing needs for detailed real-world data on rheumatic diseases and their treatments. Clinical register data are essential sources of information that can be enriched through linkage to additional data sources such as national health data registers. Detailed analyses call for international collaborative observational research to increase the number of patients and the statistical power. Such linkages and collaborations come with legal, logistic and methodological challenges. In collaboration between registers of inflammatory arthritides in Sweden, Denmark, Norway, Finland and Iceland, we plan to enrich, harmonise and standardise individual data repositories to investigate analytical approaches to multisource data, to assess the viability of different logistical approaches to data protection and sharing and to perform collaborative studies on treatment effectiveness, safety and health-economic outcomes. This narrative review summarises the needs and potentials and the challenges that remain to be overcome in order to enable large-scale international collaborative research based on clinical and other types of data. PMID:29682328
Use of the Collaborative Optimization Architecture for Launch Vehicle Design
NASA Technical Reports Server (NTRS)
Braun, R. D.; Moore, A. A.; Kroo, I. M.
1996-01-01
Collaborative optimization is a new design architecture specifically created for large-scale distributed-analysis applications. In this approach, problem is decomposed into a user-defined number of subspace optimization problems that are driven towards interdisciplinary compatibility and the appropriate solution by a system-level coordination process. This decentralized design strategy allows domain-specific issues to be accommodated by disciplinary analysts, while requiring interdisciplinary decisions to be reached by consensus. The present investigation focuses on application of the collaborative optimization architecture to the multidisciplinary design of a single-stage-to-orbit launch vehicle. Vehicle design, trajectory, and cost issues are directly modeled. Posed to suit the collaborative architecture, the design problem is characterized by 5 design variables and 16 constraints. Numerous collaborative solutions are obtained. Comparison of these solutions demonstrates the influence which an priori ascent-abort criterion has on development cost. Similarly, objective-function selection is discussed, demonstrating the difference between minimum weight and minimum cost concepts. The operational advantages of the collaborative optimization
Aircraft Engine Technology for Green Aviation to Reduce Fuel Burn
NASA Technical Reports Server (NTRS)
Hughes, Christopher E.; VanZante, Dale E.; Heidmann, James D.
2013-01-01
The NASA Fundamental Aeronautics Program Subsonic Fixed Wing Project and Integrated Systems Research Program Environmentally Responsible Aviation Project in the Aeronautics Research Mission Directorate are conducting research on advanced aircraft technology to address the environmental goals of reducing fuel burn, noise and NOx emissions for aircraft in 2020 and beyond. Both Projects, in collaborative partnerships with U.S. Industry, Academia, and other Government Agencies, have made significant progress toward reaching the N+2 (2020) and N+3 (beyond 2025) installed fuel burn goals by fundamental aircraft engine technology development, subscale component experimental investigations, full scale integrated systems validation testing, and development validation of state of the art computation design and analysis codes. Specific areas of propulsion technology research are discussed and progress to date.
Challenges in engineering large customized bone constructs.
Forrestal, David P; Klein, Travis J; Woodruff, Maria A
2017-06-01
The ability to treat large tissue defects with customized, patient-specific scaffolds is one of the most exciting applications in the tissue engineering field. While an increasing number of modestly sized tissue engineering solutions are making the transition to clinical use, successfully scaling up to large scaffolds with customized geometry is proving to be a considerable challenge. Managing often conflicting requirements of cell placement, structural integrity, and a hydrodynamic environment supportive of cell culture throughout the entire thickness of the scaffold has driven the continued development of many techniques used in the production, culturing, and characterization of these scaffolds. This review explores a range of technologies and methods relevant to the design and manufacture of large, anatomically accurate tissue-engineered scaffolds with a focus on the interaction of manufactured scaffolds with the dynamic tissue culture fluid environment. Biotechnol. Bioeng. 2017;114: 1129-1139. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
ATLAS Large Scale Thin Gap Chambers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soha, Aria
This is a technical scope of work (TSW) between the Fermi National Accelerator Laboratory (Fermilab) and the experimenters of the ATLAS sTGC New Small Wheel collaboration who have committed to participate in beam tests to be carried out during the FY2014 Fermilab Test Beam Facility program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abu-Zayyad, T.; et al.
2013-10-02
Joint contributions of the Pierre Auger and Telescope Array Collaborations to the 33rd International Cosmic Ray Conference, Rio de Janeiro, Brazil, July 2013: cross-calibration of the fluorescence telescopes, large scale anisotropies and mass composition.
The National Near-Road Mobile Source Air Toxics Study: Las Vegas
EPA, in collaboration with FHWA, has been involved in a large-scale monitoring research study in an effort to characterize highway vehicle emissions in a near-road environment. The pollutants of interest include particulate matter with aerodynamic diameter less than 2.5 microns ...
A web-based online collaboration platform for formulating engineering design projects
NASA Astrophysics Data System (ADS)
Varikuti, Sainath
Effective communication and collaboration among students, faculty and industrial sponsors play a vital role while formulating and solving engineering design projects. With the advent in the web technology, online platforms and systems have been proposed to facilitate interactions and collaboration among different stakeholders in the context of senior design projects. However, there are noticeable gaps in the literature with respect to understanding the effects of online collaboration platforms for formulating engineering design projects. Most of the existing literature is focused on exploring the utility of online platforms on activities after the problem is defined and teams are formed. Also, there is a lack of mechanisms and tools to guide the project formation phase in senior design projects, which makes it challenging for students and faculty to collaboratively develop and refine project ideas and to establish appropriate teams. In this thesis a web-based online collaboration platform is designed and implemented to share, discuss and obtain feedback on project ideas and to facilitate collaboration among students and faculty prior to the start of the semester. The goal of this thesis is to understand the impact of an online collaboration platform for formulating engineering design projects, and how a web-based online collaboration platform affects the amount of interactions among stakeholders during the early phases of design process. A survey measuring the amount of interactions among students and faculty is administered. Initial findings show a marked improvement in the students' ability to share project ideas and form teams with other students and faculty. Students found the online platform simple to use. The suggestions for improving the tool generally included features that were not necessarily design specific, indicating that the underlying concept of this collaborative platform provides a strong basis and can be extended for future online platforms. Although the platform was designed to promote collaboration, adoption of the collaborative platform by students and faculty has been slow. While the platform appears to be very useful for collaboration, more time is required for it to be widely used by all the stakeholders and to fully convert from email communication to the use of the online collaboration platform.
Bottom-up production of meta-atoms for optical magnetism in visible and NIR light
NASA Astrophysics Data System (ADS)
Barois, Philippe; Ponsinet, Virginie; Baron, Alexandre; Richetti, Philippe
2018-02-01
Many unusual optical properties of metamaterials arise from the magnetic response of engineered structures of sub-wavelength size (meta-atoms) exposed to light. The top-down approach whereby engineered nanostructure of well-defined morphology are engraved on a surface proved to be successful for the generation of strong optical magnetism. It faces however the limitations of high cost and small active area in visible light where nanometre resolution is needed. The bottom-up approach whereby the fabrication metamaterials of large volume or large area results from the combination of nanochemitry and self-assembly techniques may constitute a cost-effective alternative. This approach nevertheless requires the large-scale production of functional building-blocks (meta-atoms) bearing a strong magnetic optical response. We propose in this paper a few tracks that lead to the large scale synthesis of magnetic metamaterials operating in visible or near IR light.
An open source web interface for linking models to infrastructure system databases
NASA Astrophysics Data System (ADS)
Knox, S.; Mohamed, K.; Harou, J. J.; Rheinheimer, D. E.; Medellin-Azuara, J.; Meier, P.; Tilmant, A.; Rosenberg, D. E.
2016-12-01
Models of networked engineered resource systems such as water or energy systems are often built collaboratively with developers from different domains working at different locations. These models can be linked to large scale real world databases, and they are constantly being improved and extended. As the development and application of these models becomes more sophisticated, and the computing power required for simulations and/or optimisations increases, so has the need for online services and tools which enable the efficient development and deployment of these models. Hydra Platform is an open source, web-based data management system, which allows modellers of network-based models to remotely store network topology and associated data in a generalised manner, allowing it to serve multiple disciplines. Hydra Platform uses a web API using JSON to allow external programs (referred to as `Apps') to interact with its stored networks and perform actions such as importing data, running models, or exporting the networks to different formats. Hydra Platform supports multiple users accessing the same network and has a suite of functions for managing users and data. We present ongoing development in Hydra Platform, the Hydra Web User Interface, through which users can collaboratively manage network data and models in a web browser. The web interface allows multiple users to graphically access, edit and share their networks, run apps and view results. Through apps, which are located on the server, the web interface can give users access to external data sources and models without the need to install or configure any software. This also ensures model results can be reproduced by removing platform or version dependence. Managing data and deploying models via the web interface provides a way for multiple modellers to collaboratively manage data, deploy and monitor model runs and analyse results.
NASA Astrophysics Data System (ADS)
Harris, A. T.; Ramachandran, R.; Maskey, M.
2013-12-01
The Exelis-developed IDL and ENVI software are ubiquitous tools in Earth science research environments. The IDL Workbench is used by the Earth science community for programming custom data analysis and visualization modules. ENVI is a software solution for processing and analyzing geospatial imagery that combines support for multiple Earth observation scientific data types (optical, thermal, multi-spectral, hyperspectral, SAR, LiDAR) with advanced image processing and analysis algorithms. The ENVI & IDL Services Engine (ESE) is an Earth science data processing engine that allows researchers to use open standards to rapidly create, publish and deploy advanced Earth science data analytics within any existing enterprise infrastructure. Although powerful in many ways, the tools lack collaborative features out-of-box. Thus, as part of the NASA funded project, Collaborative Workbench to Accelerate Science Algorithm Development, researchers at the University of Alabama in Huntsville and Exelis have developed plugins that allow seamless research collaboration from within IDL workbench. Such additional features within IDL workbench are possible because IDL workbench is built using the Eclipse Rich Client Platform (RCP). RCP applications allow custom plugins to be dropped in for extended functionalities. Specific functionalities of the plugins include creating complex workflows based on IDL application source code, submitting workflows to be executed by ESE in the cloud, and sharing and cloning of workflows among collaborators. All these functionalities are available to scientists without leaving their IDL workbench. Because ESE can interoperate with any middleware, scientific programmers can readily string together IDL processing tasks (or tasks written in other languages like C++, Java or Python) to create complex workflows for deployment within their current enterprise architecture (e.g. ArcGIS Server, GeoServer, Apache ODE or SciFlo from JPL). Using the collaborative IDL Workbench, coupled with ESE for execution in the cloud, asynchronous workflows could be executed in batch mode on large data in the cloud. We envision that a scientist will initially develop a scientific workflow locally on a small set of data. Once tested, the scientist will deploy the workflow to the cloud for execution. Depending on the results, the scientist may share the workflow and results, allowing them to be stored in a community catalog and instantly loaded into the IDL Workbench of other scientists. Thereupon, scientists can clone and modify or execute the workflow with different input parameters. The Collaborative Workbench will provide a platform for collaboration in the cloud, helping Earth scientists solve big-data problems in the Earth and planetary sciences.
Evolving from bioinformatics in-the-small to bioinformatics in-the-large.
Parker, D Stott; Gorlick, Michael M; Lee, Christopher J
2003-01-01
We argue the significance of a fundamental shift in bioinformatics, from in-the-small to in-the-large. Adopting a large-scale perspective is a way to manage the problems endemic to the world of the small-constellations of incompatible tools for which the effort required to assemble an integrated system exceeds the perceived benefit of the integration. Where bioinformatics in-the-small is about data and tools, bioinformatics in-the-large is about metadata and dependencies. Dependencies represent the complexities of large-scale integration, including the requirements and assumptions governing the composition of tools. The popular make utility is a very effective system for defining and maintaining simple dependencies, and it offers a number of insights about the essence of bioinformatics in-the-large. Keeping an in-the-large perspective has been very useful to us in large bioinformatics projects. We give two fairly different examples, and extract lessons from them showing how it has helped. These examples both suggest the benefit of explicitly defining and managing knowledge flows and knowledge maps (which represent metadata regarding types, flows, and dependencies), and also suggest approaches for developing bioinformatics database systems. Generally, we argue that large-scale engineering principles can be successfully adapted from disciplines such as software engineering and data management, and that having an in-the-large perspective will be a key advantage in the next phase of bioinformatics development.
NASA Astrophysics Data System (ADS)
Shelestov, Andrii; Lavreniuk, Mykola; Kussul, Nataliia; Novikov, Alexei; Skakun, Sergii
2017-02-01
Many applied problems arising in agricultural monitoring and food security require reliable crop maps at national or global scale. Large scale crop mapping requires processing and management of large amount of heterogeneous satellite imagery acquired by various sensors that consequently leads to a “Big Data” problem. The main objective of this study is to explore efficiency of using the Google Earth Engine (GEE) platform when classifying multi-temporal satellite imagery with potential to apply the platform for a larger scale (e.g. country level) and multiple sensors (e.g. Landsat-8 and Sentinel-2). In particular, multiple state-of-the-art classifiers available in the GEE platform are compared to produce a high resolution (30 m) crop classification map for a large territory ( 28,100 km2 and 1.0 M ha of cropland). Though this study does not involve large volumes of data, it does address efficiency of the GEE platform to effectively execute complex workflows of satellite data processing required with large scale applications such as crop mapping. The study discusses strengths and weaknesses of classifiers, assesses accuracies that can be achieved with different classifiers for the Ukrainian landscape, and compares them to the benchmark classifier using a neural network approach that was developed in our previous studies. The study is carried out for the Joint Experiment of Crop Assessment and Monitoring (JECAM) test site in Ukraine covering the Kyiv region (North of Ukraine) in 2013. We found that Google Earth Engine (GEE) provides very good performance in terms of enabling access to the remote sensing products through the cloud platform and providing pre-processing; however, in terms of classification accuracy, the neural network based approach outperformed support vector machine (SVM), decision tree and random forest classifiers available in GEE.
Urban Elementary STEM Initiative
ERIC Educational Resources Information Center
Parker, Carolyn; Abel, Yolanda; Denisova, Ekaterina
2015-01-01
The new standards for K-12 science education suggest that student learning should be more integrated and should focus on crosscutting concepts and core ideas from the areas of physical science, life science, Earth/space science, and engineering/technology. This paper describes large-scale, urban elementary-focused science, technology, engineering,…
Collaborative Learning in Engineering Design.
ERIC Educational Resources Information Center
Newell, Sigrin
1990-01-01
Described is a capstone experience for undergraduate biomedical engineering students in which student teams work with children and adults with cerebral palsy to produce devices that make their lives easier or more enjoyable. The collaborative approach, benefits to the clients, and evaluation of the projects are discussed. (CW)
Coastal Aerosol Distribution by Data Assimilation
2006-09-30
useful for forecasts of dust storms in areas downwind of the large deserts of the world: Arabian Gulf, Sea of Japan, China Sea , Mediterranean Sea ...and the Tropical Atlantic Ocean. NAAPS also accurately predicts the fate of large-scale smoke and pollution plumes. With its global and continuous...The collaboration with Scripps Institute of Oceanography and the University of Warsaw has led to the addition of a sea salt component to NAAPS. The
Julee A Herdt; John Hunt; Kellen Schauermann
2016-01-01
This project demonstrates newly invented, biobased construction materials developed by applying lowcarbon, biomass waste sources through the Authorsâ engineered fiber processes and technology. If manufactured and applied large-scale the project inventions can divert large volumes of cellulose waste into high-performance, low embodied energy, environmental construction...
Raising awareness of the importance of funding for tuberculosis small-molecule research.
Riccardi, Giovanna; Old, Iain G; Ekins, Sean
2017-03-01
Tuberculosis (TB) drug discovery research is hampered by several factors, but as in many research areas, the available funding is insufficient to support the needs of research and development. Recent years have seen various large collaborative efforts involving public-private partnerships, mimicking the situation during the golden age of antibiotic drug discovery during the 1950s and 1960s. The large-scale collaborative efforts funded by the European Union (EU) are now subject to diminishing financial support. As a result, TB researchers are increasingly looking for novel forms of funding, such as crowdfunding, to fill this gap. Any potential solution will require a careful reassessment of the incentives to encourage additional organizations to provide funding. Copyright © 2016 Elsevier Ltd. All rights reserved.
A puzzle assembly strategy for fabrication of large engineered cartilage tissue constructs.
Nover, Adam B; Jones, Brian K; Yu, William T; Donovan, Daniel S; Podolnick, Jeremy D; Cook, James L; Ateshian, Gerard A; Hung, Clark T
2016-03-21
Engineering of large articular cartilage tissue constructs remains a challenge as tissue growth is limited by nutrient diffusion. Here, a novel strategy is investigated, generating large constructs through the assembly of individually cultured, interlocking, smaller puzzle-shaped subunits. These constructs can be engineered consistently with more desirable mechanical and biochemical properties than larger constructs (~4-fold greater Young׳s modulus). A failure testing technique was developed to evaluate the physiologic functionality of constructs, which were cultured as individual subunits for 28 days, then assembled and cultured for an additional 21-35 days. Assembled puzzle constructs withstood large deformations (40-50% compressive strain) prior to failure. Their ability to withstand physiologic loads may be enhanced by increases in subunit strength and assembled culture time. A nude mouse model was utilized to show biocompatibility and fusion of assembled puzzle pieces in vivo. Overall, the technique offers a novel, effective approach to scaling up engineered tissues and may be combined with other techniques and/or applied to the engineering of other tissues. Future studies will aim to optimize this system in an effort to engineer and integrate robust subunits to fill large defects. Copyright © 2016 Elsevier Ltd. All rights reserved.
A Puzzle Assembly Strategy for Fabrication of Large Engineered Cartilage Tissue Constructs
Nover, Adam B.; Jones, Brian K.; Yu, William T.; Donovan, Daniel S.; Podolnick, Jeremy D.; Cook, James L.; Ateshian, Gerard A.; Hung, Clark T.
2016-01-01
Engineering of large articular cartilage tissue constructs remains a challenge as tissue growth is limited by nutrient diffusion. Here, a novel strategy is investigated, generating large constructs through the assembly of individually cultured, interlocking, smaller puzzle-shaped subunits. These constructs can be engineered consistently with more desirable mechanical and biochemical properties than larger constructs (~4-fold greater Young's modulus). A failure testing technique was developed to evaluate the physiologic functionality of constructs, which were cultured as individual subunits for 28 days, then assembled and cultured for an additional 21-35 days. Assembled puzzle constructs withstood large deformations (40-50% compressive strain) prior to failure. Their ability to withstand physiologic loads may be enhanced by increases in subunit strength and assembled culture time. A nude mouse model was utilized to show biocompatibility and fusion of assembled puzzle pieces in vivo. Overall, the technique offers a novel, effective approach to scaling up engineered tissues and may be combined with other techniques and/or applied to the engineering of other tissues. Future studies will aim to optimize this system in an effort to engineer and integrate robust subunits to fill large defects. PMID:26895780
Fluvial geomorphology and river engineering: future roles utilizing a fluvial hydrosystems framework
NASA Astrophysics Data System (ADS)
Gilvear, David J.
1999-12-01
River engineering is coming under increasing public scrutiny given failures to prevent flood hazards and economic and environmental concerns. This paper reviews the contribution that fluvial geomorphology can make in the future to river engineering. In particular, it highlights the need for fluvial geomorphology to be an integral part in engineering projects, that is, to be integral to the planning, implementation, and post-project appraisal stages of engineering projects. It should be proactive rather than reactive. Areas in which geomorphologists will increasingly be able to complement engineers in river management include risk and environmental impact assessment, floodplain planning, river audits, determination of instream flow needs, river restoration, and design of ecologically acceptable channels and structures. There are four key contributions that fluvial geomorphology can make to the engineering profession with regard to river and floodplain management: to promote recognition of lateral, vertical, and downstream connectivity in the fluvial system and the inter-relationships between river planform, profile, and cross-section; to stress the importance of understanding fluvial history and chronology over a range of time scales, and recognizing the significance of both palaeo and active landforms and deposits as indicators of levels of landscape stability; to highlight the sensitivity of geomorphic systems to environmental disturbances and change, especially when close to geomorphic thresholds, and the dynamics of the natural systems; and to demonstrate the importance of landforms and processes in controlling and defining fluvial biotopes and to thus promote ecologically acceptable engineering. Challenges facing fluvial geomorphology include: gaining full acceptance by the engineering profession; widespread utilization of new technologies including GPS, GIS, image analysis of satellite and airborne remote sensing data, computer-based hydraulic modeling and geophysical techniques; dovetailing engineering approaches to the study of river channels which emphasize reach-scale flow resistance, shear stresses, and material strength with catchment scale geomorphic approaches, empirical predictions, bed and bank processes, landform evolution, and magnitude-frequency concepts; producing accepted river channel typologies; fundamental research aimed at producing more reliable deterministic equations for prediction of bed and bank stability and bedload transport; and collaboration with aquatic biologists to determine the role and importance of geomorphologically and hydraulically defined habitats.
An efficient quantum scheme for Private Set Intersection
NASA Astrophysics Data System (ADS)
Shi, Run-hua; Mu, Yi; Zhong, Hong; Cui, Jie; Zhang, Shun
2016-01-01
Private Set Intersection allows a client to privately compute set intersection with the collaboration of the server, which is one of the most fundamental and key problems within the multiparty collaborative computation of protecting the privacy of the parties. In this paper, we first present a cheat-sensitive quantum scheme for Private Set Intersection. Compared with classical schemes, our scheme has lower communication complexity, which is independent of the size of the server's set. Therefore, it is very suitable for big data services in Cloud or large-scale client-server networks.
A conceptual design of shock-eliminating clover combustor for large scale scramjet engine
NASA Astrophysics Data System (ADS)
Sun, Ming-bo; Zhao, Yu-xin; Zhao, Guo-yan; Liu, Yuan
2017-01-01
A new concept of shock-eliminating clover combustor is proposed for large scale scramjet engine to fulfill the requirements of fuel penetration, total pressure recovery and cooling. To generate the circular-to-clover transition shape of the combustor, the streamline tracing technique is used based on an axisymmetric expansion parent flowfield calculated using the method of characteristics. The combustor is examined using inviscid and viscous numerical simulations and a pure circular shape is calculated for comparison. The results showed that the combustor avoids the shock wave generation and produces low total pressure losses in a wide range of flight condition with various Mach number. The flameholding device for this combustor is briefly discussed.
Rendon, J S; Swinton, M; Bernthal, N; Boffano, M; Damron, T; Evaniew, N; Ferguson, P; Galli Serra, M; Hettwer, W; McKay, P; Miller, B; Nystrom, L; Parizzia, W; Schneider, P; Spiguel, A; Vélez, R; Weiss, K; Zumárraga, J P; Ghert, M
2017-05-01
As tumours of bone and soft tissue are rare, multicentre prospective collaboration is essential for meaningful research and evidence-based advances in patient care. The aim of this study was to identify barriers and facilitators encountered in large-scale collaborative research by orthopaedic oncological surgeons involved or interested in prospective multicentre collaboration. All surgeons who were involved, or had expressed an interest, in the ongoing Prophylactic Antibiotic Regimens in Tumour Surgery (PARITY) trial were invited to participate in a focus group to discuss their experiences with collaborative research in this area. The discussion was digitally recorded, transcribed and anonymised. The transcript was analysed qualitatively, using an analytic approach which aims to organise the data in the language of the participants with little theoretical interpretation. The 13 surgeons who participated in the discussion represented orthopaedic oncology practices from seven countries (Argentina, Brazil, Italy, Spain, Denmark, United States and Canada). Four categories and associated themes emerged from the discussion: the need for collaboration in the field of orthopaedic oncology due to the rarity of the tumours and the need for high level evidence to guide treatment; motivational factors for participating in collaborative research including establishing proof of principle, learning opportunity, answering a relevant research question and being part of a collaborative research community; barriers to participation including funding, personal barriers, institutional barriers, trial barriers, and administrative barriers and facilitators for participation including institutional facilitators, leadership, authorship, trial set-up, and the support of centralised study coordination. Orthopaedic surgeons involved in an ongoing international randomised controlled trial (RCT) were motivated by many factors to participate. There were a number of barriers to and facilitators for their participation. There was a collective sense of fatigue experienced in overcoming these barriers, which was mirrored by a strong collective sense of the importance of, and need for, collaborative research in this field. The experiences were described as essential educational first steps to advance collaborative studies in this area. Knowledge gained from this study will inform the development of future large-scale collaborative research projects in orthopaedic oncology. Cite this article: J. S. Rendon, M. Swinton, N. Bernthal, M. Boffano, T. Damron, N. Evaniew, P. Ferguson, M. Galli Serra, W. Hettwer, P. McKay, B. Miller, L. Nystrom, W. Parizzia, P. Schneider, A. Spiguel, R. Vélez, K. Weiss, J. P. Zumárraga, M. Ghert. Barriers and facilitators experienced in collaborative prospective research in orthopaedic oncology: A qualitative study. Bone Joint Res 2017;6:-314. DOI: 10.1302/2046-3758.65.BJR-2016-0192.R1. © 2017 Ghert et al.
Software Engineering Research/Developer Collaborations in 2005
NASA Technical Reports Server (NTRS)
Pressburger, Tom
2006-01-01
In CY 2005, three collaborations between software engineering technology providers and NASA software development personnel deployed three software engineering technologies on NASA development projects (a different technology on each project). The main purposes were to benefit the projects, infuse the technologies if beneficial into NASA, and give feedback to the technology providers to improve the technologies. Each collaboration project produced a final report. Section 2 of this report summarizes each project, drawing from the final reports and communications with the software developers and technology providers. Section 3 indicates paths to further infusion of the technologies into NASA practice. Section 4 summarizes some technology transfer lessons learned. Also included is an acronym list.
Overview of NASA MSFC IEC Federated Engineering Collaboration Capability
NASA Technical Reports Server (NTRS)
Moushon, Brian; McDuffee, Patrick
2005-01-01
The MSFC IEC federated engineering framework is currently developing a single collaborative engineering framework across independent NASA centers. The federated approach allows NASA centers the ability to maintain diversity and uniqueness, while providing interoperability. These systems are integrated together in a federated framework without compromising individual center capabilities. MSFC IEC's Federation Framework will have a direct affect on how engineering data is managed across the Agency. The approach is directly attributed in response to the Columbia Accident Investigation Board (CAB) finding F7.4-11 which states the Space Shuttle Program has a wealth of data sucked away in multiple databases without a convenient way to integrate and use the data for management, engineering, or safety decisions. IEC s federated capability is further supported by OneNASA recommendation 6 that identifies the need to enhance cross-Agency collaboration by putting in place common engineering and collaborative tools and databases, processes, and knowledge-sharing structures. MSFC's IEC Federated Framework is loosely connected to other engineering applications that can provide users with the integration needed to achieve an Agency view of the entire product definition and development process, while allowing work to be distributed across NASA Centers and contractors. The IEC DDMS federation framework eliminates the need to develop a single, enterprise-wide data model, where the goal of having a common data model shared between NASA centers and contractors is very difficult to achieve.
High Speed Civil Transport Design Using Collaborative Optimization and Approximate Models
NASA Technical Reports Server (NTRS)
Manning, Valerie Michelle
1999-01-01
The design of supersonic aircraft requires complex analysis in multiple disciplines, posing, a challenge for optimization methods. In this thesis, collaborative optimization, a design architecture developed to solve large-scale multidisciplinary design problems, is applied to the design of supersonic transport concepts. Collaborative optimization takes advantage of natural disciplinary segmentation to facilitate parallel execution of design tasks. Discipline-specific design optimization proceeds while a coordinating mechanism ensures progress toward an optimum and compatibility between disciplinary designs. Two concepts for supersonic aircraft are investigated: a conventional delta-wing design and a natural laminar flow concept that achieves improved performance by exploiting properties of supersonic flow to delay boundary layer transition. The work involves the development of aerodynamics and structural analyses, and integration within a collaborative optimization framework. It represents the most extensive application of the method to date.
Advances in multi-scale modeling of solidification and casting processes
NASA Astrophysics Data System (ADS)
Liu, Baicheng; Xu, Qingyan; Jing, Tao; Shen, Houfa; Han, Zhiqiang
2011-04-01
The development of the aviation, energy and automobile industries requires an advanced integrated product/process R&D systems which could optimize the product and the process design as well. Integrated computational materials engineering (ICME) is a promising approach to fulfill this requirement and make the product and process development efficient, economic, and environmentally friendly. Advances in multi-scale modeling of solidification and casting processes, including mathematical models as well as engineering applications are presented in the paper. Dendrite morphology of magnesium and aluminum alloy of solidification process by using phase field and cellular automaton methods, mathematical models of segregation of large steel ingot, and microstructure models of unidirectionally solidified turbine blade casting are studied and discussed. In addition, some engineering case studies, including microstructure simulation of aluminum casting for automobile industry, segregation of large steel ingot for energy industry, and microstructure simulation of unidirectionally solidified turbine blade castings for aviation industry are discussed.
Similitude design for the vibration problems of plates and shells: A review
NASA Astrophysics Data System (ADS)
Zhu, Yunpeng; Wang, You; Luo, Zhong; Han, Qingkai; Wang, Deyou
2017-06-01
Similitude design plays a vital role in the analysis of vibration and shock problems encountered in large engineering equipment. Similitude design, including dimensional analysis and governing equation method, is founded on the dynamic similitude theory. This study reviews the application of similitude design methods in engineering practice and summarizes the major achievements of the dynamic similitude theory in structural vibration and shock problems in different fields, including marine structures, civil engineering structures, and large power equipment. This study also reviews the dynamic similitude design methods for thin-walled and composite material plates and shells, including the most recent work published by the authors. Structure sensitivity analysis is used to evaluate the scaling factors to attain accurate distorted scaling laws. Finally, this study discusses the existing problems and the potential of the dynamic similitude theory for the analysis of vibration and shock problems of structures.
NREL Supercomputer Tackles Grid Challenges | News | NREL
traditional database processes. Photo by Dennis Schroeder, NREL "Big data" is playing an imagery, and large-scale simulation data. Photo by Dennis Schroeder, NREL "Peregrine provides much . Photo by Dennis Schroeder, NREL Collaboration is key, and it is hard-wired into the ESIF's core. NREL
Turning of COGS moves forward findings for hormonally mediated cancers.
Sakoda, Lori C; Jorgenson, Eric; Witte, John S
2013-04-01
The large-scale Collaborative Oncological Gene-environment Study (COGS) presents new findings that further characterize the genetic bases of breast, ovarian and prostate cancers. We summarize and provide insights into this collection of papers from COGS and discuss the implications of the results and future directions for such efforts.
Experimental Evaluation of Instructional Consultation Teams on Teacher Beliefs and Practices
ERIC Educational Resources Information Center
Vu, Phuong; Shanahan, Katherine Bruckman; Rosenfield, Sylvia; Gravois, Todd; Koehler, Jessica; Kaiser, Lauren; Berger, Jill; Vaganek, Megan; Gottfredson, Gary D.; Nelson, Deborah
2013-01-01
Instructional Consultation Teams (IC Teams) are an early intervention service intended to support teachers in working with struggling students. This is a large-scale experimental trial investigating the effects of IC Teams on teacher efficacy, instructional practices, collaboration, and job satisfaction. Public elementary schools (N = 34) were…
A genome-wide association study platform built on iPlant cyber-infrastructure
USDA-ARS?s Scientific Manuscript database
We demonstrated a flexible Genome-Wide Association (GWA) Study (GWAS) platform built upon the iPlant Collaborative Cyber-infrastructure. The platform supports big data management, sharing, and large scale study of both genotype and phenotype data on clusters. End users can add their own analysis too...
Discovering and Mitigating Software Vulnerabilities through Large-Scale Collaboration
ERIC Educational Resources Information Center
Zhao, Mingyi
2016-01-01
In today's rapidly digitizing society, people place their trust in a wide range of digital services and systems that deliver latest news, process financial transactions, store sensitive information, etc. However, this trust does not have a solid foundation, because software code that supports this digital world has security vulnerabilities. These…
A Systemically Collaborative Approach to Achieving Equity in Higher Education
ERIC Educational Resources Information Center
Prystowsky, Richard J.
2018-01-01
Colleges and universities have long recognized the need to address inequities affecting students from underrepresented or underserved groups. Despite efforts undertaken by dedicated individuals, large-scale, national change in this area has not been realized. In this article, we address two major factors underlying this disappointing result (the…
Faculty Collaboration on Multidisciplinary Web-Based Education.
ERIC Educational Resources Information Center
Saad, Ashraf; Uskov, Vladimir L.; Cedercreutz, Kettil; Geonetta, Sam; Spille, Jack; Abel, Dick
In 1998, faculty members at the University of Cincinnati started a project as an interdepartmental collaboration to investigate the use of World Wide Web-based instructional (WBI) tools. The project team included representatives from various areas such as information engineering technology, mechanical engineering technology, chemical technology,…
Decentralized asset management for collaborative sensing
NASA Astrophysics Data System (ADS)
Malhotra, Raj P.; Pribilski, Michael J.; Toole, Patrick A.; Agate, Craig
2017-05-01
There has been increased impetus to leverage Small Unmanned Aerial Systems (SUAS) for collaborative sensing applications in which many platforms work together to provide critical situation awareness in dynamic environments. Such applications require critical sensor observations to be made at the right place and time to facilitate the detection, tracking, and classification of ground-based objects. This further requires rapid response to real-world events and the balancing of multiple, competing mission objectives. In this context, human operators become overwhelmed with management of many platforms. Further, current automated planning paradigms tend to be centralized and don't scale up well to many collaborating platforms. We introduce a decentralized approach based upon information-theory and distributed fusion which enable us to scale up to large numbers of collaborating Small Unmanned Aerial Systems (SUAS) platforms. This is exercised against a military application involving the autonomous detection, tracking, and classification of critical mobile targets. We further show that, based upon monte-carlo simulation results, our decentralized approach out-performs more static management strategies employed by human operators and achieves similar results to a centralized approach while being scalable and robust to degradation of communication. Finally, we describe the limitations of our approach and future directions for our research.
Agapakis, Christina M
2014-03-21
Synthetic biology is frequently defined as the application of engineering design principles to biology. Such principles are intended to streamline the practice of biological engineering, to shorten the time required to design, build, and test synthetic gene networks. This streamlining of iterative design cycles can facilitate the future construction of biological systems for a range of applications in the production of fuels, foods, materials, and medicines. The promise of these potential applications as well as the emphasis on design has prompted critical reflection on synthetic biology from design theorists and practicing designers from many fields, who can bring valuable perspectives to the discipline. While interdisciplinary connections between biologists and engineers have built synthetic biology via the science and the technology of biology, interdisciplinary collaboration with artists, designers, and social theorists can provide insight on the connections between technology and society. Such collaborations can open up new avenues and new principles for research and design, as well as shed new light on the challenging context-dependence-both biological and social-that face living technologies at many scales. This review is inspired by the session titled "Design and Synthetic Biology: Connecting People and Technology" at Synthetic Biology 6.0 and covers a range of literature on design practice in synthetic biology and beyond. Critical engagement with how design is used to shape the discipline opens up new possibilities for how we might design the future of synthetic biology.
Computers in Electrical Engineering Education at Virginia Polytechnic Institute.
ERIC Educational Resources Information Center
Bennett, A. Wayne
1982-01-01
Discusses use of computers in Electrical Engineering (EE) at Virginia Polytechnic Institute. Topics include: departmental background, level of computing power using large scale systems, mini and microcomputers, use of digital logic trainers and analog/hybrid computers, comments on integrating computers into EE curricula, and computer use in…
ERIC Educational Resources Information Center
Pavlu, Virgil
2008-01-01
Today, search engines are embedded into all aspects of digital world: in addition to Internet search, all operating systems have integrated search engines that respond even as you type, even over the network, even on cell phones; therefore the importance of their efficacy and efficiency cannot be overstated. There are many open possibilities for…
Collaboration in Global Software Engineering Based on Process Description Integration
NASA Astrophysics Data System (ADS)
Klein, Harald; Rausch, Andreas; Fischer, Edward
Globalization is one of the big trends in software development. Development projects need a variety of different resources with appropriate expert knowledge to be successful. More and more of these resources are nowadays obtained from specialized organizations and countries all over the world, varying in development approaches, processes, and culture. As seen with early outsourcing attempts, collaboration may fail due to these differences. Hence, the major challenge in global software engineering is to streamline collaborating organizations towards a successful conjoint development. Based on typical collaboration scenarios, this paper presents a structured approach to integrate processes in a comprehensible way.
The Study on Collaborative Manufacturing Platform Based on Agent
NASA Astrophysics Data System (ADS)
Zhang, Xiao-yan; Qu, Zheng-geng
To fulfill the trends of knowledge-intensive in collaborative manufacturing development, we have described multi agent architecture supporting knowledge-based platform of collaborative manufacturing development platform. In virtue of wrapper service and communication capacity agents provided, the proposed architecture facilitates organization and collaboration of multi-disciplinary individuals and tools. By effectively supporting the formal representation, capture, retrieval and reuse of manufacturing knowledge, the generalized knowledge repository based on ontology library enable engineers to meaningfully exchange information and pass knowledge across boundaries. Intelligent agent technology increases traditional KBE systems efficiency and interoperability and provides comprehensive design environments for engineers.
Dynamics of a Global Zoonotic Research Network Over 33 Years (1980-2012).
Hossain, Liaquat; Karimi, Faezeh; Wigand, Rolf T
2015-10-01
The increasing rate of outbreaks in humans of zoonotic diseases requires detailed examination of the education, research, and practice of animal health and its connection to human health. This study investigated the collaboration network of different fields engaged in conducting zoonotic research from a transdisciplinary perspective. Examination of the dynamics of this network for a 33-year period from 1980 to 2012 is presented through the development of a large scientometric database from Scopus. In our analyses we compared several properties of these networks, including density, clustering coefficient, giant component, and centrality measures over time. We also elicited patterns in different fields of study collaborating with various other fields for zoonotic research. We discovered that the strongest collaborations across disciplines are formed among the fields of medicine; biochemistry, genetics, and molecular biology; immunology and microbiology; veterinary; agricultural and biological sciences; and social sciences. Furthermore, the affiliation network is growing overall in terms of collaborative research among different fields of study such that more than two-thirds of all possible collaboration links among disciplines have already been formed. Our findings indicate that zoonotic research scientists in different fields (human or animal health, social science, earth and environmental sciences, engineering) have been actively collaborating with each other over the past 11 years.
Enforcing compatibility and constraint conditions and information retrieval at the design action
NASA Technical Reports Server (NTRS)
Woodruff, George W.
1990-01-01
The design of complex entities is a multidisciplinary process involving several interacting groups and disciplines. There is a need to integrate the data in such environments to enhance the collaboration between these groups and to enforce compatibility between dependent data entities. This paper discusses the implementation of a workstation based CAD system that is integrated with a DBMS and an expert system, CLIPS, (both implemented on a mini computer) to provide such collaborative and compatibility enforcement capabilities. The current implementation allows for a three way link between the CAD system, the DBMS and CLIPS. The engineering design process associated with the design and fabrication of sheet metal housing for computers in a large computer manufacturing facility provides the basis for this prototype system.
NASA Technical Reports Server (NTRS)
Modesitt, Kenneth L.
1987-01-01
Progress is reported on the development of SCOTTY, an expert knowledge-based system to automate the analysis procedure following test firings of the Space Shuttle Main Engine (SSME). The integration of a large-scale relational data base system, a computer graphics interface for experts and end-user engineers, potential extension of the system to flight engines, application of the system for training of newly-hired engineers, technology transfer to other engines, and the essential qualities of good software engineering practices for building expert knowledge-based systems are among the topics discussed.
A concept ideation framework for medical device design.
Hagedorn, Thomas J; Grosse, Ian R; Krishnamurty, Sundar
2015-06-01
Medical device design is a challenging process, often requiring collaboration between medical and engineering domain experts. This collaboration can be best institutionalized through systematic knowledge transfer between the two domains coupled with effective knowledge management throughout the design innovation process. Toward this goal, we present the development of a semantic framework for medical device design that unifies a large medical ontology with detailed engineering functional models along with the repository of design innovation information contained in the US Patent Database. As part of our development, existing medical, engineering, and patent document ontologies were modified and interlinked to create a comprehensive medical device innovation and design tool with appropriate properties and semantic relations to facilitate knowledge capture, enrich existing knowledge, and enable effective knowledge reuse for different scenarios. The result is a Concept Ideation Framework for Medical Device Design (CIFMeDD). Key features of the resulting framework include function-based searching and automated inter-domain reasoning to uniquely enable identification of functionally similar procedures, tools, and inventions from multiple domains based on simple semantic searches. The significance and usefulness of the resulting framework for aiding in conceptual design and innovation in the medical realm are explored via two case studies examining medical device design problems. Copyright © 2015 Elsevier Inc. All rights reserved.
Truelove, Emily; Kellogg, Katherine C.
2016-01-01
This 12-month ethnographic study of an early entrant into the U.S. car-sharing industry demonstrates that when an organization shifts its focus from developing radical new technology to incrementally improving this technology, the shift may spark an internal power struggle between the dominant engineering group and a challenger occupational group such as the marketing group. Analyzing 42 projects in two time periods that required collaboration between engineering and marketing during such a shift, we show how cross-occupational collaboration under these conditions can be facilitated by a radical flank threat, through which the bargaining power of moderates is strengthened by the presence of a more-radical group. In the face of a strong threat by radical members of a challenger occupational group, moderate members of the dominant engineering group may change their perceptions of their power to resist challengers’ demands and begin to distinguish between the goals of radical versus more-moderate challengers. To maintain as much power as possible and prevent the more-dramatic change in engineering occupational goals demanded by radical challengers, moderate engineers may build a coalition with moderate challengers and collaborate for incremental technology development. PMID:28424533
Truelove, Emily; Kellogg, Katherine C
2016-12-01
This 12-month ethnographic study of an early entrant into the U.S. car-sharing industry demonstrates that when an organization shifts its focus from developing radical new technology to incrementally improving this technology, the shift may spark an internal power struggle between the dominant engineering group and a challenger occupational group such as the marketing group. Analyzing 42 projects in two time periods that required collaboration between engineering and marketing during such a shift, we show how cross-occupational collaboration under these conditions can be facilitated by a radical flank threat, through which the bargaining power of moderates is strengthened by the presence of a more-radical group. In the face of a strong threat by radical members of a challenger occupational group, moderate members of the dominant engineering group may change their perceptions of their power to resist challengers' demands and begin to distinguish between the goals of radical versus more-moderate challengers. To maintain as much power as possible and prevent the more-dramatic change in engineering occupational goals demanded by radical challengers, moderate engineers may build a coalition with moderate challengers and collaborate for incremental technology development.
Active and Collaborative Learning in an Introductory Electrical and Computer Engineering Course
ERIC Educational Resources Information Center
Kotru, Sushma; Burkett, Susan L.; Jackson, David Jeff
2010-01-01
Active and collaborative learning instruments were introduced into an introductory electrical and computer engineering course. These instruments were designed to assess specific learning objectives and program outcomes. Results show that students developed an understanding comparable to that of more advanced students assessed later in the…
Innovative University-Industry-Government Collaboration. Six Case Studies from the USA.
ERIC Educational Resources Information Center
Dryden, R. D.; Erzurumlu, H. C. M.
1996-01-01
University-industry-government collaborations face challenges that necessitate a new culture or mindset. Six U.S. case examples demonstrate ways to create a win-win-win scenario and sustain partnerships: Oregon Joint Graduate Schools of Engineering; Network for Engineering and Research in Oregon; Blacksburg Electronic Village; research…
Hierarchical Engine for Large-scale Infrastructure Co-Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
2017-04-24
HELICS is designed to support very-large-scale (100,000+ federates) cosimulations with off-the-shelf power-system, communication, market, and end-use tools. Other key features include cross platform operating system support, the integration of both event driven (e.g., packetized communication) and time-series (e.g., power flow) simulations, and the ability to co-iterate among federates to ensure physical model convergence at each time step.
2011-11-01
fusion energy -production processes of the particular type of reactor using a lithium (Li) blanket or related alloys such as the Pb-17Li eutectic. As such, tritium breeding is intimately connected with energy production, thermal management, radioactivity management, materials properties, and mechanical structures of any plausible future large-scale fusion power reactor. JASON is asked to examine the current state of scientific knowledge and engineering practice on the physical and chemical bases for large-scale tritium
Zhang, Lening; Messner, Steven F; Lu, Jianhong
2007-02-01
This article discusses research experience gained from a large-scale survey of criminal victimization recently conducted in Tianjin, China. The authors review some of the more important challenges that arose in the research, their responses to these challenges, and lessons learned that might be beneficial to other scholars who are interested in conducting criminological research in China. Their experience underscores the importance of understanding the Chinese political, cultural, and academic context, and the utility of collaborating with experienced and knowledgeable colleagues "on site." Although there are some special difficulties and barriers, their project demonstrates the feasibility of original criminological data collection in China.
Multi-Center Implementation of NPR 7123.1A: A Collaborative Effort
NASA Technical Reports Server (NTRS)
Hall, Phillip B.; McNelis, Nancy B.
2011-01-01
Collaboration efforts between MSFC and GRC Engineering Directorates to implement the NASA Systems Engineering (SE) Engine have expanded over the past year to include other NASA Centers. Sharing information on designing, developing, and deploying SE processes has sparked further interest based on the realization that there is relative consistency in implementing SE processes at the institutional level. This presentation will provide a status on the ongoing multi-center collaboration and provide insight into how these NPR 7123.1A SE-aligned directives are being implemented and managed to better support the needs of NASA programs and projects. NPR 7123.1A, NASA Systems Engineering Processes and Requirements, was released on March 26, 2007 to clearly articulate and establish the requirements on the implementing organization for performing, supporting, and evaluating SE activities. In early 2009, MSFC and GRC Engineering Directorates undertook a collaborative opportunity to share their research and work associated with developing, updating and revising their SE process policy to comply and align with NPR 7123.1A. The goal is to develop instructions, checklists, templates, and procedures for each of the 17 SE process requirements so that systems engineers will be a position to define work that is process-driven. Greater efficiency and more effective technical management will be achieved due to consistency and repeatability of SE process implementation across and throughout each of the NASA centers. An added benefit will be to encourage NASA centers to pursue and collaborate on joint projects as a result of using common or similar processes, methods, tools, and techniques.
Engineering processes for the African VLBI network
NASA Astrophysics Data System (ADS)
Thondikulam, Venkatasubramani L.; Loots, Anita; Gaylard, Michael
2013-04-01
The African VLBI Network (AVN) is an initiative by the SKA-SA and HartRAO, business units of the National Research Foundation (NRF), Department of Science and Technology (DST), South Africa. The aim is to fill the existing gap of Very Long Baseline Interferometry (VLBI)-capable radio telescopes in the African continent by a combination of new build as well as conversion of large redundant telecommunication antennas through an Inter-Governmental collaborative programme in Science and Technology. The issue of human capital development in the Continent in the techniques of radio astronomy engineering and science is a strong force to drive the project and is expected to contribute significantly to the success of Square Kilometer Array (SKA) in the Continent.
NASA Astrophysics Data System (ADS)
Günther, Andreas; Aziz Patwary, Mohammad Abdul; Bahls, Rebecca; Asaduzzaman, Atm; Ludwig, Rüdiger; Ashraful Kamal, Mohammad; Nahar Faruqa, Nurun; Jabeen, Sarwat
2016-04-01
Dhaka Metropolitan City (including Dhaka and five adjacent municipal areas) is one of the fastest developing urban regions in the world. Densely build-up areas in the developed metropolitan area of Dhaka City are subject to extensive restructuring as common six- or lower storied buildings are replaced by higher and heavier constructions. Additional stories are built on existing houses, frequently exceeding the allowable bearing pressure on the subsoil as supported by the foundations. In turn, newly developing city areas are projected in marshy terrains modified by extensive, largely unengineered landfills. In most areas, these terrains bear unfavorable building ground conditions within 30 meters. Within a collaborative technical cooperation project between Bangladesh and Germany, BGR supports GSB in the provision of geo-information for the Capital Development Authority (RAJUK). For general urban planning, RAJUK successively develops a detailed area plan (DAP) at scale 1 : 50000 for the whole Dhaka Metropolitan City area (approx. 1700 km2). Geo-information have not been considered in the present DAP. Within the project, geospatial information in form of a geomorphic map, a digital terrain model and a 3-D subsurface model covering the whole city area have been generated at a scale of 1 : 50000. An extensive engineering geological data base consisting of more than 2200 borehole data with associated Standard Penetration Testing (SPT) and lab data has been compiled. With the field testing (SPT) and engineering geological lab data, the 3-D subsurface model can be parameterized to derive important spatial subsurface information for urban planning like bearing capacity evaluations for different foundation designs or soil liquefaction potential assessments for specific earthquake scenarios. In conjunction with inundation potential evaluations for different flooding scenarios, comprehensive building ground suitability information can be derived to support risk-sensitive urban planning in Dhaka Metropolitan City area at the DAP scale
Khademi, Ramin; Mohebbi-Kalhori, Davod; Hadjizadeh, Afra
2014-03-01
Successful bone tissue culture in a large implant is still a challenge. We have previously developed a porous hollow membrane sheet (HMSh) for tissue engineering applications (Afra Hadjizadeh and Davod Mohebbi-Kalhori, J Biomed. Mater. Res. Part A [2]). This study aims to investigate culture conditions and nutrient supply in a bioreactor made of HMSh. For this purpose, hydrodynamic and mass transport behavior in the newly proposed hollow membrane sheet bioreactor including a lumen region and porous membrane (scaffold) for supporting and feeding cells with a grooved section for accommodating gel-cell matrix was numerically studied. A finite element method was used for solving the governing equations in both homogenous and porous media. Furthermore, the cell resistance and waste production have been included in a 3D mathematical model. The influences of different bioreactor design parameters and the scaffold properties which determine the HMSh bioreactor performance and various operating conditions were discussed in detail. The obtained results illustrated that the novel scaffold can be employed in the large-scale applications in bone tissue engineering.
All for one and one for all: The value of grassroots collaboration in clinical research.
Al Wattar, Bassel H; Tamblyn, Jennifer
2017-08-01
Collaboration in health research is common in current practice. Engaging grassroots clinicians in the evidence synthesis and research process can deliver impactful results and reduce research wastage. The UKARCOG is a group of specialty trainees in obstetrics and gynaecology in the UK aiming to promote women's health research by delivering high-quality impactful research and national audit projects. The collaborative enables trainees to develop essential academic skills and roll out multicentre research projects at high cost-effectiveness. Collective research work can face a number of challenges such as establishing a joint authorship style, gaining institutional support and acquiring funds to boost networking and deliver large scales studies. Copyright © 2017 Elsevier B.V. All rights reserved.
Nanobiocatalyst advancements and bioprocessing applications
Misson, Mailin; Zhang, Hu; Jin, Bo
2015-01-01
The nanobiocatalyst (NBC) is an emerging innovation that synergistically integrates advanced nanotechnology with biotechnology and promises exciting advantages for improving enzyme activity, stability, capability and engineering performances in bioprocessing applications. NBCs are fabricated by immobilizing enzymes with functional nanomaterials as enzyme carriers or containers. In this paper, we review the recent developments of novel nanocarriers/nanocontainers with advanced hierarchical porous structures for retaining enzymes, such as nanofibres (NFs), mesoporous nanocarriers and nanocages. Strategies for immobilizing enzymes onto nanocarriers made from polymers, silicas, carbons and metals by physical adsorption, covalent binding, cross-linking or specific ligand spacers are discussed. The resulting NBCs are critically evaluated in terms of their bioprocessing performances. Excellent performances are demonstrated through enhanced NBC catalytic activity and stability due to conformational changes upon immobilization and localized nanoenvironments, and NBC reutilization by assembling magnetic nanoparticles into NBCs to defray the high operational costs associated with enzyme production and nanocarrier synthesis. We also highlight several challenges associated with the NBC-driven bioprocess applications, including the maturation of large-scale nanocarrier synthesis, design and development of bioreactors to accommodate NBCs, and long-term operations of NBCs. We suggest these challenges are to be addressed through joint collaboration of chemists, engineers and material scientists. Finally, we have demonstrated the great potential of NBCs in manufacturing bioprocesses in the near future through successful laboratory trials of NBCs in carbohydrate hydrolysis, biofuel production and biotransformation. PMID:25392397
Mavronicolas, Heather A; Laraque, Fabienne; Shankar, Arti; Campbell, Claudia
2017-05-01
Care coordination programmes are an important aspect of HIV management whose success depends largely on HIV primary care provider (PCP) and case manager collaboration. Factors influencing collaboration among HIV PCPs and case managers remain to be studied. The study objective was to test an existing theoretical model of interprofessional collaborative practice and determine which factors play the most important role in facilitating collaboration. A self-administered, anonymous mail survey was sent to HIV PCPs and case managers in New York City. An adapted survey instrument elicited information on demographic, contextual, and perceived social exchange (trustworthiness, role specification, and relationship initiation) characteristics. The dependent variable, perceived interprofessional practice, was constructed from a validated scale. A sequential block wise regression model specifying variable entry order examined the relative importance of each group of factors and of individual variables. The analysis showed that social exchange factors were the dominant drivers of collaboration. Relationship initiation was the most important predictor of interprofessional collaboration. Additional influential factors included organisational leadership support of collaboration, practice settings, and frequency of interprofessional meetings. Addressing factors influencing collaboration among providers will help public health programmes optimally design their structural, hiring, and training strategies to foster effective social exchanges and promote collaborative working relationships.
Adjoint Sensitivity Analysis for Scale-Resolving Turbulent Flow Solvers
NASA Astrophysics Data System (ADS)
Blonigan, Patrick; Garai, Anirban; Diosady, Laslo; Murman, Scott
2017-11-01
Adjoint-based sensitivity analysis methods are powerful design tools for engineers who use computational fluid dynamics. In recent years, these engineers have started to use scale-resolving simulations like large-eddy simulations (LES) and direct numerical simulations (DNS), which resolve more scales in complex flows with unsteady separation and jets than the widely-used Reynolds-averaged Navier-Stokes (RANS) methods. However, the conventional adjoint method computes large, unusable sensitivities for scale-resolving simulations, which unlike RANS simulations exhibit the chaotic dynamics inherent in turbulent flows. Sensitivity analysis based on least-squares shadowing (LSS) avoids the issues encountered by conventional adjoint methods, but has a high computational cost even for relatively small simulations. The following talk discusses a more computationally efficient formulation of LSS, ``non-intrusive'' LSS, and its application to turbulent flows simulated with a discontinuous-Galkerin spectral-element-method LES/DNS solver. Results are presented for the minimal flow unit, a turbulent channel flow with a limited streamwise and spanwise domain.
Turbulent pipe flow at extreme Reynolds numbers.
Hultmark, M; Vallikivi, M; Bailey, S C C; Smits, A J
2012-03-02
Both the inherent intractability and complex beauty of turbulence reside in its large range of physical and temporal scales. This range of scales is captured by the Reynolds number, which in nature and in many engineering applications can be as large as 10(5)-10(6). Here, we report turbulence measurements over an unprecedented range of Reynolds numbers using a unique combination of a high-pressure air facility and a new nanoscale anemometry probe. The results reveal previously unknown universal scaling behavior for the turbulent velocity fluctuations, which is remarkably similar to the well-known scaling behavior of the mean velocity distribution.
A review of sensing technologies for small and large-scale touch panels
NASA Astrophysics Data System (ADS)
Akhtar, Humza; Kemao, Qian; Kakarala, Ramakrishna
2017-06-01
A touch panel is an input device for human computer interaction. It consists of a network of sensors, a sampling circuit and a micro controller for detecting and locating a touch input. Touch input can come from either finger or stylus depending upon the type of touch technology. These touch panels provide an intuitive and collaborative workspace so that people can perform various tasks with the use of their fingers instead of traditional input devices like keyboard and mouse. Touch sensing technology is not new. At the time of this writing, various technologies are available in the market and this paper reviews the most common ones. We review traditional designs and sensing algorithms for touch technology. We also observe that due to its various strengths, capacitive touch will dominate the large-scale touch panel industry in years to come. In the end, we discuss the motivation for doing academic research on large-scale panels.
Engineering-Scale Demonstration of DuraLith and Ceramicrete Waste Forms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Josephson, Gary B.; Westsik, Joseph H.; Pires, Richard P.
2011-09-23
To support the selection of a waste form for the liquid secondary wastes from the Hanford Waste Immobilization and Treatment Plant, Washington River Protection Solutions (WRPS) has initiated secondary waste form testing on four candidate waste forms. Two of the candidate waste forms have not been developed to scale as the more mature waste forms. This work describes engineering-scale demonstrations conducted on Ceramicrete and DuraLith candidate waste forms. Both candidate waste forms were successfully demonstrated at an engineering scale. A preliminary conceptual design could be prepared for full-scale production of the candidate waste forms. However, both waste forms are stillmore » too immature to support a detailed design. Formulations for each candidate waste form need to be developed so that the material has a longer working time after mixing the liquid and solid constituents together. Formulations optimized based on previous lab studies did not have sufficient working time to support large-scale testing. The engineering-scale testing was successfully completed using modified formulations. Further lab development and parametric studies are needed to optimize formulations with adequate working time and assess the effects of changes in raw materials and process parameters on the final product performance. Studies on effects of mixing intensity on the initial set time of the waste forms are also needed.« less
Metabolic engineering of biosynthetic pathway for production of renewable biofuels.
Singh, Vijai; Mani, Indra; Chaudhary, Dharmendra Kumar; Dhar, Pawan Kumar
2014-02-01
Metabolic engineering is an important area of research that involves editing genetic networks to overproduce a certain substance by the cells. Using a combination of genetic, metabolic, and modeling methods, useful substances have been synthesized in the past at industrial scale and in a cost-effective manner. Currently, metabolic engineering is being used to produce sufficient, economical, and eco-friendly biofuels. In the recent past, a number of efforts have been made towards engineering biosynthetic pathways for large scale and efficient production of biofuels from biomass. Given the adoption of metabolic engineering approaches by the biofuel industry, this paper reviews various approaches towards the production and enhancement of renewable biofuels such as ethanol, butanol, isopropanol, hydrogen, and biodiesel. We have also identified specific areas where more work needs to be done in the future.
NASA Technical Reports Server (NTRS)
Carros, R. J.; Boissevain, A. G.; Aoyagi, K.
1975-01-01
Data are presented from an investigation of the aerodynamic characteristics of large-scale wind tunnel aircraft model that utilized a hybrid-upper surface blown flap to augment lift. The hybrid concept of this investigation used a portion of the turbofan exhaust air for blowing over the trailing edge flap to provide boundary layer control. The model, tested in the Ames 40- by 80-foot Wind Tunnel, had a 27.5 deg swept wing of aspect ratio 8 and 4 turbofan engines mounted on the upper surface of the wing. The lift of the model was augmented by turbofan exhaust impingement on the wind upper-surface and flap system. Results were obtained for three flap deflections, for some variation of engine nozzle configuration and for jet thrust coefficients from 0 to 3.0. Six-component longitudinal and lateral data are presented with four engine operation and with the critical engine out. In addition, a limited number of cross-plots of the data are presented. All of the tests were made with a downwash rake installed instead of a horizontal tail. Some of these downwash data are also presented.
Advanced engineering environment pilot project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwegel, Jill; Pomplun, Alan R.; Abernathy, Rusty
2006-10-01
The Advanced Engineering Environment (AEE) is a concurrent engineering concept that enables real-time process tooling design and analysis, collaborative process flow development, automated document creation, and full process traceability throughout a product's life cycle. The AEE will enable NNSA's Design and Production Agencies to collaborate through a singular integrated process. Sandia National Laboratories and Parametric Technology Corporation (PTC) are working together on a prototype AEE pilot project to evaluate PTC's product collaboration tools relative to the needs of the NWC. The primary deliverable for the project is a set of validated criteria for defining a complete commercial off-the-shelf (COTS) solutionmore » to deploy the AEE across the NWC.« less
Deliyannides, Timothy S; Gabler, Vanessa
2012-01-01
This Publisher's Report describes the collaboration between a university library system's scholarly communication and publishing office and a federally funded research team, the Rehabilitation Engineering Research Center (RERC) on Telerehabilitation. This novel interdisciplinary collaboration engages librarians, information technologists, publishing professionals, clinicians, policy experts, and engineers and has produced a new Open Access journal, International Journal of Telerehabilitation, and a developing, interactive web-based product dedicated to disseminating information about telerehabilitation. Readership statistics are presented for March 1, 2011 - February 29, 2012.
Turbofan engine demonstration of sensor failure detection
NASA Technical Reports Server (NTRS)
Merrill, Walter C.; Delaat, John C.; Abdelwahab, Mahmood
1991-01-01
In the paper, the results of a full-scale engine demonstration of a sensor failure detection algorithm are presented. The algorithm detects, isolates, and accommodates sensor failures using analytical redundancy. The experimental hardware, including the F100 engine, is described. Demonstration results were obtained over a large portion of a typical flight envelope for the F100 engine. They include both subsonic and supersonic conditions at both medium and full, nonafter burning, power. Estimated accuracy, minimum detectable levels of sensor failures, and failure accommodation performance for an F100 turbofan engine control system are discussed.
Computer code for the prediction of nozzle admittance
NASA Technical Reports Server (NTRS)
Nguyen, Thong V.
1988-01-01
A procedure which can accurately characterize injector designs for large thrust (0.5 to 1.5 million pounds), high pressure (500 to 3000 psia) LOX/hydrocarbon engines is currently under development. In this procedure, a rectangular cross-sectional combustion chamber is to be used to simulate the lower traverse frequency modes of the large scale chamber. The chamber will be sized so that the first width mode of the rectangular chamber corresponds to the first tangential mode of the full-scale chamber. Test data to be obtained from the rectangular chamber will be used to assess the full scale engine stability. This requires the development of combustion stability models for rectangular chambers. As part of the combustion stability model development, a computer code, NOAD based on existing theory was developed to calculate the nozzle admittances for both rectangular and axisymmetric nozzles. This code is detailed.
Concurrent and Collaborative Engineering Implementation in an R and D Organization
NASA Technical Reports Server (NTRS)
DelRosario, Ruben; Davis, Jose M.; Keys, L. Ken
2003-01-01
Concurrent Engineering (CE), and Collaborative Engineering (or Collaborative Product Development - CPD) have emerged as new paradigms with significant impact in the development of new products and processes. With documented and substantiated success in the automotive and technology industries CE and, most recently, CPD are being touted as innovative management philosophies for many other business sectors including Research and De- velopment. This paper introduces two independent research initiatives conducted at the NASA Glenn Research Center (GRC) in Cleveland, Ohio investigating the application of CE and CPD in an RdiD environment. Since little research has been conducted in the use of CE and CPD in sectors other than the high mass production manufacturing, the objective of these independent studies is to provide a systematic evaluation of the applicability of these paradigms (concur- rent and collaborative) in a low/no production, service environment, in particular R&D.
1981-11-01
AD-AI09 516 FLORIDA UNIV GAINESVILLE DEPT OF ENVIRONMENTAL ENGIN--ETC F/G 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE,WHITE AMUR--ETC(U... OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC PLANTS Report I: Baseline Studies Volume I: The Aquatic Macropyes of...COVERED LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF Report 2 of a series THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC (In 7 volumes) PLANTS
Impurity engineering of Czochralski silicon used for ultra large-scaled-integrated circuits
NASA Astrophysics Data System (ADS)
Yang, Deren; Chen, Jiahe; Ma, Xiangyang; Que, Duanlin
2009-01-01
Impurities in Czochralski silicon (Cz-Si) used for ultra large-scaled-integrated (ULSI) circuits have been believed to deteriorate the performance of devices. In this paper, a review of the recent processes from our investigation on internal gettering in Cz-Si wafers which were doped with nitrogen, germanium and/or high content of carbon is presented. It has been suggested that those impurities enhance oxygen precipitation, and create both denser bulk microdefects and enough denuded zone with the desirable width, which is benefit of the internal gettering of metal contamination. Based on the experimental facts, a potential mechanism of impurity doping on the internal gettering structure is interpreted and, a new concept of 'impurity engineering' for Cz-Si used for ULSI is proposed.
Shaping carbon nanostructures by controlling the synthesis process
NASA Astrophysics Data System (ADS)
Merkulov, Vladimir I.; Guillorn, Michael A.; Lowndes, Douglas H.; Simpson, Michael L.; Voelkl, Edgar
2001-08-01
The ability to control the nanoscale shape of nanostructures in a large-scale synthesis process is an essential and elusive goal of nanotechnology research. Here, we report significant progress toward that goal. We have developed a technique that enables controlled synthesis of nanoscale carbon structures with conical and cylinder-on-cone shapes and provides the capability to dynamically change the nanostructure shape during the synthesis process. In addition, we present a phenomenological model that explains the formation of these nanostructures and provides insight into methods for precisely engineering their shape. Since the growth process we report is highly deterministic in allowing large-scale synthesis of precisely engineered nanoscale components at defined locations, our approach provides an important tool for a practical nanotechnology.
Collaborative Distributed Scheduling Approaches for Wireless Sensor Network
Niu, Jianjun; Deng, Zhidong
2009-01-01
Energy constraints restrict the lifetime of wireless sensor networks (WSNs) with battery-powered nodes, which poses great challenges for their large scale application. In this paper, we propose a family of collaborative distributed scheduling approaches (CDSAs) based on the Markov process to reduce the energy consumption of a WSN. The family of CDSAs comprises of two approaches: a one-step collaborative distributed approach and a two-step collaborative distributed approach. The approaches enable nodes to learn the behavior information of its environment collaboratively and integrate sleep scheduling with transmission scheduling to reduce the energy consumption. We analyze the adaptability and practicality features of the CDSAs. The simulation results show that the two proposed approaches can effectively reduce nodes' energy consumption. Some other characteristics of the CDSAs like buffer occupation and packet delay are also analyzed in this paper. We evaluate CDSAs extensively on a 15-node WSN testbed. The test results show that the CDSAs conserve the energy effectively and are feasible for real WSNs. PMID:22408491
Coarse cluster enhancing collaborative recommendation for social network systems
NASA Astrophysics Data System (ADS)
Zhao, Yao-Dong; Cai, Shi-Min; Tang, Ming; Shang, Min-Sheng
2017-10-01
Traditional collaborative filtering based recommender systems for social network systems bring very high demands on time complexity due to computing similarities of all pairs of users via resource usages and annotation actions, which thus strongly suppresses recommending speed. In this paper, to overcome this drawback, we propose a novel approach, namely coarse cluster that partitions similar users and associated items at a high speed to enhance user-based collaborative filtering, and then develop a fast collaborative user model for the social tagging systems. The experimental results based on Delicious dataset show that the proposed model is able to dramatically reduce the processing time cost greater than 90 % and relatively improve the accuracy in comparison with the ordinary user-based collaborative filtering, and is robust for the initial parameter. Most importantly, the proposed model can be conveniently extended by introducing more users' information (e.g., profiles) and practically applied for the large-scale social network systems to enhance the recommending speed without accuracy loss.
2014-09-01
decision-making framework to eliminate bias and promote effective communication. Using a collaborative approach built on systems engineering and...framework to eliminate bias and promote effective communication. Using a collaborative approach built on systems engineering and decision-making...Organization .......................................................................................61 2. Bias
Partnering: An Engine for Innovation - Continuum Magazine | NREL
Schroeder, NREL Collaborative research truly is an engine for innovation. While the term "partnership (DOE) National Renewable Energy Laboratory (NREL) engages in research with the public and private agreements as in the case of NREL and HP. NREL set requirements, and then the lab and HP collaborated on the
ERIC Educational Resources Information Center
Chen, Chung-Yang; Teng, Kao-Chiuan
2011-01-01
This paper presents a computerized tool support, the Meetings-Flow Project Collaboration System (MFS), for designing, directing and sustaining the collaborative teamwork required in senior projects in software engineering (SE) education. Among many schools' SE curricula, senior projects serve as a capstone course that provides comprehensive…
Implementing Assessment Engineering in the Uniform Certified Public Accountant (CPA) Examination
ERIC Educational Resources Information Center
Burke, Matthew; Devore, Richard; Stopek, Josh
2013-01-01
This paper describes efforts to bring principled assessment design to a large-scale, high-stakes licensure examination by employing the frameworks of Assessment Engineering (AE), the Revised Bloom's Taxonomy (RBT), and Cognitive Task Analysis (CTA). The Uniform CPA Examination is practice-oriented and focuses on the skills of accounting. In…
Data Management Practices and Perspectives of Atmospheric Scientists and Engineering Faculty
ERIC Educational Resources Information Center
Wiley, Christie; Mischo, William H.
2016-01-01
This article analyzes 21 in-depth interviews of engineering and atmospheric science faculty at the University of Illinois Urbana-Champaign (UIUC) to determine faculty data management practices and needs within the context of their research activities. A detailed literature review of previous large-scale and institutional surveys and interviews…
Can Models Capture the Complexity of the Systems Engineering Process?
NASA Astrophysics Data System (ADS)
Boppana, Krishna; Chow, Sam; de Weck, Olivier L.; Lafon, Christian; Lekkakos, Spyridon D.; Lyneis, James; Rinaldi, Matthew; Wang, Zhiyong; Wheeler, Paul; Zborovskiy, Marat; Wojcik, Leonard A.
Many large-scale, complex systems engineering (SE) programs have been problematic; a few examples are listed below (Bar-Yam, 2003 and Cullen, 2004), and many others have been late, well over budget, or have failed: Hilton/Marriott/American Airlines system for hotel reservations and flights; 1988-1992; 125 million; "scrapped"
Incorporating Learning Theory into Existing Systems Engineering Models
2013-09-01
3. Social Cognition 22 Table 1. Classification of learning theories Behaviorism Cognitivism Constructivism Connectivism...Introdution to design of large scale systems. New York: Mcgraw-Hill. Grusec. J. (1992). Social learning theory and development psychology: The... LEARNING THEORY INTO EXISTING SYSTEMS ENGINEERING MODELS by Valentine Leo September 2013 Thesis Advisor: Gary O. Langford Co-Advisor
Influence of small-scale disturbances by kangaroo rats on Chihuahuan Desert ants
R. L. Schooley; B. T. Bestelmeyer; J. F. Kelly
2000-01-01
Banner-tailed kangaroo rats (Dipodomys spectabilis) are prominent ecosystem engineers that build large mounds that influence the spatial structuring of fungi, plants, and some ground-dwelling animals. Ants are diverse and functionally important components of arid ecosystems; some species are also ecosystem engineers. We investigated the effects of...
Human Centered Hardware Modeling and Collaboration
NASA Technical Reports Server (NTRS)
Stambolian Damon; Lawrence, Brad; Stelges, Katrine; Henderson, Gena
2013-01-01
In order to collaborate engineering designs among NASA Centers and customers, to in clude hardware and human activities from multiple remote locations, live human-centered modeling and collaboration across several sites has been successfully facilitated by Kennedy Space Center. The focus of this paper includes innovative a pproaches to engineering design analyses and training, along with research being conducted to apply new technologies for tracking, immersing, and evaluating humans as well as rocket, vehic le, component, or faci lity hardware utilizing high resolution cameras, motion tracking, ergonomic analysis, biomedical monitoring, wor k instruction integration, head-mounted displays, and other innovative human-system integration modeling, simulation, and collaboration applications.
Liu, Xiaonan; Ding, Wentao; Jiang, Huifeng
2017-07-19
Plant natural products (PNPs) are widely used as pharmaceuticals, nutraceuticals, seasonings, pigments, etc., with a huge commercial value on the global market. However, most of these PNPs are still being extracted from plants. A resource-conserving and environment-friendly synthesis route for PNPs that utilizes microbial cell factories has attracted increasing attention since the 1940s. However, at the present only a handful of PNPs are being produced by microbial cell factories at an industrial scale, and there are still many challenges in their large-scale application. One of the challenges is that most biosynthetic pathways of PNPs are still unknown, which largely limits the number of candidate PNPs for heterologous microbial production. Another challenge is that the metabolic fluxes toward the target products in microbial hosts are often hindered by poor precursor supply, low catalytic activity of enzymes and obstructed product transport. Consequently, despite intensive studies on the metabolic engineering of microbial hosts, the fermentation costs of most heterologously produced PNPs are still too high for industrial-scale production. In this paper, we review several aspects of PNP production in microbial cell factories, including important design principles and recent progress in pathway mining and metabolic engineering. In addition, implemented cases of industrial-scale production of PNPs in microbial cell factories are also highlighted.
The innovative medicines initiative: a European response to the innovation challenge.
Goldman, M
2012-03-01
The Innovative Medicines Initiative (IMI) was launched in 2008 as a large-scale public-private partnership between the European Commission and the European Federation of Pharmaceutical Industries and Associations (EFPIA). With a total budget of €2 billion, the IMI aims to boost the development of new medicines across Europe by implementing new collaborative endeavors between large pharmaceutical companies and other key actors in the health-care ecosystem, i.e., academic institutions, small and medium enterprises, patients, and regulatory authorities. Projects conducted by IMI consortia have already delivered meaningful results, providing proof-of-concept evidence for the efficiency of this new model of collaboration. In this article we review recent achievements of the IMI consortia and discuss the growing interest in the IMI as a best-practice model to reinvigorate drug development.
ERIC Educational Resources Information Center
Awsumb, Jessica M.
2017-01-01
This study examines post-school outcomes of youth with disabilities that were served by the Illinois vocational rehabilitation (VR) agency while in Chicago Public Schools (CPS) through a mixed methodology research design. In order to understand how outcomes differ among the study population, a large-scale dataset of the employment outcomes of…
Learning a Living: First Results of the Adult Literacy and Life Skills Survey
ERIC Educational Resources Information Center
OECD Publishing (NJ1), 2005
2005-01-01
The Adult Literacy and Life Skills Survey (ALL) is a large-scale co-operative effort undertaken by governments, national statistics agencies, research institutions and multi-lateral agencies. The development and management of the study were co-ordinated by Statistics Canada and the Educational Testing Service (ETS) in collaboration with the…
School Mental Health: The Impact of State and Local Capacity-Building Training
ERIC Educational Resources Information Center
Stephan, Sharon; Paternite, Carl; Grimm, Lindsey; Hurwitz, Laura
2014-01-01
Despite a growing number of collaborative partnerships between schools and community-based organizations to expand school mental health (SMH) service capacity in the United States, there have been relatively few systematic initiatives focused on key strategies for large-scale SMH capacity building with state and local education systems. Based on a…
Student Performance and Attitudes in a Collaborative and Flipped Linear Algebra Course
ERIC Educational Resources Information Center
Murphy, Julia; Chang, Jen-Mei; Suaray, Kagba
2016-01-01
Flipped learning is gaining traction in K-12 for enhancing students' problem-solving skills at an early age; however, there is relatively little large-scale research showing its effectiveness in promoting better learning outcomes in higher education, especially in mathematics classes. In this study, we examined the data compiled from both…
Comparing Learner Community Behavior in Multiple Presentations of a Massive Open Online Course
ERIC Educational Resources Information Center
Gallagher, Silvia Elena; Savage, Timothy
2015-01-01
Massive Online Open Courses (MOOCs) can create large scale communities of learners who collaborate, interact and discuss learning materials and activities. MOOCs are often delivered multiple times with similar content to different cohorts of learners. However, research into the differences of learner communication, behavior and expectation between…
Comparing Learner Community Behavior in Multiple Presentations of a Massive Open Online Course
ERIC Educational Resources Information Center
Gallagher, Silvia Elena; Savage, Timothy
2016-01-01
Massive Online Open Courses (MOOCs) can create large scale communities of learners who collaborate, interact and discuss learning materials and activities. MOOCs are often delivered multiple times with similar content to different cohorts of learners. However, research into the differences of learner communication, behavior and expectation between…
ERIC Educational Resources Information Center
Hunter, Darryl; Gambell, Trevor; Randhawa, Bikkar
2005-01-01
Because of its centrality to school success, social status, and workplace effectiveness, oral and aural skills development has been increasingly emphasized in Canadian curricula, classrooms and, very recently, large-scale assessment. The corresponding emphasis on group processes and collaborative learning has aimed to address equity issues in…
A Strong Future for Public Library Use and Employment
ERIC Educational Resources Information Center
Griffiths, Jose-Marie; King, Donald W.
2011-01-01
The latest and most comprehensive assessment of public librarians' education and career paths to date, this important volume reports on a large-scale research project performed by authors Jose-Marie Griffiths and Donald W. King. Presented in collaboration with the Office for Research and Statistics (ORS), the book includes an examination of trends…
ERIC Educational Resources Information Center
Ee, J.; Wang, C.; Koh, C.; Tan, O.; Liu, W.
2009-01-01
In 2000, the Singapore Ministry of Education launched Project Work (PW) to encourage the application of knowledge across disciplines, and to develop thinking, communication, collaboration and metacognitive skills. This preliminary findings of a large scale study examines the role of goal orientations (achievement goals and social goals) in…
Postgraduate Coursework in Australia: Issues Emerging from University and Industry Collaboration
ERIC Educational Resources Information Center
Forsyth, H.; Laxton, R.; Moran, C.; van der werf, J.; Banks, R.; Taylor, R.
2009-01-01
Coursework masters degrees in Australia have experienced rapid, decentralised growth since deregulation at the end of the 1980s. The result is an extraordinarily high level of diversity and some confusion as to standards, strategic positioning, purpose and educational approaches. Throughout this period of growth, a sense that large-scale (often…
Engineering large cartilage tissues using dynamic bioreactor culture at defined oxygen conditions.
Daly, Andrew C; Sathy, Binulal N; Kelly, Daniel J
2018-01-01
Mesenchymal stem cells maintained in appropriate culture conditions are capable of producing robust cartilage tissue. However, gradients in nutrient availability that arise during three-dimensional culture can result in the development of spatially inhomogeneous cartilage tissues with core regions devoid of matrix. Previous attempts at developing dynamic culture systems to overcome these limitations have reported suppression of mesenchymal stem cell chondrogenesis compared to static conditions. We hypothesize that by modulating oxygen availability during bioreactor culture, it is possible to engineer cartilage tissues of scale. The objective of this study was to determine whether dynamic bioreactor culture, at defined oxygen conditions, could facilitate the development of large, spatially homogeneous cartilage tissues using mesenchymal stem cell laden hydrogels. A dynamic culture regime was directly compared to static conditions for its capacity to support chondrogenesis of mesenchymal stem cells in both small and large alginate hydrogels. The influence of external oxygen tension on the response to the dynamic culture conditions was explored by performing the experiment at 20% O 2 and 3% O 2 . At 20% O 2 , dynamic culture significantly suppressed chondrogenesis in engineered tissues of all sizes. In contrast, at 3% O 2 dynamic culture significantly enhanced the distribution and amount of cartilage matrix components (sulphated glycosaminoglycan and collagen II) in larger constructs compared to static conditions. Taken together, these results demonstrate that dynamic culture regimes that provide adequate nutrient availability and a low oxygen environment can be employed to engineer large homogeneous cartilage tissues. Such culture systems could facilitate the scaling up of cartilage tissue engineering strategies towards clinically relevant dimensions.
Evaluating a collaborative IT based research and development project.
Khan, Zaheer; Ludlow, David; Caceres, Santiago
2013-10-01
In common with all projects, evaluating an Information Technology (IT) based research and development project is necessary in order to discover whether or not the outcomes of the project are successful. However, evaluating large-scale collaborative projects is especially difficult as: (i) stakeholders from different countries are involved who, almost inevitably, have diverse technological and/or application domain backgrounds and objectives; (ii) multiple and sometimes conflicting application specific and user-defined requirements exist; and (iii) multiple and often conflicting technological research and development objectives are apparent. In this paper, we share our experiences based on the large-scale integrated research project - The HUMBOLDT project - with project duration of 54 months, involving contributions from 27 partner organisations, plus 4 sub-contractors from 14 different European countries. In the HUMBOLDT project, a specific evaluation methodology was defined and utilised for the user evaluation of the project outcomes. The user evaluation performed on the HUMBOLDT Framework and its associated nine application scenarios from various application domains, resulted in not only an evaluation of the integrated project, but also revealed the benefits and disadvantages of the evaluation methodology. This paper presents the evaluation methodology, discusses in detail the process of applying it to the HUMBOLDT project and provides an in-depth analysis of the results, which can be usefully applied to other collaborative research projects in a variety of domains. Copyright © 2013 Elsevier Ltd. All rights reserved.
Overview of Low-Speed Aerodynamic Tests on a 5.75% Scale Blended-Wing-Body Twin Jet Configuration
NASA Technical Reports Server (NTRS)
Vicroy, Dan D.; Dickey, Eric; Princen, Norman; Beyar, Michael D.
2016-01-01
The NASA Environmentally Responsible Aviation (ERA) Project sponsored a series of computational and experimental investigations of the propulsion and airframe integration issues associated with Hybrid-Wing-Body (HWB) or Blended-Wing-Body (BWB) configurations. NASA collaborated with Boeing Research and Technology (BR&T) to conduct this research on a new twin-engine Boeing BWB transport configuration. The experimental investigations involved a series of wind tunnel tests with a 5.75-percent scale model conducted in two low-speed wind tunnels. This testing focused on the basic aerodynamics of the configuration and selection of the leading edge Krueger slat position for takeoff and landing. This paper reviews the results and analysis of these low-speed wind tunnel tests.
NASA Astrophysics Data System (ADS)
Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.
2015-12-01
Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheraghi, S. Hossein; Madden, Frank
The goal of this collaborative effort between Western New England University's College of Engineering and FloDesign Wind Turbine (FDWT) Corporation to wok on a novel areodynamic concept that could potentially lead to the next generation of wind turbines. Analytical studies and early scale model tests of FDWT's Mixer/Ejector Wind Turbine (MEWT) concept, which exploits jet-age advanced fluid dynamics, indicate that the concept has the potential to significantly reduce the cost of electricity over conventional Horizontal Axis Wind Turbines while reducing land usage. This project involved the design, fabrication, and wind tunnel testing of components of MEWT to provide the researchmore » and engineering data necessary to validate the design iterations and optimize system performance. Based on these tests, a scale model prototype called Briza was designed, fabricated, installed and tested on a portable tower to investigate and improve the design system in real world conditions. The results of these scale prototype efforts were very promising and have contributed significantly to FDWT's ongoing development of a product scale wind turbine for deployment in multiple locations around the U.S. This research was mutually beneficial to Western New England University, FDWT, and the DOE by utilizing over 30 student interns and a number of faculty in all efforts. It brought real-world wind turbine experience into the classroom to further enhance the Green Engineering Program at WNEU. It also provided on-the-job training to many students, improving their future employment opportunities, while also providing valuable information to further advance FDWT's mixer-ejector wind turbine technology, creating opportunities for future project innovation and job creation.« less
1988-08-01
functional area in which one of the brothers was clearly in charge was engineering. Nat was the Chief Engineer largely because John was blind from the age of...work pack- age that straddles a bulkhead during hot work on the bulkhead, knowing full well that later in time, zones that coincide with the...take the natural step of employing these concepts in large scale repair work. Decreasing work the Marine Industry always fans the flames of the age
Quality Function Deployment for Large Systems
NASA Technical Reports Server (NTRS)
Dean, Edwin B.
1992-01-01
Quality Function Deployment (QFD) is typically applied to small subsystems. This paper describes efforts to extend QFD to large scale systems. It links QFD to the system engineering process, the concurrent engineering process, the robust design process, and the costing process. The effect is to generate a tightly linked project management process of high dimensionality which flushes out issues early to provide a high quality, low cost, and, hence, competitive product. A pre-QFD matrix linking customers to customer desires is described.
Using Computing and Data Grids for Large-Scale Science and Engineering
NASA Technical Reports Server (NTRS)
Johnston, William E.
2001-01-01
We use the term "Grid" to refer to a software system that provides uniform and location independent access to geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. These emerging data and computing Grids promise to provide a highly capable and scalable environment for addressing large-scale science problems. We describe the requirements for science Grids, the resulting services and architecture of NASA's Information Power Grid (IPG) and DOE's Science Grid, and some of the scaling issues that have come up in their implementation.
Leveraging human oversight and intervention in large-scale parallel processing of open-source data
NASA Astrophysics Data System (ADS)
Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.
2015-05-01
The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.
Factors influencing teamwork and collaboration within a tertiary medical center
Chien, Shu Feng; Wan, Thomas TH; Chen, Yu-Chih
2012-01-01
AIM: To understand how work climate and related factors influence teamwork and collaboration in a large medical center. METHODS: A survey of 3462 employees was conducted to generate responses to Sexton’s Safety Attitudes Questionnaire (SAQ) to assess perceptions of work environment via a series of five-point, Likert-scaled questions. Path analysis was performed, using teamwork (TW) and collaboration (CO) as endogenous variables. The exogenous variables are effective communication (EC), safety culture (SC), job satisfaction (JS), work pressure (PR), and work climate (WC). The measurement instruments for the variables or summated subscales are presented. Reliability of each sub-scale are calculated. Alpha Cronbach coefficients are relatively strong: TW (0.81), CO (0.76), EC (0.70), SC (0.83), JS (0.91), WP (0.85), and WC (0.78). Confirmatory factor analysis was performed for each of these constructs. RESULTS: Path analysis enables to identify statistically significant predictors of two endogenous variables, teamwork and intra-organizational collaboration. Significant amounts of variance in perceived teamwork (R2 = 0.59) and in collaboration (R2 = 0.75) are accounted for by the predictor variables. In the initial model, safety culture is the most important predictor of perceived teamwork, with a β weight of 0.51, and work climate is the most significant predictor of collaboration, with a β weight of 0.84. After eliminating statistically insignificant causal paths and allowing correlated predictors1, the revised model shows that work climate is the only predictor positively influencing both teamwork (β = 0.26) and collaboration (β = 0.88). A relatively weak positive (β = 0.14) but statistically significant relationship exists between teamwork and collaboration when the effects of other predictors are simultaneously controlled. CONCLUSION: Hospital executives who are interested in improving collaboration should assess the work climate to ensure that employees are operating in a setting conducive to intra-organizational collaboration. PMID:25237612
A Word to the Wise: Advice for Scientists Engaged in Collaborative Adaptive Management
NASA Astrophysics Data System (ADS)
Hopkinson, Peter; Huber, Ann; Saah, David S.; Battles, John J.
2017-05-01
Collaborative adaptive management is a process for making decisions about the environment in the face of uncertainty and conflict. Scientists have a central role to play in these decisions. However, while scientists are well trained to reduce uncertainty by discovering new knowledge, most lack experience with the means to mitigate conflict in contested situations. To address this gap, we drew from our efforts coordinating a large collaborative adaptive management effort, the Sierra Nevada Adaptive Management Project, to offer advice to our fellow environmental scientists. Key challenges posed by collaborative adaptive management include the confusion caused by multiple institutional cultures, the need to provide information at management-relevant scales, frequent turnover in participants, fluctuations in enthusiasm among key constituencies, and diverse definitions of success among partners. Effective strategies included a dedication to consistency, a commitment to transparency, the willingness to communicate frequently via multiple forums, and the capacity for flexibility. Collaborative adaptive management represents a promising, new model for scientific engagement with the public. Learning the lessons of effective collaboration in environmental management is an essential task to achieve the shared goal of a sustainable future.
Large scale static tests of a tilt-nacelle V/STOL propulsion/attitude control system
NASA Technical Reports Server (NTRS)
1978-01-01
The concept of a combined V/STOL propulsion and aircraft attitude control system was subjected to large scale engine tests. The tilt nacelle/attitude control vane package consisted of the T55 powered Hamilton Standard Q-Fan demonstrator. Vane forces, moments, thermal and acoustic characteristics as well as the effects on propulsion system performance were measured under conditions simulating hover in and out of ground effect.
Modal Testing of the NPSAT1 Engineering Development Unit
2012-07-01
erkläre ich, dass die vorliegende Master Arbeit von mir selbstständig und nur unter Verwendung der angegebenen Quellen und Hilfsmittel angefertigt...logarithmic scale . As 5 Figure 2 shows, natural frequencies are indicated by large values of the first CMIF (peaks), and multiple modes can be detected by...structure’s behavior. Ewins even states, “that no large- scale modal test should be permitted to proceed until some preliminary SDOF analyses have
Measuring the quality of interprofessional collaboration in child mental health collaborative care
Rousseau, Cécile; Laurin-Lamothe, Audrey; Nadeau, Lucie; Deshaies, Suzanne; Measham, Toby
2012-01-01
Objective This pilot study examines the potential utility of the Perception of Interprofessional Collaboration Model and the shared decision-making scales in evaluating the quality of partnership in child mental health collaborative care. Methods Ninety-six primary care professionals working with children and youth responded to an internet survey which included the Perception of Interprofessional Collaboration Model scale (PINCOM-Q) and an adapted version of a shared decision-making scale (Échelle de confort décisionnel, partenaire—ECD-P). The perceptions of child mental health professionals were compared with those of other professionals working with children. Results The PINCOM-Q and the ECD-P scales had an excellent internal consistency and they were moderately correlated. Child mental health professionals’ Individual Interprofessional Collaboration scores from the PINCOM-Q individual aspects subscale were better than that of other child professionals. Conclusion These scales may be interesting instruments to measure the quality of partnership in child mental health collaborative care settings. Research needs to replicate these findings and to determine whether the quality of collaboration is a predictor of mental health outcome.
Measuring the quality of interprofessional collaboration in child mental health collaborative care
Rousseau, Cécile; Laurin-Lamothe, Audrey; Nadeau, Lucie; Deshaies, Suzanne; Measham, Toby
2012-01-01
Objective This pilot study examines the potential utility of the Perception of Interprofessional Collaboration Model and the shared decision-making scales in evaluating the quality of partnership in child mental health collaborative care. Methods Ninety-six primary care professionals working with children and youth responded to an internet survey which included the Perception of Interprofessional Collaboration Model scale (PINCOM-Q) and an adapted version of a shared decision-making scale (Échelle de confort décisionnel, partenaire—ECD-P). The perceptions of child mental health professionals were compared with those of other professionals working with children. Results The PINCOM-Q and the ECD-P scales had an excellent internal consistency and they were moderately correlated. Child mental health professionals’ Individual Interprofessional Collaboration scores from the PINCOM-Q individual aspects subscale were better than that of other child professionals. Conclusion These scales may be interesting instruments to measure the quality of partnership in child mental health collaborative care settings. Research needs to replicate these findings and to determine whether the quality of collaboration is a predictor of mental health outcome. PMID:22371692
NASA Astrophysics Data System (ADS)
Duan, Pengfei; Lei, Wenping
2017-11-01
A number of disciplines (mechanics, structures, thermal, and optics) are needed to design and build Space Camera. Separate design models are normally constructed by each discipline CAD/CAE tools. Design and analysis is conducted largely in parallel subject to requirements that have been levied on each discipline, and technical interaction between the different disciplines is limited and infrequent. As a result a unified view of the Space Camera design across discipline boundaries is not directly possible in the approach above, and generating one would require a large manual, and error-prone process. A collaborative environment that is built on abstract model and performance template allows engineering data and CAD/CAE results to be shared across above discipline boundaries within a common interface, so that it can help to attain speedy multivariate design and directly evaluate optical performance under environment loadings. A small interdisciplinary engineering team from Beijing Institute of Space Mechanics and Electricity has recently conducted a Structural/Thermal/Optical (STOP) analysis of a space camera with this collaborative environment. STOP analysis evaluates the changes in image quality that arise from the structural deformations when the thermal environment of the camera changes throughout its orbit. STOP analyses were conducted for four different test conditions applied during final thermal vacuum (TVAC) testing of the payload on the ground. The STOP Simulation Process begins with importing an integrated CAD model of the camera geometry into the collaborative environment, within which 1. Independent thermal and structural meshes are generated. 2. The thermal mesh and relevant engineering data for material properties and thermal boundary conditions are then used to compute temperature distributions at nodal points in both the thermal and structures mesh through Thermal Desktop, a COTS thermal design and analysis code. 3. Thermally induced structural deformations of the camera are then evaluated in Nastran, an industry standard code for structural design and analysis. 4. Thermal and structural results are next imported into SigFit, another COTS tool that computes deformation and best fit rigid body displacements for the optical surfaces. 5. SigFit creates a modified optical prescription that is imported into CODE V for evaluation of optical performance impacts. The integrated STOP analysis was validated using TVAC test data. For the four different TVAC tests, the relative errors between simulation and test data of measuring points temperatures were almost around 5%, while in some test conditions, they were even much lower to 1%. As to image quality MTF, relative error between simulation and test was 8.3% in the worst condition, others were all below 5%. Through the validation, it has been approved that the collaborative design and simulation environment can achieved the integrated STOP analysis of Space Camera efficiently. And further, the collaborative environment allows an interdisciplinary analysis that formerly might take several months to perform to be completed in two or three weeks, which is very adaptive to scheme demonstration of projects in earlier stages.
Compiling and using input-output frameworks through collaborative virtual laboratories.
Lenzen, Manfred; Geschke, Arne; Wiedmann, Thomas; Lane, Joe; Anderson, Neal; Baynes, Timothy; Boland, John; Daniels, Peter; Dey, Christopher; Fry, Jacob; Hadjikakou, Michalis; Kenway, Steven; Malik, Arunima; Moran, Daniel; Murray, Joy; Nettleton, Stuart; Poruschi, Lavinia; Reynolds, Christian; Rowley, Hazel; Ugon, Julien; Webb, Dean; West, James
2014-07-01
Compiling, deploying and utilising large-scale databases that integrate environmental and economic data have traditionally been labour- and cost-intensive processes, hindered by the large amount of disparate and misaligned data that must be collected and harmonised. The Australian Industrial Ecology Virtual Laboratory (IELab) is a novel, collaborative approach to compiling large-scale environmentally extended multi-region input-output (MRIO) models. The utility of the IELab product is greatly enhanced by avoiding the need to lock in an MRIO structure at the time the MRIO system is developed. The IELab advances the idea of the "mother-daughter" construction principle, whereby a regionally and sectorally very detailed "mother" table is set up, from which "daughter" tables are derived to suit specific research questions. By introducing a third tier - the "root classification" - IELab users are able to define their own mother-MRIO configuration, at no additional cost in terms of data handling. Customised mother-MRIOs can then be built, which maximise disaggregation in aspects that are useful to a family of research questions. The second innovation in the IELab system is to provide a highly automated collaborative research platform in a cloud-computing environment, greatly expediting workflows and making these computational benefits accessible to all users. Combining these two aspects realises many benefits. The collaborative nature of the IELab development project allows significant savings in resources. Timely deployment is possible by coupling automation procedures with the comprehensive input from multiple teams. User-defined MRIO tables, coupled with high performance computing, mean that MRIO analysis will be useful and accessible for a great many more research applications than would otherwise be possible. By ensuring that a common set of analytical tools such as for hybrid life-cycle assessment is adopted, the IELab will facilitate the harmonisation of fragmented, dispersed and misaligned raw data for the benefit of all interested parties. Copyright © 2014 Elsevier B.V. All rights reserved.
Reynolds, Arthur J; Hayakawa, Momoko; Ou, Suh-Ruu; Mondi, Christina F; Englund, Michelle M; Candee, Allyson J; Smerillo, Nicole E
2017-09-01
We describe the development, implementation, and evaluation of a comprehensive preschool to third grade prevention program for the goals of sustaining services at a large scale. The Midwest Child-Parent Center (CPC) Expansion is a multilevel collaborative school reform model designed to improve school achievement and parental involvement from ages 3 to 9. By increasing the dosage, coordination, and comprehensiveness of services, the program is expected to enhance the transition to school and promote more enduring effects on well-being in multiple domains. We review and evaluate evidence from two longitudinal studies (Midwest CPC, 2012 to present; Chicago Longitudinal Study, 1983 to present) and four implementation examples of how the guiding principles of shared ownership, committed resources, and progress monitoring for improvement can promote effectiveness. The implementation system of partners and further expansion using "Pay for Success" financing shows the feasibility of scaling the program while continuing to improve effectiveness. © 2017 The Authors. Child Development published by Wiley Periodicals, Inc. on behalf of Society for Research in Child Development.
Science Diplomacy in Large International Collaborations
NASA Astrophysics Data System (ADS)
Barish, Barry C.
2011-04-01
What opportunities and challenges does the rapidly growing internationalization of science, especially large scale science and technology projects, present for US science policy? On one hand, the interchange of scientists, the sharing of technology and facilities and the working together on common scientific goals promotes better understanding and better science. On the other hand, challenges are presented, because the science cannot be divorced from government policies, and solutions must be found for issues varying from visas to making reliable international commitments.
Systems engineering implementation in the preliminary design phase of the Giant Magellan Telescope
NASA Astrophysics Data System (ADS)
Maiten, J.; Johns, M.; Trancho, G.; Sawyer, D.; Mady, P.
2012-09-01
Like many telescope projects today, the 24.5-meter Giant Magellan Telescope (GMT) is truly a complex system. The primary and secondary mirrors of the GMT are segmented and actuated to support two operating modes: natural seeing and adaptive optics. GMT is a general-purpose telescope supporting multiple science instruments operated in those modes. GMT is a large, diverse collaboration and development includes geographically distributed teams. The need to implement good systems engineering processes for managing the development of systems like GMT becomes imperative. The management of the requirements flow down from the science requirements to the component level requirements is an inherently difficult task in itself. The interfaces must also be negotiated so that the interactions between subsystems and assemblies are well defined and controlled. This paper will provide an overview of the systems engineering processes and tools implemented for the GMT project during the preliminary design phase. This will include requirements management, documentation and configuration control, interface development and technical risk management. Because of the complexity of the GMT system and the distributed team, using web-accessible tools for collaboration is vital. To accomplish this GMTO has selected three tools: Cognition Cockpit, Xerox Docushare, and Solidworks Enterprise Product Data Management (EPDM). Key to this is the use of Cockpit for managing and documenting the product tree, architecture, error budget, requirements, interfaces, and risks. Additionally, drawing management is accomplished using an EPDM vault. Docushare, a documentation and configuration management tool is used to manage workflow of documents and drawings for the GMT project. These tools electronically facilitate collaboration in real time, enabling the GMT team to track, trace and report on key project metrics and design parameters.
Examining the Role of Collaborative Learning in a Public Speaking Course
ERIC Educational Resources Information Center
Liao, Hsiang-Ann
2014-01-01
Collaborative learning has been found to benefit students in various disciplines. Moreover, in the science, technology, engineering, and mathematics literature, it was noted that minority students benefited the most from collaborative learning. Studies on the effects of collaborative learning in communication are limited. As a result, I examined…
NASA Astrophysics Data System (ADS)
Choomlucksana, Juthamas; Doolen, Toni L.
2017-11-01
The use of collaborative activities and simulation sessions in engineering education has been explored previously. However, few studies have investigated the relationship of these types of teaching innovations with other learner characteristics, such as self-efficacy and background knowledge. This study explored the effects of collaborative activities and simulation sessions on learning and the relationships between self-efficacy beliefs, background knowledge, and learning. Data were collected from two different terms in an upper division engineering course entitled Lean Manufacturing Systems Engineering. Findings indicated that the impact of collaborative activities and simulation sessions appears to be different, depending on the concepts being taught. Simulation sessions were found to have a significant effect on self-efficacy beliefs, and background knowledge had a mixed effect on learning. Overall the results of this study highlight the complex set of relationships between classroom innovations, learner characteristics, and learning.
NASA Astrophysics Data System (ADS)
Klug Boonstra, S.
2017-12-01
With the advent and widespread adoption of virtual connectivity, it is possible for scientists, engineers, and other STEM professionals to reach every place the youth of America learn! Arizona State University's School of Earth and Space Exploration, in planned collaboration with national STEM organizations, agencies, and education partners, are proposing a bold, collaborative, national model that will better enable STEM professionals of all disciplines to meet the needs of their audiences more effectively and efficiently. STEM subject matter experts (SMEs) can bring timely and authentic, real-world examples that engage and motivate learners in the conceptual learning journey presented through formal and informal curricula while also providing a personal face and story of their STEM journey and experience. With over 6.2 million scientists and engineers, 55.6 million PreK-12 students, and 6.3 million community college students in the US, the possible reach, long-term impact, and benefits of the virtual, just-in-time interactions between SMEs, teachers, and students has the potential to provide the missing links of relevancy and real-world application that will engage learners and enhance STEM understanding at a higher, deeper level while having the capacity to do this at a national scale. Providing professional development training for the SMEs will be an essential element in helping them to understand where their STEM work is relevant and appropriate within educational learning progressions. The vision for STEM Connect will be to prepare the STEM SMEs to share their expertise in a way that will show the dynamic and iterative nature of STEM research and design, helping them to bring their STEM expertise to formal and informal learners in a strategic and meaningful way. Discussions with possible STEM Connect collaborators (e.g., national STEM member-based organizations, technology providers, federal agencies, and professional educational organizations) are underway to bring together a national design and implementation vision, start to build a collaborative team, and to look for funding mechanisms. We hope to empower this national pathway for STEM professionals to impact the way the next generation will understand and appreciate STEM's impact on our everyday lives.
NASA Technical Reports Server (NTRS)
McGowan, Anna-Maria Rivas; Papalambros, Panos Y.; Baker, Wayne E.
2015-01-01
This paper examines four primary methods of working across disciplines during R&D and early design of large-scale complex engineered systems such as aerospace systems. A conceptualized framework, called the Combining System Elements framework, is presented to delineate several aspects of cross-discipline and system integration practice. The framework is derived from a theoretical and empirical analysis of current work practices in actual operational settings and is informed by theories from organization science and engineering. The explanatory framework may be used by teams to clarify assumptions and associated work practices, which may reduce ambiguity in understanding diverse approaches to early systems research, development and design. The framework also highlights that very different engineering results may be obtained depending on work practices, even when the goals for the engineered system are the same.
ERIC Educational Resources Information Center
Deschênes, Jean-Sebastien; Barka, Noureddine; Michaud, Mario; Paradis, Denis; Brousseau, Jean
2013-01-01
A joint learning activity in process control is presented, in the context of a distance collaboration between engineering and technical-level students, in a similar fashion as current practices in the industry involving distance coordination and troubleshooting. The necessary infrastructure and the setup used are first detailed, followed by a…
ERIC Educational Resources Information Center
Stein, Otto R.; Schmalzbauer, Leah
2012-01-01
The Montana State University student chapter of Engineers Without Borders USA is a student-managed partnership with the people of Khwisero, Kenya. The primary mission, to bring potable water and clean sanitation facilities to 61 primary schools and the surrounding communities of Khwisero, necessitates a long-term commitment to collaboration and…
Learning about Friction: Group Dynamics in Engineering Students' Work with Free Body Diagrams
ERIC Educational Resources Information Center
Berge, Maria; Weilenmann, Alexandra
2014-01-01
In educational research, it is well-known that collaborative work on core conceptual issues in physics leads to significant improvements in students' conceptual understanding. In this paper, we explore collaborative learning in action, adding to previous research in engineering education with a specific focus on the students' use of free body…
Student Perspectives on the Flipped-Classroom Approach and Collaborative Problem-Solving Process
ERIC Educational Resources Information Center
Karabulut-Ilgu, Aliye; Yao, Suhan; Savolainen, Peter; Jahren, Charles
2018-01-01
The flipped-classroom approach has gained increasing popularity and interest in engineering education. The purpose of this study was to investigate (a) student perspectives on the flipped-classroom approach in a transportation-engineering course and (b) how students used the in-class time dedicated to collaborative problem solving. To this end,…
Software Engineering Research/Developer Collaborations in 2004 (C104)
NASA Technical Reports Server (NTRS)
Pressburger, Tom; Markosian, Lawrance
2005-01-01
In 2004, six collaborations between software engineering technology providers and NASA software development personnel deployed a total of five software engineering technologies (for references, see Section 7.2) on the NASA projects. The main purposes were to benefit the projects, infuse the technologies if beneficial into NASA, and give feedback to the technology providers to improve the technologies. Each collaboration project produced a final report (for references, see Section 7.1). Section 2 of this report summarizes each project, drawing from the final reports and communications with the software developers and technology providers. Section 3 indicates paths to further infusion of the technologies into NASA practice. Section 4 summarizes some technology transfer lessons learned. Section 6 lists the acronyms used in this report.
Henrionnet, Christel; Dumas, Dominique; Hupont, Sébastien; Stoltz, Jean François; Mainard, Didier; Gillet, Pierre; Pinzano, Astrid
2017-01-01
In tissue engineering approaches, the quality of substitutes is a key element to determine its ability to treat cartilage defects. However, in clinical practice, the evaluation of tissue-engineered cartilage substitute quality is not possible due to the invasiveness of the standard procedure, which is to date histology. The aim of this work was to validate a new innovative system performed from two-photon excitation laser adapted to an optical macroscope to evaluate at macroscopic scale the collagen network in cartilage tissue-engineered substitutes in confrontation with gold standard histologic techniques or immunohistochemistry to visualize type II collagen. This system permitted to differentiate the quality of collagen network between ITS and TGF-β1 treatments. Multiscale large field imaging combined to multimodality approaches (SHG-TCSPC) at macroscopical scale represent an innovative and non-invasive technique to monitor the quality of collagen network in cartilage tissue-engineered substitutes before in vivo implantation.
Jorm, Christine; Nisbet, Gillian; Roberts, Chris; Gordon, Christopher; Gentilcore, Stacey; Chen, Timothy F
2016-08-08
More and better interprofessional practice is predicated to be necessary to deliver good care to the patients of the future. However, universities struggle to create authentic learning activities that enable students to experience the dynamic interprofessional interactions common in healthcare and that can accommodate large interprofessional student cohorts. We investigated a large-scale mandatory interprofessional learning (IPL) activity for health professional students designed to promote social learning. A mixed methods research approach determined feasibility, acceptability and the extent to which student IPL outcomes were met. We developed an IPL activity founded in complexity theory to prepare students for future practice by engaging them in a self-directed (self-organised) learning activity with a diverse team, whose assessable products would be emergent creations. Complicated but authentic clinical cases (n = 12) were developed to challenge student teams (n = 5 or 6). Assessment consisted of a written management plan (academically marked) and a five-minute video (peer marked) designed to assess creative collaboration as well as provide evidence of integrated collective knowledge; the cohesive patient-centred management plan. All students (including the disciplines of diagnostic radiology, exercise physiology, medicine, nursing, occupational therapy, pharmacy, physiotherapy and speech pathology), completed all tasks successfully. Of the 26 % of students who completed the evaluation survey, 70 % agreed or strongly agreed that the IPL activity was worthwhile, and 87 % agreed or strongly agreed that their case study was relevant. Thematic analysis found overarching themes of engagement and collaboration-in-action suggesting that the IPL activity enabled students to achieve the intended learning objectives. Students recognised the contribution of others and described negotiation, collaboration and creation of new collective knowledge after working together on the complicated patient case studies. The novel video assessment was challenging to many students and contextual issues limited engagement for some disciplines. We demonstrated the feasibility and acceptability of a large scale IPL activity where design of cases, format and assessment tasks was founded in complexity theory. This theoretically based design enabled students to achieve complex IPL outcomes relevant to future practice. Future research could establish the psychometric properties of assessments of student performance in large-scale IPL events.
Federated learning of predictive models from federated Electronic Health Records.
Brisimi, Theodora S; Chen, Ruidi; Mela, Theofanie; Olshevsky, Alex; Paschalidis, Ioannis Ch; Shi, Wei
2018-04-01
In an era of "big data," computationally efficient and privacy-aware solutions for large-scale machine learning problems become crucial, especially in the healthcare domain, where large amounts of data are stored in different locations and owned by different entities. Past research has been focused on centralized algorithms, which assume the existence of a central data repository (database) which stores and can process the data from all participants. Such an architecture, however, can be impractical when data are not centrally located, it does not scale well to very large datasets, and introduces single-point of failure risks which could compromise the integrity and privacy of the data. Given scores of data widely spread across hospitals/individuals, a decentralized computationally scalable methodology is very much in need. We aim at solving a binary supervised classification problem to predict hospitalizations for cardiac events using a distributed algorithm. We seek to develop a general decentralized optimization framework enabling multiple data holders to collaborate and converge to a common predictive model, without explicitly exchanging raw data. We focus on the soft-margin l 1 -regularized sparse Support Vector Machine (sSVM) classifier. We develop an iterative cluster Primal Dual Splitting (cPDS) algorithm for solving the large-scale sSVM problem in a decentralized fashion. Such a distributed learning scheme is relevant for multi-institutional collaborations or peer-to-peer applications, allowing the data holders to collaborate, while keeping every participant's data private. We test cPDS on the problem of predicting hospitalizations due to heart diseases within a calendar year based on information in the patients Electronic Health Records prior to that year. cPDS converges faster than centralized methods at the cost of some communication between agents. It also converges faster and with less communication overhead compared to an alternative distributed algorithm. In both cases, it achieves similar prediction accuracy measured by the Area Under the Receiver Operating Characteristic Curve (AUC) of the classifier. We extract important features discovered by the algorithm that are predictive of future hospitalizations, thus providing a way to interpret the classification results and inform prevention efforts. Copyright © 2018 Elsevier B.V. All rights reserved.
Engineering a Large Scale Indium Nanodot Array for Refractive Index Sensing.
Xu, Xiaoqing; Hu, Xiaolin; Chen, Xiaoshu; Kang, Yangsen; Zhang, Zhiping; B Parizi, Kokab; Wong, H-S Philip
2016-11-23
In this work, we developed a simple method to fabricate 12 × 4 mm 2 large scale nanostructure arrays and investigated the feasibility of indium nanodot (ND) array with different diameters and periods for refractive index sensing. Absorption resonances at multiple wavelengths from the visible to the near-infrared range were observed for various incident angles in a variety of media. Engineering the ND array with a centered square lattice, we successfully enhanced the sensitivity by 60% and improved the figure of merit (FOM) by 190%. The evolution of the resonance dips in the reflection spectra, of square lattice and centered square lattice, from air to water, matches well with the results of Lumerical FDTD simulation. The improvement of sensitivity is due to the enhancement of local electromagnetic field (E-field) near the NDs with centered square lattice, as revealed by E-field simulation at resonance wavelengths. The E-field is enhanced due to coupling between the two square ND arrays with [Formula: see text]x period at phase matching. This work illustrates an effective way to engineer and fabricate a refractive index sensor at a large scale. This is the first experimental demonstration of poor-metal (indium) nanostructure array for refractive index sensing. It also demonstrates a centered square lattice for higher sensitivity and as a better basic platform for more complex sensor designs.
NASA Technical Reports Server (NTRS)
Schlundt, D. W.
1976-01-01
The installed performance degradation of a swivel nozzle thrust deflector system obtained during increased vectoring angles of a large-scale test program was investigated and improved. Small-scale models were used to generate performance data for analyzing selected swivel nozzle configurations. A single-swivel nozzle design model with five different nozzle configurations and a twin-swivel nozzle design model, scaled to 0.15 size of the large-scale test hardware, were statically tested at low exhaust pressure ratios of 1.4, 1.3, 1.2, and 1.1 and vectored at four nozzle positions from 0 deg cruise through 90 deg vertical used for the VTOL mode.
Establishment of a National Wind Energy Center at University of Houston
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Su Su
The DOE-supported project objectives are to: establish a national wind energy center (NWEC) at University of Houston and conduct research to address critical science and engineering issues for the development of future large MW-scale wind energy production systems, especially offshore wind turbines. The goals of the project are to: (1) establish a sound scientific/technical knowledge base of solutions to critical science and engineering issues for developing future MW-scale large wind energy production systems, (2) develop a state-of-the-art wind rotor blade research facility at the University of Houston, and (3) through multi-disciplinary research, introducing technology innovations on advanced wind-turbine materials, processing/manufacturingmore » technology, design and simulation, testing and reliability assessment methods related to future wind turbine systems for cost-effective production of offshore wind energy. To achieve the goals of the project, the following technical tasks were planned and executed during the period from April 15, 2010 to October 31, 2014 at the University of Houston: (1) Basic research on large offshore wind turbine systems (2) Applied research on innovative wind turbine rotors for large offshore wind energy systems (3) Integration of offshore wind-turbine design, advanced materials and manufacturing technologies (4) Integrity and reliability of large offshore wind turbine blades and scaled model testing (5) Education and training of graduate and undergraduate students and post- doctoral researchers (6) Development of a national offshore wind turbine blade research facility The research program addresses both basic science and engineering of current and future large wind turbine systems, especially offshore wind turbines, for MW-scale power generation. The results of the research advance current understanding of many important scientific issues and provide technical information for solving future large wind turbines with advanced design, composite materials, integrated manufacturing, and structural reliability and integrity. The educational program have trained many graduate and undergraduate students and post-doctoral level researchers to learn critical science and engineering of wind energy production systems through graduate-level courses and research, and participating in various projects in center’s large multi-disciplinary research. These students and researchers are now employed by the wind industry, national labs and universities to support the US and international wind energy industry. The national offshore wind turbine blade research facility developed in the project has been used to support the technical and training tasks planned in the program to accomplish their goals, and it is a national asset which is available for used by domestic and international researchers in the wind energy arena.« less
The influence of super-horizon scales on cosmological observables generated during inflation
NASA Astrophysics Data System (ADS)
Matarrese, Sabino; Musso, Marcello A.; Riotto, Antonio
2004-05-01
Using the techniques of out-of-equilibrium field theory, we study the influence on properties of cosmological perturbations generated during inflation on observable scales coming from fluctuations corresponding today to scales much bigger than the present Hubble radius. We write the effective action for the coarse grained inflaton perturbations, integrating out the sub-horizon modes, which manifest themselves as a coloured noise and lead to memory effects. Using the simple model of a scalar field with cubic self-interactions evolving in a fixed de Sitter background, we evaluate the two- and three-point correlation function on observable scales. Our basic procedure shows that perturbations do preserve some memory of the super-horizon scale dynamics, in the form of scale dependent imprints in the statistical moments. In particular, we find a blue tilt of the power spectrum on large scales, in agreement with the recent results of the WMAP collaboration which show a suppression of the lower multipoles in the cosmic microwave background anisotropies, and a substantial enhancement of the intrinsic non-Gaussianity on large scales.
ERIC Educational Resources Information Center
Madhavan, Krishna; Johri, Aditya; Xian, Hanjun; Wang, G. Alan; Liu, Xiaomo
2014-01-01
The proliferation of digital information technologies and related infrastructure has given rise to novel ways of capturing, storing and analyzing data. In this paper, we describe the research and development of an information system called Interactive Knowledge Networks for Engineering Education Research (iKNEER). This system utilizes a framework…
ERIC Educational Resources Information Center
Djerassi, Carl
1972-01-01
Manipulation of genes in human beings on a large scale is not possible under present conditions because it lacks economic potential and other attractions for industry. However, preventive'' genetic engineering may be a field for vast research in the future and will perhaps be approved by governments, parishes, people and industry. (PS)
Singular value decomposition for collaborative filtering on a GPU
NASA Astrophysics Data System (ADS)
Kato, Kimikazu; Hosino, Tikara
2010-06-01
A collaborative filtering predicts customers' unknown preferences from known preferences. In a computation of the collaborative filtering, a singular value decomposition (SVD) is needed to reduce the size of a large scale matrix so that the burden for the next phase computation will be decreased. In this application, SVD means a roughly approximated factorization of a given matrix into smaller sized matrices. Webb (a.k.a. Simon Funk) showed an effective algorithm to compute SVD toward a solution of an open competition called "Netflix Prize". The algorithm utilizes an iterative method so that the error of approximation improves in each step of the iteration. We give a GPU version of Webb's algorithm. Our algorithm is implemented in the CUDA and it is shown to be efficient by an experiment.