Studies on combined model based on functional objectives of large scale complex engineering
NASA Astrophysics Data System (ADS)
Yuting, Wang; Jingchun, Feng; Jiabao, Sun
2018-03-01
As various functions were included in large scale complex engineering, and each function would be conducted with completion of one or more projects, combined projects affecting their functions should be located. Based on the types of project portfolio, the relationship of projects and their functional objectives were analyzed. On that premise, portfolio projects-technics based on their functional objectives were introduced, then we studied and raised the principles of portfolio projects-technics based on the functional objectives of projects. In addition, The processes of combined projects were also constructed. With the help of portfolio projects-technics based on the functional objectives of projects, our research findings laid a good foundation for management of large scale complex engineering portfolio management.
How Project Managers Really Manage: An Indepth Look at Some Managers of Large, Complex NASA Projects
NASA Technical Reports Server (NTRS)
Mulenburg, Gerald M.; Impaeilla, Cliff (Technical Monitor)
2000-01-01
This paper reports on a research study by the author that examined ten contemporary National Aeronautics and Space Administration (NASA) complex projects. In-depth interviews with the project managers of these projects provided qualitative data about the inner workings of the project and the methodologies used in establishing and managing the projects. The inclusion of a variety of space, aeronautics, and ground based projects from several different NASA research centers helped to reduce potential bias in the findings toward any one type of project, or technical discipline. The findings address the participants and their individual approaches. The discussion includes possible implications for project managers of other large, complex, projects.
Managing Risk and Uncertainty in Large-Scale University Research Projects
ERIC Educational Resources Information Center
Moore, Sharlissa; Shangraw, R. F., Jr.
2011-01-01
Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…
ERIC Educational Resources Information Center
Grabowski, Barbara L.; Koszalka, Tiffany A.
Combining assessment and research components on a large development and research project is a complex task. There are many descriptions of how either assessment or research should be conducted, but detailed examples illustrating integration of such strategies in complex projects are scarce. This paper provides definitions of assessment,…
Electronic construction collaboration system -- final phase : [tech transfer summary].
DOT National Transportation Integrated Search
2014-07-01
Construction projects have been growing more complex in terms of : project team composition, design aspects, and construction processes. : To help manage the shop/working drawings and requests for information : (RFIs) for its large, complex projects,...
NASA Astrophysics Data System (ADS)
Henkel, Daniela; Eisenhauer, Anton
2017-04-01
During the last decades, the number of large research projects has increased and therewith the requirement for multidisciplinary, multisectoral collaboration. Such complex and large-scale projects pose new competencies to form, manage, and use large, diverse teams as a competitive advantage. For complex projects the effort is magnified because multiple large international research consortia involving academic and non-academic partners, including big industries, NGOs, private and public bodies, all with cultural differences, individually discrepant expectations on teamwork and differences in the collaboration between national and multi-national administrations and research organisations, challenge the organisation and management of such multi-partner research consortia. How many partners are needed to establish and conduct collaboration with a multidisciplinary and multisectoral approach? How much personnel effort and what kinds of management techniques are required for such projects. This presentation identifies advantages and challenges of large research projects based on the experiences made in the context of an Innovative Training Network (ITN) project within Marie Skłodowska-Curie Actions of the European HORIZON 2020 program. Possible strategies are discussed to circumvent and avoid conflicts already at the beginning of the project.
NASA Technical Reports Server (NTRS)
Mizell, Carolyn; Malone, Linda
2007-01-01
It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.
Projection rule for complex-valued associative memory with large constant terms
NASA Astrophysics Data System (ADS)
Kitahara, Michimasa; Kobayashi, Masaki
Complex-valued Associative Memory (CAM) has an inherent property of rotation invariance. Rotation invariance produces many undesirable stable states and reduces the noise robustness of CAM. Constant terms may remove rotation invariance, but if the constant terms are too small, rotation invariance does not vanish. In this paper, we eliminate rotation invariance by introducing large constant terms to complex-valued neurons. We have to make constant terms sufficiently large to improve the noise robustness. We introduce a parameter to control the amplitudes of constant terms into projection rule. The large constant terms are proved to be effective by our computer simulations.
ERIC Educational Resources Information Center
Veaner, Allen B.
Project BALLOTS is a large-scale library automation development project of the Stanford University Libraries which has demonstrated the feasibility of conducting on-line interactive searches of complex bibliographic files, with a large number of users working simultaneously in the same or different files. This report documents the continuing…
Ellinas, Christos; Allan, Neil; Durugbo, Christopher; Johansson, Anders
2015-01-01
Current societal requirements necessitate the effective delivery of complex projects that can do more while using less. Yet, recent large-scale project failures suggest that our ability to successfully deliver them is still at its infancy. Such failures can be seen to arise through various failure mechanisms; this work focuses on one such mechanism. Specifically, it examines the likelihood of a project sustaining a large-scale catastrophe, as triggered by single task failure and delivered via a cascading process. To do so, an analytical model was developed and tested on an empirical dataset by the means of numerical simulation. This paper makes three main contributions. First, it provides a methodology to identify the tasks most capable of impacting a project. In doing so, it is noted that a significant number of tasks induce no cascades, while a handful are capable of triggering surprisingly large ones. Secondly, it illustrates that crude task characteristics cannot aid in identifying them, highlighting the complexity of the underlying process and the utility of this approach. Thirdly, it draws parallels with systems encountered within the natural sciences by noting the emergence of self-organised criticality, commonly found within natural systems. These findings strengthen the need to account for structural intricacies of a project's underlying task precedence structure as they can provide the conditions upon which large-scale catastrophes materialise.
Some thoughts on the management of large, complex international space ventures
NASA Technical Reports Server (NTRS)
Lee, T. J.; Kutzer, Ants; Schneider, W. C.
1992-01-01
Management issues relevant to the development and deployment of large international space ventures are discussed with particular attention given to previous experience. Management approaches utilized in the past are labeled as either simple or complex, and signs of efficient management are examined. Simple approaches include those in which experiments and subsystems are developed for integration into spacecraft, and the Apollo-Soyuz Test Project is given as an example of a simple multinational approach. Complex approaches include those for ESA's Spacelab Project and the Space Station Freedom in which functional interfaces cross agency and political boundaries. It is concluded that individual elements of space programs should be managed by individual participating agencies, and overall configuration control is coordinated by level with a program director acting to manage overall objectives and project interfaces.
2016-04-30
also that we have started building in a domain where structural patterns matter, especially for large projects. Complex Systems Complexity has been...through minimalistic thinking and parsimony” and perceived elegance, which “hides systemic or organizational complexity from the user.” If the system
Kevin C. Vogler; Alan A. Ager; Michelle A. Day; Michael Jennings; John D. Bailey
2015-01-01
The implementation of US federal forest restoration programs on national forests is a complex process that requires balancing diverse socioecological goals with project economics. Despite both the large geographic scope and substantial investments in restoration projects, a quantitative decision support framework to locate optimal project areas and examine...
Anurag Srivastava; Joan Q. Wu; William J. Elliot; Erin S. Brooks
2015-01-01
The Water Erosion Prediction Project (WEPP) model, originally developed for hillslope and small watershed applications, simulates complex interactive processes influencing erosion. Recent incorporations to the model have improved the subsurface hydrology components for forest applications. Incorporation of channel routing has made the WEPP model well suited for large...
Feng, Cun-Fang; Xu, Xin-Jian; Wang, Sheng-Jun; Wang, Ying-Hai
2008-06-01
We study projective-anticipating, projective, and projective-lag synchronization of time-delayed chaotic systems on random networks. We relax some limitations of previous work, where projective-anticipating and projective-lag synchronization can be achieved only on two coupled chaotic systems. In this paper, we realize projective-anticipating and projective-lag synchronization on complex dynamical networks composed of a large number of interconnected components. At the same time, although previous work studied projective synchronization on complex dynamical networks, the dynamics of the nodes are coupled partially linear chaotic systems. In this paper, the dynamics of the nodes of the complex networks are time-delayed chaotic systems without the limitation of the partial linearity. Based on the Lyapunov stability theory, we suggest a generic method to achieve the projective-anticipating, projective, and projective-lag synchronization of time-delayed chaotic systems on random dynamical networks, and we find both its existence and sufficient stability conditions. The validity of the proposed method is demonstrated and verified by examining specific examples using Ikeda and Mackey-Glass systems on Erdos-Renyi networks.
Etoile Project : Social Intelligent ICT-System for very large scale education in complex systems
NASA Astrophysics Data System (ADS)
Bourgine, P.; Johnson, J.
2009-04-01
The project will devise new theory and implement new ICT-based methods of delivering high-quality low-cost postgraduate education to many thousands of people in a scalable way, with the cost of each extra student being negligible (< a few Euros). The research will create an in vivo laboratory of one to ten thousand postgraduate students studying courses in complex systems. This community is chosen because it is large and interdisciplinary and there is a known requirement for courses for thousand of students across Europe. The project involves every aspect of course production and delivery. Within this the research focused on the creation of a Socially Intelligent Resource Mining system to gather large volumes of high quality educational resources from the internet; new methods to deconstruct these to produce a semantically tagged Learning Object Database; a Living Course Ecology to support the creation and maintenance of evolving course materials; systems to deliver courses; and a ‘socially intelligent assessment system'. The system will be tested on one to ten thousand postgraduate students in Europe working towards the Complex System Society's title of European PhD in Complex Systems. Étoile will have a very high impact both scientifically and socially by (i) the provision of new scalable ICT-based methods for providing very low cost scientific education, (ii) the creation of new mathematical and statistical theory for the multiscale dynamics of complex systems, (iii) the provision of a working example of adaptation and emergence in complex socio-technical systems, and (iv) making a major educational contribution to European complex systems science and its applications.
Understanding Complex Adaptive Systems by Playing Games
ERIC Educational Resources Information Center
van Bilsen, Arthur; Bekebrede, Geertje; Mayer, Igor
2010-01-01
While educators teach their students about decision making in complex environments, managers have to deal with the complexity of large projects on a daily basis. To make better decisions it is assumed, that the latter would benefit from better understanding of complex phenomena, as do students as the professionals of the future. The goal of this…
Chapter 13 - Perspectives on LANDFIRE Prototype Project Accuracy Assessment
James Vogelmann; Zhiliang Zhu; Jay Kost; Brian Tolk; Donald Ohlen
2006-01-01
The purpose of this chapter is to provide a general overview of the many aspects of accuracy assessment pertinent to the Landscape Fire and Resource Management Planning Tools Prototype Project (LANDFIRE Prototype Project). The LANDFIRE Prototype formed a large and complex research and development project with many broad-scale data sets and products developed throughout...
Large-Scale Optimization for Bayesian Inference in Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willcox, Karen; Marzouk, Youssef
2013-11-12
The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of themore » SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less
Accuracy assessment with complex sampling designs
Raymond L. Czaplewski
2010-01-01
A reliable accuracy assessment of remotely sensed geospatial data requires a sufficiently large probability sample of expensive reference data. Complex sampling designs reduce cost or increase precision, especially with regional, continental and global projects. The General Restriction (GR) Estimator and the Recursive Restriction (RR) Estimator separate a complex...
Diller, Kyle I; Bayden, Alexander S; Audie, Joseph; Diller, David J
2018-01-01
There is growing interest in peptide-based drug design and discovery. Due to their relatively large size, polymeric nature, and chemical complexity, the design of peptide-based drugs presents an interesting "big data" challenge. Here, we describe an interactive computational environment, PeptideNavigator, for naturally exploring the tremendous amount of information generated during a peptide drug design project. The purpose of PeptideNavigator is the presentation of large and complex experimental and computational data sets, particularly 3D data, so as to enable multidisciplinary scientists to make optimal decisions during a peptide drug discovery project. PeptideNavigator provides users with numerous viewing options, such as scatter plots, sequence views, and sequence frequency diagrams. These views allow for the collective visualization and exploration of many peptides and their properties, ultimately enabling the user to focus on a small number of peptides of interest. To drill down into the details of individual peptides, PeptideNavigator provides users with a Ramachandran plot viewer and a fully featured 3D visualization tool. Each view is linked, allowing the user to seamlessly navigate from collective views of large peptide data sets to the details of individual peptides with promising property profiles. Two case studies, based on MHC-1A activating peptides and MDM2 scaffold design, are presented to demonstrate the utility of PeptideNavigator in the context of disparate peptide-design projects. Copyright © 2017 Elsevier Ltd. All rights reserved.
Astronomical large projects managed with MANATEE: management tool for effective engineering
NASA Astrophysics Data System (ADS)
García-Vargas, M. L.; Mujica-Alvarez, E.; Pérez-Calpena, A.
2012-09-01
This paper describes MANATEE, which is the Management project web tool developed by FRACTAL, specifically designed for managing large astronomical projects. MANATEE facilitates the management by providing an overall view of the project and the capabilities to control the three main projects parameters: scope, schedule and budget. MANATEE is one of the three tools of the FRACTAL System & Project Suite, which is composed also by GECO (System Engineering Tool) and DOCMA (Documentation Management Tool). These tools are especially suited for those Consortia and teams collaborating in a multi-discipline, complex project in a geographically distributed environment. Our Management view has been applied successfully in several projects and currently is being used for Managing MEGARA, the next instrument for the GTC 10m telescope.
Geomorphic analysis of large alluvial rivers
NASA Astrophysics Data System (ADS)
Thorne, Colin R.
2002-05-01
Geomorphic analysis of a large river presents particular challenges and requires a systematic and organised approach because of the spatial scale and system complexity involved. This paper presents a framework and blueprint for geomorphic studies of large rivers developed in the course of basic, strategic and project-related investigations of a number of large rivers. The framework demonstrates the need to begin geomorphic studies early in the pre-feasibility stage of a river project and carry them through to implementation and post-project appraisal. The blueprint breaks down the multi-layered and multi-scaled complexity of a comprehensive geomorphic study into a number of well-defined and semi-independent topics, each of which can be performed separately to produce a clearly defined, deliverable product. Geomorphology increasingly plays a central role in multi-disciplinary river research and the importance of effective quality assurance makes it essential that audit trails and quality checks are hard-wired into study design. The structured approach presented here provides output products and production trails that can be rigorously audited, ensuring that the results of a geomorphic study can stand up to the closest scrutiny.
Health impact assessment of industrial development projects: a spatio-temporal visualization.
Winkler, Mirko S; Krieger, Gary R; Divall, Mark J; Singer, Burton H; Utzinger, Jürg
2012-05-01
Development and implementation of large-scale industrial projects in complex eco-epidemiological settings typically require combined environmental, social and health impact assessments. We present a generic, spatio-temporal health impact assessment (HIA) visualization, which can be readily adapted to specific projects and key stakeholders, including poorly literate communities that might be affected by consequences of a project. We illustrate how the occurrence of a variety of complex events can be utilized for stakeholder communication, awareness creation, interactive learning as well as formulating HIA research and implementation questions. Methodological features are highlighted in the context of an iron ore development in a rural part of Africa.
Office of Research and Development's Four Lab Study: Toxicological and Chemical Evaluation of Complex Mixtures of Disinfection By-Products (DBPs), and Quality Assurance Activities for a Large U.S. EPA Multilaboratoty Study
Thomas J. Hughes, Project and QA Manager, Expe...
Project management for complex ground-based instruments: MEGARA plan
NASA Astrophysics Data System (ADS)
García-Vargas, María. Luisa; Pérez-Calpena, Ana; Gil de Paz, Armando; Gallego, Jesús; Carrasco, Esperanza; Cedazo, Raquel; Iglesias, Jorge
2014-08-01
The project management of complex instruments for ground-based large telescopes is a challenge itself. A good management is a clue for project success in terms of performance, schedule and budget. Being on time has become a strict requirement for two reasons: to assure the arrival at the telescope due to the pressure on demanding new instrumentation for this first world-class telescopes and to not fall in over-costs. The budget and cash-flow is not always the expected one and has to be properly handled from different administrative departments at the funding centers worldwide distributed. The complexity of the organizations, the technological and scientific return to the Consortium partners and the participation in the project of all kind of professional centers working in astronomical instrumentation: universities, research centers, small and large private companies, workshops and providers, etc. make the project management strategy, and the tools and procedures tuned to the project needs, crucial for success. MEGARA (Multi-Espectrógrafo en GTC de Alta Resolución para Astronomía) is a facility instrument of the 10.4m GTC (La Palma, Spain) working at optical wavelengths that provides both Integral-Field Unit (IFU) and Multi-Object Spectrograph (MOS) capabilities at resolutions in the range R=6,000-20,000. The project is an initiative led by Universidad Complutense de Madrid (Spain) in collaboration with INAOE (Mexico), IAA-CSIC (Spain) and Universidad Politécnica de Madrid (Spain). MEGARA is being developed under contract with GRANTECAN.
Big questions, big science: meeting the challenges of global ecology.
Schimel, David; Keller, Michael
2015-04-01
Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.
Embedding Agile Practices within a Plan-Driven Hierarchical Project Life Cycle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Millard, W. David; Johnson, Daniel M.; Henderson, John M.
2014-07-28
Organizations use structured, plan-driven approaches to provide continuity, direction, and control to large, multi-year programs. Projects within these programs vary greatly in size, complexity, level of maturity, technical risk, and clarity of the development objectives. Organizations that perform exploratory research, evolutionary development, and other R&D activities can obtain the benefits of Agile practices without losing the benefits of their program’s overarching plan-driven structure. This paper describes application of Agile development methods on a large plan-driven sensor integration program. While the client employed plan-driven, requirements flow-down methodologies, tight project schedules and complex interfaces called for frequent end-to-end demonstrations to provide feedbackmore » during system development. The development process maintained the many benefits of plan-driven project execution with the rapid prototyping, integration, demonstration, and client feedback possible through Agile development methods. This paper also describes some of the tools and implementing mechanisms used to transition between and take advantage of each methodology, and presents lessons learned from the project management, system engineering, and developer’s perspectives.« less
Developing Renewable Energy Projects Larger Than 10 MWs at Federal Facilities (Book)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2013-03-01
To accomplish Federal goals for renewable energy, sustainability, and energy security, large-scale renewable energy projects must be developed and constructed on Federal sites at a significant scale with significant private investment. The U.S. Department of Energy's Federal Energy Management Program (FEMP) helps Federal agencies meet these goals and assists agency personnel navigate the complexities of developing such projects and attract the necessary private capital to complete them. This guide is intended to provide a general resource that will begin to develop the Federal employee's awareness and understanding of the project developer's operating environment and the private sector's awareness and understandingmore » of the Federal environment. Because the vast majority of the investment that is required to meet the goals for large-scale renewable energy projects will come from the private sector, this guide has been organized to match Federal processes with typical phases of commercial project development. The main purpose of this guide is to provide a project development framework to allow the Federal Government, private developers, and investors to work in a coordinated fashion on large-scale renewable energy projects. The framework includes key elements that describe a successful, financially attractive large-scale renewable energy project.« less
Big Science and the Large Hadron Collider
NASA Astrophysics Data System (ADS)
Giudice, Gian Francesco
2012-03-01
The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.
Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghattas, Omar
2013-10-15
The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUAROmore » Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less
Large Instrument Development for Radio Astronomy
NASA Astrophysics Data System (ADS)
Fisher, J. Richard; Warnick, Karl F.; Jeffs, Brian D.; Norrod, Roger D.; Lockman, Felix J.; Cordes, James M.; Giovanelli, Riccardo
2009-03-01
This white paper offers cautionary observations about the planning and development of new, large radio astronomy instruments. Complexity is a strong cost driver so every effort should be made to assign differing science requirements to different instruments and probably different sites. The appeal of shared resources is generally not realized in practice and can often be counterproductive. Instrument optimization is much more difficult with longer lists of requirements, and the development process is longer and less efficient. More complex instruments are necessarily further behind the technology state of the art because of longer development times. Including technology R&D in the construction phase of projects is a growing trend that leads to higher risks, cost overruns, schedule delays, and project de-scoping. There are no technology breakthroughs just over the horizon that will suddenly bring down the cost of collecting area. Advances come largely through careful attention to detail in the adoption of new technology provided by industry and the commercial market. Radio astronomy instrumentation has a very bright future, but a vigorous long-term R&D program not tied directly to specific projects needs to be restored, fostered, and preserved.
A numerical projection technique for large-scale eigenvalue problems
NASA Astrophysics Data System (ADS)
Gamillscheg, Ralf; Haase, Gundolf; von der Linden, Wolfgang
2011-10-01
We present a new numerical technique to solve large-scale eigenvalue problems. It is based on the projection technique, used in strongly correlated quantum many-body systems, where first an effective approximate model of smaller complexity is constructed by projecting out high energy degrees of freedom and in turn solving the resulting model by some standard eigenvalue solver. Here we introduce a generalization of this idea, where both steps are performed numerically and which in contrast to the standard projection technique converges in principle to the exact eigenvalues. This approach is not just applicable to eigenvalue problems encountered in many-body systems but also in other areas of research that result in large-scale eigenvalue problems for matrices which have, roughly speaking, mostly a pronounced dominant diagonal part. We will present detailed studies of the approach guided by two many-body models.
Reduced order models for prediction of groundwater quality impacts from CO₂ and brine leakage
Zheng, Liange; Carroll, Susan; Bianchi, Marco; ...
2014-12-31
A careful assessment of the risk associated with geologic CO₂ storage is critical to the deployment of large-scale storage projects. A potential risk is the deterioration of groundwater quality caused by the leakage of CO₂ and brine leakage from deep subsurface reservoirs. In probabilistic risk assessment studies, numerical modeling is the primary tool employed to assess risk. However, the application of traditional numerical models to fully evaluate the impact of CO₂ leakage on groundwater can be computationally complex, demanding large processing times and resources, and involving large uncertainties. As an alternative, reduced order models (ROMs) can be used as highlymore » efficient surrogates for the complex process-based numerical models. In this study, we represent the complex hydrogeological and geochemical conditions in a heterogeneous aquifer and subsequent risk by developing and using two separate ROMs. The first ROM is derived from a model that accounts for the heterogeneous flow and transport conditions in the presence of complex leakage functions for CO₂ and brine. The second ROM is obtained from models that feature similar, but simplified flow and transport conditions, and allow for a more complex representation of all relevant geochemical reactions. To quantify possible impacts to groundwater aquifers, the basic risk metric is taken as the aquifer volume in which the water quality of the aquifer may be affected by an underlying CO₂ storage project. The integration of the two ROMs provides an estimate of the impacted aquifer volume taking into account uncertainties in flow, transport and chemical conditions. These two ROMs can be linked in a comprehensive system level model for quantitative risk assessment of the deep storage reservoir, wellbore leakage, and shallow aquifer impacts to assess the collective risk of CO₂ storage projects.« less
1984-11-01
FLOW CHART Complet List of Location/Shtes Evaluation of Past Operations at LUste SNMe * ~Potential Hazard to Healh WelareS Reglaor Agencon consolidat...siting studies were also a part of this tions, soil, groundwater sampling and large complex project. analysis, and remedial concept engi- neering. Project
Citizen science on a smartphone: Participants' motivations and learning.
Land-Zandstra, Anne M; Devilee, Jeroen L A; Snik, Frans; Buurmeijer, Franka; van den Broek, Jos M
2016-01-01
Citizen science provides researchers means to gather or analyse large datasets. At the same time, citizen science projects offer an opportunity for non-scientists to be part of and learn from the scientific process. In the Dutch iSPEX project, a large number of citizens turned their smartphones into actual measurement devices to measure aerosols. This study examined participants' motivation and perceived learning impacts of this unique project. Most respondents joined iSPEX because they wanted to contribute to the scientific goals of the project or because they were interested in the project topics (health and environmental impact of aerosols). In terms of learning impact, respondents reported a gain in knowledge about citizen science and the topics of the project. However, many respondents had an incomplete understanding of the science behind the project, possibly caused by the complexity of the measurements. © The Author(s) 2015.
ERIC Educational Resources Information Center
Polanin, Joshua R.; Wilson, Sandra Jo
2014-01-01
The purpose of this project is to demonstrate the practical methods developed to utilize a dataset consisting of both multivariate and multilevel effect size data. The context for this project is a large-scale meta-analytic review of the predictors of academic achievement. This project is guided by three primary research questions: (1) How do we…
The Cognitive Authority of Collective Intelligence
ERIC Educational Resources Information Center
Goldman, James L.
2010-01-01
Collaboration tools based on World Wide Web technologies now enable and encourage large groups of people who do not previously know one another, and who may share no other affiliation, to work together cooperatively and often anonymously on large information projects such as online encyclopedias and complex websites. Making use of information…
Geospatial Optimization of Siting Large-Scale Solar Projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macknick, Jordan; Quinby, Ted; Caulfield, Emmet
2014-03-01
Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent withmore » each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.« less
NASA Astrophysics Data System (ADS)
Masters, J.; Alexov, A.; Folk, M.; Hanisch, R.; Heber, G.; Wise, M.
2012-09-01
Here we update the astronomy community on our effort to deal with the demands of ever-increasing astronomical data size and complexity, using the Hierarchical Data Format, version 5 (HDF5) format (Wise et al. 2011). NRAO, LOFAR and VAO have joined forces with The HDF Group to write an NSF grant, requesting funding to assist in the effort. This paper briefly summarizes our motivation for the proposed project, an outline of the project itself, and some of the material discussed at the ADASS Birds of a Feather (BoF) discussion. Topics of discussion included: community experiences with HDF5 and other file formats; toolsets which exist and/or can be adapted for HDF5; a call for development towards visualizing large (> 1 TB) image cubes; and, general lessons learned from working with large and complex data.
Large-scale visualization projects for teaching software engineering.
Müller, Christoph; Reina, Guido; Burch, Michael; Weiskopf, Daniel
2012-01-01
The University of Stuttgart's software engineering major complements the traditional computer science major with more practice-oriented education. Two-semester software projects in various application areas offered by the university's different computer science institutes are a successful building block in the curriculum. With this realistic, complex project setting, students experience the practice of software engineering, including software development processes, technologies, and soft skills. In particular, visualization-based projects are popular with students. Such projects offer them the opportunity to gain profound knowledge that would hardly be possible with only regular lectures and homework assignments.
NASA Technical Reports Server (NTRS)
McNeill, Justin
1995-01-01
The Multimission Image Processing Subsystem (MIPS) at the Jet Propulsion Laboratory (JPL) has managed transitions of application software sets from one operating system and hardware platform to multiple operating systems and hardware platforms. As a part of these transitions, cost estimates were generated from the personal experience of in-house developers and managers to calculate the total effort required for such projects. Productivity measures have been collected for two such transitions, one very large and the other relatively small in terms of source lines of code. These estimates used a cost estimation model similar to the Software Engineering Laboratory (SEL) Effort Estimation Model. Experience in transitioning software within JPL MIPS have uncovered a high incidence of interface complexity. Interfaces, both internal and external to individual software applications, have contributed to software transition project complexity, and thus to scheduling difficulties and larger than anticipated design work on software to be ported.
Group Projects: More Learning? Less Fair? A Conundrum in Assessing Postgraduate Business Education
ERIC Educational Resources Information Center
Nordberg, Donald
2008-01-01
Group projects form a large and possibly growing component of the work undertaken for assessing students for postgraduate degrees in business. Yet the assessments sources, methods and purposes result in an array of combinations that the literature on assessment fails to capture in its full complexity. This paper builds on a new framework for…
An Analysis of the Second Project High Water Data
NASA Technical Reports Server (NTRS)
Woodbridge, David D.; Lasater, James A.; Fultz, Bennett M.; Clark, Richard E.; Wylie, Nancy
1963-01-01
Early in 1962 NASA established "Project High Water" to investigate the sudden release of large quantities of water into the upper atmosphere. The primary objectives of these experiments were to obtain information on the behavior of liquids released in the ionosphere and the localized effects on the ionosphere produced by the injection of large quantities of water. The data obtained in the two (2) Project High Water experiments have yielded an extensive amount of information concerning the complex phenomena associated with the sudden release of liquids in the Ionosphere. The detailed analysis of data obtained during the second Project High Water experiment (i.e., the third Saturn I vehicle test or SA-3) presented in this report demonstrates that the objectives of the Project High Water were achieved. In addition, the Project High Water has provided essential information relevant to a number of problems vital to manned explorations of space.
Success in large high-technology projects: What really works?
NASA Astrophysics Data System (ADS)
Crosby, P.
2014-08-01
Despite a plethora of tools, technologies and management systems, successful execution of big science and engineering projects remains problematic. The sheer scale of globally funded projects such as the Large Hadron Collider and the Square Kilometre Array telescope means that lack of project success can impact both on national budgets, and collaborative reputations. In this paper, I explore data from contemporary literature alongside field research from several current high-technology projects in Europe and Australia, and reveal common `pressure points' that are shown to be key influencers of project control and success. I discuss the how mega-science projects sit between being merely complicated, and chaotic, and explain the importance of understanding multiple dimensions of project complexity. Project manager/leader traits are briefly discussed, including capability to govern and control such enterprises. Project structures are examined, including the challenge of collaborations. I show that early attention to building project resilience, curbing optimism, and risk alertness can help prepare large high-tech projects against threats, and why project managers need to understand aspects of `the silent power of time'. Mission assurance is advanced as a critical success function, alongside the deployment of task forces and new combinations of contingency plans. I argue for increased project control through industrial-style project reviews, and show how post-project reviews are an under-used, yet invaluable avenue of personal and organisational improvement. Lastly, I discuss the avoidance of project amnesia through effective capture of project knowledge, and transfer of lessons-learned to subsequent programs and projects.
E-Learning in a Large Organization: A Study of the Critical Role of Information Sharing
ERIC Educational Resources Information Center
Netteland, Grete; Wasson, Barbara; Morch, Anders I
2007-01-01
Purpose: The purpose of this paper is to provide new insights into the implementation of large-scale learning projects; thereby better understanding the difficulties, frustrations, and obstacles encountered when implementing enterprise-wide e-learning as a tool for training and organization transformation in a complex organization.…
Developing Renewable Energy Projects Larger Than 10 MWs at Federal Facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2013-03-01
To accomplish Federal goals for renewable energy, sustainability, and energy security, large-scale renewable energy projects must be developed and constructed on Federal sites at a significant scale with significant private investment. For the purposes of this Guide, large-scale Federal renewable energy projects are defined as renewable energy facilities larger than 10 megawatts (MW) that are sited on Federal property and lands and typically financed and owned by third parties.1 The U.S. Department of Energy’s Federal Energy Management Program (FEMP) helps Federal agencies meet these goals and assists agency personnel navigate the complexities of developing such projects and attract the necessarymore » private capital to complete them. This Guide is intended to provide a general resource that will begin to develop the Federal employee’s awareness and understanding of the project developer’s operating environment and the private sector’s awareness and understanding of the Federal environment. Because the vast majority of the investment that is required to meet the goals for large-scale renewable energy projects will come from the private sector, this Guide has been organized to match Federal processes with typical phases of commercial project development. FEMP collaborated with the National Renewable Energy Laboratory (NREL) and professional project developers on this Guide to ensure that Federal projects have key elements recognizable to private sector developers and investors. The main purpose of this Guide is to provide a project development framework to allow the Federal Government, private developers, and investors to work in a coordinated fashion on large-scale renewable energy projects. The framework includes key elements that describe a successful, financially attractive large-scale renewable energy project. This framework begins the translation between the Federal and private sector operating environments. When viewing the overall« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardage, B.A.; Carr, D.L.; Finley, R.J.
1995-07-01
The objectives of this project are to define undrained or incompletely drained reservoir compartments controlled primarily by depositional heterogeneity in a low-accommodation, cratonic Midcontinent depositional setting, and, afterwards, to develop and transfer to producers strategies for infield reserve growth of natural gas. Integrated geologic, geophysical, reservoir engineering, and petrophysical evaluations are described in complex difficult-to-characterize fluvial and deltaic reservoirs in Boonsville (Bend Conglomerate Gas) field, a large, mature gas field located in the Fort Worth Basin of North Texas. The purpose of this project is to demonstrate approaches to overcoming the reservoir complexity, targeting the gas resource, and doing somore » using state-of-the-art technologies being applied by a large cross section of Midcontinent operators.« less
Putting Ourselves in the Big Picture: A Sustainable Approach to Project Management for e-Learning
ERIC Educational Resources Information Center
Buchan, Janet
2010-01-01
In a case study of a large Australian university the metaphor of panarchy is used as a means of describing and understanding the complex interrelationships of multi-scale institutional projects and the influences of a variety factors on the potential success of e-learning initiatives. The concept of para-analysis is introduced as a management…
A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project.
Ewers, Robert M; Didham, Raphael K; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L; Turner, Edgar C
2011-11-27
Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification.
A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project
Ewers, Robert M.; Didham, Raphael K.; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D.; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L.; Turner, Edgar C.
2011-01-01
Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification. PMID:22006969
Wang, Yong
2017-03-25
In the last decade, synthetic biology research has been gradually transited from monocellular parts or devices toward more complex multicellular systems. The emerging plant synthetic biology is regarded as the "next chapter" of synthetic biology. The complex and diverse plant metabolism as the entry point, plant synthetic biology research not only helps us understand how real life is working, but also facilitates us to learn how to design and construct more complex artificial life. Bioactive compounds innovation and large-scale production are expected to be breakthrough with the redesigned plant metabolism as well. In this review, we discuss the research progress in plant synthetic biology and propose the new materia medica project to lift the level of traditional Chinese herbal medicine research.
Mobile Phone Voting for Participation and Engagement in a Large Compulsory Law Course
ERIC Educational Resources Information Center
Habel, Chad; Stubbs, Matthew
2014-01-01
This article reports on an action-research project designed to investigate the effect of a technological intervention on the complex interactions between student engagement, participation, attendance and preparation in a large lecture delivered as part of a compulsory first-year law course, a discipline which has not been the focus of any previous…
ERIC Educational Resources Information Center
Johanson, Kelly E.; Watt, Terry J.; McIntyre, Neil R.; Thompson, Marleesa
2013-01-01
Providing a project-based experience in an undergraduate biochemistry laboratory class can be complex with large class sizes and limited resources. We have designed a 6-week curriculum during which students purify and characterize the enzymes invertase and phosphatase from bakers yeast. Purification is performed in two stages via ethanol…
The Renewed Primary School in Belgium: Analysis of the Local Innovation Policy.
ERIC Educational Resources Information Center
Vandenberghe, Roland
The Renewed Primary School project in Belgium is analyzed in this paper in terms of organizational response to a large-scale innovation, which is characterized by its multidimensionality, by the large number of participating schools, and by a complex support structure. Section 2 of the report presents an elaborated description of these…
MSFC Optical Metrology: A National Resource
NASA Technical Reports Server (NTRS)
Burdine, Robert
1998-01-01
A national need exists for Large Diameter Optical Metrology Services. These services include the manufacture, testing, and assurance of precision and control necessary to assure the success of large optical projects. "Best Practices" are often relied on for manufacture and quality controls while optical projects are increasingly more demanding and complex. Marshall Space Flight Center (MSFC) has acquired unique optical measurement, testing and metrology capabilities through active participation in a wide variety of NASA optical programs. An overview of existing optical facilities and metrology capabilities is given with emphasis on use by other optical projects. Cost avoidance and project success is stressed through use of existing MSFC facilities and capabilities for measurement and metrology controls. Current issues in large diameter optical metrology are briefly reviewed. The need for a consistent and long duration Large Diameter Optical Metrology Service Group is presented with emphasis on the establishment of a National Large Diameter Optical Standards Laboratory. Proposals are made to develop MSFC optical standards and metrology capabilities as the primary national standards resource, providing access to MSFC Optical Core Competencies for manufacturers and researchers. Plans are presented for the development of a national lending library of precision optical standards with emphasis on cost avoidance while improving measurement assurance.
The role of ethics in data governance of large neuro-ICT projects.
Stahl, Bernd Carsten; Rainey, Stephen; Harris, Emma; Fothergill, B Tyr
2018-05-14
We describe current practices of ethics-related data governance in large neuro-ICT projects, identify gaps in current practice, and put forward recommendations on how to collaborate ethically in complex regulatory and normative contexts. We undertake a survey of published principles of data governance of large neuro-ICT projects. This grounds an approach to a normative analysis of current data governance approaches. Several ethical issues are well covered in the data governance policies of neuro-ICT projects, notably data protection and attribution of work. Projects use a set of similar policies to ensure users behave appropriately. However, many ethical issues are not covered at all. Implementation and enforcement of policies remain vague. The data governance policies we investigated indicate that the neuro-ICT research community is currently close-knit and that shared assumptions are reflected in infrastructural aspects. This explains why many ethical issues are not explicitly included in data governance policies at present. With neuro-ICT research growing in scale, scope, and international involvement, these shared assumptions should be made explicit and reflected in data governance.
NASA Astrophysics Data System (ADS)
Fisher, J. Richard; Bradley, Richard F.; Brisken, Walter F.; Cotton, William D.; Emerson, Darrel T.; Kerr, Anthony R.; Lacasse, Richard J.; Morgan, Matthew A.; Napier, Peter J.; Norrod, Roger D.; Payne, John M.; Pospieszalski, Marian W.; Symmes, Arthur; Thompson, A. Richard; Webber, John C.
2009-03-01
This white paper offers cautionary observations about the planning and development of new, large radio astronomy instruments. Complexity is a strong cost driver so every effort should be made to assign differing science requirements to different instruments and probably different sites. The appeal of shared resources is generally not realized in practice and can often be counterproductive. Instrument optimization is much more difficult with longer lists of requirements, and the development process is longer and less efficient. More complex instruments are necessarily further behind the technology state of the art because of longer development times. Including technology R&D in the construction phase of projects is a growing trend that leads to higher risks, cost overruns, schedule delays, and project de-scoping. There are no technology breakthroughs just over the horizon that will suddenly bring down the cost of collecting area. Advances come largely through careful attention to detail in the adoption of new technology provided by industry and the commercial market. Radio astronomy instrumentation has a very bright future, but a vigorous long-term R&D program not tied directly to specific projects needs to be restored, fostered, and preserved.
Model structures amplify uncertainty in predicted soil carbon responses to climate change.
Shi, Zheng; Crowell, Sean; Luo, Yiqi; Moore, Berrien
2018-06-04
Large model uncertainty in projected future soil carbon (C) dynamics has been well documented. However, our understanding of the sources of this uncertainty is limited. Here we quantify the uncertainties arising from model parameters, structures and their interactions, and how those uncertainties propagate through different models to projections of future soil carbon stocks. Both the vertically resolved model and the microbial explicit model project much greater uncertainties to climate change than the conventional soil C model, with both positive and negative C-climate feedbacks, whereas the conventional model consistently predicts positive soil C-climate feedback. Our findings suggest that diverse model structures are necessary to increase confidence in soil C projection. However, the larger uncertainty in the complex models also suggests that we need to strike a balance between model complexity and the need to include diverse model structures in order to forecast soil C dynamics with high confidence and low uncertainty.
Neural pathways mediating control of reproductive behaviour in male Japanese quail
Wild, J Martin; Balthazart, Jacques
2012-01-01
The sexually dimorphic medial preoptic nucleus (POM) in Japanese quail has for many years been the focus of intensive investigations into its role in reproductive behaviour. The present paper delineates a sequence of descending pathways that finally reach sacral levels of the spinal cord housing motor neurons innervating cloacal muscles involved in reproductive behaviour. We first retrogradely labeled the motor neurons innervating the large cloacal sphincter muscle (mSC) that forms part of the foam gland complex (Seiwert and Adkins-Regan, 1998, Brain Behav Evol 52:61–80) and then putative premotor nuclei in the brainstem, one of which was nucleus retroambigualis (RAm) in the caudal medulla. Anterograde tracing from RAm defined a bulbospinal pathway, terminations of which overlapped the distribution of mSC motor neurons and their extensive dorsally directed dendrites. Descending input to RAm arose from an extensive dorsomedial nucleus of the intercollicular complex (DM-ICo), electrical stimulation of which drove vocalizations. POM neurons were retrogradely labeled by injections of tracer into DM-ICo, but POM projections largely surrounded DM, rather than penetrated it. Thus, although a POM projection to ICo was shown, a POM projection to DM must be inferred. Nevertheless, the sequence of projections in the male quail from POM to cloacal motor neurons strongly resembles that in rats, cats and monkeys for the control of reproductive behaviour, as largely defined by Holstege and co-workers (e.g., Holstege et al., 1997, Neuroscience 80: 587–598). PMID:23225613
Chapter 11 - Post-hurricane fuel dynamics and implications for fire behavior (Project SO-EM-F-12-01)
Shanyue Guan; G. Geoff. Wang
2018-01-01
Hurricanes have long been a powerful and recurring disturbance in many coastal forest ecosystems. Intense hurricanes often produce a large amount of dead fuels in their affected forests. How the post-hurricane fuel complex changes with time, due todecomposition and management such as salvage, and its implications for fire behavior remain largely unknown....
Chen, Xuehui; Sun, Yunxiang; An, Xiongbo; Ming, Dengming
2011-10-14
Normal mode analysis of large biomolecular complexes at atomic resolution remains challenging in computational structure biology due to the requirement of large amount of memory space and central processing unit time. In this paper, we present a method called virtual interface substructure synthesis method or VISSM to calculate approximate normal modes of large biomolecular complexes at atomic resolution. VISSM introduces the subunit interfaces as independent substructures that join contacting molecules so as to keep the integrity of the system. Compared with other approximate methods, VISSM delivers atomic modes with no need of a coarse-graining-then-projection procedure. The method was examined for 54 protein-complexes with the conventional all-atom normal mode analysis using CHARMM simulation program and the overlap of the first 100 low-frequency modes is greater than 0.7 for 49 complexes, indicating its accuracy and reliability. We then applied VISSM to the satellite panicum mosaic virus (SPMV, 78,300 atoms) and to F-actin filament structures of up to 39-mer, 228,813 atoms and found that VISSM calculations capture functionally important conformational changes accessible to these structures at atomic resolution. Our results support the idea that the dynamics of a large biomolecular complex might be understood based on the motions of its component subunits and the way in which subunits bind one another. © 2011 American Institute of Physics
NASA Astrophysics Data System (ADS)
Sharf, I. V.; Chukhareva, N. V.; Kuznetsova, L. P.
2014-08-01
High social and economic importance of large-scale projects on gasification of East Siberian regions of Russia and diversifying gas exports poses the problem of complex risk analysis of the project. This article discusses the various types of risks that could significantly affect the timing of the implementation and effectiveness of the project for the construction of the first line of "Sila Sibiri", the "Chayanda-Lensk" section. Special attention is paid to financial and tax aspects of the project. Graphically presented analysis of the dynamics of financial indicators reflect certain periods of effectiveness in implementing the project. Authors also discuss the possible causes and consequences of risks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This report describes results of a technical, financial and environmental assessment study for a project, which would have included a new TCS micronized coal-fired heating plant for the Produkcja I Hodowla Roslin Ogrodniczych (PHRO) Greenhouse Complex; Krzeszowice, Poland. Project site is about 20 miles west of Krakow, Poland. During the project study period, PHRO utilized 14 heavy oil-fired boilers to produce heat for its greenhouse facilities and also home heating to several adjacent apartment housing complexes. The boilers burn a high-sulfur content heavy crude oil, called mazute, The project study was conducted during a period extended from March 1996 throughmore » February 1997. For size orientation, the PHRO Greenhouse complex grows a variety of vegetables and flowers for the Southern Poland marketplace. The greenhouse area under glass is very large and equivalent to approximately 50 football fields, The new micronized coal fired boiler would have: (1) provided a significant portion of the heat for PHRO and a portion of the adjacent apartment housing complexes, (2) dramatically reduced sulfur dioxide air pollution emissions, while satisfying new Polish air regulations, and (3) provided attractive savings to PHRO, based on the quantity of displaced oil.« less
NASA Technical Reports Server (NTRS)
Hyland, D. C.; Bernstein, D. S.
1987-01-01
The underlying philosophy and motivation of the optimal projection/maximum entropy (OP/ME) stochastic modeling and reduced control design methodology for high order systems with parameter uncertainties are discussed. The OP/ME design equations for reduced-order dynamic compensation including the effect of parameter uncertainties are reviewed. The application of the methodology to several Large Space Structures (LSS) problems of representative complexity is illustrated.
Adaptive Systems Engineering: A Medical Paradigm for Practicing Systems Engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Douglas Hamelin; Ron D. Klingler; Christopher Dieckmann
2011-06-01
From its inception in the defense and aerospace industries, SE has applied holistic, interdisciplinary tools and work-process to improve the design and management of 'large, complex engineering projects.' The traditional scope of engineering in general embraces the design, development, production, and operation of physical systems, and SE, as originally conceived, falls within that scope. While this 'traditional' view has expanded over the years to embrace wider, more holistic applications, much of the literature and training currently available is still directed almost entirely at addressing the large, complex, NASA and defense-sized systems wherein the 'ideal' practice of SE provides the cradle-to-gravemore » foundation for system development and deployment. Under such scenarios, systems engineers are viewed as an integral part of the system and project life-cycle from conception to decommissioning. In far less 'ideal' applications, SE principles are equally applicable to a growing number of complex systems and projects that need to be 'rescued' from overwhelming challenges that threaten imminent failure. The medical profession provides a unique analogy for this latter concept and offers a useful paradigm for tailoring our 'practice' of SE to address the unexpected dynamics of applying SE in the real world. In short, we can be much more effective as systems engineers as we change some of the paradigms under which we teach and 'practice' SE.« less
De-mystifying earned value management for ground based astronomy projects, large and small
NASA Astrophysics Data System (ADS)
Norton, Timothy; Brennan, Patricia; Mueller, Mark
2014-08-01
The scale and complexity of today's ground based astronomy projects have justifiably required Principal Investigator's and their project teams to adopt more disciplined management processes and tools in order to achieve timely and accurate quantification of the progress and relative health of their projects. Earned Value Management (EVM) is one such tool. Developed decades ago and used extensively in the defense and construction industries, and now a requirement of NASA projects greater than $20M; EVM has gained a foothold in ground-based astronomy projects. The intent of this paper is to de-mystify EVM by discussing the fundamentals of project management, explaining how EVM fits with existing principles, and describing key concepts every project can use to implement their own EVM system. This paper also discusses pitfalls to avoid during implementation and obstacles to its success. The authors report on their organization's most recent experience implementing EVM for the GMT-Consortium Large Earth Finder (G-CLEF) project. G-CLEF is a fiber-fed, optical echelle spectrograph that has been selected as a first light instrument for the Giant Magellan Telescope (GMT), planned for construction at the Las Campanas Observatory in Chile's Atacama Desert region.
Systems Engineering and Reusable Avionics
NASA Technical Reports Server (NTRS)
Conrad, James M.; Murphy, Gloria
2010-01-01
One concept for future space flights is to construct building blocks for a wide variety of avionics systems. Once a unit has served its original purpose, it can be removed from the original vehicle and reused in a similar or dissimilar function, depending on the function blocks the unit contains. For example: Once a lunar lander has reached the moon's surface, an engine controller for the Lunar Decent Module would be removed and used for a lunar rover motor control unit or for a Environmental Control Unit for a Lunar Habitat. This senior design project included the investigation of a wide range of functions of space vehicles and possible uses. Specifically, this includes: (1) Determining and specifying the basic functioning blocks of space vehicles. (2) Building and demonstrating a concept model. (3) Showing high reliability is maintained. The specific implementation of this senior design project included a large project team made up of Systems, Electrical, Computer, and Mechanical Engineers/Technologists. The efforts were made up of several sub-groups that each worked on a part of the entire project. The large size and complexity made this project one of the more difficult to manage and advise. Typical projects only have 3-4 students, but this project had 10 students from five different disciplines. This paper describes the difference of this large project compared to typical projects, and the challenges encountered. It also describes how the systems engineering approach was successfully implemented so that the students were able to meet nearly all of the project requirements.
System Modeling of a large FPGA project: the SKA Tile Processing Module
NASA Astrophysics Data System (ADS)
Belli, C.; Comoretto, G.
Large projects like the SKA have an intrinsic complexity due to their scale. In this context, the application of a management design system becomes fundamental. For this purpose the SysML language, a UML customization for engineering applications, has been applied. As far as our work is concerned, we focused on the SKA Low Telescope - Tile Processing Module, designing diagrams at different detail levels. We designed a conceptual model of the TPM, primarily focusing on the main interfaces and the major data flows between product items. Functionalities are derived from use cases and allocated to hardware modules in order to guarantee the project's internal consistency and features. This model has been used both as internal documentation and as job specification, to commit part of the design to external entities.
NASA Astrophysics Data System (ADS)
Myers, B.; Wiggins, H. V.; Turner-Bogren, E. J.; Warburton, J.
2017-12-01
Project Managers at the Arctic Research Consortium of the U.S. (ARCUS) lead initiatives to convene, communicate with, and connect the Arctic research community across challenging disciplinary, geographic, temporal, and cultural boundaries. They regularly serve as the organizing hubs, archivists and memory-keepers for collaborative projects comprised of many loosely affiliated partners. As leading organizers of large open science meetings and other outreach events, they also monitor the interdisciplinary landscape of community needs, concerns, opportunities, and emerging research directions. However, leveraging the ARCUS Project Manager role to strategically build out the intangible infrastructure necessary to advance Arctic research requires a unique set of knowledge, skills, and experience. Drawing on a range of lessons learned from past and ongoing experiences with collaborative science, education and outreach programming, this presentation will highlight a model of ARCUS project management that we believe works best to support and sustain our community in its long-term effort to conquer the complexities of Arctic research.
LIFE CYCLE MANAGEMENT OF MUNICIPAL SOLID WASTE
This is a large, complex project in which a number of different research activities are taking place concurrently to collect data, develop cost and LCI methodologies, construct a database and decision support tool, and conduct case studies with communities to support the life cyc...
Solar ultraviolet radiation in a changing climate
The projected large increases in damaging ultraviolet radiation as a result of global emissions of ozone-depleting substances have been forestalled by the success of the Montreal Protocol. New challenges are now arising in relation to climate change. We highlight the complex inte...
Modeling Watershed Mercury Response to Atmospheric Loadings: Response Time and Challenges
The relationship between sources of mercury to watersheds and its fate in surface waters is invariably complex. Large scale monitoring studies, such as the METAALICUS project, have advanced current understanding of the links between atmospheric deposition of mercury and accumulat...
A Model-Based Approach to Engineering Behavior of Complex Aerospace Systems
NASA Technical Reports Server (NTRS)
Ingham, Michel; Day, John; Donahue, Kenneth; Kadesch, Alex; Kennedy, Andrew; Khan, Mohammed Omair; Post, Ethan; Standley, Shaun
2012-01-01
One of the most challenging yet poorly defined aspects of engineering a complex aerospace system is behavior engineering, including definition, specification, design, implementation, and verification and validation of the system's behaviors. This is especially true for behaviors of highly autonomous and intelligent systems. Behavior engineering is more of an art than a science. As a process it is generally ad-hoc, poorly specified, and inconsistently applied from one project to the next. It uses largely informal representations, and results in system behavior being documented in a wide variety of disparate documents. To address this problem, JPL has undertaken a pilot project to apply its institutional capabilities in Model-Based Systems Engineering to the challenge of specifying complex spacecraft system behavior. This paper describes the results of the work in progress on this project. In particular, we discuss our approach to modeling spacecraft behavior including 1) requirements and design flowdown from system-level to subsystem-level, 2) patterns for behavior decomposition, 3) allocation of behaviors to physical elements in the system, and 4) patterns for capturing V&V activities associated with behavioral requirements. We provide examples of interesting behavior specification patterns, and discuss findings from the pilot project.
A measurement system for large, complex software programs
NASA Technical Reports Server (NTRS)
Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.
1994-01-01
This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.
The Human Genome Project: big science transforms biology and medicine.
Hood, Leroy; Rowen, Lee
2013-01-01
The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called 'big science' - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project.
The Human Genome Project: big science transforms biology and medicine
2013-01-01
The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called ‘big science’ - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project. PMID:24040834
Interactive exploration of coastal restoration modeling in virtual environments
NASA Astrophysics Data System (ADS)
Gerndt, Andreas; Miller, Robert; Su, Simon; Meselhe, Ehab; Cruz-Neira, Carolina
2009-02-01
Over the last decades, Louisiana has lost a substantial part of its coastal region to the Gulf of Mexico. The goal of the project depicted in this paper is to investigate the complex ecological and geophysical system not only to find solutions to reverse this development but also to protect the southern landscape of Louisiana for disastrous impacts of natural hazards like hurricanes. This paper sets a focus on the interactive data handling of the Chenier Plain which is only one scenario of the overall project. The challenge addressed is the interactive exploration of large-scale time-depending 2D simulation results and of terrain data with a high resolution that is available for this region. Besides data preparation, efficient visualization approaches optimized for the usage in virtual environments are presented. These are embedded in a complex framework for scientific visualization of time-dependent large-scale datasets. To provide a straightforward interface for rapid application development, a software layer called VRFlowVis has been developed. Several architectural aspects to encapsulate complex virtual reality aspects like multi-pipe vs. cluster-based rendering are discussed. Moreover, the distributed post-processing architecture is investigated to prove its efficiency for the geophysical domain. Runtime measurements conclude this paper.
Intelligent systems engineering methodology
NASA Technical Reports Server (NTRS)
Fouse, Scott
1990-01-01
An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.
Book Review: Large-Scale Ecosystem Restoration: Five Case Studies from the United States
Broad-scale ecosystem restoration efforts involve a very complex set of ecological and societal components, and the success of any ecosystem restoration project rests on an integrated approach to implementation. Editors Mary Doyle and Cynthia Drew have successfully synthesized ma...
Breaking barriers through collaboration: the example of the Cell Migration Consortium.
Horwitz, Alan Rick; Watson, Nikki; Parsons, J Thomas
2002-10-15
Understanding complex integrated biological processes, such as cell migration, requires interdisciplinary approaches. The Cell Migration Consortium, funded by a Large-Scale Collaborative Project Award from the National Institute of General Medical Science, develops and disseminates new technologies, data, reagents, and shared information to a wide audience. The development and operation of this Consortium may provide useful insights for those who plan similarly large-scale, interdisciplinary approaches.
NASA Technical Reports Server (NTRS)
Hicks, Brian A.; Lyon, Richard G.; Petrone, Peter, III; Bolcar, Matthew R.; Bolognese, Jeff; Clampin, Mark; Dogoda, Peter; Dworzanski, Daniel; Helmbrecht, Michael A.; Koca, Corina;
2016-01-01
This work presents an overview of the This work presents an overview of the Segmented Aperture Interferometric Nulling Testbed (SAINT), a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNCs demonstrated wavefront sensing and control system to refine and quantify the end-to-end system performance for high-contrast starlight suppression. This pathfinder system will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes., a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNCs demonstrated wavefront sensing and control system to refine and quantify the end-to-end system performance for high-contrast starlight suppression. This pathfinder system will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes.
Diaspora, faith, and science: building a Mouride hospital in Senegal.
Foley, Ellen E; Babou, Cheikh Anta
2011-01-01
This article examines a development initiative spearheaded by the members of a transnational diaspora – the creation of a medical hospital in the holy city of Touba in central Senegal. Although the construction of the hospital is decidedly a philanthropic project, Hôpital Matlaboul Fawzaini is better understood as part of the larger place-making project of the Muridiyya and the pursuit of symbolic capital by a particular Mouride "dahira". The "dahira's" project illuminates important processes of forging global connections and transnational localities, and underscores the importance of understanding the complex motivations behind diaspora development. The hospital's history reveals the delicate negotiations between state actors and diaspora organizations, and the complexities of public–private partnerships for development. In a reversal of state withdrawal in the neo-liberal era, a diaspora association was able to wrest new financial commitments from the state by completing a large infrastructure project. Despite this success, we argue that these kinds of projects, which are by nature uneven and sporadic, reflect particular historical conjunctures and do not offer a panacea for the failure of state-led development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moser, M.A.
1996-01-01
Options for successfully using biogas depend on project scale. Almost all biogas from anaerobic digesters must first go through a gas handling system that pressurizes, meters, and filters the biogas. Additional treatment, including hydrogen sulfide-mercaptan scrubbing, gas drying, and carbon dioxide removal may be necessary for specialized uses, but these are complex and expensive processes. Thus, they can be justified only for large-scale projects that require high-quality biogas. Small-scale projects (less than 65 cfm) generally use biogas (as produced) as a boiler fuel or for fueling internal combustion engine-generators to produce electricity. If engines or boilers are selected properly, theremore » should be no need to remove hydrogen sulfide. Small-scale combustion turbines, steam turbines, and fuel cells are not used because of their technical complexity and high capital cost. Biogas cleanup to pipeline or transportation fuel specifications is very costly, and energy economics preclude this level of treatment.« less
Empowering pharmacoinformatics by linked life science data
NASA Astrophysics Data System (ADS)
Goldmann, Daria; Zdrazil, Barbara; Digles, Daniela; Ecker, Gerhard F.
2017-03-01
With the public availability of large data sources such as ChEMBLdb and the Open PHACTS Discovery Platform, retrieval of data sets for certain protein targets of interest with consistent assay conditions is no longer a time consuming process. Especially the use of workflow engines such as KNIME or Pipeline Pilot allows complex queries and enables to simultaneously search for several targets. Data can then directly be used as input to various ligand- and structure-based studies. In this contribution, using in-house projects on P-gp inhibition, transporter selectivity, and TRPV1 modulation we outline how the incorporation of linked life science data in the daily execution of projects allowed to expand our approaches from conventional Hansch analysis to complex, integrated multilayer models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Estep, Donald
2015-11-30
This project addressed the challenge of predictive computational analysis of strongly coupled, highly nonlinear multiphysics systems characterized by multiple physical phenomena that span a large range of length- and time-scales. Specifically, the project was focused on computational estimation of numerical error and sensitivity analysis of computational solutions with respect to variations in parameters and data. In addition, the project investigated the use of accurate computational estimates to guide efficient adaptive discretization. The project developed, analyzed and evaluated new variational adjoint-based techniques for integration, model, and data error estimation/control and sensitivity analysis, in evolutionary multiphysics multiscale simulations.
Igniting the Light Elements: The Los Alamos Thermonuclear Weapon Project, 1942-1952
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitzpatrick, Anne C.
1999-07-01
The American system of nuclear weapons research and development was conceived and developed not as a result of technological determinism, but by a number of individual architects who promoted the growth of this large technologically-based complex. While some of the technological artifacts of this system, such as the fission weapons used in World War II, have been the subject of many historical studies, their technical successors--fusion (or hydrogen) devices--are representative of the largely unstudied highly secret realms of nuclear weapons science and engineering. In the postwar period a small number of Los Alamos Scientific Laboratory's staff and affiliates were responsiblemore » for theoretical work on fusion weapons, yet the program was subject to both the provisions and constraints of the US Atomic Energy Commission, of which Los Alamos was a part. The Commission leadership's struggle to establish a mission for its network of laboratories, least of all to keep them operating, affected Los Alamos's leaders' decisions as to the course of weapons design and development projects. Adapting Thomas P. Hughes's ''large technological systems'' thesis, I focus on the technical, social, political, and human problems that nuclear weapons scientists faced while pursuing the thermonuclear project, demonstrating why the early American thermonuclear bomb project was an immensely complicated scientific and technological undertaking. I concentrate mainly on Los Alamos Scientific Laboratory's Theoretical, or T, Division, and its members' attempts to complete an accurate mathematical treatment of the ''Super''--the most difficult problem in physics in the postwar period--and other fusion weapon theories. Although tackling a theoretical problem, theoreticians had to address technical and engineering issues as well. I demonstrate the relative value and importance of H-bomb research over time in the postwar era to scientific, politician, and military participants in this project. I analyze how and when participants in the H-bomb project recognized both blatant and subtle problems facing the project, how scientists solved them, and the relationship this process had to official nuclear weapons policies. Consequently, I show how the practice of nuclear weapons science in the postwar period became an extremely complex, technologically-based endeavor.« less
Heredia-Langner, Alejandro; Cort, John; Bailey, Vanessa
2018-01-16
The Fishing for Features Signature Discovery project developed a framework for discovering signature features in challenging environments involving large and complex data sets or where phenomena may be poorly characterized or understood. Researchers at PNNL have applied the framework to the optimization of biofuels blending and to discover signatures of climate change on microbial soil communities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heredia-Langner, Alejandro; Cort, John; Bailey, Vanessa
2016-07-21
The Fishing for Features Signature Discovery project developed a framework for discovering signature features in challenging environments involving large and complex data sets or where phenomena may be poorly characterized or understood. Researchers at PNNL have applied the framework to the optimization of biofuels blending and to discover signatures of climate change on microbial soil communities.
Overview of Superconductivity and Challenges in Applications
NASA Astrophysics Data System (ADS)
Flükiger, Rene
2012-01-01
Considerable progress has been achieved during the last few decades in the various fields of applied superconductivity, while the related low temperature technology has reached a high level. Magnetic resonance imaging (MRI) and nuclear magnetic resonance (NMR) are so far the most successful applications, with tens of thousands of units worldwide, but high potential can also be recognized in the energy sector, with high energy cables, transformers, motors, generators for wind turbines, fault current limiters and devices for magnetic energy storage. A large number of magnet and cable prototypes have been constructed, showing in all cases high reliability. Large projects involving the construction of magnets, solenoids as well as dipoles and quadrupoles are described in the present book. A very large project, the LHC, is currently in operation, demonstrating that superconductivity is a reliable technology, even in a device of unprecedented high complexity. A project of similar complexity is ITER, a fusion device that is presently under construction. This article starts with a brief historical introduction to superconductivity as a phenomenon, and some fundamental properties necessary for the understanding of the technical behavior of superconductors are described. The introduction of superconductivity in the industrial cycle faces many challenges, first for the properties of the base elements, e.g. the wires, tapes and thin films, then for the various applied devices, where a number of new difficulties had to be resolved. A variety of industrial applications in energy, medicine and communications are briefly presented, showing how superconductivity is now entering the market.
Statistical genetics concepts and approaches in schizophrenia and related neuropsychiatric research.
Schork, Nicholas J; Greenwood, Tiffany A; Braff, David L
2007-01-01
Statistical genetics is a research field that focuses on mathematical models and statistical inference methodologies that relate genetic variations (ie, naturally occurring human DNA sequence variations or "polymorphisms") to particular traits or diseases (phenotypes) usually from data collected on large samples of families or individuals. The ultimate goal of such analysis is the identification of genes and genetic variations that influence disease susceptibility. Although of extreme interest and importance, the fact that many genes and environmental factors contribute to neuropsychiatric diseases of public health importance (eg, schizophrenia, bipolar disorder, and depression) complicates relevant studies and suggests that very sophisticated mathematical and statistical modeling may be required. In addition, large-scale contemporary human DNA sequencing and related projects, such as the Human Genome Project and the International HapMap Project, as well as the development of high-throughput DNA sequencing and genotyping technologies have provided statistical geneticists with a great deal of very relevant and appropriate information and resources. Unfortunately, the use of these resources and their interpretation are not straightforward when applied to complex, multifactorial diseases such as schizophrenia. In this brief and largely nonmathematical review of the field of statistical genetics, we describe many of the main concepts, definitions, and issues that motivate contemporary research. We also provide a discussion of the most pressing contemporary problems that demand further research if progress is to be made in the identification of genes and genetic variations that predispose to complex neuropsychiatric diseases.
Biodegradable hybrid tissue engineering scaffolds for reconstruction of large bone defects
NASA Astrophysics Data System (ADS)
Barati, Danial
Complex skeletal injuries and large bone fractures are still a significant clinical problem in US. Approximately 1.5 million Americans (veterans, their families, and civilians) every year suffer from bone loss due to traumatic skeletal injuries, infection, and resection of primary tumors that require extensive grafting to bridge the gap. The US bone graft market is over $2.2 billion a year. Due to insufficient mechanical stability, lack of vascularity, and inadequate resorption of the graft, patients with traumatic large skeletal injuries undergo multiple costly operations followed by extensive recovery steps to maintain proper bone alignment and length. Current strategies for repairing damaged or diseased bones include autologous or allograft bone transplantations. However, limited availability of autografts and risk of disease transmission associated with allografts have necessitated the search for the development of new bone graft options and strategies. The overall goal of this project is to develop a much-needed bone-mimetic engineered graft as a substitute for current strategies providing required bone grafts for reconstruction of large bone defects. This project will use the structure of natural cortical bone as a guide to produce an engineered bone graft with balanced strength, osteogenesis, vascularization, and resorption. The outcome of this project will be a biodegradable hybrid scaffold system (similar to natural cortical bone) including a mechanically strong scaffold allowing for mechanical stability of the load-bearing defect site and a soft and highly porous structure such as a hydrogel phase which will allow for efficient cell and growth factor delivery into the defect implantation site, cell niche establishment and promotion of mineralization. Successful completion of this project will transform bone graft technology for regeneration of complex bone defects from a frozen or freeze-dried allograft to a safe, infection-free, mechanically-stable, osteoinductive, and vasculogenic graft that is ultimately displaced by the patient's own tissue.
Structural attachments for large space structures
NASA Technical Reports Server (NTRS)
Pruett, E. C.; Loughead, T. E.; Robertson, K. B., III
1980-01-01
The feasibility of fabricating beams in space and using them as components of a large, crew assembled structure, was investigated. Two projects were undertaken: (1) design and development of a ground version of an automated beam builder capable of producing triangular cross section aluminum beams; and (2) design and fabrication of lap joints to connect the beams orthogonally and centroidal end caps to connect beams end to end at any desired angle. The first project produced a beam building machine which fabricates aluminum beams suitable for neutral buoyancy evaluation. The second project produced concepts for the lap joint and end cap. However, neither of these joint concepts was suitable for use by a pressure suited crew member in a zero gravity environment. It is concluded that before the beams can be evaluated the joint designs need to be completed and sufficient joints produced to allow assembly of a complex structure.
Minimus: a fast, lightweight genome assembler.
Sommer, Daniel D; Delcher, Arthur L; Salzberg, Steven L; Pop, Mihai
2007-02-26
Genome assemblers have grown very large and complex in response to the need for algorithms to handle the challenges of large whole-genome sequencing projects. Many of the most common uses of assemblers, however, are best served by a simpler type of assembler that requires fewer software components, uses less memory, and is far easier to install and run. We have developed the Minimus assembler to address these issues, and tested it on a range of assembly problems. We show that Minimus performs well on several small assembly tasks, including the assembly of viral genomes, individual genes, and BAC clones. In addition, we evaluate Minimus' performance in assembling bacterial genomes in order to assess its suitability as a component of a larger assembly pipeline. We show that, unlike other software currently used for these tasks, Minimus produces significantly fewer assembly errors, at the cost of generating a more fragmented assembly. We find that for small genomes and other small assembly tasks, Minimus is faster and far more flexible than existing tools. Due to its small size and modular design Minimus is perfectly suited to be a component of complex assembly pipelines. Minimus is released as an open-source software project and the code is available as part of the AMOS project at Sourceforge.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cresap, D.A.; Halverson, D.S.
In the Fluorinel Dissolution Process (FDP) upgrade, excess hydrofluoric acid in the dissolver product must be complexed with aluminum nitrate (ANN) to eliminate corrosion concerns, adjusted with nitrate to facilitate extraction, and diluted with water to ensure solution stability. This is currently accomplished via batch processing in large vessels. However, to accommodate increases in projected throughput and reduce water production in a cost-effective manner, a semi-continuous system (In-line Complexing (ILC)) has been developed. The major conclusions drawn from tests demonstrating the feasibility of this concept are given in this report.
Coal conversion products industrial applications
NASA Technical Reports Server (NTRS)
Dunkin, J. H.; Warren, D.
1980-01-01
Coal-based synthetic fuels complexes under development consideration by NASA/MSFC will produce large quantities of synthetic fuels, primarily medium BTU gas, which could be sold commercially to industries located in South Central Tennessee and Northern Alabama. The complexes would be modular in construction, and subsequent modules may produce liquid fuels or fuels for electric power production. Current and projected industries in the two states which have a propensity for utilizing coal-based synthetic fuels were identified, and a data base was compiled to support MFSC activities.
2014-09-30
repeating pulse-like signals were investigated. Software prototypes were developed and integrated into distinct streams of reseach ; projects...to study complex sound archives spanning large spatial and temporal scales. A new post processing method for detection and classifcation was also...false positive rates. HK-ANN was successfully tested for a large minke whale dataset, but could easily be used on other signal types. Various
Underestimation of Project Costs
NASA Technical Reports Server (NTRS)
Jones, Harry W.
2015-01-01
Large projects almost always exceed their budgets. Estimating cost is difficult and estimated costs are usually too low. Three different reasons are suggested: bad luck, overoptimism, and deliberate underestimation. Project management can usually point to project difficulty and complexity, technical uncertainty, stakeholder conflicts, scope changes, unforeseen events, and other not really unpredictable bad luck. Project planning is usually over-optimistic, so the likelihood and impact of bad luck is systematically underestimated. Project plans reflect optimism and hope for success in a supposedly unique new effort rather than rational expectations based on historical data. Past project problems are claimed to be irrelevant because "This time it's different." Some bad luck is inevitable and reasonable optimism is understandable, but deliberate deception must be condemned. In a competitive environment, project planners and advocates often deliberately underestimate costs to help gain project approval and funding. Project benefits, cost savings, and probability of success are exaggerated and key risks ignored. Project advocates have incentives to distort information and conceal difficulties from project approvers. One naively suggested cure is more openness, honesty, and group adherence to shared overall goals. A more realistic alternative is threatening overrun projects with cancellation. Neither approach seems to solve the problem. A better method to avoid the delusions of over-optimism and the deceptions of biased advocacy is to base the project cost estimate on the actual costs of a large group of similar projects. Over optimism and deception can continue beyond the planning phase and into project execution. Hard milestones based on verified tests and demonstrations can provide a reality check.
Integrating ecological and social knowledge: learning from CHANS research
Bruce Shindler; Thomas A. Spies; John P. Bolte; Jeffrey D. Kline
2017-01-01
Scientists are increasingly called upon to integrate across ecological and social disciplines to tackle complex coupled human and natural system (CHANS) problems. Integration of these disciplines is challenging and many scientists do not have experience with large integrated research projects. However, much can be learned about the complicated process of integration...
Interactive Exploration on Large Genomic Datasets.
Tu, Eric
2016-01-01
The prevalence of large genomics datasets has made the the need to explore this data more important. Large sequencing projects like the 1000 Genomes Project [1], which reconstructed the genomes of 2,504 individuals sampled from 26 populations, have produced over 200TB of publically available data. Meanwhile, existing genomic visualization tools have been unable to scale with the growing amount of larger, more complex data. This difficulty is acute when viewing large regions (over 1 megabase, or 1,000,000 bases of DNA), or when concurrently viewing multiple samples of data. While genomic processing pipelines have shifted towards using distributed computing techniques, such as with ADAM [4], genomic visualization tools have not. In this work we present Mango, a scalable genome browser built on top of ADAM that can run both locally and on a cluster. Mango presents a combination of different optimizations that can be combined in a single application to drive novel genomic visualization techniques over terabytes of genomic data. By building visualization on top of a distributed processing pipeline, we can perform visualization queries over large regions that are not possible with current tools, and decrease the time for viewing large data sets. Mango is part of the Big Data Genomics project at University of California-Berkeley [25] and is published under the Apache 2 license. Mango is available at https://github.com/bigdatagenomics/mango.
A Climate Information Platform for Copernicus (CLIPC): managing the data flood
NASA Astrophysics Data System (ADS)
Juckes, Martin; Swart, Rob; Bärring, Lars; Groot, Annemarie; Thysse, Peter; Som de Cerff, Wim; Costa, Luis; Lückenkötter, Johannes; Callaghan, Sarah; Bennett, Victoria
2016-04-01
The FP7 project "Climate Information Platform for Copernicus" (CLIPC) is developing a demonstration portal for the Copernicus Climate Change Service (C3S). The project confronts many problems associated with the huge diversity of underlying data, complex multi-layered uncertainties and extremely complex and evolving user requirements. The infrastructure is founded on a comprehensive approach to managing data and documentation, using global domain independent standards where possible. An extensive thesaurus of terms provides both a robust and flexible foundation for data discovery services and accessible definitions to support users. It is, of course, essential to provide information to users through an interface which reflects their expectations rather than the intricacies of abstract data models. CLIPC has reviewed user engagement activities from other collaborative European projects, conducted user polls, interviews and meetings and is now entering an evaluation phase in which users discuss new features and options in the portal design. The CLIPC portal will provide access to raw climate science data and climate impact indicators derived from that data. The portal needs the flexibility to support access to extremely large datasets as well as providing means to manipulate data and explore complex products interactively.
An efficient approach to BAC based assembly of complex genomes.
Visendi, Paul; Berkman, Paul J; Hayashi, Satomi; Golicz, Agnieszka A; Bayer, Philipp E; Ruperao, Pradeep; Hurgobin, Bhavna; Montenegro, Juan; Chan, Chon-Kit Kenneth; Staňková, Helena; Batley, Jacqueline; Šimková, Hana; Doležel, Jaroslav; Edwards, David
2016-01-01
There has been an exponential growth in the number of genome sequencing projects since the introduction of next generation DNA sequencing technologies. Genome projects have increasingly involved assembly of whole genome data which produces inferior assemblies compared to traditional Sanger sequencing of genomic fragments cloned into bacterial artificial chromosomes (BACs). While whole genome shotgun sequencing using next generation sequencing (NGS) is relatively fast and inexpensive, this method is extremely challenging for highly complex genomes, where polyploidy or high repeat content confounds accurate assembly, or where a highly accurate 'gold' reference is required. Several attempts have been made to improve genome sequencing approaches by incorporating NGS methods, to variable success. We present the application of a novel BAC sequencing approach which combines indexed pools of BACs, Illumina paired read sequencing, a sequence assembler specifically designed for complex BAC assembly, and a custom bioinformatics pipeline. We demonstrate this method by sequencing and assembling BAC cloned fragments from bread wheat and sugarcane genomes. We demonstrate that our assembly approach is accurate, robust, cost effective and scalable, with applications for complete genome sequencing in large and complex genomes.
Lopes, Manoela Gomes Reis; Vilela, Rodolfo Andrade de Gouveia; Querol, Marco Antônio Pereira
2018-02-19
Large construction projects involve the functioning of a complex activity system (AS) in network format. Anomalies such as accidents, delays, reworks, etc., can be explained by contradictions that emerge historically in the system. The aim of this study was to analyze the history of an airport construction project to understand the current contradictions and anomalies in the AS and how they emerged. A case study was conducted for this purpose, combining Collective Work Analysis, interviews, observations, and analysis of documents that provided the basis for sessions in the Change Laboratory, where a participant timeline was elaborated with the principal events during the construction project. Based on the timeline, a historical analysis of the airport's AS revealed critical historical events and contradictions that explained the anomalies that occurred during the project. The analysis showed that the airport had been planned for construction with politically determined deadlines that were insufficient and inconsistent with the project's complexity. The choice of the contract modality, which assigned responsibility to a joint venture for all of the project's phases, was another critical historical event, because it allowed launching the construction before a definitive executive project had been drafted. There were also different cultures in companies working together for the first time in the context of a project with time pressures and outsourcing of activities without the necessary coordination. Identifying these contradictions and their historical origins proved essential for understanding the current situation and efforts to prevent similar situations in the future.
The Value of Methodical Management: Optimizing Science Results
NASA Astrophysics Data System (ADS)
Saby, Linnea
2016-01-01
As science progresses, making new discoveries in radio astronomy becomes increasingly complex. Instrumentation must be incredibly fine-tuned and well-understood, scientists must consider the skills and schedules of large research teams, and inter-organizational projects sometimes require coordination between observatories around the globe. Structured and methodical management allows scientists to work more effectively in this environment and leads to optimal science output. This report outlines the principles of methodical project management in general, and describes how those principles are applied at the National Radio Astronomy Observatory (NRAO) in Charlottesville, Virginia.
Enhancing Transdisciplinary Research Through Collaborative Leadership
Gray, Barbara
2008-01-01
Transcending the well-established and familiar boundaries of disciplinary silos poses challenges for even the most interpersonally competent scientists. This paper explores the challenges inherent in leading transdisciplinary projects, detailing the critical roles that leaders play in shepherding transdisciplinary scientific endeavors. Three types of leadership tasks are considered: cognitive, structural, and processual. Distinctions are made between leading small, co-located projects and large, dispersed ones. Finally, social-network analysis is proposed as a useful tool for conducting research on leadership, and, in particular, on the role of brokers, on complex transdisciplinary teams. PMID:18619392
NASA Astrophysics Data System (ADS)
Alagba, Tonye J.
Oil and gas drilling projects are the primary means by which oil companies recover large volumes of commercially available hydrocarbons from deep reservoirs. These types of projects are complex in nature, involving management of multiple stakeholder interfaces, multidisciplinary personnel, complex contractor relationships, and turbulent environmental and market conditions, necessitating the application of proven project management best practices and critical success factors (CSFs) to achieve success. Although there is some practitioner oriented literature on project management CSFs for drilling projects, none of these is based on empirical evidence, from research. In addition, the literature has reported alarming rates of oil and gas drilling project failure, which is attributable not to technical factors, but to failure of project management. The aim of this quantitative correlational study therefore, was to discover an empirically verified list of project management CSFs, which consistent application leads to successful implementation of oil and gas drilling projects. The study collected survey data online, from a random sample of 127 oil and gas drilling personnel who were members of LinkedIn's online community "Drilling Supervisors, Managers, and Engineers". The results of the study indicated that 10 project management factors are individually related to project success of oil and gas drilling projects. These 10 CSFs are namely; Project mission, Top management support, Project schedule/plan, Client consultation, Personnel, Technical tasks, Client acceptance, Monitoring and feedback, Communication, and Troubleshooting. In addition, the study found that the relationships between the 10 CSFs and drilling project success is unaffected by participant and project demographics---role of project personnel, and project location. The significance of these findings are both practical, and theoretical. Practically, application of an empirically verified CSFs list to oil and gas drilling projects could help oil companies improve the performance of future drilling projects. Theoretically, the study's findings may help to bridge a gap in the project management CSFs literature, and add to the general project management body of knowledge.
Complex networks as an emerging property of hierarchical preferential attachment.
Hébert-Dufresne, Laurent; Laurence, Edward; Allard, Antoine; Young, Jean-Gabriel; Dubé, Louis J
2015-12-01
Real complex systems are not rigidly structured; no clear rules or blueprints exist for their construction. Yet, amidst their apparent randomness, complex structural properties universally emerge. We propose that an important class of complex systems can be modeled as an organization of many embedded levels (potentially infinite in number), all of them following the same universal growth principle known as preferential attachment. We give examples of such hierarchy in real systems, for instance, in the pyramid of production entities of the film industry. More importantly, we show how real complex networks can be interpreted as a projection of our model, from which their scale independence, their clustering, their hierarchy, their fractality, and their navigability naturally emerge. Our results suggest that complex networks, viewed as growing systems, can be quite simple, and that the apparent complexity of their structure is largely a reflection of their unobserved hierarchical nature.
Complex networks as an emerging property of hierarchical preferential attachment
NASA Astrophysics Data System (ADS)
Hébert-Dufresne, Laurent; Laurence, Edward; Allard, Antoine; Young, Jean-Gabriel; Dubé, Louis J.
2015-12-01
Real complex systems are not rigidly structured; no clear rules or blueprints exist for their construction. Yet, amidst their apparent randomness, complex structural properties universally emerge. We propose that an important class of complex systems can be modeled as an organization of many embedded levels (potentially infinite in number), all of them following the same universal growth principle known as preferential attachment. We give examples of such hierarchy in real systems, for instance, in the pyramid of production entities of the film industry. More importantly, we show how real complex networks can be interpreted as a projection of our model, from which their scale independence, their clustering, their hierarchy, their fractality, and their navigability naturally emerge. Our results suggest that complex networks, viewed as growing systems, can be quite simple, and that the apparent complexity of their structure is largely a reflection of their unobserved hierarchical nature.
Accelerating NASA GN&C Flight Software Development
NASA Technical Reports Server (NTRS)
Tamblyn, Scott; Henry, Joel; Rapp, John
2010-01-01
When the guidance, navigation, and control (GN&C) system for the Orion crew vehicle undergoes Critical Design Review (CDR), more than 90% of the flight software will already be developed - a first for NASA on a project of this scope and complexity. This achievement is due in large part to a new development approach using Model-Based Design.
Using REU Projects and Crowdsourcing to Facilitate Learning on Demand
ERIC Educational Resources Information Center
Liu, Hong P.; Klein, Jerry E.
2013-01-01
With the increasing complexity of technology and large quantities of data in our digital age, learning and training has become a major cost of employers. Employee competence depends more and more on how quickly one can acquire new knowledge and solve problems to meet pressing deadlines. This paper presents a practical method to use REU (Research…
Managing the Socioeconomic Impacts of Energy Development. A Guide for the Small Community.
ERIC Educational Resources Information Center
Armbrust, Roberta
Decisions concerning large-scale energy development projects near small communities or in predominantly rural areas are usually complex, requiring cooperation of all levels of government, as well as the general public and the private sector. It is unrealistic to expect the typical small community to develop capabilities to independently evaluate a…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacques Hugo
Traditional engineering methods do not make provision for the integration of human considerations, while traditional human factors methods do not scale well to the complexity of large-scale nuclear power plant projects. Although the need for up-to-date human factors engineering processes and tools is recognised widely in industry, so far no formal guidance has been developed. This article proposes such a framework.
Perceiving Permutations as Distinct Outcomes: The Accommodation of a Complex Knowledge System
ERIC Educational Resources Information Center
Kapon, Shulamit; Ron, Gila; Hershkowitz, Rina; Dreyfus, Tommy
2015-01-01
There is ample evidence that reasoning about stochastic phenomena is often subject to systematic bias even after instruction. Few studies have examined the detailed learning processes involved in learning probability. This paper examines a case study drawn from a large corpus of data collected as part of a research project that dealt with the…
Acceleration techniques for dependability simulation. M.S. Thesis
NASA Technical Reports Server (NTRS)
Barnette, James David
1995-01-01
As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.
Gold nanoparticles for high-throughput genotyping of long-range haplotypes
NASA Astrophysics Data System (ADS)
Chen, Peng; Pan, Dun; Fan, Chunhai; Chen, Jianhua; Huang, Ke; Wang, Dongfang; Zhang, Honglu; Li, You; Feng, Guoyin; Liang, Peiji; He, Lin; Shi, Yongyong
2011-10-01
Completion of the Human Genome Project and the HapMap Project has led to increasing demands for mapping complex traits in humans to understand the aetiology of diseases. Identifying variations in the DNA sequence, which affect how we develop disease and respond to pathogens and drugs, is important for this purpose, but it is difficult to identify these variations in large sample sets. Here we show that through a combination of capillary sequencing and polymerase chain reaction assisted by gold nanoparticles, it is possible to identify several DNA variations that are associated with age-related macular degeneration and psoriasis on significant regions of human genomic DNA. Our method is accurate and promising for large-scale and high-throughput genetic analysis of susceptibility towards disease and drug resistance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finley, Robert; Payne, William; Kirksey, Jim
2015-06-01
The Midwest Geological Sequestration Consortium (MGSC) has partnered with Archer Daniels Midland Company (ADM) and Schlumberger Carbon Services to conduct a large-volume, saline reservoir storage project at ADM’s agricultural products processing complex in Decatur, Illinois. The Development Phase project, named the Illinois Basin Decatur Project (IBDP) involves the injection of 1 million tonnes of carbon dioxide (CO 2) into a deep saline formation of the Illinois Basin over a three-year period. This report focuses on objectives, execution, and lessons learned/unanticipated results from the site development (relating specifically to surface equipment), operations, and the site closure plan.
Modeling the Hydrologic Effects of Large-Scale Green Infrastructure Projects with GIS
NASA Astrophysics Data System (ADS)
Bado, R. A.; Fekete, B. M.; Khanbilvardi, R.
2015-12-01
Impervious surfaces in urban areas generate excess runoff, which in turn causes flooding, combined sewer overflows, and degradation of adjacent surface waters. Municipal environmental protection agencies have shown a growing interest in mitigating these effects with 'green' infrastructure practices that partially restore the perviousness and water holding capacity of urban centers. Assessment of the performance of current and future green infrastructure projects is hindered by the lack of adequate hydrological modeling tools; conventional techniques fail to account for the complex flow pathways of urban environments, and detailed analyses are difficult to prepare for the very large domains in which green infrastructure projects are implemented. Currently, no standard toolset exists that can rapidly and conveniently predict runoff, consequent inundations, and sewer overflows at a city-wide scale. We demonstrate how streamlined modeling techniques can be used with open-source GIS software to efficiently model runoff in large urban catchments. Hydraulic parameters and flow paths through city blocks, roadways, and sewer drains are automatically generated from GIS layers, and ultimately urban flow simulations can be executed for a variety of rainfall conditions. With this methodology, users can understand the implications of large-scale land use changes and green/gray storm water retention systems on hydraulic loading, peak flow rates, and runoff volumes.
The Southern Ocean in the Coupled Model Intercomparison Project phase 5
Meijers, A. J. S.
2014-01-01
The Southern Ocean is an important part of the global climate system, but its complex coupled nature makes both its present state and its response to projected future climate forcing difficult to model. Clear trends in wind, sea-ice extent and ocean properties emerged from multi-model intercomparison in the Coupled Model Intercomparison Project phase 3 (CMIP3). Here, we review recent analyses of the historical and projected wind, sea ice, circulation and bulk properties of the Southern Ocean in the updated Coupled Model Intercomparison Project phase 5 (CMIP5) ensemble. Improvements to the models include higher resolutions, more complex and better-tuned parametrizations of ocean mixing, and improved biogeochemical cycles and atmospheric chemistry. CMIP5 largely reproduces the findings of CMIP3, but with smaller inter-model spreads and biases. By the end of the twenty-first century, mid-latitude wind stresses increase and shift polewards. All water masses warm, and intermediate waters freshen, while bottom waters increase in salinity. Surface mixed layers shallow, warm and freshen, whereas sea ice decreases. The upper overturning circulation intensifies, whereas bottom water formation is reduced. Significant disagreement exists between models for the response of the Antarctic Circumpolar Current strength, for reasons that are as yet unclear. PMID:24891395
NASA Astrophysics Data System (ADS)
Pennington, D. D.; Vincent, S.
2017-12-01
The NSF-funded project "Employing Model-Based Reasoning in Socio-Environmental Synthesis (EMBeRS)" has developed a generic model for exchanging knowledge across disciplines that is based on findings from the cognitive, learning, social, and organizational sciences addressing teamwork in complex problem solving situations. Two ten-day summer workshops for PhD students from large, NSF-funded interdisciplinary projects working on a variety of water issues were conducted in 2016 and 2017, testing the model by collecting a variety of data, including surveys, interviews, audio/video recordings, material artifacts and documents, and photographs. This presentation will introduce the EMBeRS model, the design of workshop activities based on the model, and results from surveys and interviews with the participating students. Findings suggest that this approach is very effective for developing a shared, integrated research vision across disciplines, compared with activities typically provided by most large research projects, and that students believe the skills developed in the EMBeRS workshops are unique and highly desireable.
NASA Technical Reports Server (NTRS)
Lee, Ashley; Rackoczy, John; Heater, Daniel; Sanders, Devon; Tashakkor, Scott
2013-01-01
Over the past few years interest in the development and use of small satellites has rapidly gained momentum with universities, commercial, and government organizations. In a few years we may see networked clusters of dozens or even hundreds of small, cheap, easily replaceable satellites working together in place of the large, expensive and difficult-to-replace satellites now in orbit. Standards based satellite buses and deployment mechanisms, such as the CubeSat and Poly Pico-satellite Orbital Deployer (P-POD), have stimulated growth in this area. The use of small satellites is also proving to be a cost effective capability in many areas traditionally dominated by large satellites, though many challenges remain. Currently many of these small satellites undergo very little testing prior to flight. As these small satellites move from technology demonstration and student projects toward more complex operational assets, it is expected that the standards for verification and validation will increase.
NASA Astrophysics Data System (ADS)
Hawcroft, M.; Hodges, K.; Walsh, E.; Zappa, G.
2017-12-01
For the Northern Hemisphere extratropics, changes in circulation are key to determining the impacts of climate warming. The mechanisms governing these circulation changes are complex, leading to the well documented uncertainty in projections of the future location of the mid-latitude storm tracks simulated by climate models. These storms are the primary source of precipitation for North America and Europe and generate many of the large-scale precipitation extremes associated with flooding and severe economic loss. Here, we show that in spite of the uncertainty in circulation changes, by analysing the behaviour of the storms themselves, we find entirely consistent and robust projections across an ensemble of climate models. In particular, we find that projections of change in the most intensely precipitating storms (above the present day 99th percentile) in the Northern Hemisphere are substantial and consistent across models, with large increases in the frequency of both summer (June-August, +226±68%) and winter (December-February, +186±34%) extreme storms by the end of the century. Regionally, both North America (summer +202±129%, winter +232±135%) and Europe (summer +390±148%, winter +318±114%) are projected to experience large increases in the frequency of intensely precipitating storms. These changes are thermodynamic and driven by surface warming, rather than by changes in the dynamical behaviour of the storms. Such changes in storm behaviour have the potential to have major impacts on society given intensely precipitating storms are responsible for many large-scale flooding events.
Convergence in France facing Big Data era and Exascale challenges for Climate Sciences
NASA Astrophysics Data System (ADS)
Denvil, Sébastien; Dufresne, Jean-Louis; Salas, David; Meurdesoif, Yann; Valcke, Sophie; Caubel, Arnaud; Foujols, Marie-Alice; Servonnat, Jérôme; Sénési, Stéphane; Derouillat, Julien; Voury, Pascal
2014-05-01
The presentation will introduce a french national project : CONVERGENCE that has been funded for four years. This project will tackle big data and computational challenges faced by climate modeling community in HPC context. Model simulations are central to the study of complex mechanisms and feedbacks in the climate system and to provide estimates of future and past climate changes. Recent trends in climate modelling are to add more physical components in the modelled system, increasing the resolution of each individual component and the more systematic use of large suites of simulations to address many scientific questions. Climate simulations may therefore differ in their initial state, parameter values, representation of physical processes, spatial resolution, model complexity, and degree of realism or degree of idealisation. In addition, there is a strong need for evaluating, improving and monitoring the performance of climate models using a large ensemble of diagnostics and better integration of model outputs and observational data. High performance computing is currently reaching the exascale and has the potential to produce this exponential increase of size and numbers of simulations. However, post-processing, analysis, and exploration of the generated data have stalled and there is a strong need for new tools to cope with the growing size and complexity of the underlying simulations and datasets. Exascale simulations require new scalable software tools to generate, manage and mine those simulations ,and data to extract the relevant information and to take the correct decision. The primary purpose of this project is to develop a platform capable of running large ensembles of simulations with a suite of models, to handle the complex and voluminous datasets generated, to facilitate the evaluation and validation of the models and the use of higher resolution models. We propose to gather interdisciplinary skills to design, using a component-based approach, a specific programming environment for scalable scientific simulations and analytics, integrating new and efficient ways of deploying and analysing the applications on High Performance Computing (HPC) system. CONVERGENCE, gathering HPC and informatics expertise that cuts across the individual partners and the broader HPC community, will allow the national climate community to leverage information technology (IT) innovations to address its specific needs. Our methodology consists in developing an ensemble of generic elements needed to run the French climate models with different grids and different resolution, ensuring efficient and reliable execution of these models, managing large volume and number of data and allowing analysis of the results and precise evaluation of the models. These elements include data structure definition and input-output (IO), code coupling and interpolation, as well as runtime and pre/post-processing environments. A common data and metadata structure will allow transferring consistent information between the various elements. All these generic elements will be open source and publicly available. The IPSL-CM and CNRM-CM climate models will make use of these elements that will constitute a national platform for climate modelling. This platform will be used, in its entirety, to optimise and tune the next version of the IPSL-CM model and to develop a global coupled climate model with a regional grid refinement. It will also be used, at least partially, to run ensembles of the CNRM-CM model at relatively high resolution and to run a very-high resolution prototype of this model. The climate models we developed are already involved in many international projects. For instance we participate to the CMIP (Coupled Model Intercomparison Project) project that is very demanding but has a high visibility: its results are widely used and are in particular synthesised in the IPCC (Intergovernmental Panel on Climate Change) assessment reports. The CONVERGENCE project will constitute an invaluable step for the French climate community to prepare and better contribute to the next phase of the CMIP project.
Sensitivity of Precipitation in Coupled Land-Atmosphere Models
NASA Technical Reports Server (NTRS)
Neelin, David; Zeng, N.; Suarez, M.; Koster, R.
2004-01-01
The project objective was to understand mechanisms by which atmosphere-land-ocean processes impact precipitation in the mean climate and interannual variations, focusing on tropical and subtropical regions. A combination of modeling tools was used: an intermediate complexity land-atmosphere model developed at UCLA known as the QTCM and the NASA Seasonal-to-Interannual Prediction Program general circulation model (NSIPP GCM). The intermediate complexity model was used to develop hypotheses regarding the physical mechanisms and theory for the interplay of large-scale dynamics, convective heating, cloud radiative effects and land surface feedbacks. The theoretical developments were to be confronted with diagnostics from the more complex GCM to validate or modify the theory.
Demonstrating artificial intelligence for space systems - Integration and project management issues
NASA Technical Reports Server (NTRS)
Hack, Edmund C.; Difilippo, Denise M.
1990-01-01
As part of its Systems Autonomy Demonstration Project (SADP), NASA has recently demonstrated the Thermal Expert System (TEXSYS). Advanced real-time expert system and human interface technology was successfully developed and integrated with conventional controllers of prototype space hardware to provide intelligent fault detection, isolation, and recovery capability. Many specialized skills were required, and responsibility for the various phases of the project therefore spanned multiple NASA centers, internal departments and contractor organizations. The test environment required communication among many types of hardware and software as well as between many people. The integration, testing, and configuration management tools and methodologies which were applied to the TEXSYS project to assure its safe and successful completion are detailed. The project demonstrated that artificial intelligence technology, including model-based reasoning, is capable of the monitoring and control of a large, complex system in real time.
NASA Astrophysics Data System (ADS)
Leon, J.; Urban, T.; Gerard-Little, P.; Kearns, C.; Manning, S. W.; Fisher, K.; Rogers, M.
2013-12-01
The Kalavasos and Maroni Built Environments (KAMBE) Project is a multi-year investigation of the urban fabric and architectural organization of two Late Bronze Age (c. 1650-1100 BCE) polities on Cyprus. The Late Bronze Age (known also as the Late Cypriot period on Cyprus) is characterized by the emergence of a number of large, urban settlements on the island. The amalgamation of large populations at centralized sites coincides with contemporary social, economic and political changes, including a growing disparity in funerary goods, an increased emphasis on metallurgy (specifically copper mining and smelting for the production of bronze), and the construction of monumental buildings. The initial phase of the project centered on geophysical survey at two archaeological sites in adjacent river valleys in south-central Cyprus: Kalavasos-Ayios Dhimitrios and the Maroni settlement cluster [1]. These sites are thought to be two of the earliest 'urban' settlements on the island and provide a unique opportunity to explore how urban space was instrumental in the development of social and political complexity during this transformative period. The formation of these Late Bronze Age urban landscapes is, we argue, not simply the result of this emerging social complexity, but is instead an key tool in the creation and maintenance of societal boundaries. Indeed, the process of 'place-making'--the dynamic creation of socially meaningful spaces, likely by elites--may well have been one of the most effective arenas that elites used to re-enforce the growing socio-political disparity. The KAMBE Project investigates the layout and organization of these new 'urban' spaces to better understand how built-space impacted social developments. Geophysical survey methods are ideal for large-scale data collection both to identify potential areas for targeted archaeological excavation, and to provide proxy data for architectural plans that allow us to comment on the nature of the urban fabric at these settlements. Having just completed this first phase of the project, we report on the results of large-scale geophysical survey, including the identification of at least two previously unknown building complexes (one at each site). Here we focus particularly on ground-penetrating radar (GPR) data and survey methodology, in an effort to critically examine the range of approaches applied throughout the project (e.g. various antennae frequencies, data-collection densities, soil moisture/seasonality of survey, and post-collection data processing [2]), and to identify the most effective parameters for archaeological geophysical survey in the region. This paper also advocates for the role of geophysical survey within a multi-component archaeological project, not simply as a prospection tool but as an archaeological data collection method in its own right. 1]Fisher, K. D., J. Leon, S. Manning, M. Rogers, and D. Sewell. In Press. 2011-2012. 'The Kalavasos and Maroni Built Environments Project: Introduction and preliminary report on the 2008 and 2010 seasons. Report of the Department of Antiquities, Cyprus. 2] e.g. Rogers, M., J. F. Leon, K. D. Fisher, S. W. Manning and D. Sewell. 2012. 'Comparing similar ground-penetrating radar surveys under different soil moisture conditions at Kalavasos-Ayios Dhimitrios, Cyprus.' Archaeological Prospection 19 (4): 297-305.
Changing vessel routes could significantly reduce the cost of future offshore wind projects.
Samoteskul, Kateryna; Firestone, Jeremy; Corbett, James; Callahan, John
2014-08-01
With the recent emphasis on offshore wind energy Coastal and Marine Spatial Planning (CMSP) has become one of the main frameworks used to plan and manage the increasingly complex web of ocean and coastal uses. As wind development becomes more prevalent, existing users of the ocean space, such as commercial shippers, will be compelled to share their historically open-access waters with these projects. Here, we demonstrate the utility of using cost-effectiveness analysis (CEA) to support siting decisions within a CMSP framework. In this study, we assume that large-scale offshore wind development will take place in the US Mid-Atlantic within the next decades. We then evaluate whether building projects nearshore or far from shore would be more cost-effective. Building projects nearshore is assumed to require rerouting of the commercial vessel traffic traveling between the US Mid-Atlantic ports by an average of 18.5 km per trip. We focus on less than 1500 transits by large deep-draft vessels. We estimate that over 29 years of the study, commercial shippers would incur an additional $0.2 billion (in 2012$) in direct and indirect costs. Building wind projects closer to shore where vessels used to transit would generate approximately $13.4 billion (in 2012$) in savings. Considering the large cost savings, modifying areas where vessels transit needs to be included in the portfolio of policies used to support the growth of the offshore wind industry in the US. Copyright © 2014 Elsevier Ltd. All rights reserved.
Complex wet-environments in electronic-structure calculations
NASA Astrophysics Data System (ADS)
Fisicaro, Giuseppe; Genovese, Luigi; Andreussi, Oliviero; Marzari, Nicola; Goedecker, Stefan
The computational study of chemical reactions in complex, wet environments is critical for applications in many fields. It is often essential to study chemical reactions in the presence of an applied electrochemical potentials, including complex electrostatic screening coming from the solvent. In the present work we present a solver to handle both the Generalized Poisson and the Poisson-Boltzmann equation. A preconditioned conjugate gradient (PCG) method has been implemented for the Generalized Poisson and the linear regime of the Poisson-Boltzmann, allowing to solve iteratively the minimization problem with some ten iterations. On the other hand, a self-consistent procedure enables us to solve the Poisson-Boltzmann problem. The algorithms take advantage of a preconditioning procedure based on the BigDFT Poisson solver for the standard Poisson equation. They exhibit very high accuracy and parallel efficiency, and allow different boundary conditions, including surfaces. The solver has been integrated into the BigDFT and Quantum-ESPRESSO electronic-structure packages and it will be released as a independent program, suitable for integration in other codes. We present test calculations for large proteins to demonstrate efficiency and performances. This work was done within the PASC and NCCR MARVEL projects. Computer resources were provided by the Swiss National Supercomputing Centre (CSCS) under Project ID s499. LG acknowledges also support from the EXTMOS EU project.
ERIC Educational Resources Information Center
San Diego, Jonathan P.; Cox, Margaret J.; Quinn, Barry F. A.; Newton, Jonathan Tim; Banerjee, Avijit; Woolford, Mark
2012-01-01
hapTEL, an interdisciplinary project funded by two UK research councils from 2007 to 2011, involves a large interdisciplinary team (with undergraduate and post-graduate student participants) which has been developing and evaluating a virtual learning system within an HE healthcare education setting, working on three overlapping strands. Strand 1…
Sabbat, J
1997-09-01
The restoration of democracy in Poland initiated a major system transformation including reform of the health sector. The international community were quick to respond to the need for assistance. Polish proposals were supported by international experts and projects were developed together with international development agencies and donors. Donors had no experience of central and eastern Europe, these countries had never been beneficiaries of aid and neither side had experience working together. Progress and absorption of funds was slow. Comparative experience from developing countries was used to analyze the barriers encountered in project development and implementation in Poland. The conditions necessary for implementation were not satisfied. Insufficient attention was paid to the project process. Barriers originate on the side of both donors and recipients and additionally from programme characteristics. The most serious problems experience in Poland were lack of government commitment to health care reform leading to failure to provide counterpart funds and low capacity for absorption of aid. Rent seeking attitudes were important. Donor paternalistic attitudes, complex procedures and lack of innovative approach were also present. Poor coordination was a problem on both sides. Multi-lateral projects were too complex and it was not always possible to integrate project activities with routine ones. External consultants played an excessive role in project development and implementation, absorbing a large portion of funds. The barriers have been operationalised to create a checklist which requires validation elsewhere and may be useful for those working in this field.
ERIC Educational Resources Information Center
Williamson, David J.
2011-01-01
The specific problem addressed in this study was the low success rate of information technology (IT) projects in the U.S. Due to the abstract nature and inherent complexity of software development, IT projects are among the most complex projects encountered. Most existing schools of project management theory are based on the rational systems…
NASA Technical Reports Server (NTRS)
2005-01-01
[figure removed for brevity, see original site]
Our topic for the weeks of April 4 and April 11 is dunes on Mars. We will look at the north polar sand sea and at isolated dune fields at lower latitudes. Sand seas on Earth are often called 'ergs,' an Arabic name for dune field. A sand sea differs from a dune field in two ways: 1) a sand sea has a large regional extent, and 2) the individual dunes are large in size and complex in form. This VIS image shows a dune field within Nili Patera, the northern caldera of a large volcanic complex in Syrtis Major. Image information: VIS instrument. Latitude 9, Longitude 67 East (293 West). 19 meter/pixel resolution. Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time. NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.Shafer, Sarah L; Bartlein, Patrick J; Gray, Elizabeth M; Pelltier, Richard T
2015-01-01
Future climate change may significantly alter the distributions of many plant taxa. The effects of climate change may be particularly large in mountainous regions where climate can vary significantly with elevation. Understanding potential future vegetation changes in these regions requires methods that can resolve vegetation responses to climate change at fine spatial resolutions. We used LPJ, a dynamic global vegetation model, to assess potential future vegetation changes for a large topographically complex area of the northwest United States and southwest Canada (38.0-58.0°N latitude by 136.6-103.0°W longitude). LPJ is a process-based vegetation model that mechanistically simulates the effect of changing climate and atmospheric CO2 concentrations on vegetation. It was developed and has been mostly applied at spatial resolutions of 10-minutes or coarser. In this study, we used LPJ at a 30-second (~1-km) spatial resolution to simulate potential vegetation changes for 2070-2099. LPJ was run using downscaled future climate simulations from five coupled atmosphere-ocean general circulation models (CCSM3, CGCM3.1(T47), GISS-ER, MIROC3.2(medres), UKMO-HadCM3) produced using the A2 greenhouse gases emissions scenario. Under projected future climate and atmospheric CO2 concentrations, the simulated vegetation changes result in the contraction of alpine, shrub-steppe, and xeric shrub vegetation across the study area and the expansion of woodland and forest vegetation. Large areas of maritime cool forest and cold forest are simulated to persist under projected future conditions. The fine spatial-scale vegetation simulations resolve patterns of vegetation change that are not visible at coarser resolutions and these fine-scale patterns are particularly important for understanding potential future vegetation changes in topographically complex areas.
Columbus Closure Project Released without Radiological Restrictions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henderson, G.
2007-07-01
The Columbus Closure Project (CCP), a historic radiological research complex, was cleaned up for future use without radiological restriction in 2006. The CCP research and development site contributed to national defense, nuclear fuel fabrication, and the development of safe nuclear reactors in the United States until 1988 when research activities were concluded for site decommissioning. In November of 2003, the Ohio Field Office of the U.S. Department of Energy contracted ECC/E2 Closure Services, LLC (Closure Services) to complete the removal of radioactive contamination from of a 1955 era nuclear sciences area consisting of a large hot cell facility, research reactormore » building and underground piping. The project known as the Columbus Closure Project (CCP) was completed in 27 months and brought to a close 16 years of D and D in Columbus, Ohio. This paper examines the project innovations and challenges presented during the Columbus Closure Project. The examination of the CCP includes the project regulatory environment, the CS safety program, accelerated clean up innovation, project execution strategies and management of project waste issues and the regulatory approach to site release 'without radiological restrictions'. (authors)« less
Python for large-scale electrophysiology.
Spacek, Martin; Blanche, Tim; Swindale, Nicholas
2008-01-01
Electrophysiology is increasingly moving towards highly parallel recording techniques which generate large data sets. We record extracellularly in vivo in cat and rat visual cortex with 54-channel silicon polytrodes, under time-locked visual stimulation, from localized neuronal populations within a cortical column. To help deal with the complexity of generating and analysing these data, we used the Python programming language to develop three software projects: one for temporally precise visual stimulus generation ("dimstim"); one for electrophysiological waveform visualization and spike sorting ("spyke"); and one for spike train and stimulus analysis ("neuropy"). All three are open source and available for download (http://swindale.ecc.ubc.ca/code). The requirements and solutions for these projects differed greatly, yet we found Python to be well suited for all three. Here we present our software as a showcase of the extensive capabilities of Python in neuroscience.
Simplified Deployment of Health Informatics Applications by Providing Docker Images.
Löbe, Matthias; Ganslandt, Thomas; Lotzmann, Lydia; Mate, Sebastian; Christoph, Jan; Baum, Benjamin; Sariyar, Murat; Wu, Jie; Stäubert, Sebastian
2016-01-01
Due to the specific needs of biomedical researchers, in-house development of software is widespread. A common problem is to maintain and enhance software after the funded project has ended. Even if many tools are made open source, only a couple of projects manage to attract a user basis large enough to ensure sustainability. Reasons for this include complex installation and configuration of biomedical software as well as an ambiguous terminology of the features provided; all of which make evaluation of software laborious. Docker is a para-virtualization technology based on Linux containers that eases deployment of applications and facilitates evaluation. We investigated a suite of software developments funded by a large umbrella organization for networked medical research within the last 10 years and created Docker containers for a number of applications to support utilization and dissemination.
Damping characterization in large structures
NASA Technical Reports Server (NTRS)
Eke, Fidelis O.; Eke, Estelle M.
1991-01-01
This research project has as its main goal the development of methods for selecting the damping characteristics of components of a large structure or multibody system, in such a way as to produce some desired system damping characteristics. The main need for such an analytical device is in the simulation of the dynamics of multibody systems consisting, at least partially, of flexible components. The reason for this need is that all existing simulation codes for multibody systems require component-by-component characterization of complex systems, whereas requirements (including damping) often appear at the overall system level. The main goal was met in large part by the development of a method that will in fact synthesize component damping matrices from a given system damping matrix. The restrictions to the method are that the desired system damping matrix must be diagonal (which is almost always the case) and that interbody connections must be by simple hinges. In addition to the technical outcome, this project contributed positively to the educational and research infrastructure of Tuskegee University - a Historically Black Institution.
Taking it to the streets: delivering on deployment.
Carr, Dafna; Welch, Vickie; Fabik, Trish; Hirji, Nadir; O'Connor, Casey
2009-01-01
From inception to deployment, the Wait Time Information System (WTIS) project faced significant challenges associated with time, scope and complexity. It involved not only the creation and deployment of two large-scale province-wide systems (the WTIS and Ontario's Client Registry/Enterprise Master Patient Index) within aggressive time frames, but also the active engagement of 82 Ontario hospitals, scores of healthcare leaders and several thousand clinicians who would eventually be using the new technology and its data. The provincial WTIS project team (see Figure 1) also had to be able to adapt and evolve their planning in an environment that was changing day-by-day. This article looks at the factors that allowed the team to take the WTIS out to the field and shares the approach, processes and tools used to deploy this complex and ambitious information management and information technology (IM/IT) initiative.
Mobilizing Public Markets to Finance Renewable Energy Projects: Insights from Expert Stakeholders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwabe, P.; Mendelsohn, M.; Mormann, F.
Financing renewable energy projects in the United States can be a complex process. Most equity investment in new renewable power production facilities is supported by tax credits and accelerated depreciation benefits, and is constrained by the pool of potential investors that can fully use these tax benefits and are willing to engage in complex financial structures. For debt financing, non-government lending has largely been provided by foreign banks that may be under future lending constraints due to economic and regulatory conditions. To discuss renewable energy financing challenges and to identify new sources of capital to the U.S. market, two roundtablemore » discussions were held with renewable energy and financing experts in April 2012. This report summarizes the key messages of those discussions and is designed to provide insights to the U.S. market and inform the international conversation on renewable energy financing innovations.« less
More 'altruistic' punishment in larger societies.
Marlowe, Frank W; Berbesque, J Colette
2008-03-07
If individuals will cooperate with cooperators, and punish non-cooperators even at a cost to themselves, then this strong reciprocity could minimize the cheating that undermines cooperation. Based upon numerous economic experiments, some have proposed that human cooperation is explained by strong reciprocity and norm enforcement. Second-party punishment is when you punish someone who defected on you; third-party punishment is when you punish someone who defected on someone else. Third-party punishment is an effective way to enforce the norms of strong reciprocity and promote cooperation. Here we present new results that expand on a previous report from a large cross-cultural project. This project has already shown that there is considerable cross-cultural variation in punishment and cooperation. Here we test the hypothesis that population size (and complexity) predicts the level of third-party punishment. Our results show that people in larger, more complex societies engage in significantly more third-party punishment than people in small-scale societies.
Model projections of atmospheric steering of Sandy-like superstorms
Barnes, Elizabeth A.; Polvani, Lorenzo M.; Sobel, Adam H.
2013-01-01
Superstorm Sandy ravaged the eastern seaboard of the United States, costing a great number of lives and billions of dollars in damage. Whether events like Sandy will become more frequent as anthropogenic greenhouse gases continue to increase remains an open and complex question. Here we consider whether the persistent large-scale atmospheric patterns that steered Sandy onto the coast will become more frequent in the coming decades. Using the Coupled Model Intercomparison Project, phase 5 multimodel ensemble, we demonstrate that climate models consistently project a decrease in the frequency and persistence of the westward flow that led to Sandy’s unprecedented track, implying that future atmospheric conditions are less likely than at present to propel storms westward into the coast. PMID:24003129
NASA Technical Reports Server (NTRS)
Lee, Taesik; Jeziorek, Peter
2004-01-01
Large complex projects cost large sums of money throughout their life cycle for a variety of reasons and causes. For such large programs, the credible estimation of the project cost, a quick assessment of the cost of making changes, and the management of the project budget with effective cost reduction determine the viability of the project. Cost engineering that deals with these issues requires a rigorous method and systematic processes. This paper introduces a logical framework to a&e effective cost engineering. The framework is built upon Axiomatic Design process. The structure in the Axiomatic Design process provides a good foundation to closely tie engineering design and cost information together. The cost framework presented in this paper is a systematic link between the functional domain (FRs), physical domain (DPs), cost domain (CUs), and a task/process-based model. The FR-DP map relates a system s functional requirements to design solutions across all levels and branches of the decomposition hierarchy. DPs are mapped into CUs, which provides a means to estimate the cost of design solutions - DPs - from the cost of the physical entities in the system - CUs. The task/process model describes the iterative process ot-developing each of the CUs, and is used to estimate the cost of CUs. By linking the four domains, this framework provides a superior traceability from requirements to cost information.
Watson, Nora L; Prosperi, Christine; Driscoll, Amanda J; Higdon, Melissa M; Park, Daniel E; Sanza, Megan; DeLuca, Andrea N; Awori, Juliet O; Goswami, Doli; Hammond, Emily; Hossain, Lokman; Johnson, Catherine; Kamau, Alice; Kuwanda, Locadiah; Moore, David P; Neyzari, Omid; Onwuchekwa, Uma; Parker, David; Sapchookul, Patranuch; Seidenberg, Phil; Shamsul, Arifin; Siazeele, Kazungu; Srisaengchai, Prasong; Sylla, Mamadou; Levine, Orin S; Murdoch, David R; O'Brien, Katherine L; Wolff, Mark; Deloria Knoll, Maria
2017-06-15
The Pneumonia Etiology Research for Child Health (PERCH) study is the largest multicountry etiology study of pediatric pneumonia undertaken in the past 3 decades. The study enrolled 4232 hospitalized cases and 5325 controls over 2 years across 9 research sites in 7 countries in Africa and Asia. The volume and complexity of data collection in PERCH presented considerable logistical and technical challenges. The project chose an internet-based data entry system to allow real-time access to the data, enabling the project to monitor and clean incoming data and perform preliminary analyses throughout the study. To ensure high-quality data, the project developed comprehensive quality indicator, data query, and monitoring reports. Among the approximately 9000 cases and controls, analyzable laboratory results were available for ≥96% of core specimens collected. Selected approaches to data management in PERCH may be extended to the planning and organization of international studies of similar scope and complexity. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.
Circuit Architecture of VTA Dopamine Neurons Revealed by Systematic Input-Output Mapping.
Beier, Kevin T; Steinberg, Elizabeth E; DeLoach, Katherine E; Xie, Stanley; Miyamichi, Kazunari; Schwarz, Lindsay; Gao, Xiaojing J; Kremer, Eric J; Malenka, Robert C; Luo, Liqun
2015-07-30
Dopamine (DA) neurons in the midbrain ventral tegmental area (VTA) integrate complex inputs to encode multiple signals that influence motivated behaviors via diverse projections. Here, we combine axon-initiated viral transduction with rabies-mediated trans-synaptic tracing and Cre-based cell-type-specific targeting to systematically map input-output relationships of VTA-DA neurons. We found that VTA-DA (and VTA-GABA) neurons receive excitatory, inhibitory, and modulatory input from diverse sources. VTA-DA neurons projecting to different forebrain regions exhibit specific biases in their input selection. VTA-DA neurons projecting to lateral and medial nucleus accumbens innervate largely non-overlapping striatal targets, with the latter also sending extensive extra-striatal axon collaterals. Using electrophysiology and behavior, we validated new circuits identified in our tracing studies, including a previously unappreciated top-down reinforcing circuit from anterior cortex to lateral nucleus accumbens via VTA-DA neurons. This study highlights the utility of our viral-genetic tracing strategies to elucidate the complex neural substrates that underlie motivated behaviors. Copyright © 2015 Elsevier Inc. All rights reserved.
Environmental projects. Volume 1: Polychlorinated biphenyl (PCB) abatement program
NASA Technical Reports Server (NTRS)
Kushner, L.
1987-01-01
Six large parabolic dish antennas are located at the Goldstone Deep Space Communications Complex north of Barstow, California. Some of the ancillary electrical equipment of thes Deep Space Stations, particularly transformers and power capicitors, were filled with stable, fire-retardant, dielectric fluids containing substances called polychlorobiphenyls (PCBs). Because the Environmental Protection Agency has determined that PCBs are environmental pollutants toxic to humans, all NASA centers have been asked to participate in a PCB-abatement program. Under the supervision of JPL's Office of Telecommunications and Data Acquisition, a two-year long PCB-abatement program has eliminated PCBs from the Goldstone Complex.
NASA Astrophysics Data System (ADS)
Krechowicz, Maria
2017-10-01
Nowadays, one of the characteristic features of construction industry is an increased complexity of a growing number of projects. Almost each construction project is unique, has its project-specific purpose, its own project structural complexity, owner’s expectations, ground conditions unique to a certain location, and its own dynamics. Failure costs and costs resulting from unforeseen problems in complex construction projects are very high. Project complexity drivers pose many vulnerabilities to a successful completion of a number of projects. This paper discusses the process of effective risk management in complex construction projects in which renewable energy sources were used, on the example of the realization phase of the ENERGIS teaching-laboratory building, from the point of view of DORBUD S.A., its general contractor. This paper suggests a new approach to risk management for complex construction projects in which renewable energy sources were applied. The risk management process was divided into six stages: gathering information, identification of the top, critical project risks resulting from the project complexity, construction of the fault tree for each top, critical risks, logical analysis of the fault tree, quantitative risk assessment applying fuzzy logic and development of risk response strategy. A new methodology for the qualitative and quantitative risk assessment for top, critical risks in complex construction projects was developed. Risk assessment was carried out applying Fuzzy Fault Tree analysis on the example of one top critical risk. Application of the Fuzzy sets theory to the proposed model allowed to decrease uncertainty and eliminate problems with gaining the crisp values of the basic events probability, common during expert risk assessment with the objective to give the exact risk score of each unwanted event probability.
NASA Astrophysics Data System (ADS)
Voronin, Alexander; Vasilchenko, Ann; Khoperskov, Alexander
2018-03-01
The project of small watercourses restoration in the northern part of the Volga-Akhtuba floodplain is considered together with the aim of increasing the watering of the territory during small and medium floods. The topography irregularity, the complex structure of the floodplain valley consisting of large number of small watercourses, the presence of urbanized and agricultural areas require careful preliminary analysis of the hydrological safety and efficiency of geographically distributed project activities. Using the digital terrain and watercourses structure models of the floodplain, the hydrodynamic flood model, the analysis of the hydrological safety and efficiency of several project implementation strategies has been conducted. The objective function values have been obtained from the hydrodynamic calculations of the floodplain territory flooding for virtual digital terrain models simulating alternatives for the geographically distributed project activities. The comparative efficiency of several empirical strategies for the geographically distributed project activities, as well as a two-stage exact solution method for the optimization problem has been studied.
Building and measuring a high performance network architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kramer, William T.C.; Toole, Timothy; Fisher, Chuck
2001-04-20
Once a year, the SC conferences present a unique opportunity to create and build one of the most complex and highest performance networks in the world. At SC2000, large-scale and complex local and wide area networking connections were demonstrated, including large-scale distributed applications running on different architectures. This project was designed to use the unique opportunity presented at SC2000 to create a testbed network environment and then use that network to demonstrate and evaluate high performance computational and communication applications. This testbed was designed to incorporate many interoperable systems and services and was designed for measurement from the very beginning.more » The end results were key insights into how to use novel, high performance networking technologies and to accumulate measurements that will give insights into the networks of the future.« less
ERIC Educational Resources Information Center
So, H.-J.
2009-01-01
The purpose of this study is to explore how groups decide to use asynchronous online discussion forums in a non-mandatory setting, and, after the group decision is made, how group members use online discussion forums to complete a collaborative learning project requiring complex data gathering and research processes. While a large body of research…
Steven M. Wondzell; Agnieszka Przeszlowska; Dirk Pflugmacher; Miles A. Hemstrom; Peter A. Bisson
2012-01-01
Interactions between landuse and ecosystem change are complex, especially in riparian zones. To date, few models are available to project the influence of alternative landuse practices, natural disturbance and plant succession on the likely future conditions of riparian zones and aquatic habitats across large spatial extents. A state and transition approach was used to...
[Privacy and public benefit in using large scale health databases].
Yamamoto, Ryuichi
2014-01-01
In Japan, large scale heath databases were constructed in a few years, such as National Claim insurance and health checkup database (NDB) and Japanese Sentinel project. But there are some legal issues for making adequate balance between privacy and public benefit by using such databases. NDB is carried based on the act for elderly person's health care but in this act, nothing is mentioned for using this database for general public benefit. Therefore researchers who use this database are forced to pay much concern about anonymization and information security that may disturb the research work itself. Japanese Sentinel project is a national project to detecting drug adverse reaction using large scale distributed clinical databases of large hospitals. Although patients give the future consent for general such purpose for public good, it is still under discussion using insufficiently anonymized data. Generally speaking, researchers of study for public benefit will not infringe patient's privacy, but vague and complex requirements of legislation about personal data protection may disturb the researches. Medical science does not progress without using clinical information, therefore the adequate legislation that is simple and clear for both researchers and patients is strongly required. In Japan, the specific act for balancing privacy and public benefit is now under discussion. The author recommended the researchers including the field of pharmacology should pay attention to, participate in the discussion of, and make suggestion to such act or regulations.
Steck, R; Epari, D R; Schuetz, M A
2010-07-01
The collaboration of clinicians with basic science researchers is crucial for addressing clinically relevant research questions. In order to initiate such mutually beneficial relationships, we propose a model where early career clinicians spend a designated time embedded in established basic science research groups, in order to pursue a postgraduate qualification. During this time, clinicians become integral members of the research team, fostering long term relationships and opening up opportunities for continuing collaboration. However, for these collaborations to be successful there are pitfalls to be avoided. Limited time and funding can lead to attempts to answer clinical challenges with highly complex research projects characterised by a large number of "clinical" factors being introduced in the hope that the research outcomes will be more clinically relevant. As a result, the complexity of such studies and variability of its outcomes may lead to difficulties in drawing scientifically justified and clinically useful conclusions. Consequently, we stress that it is the basic science researcher and the clinician's obligation to be mindful of the limitations and challenges of such multi-factorial research projects. A systematic step-by-step approach to address clinical research questions with limited, but highly targeted and well defined research projects provides the solid foundation which may lead to the development of a longer term research program for addressing more challenging clinical problems. Ultimately, we believe that it is such models, encouraging the vital collaboration between clinicians and researchers for the work on targeted, well defined research projects, which will result in answers to the important clinical challenges of today. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Vatcha, Rashna; Lee, Seok-Won; Murty, Ajeet; Tolone, William; Wang, Xiaoyu; Dou, Wenwen; Chang, Remco; Ribarsky, William; Liu, Wanqiu; Chen, Shen-en; Hauser, Edd
2009-05-01
Infrastructure management (and its associated processes) is complex to understand, perform and thus, hard to make efficient and effective informed decisions. The management involves a multi-faceted operation that requires the most robust data fusion, visualization and decision making. In order to protect and build sustainable critical assets, we present our on-going multi-disciplinary large-scale project that establishes the Integrated Remote Sensing and Visualization (IRSV) system with a focus on supporting bridge structure inspection and management. This project involves specific expertise from civil engineers, computer scientists, geographers, and real-world practitioners from industry, local and federal government agencies. IRSV is being designed to accommodate the essential needs from the following aspects: 1) Better understanding and enforcement of complex inspection process that can bridge the gap between evidence gathering and decision making through the implementation of ontological knowledge engineering system; 2) Aggregation, representation and fusion of complex multi-layered heterogeneous data (i.e. infrared imaging, aerial photos and ground-mounted LIDAR etc.) with domain application knowledge to support machine understandable recommendation system; 3) Robust visualization techniques with large-scale analytical and interactive visualizations that support users' decision making; and 4) Integration of these needs through the flexible Service-oriented Architecture (SOA) framework to compose and provide services on-demand. IRSV is expected to serve as a management and data visualization tool for construction deliverable assurance and infrastructure monitoring both periodically (annually, monthly, even daily if needed) as well as after extreme events.
The communication in industrialised building system (IBS) construction project: Virtual environment
NASA Astrophysics Data System (ADS)
Pozin, Mohd Affendi Ahmad; Nawi, Mohd Nasrun Mohd
2017-10-01
Large portion of numbers team organization in the IBS construction sector is known are being fragmented. That is contributed from a segregation of construction activity thus create team working in virtually. Virtual team are the nature when teams are working in distributed area, across culture and time. Therefore, teams can be respond to the task without relocating to the site project and settle down a problem through information and communication technology (ICT). The emergence of virtual team are carry out by advancements in communication technologies as a medium to improve project team communication in project delivery process on IBS construction. Based on literature review from previous study and data collected from interviewing, this paper aim to identified communication challenges among project team members according to current project development practices in IBS construction project. Hence, in attempt to develop effective communication through the advantages of virtual team approach for IBS construction project. In order to ensure the data is gathered comprehensively and accurately, the data was collected from project managers by using semi structured interview method. It was found that virtual team approach could be enable competitive challenges on complexity in the construction project management process.
NASA Astrophysics Data System (ADS)
Lorenzo Alvarez, Jose; Metselaar, Harold; Amiaux, Jerome; Saavedra Criado, Gonzalo; Gaspar Venancio, Luis M.; Salvignol, Jean-Christophe; Laureijs, René J.; Vavrek, Roland
2016-08-01
In the last years, the system engineering field is coming to terms with a paradigm change in the approach for complexity management. Different strategies have been proposed to cope with highly interrelated systems, system of systems and collaborative system engineering have been proposed and a significant effort is being invested into standardization and ontology definition. In particular, Model Based System Engineering (MBSE) intends to introduce methodologies for a systematic system definition, development, validation, deployment, operation and decommission, based on logical and visual relationship mapping, rather than traditional 'document based' information management. The practical implementation in real large-scale projects is not uniform across fields. In space science missions, the usage has been limited to subsystems or sample projects with modeling being performed 'a-posteriori' in many instances. The main hurdle for the introduction of MBSE practices in new projects is still the difficulty to demonstrate their added value to a project and whether their benefit is commensurate with the level of effort required to put them in place. In this paper we present the implemented Euclid system modeling activities, and an analysis of the benefits and limitations identified to support in particular requirement break-down and allocation, and verification planning at mission level.
Python for Large-Scale Electrophysiology
Spacek, Martin; Blanche, Tim; Swindale, Nicholas
2008-01-01
Electrophysiology is increasingly moving towards highly parallel recording techniques which generate large data sets. We record extracellularly in vivo in cat and rat visual cortex with 54-channel silicon polytrodes, under time-locked visual stimulation, from localized neuronal populations within a cortical column. To help deal with the complexity of generating and analysing these data, we used the Python programming language to develop three software projects: one for temporally precise visual stimulus generation (“dimstim”); one for electrophysiological waveform visualization and spike sorting (“spyke”); and one for spike train and stimulus analysis (“neuropy”). All three are open source and available for download (http://swindale.ecc.ubc.ca/code). The requirements and solutions for these projects differed greatly, yet we found Python to be well suited for all three. Here we present our software as a showcase of the extensive capabilities of Python in neuroscience. PMID:19198646
The management of large cabling campaigns during the Long Shutdown 1 of LHC
NASA Astrophysics Data System (ADS)
Meroli, S.; Machado, S.; Formenti, F.; Frans, M.; Guillaume, J. C.; Ricci, D.
2014-03-01
The Large Hadron Collider at CERN entered into its first 18 month-long shutdown period in February 2013. During this period the entire CERN accelerator complex will undergo major consolidation and upgrade works, preparing the machines for LHC operation at nominal energy (7 TeV/beam). One of the most challenging activities concerns the cabling infrastructure (copper and optical fibre cables) serving the CERN data acquisition, networking and control systems. About 1000 kilometres of cables, distributed in different machine areas, will be installed, representing an investment of about 15 MCHF. This implies an extraordinary challenge in terms of project management, including resource and activity planning, work execution and quality control. The preparation phase of this project started well before its implementation, by defining technical solutions and setting financial plans for staff recruitment and material supply. Enhanced task coordination was further implemented by deploying selected competences to form a central support team.
Small Projects Rapid Integration and Test Environment (SPRITE): Application for Increasing Robutness
NASA Technical Reports Server (NTRS)
Lee, Ashley; Rakoczy, John; Heather, Daniel; Sanders, Devon
2013-01-01
Over the past few years interest in the development and use of small satellites has rapidly gained momentum with universities, commercial, and government organizations. In a few years we may see networked clusters of dozens or even hundreds of small, cheap, easily replaceable satellites working together in place of the large, expensive and difficult-to-replace satellites now in orbit. Standards based satellite buses and deployment mechanisms, such as the CubeSat and Poly Pico-satellite Orbital Deployer (P-POD), have stimulated growth in this area. The use of small satellites is also proving to be a cost effective capability in many areas traditionally dominated by large satellites, though many challenges remain. Currently many of these small satellites undergo very little testing prior to flight. As these small satellites move from technology demonstration and student projects toward more complex operational assets, it is expected that the standards for verification and validation will increase.
From ATLASGAL to SEDIGISM: Towards a Complete 3D View of the Dense Galactic Interstellar Medium
NASA Astrophysics Data System (ADS)
Schuller, F.; Urquhart, J.; Bronfman, L.; Csengeri, T.; Bontemps, S.; Duarte-Cabral, A.; Giannetti, A.; Ginsburg, A.; Henning, T.; Immer, K.; Leurini, S.; Mattern, M.; Menten, K.; Molinari, S.; Muller, E.; Sánchez-Monge, A.; Schisano, E.; Suri, S.; Testi, L.; Wang, K.; Wyrowski, F.; Zavagno, A.
2016-09-01
The ATLASGAL survey has provided the first unbiased view of the inner Galactic Plane at sub-millimetre wavelengths. This is the largest ground-based survey of its kind to date, covering 420 square degrees at a wavelength of 870 µm. The reduced data, consisting of images and a catalogue of > 104 compact sources, are available from the ESO Science Archive Facility through the Phase 3 infrastructure. The extremely rich statistics of this survey initiated several follow-up projects, including spectroscopic observations to explore molecular complexity and high angular resolution imaging with the Atacama Large Millimeter/submillimeter Array (ALMA), aimed at resolving individual protostars. The most extensive follow-up project is SEDIGISM, a 3D mapping of the dense interstellar medium over a large fraction of the inner Galaxy. Some notable results of these surveys are highlighted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spentzouris, Panagiotis; /Fermilab; Cary, John
The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.« less
Remote sensing techniques in cultural resource management archaeology
NASA Astrophysics Data System (ADS)
Johnson, Jay K.; Haley, Bryan S.
2003-04-01
Cultural resource management archaeology in the United States concerns compliance with legislation set in place to protect archaeological resources from the impact of modern activities. Traditionally, surface collection, shovel testing, test excavation, and mechanical stripping are used in these projects. These methods are expensive, time consuming, and may poorly represent the features within archaeological sites. The use of remote sensing techniques in cultural resource management archaeology may provide an answer to these problems. Near-surface geophysical techniques, including magnetometry, resistivity, electromagnetics, and ground penetrating radar, have proven to be particularly successful at efficiently locating archaeological features. Research has also indicated airborne and satellite remote sensing may hold some promise in the future for large-scale archaeological survey, although this is difficult in many areas of the world where ground cover reflect archaeological features in an indirect manner. A cost simulation of a hypothetical data recovery project on a large complex site in Mississippi is presented to illustrate the potential advantages of remote sensing in a cultural resource management setting. The results indicate these techniques can save a substantial amount of time and money for these projects.
NASA Astrophysics Data System (ADS)
Sims-Waterhouse, D.; Bointon, P.; Piano, S.; Leach, R. K.
2017-06-01
In this paper we show that, by using a photogrammetry system with and without laser speckle, a large range of additive manufacturing (AM) parts with different geometries, materials and post-processing textures can be measured to high accuracy. AM test artefacts have been produced in three materials: polymer powder bed fusion (nylon-12), metal powder bed fusion (Ti-6Al-4V) and polymer material extrusion (ABS plastic). Each test artefact was then measured with the photogrammetry system in both normal and laser speckle projection modes and the resulting point clouds compared with the artefact CAD model. The results show that laser speckle projection can result in a reduction of the point cloud standard deviation from the CAD data of up to 101 μm. A complex relationship with surface texture, artefact geometry and the laser speckle projection is also observed and discussed.
NASA Astrophysics Data System (ADS)
Muller, Christian; PERICLES Consortium
2017-06-01
The FP-7 (Framework Programme 7 of the European Union) PERICLES project addresses the life-cycle of large and complex data sets to cater for the evolution of context of data sets and user communities, including groups unanticipated when the data was created. Semantics of data sets are thus also expected to evolve and the project includes elements which could address the reuse of data sets at periods where the data providers and even their institutions are not available any more. This paper presents the PERICLES science case with the example of the SOLAR (SOLAR monitoring observatory) payload on International Space Station-Columbus.
openBIS: a flexible framework for managing and analyzing complex data in biology research
2011-01-01
Background Modern data generation techniques used in distributed systems biology research projects often create datasets of enormous size and diversity. We argue that in order to overcome the challenge of managing those large quantitative datasets and maximise the biological information extracted from them, a sound information system is required. Ease of integration with data analysis pipelines and other computational tools is a key requirement for it. Results We have developed openBIS, an open source software framework for constructing user-friendly, scalable and powerful information systems for data and metadata acquired in biological experiments. openBIS enables users to collect, integrate, share, publish data and to connect to data processing pipelines. This framework can be extended and has been customized for different data types acquired by a range of technologies. Conclusions openBIS is currently being used by several SystemsX.ch and EU projects applying mass spectrometric measurements of metabolites and proteins, High Content Screening, or Next Generation Sequencing technologies. The attributes that make it interesting to a large research community involved in systems biology projects include versatility, simplicity in deployment, scalability to very large data, flexibility to handle any biological data type and extensibility to the needs of any research domain. PMID:22151573
NASA Technical Reports Server (NTRS)
Zapata, Edgar
2012-01-01
This paper presents past and current work in dealing with indirect industry and NASA costs when providing cost estimation or analysis for NASA projects and programs. Indirect costs, when defined as those costs in a project removed from the actual hardware or software hands-on labor; makes up most of the costs of today's complex large scale NASA space/industry projects. This appears to be the case across phases from research into development into production and into the operation of the system. Space transportation is the case of interest here. Modeling and cost estimation as a process rather than a product will be emphasized. Analysis as a series of belief systems in play among decision makers and decision factors will also be emphasized to provide context.
Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Wucherl; Koo, Michelle; Cao, Yu
Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe-more » art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.« less
A decade of human genome project conclusion: Scientific diffusion about our genome knowledge.
Moraes, Fernanda; Góes, Andréa
2016-05-06
The Human Genome Project (HGP) was initiated in 1990 and completed in 2003. It aimed to sequence the whole human genome. Although it represented an advance in understanding the human genome and its complexity, many questions remained unanswered. Other projects were launched in order to unravel the mysteries of our genome, including the ENCyclopedia of DNA Elements (ENCODE). This review aims to analyze the evolution of scientific knowledge related to both the HGP and ENCODE projects. Data were retrieved from scientific articles published in 1990-2014, a period comprising the development and the 10 years following the HGP completion. The fact that only 20,000 genes are protein and RNA-coding is one of the most striking HGP results. A new concept about the organization of genome arose. The ENCODE project was initiated in 2003 and targeted to map the functional elements of the human genome. This project revealed that the human genome is pervasively transcribed. Therefore, it was determined that a large part of the non-protein coding regions are functional. Finally, a more sophisticated view of chromatin structure emerged. The mechanistic functioning of the genome has been redrafted, revealing a much more complex picture. Besides, a gene-centric conception of the organism has to be reviewed. A number of criticisms have emerged against the ENCODE project approaches, raising the question of whether non-conserved but biochemically active regions are truly functional. Thus, HGP and ENCODE projects accomplished a great map of the human genome, but the data generated still requires further in depth analysis. © 2016 by The International Union of Biochemistry and Molecular Biology, 44:215-223, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.
Computer-assisted map projection research
Snyder, John Parr
1985-01-01
Computers have opened up areas of map projection research which were previously too complicated to utilize, for example, using a least-squares fit to a very large number of points. One application has been in the efficient transfer of data between maps on different projections. While the transfer of moderate amounts of data is satisfactorily accomplished using the analytical map projection formulas, polynomials are more efficient for massive transfers. Suitable coefficients for the polynomials may be determined more easily for general cases using least squares instead of Taylor series. A second area of research is in the determination of a map projection fitting an unlabeled map, so that accurate data transfer can take place. The computer can test one projection after another, and include iteration where required. A third area is in the use of least squares to fit a map projection with optimum parameters to the region being mapped, so that distortion is minimized. This can be accomplished for standard conformal, equalarea, or other types of projections. Even less distortion can result if complex transformations of conformal projections are utilized. This bulletin describes several recent applications of these principles, as well as historical usage and background.
Group Decision Support System to Aid the Process of Design and Maintenance of Large Scale Systems
1992-03-23
from a fuzzy set of user requirements. The overall objective of the project is to develop a system combining the characteristics of a compact computer... AHP ) for hierarchical prioritization. 4) Individual Evaluation and Selection of Alternatives - Allows the decision maker to individually evaluate...its concept of outranking relations. The AHP method supports complex decision problems by successively decomposing and synthesizing various elements
ERIC Educational Resources Information Center
Pitluck, Corrin
2010-01-01
Assuming the strength of small by design schools for poor urban students of color to be a settled question, this project attempts to analyze the sustainability of small by design schools in a large, complex urban district. Asking what causes small schools to converge toward or diverge from the small by design model, I analyze three sets of design…
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
AvalonBay Communities, which is a large multifamily developer, was developing a three-building complex in Elmsford, New York. The buildings were planned to be certified to the ENERGY STAR® Homes Version 3 program. This plan led to AvalonBay partnering with the Advanced Residential Integrated Solutions (ARIES) collaborative, which is a U.S. Department of Energy Building America team. ARIES worked with AvalonBay to redesign the project to comply with Zero Energy Ready Home (ZERH) criteria.
A New Illuminator of Opportunity Bistatic Radar Research Project at DSTO
2009-05-01
digitally down convert each IF into 32-bit complex samples . That is, it generates 16-bit in-phase and quadrature -phase samples and saves them on a large non...cross- correlation process (see Equation 14), to produce each frame of the movies presented in Figures 30 - 36. The MATLAB code used to produce the...11 3.3.1 Terrestrial TV Configuration . . . . . . . . . . . . . . . . . . . . . 11 3.3.2 GPS Configuration
Meteor Observations as Big Data Citizen Science
NASA Astrophysics Data System (ADS)
Gritsevich, M.; Vinkovic, D.; Schwarz, G.; Nina, A.; Koschny, D.; Lyytinen, E.
2016-12-01
Meteor science represents an excellent example of the citizen science project, where progress in the field has been largely determined by amateur observations. Over the last couple of decades technological advancements in observational techniques have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced scientific goals. We review some of the developments that push meteor science into the Big Data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere. The recent increased interest in meteor science triggered by the Chelyabinsk fireball helps in building the case for technologically and logistically more ambitious meteor projects. This requires developing new methodological approaches in meteor research, with Big Data science and close collaboration between citizen science, geoscience and astronomy as critical elements. We discuss possibilities for improvements and promote an opportunity for collaboration in meteor science within the currently established BigSkyEarth http://bigskyearth.eu/ network.
Toward a theoretical framework for trustworthy cyber sensing
NASA Astrophysics Data System (ADS)
Xu, Shouhuai
2010-04-01
Cyberspace is an indispensable part of the economy and society, but has been "polluted" with many compromised computers that can be abused to launch further attacks against the others. Since it is likely that there always are compromised computers, it is important to be aware of the (dynamic) cyber security-related situation, which is however challenging because cyberspace is an extremely large-scale complex system. Our project aims to investigate a theoretical framework for trustworthy cyber sensing. With the perspective of treating cyberspace as a large-scale complex system, the core question we aim to address is: What would be a competent theoretical (mathematical and algorithmic) framework for designing, analyzing, deploying, managing, and adapting cyber sensor systems so as to provide trustworthy information or input to the higher layer of cyber situation-awareness management, even in the presence of sophisticated malicious attacks against the cyber sensor systems?
Krojer, Tobias; Talon, Romain; Pearce, Nicholas; Collins, Patrick; Douangamath, Alice; Brandao-Neto, Jose; Dias, Alexandre; Marsden, Brian; von Delft, Frank
2017-03-01
XChemExplorer (XCE) is a data-management and workflow tool to support large-scale simultaneous analysis of protein-ligand complexes during structure-based ligand discovery (SBLD). The user interfaces of established crystallographic software packages such as CCP4 [Winn et al. (2011), Acta Cryst. D67, 235-242] or PHENIX [Adams et al. (2010), Acta Cryst. D66, 213-221] have entrenched the paradigm that a `project' is concerned with solving one structure. This does not hold for SBLD, where many almost identical structures need to be solved and analysed quickly in one batch of work. Functionality to track progress and annotate structures is essential. XCE provides an intuitive graphical user interface which guides the user from data processing, initial map calculation, ligand identification and refinement up until data dissemination. It provides multiple entry points depending on the need of each project, enables batch processing of multiple data sets and records metadata, progress and annotations in an SQLite database. XCE is freely available and works on any Linux and Mac OS X system, and the only dependency is to have the latest version of CCP4 installed. The design and usage of this tool are described here, and its usefulness is demonstrated in the context of fragment-screening campaigns at the Diamond Light Source. It is routinely used to analyse projects comprising 1000 data sets or more, and therefore scales well to even very large ligand-design projects.
Wright, Michael T.; Fram, Miranda S.; Belitz, Kenneth
2015-01-01
Concentrations of strontium, which exists primarily in a cationic form (Sr2+), were not significantly correlated with either groundwater age or pH. Strontium concentrations showed a strong positive correlation with total dissolved solids (TDS). Dissolved constituents, such as Sr, that interact with mineral surfaces through outer-sphere complexation become increasingly soluble with increasing TDS concentrations of groundwater. Boron concentrations also showed a significant positive correlation with TDS, indicating the B may interact to a large degree with mineral surfaces through outer-sphere complexation.
A high performance cost-effective digital complex correlator for an X-band polarimetry survey.
Bergano, Miguel; Rocha, Armando; Cupido, Luís; Barbosa, Domingos; Villela, Thyrso; Boas, José Vilas; Rocha, Graça; Smoot, George F
2016-01-01
The detailed knowledge of the Milky Way radio emission is important to characterize galactic foregrounds masking extragalactic and cosmological signals. The update of the global sky models describing radio emissions over a very large spectral band requires high sensitivity experiments capable of observing large sky areas with long integration times. Here, we present the design of a new 10 GHz (X-band) polarimeter digital back-end to map the polarization components of the galactic synchrotron radiation field of the Northern Hemisphere sky. The design follows the digital processing trends in radio astronomy and implements a large bandwidth (1 GHz) digital complex cross-correlator to extract the Stokes parameters of the incoming synchrotron radiation field. The hardware constraints cover the implemented VLSI hardware description language code and the preliminary results. The implementation is based on the simultaneous digitized acquisition of the Cartesian components of the two linear receiver polarization channels. The design strategy involves a double data rate acquisition of the ADC interleaved parallel bus, and field programmable gate array device programming at the register transfer mode. The digital core of the back-end is capable of processing 32 Gbps and is built around an Altera field programmable gate array clocked at 250 MHz, 1 GSps analog to digital converters and a clock generator. The control of the field programmable gate array internal signal delays and a convenient use of its phase locked loops provide the timing requirements to achieve the target bandwidths and sensitivity. This solution is convenient for radio astronomy experiments requiring large bandwidth, high functionality, high volume availability and low cost. Of particular interest, this correlator was developed for the Galactic Emission Mapping project and is suitable for large sky area polarization continuum surveys. The solutions may also be adapted to be used at signal processing subsystem levels for large projects like the square kilometer array testbeds.
Complexity analysis of the Next Gen Air Traffic Management System: trajectory based operations.
Lyons, Rhonda
2012-01-01
According to Federal Aviation Administration traffic predictions currently our Air Traffic Management (ATM) system is operating at 150 percent capacity; forecasting that within the next two decades, the traffic with increase to a staggering 250 percent [17]. This will require a major redesign of our system. Today's ATM system is complex. It is designed to safely, economically, and efficiently provide air traffic services through the cost-effective provision of facilities and seamless services in collaboration with multiple agents however, contrary the vision, the system is loosely integrated and is suffering tremendously from antiquated equipment and saturated airways. The new Next Generation (Next Gen) ATM system is designed to transform the current system into an agile, robust and responsive set of operations that are designed to safely manage the growing needs of the projected increasingly complex, diverse set of air transportation system users and massive projected worldwide traffic rates. This new revolutionary technology-centric system is dynamically complex and is much more sophisticated than it's soon to be predecessor. ATM system failures could yield large scale catastrophic consequences as it is a safety critical system. This work will attempt to describe complexity and the complex nature of the NextGen ATM system and Trajectory Based Operational. Complex human factors interactions within Next Gen will be analyzed using a proposed dual experimental approach designed to identify hazards, gaps and elicit emergent hazards that would not be visible if conducted in isolation. Suggestions will be made along with a proposal for future human factors research in the TBO safety critical Next Gen environment.
Developing A Large-Scale, Collaborative, Productive Geoscience Education Network
NASA Astrophysics Data System (ADS)
Manduca, C. A.; Bralower, T. J.; Egger, A. E.; Fox, S.; Ledley, T. S.; Macdonald, H.; Mcconnell, D. A.; Mogk, D. W.; Tewksbury, B. J.
2012-12-01
Over the past 15 years, the geoscience education community has grown substantially and developed broad and deep capacity for collaboration and dissemination of ideas. While this community is best viewed as emergent from complex interactions among changing educational needs and opportunities, we highlight the role of several large projects in the development of a network within this community. In the 1990s, three NSF projects came together to build a robust web infrastructure to support the production and dissemination of on-line resources: On The Cutting Edge (OTCE), Earth Exploration Toolbook, and Starting Point: Teaching Introductory Geoscience. Along with the contemporaneous Digital Library for Earth System Education, these projects engaged geoscience educators nationwide in exploring professional development experiences that produced lasting on-line resources, collaborative authoring of resources, and models for web-based support for geoscience teaching. As a result, a culture developed in the 2000s in which geoscience educators anticipated that resources for geoscience teaching would be shared broadly and that collaborative authoring would be productive and engaging. By this time, a diverse set of examples demonstrated the power of the web infrastructure in supporting collaboration, dissemination and professional development . Building on this foundation, more recent work has expanded both the size of the network and the scope of its work. Many large research projects initiated collaborations to disseminate resources supporting educational use of their data. Research results from the rapidly expanding geoscience education research community were integrated into the Pedagogies in Action website and OTCE. Projects engaged faculty across the nation in large-scale data collection and educational research. The Climate Literacy and Energy Awareness Network and OTCE engaged community members in reviewing the expanding body of on-line resources. Building Strong Geoscience Departments sought to create the same type of shared information base that was supporting individual faculty for departments. The Teach the Earth portal and its underlying web development tools were used by NSF-funded projects in education to disseminate their results. Leveraging these funded efforts, the Climate Literacy Network has expanded this geoscience education community to include individuals broadly interested in fostering climate literacy. Most recently, the InTeGrate project is implementing inter-institutional collaborative authoring, testing and evaluation of curricular materials. While these projects represent only a fraction of the activity in geoscience education, they are important drivers in the development of a large, national, coherent geoscience education network with the ability to collaborate and disseminate information effectively. Importantly, the community is open and defined by active participation. Key mechanisms for engagement have included alignment of project activities with participants needs and goals; productive face-to-face and virtual workshops, events, and series; stipends for completion of large products; and strong supporting staff to keep projects moving and assist with product production. One measure of its success is the adoption and adaptation of resources and models by emerging projects, which results in the continued growth of the network.
Multipoint molecular recognition within a calix[6]arene funnel complex
Coquière, David; de la Lande, Aurélien; Martí, Sergio; Parisel, Olivier; Prangé, Thierry; Reinaud, Olivia
2009-01-01
A multipoint recognition system based on a calix[6]arene is described. The calixarene core is decorated on alternating aromatic subunits by 3 imidazole arms at the small rim and 3 aniline groups at the large rim. This substitution pattern projects the aniline nitrogens toward each other when Zn(II) binds at the Tris-imidazole site or when a proton binds at an aniline. The XRD structure of the monoprotonated complex having an acetonitrile molecule bound to Zn(II) in the cavity revealed a constrained geometry at the metal center reminiscent of an entatic state. Computer modeling suggests that the aniline groups behave as a tritopic monobasic site in which only 1 aniline unit is protonated and interacts with the other 2 through strong hydrogen bonding. The metal complex selectively binds a monoprotonated diamine vs. a monoamine through multipoint recognition: coordination to the metal ion at the small rim, hydrogen bonding to the calix-oxygen core, CH/π interaction within the cavity's aromatic walls, and H-bonding to the anilines at the large rim. PMID:19237564
Vecharynski, Eugene; Yang, Chao; Pask, John E.
2015-02-25
Here, we present an iterative algorithm for computing an invariant subspace associated with the algebraically smallest eigenvalues of a large sparse or structured Hermitian matrix A. We are interested in the case in which the dimension of the invariant subspace is large (e.g., over several hundreds or thousands) even though it may still be small relative to the dimension of A. These problems arise from, for example, density functional theory (DFT) based electronic structure calculations for complex materials. The key feature of our algorithm is that it performs fewer Rayleigh–Ritz calculations compared to existing algorithms such as the locally optimalmore » block preconditioned conjugate gradient or the Davidson algorithm. It is a block algorithm, and hence can take advantage of efficient BLAS3 operations and be implemented with multiple levels of concurrency. We discuss a number of practical issues that must be addressed in order to implement the algorithm efficiently on a high performance computer.« less
Integrating science into management of ecosystems in the Greater Blue Mountains.
Chapple, Rosalie S; Ramp, Daniel; Bradstock, Ross A; Kingsford, Richard T; Merson, John A; Auld, Tony D; Fleming, Peter J S; Mulley, Robert C
2011-10-01
Effective management of large protected conservation areas is challenged by political, institutional and environmental complexity and inconsistency. Knowledge generation and its uptake into management are crucial to address these challenges. We reflect on practice at the interface between science and management of the Greater Blue Mountains World Heritage Area (GBMWHA), which covers approximately 1 million hectares west of Sydney, Australia. Multiple government agencies and other stakeholders are involved in its management, and decision-making is confounded by numerous plans of management and competing values and goals, reflecting the different objectives and responsibilities of stakeholders. To highlight the complexities of the decision-making process for this large area, we draw on the outcomes of a recent collaborative research project and focus on fire regimes and wild-dog control as examples of how existing knowledge is integrated into management. The collaborative research project achieved the objectives of collating and synthesizing biological data for the region; however, transfer of the project's outcomes to management has proved problematic. Reasons attributed to this include lack of clearly defined management objectives to guide research directions and uptake, and scientific information not being made more understandable and accessible. A key role of a local bridging organisation (e.g., the Blue Mountains World Heritage Institute) in linking science and management is ensuring that research results with management significance can be effectively transmitted to agencies and that outcomes are explained for nonspecialists as well as more widely distributed. We conclude that improved links between science, policy, and management within an adaptive learning-by-doing framework for the GBMWHA would assist the usefulness and uptake of future research.
Integrating Science into Management of Ecosystems in the Greater Blue Mountains
NASA Astrophysics Data System (ADS)
Chapple, Rosalie S.; Ramp, Daniel; Bradstock, Ross A.; Kingsford, Richard T.; Merson, John A.; Auld, Tony D.; Fleming, Peter J. S.; Mulley, Robert C.
2011-10-01
Effective management of large protected conservation areas is challenged by political, institutional and environmental complexity and inconsistency. Knowledge generation and its uptake into management are crucial to address these challenges. We reflect on practice at the interface between science and management of the Greater Blue Mountains World Heritage Area (GBMWHA), which covers approximately 1 million hectares west of Sydney, Australia. Multiple government agencies and other stakeholders are involved in its management, and decision-making is confounded by numerous plans of management and competing values and goals, reflecting the different objectives and responsibilities of stakeholders. To highlight the complexities of the decision-making process for this large area, we draw on the outcomes of a recent collaborative research project and focus on fire regimes and wild-dog control as examples of how existing knowledge is integrated into management. The collaborative research project achieved the objectives of collating and synthesizing biological data for the region; however, transfer of the project's outcomes to management has proved problematic. Reasons attributed to this include lack of clearly defined management objectives to guide research directions and uptake, and scientific information not being made more understandable and accessible. A key role of a local bridging organisation (e.g., the Blue Mountains World Heritage Institute) in linking science and management is ensuring that research results with management significance can be effectively transmitted to agencies and that outcomes are explained for nonspecialists as well as more widely distributed. We conclude that improved links between science, policy, and management within an adaptive learning-by-doing framework for the GBMWHA would assist the usefulness and uptake of future research.
An Efficient Image Compressor for Charge Coupled Devices Camera
Li, Jin; Xing, Fei; You, Zheng
2014-01-01
Recently, the discrete wavelet transforms- (DWT-) based compressor, such as JPEG2000 and CCSDS-IDC, is widely seen as the state of the art compression scheme for charge coupled devices (CCD) camera. However, CCD images project on the DWT basis to produce a large number of large amplitude high-frequency coefficients because these images have a large number of complex texture and contour information, which are disadvantage for the later coding. In this paper, we proposed a low-complexity posttransform coupled with compressing sensing (PT-CS) compression approach for remote sensing image. First, the DWT is applied to the remote sensing image. Then, a pair base posttransform is applied to the DWT coefficients. The pair base are DCT base and Hadamard base, which can be used on the high and low bit-rate, respectively. The best posttransform is selected by the l p-norm-based approach. The posttransform is considered as the sparse representation stage of CS. The posttransform coefficients are resampled by sensing measurement matrix. Experimental results on on-board CCD camera images show that the proposed approach significantly outperforms the CCSDS-IDC-based coder, and its performance is comparable to that of the JPEG2000 at low bit rate and it does not have the high excessive implementation complexity of JPEG2000. PMID:25114977
Risk assessment for construction projects of transport infrastructure objects
NASA Astrophysics Data System (ADS)
Titarenko, Boris
2017-10-01
The paper analyzes and compares different methods of risk assessment for construction projects of transport objects. The management of such type of projects demands application of special probabilistic methods due to large level of uncertainty of their implementation. Risk management in the projects requires the use of probabilistic and statistical methods. The aim of the work is to develop a methodology for using traditional methods in combination with robust methods that allow obtaining reliable risk assessments in projects. The robust approach is based on the principle of maximum likelihood and in assessing the risk allows the researcher to obtain reliable results in situations of great uncertainty. The application of robust procedures allows to carry out a quantitative assessment of the main risk indicators of projects when solving the tasks of managing innovation-investment projects. Calculation of damage from the onset of a risky event is possible by any competent specialist. And an assessment of the probability of occurrence of a risky event requires the involvement of special probabilistic methods based on the proposed robust approaches. Practice shows the effectiveness and reliability of results. The methodology developed in the article can be used to create information technologies and their application in automated control systems for complex projects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoverson, Eric D.; Amonette, Alexandra
2008-12-02
The Umatilla Anadromous Fisheries Habitat Project (UAFHP) is an ongoing effort to protect, enhance, and restore riparian and instream habitat for the natural production of anadromous salmonids in the Umatilla River Basin, Northeast Oregon. Flow quantity, water temperature, passage, and lack of in-stream channel complexity have been identified as the key limiting factors in the basin. During the 2007 Fiscal Year (FY) reporting period (February 1, 2007-January 31, 2008) primary project activities focused on improving instream and riparian habitat complexity, migrational passage, and restoring natural channel morphology and floodplain function. Eight fisheries habitat enhancement projects were implemented on Meacham Creek,more » Camp Creek, Greasewood Creek, Birch Creek, West Birch Creek, and the Umatilla River. Specific restoration actions included: (1) rectifying five fish passage barriers on four creeks, (2) planting 1,275 saplings and seeding 130 pounds of native grasses, (3) constructing two miles of riparian fencing for livestock exclusion, (4) coordinating activities related to the installation of two off-channel, solar-powered watering areas for livestock, and (5) developing eight water gap access sites to reduce impacts from livestock. Baseline and ongoing monitoring and evaluation activities were also completed on major project areas such as conducting photo point monitoring strategies activities at the Meacham Creek Large Wood Implementation Project site (FY2006) and at all existing easements and planned project sites. Fish surveys and aquatic habitat inventories were conducted at project sites prior to implementation. Monitoring plans will continue throughout the life of each project to oversee progression and inspire timely managerial actions. Twenty-seven conservation easements were maintained with 23 landowners. Permitting applications for planned project activities and biological opinions were written and approved. Project activities were based on a variety of fisheries monitoring techniques and habitat assessments used to determine existing conditions and identify factors limiting anadromous salmonid abundance. Proper selection and implementation of the most effective site-specific habitat restoration plan, taking into consideration the unique characteristics of each project site, and conducted in cooperation with landowners and project partners, was of paramount importance to ensure each project's success.« less
Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk
NASA Technical Reports Server (NTRS)
Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.
2014-01-01
The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.
Williams, Holly; Downes, Elizabeth
2017-11-01
The effects of climate change are far-reaching and multifactorial, with potential impacts on food security and conflict. Large population movements, whether from the aftermath of natural disasters or resulting from conflict, can precipitate the need for humanitarian response in what can become complex humanitarian emergencies (CHEs). Nurses need to be prepared to respond to affected communities in need, whether the emergency is domestic or global. The purpose of the article is to describe a novel course for nursing students interested in practice within the confines of CHEs and natural disasters. The authors used the Sphere Humanitarian Charter and Minimum Standards as a practical framework to inform the course development. They completed a review of the literature on the interaction on climate change, conflict and health, and competencies related to working CHEs. Resettled refugees, as well as experts in the area of humanitarian response, recovery, and mitigation from the Centers for Disease Control and Prevention and nongovernmental organizations further informed the development of the course. This course prepares the nursing workforce to respond appropriately to large population movements that may arise from the aftermath of natural disasters or conflict, both of which can comprise a complex humanitarian disaster. Using The Sphere Project e-learning course, students learn about the Sphere Project, which works to ensure accountability and quality in humanitarian response and offers core minimal standards for technical assistance. These guidelines are seen globally as the gold standard for humanitarian response and address many of the competencies for disaster nursing (http://www.sphereproject.org/learning/e-learning-course/). © 2017 Sigma Theta Tau International.
Final Report of the Project "From the finite element method to the virtual element method"
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manzini, Gianmarco; Gyrya, Vitaliy
The Finite Element Method (FEM) is a powerful numerical tool that is being used in a large number of engineering applications. The FEM is constructed on triangular/tetrahedral and quadrilateral/hexahedral meshes. Extending the FEM to general polygonal/polyhedral meshes in straightforward way turns out to be extremely difficult and leads to very complex and computationally expensive schemes. The reason for this failure is that the construction of the basis functions on elements with a very general shape is a non-trivial and complex task. In this project we developed a new family of numerical methods, dubbed the Virtual Element Method (VEM) for themore » numerical approximation of partial differential equations (PDE) of elliptic type suitable to polygonal and polyhedral unstructured meshes. We successfully formulated, implemented and tested these methods and studied both theoretically and numerically their stability, robustness and accuracy for diffusion problems, convection-reaction-diffusion problems, the Stokes equations and the biharmonic equations.« less
More ‘altruistic’ punishment in larger societies
Marlowe, Frank W; Berbesque, J. Colette; Barr, Abigail; Barrett, Clark; Bolyanatz, Alexander; Cardenas, Juan Camilo; Ensminger, Jean; Gurven, Michael; Gwako, Edwins; Henrich, Joseph; Henrich, Natalie; Lesorogol, Carolyn; McElreath, Richard; Tracer, David
2007-01-01
If individuals will cooperate with cooperators, and punish non-cooperators even at a cost to themselves, then this strong reciprocity could minimize the cheating that undermines cooperation. Based upon numerous economic experiments, some have proposed that human cooperation is explained by strong reciprocity and norm enforcement. Second-party punishment is when you punish someone who defected on you; third-party punishment is when you punish someone who defected on someone else. Third-party punishment is an effective way to enforce the norms of strong reciprocity and promote cooperation. Here we present new results that expand on a previous report from a large cross-cultural project. This project has already shown that there is considerable cross-cultural variation in punishment and cooperation. Here we test the hypothesis that population size (and complexity) predicts the level of third-party punishment. Our results show that people in larger, more complex societies engage in significantly more third-party punishment than people in small-scale societies. PMID:18089534
Coates, Jennifer C; Colaiezzi, Brooke A; Bell, Winnie; Charrondiere, U Ruth; Leclercq, Catherine
2017-03-16
An increasing number of low-income countries (LICs) exhibit high rates of malnutrition coincident with rising rates of overweight and obesity. Individual-level dietary data are needed to inform effective responses, yet dietary data from large-scale surveys conducted in LICs remain extremely limited. This discussion paper first seeks to highlight the barriers to collection and use of individual-level dietary data in LICs. Second, it introduces readers to new technological developments and research initiatives to remedy this situation, led by the International Dietary Data Expansion (INDDEX) Project. Constraints to conducting large-scale dietary assessments include significant costs, time burden, technical complexity, and limited investment in dietary research infrastructure, including the necessary tools and databases required to collect individual-level dietary data in large surveys. To address existing bottlenecks, the INDDEX Project is developing a dietary assessment platform for LICs, called INDDEX24, consisting of a mobile application integrated with a web database application, which is expected to facilitate seamless data collection and processing. These tools will be subject to rigorous testing including feasibility, validation, and cost studies. To scale up dietary data collection and use in LICs, the INDDEX Project will also invest in food composition databases, an individual-level dietary data dissemination platform, and capacity development activities. Although the INDDEX Project activities are expected to improve the ability of researchers and policymakers in low-income countries to collect, process, and use dietary data, the global nutrition community is urged to commit further significant investments in order to adequately address the range and scope of challenges described in this paper.
Coates, Jennifer C.; Colaiezzi, Brooke A.; Bell, Winnie; Charrondiere, U. Ruth; Leclercq, Catherine
2017-01-01
An increasing number of low-income countries (LICs) exhibit high rates of malnutrition coincident with rising rates of overweight and obesity. Individual-level dietary data are needed to inform effective responses, yet dietary data from large-scale surveys conducted in LICs remain extremely limited. This discussion paper first seeks to highlight the barriers to collection and use of individual-level dietary data in LICs. Second, it introduces readers to new technological developments and research initiatives to remedy this situation, led by the International Dietary Data Expansion (INDDEX) Project. Constraints to conducting large-scale dietary assessments include significant costs, time burden, technical complexity, and limited investment in dietary research infrastructure, including the necessary tools and databases required to collect individual-level dietary data in large surveys. To address existing bottlenecks, the INDDEX Project is developing a dietary assessment platform for LICs, called INDDEX24, consisting of a mobile application integrated with a web database application, which is expected to facilitate seamless data collection and processing. These tools will be subject to rigorous testing including feasibility, validation, and cost studies. To scale up dietary data collection and use in LICs, the INDDEX Project will also invest in food composition databases, an individual-level dietary data dissemination platform, and capacity development activities. Although the INDDEX Project activities are expected to improve the ability of researchers and policymakers in low-income countries to collect, process, and use dietary data, the global nutrition community is urged to commit further significant investments in order to adequately address the range and scope of challenges described in this paper. PMID:28300759
2004-02-01
UNCLASSIFIED − Conducted experiments to determine the usability of general-purpose anomaly detection algorithms to monitor a large, complex military...reaction and detection modules to perform tailored analysis sequences to monitor environmental conditions, health hazards and physiological states...scalability of lab proven anomaly detection techniques for intrusion detection in real world high volume environments. Narrative Title FY 2003
Fast assembling of neuron fragments in serial 3D sections.
Chen, Hanbo; Iascone, Daniel Maxim; da Costa, Nuno Maçarico; Lein, Ed S; Liu, Tianming; Peng, Hanchuan
2017-09-01
Reconstructing neurons from 3D image-stacks of serial sections of thick brain tissue is very time-consuming and often becomes a bottleneck in high-throughput brain mapping projects. We developed NeuronStitcher, a software suite for stitching non-overlapping neuron fragments reconstructed in serial 3D image sections. With its efficient algorithm and user-friendly interface, NeuronStitcher has been used successfully to reconstruct very large and complex human and mouse neurons.
NASA Technical Reports Server (NTRS)
Barclay, Rebecca O.; Pinelli, Thomas E.
1997-01-01
The large and complex aerospace industry, which employed approximately 850,000 people in 1994 (Aerospace Facts, 1994-95, p. 11), plays a vital role in the nation's economy. Although only a small percentage of those employed in aerospace are technical communicators, they perform a wide variety of communication duties in government and the private sector.
2014-05-21
simulating air-water free -surface flow, fluid-object interaction (FOI), and fluid-structure interaction (FSI) phenomena for complex geometries, and...with no limitations on the motion of the free surface, and with particular emphasis on ship hydrodynamics. The following specific research objectives...were identified for this project: 1) Development of a theoretical framework for free -surface flow, FOI and FSI that is a suitable starting point
Ohio River Environmental Assessment: Cultural Resources Reconnaissance Report, West Virginia.
1977-08-01
that Paleo-Indian populations consisted of nomadic hunting bands that ranged over large territories. Population density in the project area appears to...Complex which is present in Ohio and filters sporadically to the floodplain (Prufer and Baby 1963) during this period appears to be absent in West Virignia...riverine and estuarine resources as well as hunting and gathering (Caldwell, 1958). Sites are variable in size and density with some indications of
Research on cost control and management in high voltage transmission line construction
NASA Astrophysics Data System (ADS)
Xu, Xiaobin
2017-05-01
Enterprises. The cost control is of vital importance to the construction enterprises. It is the key to the profitability of the transmission line project, which is related to the survival and development of the electric power construction enterprises. Due to the long construction line, complex and changeable construction terrain as well as large construction costs of transmission line, it is difficult for us to take accurate and effective cost control on the project implementation of entire transmission line. Therefore, the cost control of transmission line project is a complicated and arduous task. It is of great theoretical and practical significance to study the cost control scheme of transmission line project by a more scientific and efficient way. Based on the characteristics of the construction project of the transmission line project, this paper analyzes the construction cost structure of the transmission line project and the current cost control problem of the transmission line project, and demonstrates the necessity and feasibility of studying the cost control scheme of the transmission line project more accurately. In this way, the dynamic cycle cost control process including plan, implementation, feedback, correction, modification and re-implement is achieved to realize the accurate and effective cost control of entire electric power transmission line project.
Shafer, Sarah; Bartlein, Patrick J.; Gray, Elizabeth M.; Pelltier, Richard T.
2015-01-01
Future climate change may significantly alter the distributions of many plant taxa. The effects of climate change may be particularly large in mountainous regions where climate can vary significantly with elevation. Understanding potential future vegetation changes in these regions requires methods that can resolve vegetation responses to climate change at fine spatial resolutions. We used LPJ, a dynamic global vegetation model, to assess potential future vegetation changes for a large topographically complex area of the northwest United States and southwest Canada (38.0–58.0°N latitude by 136.6–103.0°W longitude). LPJ is a process-based vegetation model that mechanistically simulates the effect of changing climate and atmospheric CO2 concentrations on vegetation. It was developed and has been mostly applied at spatial resolutions of 10-minutes or coarser. In this study, we used LPJ at a 30-second (~1-km) spatial resolution to simulate potential vegetation changes for 2070–2099. LPJ was run using downscaled future climate simulations from five coupled atmosphere-ocean general circulation models (CCSM3, CGCM3.1(T47), GISS-ER, MIROC3.2(medres), UKMO-HadCM3) produced using the A2 greenhouse gases emissions scenario. Under projected future climate and atmospheric CO2 concentrations, the simulated vegetation changes result in the contraction of alpine, shrub-steppe, and xeric shrub vegetation across the study area and the expansion of woodland and forest vegetation. Large areas of maritime cool forest and cold forest are simulated to persist under projected future conditions. The fine spatial-scale vegetation simulations resolve patterns of vegetation change that are not visible at coarser resolutions and these fine-scale patterns are particularly important for understanding potential future vegetation changes in topographically complex areas.
Shafer, Sarah L.; Bartlein, Patrick J.; Gray, Elizabeth M.; Pelltier, Richard T.
2015-01-01
Future climate change may significantly alter the distributions of many plant taxa. The effects of climate change may be particularly large in mountainous regions where climate can vary significantly with elevation. Understanding potential future vegetation changes in these regions requires methods that can resolve vegetation responses to climate change at fine spatial resolutions. We used LPJ, a dynamic global vegetation model, to assess potential future vegetation changes for a large topographically complex area of the northwest United States and southwest Canada (38.0–58.0°N latitude by 136.6–103.0°W longitude). LPJ is a process-based vegetation model that mechanistically simulates the effect of changing climate and atmospheric CO2 concentrations on vegetation. It was developed and has been mostly applied at spatial resolutions of 10-minutes or coarser. In this study, we used LPJ at a 30-second (~1-km) spatial resolution to simulate potential vegetation changes for 2070–2099. LPJ was run using downscaled future climate simulations from five coupled atmosphere-ocean general circulation models (CCSM3, CGCM3.1(T47), GISS-ER, MIROC3.2(medres), UKMO-HadCM3) produced using the A2 greenhouse gases emissions scenario. Under projected future climate and atmospheric CO2 concentrations, the simulated vegetation changes result in the contraction of alpine, shrub-steppe, and xeric shrub vegetation across the study area and the expansion of woodland and forest vegetation. Large areas of maritime cool forest and cold forest are simulated to persist under projected future conditions. The fine spatial-scale vegetation simulations resolve patterns of vegetation change that are not visible at coarser resolutions and these fine-scale patterns are particularly important for understanding potential future vegetation changes in topographically complex areas. PMID:26488750
NASA Astrophysics Data System (ADS)
Eschweiler, Joseph D.; Frank, Aaron T.; Ruotolo, Brandon T.
2017-10-01
Multiprotein complexes are central to our understanding of cellular biology, as they play critical roles in nearly every biological process. Despite many impressive advances associated with structural characterization techniques, large and highly-dynamic protein complexes are too often refractory to analysis by conventional, high-resolution approaches. To fill this gap, ion mobility-mass spectrometry (IM-MS) methods have emerged as a promising approach for characterizing the structures of challenging assemblies due in large part to the ability of these methods to characterize the composition, connectivity, and topology of large, labile complexes. In this Critical Insight, we present a series of bioinformatics studies aimed at assessing the information content of IM-MS datasets for building models of multiprotein structure. Our computational data highlights the limits of current coarse-graining approaches, and compelled us to develop an improved workflow for multiprotein topology modeling, which we benchmark against a subset of the multiprotein complexes within the PDB. This improved workflow has allowed us to ascertain both the minimal experimental restraint sets required for generation of high-confidence multiprotein topologies, and quantify the ambiguity in models where insufficient IM-MS information is available. We conclude by projecting the future of IM-MS in the context of protein quaternary structure assignment, where we predict that a more complete knowledge of the ultimate information content and ambiguity within such models will undoubtedly lead to applications for a broader array of challenging biomolecular assemblies. [Figure not available: see fulltext.
Deelman, E.; Callaghan, S.; Field, E.; Francoeur, H.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T.H.; Kesselman, C.; Maechling, P.; Mehringer, J.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.
2006-01-01
This paper discusses the process of building an environment where large-scale, complex, scientific analysis can be scheduled onto a heterogeneous collection of computational and storage resources. The example application is the Southern California Earthquake Center (SCEC) CyberShake project, an analysis designed to compute probabilistic seismic hazard curves for sites in the Los Angeles area. We explain which software tools were used to build to the system, describe their functionality and interactions. We show the results of running the CyberShake analysis that included over 250,000 jobs using resources available through SCEC and the TeraGrid. ?? 2006 IEEE.
A Portable Computer System for Auditing Quality of Ambulatory Care
McCoy, J. Michael; Dunn, Earl V.; Borgiel, Alexander E.
1987-01-01
Prior efforts to effectively and efficiently audit quality of ambulatory care based on comprehensive process criteria have been limited largely by the complexity and cost of data abstraction and management. Over the years, several demonstration projects have generated large sets of process criteria and mapping systems for evaluating quality of care, but these paper-based approaches have been impractical to implement on a routine basis. Recognizing that portable microcomputers could solve many of the technical problems in abstracting data from medical records, we built upon previously described criteria and developed a microcomputer-based abstracting system that facilitates reliable and cost-effective data abstraction.
Supporting Knowledge Transfer in IS Deployment Projects
NASA Astrophysics Data System (ADS)
Schönström, Mikael
To deploy new information systems is an expensive and complex task, and does seldom result in successful usage where the system adds strategic value to the firm (e.g. Sharma et al. 2003). It has been argued that innovation diffusion is a knowledge integration problem (Newell et al. 2000). Knowledge about business processes, deployment processes, information systems and technology are needed in a large-scale deployment of a corporate IS. These deployments can therefore to a large extent be argued to be a knowledge management (KM) problem. An effective deployment requires that knowledge about the system is effectively transferred to the target organization (Ko et al. 2005).
Hybrid function projective synchronization in complex dynamical networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, Qiang; Wang, Xing-yuan, E-mail: wangxy@dlut.edu.cn; Hu, Xiao-peng
2014-02-15
This paper investigates hybrid function projective synchronization in complex dynamical networks. When the complex dynamical networks could be synchronized up to an equilibrium or periodic orbit, a hybrid feedback controller is designed to realize the different component of vector of node could be synchronized up to different desired scaling function in complex dynamical networks with time delay. Hybrid function projective synchronization (HFPS) in complex dynamical networks with constant delay and HFPS in complex dynamical networks with time-varying coupling delay are researched, respectively. Finally, the numerical simulations show the effectiveness of theoretical analysis.
Final Report. Analysis and Reduction of Complex Networks Under Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marzouk, Youssef M.; Coles, T.; Spantini, A.
2013-09-30
The project was a collaborative effort among MIT, Sandia National Laboratories (local PI Dr. Habib Najm), the University of Southern California (local PI Prof. Roger Ghanem), and The Johns Hopkins University (local PI Prof. Omar Knio, now at Duke University). Our focus was the analysis and reduction of large-scale dynamical systems emerging from networks of interacting components. Such networks underlie myriad natural and engineered systems. Examples important to DOE include chemical models of energy conversion processes, and elements of national infrastructure—e.g., electric power grids. Time scales in chemical systems span orders of magnitude, while infrastructure networks feature both local andmore » long-distance connectivity, with associated clusters of time scales. These systems also blend continuous and discrete behavior; examples include saturation phenomena in surface chemistry and catalysis, and switching in electrical networks. Reducing size and stiffness is essential to tractable and predictive simulation of these systems. Computational singular perturbation (CSP) has been effectively used to identify and decouple dynamics at disparate time scales in chemical systems, allowing reduction of model complexity and stiffness. In realistic settings, however, model reduction must contend with uncertainties, which are often greatest in large-scale systems most in need of reduction. Uncertainty is not limited to parameters; one must also address structural uncertainties—e.g., whether a link is present in a network—and the impact of random perturbations, e.g., fluctuating loads or sources. Research under this project developed new methods for the analysis and reduction of complex multiscale networks under uncertainty, by combining computational singular perturbation (CSP) with probabilistic uncertainty quantification. CSP yields asymptotic approximations of reduceddimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty in this context raised fundamentally new issues, e.g., how is the topology of slow manifolds transformed by parametric uncertainty? How to construct dynamical models on these uncertain manifolds? To address these questions, we used stochastic spectral polynomial chaos (PC) methods to reformulate uncertain network models and analyzed them using CSP in probabilistic terms. Finding uncertain manifolds involved the solution of stochastic eigenvalue problems, facilitated by projection onto PC bases. These problems motivated us to explore the spectral properties stochastic Galerkin systems. We also introduced novel methods for rank-reduction in stochastic eigensystems—transformations of a uncertain dynamical system that lead to lower storage and solution complexity. These technical accomplishments are detailed below. This report focuses on the MIT portion of the joint project.« less
Recent progress in 3-D imaging of sea freight containers
NASA Astrophysics Data System (ADS)
Fuchs, Theobald; Schön, Tobias; Dittmann, Jonas; Sukowski, Frank; Hanke, Randolf
2015-03-01
The inspection of very large objects like sea freight containers with X-ray Computed Tomography (CT) is an emerging technology. A complete 3-D CT scan of a see-freight container takes several hours. Of course, this is too slow to apply it to a large number of containers. However, the benefits of a 3-D CT for sealed freight are obvious: detection of potential threats or illicit cargo without being confronted with legal complications or high time consumption and risks for the security personnel during a manual inspection. Recently distinct progress was made in the field of reconstruction of projections with only a relatively low number of angular positions. Instead of today's 500 to 1000 rotational steps, as needed for conventional CT reconstruction techniques, this new class of algorithms provides the potential to reduce the number of projection angles approximately by a factor of 10. The main drawback of these advanced iterative methods is the high consumption for numerical processing. But as computational power is getting steadily cheaper, there will be practical applications of these complex algorithms in a foreseeable future. In this paper, we discuss the properties of iterative image reconstruction algorithms and show results of their application to CT of extremely large objects scanning a sea-freight container. A specific test specimen is used to quantitatively evaluate the image quality in terms of spatial and contrast resolution and depending on different number of projections.
Modeling Costal Zone Responses to Sea-Level Rise Using MoCCS: A Model of Complex Coastal System
NASA Astrophysics Data System (ADS)
Dai, H.; Niedoroda, A. W.; Ye, M.; Saha, B.; Donoghue, J. F.; Kish, S.
2011-12-01
Large-scale coastal systems consisting of several morphological components (e.g. beach, surf zone, dune, inlet, shoreface, and estuary) can be expected to exhibit complex and interacting responses to changes in the rate of sea level rise and storm climate. We have developed a numerical model of complex coastal systems (MoCCS), derived from earlier morphdynamic models, to represent the large-scale time-averaged physical processes that shape each component and govern the component interactions. These control the ongoing evolution of the barrier islands, beach and dune erosion, shoal formation and sand withdrawal at tidal inlets, depth changes in the bay, and changes in storm flooding. The model has been used to study the response of an idealized coastal system with physical characteristics and storm climatology similar to Santa Rosa Island on the Florida Panhandle coast. Five SLR scenarios have been used, covering the range of recently published projections for the next century. Each scenario has been input with a constant and then a time-varying storm climate. The results indicate that substantial increases in the rate of beach erosion are largely due to increased sand transfer to inlet shoals with increased rates of sea level rise. The barrier island undergoes cycles of dune destruction and regrowth, leading to sand deposition. This largely maintains island freeboard but is progressively less effective in offsetting bayside inundation and marsh habitat loss at accelerated sea level rise rates.
Complexity analysis of the cost effectiveness of PI-led NASA science missions
NASA Astrophysics Data System (ADS)
Yoshida, J.; Cowdin, M.; Mize, T.; Kellogg, R.; Bearden, D.
For the last 20 years, NASA has allowed Principal Investigators (PIs) to manage the development of many unmanned space projects. Advocates of PI-led projects believe that a PI-led implementation can result in a project being developed at lower cost and shorter schedule than other implementation modes. This paper seeks to test this hypothesis by comparing the actual costs of NASA and other comparable projects developed under different implementation modes. The Aerospace Corporation's Complexity-Based Risk Assessment (CoBRA) analysis tool is used to normalize the projects such that the cost can be compared for equivalent project complexities. The data is examined both by complexity and by launch year. Cost growth will also be examined for any correlation with implementation mode. Defined in many NASA Announcements of Opportunity (AOs), a PI-led project is characterized by a central, single person with full responsibility for assembling a team and for the project's scientific integrity and the implementation and integrity of all other aspects of the mission, while operating under a cost cap. PIs have larger degrees of freedom to achieve the stated goals within NASA guidelines and oversight. This study leverages the definitions and results of previous National Research Council studies of PI-led projects. Aerospace has defined a complexity index, derived from mission performance, mass, power, and technology choices, to arrive at a broad representation of missions for purposes of comparison. Over a decade of research has established a correlation between mission complexity and spacecraft development cost and schedule. This complexity analysis, CoBRA, is applied to compare a PI-led set of New Frontiers, Discovery, Explorers, and Earth System Science Pathfinder missions to the overall NASA mission dataset. This reveals the complexity trends against development costs, cost growth, and development era.
Management evolution in the LSST project
NASA Astrophysics Data System (ADS)
Sweeney, Donald; Claver, Charles; Jacoby, Suzanne; Kantor, Jeffrey; Krabbendam, Victor; Kurita, Nadine
2010-07-01
The Large Synoptic Survey Telescope (LSST) project has evolved from just a few staff members in 2003 to about 100 in 2010; the affiliation of four founding institutions has grown to 32 universities, government laboratories, and industry. The public private collaboration aims to complete the estimated $450 M observatory in the 2017 timeframe. During the design phase of the project from 2003 to the present the management structure has been remarkably stable. At the same time, the funding levels, staffing levels and scientific community participation have grown dramatically. The LSSTC has introduced project controls and tools required to manage the LSST's complex funding model, technical structure and distributed work force. Project controls have been configured to comply with the requirements of federal funding agencies. Some of these tools for risk management, configuration control and resource-loaded schedule have been effective and others have not. Technical tasks associated with building the LSST are distributed into three subsystems: Telescope & Site, Camera, and Data Management. Each sub-system has its own experienced Project Manager and System Scientist. Delegation of authority is enabling and effective; it encourages a strong sense of ownership within the project. At the project level, subsystem management follows the principle that there is one Board of Directors, Director, and Project Manager who have overall authority.
World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events
NASA Technical Reports Server (NTRS)
Elfrey, Priscilla
2010-01-01
Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming prove more and more ominous-glaciers melting in Bolivia, floods in Saudi Arabia, the Maldives sinking and salt rising along the Nile. Fear grows about potential asteroid crashes and nightly television images raise awareness of victims of floods, hurricanes, cyclones and typhoons, fire, tornado, tsunami, bombings, landslides, and cross-boundary criminality. The Red Cross says that disasters impact 250 million people each year. That means that 700,000 people are having a very bad day today. Modeling and simulation is and must be part of the solution.
ARPA surveillance technology for detection of targets hidden in foliage
NASA Astrophysics Data System (ADS)
Hoff, Lawrence E.; Stotts, Larry B.
1994-02-01
The processing of large quantities of synthetic aperture radar data in real time is a complex problem. Even the image formation process taxes today's most advanced computers. The use of complex algorithms with multiple channels adds another dimension to the computational problem. Advanced Research Projects Agency (ARPA) is currently planning on using the Paragon parallel processor for this task. The Paragon is small enough to allow its use in a sensor aircraft. Candidate algorithms will be implemented on the Paragon for evaluation for real time processing. In this paper ARPA technology developments for detecting targets hidden in foliage are reviewed and examples of signal processing techniques on field collected data are presented.
Lidar system for air-pollution monitoring over urban areas
NASA Astrophysics Data System (ADS)
Moskalenko, Irina V.; Shcheglov, Djolinard A.; Molodtsov, Nikolai A.
1997-05-01
The atmospheric environmental situation over the urban area of a large city is determined by a complex combination of anthropogenic pollution and meteorological factors. The efficient way to provide three-dimensional mapping of gaseous pollutants over wide areas is utilization of lidar systems employing tunable narrowband transmitters. The paper presented describes activity of RRC 'Kurchatov Institute' in the field of lidar atmospheric monitoring. The project 'mobile remote sensing system based on tunable laser transmitter for environmental monitoring' is developed under financial support of International Scientific and Technology Center (Moscow). The objective of the project is design, construction and field testing of a DIAL-technique system. The lidar transmitter consists of an excimer laser pumping dye laser, BBO crystal frequency doubler, and scanning flat mirror. Sulfur dioxide and atomic mercury have been selected as pollutants for field tests of the lidar system under development. A recent large increase in Moscow traffic stimulated taking into consideration also the remote sensing of lower troposphere ozone because of the photochemical smog problem. The status of the project is briefly discussed. The current activity includes also collecting of environmental data relevant to lidar remote sensing. Main attention is paid to pollutant concentration levels over Moscow city and Moscow district areas.
Feasibility study for hydrocarbon complex in southern seaboard. Petroleum Authority of Thailand
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This study, conducted by Fluor Daniel, was funded by the U.S. Trade and Development Agency, on behalf of the Petroleum Authority of Thailand. The primary objective of the study was to investigate the economic viability of the related facilities and determine how each could help to industrialize and build up the Southern Seaboard area of Thailand. The focus of the report is in three areas including; Crude Oil Transportation System, Refinery, and Petrochemical Complex. Another objective of the study was to offer an alternative for large crude carrier traffic by proposing the completion of a crude oil pipeline. The reportmore » is divided into the following sections: (1) Executive Summary; (2) Introduction; (3) Crude Oil Transportation System; (4) Refinery Project; (5) Petrochemical Complex; (6) Key Issues & Considerations; (7) Financial Evaluations; (8) Summary & Conclusions.« less
ERIC Educational Resources Information Center
Skilton, Paul F.; Forsyth, David; White, Otis J.
2008-01-01
Building from research on learning in workplace project teams, the authors work forward from the idea that the principal condition enabling integration learning in student team projects is project complexity. Recognizing the challenges of developing and running complex student projects, the authors extend theory to propose that the experience of…
Software for project-based learning of robot motion planning
NASA Astrophysics Data System (ADS)
Moll, Mark; Bordeaux, Janice; Kavraki, Lydia E.
2013-12-01
Motion planning is a core problem in robotics concerned with finding feasible paths for a given robot. Motion planning algorithms perform a search in the high-dimensional continuous space of robot configurations and exemplify many of the core algorithmic concepts of search algorithms and associated data structures. Motion planning algorithms can be explained in a simplified two-dimensional setting, but this masks many of the subtleties and complexities of the underlying problem. We have developed software for project-based learning of motion planning that enables deep learning. The projects that we have developed allow advanced undergraduate students and graduate students to reflect on the performance of existing textbook algorithms and their own variations on such algorithms. Formative assessment has been conducted at three institutions. The core of the software used for this teaching module is also used within the Robot Operating System, a widely adopted platform by the robotics research community. This allows for transfer of knowledge and skills to robotics research projects involving a large variety robot hardware platforms.
Forouhar, Amir; Hasankhani, Mahnoosh
2018-04-01
Urban decay is the process by which a historical city center, or an old part of a city, falls into decrepitude and faces serious problems. Urban management, therefore, implements renewal mega projects with the goal of physical and functional revitalization, retrieval of socioeconomic capacities, and improving of quality of life of residents. Ignoring the complexities of these large-scale interventions in the old and historical urban fabrics may lead to undesirable consequences, including an additional decline of quality of life. Thus, the present paper aims to assess the impact of renewal mega projects on residents' subjective quality of life, in the historical religious district of the holy city of Mashhad (Samen District). A combination of quantitative and qualitative methods of impact assessment, including questionnaires, semi-structured personal interviews, and direct observation, is used in this paper. The results yield that the Samen Renewal Project has significantly reduced the resident's subjective quality of life, due to its undesirable impacts on physical, socio-cultural, and economic environments.
The Role of Lunar Development in Human Exploration of the Solar System
NASA Technical Reports Server (NTRS)
Mendell, Wendell W.
1999-01-01
Human exploration of the solar system can be said to have begun with the Apollo landings on the Moon. The Apollo Project was publicly funded with the narrow technical objective of landing human beings on the Moon. The transportation and life support systems were specialized technical designs, developed in a project management environment tailored to that objective. Most scenarios for future human exploration assume a similar long-term commitment of public funds to a narrowly focused project managed by a large, monolithic organization. Advocates of human exploration of space have not yet been successful in generating the political momentum required to initiate such a project to go to the Moon or to Mars. Alternative scenarios of exploration may relax some or all of the parameters of organizational complexity, great expense, narrow technical focus, required public funding, and control by a single organization. Development of the Moon using private investment is quite possibly a necessary condition for alternative scenarios to succeed.
Appreciating the Complexity of Project Management Execution: Using Simulation in the Classroom
ERIC Educational Resources Information Center
Hartman, Nathan S.; Watts, Charles A.; Treleven, Mark D.
2013-01-01
As the popularity and importance of project management increase, so does the need for well-prepared project managers. This article discusses our experiences using a project management simulation in undergraduate and MBA classes to help students better grasp the complexity of project management. This approach gives students hands-on experience with…
DaVIE: Database for the Visualization and Integration of Epigenetic data
Fejes, Anthony P.; Jones, Meaghan J.; Kobor, Michael S.
2014-01-01
One of the challenges in the analysis of large data sets, particularly in a population-based setting, is the ability to perform comparisons across projects. This has to be done in such a way that the integrity of each individual project is maintained, while ensuring that the data are comparable across projects. These issues are beginning to be observed in human DNA methylation studies, as the Illumina 450k platform and next generation sequencing-based assays grow in popularity and decrease in price. This increase in productivity is enabling new insights into epigenetics, but also requires the development of pipelines and software capable of handling the large volumes of data. The specific problems inherent in creating a platform for the storage, comparison, integration, and visualization of DNA methylation data include data storage, algorithm efficiency and ability to interpret the results to derive biological meaning from them. Databases provide a ready-made solution to these issues, but as yet no tools exist that that leverage these advantages while providing an intuitive user interface for interpreting results in a genomic context. We have addressed this void by integrating a database to store DNA methylation data with a web interface to query and visualize the database and a set of libraries for more complex analysis. The resulting platform is called DaVIE: Database for the Visualization and Integration of Epigenetics data. DaVIE can use data culled from a variety of sources, and the web interface includes the ability to group samples by sub-type, compare multiple projects and visualize genomic features in relation to sites of interest. We have used DaVIE to identify patterns of DNA methylation in specific projects and across different projects, identify outlier samples, and cross-check differentially methylated CpG sites identified in specific projects across large numbers of samples. A demonstration server has been setup using GEO data at http://echelon.cmmt.ubc.ca/dbaccess/, with login “guest” and password “guest.” Groups may download and install their own version of the server following the instructions on the project's wiki. PMID:25278960
Nelson, Geoffrey; Macnaughton, Eric; Goering, Paula
2015-11-01
Using the case of a large-scale, multi-site Canadian Housing First research demonstration project for homeless people with mental illness, At Home/Chez Soi, we illustrate the value of qualitative methods in a randomized controlled trial (RCT) of a complex community intervention. We argue that quantitative RCT research can neither capture the complexity nor tell the full story of a complex community intervention. We conceptualize complex community interventions as having multiple phases and dimensions that require both RCT and qualitative research components. Rather than assume that qualitative research and RCTs are incommensurate, a more pragmatic mixed methods approach was used, which included using both qualitative and quantitative methods to understand program implementation and outcomes. At the same time, qualitative research was used to examine aspects of the intervention that could not be understood through the RCT, such as its conception, planning, sustainability, and policy impacts. Through this example, we show how qualitative research can tell a more complete story about complex community interventions. Copyright © 2015 Elsevier Inc. All rights reserved.
Peters, D T J M; Verweij, S; Grêaux, K; Stronks, K; Harting, J
2017-12-01
Improving health requires changes in the social, physical, economic and political determinants of health behavior. For the realization of policies that address these environmental determinants, intersectoral policy networks are considered necessary for the pooling of resources to implement different policy instruments. However, such network diversity may increase network complexity and therefore hamper network performance. Network complexity may be reduced by network management and the provision of financial resources. This study examined whether network diversity - amidst the other conditions - is indeed needed to address environmental determinants of health behavior. We included 25 intersectoral policy networks in Dutch municipalities aimed at reducing overweight, smoking, and alcohol/drugs abuse. For our fuzzy set Qualitative Comparative Analysis we used data from three web-based surveys among (a) project leaders regarding network diversity and size (n = 38); (b) project leaders and project partners regarding management (n = 278); and (c) implementation professionals regarding types of environmental determinants addressed (n = 137). Data on budgets were retrieved from project application forms. Contrary to their intentions, most policy networks typically addressed personal determinants. If the environment was addressed too, it was mostly the social environment. To address environmental determinants of health behavior, network diversity (>50% of the actors are non-public health) was necessary in networks that were either small (<16 actors) or had small budgets (<€183,172), when both were intensively managed. Irrespective of network diversity, environmental determinants also were addressed by small networks with large budgets, and by large networks with small budgets, when both provided network management. We conclude that network diversity is important - although not necessary - for resource pooling to address environmental determinants of health behavior, but only effective in the presence of network management. Our findings may support intersectoral policy networks in improving health behaviors by addressing a variety of environmental determinants. Copyright © 2017. Published by Elsevier Ltd.
System engineering of the Atacama Large Millimeter/submillimeter Array
NASA Astrophysics Data System (ADS)
Bhatia, Ravinder; Marti, Javier; Sugimoto, Masahiro; Sramek, Richard; Miccolis, Maurizio; Morita, Koh-Ichiro; Arancibia, Demián.; Araya, Andrea; Asayama, Shin'ichiro; Barkats, Denis; Brito, Rodrigo; Brundage, William; Grammer, Wes; Haupt, Christoph; Kurlandczyk, Herve; Mizuno, Norikazu; Napier, Peter; Pizarro, Eduardo; Saini, Kamaljeet; Stahlman, Gretchen; Verzichelli, Gianluca; Whyborn, Nick; Yagoubov, Pavel
2012-09-01
The Atacama Large Millimeter/submillimeter Array (ALMA) will be composed of 66 high precision antennae located at 5000 meters altitude in northern Chile. This paper will present the methodology, tools and processes adopted to system engineer a project of high technical complexity, by system engineering teams that are remotely located and from different cultures, and in accordance with a demanding schedule and within tight financial constraints. The technical and organizational complexity of ALMA requires a disciplined approach to the definition, implementation and verification of the ALMA requirements. During the development phase, System Engineering chairs all technical reviews and facilitates the resolution of technical conflicts. We have developed analysis tools to analyze the system performance, incorporating key parameters that contribute to the ultimate performance, and are modeled using best estimates and/or measured values obtained during test campaigns. Strict tracking and control of the technical budgets ensures that the different parts of the system can operate together as a whole within ALMA boundary conditions. System Engineering is responsible for acceptances of the thousands of hardware items delivered to Chile, and also supports the software acceptance process. In addition, System Engineering leads the troubleshooting efforts during testing phases of the construction project. Finally, the team is conducting System level verification and diagnostics activities to assess the overall performance of the observatory. This paper will also share lessons learned from these system engineering and verification approaches.
Ciric, Milica; Moon, Christina D; Leahy, Sinead C; Creevey, Christopher J; Altermann, Eric; Attwood, Graeme T; Rakonjac, Jasna; Gagic, Dragana
2014-05-12
In silico, secretome proteins can be predicted from completely sequenced genomes using various available algorithms that identify membrane-targeting sequences. For metasecretome (collection of surface, secreted and transmembrane proteins from environmental microbial communities) this approach is impractical, considering that the metasecretome open reading frames (ORFs) comprise only 10% to 30% of total metagenome, and are poorly represented in the dataset due to overall low coverage of metagenomic gene pool, even in large-scale projects. By combining secretome-selective phage display and next-generation sequencing, we focused the sequence analysis of complex rumen microbial community on the metasecretome component of the metagenome. This approach achieved high enrichment (29 fold) of secreted fibrolytic enzymes from the plant-adherent microbial community of the bovine rumen. In particular, we identified hundreds of heretofore rare modules belonging to cellulosomes, cell-surface complexes specialised for recognition and degradation of the plant fibre. As a method, metasecretome phage display combined with next-generation sequencing has a power to sample the diversity of low-abundance surface and secreted proteins that would otherwise require exceptionally large metagenomic sequencing projects. As a resource, metasecretome display library backed by the dataset obtained by next-generation sequencing is ready for i) affinity selection by standard phage display methodology and ii) easy purification of displayed proteins as part of the virion for individual functional analysis.
NASA Astrophysics Data System (ADS)
Fischer, Andreas; Keller, Denise; Liniger, Mark; Rajczak, Jan; Schär, Christoph; Appenzeller, Christof
2014-05-01
Fundamental changes in the hydrological cycle are expected in a future warmer climate. This is of particular relevance for the Alpine region, as a source and reservoir of several major rivers in Europe and being prone to extreme events such as floodings. For this region, climate change assessments based on the ENSEMBLES regional climate models (RCMs) project a significant decrease in summer mean precipitation under the A1B emission scenario by the mid-to-end of this century, while winter mean precipitation is expected to slightly rise. From an impact perspective, projected changes in seasonal means, however, are often insufficient to adequately address the multifaceted challenges of climate change adaptation. In this study, we revisit the full matrix of the ENSEMBLES RCM projections regarding changes in frequency and intensity, precipitation-type (convective versus stratiform) and temporal structure (wet/dry spells and transition probabilities) over Switzerland and surroundings. As proxies for raintype changes, we rely on the model parameterized convective and large-scale precipitation components. Part of the analysis involves a Bayesian multi-model combination algorithm to infer changes from the multi-model ensemble. The analysis suggests a summer drying that evolves altitude-specific: over low-land regions it is associated with wet-day frequency decreases of convective and large-scale precipitation, while over elevated regions it is primarily associated with a decline in large-scale precipitation only. As a consequence, almost all the models project an increase in the convective fraction at elevated Alpine altitudes. The decrease in the number of wet days during summer is accompanied by decreases (increases) in multi-day wet (dry) spells. This shift in multi-day episodes also lowers the likelihood of short dry spell occurrence in all of the models. For spring and autumn the combined multi-model projections indicate higher mean precipitation intensity north of the Alps, while a similar tendency is expected for the winter season over most of Switzerland.
FPGA Based Adaptive Rate and Manifold Pattern Projection for Structured Light 3D Camera System †
Lee, Sukhan
2018-01-01
The quality of the captured point cloud and the scanning speed of a structured light 3D camera system depend upon their capability of handling the object surface of a large reflectance variation in the trade-off of the required number of patterns to be projected. In this paper, we propose and implement a flexible embedded framework that is capable of triggering the camera single or multiple times for capturing single or multiple projections within a single camera exposure setting. This allows the 3D camera system to synchronize the camera and projector even for miss-matched frame rates such that the system is capable of projecting different types of patterns for different scan speed applications. This makes the system capturing a high quality of 3D point cloud even for the surface of a large reflectance variation while achieving a high scan speed. The proposed framework is implemented on the Field Programmable Gate Array (FPGA), where the camera trigger is adaptively generated in such a way that the position and the number of triggers are automatically determined according to camera exposure settings. In other words, the projection frequency is adaptive to different scanning applications without altering the architecture. In addition, the proposed framework is unique as it does not require any external memory for storage because pattern pixels are generated in real-time, which minimizes the complexity and size of the application-specific integrated circuit (ASIC) design and implementation. PMID:29642506
Solving optimization problems on computational grids.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, S. J.; Mathematics and Computer Science
2001-05-01
Multiprocessor computing platforms, which have become more and more widely available since the mid-1980s, are now heavily used by organizations that need to solve very demanding computational problems. Parallel computing is now central to the culture of many research communities. Novel parallel approaches were developed for global optimization, network optimization, and direct-search methods for nonlinear optimization. Activity was particularly widespread in parallel branch-and-bound approaches for various problems in combinatorial and network optimization. As the cost of personal computers and low-end workstations has continued to fall, while the speed and capacity of processors and networks have increased dramatically, 'cluster' platforms havemore » become popular in many settings. A somewhat different type of parallel computing platform know as a computational grid (alternatively, metacomputer) has arisen in comparatively recent times. Broadly speaking, this term refers not to a multiprocessor with identical processing nodes but rather to a heterogeneous collection of devices that are widely distributed, possibly around the globe. The advantage of such platforms is obvious: they have the potential to deliver enormous computing power. Just as obviously, however, the complexity of grids makes them very difficult to use. The Condor team, headed by Miron Livny at the University of Wisconsin, were among the pioneers in providing infrastructure for grid computations. More recently, the Globus project has developed technologies to support computations on geographically distributed platforms consisting of high-end computers, storage and visualization devices, and other scientific instruments. In 1997, we started the metaneos project as a collaborative effort between optimization specialists and the Condor and Globus groups. Our aim was to address complex, difficult optimization problems in several areas, designing and implementing the algorithms and the software infrastructure need to solve these problems on computational grids. This article describes some of the results we have obtained during the first three years of the metaneos project. Our efforts have led to development of the runtime support library MW for implementing algorithms with master-worker control structure on Condor platforms. This work is discussed here, along with work on algorithms and codes for integer linear programming, the quadratic assignment problem, and stochastic linear programmming. Our experiences in the metaneos project have shown that cheap, powerful computational grids can be used to tackle large optimization problems of various types. In an industrial or commercial setting, the results demonstrate that one may not have to buy powerful computational servers to solve many of the large problems arising in areas such as scheduling, portfolio optimization, or logistics; the idle time on employee workstations (or, at worst, an investment in a modest cluster of PCs) may do the job. For the optimization research community, our results motivate further work on parallel, grid-enabled algorithms for solving very large problems of other types. The fact that very large problems can be solved cheaply allows researchers to better understand issues of 'practical' complexity and of the role of heuristics.« less
SU-E-T-76: A Software System to Monitor VMAT Plan Complexity in a Large Radiotherapy Centre
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arumugam, S; Xing, A; Ingham Institute, Sydney, NSW
2015-06-15
Purpose: To develop a system that analyses and reports the complexity of Volumetric Modulated Arc Therapy (VMAT) plans to aid in the decision making for streamlining patient specific dosimetric quality assurance (QA) tests. Methods: A software system, Delcheck, was developed in-house to calculate VMAT plan and delivery complexity using the treatment delivery file. Delcheck has the functionality to calculate multiple plan complexity metrics including the Li-Xing Modulation Index (LI-MI), multiplicative combination of Leaf Travel and Modulation Complexity Score (LTMCSv), Monitor Units per prescribed dose (MU/D) and the delivery complexity index (MIt) that incorporates the modulation of dose rate, leaf speedmore » and gantry speed. Delcheck includes database functionality to store and compare plan metrics for a specified treatment site. The overall plan and delivery complexity is assessed based on the 95% conformance limit of the complexity metrics as Similar, More or Less complex. The functionality of the software was tested using 42 prostate conventional, 10 prostate SBRT and 15 prostate bed VMAT plans generated for an Elekta linear accelerator. Results: The mean(σ) of LI-MI for conventional, SBRT and prostate bed plans were 1690(486), 3215.4(1294) and 3258(982) respectively. The LTMCSv of the studied categories were 0.334(0.05), 0.325(0.07) and 0.3112(0.09). The MU/D of the studied categories were 2.4(0.4), 2.7(0.7) and 2.5(0.5). The MIt of the studied categories were 21.6(3.4), 18.2(3.0) and 35.9(6.6). The values of the complexity metrics show that LI-MI appeared to resolve the plan complexity better than LTMCSv and MU/D. The MIt value increased as the delivery complexity increased. Conclusion: The developed software was shown to be working as expected. In studied treatment categories Prostate bed plans are more complex in both plan and delivery and SBRT is more complex in plan and less complex in delivery as demonstrated by LI-MI and MIt. This project was funded through a Cancer Council NSW Project Grant (RG14-11)« less
Human vocal attractiveness as signaled by body size projection.
Xu, Yi; Lee, Albert; Wu, Wing-Li; Liu, Xuan; Birkholz, Peter
2013-01-01
Voice, as a secondary sexual characteristic, is known to affect the perceived attractiveness of human individuals. But the underlying mechanism of vocal attractiveness has remained unclear. Here, we presented human listeners with acoustically altered natural sentences and fully synthetic sentences with systematically manipulated pitch, formants and voice quality based on a principle of body size projection reported for animal calls and emotional human vocal expressions. The results show that male listeners preferred a female voice that signals a small body size, with relatively high pitch, wide formant dispersion and breathy voice, while female listeners preferred a male voice that signals a large body size with low pitch and narrow formant dispersion. Interestingly, however, male vocal attractiveness was also enhanced by breathiness, which presumably softened the aggressiveness associated with a large body size. These results, together with the additional finding that the same vocal dimensions also affect emotion judgment, indicate that humans still employ a vocal interaction strategy used in animal calls despite the development of complex language.
SNPassoc: an R package to perform whole genome association studies.
González, Juan R; Armengol, Lluís; Solé, Xavier; Guinó, Elisabet; Mercader, Josep M; Estivill, Xavier; Moreno, Víctor
2007-03-01
The popularization of large-scale genotyping projects has led to the widespread adoption of genetic association studies as the tool of choice in the search for single nucleotide polymorphisms (SNPs) underlying susceptibility to complex diseases. Although the analysis of individual SNPs is a relatively trivial task, when the number is large and multiple genetic models need to be explored it becomes necessary a tool to automate the analyses. In order to address this issue, we developed SNPassoc, an R package to carry out most common analyses in whole genome association studies. These analyses include descriptive statistics and exploratory analysis of missing values, calculation of Hardy-Weinberg equilibrium, analysis of association based on generalized linear models (either for quantitative or binary traits), and analysis of multiple SNPs (haplotype and epistasis analysis). Package SNPassoc is available at CRAN from http://cran.r-project.org. A tutorial is available on Bioinformatics online and in http://davinci.crg.es/estivill_lab/snpassoc.
Ra, Kongtae; Bang, Jae-Hyun; Lee, Jung-Moo; Kim, Kyung-Tae; Kim, Eun-Soo
2011-08-01
The vertical distribution of trace metals in sediment cores was investigated to evaluate the extent and the historical record of metal pollution over 30 years in the artificial Lake Shihwa in Korea. A marked increase of trace metals after 1980 was observed due to the operation of two large industrial complexes and dike construction for a reclamation project. There was a decreasing trend of metal concentrations with the distance from the pollution source. The enrichment factor and pollution load index of the metals indicated that the metal pollution was mainly derived from Cu, Zn and Cd loads due to anthropogenic activities. The concentrations of Cr, Ni, Cu, Zn, As and Pb in the upper part of all core sediments exceeded the ERL criteria of NOAA. Our results indicate that inadequate planning and management of industrialization and a large reclamation project accomplished by dike construction have continued to strongly accelerate metal pollution in Lake Shihwa. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.
Kappler, Ulrike; Rowland, Susan L; Pedwell, Rhianna K
2017-05-01
Systems biology is frequently taught with an emphasis on mathematical modeling approaches. This focus effectively excludes most biology, biochemistry, and molecular biology students, who are not mathematics majors. The mathematical focus can also present a misleading picture of systems biology, which is a multi-disciplinary pursuit requiring collaboration between biochemists, bioinformaticians, and mathematicians. This article describes an authentic large-scale undergraduate research experience (ALURE) in systems biology that incorporates proteomics, bacterial genomics, and bioinformatics in the one exercise. This project is designed to engage students who have a basic grounding in protein chemistry and metabolism and no mathematical modeling skills. The pedagogy around the research experience is designed to help students attack complex datasets and use their emergent metabolic knowledge to make meaning from large amounts of raw data. On completing the ALURE, participants reported a significant increase in their confidence around analyzing large datasets, while the majority of the cohort reported good or great gains in a variety of skills including "analysing data for patterns" and "conducting database or internet searches." An environmental scan shows that this ALURE is the only undergraduate-level system-biology research project offered on a large-scale in Australia; this speaks to the perceived difficulty of implementing such an opportunity for students. We argue however, that based on the student feedback, allowing undergraduate students to complete a systems-biology project is both feasible and desirable, even if the students are not maths and computing majors. © 2016 by The International Union of Biochemistry and Molecular Biology, 45(3):235-248, 2017. © 2016 The International Union of Biochemistry and Molecular Biology.
The Norwegian national project for ethics support in community health and care services.
Magelssen, Morten; Gjerberg, Elisabeth; Pedersen, Reidar; Førde, Reidun; Lillemoen, Lillian
2016-11-08
Internationally, clinical ethics support has yet to be implemented systematically in community health and care services. A large-scale Norwegian project (2007-2015) attempted to increase ethical competence in community services through facilitating the implementation of ethics support activities in 241 Norwegian municipalities. The article describes the ethics project and the ethics activities that ensued. The article first gives an account of the Norwegian ethics project. Then the results of two online questionnaires are reported, characterizing the scope, activities and organization of the ethics activities in the Norwegian municipalities and the ethical topics addressed. One hundred and thirty-seven municipal contact persons answered the first survey (55 % response rate), whereas 217 ethics facilitators from 48 municipalities responded to the second (33 % response rate). The Norwegian ethics project is vast in scope, yet has focused on some institutions and professions (e.g., nursing homes, home-based care; nurses, nurses' aides, unskilled workers) whilst seldom reaching others (e.g., child and adolescent health care; physicians). Patients and next of kin were very seldom involved. Through the ethics project employees discussed many important ethical challenges, in particular related to patient autonomy, competence to consent, and cooperation with next of kin. The "ethics reflection group" was the most common venue for ethics deliberation. The Norwegian project is the first of its kind and scope, and other countries may learn from the Norwegian experiences. Professionals have discussed central ethical dilemmas, the handling of which arguably makes a difference for patients/users and service quality. The study indicates that large (national) scale implementation of CES structures for the municipal health and care services is complex, yet feasible.
NASA Technical Reports Server (NTRS)
Pradhan, Anil K.
2000-01-01
Recent advances in theoretical atomic physics have enabled large-scale calculation of atomic parameters for a variety of atomic processes with high degree of precision. The development and application of these methods is the aim of the Iron Project. At present the primary focus is on collisional processes for all ions of iron, Fe I - FeXXVI, and other iron-peak elements; new work on radiative processes has also been initiated. Varied applications of the Iron Project work to X-ray astronomy are discussed, and more general applications to other spectral ranges are pointed out. The IP work forms the basis for more specialized projects such as the RmaX Project, and the work on photoionization/recombination, and aims to provide a comprehensive and self-consistent set of accurate collisional and radiative cross sections, and transition probabilities, within the framework of relativistic close coupling formulation using the Breit-Pauli R-Matrix method. An illustrative example is presented of how the IP data may be utilized in the formation of X-ray spectra of the K alpha complex at 6.7 keV from He-like Fe XXV.
QMC Goes BOINC: Using Public Resource Computing to Perform Quantum Monte Carlo Calculations
NASA Astrophysics Data System (ADS)
Rainey, Cameron; Engelhardt, Larry; Schröder, Christian; Hilbig, Thomas
2008-10-01
Theoretical modeling of magnetic molecules traditionally involves the diagonalization of quantum Hamiltonian matrices. However, as the complexity of these molecules increases, the matrices become so large that this process becomes unusable. An additional challenge to this modeling is that many repetitive calculations must be performed, further increasing the need for computing power. Both of these obstacles can be overcome by using a quantum Monte Carlo (QMC) method and a distributed computing project. We have recently implemented a QMC method within the Spinhenge@home project, which is a Public Resource Computing (PRC) project where private citizens allow part-time usage of their PCs for scientific computing. The use of PRC for scientific computing will be described in detail, as well as how you can contribute to the project. See, e.g., L. Engelhardt, et. al., Angew. Chem. Int. Ed. 47, 924 (2008). C. Schröoder, in Distributed & Grid Computing - Science Made Transparent for Everyone. Principles, Applications and Supporting Communities. (Weber, M.H.W., ed., 2008). Project URL: http://spin.fh-bielefeld.de
Detection and Characterisation of Meteors as a Big Data Citizen Science project
NASA Astrophysics Data System (ADS)
Gritsevich, M.
2017-12-01
Out of a total around 50,000 meteorites currently known to science, the atmospheric passage was recorded instrumentally in only 30 cases with the potential to derive their atmospheric trajectories and pre-impact heliocentric orbits. Similarly, while the observations of meteors, add thousands of new entries per month to existing databases, it is extremely rare they lead to meteorite recovery. Meteor studies thus represent an excellent example of the Big Data citizen science project, where progress in the field largely depends on the prompt identification and characterisation of meteor events as well as on extensive and valuable contributions by amateur observers. Over the last couple of decades technological advancements in observational techniques have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced scientific goals. We review some of the developments that push meteor science into the Big Data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere. The recent increased interest in meteor science triggered by the Chelyabinsk fireball helps in building the case for technologically and logistically more ambitious meteor projects. This requires developing new methodological approaches in meteor research, with Big Data science and close collaboration between citizen science, geoscience and astronomy as critical elements. We discuss possibilities for improvements and promote an opportunity for collaboration in meteor science within the currently established EU COST BigSkyEarth http://bigskyearth.eu/ network.
System engineering and science projects: lessons from MeerKAT
NASA Astrophysics Data System (ADS)
Kapp, Francois
2016-08-01
The Square Kilometre Array (SKA) is a large science project planning to commence construction of the world's largest Radio Telescope after 2018. MeerKAT is one of the precursor projects to the SKA, based on the same site that will host the SKA Mid array in the central Karoo area of South Africa. From the perspective of signal processing hardware development, we analyse the challenges that MeerKAT encountered and extrapolate them to SKA in order to prepare the System Engineering and Project Management methods that could contribute to a successful completion of SKA. Using the MeerKAT Digitiser, Correlator/Beamformer and Time and Frequency Reference Systems as an example, we will trace the risk profile and subtle differences in engineering approaches of these systems over time and show the effects of varying levels of System Engineering rigour on the evolution of their risk profiles. It will be shown that the most rigorous application of System Engineering discipline resulted in the most substantial reduction in risk over time. Since the challenges faced by SKA are not limited to that of MeerKAT, we also look into how that translates to a system development where there is substantial complexity in both the created system as well as the creating system. Since the SKA will be designed and constructed by consortia made up from the ten member countries, there are many additional complexities to the organisation creating the system - a challenge the MeerKAT project did not encounter. Factors outside of engineering, for instance procurement models and political interests, also play a more significant role, and add to the project risks of SKA when compared to MeerKAT.
Bochorishvili, Genrieta; Stornetta, Ruth L.; Coates, Melissa B.; Guyenet, Patrice G.
2014-01-01
The retrotrapezoid nucleus (RTN) contains CO2-responsive neurons that regulate breathing frequency and amplitude. These neurons (RTN-Phox2b neurons) contain the transcription factor Phox2b, vesicular glutamate transporter 2 (VGLUT2) mRNA, and a subset contains preprogalanin mRNA. We wished to determine whether the terminals of RTN-Phox2b neurons contain galanin and VGLUT2 proteins, to identify the specific projections of the galaninergic subset, to test whether RTN-Phox2b neurons contact neurons in the pre-Bötzinger complex, and to identify the ultrastructure of these synapses. The axonal projections of RTN-Phox2b neurons were traced by using biotinylated dextran amine (BDA), and many BDA-ir boutons were found to contain galanin immunoreactivity. RTN galaninergic neurons had ipsilateral projections that were identical with those of this nucleus at large: the ventral respiratory column, the caudolateral nucleus of the solitary tract, and the pontine Köliker-Fuse, intertrigeminal region, and lateral parabrachial nucleus. For ultrastructural studies, RTN-Phox2b neurons (galaninergic and others) were transfected with a lentiviral vector that expresses mCherry almost exclusively in Phox2b-ir neurons. After spinal cord injections of a catecholamine neuron-selective toxin, there was a depletion of C1 neurons in the RTN area; thus it was determined that the mCherry-positive terminals located in the pre-Bötzinger complex originated almost exclusively from the RTN-Phox2b (non-C1) neurons. These terminals were generally VGLUT2-immunoreactive and formed numerous close appositions with neurokinin-1 receptor-ir pre-Bötzinger complex neurons. Their boutons (n = 48) formed asymmetric synapses filled with small clear vesicles. In summary, RTN-Phox2b neurons, including the galaninergic subset, selectively innervate the respiratory pattern generator plus a portion of the dorsolateral pons. RTN-Phox2b neurons establish classic excitatory glutamatergic synapses with pre-Bötzinger complex neurons presumed to generate the respiratory rhythm. PMID:21935944
Quasi-projective synchronization of fractional-order complex-valued recurrent neural networks.
Yang, Shuai; Yu, Juan; Hu, Cheng; Jiang, Haijun
2018-08-01
In this paper, without separating the complex-valued neural networks into two real-valued systems, the quasi-projective synchronization of fractional-order complex-valued neural networks is investigated. First, two new fractional-order inequalities are established by using the theory of complex functions, Laplace transform and Mittag-Leffler functions, which generalize traditional inequalities with the first-order derivative in the real domain. Additionally, different from hybrid control schemes given in the previous work concerning the projective synchronization, a simple and linear control strategy is designed in this paper and several criteria are derived to ensure quasi-projective synchronization of the complex-valued neural networks with fractional-order based on the established fractional-order inequalities and the theory of complex functions. Moreover, the error bounds of quasi-projective synchronization are estimated. Especially, some conditions are also presented for the Mittag-Leffler synchronization of the addressed neural networks. Finally, some numerical examples with simulations are provided to show the effectiveness of the derived theoretical results. Copyright © 2018 Elsevier Ltd. All rights reserved.
Organization and management of heterogeneous, dispersed data bases in nuclear engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eastman, C.M.
1986-01-01
Large, complex, multiperson engineering projects in many areas, nuclear, aerospace, electronics, and manufacturing, have inherent needs for coordination, control, and management of the related engineering data. Taken in the abstract, the notion of an integrated engineering data base (IED) for such projects is attractive. The potential capabilities of an (IED) are that all data are managed in a coordinated way, are made accessible to all users who need it, allow relations between all parts of the data to be tracked and managed, provide backup, recovery, audit trails, security and access control, and allow overall project status to be monitored andmore » managed. Common data accessing schemes and user interfaces to applications are also part of an IED. This paper describes a new software product that allows incremental realization of many of the capabilities of an IED, without the massive disruption and risk.« less
Adjoint-Based Methodology for Time-Dependent Optimal Control (AMTOC)
NASA Technical Reports Server (NTRS)
Yamaleev, Nail; Diskin, boris; Nishikawa, Hiroaki
2012-01-01
During the five years of this project, the AMTOC team developed an adjoint-based methodology for design and optimization of complex time-dependent flows, implemented AMTOC in a testbed environment, directly assisted in implementation of this methodology in the state-of-the-art NASA's unstructured CFD code FUN3D, and successfully demonstrated applications of this methodology to large-scale optimization of several supersonic and other aerodynamic systems, such as fighter jet, subsonic aircraft, rotorcraft, high-lift, wind-turbine, and flapping-wing configurations. In the course of this project, the AMTOC team has published 13 refereed journal articles, 21 refereed conference papers, and 2 NIA reports. The AMTOC team presented the results of this research at 36 international and national conferences, meeting and seminars, including International Conference on CFD, and numerous AIAA conferences and meetings. Selected publications that include the major results of the AMTOC project are enclosed in this report.
Supersonic Retropropulsion Technology Development in NASA's Entry, Descent, and Landing Project
NASA Technical Reports Server (NTRS)
Edquist, Karl T.; Berry, Scott A.; Rhode, Matthew N.; Kelb, Bil; Korzun, Ashley; Dyakonov, Artem A.; Zarchi, Kerry A.; Schauerhamer, Daniel G.; Post, Ethan A.
2012-01-01
NASA's Entry, Descent, and Landing (EDL) space technology roadmap calls for new technologies to achieve human exploration of Mars in the coming decades [1]. One of those technologies, termed Supersonic Retropropulsion (SRP), involves initiation of propulsive deceleration at supersonic Mach numbers. The potential benefits afforded by SRP to improve payload mass and landing precision make the technology attractive for future EDL missions. NASA's EDL project spent two years advancing the technological maturity of SRP for Mars exploration [2-15]. This paper summarizes the technical accomplishments from the project and highlights challenges and recommendations for future SRP technology development programs. These challenges include: developing sufficiently large SRP engines for use on human-scale entry systems; testing and computationally modelling complex and unsteady SRP fluid dynamics; understanding the effects of SRP on entry vehicle stability and controllability; and demonstrating sub-scale SRP entry systems in Earth's atmosphere.
Anticipatory Water Management in Phoenix using Advanced Scenario Planning and Analyses: WaterSim 5
NASA Astrophysics Data System (ADS)
Sampson, D. A.; Quay, R.; White, D. D.; Gober, P.; Kirkwood, C.
2013-12-01
Complexity, uncertainty, and variability are inherent properties of linked social and natural processes; sustainable resource management must somehow consider all three. Typically, a decision support tool (using scenario analyses) is used to examine management alternatives under suspected trajectories in driver variables (i.e., climate forcing's, growth or economic projections, etc.). This traditional planning focuses on a small set of envisioned scenarios whose outputs are compared against one-another in order to evaluate their differing impacts on desired metrics. Human cognition typically limits this to three to five scenarios. However, complex and highly uncertain issues may require more, often much more, than five scenarios. In this case advanced scenario analysis provides quantitative or qualitative methods that can reveal patterns and associations among scenario metrics for a large ensemble of scenarios. From this analysis, then, a smaller set of heuristics that describe the complexity and uncertainty revealed provides a basis to guide planning in an anticipatory fashion. Our water policy and management model, termed WaterSim, permits advanced scenario planning and analysis for the Phoenix Metropolitan Area. In this contribution we examine the concepts of advanced scenario analysis on a large scale ensemble of scenarios using our work with WaterSim as a case study. For this case study we created a range of possible water futures by creating scenarios that encompasses differences in water supplies (our surrogates for climate change, drought, and inherent variability in riverine flows), population growth, and per capital water consumption. We used IPCC estimates of plausible, future, alterations in riverine runoff, locally produced and vetted estimates of population growth projections, and empirical trends in per capita water consumption for metropolitan cities. This ensemble consisted of ~ 30, 700 scenarios (~575 k observations). We compared and contrasted two metropolitan communities that exhibit differing growth projections and water portfolios; moderate growth with a diverse portfolio versus high growth for a more restrictive portfolio. Results illustrate that both communities exhibited an expanding envelope of possible, future water outcomes with rational water management trajectories. However, a more diverse portfolio resulted in a broad, time-insensitive decision space for management interventions. The reverse was true for the more restrictive water portfolio with high growth projections.
Regional climate projection of the Maritime Continent using the MIT Regional Climate Model
NASA Astrophysics Data System (ADS)
IM, E. S.; Eltahir, E. A. B.
2014-12-01
Given that warming of the climate system is unequivocal (IPCC AR5), accurate assessment of future climate is essential to understand the impact of climate change due to global warming. Modelling the climate change of the Maritime Continent is particularly challenge, showing a high degree of uncertainty. Compared to other regions, model agreement of future projections in response to anthropogenic emission forcings is much less. Furthermore, the spatial and temporal behaviors of climate projections seem to vary significantly due to a complex geographical condition and a wide range of scale interactions. For the fine-scale climate information (27 km) suitable for representing the complexity of climate change over the Maritime Continent, dynamical downscaling is performed using the MIT regional climate model (MRCM) during two thirty-year period for reference (1970-1999) and future (2070-2099) climate. Initial and boundary conditions are provided by Community Earth System Model (CESM) simulations under the emission scenarios projected by MIT Integrated Global System Model (IGSM). Changes in mean climate as well as the frequency and intensity of extreme climate events are investigated at various temporal and spatial scales. Our analysis is primarily centered on the different behavior of changes in convective and large-scale precipitation over land vs. ocean during dry vs. wet season. In addition, we attempt to find the added value to downscaled results over the Maritime Continent through the comparison between MRCM and CESM projection. Acknowledgements.This research was supported by the National Research Foundation Singapore through the Singapore MIT Alliance for Research and Technology's Center for Environmental Sensing and Modeling interdisciplinary research program.
Exposing the Science in Citizen Science: Fitness to Purpose and Intentional Design.
Parrish, Julia K; Burgess, Hillary; Weltzin, Jake F; Fortson, Lucy; Wiggins, Andrea; Simmons, Brooke
2018-05-21
Citizen science is a growing phenomenon. With millions of people involved and billions of in-kind dollars contributed annually, this broad extent, fine grain approach to data collection should be garnering enthusiastic support in the mainstream science and higher education communities. However, many academic researchers demonstrate distinct biases against the use of citizen science as a source of rigorous information. To engage the public in scientific research, and the research community in the practice of citizen science, a mutual understanding is needed of accepted quality standards in science, and the corresponding specifics of project design and implementation when working with a broad public base. We define a science-based typology focused on the degree to which projects deliver the type(s) and quality of data/work needed to produce valid scientific outcomes directly useful in science and natural resource management. Where project intent includes direct contribution to science and the public is actively involved either virtually or hands-on, we examine the measures of quality assurance (methods to increase data quality during the design and implementation phases of a project) and quality control (post hoc methods to increase the quality of scientific outcomes). We suggest that high quality science can be produced with massive, largely one-off, participation if data collection is simple and quality control includes algorithm voting, statistical pruning and/or computational modeling. Small to mid-scale projects engaging participants in repeated, often complex, sampling can advance quality through expert-led training and well-designed materials, and through independent verification. Both approaches - simplification at scale and complexity with care - generate more robust science outcomes.
NASA Astrophysics Data System (ADS)
Turner, Sean W. D.; Marlow, David; Ekström, Marie; Rhodes, Bruce G.; Kularathna, Udaya; Jeffrey, Paul J.
2014-04-01
Despite a decade of research into climate change impacts on water resources, the scientific community has delivered relatively few practical methodological developments for integrating uncertainty into water resources system design. This paper presents an application of the "decision scaling" methodology for assessing climate change impacts on water resources system performance and asks how such an approach might inform planning decisions. The decision scaling method reverses the conventional ethos of climate impact assessment by first establishing the climate conditions that would compel planners to intervene. Climate model projections are introduced at the end of the process to characterize climate risk in such a way that avoids the process of propagating those projections through hydrological models. Here we simulated 1000 multisite synthetic monthly streamflow traces in a model of the Melbourne bulk supply system to test the sensitivity of system performance to variations in streamflow statistics. An empirical relation was derived to convert decision-critical flow statistics to climatic units, against which 138 alternative climate projections were plotted and compared. We defined the decision threshold in terms of a system yield metric constrained by multiple performance criteria. Our approach allows for fast and simple incorporation of demand forecast uncertainty and demonstrates the reach of the decision scaling method through successful execution in a large and complex water resources system. Scope for wider application in urban water resources planning is discussed.
On coarse projective integration for atomic deposition in amorphous systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chuang, Claire Y., E-mail: yungc@seas.upenn.edu, E-mail: meister@unm.edu, E-mail: zepedaruiz1@llnl.gov; Sinno, Talid, E-mail: talid@seas.upenn.edu; Han, Sang M., E-mail: yungc@seas.upenn.edu, E-mail: meister@unm.edu, E-mail: zepedaruiz1@llnl.gov
2015-10-07
Direct molecular dynamics simulation of atomic deposition under realistic conditions is notoriously challenging because of the wide range of time scales that must be captured. Numerous simulation approaches have been proposed to address the problem, often requiring a compromise between model fidelity, algorithmic complexity, and computational efficiency. Coarse projective integration, an example application of the “equation-free” framework, offers an attractive balance between these constraints. Here, periodically applied, short atomistic simulations are employed to compute time derivatives of slowly evolving coarse variables that are then used to numerically integrate differential equations over relatively large time intervals. A key obstacle to themore » application of this technique in realistic settings is the “lifting” operation in which a valid atomistic configuration is recreated from knowledge of the coarse variables. Using Ge deposition on amorphous SiO{sub 2} substrates as an example application, we present a scheme for lifting realistic atomistic configurations comprised of collections of Ge islands on amorphous SiO{sub 2} using only a few measures of the island size distribution. The approach is shown to provide accurate initial configurations to restart molecular dynamics simulations at arbitrary points in time, enabling the application of coarse projective integration for this morphologically complex system.« less
On Coarse Projective Integration for Atomic Deposition in Amorphous Systems
Chuang, Claire Y.; Han, Sang M.; Zepeda-Ruiz, Luis A.; ...
2015-10-02
Direct molecular dynamics simulation of atomic deposition under realistic conditions is notoriously challenging because of the wide range of timescales that must be captured. Numerous simulation approaches have been proposed to address the problem, often requiring a compromise between model fidelity, algorithmic complexity and computational efficiency. Coarse projective integration, an example application of the ‘equation-free’ framework, offers an attractive balance between these constraints. Here, periodically applied, short atomistic simulations are employed to compute gradients of slowly-evolving coarse variables that are then used to numerically integrate differential equations over relatively large time intervals. A key obstacle to the application of thismore » technique in realistic settings is the ‘lifting’ operation in which a valid atomistic configuration is recreated from knowledge of the coarse variables. Using Ge deposition on amorphous SiO 2 substrates as an example application, we present a scheme for lifting realistic atomistic configurations comprised of collections of Ge islands on amorphous SiO 2 using only a few measures of the island size distribution. In conclusion, the approach is shown to provide accurate initial configurations to restart molecular dynamics simulations at arbitrary points in time, enabling the application of coarse projective integration for this morphologically complex system.« less
Tool Use Within NASA Software Quality Assurance
NASA Technical Reports Server (NTRS)
Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel
2013-01-01
As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.
NASA Technical Reports Server (NTRS)
Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.
2012-01-01
This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.
Second Generation Crop Yield Models Review
NASA Technical Reports Server (NTRS)
Hodges, T. (Principal Investigator)
1982-01-01
Second generation yield models, including crop growth simulation models and plant process models, may be suitable for large area crop yield forecasting in the yield model development project. Subjective and objective criteria for model selection are defined and models which might be selected are reviewed. Models may be selected to provide submodels as input to other models; for further development and testing; or for immediate testing as forecasting tools. A plant process model may range in complexity from several dozen submodels simulating (1) energy, carbohydrates, and minerals; (2) change in biomass of various organs; and (3) initiation and development of plant organs, to a few submodels simulating key physiological processes. The most complex models cannot be used directly in large area forecasting but may provide submodels which can be simplified for inclusion into simpler plant process models. Both published and unpublished models which may be used for development or testing are reviewed. Several other models, currently under development, may become available at a later date.
PATHA: Performance Analysis Tool for HPC Applications
Yoo, Wucherl; Koo, Michelle; Cao, Yi; ...
2016-02-18
Large science projects rely on complex workflows to analyze terabytes or petabytes of data. These jobs are often running over thousands of CPU cores and simultaneously performing data accesses, data movements, and computation. It is difficult to identify bottlenecks or to debug the performance issues in these large workflows. In order to address these challenges, we have developed Performance Analysis Tool for HPC Applications (PATHA) using the state-of-art open source big data processing tools. Our framework can ingest system logs to extract key performance measures, and apply the most sophisticated statistical tools and data mining methods on the performance data.more » Furthermore, it utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of PATHA, we conduct a case study on the workflows from an astronomy project known as the Palomar Transient Factory (PTF). This study processed 1.6 TB of system logs collected on the NERSC supercomputer Edison. Using PATHA, we were able to identify performance bottlenecks, which reside in three tasks of PTF workflow with the dependency on the density of celestial objects.« less
Portable Common Execution Environment (PCEE) project review: Peer review
NASA Technical Reports Server (NTRS)
Locke, C. Douglass
1991-01-01
The purpose of the review was to conduct an independent, in-depth analysis of the PCEE project and to provide the results of said review. The review team was tasked with evaluating the potential contribution of the PCEE project to the improvement of the life cycle support of mission and safety critical (MASC) computing components for large, complex, non-stop, distributed systems similar to those planned for such NASA programs as the space station, lunar outpost, and manned missions to Mars. Some conclusions of the review team are as follow: The PCEE project was given high marks for its breath of vision on the overall problem with MASC software; Correlated with the sweeping vision, the Review Team is very skeptical that any research project can successfully attack such a broad range of problems; and several recommendations are made such as to identify the components of the broad solution envisioned, prioritizing them with respect to their impact and the likely ability of the PCEE or others to attack them successfully, and to rewrite its Concept Document differentiating the problem description, objectives, approach, and results so that the project vision becomes assessible to others.
NASA Technical Reports Server (NTRS)
Yudkin, Howard
1988-01-01
The next generation of computer systems are studied by examining the processes and methodologies. The present generation is ok for small projects, but not so good for large projects. They are not good for addressing the iterative nature of requirements, resolution, and implementation. They do not address complexity issues of requirements stabilization. They do not explicitly address reuse opportunities, and they do not help with people shortages. Therefore, there is a need to define and automate improved software engineering processes. Some help may be gained by reuse and prototyping, which are two sides of the same coin. Reuse library parts are used to generate good approximations to desired solutions, i.e., prototypes. And rapid prototype composition implies use of preexistent parts, i.e., reusable parts.
A parallel data management system for large-scale NASA datasets
NASA Technical Reports Server (NTRS)
Srivastava, Jaideep
1993-01-01
The past decade has experienced a phenomenal growth in the amount of data and resultant information generated by NASA's operations and research projects. A key application is the reprocessing problem which has been identified to require data management capabilities beyond those available today (PRAT93). The Intelligent Information Fusion (IIF) system (ROEL91) is an ongoing NASA project which has similar requirements. Deriving our understanding of NASA's future data management needs based on the above, this paper describes an approach to using parallel computer systems (processor and I/O architectures) to develop an efficient parallel database management system to address the needs. Specifically, we propose to investigate issues in low-level record organizations and management, complex query processing, and query compilation and scheduling.
The BRAMS Zoo, a citizen science project
NASA Astrophysics Data System (ADS)
Calders, S.
2015-01-01
Currently, the BRAMS network comprises around 30 receiving stations, and each station collects 24 hours of data per day. With such a large number of raw data, automatic detection of meteor echoes is mandatory. Several algorithms have been developed, using different techniques. (They are discussed in the Proceedings of IMC 2014.) This task is complicated because of the presence of parasitic signals (mostly airplane echoes) on one hand and the fact that some meteor echoes (overdense) exhibit complex shapes that are hard to recognize on the other hand. Currently, none of the algorithms can perfectly mimic the human eye which stays the best detector. Therefore we plan to collaborate with Citizen Science in order to create a "BRAMS zoo". The idea is to ask their very large community of users to draw boxes around meteor echoes in spectrograms. The results will be used to assess the accuracy of the automatic detection algorithms on a large data set. We will focus on a few selected meteor showers which are always more fascinating for the large public than the sporadic background. Moreover, during meteor showers, many more complex overdense echoes are observed for which current automatic detection methods might fail. Finally, the dataset of manually detected meteors can also be useful e.g. for IMCCE to study the dynamic evolution of cometary dust.
Ordering Unstructured Meshes for Sparse Matrix Computations on Leading Parallel Systems
NASA Technical Reports Server (NTRS)
Oliker, Leonid; Li, Xiaoye; Heber, Gerd; Biswas, Rupak
2000-01-01
The ability of computers to solve hitherto intractable problems and simulate complex processes using mathematical models makes them an indispensable part of modern science and engineering. Computer simulations of large-scale realistic applications usually require solving a set of non-linear partial differential equations (PDES) over a finite region. For example, one thrust area in the DOE Grand Challenge projects is to design future accelerators such as the SpaHation Neutron Source (SNS). Our colleagues at SLAC need to model complex RFQ cavities with large aspect ratios. Unstructured grids are currently used to resolve the small features in a large computational domain; dynamic mesh adaptation will be added in the future for additional efficiency. The PDEs for electromagnetics are discretized by the FEM method, which leads to a generalized eigenvalue problem Kx = AMx, where K and M are the stiffness and mass matrices, and are very sparse. In a typical cavity model, the number of degrees of freedom is about one million. For such large eigenproblems, direct solution techniques quickly reach the memory limits. Instead, the most widely-used methods are Krylov subspace methods, such as Lanczos or Jacobi-Davidson. In all the Krylov-based algorithms, sparse matrix-vector multiplication (SPMV) must be performed repeatedly. Therefore, the efficiency of SPMV usually determines the eigensolver speed. SPMV is also one of the most heavily used kernels in large-scale numerical simulations.
Software technology insertion: A study of success factors
NASA Technical Reports Server (NTRS)
Lydon, Tom
1990-01-01
Managing software development in large organizations has become increasingly difficult due to increasing technical complexity, stricter government standards, a shortage of experienced software engineers, competitive pressure for improved productivity and quality, the need to co-develop hardware and software together, and the rapid changes in both hardware and software technology. The 'software factory' approach to software development minimizes risks while maximizing productivity and quality through standardization, automation, and training. However, in practice, this approach is relatively inflexible when adopting new software technologies. The methods that a large multi-project software engineering organization can use to increase the likelihood of successful software technology insertion (STI), especially in a standardized engineering environment, are described.
Stephan, Raiko; Gohl, Christina; Fleige, Astrid; Klämbt, Christian; Bogdan, Sven
2011-01-01
A tight spatial-temporal coordination of F-actin dynamics is crucial for a large variety of cellular processes that shape cells. The Abelson interactor (Abi) has a conserved role in Arp2/3-dependent actin polymerization, regulating Wiskott-Aldrich syndrome protein (WASP) and WASP family verprolin-homologous protein (WAVE). In this paper, we report that Abi exerts nonautonomous control of photoreceptor axon targeting in the Drosophila visual system through WAVE. In abi mutants, WAVE is unstable but restored by reexpression of Abi, confirming that Abi controls the integrity of the WAVE complex in vivo. Remarkably, expression of a membrane-tethered WAVE protein rescues the axonal projection defects of abi mutants in the absence of the other subunits of the WAVE complex, whereas cytoplasmic WAVE only slightly affects the abi mutant phenotype. Thus complex formation not only stabilizes WAVE, but also provides further membrane-recruiting signals, resulting in an activation of WAVE. PMID:21900504
NASA Astrophysics Data System (ADS)
Hicks, Brian A.; Lyon, Richard G.; Petrone, Peter; Ballard, Marlin; Bolcar, Matthew R.; Bolognese, Jeff; Clampin, Mark; Dogoda, Peter; Dworzanski, Daniel; Helmbrecht, Michael A.; Koca, Corina; Shiri, Ron
2016-07-01
This work presents an overview of the Segmented Aperture Interferometric Nulling Testbed (SAINT), a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNC's demonstrated wavefront sensing and control system to refine and quantify end-to-end high-contrast starlight suppression performance. This pathfinder testbed will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes.
Otis-Green, Shirley; Sidhu, Rupinder K.; Ferraro, Catherine Del; Ferrell, Betty
2014-01-01
Lung cancer patients and their family caregivers face a wide range of potentially distressing symptoms across the four domains of quality of life. A multi-dimensional approach to addressing these complex concerns with early integration of palliative care has proven beneficial. This article highlights opportunities to integrate social work using a comprehensive quality of life model and a composite patient scenario from a large lung cancer educational intervention National Cancer Institute-funded program project grant. PMID:24797998
NASA Astrophysics Data System (ADS)
Liu, Jianming; Grant, Steven L.; Benesty, Jacob
2015-12-01
A new reweighted proportionate affine projection algorithm (RPAPA) with memory and row action projection (MRAP) is proposed in this paper. The reweighted PAPA is derived from a family of sparseness measures, which demonstrate performance similar to mu-law and the l 0 norm PAPA but with lower computational complexity. The sparseness of the channel is taken into account to improve the performance for dispersive system identification. Meanwhile, the memory of the filter's coefficients is combined with row action projections (RAP) to significantly reduce computational complexity. Simulation results demonstrate that the proposed RPAPA MRAP algorithm outperforms both the affine projection algorithm (APA) and PAPA, and has performance similar to l 0 PAPA and mu-law PAPA, in terms of convergence speed and tracking ability. Meanwhile, the proposed RPAPA MRAP has much lower computational complexity than PAPA, mu-law PAPA, and l 0 PAPA, etc., which makes it very appealing for real-time implementation.
Modern Paradigm of Star Formation in the Galaxy
NASA Astrophysics Data System (ADS)
Sobolev, A. M.
2017-06-01
Understanding by the scientific community of the star formation processes in the Galaxy undergone significant changes in recent years. This is largely due to the development of the observational basis of astronomy in the infrared and submillimeter ranges. Analysis of new observational data obtained in the course of the Herschel project, by radio interferometer ALMA and other modern facilities significantly advanced our understanding of the structure of the regions of star formation, young stellar object vicinities and provided comprehensive data on the mass function of proto-stellar objects in a number of star-forming complexes of the Galaxy. Mapping of the complexes in molecular radio lines allowed to study their spatial and kinematic structure on the spatial scales of tens and hundreds of parsecs. The next breakthrough in this field can be achieved as a result of the planned project “Spektr-MM” (Millimetron) which implies a significant improvement in angular resolution and sensitivity. The use of sensitive interferometers allowed to investigate the details of star formation processes at small spatial scales - down to the size of the solar system (with the help of the ALMA), and even the Sun (in the course of the space project “Spektr-R” = RadioAstron). Significant contribution to the study of the processes of accretion is expected as a result of the project “Spektr-UV” (WSO-UV = “World Space Observatory - Ultraviolet”). Complemented with significant theoretical achievements obtained observational data have greatly promoted our understanding of the star formation processes.
Yasui, Yutaka; McLerran, Dale; Adam, Bao-Ling; Winget, Marcy; Thornquist, Mark; Feng, Ziding
2003-01-01
Discovery of "signature" protein profiles that distinguish disease states (eg, malignant, benign, and normal) is a key step towards translating recent advancements in proteomic technologies into clinical utilities. Protein data generated from mass spectrometers are, however, large in size and have complex features due to complexities in both biological specimens and interfering biochemical/physical processes of the measurement procedure. Making sense out of such high-dimensional complex data is challenging and necessitates the use of a systematic data analytic strategy. We propose here a data processing strategy for two major issues in the analysis of such mass-spectrometry-generated proteomic data: (1) separation of protein "signals" from background "noise" in protein intensity measurements and (2) calibration of protein mass/charge measurements across samples. We illustrate the two issues and the utility of the proposed strategy using data from a prostate cancer biomarker discovery project as an example.
Study of the techniques feasible for food synthesis aboard a spacecraft
NASA Technical Reports Server (NTRS)
Weiss, A. H.
1972-01-01
Synthesis of sugars by Ca(OH)2 catalyzed formaldehyde condensation (the formose reaction) has produced branched carbohydrates that do not occur in nature. The kinetics and mechanisms of the homogeneously catalyzed autocatalytic condensation were studied and analogies between homogeneous and heterogeneous rate laws have been found. Aldol condensations proceed simultaneously with Cannizzaro and crossed-Cannizzaro reactions and Lobry de Bruyn-Van Eckenstein rearrangements. The separate steps as well as the interactions of this highly complex reaction system were elucidated. The system exhibits instabilities, competitive catalytic, mass action, and equilibrium phenomena, complexing, and parallel and consecutive reactions. Specific finding that have been made on the problem will be of interest for synthesizing sugars, both for sustained space flight and for large scale food manufacture. A contribution to methodology for studying complex catalyzed reactions and to understanding control of reaction selectivity was a broad goal of the project.
NASA Technical Reports Server (NTRS)
Mizell, Carolyn Barrett; Malone, Linda
2007-01-01
The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.
NASA Technical Reports Server (NTRS)
Richstein, Alan B.; Nolte, Jerome T.; Pfarr, Barbara B.
2004-01-01
There are numerous technical reviews that occur throughout the systems engineering process life cycle. Many are well known by project managers and stakeholders such as developers and end users, an example of much is the critical design review (CDR). This major milestone for a large, complex new project may last two or more days, include an extensive agenda of topics, and entail hundreds of hours of developer time to prepare presentation materials and associated documents. Additionally, the weeks of schedule spent on review preparation is at least partly at the expense of other work. This paper suggests an approach for tailoring technical reviews, based on the project characteristics and the project manager s identification of the key stakeholders and understanding of their most important issues and considerations. With this insight the project manager can communicate to, manage expectations oc and establish formal agreement with the stakeholders as to which reviews, and at what depth, are most appropriate to achieve project success. The authors, coming from diverse organizations and backgrounds, have drawn on their personal experiences and summarized the best practices of their own organizations to create a common framework to provide guidance on the adaptation of design reviews to other system engineers.
Enhanced Capabilities for Subcritical Experiments (ECSE) Risk Management Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Urban, Mary Elizabeth
Risk is a factor, element, constraint, or course of action that introduces an uncertainty of outcome that could impact project objectives. Risk is an inherent part of all activities, whether the activity is simple and small, or large and complex. Risk management is a process that identifies, evaluates, handles, and monitors risks that have the potential to affect project success. The risk management process spans the entire project, from its initiation to its successful completion and closeout, including both technical and programmatic (non-technical) risks. This Risk Management Plan (RMP) defines the process to be used for identifying, evaluating, handling, andmore » monitoring risks as part of the overall management of the Enhanced Capabilities for Subcritical Experiments (ECSE) ‘Project’. Given the changing nature of the project environment, risk management is essentially an ongoing and iterative process, which applies the best efforts of a knowledgeable project staff to a suite of focused and prioritized concerns. The risk management process itself must be continually applied throughout the project life cycle. This document was prepared in accordance with DOE O 413.3B, Program and Project Management for the Acquisition of Capital Assets, its associated guide for risk management DOE G 413.3-7, Risk Management Guide, and LANL ADPM AP-350-204, Risk and Opportunity Management.« less
Recent progress in 3-D imaging of sea freight containers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fuchs, Theobald, E-mail: theobold.fuchs@iis.fraunhofer.de; Schön, Tobias, E-mail: theobold.fuchs@iis.fraunhofer.de; Sukowski, Frank
The inspection of very large objects like sea freight containers with X-ray Computed Tomography (CT) is an emerging technology. A complete 3-D CT scan of a see-freight container takes several hours. Of course, this is too slow to apply it to a large number of containers. However, the benefits of a 3-D CT for sealed freight are obvious: detection of potential threats or illicit cargo without being confronted with legal complications or high time consumption and risks for the security personnel during a manual inspection. Recently distinct progress was made in the field of reconstruction of projections with only amore » relatively low number of angular positions. Instead of today’s 500 to 1000 rotational steps, as needed for conventional CT reconstruction techniques, this new class of algorithms provides the potential to reduce the number of projection angles approximately by a factor of 10. The main drawback of these advanced iterative methods is the high consumption for numerical processing. But as computational power is getting steadily cheaper, there will be practical applications of these complex algorithms in a foreseeable future. In this paper, we discuss the properties of iterative image reconstruction algorithms and show results of their application to CT of extremely large objects scanning a sea-freight container. A specific test specimen is used to quantitatively evaluate the image quality in terms of spatial and contrast resolution and depending on different number of projections.« less
Krojer, Tobias; Talon, Romain; Pearce, Nicholas; Douangamath, Alice; Brandao-Neto, Jose; Dias, Alexandre; Marsden, Brian
2017-01-01
XChemExplorer (XCE) is a data-management and workflow tool to support large-scale simultaneous analysis of protein–ligand complexes during structure-based ligand discovery (SBLD). The user interfaces of established crystallographic software packages such as CCP4 [Winn et al. (2011 ▸), Acta Cryst. D67, 235–242] or PHENIX [Adams et al. (2010 ▸), Acta Cryst. D66, 213–221] have entrenched the paradigm that a ‘project’ is concerned with solving one structure. This does not hold for SBLD, where many almost identical structures need to be solved and analysed quickly in one batch of work. Functionality to track progress and annotate structures is essential. XCE provides an intuitive graphical user interface which guides the user from data processing, initial map calculation, ligand identification and refinement up until data dissemination. It provides multiple entry points depending on the need of each project, enables batch processing of multiple data sets and records metadata, progress and annotations in an SQLite database. XCE is freely available and works on any Linux and Mac OS X system, and the only dependency is to have the latest version of CCP4 installed. The design and usage of this tool are described here, and its usefulness is demonstrated in the context of fragment-screening campaigns at the Diamond Light Source. It is routinely used to analyse projects comprising 1000 data sets or more, and therefore scales well to even very large ligand-design projects. PMID:28291762
Palazuelos, Daniel; DaEun Im, Dana; Peckarsky, Matthew; Schwarz, Dan; Farmer, Didi Bertrand; Dhillon, Ranu; Johnson, Ari; Orihuela, Claudia; Hackett, Jill; Bazile, Junior; Berman, Leslie; Ballard, Madeleine; Panjabi, Raj; Ternier, Ralph; Slavin, Sam; Lee, Scott; Selinsky, Steve; Mitnick, Carole Diane
2013-01-01
Introduction Despite decades of experience with community health workers (CHWs) in a wide variety of global health projects, there is no established conceptual framework that structures how implementers and researchers can understand, study and improve their respective programs based on lessons learned by other CHW programs. Objective To apply an original, non-linear framework and case study method, 5-SPICE, to multiple sister projects of a large, international non-governmental organization (NGO), and other CHW projects. Design Engaging a large group of implementers, researchers and the best available literature, the 5-SPICE framework was refined and then applied to a selection of CHW programs. Insights gleaned from the case study method were summarized in a tabular format named the ‘5×5-SPICE chart’. This format graphically lists the ways in which essential CHW program elements interact, both positively and negatively, in the implementation field. Results The 5×5-SPICE charts reveal a variety of insights that come from a more complex understanding of how essential CHW projects interact and influence each other in their unique context. Some have been well described in the literature previously, while others are exclusive to this article. An analysis of how best to compensate CHWs is also offered as an example of the type of insights that this method may yield. Conclusions The 5-SPICE framework is a novel instrument that can be used to guide discussions about CHW projects. Insights from this process can help guide quality improvement efforts, or be used as hypothesis that will form the basis of a program's research agenda. Recent experience with research protocols embedded into successfully implemented projects demonstrates how such hypothesis can be rigorously tested. PMID:23561023
Large Mine Permitting - Div. of Mining, Land, and Water
Pebble Project Pogo Mine Red Dog Mine Rock Creek Project True North Mine OPMP Canadian Large Projects Pebble Project Pogo Mine Red Dog Mine Rock Creek Project True North Mine Contact: Kyle Moselle Large Mine
Managing Programmatic Risk for Complex Space System Developments
NASA Technical Reports Server (NTRS)
Panetta, Peter V.; Hastings, Daniel; Brumfield, Mark (Technical Monitor)
2001-01-01
Risk management strategies have become a recent important research topic to many aerospace organizations as they prepare to develop the revolutionary complex space systems of the future. Future multi-disciplinary complex space systems will make it absolutely essential for organizations to practice a rigorous, comprehensive risk management process, emphasizing thorough systems engineering principles to succeed. Project managers must possess strong leadership skills to direct high quality, cross-disciplinary teams for successfully developing revolutionary space systems that are ever increasing in complexity. Proactive efforts to reduce or eliminate risk throughout a project's lifecycle ideally must be practiced by all technical members in the organization. This paper discusses some of the risk management perspectives that were collected from senior managers and project managers of aerospace and aeronautical organizations by the use of interviews and surveys. Some of the programmatic risks which drive the success or failure of projects are revealed. Key findings lead to a number of insights for organizations to consider for proactively approaching the risks which face current and future complex space systems projects.
Habitat Complexity Metrics to Guide Restoration of Large Rivers
NASA Astrophysics Data System (ADS)
Jacobson, R. B.; McElroy, B. J.; Elliott, C.; DeLonay, A.
2011-12-01
Restoration strategies on large, channelized rivers typically strive to recover lost habitat complexity, based on the assumption complexity and biophysical capacity are directly related. Although definition of links between complexity and biotic responses can be tenuous, complexity metrics have appeal because of their potential utility in quantifying habitat quality, defining reference conditions and design criteria, and measuring restoration progress. Hydroacoustic instruments provide many ways to measure complexity on large rivers, yet substantive questions remain about variables and scale of complexity that are meaningful to biota, and how complexity can be measured and monitored cost effectively. We explore these issues on the Missouri River, using the example of channel re-engineering projects that are intended to aid in recovery of the pallid sturgeon, an endangered benthic fish. We are refining understanding of what habitat complexity means for adult fish by combining hydroacoustic habitat assessments with acoustic telemetry to map locations during reproductive migrations and spawning. These data indicate that migrating sturgeon select points with relatively low velocity but adjacent to areas of high velocity (that is, with high velocity gradients); the integration of points defines pathways which minimize energy expenditures during upstream migrations of 10's to 100's of km. Complexity metrics that efficiently quantify migration potential at the reach scale are therefore directly relevant to channel restoration strategies. We are also exploring complexity as it relates to larval sturgeon dispersal. Larvae may drift for as many as 17 days (100's of km at mean velocities) before using up their yolk sac, after which they "settle" into habitats where they initiate feeding. An assumption underlying channel re-engineering is that additional channel complexity, specifically increased shallow, slow water, is necessary for early feeding and refugia. Development of complexity metrics is complicated by the fact that characteristics of channel morphology may increase complexity scores without necessarily increasing biophysical capacity for target species. For example, a cross section that samples depths and velocities across the thalweg (navigation channel) and into lentic habitat may score high on most measures of hydraulic or geomorphic complexity, but does not necessarily provide habitats beneficial to native species. Complexity measures need to be bounded by best estimates of native species requirements. In the absence of specific information, creation of habitat complexity for the sake of complexity may lead to unintended consequences, for example, lentic habitats that increase a complexity score but support invasive species. An additional practical constraint on complexity measures is the need to develop metrics that are can be deployed cost-effectively in an operational monitoring program. Design of a monitoring program requires informed choices of measurement variables, definition of reference sites, and design of sampling effort to capture spatial and temporal variability.
NASA Astrophysics Data System (ADS)
Hartin, C.; Lynch, C.; Kravitz, B.; Link, R. P.; Bond-Lamberty, B. P.
2017-12-01
Typically, uncertainty quantification of internal variability relies on large ensembles of climate model runs under multiple forcing scenarios or perturbations in a parameter space. Computationally efficient, standard pattern scaling techniques only generate one realization and do not capture the complicated dynamics of the climate system (i.e., stochastic variations with a frequency-domain structure). In this study, we generate large ensembles of climate data with spatially and temporally coherent variability across a subselection of Coupled Model Intercomparison Project Phase 5 (CMIP5) models. First, for each CMIP5 model we apply a pattern emulation approach to derive the model response to external forcing. We take all the spatial and temporal variability that isn't explained by the emulator and decompose it into non-physically based structures through use of empirical orthogonal functions (EOFs). Then, we perform a Fourier decomposition of the EOF projection coefficients to capture the input fields' temporal autocorrelation so that our new emulated patterns reproduce the proper timescales of climate response and "memory" in the climate system. Through this 3-step process, we derive computationally efficient climate projections consistent with CMIP5 model trends and modes of variability, which address a number of deficiencies inherent in the ability of pattern scaling to reproduce complex climate model behavior.
Teaching With Projections in the Geosciences: Windows to Enlightenment or Barriers to Understanding?
NASA Astrophysics Data System (ADS)
Mogk, D. W.
2009-12-01
Geoscientists are trained to represent multi-component datasets by projecting onto relatively simple diagrams on two-dimensional surfaces. These projections are used to represent a variety of phenomena ranging from spatial relations to physico-chemical processes. By using projections, it is possible to create simple diagrams as models or analogs of complex and heterogeneous natural systems using a limited number of well-defined “end-member” variables. Although projections are widely used in professional practice, the construction, use and interpretation of these diagrams often presents formidable barriers to student learning. This is largely due to the fact that diagrams that display projected data are the composite product of underlying scientific and mathematical principles, spatial relations on the diagrams may serve as proxies for physical or chemical properties or processes (thus co-mingling spatial reasoning with conceptual reasoning), there are myriad hidden or understood assumptions in the creation of the projections, and projections seek to decrease the “dimensionality” (or degrees of freedom) of multi-component (or multi-variable) systems. Additional layers of information may be superposed on projected diagrams by contouring data, using color or other symbols to distinguish discrete populations of data, imposing gradients of related variables (e.g. isotherms on composition diagrams), or using multiple projections to demonstrate time sequences that elucidate processes (e.g. before/after relations conveyed in animations). Thus, the simple forms of graphical projections may belie numerous layers of information that attempt to explain complex and sophisticated relationships in nature. In striving for simplicity in presentation, diagrams that present projected data may confound student understanding due to lack of knowledge about the inherent complexities in their development. Recall Plato’s Myth of the Cave (Republic, Book 7): the shadow on the wall is at least one step (and probably more) removed from reality. Examples of maps commonly used in the geosciences includes maps (topographic, geologic, weather), equal-area and equal-angle stereonets, phase diagrams (binary, ternary, quadrilateral, PTt, T-X, activity-activity), and other geochemical variation diagrams (mineral exchange vector diagrams, Piper diagrams). All of these projected representations of geological data provide powerful tools to analyze and explain Earth phenomena. But, the truths revealed in these diagrams are not immediately obvious to novices (colleagues "out of field", students, the interested public). In presenting projected data is worth considering: How do “Master” geoscientists derive meaning from these representations? How do we understand what is “normal” and what is “anomalous”? How do we make the jump from “signal” to interpretation? Can we articulate what we’re doing (and why) in such a way that it becomes understandable to our students? As with any powerful tool, it is important to include operating instructions such as annotations, tutorials and worked examples to convey meaning and to ensure appropriate use.
Life-Cycle Assessments of Selected NASA Ground-Based Test Facilities
NASA Technical Reports Server (NTRS)
Sydnor, George Honeycutt
2012-01-01
In the past two years, two separate facility-specific life cycle assessments (LCAs) have been performed as summer student projects. The first project focused on 13 facilities managed by NASA s Aeronautics Test Program (ATP), an organization responsible for large, high-energy ground test facilities that accomplish the nation s most advanced aerospace research. A facility inventory was created for each facility, and the operational-phase carbon footprint and environmental impact were calculated. The largest impacts stemmed from electricity and natural gas used directly at the facility and to generate support processes such as compressed air and steam. However, in specialized facilities that use unique inputs like R-134a, R-14, jet fuels, or nitrogen gas, these sometimes had a considerable effect on the facility s overall environmental impact. The second LCA project was conducted on the NASA Ames Arc Jet Complex and also involved creating a facility inventory and calculating the carbon footprint and environmental impact. In addition, operational alternatives were analyzed for their effectiveness at reducing impact. Overall, the Arc Jet Complex impact is dominated by the natural-gas fired boiler producing steam on-site, but alternatives were provided that could reduce the impact of the boiler operation, some of which are already being implemented. The data and results provided by these LCA projects are beneficial to both the individual facilities and NASA as a whole; the results have already been used in a proposal to reduce carbon footprint at Ames Research Center. To help future life cycle projects, several lessons learned have been recommended as simple and effective infrastructure improvements to NASA, including better utility metering and data recording and standardization of modeling choices and methods. These studies also increased sensitivity to and appreciation for quantifying the impact of NASA s activities.
12om Methodology: Process v1.1
2014-03-31
in support of the Applied Research Project (ARP) 12om entitled “Collaborative Understanding of Complex Situations”. The overall purpose of this...Definition ARP Applied Research Project CF Canadian Forces CFOPP Canadian Forces Operational Planning Process CIDA Canadian International... Research Project (ARP) 12om entitled “Collaborative Understanding of Complex Situations”. The overall purpose of this project is to develop a
Data management integration for biomedical core facilities
NASA Astrophysics Data System (ADS)
Zhang, Guo-Qiang; Szymanski, Jacek; Wilson, David
2007-03-01
We present the design, development, and pilot-deployment experiences of MIMI, a web-based, Multi-modality Multi-Resource Information Integration environment for biomedical core facilities. This is an easily customizable, web-based software tool that integrates scientific and administrative support for a biomedical core facility involving a common set of entities: researchers; projects; equipments and devices; support staff; services; samples and materials; experimental workflow; large and complex data. With this software, one can: register users; manage projects; schedule resources; bill services; perform site-wide search; archive, back-up, and share data. With its customizable, expandable, and scalable characteristics, MIMI not only provides a cost-effective solution to the overarching data management problem of biomedical core facilities unavailable in the market place, but also lays a foundation for data federation to facilitate and support discovery-driven research.
'Pop-Up' Governance: developing internal governance frameworks for consortia: the example of UK10K.
Kaye, Jane; Muddyman, Dawn; Smee, Carol; Kennedy, Karen; Bell, Jessica
2015-01-01
Innovations in information technologies have facilitated the development of new styles of research networks and forms of governance. This is evident in genomics where increasingly, research is carried out by large, interdisciplinary consortia focussing on a specific research endeavour. The UK10K project is an example of a human genomics consortium funded to provide insights into the genomics of rare conditions, and establish a community resource from generated sequence data. To achieve its objectives according to the agreed timetable, the UK10K project established an internal governance system to expedite the research and to deal with the complex issues that arose. The project's governance structure exemplifies a new form of network governance called 'pop-up' governance. 'Pop-up' because: it was put together quickly, existed for a specific period, was designed for a specific purpose, and was dismantled easily on project completion. In this paper, we use UK10K to describe how 'pop-up' governance works on the ground and how relational, hierarchical and contractual governance mechanisms are used in this new form of network governance.
NASA Technical Reports Server (NTRS)
1974-01-01
The significant management and technical aspects of the JPL Project to develop and implement a 64-meter-diameter antenna at the Goldstone Deep Space Communications Complex in California, which was the first of the Advanced Antenna Systems of the National Aeronautics and Space Administration/Jet Propulsion Laboratory Deep Space Network are described. The original need foreseen for a large-diameter antenna to accomplish communication and tracking support of NASA's solar system exploration program is reviewed, and the translation of those needs into the technical specification of an appropriate ground station antenna is described. The antenna project is delineated by phases to show the key technical and managerial skills and the technical facility resources involved. There is a brief engineering description of the antenna and its closely related facilities. Some difficult and interesting engineering problems, then at the state-of-the-art level, which were met in the accomplishment of the Project, are described. The key performance characteristics of the antenna, in relation to the original specifications and the methods of their determination, are stated.
Wind Technology Modeling Within the System Advisor Model (SAM) (Poster)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blair, N.; Dobos, A.; Ferguson, T.
This poster provides detail for implementation and the underlying methodology for modeling wind power generation performance in the National Renewable Energy Laboratory's (NREL's) System Advisor Model (SAM). SAM's wind power model allows users to assess projects involving one or more large or small wind turbines with any of the detailed options for residential, commercial, or utility financing. The model requires information about the wind resource, wind turbine specifications, wind farm layout (if applicable), and costs, and provides analysis to compare the absolute or relative impact of these inputs. SAM is a system performance and economic model designed to facilitate analysismore » and decision-making for project developers, financers, policymakers, and energy researchers. The user pairs a generation technology with a financing option (residential, commercial, or utility) to calculate the cost of energy over the multi-year project period. Specifically, SAM calculates the value of projects which buy and sell power at retail rates for residential and commercial systems, and also for larger-scale projects which operate through a power purchase agreement (PPA) with a utility. The financial model captures complex financing and rate structures, taxes, and incentives.« less
Dynamic system simulation of small satellite projects
NASA Astrophysics Data System (ADS)
Raif, Matthias; Walter, Ulrich; Bouwmeester, Jasper
2010-11-01
A prerequisite to accomplish a system simulation is to have a system model holding all necessary project information in a centralized repository that can be accessed and edited by all parties involved. At the Institute of Astronautics of the Technische Universitaet Muenchen a modular approach for modeling and dynamic simulation of satellite systems has been developed called dynamic system simulation (DySyS). DySyS is based on the platform independent description language SysML to model a small satellite project with respect to the system composition and dynamic behavior. A library of specific building blocks and possible relations between these blocks have been developed. From this library a system model of the satellite of interest can be created. A mapping into a C++ simulation allows the creation of an executable system model with which simulations are performed to observe the dynamic behavior of the satellite. In this paper DySyS is used to model and simulate the dynamic behavior of small satellites, because small satellite projects can act as a precursor to demonstrate the feasibility of a system model since they are less complex compared to a large scale satellite project.
Reengineering observatory operations for the time domain
NASA Astrophysics Data System (ADS)
Seaman, Robert L.; Vestrand, W. T.; Hessman, Frederic V.
2014-07-01
Observatories are complex scientific and technical institutions serving diverse users and purposes. Their telescopes, instruments, software, and human resources engage in interwoven workflows over a broad range of timescales. These workflows have been tuned to be responsive to concepts of observatory operations that were applicable when various assets were commissioned, years or decades in the past. The astronomical community is entering an era of rapid change increasingly characterized by large time domain surveys, robotic telescopes and automated infrastructures, and - most significantly - of operating modes and scientific consortia that span our individual facilities, joining them into complex network entities. Observatories must adapt and numerous initiatives are in progress that focus on redesigning individual components out of the astronomical toolkit. New instrumentation is both more capable and more complex than ever, and even simple instruments may have powerful observation scripting capabilities. Remote and queue observing modes are now widespread. Data archives are becoming ubiquitous. Virtual observatory standards and protocols and astroinformatics data-mining techniques layered on these are areas of active development. Indeed, new large-aperture ground-based telescopes may be as expensive as space missions and have similarly formal project management processes and large data management requirements. This piecewise approach is not enough. Whatever challenges of funding or politics facing the national and international astronomical communities it will be more efficient - scientifically as well as in the usual figures of merit of cost, schedule, performance, and risks - to explicitly address the systems engineering of the astronomical community as a whole.
Latorre, Mariano; Silva, Herman; Saba, Juan; Guziolowski, Carito; Vizoso, Paula; Martinez, Veronica; Maldonado, Jonathan; Morales, Andrea; Caroca, Rodrigo; Cambiazo, Veronica; Campos-Vargas, Reinaldo; Gonzalez, Mauricio; Orellana, Ariel; Retamales, Julio; Meisel, Lee A
2006-11-23
Expressed sequence tag (EST) analyses provide a rapid and economical means to identify candidate genes that may be involved in a particular biological process. These ESTs are useful in many Functional Genomics studies. However, the large quantity and complexity of the data generated during an EST sequencing project can make the analysis of this information a daunting task. In an attempt to make this task friendlier, we have developed JUICE, an open source data management system (Apache + PHP + MySQL on Linux), which enables the user to easily upload, organize, visualize and search the different types of data generated in an EST project pipeline. In contrast to other systems, the JUICE data management system allows a branched pipeline to be established, modified and expanded, during the course of an EST project. The web interfaces and tools in JUICE enable the users to visualize the information in a graphical, user-friendly manner. The user may browse or search for sequences and/or sequence information within all the branches of the pipeline. The user can search using terms associated with the sequence name, annotation or other characteristics stored in JUICE and associated with sequences or sequence groups. Groups of sequences can be created by the user, stored in a clipboard and/or downloaded for further analyses. Different user profiles restrict the access of each user depending upon their role in the project. The user may have access exclusively to visualize sequence information, access to annotate sequences and sequence information, or administrative access. JUICE is an open source data management system that has been developed to aid users in organizing and analyzing the large amount of data generated in an EST Project workflow. JUICE has been used in one of the first functional genomics projects in Chile, entitled "Functional Genomics in nectarines: Platform to potentiate the competitiveness of Chile in fruit exportation". However, due to its ability to organize and visualize data from external pipelines, JUICE is a flexible data management system that should be useful for other EST/Genome projects. The JUICE data management system is released under the Open Source GNU Lesser General Public License (LGPL). JUICE may be downloaded from http://genoma.unab.cl/juice_system/ or http://www.genomavegetal.cl/juice_system/.
Latorre, Mariano; Silva, Herman; Saba, Juan; Guziolowski, Carito; Vizoso, Paula; Martinez, Veronica; Maldonado, Jonathan; Morales, Andrea; Caroca, Rodrigo; Cambiazo, Veronica; Campos-Vargas, Reinaldo; Gonzalez, Mauricio; Orellana, Ariel; Retamales, Julio; Meisel, Lee A
2006-01-01
Background Expressed sequence tag (EST) analyses provide a rapid and economical means to identify candidate genes that may be involved in a particular biological process. These ESTs are useful in many Functional Genomics studies. However, the large quantity and complexity of the data generated during an EST sequencing project can make the analysis of this information a daunting task. Results In an attempt to make this task friendlier, we have developed JUICE, an open source data management system (Apache + PHP + MySQL on Linux), which enables the user to easily upload, organize, visualize and search the different types of data generated in an EST project pipeline. In contrast to other systems, the JUICE data management system allows a branched pipeline to be established, modified and expanded, during the course of an EST project. The web interfaces and tools in JUICE enable the users to visualize the information in a graphical, user-friendly manner. The user may browse or search for sequences and/or sequence information within all the branches of the pipeline. The user can search using terms associated with the sequence name, annotation or other characteristics stored in JUICE and associated with sequences or sequence groups. Groups of sequences can be created by the user, stored in a clipboard and/or downloaded for further analyses. Different user profiles restrict the access of each user depending upon their role in the project. The user may have access exclusively to visualize sequence information, access to annotate sequences and sequence information, or administrative access. Conclusion JUICE is an open source data management system that has been developed to aid users in organizing and analyzing the large amount of data generated in an EST Project workflow. JUICE has been used in one of the first functional genomics projects in Chile, entitled "Functional Genomics in nectarines: Platform to potentiate the competitiveness of Chile in fruit exportation". However, due to its ability to organize and visualize data from external pipelines, JUICE is a flexible data management system that should be useful for other EST/Genome projects. The JUICE data management system is released under the Open Source GNU Lesser General Public License (LGPL). JUICE may be downloaded from or . PMID:17123449
Interdisciplinary collaboration in action: tracking the signal, tracing the noise
Callard, Felicity; Fitzgerald, Des; Woods, Angela
2016-01-01
Interdisciplinarity is often framed as an unquestioned good within and beyond the academy, one to be encouraged by funders and research institutions alike. And yet there is little research on how interdisciplinary projects actually work—and do not work—in practice, particularly within and across the social sciences and humanities. This article centres on “Hubbub”, the first interdisciplinary 2-year research residency of The Hub at Wellcome Collection, which is investigating rest and its opposites in neuroscience, mental health, the arts and the everyday. The article describes how Hubbub is tracing, capturing and reflecting on practices of interdisciplinarity across its large, dispersed team of collaborators, who work across the social sciences, humanities, arts, mind and brain sciences, and public engagement. We first describe the distinctiveness of Hubbub (a project designed for a particular space, and one in which the arts are not positioned as simply illustrating or disseminating the research of the scientists), and then outline three techniques Hubbub has developed to map interdisciplinary collaboration in the making: (1) ethnographic analysis; (2) “In the Diary Room”, an aesthetics of collaboration designed to harness and capture affective dynamics within a large, complex project; and (3) the Hubbub Collaboration Questionnaire, which yields quantitative and qualitative data, as well as a social network analysis of collaborators. We conclude by considering some themes that other inter-disciplinary projects might draw on for their own logics of tracking and tracing. This article forms part of an ongoing thematic collection dedicated to interdisciplinary research. PMID:27516896
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Katherine J; Johnson, Seth R; Prokopenko, Andrey V
'ForTrilinos' is related to The Trilinos Project, which contains a large and growing collection of solver capabilities that can utilize next-generation platforms, in particular scalable multicore, manycore, accelerator and heterogeneous systems. Trilinos is primarily written in C++, including its user interfaces. While C++ is advantageous for gaining access to the latest programming environments, it limits Trilinos usage via Fortran. Sever ad hoc translation interfaces exist to enable Fortran usage of Trilinos, but none of these interfaces is general-purpose or written for reusable and sustainable external use. 'ForTrilinos' provides a seamless pathway for large and complex Fortran-based codes to access Trilinosmore » without C/C++ interface code. This access includes Fortran versions of Kokkos abstractions for code execution and data management.« less
Implementation of an Antenna Array Signal Processing Breadboard for the Deep Space Network
NASA Technical Reports Server (NTRS)
Navarro, Robert
2006-01-01
The Deep Space Network Large Array will replace/augment 34 and 70 meter antenna assets. The array will mainly be used to support NASA's deep space telemetry, radio science, and navigation requirements. The array project will deploy three complexes in the western U.S., Australia, and European longitude each with 400 12m downlink antennas and a DSN central facility at JPL. THis facility will remotely conduct all real-time monitor and control for the network. Signal processing objectives include: provide a means to evaluate the performance of the Breadboard Array's antenna subsystem; design and build prototype hardware; demonstrate and evaluate proposed signal processing techniques; and gain experience with various technologies that may be used in the Large Array. Results are summarized..
NASA Astrophysics Data System (ADS)
Krumhansl, R. A.; Foster, J.; Peach, C. L.; Busey, A.; Baker, I.
2012-12-01
The practice of science and engineering is being revolutionized by the development of cyberinfrastructure for accessing near real-time and archived observatory data. Large cyberinfrastructure projects have the potential to transform the way science is taught in high school classrooms, making enormous quantities of scientific data available, giving students opportunities to analyze and draw conclusions from many kinds of complex data, and providing students with experiences using state-of-the-art resources and techniques for scientific investigations. However, online interfaces to scientific data are built by scientists for scientists, and their design can significantly impede broad use by novices. Knowledge relevant to the design of student interfaces to complex scientific databases is broadly dispersed among disciplines ranging from cognitive science to computer science and cartography and is not easily accessible to designers of educational interfaces. To inform efforts at bridging scientific cyberinfrastructure to the high school classroom, Education Development Center, Inc. and the Scripps Institution of Oceanography conducted an NSF-funded 2-year interdisciplinary review of literature and expert opinion pertinent to making interfaces to large scientific databases accessible to and usable by precollege learners and their teachers. Project findings are grounded in the fundamentals of Cognitive Load Theory, Visual Perception, Schemata formation and Universal Design for Learning. The Knowledge Status Report (KSR) presents cross-cutting and visualization-specific guidelines that highlight how interface design features can address/ ameliorate challenges novice high school students face as they navigate complex databases to find data, and construct and look for patterns in maps, graphs, animations and other data visualizations. The guidelines present ways to make scientific databases more broadly accessible by: 1) adjusting the cognitive load imposed by the user interface and visualizations so that it doesn't exceed the amount of information the learner can actively process; 2) drawing attention to important features and patterns; and 3) enabling customization of visualizations and tools to meet the needs of diverse learners.
Introduction to the HL-LHC Project
NASA Astrophysics Data System (ADS)
Rossi, L.; Brüning, O.
The Large Hadron Collider (LHC) is one of largest scientific instruments ever built. It has been exploring the new energy frontier since 2010, gathering a global user community of 7,000 scientists. To extend its discovery potential, the LHC will need a major upgrade in the 2020s to increase its luminosity (rate of collisions) by a factor of five beyond its design value and the integrated luminosity by a factor of ten. As a highly complex and optimized machine, such an upgrade of the LHC must be carefully studied and requires about ten years to implement. The novel machine configuration, called High Luminosity LHC (HL-LHC), will rely on a number of key innovative technologies, representing exceptional technological challenges, such as cutting-edge 11-12 tesla superconducting magnets, very compact superconducting cavities for beam rotation with ultra-precise phase control, new technology for beam collimation and 300-meter-long high-power superconducting links with negligible energy dissipation. HL-LHC federates efforts and R&D of a large community in Europe, in the US and in Japan, which will facilitate the implementation of the construction phase as a global project.
Voet, T; Devolder, P; Pynoo, B; Vercruysse, J; Duyck, P
2007-11-01
This paper hopes to share the insights we experienced during designing, building, and running an indexing solution for a large set of radiological reports and images in a production environment for more than 3 years. Several technical challenges were encountered and solved in the course of this project. One hundred four million words in 1.8 million radiological reports from 1989 to the present were indexed and became instantaneously searchable in a user-friendly fashion; the median query duration is only 31 ms. Currently, our highly tuned index holds 332,088 unique words in four languages. The indexing system is feature-rich and language-independent and allows for making complex queries. For research and training purposes it certainly is a valuable and convenient addition to our radiology informatics toolbox. Extended use of open-source technology dramatically reduced both implementation time and cost. All software we developed related to the indexing project has been made available to the open-source community covered by an unrestricted Berkeley Software Distribution-style license.
A review of surrogate models and their application to groundwater modeling
NASA Astrophysics Data System (ADS)
Asher, M. J.; Croke, B. F. W.; Jakeman, A. J.; Peeters, L. J. M.
2015-08-01
The spatially and temporally variable parameters and inputs to complex groundwater models typically result in long runtimes which hinder comprehensive calibration, sensitivity, and uncertainty analysis. Surrogate modeling aims to provide a simpler, and hence faster, model which emulates the specified output of a more complex model in function of its inputs and parameters. In this review paper, we summarize surrogate modeling techniques in three categories: data-driven, projection, and hierarchical-based approaches. Data-driven surrogates approximate a groundwater model through an empirical model that captures the input-output mapping of the original model. Projection-based models reduce the dimensionality of the parameter space by projecting the governing equations onto a basis of orthonormal vectors. In hierarchical or multifidelity methods the surrogate is created by simplifying the representation of the physical system, such as by ignoring certain processes, or reducing the numerical resolution. In discussing the application to groundwater modeling of these methods, we note several imbalances in the existing literature: a large body of work on data-driven approaches seemingly ignores major drawbacks to the methods; only a fraction of the literature focuses on creating surrogates to reproduce outputs of fully distributed groundwater models, despite these being ubiquitous in practice; and a number of the more advanced surrogate modeling methods are yet to be fully applied in a groundwater modeling context.
Error Cost Escalation Through the Project Life Cycle
NASA Technical Reports Server (NTRS)
Stecklein, Jonette M.; Dabney, Jim; Dick, Brandon; Haskins, Bill; Lovell, Randy; Moroney, Gregory
2004-01-01
It is well known that the costs to fix errors increase as the project matures, but how fast do those costs build? A study was performed to determine the relative cost of fixing errors discovered during various phases of a project life cycle. This study used three approaches to determine the relative costs: the bottom-up cost method, the total cost breakdown method, and the top-down hypothetical project method. The approaches and results described in this paper presume development of a hardware/software system having project characteristics similar to those used in the development of a large, complex spacecraft, a military aircraft, or a small communications satellite. The results show the degree to which costs escalate, as errors are discovered and fixed at later and later phases in the project life cycle. If the cost of fixing a requirements error discovered during the requirements phase is defined to be 1 unit, the cost to fix that error if found during the design phase increases to 3 - 8 units; at the manufacturing/build phase, the cost to fix the error is 7 - 16 units; at the integration and test phase, the cost to fix the error becomes 21 - 78 units; and at the operations phase, the cost to fix the requirements error ranged from 29 units to more than 1500 units
Portraiture in the Large Lecture: Storying One Chemistry Professor's Practical Knowledge
NASA Astrophysics Data System (ADS)
Eddleton, Jeannine E.
Practical knowledge, as defined by Freema Elbaz (1983), is a complex, practically oriented set of understandings which teachers use to actively shape and direct their work. The goal of this study is the construction of a social science portrait that illuminates the practical knowledge of a large lecture professor of general chemistry at a public research university in the southeast. This study continues Elbaz's (1981) work on practical knowledge with the incorporation of a qualitative and intentionally interventionist methodology which "blurs the boundaries of aesthetics and empiricism in an effort to capture the complexity, dynamics, and subtlety of human experience and organizational life," (Lawrence-Lightfoot & Davis, 1997). This collection of interviews, observations, writings, and reflections is designed for an eclectic audience with the intent of initiating conversation on the topic of the large lecture and is a purposeful attempt to link research and practice. Social science portraiture is uniquely suited to this intersection of researcher and researched, the perfect combination of methodology and analysis for a project that is both product and praxis. The following research questions guide the study. • Are aspects of Elbaz's practical knowledge identifiable in the research conversations conducted with a large lecture college professor? • Is practical knowledge identifiable during observations of Patricia's large lecture? Freema Elbaz conducted research conversations with Sarah, a high school classroom and writing resource teacher who conducted much of her teaching work one on one with students. Patricia's practice differs significantly from Sarah's with respect to subject matter and to scale.
Rapid water quality change in the Elwha River estuary complex during dam removal
Foley, Melissa M.; Duda, Jeffrey J.; Beirne, Matthew M.; Paradis, Rebecca; Ritchie, Andrew; Warrick, Jonathan A.
2015-01-01
Dam removal in the United States is increasing as a result of structural concerns, sedimentation of reservoirs, and declining riverine ecosystem conditions. The removal of the 32 m Elwha and 64 m Glines Canyon dams from the Elwha River in Washington, U.S.A., was the largest dam removal project in North American history. During the 3 yr of dam removal—from September 2011 to August 2014—more than ten million cubic meters of sediment was eroded from the former reservoirs, transported downstream, and deposited throughout the lower river, river delta, and nearshore waters of the Strait of Juan de Fuca. Water quality data collected in the estuary complex at the mouth of the Elwha River document how conditions in the estuary changed as a result of sediment deposition over the 3 yr the dams were removed. Rapid and large-scale changes in estuary conditions—including salinity, depth, and turbidity—occurred 1 yr into the dam removal process. Tidal propagation into the estuary ceased following a large sediment deposition event that began in October 2013, resulting in decreased salinity, and increased depth and turbidity in the estuary complex. These changes have persisted in the system through dam removal, significantly altering the structure and functioning of the Elwha River estuary ecosystem.
2002-01-01
Discussing the ethical issues involved in topics such as cloning and stem cell research in a large introductory biology course is often difficult. Teachers may be wary of presenting material biased by personal beliefs, and students often feel inhibited speaking about moral issues in a large group. Yet, to ignore what is happening “out there” beyond the textbooks and lab work is to do a disservice to students. This essay describes a semester-long project in which upperclass students presented some of the most complex and controversial ideas imaginable to introductory students by staging a mock debate and acting as members of the then newly appointed President's Council on Bioethics. Because the upperclass students were presenting the ideas of real people who play an important role in shaping national policy, no student's personal beliefs were put on the line, and many ideas were articulated. The introductory audience could accept or reject what they were hearing and learn information important for making up their own minds on these issues. This project is presented as an example of how current events can be used to put basic cell biology into context and of how exciting it can be when students teach students. PMID:12669102
Pragmatic quality metrics for evolutionary software development models
NASA Technical Reports Server (NTRS)
Royce, Walker
1990-01-01
Due to the large number of product, project, and people parameters which impact large custom software development efforts, measurement of software product quality is a complex undertaking. Furthermore, the absolute perspective from which quality is measured (customer satisfaction) is intangible. While we probably can't say what the absolute quality of a software product is, we can determine the relative quality, the adequacy of this quality with respect to pragmatic considerations, and identify good and bad trends during development. While no two software engineers will ever agree on an optimum definition of software quality, they will agree that the most important perspective of software quality is its ease of change. We can call this flexibility, adaptability, or some other vague term, but the critical characteristic of software is that it is soft. The easier the product is to modify, the easier it is to achieve any other software quality perspective. This paper presents objective quality metrics derived from consistent lifecycle perspectives of rework which, when used in concert with an evolutionary development approach, can provide useful insight to produce better quality per unit cost/schedule or to achieve adequate quality more efficiently. The usefulness of these metrics is evaluated by applying them to a large, real world, Ada project.
Boeker, Martin; Stenzhorn, Holger; Kumpf, Kai; Bijlenga, Philippe; Schulz, Stefan; Hanser, Susanne
2007-01-01
The @neurIST ontology is currently under development within the scope of the European project @neurIST intended to serve as a module in a complex architecture aiming at providing a better understanding and management of intracranial aneurysms and subarachnoid hemorrhages. Due to the integrative structure of the project the ontology needs to represent entities from various disciplines on a large spatial and temporal scale. Initial term acquisition was performed by exploiting a database scaffold, literature analysis and communications with domain experts. The ontology design is based on the DOLCE upper ontology and other existing domain ontologies were linked or partly included whenever appropriate (e.g., the FMA for anatomical entities and the UMLS for definitions and lexical information). About 2300 predominantly medical entities were represented but also a multitude of biomolecular, epidemiological, and hemodynamic entities. The usage of the ontology in the project comprises terminological control, text mining, annotation, and data mediation. PMID:18693797
NASA Astrophysics Data System (ADS)
Bader, D. C.
2015-12-01
The Accelerated Climate Modeling for Energy (ACME) Project is concluding its first year. Supported by the Office of Science in the U.S. Department of Energy (DOE), its vision is to be "an ongoing, state-of-the-science Earth system modeling, modeling simulation and prediction project that optimizes the use of DOE laboratory resources to meet the science needs of the nation and the mission needs of DOE." Included in the "laboratory resources," is a large investment in computational, network and information technologies that will be utilized to both build better and more accurate climate models and broadly disseminate the data they generate. Current model diagnostic analysis and data dissemination technologies will not scale to the size of the simulations and the complexity of the models envisioned by ACME and other top tier international modeling centers. In this talk, the ACME Workflow component plans to meet these future needs will be described and early implementation examples will be highlighted.
NASA Technical Reports Server (NTRS)
Tavana, Madjid
1995-01-01
The evaluation and prioritization of Engineering Support Requests (ESR's) is a particularly difficult task at the Kennedy Space Center (KSC) -- Shuttle Project Engineering Office. This difficulty is due to the complexities inherent in the evaluation process and the lack of structured information. The evaluation process must consider a multitude of relevant pieces of information concerning Safety, Supportability, O&M Cost Savings, Process Enhancement, Reliability, and Implementation. Various analytical and normative models developed over the past have helped decision makers at KSC utilize large volumes of information in the evaluation of ESR's. The purpose of this project is to build on the existing methodologies and develop a multiple criteria decision support system that captures the decision maker's beliefs through a series of sequential, rational, and analytical processes. The model utilizes the Analytic Hierarchy Process (AHP), subjective probabilities, the entropy concept, and Maximize Agreement Heuristic (MAH) to enhance the decision maker's intuition in evaluating a set of ESR's.
Application of machine learning methods in bioinformatics
NASA Astrophysics Data System (ADS)
Yang, Haoyu; An, Zheng; Zhou, Haotian; Hou, Yawen
2018-05-01
Faced with the development of bioinformatics, high-throughput genomic technology have enabled biology to enter the era of big data. [1] Bioinformatics is an interdisciplinary, including the acquisition, management, analysis, interpretation and application of biological information, etc. It derives from the Human Genome Project. The field of machine learning, which aims to develop computer algorithms that improve with experience, holds promise to enable computers to assist humans in the analysis of large, complex data sets.[2]. This paper analyzes and compares various algorithms of machine learning and their applications in bioinformatics.
Multimission Software Reuse in an Environment of Large Paradigm Shifts
NASA Technical Reports Server (NTRS)
Wilson, Robert K.
1996-01-01
The ground data systems provided for NASA space mission support are discussed. As space missions expand, the ground systems requirements become more complex. Current ground data systems provide for telemetry, command, and uplink and downlink processing capabilities. The new millennium project (NMP) technology testbed for 21st century NASA missions is discussed. The program demonstrates spacecraft and ground system technologies. The paradigm shift from detailed ground sequencing to a goal oriented planning approach is considered. The work carried out to meet this paradigm for the Deep Space-1 (DS-1) mission is outlined.
Risk Management Technique for design and operation of facilities and equipment
NASA Technical Reports Server (NTRS)
Fedor, O. H.; Parsons, W. N.; Coutinho, J. De S.
1975-01-01
The Risk Management System collects information from engineering, operating, and management personnel to identify potentially hazardous conditions. This information is used in risk analysis, problem resolution, and contingency planning. The resulting hazard accountability system enables management to monitor all identified hazards. Data from this system are examined in project reviews so that management can decide to eliminate or accept these risks. This technique is particularly effective in improving the management of risks in large, complex, high-energy facilities. These improvements are needed for increased cooperation among industry, regulatory agencies, and the public.
Ontology-Driven Information Integration
NASA Technical Reports Server (NTRS)
Tissot, Florence; Menzel, Chris
2005-01-01
Ontology-driven information integration (ODII) is a method of computerized, automated sharing of information among specialists who have expertise in different domains and who are members of subdivisions of a large, complex enterprise (e.g., an engineering project, a government agency, or a business). In ODII, one uses rigorous mathematical techniques to develop computational models of engineering and/or business information and processes. These models are then used to develop software tools that support the reliable processing and exchange of information among the subdivisions of this enterprise or between this enterprise and other enterprises.
Python-based geometry preparation and simulation visualization toolkits for STEPS
Chen, Weiliang; De Schutter, Erik
2014-01-01
STEPS is a stochastic reaction-diffusion simulation engine that implements a spatial extension of Gillespie's Stochastic Simulation Algorithm (SSA) in complex tetrahedral geometries. An extensive Python-based interface is provided to STEPS so that it can interact with the large number of scientific packages in Python. However, a gap existed between the interfaces of these packages and the STEPS user interface, where supporting toolkits could reduce the amount of scripting required for research projects. This paper introduces two new supporting toolkits that support geometry preparation and visualization for STEPS simulations. PMID:24782754
Psychiatric Emergencies in the Elderly.
Sikka, Veronica; Kalra, S; Galwankar, Sagar; Sagar, Galwankar
2015-11-01
With the increasing life expectancy, the geriatric population has been increasing over the past few decades. By the year 2050, it is projected to compose more than a fifth of the entire population, representing a 147% increase in this age group. There has been a steady increase in the number of medical and psychiatric disorders, and a large percentage of geriatric patients are now presenting to the emergency department with such disorders. The management of our progressively complex geriatric patient population will require an integrative team approach involving emergency medicine, psychiatry, and hospitalist medicine. Published by Elsevier Inc.
Lightning Mapping and Leader Propagation Reconstruction using LOFAR-LIM
NASA Astrophysics Data System (ADS)
Hare, B.; Ebert, U.; Rutjes, C.; Scholten, O.; Trinh, G. T. N.
2017-12-01
LOFAR (LOw Frequency ARray) is a radio telescope that consists of a large number of dual-polarized antennas spread over the northern Netherlands and beyond. The LOFAR for Lightning Imaging project (LOFAR-LIM) has successfully used LOFAR to map out lightning in the Netherlands. Since LOFAR covers a large frequency range (10-90 MHz), has antennas spread over a large area, and saves the raw trace data from the antennas, LOFAR-LIM can combine all the strongest aspects of both lightning mapping arrays and lightning interferometers. These aspects include a nanosecond resolution between pulses, nanosecond timing accuracy, and an ability to map lightning in all 3 spatial dimensions and time. LOFAR should be able to map out overhead lightning with a spatial accuracy on the order of meters. The large amount of complex data provide by LOFAR has presented new data processing challenges, such as handling the time offsets between stations with large baselines and locating as many sources as possible. New algorithms to handle these challenges have been developed and will be discussed. Since the antennas are dual-polarized, all three components of the electric field can be extracted and the structure of the R.F. pulses can be investigated at a large number of distances and angles relative to the lightning source, potentially allowing for modeling of lightning current distributions relevant to the 10 to 90 MHz frequency range. R.F. pulses due to leader propagation will be presented, which show a complex sub-structure, indicating intricate physics that could potentially be reconstructed.
Different Manhattan project: automatic statistical model generation
NASA Astrophysics Data System (ADS)
Yap, Chee Keng; Biermann, Henning; Hertzmann, Aaron; Li, Chen; Meyer, Jon; Pao, Hsing-Kuo; Paxia, Salvatore
2002-03-01
We address the automatic generation of large geometric models. This is important in visualization for several reasons. First, many applications need access to large but interesting data models. Second, we often need such data sets with particular characteristics (e.g., urban models, park and recreation landscape). Thus we need the ability to generate models with different parameters. We propose a new approach for generating such models. It is based on a top-down propagation of statistical parameters. We illustrate the method in the generation of a statistical model of Manhattan. But the method is generally applicable in the generation of models of large geographical regions. Our work is related to the literature on generating complex natural scenes (smoke, forests, etc) based on procedural descriptions. The difference in our approach stems from three characteristics: modeling with statistical parameters, integration of ground truth (actual map data), and a library-based approach for texture mapping.
Teaching Sustainability as a Large Format Environmental Science Elective
NASA Astrophysics Data System (ADS)
Davies, C.; Frisch, M.; Wagner, J.
2012-12-01
A challenge in teaching sustainability is engaging students in the global scale and immediacy of environmental impacts, and degree of societal change required to address environmental challenges. Succeeding in a large format Environmental Science elective course with a many as 100 students is an even greater challenge. ENVSC 322 Environmental Sustainability is an innovative new course integrating multiple disciplines, a wide range of external expert speakers and a hands-on community engagement project. The course, in its third year, has been highly successful and impacting for the students, community and faculty involved. The determination of success is based on student and community impacts. Students covered science topics on Earth systems, ecosystem complexity and services through readings and specialist speakers. The interconnection of society and climate was approached through global and local examples with a strong environmental justice component. Experts in a wide range of professional fields were engaged to speak with students on the role and impacts of sustainability in their particular field. Some examples are: Region VII Environmental Protection Agency Environmental Justice Director engaged students in both urban and rural aspects of environmental justice; a Principle Architect and national leader in Green architecture and redevelopment spoke with students regarding the necessity and potential for green urbanism; and industry innovators presented closed-cycle and alternative energy projects. The capstone project and highlight of the course was an individual or team community engagement project on sustainability, designed and implemented by the students. Community engagement projects completed throughout the Kansas City metro area have increased each year in number, quality and impact from 35 the first year to 70 projects this past spring. Students directly engage their communities and through this experience integrate knowledge of environmental systems with how their own society uses and impacts these systems. The direct nature of "doing" a project, not its success, can and has been transformative for many students.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ouyang, Lizhi
Advanced Ultra Supercritical Boiler (AUSC) requires materials that can operate in corrosive environment at temperature and pressure as high as 760°C (or 1400°F) and 5000psi, respectively, while at the same time maintain good ductility at low temperature. We develop automated simulation software tools to enable fast large scale screening studies of candidate designs. While direct evaluation of creep rupture strength and ductility are currently not feasible, properties such as energy, elastic constants, surface energy, interface energy, and stack fault energy can be used to assess their relative ductility and creeping strength. We implemented software to automate the complex calculations tomore » minimize human inputs in the tedious screening studies which involve model structures generation, settings for first principles calculations, results analysis and reporting. The software developed in the project and library of computed mechanical properties of phases found in ferritic steels, many are complex solid solutions estimated for the first time, will certainly help the development of low cost ferritic steel for AUSC.« less
The Design of Large Geothermally Powered Air-Conditioning Systems Using an Optimal Control Approach
NASA Astrophysics Data System (ADS)
Horowitz, F. G.; O'Bryan, L.
2010-12-01
The direct use of geothermal energy from Hot Sedimentary Aquifer (HSA) systems for large scale air-conditioning projects involves many tradeoffs. Aspects contributing towards making design decisions for such systems include: the inadequately known permeability and thermal distributions underground; the combinatorial complexity of selecting pumping and chiller systems to match the underground conditions to the air-conditioning requirements; the future price variations of the electricity market; any uncertainties in future Carbon pricing; and the applicable discount rate for evaluating the financial worth of the project. Expanding upon the previous work of Horowitz and Hornby (2007), we take an optimal control approach to the design of such systems. By building a model of the HSA system, the drilling process, the pumping process, and the chilling operations, along with a specified objective function, we can write a Hamiltonian for the system. Using the standard techniques of optimal control, we use gradients of the Hamiltonian to find the optimal design for any given set of permeabilities, thermal distributions, and the other engineering and financial parameters. By using this approach, optimal system designs could potentially evolve in response to the actual conditions encountered during drilling. Because the granularity of some current models is so coarse, we will be able to compare our optimal control approach to an exhaustive search of parameter space. We will present examples from the conditions appropriate for the Perth Basin of Western Australia, where the WA Geothermal Centre of Excellence is involved with two large air-conditioning projects using geothermal water from deep aquifers at 75 to 95 degrees C.
NASA Astrophysics Data System (ADS)
Ceccarelli, C.; Caselli, P.; Fontani, F.; Neri, R.; López-Sepulcre, A.; Codella, C.; Feng, S.; Jiménez-Serra, I.; Lefloch, B.; Pineda, J. E.; Vastel, C.; Alves, F.; Bachiller, R.; Balucani, N.; Bianchi, E.; Bizzocchi, L.; Bottinelli, S.; Caux, E.; Chacón-Tanarro, A.; Choudhury, R.; Coutens, A.; Dulieu, F.; Favre, C.; Hily-Blant, P.; Holdship, J.; Kahane, C.; Jaber Al-Edhari, A.; Laas, J.; Ospina, J.; Oya, Y.; Podio, L.; Pon, A.; Punanova, A.; Quenard, D.; Rimola, A.; Sakai, N.; Sims, I. R.; Spezzano, S.; Taquet, V.; Testi, L.; Theulé, P.; Ugliengo, P.; Vasyunin, A. I.; Viti, S.; Wiesenfeld, L.; Yamamoto, S.
2017-12-01
Complex organic molecules have been observed for decades in the interstellar medium. Some of them might be considered as small bricks of the macromolecules at the base of terrestrial life. It is hence particularly important to understand organic chemistry in Solar-like star-forming regions. In this article, we present a new observational project: Seeds Of Life In Space (SOLIS). This is a Large Project using the IRAM-NOEMA interferometer, and its scope is to image the emission of several crucial organic molecules in a sample of Solar-like star-forming regions in different evolutionary stages and environments. Here we report the first SOLIS results, obtained from analyzing the spectra of different regions of the Class 0 source NGC 1333-IRAS4A, the protocluster OMC-2 FIR4, and the shock site L1157-B1. The different regions were identified based on the images of formamide (NH2CHO) and cyanodiacetylene (HC5N) lines. We discuss the observed large diversity in the molecular and organic content, both on large (3000-10,000 au) and relatively small (300-1000 au) scales. Finally, we derive upper limits to the methoxy fractional abundance in the three observed regions of the same order of magnitude of that measured in a few cold prestellar objects, namely ˜ {10}-12-10-11 with respect to H2 molecules. Based on observations carried out under project number L15AA with the IRAM-NOEMA interferometer. IRAM is supported by INSU/CNRS (France), MPG (Germany), and IGN (Spain).
Visualizing the Complex Process for Deep Learning with an Authentic Programming Project
ERIC Educational Resources Information Center
Peng, Jun; Wang, Minhong; Sampson, Demetrios
2017-01-01
Project-based learning (PjBL) has been increasingly used to connect abstract knowledge and authentic tasks in educational practice, including computer programming education. Despite its promising effects on improving learning in multiple aspects, PjBL remains a struggle due to its complexity. Completing an authentic programming project involves a…
NASA Astrophysics Data System (ADS)
Cornillon, L.; Devilliers, C.; Behar-Lafenetre, S.; Ait-Zaid, S.; Berroth, K.; Bravo, A. C.
2017-11-01
Dealing with ceramic materials for more than two decades, Thales Alenia Space - France has identified Silicon Nitride Si3N4 as a high potential material for the manufacturing of stiff, stable and lightweight truss structure for future large telescopes. Indeed, for earth observation or astronomic observation, space mission requires more and more telescopes with high spatial resolution, which leads to the use of large primary mirrors, and a long distance between primary and secondary mirrors. Therefore current and future large space telescopes require a huge truss structure to hold and locate precisely the mirrors. Such large structure requires very strong materials with high specific stiffness and a low coefficient of thermal expansion (CTE). Based on the silicon nitride performances and on the know how of FCT Ingenieurkeramik to manufacture complex parts, Thales Alenia Space (TAS) has engaged, in cooperation with FCT, activities to develop and qualify silicon nitride parts for other applications for space projects.
NASA Astrophysics Data System (ADS)
Cornillon, L.; Devilliers, C.; Behar-Lafenetre, S.; Ait-Zaid, S.; Berroth, K.; Bravo, A. C.
2017-11-01
Dealing with ceramic materials for more than two decades, Thales Alenia Space - France has identified Silicon Nitride Si3N4 as a high potential material for the manufacturing of stiff, stable and lightweight truss structure for future large telescopes. Indeed, for earth observation or astronomic observation, space mission requires more and more telescopes with high spatial resolution, which leads to the use of large primary mirrors, and a long distance between primary and secondary mirrors. Therefore current and future large space telescopes require a huge truss structure to hold and locate precisely the mirrors. Such large structure requires very strong materials with high specific stiffness and a low coefficient of thermal expansion (CTE). Based on the silicon nitride performances and on the know how of FCT Ingenieurkeramik to manufacture complex parts, Thales Alenia Space (TAS) has engaged, in cooperation with FCT, activities to develop and qualify silicon nitride parts for other applications for space projects.
NASA Technical Reports Server (NTRS)
Caines, P. E.
1999-01-01
The work in this research project has been focused on the construction of a hierarchical hybrid control theory which is applicable to flight management systems. The motivation and underlying philosophical position for this work has been that the scale, inherent complexity and the large number of agents (aircraft) involved in an air traffic system imply that a hierarchical modelling and control methodology is required for its management and real time control. In the current work the complex discrete or continuous state space of a system with a small number of agents is aggregated in such a way that discrete (finite state machine or supervisory automaton) controlled dynamics are abstracted from the system's behaviour. High level control may then be either directly applied at this abstracted level, or, if this is in itself of significant complexity, further layers of abstractions may be created to produce a system with an acceptable degree of complexity at each level. By the nature of this construction, high level commands are necessarily realizable at lower levels in the system.
In Defense of the National Labs and Big-Budget Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goodwin, J R
2008-07-29
The purpose of this paper is to present the unofficial and unsanctioned opinions of a Visiting Scientist at Lawrence Livermore National Laboratory on the values of LLNL and the other National Labs. The basic founding value and goal of the National Labs is big-budget scientific research, along with smaller-budget scientific research that cannot easily be done elsewhere. The most important example in the latter category is classified defense-related research. The historical guiding light here is the Manhattan Project. This endeavor was unique in human history, and might remain so. The scientific expertise and wealth of an entire nation was tappedmore » in a project that was huge beyond reckoning, with no advance guarantee of success. It was in many respects a clash of scientific titans, with a large supporting cast, collaborating toward a single well-defined goal. Never had scientists received so much respect, so much money, and so much intellectual freedom to pursue scientific progress. And never was the gap between theory and implementation so rapidly narrowed, with results that changed the world, completely. Enormous resources are spent at the national or international level on large-scale scientific projects. LLNL has the most powerful computer in the world, Blue Gene/L. (Oops, Los Alamos just seized the title with Roadrunner; such titles regularly change hands.) LLNL also has the largest laser in the world, the National Ignition Facility (NIF). Lawrence Berkeley National Lab (LBNL) has the most powerful microscope in the world. Not only is it beyond the resources of most large corporations to make such expenditures, but the risk exceeds the possible rewards for those corporations that could. Nor can most small countries afford to finance large scientific projects, and not even the richest can afford largess, especially if Congress is under major budget pressure. Some big-budget research efforts are funded by international consortiums, such as the Large Hadron Collider (LHC) at CERN, and the International Tokamak Experimental Reactor (ITER) in Cadarache, France, a magnetic-confinement fusion research project. The postWWII histories of particle and fusion physics contain remarkable examples of both international competition, with an emphasis on secrecy, and international cooperation, with an emphasis on shared knowledge and resources. Initiatives to share sometimes came from surprising directions. Most large-scale scientific projects have potential defense applications. NIF certainly does; it is primarily designed to create small-scale fusion explosions. Blue Gene/L operates in part in service to NIF, and in part to various defense projects. The most important defense projects include stewardship of the national nuclear weapons stockpile, and the proposed redesign and replacement of those weapons with fewer, safer, more reliable, longer-lived, and less apocalyptic warheads. Many well-meaning people will consider the optimal lifetime of a nuclear weapon to be zero, but most thoughtful people, when asked how much longer they think this nation will require them, will ask for some time to think. NIF is also designed to create exothermic small-scale fusion explosions. The malapropos 'exothermic' here is a convenience to cover a profusion of complexities, but the basic idea is that the explosions will create more recoverable energy than was used to create them. One can hope that the primary future benefits of success for NIF will be in cost-effective generation of electrical power through controlled small-scale fusion reactions, rather than in improved large-scale fusion explosions. Blue Gene/L also services climate research, genomic research, materials research, and a myriad of other computational problems that become more feasible, reliable, and precise the larger the number of computational nodes employed. Blue Gene/L has to be sited within a security complex for obvious reasons, but its value extends to the nation and the world. There is a duality here between large-scale scientific research machines and the supercomputers used to model them. An astounding example is illustrated in a graph released by EFDAJET, at Oxfordshire, UK, presently the largest operating magnetic-confinement fusion experiment. The graph shows plasma confinement times (an essential performance parameter) for all the major tokamaks in the international fusion program, over their existing lifetimes. The remarkable thing about the data is not so much confinement-time versus date or scale, but the fact that the data are given for both the computer model predictions and the actual experimental measurements, and the two are in phenomenal agreement over the extended range of scales. Supercomputer models, sometimes operating with the intricacy of Schroedinger's equation at quantum physical scales, have become a costly but enormously cost-saving tool.« less
Climate change impacts on selected global rangeland ecosystem services.
Boone, Randall B; Conant, Richard T; Sircely, Jason; Thornton, Philip K; Herrero, Mario
2018-03-01
Rangelands are Earth's dominant land cover and are important providers of ecosystem services. Reliance on rangelands is projected to grow, thus understanding the sensitivity of rangelands to future climates is essential. We used a new ecosystem model of moderate complexity that allows, for the first time, to quantify global changes expected in rangelands under future climates. The mean global annual net primary production (NPP) may decline by 10 g C m -2 year -1 in 2050 under Representative Concentration Pathway (RCP) 8.5, but herbaceous NPP is projected to increase slightly (i.e., average of 3 g C m -2 year -1 ). Responses vary substantially from place-to-place, with large increases in annual productivity projected in northern regions (e.g., a 21% increase in productivity in the US and Canada) and large declines in western Africa (-46% in sub-Saharan western Africa) and Australia (-17%). Soil organic carbon is projected to increase in Australia (9%), the Middle East (14%), and central Asia (16%) and decline in many African savannas (e.g., -18% in sub-Saharan western Africa). Livestock are projected to decline 7.5 to 9.6%, an economic loss of from $9.7 to $12.6 billion. Our results suggest that forage production in Africa is sensitive to changes in climate, which will have substantial impacts on the livelihoods of the more than 180 million people who raise livestock on those rangelands. Our approach and the simulation tool presented here offer considerable potential for forecasting future conditions, highlight regions of concern, and support analyses where costs and benefits of adaptations and policies may be quantified. Otherwise, the technical options and policy and enabling environment that are needed to facilitate widespread adaptation may be very difficult to elucidate. © 2017 John Wiley & Sons Ltd.
Economic evaluation on CO₂-EOR of onshore oil fields in China
Wei, Ning; Li, Xiaochun; Dahowski, Robert T.; ...
2015-06-01
Carbon dioxide enhanced oil recovery (CO₂-EOR) and sequestration in depleted oil reservoirs is a plausible option for utilizing anthropogenic CO₂ to increase oil production while storing CO₂ underground. Evaluation of the storage resources and cost of potential CO₂-EOR projects is an essential step before the commencement of large-scale deployment of such activities. In this paper, a hybrid techno-economic evaluation method, including a performance model and cost model for onshore CO₂-EOR projects, has been developed based on previous studies. Total 296 onshore oil fields, accounting for about 70% of total mature onshore oil fields in China, were evaluated by the techno-economicmore » method. The key findings of this study are summarized as follows: (1) deterministic analysis shows there are approximately 1.1 billion tons (7.7 billion barrels) of incremental crude oil and 2.2 billion tons CO₂ storage resource for onshore CO₂-EOR at net positive revenue within the Chinese oil fields reviewed under the given operating strategy and economic assumptions. (2) Sensitivity study highlights that the cumulative oil production and cumulative CO₂ storage resource are very sensitive to crude oil price, CO₂ cost, project lifetime, discount rate and tax policy. High oil price, short project lifetime, low discount rate, low CO₂ cost, and low tax policy can greatly increase the net income of the oil enterprise, incremental oil recovery and CO₂ storage resource. (3) From this techno-economic evaluation, the major barriers to large-scale deployment of CO₂-EOR include complex geological conditions, low API of crude oil, high tax policy, and lack of incentives for the CO₂-EOR project.« less
Economic evaluation on CO₂-EOR of onshore oil fields in China
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, Ning; Li, Xiaochun; Dahowski, Robert T.
Carbon dioxide enhanced oil recovery (CO₂-EOR) and sequestration in depleted oil reservoirs is a plausible option for utilizing anthropogenic CO₂ to increase oil production while storing CO₂ underground. Evaluation of the storage resources and cost of potential CO₂-EOR projects is an essential step before the commencement of large-scale deployment of such activities. In this paper, a hybrid techno-economic evaluation method, including a performance model and cost model for onshore CO₂-EOR projects, has been developed based on previous studies. Total 296 onshore oil fields, accounting for about 70% of total mature onshore oil fields in China, were evaluated by the techno-economicmore » method. The key findings of this study are summarized as follows: (1) deterministic analysis shows there are approximately 1.1 billion tons (7.7 billion barrels) of incremental crude oil and 2.2 billion tons CO₂ storage resource for onshore CO₂-EOR at net positive revenue within the Chinese oil fields reviewed under the given operating strategy and economic assumptions. (2) Sensitivity study highlights that the cumulative oil production and cumulative CO₂ storage resource are very sensitive to crude oil price, CO₂ cost, project lifetime, discount rate and tax policy. High oil price, short project lifetime, low discount rate, low CO₂ cost, and low tax policy can greatly increase the net income of the oil enterprise, incremental oil recovery and CO₂ storage resource. (3) From this techno-economic evaluation, the major barriers to large-scale deployment of CO₂-EOR include complex geological conditions, low API of crude oil, high tax policy, and lack of incentives for the CO₂-EOR project.« less
Fast alternating projection methods for constrained tomographic reconstruction
Liu, Li; Han, Yongxin
2017-01-01
The alternating projection algorithms are easy to implement and effective for large-scale complex optimization problems, such as constrained reconstruction of X-ray computed tomography (CT). A typical method is to use projection onto convex sets (POCS) for data fidelity, nonnegative constraints combined with total variation (TV) minimization (so called TV-POCS) for sparse-view CT reconstruction. However, this type of method relies on empirically selected parameters for satisfactory reconstruction and is generally slow and lack of convergence analysis. In this work, we use a convex feasibility set approach to address the problems associated with TV-POCS and propose a framework using full sequential alternating projections or POCS (FS-POCS) to find the solution in the intersection of convex constraints of bounded TV function, bounded data fidelity error and non-negativity. The rationale behind FS-POCS is that the mathematically optimal solution of the constrained objective function may not be the physically optimal solution. The breakdown of constrained reconstruction into an intersection of several feasible sets can lead to faster convergence and better quantification of reconstruction parameters in a physical meaningful way than that in an empirical way of trial-and-error. In addition, for large-scale optimization problems, first order methods are usually used. Not only is the condition for convergence of gradient-based methods derived, but also a primal-dual hybrid gradient (PDHG) method is used for fast convergence of bounded TV. The newly proposed FS-POCS is evaluated and compared with TV-POCS and another convex feasibility projection method (CPTV) using both digital phantom and pseudo-real CT data to show its superior performance on reconstruction speed, image quality and quantification. PMID:28253298
EpiCollect+: linking smartphones to web applications for complex data collection projects
Aanensen, David M.; Huntley, Derek M.; Menegazzo, Mirko; Powell, Chris I.; Spratt, Brian G.
2014-01-01
Previously, we have described the development of the generic mobile phone data gathering tool, EpiCollect, and an associated web application, providing two-way communication between multiple data gatherers and a project database. This software only allows data collection on the phone using a single questionnaire form that is tailored to the needs of the user (including a single GPS point and photo per entry), whereas many applications require a more complex structure, allowing users to link a series of forms in a linear or branching hierarchy, along with the addition of any number of media types accessible from smartphones and/or tablet devices (e.g., GPS, photos, videos, sound clips and barcode scanning). A much enhanced version of EpiCollect has been developed (EpiCollect+). The individual data collection forms in EpiCollect+ provide more design complexity than the single form used in EpiCollect, and the software allows the generation of complex data collection projects through the ability to link many forms together in a linear (or branching) hierarchy. Furthermore, EpiCollect+ allows the collection of multiple media types as well as standard text fields, increased data validation and form logic. The entire process of setting up a complex mobile phone data collection project to the specification of a user (project and form definitions) can be undertaken at the EpiCollect+ website using a simple ‘drag and drop’ procedure, with visualisation of the data gathered using Google Maps and charts at the project website. EpiCollect+ is suitable for situations where multiple users transmit complex data by mobile phone (or other Android devices) to a single project web database and is already being used for a range of field projects, particularly public health projects in sub-Saharan Africa. However, many uses can be envisaged from education, ecology and epidemiology to citizen science. PMID:25485096
EpiCollect+: linking smartphones to web applications for complex data collection projects.
Aanensen, David M; Huntley, Derek M; Menegazzo, Mirko; Powell, Chris I; Spratt, Brian G
2014-01-01
Previously, we have described the development of the generic mobile phone data gathering tool, EpiCollect, and an associated web application, providing two-way communication between multiple data gatherers and a project database. This software only allows data collection on the phone using a single questionnaire form that is tailored to the needs of the user (including a single GPS point and photo per entry), whereas many applications require a more complex structure, allowing users to link a series of forms in a linear or branching hierarchy, along with the addition of any number of media types accessible from smartphones and/or tablet devices (e.g., GPS, photos, videos, sound clips and barcode scanning). A much enhanced version of EpiCollect has been developed (EpiCollect+). The individual data collection forms in EpiCollect+ provide more design complexity than the single form used in EpiCollect, and the software allows the generation of complex data collection projects through the ability to link many forms together in a linear (or branching) hierarchy. Furthermore, EpiCollect+ allows the collection of multiple media types as well as standard text fields, increased data validation and form logic. The entire process of setting up a complex mobile phone data collection project to the specification of a user (project and form definitions) can be undertaken at the EpiCollect+ website using a simple 'drag and drop' procedure, with visualisation of the data gathered using Google Maps and charts at the project website. EpiCollect+ is suitable for situations where multiple users transmit complex data by mobile phone (or other Android devices) to a single project web database and is already being used for a range of field projects, particularly public health projects in sub-Saharan Africa. However, many uses can be envisaged from education, ecology and epidemiology to citizen science.
Distance-Based Behaviors for Low-Complexity Control in Multiagent Robotics
NASA Astrophysics Data System (ADS)
Pierpaoli, Pietro
Several biological examples show that living organisms cooperate to collectively accomplish tasks impossible for single individuals. More importantly, this coordination is often achieved with a very limited set of information. Inspired by these observations, research on autonomous systems has focused on the development of distributed control techniques for control and guidance of groups of autonomous mobile agents, or robots. From an engineering perspective, when coordination and cooperation is sought in large ensembles of robotic vehicles, a reduction in hardware and algorithms' complexity becomes mandatory from the very early stages of the project design. The research for solutions capable of lowering power consumption, cost and increasing reliability are thus worth investigating. In this work, we studied low-complexity techniques to achieve cohesion and control on swarms of autonomous robots. Starting from an inspiring example with two-agents, we introduced effects of neighbors' relative positions on control of an autonomous agent. The extension of this intuition addressed the control of large ensembles of autonomous vehicles, and was applied in the form of a herding-like technique. To this end, a low-complexity distance-based aggregation protocol was defined. We first showed that our protocol produced a cohesion aggregation among the agent while avoiding inter-agent collisions. Then, a feedback leader-follower architecture was introduced for the control of the swarm. We also described how proximity measures and probability of collisions with neighbors can also be used as source of information in highly populated environments.
15 MW HArdware-in-the-loop Grid Simulation Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rigas, Nikolaos; Fox, John Curtiss; Collins, Randy
2014-10-31
The 15MW Hardware-in-the-loop (HIL) Grid Simulator project was to (1) design, (2) construct and (3) commission a state-of-the-art grid integration testing facility for testing of multi-megawatt devices through a ‘shared facility’ model open to all innovators to promote the rapid introduction of new technology in the energy market to lower the cost of energy delivered. The 15 MW HIL Grid Simulator project now serves as the cornerstone of the Duke Energy Electric Grid Research, Innovation and Development (eGRID) Center. This project leveraged the 24 kV utility interconnection and electrical infrastructure of the US DOE EERE funded WTDTF project at themore » Clemson University Restoration Institute in North Charleston, SC. Additionally, the project has spurred interest from other technology sectors, including large PV inverter and energy storage testing and several leading edge research proposals dealing with smart grid technologies, grid modernization and grid cyber security. The key components of the project are the power amplifier units capable of providing up to 20MW of defined power to the research grid. The project has also developed a one of a kind solution to performing fault ride-through testing by combining a reactive divider network and a large power converter into a hybrid method. This unique hybrid method of performing fault ride-through analysis will allow for the research team at the eGRID Center to investigate the complex differences between the alternative methods of performing fault ride-through evaluations and will ultimately further the science behind this testing. With the final goal of being able to perform HIL experiments and demonstration projects, the eGRID team undertook a significant challenge with respect to developing a control system that is capable of communicating with several different pieces of equipment with different communication protocols in real-time. The eGRID team developed a custom fiber optical network that is based upon FPGA hardware that allows for communication between the key real-time interfaces and reduces the latency between these interfaces to acceptable levels for HIL experiments.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoverson, Eric D.; Amonette, Alexandra
The Umatilla Anadromous Fisheries Habitat Project (UAFHP) is an ongoing effort to protect, enhance, and restore riparian and instream habitat for the natural production of anadromous salmonids in the Umatilla River Basin, Northeast Oregon. Flow quantity, water temperature, passage, and lack of in-stream channel complexity have been identified as the key limiting factors in the basin. During the 2008 Fiscal Year (FY) reporting period (February 1, 2008-January 31, 2009) primary project activities focused on improving instream and riparian habitat complexity, migrational passage, and restoring natural channel morphology and floodplain function. Eight primary fisheries habitat enhancement projects were implemented on Meachammore » Creek, Birch Creek, West Birch Creek, McKay Creek, West Fork Spring Hollow, and the Umatilla River. Specific restoration actions included: (1) rectifying one fish passage barrier on West Birch Creek; (2) participating in six projects planting 10,000 trees and seeding 3225 pounds of native grasses; (3) donating 1000 ft of fencing and 1208 fence posts and associated hardware for 3.6 miles of livestock exclusion fencing projects in riparian areas of West Birch and Meacham Creek, and for tree screens to protect against beaver damage on West Fork Spring Hollow Creek; (4) using biological control (insects) to reduce noxious weeds on three treatment areas covering five acres on Meacham Creek; (5) planning activities for a levee setback project on Meacham Creek. We participated in additional secondary projects as opportunities arose. Baseline and ongoing monitoring and evaluation activities were also completed on major project areas such as conducting photo point monitoring strategies activities at the Meacham Creek Large Wood Implementation Project site (FY2006) and at additional easements and planned project sites. Fish surveys and aquatic habitat inventories were conducted at project sites prior to implementation. Proper selection and implementation of the most effective site-specific habitat restoration plan, taking into consideration the unique characteristics of each project site, and conducted in cooperation with landowners and project partners, was of paramount importance to ensure each project's success. An Aquatic Habitat Inventory was conducted from river mile 0-8 on Isquulktpe Creek and the data collected was compared with data collected in 1994. Monitoring plans will continue throughout the duration of each project to oversee progression and inspire timely managerial actions. Twenty-seven conservation easements were maintained with 23 landowners. Permitting applications for planned project activities and biological opinions were written and approved. Project activities were based on a variety of fisheries monitoring techniques and habitat assessments used to determine existing conditions and identify factors limiting anadromous salmonid abundance in accordance with the Umatilla River Subbasin Salmon and Steelhead Production Plan (NPPC 1990) and the Final Umatilla Willow Subbasin Plan (Umatilla/Willow Subbasin Planning Team 2005).« less
Sparse distributed memory overview
NASA Technical Reports Server (NTRS)
Raugh, Mike
1990-01-01
The Sparse Distributed Memory (SDM) project is investigating the theory and applications of massively parallel computing architecture, called sparse distributed memory, that will support the storage and retrieval of sensory and motor patterns characteristic of autonomous systems. The immediate objectives of the project are centered in studies of the memory itself and in the use of the memory to solve problems in speech, vision, and robotics. Investigation of methods for encoding sensory data is an important part of the research. Examples of NASA missions that may benefit from this work are Space Station, planetary rovers, and solar exploration. Sparse distributed memory offers promising technology for systems that must learn through experience and be capable of adapting to new circumstances, and for operating any large complex system requiring automatic monitoring and control. Sparse distributed memory is a massively parallel architecture motivated by efforts to understand how the human brain works. Sparse distributed memory is an associative memory, able to retrieve information from cues that only partially match patterns stored in the memory. It is able to store long temporal sequences derived from the behavior of a complex system, such as progressive records of the system's sensory data and correlated records of the system's motor controls.
NASA Technical Reports Server (NTRS)
2005-01-01
[figure removed for brevity, see original site]
Our topic for the weeks of April 4 and April 11 is dunes on Mars. We will look at the north polar sand sea and at isolated dune fields at lower latitudes. Sand seas on Earth are often called 'ergs,' an Arabic name for dune field. A sand sea differs from a dune field in two ways: 1) a sand sea has a large regional extent, and 2) the individual dunes are large in size and complex in form. This VIS image was taken at 82 degrees North latitude during Northern spring. As with yesterday's image, the dunes are still partially frost covered. This region is part of the north polar erg (sand sea), note the complexity and regional coverage of the dunes. Image information: VIS instrument. Latitude 81.2, Longitude 118.2 East (241.8 West). 19 meter/pixel resolution. Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time. NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.NASA Astrophysics Data System (ADS)
Niswonger, R. G.; Huntington, J. L.; Dettinger, M. D.; Rajagopal, S.; Gardner, M.; Morton, C. G.; Reeves, D. M.; Pohll, G. M.
2013-12-01
Water resources in the Tahoe basin are susceptible to long-term climate change and extreme events because it is a middle-altitude, snow-dominated basin that experiences large inter-annual climate variations. Lake Tahoe provides critical water supply for its basin and downstream populations, but changes in water supply are obscured by complex climatic and hydrologic gradients across the high relief, geologically complex basin. An integrated surface and groundwater model of the Lake Tahoe basin has been developed using GSFLOW to assess the effects of climate change and extreme events on surface and groundwater resources. Key hydrologic mechanisms are identified with this model that explains recent changes in water resources of the region. Critical vulnerabilities of regional water-supplies and hazards also were explored. Maintaining a balance between (a) accurate representation of spatial features (e.g., geology, streams, and topography) and hydrologic response (i.e., groundwater, stream, lake, and wetland flows and storages), and (b) computational efficiency, is a necessity for the desired model applications. Potential climatic influences on water resources are analyzed here in simulations of long-term water-availability and flood responses to selected 100-year climate-model projections. GSFLOW is also used to simulate a scenario depicting an especially extreme storm event that was constructed from a combination of two historical atmospheric-river storm events as part of the USGS MultiHazards Demonstration Project. Historical simulated groundwater levels, streamflow, wetlands, and lake levels compare well with measured values for a 30-year historical simulation period. Results are consistent for both small and large model grid cell sizes, due to the model's ability to represent water table altitude, streams, and other hydrologic features at the sub-grid scale. Simulated hydrologic responses are affected by climate change, where less groundwater resources will be available during more frequent droughts. Simulated floods for the region indicate issues related to drainage in the developed areas around Lake Tahoe, and necessary dam releases that create downstream flood risks.
Implementing Large Projects in Software Engineering Courses
ERIC Educational Resources Information Center
Coppit, David
2006-01-01
In software engineering education, large projects are widely recognized as a useful way of exposing students to the real-world difficulties of team software development. But large projects are difficult to put into practice. First, educators rarely have additional time to manage software projects. Second, classrooms have inherent limitations that…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vavrinec, John; Borde, Amy B.; Woodruff, Dana L.
Unites States Navy capital improvement projects are designed to modernize and improve mission capacity. Such capital improvement projects often result in unavoidable environmental impacts by increasing over-water structures, which results in a loss of subtidal habitat within industrial areas of Navy bases. In the Pacific Northwest, compensatory mitigation often targets alleviating impacts to Endangered Species Act-listed salmon species. The complexity of restoring large systems requires limited resources to target successful and more coordinated mitigation efforts to address habitat loss and improvements in water quality that will clearly contribute to an improvement at the site scale and can then be linkedmore » to a cumulative net ecosystem improvement.« less
NASA Astrophysics Data System (ADS)
Zhadanovsky, Boris; Sinenko, Sergey
2018-03-01
Economic indicators of construction work, particularly in high-rise construction, are directly related to the choice of optimal number of machines. The shortage of machinery makes it impossible to complete the construction & installation work on scheduled time. Rates of performance of construction & installation works and labor productivity during high-rise construction largely depend on the degree of provision of construction project with machines (level of work mechanization). During calculation of the need for machines in construction projects, it is necessary to ensure that work is completed on scheduled time, increased level of complex mechanization, increased productivity and reduction of manual work, and improved usage and maintenance of machine fleet. The selection of machines and determination of their numbers should be carried out by using formulas presented in this work.
Martinez, Elizabeth A; Chavez-Valdez, Raul; Holt, Natalie F; Grogan, Kelly L; Khalifeh, Katherine W; Slater, Tammy; Winner, Laura E; Moyer, Jennifer; Lehmann, Christoph U
2011-01-01
Although the evidence strongly supports perioperative glycemic control among cardiac surgical patients, there is scant literature to describe the practical application of such a protocol in the complex ICU environment. This paper describes the use of the Lean Six Sigma methodology to implement a perioperative insulin protocol in a cardiac surgical intensive care unit (CSICU) in a large academic hospital. A preintervention chart audit revealed that fewer than 10% of patients were admitted to the CSICU with glucose <200 mg/dL, prompting the initiation of the quality improvement project. Following protocol implementation, more than 90% of patients were admitted with a glucose <200 mg/dL. Key elements to success include barrier analysis and intervention, provider education, and broadening the project scope to address the intraoperative period.
Martinez, Elizabeth A.; Chavez-Valdez, Raul; Holt, Natalie F.; Grogan, Kelly L.; Khalifeh, Katherine W.; Slater, Tammy; Winner, Laura E.; Moyer, Jennifer; Lehmann, Christoph U.
2011-01-01
Although the evidence strongly supports perioperative glycemic control among cardiac surgical patients, there is scant literature to describe the practical application of such a protocol in the complex ICU environment. This paper describes the use of the Lean Six Sigma methodology to implement a perioperative insulin protocol in a cardiac surgical intensive care unit (CSICU) in a large academic hospital. A preintervention chart audit revealed that fewer than 10% of patients were admitted to the CSICU with glucose <200 mg/dL, prompting the initiation of the quality improvement project. Following protocol implementation, more than 90% of patients were admitted with a glucose <200 mg/dL. Key elements to success include barrier analysis and intervention, provider education, and broadening the project scope to address the intraoperative period. PMID:22091218
A model of language inflection graphs
NASA Astrophysics Data System (ADS)
Fukś, Henryk; Farzad, Babak; Cao, Yi
2014-01-01
Inflection graphs are highly complex networks representing relationships between inflectional forms of words in human languages. For so-called synthetic languages, such as Latin or Polish, they have particularly interesting structure due to the abundance of inflectional forms. We construct the simplest form of inflection graphs, namely a bipartite graph in which one group of vertices corresponds to dictionary headwords and the other group to inflected forms encountered in a given text. We, then, study projection of this graph on the set of headwords. The projection decomposes into a large number of connected components, to be called word groups. Distribution of sizes of word group exhibits some remarkable properties, resembling cluster distribution in a lattice percolation near the critical point. We propose a simple model which produces graphs of this type, reproducing the desired component distribution and other topological features.
NASA Technical Reports Server (NTRS)
Mulenburg, Gerald M.
2000-01-01
Study of characteristics and relationships of project managers of complex projects in the National Aeronautics and Space Administration. Study is based on Research Design, Data Collection, Interviews, Case Studies, and Data Analysis across varying disciplines such as biological research, space research, advanced aeronautical test facilities, aeronautic flight demonstrations, and projects at different NASA centers to ensure that findings were not endemic to one type of project management, or to one Center's management philosophies. Each project is treated as a separate case with the primary data collected during semi-structured interviews with the project manager responsible for the overall project. Results of the various efforts show some definite similarities of characteristics and relationships among the project managers in the study. A model for how the project managers formulated and managed their projects is included.
Models and observations of Arctic melt ponds
NASA Astrophysics Data System (ADS)
Golden, K. M.
2016-12-01
During the Arctic melt season, the sea ice surface undergoes a striking transformation from vast expanses of snow covered ice to complex mosaics of ice and melt ponds. Sea ice albedo, a key parameter in climate modeling, is largely determined by the complex evolution of melt pond configurations. In fact, ice-albedo feedback has played a significant role in the recent declines of the summer Arctic sea ice pack. However, understanding melt pond evolution remains a challenge to improving climate projections. It has been found that as the ponds grow and coalesce, the fractal dimension of their boundaries undergoes a transition from 1 to about 2, around a critical length scale of 100 square meters in area. As the ponds evolve they take complex, self-similar shapes with boundaries resembling space-filling curves. I will outline how mathematical models of composite materials and statistical physics, such as percolation and Ising models, are being used to describe this evolution and predict key geometrical parameters that agree very closely with observations.
Levine, Rebecca S; Peterson, A Townsend; Benedict, Mark Q
2004-02-01
The distribution of the Anopheles gambiae complex of malaria vectors in Africa is uncertain due to under-sampling of vast regions. We use ecologic niche modeling to predict the potential distribution of three members of the complex (A. gambiae, A. arabiensis, and A. quadriannulatus) and demonstrate the statistical significance of the models. Predictions correspond well to previous estimates, but provide detail regarding spatial discontinuities in the distribution of A. gambiae s.s. that are consistent with population genetic studies. Our predictions also identify large areas of Africa where the presence of A. arabiensis is predicted, but few specimens have been obtained, suggesting under-sampling of the species. Finally, we project models developed from African distribution data for the late 1900s into the past and to South America to determine retrospectively whether the deadly 1929 introduction of A. gambiae sensu lato into Brazil was more likely that of A. gambiae sensu stricto or A. arabiensis.
NASA Astrophysics Data System (ADS)
Kariniotakis, G.; Anemos Team
2003-04-01
Objectives: Accurate forecasting of the wind energy production up to two days ahead is recognized as a major contribution for reliable large-scale wind power integration. Especially, in a liberalized electricity market, prediction tools enhance the position of wind energy compared to other forms of dispatchable generation. ANEMOS, is a new 3.5 years R&D project supported by the European Commission, that resembles research organizations and end-users with an important experience on the domain. The project aims to develop advanced forecasting models that will substantially outperform current methods. Emphasis is given to situations like complex terrain, extreme weather conditions, as well as to offshore prediction for which no specific tools currently exist. The prediction models will be implemented in a software platform and installed for online operation at onshore and offshore wind farms by the end-users participating in the project. Approach: The paper presents the methodology of the project. Initially, the prediction requirements are identified according to the profiles of the end-users. The project develops prediction models based on both a physical and an alternative statistical approach. Research on physical models gives emphasis to techniques for use in complex terrain and the development of prediction tools based on CFD techniques, advanced model output statistics or high-resolution meteorological information. Statistical models (i.e. based on artificial intelligence) are developed for downscaling, power curve representation, upscaling for prediction at regional or national level, etc. A benchmarking process is set-up to evaluate the performance of the developed models and to compare them with existing ones using a number of case studies. The synergy between statistical and physical approaches is examined to identify promising areas for further improvement of forecasting accuracy. Appropriate physical and statistical prediction models are also developed for offshore wind farms taking into account advances in marine meteorology (interaction between wind and waves, coastal effects). The benefits from the use of satellite radar images for modeling local weather patterns are investigated. A next generation forecasting software, ANEMOS, will be developed to integrate the various models. The tool is enhanced by advanced Information Communication Technology (ICT) functionality and can operate both in stand alone, or remote mode, or be interfaced with standard Energy or Distribution Management Systems (EMS/DMS) systems. Contribution: The project provides an advanced technology for wind resource forecasting applicable in a large scale: at a single wind farm, regional or national level and for both interconnected and island systems. A major milestone is the on-line operation of the developed software by the participating utilities for onshore and offshore wind farms and the demonstration of the economic benefits. The outcome of the ANEMOS project will help consistently the increase of wind integration in two levels; in an operational level due to better management of wind farms, but also, it will contribute to increasing the installed capacity of wind farms. This is because accurate prediction of the resource reduces the risk of wind farm developers, who are then more willing to undertake new wind farm installations especially in a liberalized electricity market environment.
2018-01-01
Abstract The neocortex is composed of many distinct subtypes of neurons that must form precise subtype-specific connections to enable the cortex to perform complex functions. Callosal projection neurons (CPN) are the broad population of commissural neurons that connect the cerebral hemispheres via the corpus callosum (CC). Currently, how the remarkable diversity of CPN subtypes and connectivity is specified, and how they differentiate to form highly precise and specific circuits, are largely unknown. We identify in mouse that the lipid-bound scaffolding domain protein Caveolin 1 (CAV1) is specifically expressed by a unique subpopulation of Layer V CPN that maintain dual ipsilateral frontal projections to premotor cortex. CAV1 is expressed by over 80% of these dual projecting callosal/frontal projection neurons (CPN/FPN), with expression peaking early postnatally as axonal and dendritic targets are being reached and refined. CAV1 is localized to the soma and dendrites of CPN/FPN, a unique population of neurons that shares information both between hemispheres and with premotor cortex, suggesting function during postmitotic development and refinement of these neurons, rather than in their specification. Consistent with this, we find that Cav1 function is not necessary for the early specification of CPN/FPN, or for projecting to their dual axonal targets. CPN subtype-specific expression of Cav1 identifies and characterizes a first molecular component that distinguishes this functionally unique projection neuron population, a population that expands in primates, and is prototypical of additional dual and higher-order projection neuron subtypes. PMID:29379878
Orbital debris removal and salvage system
NASA Technical Reports Server (NTRS)
1990-01-01
Four Texas A&M University projects are discussed. The first project is a design to eliminate a majority of orbital debris. The Orbital Debris and Salvage System will push the smaller particles into lower orbits where their orbits will decay at a higher rate. This will be done by momentum transfer via laser. The salvageable satellites will be delivered to the Space Station by an Orbital Transfer Vehicle. The rest of the debris will be collected by Salvage I. The second project is the design of a space based satellite system to prevent the depletion of atmospheric ozone. The focus is on ozone depletion in the Antarctic. The plan is to use an orbiting solar array system designed to transmit microwaves at a frequency of 22 GHz over the region in order to dissipate polar stratospheric clouds that form during the months beginning in August and ending in October. The third project, Project Poseidon, involves a conceptual design of a space based hurricane control system consisting of a network of 21 low-orbiting laser platforms arranged in three rings designed to heat the upper atmosphere of a developing tropical depression. Fusion power plants are proposed to provide power for the lasers. The fourth project, Project Donatello, involves a proposed Mars exploration initiative for the year 2050. The project is a conceptual design for a futuristic superfreighter that will transport large numbers of people and supplies to Mars for the construction of a full scale scientific and manufacturing complex.
Residual motion compensation in ECG-gated interventional cardiac vasculature reconstruction
NASA Astrophysics Data System (ADS)
Schwemmer, C.; Rohkohl, C.; Lauritsch, G.; Müller, K.; Hornegger, J.
2013-06-01
Three-dimensional reconstruction of cardiac vasculature from angiographic C-arm CT (rotational angiography) data is a major challenge. Motion artefacts corrupt image quality, reducing usability for diagnosis and guidance. Many state-of-the-art approaches depend on retrospective ECG-gating of projection data for image reconstruction. A trade-off has to be made regarding the size of the ECG-gating window. A large temporal window is desirable to avoid undersampling. However, residual motion will occur in a large window, causing motion artefacts. We present an algorithm to correct for residual motion. Our approach is based on a deformable 2D-2D registration between the forward projection of an initial, ECG-gated reconstruction, and the original projection data. The approach is fully automatic and does not require any complex segmentation of vasculature, or landmarks. The estimated motion is compensated for during the backprojection step of a subsequent reconstruction. We evaluated the method using the publicly available CAVAREV platform and on six human clinical datasets. We found a better visibility of structure, reduced motion artefacts, and increased sharpness of the vessels in the compensated reconstructions compared to the initial reconstructions. At the time of writing, our algorithm outperforms the leading result of the CAVAREV ranking list. For the clinical datasets, we found an average reduction of motion artefacts by 13 ± 6%. Vessel sharpness was improved by 25 ± 12% on average.
Scientific, Social, and Institutional Constraints Facing Coastal Restoration in Louisiana
NASA Astrophysics Data System (ADS)
Kleiss, B.; Shabman, L. A.; Brown, G.
2017-12-01
Due to multiple stressors, including subsidence, accelerated sea level rise, canal construction, tropical storm damages, and basin-wide river management decisions, southern Louisiana is experiencing some of the world's highest rates of coastal land loss. Although ideas abound, the solutions proposed to mitigate for land loss are often uncertain, complex, expensive, and difficult. There are significant scientific uncertainties associated with fundamental processes including the spatial distribution of rates of subsidence, the anticipated impacts of increased inundation on marsh plant species and questions about the resilience of engineered solutions. Socially and politically, there is the need to balance navigation, flood risk management and environmental restoration with the fact that the land involved is largely privately owned and includes many communities and towns. And layered within this, there are federal and state regulatory constraints which seek to follow a myriad of existing State and Federal laws, protect the benefits realized from previous federal investments, and balance the conflicting interests of a large number of stakeholders. Additionally, current practice when implementing some environmental regulations is to assess impacts against the baseline of current conditions, not projected future, non-project conditions, making it difficult to receive a permit for projects which may have a short-term detriment, but hope for a long-term benefit. The resolution (or lack thereof) of these issues will serve to inform similar future struggles in other low lying coastal areas around the globe.
A collaborative virtual reality environment for neurosurgical planning and training.
Kockro, Ralf A; Stadie, Axel; Schwandt, Eike; Reisch, Robert; Charalampaki, Cleopatra; Ng, Ivan; Yeo, Tseng Tsai; Hwang, Peter; Serra, Luis; Perneczky, Axel
2007-11-01
We have developed a highly interactive virtual environment that enables collaborative examination of stereoscopic three-dimensional (3-D) medical imaging data for planning, discussing, or teaching neurosurgical approaches and strategies. The system consists of an interactive console with which the user manipulates 3-D data using hand-held and tracked devices within a 3-D virtual workspace and a stereoscopic projection system. The projection system displays the 3-D data on a large screen while the user is working with it. This setup allows users to interact intuitively with complex 3-D data while sharing this information with a larger audience. We have been using this system on a routine clinical basis and during neurosurgical training courses to collaboratively plan and discuss neurosurgical procedures with 3-D reconstructions of patient-specific magnetic resonance and computed tomographic imaging data or with a virtual model of the temporal bone. Working collaboratively with the 3-D information of a large, interactive, stereoscopic projection provides an unambiguous way to analyze and understand the anatomic spatial relationships of different surgical corridors. In our experience, the system creates a unique forum for open and precise discussion of neurosurgical approaches. We believe the system provides a highly effective way to work with 3-D data in a group, and it significantly enhances teaching of neurosurgical anatomy and operative strategies.
Tanaka, Masashi; Eynon, Nir; Bouchard, Claude; North, Kathryn N.; Williams, Alun G.; Collins, Malcolm; Britton, Steven L.; Fuku, Noriyuki; Ashley, Euan A.; Klissouras, Vassilis; Lucia, Alejandro; Ahmetov, Ildus I.; de Geus, Eco; Alsayrafi, Mohammed
2015-01-01
Despite numerous attempts to discover genetic variants associated with elite athletic performance, injury predisposition, and elite/world-class athletic status, there has been limited progress to date. Past reliance on candidate gene studies predominantly focusing on genotyping a limited number of single nucleotide polymorphisms or the insertion/deletion variants in small, often heterogeneous cohorts (i.e., made up of athletes of quite different sport specialties) have not generated the kind of results that could offer solid opportunities to bridge the gap between basic research in exercise sciences and deliverables in biomedicine. A retrospective view of genetic association studies with complex disease traits indicates that transition to hypothesis-free genome-wide approaches will be more fruitful. In studies of complex disease, it is well recognized that the magnitude of genetic association is often smaller than initially anticipated, and, as such, large sample sizes are required to identify the gene effects robustly. A symposium was held in Athens and on the Greek island of Santorini from 14–17 May 2015 to review the main findings in exercise genetics and genomics and to explore promising trends and possibilities. The symposium also offered a forum for the development of a position stand (the Santorini Declaration). Among the participants, many were involved in ongoing collaborative studies (e.g., ELITE, GAMES, Gene SMART, GENESIS, and POWERGENE). A consensus emerged among participants that it would be advantageous to bring together all current studies and those recently launched into one new large collaborative initiative, which was subsequently named the Athlome Project Consortium. PMID:26715623
Pitsiladis, Yannis P; Tanaka, Masashi; Eynon, Nir; Bouchard, Claude; North, Kathryn N; Williams, Alun G; Collins, Malcolm; Moran, Colin N; Britton, Steven L; Fuku, Noriyuki; Ashley, Euan A; Klissouras, Vassilis; Lucia, Alejandro; Ahmetov, Ildus I; de Geus, Eco; Alsayrafi, Mohammed
2016-03-01
Despite numerous attempts to discover genetic variants associated with elite athletic performance, injury predisposition, and elite/world-class athletic status, there has been limited progress to date. Past reliance on candidate gene studies predominantly focusing on genotyping a limited number of single nucleotide polymorphisms or the insertion/deletion variants in small, often heterogeneous cohorts (i.e., made up of athletes of quite different sport specialties) have not generated the kind of results that could offer solid opportunities to bridge the gap between basic research in exercise sciences and deliverables in biomedicine. A retrospective view of genetic association studies with complex disease traits indicates that transition to hypothesis-free genome-wide approaches will be more fruitful. In studies of complex disease, it is well recognized that the magnitude of genetic association is often smaller than initially anticipated, and, as such, large sample sizes are required to identify the gene effects robustly. A symposium was held in Athens and on the Greek island of Santorini from 14-17 May 2015 to review the main findings in exercise genetics and genomics and to explore promising trends and possibilities. The symposium also offered a forum for the development of a position stand (the Santorini Declaration). Among the participants, many were involved in ongoing collaborative studies (e.g., ELITE, GAMES, Gene SMART, GENESIS, and POWERGENE). A consensus emerged among participants that it would be advantageous to bring together all current studies and those recently launched into one new large collaborative initiative, which was subsequently named the Athlome Project Consortium. Copyright © 2016 the American Physiological Society.
Molecular inversion probe assay.
Absalan, Farnaz; Ronaghi, Mostafa
2007-01-01
We have described molecular inversion probe technologies for large-scale genetic analyses. This technique provides a comprehensive and powerful tool for the analysis of genetic variation and enables affordable, large-scale studies that will help uncover the genetic basis of complex disease and explain the individual variation in response to therapeutics. Major applications of the molecular inversion probes (MIP) technologies include targeted genotyping from focused regions to whole-genome studies, and allele quantification of genomic rearrangements. The MIP technology (used in the HapMap project) provides an efficient, scalable, and affordable way to score polymorphisms in case/control populations for genetic studies. The MIP technology provides the highest commercially available multiplexing levels and assay conversion rates for targeted genotyping. This enables more informative, genome-wide studies with either the functional (direct detection) approach or the indirect detection approach.
Flux Calculation Using CARIBIC DOAS Aircraft Measurements: SO2 Emission of Norilsk
NASA Technical Reports Server (NTRS)
Walter, D.; Heue, K.-P.; Rauthe-Schoech, A.; Brenninkmeijer, C. A. M.; Lamsal, L. N.; Krotkov, N. A.; Platt, U.
2012-01-01
Based on a case-study of the nickel smelter in Norilsk (Siberia), the retrieval of trace gas fluxes using airborne remote sensing is discussed. A DOAS system onboard an Airbus 340 detected large amounts of SO2 and NO2 near Norilsk during a regular passenger flight within the CARIBIC project. The remote sensing data were combined with ECMWF wind data to estimate the SO2 output of the Norilsk industrial complex to be around 1 Mt per year, which is in agreement with independent estimates. This value is compared to results using data from satellite remote sensing (GOME, OMI). The validity of the assumptions underlying our estimate is discussed, including the adaptation of this method to other gases and sources like the NO2 emissions of large industries or cities.
NASA Astrophysics Data System (ADS)
Yucel, M.; Sueishi, T.; Inagaki, A.; Kanda, M.
2017-12-01
`Great Garuda' project is an eagle-shaped offshore structure with 17 artificial islands. This project has been designed for the coastal protection and land reclamation of Jakarta due to catastrophic flooding in the city. It offers an urban generation for 300.000 inhabitants and 600.000 workers in addition to its water safety goal. A broad coalition of Indonesian scientists has criticized the project for being negative impacts on the surrounding environment. Despite the vast research by Indonesian scientist on maritime environment, studies on wind and thermal environment over built-up area are still lacking. However, the construction of the various islands off the coast may result changes in wind patterns and thermal environment due to the alteration of the coastline and urbanization in the Jakarta Bay. Therefore, it is important to understand the airflow within the urban canopy in case of unpredictable gust events. These gust events may occur through the closely-packed high-rise buildings and pedestrians may be harmed from such gusts. Accordingly, we used numerical simulations to investigate the impact of the sea wall and the artificial islands over built-up area and, the intensity of wind gusts at the pedestrian level. Considering the fact that the size of turbulence organized structure sufficiently large computational domain is required. Therefore, a 19.2km×4.8km×1.0 km simulation domain with 2-m resolution in all directions was created to explicitly resolve the detailed shapes of buildings and the flow at the pedestrian level. This complex computation was accomplished by implementing a large-eddy simulation (LES) model. Two case studies were conducted considering the effect of realistic surface roughness and upward heat flux. Case_1 was conducted based on the current built environment and Case_2 for investigating the effect of the project on the chosen coastal region of the city. Fig.1 illustrates the schematic of the large-eddy simulation domains of two cases with and without Great Garuda Sea Wall and 17 artificial islands. 3D model of Great Garuda is shown in Fig.2. In addition to the cases mentioned above, the simulation will be generated assigning more realistic heat flux outputs from energy balance model and, inflow boundary conditions coupling with mesoscale model (Weather Research and Forecast model).
Risk management integration into complex project organizations
NASA Technical Reports Server (NTRS)
Fisher, K.; Greanias, G.; Rose, J.; Dumas, R.
2002-01-01
This paper describes the approach used in designing and adapting the SIRTF prototype, discusses some of the lessons learned in developing the SIRTF prototype, and explains the adaptability of the risk management database to varying levels project complexity.
Risk Reduction for Use of Complex Devices in Space Projects
NASA Technical Reports Server (NTRS)
Berg, Melanie; Poivey, Christian; Friendlich, Mark; Petrick, Dave; LaBel, Kenneth; Stansberry, Scott
2007-01-01
We present guidel!nes to reduce risk to an acceptable level when using complex devices in space applications. Application to Virtex 4 Field Programmable Gate Array (FPGA) on Express Logistic Carrier (ELC) project is presented.
Sligo, Judith; Gauld, Robin; Roberts, Vaughan; Villa, Luis
2017-01-01
Information technology is perceived as a potential panacea for healthcare organisations to manage pressure to improve services in the face of increased demand. However, the implementation and evaluation of health information systems (HIS) is plagued with problems and implementation shortcomings and failures are rife. HIS implementation is complex and relies on organisational, structural, technological, and human factors to be successful. It also requires reflective, nuanced, multidimensional evaluation to provide ongoing feedback to ensure success. This article provides a comprehensive review of the literature about evaluating and implementing HIS, detailing the challenges and recommendations for both evaluators and healthcare organisations. The factors that inhibit or promote successful HIS implementation are identified and effective evaluation strategies are described with the goal of informing teams evaluating complex HIS. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
1979-02-28
This mosaic of Jupiter was assembled from nine individual photos taken through an orange filter by Voyager 1 on Feb. 6, 1979, when the spacecraft was 4.7 million miles (7.8 million kilometers) from Jupiter. Distortion of the mosaic, especially where portions of the limb have been fitted together, is caused by rotation of the planet during the 96-second intervals between individual pictures. The large atmospheric feature just below and to the right of center is the Great Red Spot. The complex structure of the cloud formations seen over the entire planet gives some hint of the equally complex motions in the Voyager 1 time-lapse photography. The smallest atomospheric features seen in this view are approximately 85 miles (140 kilometers) across. Voyager project is managed and controlled by Jet Propulsion Laboratory for NASA's Office of Space Science. (JPL ref. No. P-21146)
Bias and robustness of uncertainty components estimates in transient climate projections
NASA Astrophysics Data System (ADS)
Hingray, Benoit; Blanchet, Juliette; Jean-Philippe, Vidal
2016-04-01
A critical issue in climate change studies is the estimation of uncertainties in projections along with the contribution of the different uncertainty sources, including scenario uncertainty, the different components of model uncertainty and internal variability. Quantifying the different uncertainty sources faces actually different problems. For instance and for the sake of simplicity, an estimate of model uncertainty is classically obtained from the empirical variance of the climate responses obtained for the different modeling chains. These estimates are however biased. Another difficulty arises from the limited number of members that are classically available for most modeling chains. In this case, the climate response of one given chain and the effect of its internal variability may be actually difficult if not impossible to separate. The estimate of scenario uncertainty, model uncertainty and internal variability components are thus likely to be not really robust. We explore the importance of the bias and the robustness of the estimates for two classical Analysis of Variance (ANOVA) approaches: a Single Time approach (STANOVA), based on the only data available for the considered projection lead time and a time series based approach (QEANOVA), which assumes quasi-ergodicity of climate outputs over the whole available climate simulation period (Hingray and Saïd, 2014). We explore both issues for a simple but classical configuration where uncertainties in projections are composed of two single sources: model uncertainty and internal climate variability. The bias in model uncertainty estimates is explored from theoretical expressions of unbiased estimators developed for both ANOVA approaches. The robustness of uncertainty estimates is explored for multiple synthetic ensembles of time series projections generated with MonteCarlo simulations. For both ANOVA approaches, when the empirical variance of climate responses is used to estimate model uncertainty, the bias is always positive. It can be especially high with STANOVA. In the most critical configurations, when the number of members available for each modeling chain is small (< 3) and when internal variability explains most of total uncertainty variance (75% or more), the overestimation is higher than 100% of the true model uncertainty variance. The bias can be considerably reduced with a time series ANOVA approach, owing to the multiple time steps accounted for. The longer the transient time period used for the analysis, the larger the reduction. When a quasi-ergodic ANOVA approach is applied to decadal data for the whole 1980-2100 period, the bias is reduced by a factor 2.5 to 20 depending on the projection lead time. In all cases, the bias is likely to be not negligible for a large number of climate impact studies resulting in a likely large overestimation of the contribution of model uncertainty to total variance. For both approaches, the robustness of all uncertainty estimates is higher when more members are available, when internal variability is smaller and/or the response-to-uncertainty ratio is higher. QEANOVA estimates are much more robust than STANOVA ones: QEANOVA simulated confidence intervals are roughly 3 to 5 times smaller than STANOVA ones. Excepted for STANOVA when less than 3 members is available, the robustness is rather high for total uncertainty and moderate for internal variability estimates. For model uncertainty or response-to-uncertainty ratio estimates, the robustness is conversely low for QEANOVA to very low for STANOVA. In the most critical configurations (small number of member, large internal variability), large over- or underestimation of uncertainty components is very thus likely. To propose relevant uncertainty analyses and avoid misleading interpretations, estimates of uncertainty components should be therefore bias corrected and ideally come with estimates of their robustness. This work is part of the COMPLEX Project (European Collaborative Project FP7-ENV-2012 number: 308601; http://www.complex.ac.uk/). Hingray, B., Saïd, M., 2014. Partitioning internal variability and model uncertainty components in a multimodel multireplicate ensemble of climate projections. J.Climate. doi:10.1175/JCLI-D-13-00629.1 Hingray, B., Blanchet, J. (revision) Unbiased estimators for uncertainty components in transient climate projections. J. Climate Hingray, B., Blanchet, J., Vidal, J.P. (revision) Robustness of uncertainty components estimates in climate projections. J.Climate
Community Learning Campus: It Takes a Simple Message to Build a Complex Project
ERIC Educational Resources Information Center
Pearson, George
2012-01-01
Education Canada asked Tom Thompson, president of Olds College and a prime mover behind the Community Learning Campus (CLC): What were the lessons learned from this unusually ambitious education project? Thompson mentions six lessons he learned from this complex project which include: (1) Dream big, build small, act now; (2) Keep a low profile at…
Theory-based practice in a major medical centre.
Alligood, Martha Raile
2011-11-01
This project was designed to improve care quality and nursing staff satisfaction. Nursing theory structures thought and action as demonstrated by evidence of improvement in complex health-care settings. Nursing administrators selected Modelling and Role-Modelling (MRM) for the theory-based practice goal in their strategic plan. An action research approach structured implementation of MRM in a 1-year consultation project in 2001-2002. Quality of health care improved according to national quality assessment ratings, as well as patient satisfaction and nurse satisfaction. Modelling and Role-Modelling demonstrated capacity to structure nursing thought and action in patient care in a major medical centre. Uniformity of patient care language was valued by nurses as well as by allied health providers who wished to learn the holistic MRM style of practice. The processes of MRM and action research contributed to project success. A positive health-care change project was carried out in a large medical centre with action research. Introducing MRM theory-based practice was a beneficial decision by nursing administration that improved care and nurse satisfaction. Attention to nursing practice stimulated career development among the nurses to pursue bachelors, masters, and doctoral degrees. © 2011 Blackwell Publishing Ltd.
Probabilistic projections of 21st century climate change over Northern Eurasia
NASA Astrophysics Data System (ADS)
Monier, E.; Sokolov, A. P.; Schlosser, C. A.; Scott, J. R.; Gao, X.
2013-12-01
We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an earth system model of intermediate complexity, with a two-dimensional zonal-mean atmosphere, to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three dimensional atmospheric model; and a statistical downscaling, where a pattern scaling algorithm uses climate-change patterns from 17 climate models. This framework allows for key sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections; climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate); natural variability; and structural uncertainty. Results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also nd that dierent initial conditions lead to dierences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider all sources of uncertainty when modeling climate impacts over Northern Eurasia.
Probabilistic projections of 21st century climate change over Northern Eurasia
NASA Astrophysics Data System (ADS)
Monier, Erwan; Sokolov, Andrei; Schlosser, Adam; Scott, Jeffery; Gao, Xiang
2013-12-01
We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity with a two-dimensional zonal-mean atmosphere to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three-dimensional atmospheric model, and a statistical downscaling, where a pattern scaling algorithm uses climate change patterns from 17 climate models. This framework allows for four major sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections, climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate), natural variability, and structural uncertainty. The results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also find that different initial conditions lead to differences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider these sources of uncertainty when modeling climate impacts over Northern Eurasia.
Dall, Timothy M; Gallo, Paul D; Chakrabarti, Ritasree; West, Terry; Semilla, April P; Storm, Michael V
2013-11-01
As the US population ages, the increasing prevalence of chronic disease and complex medical conditions will have profound implications for the future health care system. We projected future prevalence of selected diseases and health risk factors to model future demand for health care services for each person in a representative sample of the current and projected future population. Based on changing demographic characteristics and expanded medical coverage under the Affordable Care Act, we project that the demand for adult primary care services will grow by approximately 14 percent between 2013 and 2025. Vascular surgery has the highest projected demand growth (31 percent), followed by cardiology (20 percent) and neurological surgery, radiology, and general surgery (each 18 percent). Market indicators such as long wait times to obtain appointments suggest that the current supply of many specialists throughout the United States is inadequate to meet the current demand. Failure to train sufficient numbers and the correct mix of specialists could exacerbate already long wait times for appointments, reduce access to care for some of the nation's most vulnerable patients, and reduce patients' quality of life.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Hanchen, E-mail: jhc13@mails.tsinghua.edu.cn; Qiang, Maoshan, E-mail: qiangms@tsinghua.edu.cn; Lin, Peng, E-mail: celinpe@mail.tsinghua.edu.cn
Public opinion becomes increasingly salient in the ex post evaluation stage of large infrastructure projects which have significant impacts to the environment and the society. However, traditional survey methods are inefficient in collection and assessment of the public opinion due to its large quantity and diversity. Recently, Social media platforms provide a rich data source for monitoring and assessing the public opinion on controversial infrastructure projects. This paper proposes an assessment framework to transform unstructured online public opinions on large infrastructure projects into sentimental and topical indicators for enhancing practices of ex post evaluation and public participation. The framework usesmore » web crawlers to collect online comments related to a large infrastructure project and employs two natural language processing technologies, including sentiment analysis and topic modeling, with spatio-temporal analysis, to transform these comments into indicators for assessing online public opinion on the project. Based on the framework, we investigate the online public opinion of the Three Gorges Project on China's largest microblogging site, namely, Weibo. Assessment results present spatial-temporal distributions of post intensity and sentiment polarity, reveals major topics with different sentiments and summarizes managerial implications, for ex post evaluation of the world's largest hydropower project. The proposed assessment framework is expected to be widely applied as a methodological strategy to assess public opinion in the ex post evaluation stage of large infrastructure projects. - Highlights: • We developed a framework to assess online public opinion on large infrastructure projects with environmental impacts. • Indicators were built to assess post intensity, sentiment polarity and major topics of the public opinion. • We took the Three Gorges Project (TGP) as an example to demonstrate the effectiveness proposed framework. • We revealed spatial-temporal patterns of post intensity and sentiment polarity on the TGP. • We drew implications for a more in-depth understanding of the public opinion on large infrastructure projects.« less
Economic evaluation of environmental epidemiological projects in national industrial complexes.
Shin, Youngchul
2017-01-01
In this economic evaluation of environmental epidemiological monitoring projects, we analyzed the economic feasibility of these projects by determining the social cost and benefit of these projects and conducting a cost/benefit analysis. Here, the social cost was evaluated by converting annual budgets for these research and survey projects into present values. Meanwhile, the societal benefit of these projects was evaluated by using the contingent valuation method to estimate the willingness-to-pay of residents living in or near industrial complexes. In addition, the extent to which these projects reduced negative health effects (i.e., excess disease and premature death) was evaluated through expert surveys, and the analysis was conducted to reflect the unit of economic value, based on the cost of illness and benefit transfer method. The results were then used to calculate the benefit of these projects in terms of the decrease in negative health effects. For residents living near industrial complexes, the benefit/cost ratio was 1.44 in the analysis based on resident surveys and 5.17 in the analysis based on expert surveys. Thus, whichever method was used for the economic analysis, the economic feasibility of these projects was confirmed.
Parallel processing for scientific computations
NASA Technical Reports Server (NTRS)
Alkhatib, Hasan S.
1995-01-01
The scope of this project dealt with the investigation of the requirements to support distributed computing of scientific computations over a cluster of cooperative workstations. Various experiments on computations for the solution of simultaneous linear equations were performed in the early phase of the project to gain experience in the general nature and requirements of scientific applications. A specification of a distributed integrated computing environment, DICE, based on a distributed shared memory communication paradigm has been developed and evaluated. The distributed shared memory model facilitates porting existing parallel algorithms that have been designed for shared memory multiprocessor systems to the new environment. The potential of this new environment is to provide supercomputing capability through the utilization of the aggregate power of workstations cooperating in a cluster interconnected via a local area network. Workstations, generally, do not have the computing power to tackle complex scientific applications, making them primarily useful for visualization, data reduction, and filtering as far as complex scientific applications are concerned. There is a tremendous amount of computing power that is left unused in a network of workstations. Very often a workstation is simply sitting idle on a desk. A set of tools can be developed to take advantage of this potential computing power to create a platform suitable for large scientific computations. The integration of several workstations into a logical cluster of distributed, cooperative, computing stations presents an alternative to shared memory multiprocessor systems. In this project we designed and evaluated such a system.
NASA Astrophysics Data System (ADS)
Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.
2014-12-01
In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is essential for successful project implementation (Conroy and Peterson, 2013). Evaluating tradeoffs and examining alternatives to improve fish habitat through optimization modeling is not just a trend but rather the scientific strategy by which management needs embrace and apply in its decision framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Du, Jincheng; Rimsza, Jessica; Deng, Lu
This NEUP Project aimed to generate accurate atomic structural models of nuclear waste glasses by using large-scale molecular dynamics-based computer simulations and to use these models to investigate self-diffusion behaviors, interfacial structures, and hydrated gel structures formed during dissolution of these glasses. The goal was to obtain realistic and accurate short and medium range structures of these complex oxide glasses, to provide a mechanistic understanding of the dissolution behaviors, and to generate reliable information with predictive power in designing nuclear waste glasses for long-term geological storage. Looking back of the research accomplishments of this project, most of the scientific goalsmore » initially proposed have been achieved through intensive research in the three and a half year period of the project. This project has also generated a wealth of scientific data and vibrant discussions with various groups through collaborations within and outside of this project. Throughout the project one book chapter and 14 peer reviewed journal publications have been generated (including one under review) and 16 presentations (including 8 invited talks) have been made to disseminate the results of this project in national and international conference. Furthermore, this project has trained several outstanding graduate students and young researchers for future workforce in nuclear related field, especially on nuclear waste immobilization. One postdoc and four PhD students have been fully or partially supported through the project with intensive training in the field material science and engineering with expertise on glass science and nuclear waste disposal« less
NASA Astrophysics Data System (ADS)
Murphy, K. W.; Ellis, A. W.; Skindlov, J. A.
2015-12-01
Water resource systems have provided vital support to transformative growth in the Southwest United States and the Phoenix, Arizona metropolitan area where the Salt River Project (SRP) currently satisfies 40% of the area's water demand from reservoir storage and groundwater. Large natural variability and expectations of climate changes have sensitized water management to risks posed by future periods of excess and drought. The conventional approach to impacts assessment has been downscaled climate model simulations translated through hydrologic models; but, scenario ranges enlarge as uncertainties propagate through sequential levels of modeling complexity. The research often does not reach the stage of specific impact assessments, rendering future projections frustratingly uncertain and unsuitable for complex decision-making. Alternatively, this study inverts the common approach by beginning with the threatened water system and proceeding backwards to the uncertain climate future. The methodology is built upon reservoir system response modeling to exhaustive time series of climate-driven net basin supply. A reservoir operations model, developed with SRP guidance, assesses cumulative response to inflow variability and change. Complete statistical analyses of long-term historical watershed climate and runoff data are employed for 10,000-year stochastic simulations, rendering the entire range of multi-year extremes with full probabilistic characterization. Sets of climate change projections are then translated by temperature sensitivity and precipitation elasticity into future inflow distributions that are comparatively assessed with the reservoir operations model. This approach provides specific risk assessments in pragmatic terms familiar to decision makers, interpretable within the context of long-range planning and revealing a clearer meaning of climate change projections for the region. As a transferable example achieving actionable findings, the approach can guide other communities confronting water resource planning challenges.
Systems engineering implementation in the preliminary design phase of the Giant Magellan Telescope
NASA Astrophysics Data System (ADS)
Maiten, J.; Johns, M.; Trancho, G.; Sawyer, D.; Mady, P.
2012-09-01
Like many telescope projects today, the 24.5-meter Giant Magellan Telescope (GMT) is truly a complex system. The primary and secondary mirrors of the GMT are segmented and actuated to support two operating modes: natural seeing and adaptive optics. GMT is a general-purpose telescope supporting multiple science instruments operated in those modes. GMT is a large, diverse collaboration and development includes geographically distributed teams. The need to implement good systems engineering processes for managing the development of systems like GMT becomes imperative. The management of the requirements flow down from the science requirements to the component level requirements is an inherently difficult task in itself. The interfaces must also be negotiated so that the interactions between subsystems and assemblies are well defined and controlled. This paper will provide an overview of the systems engineering processes and tools implemented for the GMT project during the preliminary design phase. This will include requirements management, documentation and configuration control, interface development and technical risk management. Because of the complexity of the GMT system and the distributed team, using web-accessible tools for collaboration is vital. To accomplish this GMTO has selected three tools: Cognition Cockpit, Xerox Docushare, and Solidworks Enterprise Product Data Management (EPDM). Key to this is the use of Cockpit for managing and documenting the product tree, architecture, error budget, requirements, interfaces, and risks. Additionally, drawing management is accomplished using an EPDM vault. Docushare, a documentation and configuration management tool is used to manage workflow of documents and drawings for the GMT project. These tools electronically facilitate collaboration in real time, enabling the GMT team to track, trace and report on key project metrics and design parameters.
AMMOS2: a web server for protein-ligand-water complexes refinement via molecular mechanics.
Labbé, Céline M; Pencheva, Tania; Jereva, Dessislava; Desvillechabrol, Dimitri; Becot, Jérôme; Villoutreix, Bruno O; Pajeva, Ilza; Miteva, Maria A
2017-07-03
AMMOS2 is an interactive web server for efficient computational refinement of protein-small organic molecule complexes. The AMMOS2 protocol employs atomic-level energy minimization of a large number of experimental or modeled protein-ligand complexes. The web server is based on the previously developed standalone software AMMOS (Automatic Molecular Mechanics Optimization for in silico Screening). AMMOS utilizes the physics-based force field AMMP sp4 and performs optimization of protein-ligand interactions at five levels of flexibility of the protein receptor. The new version 2 of AMMOS implemented in the AMMOS2 web server allows the users to include explicit water molecules and individual metal ions in the protein-ligand complexes during minimization. The web server provides comprehensive analysis of computed energies and interactive visualization of refined protein-ligand complexes. The ligands are ranked by the minimized binding energies allowing the users to perform additional analysis for drug discovery or chemical biology projects. The web server has been extensively tested on 21 diverse protein-ligand complexes. AMMOS2 minimization shows consistent improvement over the initial complex structures in terms of minimized protein-ligand binding energies and water positions optimization. The AMMOS2 web server is freely available without any registration requirement at the URL: http://drugmod.rpbs.univ-paris-diderot.fr/ammosHome.php. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Photoinduced energy transfer in transition metal complex oligomers
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-04-01
The work we have done over the past three years has been directed toward the preparation, characterization and photophysical examination of mono- and bimetallic diimine complexes. The work is part of a broader project directed toward the development of stable, efficient, light harvesting arrays of transition metal complex chromophores. One focus has been the synthesis of rigid bis-bidentate and bis-tridentate bridging ligands. We have managed to make the ligand bphb in multigram quantities from inexpensive starting materials. The synthetic approach used has allowed us prepare a variety of other ligands which may have unique applications (vide infra). We have prepared,more » characterized and examined the photophysical behavior of Ru(II) and Re(I) complexes of the ligands. Energy donor/acceptor complexes of bphb have been prepared which exhibit nearly activationless energy transfer. Complexes of Ru(II) and Re(I) have also been prepared with other polyunsaturated ligands in which two different long lived ( > 50 ns) excited states exist; results of luminescence and transient absorbance measurements suggest the two states are metal-to-ligand charge transfer and ligand localized {pi}{r_arrow}{pi}* triplets. Finally, we have developed methods to prepare polymetallic complexes which are covalently bound to various surfaces. The long term objective of this work is to make light harvesting arrays for the sensitization of large band gap semiconductors. Details of this work are provided in the body of the report.« less
AMMOS2: a web server for protein–ligand–water complexes refinement via molecular mechanics
Labbé, Céline M.; Pencheva, Tania; Jereva, Dessislava; Desvillechabrol, Dimitri; Becot, Jérôme; Villoutreix, Bruno O.; Pajeva, Ilza
2017-01-01
Abstract AMMOS2 is an interactive web server for efficient computational refinement of protein–small organic molecule complexes. The AMMOS2 protocol employs atomic-level energy minimization of a large number of experimental or modeled protein–ligand complexes. The web server is based on the previously developed standalone software AMMOS (Automatic Molecular Mechanics Optimization for in silico Screening). AMMOS utilizes the physics-based force field AMMP sp4 and performs optimization of protein–ligand interactions at five levels of flexibility of the protein receptor. The new version 2 of AMMOS implemented in the AMMOS2 web server allows the users to include explicit water molecules and individual metal ions in the protein–ligand complexes during minimization. The web server provides comprehensive analysis of computed energies and interactive visualization of refined protein–ligand complexes. The ligands are ranked by the minimized binding energies allowing the users to perform additional analysis for drug discovery or chemical biology projects. The web server has been extensively tested on 21 diverse protein–ligand complexes. AMMOS2 minimization shows consistent improvement over the initial complex structures in terms of minimized protein–ligand binding energies and water positions optimization. The AMMOS2 web server is freely available without any registration requirement at the URL: http://drugmod.rpbs.univ-paris-diderot.fr/ammosHome.php. PMID:28486703
NASA Astrophysics Data System (ADS)
Fiore, Sandro; Williams, Dean; Aloisio, Giovanni
2016-04-01
In many scientific domains such as climate, data is often n-dimensional and requires tools that support specialized data types and primitives to be properly stored, accessed, analysed and visualized. Moreover, new challenges arise in large-scale scenarios and eco-systems where petabytes (PB) of data can be available and data can be distributed and/or replicated (e.g., the Earth System Grid Federation (ESGF) serving the Coupled Model Intercomparison Project, Phase 5 (CMIP5) experiment, providing access to 2.5PB of data for the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). Most of the tools currently available for scientific data analysis in the climate domain fail at large scale since they: (1) are desktop based and need the data locally; (2) are sequential, so do not benefit from available multicore/parallel machines; (3) do not provide declarative languages to express scientific data analysis tasks; (4) are domain-specific, which ties their adoption to a specific domain; and (5) do not provide a workflow support, to enable the definition of complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes ("datacubes"). The project relies on a strong background of high performance database management and OLAP systems to manage large scientific data sets. It also provides a native workflow management support, to define processing chains and workflows with tens to hundreds of data analytics operators to build real scientific use cases. With regard to interoperability aspects, the talk will present the contribution provided both to the RDA Working Group on Array Databases, and the Earth System Grid Federation (ESGF) Compute Working Team. Also highlighted will be the results of large scale climate model intercomparison data analysis experiments, for example: (1) defined in the context of the EU H2020 INDIGO-DataCloud project; (2) implemented in a real geographically distributed environment involving CMCC (Italy) and LLNL (US) sites; (3) exploiting Ophidia as server-side, parallel analytics engine; and (4) applied on real CMIP5 data sets available through ESGF.
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.; Glassman, Nanci A.; Affelder, Linda O.; Hecht, Laura M.; Kennedy, John M.; Barclay, Rebecca O.
1993-01-01
An exploratory study was conducted that investigated the influence of technical uncertainty and project complexity on information use by U.S. industry-affiliated aerospace engineers and scientists. The study utilized survey research in the form of a self-administered mail questionnaire. U.S. aerospace engineers and scientists on the Society of Automotive Engineers (SAE) mailing list served as the study population. The adjusted response rate was 67 percent. The survey instrument is appendix C to this report. Statistically significant relationships were found to exist between technical uncertainty, project complexity, and information use. Statistically significant relationships were found to exist between technical uncertainty, project complexity, and the use of federally funded aerospace R&D. The results of this investigation are relevant to researchers investigating information-seeking behavior of aerospace engineers. They are also relevant to R&D managers and policy planners concerned with transferring the results of federally funded aerospace R&D to the U.S. aerospace industry.
Dynamics of microbial growth and metabolic activity and their control by aeration.
Kalina, V
1993-01-01
The optimization of fermentation processes depends to a large extent on the modelling of microbial activity under complex environmental conditions where aeration is an important limiting and control factor. Simple relationships are used to establish the sensitivity of cultures to oxygen stress. Specific limitation coefficients which can be determined in laboratory reactors allow a projection to industrial operation and the definition of appropriate aeration and agitation profiles. Optimum control can be assured on the basis of directly measurable process parameters. This is shown for the case of ethanol production using S. cerevisiae at high cell dry weight concentrations.
The Contextualization of Archetypes: Clinical Template Governance.
Pedersen, Rune; Ulriksen, Gro-Hilde; Ellingsen, Gunnar
2015-01-01
This paper is a status report from a large-scale openEHR-based EPR project from the North Norway Regional Health Authority. It concerns the standardization of a regional ICT portfolio and the ongoing development of a new process oriented EPR systems encouraged by the unfolding of a national repository for openEHR archetypes. Subject of interest; the contextualization of clinical templates is governed over multiple national boundaries which is complex due to the dependency of clinical resources. From the outset of this, we are interested in how local, regional, and national organizers maneuver to standardize while applying OpenEHR technology.
Managing Analysis Models in the Design Process
NASA Technical Reports Server (NTRS)
Briggs, Clark
2006-01-01
Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.
Clan Genomics and the Complex Architecture of Human Disease
Belmont, John W.; Boerwinkle, Eric
2013-01-01
Human diseases are caused by alleles that encompass the full range of variant types, from single-nucleotide changes to copy-number variants, and these variations span a broad frequency spectrum, from the very rare to the common. The picture emerging from analysis of whole-genome sequences, the 1000 Genomes Project pilot studies, and targeted genomic sequencing derived from very large sample sizes reveals an abundance of rare and private variants. One implication of this realization is that recent mutation may have a greater influence on disease susceptibility or protection than is conferred by variations that arose in distant ancestors. PMID:21962505
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Minnis, P.; Spangenberg, D.; Ayers, J. K.; Palikonda, R.; Vakhnin, A.; Dubois, R.; Murphy, P. R.
2014-12-01
The processing, storage and dissemination of satellite cloud and radiation products produced at NASA Langley Research Center are key activities for the Climate Science Branch. A constellation of systems operates in sync to accomplish these goals. Because of the complexity involved with operating such intricate systems, there are both high failure rates and high costs for hardware and system maintenance. Cloud computing has the potential to ameliorate cost and complexity issues. Over time, the cloud computing model has evolved and hybrid systems comprising off-site as well as on-site resources are now common. Towards our mission of providing the highest quality research products to the widest audience, we have explored the use of the Amazon Web Services (AWS) Cloud and Storage and present a case study of our results and efforts. This project builds upon NASA Langley Cloud and Radiation Group's experience with operating large and complex computing infrastructures in a reliable and cost effective manner to explore novel ways to leverage cloud computing resources in the atmospheric science environment. Our case study presents the project requirements and then examines the fit of AWS with the LaRC computing model. We also discuss the evaluation metrics, feasibility, and outcomes and close the case study with the lessons we learned that would apply to others interested in exploring the implementation of the AWS system in their own atmospheric science computing environments.
Multichannel reconfigurable measurement system for hot plasma diagnostics based on GEM-2D detector
NASA Astrophysics Data System (ADS)
Wojenski, A. J.; Kasprowicz, G.; Pozniak, K. T.; Byszuk, A.; Chernyshova, M.; Czarski, T.; Jablonski, S.; Juszczyk, B.; Zienkiewicz, P.
2015-12-01
In the future magnetically confined fusion research reactors (e.g. ITER tokamak), precise determination of the level of the soft X-ray radiation of plasma with temperature above 30 keV (around 350 mln K) will be very important in plasma parameters optimization. This paper presents the first version of a designed spectrography measurement system. The system is already installed at JET tokamak. Based on the experience gained from the project, the new generation of hardware for spectrography measurements, was designed and also described in the paper. The GEM detector readout structure was changed to 2D in order to perform measurements of i.e. laser generated plasma. The hardware structure of the system was redesigned in order to provide large number of high speed input channels. Finally, this paper also covers the issue of new control software, necessary to set-up a complete system of certain complexity and perform data acquisition. The main goal of the project was to develop a new version of the system, which includes upgraded structure and data transmission infrastructure (i.e. handling large number of measurement channels, high sampling rate).
Tissue-aware RNA-Seq processing and normalization for heterogeneous and sparse data.
Paulson, Joseph N; Chen, Cho-Yi; Lopes-Ramos, Camila M; Kuijjer, Marieke L; Platig, John; Sonawane, Abhijeet R; Fagny, Maud; Glass, Kimberly; Quackenbush, John
2017-10-03
Although ultrahigh-throughput RNA-Sequencing has become the dominant technology for genome-wide transcriptional profiling, the vast majority of RNA-Seq studies typically profile only tens of samples, and most analytical pipelines are optimized for these smaller studies. However, projects are generating ever-larger data sets comprising RNA-Seq data from hundreds or thousands of samples, often collected at multiple centers and from diverse tissues. These complex data sets present significant analytical challenges due to batch and tissue effects, but provide the opportunity to revisit the assumptions and methods that we use to preprocess, normalize, and filter RNA-Seq data - critical first steps for any subsequent analysis. We find that analysis of large RNA-Seq data sets requires both careful quality control and the need to account for sparsity due to the heterogeneity intrinsic in multi-group studies. We developed Yet Another RNA Normalization software pipeline (YARN), that includes quality control and preprocessing, gene filtering, and normalization steps designed to facilitate downstream analysis of large, heterogeneous RNA-Seq data sets and we demonstrate its use with data from the Genotype-Tissue Expression (GTEx) project. An R package instantiating YARN is available at http://bioconductor.org/packages/yarn .
Web based visualization of large climate data sets
Alder, Jay R.; Hostetler, Steven W.
2015-01-01
We have implemented the USGS National Climate Change Viewer (NCCV), which is an easy-to-use web application that displays future projections from global climate models over the United States at the state, county and watershed scales. We incorporate the NASA NEX-DCP30 statistically downscaled temperature and precipitation for 30 global climate models being used in the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC), and hydrologic variables we simulated using a simple water-balance model. Our application summarizes very large, complex data sets at scales relevant to resource managers and citizens and makes climate-change projection information accessible to users of varying skill levels. Tens of terabytes of high-resolution climate and water-balance data are distilled to compact binary format summary files that are used in the application. To alleviate slow response times under high loads, we developed a map caching technique that reduces the time it takes to generate maps by several orders of magnitude. The reduced access time scales to >500 concurrent users. We provide code examples that demonstrate key aspects of data processing, data exporting/importing and the caching technique used in the NCCV.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moniz, Ernest; Carr, Alan; Bethe, Hans
The Trinity Test of July 16, 1945 was the first full-scale, real-world test of a nuclear weapon; with the new Trinity supercomputer Los Alamos National Laboratory's goal is to do this virtually, in 3D. Trinity was the culmination of a fantastic effort of groundbreaking science and engineering by hundreds of men and women at Los Alamos and other Manhattan Project sites. It took them less than two years to change the world. The Laboratory is marking the 70th anniversary of the Trinity Test because it not only ushered in the Nuclear Age, but with it the origin of today’s advancedmore » supercomputing. We live in the Age of Supercomputers due in large part to nuclear weapons science here at Los Alamos. National security science, and nuclear weapons science in particular, at Los Alamos National Laboratory have provided a key motivation for the evolution of large-scale scientific computing. Beginning with the Manhattan Project there has been a constant stream of increasingly significant, complex problems in nuclear weapons science whose timely solutions demand larger and faster computers. The relationship between national security science at Los Alamos and the evolution of computing is one of interdependence.« less
CaseWorld™: Interactive, media rich, multidisciplinary case based learning.
Gillham, David; Tucker, Katie; Parker, Steve; Wright, Victoria; Kargillis, Christina
2015-11-01
Nurse educators are challenged to keep up with highly specialised clinical practice, emerging research evidence, regulation requirements and rapidly changing information technology while teaching very large numbers of diverse students in a resource constrained environment. This complex setting provides the context for the CaseWorld project, which aims to simulate those aspects of clinical practice that can be represented by e-learning. This paper describes the development, implementation and evaluation of CaseWorld, a simulated learning environment that supports case based learning. CaseWorld provides nursing students with the opportunity to view unfolding authentic cases presented in a rich multimedia context. The first round of comprehensive summative evaluation of CaseWorld is discussed in the context of earlier formative evaluation, reference group input and strategies for integration of CaseWorld with subject content. This discussion highlights the unique approach taken in this project that involved simultaneous prototype development and large scale implementation, thereby necessitating strong emphasis on staff development, uptake and engagement. The lessons learned provide an interesting basis for further discussion of broad content sharing across disciplines and universities, and the contribution that local innovations can make to global education advancement. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.
Moniz, Ernest; Carr, Alan; Bethe, Hans; Morrison, Phillip; Ramsay, Norman; Teller, Edward; Brixner, Berlyn; Archer, Bill; Agnew, Harold; Morrison, John
2018-01-16
The Trinity Test of July 16, 1945 was the first full-scale, real-world test of a nuclear weapon; with the new Trinity supercomputer Los Alamos National Laboratory's goal is to do this virtually, in 3D. Trinity was the culmination of a fantastic effort of groundbreaking science and engineering by hundreds of men and women at Los Alamos and other Manhattan Project sites. It took them less than two years to change the world. The Laboratory is marking the 70th anniversary of the Trinity Test because it not only ushered in the Nuclear Age, but with it the origin of todayâs advanced supercomputing. We live in the Age of Supercomputers due in large part to nuclear weapons science here at Los Alamos. National security science, and nuclear weapons science in particular, at Los Alamos National Laboratory have provided a key motivation for the evolution of large-scale scientific computing. Beginning with the Manhattan Project there has been a constant stream of increasingly significant, complex problems in nuclear weapons science whose timely solutions demand larger and faster computers. The relationship between national security science at Los Alamos and the evolution of computing is one of interdependence.
NASA Astrophysics Data System (ADS)
Guido, Z.
2017-12-01
Climate information is heralded as helping to build adaptive capacity, improve resource management, and contribute to more effective risk management. However, decision makers often find it challenging to use climate information for reasons attributed to a disconnect between technical experts who produce the information and end users. Consequently, many climate service projects are now applying an end-to-end approach that links information users and producers in the design, development, and delivery of services. This collaboration confronts obstacles that can undermine the objectives of the project. Despite this, few studies in the burgeoning field of climate services have assessed the challenges. To address this gap, I provide a reflective account and analysis of the collaborative challenges experienced in an ongoing, complex four-year project developing climate services for small-scale coffee producers in Jamaica. The project has involved diverse activities, including social data collection, research and development of information tools, periodic engagement with coffee sector representatives, and community-based trainings. Contributions to the project were made routinely by 18 individuals who represent 9 institutions located in three countries. These individuals work for academic and governmental organizations and bring expertise in anthropology, plant pathology, and climatology, among others. In spanning diverse disciplines, large geographic distances, and different cultures, the project team has navigated challenges in communication, problem framing, organizational agendas, disciplinary integration, and project management. I contextualize these experiences within research on transdisciplinary and team science, and share some perspectives on strategies to lessen their impact.
A Fast Projection-Based Algorithm for Clustering Big Data.
Wu, Yun; He, Zhiquan; Lin, Hao; Zheng, Yufei; Zhang, Jingfen; Xu, Dong
2018-06-07
With the fast development of various techniques, more and more data have been accumulated with the unique properties of large size (tall) and high dimension (wide). The era of big data is coming. How to understand and discover new knowledge from these data has attracted more and more scholars' attention and has become the most important task in data mining. As one of the most important techniques in data mining, clustering analysis, a kind of unsupervised learning, could group a set data into objectives(clusters) that are meaningful, useful, or both. Thus, the technique has played very important role in knowledge discovery in big data. However, when facing the large-sized and high-dimensional data, most of the current clustering methods exhibited poor computational efficiency and high requirement of computational source, which will prevent us from clarifying the intrinsic properties and discovering the new knowledge behind the data. Based on this consideration, we developed a powerful clustering method, called MUFOLD-CL. The principle of the method is to project the data points to the centroid, and then to measure the similarity between any two points by calculating their projections on the centroid. The proposed method could achieve linear time complexity with respect to the sample size. Comparison with K-Means method on very large data showed that our method could produce better accuracy and require less computational time, demonstrating that the MUFOLD-CL can serve as a valuable tool, at least may play a complementary role to other existing methods, for big data clustering. Further comparisons with state-of-the-art clustering methods on smaller datasets showed that our method was fastest and achieved comparable accuracy. For the convenience of most scholars, a free soft package was constructed.
NASA Technical Reports Server (NTRS)
2005-01-01
[figure removed for brevity, see original site]
Our topic for the weeks of April 4 and April 11 is dunes on Mars. We will look at the north polar sand sea and at isolated dune fields at lower latitudes. Sand seas on Earth are often called 'ergs,' an Arabic name for dune field. A sand sea differs from a dune field in two ways: 1) a sand sea has a large regional extent, and 2) the individual dunes are large in size and complex in form. A common location for dune fields on Mars is in the basin of large craters. This dune field is located in Holden Crater at 25 degrees South atitude. Image information: VIS instrument. Latitude -25.5, Longitude 326.8 East (33.2 West). 19 meter/pixel resolution. Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time. NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.NASA Astrophysics Data System (ADS)
Vardoulaki, Eleni; Faustino Jimenez Andrade, Eric; Delvecchio, Ivan; Karim, Alexander; Smolčić, Vernesa; Magnelli, Benjamin; Bertoldi, Frank; Schinnener, Eva; Sargent, Mark; Finoguenov, Alexis; VLA COSMOS Team
2018-01-01
The radio sources associated with active galactic nuclei (AGN) can exhibit a variety of radio structures, from simple to more complex, giving rise to a variety of classification schemes. The question which still remains open, given deeper surveys revealing new populations of radio sources, is whether this plethora of radio structures can be attributed to the physical properties of the host or to the environment. Here we present an analysis on the radio structure of radio-selected AGN from the VLA-COSMOS Large Project at 3 GHz (JVLA-COSMOS; Smolčić et al.) in relation to: 1) their linear projected size, 2) the Eddington ratio, and 3) the environment their hosts lie within. We classify these as FRI (jet-like) and FRII (lobe-like) based on the FR-type classification scheme, and compare them to a sample of jet-less radio AGN in JVLA-COSMOS. We measure their linear projected sizes using a semi-automatic machine learning technique. Their Eddington ratios are calculated from X-ray data available for COSMOS. As environmental probes we take the X-ray groups (hundreds kpc) and the density fields (~Mpc-scale) in COSMOS. We find that FRII radio sources are on average larger than FRIs, which agrees with literature. But contrary to past studies, we find no dichotomy in FR objects in JVLA-COSMOS given their Eddington ratios, as on average they exhibit similar values. Furthermore our results show that the large-scale environment does not explain the observed dichotomy in lobe- and jet-like FR-type objects as both types are found on similar environments, but it does affect the shape of the radio structure introducing bents for objects closer to the centre of an X-ray group.
Philipp, E E R; Kraemer, L; Mountfort, D; Schilhabel, M; Schreiber, S; Rosenstiel, P
2012-03-15
Next generation sequencing (NGS) technologies allow a rapid and cost-effective compilation of large RNA sequence datasets in model and non-model organisms. However, the storage and analysis of transcriptome information from different NGS platforms is still a significant bottleneck, leading to a delay in data dissemination and subsequent biological understanding. Especially database interfaces with transcriptome analysis modules going beyond mere read counts are missing. Here, we present the Transcriptome Analysis and Comparison Explorer (T-ACE), a tool designed for the organization and analysis of large sequence datasets, and especially suited for transcriptome projects of non-model organisms with little or no a priori sequence information. T-ACE offers a TCL-based interface, which accesses a PostgreSQL database via a php-script. Within T-ACE, information belonging to single sequences or contigs, such as annotation or read coverage, is linked to the respective sequence and immediately accessible. Sequences and assigned information can be searched via keyword- or BLAST-search. Additionally, T-ACE provides within and between transcriptome analysis modules on the level of expression, GO terms, KEGG pathways and protein domains. Results are visualized and can be easily exported for external analysis. We developed T-ACE for laboratory environments, which have only a limited amount of bioinformatics support, and for collaborative projects in which different partners work on the same dataset from different locations or platforms (Windows/Linux/MacOS). For laboratories with some experience in bioinformatics and programming, the low complexity of the database structure and open-source code provides a framework that can be customized according to the different needs of the user and transcriptome project.
ERIC Educational Resources Information Center
Curtis, Neil F.; And Others
1986-01-01
Discusses the need for student research-type chemistry projects based upon "unknown" metal complexes. Describes an experiment involving the product from the reaction between cobalt(II) chloride, ethane-1,2-diamine (en) and concentrated hydrochloric acid. Outlines the preparation of the cobalt complex, along with procedure, results and…
NASA Technical Reports Server (NTRS)
Kring, David A.; Zurcher, Lukas; Horz, Friedrich
2003-01-01
The Chicxulub Scientific Drilling Project recovered a continuous core from the Yaxcopoil-1 (YAX-1) borehole, which is approx.60-65 km from the center of the Chicxulub structure, approx.15 km beyond the limit of the estimated approx.50 km radius transient crater (excavation cavity), but within the rim of the estimated approx.90 km radius final crater. Approximately approx.100 m of melt-bearing impactites were recoverd from a depth of 794 to 895 m, above approx.600 m of underlying megablocks of Cretaceous target sediments, before bottoming at 1511 m. Compared to lithologies at impact craters like the Ries, the YAX-1 impactite sequence is incredibly rich in impact melts of unusual textural variety and complexity. The impactite sequence has also been altered by hydrothermal activity that may have largely been produced by the impact event.
The Many Faces of a Software Engineer in a Research Community
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marinovici, Maria C.; Kirkham, Harold
2013-10-14
The ability to gather, analyze and make decisions based on real world data is changing nearly every field of human endeavor. These changes are particularly challenging for software engineers working in a scientific community, designing and developing large, complex systems. To avoid the creation of a communications gap (almost a language barrier), the software engineers should possess an ‘adaptive’ skill. In the science and engineering research community, the software engineers must be responsible for more than creating mechanisms for storing and analyzing data. They must also develop a fundamental scientific and engineering understanding of the data. This paper looks atmore » the many faces that a software engineer should have: developer, domain expert, business analyst, security expert, project manager, tester, user experience professional, etc. Observations made during work on a power-systems scientific software development are analyzed and extended to describe more generic software development projects.« less
Coordinated scheduling for dynamic real-time systems
NASA Technical Reports Server (NTRS)
Natarajan, Swaminathan; Zhao, Wei
1994-01-01
In this project, we addressed issues in coordinated scheduling for dynamic real-time systems. In particular, we concentrated on design and implementation of a new distributed real-time system called R-Shell. The design objective of R-Shell is to provide computing support for space programs that have large, complex, fault-tolerant distributed real-time applications. In R-shell, the approach is based on the concept of scheduling agents, which reside in the application run-time environment, and are customized to provide just those resource management functions which are needed by the specific application. With this approach, we avoid the need for a sophisticated OS which provides a variety of generalized functionality, while still not burdening application programmers with heavy responsibility for resource management. In this report, we discuss the R-Shell approach, summarize the achievement of the project, and describe a preliminary prototype of R-Shell system.
The African Genome Variation Project shapes medical genetics in Africa
Gurdasani, Deepti; Carstensen, Tommy; Tekola-Ayele, Fasil; Pagani, Luca; Tachmazidou, Ioanna; Hatzikotoulas, Konstantinos; Karthikeyan, Savita; Iles, Louise; Pollard, Martin O.; Choudhury, Ananyo; Ritchie, Graham R. S.; Xue, Yali; Asimit, Jennifer; Nsubuga, Rebecca N.; Young, Elizabeth H.; Pomilla, Cristina; Kivinen, Katja; Rockett, Kirk; Kamali, Anatoli; Doumatey, Ayo P.; Asiki, Gershim; Seeley, Janet; Sisay-Joof, Fatoumatta; Jallow, Muminatou; Tollman, Stephen; Mekonnen, Ephrem; Ekong, Rosemary; Oljira, Tamiru; Bradman, Neil; Bojang, Kalifa; Ramsay, Michele; Adeyemo, Adebowale; Bekele, Endashaw; Motala, Ayesha; Norris, Shane A.; Pirie, Fraser; Kaleebu, Pontiano; Kwiatkowski, Dominic; Tyler-Smith, Chris; Rotimi, Charles; Zeggini, Eleftheria; Sandhu, Manjinder S.
2014-01-01
Given the importance of Africa to studies of human origins and disease susceptibility, detailed characterisation of African genetic diversity is needed. The African Genome Variation Project (AGVP) provides a resource to help design, implement and interpret genomic studies in sub-Saharan Africa (SSA) and worldwide. The AGVP represents dense genotypes from 1,481 and whole genome sequences (WGS) from 320 individuals across SSA. Using this resource, we find novel evidence of complex, regionally distinct hunter-gatherer and Eurasian admixture across SSA. We identify new loci under selection, including for malaria and hypertension. We show that modern imputation panels can identify association signals at highly differentiated loci across populations in SSA. Using WGS, we show further improvement in imputation accuracy supporting efforts for large-scale sequencing of diverse African haplotypes. Finally, we present an efficient genotype array design capturing common genetic variation in Africa, showing for the first time that such designs are feasible. PMID:25470054
NASA Astrophysics Data System (ADS)
Sass, J. P.; Fesmire, J. E.; Nagy, Z. F.; Sojourner, S. J.; Morris, D. L.; Augustynowicz, S. D.
2008-03-01
A technology demonstration test project was conducted by the Cryogenics Test Laboratory at the Kennedy Space Center (KSC) to provide comparative thermal performance data for glass microspheres, referred to as bubbles, and perlite insulation for liquid hydrogen tank applications. Two identical 1/15th scale versions of the 3,200,000 liter spherical liquid hydrogen tanks at Launch Complex 39 at KSC were custom designed and built to serve as test articles for this test project. Evaporative (boil-off) calorimeter test protocols, including liquid nitrogen and liquid hydrogen, were established to provide tank test conditions characteristic of the large storage tanks that support the Space Shuttle launch operations. This paper provides comparative thermal performance test results for bubbles and perlite for a wide range of conditions. Thermal performance as a function of cryogenic commodity (nitrogen and hydrogen), vacuum pressure, insulation fill level, tank liquid level, and thermal cycles will be presented.
ENT audit and research in the era of trainee collaboratives.
Smith, Matthew E; Hardman, John; Ellis, Matthew; Williams, Richard J
2018-05-26
Large surgical audits and research projects are complex and costly to deliver, but increasingly surgical trainees are delivering these projects within formal collaboratives and research networks. Surgical trainee collaboratives are now recognised as a valuable part of the research infrastructure, with many perceived benefits for both the trainees and the wider surgical speciality. In this article, we describe the activity of ENT trainee research collaboratives within the UK, and summarise how INTEGRATE, the UK National ENT Trainee Research Network, successfully delivered a national audit of epistaxis management. The prospective audit collected high-quality data from 1826 individuals, representing 94% of all cases that met the inclusion criteria at the 113 participating sites over the 30-day audit period. It is hoped that the audit has provided a template for subsequent high-quality and cost-effective national studies, and we discuss the future possibilities for ENT trainee research collaboratives.
Idea Paper: The Lifecycle of Software for Scientific Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dubey, Anshu; McInnes, Lois C.
The software lifecycle is a well researched topic that has produced many models to meet the needs of different types of software projects. However, one class of projects, software development for scientific computing, has received relatively little attention from lifecycle researchers. In particular, software for end-to-end computations for obtaining scientific results has received few lifecycle proposals and no formalization of a development model. An examination of development approaches employed by the teams implementing large multicomponent codes reveals a great deal of similarity in their strategies. This idea paper formalizes these related approaches into a lifecycle model for end-to-end scientific applicationmore » software, featuring loose coupling between submodels for development of infrastructure and scientific capability. We also invite input from stakeholders to converge on a model that captures the complexity of this development processes and provides needed lifecycle guidance to the scientific software community.« less
An automated performance budget estimator: a process for use in instrumentation
NASA Astrophysics Data System (ADS)
Laporte, Philippe; Schnetler, Hermine; Rees, Phil
2016-08-01
Current day astronomy projects continue to increase in size and are increasingly becoming more complex, regardless of the wavelength domain, while risks in terms of safety, cost and operability have to be reduced to ensure an affordable total cost of ownership. All of these drivers have to be considered carefully during the development process of an astronomy project at the same time as there is a big drive to shorten the development life-cycle. From the systems engineering point of view, this evolution is a significant challenge. Big instruments imply management of interfaces within large consortia and dealing with tight design phase schedules which necessitate efficient and rapid interactions between all the stakeholders to firstly ensure that the system is defined correctly and secondly that the designs will meet all the requirements. It is essential that team members respond quickly such that the time available for the design team is maximised. In this context, performance prediction tools can be very helpful during the concept phase of a project to help selecting the best design solution. In the first section of this paper we present the development of such a prediction tool that can be used by the system engineer to determine the overall performance of the system and to evaluate the impact on the science based on the proposed design. This tool can also be used in "what-if" design analysis to assess the impact on the overall performance of the system based on the simulated numbers calculated by the automated system performance prediction tool. Having such a tool available from the beginning of a project can allow firstly for a faster turn-around between the design engineers and the systems engineer and secondly, between the systems engineer and the instrument scientist. Following the first section we described the process for constructing a performance estimator tool, followed by describing three projects in which such a tool has been utilised to illustrate how such a tool have been used in astronomy projects. The three use-cases are; EAGLE, one of the European Extremely Large Telescope (E-ELT) Multi-Object Spectrograph (MOS) instruments that was studied from 2007 to 2009, the Multi-Object Optical and Near-Infrared Spectrograph (MOONS) for the European Southern Observatory's Very Large Telescope (VLT), currently under development and SST-GATE.
The African Genome Variation Project shapes medical genetics in Africa
NASA Astrophysics Data System (ADS)
Gurdasani, Deepti; Carstensen, Tommy; Tekola-Ayele, Fasil; Pagani, Luca; Tachmazidou, Ioanna; Hatzikotoulas, Konstantinos; Karthikeyan, Savita; Iles, Louise; Pollard, Martin O.; Choudhury, Ananyo; Ritchie, Graham R. S.; Xue, Yali; Asimit, Jennifer; Nsubuga, Rebecca N.; Young, Elizabeth H.; Pomilla, Cristina; Kivinen, Katja; Rockett, Kirk; Kamali, Anatoli; Doumatey, Ayo P.; Asiki, Gershim; Seeley, Janet; Sisay-Joof, Fatoumatta; Jallow, Muminatou; Tollman, Stephen; Mekonnen, Ephrem; Ekong, Rosemary; Oljira, Tamiru; Bradman, Neil; Bojang, Kalifa; Ramsay, Michele; Adeyemo, Adebowale; Bekele, Endashaw; Motala, Ayesha; Norris, Shane A.; Pirie, Fraser; Kaleebu, Pontiano; Kwiatkowski, Dominic; Tyler-Smith, Chris; Rotimi, Charles; Zeggini, Eleftheria; Sandhu, Manjinder S.
2015-01-01
Given the importance of Africa to studies of human origins and disease susceptibility, detailed characterization of African genetic diversity is needed. The African Genome Variation Project provides a resource with which to design, implement and interpret genomic studies in sub-Saharan Africa and worldwide. The African Genome Variation Project represents dense genotypes from 1,481 individuals and whole-genome sequences from 320 individuals across sub-Saharan Africa. Using this resource, we find novel evidence of complex, regionally distinct hunter-gatherer and Eurasian admixture across sub-Saharan Africa. We identify new loci under selection, including loci related to malaria susceptibility and hypertension. We show that modern imputation panels (sets of reference genotypes from which unobserved or missing genotypes in study sets can be inferred) can identify association signals at highly differentiated loci across populations in sub-Saharan Africa. Using whole-genome sequencing, we demonstrate further improvements in imputation accuracy, strengthening the case for large-scale sequencing efforts of diverse African haplotypes. Finally, we present an efficient genotype array design capturing common genetic variation in Africa.
The HEP.TrkX Project: deep neural networks for HL-LHC online and offline tracking
Farrell, Steven; Anderson, Dustin; Calafiura, Paolo; ...
2017-08-08
Particle track reconstruction in dense environments such as the detectors of the High Luminosity Large Hadron Collider (HL-LHC) is a challenging pattern recognition problem. Traditional tracking algorithms such as the combinatorial Kalman Filter have been used with great success in LHC experiments for years. However, these state-of-the-art techniques are inherently sequential and scale poorly with the expected increases in detector occupancy in the HL-LHC conditions. The HEP.TrkX project is a pilot project with the aim to identify and develop cross-experiment solutions based on machine learning algorithms for track reconstruction. Machine learning algorithms bring a lot of potential to this problemmore » thanks to their capability to model complex non-linear data dependencies, to learn effective representations of high-dimensional data through training, and to parallelize easily on high-throughput architectures such as GPUs. This contribution will describe our initial explorations into this relatively unexplored idea space. Furthermore, we will discuss the use of recurrent (LSTM) and convolutional neural networks to find and fit tracks in toy detector data.« less
The African Genome Variation Project shapes medical genetics in Africa.
Gurdasani, Deepti; Carstensen, Tommy; Tekola-Ayele, Fasil; Pagani, Luca; Tachmazidou, Ioanna; Hatzikotoulas, Konstantinos; Karthikeyan, Savita; Iles, Louise; Pollard, Martin O; Choudhury, Ananyo; Ritchie, Graham R S; Xue, Yali; Asimit, Jennifer; Nsubuga, Rebecca N; Young, Elizabeth H; Pomilla, Cristina; Kivinen, Katja; Rockett, Kirk; Kamali, Anatoli; Doumatey, Ayo P; Asiki, Gershim; Seeley, Janet; Sisay-Joof, Fatoumatta; Jallow, Muminatou; Tollman, Stephen; Mekonnen, Ephrem; Ekong, Rosemary; Oljira, Tamiru; Bradman, Neil; Bojang, Kalifa; Ramsay, Michele; Adeyemo, Adebowale; Bekele, Endashaw; Motala, Ayesha; Norris, Shane A; Pirie, Fraser; Kaleebu, Pontiano; Kwiatkowski, Dominic; Tyler-Smith, Chris; Rotimi, Charles; Zeggini, Eleftheria; Sandhu, Manjinder S
2015-01-15
Given the importance of Africa to studies of human origins and disease susceptibility, detailed characterization of African genetic diversity is needed. The African Genome Variation Project provides a resource with which to design, implement and interpret genomic studies in sub-Saharan Africa and worldwide. The African Genome Variation Project represents dense genotypes from 1,481 individuals and whole-genome sequences from 320 individuals across sub-Saharan Africa. Using this resource, we find novel evidence of complex, regionally distinct hunter-gatherer and Eurasian admixture across sub-Saharan Africa. We identify new loci under selection, including loci related to malaria susceptibility and hypertension. We show that modern imputation panels (sets of reference genotypes from which unobserved or missing genotypes in study sets can be inferred) can identify association signals at highly differentiated loci across populations in sub-Saharan Africa. Using whole-genome sequencing, we demonstrate further improvements in imputation accuracy, strengthening the case for large-scale sequencing efforts of diverse African haplotypes. Finally, we present an efficient genotype array design capturing common genetic variation in Africa.
The HEP.TrkX Project: deep neural networks for HL-LHC online and offline tracking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrell, Steven; Anderson, Dustin; Calafiura, Paolo
Particle track reconstruction in dense environments such as the detectors of the High Luminosity Large Hadron Collider (HL-LHC) is a challenging pattern recognition problem. Traditional tracking algorithms such as the combinatorial Kalman Filter have been used with great success in LHC experiments for years. However, these state-of-the-art techniques are inherently sequential and scale poorly with the expected increases in detector occupancy in the HL-LHC conditions. The HEP.TrkX project is a pilot project with the aim to identify and develop cross-experiment solutions based on machine learning algorithms for track reconstruction. Machine learning algorithms bring a lot of potential to this problemmore » thanks to their capability to model complex non-linear data dependencies, to learn effective representations of high-dimensional data through training, and to parallelize easily on high-throughput architectures such as GPUs. This contribution will describe our initial explorations into this relatively unexplored idea space. Furthermore, we will discuss the use of recurrent (LSTM) and convolutional neural networks to find and fit tracks in toy detector data.« less
The HEP.TrkX Project: deep neural networks for HL-LHC online and offline tracking
NASA Astrophysics Data System (ADS)
Farrell, Steven; Anderson, Dustin; Calafiura, Paolo; Cerati, Giuseppe; Gray, Lindsey; Kowalkowski, Jim; Mudigonda, Mayur; Prabhat; Spentzouris, Panagiotis; Spiropoulou, Maria; Tsaris, Aristeidis; Vlimant, Jean-Roch; Zheng, Stephan
2017-08-01
Particle track reconstruction in dense environments such as the detectors of the High Luminosity Large Hadron Collider (HL-LHC) is a challenging pattern recognition problem. Traditional tracking algorithms such as the combinatorial Kalman Filter have been used with great success in LHC experiments for years. However, these state-of-the-art techniques are inherently sequential and scale poorly with the expected increases in detector occupancy in the HL-LHC conditions. The HEP.TrkX project is a pilot project with the aim to identify and develop cross-experiment solutions based on machine learning algorithms for track reconstruction. Machine learning algorithms bring a lot of potential to this problem thanks to their capability to model complex non-linear data dependencies, to learn effective representations of high-dimensional data through training, and to parallelize easily on high-throughput architectures such as GPUs. This contribution will describe our initial explorations into this relatively unexplored idea space. We will discuss the use of recurrent (LSTM) and convolutional neural networks to find and fit tracks in toy detector data.
A Case Review: Integrating Lewin’s Theory with Lean’s System Approach for Change
Wojciechowski, Elizabeth; Pearsall, Tabitha; Murphy, Patricia; French, Eileen
2016-05-31
The complexity of healthcare calls for interprofessional collaboration to improve and sustain the best outcomes for safe and high quality patient care. Historically, rehabilitation nursing has been an area that relies heavily on interprofessional relationships. Professionals from various disciplines often subscribe to different change management theories for continuous quality improvement. Through a case review, authors describe how a large, Midwestern, rehabilitation hospital used the crosswalk methodology to facilitate interprofessional collaboration and develop an intervention model for implementing and sustaining bedside shift reporting. The authors provide project background and offer a brief overview of the two common frameworks used in this project, Lewin’s Three-Step Model for Change and the Lean Systems Approach. The description of the bedside shift report project methods demonstrates that multiple disciplines are able to utilize a common framework for leading and sustaining change to support outcomes of high quality and safe care, and capitalize on the opportunities of multiple views and discipline-specific approaches. The conclusion discusses outcomes, future initiatives, and implications for nursing practice.
Lorenz, D; Armbruster, W; Vogelgesang, C; Hoffmann, H; Pattar, A; Schmidt, D; Volk, T; Kubulus, D
2016-09-01
Chief emergency physicians are regarded as an important element in the care of the injured and sick following mass casualty accidents. Their education is very theoretical; practical content in contrast often falls short. Limitations are usually the very high costs of realistic (large-scale) exercises, poor reproducibility of the scenarios, and poor corresponding results. To substantially improve the educational level because of the complexity of mass casualty accidents, modified training concepts are required that teach the not only the theoretical but above all the practical skills considerably more intensively than at present. Modern training concepts should make it possible for the learner to realistically simulate decision processes. This article examines how interactive virtual environments are applicable for the education of emergency personnel and how they could be designed. Virtual simulation and training environments offer the possibility of simulating complex situations in an adequately realistic manner. The so-called virtual reality (VR) used in this context is an interface technology that enables free interaction in addition to a stereoscopic and spatial representation of virtual large-scale emergencies in a virtual environment. Variables in scenarios such as the weather, the number wounded, and the availability of resources, can be changed at any time. The trainees are able to practice the procedures in many virtual accident scenes and act them out repeatedly, thereby testing the different variants. With the aid of the "InSitu" project, it is possible to train in a virtual reality with realistically reproduced accident situations. These integrated, interactive training environments can depict very complex situations on a scale of 1:1. Because of the highly developed interactivity, the trainees can feel as if they are a direct part of the accident scene and therefore identify much more with the virtual world than is possible with desktop systems. Interactive, identifiable, and realistic training environments based on projector systems could in future enable a repetitive exercise with changes within a decision tree, in reproducibility, and within different occupational groups. With a hard- and software environment numerous accident situations can be depicted and practiced. The main expense is the creation of the virtual accident scenes. As the appropriate city models and other three-dimensional geographical data are already available, this expenditure is very low compared with the planning costs of a large-scale exercise.
Some thoughts on the management of large, complex international space ventures
NASA Astrophysics Data System (ADS)
Lee, T. J.; Kutzer, Ants; Schneider, W. C.
The nations of the world have already collaborated on a number of joint space ventures of varying complexities. To name a few of the variations in management arrangements, the schemes have included the utilization of one nation's spacecraft to orbit another's experiment, the launch of another's spacecraft, the development of an offline article (such as the Spacelab), and the cooperative development of the Space Station Freedom (S.S. Freedom). Today, as the scope of the problems and solutions involved in establishing a permanently manned colony on the Moon and exploring the surface of Mars become clearer, the idea of a major sharing of the enormous tasks among the spacefaring nations seems more and more necessary and, indeed, required. For such a major, complex project, success depends upon the management as much as it does on the technology. If the project is not organized in a logical and workable manner with clear areas of responsibility and with an agreed-to chain of command, it is in as much jeopardy as it is if the resources are not available. It is vital that thought and analysis be put on this aspect of a "Mission from Planet Earth" early, to insure that the project is not divided into an impractical organizational structure and that agreements are not made which are unsound. As an example of the questions to be explored, the lead organization can take many forms. Clearly, there must be a recognized leader to make the many difficult programmatic decisions which will arise. The lead could be assigned to one nation; it could be assigned to a new international group; it could be assigned to a consortium; or granted to a committee. Each has implications and problems to be explored. This paper will open the discussions. It is the intent of this paper to begin the process based upon the authors' experiences in various international projects. It is to arouse interest and discussion not to select a final solution. Final solutions will depend upon capabilities, financial regards, and politics (some of the variables), and the management scheme will have to take all into account. The scheme selected must be firm, logical, and workable.
Enabling a Community to Dissect an Organism: Overview of the Neurospora Functional Genomics Project
Dunlap, Jay C.; Borkovich, Katherine A.; Henn, Matthew R.; Turner, Gloria E.; Sachs, Matthew S.; Glass, N. Louise; McCluskey, Kevin; Plamann, Michael; Galagan, James E.; Birren, Bruce W.; Weiss, Richard L.; Townsend, Jeffrey P.; Loros, Jennifer J.; Nelson, Mary Anne; Lambreghts, Randy; Colot, Hildur V.; Park, Gyungsoon; Collopy, Patrick; Ringelberg, Carol; Crew, Christopher; Litvinkova, Liubov; DeCaprio, Dave; Hood, Heather M.; Curilla, Susan; Shi, Mi; Crawford, Matthew; Koerhsen, Michael; Montgomery, Phil; Larson, Lisa; Pearson, Matthew; Kasuga, Takao; Tian, Chaoguang; Baştürkmen, Meray; Altamirano, Lorena; Xu, Junhuan
2013-01-01
A consortium of investigators is engaged in a functional genomics project centered on the filamentous fungus Neurospora, with an eye to opening up the functional genomic analysis of all the filamentous fungi. The overall goal of the four interdependent projects in this effort is to acccomplish functional genomics, annotation, and expression analyses of Neurospora crassa, a filamentous fungus that is an established model for the assemblage of over 250,000 species of nonyeast fungi. Building from the completely sequenced 43-Mb Neurospora genome, Project 1 is pursuing the systematic disruption of genes through targeted gene replacements, phenotypic analysis of mutant strains, and their distribution to the scientific community at large. Project 2, through a primary focus in Annotation and Bioinformatics, has developed a platform for electronically capturing community feedback and data about the existing annotation, while building and maintaining a database to capture and display information about phenotypes. Oligonucleotide-based microarrays created in Project 3 are being used to collect baseline expression data for the nearly 11,000 distinguishable transcripts in Neurospora under various conditions of growth and development, and eventually to begin to analyze the global effects of loss of novel genes in strains created by Project 1. cDNA libraries generated in Project 4 document the overall complexity of expressed sequences in Neurospora, including alternative splicing alternative promoters and antisense transcripts. In addition, these studies have driven the assembly of an SNP map presently populated by nearly 300 markers that will greatly accelerate the positional cloning of genes. PMID:17352902
Environmental projects. Volume 2: Underground storage tanks compliance program
NASA Technical Reports Server (NTRS)
Kushner, L.
1987-01-01
Six large parabolic dish antennas are located at the Goldstone Deep Space Communications Complex north of Barstow, California. As a large-scale facility located in a remote, isolated desert region, the GDSCC operations require numerous on-site storage facilities for gasoline, diesel and hydraulic oil. These essential fluids are stored in underground storage tanks (USTs). Because USTs may develop leaks with the resultant seepage of their hazardous contents into the surrounding soil, local, State and Federal authorities have adopted stringent regulations for the testing and maintenance of USTs. Under the supervision of JPL's Office of Telecommunications and Data Acquisition, a year-long program has brought 27 USTs at the Goldstone Complex into compliance with Federal, State of California and County of San Bernadino regulations. Of these 27 USTs, 15 are operating today, 11 have been temporary closed down, and 1 abandoned in place. In 1989, the 15 USTs now operating at the Goldstone DSCC will be replaced either by modern, double-walled USTs equipped with automatic sensors for leak detection, or by above ground storage tanks. The 11 inactivated USTs are to be excavated, removed and disposed of according to regulation.
NASA Technical Reports Server (NTRS)
Bengelsdorf, Irv
1991-01-01
The Goldstone Deep Space Communications Complex (GDSCC), located in the Mojave Desert about 40 miles north of Barstow, California, and about 160 miles northeast of Pasadena, is part of the National Aeronautics and Space Administration's (NASA's) Deep Space Network, one of the world's largest and most sensitive scientific telecommunications and radio navigation networks. Activities at the GDSCC are carried out in support of six large parabolic dish antennas. As a large-scale facility located in a remote, isolated desert region, the GDSCC operations require numerous on-site storage facilities for gasoline, diesel oil, hydraulic oil, and waste oil. These fluids are stored in underground storage tanks (USTs). This present volume describes what happened to the 26 USTs that remained at the GDSCC. Twenty-four of these USTs were constructed of carbon steel without any coating for corrosion protection, and without secondary containment or leak detection. Two remaining USTs were constructed of fiberglass-coated carbon steel but without secondary containment or leak protection. Of the 26 USTs that remained at the GDSCC, 23 were cleaned, removed from the ground, cut up, and hauled away from the GDSCC for environmentally acceptable disposal. Three USTs were permanently closed (abandoned in place).
DOE Office of Scientific and Technical Information (OSTI.GOV)
P. H. Titus, S. Avasaralla, A.Brooks, R. Hatcher
2010-09-22
The National Spherical Torus Experiment (NSTX) project is planning upgrades to the toroidal field, plasma current and pulse length. This involves the replacement of the center-stack, including the inner legs of the TF, OH, and inner PF coils. A second neutral beam will also be added. The increased performance of the upgrade requires qualification of the remaining components including the vessel, passive plates, and divertor for higher disruption loads. The hardware needing qualification is more complex than is typically accessible by large scale electromagnetic (EM) simulations of the plasma disruptions. The usual method is to include simplified representations of componentsmore » in the large EM models and attempt to extract forces to apply to more detailed models. This paper describes a more efficient approach of combining comprehensive modeling of the plasma and tokamak conducting structures, using the 2D OPERA code, with much more detailed treatment of individual components using ANSYS electromagnetic (EM) and mechanical analysis. This capture local eddy currents and resulting loads in complex details, and allows efficient non-linear, and dynamic structural analyses.« less
Goonesekere, Nalin Cw
2009-01-01
The large numbers of protein sequences generated by whole genome sequencing projects require rapid and accurate methods of annotation. The detection of homology through computational sequence analysis is a powerful tool in determining the complex evolutionary and functional relationships that exist between proteins. Homology search algorithms employ amino acid substitution matrices to detect similarity between proteins sequences. The substitution matrices in common use today are constructed using sequences aligned without reference to protein structure. Here we present amino acid substitution matrices constructed from the alignment of a large number of protein domain structures from the structural classification of proteins (SCOP) database. We show that when incorporated into the homology search algorithms BLAST and PSI-blast, the structure-based substitution matrices enhance the efficacy of detecting remote homologs.
Problems in merging Earth sensing satellite data sets
NASA Technical Reports Server (NTRS)
Smith, Paul H.; Goldberg, Michael J.
1987-01-01
Satellite remote sensing systems provide a tremendous source of data flow to the Earth science community. These systems provide scientists with data of types and on a scale previously unattainable. Looking forward to the capabilities of Space Station and the Earth Observing System (EOS), the full realization of the potential of satellite remote sensing will be handicapped by inadequate information systems. There is a growing emphasis in Earth science research to ask questions which are multidisciplinary in nature and global in scale. Many of these research projects emphasize the interactions of the land surface, the atmosphere, and the oceans through various physical mechanisms. Conducting this research requires large and complex data sets and teams of multidisciplinary scientists, often working at remote locations. A review of the problems of merging these large volumes of data into spatially referenced and manageable data sets is presented.
NASA Technical Reports Server (NTRS)
Keller, Richard M.
1991-01-01
The construction of scientific software models is an integral part of doing science, both within NASA and within the scientific community at large. Typically, model-building is a time-intensive and painstaking process, involving the design of very large, complex computer programs. Despite the considerable expenditure of resources involved, completed scientific models cannot easily be distributed and shared with the larger scientific community due to the low-level, idiosyncratic nature of the implemented code. To address this problem, we have initiated a research project aimed at constructing a software tool called the Scientific Modeling Assistant. This tool provides automated assistance to the scientist in developing, using, and sharing software models. We describe the Scientific Modeling Assistant, and also touch on some human-machine interaction issues relevant to building a successful tool of this type.
Full-color, large area, transmissive holograms enabled by multi-level diffractive optics.
Mohammad, Nabil; Meem, Monjurul; Wan, Xiaowen; Menon, Rajesh
2017-07-19
We show that multi-level diffractive microstructures can enable broadband, on-axis transmissive holograms that can project complex full-color images, which are invariant to viewing angle. Compared to alternatives like metaholograms, diffractive holograms utilize much larger minimum features (>10 µm), much smaller aspect ratios (<0.2) and thereby, can be fabricated in a single lithography step over relatively large areas (>30 mm ×30 mm). We designed, fabricated and characterized holograms that encode various full-color images. Our devices demonstrate absolute transmission efficiencies of >86% across the visible spectrum from 405 nm to 633 nm (peak value of about 92%), and excellent color fidelity. Furthermore, these devices do not exhibit polarization dependence. Finally, we emphasize that our devices exhibit negligible absorption and are phase-only holograms with high diffraction efficiency.
Four experimental demonstrations of active vibration control for flexible structures
NASA Technical Reports Server (NTRS)
Phillips, Doug; Collins, Emmanuel G., Jr.
1990-01-01
Laboratory experiments designed to test prototype active-vibration-control systems under development for future flexible space structures are described, summarizing previously reported results. The control-synthesis technique employed for all four experiments was the maximum-entropy optimal-projection (MEOP) method (Bernstein and Hyland, 1988). Consideration is given to: (1) a pendulum experiment on large-amplitude LF dynamics; (2) a plate experiment on broadband vibration suppression in a two-dimensional structure; (3) a multiple-hexagon experiment combining the factors studied in (1) and (2) to simulate the complexity of a large space structure; and (4) the NASA Marshall ACES experiment on a lightweight deployable 45-foot beam. Extensive diagrams, drawings, graphs, and photographs are included. The results are shown to validate the MEOP design approach, demonstrating that good performance is achievable using relatively simple low-order decentralized controllers.
ChelomEx: Isotope-assisted discovery of metal chelates in complex media using high-resolution LC-MS.
Baars, Oliver; Morel, François M M; Perlman, David H
2014-11-18
Chelating agents can control the speciation and reactivity of trace metals in biological, environmental, and laboratory-derived media. A large number of trace metals (including Fe, Cu, Zn, Hg, and others) show characteristic isotopic fingerprints that can be exploited for the discovery of known and unknown organic metal complexes and related chelating ligands in very complex sample matrices using high-resolution liquid chromatography mass spectrometry (LC-MS). However, there is currently no free open-source software available for this purpose. We present a novel software tool, ChelomEx, which identifies isotope pattern-matched chromatographic features associated with metal complexes along with free ligands and other related adducts in high-resolution LC-MS data. High sensitivity and exclusion of false positives are achieved by evaluation of the chromatographic coherence of the isotope pattern within chromatographic features, which we demonstrate through the analysis of bacterial culture media. A built-in graphical user interface and compound library aid in identification and efficient evaluation of results. ChelomEx is implemented in MatLab. The source code, binaries for MS Windows and MAC OS X as well as test LC-MS data are available for download at SourceForge ( http://sourceforge.net/projects/chelomex ).
Task-phase-specific dynamics of basal forebrain neuronal ensembles
Tingley, David; Alexander, Andrew S.; Kolbu, Sean; de Sa, Virginia R.; Chiba, Andrea A.; Nitz, Douglas A.
2014-01-01
Cortically projecting basal forebrain neurons play a critical role in learning and attention, and their degeneration accompanies age-related impairments in cognition. Despite the impressive anatomical and cell-type complexity of this system, currently available data suggest that basal forebrain neurons lack complexity in their response fields, with activity primarily reflecting only macro-level brain states such as sleep and wake, onset of relevant stimuli and/or reward obtainment. The current study examined the spiking activity of basal forebrain neuron populations across multiple phases of a selective attention task, addressing, in particular, the issue of complexity in ensemble firing patterns across time. Clustering techniques applied to the full population revealed a large number of distinct categories of task-phase-specific activity patterns. Unique population firing-rate vectors defined each task phase and most categories of task-phase-specific firing had counterparts with opposing firing patterns. An analogous set of task-phase-specific firing patterns was also observed in a population of posterior parietal cortex neurons. Thus, consistent with the known anatomical complexity, basal forebrain population dynamics are capable of differentially modulating their cortical targets according to the unique sets of environmental stimuli, motor requirements, and cognitive processes associated with different task phases. PMID:25309352
NASA Astrophysics Data System (ADS)
Cheng, Lin; Yang, Yongqing; Li, Li; Sui, Xin
2018-06-01
This paper studies the finite-time hybrid projective synchronization of the drive-response complex networks. In the model, general transmission delays and distributed delays are also considered. By designing the adaptive intermittent controllers, the response network can achieve hybrid projective synchronization with the drive system in finite time. Based on finite-time stability theory and several differential inequalities, some simple finite-time hybrid projective synchronization criteria are derived. Two numerical examples are given to illustrate the effectiveness of the proposed method.
Support of an Active Science Project by a Large Information System: Lessons for the EOS Era
NASA Technical Reports Server (NTRS)
Angelici, Gary L.; Skiles, J. W.; Popovici, Lidia Z.
1993-01-01
The ability of large information systems to support the changing data requirements of active science projects is being tested in a NASA collaborative study. This paper briefly profiles both the active science project and the large information system involved in this effort and offers some observations about the effectiveness of the project support. This is followed by lessons that are important for those participating in large information systems that need to support active science projects or that make available the valuable data produced by these projects. We learned in this work that it is difficult for a large information system focused on long term data management to satisfy the requirements of an on-going science project. For example, in order to provide the best service, it is important for all information system staff to keep focused on the needs and constraints of the scientists in the development of appropriate services. If the lessons learned in this and other science support experiences are not applied by those involved with large information systems of the EOS (Earth Observing System) era, then the final data products produced by future science projects may not be robust or of high quality, thereby making the conduct of the project science less efficacious and reducing the value of these unique suites of data for future research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, M. Hope; Truex, Mike; Freshley, Mark
Complex sites are defined as those with difficult subsurface access, deep and/or thick zones of contamination, large areal extent, subsurface heterogeneities that limit the effectiveness of remediation, or where long-term remedies are needed to address contamination (e.g., because of long-term sources or large extent). The Test Area North at the Idaho National Laboratory, developed for nuclear fuel operations and heavy metal manufacturing, is used as a case study. Liquid wastes and sludge from experimental facilities were disposed in an injection well, which contaminated the subsurface aquifer located deep within fractured basalt. The wastes included organic, inorganic, and low-level radioactive constituents,more » with the focus of this case study on trichloroethylene. The site is used as an example of a systems-based framework that provides a structured approach to regulatory processes established for remediation under existing regulations. The framework is intended to facilitate remedy decisions and implementation at complex sites where restoration may be uncertain, require long timeframes, or involve use of adaptive management approaches. The framework facilitates site, regulator, and stakeholder interactions during the remedial planning and implementation process by using a conceptual model description as a technical foundation for decisions, identifying endpoints, which are interim remediation targets or intermediate decision points on the path to an ultimate end, and maintaining protectiveness during the remediation process. At the Test Area North, using a structured approach to implementing concepts in the endpoint framework, a three-component remedy is largely functioning as intended and is projected to meet remedial action objectives by 2095 as required. The remedy approach is being adjusted as new data become available. The framework provides a structured process for evaluating and adjusting the remediation approach, allowing site owners, regulators, and stakeholders to manage contamination at complex sites where adaptive remedies are needed.« less
Evolving from bioinformatics in-the-small to bioinformatics in-the-large.
Parker, D Stott; Gorlick, Michael M; Lee, Christopher J
2003-01-01
We argue the significance of a fundamental shift in bioinformatics, from in-the-small to in-the-large. Adopting a large-scale perspective is a way to manage the problems endemic to the world of the small-constellations of incompatible tools for which the effort required to assemble an integrated system exceeds the perceived benefit of the integration. Where bioinformatics in-the-small is about data and tools, bioinformatics in-the-large is about metadata and dependencies. Dependencies represent the complexities of large-scale integration, including the requirements and assumptions governing the composition of tools. The popular make utility is a very effective system for defining and maintaining simple dependencies, and it offers a number of insights about the essence of bioinformatics in-the-large. Keeping an in-the-large perspective has been very useful to us in large bioinformatics projects. We give two fairly different examples, and extract lessons from them showing how it has helped. These examples both suggest the benefit of explicitly defining and managing knowledge flows and knowledge maps (which represent metadata regarding types, flows, and dependencies), and also suggest approaches for developing bioinformatics database systems. Generally, we argue that large-scale engineering principles can be successfully adapted from disciplines such as software engineering and data management, and that having an in-the-large perspective will be a key advantage in the next phase of bioinformatics development.
A segmentation algorithm based on image projection for complex text layout
NASA Astrophysics Data System (ADS)
Zhu, Wangsheng; Chen, Qin; Wei, Chuanyi; Li, Ziyang
2017-10-01
Segmentation algorithm is an important part of layout analysis, considering the efficiency advantage of the top-down approach and the particularity of the object, a breakdown of projection layout segmentation algorithm. Firstly, the algorithm will algorithm first partitions the text image, and divided into several columns, then for each column scanning projection, the text image is divided into several sub regions through multiple projection. The experimental results show that, this method inherits the projection itself and rapid calculation speed, but also can avoid the effect of arc image information page segmentation, and also can accurate segmentation of the text image layout is complex.
BAC sequencing using pooled methods.
Saski, Christopher A; Feltus, F Alex; Parida, Laxmi; Haiminen, Niina
2015-01-01
Shotgun sequencing and assembly of a large, complex genome can be both expensive and challenging to accurately reconstruct the true genome sequence. Repetitive DNA arrays, paralogous sequences, polyploidy, and heterozygosity are main factors that plague de novo genome sequencing projects that typically result in highly fragmented assemblies and are difficult to extract biological meaning. Targeted, sub-genomic sequencing offers complexity reduction by removing distal segments of the genome and a systematic mechanism for exploring prioritized genomic content through BAC sequencing. If one isolates and sequences the genome fraction that encodes the relevant biological information, then it is possible to reduce overall sequencing costs and efforts that target a genomic segment. This chapter describes the sub-genome assembly protocol for an organism based upon a BAC tiling path derived from a genome-scale physical map or from fine mapping using BACs to target sub-genomic regions. Methods that are described include BAC isolation and mapping, DNA sequencing, and sequence assembly.
NASA Astrophysics Data System (ADS)
Ping, Jinglei; Johnson, A. T. Charlie; A. T. Charlie Johnson Team
Conventional electrical methods for detecting charge transfer through protein pores perturb the electrostatic condition of the solution and chemical reactivity of the pore, and are not suitable to be used for complex biofluids. We developed a non-perturbative methodology ( fW input power) for quantifying trans-pore electrical current and detecting the pore status (i.e., open vs. closes) via graphene microelectrodes. Ferritin was used as a model protein featuring a large interior compartment, well-separated from the exterior solution with discrete pores as charge commuting channels. The charge flowing through the ferritin pores transfers into the graphene microelectrode and is recorded by an electrometer. In this example, our methodology enables the quantification of an inorganic nanoparticle-protein nanopore interaction in complex biofluids. The authors acknowledge the support from the Defense Advanced Research Projects Agency (DARPA) and the U.S. Army Research Office under Grant Number W911NF1010093.
Photo-realistic Terrain Modeling and Visualization for Mars Exploration Rover Science Operations
NASA Technical Reports Server (NTRS)
Edwards, Laurence; Sims, Michael; Kunz, Clayton; Lees, David; Bowman, Judd
2005-01-01
Modern NASA planetary exploration missions employ complex systems of hardware and software managed by large teams of. engineers and scientists in order to study remote environments. The most complex and successful of these recent projects is the Mars Exploration Rover mission. The Computational Sciences Division at NASA Ames Research Center delivered a 30 visualization program, Viz, to the MER mission that provides an immersive, interactive environment for science analysis of the remote planetary surface. In addition, Ames provided the Athena Science Team with high-quality terrain reconstructions generated with the Ames Stereo-pipeline. The on-site support team for these software systems responded to unanticipated opportunities to generate 30 terrain models during the primary MER mission. This paper describes Viz, the Stereo-pipeline, and the experiences of the on-site team supporting the scientists at JPL during the primary MER mission.
Configuration Management at NASA
NASA Technical Reports Server (NTRS)
Doreswamy, Rajiv
2013-01-01
NASA programs are characterized by complexity, harsh environments and the fact that we usually have one chance to get it right. Programs last decades and need to accept new hardware and technology as it is developed. We have multiple suppliers and international partners Our challenges are many, our costs are high and our failures are highly visible. CM systems need to be scalable, adaptable to new technology and span the life cycle of the program (30+ years). Multiple Systems, Contractors and Countries added major levels of complexity to the ISS program and CM/DM and Requirements management systems center dot CM Systems need to be designed for long design life center dot Space Station Design started in 1984 center dot Assembly Complete in 2012 center dot Systems were developed on a task basis without an overall system perspective center dot Technology moves faster than a large project office, try to make sure you have a system that can adapt
Intelligent fault management for the Space Station active thermal control system
NASA Technical Reports Server (NTRS)
Hill, Tim; Faltisco, Robert M.
1992-01-01
The Thermal Advanced Automation Project (TAAP) approach and architecture is described for automating the Space Station Freedom (SSF) Active Thermal Control System (ATCS). The baseline functionally and advanced automation techniques for Fault Detection, Isolation, and Recovery (FDIR) will be compared and contrasted. Advanced automation techniques such as rule-based systems and model-based reasoning should be utilized to efficiently control, monitor, and diagnose this extremely complex physical system. TAAP is developing advanced FDIR software for use on the SSF thermal control system. The goal of TAAP is to join Knowledge-Based System (KBS) technology, using a combination of rules and model-based reasoning, with conventional monitoring and control software in order to maximize autonomy of the ATCS. TAAP's predecessor was NASA's Thermal Expert System (TEXSYS) project which was the first large real-time expert system to use both extensive rules and model-based reasoning to control and perform FDIR on a large, complex physical system. TEXSYS showed that a method is needed for safely and inexpensively testing all possible faults of the ATCS, particularly those potentially damaging to the hardware, in order to develop a fully capable FDIR system. TAAP therefore includes the development of a high-fidelity simulation of the thermal control system. The simulation provides realistic, dynamic ATCS behavior and fault insertion capability for software testing without hardware related risks or expense. In addition, thermal engineers will gain greater confidence in the KBS FDIR software than was possible prior to this kind of simulation testing. The TAAP KBS will initially be a ground-based extension of the baseline ATCS monitoring and control software and could be migrated on-board as additional computation resources are made available.
The European ALMA Regional Centre: a model of user support
NASA Astrophysics Data System (ADS)
Andreani, P.; Stoehr, F.; Zwaan, M.; Hatziminaoglou, E.; Biggs, A.; Diaz-Trigo, M.; Humphreys, E.; Petry, D.; Randall, S.; Stanke, T.; van Kampen, E.; Bárta, M.; Brand, J.; Gueth, F.; Hogerheijde, M.; Bertoldi, F.; Muxlow, T.; Richards, A.; Vlemmings, W.
2014-08-01
The ALMA Regional Centres (ARCs) form the interface between the ALMA observatory and the user community from the proposal preparation stage to the delivery of data and their subsequent analysis. The ARCs provide critical services to both the ALMA operations in Chile and to the user community. These services were split by the ALMA project into core and additional services. The core services are financed by the ALMA operations budget and are critical to the successful operation of ALMA. They are contractual obligations and must be delivered to the ALMA project. The additional services are not funded by the ALMA project and are not contractual obligations, but are critical to achieve ALMA full scientific potential. A distributed network of ARC nodes (with ESO being the central ARC) has been set up throughout Europe at the following seven locations: Bologna, Bonn-Cologne, Grenoble, Leiden, Manchester, Ondrejov, Onsala. These ARC nodes are working together with the central node at ESO and provide both core and additional services to the ALMA user community. This paper presents the European ARC, and how it operates in Europe to support the ALMA community. This model, although complex in nature, is turning into a very successful one, providing a service to the scientific community that has been so far highly appreciated. The ARC could become a reference support model in an age where very large collaborations are required to build large facilities, and support is needed for geographically and culturally diverse communities.
Achieving mask order processing automation, interoperability and standardization based on P10
NASA Astrophysics Data System (ADS)
Rodriguez, B.; Filies, O.; Sadran, D.; Tissier, Michel; Albin, D.; Stavroulakis, S.; Voyiatzis, E.
2007-02-01
Last year the MUSCLE (Masks through User's Supply Chain: Leadership by Excellence) project was presented. Here is the project advancement. A key process in mask supply chain management is the exchange of technical information for ordering masks. This process is large, complex, company specific and error prone, and leads to longer cycle times and higher costs due to missing or wrong inputs. Its automation and standardization could produce significant benefits. We need to agree on the standard for mandatory and optional parameters, and also a common way to describe parameters when ordering. A system was created to improve the performance in terms of Key Performance Indicators (KPIs) such as cycle time and cost of production. This tool allows us to evaluate and measure the effect of factors, as well as the effect of implementing the improvements of the complete project. Next, a benchmark study and a gap analysis were performed. These studies show the feasibility of standardization, as there is a large overlap in requirements. We see that the SEMI P10 standard needs enhancements. A format supporting the standard is required, and XML offers the ability to describe P10 in a flexible way. Beyond using XML for P10, the semantics of the mask order should also be addressed. A system design and requirements for a reference implementation for a P10 based management system are presented, covering a mechanism for the evolution and for version management and a design for P10 editing and data validation.
Mode Reduction and Upscaling of Reactive Transport Under Incomplete Mixing
NASA Astrophysics Data System (ADS)
Lester, D. R.; Bandopadhyay, A.; Dentz, M.; Le Borgne, T.
2016-12-01
Upscaling of chemical reactions in partially-mixed fluid environments is a challenging problem due to the detailed interactions between inherently nonlinear reaction kinetics and complex spatio-temporal concentration distributions under incomplete mixing. We address this challenge via the development of an order reduction method for the advection-diffusion-reaction equation (ADRE) via projection of the reaction kinetics onto a small number N of leading eigenmodes of the advection-diffusion operator (the so-called "strange eigenmodes" of the flow) as an N-by-N nonlinear system, whilst mixing dynamics only are projected onto the remaining modes. For simple kinetics and moderate Péclet and Damkhöler numbers, this approach yields analytic solutions for the concentration mean, evolving spatio-temporal distribution and PDF in terms of the well-mixed reaction kinetics and mixing dynamics. For more complex kinetics or large Péclet or Damkhöler numbers only a small number of modes are required to accurately quantify the mixing and reaction dynamics in terms of the concentration field and PDF, facilitating greatly simplified approximation and analysis of reactive transport. Approximate solutions of this low-order nonlinear system provide quantiative predictions of the evolving concentration PDF. We demonstrate application of this method to a simple random flow and various mass-action reaction kinetics.
Research at NASA's NFAC wind tunnels
NASA Technical Reports Server (NTRS)
Edenborough, H. Kipling
1990-01-01
The National Full-Scale Aerodynamics Complex (NFAC) is a unique combination of wind tunnels that allow the testing of aerodynamic and dynamic models at full or large scale. It can even accommodate actual aircraft with their engines running. Maintaining full-scale Reynolds numbers and testing with surface irregularities, protuberances, and control surface gaps that either closely match the full-scale or indeed are those of the full-scale aircraft help produce test data that accurately predict what can be expected from future flight investigations. This complex has grown from the venerable 40- by 80-ft wind tunnel that has served for over 40 years helping researchers obtain data to better understand the aerodynamics of a wide range of aircraft from helicopters to the space shuttle. A recent modification to the tunnel expanded its maximum speed capabilities, added a new 80- by 120-ft test section and provided extensive acoustic treatment. The modification is certain to make the NFAC an even more useful facility for NASA's ongoing research activities. A brief background is presented on the original facility and the kind of testing that has been accomplished using it through the years. A summary of the modification project and the measured capabilities of the two test sections is followed by a review of recent testing activities and of research projected for the future.
Semantic Web technologies for the big data in life sciences.
Wu, Hongyan; Yamaguchi, Atsuko
2014-08-01
The life sciences field is entering an era of big data with the breakthroughs of science and technology. More and more big data-related projects and activities are being performed in the world. Life sciences data generated by new technologies are continuing to grow in not only size but also variety and complexity, with great speed. To ensure that big data has a major influence in the life sciences, comprehensive data analysis across multiple data sources and even across disciplines is indispensable. The increasing volume of data and the heterogeneous, complex varieties of data are two principal issues mainly discussed in life science informatics. The ever-evolving next-generation Web, characterized as the Semantic Web, is an extension of the current Web, aiming to provide information for not only humans but also computers to semantically process large-scale data. The paper presents a survey of big data in life sciences, big data related projects and Semantic Web technologies. The paper introduces the main Semantic Web technologies and their current situation, and provides a detailed analysis of how Semantic Web technologies address the heterogeneous variety of life sciences big data. The paper helps to understand the role of Semantic Web technologies in the big data era and how they provide a promising solution for the big data in life sciences.
Food for Thought ... Mechanistic Validation
Hartung, Thomas; Hoffmann, Sebastian; Stephens, Martin
2013-01-01
Summary Validation of new approaches in regulatory toxicology is commonly defined as the independent assessment of the reproducibility and relevance (the scientific basis and predictive capacity) of a test for a particular purpose. In large ring trials, the emphasis to date has been mainly on reproducibility and predictive capacity (comparison to the traditional test) with less attention given to the scientific or mechanistic basis. Assessing predictive capacity is difficult for novel approaches (which are based on mechanism), such as pathways of toxicity or the complex networks within the organism (systems toxicology). This is highly relevant for implementing Toxicology for the 21st Century, either by high-throughput testing in the ToxCast/ Tox21 project or omics-based testing in the Human Toxome Project. This article explores the mostly neglected assessment of a test's scientific basis, which moves mechanism and causality to the foreground when validating/qualifying tests. Such mechanistic validation faces the problem of establishing causality in complex systems. However, pragmatic adaptations of the Bradford Hill criteria, as well as bioinformatic tools, are emerging. As critical infrastructures of the organism are perturbed by a toxic mechanism we argue that by focusing on the target of toxicity and its vulnerability, in addition to the way it is perturbed, we can anchor the identification of the mechanism and its verification. PMID:23665802
The Electrolyte Genome project: A big data approach in battery materials discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qu, Xiaohui; Jain, Anubhav; Rajput, Nav Nidhi
2015-06-01
We present a high-throughput infrastructure for the automated calculation of molecular properties with a focus on battery electrolytes. The infrastructure is largely open-source and handles both practical aspects (input file generation, output file parsing, and information management) as well as more complex problems (structure matching, salt complex generation, and failure recovery). Using this infrastructure, we have computed the ionization potential (IP) and electron affinities (EA) of 4830 molecules relevant to battery electrolytes (encompassing almost 55,000 quantum mechanics calculations) at the B3LYP/6-31+G(*) level. We describe automated workflows for computing redox potential, dissociation constant, and salt-molecule binding complex structure generation. We presentmore » routines for automatic recovery from calculation errors, which brings the failure rate from 9.2% to 0.8% for the QChem DFT code. Automated algorithms to check duplication between two arbitrary molecules and structures are described. We present benchmark data on basis sets and functionals on the G2-97 test set; one finding is that a IP/EA calculation method that combines PBE geometry optimization and B3LYP energy evaluation requires less computational cost and yields nearly identical results as compared to a full B3LYP calculation, and could be suitable for the calculation of large molecules. Our data indicates that among the 8 functionals tested, XYGJ-OS and B3LYP are the two best functionals to predict IP/EA with an RMSE of 0.12 and 0.27 eV, respectively. Application of our automated workflow to a large set of quinoxaline derivative molecules shows that functional group effect and substitution position effect can be separated for IP/EA of quinoxaline derivatives, and the most sensitive position is different for IP and EA. Published by Elsevier B.V« less
The NSF ITR Project: Framework for the National Virtual Observatory
NASA Astrophysics Data System (ADS)
Szalay, A. S.; Williams, R. D.; NVO Collaboration
2002-05-01
Technological advances in telescope and instrument design during the last ten years, coupled with the exponential increase in computer and communications capability, have caused a dramatic and irreversible change in the character of astronomical research. Large-scale surveys of the sky from space and ground are being initiated at wavelengths from radio to x-ray, thereby generating vast amounts of high quality irreplaceable data. The potential for scientific discovery afforded by these new surveys is enormous. Entirely new and unexpected scientific results of major significance will emerge from the combined use of the resulting datasets, science that would not be possible from such sets used singly. However, their large size and complexity require tools and structures to discover the complex phenomena encoded within them. We plan to build the NVO framework both through coordinating diverse efforts already in existence and providing a focus for the development of capabilities that do not yet exist. The NVO we envisage will act as an enabling and coordinating entity to foster the development of further tools, protocols, and collaborations necessary to realize the full scientific potential of large astronomical datasets in the coming decade. The NVO must be able to change and respond to the rapidly evolving world of IT technology. In spite of its underlying complex software, the NVO should be no harder to use for the average astronomer, than today's brick-and-mortar observatories and telescopes. Development of these capabilities will require close interaction and collaboration with the information technology community and other disciplines facing similar challenges. We need to ensure that the tools that we need exist or are built, but we do not duplicate efforts, and rely on relevant experience of others.
O'Connor, Brian D.; Yuen, Denis; Chung, Vincent; Duncan, Andrew G.; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent
2017-01-01
As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH). PMID:28344774
O'Connor, Brian D; Yuen, Denis; Chung, Vincent; Duncan, Andrew G; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent
2017-01-01
As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH).
Sensemaking in a Value Based Context for Large Scale Complex Engineered Systems
NASA Astrophysics Data System (ADS)
Sikkandar Basha, Nazareen
The design and the development of Large-Scale Complex Engineered Systems (LSCES) requires the involvement of multiple teams and numerous levels of the organization and interactions with large numbers of people and interdisciplinary departments. Traditionally, requirements-driven Systems Engineering (SE) is used in the design and development of these LSCES. The requirements are used to capture the preferences of the stakeholder for the LSCES. Due to the complexity of the system, multiple levels of interactions are required to elicit the requirements of the system within the organization. Since LSCES involves people and interactions between the teams and interdisciplinary departments, it should be socio-technical in nature. The elicitation of the requirements of most large-scale system projects are subjected to creep in time and cost due to the uncertainty and ambiguity of requirements during the design and development. In an organization structure, the cost and time overrun can occur at any level and iterate back and forth thus increasing the cost and time. To avoid such creep past researches have shown that rigorous approaches such as value based designing can be used to control it. But before the rigorous approaches can be used, the decision maker should have a proper understanding of requirements creep and the state of the system when the creep occurs. Sensemaking is used to understand the state of system when the creep occurs and provide a guidance to decision maker. This research proposes the use of the Cynefin framework, sensemaking framework which can be used in the design and development of LSCES. It can aide in understanding the system and decision making to minimize the value gap due to requirements creep by eliminating ambiguity which occurs during design and development. A sample hierarchical organization is used to demonstrate the state of the system at the occurrence of requirements creep in terms of cost and time using the Cynefin framework. These trials are continued for different requirements and at different sub-system level. The results obtained show that the Cynefin framework can be used to improve the value of the system and can be used for predictive analysis. The decision makers can use these findings and use rigorous approaches and improve the design of Large Scale Complex Engineered Systems.
Report of CEC Study Committee on Construction Management.
ERIC Educational Resources Information Center
Consulting Engineers Council of the U.S., Washington, DC.
Changing times place new demands on those involved in the implementation of construction projects. Within a relatively few years, the size and complexity of projects has grown substantially. Environmental and other public and social considerations are increasingly significant. With growing complexity, the requirements for effective project…
24 CFR 954.4 - Other Federal requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... regional or village corporation as defined or established pursuant to the Alaska Native Claims Settlement..., and affordable dwelling unit in the building/complex upon completion of the project. (2) Temporary... building/complex upon completion of the project; and (D) The provisions of paragraph (e)(2)(i) of this...
General risks for tunnelling projects: An overview
NASA Astrophysics Data System (ADS)
Siang, Lee Yong; Ghazali, Farid E. Mohamed; Zainun, Noor Yasmin; Ali, Roslinda
2017-10-01
Tunnels are indispensable when installing new infrastructure as well as when enhancing the quality of existing urban living due to their unique characteristics and potential applications. Over the past few decades, there has been a significant increase in the building of tunnels, world-wide. Tunnelling projects are complex endeavors, and risk assessment for tunnelling projects is likewise a complex process. Risk events are often interrelated. Occurrence of a technical risk usually carries cost and schedule consequences. Schedule risks typically impact cost escalation and project overhead. One must carefully consider the likelihood of a risk's occurrence and its impact in the context of a specific set of project conditions and circumstances. A project's goals, organization, and environment impacts in the context of a specific set of project conditions and circumstances. Some projects are primarily schedule driven; other projects are primarily cost or quality driven. Whether a specific risk event is perceived fundamentally as a cost risk or a schedule risk is governed by the project-specific context. Many researchers have pointed out the significance of recognition and control of the complexity, and risks of tunnelling projects. Although all general information on a project such as estimated duration, estimated cost, and stakeholders can be obtained, it is still quite difficult to accurately understand, predict and control the overall situation and development trends of the project, leading to the risks of tunnelling projects. This paper reviews all the key risks for tunnelling projects from several case studies that have been carried out by other researchers. These risks have been identified and reviewed in this paper. As a result, the current risk management plan in tunnelling projects can be enhanced by including all these reviewed risks as key information.
NASA Astrophysics Data System (ADS)
Sá, Ana C. L.; Benali, Akli; Pinto, Renata M. S.; Pereira, José M. C.; Trigo, Ricardo M.; DaCamara, Carlos C.
2014-05-01
Large wildfires are infrequent but account for the most severe environmental, ecological and socio-economic impacts. In recent years Portugal has suffered the impact of major heat waves that fuelled records of burnt area exceeding 400.000ha and 300.000ha in 2003 and 2005, respectively. According to the latest IPCC reports, the frequency and amplitude of summer heat waves over Iberia will very likely increase in the future. Therefore, most climate change studies point to an increase in the number and extent of wildfires. Thus, an increase in both wildfire impacts and fire suppression difficulties is expected. The spread of large wildfires results from a complex interaction between topography, meteorology and fuel properties. Wildfire spread models (e.g. FARSITE) are commonly used to simulate fire growth and behaviour and are an essential tool to understand their main drivers. Additionally, satellite active-fire data have been used to monitor the occurrence, extent, and spread of wildfires. Both satellite data and fire spread models provide different types of information about the spatial and temporal distribution of large wildfires and can potentially be used to support strategic decisions regarding fire suppression resource allocation. However, they have not been combined in a manner that fully exploits their potential and minimizes their limitations. A knowledge gap still exists in understanding how to minimize the impacts of large wildfires, leading to the following research question: What can we learn from past large wildfires in order to mitigate future fire impacts? FIRE-MODSAT is a one-year funded project by the Portuguese Foundation for the Science and Technology (FCT) that is founded on this research question, with the main goal of improving our understanding on the interactions between fire spread and its environmental drivers, to support fire management decisions in an operational context and generate valuable information to improve the efficiency of the fire suppression system. This project proposes to explore an innovative combination of remote sensing and fire spread models in order to 1) better understand the interactions of fire spread drivers that lead to large wildfires; 2) identify the spatio-temporal frames in which large wildfires can be suppressed more efficiently, and 3) explore the essential steps towards an operational use of both tools to assist fire suppression decisions. Preliminary results combine MODIS active-fire data and burn scar perimeters, to derive the main fire spread paths for the 10 largest wildfires that occurred in Portugal between 2001 and 2012. Fire growth and behavior simulations of some of those wildfires are assessed using the active fires data. Results are also compared with the major fire paths to understand the main drivers of fire propagation, through their interactions with topography, vegetation and meteorology. These combined results are also used for spatial and temporal identification of opportunity windows for a more efficient suppression intervention for each fire event. The approach shows promising results, providing a valuable reconstruction of the fire events and retrieval of important parameters related to the complex spread patterns of individual fire events.
NASA Astrophysics Data System (ADS)
Helbing, D.; Balietti, S.; Bishop, S.; Lukowicz, P.
2011-05-01
This contribution reflects on the comments of Peter Allen [1], Bikas K. Chakrabarti [2], Péter Érdi [3], Juval Portugali [4], Sorin Solomon [5], and Stefan Thurner [6] on three White Papers (WP) of the EU Support Action Visioneer (www.visioneer.ethz.ch). These White Papers are entitled "From Social Data Mining to Forecasting Socio-Economic Crises" (WP 1) [7], "From Social Simulation to Integrative System Design" (WP 2) [8], and "How to Create an Innovation Accelerator" (WP 3) [9]. In our reflections, the need and feasibility of a "Knowledge Accelerator" is further substantiated by fundamental considerations and recent events around the globe. newpara The Visioneer White Papers propose research to be carried out that will improve our understanding of complex techno-socio-economic systems and their interaction with the environment. Thereby, they aim to stimulate multi-disciplinary collaborations between ICT, the social sciences, and complexity science. Moreover, they suggest combining the potential of massive real-time data, theoretical models, large-scale computer simulations and participatory online platforms. By doing so, it would become possible to explore various futures and to expand the limits of human imagination when it comes to the assessment of the often counter-intuitive behavior of these complex techno-socio-economic-environmental systems. In this contribution, we also highlight the importance of a pluralistic modeling approach and, in particular, the need for a fruitful interaction between quantitative and qualitative research approaches. newpara In an appendix we briefly summarize the concept of the FuturICT flagship project, which will build on and go beyond the proposals made by the Visioneer White Papers. EU flagships are ambitious multi-disciplinary high-risk projects with a duration of at least 10 years amounting to an envisaged overall budget of 1 billion EUR [10]. The goal of the FuturICT flagship initiative is to understand and manage complex, global, socially interactive systems, with a focus on sustainability and resilience.
The Ophidia Stack: Toward Large Scale, Big Data Analytics Experiments for Climate Change
NASA Astrophysics Data System (ADS)
Fiore, S.; Williams, D. N.; D'Anca, A.; Nassisi, P.; Aloisio, G.
2015-12-01
The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in multiple domains (e.g. climate change). It provides a "datacube-oriented" framework responsible for atomically processing and manipulating scientific datasets, by providing a common way to run distributive tasks on large set of data fragments (chunks). Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes. The project relies on a strong background on high performance database management and On-Line Analytical Processing (OLAP) systems to manage large scientific datasets. The Ophidia analytics platform provides several data operators to manipulate datacubes (about 50), and array-based primitives (more than 100) to perform data analysis on large scientific data arrays. To address interoperability, Ophidia provides multiple server interfaces (e.g. OGC-WPS). From a client standpoint, a Python interface enables the exploitation of the framework into Python-based eco-systems/applications (e.g. IPython) and the straightforward adoption of a strong set of related libraries (e.g. SciPy, NumPy). The talk will highlight a key feature of the Ophidia framework stack: the "Analytics Workflow Management System" (AWfMS). The Ophidia AWfMS coordinates, orchestrates, optimises and monitors the execution of multiple scientific data analytics and visualization tasks, thus supporting "complex analytics experiments". Some real use cases related to the CMIP5 experiment will be discussed. In particular, with regard to the "Climate models intercomparison data analysis" case study proposed in the EU H2020 INDIGO-DataCloud project, workflows related to (i) anomalies, (ii) trend, and (iii) climate change signal analysis will be presented. Such workflows will be distributed across multiple sites - according to the datasets distribution - and will include intercomparison, ensemble, and outlier analysis. The two-level workflow solution envisioned in INDIGO (coarse grain for distributed tasks orchestration, and fine grain, at the level of a single data analytics cluster instance) will be presented and discussed.
Megacities and Large Urban Complexes - WMO Role in Addressing Challenges and Opportunities
NASA Astrophysics Data System (ADS)
Terblanche, Deon; Jalkanen, Liisa
2013-04-01
Megacities and Large Urban Complexes - WMO Role in Addressing Challenges and Opportunities Deon E. Terblanche and Liisa Jalkanen dterblanche@wmo.int ljalkanen@wmo.int World Meteorological Organization, Geneva, Switzerland The 21st Century could amongst others, become known as the century in which our species has evolved from Homo sapiens to Homo urbanus. By now the urban population has surpassed the rural population and the rate of urbanization will continue at such a pace that by 2050 urban dwellers could outnumber their rural counterpart by more than two to one. Most of this growth in urban population will occur in developing countries and along coastal areas. Urbanization is to a large extent the outcome of humans seeking a better life through improved opportunities presented by high-density communities. Megacities and large urban complexes provide more job opportunities and social structures, better transport and communication links and a relative abundance of physical goods and services when compared to most rural areas. Unfortunately these urban complexes also present numerous social and environmental challenges. Urban areas differ from their surroundings by morphology, population density, and with high concentration of industrial activities, energy consumption and transport. They also pose unique challenges to atmospheric modelling and monitoring and create a multi-disciplinary spectrum of potential threats, including air pollution, which need to be addressed in an integrated way. These areas are also vulnerable to the changing climate and its implications to sea-level and extreme events, air quality and related health impacts. Many urban activities are significantly impacted by weather events that would not be considered to be of high impact in less densely populated areas. For instance, moderate precipitation events can cause flooding and landslides as modified urban catchments generally have higher run-off to rainfall ratios than their more pristine rural counterparts. The urban environment also provides numerous opportunities. One example being the better use of weather and environmental predictions to proactively optimize the functioning of the urban environment in terms of the use of energy, goods and services. Another is the providing of air quality forecasting services to benefit the health of the population. To address the challenges and opportunities facing megacities and large urban complexes, WMO has established the Global Atmosphere Watch (GAW) Urban Research Meteorology and Environment (GURME). Air pollution questions in urban areas, in particular megacities, is the main focus, building observational and modelling capabilities in developing countries through pilot projects and transfer of scientific expertise. GURME contributes to improving capabilities to handle meteorological and related features of air pollution by addressing end-to-end aspects of air quality, linking observational capabilities with the needs of chemical weather prediction, with the goal of providing high quality air quality services. Using examples from around the world but with specific reference to Africa, the unique challenges and opportunities related to megacities and large urban complexes, as perceived by the World Meteorological Organization (WMO) are highlighted.
NASA Astrophysics Data System (ADS)
Goulet, C. A.; Abrahamson, N. A.; Al Atik, L.; Atkinson, G. M.; Bozorgnia, Y.; Graves, R. W.; Kuehn, N. M.; Youngs, R. R.
2017-12-01
The Next Generation Attenuation project for Central and Eastern North America (CENA), NGA-East, is a major multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER). The project was co-sponsored by the U.S. Nuclear Regulatory Commission (NRC), the U.S. Department of Energy (DOE), the Electric Power Research Institute (EPRI) and the U.S. Geological Survey (USGS). NGA-East involved a large number of participating researchers from various organizations in academia, industry and government and was carried-out as a combination of 1) a scientific research project and 2) a model-building component following the NRC Seismic Senior Hazard Analysis Committee (SSHAC) Level 3 process. The science part of the project led to several data products and technical reports while the SSHAC component aggregated the various results into a ground motion characterization (GMC) model. The GMC model consists in a set of ground motion models (GMMs) for median and standard deviation of ground motions and their associated weights, combined into logic-trees for use in probabilistic seismic hazard analyses (PSHA). NGA-East addressed many technical challenges, most of them related to the relatively small number of earthquake recordings available for CENA. To resolve this shortcoming, the project relied on ground motion simulations to supplement the available data. Other important scientific issues were addressed through research projects on topics such as the regionalization of seismic source, path and attenuation of motions, the treatment of variability and uncertainties and on the evaluation of site effects. Seven working groups were formed to cover the complexity and breadth of topics in the NGA-East project, each focused on a specific technical area. This presentation provides an overview of the NGA-East research project and its key products.
Choice of baseline climate data impacts projected species' responses to climate change.
Baker, David J; Hartley, Andrew J; Butchart, Stuart H M; Willis, Stephen G
2016-07-01
Climate data created from historic climate observations are integral to most assessments of potential climate change impacts, and frequently comprise the baseline period used to infer species-climate relationships. They are often also central to downscaling coarse resolution climate simulations from General Circulation Models (GCMs) to project future climate scenarios at ecologically relevant spatial scales. Uncertainty in these baseline data can be large, particularly where weather observations are sparse and climate dynamics are complex (e.g. over mountainous or coastal regions). Yet, importantly, this uncertainty is almost universally overlooked when assessing potential responses of species to climate change. Here, we assessed the importance of historic baseline climate uncertainty for projections of species' responses to future climate change. We built species distribution models (SDMs) for 895 African bird species of conservation concern, using six different climate baselines. We projected these models to two future periods (2040-2069, 2070-2099), using downscaled climate projections, and calculated species turnover and changes in species-specific climate suitability. We found that the choice of baseline climate data constituted an important source of uncertainty in projections of both species turnover and species-specific climate suitability, often comparable with, or more important than, uncertainty arising from the choice of GCM. Importantly, the relative contribution of these factors to projection uncertainty varied spatially. Moreover, when projecting SDMs to sites of biodiversity importance (Important Bird and Biodiversity Areas), these uncertainties altered site-level impacts, which could affect conservation prioritization. Our results highlight that projections of species' responses to climate change are sensitive to uncertainty in the baseline climatology. We recommend that this should be considered routinely in such analyses. © 2016 John Wiley & Sons Ltd.
Wall, Martin; Hayes, Richard; Moore, Derek; Petticrew, Mark; Clow, Angela; Schmidt, Elena; Draper, Alizon; Lock, Karen; Lynch, Rebecca; Renton, Adrian
2009-01-01
Background In London and the rest of the UK, diseases associated with poor diet, inadequate physical activity and mental illness account for a large proportion of area based health inequality. There is a lack of evidence on interventions promoting healthier behaviours especially in marginalised populations, at a structural or ecological level and utilising a community development approach. The Well London project financed by the Big Lottery 'Wellbeing' Fund and implemented by a consortium of London based agencies led by the Greater London Authority and the London Health Commission is implementing a set of complex interventions across 20 deprived areas of London. The interventions focus on healthy eating, healthy physical activity and mental health and wellbeing and are designed and executed with community participation complementing existing facilities and services. Methods/Design The programme will be evaluated through a cluster randomised controlled trial. Forty areas across London were chosen based on deprivation scores. Areas were characterised by high proportion of Black and Minority Ethnic residents, worklessness, ill-health and poor physical environments. Twenty areas were randomly assigned to the intervention arm of Well London project and twenty 'matched' areas assigned as controls. Measures of physical activity, diet and mental health are collected at start and end of the project and compared to assess impact. The quantitative element will be complemented by a longitudinal qualitative study elucidating pathways of influence between intervention activities and health outcomes. A related element of the study investigates the health-related aspects of the structural and ecological characteristics of the project areas. The project 'process' will also be evaluated. Discussion The size of the project and the fact that the interventions are 'complex' in the sense that firstly, there are a number of interacting components with a wide range of groups and organisational levels targeted by the intervention, and secondly, a degree of flexibility or tailoring of the intervention, makes this trial potentially very useful in providing evidence of the types of activities that can be used to address chronic health problems in communities suffering from multiple deprivation. Trial Registration Current Controlled Trials ISRCTN68175121 PMID:19558712
Project Management Life Cycle Models to Improve Management in High-rise Construction
NASA Astrophysics Data System (ADS)
Burmistrov, Andrey; Siniavina, Maria; Iliashenko, Oksana
2018-03-01
The paper describes a possibility to improve project management in high-rise buildings construction through the use of various Project Management Life Cycle Models (PMLC models) based on traditional and agile project management approaches. Moreover, the paper describes, how the split the whole large-scale project to the "project chain" will create the factor for better manageability of the large-scale buildings project and increase the efficiency of the activities of all participants in such projects.
Graef, Frieder; Sieber, Stefan
2018-01-01
Research and development increasingly apply participatory approaches that involve both stakeholders and scientists. This article presents an evaluation of German and Tanzanian researchers' perceptions during their activities as part of a large interdisciplinary research project in Tanzania. The project focused on prioritizing and implementing food-securing upgrading strategies across the components of rural food value chains. The participants involved during the course of the project were asked to provide feedback on 10 different research steps and to evaluate eight core features related to the functioning and potential shortcomings of the project. The study discriminated among evaluation differences linked to culture, gender, and institutional status. Perceptions differed between Tanzanian and German participants depending on the type and complexity of the participatory research steps undertaken and the intensity of stakeholder participation. There were differences in perception linked to gender and hierarchical status; however, those differences were not as concise and significant as those linked to nationality. These findings indicate that participatory action research of this nature requires more targeted strategies and planning tailored to the type of activity. Such planning would result in more efficient and satisfactory communication, close collaboration, and mutual feedback to avoid conflicts and other problems. We further conclude that it would be advisable to carefully incorporate training on these aspects into future project designs.
NASA Technical Reports Server (NTRS)
Allen, B. Danette
1998-01-01
In the traditional 'waterfall' model of the software project life cycle, the Requirements Phase ends and flows into the Design Phase, which ends and flows into the Development Phase. Unfortunately, the process rarely, if ever, works so smoothly in practice. Instead, software developers often receive new requirements, or modifications to the original requirements, well after the earlier project phases have been completed. In particular, projects with shorter than ideal schedules are highly susceptible to frequent requirements changes, as the software requirements analysis phase is often forced to begin before the overall system requirements and top-level design are complete. This results in later modifications to the software requirements, even though the software design and development phases may be complete. Requirements changes received in the later stages of a software project inevitably lead to modification of existing developed software. Presented here is a series of software design techniques that can greatly reduce the impact of last-minute requirements changes. These techniques were successfully used to add built-in flexibility to two complex software systems in which the requirements were expected to (and did) change frequently. These large, real-time systems were developed at NASA Langley Research Center (LaRC) to test and control the Lidar In-Space Technology Experiment (LITE) instrument which flew aboard the space shuttle Discovery as the primary payload on the STS-64 mission.
Minor Distortions with Major Consequences: Correcting Distortions in Imaging Spectrographs
Esmonde-White, Francis W. L.; Esmonde-White, Karen A.; Morris, Michael D.
2010-01-01
Projective transformation is a mathematical correction (implemented in software) used in the remote imaging field to produce distortion-free images. We present the application of projective transformation to correct minor alignment and astigmatism distortions that are inherent in dispersive spectrographs. Patterned white-light images and neon emission spectra were used to produce registration points for the transformation. Raman transects collected on microscopy and fiber-optic systems were corrected using established methods and compared with the same transects corrected using the projective transformation. Even minor distortions have a significant effect on reproducibility and apparent fluorescence background complexity. Simulated Raman spectra were used to optimize the projective transformation algorithm. We demonstrate that the projective transformation reduced the apparent fluorescent background complexity and improved reproducibility of measured parameters of Raman spectra. Distortion correction using a projective transformation provides a major advantage in reducing the background fluorescence complexity even in instrumentation where slit-image distortions and camera rotation were minimized using manual or mechanical means. We expect these advantages should be readily applicable to other spectroscopic modalities using dispersive imaging spectrographs. PMID:21211158
Data-assisted reduced-order modeling of extreme events in complex dynamical systems
Koumoutsakos, Petros
2018-01-01
The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN) architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM) regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more significant in regions associated with extreme events, where data is sparse. PMID:29795631
Data-assisted reduced-order modeling of extreme events in complex dynamical systems.
Wan, Zhong Yi; Vlachas, Pantelis; Koumoutsakos, Petros; Sapsis, Themistoklis
2018-01-01
The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN) architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM) regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more significant in regions associated with extreme events, where data is sparse.
Success in health information exchange projects: solving the implementation puzzle.
Sicotte, Claude; Paré, Guy
2010-04-01
Interest in health information exchange (HIE), defined as the use of information technology to support the electronic transfer of clinical information across health care organizations, continues to grow among those pursuing greater patient safety and health care accessibility and efficiency. In this paper, we present the results of a longitudinal multiple-case study of two large-scale HIE implementation projects carried out in real time over 3-year and 2-year periods in Québec, Canada. Data were primarily collected through semi-structured interviews (n=52) with key informants, namely implementation team members and targeted users. These were supplemented with non-participants observation of team meetings and by the analysis of organizational documents. The cross-case comparison was particularly relevant given that project circumstances led to contrasting outcomes: while one project failed, the other was a success. A risk management analysis was performed taking a process view in order to capture the complexity of project implementations as evolving phenomena that are affected by interdependent pre-existing and emergent risks that tend to change over time. The longitudinal case analysis clearly demonstrates that the risk factors were closely intertwined. Systematic ripple effects from one risk factor to another were observed. This risk interdependence evolved dynamically over time, with a snowball effect that rendered a change of path progressively more difficult as time passed. The results of the cross-case analysis demonstrate a direct relationship between the quality of an implementation strategy and project outcomes. Copyright 2010 Elsevier Ltd. All rights reserved.
System dynamic simulation: A new method in social impact assessment (SIA)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karami, Shobeir, E-mail: shobeirkarami@gmail.com; Karami, Ezatollah, E-mail: ekarami@shirazu.ac.ir; Buys, Laurie, E-mail: l.buys@qut.edu.au
Many complex social questions are difficult to address adequately with conventional methods and techniques, due to the complicated dynamics, and hard to quantify social processes. Despite these difficulties researchers and practitioners have attempted to use conventional methods not only in evaluative modes but also in predictive modes to inform decision making. The effectiveness of SIAs would be increased if they were used to support the project design processes. This requires deliberate use of lessons from retrospective assessments to inform predictive assessments. Social simulations may be a useful tool for developing a predictive SIA method. There have been limited attempts tomore » develop computer simulations that allow social impacts to be explored and understood before implementing development projects. In light of this argument, this paper aims to introduce system dynamic (SD) simulation as a new predictive SIA method in large development projects. We propose the potential value of the SD approach to simulate social impacts of development projects. We use data from the SIA of Gareh-Bygone floodwater spreading project to illustrate the potential of SD simulation in SIA. It was concluded that in comparison to traditional SIA methods SD simulation can integrate quantitative and qualitative inputs from different sources and methods and provides a more effective and dynamic assessment of social impacts for development projects. We recommend future research to investigate the full potential of SD in SIA in comparing different situations and scenarios.« less
Energising the WEF nexus to enhance sustainable development at local level.
Terrapon-Pfaff, Julia; Ortiz, Willington; Dienst, Carmen; Gröne, Marie-Christine
2018-06-23
The water-energy-food (WEF) nexus is increasingly recognised as a conceptual framework able to support the efficient implementation of the Sustainable Development Goals (SDGs). Despite growing attention paid to the WEF nexus, the role that renewable energies can play in addressing trade-offs and realising synergies has received limited attention. Until now, the focus of WEF nexus discussions and applications has mainly been on national or global levels, macro-level drivers, material flows and large infrastructure developments. This overlooks the fact that major nexus challenges are faced at local level. Aiming to address these knowledge gaps, the authors conduct a systematic analysis of the linkages between small-scale energy projects in developing countries and the food and water aspects of development. The analysis is based on empirical data from continuous process and impact evaluations complemented by secondary data and relevant literature. The study provides initial insights into how to identify interconnections and the potential benefits of integrating the nexus pillars into local level projects in the global south. The study identifies the complex links which exist between sustainable energy projects and the food and water sectors and highlights that these needs are currently not systematically integrated into project design or project evaluation. A more systematic approach, integrating the water and food pillars into energy planning at local level in the global south, is recommended to avoid trade-offs and enhance the development outcomes and impacts of energy projects. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Hansen, Matthew; O’Brien, Kerth; Meckler, Garth; Chang, Anna Marie; Guise, Jeanne-Marie
2016-01-01
Mixed methods research has significant potential to broaden the scope of emergency care and specifically emergency medical services investigation. Mixed methods studies involve the coordinated use of qualitative and quantitative research approaches to gain a fuller understanding of practice. By combining what is learnt from multiple methods, these approaches can help to characterise complex healthcare systems, identify the mechanisms of complex problems such as medical errors and understand aspects of human interaction such as communication, behaviour and team performance. Mixed methods approaches may be particularly useful for out-of-hospital care researchers because care is provided in complex systems where equipment, interpersonal interactions, societal norms, environment and other factors influence patient outcomes. The overall objectives of this paper are to (1) introduce the fundamental concepts and approaches of mixed methods research and (2) describe the interrelation and complementary features of the quantitative and qualitative components of mixed methods studies using specific examples from the Children’s Safety Initiative-Emergency Medical Services (CSI-EMS), a large National Institutes of Health-funded research project conducted in the USA. PMID:26949970
Hydrological Dynamics In High Mountain Catchment Areas of Central Norway
NASA Astrophysics Data System (ADS)
Löffler, J.; Rössler, O.
Large-scaled landscape structure is regarded as a mosaic of ecotopes where pro- cess dynamics of water and energy fluxes are analysed due to its effects on ecosys- tem functioning. The investigations have been carried out in the continental most Vågå/Oppland high mountains in central Norway since 1994 (LÖFFLER WUN- DRAM 1999, 2000, 2001). Additionally, comparable investigations started in 2000 dealing with the oceanic high mountain landscapes on same latitudes (LÖFFLER et al. 2001). The theoretical and methodological framework of the project is given by the Landscape-Ecological Complex Analysis (MOSIMANN 1984, 1985) and its variations due to technical and principle methodical challenges in this high moun- tain landscape (KÖHLER et al. 1994, LÖFFLER 1998). The aim of the project is to characterize high mountain ecosystem structure, functioning and dynamics within small catchment areas, that are chosen in two different altitudinal belts each in the eastern continental and the western oceanic region of central Norway. In the frame of this research project hydrological and meteorological measurements on ground water, percolation and soil moisture dynamics as well as on evaporation, air humidity and air-, surface- and soil-temperatures have been conducted. On the basis of large-scaled landscape-ecological mappings (LÖFFLER 1997) one basic meteorological station and several major data logger run stations have been installed in representative sites of each two catchment areas in the low and mid alpine belts of the investigation re- gions (JUNG et al. 1997, LÖFFLER WUNDRAM 1997). Moreover, spatial differ- entiations of groundwater level, soil moisture and temperature profiles have been in- vestigated by means of hand held measurements at different times of the day, during different climatic situations and different seasons. Daily and annual air-, surface- and soil-temperature dynamics are demonstrated by means of thermoisopleth-diagrams for different types of ecotopes of the different altitudinal belts. The local differences of temperature dynamics are illustrated in a map as an example of the low alpine al- titudinal belt showing a 4-dimensional characterization (in space and time) of high mountain ecosystem functioning. Hydrological aspects derived from those results are presented showing the large-scaled hydrological dynamics of high mountain catch- ment basins in central Norway. The results of the process analysis of hydrological dynamics in the central Norwegian high mountains are discussed within the frame of 1 investigations on altitudinal changes of mountain ecosystem structure and function- ing (LÖFFLER WUNDRAM [in print]). The poster illustrates the theoretical and methodological conception, methods and techniques, examples from complex data material as well as general outcomes of the project (RÖSSLER [in prep.]. 2
Photoinduced energy transfer in transition metal complex oligomers
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-06-01
The work done over the past three years has been directed toward the preparation, characterization and photophysical examination of mono- and bimetallic diimine complexes. The work is part of a broader project directed toward the development of stable, efficient, light harvesting arrays of transition metal complex chromophores. One focus has been the synthesis of rigid bis-bidentate and bis-tridentate bridging ligands. The authors have managed to make the ligand bphb in multigram quantities from inexpensive starting materials. The synthetic approach used has allowed them to prepare a variety of other ligands which may have unique applications (vide infra). They have prepared,more » characterized and examined the photophysical behavior of Ru(II) and Re(I) complexes of the ligands. Energy donor/acceptor complexes of bphb have been prepared which exhibit nearly activationless energy transfer. Complexes of Ru(II) and Re(I) have also been prepared with other polyunsaturated ligands in which two different long lived (> 50 ns) excited states exist; results of luminescence and transient absorbance measurements suggest the two states are metal-to-ligand charge transfer and ligand localized {pi}{r_arrow}{pi}* triplets. Finally, the authors have developed methods to prepare polymetallic complexes which are covalently bound to various surfaces. The long term objective of this work is to make light harvesting arrays for the sensitization of large band gap semiconductors. Details of this work are provided in the body of the report.« less
Big data, open science and the brain: lessons learned from genomics.
Choudhury, Suparna; Fishman, Jennifer R; McGowan, Michelle L; Juengst, Eric T
2014-01-01
The BRAIN Initiative aims to break new ground in the scale and speed of data collection in neuroscience, requiring tools to handle data in the magnitude of yottabytes (10(24)). The scale, investment and organization of it are being compared to the Human Genome Project (HGP), which has exemplified "big science" for biology. In line with the trend towards Big Data in genomic research, the promise of the BRAIN Initiative, as well as the European Human Brain Project, rests on the possibility to amass vast quantities of data to model the complex interactions between the brain and behavior and inform the diagnosis and prevention of neurological disorders and psychiatric disease. Advocates of this "data driven" paradigm in neuroscience argue that harnessing the large quantities of data generated across laboratories worldwide has numerous methodological, ethical and economic advantages, but it requires the neuroscience community to adopt a culture of data sharing and open access to benefit from them. In this article, we examine the rationale for data sharing among advocates and briefly exemplify these in terms of new "open neuroscience" projects. Then, drawing on the frequently invoked model of data sharing in genomics, we go on to demonstrate the complexities of data sharing, shedding light on the sociological and ethical challenges within the realms of institutions, researchers and participants, namely dilemmas around public/private interests in data, (lack of) motivation to share in the academic community, and potential loss of participant anonymity. Our paper serves to highlight some foreseeable tensions around data sharing relevant to the emergent "open neuroscience" movement.
What is the Thalamus in Zebrafish?
Mueller, Thomas
2012-01-01
Current research on the thalamus and related structures in the zebrafish diencephalon identifies an increasing number of both neurological structures and ontogenetic processes as evolutionary conserved between teleosts and mammals. The patterning processes, for example, which during the embryonic development of zebrafish form the thalamus proper appear largely conserved. Yet also striking differences between zebrafish and other vertebrates have been observed, particularly when we look at mature and histologically differentiated brains. A case in point is the migrated preglomerular complex of zebrafish which evolved only within the lineage of ray-finned fish and has no counterpart in mammals or tetrapod vertebrates. Based on its function as a sensory relay station with projections to pallial zones, the preglomerular complex has been compared to specific thalamic nuclei in mammals. However, no thalamic projections to the zebrafish dorsal pallium, which corresponds topologically to the mammalian isocortex, have been identified. Merely one teleostean thalamic nucleus proper, the auditory nucleus, projects to a part of the dorsal telencephalon, the pallial amygdala. Studies on patterning mechanisms identify a rostral and caudal domain in the embryonic thalamus proper. In both, teleosts and mammals, the rostral domain gives rise to GABAergic neurons, whereas glutamatergic neurons originate in the caudal domain of the zebrafish thalamus. The distribution of GABAergic derivatives in the adult zebrafish brain, furthermore, revealed previously overlooked thalamic nuclei and redefined already established ones. These findings require some reconsideration regarding the topological origin of these adult structures. In what follows, I discuss how evolutionary conserved and newly acquired features of the developing and adult zebrafish thalamus can be compared to the mammalian situation. PMID:22586363
FME Senior Project Managers: A Juggling Act of Multiple Projects | Poster
By Peggy Pearl, Contributing Writer It was not until the 1950s that organizations in the United States began to apply project management tools and techniques to complex construction and engineering projects (http://en.m.wikipedia.org/wiki/Project_life_cycle).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sig Drellack, Lance Prothro
2007-12-01
The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less
2016-05-26
Complex and Austere Environment Sb. GRANT NUMBER Sc. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Sd. PROJECT NUMBER MAJ Sun Ryu Se. TASK NUMBER Sf...supports the United States Armed Forces to project combat power during hostilities. In 2014, TRADOC published the new Army Operating Concept (AOC...Sustaining the Army Organic Industrial Base in the Post- Afghanistan Conflict Era” (Civilian Research Project , US Army War College, 2014), 1. 8
The Five-Hundred Aperture Spherical Radio Telescope (fast) Project
NASA Astrophysics Data System (ADS)
Nan, Rendong; Li, Di; Jin, Chengjin; Wang, Qiming; Zhu, Lichun; Zhu, Wenbai; Zhang, Haiyan; Yue, Youling; Qian, Lei
Five-hundred-meter Aperture Spherical radio Telescope (FAST) is a Chinese mega-science project to build the largest single dish radio telescope in the world. Its innovative engineering concept and design pave a new road to realize a huge single dish in the most effective way. FAST also represents Chinese contribution in the international efforts to build the square kilometer array (SKA). Being the most sensitive single dish radio telescope, FAST will enable astronomers to jump-start many science goals, such as surveying the neutral hydrogen in the Milky Way and other galaxies, detecting faint pulsars, looking for the first shining stars, hearing the possible signals from other civilizations, etc. The idea of sitting a large spherical dish in a karst depression is rooted in Arecibo telescope. FAST is an Arecibo-type antenna with three outstanding aspects: the karst depression used as the site, which is large to host the 500-meter telescope and deep to allow a zenith angle of 40 degrees; the active main reflector correcting for spherical aberration on the ground to achieve a full polarization and a wide band without involving complex feed systems; and the light-weight feed cabin driven by cables and servomechanism plus a parallel robot as a secondary adjustable system to move with high precision. The feasibility studies for FAST have been carried out for 14 years, supported by Chinese and world astronomical communities. Funding for FAST has been approved by the National Development and Reform Commission in July of 2007 with a capital budget ~ 700 million RMB. The project time is 5.5 years from the commencement of work in March of 2011 and the first light is expected to be in 2016. This review intends to introduce the project of FAST with emphasis on the recent progress since 2006. In this paper, the subsystems of FAST are described in modest details followed by discussions of the fundamental science goals and examples of early science projects.
NASA Astrophysics Data System (ADS)
Xie, Lizhe; Hu, Yining; Chen, Yang; Shi, Luyao
2015-03-01
Projection and back-projection are the most computational consuming parts in Computed Tomography (CT) reconstruction. Parallelization strategies using GPU computing techniques have been introduced. We in this paper present a new parallelization scheme for both projection and back-projection. The proposed method is based on CUDA technology carried out by NVIDIA Corporation. Instead of build complex model, we aimed on optimizing the existing algorithm and make it suitable for CUDA implementation so as to gain fast computation speed. Besides making use of texture fetching operation which helps gain faster interpolation speed, we fixed sampling numbers in the computation of projection, to ensure the synchronization of blocks and threads, thus prevents the latency caused by inconsistent computation complexity. Experiment results have proven the computational efficiency and imaging quality of the proposed method.
Climate information for the wind energy industry in the Mediterranean Region
NASA Astrophysics Data System (ADS)
Calmanti, Sandro; Davis, Melanie; Schmidt, Peter; Dell'Aquila, Alessandro
2013-04-01
According to the World Wind Energy Association the total wind generation capacity worldwide has come close to cover 3% of the world's electricity demand in 2011. Thanks to the enormous resource potential and the relatively low costs of construction and maintenance of wind power plants, the wind energy sector will remain one of the most attractive renewable energy investment options. Studies reveal that climate variability and change pose a new challenge to the entire renewable energy sector, and in particular for wind energy. Stakeholders in the wind energy sector mainly use, if available, site-specific historical climate information to assess wind resources at a given project site. So far, this is the only source of information that investors (e.g., banks) are keen to accept for decisions concerning the financing of wind energy projects. However, one possible wind energy risk at the seasonal scale is the volatility of earnings from year to year investment. The most significant risk is therefore that not enough units of energy (or megawatt hours) can be generated from the project to capture energy sales to pay down debt in any given quarter or year. On the longer time scale the risk is that a project's energy yields fall short of their estimated levels, resulting in revenues that consistently come in below their projection, over the life of the project. The nature of the risk exposure determines considerable interest in wind scenarios, as a potential component of both the planning and operational phase of a renewable energy project. Fundamentally, by using climate projections, the assumption of stationary wind regimes can be compared to other scenarios where large scale changes in atmospheric circulation patterns may affect local wind regimes. In the framework of CLIM-RUN EU FP7 project, climate experts are exploring the potential of seasonal to decadal climate forecast techniques (time-frame 2012-2040) and regional climate scenarios (time horizon 2040+) over the Mediterranean Region as a tool for assessing the impact of changes in climate patterns on the energy output of wind power plants. Subsequently, we will give here a brief overview of these techniques as well as first results related to wind projections for different sites across the Mediterranean Region. We will highlight that regional climate models have a large potential for enhancing the quality of climate projections in the presence of complex orography and in the proximity of coastal areas.
ALCF Data Science Program: Productive Data-centric Supercomputing
NASA Astrophysics Data System (ADS)
Romero, Nichols; Vishwanath, Venkatram
The ALCF Data Science Program (ADSP) is targeted at big data science problems that require leadership computing resources. The goal of the program is to explore and improve a variety of computational methods that will enable data-driven discoveries across all scientific disciplines. The projects will focus on data science techniques covering a wide area of discovery including but not limited to uncertainty quantification, statistics, machine learning, deep learning, databases, pattern recognition, image processing, graph analytics, data mining, real-time data analysis, and complex and interactive workflows. Project teams will be among the first to access Theta, ALCFs forthcoming 8.5 petaflops Intel/Cray system. The program will transition to the 200 petaflop/s Aurora supercomputing system when it becomes available. In 2016, four projects have been selected to kick off the ADSP. The selected projects span experimental and computational sciences and range from modeling the brain to discovering new materials for solar-powered windows to simulating collision events at the Large Hadron Collider (LHC). The program will have a regular call for proposals with the next call expected in Spring 2017.http://www.alcf.anl.gov/alcf-data-science-program This research used resources of the ALCF, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.
Genome Improvement at JGI-HAGSC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grimwood, Jane; Schmutz, Jeremy J.; Myers, Richard M.
Since the completion of the sequencing of the human genome, the Joint Genome Institute (JGI) has rapidly expanded its scientific goals in several DOE mission-relevant areas. At the JGI-HAGSC, we have kept pace with this rapid expansion of projects with our focus on assessing, assembling, improving and finishing eukaryotic whole genome shotgun (WGS) projects for which the shotgun sequence is generated at the Production Genomic Facility (JGI-PGF). We follow this by combining the draft WGS with genomic resources generated at JGI-HAGSC or in collaborator laboratories (including BAC end sequences, genetic maps and FLcDNA sequences) to produce an improved draft sequence.more » For eukaryotic genomes important to the DOE mission, we then add further information from directed experiments to produce reference genomic sequences that are publicly available for any scientific researcher. Also, we have continued our program for producing BAC-based finished sequence, both for adding information to JGI genome projects and for small BAC-based sequencing projects proposed through any of the JGI sequencing programs. We have now built our computational expertise in WGS assembly and analysis and have moved eukaryotic genome assembly from the JGI-PGF to JGI-HAGSC. We have concentrated our assembly development work on large plant genomes and complex fungal and algal genomes.« less
Configuration Management of an Optimization Application in a Research Environment
NASA Technical Reports Server (NTRS)
Townsend, James C.; Salas, Andrea O.; Schuler, M. Patricia
1999-01-01
Multidisciplinary design optimization (MDO) research aims to increase interdisciplinary communication and reduce design cycle time by combining system analyses (simulations) with design space search and decision making. The High Performance Computing and Communication Program's current High Speed Civil Transport application, HSCT4.0, at NASA Langley Research Center involves a highly complex analysis process with high-fidelity analyses that are more realistic than previous efforts at the Center. The multidisciplinary processes have been integrated to form a distributed application by using the Java language and Common Object Request Broker Architecture (CORBA) software techniques. HSCT4.0 is a research project in which both the application problem and the implementation strategy have evolved as the MDO and integration issues became better understood. Whereas earlier versions of the application and integrated system were developed with a simple, manual software configuration management (SCM) process, it was evident that this larger project required a more formal SCM procedure. This report briefly describes the HSCT4.0 analysis and its CORBA implementation and then discusses some SCM concepts and their application to this project. In anticipation that SCM will prove beneficial for other large research projects, the report concludes with some lessons learned in overcoming SCM implementation problems for HSCT4.0.
NASA Astrophysics Data System (ADS)
Russell, J. L.; Sarmiento, J. L.
2017-12-01
The Southern Ocean is central to the climate's response to increasing levels of atmospheric greenhouse gases as it ventilates a large fraction of the global ocean volume. Global coupled climate models and earth system models, however, vary widely in their simulations of the Southern Ocean and its role in, and response to, the ongoing anthropogenic forcing. Due to its complex water-mass structure and dynamics, Southern Ocean carbon and heat uptake depend on a combination of winds, eddies, mixing, buoyancy fluxes and topography. Understanding how the ocean carries heat and carbon into its interior and how the observed wind changes are affecting this uptake is essential to accurately projecting transient climate sensitivity. Observationally-based metrics are critical for discerning processes and mechanisms, and for validating and comparing climate models. As the community shifts toward Earth system models with explicit carbon simulations, more direct observations of important biogeochemical parameters, like those obtained from the biogeochemically-sensored floats that are part of the Southern Ocean Carbon and Climate Observations and Modeling project, are essential. One goal of future observing systems should be to create observationally-based benchmarks that will lead to reducing uncertainties in climate projections, and especially uncertainties related to oceanic heat and carbon uptake.
Hyper-Spectral Networking Concept of Operations and Future Air Traffic Management Simulations
NASA Technical Reports Server (NTRS)
Davis, Paul; Boisvert, Benjamin
2017-01-01
The NASA sponsored Hyper-Spectral Communications and Networking for Air Traffic Management (ATM) (HSCNA) project is conducting research to improve the operational efficiency of the future National Airspace System (NAS) through diverse and secure multi-band, multi-mode, and millimeter-wave (mmWave) wireless links. Worldwide growth of air transportation and the coming of unmanned aircraft systems (UAS) will increase air traffic density and complexity. Safe coordination of aircraft will require more capable technologies for communications, navigation, and surveillance (CNS). The HSCNA project will provide a foundation for technology and operational concepts to accommodate a significantly greater number of networked aircraft. This paper describes two of the HSCNA projects technical challenges. The first technical challenge is to develop a multi-band networking concept of operations (ConOps) for use in multiple phases of flight and all communication link types. This ConOps will integrate the advanced technologies explored by the HSCNA project and future operational concepts into a harmonized vision of future NAS communications and networking. The second technical challenge discussed is to conduct simulations of future ATM operations using multi-bandmulti-mode networking and technologies. Large-scale simulations will assess the impact, compared to todays system, of the new and integrated networks and technologies under future air traffic demand.
North-South health research collaboration: challenges in institutional interaction.
Maina-Ahlberg, B; Nordberg, E; Tomson, G
1997-04-01
North-South health development cooperation often includes research financed largely by external donors. The cooperation varies between projects and programmes with regard to subject area, mix of disciplines involved, research methods, training components and project management arrangements. A variety of problems is encountered, but they are rarely described and discussed in published project reports. We authors conducted a study of a small number of European health researchers collaborating with researchers from the Third World. We focused upon projects involving both biomedical and social science researchers, and apart from a literature review three methods were applied: self-administered questionnaires to European researchers, semistructured interviews with five IHCAR researchers, and written summaries by the three authors, each on one recent or ongoing collaborative project of their choice. Most collaborative projects were initiated from the North and are monodisciplinary or partly interdisciplinary in the sense that researchers did independent data collection preceded by joint planning and followed by joint analysis and write-up. There may be disagreements concerning remuneration such as allowances in relation to fieldwork and training. Socio-cultural misunderstanding and conflict was reportedly rare, and no serious problems were reported regarding authorship and publishing. It is concluded that collaborative research is a complex and poorly understood process with considerable potential and worth pursuing despite the problems. Difficulties related to logistics and finance are easily and freely discussed, while there is little evidence that transdisciplinary research is conducted or even discussed. We recommend that published and unpublished reports on collaborative research projects include more detailed accounts of the North-South collaborative arrangements and their management, ethical and financial aspects.
Implementing large projects in software engineering courses
NASA Astrophysics Data System (ADS)
Coppit, David
2006-03-01
In software engineering education, large projects are widely recognized as a useful way of exposing students to the real-world difficulties of team software development. But large projects are difficult to put into practice. First, educators rarely have additional time to manage software projects. Second, classrooms have inherent limitations that threaten the realism of large projects. Third, quantitative evaluation of individuals who work in groups is notoriously difficult. As a result, many software engineering courses compromise the project experience by reducing the team sizes, project scope, and risk. In this paper, we present an approach to teaching a one-semester software engineering course in which 20 to 30 students work together to construct a moderately sized (15KLOC) software system. The approach combines carefully coordinated lectures and homeworks, a hierarchical project management structure, modern communication technologies, and a web-based project tracking and individual assessment system. Our approach provides a more realistic project experience for the students, without incurring significant additional overhead for the instructor. We present our experiences using the approach the last 2 years for the software engineering course at The College of William and Mary. Although the approach has some weaknesses, we believe that they are strongly outweighed by the pedagogical benefits.
Elliptic complexes over C∗-algebras of compact operators
NASA Astrophysics Data System (ADS)
Krýsl, Svatopluk
2016-03-01
For a C∗-algebra A of compact operators and a compact manifold M, we prove that the Hodge theory holds for A-elliptic complexes of pseudodifferential operators acting on smooth sections of finitely generated projective A-Hilbert bundles over M. For these C∗-algebras and manifolds, we get a topological isomorphism between the cohomology groups of an A-elliptic complex and the space of harmonic elements of the complex. Consequently, the cohomology groups appear to be finitely generated projective C∗-Hilbert modules and especially, Banach spaces. We also prove that in the category of Hilbert A-modules and continuous adjointable Hilbert A-module homomorphisms, the property of a complex of being self-adjoint parametrix possessing characterizes the complexes of Hodge type.
Shi, Zhenyu; Wedd, Anthony G.; Gras, Sally L.
2013-01-01
The development of synthetic biology requires rapid batch construction of large gene networks from combinations of smaller units. Despite the availability of computational predictions for well-characterized enzymes, the optimization of most synthetic biology projects requires combinational constructions and tests. A new building-brick-style parallel DNA assembly framework for simple and flexible batch construction is presented here. It is based on robust recombination steps and allows a variety of DNA assembly techniques to be organized for complex constructions (with or without scars). The assembly of five DNA fragments into a host genome was performed as an experimental demonstration. PMID:23468883
Performance study of large area encoding readout MRPC
NASA Astrophysics Data System (ADS)
Chen, X. L.; Wang, Y.; Chen, G.; Han, D.; Wang, X.; Zeng, M.; Zeng, Z.; Zhao, Z.; Guo, B.
2018-02-01
Muon tomography system built by the 2-D readout high spatial resolution Multi-gap Resistive Plate Chamber (MRPC) detector is a project of Tsinghua University. An encoding readout method based on the fine-fine configuration has been used to minimize the number of the readout electronic channels resulting in reducing the complexity and the cost of the system. In this paper, we provide a systematic comparison of the MRPC detector performance with and without fine-fine encoding readout. Our results suggest that the application of the fine-fine encoding readout leads us to achieve a detecting system with slightly worse spatial resolution but dramatically reduce the number of electronic channels.
Remediation of the protein data bank archive.
Henrick, Kim; Feng, Zukang; Bluhm, Wolfgang F; Dimitropoulos, Dimitris; Doreleijers, Jurgen F; Dutta, Shuchismita; Flippen-Anderson, Judith L; Ionides, John; Kamada, Chisa; Krissinel, Eugene; Lawson, Catherine L; Markley, John L; Nakamura, Haruki; Newman, Richard; Shimizu, Yukiko; Swaminathan, Jawahar; Velankar, Sameer; Ory, Jeramia; Ulrich, Eldon L; Vranken, Wim; Westbrook, John; Yamashita, Reiko; Yang, Huanwang; Young, Jasmine; Yousufuddin, Muhammed; Berman, Helen M
2008-01-01
The Worldwide Protein Data Bank (wwPDB; wwpdb.org) is the international collaboration that manages the deposition, processing and distribution of the PDB archive. The online PDB archive at ftp://ftp.wwpdb.org is the repository for the coordinates and related information for more than 47 000 structures, including proteins, nucleic acids and large macromolecular complexes that have been determined using X-ray crystallography, NMR and electron microscopy techniques. The members of the wwPDB-RCSB PDB (USA), MSD-EBI (Europe), PDBj (Japan) and BMRB (USA)-have remediated this archive to address inconsistencies that have been introduced over the years. The scope and methods used in this project are presented.
Unmanned Vehicle Guidance Using Video Camera/Vehicle Model
NASA Technical Reports Server (NTRS)
Sutherland, T.
1999-01-01
A video guidance sensor (VGS) system has flown on both STS-87 and STS-95 to validate a single camera/target concept for vehicle navigation. The main part of the image algorithm was the subtraction of two consecutive images using software. For a nominal size image of 256 x 256 pixels this subtraction can take a large portion of the time between successive frames in standard rate video leaving very little time for other computations. The purpose of this project was to integrate the software subtraction into hardware to speed up the subtraction process and allow for more complex algorithms to be performed, both in hardware and software.
A novel spatial-temporal detection method of dim infrared moving small target
NASA Astrophysics Data System (ADS)
Chen, Zhong; Deng, Tao; Gao, Lei; Zhou, Heng; Luo, Song
2014-09-01
Moving small target detection under complex background in infrared image sequence is one of the major challenges of modern military in Early Warning Systems (EWS) and the use of Long-Range Strike (LRS). However, because of the low SNR and undulating background, the infrared moving small target detection is a difficult problem in a long time. To solve this problem, a novel spatial-temporal detection method based on bi-dimensional empirical mode decomposition (EMD) and time-domain difference is proposed in this paper. This method is downright self-data decomposition and do not rely on any transition kernel function, so it has a strong adaptive capacity. Firstly, we generalized the 1D EMD algorithm to the 2D case. In this process, the project has solved serial issues in 2D EMD, such as large amount of data operations, define and identify extrema in 2D case, and two-dimensional signal boundary corrosion. The EMD algorithm studied in this project can be well adapted to the automatic detection of small targets under low SNR and complex background. Secondly, considering the characteristics of moving target, we proposed an improved filtering method based on three-frame difference on basis of the original difference filtering in time-domain, which greatly improves the ability of anti-jamming algorithm. Finally, we proposed a new time-space fusion method based on a combined processing of 2D EMD and improved time-domain differential filtering. And, experimental results show that this method works well in infrared small moving target detection under low SNR and complex background.
Surface inspection system for carriage parts
NASA Astrophysics Data System (ADS)
Denkena, Berend; Acker, Wolfram
2006-04-01
Quality standards are very high in carriage manufacturing, due to the fact, that the visual quality impression is highly relevant for the purchase decision for the customer. In carriage parts even very small dents can be visible on the varnished and polished surface by observing reflections. The industrial demands are to detect these form errors on the unvarnished part. In order to meet the requirements, a stripe projection system for automatic recognition of waviness and form errors is introduced1. It bases on a modified stripe projection method using a high resolution line scan camera. Particular emphasis is put on achieving a short measuring time and a high resolution in depth, aiming at a reliable automatic recognition of dents and waviness of 10 μm on large curved surfaces of approximately 1 m width. The resulting point cloud needs to be filtered in order to detect dents. Therefore a spatial filtering technique is used. This works well on smoothly curved surfaces, if frequency parameters are well defined. On more complex parts like mudguards the method is restricted by the fact that frequencies near the define dent frequencies occur within the surface as well. To allow analysis of complex parts, the system is currently extended by including 3D CAD models into the process of inspection. For smoothly curved surfaces, the measuring speed of the prototype is mainly limited by the amount of light produced by the stripe projector. For complex surfaces the measuring speed is limited by the time consuming matching process. Currently, the development focuses on the improvement of the measuring speed.
Low order climate models as a tool for cross-disciplinary collaboration
NASA Astrophysics Data System (ADS)
Newton, R.; Pfirman, S. L.; Tremblay, B.; Schlosser, P.
2014-12-01
Human impacts on climate are pervasive and significant and project future states cannot be projected without taking human influence into account. We recently helped convene a meeting of climatologists, policy analysts, lawyers and social scientists to discuss the dramatic loss in Arctic summer sea ice. A dialogue emerged around distinct time scales in the integrated human/natural climate system. Climate scientists tended to discuss engineering solutions as though they could be implemented immediately, whereas lags of 2 or more decades were estimated by social scientists for societal shifts and similar lags were cited for deployment by the engineers. Social scientists tended to project new climate states virtually overnight, while climatologists described time scales of decades to centuries for the system to respond to changes in forcing functions. For the conversation to develop, the group had to come to grips with an increasingly complex set of transient effect time scales and lags between decisions, changes in forcing, and system outputs. We use several low-order dynamical system models to explore mismatched timescales, ranges of lags, and uncertainty in cost estimates on climate outcomes, focusing on Arctic-specific issues. In addition to lessons regarding what is/isn't feasible from a policy and engineering perspective, these models provide a useful tool to concretize cross-disciplinary thinking. They are fast and easy to iterate through a large region of the problem space, while including surprising complexity in their evolution. Thus they are appropriate for investigating the implications of policy in an efficient, but not unrealistic physical setting. (Earth System Models, by contrast, can be too resource- and time-intensive for iteratively testing "what if" scenarios in cross-disciplinary collaborations.) Our runs indicate, for example, that the combined social, engineering and climate physics lags make it extremely unlikely that an ice-free summer ecology in the Arctic can be avoided. Further, if prospective remediation strategies are successful, a return to perennial ice conditions between one and two centuries from now is entirely likely, with interesting and large impacts on Northern economies.
Leisner, Courtney P; Wood, Joshua C; Vaillancourt, Brieanne; Tang, Ying; Douches, Dave S; Robin Buell, C; Winkler, Julie A
2018-04-01
Understanding the impacts of climate change on agriculture is essential to ensure adequate future food production. Controlled growth experiments provide an effective tool for assessing the complex effects of climate change. However, a review of the use of climate projections in 57 previously published controlled growth studies found that none considered within-season variations in projected future temperature change, and few considered regional differences in future warming. A fixed, often arbitrary, temperature perturbation typically was applied for the entire growing season. This study investigates the utility of employing more complex climate change scenarios in growth chamber experiments. A case study in potato was performed using three dynamically downscaled climate change projections for the mid-twenty-first century that differ in terms of the timing during the growing season of the largest projected temperature changes. The climate projections were used in growth chamber experiments for four elite potato cultivars commonly planted in Michigan's major potato growing region. The choice of climate projection had a significant influence on the sign and magnitude of the projected changes in aboveground biomass and total tuber count, whereas all projections suggested an increase in total tuber weight and a decrease in specific gravity, a key market quality trait for potato, by mid-century. These results demonstrate that the use of more complex climate projections that extend beyond a simple incremental change can provide additional insights into the future impacts of climate change on crop production and the accompanying uncertainty.
Lesion insertion in the projection domain: Methods and initial results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Baiyu; Leng, Shuai; Yu, Lifeng
2015-12-15
Purpose: To perform task-based image quality assessment in CT, it is desirable to have a large number of realistic patient images with known diagnostic truth. One effective way of achieving this objective is to create hybrid images that combine patient images with inserted lesions. Because conventional hybrid images generated in the image domain fails to reflect the impact of scan and reconstruction parameters on lesion appearance, this study explored a projection-domain approach. Methods: Lesions were segmented from patient images and forward projected to acquire lesion projections. The forward-projection geometry was designed according to a commercial CT scanner and accommodated bothmore » axial and helical modes with various focal spot movement patterns. The energy employed by the commercial CT scanner for beam hardening correction was measured and used for the forward projection. The lesion projections were inserted into patient projections decoded from commercial CT projection data. The combined projections were formatted to match those of commercial CT raw data, loaded onto a commercial CT scanner, and reconstructed to create the hybrid images. Two validations were performed. First, to validate the accuracy of the forward-projection geometry, images were reconstructed from the forward projections of a virtual ACR phantom and compared to physically acquired ACR phantom images in terms of CT number accuracy and high-contrast resolution. Second, to validate the realism of the lesion in hybrid images, liver lesions were segmented from patient images and inserted back into the same patients, each at a new location specified by a radiologist. The inserted lesions were compared to the original lesions and visually assessed for realism by two experienced radiologists in a blinded fashion. Results: For the validation of the forward-projection geometry, the images reconstructed from the forward projections of the virtual ACR phantom were consistent with the images physically acquired for the ACR phantom in terms of Hounsfield unit and high-contrast resolution. For the validation of the lesion realism, lesions of various types were successfully inserted, including well circumscribed and invasive lesions, homogeneous and heterogeneous lesions, high-contrast and low-contrast lesions, isolated and vessel-attached lesions, and small and large lesions. The two experienced radiologists who reviewed the original and inserted lesions could not identify the lesions that were inserted. The same lesion, when inserted into the projection domain and reconstructed with different parameters, demonstrated a parameter-dependent appearance. Conclusions: A framework has been developed for projection-domain insertion of lesions into commercial CT images, which can be potentially expanded to all geometries of CT scanners. Compared to conventional image-domain methods, the authors’ method reflected the impact of scan and reconstruction parameters on lesion appearance. Compared to prior projection-domain methods, the authors’ method has the potential to achieve higher anatomical complexity by employing clinical patient projections and real patient lesions.« less
Lesion insertion in the projection domain: Methods and initial results
Chen, Baiyu; Leng, Shuai; Yu, Lifeng; Yu, Zhicong; Ma, Chi; McCollough, Cynthia
2015-01-01
Purpose: To perform task-based image quality assessment in CT, it is desirable to have a large number of realistic patient images with known diagnostic truth. One effective way of achieving this objective is to create hybrid images that combine patient images with inserted lesions. Because conventional hybrid images generated in the image domain fails to reflect the impact of scan and reconstruction parameters on lesion appearance, this study explored a projection-domain approach. Methods: Lesions were segmented from patient images and forward projected to acquire lesion projections. The forward-projection geometry was designed according to a commercial CT scanner and accommodated both axial and helical modes with various focal spot movement patterns. The energy employed by the commercial CT scanner for beam hardening correction was measured and used for the forward projection. The lesion projections were inserted into patient projections decoded from commercial CT projection data. The combined projections were formatted to match those of commercial CT raw data, loaded onto a commercial CT scanner, and reconstructed to create the hybrid images. Two validations were performed. First, to validate the accuracy of the forward-projection geometry, images were reconstructed from the forward projections of a virtual ACR phantom and compared to physically acquired ACR phantom images in terms of CT number accuracy and high-contrast resolution. Second, to validate the realism of the lesion in hybrid images, liver lesions were segmented from patient images and inserted back into the same patients, each at a new location specified by a radiologist. The inserted lesions were compared to the original lesions and visually assessed for realism by two experienced radiologists in a blinded fashion. Results: For the validation of the forward-projection geometry, the images reconstructed from the forward projections of the virtual ACR phantom were consistent with the images physically acquired for the ACR phantom in terms of Hounsfield unit and high-contrast resolution. For the validation of the lesion realism, lesions of various types were successfully inserted, including well circumscribed and invasive lesions, homogeneous and heterogeneous lesions, high-contrast and low-contrast lesions, isolated and vessel-attached lesions, and small and large lesions. The two experienced radiologists who reviewed the original and inserted lesions could not identify the lesions that were inserted. The same lesion, when inserted into the projection domain and reconstructed with different parameters, demonstrated a parameter-dependent appearance. Conclusions: A framework has been developed for projection-domain insertion of lesions into commercial CT images, which can be potentially expanded to all geometries of CT scanners. Compared to conventional image-domain methods, the authors’ method reflected the impact of scan and reconstruction parameters on lesion appearance. Compared to prior projection-domain methods, the authors’ method has the potential to achieve higher anatomical complexity by employing clinical patient projections and real patient lesions. PMID:26632058
Efficient data management in a large-scale epidemiology research project.
Meyer, Jens; Ostrzinski, Stefan; Fredrich, Daniel; Havemann, Christoph; Krafczyk, Janina; Hoffmann, Wolfgang
2012-09-01
This article describes the concept of a "Central Data Management" (CDM) and its implementation within the large-scale population-based medical research project "Personalized Medicine". The CDM can be summarized as a conjunction of data capturing, data integration, data storage, data refinement, and data transfer. A wide spectrum of reliable "Extract Transform Load" (ETL) software for automatic integration of data as well as "electronic Case Report Forms" (eCRFs) was developed, in order to integrate decentralized and heterogeneously captured data. Due to the high sensitivity of the captured data, high system resource availability, data privacy, data security and quality assurance are of utmost importance. A complex data model was developed and implemented using an Oracle database in high availability cluster mode in order to integrate different types of participant-related data. Intelligent data capturing and storage mechanisms are improving the quality of data. Data privacy is ensured by a multi-layered role/right system for access control and de-identification of identifying data. A well defined backup process prevents data loss. Over the period of one and a half year, the CDM has captured a wide variety of data in the magnitude of approximately 5terabytes without experiencing any critical incidents of system breakdown or loss of data. The aim of this article is to demonstrate one possible way of establishing a Central Data Management in large-scale medical and epidemiological studies. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Academy Sharing Knowledge (ASK). The NASA Source for Project Management Magazine. Volume 5
NASA Technical Reports Server (NTRS)
Post, Todd (Editor)
2001-01-01
How big is your project world? Is it big enough to contain other cultures, headquarters, hierarchies, and weird harpoon-like guns? Sure it is. The great American poet Walt Whitman said it best, 'I am large/I contain multitudes.' And so must you, Mr. and Ms. Project Manager. In this issue of ASK, we look outside the project box. See how several talented project managers have expanded their definition of project scope to include managing environments outside the systems and subsystems under their care. Here's a sampling of what we've put together for you this issue: In 'Three Screws Missing,' Mike Skidmore tells about his adventures at the Plesetek Cosmodrome in northern Russia. Ray Morgan in his story, 'Our Man in Kauai,' suggests we take a broader view of what's meant by 'the team.' Jenny Baer-Riedhart, the NASA program manager on the same Pathfinder solar-powered airplane, schools us in how to sell a program to Headquarters in 'Know Thyself--But Don't Forget to Learn About the Customer Too.' Scott Cameron of Proctor and Gamble talks about sharpening your hierarchical IQ in 'The Project Manager and the Hour Glass.' Mike Jansen in 'The Lawn Dart' describes how he and the 'voodoo crew' on the Space Shuttle Advanced Solid Rocket Motor program borrowed a harpoon-like gun from the Coast Guard to catch particles inside of a plume. These are just some of the stories you'll find in ASK this issue. We hope they cause you to stop and reflect on your own project's relationship to the world outside. We are also launching a new section this issue, 'There are No Mistakes, Only Lessons.' No stranger to ASK readers, Terry Little inaugurates this new section with his article 'The Don Quixote Complex.'
The ‘spiteful’ origins of human cooperation
Marlowe, Frank W.; Berbesque, J. Colette; Barrett, Clark; Bolyanatz, Alexander; Gurven, Michael; Tracer, David
2011-01-01
We analyse generosity, second-party (‘spiteful’) punishment (2PP), and third-party (‘altruistic’) punishment (3PP) in a cross-cultural experimental economics project. We show that smaller societies are less generous in the Dictator Game but no less prone to 2PP in the Ultimatum Game. We might assume people everywhere would be more willing to punish someone who hurt them directly (2PP) than someone who hurt an anonymous third person (3PP). While this is true of small societies, people in large societies are actually more likely to engage in 3PP than 2PP. Strong reciprocity, including generous offers and 3PP, exists mostly in large, complex societies that face numerous challenging collective action problems. We argue that ‘spiteful’ 2PP, motivated by the basic emotion of anger, is more universal than 3PP and sufficient to explain the origins of human cooperation. PMID:21159680
The Web Based Monitoring Project at the CMS Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lopez-Perez, Juan Antonio; Badgett, William; Behrens, Ulf
The Compact Muon Solenoid is a large a complex general purpose experiment at the CERN Large Hadron Collider (LHC), built and maintained by many collaborators from around the world. Efficient operation of the detector requires widespread and timely access to a broad range of monitoring and status information. To the end the Web Based Monitoring (WBM) system was developed to present data to users located anywhere from many underlying heterogeneous sources, from real time messaging systems to relational databases. This system provides the power to combine and correlate data in both graphical and tabular formats of interest to the experimenters,more » including data such as beam conditions, luminosity, trigger rates, detector conditions, and many others, allowing for flexibility on the user’s side. This paper describes the WBM system architecture and describes how the system has been used from the beginning of data taking until now (Run1 and Run 2).« less
The web based monitoring project at the CMS experiment
NASA Astrophysics Data System (ADS)
Lopez-Perez, Juan Antonio; Badgett, William; Behrens, Ulf; Chakaberia, Irakli; Jo, Youngkwon; Maeshima, Kaori; Maruyama, Sho; Patrick, James; Rapsevicius, Valdas; Soha, Aron; Stankevicius, Mantas; Sulmanas, Balys; Toda, Sachiko; Wan, Zongru
2017-10-01
The Compact Muon Solenoid is a large a complex general purpose experiment at the CERN Large Hadron Collider (LHC), built and maintained by many collaborators from around the world. Efficient operation of the detector requires widespread and timely access to a broad range of monitoring and status information. To that end the Web Based Monitoring (WBM) system was developed to present data to users located anywhere from many underlying heterogeneous sources, from real time messaging systems to relational databases. This system provides the power to combine and correlate data in both graphical and tabular formats of interest to the experimenters, including data such as beam conditions, luminosity, trigger rates, detector conditions, and many others, allowing for flexibility on the user’s side. This paper describes the WBM system architecture and describes how the system has been used from the beginning of data taking until now (Run1 and Run 2).
Development of Telecommunications of Prao ASC Lpi RAS
NASA Astrophysics Data System (ADS)
Isaev, E. A.; Dumskiy, D. V.; Likhachev, S. F.; Shatskaya, M. V.; Pugachev, V. D.; Samodurov, V. A.
The new modern and reliable data storage system was acquired in 2010 in order to develop internal telecommunication resources of the Observatory. The system is designed for store large amounts of observation data obtained from the three radio-astronomy complexes (PT-22, DKR-1000 and BSA). The digital switching system - "Elcom" is installed in the Pushchino Radio Astronomy Observatory to ensure the observatory by phone communications. The phone communication between buildings of the observatory carried out over fiber-optic data links by using the ip-telephony. The direct optical channel from tracking station RT-22 in Pushchino to Moscow processing center has been created and put into operation to transfer large amounts of data at the final stage of the establishment of ground infrastructure for the international space project "Radioastron". A separate backup system for processing and storing data is organized in Pushchino Radio Astronomy Observatory to eliminate data loss during communication sessions with the Space Telescope.
Carbon nanotube circuit integration up to sub-20 nm channel lengths.
Shulaker, Max Marcel; Van Rethy, Jelle; Wu, Tony F; Liyanage, Luckshitha Suriyasena; Wei, Hai; Li, Zuanyi; Pop, Eric; Gielen, Georges; Wong, H-S Philip; Mitra, Subhasish
2014-04-22
Carbon nanotube (CNT) field-effect transistors (CNFETs) are a promising emerging technology projected to achieve over an order of magnitude improvement in energy-delay product, a metric of performance and energy efficiency, compared to silicon-based circuits. However, due to substantial imperfections inherent with CNTs, the promise of CNFETs has yet to be fully realized. Techniques to overcome these imperfections have yielded promising results, but thus far only at large technology nodes (1 μm device size). Here we demonstrate the first very large scale integration (VLSI)-compatible approach to realizing CNFET digital circuits at highly scaled technology nodes, with devices ranging from 90 nm to sub-20 nm channel lengths. We demonstrate inverters functioning at 1 MHz and a fully integrated CNFET infrared light sensor and interface circuit at 32 nm channel length. This demonstrates the feasibility of realizing more complex CNFET circuits at highly scaled technology nodes.
The 'spiteful' origins of human cooperation.
Marlowe, Frank W; Berbesque, J Colette; Barrett, Clark; Bolyanatz, Alexander; Gurven, Michael; Tracer, David
2011-07-22
We analyse generosity, second-party ('spiteful') punishment (2PP), and third-party ('altruistic') punishment (3PP) in a cross-cultural experimental economics project. We show that smaller societies are less generous in the Dictator Game but no less prone to 2PP in the Ultimatum Game. We might assume people everywhere would be more willing to punish someone who hurt them directly (2PP) than someone who hurt an anonymous third person (3PP). While this is true of small societies, people in large societies are actually more likely to engage in 3PP than 2PP. Strong reciprocity, including generous offers and 3PP, exists mostly in large, complex societies that face numerous challenging collective action problems. We argue that 'spiteful' 2PP, motivated by the basic emotion of anger, is more universal than 3PP and sufficient to explain the origins of human cooperation.
Namaste (counterbalancing) technique: Overcoming warping in costal cartilage
Agrawal, Kapil S.; Bachhav, Manoj; Shrotriya, Raghav
2015-01-01
Background: Indian noses are broader and lack projection as compared to other populations, hence very often need augmentation, that too by large volume. Costal cartilage remains the material of choice in large volume augmentations and repair of complex primary and secondary nasal deformities. One major disadvantage of costal cartilage grafts (CCG) which offsets all other advantages is the tendency to warp and become distorted over a period of time. We propose a simple technique to overcome this menace of warping. Materials and Methods: We present the data of 51 patients of rhinoplasty done using CCG with counterbalancing technique over a period of 4 years. Results: No evidence of warping was found in any patient up to a maximum follow-up period of 4 years. Conclusion: Counterbalancing is a useful technique to overcome the problem of warping. It gives liberty to utilize even unbalanced cartilage safely to provide desired shape and use the cartilage without any wastage. PMID:26424973
Namaste (counterbalancing) technique: Overcoming warping in costal cartilage.
Agrawal, Kapil S; Bachhav, Manoj; Shrotriya, Raghav
2015-01-01
Indian noses are broader and lack projection as compared to other populations, hence very often need augmentation, that too by large volume. Costal cartilage remains the material of choice in large volume augmentations and repair of complex primary and secondary nasal deformities. One major disadvantage of costal cartilage grafts (CCG) which offsets all other advantages is the tendency to warp and become distorted over a period of time. We propose a simple technique to overcome this menace of warping. We present the data of 51 patients of rhinoplasty done using CCG with counterbalancing technique over a period of 4 years. No evidence of warping was found in any patient up to a maximum follow-up period of 4 years. Counterbalancing is a useful technique to overcome the problem of warping. It gives liberty to utilize even unbalanced cartilage safely to provide desired shape and use the cartilage without any wastage.
3D exploitation of large urban photo archives
NASA Astrophysics Data System (ADS)
Cho, Peter; Snavely, Noah; Anderson, Ross
2010-04-01
Recent work in computer vision has demonstrated the potential to automatically recover camera and scene geometry from large collections of uncooperatively-collected photos. At the same time, aerial ladar and Geographic Information System (GIS) data are becoming more readily accessible. In this paper, we present a system for fusing these data sources in order to transfer 3D and GIS information into outdoor urban imagery. Applying this system to 1000+ pictures shot of the lower Manhattan skyline and the Statue of Liberty, we present two proof-of-concept examples of geometry-based photo enhancement which are difficult to perform via conventional image processing: feature annotation and image-based querying. In these examples, high-level knowledge projects from 3D world-space into georegistered 2D image planes and/or propagates between different photos. Such automatic capabilities lay the groundwork for future real-time labeling of imagery shot in complex city environments by mobile smart phones.
Ackerman, Joshua T.; Herzog, Mark P.; Hartman, Christopher A.; Watts, Trevor C.; Barr, Jarred R.
2014-01-01
The conversion of 50–90 percent of 15,100 acres of former salt evaporation ponds to tidal marsh habitat in the south San Francisco Bay, California, is planned as part of the South Bay Salt Pond Restoration Project. This large-scale habitat restoration may change the bioavailability of methylmercury. The South Bay already is known to have high methylmercury concentrations, with methylmercury concentrations in several waterbirds species more than known toxicity thresholds where avian reproduction is impaired. In this 2013 study, we continued monitoring bird egg mercury concentrations in response to the restoration of the Pond A8/A7/A5 Complex to a potential tidal marsh in the future. The restoration of the Pond A8/A7/A5 Complex began in autumn 2010, and the Pond A8 Notch was opened 5 feet (one of eight gates) to muted tidal action on June 1, 2011, and then closed in the winter. In autumn 2010, internal levees between Ponds A8, A7, and A5 were breached and water depths were substantially increased by flooding the Pond A8/A7/A5 Complex in February 2011. In June 2012, 15 feet (three of eight gates) of the Pond A8 Notch was opened, and then closed in December 2012. In June 2013, 15 feet of the Pond A8 Notch again was opened, and the Pond A8/A7/A5 Complex was a relatively deep and large pond with muted tidal action in the summer. This report synthesizes waterbird data from the 2013 breeding season, and combines it with our prior study’s data from 2010 and 2011.
Haspel, Nurit; Geisbrecht, Brian V; Lambris, John; Kavraki, Lydia
2010-03-01
We present a novel multi-level methodology to explore and characterize the low energy landscape and the thermodynamics of proteins. Traditional conformational search methods typically explore only a small portion of the conformational space of proteins and are hard to apply to large proteins due to the large amount of calculations required. In our multi-scale approach, we first provide an initial characterization of the equilibrium state ensemble of a protein using an efficient computational conformational sampling method. We then enrich the obtained ensemble by performing short Molecular Dynamics (MD) simulations on selected conformations from the ensembles as starting points. To facilitate the analysis of the results, we project the resulting conformations on a low-dimensional landscape to efficiently focus on important interactions and examine low energy regions. This methodology provides a more extensive sampling of the low energy landscape than an MD simulation starting from a single crystal structure as it explores multiple trajectories of the protein. This enables us to obtain a broader view of the dynamics of proteins and it can help in understanding complex binding, improving docking results and more. In this work, we apply the methodology to provide an extensive characterization of the bound complexes of the C3d fragment of human Complement component C3 and one of its powerful bacterial inhibitors, the inhibitory domain of Staphylococcus aureus extra-cellular fibrinogen-binding domain (Efb-C) and two of its mutants. We characterize several important interactions along the binding interface and define low free energy regions in the three complexes. Proteins 2010. (c) 2009 Wiley-Liss, Inc.
Transportation Infrastructure: Central Artery/Tunnel Project Faces Continued Financial Uncertainties
DOT National Transportation Integrated Search
1996-05-01
At a cost of over $1 billion a mile, the Central Artery/Tunnel project - an Interstate Highway System project in Boston, Massachusetts - is one of the largest, most complex, and most expensive highway construction projects ever undertaken. In respons...
ERIC Educational Resources Information Center
Lynch, Margaret M.
2013-01-01
This study explores the meaning project managers (PMs) make of their project environment, how they lead their teams and have incorporate complexity into their project management approach. The exploration of the PM's developmental level and meaning making offers a different angle on the project management and leadership literature. The study…
Chloroplast 2010: A Database for Large-Scale Phenotypic Screening of Arabidopsis Mutants1[W][OA
Lu, Yan; Savage, Linda J.; Larson, Matthew D.; Wilkerson, Curtis G.; Last, Robert L.
2011-01-01
Large-scale phenotypic screening presents challenges and opportunities not encountered in typical forward or reverse genetics projects. We describe a modular database and laboratory information management system that was implemented in support of the Chloroplast 2010 Project, an Arabidopsis (Arabidopsis thaliana) reverse genetics phenotypic screen of more than 5,000 mutants (http://bioinfo.bch.msu.edu/2010_LIMS; www.plastid.msu.edu). The software and laboratory work environment were designed to minimize operator error and detect systematic process errors. The database uses Ruby on Rails and Flash technologies to present complex quantitative and qualitative data and pedigree information in a flexible user interface. Examples are presented where the database was used to find opportunities for process changes that improved data quality. We also describe the use of the data-analysis tools to discover mutants defective in enzymes of leucine catabolism (heteromeric mitochondrial 3-methylcrotonyl-coenzyme A carboxylase [At1g03090 and At4g34030] and putative hydroxymethylglutaryl-coenzyme A lyase [At2g26800]) based upon a syndrome of pleiotropic seed amino acid phenotypes that resembles previously described isovaleryl coenzyme A dehydrogenase (At3g45300) mutants. In vitro assay results support the computational annotation of At2g26800 as hydroxymethylglutaryl-coenzyme A lyase. PMID:21224340