Science.gov

Sample records for agile development methodologies

  1. Towards an Understanding of the Conceptual Underpinnings of Agile Development Methodologies

    NASA Astrophysics Data System (ADS)

    Nerur, Sridhar; Cannon, Alan; Balijepally, Venugopal; Bond, Philip

    While the growing popularity of agile development methodologies is undeniable, there has been little systematic exploration of its intellectual foundation. Such an effort would be an important first step in understanding this paradigm's underlying premises. This understanding, in turn, would be invaluable in our assessment of current practices as well as in our efforts to advance the field of software engineering. Drawing on a variety of sources, both within and outside the discipline, we argue that the concepts underpinning agile development methodologies are by no means novel. In the tradition of General Systems Theory this paper advocates a transdisciplinary examination of agile development methodologies to extend the intellectual boundaries of software development. This is particularly important as the field moves beyond instrumental processes aimed at satisfying mere technical considerations.

  2. Integrating Low-Cost Rapid Usability Testing into Agile System Development of Healthcare IT: A Methodological Perspective.

    PubMed

    Kushniruk, Andre W; Borycki, Elizabeth M

    2015-01-01

    The development of more usable and effective healthcare information systems has become a critical issue. In the software industry methodologies such as agile and iterative development processes have emerged to lead to more effective and usable systems. These approaches highlight focusing on user needs and promoting iterative and flexible development practices. Evaluation and testing of iterative agile development cycles is considered an important part of the agile methodology and iterative processes for system design and re-design. However, the issue of how to effectively integrate usability testing methods into rapid and flexible agile design cycles has remained to be fully explored. In this paper we describe our application of an approach known as low-cost rapid usability testing as it has been applied within agile system development in healthcare. The advantages of the integrative approach are described, along with current methodological considerations.

  3. Some Findings Concerning Requirements in Agile Methodologies

    NASA Astrophysics Data System (ADS)

    Rodríguez, Pilar; Yagüe, Agustín; Alarcón, Pedro P.; Garbajosa, Juan

    Agile methods have appeared as an attractive alternative to conventional methodologies. These methods try to reduce the time to market and, indirectly, the cost of the product through flexible development and deep customer involvement. The processes related to requirements have been extensively studied in literature, in most cases in the frame of conventional methods. However, conclusions of conventional methodologies could not be necessarily valid for Agile; in some issues, conventional and Agile processes are radically different. As recent surveys report, inadequate project requirements is one of the most conflictive issues in agile approaches and better understanding about this is needed. This paper describes some findings concerning requirements activities in a project developed under an agile methodology. The project intended to evolve an existing product and, therefore, some background information was available. The major difficulties encountered were related to non-functional needs and management of requirements dependencies.

  4. The Impacts of Agile Development Methodology Use on Project Success: A Contingency View

    ERIC Educational Resources Information Center

    Tripp, John F.

    2012-01-01

    Agile Information Systems Development Methods have emerged in the past decade as an alternative manner of managing the work and delivery of information systems development teams, with a large number of organizations reporting the adoption & use of agile methods. The practitioners of these methods make broad claims as to the benefits of their…

  5. Using the Agile Development Methodology and Applying Best Practice Project Management Processes

    DTIC Science & Technology

    2014-12-01

    detrimental to a system architecture function. Proponents of Agile argue that developers in the waterfall 25 development get trapped in “ analysis ...limited to very small web-based socio-technical systems . (Krutchen 2010, 497) 2 . Agile Team Responsibilities So, who on the scrum team is...the use of a sprint 0 in which the system architect, heretofore singularly referred to but reflecting an individual or team approach, will take the

  6. Agile Software Development

    ERIC Educational Resources Information Center

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  7. Teaching Agile Software Development: A Case Study

    ERIC Educational Resources Information Center

    Devedzic, V.; Milenkovic, S. R.

    2011-01-01

    This paper describes the authors' experience of teaching agile software development to students of computer science, software engineering, and other related disciplines, and comments on the implications of this and the lessons learned. It is based on the authors' eight years of experience in teaching agile software methodologies to various groups…

  8. Inserting Agility in System Development

    DTIC Science & Technology

    2012-07-01

    Agile IT Acquisition, IT Box, Scrum Inserting Agility in System Development Matthew R. Kennedy and Lt Col Dan Ward, USAF With the fast-paced nature...1,700 individuals and 71 countries, found Scrum and eXtreme Programming to be the most widely followed method- ologies (VersionOne, 2007). Other...University http://www.dau.mil 259 Defense ARJ, July 2012, Vol. 19 No. 3 : 249–264 Scrum Scrum is a framework used for project management, which is

  9. Agile Development of Advanced Prototypes

    DTIC Science & Technology

    2014-11-01

    genetically modified babies. A case where researchers supplemented women’s defective mitochondria with healthy mitochondria from a donor was...and immersive experience showing genetic engineering’s implication for the future of medicine. 15. SUBJECT TERMS Agile Development, Games for...provoking perspective on genetic engineering’s implication for the future of medicine. Experiencing Living with Prostheses (Xense) During this period

  10. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases, they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At the Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and processes for developing software. This paper will discuss some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies and processes.

  11. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and procedure for developing software. This paper will discuss the some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies.

  12. Agile

    NASA Technical Reports Server (NTRS)

    Trimble, Jay Phillip

    2013-01-01

    This is based on a previous talk on agile development. Methods for delivering software on a short cycle are described, including interactions with the customer, the affect on the team, and how to be more effective, streamlined and efficient.

  13. Development of an agility assessment module for preliminary fighter design

    NASA Technical Reports Server (NTRS)

    Ngan, Angelen; Bauer, Brent; Biezad, Daniel; Hahn, Andrew

    1996-01-01

    A FORTRAN computer program is presented to perform agility analysis on fighter aircraft configurations. This code is one of the modules of the NASA Ames ACSYNT (AirCraft SYNThesis) design code. The background of the agility research in the aircraft industry and a survey of a few agility metrics are discussed. The methodology, techniques, and models developed for the code are presented. FORTRAN programs were developed for two specific metrics, CCT (Combat Cycle Time) and PM (Pointing Margin), as part of the agility module. The validity of the code was evaluated by comparing with existing flight test data. Example trade studies using the agility module along with ACSYNT were conducted using Northrop F-20 Tigershark and McDonnell Douglas F/A-18 Hornet aircraft models. The sensitivity of thrust loading and wing loading on agility criteria were investigated. The module can compare the agility potential between different configurations and has the capability to optimize agility performance in the preliminary design process. This research provides a new and useful design tool for analyzing fighter performance during air combat engagements.

  14. An Agile Constructionist Mentoring Methodology for Software Projects in the High School

    ERIC Educational Resources Information Center

    Meerbaum-Salant, Orni; Hazzan, Orit

    2010-01-01

    This article describes the construction process and evaluation of the Agile Constructionist Mentoring Methodology (ACMM), a mentoring method for guiding software development projects in the high school. The need for such a methodology has arisen due to the complexity of mentoring software project development in the high school. We introduce the…

  15. Pilot users in agile development processes: motivational factors.

    PubMed

    Johannessen, Liv Karen; Gammon, Deede

    2010-01-01

    Despite a wealth of research on user participation, few studies offer insights into how to involve multi-organizational users in agile development methods. This paper is a case study of user involvement in developing a system for electronic laboratory requisitions using agile methodologies in a multi-organizational context. Building on an interpretive approach, we illuminate questions such as: How does collaboration between users and developers evolve and how might it be improved? What key motivational aspects are at play when users volunteer and continue contributing in the face of considerable added burdens? The study highlights how agile methods in themselves appear to facilitate mutually motivating collaboration between user groups and developers. Lessons learned for leveraging the advantages of agile development processes include acknowledging the substantial and ongoing contributions of users and their roles as co-designers of the system.

  16. Value Creation by Agile Projects: Methodology or Mystery?

    NASA Astrophysics Data System (ADS)

    Racheva, Zornitza; Daneva, Maya; Sikkel, Klaas

    Business value is a key concept in agile software development approaches. This paper presents results of a systematic review of literature on how business value is created by agile projects. We found that with very few exceptions, most published studies take the concept of business value for granted and do not state what it means in general as well as in the specific study context. We could find no study which clearly indicates how exactly individual agile practices or groups of those create value and keep accumulating it over time. The key implication for research is that we have an incentive to pursue the study of value creation in agile project by deploying empirical research methods.

  17. Development of a Computer Program for Analyzing Preliminary Aircraft Configurations in Relationship to Emerging Agility Metrics

    NASA Technical Reports Server (NTRS)

    Bauer, Brent

    1993-01-01

    This paper discusses the development of a FORTRAN computer code to perform agility analysis on aircraft configurations. This code is to be part of the NASA-Ames ACSYNT (AirCraft SYNThesis) design code. This paper begins with a discussion of contemporary agility research in the aircraft industry and a survey of a few agility metrics. The methodology, techniques and models developed for the code are then presented. Finally, example trade studies using the agility module along with ACSYNT are illustrated. These trade studies were conducted using a Northrop F-20 Tigershark aircraft model. The studies show that the agility module is effective in analyzing the influence of common parameters such as thrust-to-weight ratio and wing loading on agility criteria. The module can compare the agility potential between different configurations. In addition, one study illustrates the module's ability to optimize a configuration's agility performance.

  18. Exploring the possibility of modeling a genetic counseling guideline using agile methodology.

    PubMed

    Choi, Jeeyae

    2013-01-01

    Increased demand of genetic counseling services heightened the necessity of a computerized genetic counseling decision support system. In order to develop an effective and efficient computerized system, modeling of genetic counseling guideline is an essential step. Throughout this pilot study, Agile methodology with United Modeling Language (UML) was utilized to model a guideline. 13 tasks and 14 associated elements were extracted. Successfully constructed conceptual class and activity diagrams revealed that Agile methodology with UML was a suitable tool to modeling a genetic counseling guideline.

  19. Developing communications requirements for Agile Product Realization

    SciTech Connect

    Forsythe, C.; Ashby, M.R.

    1994-03-01

    Sandia National Laboratories has undertaken the Agile Product Realization for Innovative electroMEchanical Devices (A-PRIMED) pilot project to develop and implement technologies for agile design and manufacturing of electrochemical components. Emphasis on information-driven processes, concurrent engineering and multi-functional team communications makes computer-supported cooperative work critical to achieving significantly faster product development cycles. This report describes analyses conducted in developing communications requirements and a communications plan that addresses the unique communications demands of an agile enterprise.

  20. Planning, Estimating, and Monitoring Progress in Agile Systems Development Environments

    DTIC Science & Technology

    2010-04-01

    Jilemanifesto.orgf Nt:JRJ’HRDP GRUMMAN Agile Terminology Term Definition P d B kl R i /U S i b l dro uct ac og equ rements ser tor es to e comp ete Iteration...Marion, McKelvey, & Uhl- Bien . (2007). Leadership Quarterly, 18(4), 298-318. Agile Development Practices Agile Project Management with Scrum Ken Schwaber

  1. Agile Development Methods for Space Operations

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Webster, Chris

    2012-01-01

    Main stream industry software development practice has gone from a traditional waterfall process to agile iterative development that allows for fast response to customer inputs and produces higher quality software at lower cost. How can we, the space ops community, adopt state of the art software development practice, achieve greater productivity at lower cost, and maintain safe and effective space flight operations? At NASA Ames, we are developing Mission Control Technologies Software, in collaboration with Johnson Space Center (JSC) and, more recently, the Jet Propulsion Laboratory (JPL).

  2. Agile Manufacturing Development of Castings

    DTIC Science & Technology

    2007-11-02

    Consortium was tasked by GE Transportation Systems (GETS) with development of the IFE, a complex ductile iron casting for a commercial loco- motive that... ductile iron foundry with this tooling, it was clear that castings with acceptable quality could not be made. These castings were on the GE...requirements. Therefore, the design specifies a thin - walled casting with complex structures and the requirements demand tight dimensional tolerances and

  3. Impact of agile methodologies on team capacity in automotive radio-navigation projects

    NASA Astrophysics Data System (ADS)

    Prostean, G.; Hutanu, A.; Volker, S.

    2017-01-01

    The development processes used in automotive radio-navigation projects are constantly under adaption pressure. While the software development models are based on automotive production processes, the integration of peripheral components into an automotive system will trigger a high number of requirement modifications. The use of traditional development models in automotive industry will bring team’s development capacity to its boundaries. The root cause lays in the inflexibility of actual processes and their adaption limits. This paper addresses a new project management approach for the development of radio-navigation projects. The understanding of weaknesses of current used models helped us in development and integration of agile methodologies in traditional development model structure. In the first part we focus on the change management methods to reduce request for change inflow. Established change management risk analysis processes enables the project management to judge the impact of a requirement change and also gives time to the project to implement some changes. However, in big automotive radio-navigation projects the saved time is not enough to implement the large amount of changes, which are submitted to the project. In the second phase of this paper we focus on increasing team capacity by integrating at critical project phases agile methodologies into the used traditional model. The overall objective of this paper is to prove the need of process adaption in order to solve project team capacity bottlenecks.

  4. A Quantitative Examination of Critical Success Factors Comparing Agile and Waterfall Project Management Methodologies

    ERIC Educational Resources Information Center

    Pedersen, Mitra

    2013-01-01

    This study investigated the rate of success for IT projects using agile and standard project management methodologies. Any successful project requires use of project methodology. Specifically, large projects require formal project management methodologies or models, which establish a blueprint of processes and project planning activities. This…

  5. Peridigm summary report : lessons learned in development with agile components.

    SciTech Connect

    Salinger, Andrew Gerhard; Mitchell, John Anthony; Littlewood, David John; Parks, Michael L.

    2011-09-01

    This report details efforts to deploy Agile Components for rapid development of a peridynamics code, Peridigm. The goal of Agile Components is to enable the efficient development of production-quality software by providing a well-defined, unifying interface to a powerful set of component-based software. Specifically, Agile Components facilitate interoperability among packages within the Trilinos Project, including data management, time integration, uncertainty quantification, and optimization. Development of the Peridigm code served as a testbed for Agile Components and resulted in a number of recommendations for future development. Agile Components successfully enabled rapid integration of Trilinos packages into Peridigm. A cost of this approach, however, was a set of restrictions on Peridigm's architecture which impacted the ability to track history-dependent material data, dynamically modify the model discretization, and interject user-defined routines into the time integration algorithm. These restrictions resulted in modifications to the Agile Components approach, as implemented in Peridigm, and in a set of recommendations for future Agile Components development. Specific recommendations include improved handling of material states, a more flexible flow control model, and improved documentation. A demonstration mini-application, SimpleODE, was developed at the onset of this project and is offered as a potential supplement to Agile Components documentation.

  6. Future Research in Agile Systems Development: Applying Open Innovation Principles Within the Agile Organisation

    NASA Astrophysics Data System (ADS)

    Conboy, Kieran; Morgan, Lorraine

    A particular strength of agile approaches is that they move away from ‘introverted' development and intimately involve the customer in all areas of development, supposedly leading to the development of a more innovative and hence more valuable information system. However, we argue that a single customer representative is too narrow a focus to adopt and that involvement of stakeholders beyond the software development itself is still often quite weak and in some cases non-existent. In response, we argue that current thinking regarding innovation in agile development needs to be extended to include multiple stakeholders outside the business unit. This paper explores the intra-organisational applicability and implications of open innovation in agile systems development. Additionally, it argues for a different perspective of project management that includes collaboration and knowledge-sharing with other business units, customers, partners, and other relevant stakeholders pertinent to the business success of an organisation, thus embracing open innovation principles.

  7. The NERV Methodology: Non-Functional Requirements Elicitation, Reasoning and Validation in Agile Processes

    ERIC Educational Resources Information Center

    Domah, Darshan

    2013-01-01

    Agile software development has become very popular around the world in recent years, with methods such as Scrum and Extreme Programming (XP). Literature suggests that functionality is the primary focus in Agile processes while non-functional requirements (NFR) are either ignored or ill-defined. However, for software to be of good quality both…

  8. Distributed agile software development for the SKA

    NASA Astrophysics Data System (ADS)

    Wicenec, Andreas; Parsons, Rebecca; Kitaeff, Slava; Vinsen, Kevin; Wu, Chen; Nelson, Paul; Reed, David

    2012-09-01

    The SKA software will most probably be developed by many groups distributed across the globe and coming from dierent backgrounds, like industries and research institutions. The SKA software subsystems will have to cover a very wide range of dierent areas, but still they have to react and work together like a single system to achieve the scientic goals and satisfy the challenging data ow requirements. Designing and developing such a system in a distributed fashion requires proper tools and the setup of an environment to allow for ecient detection and tracking of interface and integration issues in particular in a timely way. Agile development can provide much faster feedback mechanisms and also much tighter collaboration between the customer (scientist) and the developer. Continuous integration and continuous deployment on the other hand can provide much faster feedback of integration issues from the system level to the subsystem developers. This paper describes the results obtained from trialing a potential SKA development environment based on existing science software development processes like ALMA, the expected distribution of the groups potentially involved in the SKA development and experience gained in the development of large scale commercial software projects.

  9. A Case Study of Coordination in Distributed Agile Software Development

    NASA Astrophysics Data System (ADS)

    Hole, Steinar; Moe, Nils Brede

    Global Software Development (GSD) has gained significant popularity as an emerging paradigm. Companies also show interest in applying agile approaches in distributed development to combine the advantages of both approaches. However, in their most radical forms, agile and GSD can be placed in each end of a plan-based/agile spectrum because of how work is coordinated. We describe how three GSD projects applying agile methods coordinate their work. We found that trust is needed to reduce the need of standardization and direct supervision when coordinating work in a GSD project, and that electronic chatting supports mutual adjustment. Further, co-location and modularization mitigates communication problems, enables agility in at least part of a GSD project, and renders the implementation of Scrum of Scrums possible.

  10. An agile implementation of SCRUM

    NASA Astrophysics Data System (ADS)

    Gannon, Michele

    Is Agile a way to cut corners? To some, the use of an Agile Software Development Methodology has a negative connotation - “ Oh, you're just not producing any documentation” . So can a team with no experience in Agile successfully implement and use SCRUM?

  11. Agile informatics: application of agile project management to the development of a personal health application.

    PubMed

    Chung, Jeanhee; Pankey, Evan; Norris, Ryan J

    2007-10-11

    We describe the application of the Agile method-- a short iteration cycle, user responsive, measurable software development approach-- to the project management of a modular personal health record, iHealthSpace, to be deployed to the patients and providers of a large academic primary care practice.

  12. Modern Enterprise Systems as Enablers of Agile Development

    NASA Astrophysics Data System (ADS)

    Fredriksson, Odd; Ljung, Lennart

    Traditional ES technology and traditional project management methods are supporting and matching each other. But they are not supporting the critical success conditions for ES development in an effective way. Although the findings from one case study of a successful modern ES change project is not strong empirical evidence, we carefully propose that the new modern ES technology is supporting and matching agile project management methods. In other words, it provides the required flexibility which makes it possible to put into practice the agile way of running projects, both for the system supplier and for the customer. In addition, we propose that the combination of modern ES technology and agile project management methods are more appropriate for supporting the realization of critical success conditions for ES development. The main purpose of this chapter is to compare critical success conditions for modern enterprise systems development projects with critical success conditions for agile information systems development projects.

  13. Agile Software Development in Defense Acquisition: A Mission Assurance Perspective

    DTIC Science & Technology

    2012-03-23

    AEROSPACE REPORT NO ATR -2012(9010)-2 Agile Software Development in Defense Acquisition - A Mission Assurance Perspective March 23, 2012 Peter...release; distribution unlimited. aoUc£23>o;r7 AEROSPACE REPORT NO ATR -2012(9010)-2 Agile Software Development in Defense Acquisition - A Mission...Engineering and Technology Group Approved for public release; distribution unlimited. (A\\ AEROSPACE ^•^ Aautoff $m MK*I? taH AEROSPACE REPORT NO ATR

  14. Addressing the Barriers to Agile Development in DoD

    DTIC Science & Technology

    2015-05-01

    acquisition development  IT programs are subject to extensive documentation, reviews, and oversight that inhibits speed and agility needed for IT  Major...Based on Program, Ops, and Technical Risk Structuring an Agile Program  Notional: 6 Month Release with 4-Week Sprints – Continual development...integration, and testing – Monthly demonstration of capabilities to users  Gov’t testers, certifiers, and users involved early and often – Minimizes

  15. Lean and Agile Development of the AITS Ground Software System

    NASA Astrophysics Data System (ADS)

    Richters, Mark; Dutruel, Etienne; Mecredy, Nicolas

    2013-08-01

    We present the ongoing development of a new ground software system used for integrating, testing and operating spacecraft. The Advanced Integration and Test Services (AITS) project aims at providing a solution for electrical ground support equipment and mission control systems in future Astrium Space Transportation missions. Traditionally ESA ground or flight software development projects are conducted according to a waterfall-like process as specified in the ECSS-E-40 standard promoted by ESA in the European industry. In AITS a decision was taken to adopt an agile development process. This work could serve as a reference for future ESA software projects willing to apply agile concepts.

  16. Developing a Model for Agile Supply: an Empirical Study from Iranian Pharmaceutical Supply Chain

    PubMed Central

    Rajabzadeh Ghatari, Ali; Mehralian, Gholamhossein; Zarenezhad, Forouzandeh; Rasekh, Hamid Reza

    2013-01-01

    Agility is the fundamental characteristic of a supply chain needed for survival in turbulent markets, where environmental forces create additional uncertainty resulting in higher risk in the supply chain management. In addition, agility helps providing the right product, at the right time to the consumer. The main goal of this research is therefore to promote supplier selection in pharmaceutical industry according to the formative basic factors. Moreover, this paper can configure its supply network to achieve the agile supply chain. The present article analyzes the supply part of supply chain based on SCOR model, used to assess agile supply chains by highlighting their specific characteristics and applicability in providing the active pharmaceutical ingredient (API). This methodology provides an analytical modeling; the model enables potential suppliers to be assessed against the multiple criteria using both quantitative and qualitative measures. In addition, for making priority of critical factors, TOPSIS algorithm has been used as a common technique of MADM model. Finally, several factors such as delivery speed, planning and reorder segmentation, trust development and material quantity adjustment are identified and prioritized as critical factors for being agile in supply of API. PMID:24250689

  17. Developing a model for agile supply: an empirical study from Iranian pharmaceutical supply chain.

    PubMed

    Rajabzadeh Ghatari, Ali; Mehralian, Gholamhossein; Zarenezhad, Forouzandeh; Rasekh, Hamid Reza

    2013-01-01

    Agility is the fundamental characteristic of a supply chain needed for survival in turbulent markets, where environmental forces create additional uncertainty resulting in higher risk in the supply chain management. In addition, agility helps providing the right product, at the right time to the consumer. The main goal of this research is therefore to promote supplier selection in pharmaceutical industry according to the formative basic factors. Moreover, this paper can configure its supply network to achieve the agile supply chain. The present article analyzes the supply part of supply chain based on SCOR model, used to assess agile supply chains by highlighting their specific characteristics and applicability in providing the active pharmaceutical ingredient (API). This methodology provides an analytical modeling; the model enables potential suppliers to be assessed against the multiple criteria using both quantitative and qualitative measures. In addition, for making priority of critical factors, TOPSIS algorithm has been used as a common technique of MADM model. Finally, several factors such as delivery speed, planning and reorder segmentation, trust development and material quantity adjustment are identified and prioritized as critical factors for being agile in supply of API.

  18. Creativity in Agile Systems Development: A Literature Review

    NASA Astrophysics Data System (ADS)

    Conboy, Kieran; Wang, Xiaofeng; Fitzgerald, Brian

    Proponents of agile methods claim that enabling, fostering and driving creativity is the key motivation that differentiates agile methods from their more traditional, beauraucratic counterparts. However, there is very little rigorous research to support this claim. Like most of their predecessors, the development and promotion of these methods has been almost entirely driven by practitioners and consultants, with little objective validation from the research community. This lack of validation is particularly relevant for SMEs, given that many of their project teams typify the environment to which agile methods are most suited i.e. small, co-located teams with diverse, blended skills in unstructured, sometimes even chaotic surroundings. This paper uses creativity theory as a lens to review the current agile method literature to understand exactly how much we know about the extent to which creativity actually occurs in these agile environments. The study reveals many gaps and conflict of opinion in the body of knowledge in its current state and identifies many avenues for further research.

  19. A Capstone Course on Agile Software Development Using Scrum

    ERIC Educational Resources Information Center

    Mahnic, V.

    2012-01-01

    In this paper, an undergraduate capstone course in software engineering is described that not only exposes students to agile software development, but also makes it possible to observe the behavior of developers using Scrum for the first time. The course requires students to work as Scrum Teams, responsible for the implementation of a set of user…

  20. An Agile Methodology for Implementing Service-Oriented Architecture in Small and Medium Sized Organizations

    ERIC Educational Resources Information Center

    Laidlaw, Gregory

    2013-01-01

    The purpose of this study is to evaluate the use of Lean/Agile principles, using action research to develop and deploy new technology for Small and Medium sized enterprises. The research case was conducted at the Lapeer County Sheriff's Department and involves the initial deployment of a Service Oriented Architecture to alleviate the data…

  1. Impact of Agile Software Development Model on Software Maintainability

    ERIC Educational Resources Information Center

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  2. Agile Software Development Methods: A Comparative Review1

    NASA Astrophysics Data System (ADS)

    Abrahamsson, Pekka; Oza, Nilay; Siponen, Mikko T.

    Although agile software development methods have caught the attention of software engineers and researchers worldwide, scientific research still remains quite scarce. The aim of this study is to order and make sense of the different agile approaches that have been proposed. This comparative review is performed from the standpoint of using the following features as the analytical perspectives: project management support, life-cycle coverage, type of practical guidance, adaptability in actual use, type of research objectives and existence of empirical evidence. The results show that agile software development methods cover, without offering any rationale, different phases of the software development life-cycle and that most of these methods fail to provide adequate project management support. Moreover, quite a few methods continue to offer little concrete guidance on how to use their solutions or how to adapt them in different development situations. Empirical evidence after ten years of application remains quite limited. Based on the results, new directions on agile methods are outlined.

  3. Parallelization of fine-scale computation in Agile Multiscale Modelling Methodology

    NASA Astrophysics Data System (ADS)

    Macioł, Piotr; Michalik, Kazimierz

    2016-10-01

    Nowadays, multiscale modelling of material behavior is an extensively developed area. An important obstacle against its wide application is high computational demands. Among others, the parallelization of multiscale computations is a promising solution. Heterogeneous multiscale models are good candidates for parallelization, since communication between sub-models is limited. In this paper, the possibility of parallelization of multiscale models based on Agile Multiscale Methodology framework is discussed. A sequential, FEM based macroscopic model has been combined with concurrently computed fine-scale models, employing a MatCalc thermodynamic simulator. The main issues, being investigated in this work are: (i) the speed-up of multiscale models with special focus on fine-scale computations and (ii) on decreasing the quality of computations enforced by parallel execution. Speed-up has been evaluated on the basis of Amdahl's law equations. The problem of `delay error', rising from the parallel execution of fine scale sub-models, controlled by the sequential macroscopic sub-model is discussed. Some technical aspects of combining third-party commercial modelling software with an in-house multiscale framework and a MPI library are also discussed.

  4. Implementation of an agile maintenance mechanic assignment methodology

    NASA Astrophysics Data System (ADS)

    Jimenez, Jesus A.; Quintana, Rolando

    2000-10-01

    The objective of this research was to develop a decision support system (DSS) to study the impact of introducing new equipment into a medical apparel plant from a maintenance organizational structure perspective. This system will enable the company to determine if their capacity is sufficient to meet current maintenance challenges. The DSS contains two database sets that describe equipment and maintenance resource profiles. The equipment profile specifies data such as mean time to failures, mean time to repairs, and minimum mechanic skill level required to fix each machine group. Similarly, maintenance-resource profile reports information about the mechanic staff, such as number and type of certifications received, education level, and experience. The DSS will then use this information to minimize machine downtime by assigning the highest skilled mechanics to machines with higher complexity and product value. A modified version of the simplex method, the transportation problem, was used to perform the optimization. The DSS was built using the Visual Basic for Applications (VBA) language contained in the Microsoft Excel environment. A case study was developed from current existing data. The analysis consisted of forty-two machine groups and six mechanic categories with ten skill levels. Results showed that only 56% of the mechanic workforce was utilized. Thus, the company had available resources for meeting future maintenance requirements.

  5. An Agile Course-Delivery Approach

    ERIC Educational Resources Information Center

    Capellan, Mirkeya

    2009-01-01

    In the world of software development, agile methodologies have gained popularity thanks to their lightweight methodologies and flexible approach. Many advocates believe that agile methodologies can provide significant benefits if applied in the educational environment as a teaching method. The need for an approach that engages and motivates…

  6. Achieving Agility and Stability in Large-Scale Software Development

    DTIC Science & Technology

    2013-01-16

    question Which software development process are you currently using? 1. Agile software development (e.g., using Scrum , XP practices, test-driven... Scrum teams, product development teams, component teams, feature teams) spend almost all of their time fixing defects, and new capability...architectural runway provides the degree of architectural stability to support the next n iterations of development. In a Scrum project environment

  7. Achieving Agility and Stability in Large-Scale Software Development

    DTIC Science & Technology

    2013-01-16

    question Which software development process are you currently using? 1. Agile software development (e.g., using Scrum , XP practices, test-driven... Scrum teams, product development teams, component teams, feature teams) spend almost all of their time fixing defects, and new capability...architectural runway provides the degree of architectural stability to support the next n iterations of development. In a Scrum project environment, the

  8. A Review of Agile and Lean Manufacturing as Issues in Selected International and National Research and Development Programs and Roadmaps

    ERIC Educational Resources Information Center

    Castro, Helio; Putnik, Goran D.; Shah, Vaibhav

    2012-01-01

    Purpose: The aim of this paper is to analyze international and national research and development (R&D) programs and roadmaps for the manufacturing sector, presenting how agile and lean manufacturing models are addressed in these programs. Design/methodology/approach: In this review, several manufacturing research and development programs and…

  9. Agile software development in an earned value world: a survival guide

    NASA Astrophysics Data System (ADS)

    Kantor, Jeffrey; Long, Kevin; Becla, Jacek; Economou, Frossie; Gelman, Margaret; Juric, Mario; Lambert, Ron; Krughoff, Simon; Swinbank, John D.; Wu, Xiuqin

    2016-08-01

    Agile methodologies are current best practice in software development. They are favored for, among other reasons, preventing premature optimization by taking a somewhat short-term focus, and allowing frequent replans/reprioritizations of upcoming development work based on recent results and current backlog. At the same time, funding agencies prescribe earned value management accounting for large projects which, these days, inevitably include substantial software components. Earned Value approaches emphasize a more comprehensive and typically longer-range plan, and tend to characterize frequent replans and reprioritizations as indicative of problems. Here we describe the planning, execution and reporting framework used by the LSST Data Management team, that navigates these opposite tensions.

  10. Ground System Architectures Workshop GMSEC SERVICES SUITE (GSS): an Agile Development Story

    NASA Technical Reports Server (NTRS)

    Ly, Vuong

    2017-01-01

    The GMSEC (Goddard Mission Services Evolution Center) Services Suite (GSS) is a collection of tools and software services along with a robust customizable web-based portal that enables the user to capture, monitor, report, and analyze system-wide GMSEC data. Given our plug-and-play architecture and the needs for rapid system development, we opted to follow the Scrum Agile Methodology for software development. Being one of the first few projects to implement the Agile methodology at NASA GSFC, in this presentation we will present our approaches, tools, successes, and challenges in implementing this methodology. The GMSEC architecture provides a scalable, extensible ground and flight system for existing and future missions. GMSEC comes with a robust Application Programming Interface (GMSEC API) and a core set of Java-based GMSEC components that facilitate the development of a GMSEC-based ground system. Over the past few years, we have seen an upbeat in the number of customers who are moving from a native desktop application environment to a web based environment particularly for data monitoring and analysis. We also see a need to provide separation of the business logic from the GUI display for our Java-based components and also to consolidate all the GUI displays into one interface. This combination of separation and consolidation brings immediate value to a GMSEC-based ground system through increased ease of data access via a uniform interface, built-in security measures, centralized configuration management, and ease of feature extensibility.

  11. Development of EarthCube Governance: An Agile Approach

    NASA Astrophysics Data System (ADS)

    Pearthree, G.; Allison, M. L.; Patten, K.

    2013-12-01

    Governance of geosciences cyberinfrastructure is a complex and essential undertaking, critical in enabling distributed knowledge communities to collaborate and communicate across disciplines, distances, and cultures. Advancing science with respect to 'grand challenges," such as global climate change, weather prediction, and core fundamental science, depends not just on technical cyber systems, but also on social systems for strategic planning, decision-making, project management, learning, teaching, and building a community of practice. Simply put, a robust, agile technical system depends on an equally robust and agile social system. Cyberinfrastructure development is wrapped in social, organizational and governance challenges, which may significantly impede progress. An agile development process is underway for governance of transformative investments in geosciences cyberinfrastructure through the NSF EarthCube initiative. Agile development is iterative and incremental, and promotes adaptive planning and rapid and flexible response. Such iterative deployment across a variety of EarthCube stakeholders encourages transparency, consensus, accountability, and inclusiveness. A project Secretariat acts as the coordinating body, carrying out duties for planning, organizing, communicating, and reporting. A broad coalition of stakeholder groups comprises an Assembly (Mainstream Scientists, Cyberinfrastructure Institutions, Information Technology/Computer Sciences, NSF EarthCube Investigators, Science Communities, EarthCube End-User Workshop Organizers, Professional Societies) to serve as a preliminary venue for identifying, evaluating, and testing potential governance models. To offer opportunity for broader end-user input, a crowd-source approach will engage stakeholders not involved otherwise. An Advisory Committee from the Earth, ocean, atmosphere, social, computer and library sciences is guiding the process from a high-level policy point of view. Developmental

  12. Towards a Framework for Using Agile Approaches in Global Software Development

    NASA Astrophysics Data System (ADS)

    Hossain, Emam; Ali Babar, Muhammad; Verner, June

    As agile methods and Global Software Development (GSD) are become increasingly popular, GSD project managers have been exploring the viability of using agile approaches in their development environments. Despite the expected benefits of using an agile approach with a GSD project, the overall combining mechanisms of the two approaches are not clearly understood. To address this challenge, we propose a conceptual framework, based on the research literature. This framework is expected to aid a project manager in deciding what agile strategies are effective for a particular GSD project, taking into account project context. We use an industry-based case study to explore the components of our conceptual framework. Our case study is planned and conducted according to specific published case study guidelines. We identify the agile practices and agile supporting practices used by a GSD project manager in our case study and conclude with future research directions.

  13. Data Centric Development Methodology

    ERIC Educational Resources Information Center

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  14. Agile development approach for the observatory control software of the DAG 4m telescope

    NASA Astrophysics Data System (ADS)

    Güçsav, B. Bülent; ćoker, Deniz; Yeşilyaprak, Cahit; Keskin, Onur; Zago, Lorenzo; Yerli, Sinan K.

    2016-08-01

    Observatory Control Software for the upcoming 4m infrared telescope of DAG (Eastern Anatolian Observatory in Turkish) is in the beginning of its lifecycle. After the process of elicitation-validation of the initial requirements, we have been focused on preparation of a rapid conceptual design not only to see the big picture of the system but also to clarify the further development methodology. The existing preliminary designs for both software (including TCS and active optics control system) and hardware shall be presented here in brief to exploit the challenges the DAG software team has been facing with. The potential benefits of an agile approach for the development will be discussed depending on the published experience of the community and on the resources available to us.

  15. How Can Agile Practices Minimize Global Software Development Co-ordination Risks?

    NASA Astrophysics Data System (ADS)

    Hossain, Emam; Babar, Muhammad Ali; Verner, June

    The distribution of project stakeholders in Global Software Development (GSD) projects provides significant risks related to project communication, coordination and control processes. There is growing interest in applying agile practices in GSD projects in order to leverage the advantages of both approaches. In some cases, GSD project managers use agile practices to reduce project distribution challenges. We use an existing coordination framework to identify GSD coordination problems due to temporal, geographical and socio-cultural distances. An industry-based case study is used to describe, explore and explain the use of agile practices to reduce development coordination challenges.

  16. Making Agile Work for You

    DTIC Science & Technology

    2011-07-20

    Extreme Programming (XP) • SCRUM • Dynamic Systems Development Method (DSDM) • Adaptive Software Development • Crystal • Feature-Driven Development...Carnegie Mellon University Twitter #seiwebinar SCRUM Scrum is an iterative, incremental methodology for managing agile software projects. The Team

  17. Addressing the Barriers to Agile Development in the Department of Defense: Program Structure, Requirements, and Contracting

    DTIC Science & Technology

    2015-04-30

    with the IT Box requirements concept, and thus cannot take advantage of its flexibilities to enable Agile development. In addition, long contracting...place. Many DoD IT acquisition programs are unfamiliar with the IT Box requirements concept, and thus cannot take advantage of its flexibilities to...requirements, and contracting. The DoD can address these barriers by utilizing a proactively tailored Agile acquisition model, implementing an IT Box

  18. Agile Project Management for e-Learning Developments

    ERIC Educational Resources Information Center

    Doherty, Iain

    2010-01-01

    We outline the project management tactics that we developed in praxis in order to manage elearning projects and show how our tactics were enhanced through implementing project management techniques from a formal project management methodology. Two key factors have contributed to our project management success. The first is maintaining a clear…

  19. Adopting best practices: "Agility" moves from software development to healthcare project management.

    PubMed

    Kitzmiller, Rebecca; Hunt, Eleanor; Sproat, Sara Breckenridge

    2006-01-01

    It is time for a change in mindset in how nurses operationalize system implementations and manage projects. Computers and systems have evolved over time from unwieldy mysterious machines of the past to ubiquitous computer use in every aspect of daily lives and work sites. Yet, disconcertingly, the process used to implement these systems has not evolved. Technology implementation does not need to be a struggle. It is time to adapt traditional plan-driven implementation methods to incorporate agile techniques. Agility is a concept borrowed from software development and is presented here because it encourages flexibility, adaptation, and continuous learning as part of the implementation process. Agility values communication and harnesses change to an advantage, which facilitates the natural evolution of an adaptable implementation process. Specific examples of agility in an implementation are described, and plan-driven implementation stages are adapted to incorporate relevant agile techniques. This comparison demonstrates how an agile approach enhances traditional implementation techniques to meet the demands of today's complex healthcare environments.

  20. The Perfect Process Storm: Integration of CMMI, Agile, and Lean Six Sigma

    DTIC Science & Technology

    2012-12-01

    projects using similar iterative methodologies includ- ing Scrum , Crystal, and Feature-driven Development leading to the meeting of the Agile...1986 Lean Six Sigma (LSS) Late 1990’s Lean Production 1990 CMM 1987 - 2002 CMMI 2002V1.3 2010 Agile XP 1996 Agile Manifesto 2001 Scrum 2001...Business Process Improvement. Most recently his efforts have targeted BPI for 22 Agile SCRUM projects, deploy- ing Project and Process Management

  1. Expert Systems Development Methodology

    DTIC Science & Technology

    1989-07-28

    two volumes. Volume 1 is the Development Metodology and Volume 2 is an Evaluation Methodology containing methods for evaluation, validation and...system are written in an English -like language which almost anyone can understand. Thus programming in rule based systems can become "programming for...computers and others have little understanding about how computers work. The knowledge engineer must therefore be willing and able to teach the expert

  2. Agile enterprise development framework utilizing services principles for building pervasive security

    NASA Astrophysics Data System (ADS)

    Farroha, Deborah; Farroha, Bassam

    2011-06-01

    We are in an environment of continuously changing mission requirements and therefore our Information Systems must adapt to accomplish new tasks, quicker, in a more proficient manner. Agility is the only way we will be able to keep up with this change. But there are subtleties that must be considered as we adopt various agile methods: secure, protect, control and authenticate are all elements needed to posture our Information Technology systems to counteract the real and perceived threats in today's environment. Many systems have been tasked to ingest process and analyze different data sets than they were originally designed for and they have to interact with multiple new systems that were unaccounted for at design time. Leveraging the tenets of security, we have devised a new framework that takes agility into a new realm where the product will built to work in a service-based environment but is developed using agile processes. Even though these two criteria promise to hone the development effort, they actually contradict each other in philosophy where Services require stable interfaces, while Agile focuses on being flexible and tolerate changes up to much later stages of development. This framework is focused on enabling a successful product development that capitalizes on both philosophies.

  3. Applying Agile MethodstoWeapon/Weapon-Related Software

    SciTech Connect

    Adams, D; Armendariz, M; Blackledge, M; Campbell, F; Cloninger, M; Cox, L; Davis, J; Elliott, M; Granger, K; Hans, S; Kuhn, C; Lackner, M; Loo, P; Matthews, S; Morrell, K; Owens, C; Peercy, D; Pope, G; Quirk, R; Schilling, D; Stewart, A; Tran, A; Ward, R; Williamson, M

    2007-05-02

    This white paper provides information and guidance to the Department of Energy (DOE) sites on Agile software development methods and the impact of their application on weapon/weapon-related software development. The purpose of this white paper is to provide an overview of Agile methods, examine the accepted interpretations/uses/practices of these methodologies, and discuss the applicability of Agile methods with respect to Nuclear Weapons Complex (NWC) Technical Business Practices (TBPs). It also provides recommendations on the application of Agile methods to the development of weapon/weapon-related software.

  4. Cross Sectional Study of Agile Software Development Methods and Project Performance

    ERIC Educational Resources Information Center

    Lambert, Tracy

    2011-01-01

    Agile software development methods, characterized by delivering customer value via incremental and iterative time-boxed development processes, have moved into the mainstream of the Information Technology (IT) industry. However, despite a growing body of research which suggests that a predictive manufacturing approach, with big up-front…

  5. Analysis and optimization of preliminary aircraft configurations in relationship to emerging agility metrics

    NASA Technical Reports Server (NTRS)

    Sandlin, Doral R.; Bauer, Brent Alan

    1993-01-01

    This paper discusses the development of a FORTRAN computer code to perform agility analysis on aircraft configurations. This code is to be part of the NASA-Ames ACSYNT (AirCraft SYNThesis) design code. This paper begins with a discussion of contemporary agility research in the aircraft industry and a survey of a few agility metrics. The methodology, techniques and models developed for the code are then presented. Finally, example trade studies using the agility module along with ACSYNT are illustrated. These trade studies were conducted using a Northrop F-20 Tigershark aircraft model. The studies show that the agility module is effective in analyzing the influence of common parameters such as thrust-to-weight ratio and wing loading on agility criteria. The module can compare the agility potential between different configurations. In addition one study illustrates the module's ability to optimize a configuration's agility performance.

  6. Agile in Large-Scale Development Workshop: Coaching, Transitioning and Practicing

    NASA Astrophysics Data System (ADS)

    Nilsson, Thomas; Larsson, Andreas

    Agile in large-scale and complex development presents its own set of problems, both how to practice, transition and coaching. This workshop aims at bringing persons interested in this topic together to share tools, techniques and insights. The workshop will follow the increasingly popular “lightning talk + open space” format.

  7. Agile Mythbusting

    DTIC Science & Technology

    2015-01-01

    does not fit all Scrum : The most adopted Agile method Scaling Agile Methods: Going beyond the team level methods Challenges to Agile Adoption: What’s...Arsenal Lapham, Wrubel Jan 2015 © 2015 Carnegie Mellon University. Myth: You Must Choose Agile or Waterfall – you can’t do both What about “water- scrum ...used in multiple environments, including DoD programs.1 1Start with “Agile EVM in Scrum Projects” from AGILE 2006 to get started learning about Agile

  8. Coaching for Better (Software) Buying Power in an Agile World

    DTIC Science & Technology

    2013-06-01

    professionals carefully to consider incorporation of agile methodologies into the set of acquisition tools at their disposal. This transformation is not...believes that DevOps , the process of warfighters and developers work- ing together throughout the project, is superior to volumes of detailed...Professionalism of the Total Acquisition Workforce The DoD needs to invest in training the acquisition workforce in agile methodologies to add tools that

  9. Development methodology for scientific software

    SciTech Connect

    Cort, G.; Goldstone, J.A.; Nelson, R.O.; Poore, R.V.; Miller, L.; Barrus, D.M.

    1985-01-01

    We present the details of a software development methodology that addresses all phases of the software life cycle, yet is well suited for application by small projects with limited resources. The methodology has been developed at the Los Alamos Weapons Neutron Research (WNR) Facility and was utilized during the recent development of the WNR Data Acquisition Command Language. The methodology emphasizes the development and maintenance of comprehensive documentation for all software components. The impact of the methodology upon software quality and programmer productivity is assessed.

  10. Architecting for Large Scale Agile Software Development: A Risk-Driven Approach

    DTIC Science & Technology

    2013-05-01

    addressed aspect of scale in agile software development. Practices such as Scrum of Scrums are meant to address orchestration of multiple development...owner, Scrum master) have differing responsibilities from the roles in the existing phase-based waterfall program structures. Such differences may... Scrum . Communication with both internal and external stakeholders must be open and documentation should not be used as a substitute for communication

  11. Ramping up for agility: Development of a concurrent engineering communications infrastructure

    SciTech Connect

    Forsythe, C.; Ashby, M.R.

    1995-09-01

    A-PRIMED (Agile Product Realization for Innovative Electro MEchanical Devices) demonstrated new product development in24 days accompanied by improved product quality, through ability enabling technologies. A concurrent engineering communications infrastructure was developed that provided electronic data communications, information access, enterprise integration of computers and applications, and collaborative work tools. This paper describes how A-PRIMED did it through attention to technologies, processes, and people.

  12. Agile rediscovering values: Similarities to continuous improvement strategies

    NASA Astrophysics Data System (ADS)

    Díaz de Mera, P.; Arenas, J. M.; González, C.

    2012-04-01

    Research in the late 80's on technological companies that develop products of high value innovation, with sufficient speed and flexibility to adapt quickly to changing market conditions, gave rise to the new set of methodologies known as Agile Management Approach. In the current changing economic scenario, we considered very interesting to study the similarities of these Agile Methodologies with other practices whose effectiveness has been amply demonstrated in both the West and Japan. Strategies such as Kaizen, Lean, World Class Manufacturing, Concurrent Engineering, etc, would be analyzed to check the values they have in common with the Agile Approach.

  13. Insights into Global Health Practice from the Agile Software Development Movement

    PubMed Central

    Flood, David; Chary, Anita; Austad, Kirsten; Diaz, Anne Kraemer; García, Pablo; Martinez, Boris; Canú, Waleska López; Rohloff, Peter

    2016-01-01

    Global health practitioners may feel frustration that current models of global health research, delivery, and implementation are overly focused on specific interventions, slow to provide health services in the field, and relatively ill-equipped to adapt to local contexts. Adapting design principles from the agile software development movement, we propose an analogous approach to designing global health programs that emphasizes tight integration between research and implementation, early involvement of ground-level health workers and program beneficiaries, and rapid cycles of iterative program improvement. Using examples from our own fieldwork, we illustrate the potential of ‘agile global health’ and reflect on the limitations, trade-offs, and implications of this approach. PMID:27134081

  14. Accelerating Software Development through Agile Practices--A Case Study of a Small-Scale, Time-Intensive Web Development Project at a College-Level IT Competition

    ERIC Educational Resources Information Center

    Zhang, Xuesong; Dorn, Bradley

    2012-01-01

    Agile development has received increasing interest both in industry and academia due to its benefits in developing software quickly, meeting customer needs, and keeping pace with the rapidly changing requirements. However, agile practices and scrum in particular have been mainly tested in mid- to large-size projects. In this paper, we present…

  15. Agile Development and Software Architecture: Understanding Scale and Risk

    DTIC Science & Technology

    2011-10-24

    SEIVirtualForum Symptoms of failure  Teams (e.g., Scrum teams, product development teams, component teams, feature teams) spend almost all of...stability to support the next n iterations of development. In a Scrum project environment, the architectural runway may be established during...infrastructure Presentation Layer Common Service Common Service Common Service API APIData Access Layer Domain Layer Scrum Team A Scrum Team B Scrum Team C

  16. Agile Development and Software Architecture: Understanding Scale and Risk

    DTIC Science & Technology

    2012-04-26

    In a Scrum project environment, the architectural runway may be established during Sprint 0. Sprint 0 might have a longer duration than the rest of...architecture In its simplest instantiation, a Scrum development environment consists of: • a single co-located, cross-functional team • with skills...cause analysis: Typical problem 1 Symptom • Scrum teams spend almost all of their time fixing defects, and new feature development is continuously

  17. Empirical Agility

    DTIC Science & Technology

    2014-06-01

    documented the fact that Unmanned Aerial Vehicles (now more commonly called drones ) added substantially to the quality of surveillance, resulting in better...Battalions. CACI Inc.-Federal, Arlington, Virginia. 1977. DTIC Accession Number ADA123481. Olmstead, Joseph A., B. Leon Elder, and John M...greater agility – Drones improved agility in ground and air operations – Nelson’s C2 at Trafalgar demonstrated agility. Theater Level C2 During WW II

  18. Expert System Development Methodology (ESDM)

    NASA Technical Reports Server (NTRS)

    Sary, Charisse; Gilstrap, Lewey; Hull, Larry G.

    1990-01-01

    The Expert System Development Methodology (ESDM) provides an approach to developing expert system software. Because of the uncertainty associated with this process, an element of risk is involved. ESDM is designed to address the issue of risk and to acquire the information needed for this purpose in an evolutionary manner. ESDM presents a life cycle in which a prototype evolves through five stages of development. Each stage consists of five steps, leading to a prototype for that stage. Development may proceed to a conventional development methodology (CDM) at any time if enough has been learned about the problem to write requirements. ESDM produces requirements so that a product may be built with a CDM. ESDM is considered preliminary because is has not yet been applied to actual projects. It has been retrospectively evaluated by comparing the methods used in two ongoing expert system development projects that did not explicitly choose to use this methodology but which provided useful insights into actual expert system development practices and problems.

  19. ISS Double-Gimbaled CMG Subsystem Simulation Using the Agile Development Method

    NASA Technical Reports Server (NTRS)

    Inampudi, Ravi

    2016-01-01

    This paper presents an evolutionary approach in simulating a cluster of 4 Control Moment Gyros (CMG) on the International Space Station (ISS) using a common sense approach (the agile development method) for concurrent mathematical modeling and simulation of the CMG subsystem. This simulation is part of Training systems for the 21st Century simulator which will provide training for crew members, instructors, and flight controllers. The basic idea of how the CMGs on the space station are used for its non-propulsive attitude control is briefly explained to set up the context for simulating a CMG subsystem. Next different reference frames and the detailed equations of motion (EOM) for multiple double-gimbal variable-speed control moment gyroscopes (DGVs) are presented. Fixing some of the terms in the EOM becomes the special case EOM for ISS's double-gimbaled fixed speed CMGs. CMG simulation development using the agile development method is presented in which customer's requirements and solutions evolve through iterative analysis, design, coding, unit testing and acceptance testing. At the end of the iteration a set of features implemented in that iteration are demonstrated to the flight controllers thus creating a short feedback loop and helping in creating adaptive development cycles. The unified modeling language (UML) tool is used in illustrating the user stories, class designs and sequence diagrams. This incremental development approach of mathematical modeling and simulating the CMG subsystem involved the development team and the customer early on, thus improving the quality of the working CMG system in each iteration and helping the team to accurately predict the cost, schedule and delivery of the software.

  20. Reactive Agility Performance in Handball; Development and Evaluation of a Sport-Specific Measurement Protocol.

    PubMed

    Spasic, Miodrag; Krolo, Ante; Zenic, Natasa; Delextrat, Anne; Sekulic, Damir

    2015-09-01

    There is no current study that examined sport-specific tests of reactive-agility and change-of-direction-speed (CODS) to replicate real-sport environment in handball (team-handball). This investigation evaluated the reliability and validity of two novel tests designed to assess reactive-agility and CODS of handball players. Participants were female (25.14 ± 3.71 years of age; 1.77 ± 0.09 m and 74.1 ± 6.1 kg) and male handball players (26.9 ± 4.1 years of age; 1.90 ± 0.09 m and 93.90±4.6 kg). Variables included body height, body mass, body mass index, broad jump, 5-m sprint, CODS and reactive-agility tests. Results showed satisfactory reliability for reactive-agility-test and CODS-test (ICC of 0.85-0.93, and CV of 2.4-4.8%). The reactive-agility and CODS shared less than 20% of the common variance. The calculated index of perceptual and reactive capacity (P&RC; ratio between reactive-agility- and CODS-performance) is found to be valid measure in defining true-game reactive-agility performance in handball in both genders. Therefore, the handball athletes' P&RC should be used in the evaluation of real-game reactive-agility performance. Future studies should explore other sport-specific reactive-agility tests and factors associated to such performance in sports involving agile maneuvers. Key pointsReactive agility and change-of-direction-speed should be observed as independent qualities, even when tested over the same course and similar movement templateThe reactive-agility-performance of the handball athletes involved in defensive duties is closer to their non-reactive-agility-score than in their peers who are not involved in defensive dutiesThe handball specific "true-game" reactive-agility-performance should be evaluated as the ratio between reactive-agility and corresponding CODS performance.

  1. Development of perceived competence, tactical skills, motivation, technical skills, and speed and agility in young soccer players.

    PubMed

    Forsman, Hannele; Gråstén, Arto; Blomqvist, Minna; Davids, Keith; Liukkonen, Jarmo; Konttinen, Niilo

    2016-07-01

    The objective of this 1-year, longitudinal study was to examine the development of perceived competence, tactical skills, motivation, technical skills, and speed and agility characteristics of young Finnish soccer players. We also examined associations between latent growth models of perceived competence and other recorded variables. Participants were 288 competitive male soccer players ranging from 12 to 14 years (12.7 ± 0.6) from 16 soccer clubs. Players completed the self-assessments of perceived competence, tactical skills, and motivation, and participated in technical, and speed and agility tests. Results of this study showed that players' levels of perceived competence, tactical skills, motivation, technical skills, and speed and agility characteristics remained relatively high and stable across the period of 1 year. Positive relationships were found between these levels and changes in perceived competence and motivation, and levels of perceived competence and speed and agility characteristics. Together these results illustrate the multi-dimensional nature of talent development processes in soccer. Moreover, it seems crucial in coaching to support the development of perceived competence and motivation in young soccer players and that it might be even more important in later maturing players.

  2. Development of a low-cost, low micro-vibration CMG for small agile satellite applications

    NASA Astrophysics Data System (ADS)

    Kawak, B. J.

    2017-02-01

    The agility of the spacecraft which refers to the spacecraft's ability to execute fast and accurate manoeuvers within a fixed period of time, is a key satellite parameter. The spacecraft' s agility is directly proportional to the spacecraft actuators' output torque. For high torque inertial actuators (>0.5 Nm), Control Moment Gyroscope (CMG) exhibits better performances in terms of mass and electrical power consumption than reaction wheels. However, in addition to the complex steering law required to avoid CMG singularities, one of the reasons why CMGs are not widely used is also due to their high micro-vibration emission which may interfere and disrupt the spacecraft' s sensitive instruments such as optical payloads. In this paper, an innovative two-stage viscoelastic isolation system has been designed and implemented in a new low micro-vibration CMG prototype. The first stage of the damping system acts at bearing level to attenuate the possible shock vibrations while the second stage acts at mechanism level to attenuate the structural resonances and motor noise. The developed CMG enables to combine high actuator output torque with a low micro-vibration signature. The viscoelastic damping system is cost effective as it is a fully passive system which requires no thermal control and no electronics. Furthermore, the attenuation provided by this innovative two stage damping system can reach a slope up to -80 dB/dec which leads to a Mini-CMG micro-vibration signature lower than similar output torque reaction wheels not equipped with a damping system.

  3. Investigation into the impact of agility on conceptual fighter design

    NASA Technical Reports Server (NTRS)

    Engelbeck, R. M.

    1995-01-01

    The Agility Design Study was performed by the Boeing Defense and Space Group for the NASA Langley Research Center. The objective of the study was to assess the impact of agility requirements on new fighter configurations. Global trade issues investigated were the level of agility, the mission role of the aircraft (air-to-ground, multi-role, or air-to-air), and whether the customer is Air force, Navy, or joint service. Mission profiles and design objectives were supplied by NASA. An extensive technology assessment was conducted to establish the available technologies to industry for the aircraft. Conceptual level methodology is presented to assess the five NASA-supplied agility metrics. Twelve configurations were developed to address the global trade issues. Three-view drawings, inboard profiles, and performance estimates were made and are included in the report. A critical assessment and lessons learned from the study are also presented.

  4. Applying Agile Methods to the Development of a Community-Based Sea Ice Observations Database

    NASA Astrophysics Data System (ADS)

    Pulsifer, P. L.; Collins, J. A.; Kaufman, M.; Eicken, H.; Parsons, M. A.; Gearheard, S.

    2011-12-01

    Local and traditional knowledge and community-based monitoring programs are increasingly being recognized as an important part of establishing an Arctic observing network, and understanding Arctic environmental change. The Seasonal Ice Zone Observing Network (SIZONet, http://www.sizonet.org) project has implemented an integrated program for observing seasonal ice in Alaska. Observation and analysis by local sea ice experts helps track seasonal and inter-annual variability of the ice cover and its use by coastal communities. The ELOKA project (http://eloka-arctic.org) is collaborating with SIZONet on the development of a community accessible, Web-based application for collecting and distributing local observations. The SIZONet project is dealing with complicated qualitative and quantitative data collected from a growing number of observers in different communities while concurrently working to design a system that will serve a wide range of different end users including Arctic residents, scientists, educators, and other stakeholders with a need for sea ice information. The benefits of linking and integrating knowledge from communities and university-based researchers are clear, however, development of an information system in this multidisciplinary, multi-participant context is challenging. Participants are geographically distributed, have different levels of technical expertise, and have varying goals for how the system will be used. As previously reported (Pulsifer et al. 2010), new technologies have been used to deal with some of the challenges presented in this complex development context. In this paper, we report on the challenges and innovations related to working as a multi-disciplinary software development team. Specifically, we discuss how Agile software development methods have been used in defining and refining user needs, developing prototypes, and releasing a production level application. We provide an overview of the production application that

  5. The Telemetry Agile Manufacturing Effort

    SciTech Connect

    Brown, K.D.

    1995-01-01

    The Telemetry Agile Manufacturing Effort (TAME) is an agile enterprising demonstration sponsored by the US Department of Energy (DOE). The project experimented with new approaches to product realization and assessed their impacts on performance, cost, flow time, and agility. The purpose of the project was to design the electrical and mechanical features of an integrated telemetry processor, establish the manufacturing processes, and produce an initial production lot of two to six units. This paper outlines the major methodologies utilized by the TAME, describes the accomplishments that can be attributed to each methodology, and finally, examines the lessons learned and explores the opportunities for improvement associated with the overall effort. The areas for improvement are discussed relative to an ideal vision of the future for agile enterprises. By the end of the experiment, the TAME reduced production flow time by approximately 50% and life cycle cost by more than 30%. Product performance was improved compared with conventional DOE production approaches.

  6. Poster — Thur Eve — 56: Design of Quality Assurance Methodology for VMAT system on Agility System equipped with CVDR

    SciTech Connect

    Thind, K; Tolakanahalli, R

    2014-08-15

    The aim of this study was to analyze the feasibility of designing comprehensive QA plans using iComCAT for Elekta machines equipped with Agility multileaf collimator and continuously variable dose rate. Test plans with varying MLC speed, gantry speed, and dose rate were created and delivered in a controlled manner. A strip test was designed with three 1 cm MLC positions and delivered using dynamic, StepNShoot and VMAT techniques. Plans were also designed to test error in MLC position with various gantry speeds and various MLC speeds. The delivery fluence was captured using the electronic portal-imaging device. Gantry speed was found to be within tolerance as per the Canadian standards. MLC positioning errors at higher MLC speed with gravity effects does add more than 2 mm discrepancy. More tests need to be performed to evaluate MLC performance using independent measurement systems. The treatment planning system with end-to-end testing necessary for commissioning was also investigated and found to have >95% passing rates within 3%/3mm gamma criteria. Future studies involve performing off-axis gantry starshot pattern and repeating the tests on three matched Elekta linear accelerators.

  7. Development of a fast, lean and agile direct pelletization process using experimental design techniques.

    PubMed

    Politis, Stavros N; Rekkas, Dimitrios M

    2017-04-01

    A novel hot melt direct pelletization method was developed, characterized and optimized, using statistical thinking and experimental design tools. Mixtures of carnauba wax (CW) and HPMC K100M were spheronized using melted gelucire 50-13 as a binding material (BM). Experimentation was performed sequentially; a fractional factorial design was set up initially to screen the factors affecting the process, namely spray rate, quantity of BM, rotor speed, type of rotor disk, lubricant-glidant presence, additional spheronization time, powder feeding rate and quantity. From the eight factors assessed, three were further studied during process optimization (spray rate, quantity of BM and powder feeding rate), at different ratios of the solid mixture of CW and HPMC K100M. The study demonstrated that the novel hot melt process is fast, efficient, reproducible and predictable. Therefore, it can be adopted in a lean and agile manufacturing setting for the production of flexible pellet dosage forms with various release rates easily customized between immediate and modified delivery.

  8. Development of telemetry for the agility flight test of a radio controlled fighter model

    NASA Astrophysics Data System (ADS)

    Gallagher, Michael J.

    1992-03-01

    Advanced design tools, control devices, and supermaneuverability concepts provide innovative solutions to traditional aircraft design trade-offs. Emerging technologies enable improved agility throughout the performance envelope. Unmanned Air Vehicles provide an excellent platform for dynamic measurements and agility research. A 1/8-scaled F-16A ducted-fan radio-controlled aircraft was instrumented with a telemetry system to acquire angle of attack, sideslip angle, control surface deflection, throttle position, and airspeed data. A portable ground station was built to record and visually present real-time telemetry data. Flight tests will be conducted to acquire baseline high angle-of-attack performance measurements, and follow-on research will evaluate agility improvements with varied control configurations.

  9. Utilization of an agility assessment module in analysis and optimization of preliminary fighter configuration

    NASA Technical Reports Server (NTRS)

    Ngan, Angelen; Biezad, Daniel

    1996-01-01

    A study has been conducted to develop and to analyze a FORTRAN computer code for performing agility analysis on fighter aircraft configurations. This program is one of the modules of the NASA Ames ACSYNT (AirCraft SYNThesis) design code. The background of the agility research in the aircraft industry and a survey of a few agility metrics are discussed. The methodology, techniques, and models developed for the code are presented. The validity of the existing code was evaluated by comparing with existing flight test data. A FORTRAN program was developed for a specific metric, PM (Pointing Margin), as part of the agility module. Example trade studies using the agility module along with ACSYNT were conducted using a McDonnell Douglas F/A-18 Hornet aircraft model. Tile sensitivity of thrust loading, wing loading, and thrust vectoring on agility criteria were investigated. The module can compare the agility potential between different configurations and has capability to optimize agility performance in the preliminary design process. This research provides a new and useful design tool for analyzing fighter performance during air combat engagements in the preliminary design.

  10. Development and Evaluation of an Inverse Solution Technique for Studying Helicopter Maneuverability and Agility

    DTIC Science & Technology

    1991-07-01

    three maneuvers. For the purposes of this study, maneuverability is defined as the maximum achievable time rate of change of the velocity vector at any...point in the flight envelope, and agility is defined as the maximum achievable time- rate - of - change of the acceleration vector at any point in the flight envelope.

  11. Lean Mission Operations Systems Design - Using Agile and Lean Development Principles for Mission Operations Design and Development

    NASA Technical Reports Server (NTRS)

    Trimble, Jay Phillip

    2014-01-01

    The Resource Prospector Mission seeks to rove the lunar surface with an in-situ resource utilization payload in search of volatiles at a polar region. The mission operations system (MOS) will need to perform the short-duration mission while taking advantage of the near real time control that the short one-way light time to the Moon provides. To maximize our use of limited resources for the design and development of the MOS we are utilizing agile and lean methods derived from our previous experience with applying these methods to software. By using methods such as "say it then sim it" we will spend less time in meetings and more time focused on the one outcome that counts - the effective utilization of our assets on the Moon to meet mission objectives.

  12. An Investigation of Agility Issues in Scrum Teams Using Agility Indicators

    NASA Astrophysics Data System (ADS)

    Pikkarainen, Minna; Wang, Xiaofeng

    Agile software development methods have emerged and become increasingly popular in recent years; yet the issues encountered by software development teams that strive to achieve agility using agile methods are yet to be explored systematically. Built upon a previous study that has established a set of indicators of agility, this study investigates what issues are manifested in software development teams using agile methods. It is focussed on Scrum teams particularly. In other words, the goal of the chapter is to evaluate Scrum teams using agility indicators and therefore to further validate previously presented agility indicators within the additional cases. A multiple case study research method is employed. The findings of the study reveal that the teams using Scrum do not necessarily achieve agility in terms of team autonomy, sharing, stability and embraced uncertainty. The possible reasons include previous organizational plan-driven culture, resistance towards the Scrum roles and changing resources.

  13. Agile Metrics: Progress Monitoring of Agile Contractors

    DTIC Science & Technology

    2014-01-01

    can be tailored to leverage the iterative nature of Agile meth- ods. Using optional contract funding lines or indefinite delivery indefinite quantity... naturally created during the execution of the Agile implementation. In the following paragraphs, we identify issues to consider in building an Agile...employing Agile methods [Hartman 2006]. Be prepared to mine and effectively use the metrics data that naturally occur in typical Ag- ile teams. In

  14. Decision Support for Iteration Scheduling in Agile Environments

    NASA Astrophysics Data System (ADS)

    Szőke, Ákos

    Today’s software business development projects often lay claim to low-risk value to the customers in order to be financed. Emerging agile processes offer shorter investment periods, faster time-to-market and better customer satisfaction. To date, however, in agile environments there is no sound methodological schedule support contrary to the traditional plan-based approaches. To address this situation, we present an agile iteration scheduling method whose usefulness is evaluated with post-mortem simulation. It demonstrates that the method can significantly improve load balancing of resources (cca. 5×), produce higher quality and lower-risk feasible schedule, and provide more informed and established decisions by optimized schedule production. Finally, the paper analyzes benefits and issues from the use of this method.

  15. Development and evaluation of an inverse solution technique for studying helicopter maneuverability and agility

    NASA Technical Reports Server (NTRS)

    Whalley, Matthew S.

    1991-01-01

    An inverse solution technique for determining the maximum maneuvering performance of a helicopter using smooth, pilotlike control inputs is presented. Also described is a pilot simulation experiment performed to investigate the accuracy of the solution resulting from this technique. The maneuverability and agility capability of the helicopter math model was varied by varying the pitch and roll damping, the maximum pitch and roll rate, and the maximum load-factor capability. Three maneuvers were investigated: a 180-deg turn, a longitudinal pop-up, and a lateral jink. The inverse solution technique yielded accurate predictions of pilot-in-the-loop maneuvering performance for two of the three maneuvers.

  16. Agile manufacturing: The factory of the future

    NASA Technical Reports Server (NTRS)

    Loibl, Joseph M.; Bossieux, Terry A.

    1994-01-01

    The factory of the future will require an operating methodology which effectively utilizes all of the elements of product design, manufacturing and delivery. The process must respond rapidly to changes in product demand, product mix, design changes or changes in the raw materials. To achieve agility in a manufacturing operation, the design and development of the manufacturing processes must focus on customer satisfaction. Achieving greatest results requires that the manufacturing process be considered from product concept through sales. This provides the best opportunity to build a quality product for the customer at a reasonable rate. The primary elements of a manufacturing system include people, equipment, materials, methods and the environment. The most significant and most agile element in any process is the human resource. Only with a highly trained, knowledgeable work force can the proper methods be applied to efficiently process materials with machinery which is predictable, reliable and flexible. This paper discusses the affect of each element on the development of agile manufacturing systems.

  17. Photovoltaic module energy rating methodology development

    SciTech Connect

    Kroposki, B.; Myers, D.; Emery, K.; Mrig, L.; Whitaker, C.; Newmiller, J.

    1996-05-01

    A consensus-based methodology to calculate the energy output of a PV module will be described in this paper. The methodology develops a simple measure of PV module performance that provides for a realistic estimate of how a module will perform in specific applications. The approach makes use of the weather data profiles that describe conditions throughout the United States and emphasizes performance differences between various module types. An industry-representative Technical Review Committee has been assembled to provide feedback and guidance on the strawman and final approach used in developing the methodology.

  18. Investigating Agile User-Centered Design in Practice: A Grounded Theory Perspective

    NASA Astrophysics Data System (ADS)

    Hussain, Zahid; Slany, Wolfgang; Holzinger, Andreas

    This paper investigates how the integration of agile methods and User-Centered Design (UCD) is carried out in practice. For this study, we have applied grounded theory as a suitable qualitative approach to determine what is happening in actual practice. The data was collected by semi-structured interviews with professionals who have already worked with an integrated agile UCD methodology. Further data was collected by observing these professionals in their working context, and by studying their documents, where possible. The emerging themes that the study found show that there is an increasing realization of the importance of usability in software development among agile team members. The requirements are emerging; and both low and high fidelity prototypes based usability tests are highly used in agile teams. There is an appreciation of each other's work from both UCD professionals and developers and both sides can learn from each other.

  19. Development of Methodology for Programming Autonomous Agents

    NASA Technical Reports Server (NTRS)

    Erol, Kutluhan; Levy, Renato; Lang, Lun

    2004-01-01

    A brief report discusses the rationale for, and the development of, a methodology for generating computer code for autonomous-agent-based systems. The methodology is characterized as enabling an increase in the reusability of the generated code among and within such systems, thereby making it possible to reduce the time and cost of development of the systems. The methodology is also characterized as enabling reduction of the incidence of those software errors that are attributable to the human failure to anticipate distributed behaviors caused by the software. A major conceptual problem said to be addressed in the development of the methodology was that of how to efficiently describe the interfaces between several layers of agent composition by use of a language that is both familiar to engineers and descriptive enough to describe such interfaces unambivalently

  20. Human factors in agile manufacturing

    SciTech Connect

    Forsythe, C.

    1995-03-01

    As industries position themselves for the competitive markets of today, and the increasingly competitive global markets of the 21st century, agility, or the ability to rapidly develop and produce new products, represents a common trend. Agility manifests itself in many different forms, with the agile manufacturing paradigm proposed by the Iacocca Institute offering a generally accepted, long-term vision. In its many forms, common elements of agility or agile manufacturing include: changes in business, engineering and production practices, seamless information flow from design through production, integration of computer and information technologies into all facets of the product development and production process, application of communications technologies to enable collaborative work between geographically dispersed product development team members and introduction of flexible automation of production processes. Industry has rarely experienced as dramatic an infusion of new technologies or as extensive a change in culture and work practices. Human factors will not only play a vital role in accomplishing the technical and social objectives of agile manufacturing. but has an opportunity to participate in shaping the evolution of industry paradigms for the 21st century.

  1. Methodological Issues in the Study of Development.

    ERIC Educational Resources Information Center

    Havens, A. Eugene

    The failure of development to improve the quality of life in most third world countries and in the less advantaged sectors of advanced capitalistic countries can be partially attributed, it is felt, to methodological errors made by those studying development. Some recent sociological approaches to the study of development are reviewed in this…

  2. A Methodology for Developing Diagnostic Concept Inventories

    NASA Astrophysics Data System (ADS)

    Lindell, Rebecca

    2006-12-01

    Since the development of the Force Concept Inventory, there as been a heightened interest in developing other concept inventories that not only assess if students understand a phenomena, but also diagnose specific alternative understandings. Unfortunately, there is no clear-cut methodology on how to construct such inventories. One of the difficulties is that only some parts of test development theory are appropriate for such concept inventories. This is due to the concept inventories being distracter driven, where test-takers do not randomly choose an incorrect answer. In this poster, I will present a methodology for developing diagnostic concept inventories, which combines traditional psychometric theory with modern theories of concentration and model analysis. An example of how this methodology was utilized to develop the successful Lunar Phases Concept Inventory (LPCI) will also be given.

  3. Transitioning from Distributed and Traditional to Distributed and Agile: An Experience Report

    NASA Astrophysics Data System (ADS)

    Wildt, Daniel; Prikladnicki, Rafael

    Global companies that experienced extensive waterfall phased plans are trying to improve their existing processes to expedite team engagement. Agile methodologies have become an acceptable path to follow because it comprises project management as part of its practices. Agile practices have been used with the objective of simplifying project control through simple processes, easy to update documentation and higher team iteration over exhaustive documentation, focusing rather on team continuous improvement and aiming to add value to business processes. The purpose of this chapter is to describe the experience of a global multinational company on transitioning from distributed and traditional to distributed and agile. This company has development centers across North America, South America and Asia. This chapter covers challenges faced by the project teams of two pilot projects, including strengths of using agile practices in a globally distributed environment and practical recommendations for similar endeavors.

  4. Agile Methodology - Past and Future

    DTIC Science & Technology

    2011-05-01

    Model Integration • EA - DoD’s Evolutionary Acquisition policy Mellon University • SOA - Service-oriented Architecture • HBR – Harvard Business Review • WS - Web Service • XP – Extreme Programming

  5. Development of an agile knowledge engineering framework in support of multi-disciplinary translational research.

    PubMed

    Borlawsky, Tara B; Dhaval, Rakesh; Hastings, Shannon L; Payne, Philip R O

    2009-03-01

    In October 2006, the National Institutes of Health launched a new national consortium, funded through Clinical and Translational Science Awards (CTSA), with the primary objective of improving the conduct and efficiency of the inherently multi-disciplinary field of translational research. To help meet this goal, the Ohio State University Center for Clinical and Translational Science has launched a knowledge management initiative that is focused on facilitating widespread semantic interoperability among administrative, basic science, clinical and research computing systems, both internally and among the translational research community at-large, through the integration of domain-specific standard terminologies and ontologies with local annotations. This manuscript describes an agile framework that builds upon prevailing knowledge engineering and semantic interoperability methods, and will be implemented as part this initiative.

  6. Agile manufacturing and constraints management: a strategic perspective

    NASA Astrophysics Data System (ADS)

    Stratton, Roy; Yusuf, Yahaya Y.

    2000-10-01

    The definition of the agile paradigm has proved elusive and is often viewed as a panacea, in contention with more traditional approaches to operations strategy development and Larkin its own methodology and tools. The Theory of Constraints (TOC) is also poorly understood, as it is commonly solely associated with production planning and control systems and bottleneck management. This paper will demonstrate the synergy between these two approaches together with the Theory of Inventive Problem Solving (TRIZ), and establish how the systematic elimination of trade-offs can support the agile paradigm. Whereas agility is often seen as a trade-off free destination, both TOC and TRIZ may be considered to be route finders, as they comprise methodologies that focus on the identification and elimination of the trade-offs that constrain the purposeful improvement of a system, be it organizational or mechanical. This paper will also show how the TOC thinking process may be combined with the TRIZ knowledge based approach and used in breaking contradictions within agile logistics.

  7. Methodology for developing educational hypermedia systems.

    PubMed

    Bearman, M; Kidd, M; Cesnik, B

    1998-01-01

    Hypermedia has the potential to greatly enhance teaching. It can, however, be difficult to develop hyperdocuments which provide medical students with the full benefit of the technology. We wish to resolve this problem by providing a methodology for creating educational hypermedia systems with specific emphasis on the medical domain, but applicable across disciplines. This paper examines some of the specific issues involved in educational hypermedia, and outlines a series of practical guidelines, drawing together the disparate disciplines necessary to the development process. We illustrate the methodology with our own experience in creating and updating the HIV Hypermedia Medical Education Software over a period of two years.

  8. CATHARE code development and assessment methodologies

    SciTech Connect

    Micaelli, J.C.; Barre, F.; Bestion, D.

    1995-12-31

    The CATHARE thermal-hydraulic code has been developed jointly by Commissariat a l`Energie Atomique (CEA), Electricite de France (EdF), and Framatorne for safety analysis. Since the beginning of the project (September 1979), development and assessment activities have followed a methodology supported by two series of experimental tests: separate effects tests and integral effects tests. The purpose of this paper is to describe this methodology, the code assessment status, and the evolution to take into account two new components of this program: the modeling of three-dimensional phenomena and the requirements of code uncertainty evaluation.

  9. Assessment of proposed fighter agility metrics

    NASA Technical Reports Server (NTRS)

    Liefer, Randall K.; Valasek, John; Eggold, David P.; Downing, David R.

    1990-01-01

    This paper presents the results of an analysis of proposed metrics to assess fighter aircraft agility. A novel framework for classifying these metrics is developed and applied. A set of transient metrics intended to quantify the axial and pitch agility of fighter aircraft is evaluated with a high fidelity, nonlinear F-18 simulation. Test techniques and data reduction method are proposed, and sensitivities to pilot introduced errors during flight testing is investigated. Results indicate that the power onset and power loss parameters are promising candidates for quantifying axial agility, while maximum pitch up and pitch down rates are for quantifying pitch agility.

  10. Software development methodology for high consequence systems

    SciTech Connect

    Baca, L.S.; Bouchard, J.F.; Collins, E.W.; Eisenhour, M.; Neidigk, D.D.; Shortencarier, M.J.; Trellue, P.A.

    1997-10-01

    This document describes a Software Development Methodology for High Consequence Systems. A High Consequence System is a system whose failure could lead to serious injury, loss of life, destruction of valuable resources, unauthorized use, damaged reputation or loss of credibility or compromise of protected information. This methodology can be scaled for use in projects of any size and complexity and does not prescribe any specific software engineering technology. Tasks are described that ensure software is developed in a controlled environment. The effort needed to complete the tasks will vary according to the size, complexity, and risks of the project. The emphasis of this methodology is on obtaining the desired attributes for each individual High Consequence System.

  11. The RITE Approach to Agile Acquisition

    DTIC Science & Technology

    2013-04-01

    Review (SVR), and Production Readiness Review ( PRR ), which were evaluated against Agile development requirements. Further analysis was conducted...Audit (FCA), PRR , Operational Test Readiness Review (OTRR), Physical Configuration Audit (PCA), Integration Readiness Review (IRR), In Service...Technology (DSB Task Force, 2009, p. 48) In the context of the primary milestone reviews (PDR, CDR, and SVR/ PRR ), a nominal Agile development structure was

  12. A Metamodel for Defining Development Methodologies

    NASA Astrophysics Data System (ADS)

    Bollain, Manuel; Garbajosa, Juan

    The concept of software product is often associated to software code; process documents are, therefore, considered as by-products. It is also often the case that customers demand first and foremost "results" leaving documentation in second place. Development efforts are then focused on code production at the expense of document quality and corresponding verification activities. As discussed within this paper, one of the root problems for this is that documentation in the context of methodologies is often described with insufficient level of detail. This paper presents a metamodel to address this problem. It is an extension of ISO/IEC 24744, the metamodel for methodologies development. Under this extension, documents can become the drivers of methodology activities. Documents will be the artifact which method engineers should focus on for methodology development, defining their structure and constraints. Developers will put their effort into filling sections of the documents as the way to progress in process execution; in turn, process execution will be guided by those documents defined by the method engineers. This can form the basis for a new approach to a Document-Centric Software Engineering Environment.

  13. Agility Quotient (AQ)

    DTIC Science & Technology

    2014-06-01

    system?s Agility IQ ?? and ?What is the requisite amount of Agility that is required?? This paper suggests a way forward and illustrates it, in the...answer two questions. “How can we measure a system’s Agility IQ ?” and “What is the requisite amount of Agility that is required?” This paper...agility is worth our attention. AQ can be patterned after the Intelligence Quotient ( IQ ). IQ is a score that is associated with educational potential

  14. Development of a high-speed wavelength-agile CO2 local oscillator for heterodyne DIAL measurements

    NASA Astrophysics Data System (ADS)

    Senft, Daniel C.; Pierrottet, Diego F.

    2002-06-01

    A high repetition rate, wavelength agile CO2 laser has been developed at the Air Force Research Laboratory for use as a local oscillator in a heterodyne detection receiver. Fats wavelength selection is required for measurements of airborne chemical vapors using the differential absorption lidar (DIAL) technique. Acousto-optic modulator are used to tune between different wavelengths at high speeds without the need for moving mechanical parts. Other advantages obtained by the use of acousto-optic modulators are laser output power control per wavelength and rugged packaging for field applications. The local oscillator design is described, and the results from laboratory DIAL measurements are presented. The coherent remote optical sensor system is an internal research project being conducted by the Air Force Research Laboratory Directed Energy Directorate, Active Remote Sensing Branch. The objective of the project is to develop a new long-range standoff spectral sensor that takes advantage of the enhanced performance capabilities coherent detection can provide. Emphasis of the development is on a low cost, compact, and rugged active sensor exclusively designed for heterodyne detection using the differential absorption lidar technique. State of the art technologies in waveguide laser construction and acousto- optics make feasible the next generation of lasers capable of supporting coherent lidar system requirements. Issues addressed as part of the development include optoelectronic engineering of a low cost rugged system, and fast data throughput for real time chemical concentration measurements. All hardware used in this sensor are off-the- shelf items, so only minor hardware modifications were required for the system as it stands. This paper describes a high-speed heterodyne detection CO2 DIAL system that employs a wavelength agile, acousto-optically tuned local oscillator in the receiver. Sample experimental data collected in a controlled environment are presented as

  15. A Comparison of Information System Development Methodologies.

    DTIC Science & Technology

    1987-12-01

    r Dll -ft lU 5? R COMPARISON OF INFORMATION SYSTEM DEYELOPMENT IETHOOOLOGXES(U) AIR FORCE ]NST OF TECH IGHMT-PRTTERSON RFI 0ON SCHOOL OF SYSTEMS AND...DEVELOPMENT METHODOLJOGI ES THESIS Steven D. Branch Captain, USAF AFIT/GI R /LSR/87D-1 ...................i TON STTEM " A Approyed for public relecog4i...AFIT/GIR/LSR/87D- 1 A COMPARISON OF INFORMATION SYSTEM - DEVELOPMENT METHODOLOGIES .> THESIS ’ r ~( D~9ii Captain, USAF% AFIT/GIR/LSR/8?D-1 FE 0

  16. Test Methods for Robot Agility in Manufacturing

    PubMed Central

    Downs, Anthony; Harrison, William; Schlenoff, Craig

    2017-01-01

    Purpose The paper aims to define and describe test methods and metrics to assess industrial robot system agility in both simulation and in reality. Design/methodology/approach The paper describes test methods and associated quantitative and qualitative metrics for assessing robot system efficiency and effectiveness which can then be used for the assessment of system agility. Findings The paper describes how the test methods were implemented in a simulation environment and real world environment. It also shows how the metrics are measured and assessed as they would be in a future competition. Practical Implications The test methods described in this paper will push forward the state of the art in software agility for manufacturing robots, allowing small and medium manufacturers to better utilize robotic systems. Originality / value The paper fulfills the identified need for standard test methods to measure and allow for improvement in software agility for manufacturing robots. PMID:28203034

  17. Fighter agility metrics, research, and test

    NASA Technical Reports Server (NTRS)

    Liefer, Randall K.; Valasek, John; Eggold, David P.

    1990-01-01

    Proposed new metrics to assess fighter aircraft agility are collected and analyzed. A framework for classification of these new agility metrics is developed and applied. A completed set of transient agility metrics is evaluated with a high fidelity, nonlinear F-18 simulation provided by the NASA Dryden Flight Research Center. Test techniques and data reduction methods are proposed. A method of providing cuing information to the pilot during flight test is discussed. The sensitivity of longitudinal and lateral agility metrics to deviations from the pilot cues is studied in detail. The metrics are shown to be largely insensitive to reasonable deviations from the nominal test pilot commands. Instrumentation required to quantify agility via flight test is also considered. With one exception, each of the proposed new metrics may be measured with instrumentation currently available. Simulation documentation and user instructions are provided in an appendix.

  18. Development of a flight software testing methodology

    NASA Technical Reports Server (NTRS)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  19. Clean, Agile Processing Technology.

    DTIC Science & Technology

    1997-12-01

    Research ltr dtd 10 Jun 98 THIS PAGE IS UNCLASSIFIED FINAL REPORT CLEAN, AGILE PROCESSING TECHNOLOGY Contract # N00014-96-C-0139 PI: S. W . Sinton...Agile Processing Technology . T UNCLAS I N Sinton, S. W.IN S REQUIRED FOR (Explain needin detaiO E C This document is requested by the Canadian Department

  20. Investigating the strategic antecedents of agility in humanitarian logistics.

    PubMed

    L'Hermitte, Cécile; Brooks, Benjamin; Bowles, Marcus; Tatham, Peter H

    2016-12-16

    This study investigates the strategic antecedents of operational agility in humanitarian logistics. It began by identifying the particular actions to be taken at the strategic level of a humanitarian organisation to support field-level agility. Next, quantitative data (n=59) were collected on four strategic-level capabilities (being purposeful, action-focused, collaborative, and learning-oriented) and on operational agility (field responsiveness and flexibility). Using a quantitative analysis, the study tested the relationship between organisational capacity building and operational agility and found that the four strategic-level capabilities are fundamental building blocks of agility. Collectively they account for 52 per cent of the ability of humanitarian logisticians to deal with ongoing changes and disruptions in the field. This study emphasises the need for researchers and practitioners to embrace a broader perspective of agility in humanitarian logistics. In addition, it highlights the inherently strategic nature of agility, the development of which involves focusing simultaneously on multiple drivers.

  1. Prototype of a Graphical CONOPS (Concept of Operations) Development Environment for Agile Systems Engineering

    DTIC Science & Technology

    2012-03-23

    030 March 23, 2012 UNCLASSIFIED  Development languages and physics engines supported Deployment  Client-Server capability  Web , PC, Mac...drawbacks (shown in red). Chief among these was their inability to deploy on the Web . A secondary consideration for this phase of our research task is the...containment) will need a consistent strategy 7. We need to develop an ontological schema similar to semantic webs (such as OWL), and will research this

  2. AM-OER: An Agile Method for the Development of Open Educational Resources

    ERIC Educational Resources Information Center

    Arimoto, Maurício M.; Barroca, Leonor; Barbosa, Ellen F.

    2016-01-01

    Open Educational Resources have emerged as important elements of education in the contemporary society, promoting life-long and personalized learning that transcends social, economic and geographical barriers. To achieve the potential of OERs and bring impact on education, it is necessary to increase their development and supply. However, one of…

  3. Lean and Agile: An Epistemological Reflection

    ERIC Educational Resources Information Center

    Browaeys, Marie-Joelle; Fisser, Sandra

    2012-01-01

    Purpose: The aim of the paper is to contribute to the discussion of treating the concepts of lean and agile in isolation or combination by presenting an alternative view from complexity thinking on these concepts, considering an epistemological approach to this topic. Design/methodology/approach: The paper adopts an epistemological approach, using…

  4. Design, implementation and validation of a novel open framework for agile development of mobile health applications

    PubMed Central

    2015-01-01

    The delivery of healthcare services has experienced tremendous changes during the last years. Mobile health or mHealth is a key engine of advance in the forefront of this revolution. Although there exists a growing development of mobile health applications, there is a lack of tools specifically devised for their implementation. This work presents mHealthDroid, an open source Android implementation of a mHealth Framework designed to facilitate the rapid and easy development of mHealth and biomedical apps. The framework is particularly planned to leverage the potential of mobile devices such as smartphones or tablets, wearable sensors and portable biomedical systems. These devices are increasingly used for the monitoring and delivery of personal health care and wellbeing. The framework implements several functionalities to support resource and communication abstraction, biomedical data acquisition, health knowledge extraction, persistent data storage, adaptive visualization, system management and value-added services such as intelligent alerts, recommendations and guidelines. An exemplary application is also presented along this work to demonstrate the potential of mHealthDroid. This app is used to investigate on the analysis of human behavior, which is considered to be one of the most prominent areas in mHealth. An accurate activity recognition model is developed and successfully validated in both offline and online conditions. PMID:26329639

  5. Design, implementation and validation of a novel open framework for agile development of mobile health applications.

    PubMed

    Banos, Oresti; Villalonga, Claudia; Garcia, Rafael; Saez, Alejandro; Damas, Miguel; Holgado-Terriza, Juan A; Lee, Sungyong; Pomares, Hector; Rojas, Ignacio

    2015-01-01

    The delivery of healthcare services has experienced tremendous changes during the last years. Mobile health or mHealth is a key engine of advance in the forefront of this revolution. Although there exists a growing development of mobile health applications, there is a lack of tools specifically devised for their implementation. This work presents mHealthDroid, an open source Android implementation of a mHealth Framework designed to facilitate the rapid and easy development of mHealth and biomedical apps. The framework is particularly planned to leverage the potential of mobile devices such as smartphones or tablets, wearable sensors and portable biomedical systems. These devices are increasingly used for the monitoring and delivery of personal health care and wellbeing. The framework implements several functionalities to support resource and communication abstraction, biomedical data acquisition, health knowledge extraction, persistent data storage, adaptive visualization, system management and value-added services such as intelligent alerts, recommendations and guidelines. An exemplary application is also presented along this work to demonstrate the potential of mHealthDroid. This app is used to investigate on the analysis of human behavior, which is considered to be one of the most prominent areas in mHealth. An accurate activity recognition model is developed and successfully validated in both offline and online conditions.

  6. An Innovative Approach to Lower the Risk of Software Intensive Development Programs

    DTIC Science & Technology

    2012-04-30

    There are several agile methodologies and each has advantages that are dependent on the type of capability being developed. The scrum agile method is...constrained iterative approach. It also encourages rapid and flexible response to change. Scrum is an agile methodology used as an example that provides...capabilities. Key components of the scrum process include the following:  Prior to each sprint customer meeting, a prioritized list of capabilities

  7. Focused Logistics: Putting Agility in Agile Logistics

    DTIC Science & Technology

    2011-05-19

    envisioned an agile and adaptable logstics system built around common situational understanding.5 The Focused Logistics concept specified the requirement to...the tracking of resources moving through the TD network. As a result, 112 Ibid, 20. 113 Ibid, 18-20. 39 distribution centers tagged inbound

  8. Development of Robust, Light-weight, Agile Deformable Mirrors in Carbon Fiber

    NASA Astrophysics Data System (ADS)

    Hart, M.; Ammons, S. M.; Coughenour, B.; Richardson, L.,; Romeo, R.; Martin, R.

    2012-09-01

    Carbon fiber reinforced polymer (CFRP) has recently been developed to the point that surfaces of high optical quality can be routinely replicated. Building on this advance, we are developing a new generation of deformable mirrors (DMs) for adaptive optics application that extends long-standing expertise at the University of Arizona in large, optically powered DMs for astronomy. Our existing mirrors, up to 90 cm in diameter and with aspheric deformable facesheets, are deployed on a number of large astronomical telescopes. With actuator stroke of up to 50 microns and no hysteresis, they are delivering the best imaging ever seen from an astronomical AO system. Their Zerodur glass ceramic facesheets though are not well suited to non-astronomical applications. In this paper, we describe developmental work to replace the glass components of the DMs with CFRP, an attractive material for optics fabrication because of its high stiffness-to-weight ratio, strength, and very low coefficient of thermal expansion. Surface roughness arising from fiber print-through in the CFRP facesheets is low, < 3 nm PTV across a range of temperature, and the optical figure after correction of static terms by the DM actuators is on the order of 20 nm rms. After initial investment in an optical quality mandrel, replication costs of identical units in CFRP are very low, making the technology ideal for rapid mass production.

  9. Developing collaborative environments - A Holistic software development methodology

    SciTech Connect

    PETERSEN,MARJORIE B.; MITCHINER,JOHN L.

    2000-03-08

    Sandia National Laboratories has been developing technologies to support person-to-person collaboration and the efforts of teams in the business and research communities. The technologies developed include knowledge-based design advisors, knowledge management systems, and streamlined manufacturing supply chains. These collaborative environments in which people can work together sharing information and knowledge have required a new approach to software development. The approach includes an emphasis on the requisite change in business practice that often inhibits user acceptance of collaborative technology. Leveraging the experience from this work, they have established a multidisciplinary approach for developing collaborative software environments. They call this approach ``A Holistic Software Development Methodology''.

  10. Agile High-Fidelity Mcnp Model Development Techniques for Rapid Mechanical Design Iteration

    NASA Astrophysics Data System (ADS)

    Kulesza, Joel A.

    2009-08-01

    In order to finalize mechanical design details and perform the associated radiological analyses for the AP1000 pressurized water reactor integrated head package (IHP) in time to meet industrial obligations, a process was developed that allowed a radiological analyst to rapidly respond to changing design criteria. This process used several tools together, most of which were freely available, that enabled the analyst to rapidly re-model both geometrical and radiological details, perform a three-dimensional dose field analysis with MCNP5, examine the results, and present the results in an informative and easily understandable manner to other technical working groups. Thus far the author has used this process to study the radiological impacts of different sources due to various incore instrumentation thimble assembly (IITA) materials, different IITA shield alloys and geometrical configurations, different MP shroud thicknesses, and parameterized air duct wall thicknesses and complementary shielding. Model processing before execution will be discussed in detail. Techniques will also be described which allow for rapid spatial redistribution based on the modified source term. Post processing tools and methods will also be described that yield both qualitative and quantitative results.

  11. Development and testing of a frequency-agile optical parametric oscillator system for differential absorption lidar

    NASA Astrophysics Data System (ADS)

    Weibring, P.; Smith, J. N.; Edner, H.; Svanberg, S.

    2003-10-01

    An all-solid-state fast-tuning lidar transmitter for range- and temporally resolved atmospheric gas concentration measurements has been developed and thoroughly tested. The instrument is based on a commercial optical parametric oscillator (OPO) laser system, which has been redesigned with piezoelectric transducers mounted on the wavelength-tuning mirror and on the crystal angle tuning element in the OPO. Piezoelectric transducers similarly control a frequency-mixing stage and doubling stage, which have been incorporated to extend system capabilities to the mid-IR and UV regions. The construction allows the system to be tuned to any wavelength, in any order, in the range of the piezoelectric transducers on a shot-to-shot basis. This extends the measurement capabilities far beyond the two-wavelength differential absorption lidar method and enables simultaneous measurements of several gases. The system performance in terms of wavelength, linewidth, and power stability is monitored in real time by an étalon-based wave meter and gas cells. The tests showed that the system was able to produce radiation in the 220-4300-nm-wavelength region, with an average linewidth better than 0.2 cm-1 and a shot-to-shot tunability up to 160 cm-1 within 20 ms. The utility of real-time linewidth and wavelength measurements is demonstrated by the ability to identify occasional poor quality laser shots and disregard these measurements. Also, absorption cell measurements of methane and mercury demonstrate the performance in obtaining stable wavelength and linewidth during rapid scans in the mid-IR and UV regions.

  12. Embedding Agile Practices within a Plan-Driven Hierarchical Project Life Cycle

    SciTech Connect

    Millard, W. David; Johnson, Daniel M.; Henderson, John M.; Lombardo, Nicholas J.; Bass, Robert B.; Smith, Jason E.

    2014-07-28

    Organizations use structured, plan-driven approaches to provide continuity, direction, and control to large, multi-year programs. Projects within these programs vary greatly in size, complexity, level of maturity, technical risk, and clarity of the development objectives. Organizations that perform exploratory research, evolutionary development, and other R&D activities can obtain the benefits of Agile practices without losing the benefits of their program’s overarching plan-driven structure. This paper describes application of Agile development methods on a large plan-driven sensor integration program. While the client employed plan-driven, requirements flow-down methodologies, tight project schedules and complex interfaces called for frequent end-to-end demonstrations to provide feedback during system development. The development process maintained the many benefits of plan-driven project execution with the rapid prototyping, integration, demonstration, and client feedback possible through Agile development methods. This paper also describes some of the tools and implementing mechanisms used to transition between and take advantage of each methodology, and presents lessons learned from the project management, system engineering, and developer’s perspectives.

  13. Methodology Development for Advocate Team Use for Input Evaluation.

    ERIC Educational Resources Information Center

    Reinhard, Diane L.

    Methodology for input evaluation, as defined by Daniel L. Stufflebeam, is relatively nonexistent. Advocate teams have recently become a popular means of generating and assessing alternative strategies for a set of objectives. This study was undertaken to develop and evaluate methodology for advocate team use in input evaluation. Steps taken…

  14. Agile manufacturing in Intelligence, Surveillance and Reconnaissance (ISR)

    NASA Astrophysics Data System (ADS)

    DiPadua, Mark; Dalton, George

    2016-05-01

    The objective of the Agile Manufacturing for Intelligence, Surveillance, and Reconnaissance (AMISR) effort is to research, develop, design and build a prototype multi-intelligence (multi-INT), reconfigurable pod demonstrating benefits of agile manufacturing and a modular open systems approach (MOSA) to make podded intelligence, surveillance, and reconnaissance (ISR) capability more affordable and operationally flexible.

  15. IT Development: Methodology Overload or Crisis?

    ERIC Educational Resources Information Center

    Korac-Boisvert, Nada; Kouzmin, Alexander

    1995-01-01

    An examination of the management techniques that underlie the information technology (IT) industry reveals a common strategy of dividing organizational functions into tasks in a "top-down" fashion. The research and development approach is recommended for design-in-action development and an open-ended IT strategy to facilitate an…

  16. Fighter agility metrics. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Liefer, Randall K.

    1990-01-01

    Fighter flying qualities and combat capabilities are currently measured and compared in terms relating to vehicle energy, angular rates and sustained acceleration. Criteria based on these measurable quantities have evolved over the past several decades and are routinely used to design aircraft structures, aerodynamics, propulsion and control systems. While these criteria, or metrics, have the advantage of being well understood, easily verified and repeatable during test, they tend to measure the steady state capability of the aircraft and not its ability to transition quickly from one state to another. Proposed new metrics to assess fighter aircraft agility are collected and analyzed. A framework for classification of these new agility metrics is developed and applied. A complete set of transient agility metrics is evaluated with a high fidelity, nonlinear F-18 simulation. Test techniques and data reduction methods are proposed. A method of providing cuing information to the pilot during flight test is discussed. The sensitivity of longitudinal and lateral agility metrics to deviations from the pilot cues is studied in detail. The metrics are shown to be largely insensitive to reasonable deviations from the nominal test pilot commands. Instrumentation required to quantify agility via flight test is also considered. With one exception, each of the proposed new metrics may be measured with instrumentation currently available.

  17. Enterprise Technologies Deployment for Agile Manufacturing

    SciTech Connect

    Neal, R.E.

    1992-11-01

    This report is intended for high-level technical planners who are responsible for planning future developments for their company or Department of Energy/Defense Programs (DOE/DP) facilities. On one hand, the information may be too detailed or contain too much manufacturing technology jargon for a high-level, nontechnical executive, while at the same time an expert in any of the four infrastructure fields (Product Definition/Order Entry, Planning and Scheduling, Shop Floor Management, and Intelligent Manufacturing Systems) will know more than is conveyed here. The purpose is to describe a vision of technology deployment for an agile manufacturing enterprise. According to the 21st Century Manufacturing Enterprise Strategy, the root philosophy of agile manufacturing is that ``competitive advantage in the new systems will belong to agile manufacturing enterprises, capable of responding rapidly to demand for high-quality, highly customized products.`` Such agility will be based on flexible technologies, skilled workers, and flexible management structures which collectively will foster cooperative initiatives in and among companies. The remainder of this report is dedicated to sharpening our vision and to establishing a framework for defining specific project or pre-competitive project goals which will demonstrate agility through technology deployment.

  18. Enterprise Technologies Deployment for Agile Manufacturing

    SciTech Connect

    Neal, R.E.

    1992-11-01

    This report is intended for high-level technical planners who are responsible for planning future developments for their company or Department of Energy/Defense Programs (DOE/DP) facilities. On one hand, the information may be too detailed or contain too much manufacturing technology jargon for a high-level, nontechnical executive, while at the same time an expert in any of the four infrastructure fields (Product Definition/Order Entry, Planning and Scheduling, Shop Floor Management, and Intelligent Manufacturing Systems) will know more than is conveyed here. The purpose is to describe a vision of technology deployment for an agile manufacturing enterprise. According to the 21st Century Manufacturing Enterprise Strategy, the root philosophy of agile manufacturing is that competitive advantage in the new systems will belong to agile manufacturing enterprises, capable of responding rapidly to demand for high-quality, highly customized products.'' Such agility will be based on flexible technologies, skilled workers, and flexible management structures which collectively will foster cooperative initiatives in and among companies. The remainder of this report is dedicated to sharpening our vision and to establishing a framework for defining specific project or pre-competitive project goals which will demonstrate agility through technology deployment.

  19. The measurement and improvement of the lateral agility of the F-18

    NASA Technical Reports Server (NTRS)

    Eggold, David P.; Valasek, John; Downing, David R.

    1991-01-01

    The effect of vehicle configuration and flight control system performance on the roll agility of a modern fighter aircraft has been investigated. A batch simulation of a generic F-18 Hornet was used to study the roll agility as measured by the time to roll through 90 deg metric. Problems discussed include definition of agility, factors affecting the agility of a vehicle, the development of the time to roll through 90 deg agility metric, and a simulation experiment. It is concluded that the integral of stability or wind axis roll rate should be used as a measure of the roll measure traversed. The time through roll angle 90 deg metric is considered to be a good metric for measuring the transient performance aspect of agility. Roll agility of the F-18, as measured by 90 deg metric, can be improved by 10 to 30 percent. Compatible roll and rudder actuator rates can significantly affect 90 deg agility metric.

  20. The Software First System Development Methodology

    DTIC Science & Technology

    1989-02-15

    SCII85] J. Schill, R. Smeaton , R. Jackman, "The Conversion of Commands & Control Software to Ada: Experiences and Lessons Learned", A, Vol. IV...NY., John Wiley & Sons, Inc., 1984. [WILL87] Williams, T. "Real-Time Development Tools Aid Embedded Control System Design," Computer Desitn, October

  1. SLS Flight Software Testing: Using a Modified Agile Software Testing Approach

    NASA Technical Reports Server (NTRS)

    Bolton, Albanie T.

    2016-01-01

    NASA's Space Launch System (SLS) is an advanced launch vehicle for a new era of exploration beyond earth's orbit (BEO). The world's most powerful rocket, SLS, will launch crews of up to four astronauts in the agency's Orion spacecraft on missions to explore multiple deep-space destinations. Boeing is developing the SLS core stage, including the avionics that will control vehicle during flight. The core stage will be built at NASA's Michoud Assembly Facility (MAF) in New Orleans, LA using state-of-the-art manufacturing equipment. At the same time, the rocket's avionics computer software is being developed here at Marshall Space Flight Center in Huntsville, AL. At Marshall, the Flight and Ground Software division provides comprehensive engineering expertise for development of flight and ground software. Within that division, the Software Systems Engineering Branch's test and verification (T&V) team uses an agile test approach in testing and verification of software. The agile software test method opens the door for regular short sprint release cycles. The idea or basic premise behind the concept of agile software development and testing is that it is iterative and developed incrementally. Agile testing has an iterative development methodology where requirements and solutions evolve through collaboration between cross-functional teams. With testing and development done incrementally, this allows for increased features and enhanced value for releases. This value can be seen throughout the T&V team processes that are documented in various work instructions within the branch. The T&V team produces procedural test results at a higher rate, resolves issues found in software with designers at an earlier stage versus at a later release, and team members gain increased knowledge of the system architecture by interfacing with designers. SLS Flight Software teams want to continue uncovering better ways of developing software in an efficient and project beneficial manner

  2. Agile Methods for Open Source Safety-Critical Software

    PubMed Central

    Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-01-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the right amount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545

  3. Agile Methods for Open Source Safety-Critical Software.

    PubMed

    Gary, Kevin; Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-08-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the rightamount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion.

  4. Development of Distributed Computing Systems Software Design Methodologies.

    DTIC Science & Technology

    1982-11-05

    R12i 941 DEVELOPMENT OF DISTRIBUTED COMPUTING SYSTEMS SOFTWARE ± DESIGN METHODOLOGIES(U) NORTHWESTERN UNIV EVANSTON IL DEPT OF ELECTRICAL...GUIRWAU OF STANDARDS -16 5 A Ax u FINAL REPORT Development of Distributed Computing System Software Design Methodologies C)0 Stephen S. Yau September 22...of Distributed Computing Systems Software pt.22,, 80 -OJu1, 2 * Dsig Mehodloges PERFORMING ORG REPORT NUMBERDesign th ol ies" 7. AUTHOR() .. CONTRACT

  5. Photogrammetry Methodology Development for Gossamer Spacecraft Structures

    NASA Technical Reports Server (NTRS)

    Pappa, Richard S.; Jones, Thomas W.; Black, Jonathan T.; Walford, Alan; Robson, Stuart; Shortis, Mark R.

    2002-01-01

    Photogrammetry--the science of calculating 3D object coordinates from images--is a flexible and robust approach for measuring the static and dynamic characteristics of future ultra-lightweight and inflatable space structures (a.k.a., Gossamer structures), such as large membrane reflectors, solar sails, and thin-film solar arrays. Shape and dynamic measurements are required to validate new structural modeling techniques and corresponding analytical models for these unconventional systems. This paper summarizes experiences at NASA Langley Research Center over the past three years to develop or adapt photogrammetry methods for the specific problem of measuring Gossamer space structures. Turnkey industrial photogrammetry systems were not considered a cost-effective choice for this basic research effort because of their high purchase and maintenance costs. Instead, this research uses mainly off-the-shelf digital-camera and software technologies that are affordable to most organizations and provide acceptable accuracy.

  6. Photogrammetry Methodology Development for Gossamer Spacecraft Structures

    NASA Technical Reports Server (NTRS)

    Pappa, Richard S.; Jones, Thomas W.; Walford, Alan; Black, Jonathan T.; Robson, Stuart; Shortis, Mark R.

    2002-01-01

    Photogrammetry--the science of calculating 3D object coordinates from images-is a flexible and robust approach for measuring the static and dynamic characteristics of future ultralightweight and inflatable space structures (a.k.a., Gossamer structures), such as large membrane reflectors, solar sails, and thin-film solar arrays. Shape and dynamic measurements are required to validate new structural modeling techniques and corresponding analytical models for these unconventional systems. This paper summarizes experiences at NASA Langley Research Center over the past three years to develop or adapt photogrammetry methods for the specific problem of measuring Gossamer space structures. Turnkey industrial photogrammetry systems were not considered a cost-effective choice for this basic research effort because of their high purchase and maintenance costs. Instead, this research uses mainly off-the-shelf digital-camera and software technologies that are affordable to most organizations and provide acceptable accuracy.

  7. An investigation of fighter aircraft agility

    NASA Technical Reports Server (NTRS)

    Valasek, John; Downing, David R.

    1993-01-01

    This report attempts to unify in a single document the results of a series of studies on fighter aircraft agility funded by the NASA Ames Research Center, Dryden Flight Research Facility and conducted at the University of Kansas Flight Research Laboratory during the period January 1989 through December 1993. New metrics proposed by pilots and the research community to assess fighter aircraft agility are collected and analyzed. The report develops a framework for understanding the context into which the various proposed fighter agility metrics fit in terms of application and testing. Since new metrics continue to be proposed, this report does not claim to contain every proposed fighter agility metric. Flight test procedures, test constraints, and related criteria are developed. Instrumentation required to quantify agility via flight test is considered, as is the sensitivity of the candidate metrics to deviations from nominal pilot command inputs, which is studied in detail. Instead of supplying specific, detailed conclusions about the relevance or utility of one candidate metric versus another, the authors have attempted to provide sufficient data and analyses for readers to formulate their own conclusions. Readers are therefore ultimately responsible for judging exactly which metrics are 'best' for their particular needs. Additionally, it is not the intent of the authors to suggest combat tactics or other actual operational uses of the results and data in this report. This has been left up to the user community. Twenty of the candidate agility metrics were selected for evaluation with high fidelity, nonlinear, non real-time flight simulation computer programs of the F-5A Freedom Fighter, F-16A Fighting Falcon, F-18A Hornet, and X-29A. The information and data presented on the 20 candidate metrics which were evaluated will assist interested readers in conducting their own extensive investigations. The report provides a definition and analysis of each metric; details

  8. Perspectives on Agile Coaching

    NASA Astrophysics Data System (ADS)

    Fraser, Steven; Lundh, Erik; Davies, Rachel; Eckstein, Jutta; Larsen, Diana; Vilkki, Kati

    There are many perspectives to agile coaching including: growing coaching expertise, selecting the appropriate coach for your context; and eva luating value. A coach is often an itinerant who may observe, mentor, negotiate, influence, lead, and/or architect everything from team organization to system architecture. With roots in diverse fields ranging from technology to sociology coaches have differing motivations and experience bases. This panel will bring together coaches to debate and discuss various perspectives on agile coaching. Some of the questions to be addressed will include: What are the skills required for effective coaching? What should be the expectations for teams or individu als being coached? Should coaches be: a corporate resource (internal team of consultants working with multiple internal teams); an integral part of a specific team; or external contractors? How should coaches exercise influence and au thority? How should management assess the value of a coaching engagement? Do you have what it takes to be a coach? - This panel will bring together sea soned agile coaches to offer their experience and advice on how to be the best you can be!

  9. Prometheus Reactor I&C Software Development Methodology, for Action

    SciTech Connect

    T. Hamilton

    2005-07-30

    The purpose of this letter is to submit the Reactor Instrumentation and Control (I&C) software life cycle, development methodology, and programming language selections and rationale for project Prometheus to NR for approval. This letter also provides the draft Reactor I&C Software Development Process Manual and Reactor Module Software Development Plan to NR for information.

  10. CT-assisted agile manufacturing

    NASA Astrophysics Data System (ADS)

    Stanley, James H.; Yancey, Robert N.

    1996-11-01

    The next century will witness at least two great revolutions in the way goods are produced. First, workers will use the medium of virtual reality in all aspects of marketing, research, development, prototyping, manufacturing, sales and service. Second, market forces will drive manufacturing towards small-lot production and just-in-time delivery. Already, we can discern the merging of these megatrends into what some are calling agile manufacturing. Under this new paradigm, parts and processes will be designed and engineered within the mind of a computer, tooled and manufactured by the offspring of today's rapid prototyping equipment, and evaluated for performance and reliability by advanced nondestructive evaluation (NDE) techniques and sophisticated computational models. Computed tomography (CT) is the premier example of an NDE method suitable for future agile manufacturing activities. It is the only modality that provides convenient access to the full suite of engineering data that users will need to avail themselves of computer- aided design, computer-aided manufacturing, and computer- aided engineering capabilities, as well as newly emerging reverse engineering, rapid prototyping and solid freeform fabrication technologies. As such, CT is assured a central, utilitarian role in future industrial operations. An overview of this exciting future for industrial CT is presented.

  11. Operational Agility (La Maniabilite Operationnelle)

    DTIC Science & Technology

    1994-04-01

    describe how to go further and how to set up an analytical framework for the analysis of another fundamental property of modern combat aircraft, that is... analytical framework for the analysis of airframe agility and for the derivation of agility metrics. A general consensus has been found in relating agility...Gianchecchi, Aernmacchi Lt Col. G. Fristachi, Italian Air Force Prof. M. Innocenti, Auburn University/University of Pisa United Kingdom Mr P. Gordon

  12. A Methodology for the Development of Direct Fired Flight Projectiles

    NASA Astrophysics Data System (ADS)

    Farina, Anthony P.

    This thesis addresses shortcomings in flight projectile design by describing the creation of an improved product development methodology for direct fired flight projectiles. At the outset, platform based flight projectile design and the requirements for direct fired flight projectiles are considered. The traditional methods and tools used in flight projectile design and development are presented and the improved methodology for the design and development of direct fired flight projectiles is introduced. This methodology improves upon the traditional design methodology for flight projectiles by addressing the difference in fidelity levels of the applicable design tools, classifying designs and components by families and their characteristics, and applying the tools of IPD in a three phased approach for the low, medium and high fidelity models of each discipline to create an efficient design methodology for flight projectiles. This includes an evaluation of the relationship between the number of alternatives at each fidelity level and the time to evaluate each configuration. Early on in the design process, there may be many configurations under evaluation, therefore it will be advantageous to use faster running low fidelity models to reduce the number to only those in the feasible design space, and to use medium fidelity models populated with data from the low fidelity codes as the field narrows, and to use the more time consuming and computationally expensive models with the fewer final design candidates. This new design methodology improves upon the traditional development methods by the use of models of appropriate fidelity levels at each stage of development, and the design process is also improved by proper and timely integration between predictive codes of varying fidelity levels. The utilization of such a highly desirable methodology enables the efficient design of flight projectiles that meets the customer needs of increased levels of performance against new

  13. Architectural Tactics to Support Rapid and Agile Stability

    DTIC Science & Technology

    2012-05-01

    20 CrossTalk—May/June 2012 RAPID AND AGILE STABILITY • Scrum teams, product development teams, component teams, or feature teams spend almost...and individuals in the roles of Scrum master, developer, project manager, and architect on projects from organizations that develop embedded real...agile stability. Using Scrum , 25 teams participated in the develop- ment effort. Some of the teams were colocated; teams (waste), while not evolving

  14. Candidate control design metrics for an agile fighter

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Bailey, Melvin L.; Ostroff, Aaron J.

    1991-01-01

    Success in the fighter combat environment of the future will certainly demand increasing capability from aircraft technology. These advanced capabilities in the form of superagility and supermaneuverability will require special design techniques which translate advanced air combat maneuvering requirements into design criteria. Control design metrics can provide some of these techniques for the control designer. Thus study presents an overview of control design metrics and investigates metrics for advanced fighter agility. The objectives of various metric users, such as airframe designers and pilots, are differentiated from the objectives of the control designer. Using an advanced fighter model, metric values are documented over a portion of the flight envelope through piloted simulation. These metric values provide a baseline against which future control system improvements can be compared and against which a control design methodology can be developed. Agility is measured for axial, pitch, and roll axes. Axial metrics highlight acceleration and deceleration capabilities under different flight loads and include specific excess power measurements to characterize energy meneuverability. Pitch metrics cover both body-axis and wind-axis pitch rates and accelerations. Included in pitch metrics are nose pointing metrics which highlight displacement capability between the nose and the velocity vector. Roll metrics (or torsion metrics) focus on rotational capability about the wind axis.

  15. Agile Walking Robot

    NASA Technical Reports Server (NTRS)

    Larimer, Stanley J.; Lisec, Thomas R.; Spiessbach, Andrew J.; Waldron, Kenneth J.

    1990-01-01

    Proposed agile walking robot operates over rocky, sandy, and sloping terrain. Offers stability and climbing ability superior to other conceptual mobile robots. Equipped with six articulated legs like those of insect, continually feels ground under leg before applying weight to it. If leg sensed unexpected object or failed to make contact with ground at expected point, seeks alternative position within radius of 20 cm. Failing that, robot halts, examines area around foot in detail with laser ranging imager, and replans entire cycle of steps for all legs before proceeding.

  16. Agile Infrastructure Monitoring

    NASA Astrophysics Data System (ADS)

    Andrade, P.; Ascenso, J.; Fedorko, I.; Fiorini, B.; Paladin, M.; Pigueiras, L.; Santos, M.

    2014-06-01

    At the present time, data centres are facing a massive rise in virtualisation and cloud computing. The Agile Infrastructure (AI) project is working to deliver new solutions to ease the management of CERN data centres. Part of the solution consists in a new "shared monitoring architecture" which collects and manages monitoring data from all data centre resources. In this article, we present the building blocks of this new monitoring architecture, the different open source technologies selected for each architecture layer, and how we are building a community around this common effort.

  17. Achieving agility through parameter space qualification

    SciTech Connect

    Diegert, K.V.; Easterling, R.G.; Ashby, M.R.; Benavides, G.L.; Forsythe, C.; Jones, R.E.; Longcope, D.B.; Parratt, S.W.

    1995-02-01

    The A-primed (Agile Product Realization of Innovative electro-Mechanical Devices) project is defining and proving processes for agile product realization for the Department of Energy complex. Like other agile production efforts reported in the literature, A-primed uses concurrent engineering and information automation technologies to enhance information transfer. A unique aspect of our approach to agility is the qualification during development of a family of related product designs and their production processes, rather than a single design and its attendant processes. Applying engineering principles and statistical design of experiments, economies of test and analytic effort are realized for the qualification of the device family as a whole. Thus the need is minimized for test and analysis to qualify future devices from this family, thereby further reducing the design-to-production cycle time. As a measure of the success of the A-primed approach, the first design took 24 days to produce, and operated correctly on the first attempt. A flow diagram for the qualification process is presented. Guidelines are given for implementation, based on the authors experiences as members of the A-primed qualification team.

  18. Frequency agile optical parametric oscillator

    DOEpatents

    Velsko, Stephan P.

    1998-01-01

    The frequency agile OPO device converts a fixed wavelength pump laser beam to arbitrary wavelengths within a specified range with pulse to pulse agility, at a rate limited only by the repetition rate of the pump laser. Uses of this invention include Laser radar, LIDAR, active remote sensing of effluents/pollutants, environmental monitoring, antisensor lasers, and spectroscopy.

  19. Frequency agile optical parametric oscillator

    DOEpatents

    Velsko, S.P.

    1998-11-24

    The frequency agile OPO device converts a fixed wavelength pump laser beam to arbitrary wavelengths within a specified range with pulse to pulse agility, at a rate limited only by the repetition rate of the pump laser. Uses of this invention include Laser radar, LIDAR, active remote sensing of effluents/pollutants, environmental monitoring, antisensor lasers, and spectroscopy. 14 figs.

  20. Methodology Evaluation Framework for Component-Based System Development.

    ERIC Educational Resources Information Center

    Dahanayake, Ajantha; Sol, Henk; Stojanovic, Zoran

    2003-01-01

    Explains component-based development (CBD) for distributed information systems and presents an evaluation framework, which highlights the extent to which a methodology is component oriented. Compares prominent CBD methods, discusses ways of modeling, and suggests that this is a first step towards a components-oriented systems development…

  1. Research Methodology on Language Development from a Complex Systems Perspective

    ERIC Educational Resources Information Center

    Larsen-Freeman, Diane; Cameron, Lynne

    2008-01-01

    Changes to research methodology motivated by the adoption of a complexity theory perspective on language development are considered. The dynamic, nonlinear, and open nature of complex systems, together with their tendency toward self-organization and interaction across levels and timescales, requires changes in traditional views of the functions…

  2. Development of Management Methodology for Engineering Production Quality

    NASA Astrophysics Data System (ADS)

    Gorlenko, O.; Miroshnikov, V.; Borbatc, N.

    2016-04-01

    The authors of the paper propose four directions of the methodology developing the quality management of engineering products that implement the requirements of new international standard ISO 9001:2015: the analysis of arrangement context taking into account stakeholders, the use of risk management, management of in-house knowledge, assessment of the enterprise activity according to the criteria of effectiveness

  3. Prioritization Methodology for Development of Required Operational Capabilities

    DTIC Science & Technology

    2010-04-01

    Cours d’économie politique professé à l’université de Lausanne, 3 vol., 1896-7. 4 Pareto, V., (1935), The Mind and Society [Trattato Di Sociologia ...and Applications: A State-of-the- Art Survey, Springer-Verlag, New York. Prioritization Methodology for Development of Required Operational

  4. A Methodology for Developing Learning Objects for Web Course Delivery

    ERIC Educational Resources Information Center

    Stauffer, Karen; Lin, Fuhua; Koole, Marguerite

    2008-01-01

    This article presents a methodology for developing learning objects for web-based courses using the IMS Learning Design (IMS LD) specification. We first investigated the IMS LD specification, determining how to use it with online courses and the student delivery model, and then applied this to a Unit of Learning (UOL) for online computer science…

  5. XP Workshop on Agile Product Line Engineering

    NASA Astrophysics Data System (ADS)

    Ghanam, Yaser; Cooper, Kendra; Abrahamsson, Pekka; Maurer, Frank

    Software Product Line Engineering (SPLE) promises to lower the costs of developing individual applications as they heavily reuse existing artifacts. Besides decreasing costs, software reuse achieves faster development and higher quality. Traditionally, SPLE favors big design upfront and employs traditional, heavy weight processes. On the other hand, agile methods have been proposed to rapidly develop high quality software by focusing on producing working code while reducing upfront analysis and design. Combining both paradigms, although is challenging, can yield significant improvements.

  6. Demand Activated Manufacturing Architecture (DAMA) supply chain collaboration development methodology

    SciTech Connect

    PETERSEN,MARJORIE B.; CHAPMAN,LEON D.

    2000-03-15

    The Demand Activated Manufacturing Architecture (DAMA) project during the last five years of work with the U.S. Integrated Textile Complex (retail, apparel, textile, and fiber sectors) has developed an inter-enterprise supply chain collaboration development methodology. The goal of this methodology is to enable a supply chain to work more efficiently and competitively. The outcomes of this methodology include: (1) A definitive description and evaluation of the role of business cultures and supporting business organizational structures in either inhibiting or fostering change to a more competitive supply chain; (2) ``As-Is'' and proposed ``To-Be'' supply chain business process models focusing on information flows and decision-making; and (3) Software tools that enable and support a transition to a more competitive supply chain, which results form a business driven rather than technologically driven approach to software design. This methodology development will continue in FY00 as DAMA engages companies in the soft goods industry in supply chain research and implementation of supply chain collaboration.

  7. Risk-Informed Assessment Methodology Development and Application

    SciTech Connect

    Sung Goo Chi; Seok Jeong Park; Chul Jin Choi; Ritterbusch, S.E.; Jacob, M.C.

    2002-07-01

    Westinghouse Electric Company (WEC) has been working with Korea Power Engineering Company (KOPEC) on a US Department of Energy (DOE) sponsored Nuclear Energy Research Initiative (NERI) project through a collaborative agreement established for the domestic NERI program. The project deals with Risk-Informed Assessment (RIA) of regulatory and design requirements of future nuclear power plants. An objective of the RIA project is to develop a risk-informed design process, which focuses on identifying and incorporating advanced features into future nuclear power plants (NPPs) that would meet risk goals in a cost-effective manner. The RIA design methodology is proposed to accomplish this objective. This paper discusses the development of this methodology and demonstrates its application in the design of plant systems for future NPPs. Advanced conceptual plant systems consisting of an advanced Emergency Core Cooling System (ECCS) and Emergency Feedwater System (EFWS) for a NPP were developed and the risk-informed design process was exercised to demonstrate the viability and feasibility of the RIA design methodology. Best estimate Loss-of-Coolant Accident (LOCA) analyses were performed to validate the PSA success criteria for the NPP. The results of the analyses show that the PSA success criteria can be met using the advanced conceptual systems and that the RIA design methodology is a viable and appropriate means of designing key features of risk-significant NPP systems. (authors)

  8. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  9. Development of a methodology to assess the footprint of wastes.

    PubMed

    Herva, Marta; Hernando, Ramón; Carrasco, Eugenio F; Roca, Enrique

    2010-08-15

    The ecological footprint (EF) is a widely used indicator to assess the sustainability of people, regions or business activities. Although this metric has grown in interest and popularity over the years, it has also been the subject of criticism and controversy. The advantages of an aggregated indicator are often overshadowed by the shortcomings of its corresponding methodology. One weakness of the EF is that it does not account for toxic or hazardous pollutants and wastes, which cannot be part of a closed biological cycle. The methodology developed in the present work estimates the EF of toxic and hazardous wastes considering a closed cycle modeled through a plasma process; a phenomenon that naturally occurs in stars and volcanoes. Wastes from industry can be treated in a thermal plasma gasification process, and, by developing a methodology to describe this process, the EF of hazardous wastes was calculated. A value of 56.5 gha was obtained, a figure on the same order of magnitude as that obtained in a previous study where a conventional ecological footprint methodology was applied to the same production process.

  10. Methodology to develop and evaluate a semantic representation for NLP.

    PubMed

    Irwin, Jeannie Y; Harkema, Henk; Christensen, Lee M; Schleyer, Titus; Haug, Peter J; Chapman, Wendy W

    2009-11-14

    Natural language processing applications that extract information from text rely on semantic representations. The objective of this paper is to describe a methodology for creating a semantic representation for information that will be automatically extracted from textual clinical records. We illustrate two of the four steps of the methodology in this paper using the case study of encoding information from dictated dental exams: (1) develop an initial representation from a set of training documents and (2) iteratively evaluate and evolve the representation while developing annotation guidelines. Our approach for developing and evaluating a semantic representation is based on standard principles and approaches that are not dependent on any particular domain or type of semantic representation.

  11. Agile based "Semi-"Automated Data ingest process : ORNL DAAC example

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Hook, L.; Wei, Y.; Wright, D.

    2015-12-01

    The ORNL DAAC archives and publishes data and information relevant to biogeochemical, ecological, and environmental processes. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. The ORNL DAAC ingest team curates diverse data sets from multiple data providers simultaneously. To streamline the ingest process, the data set submission process at the ORNL DAAC has been recently updated to use an agile process and a semi-automated workflow system has been developed to provide a consistent data provider experience and to create a uniform data product. The goals of semi-automated agile ingest process are to: 1.Provide the ability to track a data set from acceptance to publication 2. Automate steps that can be automated to improve efficiencies and reduce redundancy 3.Update legacy ingest infrastructure 4.Provide a centralized system to manage the various aspects of ingest. This talk will cover the agile methodology, workflow, and tools developed through this system.

  12. Agile manufacturing concept

    NASA Astrophysics Data System (ADS)

    Goldman, Steven L.

    1994-03-01

    The initial conceptualization of agile manufacturing was the result of a 1991 study -- chaired by Lehigh Professor Roger N. Nagel and California-based entrepreneur Rick Dove, President of Paradigm Shifts, International -- of what it would take for U.S. industry to regain global manufacturing competitiveness by the early twenty-first century. This industry-led study, reviewed by senior management at over 100 companies before its release, concluded that incremental improvement of the current system of manufacturing would not be enough to be competitive in today's global marketplace. Computer-based information and production technologies that were becoming available to industry opened up the possibility of an altogether new system of manufacturing, one that would be characterized by a distinctive integration of people and technologies; of management and labor; of customers, producers, suppliers, and society.

  13. Towards Agile Ontology Maintenance

    NASA Astrophysics Data System (ADS)

    Luczak-Rösch, Markus

    Ontologies are an appropriate means to represent knowledge on the Web. Research on ontology engineering reached practices for an integrative lifecycle support. However, a broader success of ontologies in Web-based information systems remains unreached while the more lightweight semantic approaches are rather successful. We assume, paired with the emerging trend of services and microservices on the Web, new dynamic scenarios gain momentum in which a shared knowledge base is made available to several dynamically changing services with disparate requirements. Our work envisions a step towards such a dynamic scenario in which an ontology adapts to the requirements of the accessing services and applications as well as the user's needs in an agile way and reduces the experts' involvement in ontology maintenance processes.

  14. Aircraft agility maneuvers

    NASA Technical Reports Server (NTRS)

    Cliff, Eugene M.; Thompson, Brian G.

    1992-01-01

    A new dynamic model for aircraft motions is presented. This model can be viewed as intermediate between a point-mass model, in which the body attitude angles are control-like, and a rigid-body model, in which the body-attitude angles evolve according to Newton's Laws. Specifically, consideration is given to the case of symmetric flight, and a model is constructed in which the body roll-rate and the body pitch-rate are the controls. In terms of this body-rate model a minimum-time heading change maneuver is formulated. When the bounds on the body-rates are large the results are similar to the point-mass model in that the model can very quickly change the applied forces and produce an acceleration to turn the vehicle. With finite bounds on these rates, the forces change in a smooth way. This leads to a measurable effect of agility.

  15. Development of test methodology for dynamic mechanical analysis instrumentation

    NASA Technical Reports Server (NTRS)

    Allen, V. R.

    1982-01-01

    Dynamic mechanical analysis instrumentation was used for the development of specific test methodology in the determination of engineering parameters of selected materials, esp. plastics and elastomers, over a broad range of temperature with selected environment. The methodology for routine procedures was established with specific attention given to sample geometry, sample size, and mounting techniques. The basic software of the duPont 1090 thermal analyzer was used for data reduction which simplify the theoretical interpretation. Clamps were developed which allowed 'relative' damping during the cure cycle to be measured for the fiber-glass supported resin. The correlation of fracture energy 'toughness' (or impact strength) with the low temperature (glassy) relaxation responses for a 'rubber-modified' epoxy system was negative in result because the low-temperature dispersion mode (-80 C) of the modifier coincided with that of the epoxy matrix, making quantitative comparison unrealistic.

  16. Elements of an Art - Agile Coaching

    NASA Astrophysics Data System (ADS)

    Lundh, Erik

    This tutorial gives you a lead on becoming or redefining yourself as an Agile Coach. Introduction to elements and dimensions of state-of-the-art Agile Coaching. How to position the agile coach to be effective in a larger setting. Making the agile transition - from a single team to thousands of people. How to support multiple teams as a coach. How to build a coaches network in your company. Challenges when the agile coach is a consultant and the organization is large.

  17. The Development of Methodology to Support Comprehensive Approach: TMC

    DTIC Science & Technology

    2014-05-02

    methodology components, each of which supports a different spectrum of multidisciplinary teamwork . The TMC development has been an iterative process and it...challenges can be mitigated to some extent through the CA process improvement. For example, the planning stage in the CA is very important in defining how...expressed by Canadian Army in May 2009 during which they stressed the importance of: • Considering all perspectives of war (i.e. cultural political

  18. Coal resources available for development; a methodology and pilot study

    USGS Publications Warehouse

    Eggleston, Jane R.; Carter, M. Devereux; Cobb, James C.

    1990-01-01

    Coal accounts for a major portion of our Nation's energy supply in projections for the future. A demonstrated reserve base of more than 475 billion short tons, as the Department of Energy currently estimates, indicates that, on the basis of today's rate of consumption, the United States has enough coal to meet projected energy needs for almost 200 years. However, the traditional procedures used for estimating the demonstrated reserve base do not account for many environmental and technological restrictions placed on coal mining. A new methodology has been developed to determine the quantity of coal that might actually be available for mining under current and foreseeable conditions. This methodology is unique in its approach, because it applies restrictions to the coal resource before it is mined. Previous methodologies incorporated restrictions into the recovery factor (a percentage), which was then globally applied to the reserve (minable coal) tonnage to derive a recoverable coal tonnage. None of the previous methodologies define the restrictions and their area and amount of impact specifically. Because these restrictions and their impacts are defined in this new methodology, it is possible to achieve more accurate and specific assessments of available resources. This methodology has been tested in a cooperative project between the U.S. Geological Survey and the Kentucky Geological Survey on the Matewan 7.5-minute quadrangle in eastern Kentucky. Pertinent geologic, mining, land-use, and technological data were collected, assimilated, and plotted. The National Coal Resources Data System was used as the repository for data, and its geographic information system software was applied to these data to eliminate restricted coal and quantify that which is available for mining. This methodology does not consider recovery factors or the economic factors that would be considered by a company before mining. Results of the pilot study indicate that, of the estimated

  19. Compact, Automated, Frequency-Agile Microspectrofluorimeter

    NASA Technical Reports Server (NTRS)

    Fernandez, Salvador M.; Guignon, Ernest F.

    1995-01-01

    Compact, reliable, rugged, automated cell-culture and frequency-agile microspectrofluorimetric apparatus developed to perform experiments involving photometric imaging observations of single live cells. In original application, apparatus operates mostly unattended aboard spacecraft; potential terrestrial applications include automated or semiautomated diagnosis of pathological tissues in clinical laboratories, biomedical instrumentation, monitoring of biological process streams, and portable instrumentation for testing biological conditions in various environments. Offers obvious advantages over present laboratory instrumentation.

  20. Development of a statistically based access delay timeline methodology.

    SciTech Connect

    Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt

    2013-02-01

    The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversarys task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.

  1. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  2. Reliability of a Field Test of Defending and Attacking Agility in Australian Football and Relationships to Reactive Strength.

    PubMed

    Young, Warren B; Murray, Mitch P

    2017-02-01

    Young, WB and Murray, MP. Reliability of a field test of defending and attacking agility in Australian football and relationships to reactive strength. J Strength Cond Res 31(2): 509-516, 2017-Defending and attacking agility tests for Australian football do not exist, and it is unknown whether any physical qualities correlate with these types of agility. The purposes of this study were to develop new field tests of defending and attacking agility for Australian Rules football, to determine whether they were reliable, and to describe the relationship between the agility tests to determine their specificity. Because the reactive strength (RS) of the lower limb muscles has been previously correlated with change-of-direction speed, we also investigated the relationship between this quality and the agility tests. Nineteen male competitive recreational-level Australian Rules football players were assessed on the agility tests and a drop jump test to assess RS. Interday and interrater reliability was also assessed. The agility tests involved performing 10 trials of one-on-one agility tasks against 2 testers (opponents), in which the objective was to be in a position to tackle (defending) or to evade (attacking) the opponent. Both agility tests had good reliability (intraclass correlation > 0.8, %CV < 3, and no significant differences between test occasions [p > 0.05], and interrater reliability was very high [r = 0.997, p < 0.001]). The common variance between the agility tests was 45%, indicating that they represented relatively independent skills. There was a large correlation between RS and defending agility (r = 0.625, p = 0.004), and a very large correlation with attacking agility (r = 0.731, p < 0.001). Defending and attacking agility have different characteristics, possibly related to the footwork, physical, and cognitive demands of each. Nonetheless, RS seems to be important for agility, especially for attacking agility.

  3. Final Report of the NASA Office of Safety and Mission Assurance Agile Benchmarking Team

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2016-01-01

    To ensure that the NASA Safety and Mission Assurance (SMA) community remains in a position to perform reliable Software Assurance (SA) on NASAs critical software (SW) systems with the software industry rapidly transitioning from waterfall to Agile processes, Terry Wilcutt, Chief, Safety and Mission Assurance, Office of Safety and Mission Assurance (OSMA) established the Agile Benchmarking Team (ABT). The Team's tasks were: 1. Research background literature on current Agile processes, 2. Perform benchmark activities with other organizations that are involved in software Agile processes to determine best practices, 3. Collect information on Agile-developed systems to enable improvements to the current NASA standards and processes to enhance their ability to perform reliable software assurance on NASA Agile-developed systems, 4. Suggest additional guidance and recommendations for updates to those standards and processes, as needed. The ABT's findings and recommendations for software management, engineering and software assurance are addressed herein.

  4. Agile Development of Various Computational Power Adaptive Web-Based Mobile-Learning Software Using Mobile Cloud Computing

    ERIC Educational Resources Information Center

    Zadahmad, Manouchehr; Yousefzadehfard, Parisa

    2016-01-01

    Mobile Cloud Computing (MCC) aims to improve all mobile applications such as m-learning systems. This study presents an innovative method to use web technology and software engineering's best practices to provide m-learning functionalities hosted in a MCC-learning system as service. Components hosted by MCC are used to empower developers to create…

  5. Agile automated vision

    NASA Astrophysics Data System (ADS)

    Fandrich, Juergen; Schmitt, Lorenz A.

    1994-11-01

    The microelectronic industry is a protagonist in driving automated vision to new paradigms. Today semiconductor manufacturers use vision systems quite frequently in their fabs in the front-end process. In fact, the process depends on reliable image processing systems. In the back-end process, where ICs are assembled and packaged, today vision systems are only partly used. But in the next years automated vision will become compulsory for the back-end process as well. Vision will be fully integrated into every IC package production machine to increase yields and reduce costs. Modem high-speed material processing requires dedicated and efficient concepts in image processing. But the integration of various equipment in a production plant leads to unifying handling of data flow and interfaces. Only agile vision systems can act with these contradictions: fast, reliable, adaptable, scalable and comprehensive. A powerful hardware platform is a unneglectable requirement for the use of advanced and reliable, but unfortunately computing intensive image processing algorithms. The massively parallel SIMD hardware product LANTERN/VME supplies a powerful platform for existing and new functionality. LANTERN/VME is used with a new optical sensor for IC package lead inspection. This is done in 3D, including horizontal and coplanarity inspection. The appropriate software is designed for lead inspection, alignment and control tasks in IC package production and handling equipment, like Trim&Form, Tape&Reel and Pick&Place machines.

  6. Development of design and analysis methodology for composite bolted joints

    NASA Astrophysics Data System (ADS)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  7. Photovoltaic-system costing-methodology development. Final report

    SciTech Connect

    Not Available

    1982-07-01

    Presented are the results of a study to expand the use of standardized costing methodologies in the National Photovoltaics Program. The costing standards, which include SAMIS for manufacturing costs and M and D for marketing and distribution costs, have been applied to concentrator collectors and power-conditioning units. The M and D model was also computerized. Finally, a uniform construction cost-accounting structure was developed for use in photovoltaic test and application projects. The appendices contain example cases which demonstrate the use of the models.

  8. [Methodological approaches in the development of clinical guidelines].

    PubMed

    Albrecht, K

    2017-03-01

    Practical guidelines assist the clinical decision-making process in modern medicine. In rheumatology the number of practical guidelines dealing with diagnostics and therapy of rheumatic diseases is also constantly increasing. Methodological standards for guidelines ensure adequate development under consideration of precisely defined structures. Expert recommendations for action (S1) are distinguished from consensus (S2k) or evidence-based (S2e) as well as consensus and evidence-based (S3) guidelines. Levels of evidence categorize available studies by study design. Parameters for the evaluation of guidelines are summarized in the German instrument for the assessment of guidelines (DELBI).

  9. What Does an Agile Coach Do?

    NASA Astrophysics Data System (ADS)

    Davies, Rachel; Pullicino, James

    The surge in Agile adoption has created a demand for project managers rather than direct their teams. A sign of this trend is the ever-increasing number of people getting certified as scrum masters and agile leaders. Training courses that introduce agile practices are easy to find. But making the transition to coach is not as simple as understanding what agile practices are. Your challenge as an Agile Coach is to support your team in learning how to wield their new Agile tools in creating great software.

  10. Development of the Comprehensive Cervical Dystonia Rating Scale: Methodology

    PubMed Central

    Comella, Cynthia L.; Fox, Susan H.; Bhatia, Kailash P.; Perlmutter, Joel S.; Jinnah, Hyder A.; Zurowski, Mateusz; McDonald, William M.; Marsh, Laura; Rosen, Ami R.; Waliczek, Tracy; Wright, Laura J.; Galpern, Wendy R.; Stebbins, Glenn T.

    2016-01-01

    We present the methodology utilized for development and clinimetric testing of the Comprehensive Cervical Dystonia (CD) Rating scale, or CCDRS. The CCDRS includes a revision of the Toronto Western Spasmodic Torticollis Rating Scale (TWSTRS-2), a newly developed psychiatric screening tool (TWSTRS-PSYCH), and the previously validated Cervical Dystonia Impact Profile (CDIP-58). For the revision of the TWSTRS, the original TWSTRS was examined by a committee of dystonia experts at a dystonia rating scales workshop organized by the Dystonia Medical Research Foundation. During this workshop, deficiencies in the standard TWSTRS were identified and recommendations for revision of the severity and pain subscales were incorporated into the TWSTRS-2. Given that no scale currently evaluates the psychiatric features of cervical dystonia (CD), we used a modified Delphi methodology and a reiterative process of item selection to develop the TWSTRS-PSYCH. We also included the CDIP-58 to capture the impact of CD on quality of life. The three scales (TWSTRS2, TWSTRS-PSYCH, and CDIP-58) were combined to construct the CCDRS. Clinimetric testing of reliability and validity of the CCDRS are described. The CCDRS was designed to be used in a modular fashion that can measure the full spectrum of CD. This scale will provide rigorous assessment for studies of natural history as well as novel symptom-based or disease-modifying therapies. PMID:27088112

  11. Development of the Comprehensive Cervical Dystonia Rating Scale: Methodology.

    PubMed

    Comella, Cynthia L; Fox, Susan H; Bhatia, Kailash P; Perlmutter, Joel S; Jinnah, Hyder A; Zurowski, Mateusz; McDonald, William M; Marsh, Laura; Rosen, Ami R; Waliczek, Tracy; Wright, Laura J; Galpern, Wendy R; Stebbins, Glenn T

    2015-06-01

    We present the methodology utilized for development and clinimetric testing of the Comprehensive Cervical Dystonia (CD) Rating scale, or CCDRS. The CCDRS includes a revision of the Toronto Western Spasmodic Torticollis Rating Scale (TWSTRS-2), a newly developed psychiatric screening tool (TWSTRS-PSYCH), and the previously validated Cervical Dystonia Impact Profile (CDIP-58). For the revision of the TWSTRS, the original TWSTRS was examined by a committee of dystonia experts at a dystonia rating scales workshop organized by the Dystonia Medical Research Foundation. During this workshop, deficiencies in the standard TWSTRS were identified and recommendations for revision of the severity and pain subscales were incorporated into the TWSTRS-2. Given that no scale currently evaluates the psychiatric features of cervical dystonia (CD), we used a modified Delphi methodology and a reiterative process of item selection to develop the TWSTRS-PSYCH. We also included the CDIP-58 to capture the impact of CD on quality of life. The three scales (TWSTRS2, TWSTRS-PSYCH, and CDIP-58) were combined to construct the CCDRS. Clinimetric testing of reliability and validity of the CCDRS are described. The CCDRS was designed to be used in a modular fashion that can measure the full spectrum of CD. This scale will provide rigorous assessment for studies of natural history as well as novel symptom-based or disease-modifying therapies.

  12. Aggregate Building Simulator (ABS) Methodology Development, Application, and User Manual

    SciTech Connect

    Dirks, James A.; Gorrissen, Willy J.

    2011-11-30

    As the relationship between the national building stock and various global energy issues becomes a greater concern, it has been deemed necessary to develop a system of predicting the energy consumption of large groups of buildings. Ideally this system is to take advantage of the most advanced energy simulation software available, be able to execute runs quickly, and provide concise and useful results at a level of detail that meets the users needs without inundating them with data. The resulting methodology that was developed allows the user to quickly develop and execute energy simulations of many buildings simultaneously, taking advantage of parallel processing to greatly reduce total simulation times. The result of these simulations can then be rapidly condensed and presented in a useful and intuitive manner.

  13. Development of Security Software: A High Assurance Methodology

    NASA Astrophysics Data System (ADS)

    Hardin, David; Hiratzka, T. Douglas; Johnson, D. Randolph; Wagner, Lucas; Whalen, Michael

    This paper reports on a project to exercise, evaluate and enhance a methodology for developing high assurance software for an embedded system controller. In this approach, researchers at the National Security Agency capture system requirements precisely and unambiguously through functional specifications in Z. Rockwell Collins then implements these requirements using an integrated, model-based software development approach. The development effort is supported by a tool chain that provides automated code generation and support for formal verification. The specific system is a prototype high speed encryption system, although the controller could be adapted for use in a variety of critical systems in which very high assurance of correctness, reliability, and security or safety properties is essential.

  14. The SIMRAND methodology - Simulation of Research and Development Projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1984-01-01

    In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.

  15. Intelligent System Development Using a Rough Sets Methodology

    NASA Technical Reports Server (NTRS)

    Anderson, Gray T.; Shelton, Robert O.

    1997-01-01

    The purpose of this research was to examine the potential of the rough sets technique for developing intelligent models of complex systems from limited information. Rough sets a simple but promising technology to extract easily understood rules from data. The rough set methodology has been shown to perform well when used with a large set of exemplars, but its performance with sparse data sets is less certain. The difficulty is that rules will be developed based on just a few examples, each of which might have a large amount of noise associated with them. The question then becomes, what is the probability of a useful rule being developed from such limited information? One nice feature of rough sets is that in unusual situations, the technique can give an answer of 'I don't know'. That is, if a case arises that is different from the cases the rough set rules were developed on, the methodology can recognize this and alert human operators of it. It can also be trained to do this when the desired action is unknown because conflicting examples apply to the same set of inputs. This summer's project was to look at combining rough set theory with statistical theory to develop confidence limits in rules developed by rough sets. Often it is important not to make a certain type of mistake (e.g., false positives or false negatives), so the rules must be biased toward preventing a catastrophic error, rather than giving the most likely course of action. A method to determine the best course of action in the light of such constraints was examined. The resulting technique was tested with files containing electrical power line 'signatures' from the space shuttle and with decompression sickness data.

  16. Methodology of citrate-based biomaterial development and application

    NASA Astrophysics Data System (ADS)

    Tran, M. Richard

    Biomaterials play central roles in modern strategies of regenerative medicine and tissue engineering. Attempts to find tissue-engineered solutions to cure various injuries or diseases have led to an enormous increase in the number of polymeric biomaterials over the past decade. The breadth of new materials arises from the multiplicity of anatomical locations, cell types, and mode of application, which all place application-specific requirements on the biomaterial. Unfortunately, many of the currently available biodegradable polymers are limited in their versatility to meet the wide range of requirements for tissue engineering. Therefore, a methodology of biomaterial development, which is able to address a broad spectrum of requirements, would be beneficial to the biomaterial field. This work presents a methodology of citrate-based biomaterial design and application to meet the multifaceted needs of tissue engineering. We hypothesize that (1) citric acid, a non-toxic metabolic product of the body (Krebs Cycle), can be exploited as a universal multifunctional monomer and reacted with various diols to produce a new class of soft biodegradable elastomers with the flexibility to tune the material properties of the resulting material to meet a wide range of requirements; (2) the newly developed citrate-based polymers can be used as platform biomaterials for the design of novel tissue engineering scaffolding; and (3) microengineering approaches in the form thin scaffold sheets, microchannels, and a new porogen design can be used to generate complex cell-cell and cell-microenvironment interactions to mimic tissue complexity and architecture. To test these hypotheses, we first developed a methodology of citrate-based biomaterial development through the synthesis and characterization of a family of in situ crosslinkable and urethane-doped elastomers, which are synthesized using simple, cost-effective strategies and offer a variety methods to tailor the material properties to

  17. Development of a Composite Delamination Fatigue Life Prediction Methodology

    NASA Technical Reports Server (NTRS)

    OBrien, Thomas K.

    2009-01-01

    Delamination is one of the most significant and unique failure modes in composite structures. Because of a lack of understanding of the consequences of delamination and the inability to predict delamination onset and growth, many composite parts are unnecessarily rejected upon inspection, both immediately after manufacture and while in service. NASA Langley is leading the efforts in the U.S. to develop a fatigue life prediction methodology for composite delamination using fracture mechanics. Research being performed to this end will be reviewed. Emphasis will be placed on the development of test standards for delamination characterization, incorporation of approaches for modeling delamination in commercial finite element codes, and efforts to mature the technology for use in design handbooks and certification documents.

  18. Agile Development of Advanced Prototypes

    DTIC Science & Technology

    2012-11-01

    sound experience that emphasizes the progression of cochlear implant technology. A guest observes and listens to a virtual environment. They are able to...transition their environment through history as well as the simulated fidelity of a contemporary cochlear implant . A visual experience that...patient with a cochlear implant was interviewed. Outcomes of this research guided the design of the first prototype. The technical design was

  19. Rapid development of xylanase assay conditions using Taguchi methodology.

    PubMed

    Prasad Uday, Uma Shankar; Bandyopadhyay, Tarun Kanti; Bhunia, Biswanath

    2016-11-01

    The present investigation is mainly concerned with the rapid development of extracellular xylanase assay conditions by using Taguchi methodology. The extracellular xylanase was produced from Aspergillus niger (KP874102.1), a new strain isolated from a soil sample of the Baramura forest, Tripura West, India. Four physical parameters including temperature, pH, buffer concentration and incubation time were considered as key factors for xylanase activity and were optimized using Taguchi robust design methodology for enhanced xylanase activity. The main effect, interaction effects and optimal levels of the process factors were determined using signal-to-noise (S/N) ratio. The Taguchi method recommends the use of S/N ratio to measure quality characteristics. Based on analysis of the S/N ratio, optimal levels of the process factors were determined. Analysis of variance (ANOVA) was performed to evaluate statistically significant process factors. ANOVA results showed that temperature contributed the maximum impact (62.58%) on xylanase activity, followed by pH (22.69%), buffer concentration (9.55%) and incubation time (5.16%). Predicted results showed that enhanced xylanase activity (81.47%) can be achieved with pH 2, temperature 50°C, buffer concentration 50 Mm and incubation time 10 min.

  20. Onshore and Offshore Outsourcing with Agility: Lessons Learned

    NASA Astrophysics Data System (ADS)

    Kussmaul, Clifton

    This chapter reflects on case study based an agile distributed project that ran for approximately three years (from spring 2003 to spring 2006). The project involved (a) a customer organization with key personnel distributed across the US, developing an application with rapidly changing requirements; (b) onshore consultants with expertise in project management, development processes, offshoring, and relevant technologies; and (c) an external offsite development team in a CMM-5 organization in southern India. This chapter is based on surveys and discussions with multiple participants. The several years since the project was completed allow greater perspective on both the strengths and weaknesses, since the participants can reflect on the entire life of the project, and compare it to subsequent experiences. Our findings emphasize the potential for agile project management in distributed software development, and the importance of people and interactions, taking many small steps to find and correct errors, and matching the structures of the project and product to support implementation of agility.

  1. Quantification of Posterior Globe Flattening: Methodology Development and Validation

    NASA Technical Reports Server (NTRS)

    Lumpkins, Sarah B.; Garcia, Kathleen M.; Sargsyan, Ashot E.; Hamilton, Douglas R.; Berggren, Michael D.; Ebert, Douglas

    2012-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in the eyes of several astronauts. This phenomenon is known to affect some terrestrial patient populations and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT) or B-mode Ultrasound (US), without consistent objective criteria. NASA uses a semiquantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was ot initiate development of an objective quantification methodology to monitor small changes in posterior globe flattening.

  2. Quantification of Posterior Globe Flattening: Methodology Development and Validationc

    NASA Technical Reports Server (NTRS)

    Lumpkins, S. B.; Garcia, K. M.; Sargsyan, A. E.; Hamilton, D. R.; Berggren, M. D.; Antonsen, E.; Ebert, D.

    2011-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts, and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in several astronauts. This phenomenon is known to affect some terrestrial patient populations, and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT), or B-mode ultrasound (US), without consistent objective criteria. NASA uses a semi-quantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was to initiate development of an objective quantification methodology for posterior globe flattening.

  3. In Pursuit of Agile Acquisition: Are We There Yet?

    DTIC Science & Technology

    2013-03-01

    bureaucracy in the methodology, and avoid promoting activities that would further expand regulatory guidance and oversight to improve agility. Once an...through an integrated digital system called Blue Force Tracker . Instead of the lack of situational awareness, units now use streaming video to help...to trace forensics collected at other crime scenes or events and trace the data back to specific individuals thus identifying dangerous insurgents

  4. Safety-related operator actions: methodology for developing criteria

    SciTech Connect

    Kozinsky, E.J.; Gray, L.H.; Beare, A.N.; Barks, D.B.; Gomer, F.E.

    1984-03-01

    This report presents a methodology for developing criteria for design evaluation of safety-related actions by nuclear power plant reactor operators, and identifies a supporting data base. It is the eleventh and final NUREG/CR Report on the Safety-Related Operator Actions Program, conducted by Oak Ridge National Laboratory for the US Nuclear Regulatory Commission. The operator performance data were developed from training simulator experiments involving operator responses to simulated scenarios of plant disturbances; from field data on events with similar scenarios; and from task analytic data. A conceptual model to integrate the data was developed and a computer simulation of the model was run, using the SAINT modeling language. Proposed is a quantitative predictive model of operator performance, the Operator Personnel Performance Simulation (OPPS) Model, driven by task requirements, information presentation, and system dynamics. The model output, a probability distribution of predicted time to correctly complete safety-related operator actions, provides data for objective evaluation of quantitative design criteria.

  5. Development and application of proton NMR methodology to lipoprotein analysis

    NASA Astrophysics Data System (ADS)

    Korhonen, Ari Juhani

    1998-11-01

    The present thesis describes the development of 1H NMR spectroscopy and its applications to lipoprotein analysis in vitro, utilizing biochemical prior knowledge and advanced lineshape fitting analysis in the frequency domain. A method for absolute quantification of lipoprotein lipids and proteins directly from the terminal methyl-CH3 resonance region of 1H NMR spectra of human blood plasma is described. Then the use of NMR methodology in time course studies of the oxidation process of LDL particles is presented. The function of the cholesteryl ester transfer protein (CETP) in lipoprotein mixtures was also assessed by 1H NMR, which allows for dynamic follow-up of the lipid transfer reactions between VLDL, LDL, and HDL particles. The results corroborated the suggestion that neutral lipid mass transfer among lipoproteins is not an equimolar heteroexchange. A novel method for studying lipoprotein particle fusion is also demonstrated. It is shown that the progression of proteolytically (α- chymotrypsin) induced fusion of LDL particles can be followed by 1H NMR spectroscopy and, moreover, that fusion can be distinguished from aggregation. In addition, NMR methodology was used to study the changes in HDL3 particles induced by phospholipid transfer protein (PLTP) in HDL3 + PLTP mixtures. The 1H NMR study revealed a gradual production of enlarged HDL particles, which demonstrated that PLTP-mediated remodeling of HDL involves fusion of the HDL particles. These applications demonstrated that the 1H NMR approach offers several advantages both in quantification and in time course studies of lipoprotein-lipoprotein interactions and of enzyme/lipid transfer protein function.

  6. Project-Method Fit: Exploring Factors That Influence Agile Method Use

    ERIC Educational Resources Information Center

    Young, Diana K.

    2013-01-01

    While the productivity and quality implications of agile software development methods (SDMs) have been demonstrated, research concerning the project contexts where their use is most appropriate has yielded less definitive results. Most experts agree that agile SDMs are not suited for all project contexts. Several project and team factors have been…

  7. Recent developments in methodology for the direct oxyamination of olefins.

    PubMed

    Donohoe, Timothy J; Callens, Cedric K A; Flores, Aida; Lacy, Adam R; Rathi, Akshat H

    2011-01-03

    1,2-Amino alcohols are high-value, versatile functional groups that are found in scores of biologically active molecules and other interesting synthetic targets such as ligands and auxiliaries. Given their prominent position within organic compounds of import, it is no surprise to note that many routes have been developed to access this motif and there are many different starting points from which a synthetic chemist might embark on a synthesis. However, one particular approach stands out from the others, and this is the direct conversion of an alkene to a vicinal amino alcohol derivative (oxyamination). Research in this field has been particularly active in recent years and many interesting new methodologies have been reported. The purpose of this review is to give the reader a tour of the methods that have emerged in the last few years so one can appreciate the myriad of different metals and reagents that can accomplish the oxyamination of alkenes. There are still many challenges to be overcome and, herein, we also outline the areas that are ripe for further development and which bode well for the future.

  8. Lean vs Agile in the Context of Complexity Management in Organizations

    ERIC Educational Resources Information Center

    Putnik, Goran D.; Putnik, Zlata

    2012-01-01

    Purpose: The objective of this paper is to provide a deeper insight into the relationship of the issue "lean vs agile" in order to inform managers towards more coherent decisions especially in a dynamic, unpredictable, uncertain, non-linear environment. Design/methodology/approach: The methodology is an exploratory study based on secondary data…

  9. Development of economic consequence methodology for process risk analysis.

    PubMed

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies.

  10. SU-E-T-610: Comparison of Treatment Times Between the MLCi and Agility Multileaf Collimators

    SciTech Connect

    Ramsey, C; Bowling, J

    2014-06-01

    Purpose: The Agility is a new 160-leaf MLC developed by Elekta for use in their Infinity and Versa HD linacs. As compared to the MLCi, the Agility increased the maximum leaf speed from 2 cm/s to 3.5 cm/s, and the maximum primary collimator speed from 1.5 cm/s to 9.0 cm/s. The purpose of this study was to determine if the Agility MLC resulted in improved plan quality and/or shorter treatment times. Methods: An Elekta Infinity that was originally equipped with a 80 leaf MLCi was upgraded to an 160 leaf Agility. Treatment plan quality was evaluated using the Pinnacle planning system with SmartArc. Optimization was performed once for the MLCi and once for the Agility beam models using the same optimization parameters and the same number of iterations. Patient treatment times were measured for all IMRT, VMAT, and SBRT patients treated on the Infinity with the MLCi and Agility MLCs. Treatment times were extracted from the EMR and measured from when the patient first walked into the treatment room until exiting the treatment room. Results: 11,380 delivery times were measured for patients treated with the MLCi, and 1,827 measurements have been made for the Agility MLC. The average treatment times were 19.1 minutes for the MLCi and 20.8 minutes for the Agility. Using a t-test analysis, there was no difference between the two groups (t = 0.22). The dose differences between patients planned with the MLCi and the Agility MLC were minimal. For example, the dose difference for the PTV, GTV, and cord for a head and neck patient planned using Pinnacle were effectively equivalent. However, the dose to the parotid glands was slightly worse with the Agility MLC. Conclusion: There was no statistical difference in treatment time, or any significant dosimetric difference between the Agility MLC and the MLCi.

  11. Development of Methodologies for IV and V of Neural Networks

    NASA Technical Reports Server (NTRS)

    Taylor, Brian; Darrah, Marjorie

    2003-01-01

    Non-deterministic systems often rely upon neural network (NN) technology to "lean" to manage flight systems under controlled conditions using carefully chosen training sets. How can these adaptive systems be certified to ensure that they will become increasingly efficient and behave appropriately in real-time situations? The bulk of Independent Verification and Validation (IV&V) research of non-deterministic software control systems such as Adaptive Flight Controllers (AFC's) addresses NNs in well-behaved and constrained environments such as simulations and strict process control. However, neither substantive research, nor effective IV&V techniques have been found to address AFC's learning in real-time and adapting to live flight conditions. Adaptive flight control systems offer good extensibility into commercial aviation as well as military aviation and transportation. Consequently, this area of IV&V represents an area of growing interest and urgency. ISR proposes to further the current body of knowledge to meet two objectives: Research the current IV&V methods and assess where these methods may be applied toward a methodology for the V&V of Neural Network; and identify effective methods for IV&V of NNs that learn in real-time, including developing a prototype test bed for IV&V of AFC's. Currently. no practical method exists. lSR will meet these objectives through the tasks identified and described below. First, ISR will conduct a literature review of current IV&V technology. TO do this, ISR will collect the existing body of research on IV&V of non-deterministic systems and neural network. ISR will also develop the framework for disseminating this information through specialized training. This effort will focus on developing NASA's capability to conduct IV&V of neural network systems and to provide training to meet the increasing need for IV&V expertise in such systems.

  12. Towards a general object-oriented software development methodology

    NASA Technical Reports Server (NTRS)

    Seidewitz, ED; Stark, Mike

    1986-01-01

    An object is an abstract software model of a problem domain entity. Objects are packages of both data and operations of that data (Goldberg 83, Booch 83). The Ada (tm) package construct is representative of this general notion of an object. Object-oriented design is the technique of using objects as the basic unit of modularity in systems design. The Software Engineering Laboratory at the Goddard Space Flight Center is currently involved in a pilot program to develop a flight dynamics simulator in Ada (approximately 40,000 statements) using object-oriented methods. Several authors have applied object-oriented concepts to Ada (e.g., Booch 83, Cherry 85). It was found that these methodologies are limited. As a result a more general approach was synthesized with allows a designer to apply powerful object-oriented principles to a wide range of applications and at all stages of design. An overview is provided of this approach. Further, how object-oriented design fits into the overall software life-cycle is considered.

  13. Development of an Atomization Methodology for Spray Combustion

    NASA Technical Reports Server (NTRS)

    Seung, S. P.; Chen, C. P.; Chen, Y. S.

    1993-01-01

    In liquid rocket propulsion, the knowledge and the understanding of liquid-gas interfacial phenomena are very important. This is important for predicting the onset of cavitation occurring in swirl injection elements used in STME, as well as atomization processes in shear-induced injectors (co-axial) and impinging injector elements. From the fact that all the physical processes including droplet size distribution, droplet dispersion, mixing and combustion are controlled by atomization processes, it is expected that the successful incorporation of the volume of fraction (VOF) will greatly enhance the analytical capability of predicting spray combustion processes in liquid-fueled engines. In this paper, a methodology is developed to define and track interfaces between two fluids in non-orthogonal, body-fitted grids using a single fractional volume of fluid (VOF) variable to describe the distribution of the liquid phase in a gas-liquid flow field. This method was implemented in a mature CFD code MAST (Multiphase All-Speed Transient) utilizing the general PISO-C algorithm. For the preliminary study on the analysis of spray combustion and tracking of the interface between two phases, we will report on the progress of the simulation of the instability on the liquid column; the surface wave instability and the droplet breakup from the liquid surface.

  14. Development of an aeroelastic methodology for surface morphing rotors

    NASA Astrophysics Data System (ADS)

    Cook, James R.

    Helicopter performance capabilities are limited by maximum lift characteristics and vibratory loading. In high speed forward flight, dynamic stall and transonic flow greatly increase the amplitude of vibratory loads. Experiments and computational simulations alike have indicated that a variety of active rotor control devices are capable of reducing vibratory loads. For example, periodic blade twist and flap excitation have been optimized to reduce vibratory loads in various rotors. Airfoil geometry can also be modified in order to increase lift coefficient, delay stall, or weaken transonic effects. To explore the potential benefits of active controls, computational methods are being developed for aeroelastic rotor evaluation, including coupling between computational fluid dynamics (CFD) and computational structural dynamics (CSD) solvers. In many contemporary CFD/CSD coupling methods it is assumed that the airfoil is rigid to reduce the interface by single dimension. Some methods retain the conventional one-dimensional beam model while prescribing an airfoil shape to simulate active chord deformation. However, to simulate the actual response of a compliant airfoil it is necessary to include deformations that originate not only from control devices (such as piezoelectric actuators), but also inertial forces, elastic stresses, and aerodynamic pressures. An accurate representation of the physics requires an interaction with a more complete representation of loads and geometry. A CFD/CSD coupling methodology capable of communicating three-dimensional structural deformations and a distribution of aerodynamic forces over the wetted blade surface has not yet been developed. In this research an interface is created within the Fully Unstructured Navier-Stokes (FUN3D) solver that communicates aerodynamic forces on the blade surface to University of Michigan's Nonlinear Active Beam Solver (UM/NLABS -- referred to as NLABS in this thesis). Interface routines are developed for

  15. Development of a Malicious Insider Composite Vulnerability Assessment Methodology

    DTIC Science & Technology

    2006-06-01

    A. NASA Physical Security Vulnerability Analysis Worksheet 81 Appendix B. NIST 800-30 Risk Assessment Methodology Flowchart . 82 Appendix C. ITFDM... NASA Physical Security Vulnerability Risk Analysis Worksheet 81 ix Figure Page B.1. Risk Assessment Methodology Flowchart . . . . . . . . . . . . 82...11 2.3. NASA Relative Value of Information Systems Assets . . . . . . 25 2.4. NIST Magnitude of Impact Definition Table . . . . . . . . . . . 27 3.1

  16. Development of an in-situ soil structure characterization methodology

    NASA Astrophysics Data System (ADS)

    Debos, Endre; Kriston, Sandor

    2015-04-01

    The agricultural cultivation has several direct and indirect effects on the soil properties, among which the soil structure degradation is the best known and most detectable one. Soil structure degradation leads to several water and nutrient management problems, which reduce the efficiency of agricultural production. There are several innovative technological approaches aiming to reduce these negative impacts on the soil structure. The tests, validation and optimization of these methods require an adequate technology to measure the impacts on the complex soil system. This study aims to develop an in-situ soil structure and root development testing methodology, which can be used in field experiments and which allows one to follow the real time changes in the soil structure - evolution / degradation and its quantitative characterization. The method is adapted from remote sensing image processing technology. A specifically transformed A/4 size scanner is placed into the soil into a safe depth that cannot be reached by the agrotechnical treatments. Only the scanner USB cable comes to the surface to allow the image acquisition without any soil disturbance. Several images from the same place can be taken throughout the vegetation season to follow the soil consolidation and structure development after the last tillage treatment for the seedbed preparation. The scanned image of the soil profile is classified using supervised image classification, namely the maximum likelihood classification algorithm. The resulting image has two principal classes, soil matrix and pore space and other complementary classes to cover the occurring thematic classes, like roots, stones. The calculated data is calibrated with filed sampled porosity data. As the scanner is buried under the soil with no changes in light conditions, the image processing can be automated for better temporal comparison. Besides the total porosity each pore size fractions and their distributions can be calculated for

  17. A Proven Methodology for Developing Secure Software and Applying It to Ground Systems

    NASA Technical Reports Server (NTRS)

    Bailey, Brandon

    2016-01-01

    Part Two expands upon Part One in an attempt to translate the methodology for ground system personnel. The goal is to build upon the methodology presented in Part One by showing examples and details on how to implement the methodology. Section 1: Ground Systems Overview; Section 2: Secure Software Development; Section 3: Defense in Depth for Ground Systems; Section 4: What Now?

  18. Current State of Agile User-Centered Design: A Survey

    NASA Astrophysics Data System (ADS)

    Hussain, Zahid; Slany, Wolfgang; Holzinger, Andreas

    Agile software development methods are quite popular nowadays and are being adopted at an increasing rate in the industry every year. However, these methods are still lacking usability awareness in their development lifecycle, and the integration of usability/User-Centered Design (UCD) into agile methods is not adequately addressed. This paper presents the preliminary results of a recently conducted online survey regarding the current state of the integration of agile methods and usability/UCD. A world wide response of 92 practitioners was received. The results show that the majority of practitioners perceive that the integration of agile methods with usability/UCD has added value to their adopted processes and to their teams; has resulted in the improvement of usability and quality of the product developed; and has increased the satisfaction of the end-users of the product developed. The top most used HCI techniques are low-fidelity prototyping, conceptual designs, observational studies of users, usability expert evaluations, field studies, personas, rapid iterative testing, and laboratory usability testing.

  19. Barriers to Achieving Mentally Agile Junior Leaders

    DTIC Science & Technology

    2009-01-21

    To help answer this question, this paper will describe the operational environment the agile leader must be prepared to operate within and the...senior leadership identified their need over eight years ago? To help answer this question, this paper will describe the operational environment the agile...to the reader. BARRIERS TO ACHIEVING MENTALLY AGILE JUNIOR LEADERS Persistent conflict and change characterize the strategic environment . We have

  20. Towards a Comparative Measure of Legged Agility

    DTIC Science & Technology

    2014-06-01

    so for this paper we explore the implications of a well-cited definition within the sports science community holding that agility is “a rapid whole...systems will have negligible agility according to our metric in accor- dance with biological observations that these motions require significantly less...W. Young, “Agility literature review: Classifications, training and testing,” Journal of sports sciences, vol. 24, no. 9, pp. 919–932, 2006. 19. D. L

  1. Action Research: A Methodology for Change and Development

    ERIC Educational Resources Information Center

    Somekh, Bridget

    2005-01-01

    This book presents a fresh view of action research as a methodology uniquely suited to researching the processes of innovation and change. Drawing on twenty-five years' experience of leading or facilitating action research projects, Bridget Somekh argues that action research can be a powerful systematic intervention, which goes beyond describing,…

  2. [Methodology for the development of expert systems of viral epidemiology].

    PubMed

    Cristea, A L; Zaharia, C N

    1988-01-01

    The proposed methodology for the elaboration of the base of knowledge uses a tree of the hierarchical entities and a simplified variant of the natural language. The resolution system is based on an extension of the predicate calculation containing, in an explicit way, entities of different nature, and among these the correlations giving the rules of deduction.

  3. A Methodology to Develop Entrepreneurial Networks: The Tech Ecosystem of Six African Cities

    DTIC Science & Technology

    2014-11-01

    Technical Report 15-005 A Methodology to Develop Entrepreneurial Networks: The Tech Ecosystem of Six African Cities Daniel...NUMBER n/a A Methodology to Develop Entrepreneurial Networks: The Tech Ecosystem of Six African Cities 5b. GRANT NUMBER n/a 5c. PROGRAM...A Methodology to Develop Entrepreneurial Networks: The Tech Ecosystem of Six African Cities Daniel Evans Background Our project

  4. Development of a methodology for LES of Turbulent Cavitating Flows

    NASA Astrophysics Data System (ADS)

    Gnanaskandan, Aswin

    The objective of this dissertation is to develop a numerical methodology for large eddy simulation of multiphase cavitating flows on unstructured grids and apply it to study two cavitating flow problems. The multiphase medium is represented using a homogeneous mixture model that assumes thermal equilibrium between the liquid and vapor phases. We develop a predictor-corrector approach to solve the governing Navier Stokes equations for the liquid/vapor mixture, together with the transport equation for the vapor mass fraction. While a non-dissipative and symmetric scheme is used in the predictor step, a novel characteristic-based filtering scheme with a second order TVD filter is developed for the corrector step to handle shocks and material discontinuities in non-ideal gases and mixtures. Additionally, a sensor based on vapor volume fraction is proposed to localize dissipation to the vicinity of discontinuities. The scheme is first validated for one dimensional canonical problems to verify its accuracy in predicting jump conditions across material discontinuities and shocks. It is then applied to two turbulent cavitating flow problems - over a hydrofoil and over a wedge. Our results show that the simulations are in good agreement with experimental data for the above tested cases, and that the scheme can be successfully applied to RANS, LES and DNS methodologies. We first study cavitation over a circular cylinder at two different Reynolds numbers (Re = 200 and 3900 based on cylinder diameter and free stream velocity) and four different cavitation numbers (sigma = 2.0, 1.0, 0.7 and 0.5). Large Eddy Simulation (LES) is employed at the higher Reynolds number and Direct Numerical Simulations (DNS) at the lower Reynolds number. The unsteady characteristics of the flow are found to be altered significantly by cavitation. It is observed that the simulated cases fall into two different cavitation regimes: cyclic and transitional. Cavitation is seen to significantly influence

  5. On the Biomimetic Design of Agile-Robot Legs

    PubMed Central

    Garcia, Elena; Arevalo, Juan Carlos; Muñoz, Gustavo; Gonzalez-de-Santos, Pablo

    2011-01-01

    The development of functional legged robots has encountered its limits in human-made actuation technology. This paper describes research on the biomimetic design of legs for agile quadrupeds. A biomimetic leg concept that extracts key principles from horse legs which are responsible for the agile and powerful locomotion of these animals is presented. The proposed biomimetic leg model defines the effective leg length, leg kinematics, limb mass distribution, actuator power, and elastic energy recovery as determinants of agile locomotion, and values for these five key elements are given. The transfer of the extracted principles to technological instantiations is analyzed in detail, considering the availability of current materials, structures and actuators. A real leg prototype has been developed following the biomimetic leg concept proposed. The actuation system is based on the hybrid use of series elasticity and magneto-rheological dampers which provides variable compliance for natural motion. From the experimental evaluation of this prototype, conclusions on the current technological barriers to achieve real functional legged robots to walk dynamically in agile locomotion are presented. PMID:22247667

  6. On the biomimetic design of agile-robot legs.

    PubMed

    Garcia, Elena; Arevalo, Juan Carlos; Muñoz, Gustavo; Gonzalez-de-Santos, Pablo

    2011-01-01

    The development of functional legged robots has encountered its limits in human-made actuation technology. This paper describes research on the biomimetic design of legs for agile quadrupeds. A biomimetic leg concept that extracts key principles from horse legs which are responsible for the agile and powerful locomotion of these animals is presented. The proposed biomimetic leg model defines the effective leg length, leg kinematics, limb mass distribution, actuator power, and elastic energy recovery as determinants of agile locomotion, and values for these five key elements are given. The transfer of the extracted principles to technological instantiations is analyzed in detail, considering the availability of current materials, structures and actuators. A real leg prototype has been developed following the biomimetic leg concept proposed. The actuation system is based on the hybrid use of series elasticity and magneto-rheological dampers which provides variable compliance for natural motion. From the experimental evaluation of this prototype, conclusions on the current technological barriers to achieve real functional legged robots to walk dynamically in agile locomotion are presented.

  7. Development of a Design Methodology for Reconfigurable Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Hess, Ronald A.; McLean, C.

    2000-01-01

    A methodology is presented for the design of flight control systems that exhibit stability and performance-robustness in the presence of actuator failures. The design is based upon two elements. The first element consists of a control law that will ensure at least stability in the presence of a class of actuator failures. This law is created by inner-loop, reduced-order, linear dynamic inversion, and outer-loop compensation based upon Quantitative Feedback Theory. The second element consists of adaptive compensators obtained from simple and approximate time-domain identification of the dynamics of the 'effective vehicle' with failed actuator(s). An example involving the lateral-directional control of a fighter aircraft is employed both to introduce the proposed methodology and to demonstrate its effectiveness and limitations.

  8. Agility: Agent - Ility Architecture

    DTIC Science & Technology

    2002-10-01

    Figure 2: Overview of eGents 9 Specific scientific and engineering subgoals were: • develop a lightweight agent system that uses email- based ...applets makes them hard to operate over corporate firewalls. eGents e - mail based ACL bus imposes fewer requirements on agents that use it, and firewalls...do not pose a problem for an e - mail based ACL bus. While applets limit 35 JATLites range of applications, they also make JATlite easy to deploy

  9. A systematic review of the main factors that determine agility in sport using structural equation modeling.

    PubMed

    Hojka, Vladimir; Stastny, Petr; Rehak, Tomas; Gołas, Artur; Mostowik, Aleksandra; Zawart, Marek; Musálek, Martin

    2016-09-01

    While tests of basic motor abilities such as speed, maximum strength or endurance are well recognized, testing of complex motor functions such as agility remains unresolved in current literature. Therefore, the aim of this review was to evaluate which main factor or factor structures quantitatively determine agility. In methodological detail, this review focused on research that explained or described the relationships between latent variables in a factorial model of agility using approaches such as principal component analysis, factor analysis and structural equation modeling. Four research studies met the defined inclusion criteria. No quantitative empirical research was found that tried to verify the quality of the whole suggested model of the main factors determining agility through the use of a structural equation modeling (SEM) approach or a confirmatory factor analysis. From the whole structure of agility, only change of direction speed (CODS) and some of its subtests were appropriately analyzed. The combination of common CODS tests is reliable and useful to estimate performance in sub-elite athletes; however, for elite athletes, CODS tests must be specific to the needs of a particular sport discipline. Sprinting and jumping tests are stronger factors for CODS than explosive strength and maximum strength tests. The authors suggest the need to verify the agility factorial model by a second generation data analysis technique such as SEM.

  10. A systematic review of the main factors that determine agility in sport using structural equation modeling

    PubMed Central

    Hojka, Vladimir; Rehak, Tomas; Gołas, Artur; Mostowik, Aleksandra; Zawart, Marek; Musálek, Martin

    2016-01-01

    Abstract While tests of basic motor abilities such as speed, maximum strength or endurance are well recognized, testing of complex motor functions such as agility remains unresolved in current literature. Therefore, the aim of this review was to evaluate which main factor or factor structures quantitatively determine agility. In methodological detail, this review focused on research that explained or described the relationships between latent variables in a factorial model of agility using approaches such as principal component analysis, factor analysis and structural equation modeling. Four research studies met the defined inclusion criteria. No quantitative empirical research was found that tried to verify the quality of the whole suggested model of the main factors determining agility through the use of a structural equation modeling (SEM) approach or a confirmatory factor analysis. From the whole structure of agility, only change of direction speed (CODS) and some of its subtests were appropriately analyzed. The combination of common CODS tests is reliable and useful to estimate performance in sub-elite athletes; however, for elite athletes, CODS tests must be specific to the needs of a particular sport discipline. Sprinting and jumping tests are stronger factors for CODS than explosive strength and maximum strength tests. The authors suggest the need to verify the agility factorial model by a second generation data analysis technique such as SEM. PMID:28149399

  11. Multiply-agile encryption in high speed communication networks

    SciTech Connect

    Pierson, L.G.; Witzke, E.L.

    1997-05-01

    Different applications have different security requirements for data privacy, data integrity, and authentication. Encryption is one technique that addresses these requirements. Encryption hardware, designed for use in high-speed communications networks, can satisfy a wide variety of security requirements if that hardware is key-agile, robustness-agile and algorithm-agile. Hence, multiply-agile encryption provides enhanced solutions to the secrecy, interoperability and quality of service issues in high-speed networks. This paper defines these three types of agile encryption. Next, implementation issues are discussed. While single-algorithm, key-agile encryptors exist, robustness-agile and algorithm-agile encryptors are still research topics.

  12. Developing a Methodology for Measuring Stress Transients at Seismogenic Depth

    NASA Astrophysics Data System (ADS)

    Silver, P. G.; Niu, F.; Daley, T.; Majer, E.

    2005-05-01

    The dependence of crack properties on stress means that crustal seismic velocity exhibits stress dependence. This dependence constitutes, in principle, a powerful means of studying transient changes in stress at seismogenic depth through the repeat measurement of travel time from a controlled source. While the scientific potential of this stress dependence has been known for decades, time-dependent seismic imaging has yet to become a reliable means of measuring subsurface stress changes in fault-zone environments. This is due to 1) insufficient delay-time precision necessary to detect small changes in stress, and 2) the difficulty in establishing a reliable in-situ calibration between stress and seismic velocity. These two problems are coupled because the best sources of calibration, solid-earth tides and barometric pressure, produce weak stress perturbations of order 102-103 Pa that require precision in the measurement of the fractional velocity change dlnv of order 10-6, based on laboratory experiments. We have thus focused on developing a methodology that is capable of providing this high level of precision. For example, we have shown that precision in dlnv is maximized when there are Q/π wavelengths in the source-receiver path. This relationship provides a means of selecting an optimal geometry and/or source characteristic frequency in the planning of experiments. We have initiated a series of experiments to demonstrate the detectability of these stress-calibration signals in progressively more tectonically relevant settings. Initial tests have been completed on the smallest scale, with two boreholes 17 m deep and 3 meters apart. We have used a piezoelectric source (0.1ms source pulse repeated every 100ms) and a string of 24 hydrophones to record P waves with a dominant frequency of 10KHz. Recording was conducted for 160 hours. The massive stacking of ~36,000 high-SNR traces/hr leads to delay-time precision of 6ns (hour sampling) corresponding to dlnv

  13. Network configuration management : paving the way to network agility.

    SciTech Connect

    Maestas, Joseph H.

    2007-08-01

    Sandia networks consist of nearly nine hundred routers and switches and nearly one million lines of command code, and each line ideally contributes to the capabilities of the network to convey information from one location to another. Sandia's Cyber Infrastructure Development and Deployment organizations recognize that it is therefore essential to standardize network configurations and enforce conformance to industry best business practices and documented internal configuration standards to provide a network that is agile, adaptable, and highly available. This is especially important in times of constrained budgets as members of the workforce are called upon to improve efficiency, effectiveness, and customer focus. Best business practices recommend using the standardized configurations in the enforcement process so that when root cause analysis results in recommended configuration changes, subsequent configuration auditing will improve compliance to the standard. Ultimately, this minimizes mean time to repair, maintains the network security posture, improves network availability, and enables efficient transition to new technologies. Network standardization brings improved network agility, which in turn enables enterprise agility, because the network touches all facets of corporate business. Improved network agility improves the business enterprise as a whole.

  14. Wavelength-Agile External-Cavity Diode Laser for DWDM

    NASA Technical Reports Server (NTRS)

    Pilgrim, Jeffrey S.; Bomse, David S.

    2006-01-01

    A prototype external-cavity diode laser (ECDL) has been developed for communication systems utilizing dense wavelength- division multiplexing (DWDM). This ECDL is an updated version of the ECDL reported in Wavelength-Agile External- Cavity Diode Laser (LEW-17090), NASA Tech Briefs, Vol. 25, No. 11 (November 2001), page 14a. To recapitulate: The wavelength-agile ECDL combines the stability of an external-cavity laser with the wavelength agility of a diode laser. Wavelength is modulated by modulating the injection current of the diode-laser gain element. The external cavity is a Littman-Metcalf resonator, in which the zeroth-order output from a diffraction grating is used as the laser output and the first-order-diffracted light is retro-reflected by a cavity feedback mirror, which establishes one end of the resonator. The other end of the resonator is the output surface of a Fabry-Perot resonator that constitutes the diode-laser gain element. Wavelength is selected by choosing the angle of the diffracted return beam, as determined by position of the feedback mirror. The present wavelength-agile ECDL is distinguished by design details that enable coverage of all 60 channels, separated by 100-GHz frequency intervals, that are specified in DWDM standards.

  15. Agile parallel bioinformatics workflow management using Pwrake

    PubMed Central

    2011-01-01

    Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability

  16. AGILE integration into APC for high mix logic fab

    NASA Astrophysics Data System (ADS)

    Gatefait, M.; Lam, A.; Le Gratiet, B.; Mikolajczak, M.; Morin, V.; Chojnowski, N.; Kocsis, Z.; Smith, I.; Decaunes, J.; Ostrovsky, A.; Monget, C.

    2015-09-01

    mix logic Fab) in term of product and technology portfolio AGILE corrects for up to 120nm of product topography error on process layer with less than 50nm depth of focus Based on tool functionalities delivered by ASML and on high volume manufacturing requirement, AGILE integration is a real challenge. Regarding ST requirements "Automatic AGILE" functionality developed by ASML was not a turnkey solution and a dedicated functionality was needed. A "ST homemade AGILE integration" has been fully developed and implemented within ASML and ST constraints. This paper describes this integration in our Advanced Process Control platform (APC).

  17. Methodology development for evaluation of selective-fidelity rotorcraft simulation

    NASA Technical Reports Server (NTRS)

    Lewis, William D.; Schrage, D. P.; Prasad, J. V. R.; Wolfe, Daniel

    1992-01-01

    This paper addressed the initial step toward the goal of establishing performance and handling qualities acceptance criteria for realtime rotorcraft simulators through a planned research effort to quantify the system capabilities of 'selective fidelity' simulators. Within this framework the simulator is then classified based on the required task. The simulator is evaluated by separating the various subsystems (visual, motion, etc.) and applying corresponding fidelity constants based on the specific task. This methodology not only provides an assessment technique, but also provides a technique to determine the required levels of subsystem fidelity for a specific task.

  18. The Introduction of Agility into Albania.

    ERIC Educational Resources Information Center

    Smith-Stevens, Eileen J.; Shkurti, Drita

    1998-01-01

    Describes a plan to introduce and achieve a national awareness of agility (and easy entry into the world market) for Albania through the relatively stable higher-education order. Agility's four strategic principles are enriching the customer, cooperating to enhance competitiveness, organizing to master change and uncertainty, and leveraging the…

  19. Agile Data Management with the Global Change Information System

    NASA Astrophysics Data System (ADS)

    Duggan, B.; Aulenbach, S.; Tilmes, C.; Goldstein, J.

    2013-12-01

    We describe experiences applying agile software development techniques to the realm of data management during the development of the Global Change Information System (GCIS), a web service and API for authoritative global change information under development by the US Global Change Research Program. Some of the challenges during system design and implementation have been : (1) balancing the need for a rigorous mechanism for ensuring information quality with the realities of large data sets whose contents are often in flux, (2) utilizing existing data to inform decisions about the scope and nature of new data, and (3) continuously incorporating new knowledge and concepts into a relational data model. The workflow for managing the content of the system has much in common with the development of the system itself. We examine various aspects of agile software development and discuss whether or how we have been able to use them for data curation as well as software development.

  20. Agile manufacturing from a statistical perspective

    SciTech Connect

    Easterling, R.G.

    1995-10-01

    The objective of agile manufacturing is to provide the ability to quickly realize high-quality, highly-customized, in-demand products at a cost commensurate with mass production. More broadly, agility in manufacturing, or any other endeavor, is defined as change-proficiency; the ability to thrive in an environment of unpredictable change. This report discusses the general direction of the agile manufacturing initiative, including research programs at the National Institute of Standards and Technology (NIST), the Department of Energy, and other government agencies, but focuses on agile manufacturing from a statistical perspective. The role of statistics can be important because agile manufacturing requires the collection and communication of process characterization and capability information, much of which will be data-based. The statistical community should initiate collaborative work in this important area.

  1. Agile manufacturing prototyping system (AMPS)

    SciTech Connect

    Garcia, P.

    1998-05-09

    The Agile Manufacturing Prototyping System (AMPS) is being integrated at Sandia National Laboratories. AMPS consists of state of the industry flexible manufacturing hardware and software enhanced with Sandia advancements in sensor and model based control; automated programming, assembly and task planning; flexible fixturing; and automated reconfiguration technology. AMPS is focused on the agile production of complex electromechanical parts. It currently includes 7 robots (4 Adept One, 2 Adept 505, 1 Staubli RX90), conveyance equipment, and a collection of process equipment to form a flexible production line capable of assembling a wide range of electromechanical products. This system became operational in September 1995. Additional smart manufacturing processes will be integrated in the future. An automated spray cleaning workcell capable of handling alcohol and similar solvents was added in 1996 as well as parts cleaning and encapsulation equipment, automated deburring, and automated vision inspection stations. Plans for 1997 and out years include adding manufacturing processes for the rapid prototyping of electronic components such as soldering, paste dispensing and pick-and-place hardware.

  2. Agile: From Software to Mission System

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Shirley, Mark H.; Hobart, Sarah Groves

    2016-01-01

    The Resource Prospector (RP) is an in-situ resource utilization (ISRU) technology demonstration mission, designed to search for volatiles at the Lunar South Pole. This is NASA's first near real time tele-operated rover on the Moon. The primary objective is to search for volatiles at one of the Lunar Poles. The combination of short mission duration, a solar powered rover, and the requirement to explore shadowed regions makes for an operationally challenging mission. To maximize efficiency and flexibility in Mission System design and thus to improve the performance and reliability of the resulting Mission System, we are tailoring Agile principles that we have used effectively in ground data system software development and applying those principles to the design of elements of the mission operations system.

  3. Owning the Technical Baseline - a Key Enabler: Agility as the Counterweight to Uncertainty and Change

    DTIC Science & Technology

    2015-08-01

    future. This fundamentally means we must embrace adaptability as a basic precept for how we develop, procure and sustain our weapons systems to be...effective for the warfighter over their life cycles. The underlying metric for such agility and adaptability is speed. When we can develop and field...capabilities fast, we must do so. Furthermore, agility and adaptability can be enabled by designing systems with modularity, well-designed standards and

  4. Development of a methodology for classifying software errors

    NASA Technical Reports Server (NTRS)

    Gerhart, S. L.

    1976-01-01

    A mathematical formalization of the intuition behind classification of software errors is devised and then extended to a classification discipline: Every classification scheme should have an easily discernible mathematical structure and certain properties of the scheme should be decidable (although whether or not these properties hold is relative to the intended use of the scheme). Classification of errors then becomes an iterative process of generalization from actual errors to terms defining the errors together with adjustment of definitions according to the classification discipline. Alternatively, whenever possible, small scale models may be built to give more substance to the definitions. The classification discipline and the difficulties of definition are illustrated by examples of classification schemes from the literature and a new study of observed errors in published papers of programming methodologies.

  5. Are Agile and Lean Manufacturing Systems Employing Sustainability, Complexity and Organizational Learning?

    ERIC Educational Resources Information Center

    Flumerfelt, Shannon; Siriban-Manalang, Anna Bella; Kahlen, Franz-Josef

    2012-01-01

    Purpose: This paper aims to peruse theories and practices of agile and lean manufacturing systems to determine whether they employ sustainability, complexity and organizational learning. Design/methodology/approach: The critical review of the comparative operational similarities and difference of the two systems was conducted while the new views…

  6. Preparing your Offshore Organization for Agility: Experiences in India

    NASA Astrophysics Data System (ADS)

    Srinivasan, Jayakanth

    Two strategies that have significantly changed the way we conventionally think about managing software development and sustainment are the family of development approaches collectively referred to as agile methods, and the distribution of development efforts on a global scale. When you combine the two strategies, organizations have to address not only the technical challenges that arise from introducing new ways of working, but more importantly have to manage the 'soft' factors that if ignored lead to hard challenges. Using two case studies of distributed agile software development in India we illustrate the areas that organizations need to be aware of when transitioning work to India. The key issues that we emphasize are the need to recruit and retain personnel; the importance of teaching, mentoring and coaching; the need to manage customer expectations; the criticality of well-articulated senior leadership vision and commitment; and the reality of operating in a heterogeneous process environment.

  7. Developing an Item Bank for Use in Testing in Africa: Theory and Methodology

    ERIC Educational Resources Information Center

    Furtuna, Daniela

    2014-01-01

    The author describes the steps taken by a research team, of which she was part, to develop a specific methodology for assessing student attainment in primary school, working with the Programme for the Analysis of Education Systems (PASEC) of the Conference of Ministers of Education of French-speaking Countries (CONFEMEN). This methodology provides…

  8. Development of a standard methodology for assessing the satiating effect of foods

    Technology Transfer Automated Retrieval System (TEKTRAN)

    No standard methodology is currently utilized for assessing the relative satiating value of food items. Our goal was to evaluate the validity and reliability of satiety responses in order to develop a standardized methodology for determining the relative satiating capacity of specific food items. A ...

  9. I'll Txt U if I Have a Problem: How the Société Canadienne du Cancer in Quebec Applied Behavior-Change Theory, Data Mining and Agile Software Development to Help Young Adults Quit Smoking

    PubMed Central

    van Mierlo, Trevor; Fournier, Rachel; Jean-Charles, Anathalie; Hovington, Jacinthe; Ethier, Isabelle; Selby, Peter

    2014-01-01

    Introduction For many organizations, limited budgets and phased funding restrict the development of digital health tools. This problem is often exacerbated by the ever-increasing sophistication of technology and costs related to programming and maintenance. Traditional development methods tend to be costly and inflexible and not client centered. The purpose of this study is to analyze the use of Agile software development and outcomes of a three-phase mHealth program designed to help young adult Quebecers quit smoking. Methods In Phase I, literature reviews, focus groups, interviews, and behavior change theory were used in the adaption and re-launch of an existing evidence-based mHealth platform. Based on analysis of user comments and utilization data from Phase I, the second phase expanded the service to allow participants to live text-chat with counselors. Phase II evaluation led to the third and current phase, in which algorithms were introduced to target pregnant smokers, substance users, students, full-time workers, those affected by mood disorders and chronic disease. Results Data collected throughout the three phases indicate that the incremental evolution of the intervention has led to increasing numbers of smokers being enrolled while making functional enhancements. In Phase I (240 days) 182 smokers registered with the service. 51% (n = 94) were male and 61.5% (n = 112) were between the ages of 18–24. In Phase II (300 days), 994 smokers registered with the service. 51% (n = 508) were male and 41% (n = 403) were between the ages of 18–24. At 174 days to date 873 smokers have registered in the third phase. 44% (n = 388) were male and 24% (n = 212) were between the ages of 18–24. Conclusions Emerging technologies in behavioral science show potential, but do not have defined best practices for application development. In phased-based projects with limited funding, Agile appears to be a viable approach to building and expanding

  10. Toward Agile Control of a Flexible-Spine Model for Quadruped Bounding

    DTIC Science & Technology

    2015-01-01

    step reachable states. Finally, we propose new guidelines for quantifying “agility” for legged robots , providing a preliminary framework for...quantifying and improving performance of legged systems. 1. INTRODUCTION One goal in developing legged robot systems is to provide a high degree of agility...Intuitively, being agile means that future states (i.e., position and velocity variables defining snapshots of the dynamic robot as it moves) are not

  11. Agile-Lean Software Engineering (ALSE) Evaluating Kanban in Systems Engineering

    DTIC Science & Technology

    2013-03-06

    Agile-Lean Software Engineering (ALSE) Evaluating Kanban in Systems Engineering A013 - Final Technical Report SERC-2013-TR-022-2 March 6, 2013...06 MAR 2013 2. REPORT TYPE 3. DATES COVERED 00-00-2013 to 00-00-2013 4. TITLE AND SUBTITLE Agile-Lean Software Engineering (ALSE) Evaluating...engineering (SE). Such approaches have been seen to be valuable in software system development. In particular, the research focuses on SE where rapid response

  12. Development of methodology for horizontal axis wind turbine dynamic analysis

    NASA Technical Reports Server (NTRS)

    Dugundji, J.

    1982-01-01

    Horizontal axis wind turbine dynamics were studied. The following findings are summarized: (1) review of the MOSTAS computer programs for dynamic analysis of horizontal axis wind turbines; (2) review of various analysis methods for rotating systems with periodic coefficients; (3) review of structural dynamics analysis tools for large wind turbine; (4) experiments for yaw characteristics of a rotating rotor; (5) development of a finite element model for rotors; (6) development of simple models for aeroelastics; and (7) development of simple models for stability and response of wind turbines on flexible towers.

  13. Methodology for security development of an electronic prescription system.

    PubMed

    Niinimäki, J; Savolainen, M; Forsström, J J

    1998-01-01

    Data security is an essential requirement in all health care applications. Developers of medical information systems should utilize the existing security development and evaluation methods to foresee as many of the technical and human factors that may endanger data security as possible and apply appropriate precautions. Modern smart card technology facilitates the building of robust security framework for interorganizational shared care systems. In this article, we describe the way we utilized the existing security evaluation criteria in developing the security concept of our electronic prescription system.

  14. Agile Leaders, Agile Institutions: Educating Adaptive and Innovative Leaders for Today and Tomorrow

    DTIC Science & Technology

    2005-03-18

    to organizational learning , specifically for militaries at war. With these lenses and informed by observations from the CCCs, the paper advances...rapid, effective organizational learning is the essence of organizational agility. In line with this paper’s concept of individual agility...organizational agility is a metaphor for organizational learning that is faster, more flexible, and more sensitive to the speed with which individual experiential

  15. Development of Fuzzy Logic and Soft Computing Methodologies

    NASA Technical Reports Server (NTRS)

    Zadeh, L. A.; Yager, R.

    1999-01-01

    Our earlier research on computing with words (CW) has led to a new direction in fuzzy logic which points to a major enlargement of the role of natural languages in information processing, decision analysis and control. This direction is based on the methodology of computing with words and embodies a new theory which is referred to as the computational theory of perceptions (CTP). An important feature of this theory is that it can be added to any existing theory - especially to probability theory, decision analysis, and control - and enhance the ability of the theory to deal with real-world problems in which the decision-relevant information is a mixture of measurements and perceptions. The new direction is centered on an old concept - the concept of a perception - a concept which plays a central role in human cognition. The ability to reason with perceptions perceptions of time, distance, force, direction, shape, intent, likelihood, truth and other attributes of physical and mental objects - underlies the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Everyday examples of such tasks are parking a car, driving in city traffic, cooking a meal, playing golf and summarizing a story. Perceptions are intrinsically imprecise. Imprecision of perceptions reflects the finite ability of sensory organs and ultimately, the brain, to resolve detail and store information. More concretely, perceptions are both fuzzy and granular, or, for short, f-granular. Perceptions are f-granular in the sense that: (a) the boundaries of perceived classes are not sharply defined; and (b) the elements of classes are grouped into granules, with a granule being a clump of elements drawn together by indistinguishability, similarity. proximity or functionality. F-granularity of perceptions may be viewed as a human way of achieving data compression. In large measure, scientific progress has been, and continues to be

  16. Agile Data Curation at a State Geological Survey

    NASA Astrophysics Data System (ADS)

    Hills, D. J.

    2015-12-01

    State agencies, including geological surveys, are often the gatekeepers for myriad data products essential for scientific research and economic development. For example, the Geological Survey of Alabama (GSA) is mandated to explore for, characterize, and report Alabama's mineral, energy, water, and biological resources in support of economic development, conservation, management, and public policy for the betterment of Alabama's citizens, communities, and businesses. As part of that mandate, the GSA has increasingly been called upon to make our data more accessible to stakeholders. Even as demand for greater data accessibility grows, budgets for such efforts are often small, meaning that agencies must do more for less. Agile software development has yielded efficient, effective products, most often at lower cost and in shorter time. Taking guidance from the agile software development model, the GSA is working towards more agile data management and curation. To date, the GSA's work has been focused primarily on data rescue. By using workflows that maximize clear communication while encouraging simplicity (e.g., maximizing the amount of work not done or that can be automated), the GSA is bringing decades of dark data into the light. Regular checks by the data rescuer with the data provider (or their proxy) provides quality control without adding an overt burden on either party. Moving forward, these workflows will also allow for more efficient and effective data management.

  17. Methodology Development for Measurement of Agent Fate in an Environmental Wind Tunnel

    DTIC Science & Technology

    2005-10-01

    METHODOLOGY DEVELOPMENT FOR MEASUREMENT OF AGENT FATE IN AN ENVIRONMENTAL WIND TUNNEL Wendel Shuely, Robert Nickol, GEO-Centers, and...managerial support by Dr. H. Durst , Mr. L. Bickford and Dr. J. Savage, ECBC, and Mr. Tim Bauer, NSWC.

  18. Integrating Agile Combat Support within Title 10 Wargames

    DTIC Science & Technology

    2015-03-26

    incorporate logistics into Air Force Title 10 wargames. More specifically, we capture Air Force Materiel Command’s (AFMC) Agile Combat Support (ACS...within an unclassified general wargame scenario. Logistics has been omitted from wargames for a multitude of reasons throughout the years. We...develop a logistics simulation model of a simplified wargame scenario designed to be run within the Logistics Composite Model (LCOM) Analysis Toolkit

  19. ROADM architectures and technologies for agile optical networks

    NASA Astrophysics Data System (ADS)

    Eldada, Louay A.

    2007-02-01

    We review the different optoelectronic component and module technologies that have been developed for use in ROADM subsystems, and describe their principles of operation, designs, features, advantages, and challenges. We also describe the various needs for reconfigurable optical add/drop switching in agile optical networks. For each network need, we present the different ROADM subsystem architecture options with their pros and cons, and describe the optoelectronic technologies supporting each architecture.

  20. External Events Analysis for LWRS/RISMC Project: Methodology Development and Early Demonstration

    SciTech Connect

    Parisi, Carlo; Prescott, Steven Ralph; Yorg, Richard Alan; Coleman, Justin Leigh; Szilard, Ronaldo Henriques

    2016-02-01

    The ultimate scope of Industrial Application #2 (IA) of the LWRS/RISMC project is a realistic simulation of natural external hazards that impose threat to a NPP. This scope requires the development of a methodology and of a qualified set of tools able to perform advanced risk- informed safety analysis. In particular the methodology should be able to combine results from seismic, flooding and thermal-hydraulic (TH) deterministic calculations with dynamic PRA. This summary presents the key points of methodology being developed and the very first sample application of it to a simple problem (spent fuel pool).

  1. Advances in Artificial Neural Networks - Methodological Development and Application

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Artificial neural networks as a major soft-computing technology have been extensively studied and applied during the last three decades. Research on backpropagation training algorithms for multilayer perceptron networks has spurred development of other neural network training algorithms for other ne...

  2. Agile manufacturing concepts and opportunities in ceramics

    SciTech Connect

    Booth, C.L.; Harmer, M.P.

    1995-08-01

    In 1991 Lehigh University facilitated seminars over a period of 8 months to define manufacturing needs for the 21st century. They concluded that the future will be characterized by rapid changes in technology advances, customer demands, and shifts in market dynamics and coined the term {open_quotes}Agile Manufacturing{close_quotes}. Agile manufacturing refers to the ability to thrive in an environment of constant unpredictable change. Market opportunities are attacked by partnering to form virtual firms to dynamically obtain the required skills for each product opportunity. This paper will describe and compare agile vs. traditional concepts of organization & structure, management policy and ethics, employee environment, product focus, information, and paradigm shift. Examples of agile manufacturing applied to ceramic materials will be presented.

  3. Software Development and Test Methodology for a Distributed Ground System

    NASA Technical Reports Server (NTRS)

    Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.

  4. Development of Behavioral Toxicology Methodology for Interactive Exposure Regimens.

    DTIC Science & Technology

    1983-12-01

    degrees C. These animals appeared somewhat more active during early minutes of the exposure but by the end of the period activity appeared less than...dietary levels of maneb. Develop. Psychobiol. 5: 137-148, 1972. Sabotka, T.J., Brodie, R.E. and Cook, M.P.: Psychophysiologic effects of early lead exposure ...extrapolate to implications for behavioral changes under less extreme exposure conditions. In an early study of the effects of CO on various physiological

  5. Development of a Methodology for Assessing Daily Experiences

    DTIC Science & Technology

    1980-01-01

    distortion possibly in a way which would result in stronger correlations (Dohrenwend, Krasnoff, Askenasy & Dohrenwend, 1978). This issue is not...Daily Experience 6 ratings of events. On the cLher hand, the Psychiatric Epidemiological Research Interview developed by Dohrenwend, Krasnoff, Askenasy ...effects. New York: John Wiley & Sons, 1974. Dohrenwend, B.S., Krasnoff, L., Askenasy , A.R., & Dohrenwend, B.P. Exemplification of a method for scaling life

  6. Decision Science Challenges for C2 Agility

    DTIC Science & Technology

    2014-06-01

    Controlled and automatic human information processing: II. Perceptual learning , automatic attending and a general theory . Psychological Review, 84 (2...1 19 th ICCRTS “C2 Agility: Lessons Learned from Research and Operations” Decision Science Challenges for C2 Agility Topic 1 (First...systems. We have two vectors there. The first vector would be in things like man-machine interface. The second ... is in the whole area of cognition

  7. Agile Port and High Speed Ship Technologies

    DTIC Science & Technology

    2009-12-31

    Report PNW Agile Port System Demonstration Center for the Commercial Deployment of Transportation Technologies milestone agenda for accomplishing the... report summarizes the results of the remaining three projects in the FY05 program cycle, in particular the PNW Agile Port System Demonstration, a system...the accomplishment of each project and the program objectives. With the submission of this report the FY05 CCDoTT Program is complete. Bibliography

  8. Methodology Development for Assessment of Spaceport Technology Returns and Risks

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla; Zapata, Edgar

    2001-01-01

    As part of Kennedy Space Center's (KSC's) challenge to open the space frontier, new spaceport technologies must be developed, matured and successfully transitioned to operational systems. R&D investment decisions can be considered from multiple perspectives. Near mid and far term technology horizons must be understood. Because a multitude of technology investment opportunities are available, we must identify choices that promise the greatest likelihood of significant lifecycle At the same time, the costs and risks of any choice must be well understood and balanced against its potential returns The problem is not one of simply rank- ordering projects in terms of their desirability. KSC wants to determine a portfolio of projects that simultaneously satisfies multiple goals, such as getting the biggest bang for the buck, supporting projects that may be too risky for private funding, staying within annual budget cycles without foregoing the requirements of a long term technology vision, and ensuring the development of a diversity of technologies that, support the variety of operational functions involved in space transportation. This work aims to assist in the development of in methods and techniques that support strategic technology investment decisions and ease the process of determining an optimal portfolio of spaceport R&D investments. Available literature on risks and returns to R&D is reviewed and most useful pieces are brought to the attention of the Spaceport Technology Development Office (STDO). KSC's current project management procedures are reviewed. It is found that the "one size fits all" nature of KSC's existing procedures and project selection criteria is not conducive to prudent decision-making. Directions for improving KSC's - procedures and criteria are outlined. With help of a contractor, STDO is currently developing a tool, named Change Management Analysis Tool (CMAT)/ Portfolio Analysis Tool (PAT), to assist KSC's R&D portfolio determination. A

  9. Development cooperation as methodology for teaching social responsibility to engineers

    NASA Astrophysics Data System (ADS)

    Lappalainen, Pia

    2011-12-01

    The role of engineering in promoting global well-being has become accentuated, turning the engineering curriculum into a means of dividing well-being equally. The gradual fortifying calls for humanitarian engineering have resulted in the incorporation of social responsibility themes in the university curriculum. Cooperation, communication, teamwork, intercultural cooperation, sustainability, social and global responsibility represent the socio-cultural dimensions that are becoming increasingly important as globalisation intensifies the demands for socially and globally adept engineering communities. This article describes an experiment, the Development Cooperation Project, which was conducted at Aalto University in Finland to integrate social responsibility themes into higher engineering education.

  10. SuperAGILE Services at ASDC

    SciTech Connect

    Preger, B.; Verrecchia, F.; Pittori, C.; Antonelli, L. A.; Giommi, P.; Lazzarotto, F.; Evangelista, Y.

    2008-05-22

    The Italian Space Agency Science Data Center (ASDC) is a facility with several responsibilities including support to all the ASI scientific missions as for management and archival of the data, acting as the interface between ASI and the scientific community and providing on-line access to the data hosted. In this poster we describe the services that ASDC provides for SuperAGILE, in particular the ASDC public web pages devoted to the dissemination of SuperAGILE scientific results. SuperAGILE is the X-Ray imager onboard the AGILE mission, and provides the scientific community with orbit-by-orbit information on the observed sources. Crucial source information including position and flux in chosen energy bands will be reported in the SuperAGILE public web page at ASDC. Given their particular interest, another web page will be dedicated entirely to GRBs and other transients, where new event alerts will be notified and where users will find all the available informations on the GRBs detected by SuperAGILE.

  11. Methodology development to support NPR strategic planning. Final report

    SciTech Connect

    1996-04-01

    This report covers the work performed in support of the Office of New Production Reactors during the 9 month period from January through September 1990. Because of the rapid pace of program activities during this time period, the emphasis on work performed shifted from the strategic planning emphasis toward supporting initiatives requiring a more immediate consideration and response. Consequently, the work performed has concentrated on researching and helping identify and resolve those issues considered to be of most immediate concern. Even though they are strongly interrelated, they can be separated into two broad categories as follows: The first category encompasses program internal concerns. Included are issues associated with the current demand for accelerating staff growth, satisfying the immediate need for appropriate skill and experience levels, team building efforts necessary to assure the development of an effective operating organization, ability of people and organizations to satisfactorily understand and execute their assigned roles and responsibilities, and the general facilitation of inter/intra organization communications and working relationships. The second category encompasses program execution concerns. These include those efforts required in development of realistic execution plans and implementation of appropriate control mechanisms which provide for effective forecasting, planning, managing, and controlling of on-going (or soon to be) program substantive activities according to the master integrated schedule and budget.

  12. Developments in the Tools and Methodologies of Synthetic Biology

    PubMed Central

    Kelwick, Richard; MacDonald, James T.; Webb, Alexander J.; Freemont, Paul

    2014-01-01

    Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices, or systems. However, biological systems are generally complex and unpredictable, and are therefore, intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a “body of knowledge” from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled, and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled, and its functionality tested. At each stage of the design cycle, an expanding repertoire of tools is being developed. In this review, we highlight several of these tools in terms of their applications and benefits to the synthetic biology community. PMID:25505788

  13. Development of an artificial compressibility methodology using flux vector splitting

    NASA Astrophysics Data System (ADS)

    Pappou, Th.; Tsangaris, S.

    1997-09-01

    An implicit, upwind arithmetic scheme that is efficient for the solution of laminar, steady, incompressible, two-dimensional flow fields in a generalised co-ordinate system is presented in this paper. The developed algorithm is based on the extended flux-vector-splitting (FVS) method for solving incompressible flow fields. As in the case of compressible flows, the FVS method consists of the decomposition of the convective fluxes into positive and negative parts that transmit information from the upstream and downstream flow field respectively. The extension of this method to the solution of incompressible flows is achieved by the method of artificial compressibility, whereby an artificial time derivative of the pressure is added to the continuity equation. In this way the incompressible equations take on a hyperbolic character with pseudopressure waves propagating with finite speed. In such problems the information inside the field is transmitted along its characteristic curves. In this sense, we can use upwind schemes to represent the finite volume scheme of the problems governing equations. For the representation of the problem variables at the cell faces, upwind schemes up to third order of accuracy are used, while for the development of a time-iterative procedure a first-order-accurate Euler backward-time difference scheme is used and a second-order central differencing for the shear stresses is presented. The discretized Navier-Stokes equations are solved by an implicit unfactored method using Newton iterations and Gauss-Siedel relaxation. To validate the derived arithmetical results against experimental data and other numerical solutions, various laminar flows with known behaviour from the literature are examined.

  14. Agile robotic edge finishing system research

    SciTech Connect

    Powell, M.A.

    1995-07-01

    This paper describes a new project undertaken by Sandia National Laboratories to develop an agile, automated, high-precision edge finishing system. The project has a two-year duration and was initiated in October, 1994. This project involves re-designing and adding additional capabilities to an existing finishing workcell at Sandia; and developing intelligent methods for automating process definition and for controlling finishing processes. The resulting system will serve as a prototype for systems that will be deployed into highly flexible automated production lines. The production systems will be used to produce a wide variety of products with limited production quantities and quick turnaround requirements. The prototype system is designed to allow programming, process definition, fixture re-configuration, and process verification to be performed off-line for new products. CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) models of the part will be used to assist with the automated process development and process control tasks. To achieve Sandia`s performance goals, the system will be employ advanced path planning, burr prediction expert systems, automated process definition, statistical process models in a process database, and a two-level control scheme using hybrid position-force control and fuzzy logic control. In this paper, we discuss the progress and the planned system development under this project.

  15. Balancing Plan-Driven and Agile Methods in Software Engineering Project Courses

    NASA Astrophysics Data System (ADS)

    Boehm, Barry; Port, Dan; Winsor Brown, A.

    2002-09-01

    For the past 6 years, we have been teaching a two-semester software engineering project course. The students organize into 5-person teams and develop largely web-based electronic services projects for real USC campus clients. We have been using and evolving a method called Model- Based (System) Architecting and Software Engineering (MBASE) for use in both the course and in industrial applications. The MBASE Guidelines include a lot of documents. We teach risk-driven documentation: if it is risky to document something, and not risky to leave it out (e.g., GUI screen placements), leave it out. Even so, students tend to associate more documentation with higher grades, although our grading eventually discourages this. We are always on the lookout for ways to have students learn best practices without having to produce excessive documentation. Thus, we were very interested in analyzing the various emerging agile methods. We found that agile methods and milestone plan-driven methods are part of a “how much planning is enough?” spectrum. Both agile and plan-driven methods have home grounds of project characteristics where they clearly work best, and where the other will have difficulties. Hybrid agile/plan-driven approaches are feasible, and necessary for projects having a mix of agile and plan-driven home ground characteristics. Information technology trends are going more toward the agile methods' home ground characteristics of emergent requirements and rapid change, although there is a concurrent increase in concern with dependability. As a result, we are currently experimenting with risk-driven combinations of MBASE and agile methods, such as integrating requirements, test plans, peer reviews, and pair programming into “agile quality management.”

  16. Integrating a distributed, agile, virtual enterprise in the TEAM program

    NASA Astrophysics Data System (ADS)

    Cobb, C. K.; Gray, W. Harvey; Hewgley, Robert E.; Klages, Edward J.; Neal, Richard E.

    1997-01-01

    The technologies enabling agile manufacturing (TEAM) program enhances industrial capability by advancing and deploying manufacturing technologies that promote agility. TEAM has developed a product realization process that features the integration of product design and manufacturing groups. TEAM uses the tools it collects, develops, and integrates in support of the product realization process to demonstrate and deploy agile manufacturing capabilities for three high- priority processes identified by industry: material removal, forming, and electromechanical assembly. In order to provide a proof-of-principle, the material removal process has been addressed first and has been successfully demonstrate din an 'interconnected' mode. An internet-accessible intersite file manager (IFM) application has been deployed to allow geographically distributed TEAM participants to share and distribute information as the product realization process is executed. An automated inspection planning application has been demonstrated, importing a solid model form the IFM, generating an inspection plan and a part program to be used in the inspection process, and then distributing the part program to the inspection site via the IFM. TEAM seeks to demonstrate the material removal process in an integrated mode in June 1997 complete with an object-oriented framework and infrastructure. The current status and future plans for this project are presented here.

  17. SAR imagery using chaotic carrier frequency agility pulses

    NASA Astrophysics Data System (ADS)

    Xu, Xiaojian; Feng, Xiangzhi

    2011-06-01

    Synthetic aperture radar (SAR) systems are getting more and more applications in both civilian and military remote sensing missions. With the increasing deployment of electronic countermeasures (ECM) on modern battlefields, SAR encounters more and more interference jamming signals. The ECM jamming signals cause the SAR system to receive and process erroneous information which results in severe degradations in the output SAR images and/or formation of phony images of nonexistent targets. As a consequence, development of the electronic counter-countermeasures (ECCM) capability becomes one of the key problems in SAR system design. This paper develops radar signaling strategies and algorithms that enhance the ability of synthetic aperture radar to image targets under conditions of electronic jamming. The concept of SAR using chaotic carrier frequency agility pulses (CCFAP-SAR) is first proposed. Then the imaging procedure for CCFAP-SAR is discussed in detail. The ECCM performance of CCFAP-SAR for both depressive noise jamming and deceptive repeat jamming is analyzed. The impact of the carrier frequency agility range on the image quality of CCFAP-SAR is also studied. Simulation results demonstrate that, with adequate agility range of the carrier frequency, the proposed CCFAP-SAR performs as well as conventional radar with linear frequency modulation (LFM) waveform in image quality and slightly better in anti-noise depressive jamming; while performs very well in anti-deception jamming which cannot be rejected by LFM-SAR.

  18. New Developments in Observer Performance Methodology in Medical Imaging

    PubMed Central

    Chakraborty, Dev P.

    2011-01-01

    A common task in medical imaging is assessing whether a new imaging system, or a variant of an existing one, is an improvement over an existing imaging technology. Imaging systems are generally quite complex, consisting of several components – e.g., image acquisition hardware, image processing and display hardware and software, and image interpretation by radiologists– each of which can affect performance. While it may appear odd to include the radiologist as a “component” of the imaging chain, since the radiologist’s decision determines subsequent patient care, the effect of the human interpretation has to be included. Physical measurements like modulation transfer function, signal to noise ratio, etc., are useful for characterizing the non-human parts of the imaging chain under idealized and often unrealistic conditions, such as uniform background phantoms, target objects with sharp edges, etc. Measuring the effect on performance of the entire imaging chain, including the radiologist, and using real clinical images, requires different methods that fall under the rubric of observer performance methods or “ROC analysis”. The purpose of this paper is to review recent developments in this field, particularly with respect to the free-response method. PMID:21978444

  19. The Development of a Checklist to Enhance Methodological Quality in Intervention Programs

    PubMed Central

    Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa

    2016-01-01

    The methodological quality of primary studies is an important issue when performing meta-analyses or systematic reviews. Nevertheless, there are no clear criteria for how methodological quality should be analyzed. Controversies emerge when considering the various theoretical and empirical definitions, especially in relation to three interrelated problems: the lack of representativeness, utility, and feasibility. In this article, we (a) systematize and summarize the available literature about methodological quality in primary studies; (b) propose a specific, parsimonious, 12-items checklist to empirically define the methodological quality of primary studies based on a content validity study; and (c) present an inter-coder reliability study for the resulting 12-items. This paper provides a precise and rigorous description of the development of this checklist, highlighting the clearly specified criteria for the inclusion of items and a substantial inter-coder agreement in the different items. Rather than simply proposing another checklist, however, it then argues that the list constitutes an assessment tool with respect to the representativeness, utility, and feasibility of the most frequent methodological quality items in the literature, one that provides practitioners and researchers with clear criteria for choosing items that may be adequate to their needs. We propose individual methodological features as indicators of quality, arguing that these need to be taken into account when designing, implementing, or evaluating an intervention program. This enhances methodological quality of intervention programs and fosters the cumulative knowledge based on meta-analyses of these interventions. Future development of the checklist is discussed. PMID:27917143

  20. Towards a Better Understanding of CMMI and Agile Integration - Multiple Case Study of Four Companies

    NASA Astrophysics Data System (ADS)

    Pikkarainen, Minna

    The amount of software is increasing in the different domains in Europe. This provides the industries in smaller countries good opportunities to work in the international markets. Success in the global markets however demands the rapid production of high quality, error free software. Both CMMI and agile methods seem to provide a ready solution for quality and lead time improvements. There is not, however, much empirical evidence available either about 1) how the integration of these two aspects can be done in practice or 2) what it actually demands from assessors and software process improvement groups. The goal of this paper is to increase the understanding of CMMI and agile integration, in particular, focusing on the research question: how to use ‘lightweight’ style of CMMI assessments in agile contexts. This is done via four case studies in which assessments were conducted using the goals of CMMI integrated project management and collaboration and coordination with relevant stakeholder process areas and practices from XP and Scrum. The study shows that the use of agile practices may support the fulfilment of the goals of CMMI process areas but there are still many challenges for the agile teams to be solved within the continuous improvement programs. It also identifies practical advices to the assessors and improvement groups to take into consideration when conducting assessment in the context of agile software development.

  1. AGILE/GRID Science Alert Monitoring System: The Workflow and the Crab Flare Case

    NASA Astrophysics Data System (ADS)

    Bulgarelli, A.; Trifoglio, M.; Gianotti, F.; Tavani, M.; Conforti, V.; Parmiggiani, N.

    2013-10-01

    During the first five years of the AGILE mission we have observed many gamma-ray transients of Galactic and extragalactic origin. A fast reaction to unexpected transient events is a crucial part of the AGILE monitoring program, because the follow-up of astrophysical transients is a key point for this space mission. We present the workflow and the software developed by the AGILE Team to perform the automatic analysis for the detection of gamma-ray transients. In addition, an App for iPhone will be released enabling the Team to access the monitoring system through mobile phones. In 2010 September the science alert monitoring system presented in this paper recorded a transient phenomena from the Crab Nebula, generating an automated alert sent via email and SMS two hours after the end of an AGILE satellite orbit, i.e. two hours after the Crab flare itself: for this discovery AGILE won the 2012 Bruno Rossi prize. The design of this alert system is maximized to reach the maximum speed, and in this, as in many other cases, AGILE has demonstrated that the reaction speed of the monitoring system is crucial for the scientific return of the mission.

  2. Methodological choices for the clinical development of medical devices.

    PubMed

    Bernard, Alain; Vaneau, Michel; Fournel, Isabelle; Galmiche, Hubert; Nony, Patrice; Dubernard, Jean Michel

    2014-01-01

    clinical development of MDs.

  3. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  4. Computational and methodological developments towards 3D full waveform inversion

    NASA Astrophysics Data System (ADS)

    Etienne, V.; Virieux, J.; Hu, G.; Jia, Y.; Operto, S.

    2010-12-01

    Full waveform inversion (FWI) is one of the most promising techniques for seismic imaging. It relies on a formalism taking into account every piece of information contained in the seismic data as opposed to more classical techniques such as travel time tomography. As a result, FWI is a high resolution imaging process able to reach a spatial accuracy equal to half a wavelength. FWI is based on a local optimization scheme and therefore the main limitation concerns the starting model which has to be closed enough to the real one in order to converge to the global minimum. Another counterpart of FWI is the required computational resources when considering models and frequencies of interest. The task becomes even more tremendous when one tends to perform the inversion using the elastic equation instead of using the acoustic approximation. This is the reason why until recently most studies were limited to 2D cases. In the last few years, due to the increase of the available computational power, FWI has focused a lot of interests and continuous efforts towards inversion of 3D models, leading to remarkable applications up to the continental scale. We investigate the computational burden induced by FWI in 3D elastic media and propose some strategic features leading to the reduction of the numerical cost while providing a great flexibility in the inversion parametrization. First, in order to release the memory requirements, we developed our FWI algorithm in the frequency domain and take benefit of the wave-number redundancy in the seismic data to process a quite reduced number of frequencies. To do so, we extract frequency solutions from time marching techniques which are efficient for 3D structures. Moreover, this frequency approach permits a multi-resolution strategy by proceeding from low to high frequencies: the final model at one frequency is used as the starting model for the next frequency. This procedure overcomes partially the non-linear behavior of the inversion

  5. The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology

    ERIC Educational Resources Information Center

    Wang, Greg G.; Swanson, Richard A.

    2008-01-01

    Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…

  6. Software ``Best'' Practices: Agile Deconstructed

    NASA Astrophysics Data System (ADS)

    Fraser, Steven

    Software “best” practices depend entirely on context - in terms of the problem domain, the system constructed, the software designers, and the “customers” ultimately deriving value from the system. Agile practices no longer have the luxury of “choosing” small non-mission critical projects with co-located teams. Project stakeholders are selecting and adapting practices based on a combina tion of interest, need and staffing. For example, growing product portfolios through a merger or the acquisition of a company exposes legacy systems to new staff, new software integration challenges, and new ideas. Innovation in communications (tools and processes) to span the growth and contraction of both information and organizations, while managing the adoption of changing software practices, is imperative for success. Traditional web-based tools such as web pages, document libraries, and forums are not suf ficient. A blend of tweeting, blogs, wikis, instant messaging, web-based confer encing, and telepresence creates a new dimension of communication “best” practices.

  7. A comparison of linear speed, closed-skill agility, and open-skill agility qualities between backcourt and frontcourt adult semiprofessional male basketball players.

    PubMed

    Scanlan, Aaron T; Tucker, Patrick S; Dalbo, Vincent J

    2014-05-01

    The measurement of fitness qualities relevant to playing position is necessary to inform basketball coaching and conditioning staff of role-related differences in playing groups. To date, sprinting and agility performance have not been compared between playing positions in adult male basketball players. Therefore, the purpose of this study was to describe and compare linear speed, closed-skill agility, and open-skill agility qualities between backcourt (point guard and shooting guard positions) and frontcourt (small forward, power forward, and center positions) semiprofessional basketball players. Six backcourt (mean ± SD: age, 24.3 ± 7.9 years; stature, 183.4 ± 4.0 cm; body mass, 85.5 ± 12.3 kg; VO2max, 51.9 ± 4.8 ml·kg(-1)·min(-1)) and 6 frontcourt (mean ± SD: age, 27.5 ± 5.5 years; stature, 194.4 ± 7.1 cm; body mass, 109.4 ± 8.8 kg; VO2max, 47.1 ± 5.0 ml·kg(-1)·min(-1)) adult male basketball players completed 20-m sprint, closed-skill agility, and open-skill agility performance tests. Magnitude-based inferences revealed that backcourt players (5 m, 1.048 ± 0.027 seconds; 10 m, 1.778 ± 0.048 seconds; 20 m, 3.075 ± 0.121 seconds) possessed likely quicker linear sprint times than frontcourt players (5 m, 1.095 ± 0.085 seconds; 10 m, 1.872 ± 0.127 seconds; 20 m, 3.242 ± 0.221 seconds). Conversely, frontcourt players (1.665 ± 0.096 seconds) held possible superior closed-skill agility performance than backcourt players (1.613 ± 0.111 seconds). In addition, unclear positional differences were apparent for open-skill agility qualities. These findings indicate that linear speed and change of direction speed might be differently developed across playing positions. Furthermore, position-related functions might similarly depend on the aspects of open-skill agility performance across backcourt and frontcourt players. Basketball coaching and conditioning staff should consider the development of position-targeted training drills to improve speed, agility

  8. Parallel optimization methods for agile manufacturing

    SciTech Connect

    Meza, J.C.; Moen, C.D.; Plantenga, T.D.; Spence, P.A.; Tong, C.H.; Hendrickson, B.A.; Leland, R.W.; Reese, G.M.

    1997-08-01

    The rapid and optimal design of new goods is essential for meeting national objectives in advanced manufacturing. Currently almost all manufacturing procedures involve the determination of some optimal design parameters. This process is iterative in nature and because it is usually done manually it can be expensive and time consuming. This report describes the results of an LDRD, the goal of which was to develop optimization algorithms and software tools that will enable automated design thereby allowing for agile manufacturing. Although the design processes vary across industries, many of the mathematical characteristics of the problems are the same, including large-scale, noisy, and non-differentiable functions with nonlinear constraints. This report describes the development of a common set of optimization tools using object-oriented programming techniques that can be applied to these types of problems. The authors give examples of several applications that are representative of design problems including an inverse scattering problem, a vibration isolation problem, a system identification problem for the correlation of finite element models with test data and the control of a chemical vapor deposition reactor furnace. Because the function evaluations are computationally expensive, they emphasize algorithms that can be adapted to parallel computers.

  9. Gamma-ray Astrophysics with AGILE

    SciTech Connect

    Longo, Francesco |; Tavani, M.; Barbiellini, G.; Argan, A.; Basset, M.; Boffelli, F.; Bulgarelli, A.; Caraveo, P.; Cattaneo, P.; Chen, A.; Costa, E.; Del Monte, E.; Di Cocco, G.; Di Persio, G.; Donnarumma, I.; Feroci, M.; Fiorini, M.; Foggetta, L.; Froysland, T.; Frutti, M.

    2007-07-12

    AGILE will explore the gamma-ray Universe with a very innovative instrument combining for the first time a gamma-ray imager and a hard X-ray imager. AGILE will be operational in spring 2007 and it will provide crucial data for the study of Active Galactic Nuclei, Gamma-Ray Bursts, unidentified gamma-ray sources. Galactic compact objects, supernova remnants, TeV sources, and fundamental physics by microsecond timing. The AGILE instrument is designed to simultaneously detect and image photons in the 30 MeV - 50 GeV and 15 - 45 keV energy bands with excellent imaging and timing capabilities, and a large field of view covering {approx} 1/5 of the entire sky at energies above 30 MeV. A CsI calorimeter is capable of GRB triggering in the energy band 0.3-50 MeV AGILE is now (March 2007) undergoing launcher integration and testing. The PLSV launch is planned in spring 2007. AGILE is then foreseen to be fully operational during the summer of 2007.

  10. Application Development Methodology Appropriateness: An Exploratory Case Study Bridging the Gap between Framework Characteristics and Selection

    ERIC Educational Resources Information Center

    Williams, Lawrence H., Jr.

    2013-01-01

    This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…

  11. Automated Methodologies for the Design of Flow Diagrams for Development and Maintenance Activities

    NASA Astrophysics Data System (ADS)

    Shivanand M., Handigund; Shweta, Bhat

    The Software Requirements Specification (SRS) of the organization is a text document prepared by strategic management incorporating the requirements of the organization. These requirements of ongoing business/ project development process involve the software tools, the hardware devices, the manual procedures, the application programs and the communication commands. These components are appropriately ordered for achieving the mission of the concerned process both in the project development and the ongoing business processes, in different flow diagrams viz. activity chart, workflow diagram, activity diagram, component diagram and deployment diagram. This paper proposes two generic, automatic methodologies for the design of various flow diagrams of (i) project development activities, (ii) ongoing business process. The methodologies also resolve the ensuing deadlocks in the flow diagrams and determine the critical paths for the activity chart. Though both methodologies are independent, each complements other in authenticating its correctness and completeness.

  12. Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama

    2003-01-01

    This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.

  13. Using the community of inquiry methodology in teaching bioethics: a focus on skills development.

    PubMed

    Hunter, David L

    2008-01-01

    The community of inquiry methodology was developed by Professor Matthew Lipman to enable the teaching of philosophy in schools. Lipman felt that inquiry-based learning was essential in schools because: Education should empower children to be thoughtful about the lives they lead, and doing philosophy is important to that goal. The community of inquiry is a powerful pedagogical tool to foster student engagement, critical thinking, and collaborative and affective skills development. As such it can be useful in the bioethics classroom. This article describes the community of inquiry methodology and how it can be a useful arrow in quiver to a teacher of bioethics.

  14. SuperAGILE and Gamma Ray Bursts

    SciTech Connect

    Pacciani, Luigi; Costa, Enrico; Del Monte, Ettore; Donnarumma, Immacolata; Evangelista, Yuri; Feroci, Marco; Frutti, Massimo; Lazzarotto, Francesco; Lapshov, Igor; Rubini, Alda; Soffitta, Paolo; Tavani, Marco; Barbiellini, Guido; Mastropietro, Marcello; Morelli, Ennio; Rapisarda, Massimo

    2006-05-19

    The solid-state hard X-ray imager of AGILE gamma-ray mission -- SuperAGILE -- has a six arcmin on-axis angular resolution in the 15-45 keV range, a field of view in excess of 1 steradian. The instrument is very light: 5 kg only. It is equipped with an on-board self triggering logic, image deconvolution, and it is able to transmit the coordinates of a GRB to the ground in real-time through the ORBCOMM constellation of satellites. Photon by photon Scientific Data are sent to the Malindi ground station at every contact. In this paper we review the performance of the SuperAGILE experiment (scheduled for a launch in the middle of 2006), after its first onground calibrations, and show the perspectives for Gamma Ray Bursts.

  15. The Influence of Agility Training on Physiological and Cognitive Performance

    DTIC Science & Technology

    2010-11-01

    training, subjects completed a physical and cognitive battery of serum cortisol, VO2max, vertical jump , reaction time, Illinois Agility Test , body...strong trends toward the agility group improving more than the traditional group on VO2max (p=0.12), vertical jump (p=0.06), Illinois Agility Test ...levels, maximal oxygen uptake, Illinois Agility Test , Makoto reaction time, and vertical jump . The cognitive portion of the testing sessions

  16. The National Aviation Operational Monitoring Service (NAOMS): A Documentation of the Development of a Survey Methodology

    NASA Technical Reports Server (NTRS)

    Connors, Mary M.; Mauro, Robert; Statler, Irving C.

    2012-01-01

    The National Aviation Operational Monitoring Service (NAOMS) was a research project under NASA s Aviation Safety Program during the years from 2000 to 2005. The purpose of this project was to develop a methodology for gaining reliable information on changes over time in the rates-of-occurrence of safety-related events as a means of assessing the safety of the national airspace. The approach was a scientifically designed survey of the operators of the aviation system concerning their safety-related experiences. This report presents the results of the methodology developed and a demonstration of the NAOMS concept through a survey of nearly 20,000 randomly selected air-carrier pilots. Results give evidence that the NAOMS methodology can provide a statistically sound basis for evaluating trends of incidents that could compromise safety. The approach and results are summarized in the report and supporting documentation and complete analyses of results are presented in 14 appendices.

  17. Development and application of a safety assessment methodology for waste disposals

    SciTech Connect

    Little, R.H.; Torres, C.; Schaller, K.H.

    1996-12-31

    As part of a European Commission funded research programme, QuantiSci (formerly the Environmental Division of Intera Information Technologies) and Instituto de Medio Ambiente of the Centro de Investigaciones Energeticas Medioambientales y Tecnologicas (IMA/CIEMAT) have developed and applied a comprehensive, yet practicable, assessment methodology for post-disposal safety assessment of land-based disposal facilities. This Safety Assessment Comparison (SACO) Methodology employs a systematic approach to the collection, evaluation and use of waste and disposal system data. It can be used to assess engineered barrier performance, the attenuating properties of host geological formations, and the long term impacts of a facility on the environment and human health, as well as allowing the comparison of different disposal options for radioactive, mixed and non-radioactive wastes. This paper describes the development of the methodology and illustrates its use.

  18. Theoretical aspects of the agile mirror

    NASA Astrophysics Data System (ADS)

    Manheimer, Wallace M.; Fernsler, Richard

    1994-01-01

    A planar plasma mirror which can be oriented electronically could have the capability of providing electronic steering of a microwave beam in a radar or electronic warfare system. This system is denoted the agile mirror. A recent experiment has demonstrated such a planar plasma and the associated microwave reflection. This plasma was produced by a hollow cathode glow discharge, where the hollow cathode was a grooved metallic trench in a Lucite plate. Various theoretical aspects of this configuration of an agile mirror are examined here.

  19. Physical qualities predict change-of-direction speed but not defensive agility in Australian rules football.

    PubMed

    Young, Warren B; Miller, Ian R; Talpey, Scott W

    2015-01-01

    The purpose of this study was to determine the relationships between selected physical qualities, change-of-direction (COD) speed, and defensive agility performance in Australian Rules football players. Twenty-four male community-level players were assessed on sprint acceleration (10-m time), maximum strength (3 repetition-maximum half squat), leg power (countermovement jump), reactive strength (drop jump), and a single COD speed test and a defensive agility test. Change-of-direction speed was correlated with reactive strength (r = -0.645, p = 0.001) and sprint acceleration (r = 0.510, p = 0.011). Multiple regression indicated that the combined physical qualities explained 56.7% of the variance associated with COD speed (adjusted R = 0.567, p ≤ 0.05). Participants were median split into faster and slower COD speed groups, and these were compared by independent t-tests. The faster group was significantly better (p ≤ 0.05) on the sprint acceleration and reactive strength tests (large effect size). The correlations between physical qualities and agility were trivial to small (r = -0.101 to 0.123, p > 0.05) and collectively explained only 14.2% of the variance associated with agility performance (adjusted R = -0.142, p > 0.05). When faster and slower agility groups were compared, there were trivial to moderate differences (p > 0.05) in all physical qualities. It was concluded that reactive strength and sprint acceleration are important for COD speed, but the physical qualities assessed are not associated with defensive agility performance. For agility tasks similar to those in this study, sprint and resistance training should not be emphasized, and training other factors, such as the development of sport-specific technique and cognitive skill, is recommended.

  20. The Backyard Human Performance Technologist: Applying the Development Research Methodology to Develop and Validate a New Instructional Design Framework

    ERIC Educational Resources Information Center

    Brock, Timothy R.

    2009-01-01

    Development research methodology (DRM) has been recommended as a viable research approach to expand the practice-to-theory/theory-to-practice literature that human performance technology (HPT) practitioners can integrate into the day-to-day work flow they already use to develop instructional products. However, little has been written about how it…

  1. Development of frequency-agile high-repetition-rate CO{sub 2} DIAL systems for long range chemical remote sensing

    SciTech Connect

    Quick, C.R. Jr.; Fite, C.B.; Foy, B.R.; Jolin, J.; Mietz, D.E.

    1997-11-01

    Issues related to the development of direct detection, long-range CO{sub 2} DIAL systems for chemical detection and identification are presented and discussed including: data handling and display techniques for large, multi-{lambda} data sets, turbulence effects, slant path propagation, and speckle averaging. Data examples from various field campaigns and CO{sub 2} lidar platforms are used to illustrate the issues.

  2. Development of a Valid and Reliable Knee Articular Cartilage Condition–Specific Study Methodological Quality Score

    PubMed Central

    Harris, Joshua D.; Erickson, Brandon J.; Cvetanovich, Gregory L.; Abrams, Geoffrey D.; McCormick, Frank M.; Gupta, Anil K.; Verma, Nikhil N.; Bach, Bernard R.; Cole, Brian J.

    2014-01-01

    Background: Condition-specific questionnaires are important components in evaluation of outcomes of surgical interventions. No condition-specific study methodological quality questionnaire exists for evaluation of outcomes of articular cartilage surgery in the knee. Purpose: To develop a reliable and valid knee articular cartilage–specific study methodological quality questionnaire. Study Design: Cross-sectional study. Methods: A stepwise, a priori–designed framework was created for development of a novel questionnaire. Relevant items to the topic were identified and extracted from a recent systematic review of 194 investigations of knee articular cartilage surgery. In addition, relevant items from existing generic study methodological quality questionnaires were identified. Items for a preliminary questionnaire were generated. Redundant and irrelevant items were eliminated, and acceptable items modified. The instrument was pretested and items weighed. The instrument, the MARK score (Methodological quality of ARticular cartilage studies of the Knee), was tested for validity (criterion validity) and reliability (inter- and intraobserver). Results: A 19-item, 3-domain MARK score was developed. The 100-point scale score demonstrated face validity (focus group of 8 orthopaedic surgeons) and criterion validity (strong correlation to Cochrane Quality Assessment score and Modified Coleman Methodology Score). Interobserver reliability for the overall score was good (intraclass correlation coefficient [ICC], 0.842), and for all individual items of the MARK score, acceptable to perfect (ICC, 0.70-1.000). Intraobserver reliability ICC assessed over a 3-week interval was strong for 2 reviewers (≥0.90). Conclusion: The MARK score is a valid and reliable knee articular cartilage condition–specific study methodological quality instrument. Clinical Relevance: This condition-specific questionnaire may be used to evaluate the quality of studies reporting outcomes of

  3. A Methodological Approach to Developing Bibliometric Models of Types of Humanities Scholarship.

    ERIC Educational Resources Information Center

    Wiberley, Stephen E., Jr.

    2003-01-01

    Outlines a methodological approach to developing bibliometric models of the sources used in different types of humanities scholarship. Identifies five types of scholarship: description of primary sources, editing of primary sources, historical studies, criticism, and theory. Illustrates the approach through an analysis of sources used in 54…

  4. Making Explicit the Analysis of Students' Mathematical Discourses--Revisiting a Newly Developed Methodological Framework

    ERIC Educational Resources Information Center

    Ryve, Andreas

    2006-01-01

    Sfard and Kieran [Kieran, C., Educational Studies in Mathematics 46, 2001, 187-228; Sfard, A., Educational Studies in Mathematics 46, 2001, 13-57; Sfard, A. and Kieran, C., Mind, Culture, and Activity 8, 2001, 42-76] have developed a methodological framework, which aims at characterizing the students' mathematical discourses while they are working…

  5. Success Rates by Software Development Methodology in Information Technology Project Management: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Wright, Gerald P.

    2013-01-01

    Despite over half a century of Project Management research, project success rates are still too low. Organizations spend a tremendous amount of valuable resources on Information Technology projects and seek to maximize the utility gained from their efforts. The author investigated the impact of software development methodology choice on ten…

  6. EPA’s AP-42 development methodology: Converting or rerating current AP-42 datasets

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In August 2013, the U.S. Environmental Protection Agency’s (EPA) published their new methodology for updating the Compilation of Air Pollution Emission Factors (AP-42). The “Recommended Procedures for Development of Emissions Factors and Use of the WebFIRE Database” instructs that the ratings of the...

  7. Evaluating EPA’s AP-42 development methodology using a cotton gin total PM dataset

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In August 2013, the U.S. Environmental Protection Agency’s (EPA) published their new methodology for updating the Compilation of Air Pollution Emission Factors (AP-42). The “Recommended Procedures for Development of Emissions Factors and Use of the WebFIRE Database” has yet to be widely used. These ...

  8. Development of Methodologies for the Estimation of Thermal Properties Associated with Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Scott, Elaine P.

    1996-01-01

    A thermal stress analysis is an important aspect in the design of aerospace structures and vehicles such as the High Speed Civil Transport (HSCT) at the National Aeronautics and Space Administration Langley Research Center (NASA-LaRC). These structures are complex and are often composed of numerous components fabricated from a variety of different materials. The thermal loads on these structures induce temperature variations within the structure, which in turn result in the development of thermal stresses. Therefore, a thermal stress analysis requires knowledge of the temperature distributions within the structures which consequently necessitates the need for accurate knowledge of the thermal properties, boundary conditions and thermal interface conditions associated with the structural materials. The goal of this proposed multi-year research effort was to develop estimation methodologies for the determination of the thermal properties and interface conditions associated with aerospace vehicles. Specific objectives focused on the development and implementation of optimal experimental design strategies and methodologies for the estimation of thermal properties associated with simple composite and honeycomb structures. The strategy used in this multi-year research effort was to first develop methodologies for relatively simple systems and then systematically modify these methodologies to analyze complex structures. This can be thought of as a building block approach. This strategy was intended to promote maximum usability of the resulting estimation procedure by NASA-LARC researchers through the design of in-house experimentation procedures and through the use of an existing general purpose finite element software.

  9. Towards a Trans-Disciplinary Methodology for a Game-Based Intervention Development Process

    ERIC Educational Resources Information Center

    Arnab, Sylvester; Clarke, Samantha

    2017-01-01

    The application of game-based learning adds play into educational and instructional contexts. Even though there is a lack of standard methodologies or formulaic frameworks to better inform game-based intervention development, there exist scientific and empirical studies that can serve as benchmarks for establishing scientific validity in terms of…

  10. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    SciTech Connect

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-08-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.

  11. Ground test and evaluation methodologies and techniques for the development of endoatmospheric interceptors

    NASA Astrophysics Data System (ADS)

    Amundson, Mark H.; Smith, D. M.

    1993-06-01

    The ground test facilities and methodologies of the Arnold Engineering Development Center are reviewed. Topics addressed include proven test techniques for aerodynamic, aerothermal, weather/erosion, and impact/lethality testing, and test support capabilities, including diagnostics and instrumentation, analysis support, and various ongoing technology programs.

  12. SURVEY OF METHODOLOGIES FOR DEVELOPING MEDIA SCREENING VALUES FOR ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    Barron, Mace G. and Steve Wharton. Submitted. Survey of Methodologies for Developing Media Screening Values for Ecological Risk Assessment. Environ. Toxicol. Chem. 44 p. (ERL,GB 1200).

    Concurrent with the increase in the number of ecological risk assessments over the past...

  13. Cognitive Sensitivity in Sibling Interactions: Development of the Construct and Comparison of Two Coding Methodologies

    ERIC Educational Resources Information Center

    Prime, Heather; Perlman, Michal; Tackett, Jennifer L.; Jenkins, Jennifer M.

    2014-01-01

    Research Findings: The goal of this study was to develop a construct of sibling cognitive sensitivity, which describes the extent to which children take their siblings' knowledge and cognitive abilities into account when working toward a joint goal. In addition, the study compared 2 coding methodologies for measuring the construct: a thin…

  14. Multicultural Career Development: A Methodological Critique of 8 Years of Research in Three Leading Career Journals.

    ERIC Educational Resources Information Center

    Koegel, Henry M.; And Others

    1995-01-01

    Conducted a methodological critique of all full-length articles highlighting American racial and ethnic minority groups or international populations appearing in the "Journal of Employment Counseling,""The Career Development Quarterly," and the "Journal of Vocational Behavior" from 1985 through 1992. (JBJ)

  15. Methodology development for the sustainability process assessment of sheet metal forming of complex-shaped products

    NASA Astrophysics Data System (ADS)

    Pankratov, D. L.; Kashapova, L. R.

    2015-06-01

    A methodology was developed for automated assessment of the reliability of the process of sheet metal forming process to reduce the defects in complex components manufacture. The article identifies the range of allowable values of the stamp parameters to obtain defect-free punching of spars trucks.

  16. [Developing the methodology of examining the lower limb veins in cosmonauts for the space medicine practice].

    PubMed

    Kotovskaia, A R; Fomina, G A; Sal'nikov, A V; Iarmanova, E N

    2014-01-01

    The article centres on development of a methodology for evaluating the function of lower limb veins of cosmonauts in microgravity. The whys and wherefores of the choice of occlusive plethysmography equipment and procedure are explained. Much place is given to arguments for the requisite body and limb positioning during venous plethysmography pre launch and on return from space flight. To minimize the gravity effect on venous blood flow, the body should be in the level position and the calf aligned with the hydrodynamically indifferent point. Determining the type of test occlusion, occlusion adjustments, venous parameters of interest, and data processing procedure constitute the methodology.

  17. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    SciTech Connect

    Aldemir, Tunc; Denning, Richard; Catalyurek, Umit; Unwin, Stephen

    2015-01-23

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  18. DEVELOPMENT OF A METHODOLOGY TO ASSESS PROLIFERATION RESISTANCE AND PHYSICAL PROTECTION FOR GENERATION IV SYSTEMS

    SciTech Connect

    Nishimura, R.; Bari, R.; Peterson, P.; Roglans-Ribas, J.; Kalenchuk, D.

    2004-10-06

    Enhanced proliferation resistance and physical protection (PR&PP) is one of the technology goals for advanced nuclear concepts, such as Generation IV systems. Under the auspices of the Generation IV International Forum, the Office of Nuclear Energy, Science and Technology of the U.S. DOE, the Office of Nonproliferation Policy of the National Nuclear Security Administration, and participating organizations from six other countries are sponsoring an international working group to develop an evaluation methodology for PR&PP. This methodology will permit an objective PR&PP comparison between alternative nuclear systems (e.g., different reactor types or fuel cycles) and support design optimization to enhance robustness against proliferation, theft and sabotage. The paper summarizes the proposed assessment methodology including the assessment framework, measures used to express the PR&PP characteristics of the system, threat definition, system element and target identification, pathway identification and analysis, and estimation of the measures.

  19. Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    McClung, R. C.; Chell, G. G.; Lee, Y. -D.; Russell, D. A.; Orient, G. E.

    1999-01-01

    A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, delta J(sub eff) as the governing parameter. The methodology contains original and literature J and delta J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.

  20. Development of a combustor analytical design methodology for liquid rocket engines

    NASA Technical Reports Server (NTRS)

    Pieper, Jerry L.; Muss, Jeff

    1989-01-01

    The development of a user friendly computerized methodology for the design and analysis of liquid propellant rocket engine combustion chambers is described. An overview of the methodology, consisting of a computer program containing an appropriate modular assembly of existing industry wide performance and combustion stability models, is presented. These models are linked with an interactive front end processor enabling the user to define the performance and stability traits of an existing design (point analysis) or to create the essential design features of a combustor to meet specific performance goals and combustion stability (point design). Plans for demonstration and verification of this methodology are also presented. These plans include the creation of combustor designs using the methodology, together with predictions of the performance and combustion stability for each design. A verification test program of 26 hot fire tests with up to four designs created using this methodology is described. This testing is planned using LOX/RP-1 propellants with a thrust level of approx. 220,000 N (50,000 lbf).

  1. Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    McClung, R. C.; Chell, G. G.; Lee, Y.-D.; Russell, D. A.; Orient, G. E.

    1999-01-01

    A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, (Delta)J(sub eff), as the governing parameter. The methodology contains original and literature J and (Delta)J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.

  2. The Holy Grail of Agile Acquisition

    DTIC Science & Technology

    2010-04-01

    Bestsellers …” [Erwin 2009] Motivation • Despite of Erwin’s recommendation… – Agility seems to be a simple concept and it is commonly perceived as a virtue...osd mil/dapaproject/>. . . Erwin 2009 Erwin, S.I., Washington Pulse, Pentagon Brass: Stay Away From Management Bestsellers , National Defense, August

  3. DEVELOPMENT OF COMMON METHODOLOGY TO CALCULATE CARBON DIOXIDE EMISSIONS FROM CONSTRUCTION MATERIALS

    NASA Astrophysics Data System (ADS)

    Kanda, Taro; Takimoto, Masamichi; Sone, Shinri; Kishida, Hiroyuki; Hanaki, Keisuke; Fujita, Tsuyoshi

    Concerning CO2 emissions related to infrastructure development, common calculation methodology has not been certified. Common calculation methodology is necessary to know effective approach toward total CO2 emissions reduction. In this study, we develop a calculation method of CO2 emissions related to infrastructure development and propose it as common methodology. As the first step, we focus our attention on major construction materials, because the manufacturing of construction materials occupies a large part of the total CO2 emissions. The calculation method should satisfy the following requirements: (1) covering all CO2 emissions, (2) based on material quantities, (3) having clear evidence, (4) categorizing materials from perspective of those concerned with infrastructure development, (5) able to reflect site oriented data, and (6) updated annually. The developed method combines the pile-up and the input-output technique to satisfy the requirements above. The major part of CO2 emissions is calculated with material based quantities by applying the pile-up to input of primary material and energy consumption. Complementary use of the input-output, we developed, covers all the domestic activities, includes product developments, fixed capital formations, and the others. An estimation, using official and industry-based statistics for some major construction materials such as cement and aggregate, confirms that the pile-up calculates approximately 90% of CO2 emissions due to the manufacturing activities. The method also enables us to update ordinary CO2 emissions of the construction materials annually.

  4. [Development of New Mathematical Methodology in Air Traffic Control for the Analysis of Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Hermann, Robert

    1997-01-01

    The aim of this research is to develop new mathematical methodology for the analysis of hybrid systems of the type involved in Air Traffic Control (ATC) problems. Two directions of investigation were initiated. The first used the methodology of nonlinear generalized functions, whose mathematical foundations were initiated by Colombeau and developed further by Oberguggenberger; it has been extended to apply to ordinary differential. Systems of the type encountered in control in joint work with the PI and M. Oberguggenberger. This involved a 'mixture' of 'continuous' and 'discrete' methodology. ATC clearly involves mixtures of two sorts of mathematical problems: (1) The 'continuous' dynamics of a standard control type described by ordinary differential equations (ODE) of the form: {dx/dt = f(x, u)} and (2) the discrete lattice dynamics involved of cellular automata. Most of the CA literature involves a discretization of a partial differential equation system of the type encountered in physics problems (e.g. fluid and gas problems). Both of these directions requires much thinking and new development of mathematical fundamentals before they may be utilized in the ATC work. Rather than consider CA as 'discretization' of PDE systems, I believe that the ATC applications will require a completely different and new mathematical methodology, a sort of discrete analogue of jet bundles and/or the sheaf-theoretic techniques to topologists. Here too, I have begun work on virtually 'virgin' mathematical ground (at least from an 'applied' point of view) which will require considerable preliminary work.

  5. Using Constructivist Case Study Methodology to Understand Community Development Processes: Proposed Methodological Questions to Guide the Research Process

    ERIC Educational Resources Information Center

    Lauckner, Heidi; Paterson, Margo; Krupa, Terry

    2012-01-01

    Often, research projects are presented as final products with the methodologies cleanly outlined and little attention paid to the decision-making processes that led to the chosen approach. Limited attention paid to these decision-making processes perpetuates a sense of mystery about qualitative approaches, particularly for new researchers who will…

  6. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    SciTech Connect

    Jaeger, Calvin Dell; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  7. Wired Widgets: Agile Visualization for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Gerschefske, K.; Witmer, J.

    2012-09-01

    Continued advancement in sensors and analysis techniques have resulted in a wealth of Space Situational Awareness (SSA) data, made available via tools and Service Oriented Architectures (SOA) such as those in the Joint Space Operations Center Mission Systems (JMS) environment. Current visualization software cannot quickly adapt to rapidly changing missions and data, preventing operators and analysts from performing their jobs effectively. The value of this wealth of SSA data is not fully realized, as the operators' existing software is not built with the flexibility to consume new or changing sources of data or to rapidly customize their visualization as the mission evolves. While tools like the JMS user-defined operational picture (UDOP) have begun to fill this gap, this paper presents a further evolution, leveraging Web 2.0 technologies for maximum agility. We demonstrate a flexible Web widget framework with inter-widget data sharing, publish-subscribe eventing, and an API providing the basis for consumption of new data sources and adaptable visualization. Wired Widgets offers cross-portal widgets along with a widget communication framework and development toolkit for rapid new widget development, giving operators the ability to answer relevant questions as the mission evolves. Wired Widgets has been applied in a number of dynamic mission domains including disaster response, combat operations, and noncombatant evacuation scenarios. The variety of applications demonstrate that Wired Widgets provides a flexible, data driven solution for visualization in changing environments. In this paper, we show how, deployed in the Ozone Widget Framework portal environment, Wired Widgets can provide an agile, web-based visualization to support the SSA mission. Furthermore, we discuss how the tenets of agile visualization can generally be applied to the SSA problem space to provide operators flexibility, potentially informing future acquisition and system development.

  8. PDS4 - Some Principles for Agile Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D. J.; Hardman, S. H.; Joyner, R.; Algermissen, S.; Padams, J.

    2015-12-01

    PDS4, a research data management and curation system for NASA's Planetary Science Archive, was developed using principles that promote the characteristics of agile development. The result is an efficient system that produces better research data products while using less resources (time, effort, and money) and maximizes their usefulness for current and future scientists. The key principle is architectural. The PDS4 information architecture is developed and maintained independent of the infrastructure's process, application and technology architectures. The information architecture is based on an ontology-based information model developed to leverage best practices from standard reference models for digital archives, digital object registries, and metadata registries and capture domain knowledge from a panel of planetary science domain experts. The information model provides a sharable, stable, and formal set of information requirements for the system and is the primary source for information to configure most system components, including the product registry, search engine, validation and display tools, and production pipelines. Multi-level governance is also allowed for the effective management of the informational elements at the common, discipline, and project level. This presentation will describe the development principles, components, and uses of the information model and how an information model-driven architecture exhibits characteristics of agile curation including early delivery, evolutionary development, adaptive planning, continuous improvement, and rapid and flexible response to change.

  9. Supply Chain Modeling: Downstream Risk Assessment Methodology (DRAM) Summary of Development and Application

    DTIC Science & Technology

    2015-04-01

    This briefing demonstrates the operation of DRAM using the neodymium -iron-boron (NdFeB) magnet supply chain as a test case. It can also serve as a...is necessary to assess risk more precisely and to evaluate and support proposed risk mitigation measures. Develop DRAM—Approach  Use neodymium ...the specific approach taken to develop the DRAM methodology and build a prototype supply chain model for neodymium iron boron (NdFeB) magnets. These

  10. A new methodology for the development of high-latitude ionospheric climatologies and empirical models

    NASA Astrophysics Data System (ADS)

    Chisham, G.

    2017-01-01

    Many empirical models and climatologies of high-latitude ionospheric processes, such as convection, have been developed over the last 40 years. One common feature in the development of these models is that measurements from different times are combined and averaged on fixed coordinate grids. This methodology ignores the reality that high-latitude ionospheric features are organized relative to the location of the ionospheric footprint of the boundary between open and closed geomagnetic field lines (OCB). This boundary is in continual motion, and the polar cap that it encloses is continually expanding and contracting in response to changes in the rates of magnetic reconnection at the Earth's magnetopause and in the magnetotail. As a consequence, models that are developed by combining and averaging data in fixed coordinate grids heavily smooth the variations that occur near the boundary location. Here we propose that the development of future models should consider the location of the OCB in order to more accurately model the variations in this region. We present a methodology which involves identifying the OCB from spacecraft auroral images and then organizing measurements in a grid where the bins are placed relative to the OCB location. We demonstrate the plausibility of this methodology using ionospheric vorticity measurements made by the Super Dual Auroral Radar Network radars and OCB measurements from the IMAGE spacecraft FUV auroral imagers. This demonstration shows that this new methodology results in sharpening and clarifying features of climatological maps near the OCB location. We discuss the potential impact of this methodology on space weather applications.

  11. Surreptitious, Evolving and Participative Ontology Development: An End-User Oriented Ontology Development Methodology

    ERIC Educational Resources Information Center

    Bachore, Zelalem

    2012-01-01

    Ontology not only is considered to be the backbone of the semantic web but also plays a significant role in distributed and heterogeneous information systems. However, ontology still faces limited application and adoption to date. One of the major problems is that prevailing engineering-oriented methodologies for building ontologies do not…

  12. AGILE and Gamma-Ray Bursts

    SciTech Connect

    Longo, Francesco; Tavani, M.; Barbiellini, G.; Argan, A.; Basset, M.; Boffelli, F.; Bulgarelli, A.; Caraveo, P.; Cattaneo, P.; Chen, A.; Costa, E.; Del Monte, E.; Di Cocco, G.; Di Persio, G.; Donnarumma, I.; Feroci, M.; Fiorini, M.; Foggetta, L.; Froysland, T.; Frutti, M.

    2006-05-19

    AGILE is a Scientific Mission dedicated to high-energy astrophysics supported by ASI with scientific participation of INAF and INFN. The AGILE instrument is designed to simultaneously detect and image photons in the 30 MeV - 50 GeV and 15 - 45 keV energy bands with excellent imaging and timing capabilities, and a large field of view covering {approx} 1/5 of the entire sky at energies above 30 MeV. A CsI calorimeter is capable of GRB triggering in the energy band 0.3-50 MeV. The broadband detection of GRBs and the study of implications for particle acceleration and high energy emission are primary goals of th emission. AGILE can image GRBs with 2-3 arcminutes error boxes in the hard X-ray range, and provide broadband photon-by photon detection in the 15-45 keV, 03-50 MeV, and 30 MeV-30 GeV energy ranges. Microsecond on-board photon tagging and a {approx} 100 microsecond gamma-ray detection deadtime will be crucial for fast GRB timing. On-board calculated GRB coordinates and energy fluxes will be quickly transmitted to the ground by an ORBCOMM transceiver. AGILE have recently (December 2005) completed its gamma-ray calibration. It is now (January 2006) undergoing satellite integration and testing. The PLSV launch is planned in early 2006. AGILE is then foreseen to be fully operational during the summer of 2006. It will be the only mission entirely dedicated to high-energy astrophysics above 30 MeV during the period mid-2006/mid-2007.

  13. Perspectives on Industrial Innovation from Agilent, HP, and Bell Labs

    NASA Astrophysics Data System (ADS)

    Hollenhorst, James

    2014-03-01

    Innovation is the life blood of technology companies. I will give perspectives gleaned from a career in research and development at Bell Labs, HP Labs, and Agilent Labs, from the point of view of an individual contributor and a manager. Physicists bring a unique set of skills to the corporate environment, including a desire to understand the fundamentals, a solid foundation in physical principles, expertise in applied mathematics, and most importantly, an attitude: namely, that hard problems can be solved by breaking them into manageable pieces. In my experience, hiring managers in industry seldom explicitly search for physicists, but they want people with those skills.

  14. Planning and scheduling for agile manufacturers: The Pantex Process Model

    SciTech Connect

    Kjeldgaard, E.A.; Jones, D.A.; List, G.F.; Tumquist, M.A.

    1998-02-01

    Effective use of resources that are shared among multiple products or processes is critical for agile manufacturing. This paper describes the development and implementation of a computerized model to support production planning in a complex manufacturing system at the Pantex Plant, a US Department of Energy facility. The model integrates two different production processes (nuclear weapon disposal and stockpile evaluation) that use common facilities and personnel at the plant. The two production processes are characteristic of flow-shop and job shop operations. The model reflects the interactions of scheduling constraints, material flow constraints, and the availability of required technicians and facilities. Operational results show significant productivity increases from use of the model.

  15. Production planning tools and techniques for agile manufacturing

    SciTech Connect

    Kjeldgaard, E.A.; Jones, D.A.; List, G.F.; Turnquist, M.A.

    1996-10-01

    Effective use of resources shared among multiple products or processes is critical for agile manufacturing. This paper describes development and implementation of a computerized model to support production planning in a complex manufacturing system at Pantex Plant. The model integrates two different production processes (nuclear weapon dismantlement and stockpile evaluation) which use common facilities and personnel, and reflects the interactions of scheduling constraints, material flow constraints, and resource availability. These two processes reflect characteristics of flow-shop and job-shop operations in a single facility. Operational results from using the model are also discussed.

  16. Impact of emerging technologies on future combat aircraft agility

    NASA Technical Reports Server (NTRS)

    Nguyen, Luat T.; Gilert, William P.

    1990-01-01

    The foreseeable character of future within-visual-range air combat entails a degree of agility which calls for the integration of high-alpha aerodynamics, thrust vectoring, intimate pilot/vehicle interfaces, and advanced weapons/avionics suites, in prospective configurations. The primary technology-development programs currently contributing to these goals are presently discussed; they encompass the F-15 Short Takeoff and Landing/Maneuver Technology Demonstrator Program, the Enhanced Fighter Maneuverability Program, the High Angle-of-Attack Technology Program, and the X-29 Technology Demonstrator Program.

  17. The Southern Argentine Agile Meteor Radar (SAAMER)

    NASA Astrophysics Data System (ADS)

    Janches, Diego

    2014-11-01

    The Southern Argentina Agile Meteor Radar (SAAMER) is a new generation system deployed in Rio Grande, Tierra del Fuego, Argentina (53 S) in May 2008. SAAMER transmits 10 times more power than regular meteor radars, and uses a newly developed transmitting array, which focuses power upward instead of the traditional single-antenna-all-sky configuration. The system is configured such that the transmitter array can also be utilized as a receiver. The new design greatly increases the sensitivity of the radar enabling the detection of large number of particles at low zenith angles. The more concentrated transmitted power enables additional meteor studies besides those typical of these systems based on the detection of specular reflections, such as routine detections of head echoes and non-specular trails, previously only possible with High Power and Large Aperture radars. In August 2010, SAAMER was upgraded to a system capable to determine meteoroid orbital parameters. This was achieved by adding two remote receiving stations approximately 10 km away from the main site in near perpendicular directions. The upgrade significantly expands the science that is achieved with this new radar enabling us to study the orbital properties of the interplanetary dust environment. Because of the unique geographical location, SAAMER allows for additional inter-hemispheric comparison with measurements from Canadian Meteor Orbit Radar, which is geographically conjugate. Initial surveys show, for example, that SAAMER observes a very strong contribution of the South Toroidal Sporadic meteor source, of which limited observational data is available. In addition, SAAMER offers similar unique capabilities for meteor showers and streams studies given the range of ecliptic latitudes that the system enables detailed study of showers at high southern latitudes (e.g July Phoenicids or Puppids complex). Finally, SAAMER is ideal for the deployment of complementary instrumentation in both, permanent

  18. Developing a methodology to assess the impact of research grant funding: a mixed methods approach.

    PubMed

    Bloch, Carter; Sørensen, Mads P; Graversen, Ebbe K; Schneider, Jesper W; Schmidt, Evanthia Kalpazidou; Aagaard, Kaare; Mejlgaard, Niels

    2014-04-01

    This paper discusses the development of a mixed methods approach to analyse research funding. Research policy has taken on an increasingly prominent role in the broader political scene, where research is seen as a critical factor in maintaining and improving growth, welfare and international competitiveness. This has motivated growing emphasis on the impacts of science funding, and how funding can best be designed to promote socio-economic progress. Meeting these demands for impact assessment involves a number of complex issues that are difficult to fully address in a single study or in the design of a single methodology. However, they point to some general principles that can be explored in methodological design. We draw on a recent evaluation of the impacts of research grant funding, discussing both key issues in developing a methodology for the analysis and subsequent results. The case of research grant funding, involving a complex mix of direct and intermediate effects that contribute to the overall impact of funding on research performance, illustrates the value of a mixed methods approach to provide a more robust and complete analysis of policy impacts. Reflections on the strengths and weaknesses of the methodology are used to examine refinements for future work.

  19. Development of 3D pseudo pin-by-pin calculation methodology in ANC

    SciTech Connect

    Zhang, B.; Mayhue, L.; Huria, H.; Ivanov, B.

    2012-07-01

    Advanced cores and fuel assembly designs have been developed to improve operational flexibility, economic performance and further enhance safety features of nuclear power plants. The simulation of these new designs, along with strong heterogeneous fuel loading, have brought new challenges to the reactor physics methodologies currently employed in the industrial codes for core analyses. Control rod insertion during normal operation is one operational feature in the AP1000{sup R} plant of Westinghouse next generation Pressurized Water Reactor (PWR) design. This design improves its operational flexibility and efficiency but significantly challenges the conventional reactor physics methods, especially in pin power calculations. The mixture loading of fuel assemblies with significant neutron spectrums causes a strong interaction between different fuel assembly types that is not fully captured with the current core design codes. To overcome the weaknesses of the conventional methods, Westinghouse has developed a state-of-the-art 3D Pin-by-Pin Calculation Methodology (P3C) and successfully implemented in the Westinghouse core design code ANC. The new methodology has been qualified and licensed for pin power prediction. The 3D P3C methodology along with its application and validation will be discussed in the paper. (authors)

  20. Development of an Optimization Methodology for the Aluminum Alloy Wheel Casting Process

    NASA Astrophysics Data System (ADS)

    Duan, Jianglan; Reilly, Carl; Maijer, Daan M.; Cockcroft, Steve L.; Phillion, Andre B.

    2015-08-01

    An optimization methodology has been developed for the aluminum alloy wheel casting process. The methodology is focused on improving the timing of cooling processes in a die to achieve improved casting quality. This methodology utilizes (1) a casting process model, which was developed within the commercial finite element package, ABAQUS™—ABAQUS is a trademark of Dassault Systèms; (2) a Python-based results extraction procedure; and (3) a numerical optimization module from the open-source Python library, Scipy. To achieve optimal casting quality, a set of constraints have been defined to ensure directional solidification, and an objective function, based on the solidification cooling rates, has been defined to either maximize, or target a specific, cooling rate. The methodology has been applied to a series of casting and die geometries with different cooling system configurations, including a 2-D axisymmetric wheel and die assembly generated from a full-scale prototype wheel. The results show that, with properly defined constraint and objective functions, solidification conditions can be improved and optimal cooling conditions can be achieved leading to process productivity and product quality improvements.

  1. A METHODOLOGY FOR DEVELOPING A ROADMAP TOWARDS LOCAL LOW-CARBON SOCIETY COSIDERING IMPLEMENTATION COST

    NASA Astrophysics Data System (ADS)

    Gomi, Kei; Kim, Jaegyu; Matsuoka, Yuzuru

    We have developed a methodology for developing roadmaps towards low-carbon society in local government. A quantification tool called "Backcasting Tool" (BCT) was developed. BCT estimates implementation schedule of all policies and actions considering their relationship, financial constraints of the actors, and co-benefit of the policies. The methodology was applied in Shiga prefecture, Japan, and a roadmap which consists of more than 240 policies is estimated considering direct costs paid by public and private sectors. As a result, cumulative implementation cost was 7.3 trillion yen in which public sector bear 17%. Cumulative emission reduction was 101MtCO2, and average emission reduction cost was 73 thousand yen/tCO2.

  2. Methodology for characterizing seeds under development for brachytherapy by means of radiochromic and photographic films.

    PubMed

    Meira-Belo, L C; Rodrigues, E J T; Grynberg, S E

    2013-04-01

    The development of new medical devices possess a number of challenges, including designing, constructing, and assaying prototypes. In the case of new brachytherapy seeds, this is also true. In this paper, a methodology for rapid dosimetric characterization of (125)I brachytherapy seeds during the early stages of their development is introduced. The characterization methodology is based on the joint use of radiochromic and personal monitoring photographic films in order to determine the planar anisotropy due to the radiation field produced by the seed under development, by means of isodose curves. To evaluate and validate the process, isodose curves were obtained with both types of films after irradiation with a commercial (125)I brachytherapy seed.

  3. Methodology for Developing the REScheckTM Software through Version 4.2

    SciTech Connect

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan; Lucas, R. G.; Schultz, Robert W.; Taylor, Zachary T.; Wiberg, John D.

    2009-08-01

    This report explains the methodology used to develop Version 4.2 of the REScheck software developed for the 1992, 1993, and 1995 editions of the MEC, and the 1998, 2000, 2003, and 2006 editions of the IECC, and the 2006 edition of the International Residential Code (IRC). Although some requirements contained in these codes have changed, the methodology used to develop the REScheck software for these five editions is similar. REScheck assists builders in meeting the most complicated part of the code-the building envelope Uo-, U-, and R-value requirements in Section 502 of the code. This document details the calculations and assumptions underlying the treatment of the code requirements in REScheck, with a major emphasis on the building envelope requirements.

  4. Impacts of Outer Continental Shelf (OCS) development on recreation and tourism. Volume 3. Detailed methodology

    SciTech Connect

    Not Available

    1987-04-01

    The final report for the project is presented in five volumes. This volume, Detailed Methodology Review, presents a discussion of the methods considered and used to estimate the impacts of Outer Continental Shelf (OCS) oil and gas development on coastal recreation in California. The purpose is to provide the Minerals Management Service with data and methods to improve their ability to analyze the socio-economic impacts of OCS development. Chapter II provides a review of previous attempts to evaluate the effects of OCS development and of oil spills on coastal recreation. The review also discusses the strengths and weaknesses of different approaches and presents the rationale for the methodology selection made. Chapter III presents a detailed discussion of the methods actually used in the study. The volume contains the bibliography for the entire study.

  5. A Model-Based Methodology for Spray-Drying Process Development.

    PubMed

    Dobry, Dan E; Settell, Dana M; Baumann, John M; Ray, Rod J; Graham, Lisa J; Beyerinck, Ron A

    2009-09-01

    Solid amorphous dispersions are frequently used to improve the solubility and, thus, the bioavailability of poorly soluble active pharmaceutical ingredients (APIs). Spray-drying, a well-characterized pharmaceutical unit operation, is ideally suited to producing solid amorphous dispersions due to its rapid drying kinetics. This paper describes a novel flowchart methodology based on fundamental engineering models and state-of-the-art process characterization techniques that ensure that spray-drying process development and scale-up are efficient and require minimal time and API. This methodology offers substantive advantages over traditional process-development methods, which are often empirical and require large quantities of API and long development times. This approach is also in alignment with the current guidance on Pharmaceutical Development Q8(R1). The methodology is used from early formulation-screening activities (involving milligrams of API) through process development and scale-up for early clinical supplies (involving kilograms of API) to commercial manufacturing (involving metric tons of API). It has been used to progress numerous spray-dried dispersion formulations, increasing bioavailability of formulations at preclinical through commercial scales.

  6. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  7. Development and implementation of rotorcraft preliminary design methodology using multidisciplinary design optimization

    NASA Astrophysics Data System (ADS)

    Khalid, Adeel Syed

    Rotorcraft's evolution has lagged behind that of fixed-wing aircraft. One of the reasons for this gap is the absence of a formal methodology to accomplish a complete conceptual and preliminary design. Traditional rotorcraft methodologies are not only time consuming and expensive but also yield sub-optimal designs. Rotorcraft design is an excellent example of a multidisciplinary complex environment where several interdependent disciplines are involved. A formal framework is developed and implemented in this research for preliminary rotorcraft design using IPPD methodology. The design methodology consists of the product and process development cycles. In the product development loop, all the technical aspects of design are considered including the vehicle engineering, dynamic analysis, stability and control, aerodynamic performance, propulsion, transmission design, weight and balance, noise analysis and economic analysis. The design loop starts with a detailed analysis of requirements. A baseline is selected and upgrade targets are identified depending on the mission requirements. An Overall Evaluation Criterion (OEC) is developed that is used to measure the goodness of the design or to compare the design with competitors. The requirements analysis and baseline upgrade targets lead to the initial sizing and performance estimation of the new design. The digital information is then passed to disciplinary experts. This is where the detailed disciplinary analyses are performed. Information is transferred from one discipline to another as the design loop is iterated. To coordinate all the disciplines in the product development cycle, Multidisciplinary Design Optimization (MDO) techniques e.g. All At Once (AAO) and Collaborative Optimization (CO) are suggested. The methodology is implemented on a Light Turbine Training Helicopter (LTTH) design. Detailed disciplinary analyses are integrated through a common platform for efficient and centralized transfer of design

  8. Development of uncertainty methodology for COBRA-TF void distribution and critical power predictions

    NASA Astrophysics Data System (ADS)

    Aydogan, Fatih

    Thermal hydraulic codes are commonly used tools in licensing processes for the evaluation of various thermal hydraulic scenarios. The uncertainty of a thermal hydraulic code prediction is calculated with uncertainty analyses. The objective of all the uncertainty analysis is to determine how well a code predicts with corresponding uncertainties. If a code has a big output uncertainty, this code needs further development and/or model improvements. If a code has a small uncertainty, this code needs maintenance program in order to keep this small output uncertainty. Uncertainty analysis also indicates the more validation data is needed. Uncertainty analyses for the BWR nominal steady state and transient scenarios are necessary in order to develop and improve the two phase flow models in the thermal hydraulic codes. Because void distribution is the key factor in order to determine the flow regime and heat transfer regime of the flow and critical power is an important factor for the safety margin, both steady state void distribution and critical power predictions are important features of a code. An uncertainty analysis for these two phenomena/cases provides valuable results. These results can be used for the development of the thermal hydraulic codes that are used for designing a BWR bundle or for licensing procedures. This dissertation includes the development of a particular uncertainty methodology for the steady state void distribution and critical power predictions. In this methodology, the PIRT element of CSAU was used to eliminate the low ranked uncertainty parameters. The SPDF element of GRS was utilized to make the uncertainty methodology flexible for the assignment of PDFs to the uncertainty parameters. The developed methodology includes the uncertainty comparison methods to assess the code precision with the sample-averaged bias, to assess the code spreading with the sample-averaged standard deviation and to assess the code reliability with the proportion of

  9. A review of the Technologies Enabling Agile Manufacturing program

    SciTech Connect

    Gray, W.H.; Neal, R.E.; Cobb, C.K.

    1996-10-01

    Addressing a technical plan developed in consideration with major US manufacturers, software and hardware providers, and government representatives, the Technologies Enabling Agile Manufacturing (TEAM) program is leveraging the expertise and resources of industry, universities, and federal agencies to develop, integrate, and deploy leap-ahead manufacturing technologies. One of the TEAM program`s goals is to transition products from design to production faster, more efficiently, and at less cost. TEAM`s technology development strategy also provides all participants with early experience in establishing and working within an electronic enterprise that includes access to high-speed networks and high-performance computing and storage systems. The TEAM program uses the cross-cutting tools it collects, develops, and integrates to demonstrate and deploy agile manufacturing capabilities for three high-priority processes identified by industry: material removal, sheet metal forming, electro-mechanical assembly. This paper reviews the current status of the TEAM program with emphasis upon TEAM`s information infrastructure.

  10. Agile development of ontologies through conversation

    NASA Astrophysics Data System (ADS)

    Braines, Dave; Bhattal, Amardeep; Preece, Alun D.; de Mel, Geeth

    2016-05-01

    Ontologies and semantic systems are necessarily complex but offer great potential in terms of their ability to fuse information from multiple sources in support of situation awareness. Current approaches do not place the ontologies directly into the hands of the end user in the field but instead hide them away behind traditional applications. We have been experimenting with human-friendly ontologies and conversational interactions to enable non-technical business users to interact with and extend these dynamically. In this paper we outline our approach via a worked example, covering: OWL ontologies, ITA Controlled English, Sensor/mission matching and conversational interactions between human and machine agents.

  11. Project Success in Agile Development Software Projects

    ERIC Educational Resources Information Center

    Farlik, John T.

    2016-01-01

    Project success has multiple definitions in the scholarly literature. Research has shown that some scholars and practitioners define project success as the completion of a project within schedule and within budget. Others consider a successful project as one in which the customer is satisfied with the product. This quantitative study was conducted…

  12. A multimedia approach for teaching human embryology: Development and evaluation of a methodology.

    PubMed

    Moraes, Suzana Guimarães; Pereira, Luis Antonio Violin

    2010-12-20

    Human embryology requires students to understand the simultaneous changes in embryos, but students find it difficult to grasp the concepts presented and to visualise the related processes in three dimensions. The aims of this study have been to develop and evaluate new educational materials and a teaching methodology based on multimedia approaches to improve the comprehension of human development. The materials developed at the State University of Campinas include clinical histories, movies, animations, and ultrasound, as well as autopsy images from embryos and foetuses. The series of embryology lectures were divided into two parts. The first part of the series addressed the development of the body's structures, while in the second part, clinical history and the corresponding materials were shown to the students, who were encouraged to discuss the malformations. The teaching materials were made available on software used by the students in classes. At the end of the discipline, the material and methodology were evaluated with an attitudinal instrument, interviews, and knowledge examination. The response rate to the attitudinal instrument was 95.35%, and the response rate to the interview was 46%. The students approved of the materials and the teaching methodology (reliability of the attitudinal instrument was 0.9057). The exams showed that most students scored above 6.0. A multimedia approach proved useful for solving an important problem associated with teaching methods in many medical institutions: the lack of integration between basic sciences and clinical disciplines.

  13. Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama M.

    2003-01-01

    This effort is to investigate probabilistic life prediction methodologies for MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. This includes completion of a literature survey regarding Weibull size effect in MEMS and strength testing techniques. Also of interest is the design of a proper test for the Weibull size effect in tensile specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. Another potential item of interest is analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structuredlife (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. Along these lines work may also be performed on transient fatigue life prediction methodologies.

  14. Development and exploration of a new methodology for the fitting and analysis of XAS data

    PubMed Central

    Delgado-Jaime, Mario Ulises; Kennepohl, Pierre

    2010-01-01

    A new data analysis methodology for X-ray absorption near-edge spectroscopy (XANES) is introduced and tested using several examples. The methodology has been implemented within the context of a new Matlab-based program discussed in a companion related article [Delgado-Jaime et al. (2010 ▶), J. Synchrotron Rad. 17, 132–137]. The approach makes use of a Monte Carlo search method to seek appropriate starting points for a fit model, allowing for the generation of a large number of independent fits with minimal user-induced bias. The applicability of this methodology is tested using various data sets on the Cl K-edge XAS data for tetragonal CuCl4 2−, a common reference compound used for calibration and covalency estimation in M—Cl bonds. A new background model function that effectively blends together background profiles with spectral features is an important component of the discussed methodology. The development of a robust evaluation function to fit multiple-edge data is discussed and the implications regarding standard approaches to data analysis are discussed and explored within these examples. PMID:20029120

  15. Development of a methodology for the detection of hospital financial outliers using information systems.

    PubMed

    Okada, Sachiko; Nagase, Keisuke; Ito, Ayako; Ando, Fumihiko; Nakagawa, Yoshiaki; Okamoto, Kazuya; Kume, Naoto; Takemura, Tadamasa; Kuroda, Tomohiro; Yoshihara, Hiroyuki

    2014-01-01

    Comparison of financial indices helps to illustrate differences in operations and efficiency among similar hospitals. Outlier data tend to influence statistical indices, and so detection of outliers is desirable. Development of a methodology for financial outlier detection using information systems will help to reduce the time and effort required, eliminate the subjective elements in detection of outlier data, and improve the efficiency and quality of analysis. The purpose of this research was to develop such a methodology. Financial outliers were defined based on a case model. An outlier-detection method using the distances between cases in multi-dimensional space is proposed. Experiments using three diagnosis groups indicated successful detection of cases for which the profitability and income structure differed from other cases. Therefore, the method proposed here can be used to detect outliers.

  16. Development of a standard methodology for optimizing remote visual display for nuclear-maintenance tasks

    SciTech Connect

    Clarke, M.M.; Garin, J.; Preston-Anderson, A.

    1981-01-01

    The aim of the present study is to develop a methodology for optimizing remote viewing systems for a fuel recycle facility (HEF) being designed at Oak Ridge National Laboratory (ORNL). An important feature of this design involves the Remotex concept: advanced servo-controlled master/slave manipulators, with remote television viewing, will totally replace direct human contact with the radioactive environment. Therefore, the design of optimal viewing conditions is a critical component of the overall man/machine system. A methodology has been developed for optimizing remote visual displays for nuclear maintenance tasks. The usefulness of this approach has been demonstrated by preliminary specification of optimal closed circuit TV systems for such tasks.

  17. Co-design of RAD and ETHICS methodologies: a combination of information system development methods

    NASA Astrophysics Data System (ADS)

    Nasehi, Arezo; Shahriyari, Salman

    2011-12-01

    Co-design is a new trend in the social world which tries to capture different ideas in order to use the most appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid application development (RAD) and effective technical and human implementation of computer-based systems (ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or combination of them for developing an information system. To reach this purpose, four different aspects of them are analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some suggestions for the co-design.

  18. Discovery of new antimalarial chemotypes through chemical methodology and library development

    PubMed Central

    Brown, Lauren E.; Chih-Chien Cheng, Ken; Wei, Wan-Guo; Yuan, Pingwei; Dai, Peng; Trilles, Richard; Ni, Feng; Yuan, Jing; MacArthur, Ryan; Guha, Rajarshi; Johnson, Ronald L.; Su, Xin-zhuan; Dominguez, Melissa M.; Snyder, John K.; Beeler, Aaron B.; Schaus, Scott E.; Inglese, James; Porco, John A.

    2011-01-01

    In an effort to expand the stereochemical and structural complexity of chemical libraries used in drug discovery, the Center for Chemical Methodology and Library Development at Boston University has established an infrastructure to translate methodologies accessing diverse chemotypes into arrayed libraries for biological evaluation. In a collaborative effort, the NIH Chemical Genomics Center determined IC50’s for Plasmodium falciparum viability for each of 2,070 members of the CMLD-BU compound collection using quantitative high-throughput screening across five parasite lines of distinct geographic origin. Three compound classes displaying either differential or comprehensive antimalarial activity across the lines were identified, and the nascent structure activity relationships (SAR) from this experiment used to initiate optimization of these chemotypes for further development. PMID:21498685

  19. U.S. Geological Survey Methodology Development for Ecological Carbon Assessment and Monitoring

    USGS Publications Warehouse

    Zhu, Zhi-Liang; Stackpoole, S.M.

    2009-01-01

    Ecological carbon sequestration refers to transfer and storage of atmospheric carbon in vegetation, soils, and aquatic environments to help offset the net increase from carbon emissions. Understanding capacities, associated opportunities, and risks of vegetated ecosystems to sequester carbon provides science information to support formulation of policies governing climate change mitigation, adaptation, and land-management strategies. Section 712 of the Energy Independence and Security Act (EISA) of 2007 mandates the Department of the Interior to develop a methodology and assess the capacity of our nation's ecosystems for ecological carbon sequestration and greenhouse gas (GHG) flux mitigation. The U.S. Geological Survey (USGS) LandCarbon Project is responding to the Department of Interior's request to develop a methodology that meets specific EISA requirements.

  20. Implementation of a cooperative methodology to develop organic chemical engineering skills

    NASA Astrophysics Data System (ADS)

    Arteaga, J. F.; Díaz Blanco, M. J.; Toscano Fuentes, C.; Martín Alfonso, J. E.

    2013-08-01

    The objective of this work is to investigate how most of the competences required by engineering students may be developed through an active methodology based on cooperative learning/evaluation. Cooperative learning was employed by the University of Huelva's third-year engineering students. The teaching methodology pretends to create some of the most relevant engineering skills required nowadays such as the ability to cooperate finding appropriate information; the ability to solve problems through critical and creative thinking; and the ability to make decisions and to communicate effectively. The statistical study carried out supports the hypothesis that comprehensive and well-defined protocols in the development of the subject, the rubric and cooperative evaluation allow students to acquire a successful learning.

  1. Discovery of new antimalarial chemotypes through chemical methodology and library development.

    PubMed

    Brown, Lauren E; Chih-Chien Cheng, Ken; Wei, Wan-Guo; Yuan, Pingwei; Dai, Peng; Trilles, Richard; Ni, Feng; Yuan, Jing; MacArthur, Ryan; Guha, Rajarshi; Johnson, Ronald L; Su, Xin-zhuan; Dominguez, Melissa M; Snyder, John K; Beeler, Aaron B; Schaus, Scott E; Inglese, James; Porco, John A

    2011-04-26

    In an effort to expand the stereochemical and structural complexity of chemical libraries used in drug discovery, the Center for Chemical Methodology and Library Development at Boston University has established an infrastructure to translate methodologies accessing diverse chemotypes into arrayed libraries for biological evaluation. In a collaborative effort, the NIH Chemical Genomics Center determined IC(50)'s for Plasmodium falciparum viability for each of 2,070 members of the CMLD-BU compound collection using quantitative high-throughput screening across five parasite lines of distinct geographic origin. Three compound classes displaying either differential or comprehensive antimalarial activity across the lines were identified, and the nascent structure activity relationships (SAR) from this experiment used to initiate optimization of these chemotypes for further development.

  2. A process for the agile product realization of electro-mechanical devices

    SciTech Connect

    Forsythe, C.; Ashby, M.R.; Benavides, G.L.; Diegert, K.V.; Jones, R.E.; Longcope, D.B.; Parratt, S.W.

    1995-09-01

    This paper describes a product realization process developed and demonstrated at Sandia by the A-PRIMED (Agile Product Realization for Innovative Electro MEchanical Devices) project that integrates many of the key components of ``agile manufacturing`` into a complete, design-to-production process. Evidence indicates that the process has reduced the product realization cycle and assured product quality. Products included discriminators for a robotic quick change adapter and for an electronic defense system. These discriminators, built using A-PRIMED, met random vibration requirements and had life cycles that far surpass the performance obtained from earlier efforts.

  3. Lesson Learned from AGILE and LARES ASI Projects About MATED Data Collection and Post Analysis

    NASA Astrophysics Data System (ADS)

    Carpentiero, Rita; Mrchetti, Ernesto; Natalucci, Silvia; Portelli, Claudio

    2012-07-01

    ASI has managed and collected data on project development of two scientific all-Italian missions: AGILE and LARES. Collection of the Model And Test Effectiveness Database (MATED) data, concerning Project, AIV (Assembly Integration and Verification) and NCR (Non Conformance Report) aspects has been performed by the Italian Space Agency (ASI), using available technical documentation of both AGILE e LARES projects. In this paper some consideration on the needs of 'real time' data collection is made, together with proposal of front end improvement of this tool. In addition a preliminary analysis of MATED effectiveness related to the above ASI projects will be presented in a bottom-up and post verification approach.

  4. Development of a Pattern Recognition Methodology for Determining Operationally Optimal Heat Balance Instrumentation Calibration Schedules

    SciTech Connect

    Kurt Beran; John Christenson; Dragos Nica; Kenny Gross

    2002-12-15

    The goal of the project is to enable plant operators to detect with high sensitivity and reliability the onset of decalibration drifts in all of the instrumentation used as input to the reactor heat balance calculations. To achieve this objective, the collaborators developed and implemented at DBNPS an extension of the Multivariate State Estimation Technique (MSET) pattern recognition methodology pioneered by ANAL. The extension was implemented during the second phase of the project and fully achieved the project goal.

  5. Methodological Guidelines for Reducing the Complexity of Data Warehouse Development for Transactional Blood Bank Systems.

    PubMed

    Takecian, Pedro L; Oikawa, Marcio K; Braghetto, Kelly R; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S; Acker, Susan; Carneiro-Proietti, Anna B F; Sabino, Ester C; Custer, Brian; Busch, Michael P; Ferreira, João E

    2013-06-01

    Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development.

  6. Methodological Guidelines for Reducing the Complexity of Data Warehouse Development for Transactional Blood Bank Systems

    PubMed Central

    Takecian, Pedro L.; Oikawa, Marcio K.; Braghetto, Kelly R.; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S.; Acker, Susan; Carneiro-Proietti, Anna B. F.; Sabino, Ester C.; Custer, Brian; Busch, Michael P.; Ferreira, João E.

    2013-01-01

    Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development. PMID:23729945

  7. Developing a methodology to predict PM10 concentrations in urban areas using generalized linear models.

    PubMed

    Garcia, J M; Teodoro, F; Cerdeira, R; Coelho, L M R; Kumar, Prashant; Carvalho, M G

    2016-09-01

    A methodology to predict PM10 concentrations in urban outdoor environments is developed based on the generalized linear models (GLMs). The methodology is based on the relationship developed between atmospheric concentrations of air pollutants (i.e. CO, NO2, NOx, VOCs, SO2) and meteorological variables (i.e. ambient temperature, relative humidity (RH) and wind speed) for a city (Barreiro) of Portugal. The model uses air pollution and meteorological data from the Portuguese monitoring air quality station networks. The developed GLM considers PM10 concentrations as a dependent variable, and both the gaseous pollutants and meteorological variables as explanatory independent variables. A logarithmic link function was considered with a Poisson probability distribution. Particular attention was given to cases with air temperatures both below and above 25°C. The best performance for modelled results against the measured data was achieved for the model with values of air temperature above 25°C compared with the model considering all ranges of air temperatures and with the model considering only temperature below 25°C. The model was also tested with similar data from another Portuguese city, Oporto, and results found to behave similarly. It is concluded that this model and the methodology could be adopted for other cities to predict PM10 concentrations when these data are not available by measurements from air quality monitoring stations or other acquisition means.

  8. Development and application of a methodology for a clean development mechanism to avoid methane emissions in closed landfills.

    PubMed

    Janke, Leandro; Lima, André O S; Millet, Maurice; Radetski, Claudemir M

    2013-01-01

    In Brazil, Solid Waste Disposal Sites have operated without consideration of environmental criteria, these areas being characterized by methane (CH4) emissions during the anaerobic degradation of organic matter. The United Nations organization has made efforts to control this situation, through the United Nations Framework Convention on Climate Change (UNFCCC) and the Kyoto Protocol, where projects that seek to reduce the emissions of greenhouse gases (GHG) can be financially rewarded through Certified Emission Reductions (CERs) if they respect the requirements established by the Clean Development Mechanism (CDM), such as the use of methodologies approved by the CDM Executive Board (CDM-EB). Thus, a methodology was developed according to the CDM standards related to the aeration, excavation and composting of closed Municipal Solid Waste (MSW) landfills, which was submitted to CDM-EB for assessment and, after its approval, applied to a real case study in Maringá City (Brazil) with a view to avoiding negative environmental impacts due the production of methane and leachates even after its closure. This paper describes the establishment of this CDM-EB-approved methodology to determine baseline emissions, project emissions and the resultant emission reductions with the application of appropriate aeration, excavation and composting practices at closed MSW landfills. A further result obtained through the application of the methodology in the landfill case study was that it would be possible to achieve an ex-ante emission reduction of 74,013 tCO2 equivalent if the proposed CDM project activity were implemented.

  9. Development of a Probabilistic Assessment Methodology for Evaluation of Carbon Dioxide Storage

    USGS Publications Warehouse

    Burruss, Robert A.; Brennan, Sean T.; Freeman, P.A.; Merrill, Matthew D.; Ruppert, Leslie F.; Becker, Mark F.; Herkelrath, William N.; Kharaka, Yousif K.; Neuzil, Christopher E.; Swanson, Sharon M.; Cook, Troy A.; Klett, Timothy R.; Nelson, Philip H.; Schenk, Christopher J.

    2009-01-01

    This report describes a probabilistic assessment methodology developed by the U.S. Geological Survey (USGS) for evaluation of the resource potential for storage of carbon dioxide (CO2) in the subsurface of the United States as authorized by the Energy Independence and Security Act (Public Law 110-140, 2007). The methodology is based on USGS assessment methodologies for oil and gas resources created and refined over the last 30 years. The resource that is evaluated is the volume of pore space in the subsurface in the depth range of 3,000 to 13,000 feet that can be described within a geologically defined storage assessment unit consisting of a storage formation and an enclosing seal formation. Storage assessment units are divided into physical traps (PTs), which in most cases are oil and gas reservoirs, and the surrounding saline formation (SF), which encompasses the remainder of the storage formation. The storage resource is determined separately for these two types of storage. Monte Carlo simulation methods are used to calculate a distribution of the potential storage size for individual PTs and the SF. To estimate the aggregate storage resource of all PTs, a second Monte Carlo simulation step is used to sample the size and number of PTs. The probability of successful storage for individual PTs or the entire SF, defined in this methodology by the likelihood that the amount of CO2 stored will be greater than a prescribed minimum, is based on an estimate of the probability of containment using present-day geologic knowledge. The report concludes with a brief discussion of needed research data that could be used to refine assessment methodologies for CO2 sequestration.

  10. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis

    PubMed Central

    Guardia, Gabriela D. A.; Pires, Luís Ferreira; Vêncio, Ricardo Z. N.; Malmegrim, Kelen C. R.; de Farias, Cléver R. G.

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740

  11. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    PubMed

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  12. Using Modern Methodologies with Maintenance Software

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.

    2014-01-01

    Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.

  13. Methodology, status and plans for development and assessment of Cathare code

    SciTech Connect

    Bestion, D.; Barre, F.; Faydide, B.

    1997-07-01

    This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests or integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.

  14. Fall 2014 SEI Research Review Applying Agile Methods to DoD

    DTIC Science & Technology

    2014-10-28

    2014 © 2014 Carnegie Mellon University Agile Defense Adoption Proponents Team (ADAPT) member E-Learning Agile Course Multiple Presentations ... presentations , program committees: GSAW 2014, Agile 2014, Contracts in Agile International Meeting, AFEI/SEI DoD Agile Summit, GAO Working Groups 8

  15. Development, characterization, and optimization of protein level in date bars using response surface methodology.

    PubMed

    Nadeem, Muhammad; Salim-ur-Rehman; Muhammad Anjum, Faqir; Murtaza, Mian Anjum; Mueen-ud-Din, Ghulam

    2012-01-01

    This project was designed to produce a nourishing date bar with commercial value especially for school going children to meet their body development requirements. Protein level of date bars was optimized using response surface methodology (RSM). Economical and underutilized sources, that is, whey protein concentrate and vetch protein isolates, were explored for protein supplementation. Fourteen date bar treatments were produced using a central composite design (CCD) with 2 variables and 3 levels for each variable. Date bars were then analyzed for nutritional profile. Proximate composition revealed that addition of whey protein concentrate and vetch protein isolates improved the nutritional profile of date bars. Protein level, texture, and taste were considerably improved by incorporating 6.05% whey protein concentrate and 4.35% vetch protein isolates in date bar without affecting any sensory characteristics during storage. Response surface methodology was observed as an economical and effective tool to optimize the ingredient level and to discriminate the interactive effects of independent variables.

  16. Development, Characterization, and Optimization of Protein Level in Date Bars Using Response Surface Methodology

    PubMed Central

    Nadeem, Muhammad; Salim-ur-Rehman; Muhammad Anjum, Faqir; Murtaza, Mian Anjum; Mueen-ud-Din, Ghulam

    2012-01-01

    This project was designed to produce a nourishing date bar with commercial value especially for school going children to meet their body development requirements. Protein level of date bars was optimized using response surface methodology (RSM). Economical and underutilized sources, that is, whey protein concentrate and vetch protein isolates, were explored for protein supplementation. Fourteen date bar treatments were produced using a central composite design (CCD) with 2 variables and 3 levels for each variable. Date bars were then analyzed for nutritional profile. Proximate composition revealed that addition of whey protein concentrate and vetch protein isolates improved the nutritional profile of date bars. Protein level, texture, and taste were considerably improved by incorporating 6.05% whey protein concentrate and 4.35% vetch protein isolates in date bar without affecting any sensory characteristics during storage. Response surface methodology was observed as an economical and effective tool to optimize the ingredient level and to discriminate the interactive effects of independent variables. PMID:22792044

  17. Development of a methodology for assessing the safety of embedded software systems

    NASA Technical Reports Server (NTRS)

    Garrett, C. J.; Guarro, S. B.; Apostolakis, G. E.

    1993-01-01

    A Dynamic Flowgraph Methodology (DFM) based on an integrated approach to modeling and analyzing the behavior of software-driven embedded systems for assessing and verifying reliability and safety is discussed. DFM is based on an extension of the Logic Flowgraph Methodology to incorporate state transition models. System models which express the logic of the system in terms of causal relationships between physical variables and temporal characteristics of software modules are analyzed to determine how a certain state can be reached. This is done by developing timed fault trees which take the form of logical combinations of static trees relating the system parameters at different point in time. The resulting information concerning the hardware and software states can be used to eliminate unsafe execution paths and identify testing criteria for safety critical software functions.

  18. A methodology and a web platform for the collaborative development of context-aware systems.

    PubMed

    Martín, David; López-de-Ipiña, Diego; Alzua-Sorzabal, Aurkene; Lamsfus, Carlos; Torres-Manzanera, Emilio

    2013-05-10

    Information and services personalization is essential for an optimal user experience. Systems have to be able to acquire data about the user's context, process them in order to identify the user's situation and finally, adapt the functionality of the system to that situation, but the development of context-aware systems is complex. Data coming from distributed and heterogeneous sources have to be acquired, processed and managed. Several programming frameworks have been proposed in order to simplify the development of context-aware systems. These frameworks offer high-level application programming interfaces for programmers that complicate the involvement of domain experts in the development life-cycle. The participation of users that do not have programming skills but are experts in the application domain can speed up and improve the development process of these kinds of systems. Apart from that, there is a lack of methodologies to guide the development process. This article presents as main contributions, the implementation and evaluation of a web platform and a methodology to collaboratively develop context-aware systems by programmers and domain experts.

  19. Curriculum Development of a Research Laboratory Methodology Course for Complementary and Integrative Medicine Students

    PubMed Central

    Vasilevsky, Nicole; Schafer, Morgan; Tibbitts, Deanne; Wright, Kirsten; Zwickey, Heather

    2015-01-01

    Training in fundamental laboratory methodologies is valuable to medical students because it enables them to understand the published literature, critically evaluate clinical studies, and make informed decisions regarding patient care. It also prepares them for research opportunities that may complement their medical practice. The National College of Natural Medicine's (NCNM) Master of Science in Integrative Medicine Research (MSiMR) program has developed an Introduction to Laboratory Methods course. The objective of the course it to train clinical students how to perform basic laboratory skills, analyze and manage data, and judiciously assess biomedical studies. Here we describe the course development and implementation as it applies to complementary and integrative medicine students. PMID:26500806

  20. Agile text mining for the 2014 i2b2/UTHealth Cardiac risk factors challenge.

    PubMed

    Cormack, James; Nath, Chinmoy; Milward, David; Raja, Kalpana; Jonnalagadda, Siddhartha R

    2015-12-01

    This paper describes the use of an agile text mining platform (Linguamatics' Interactive Information Extraction Platform, I2E) to extract document-level cardiac risk factors in patient records as defined in the i2b2/UTHealth 2014 challenge. The approach uses a data-driven rule-based methodology with the addition of a simple supervised classifier. We demonstrate that agile text mining allows for rapid optimization of extraction strategies, while post-processing can leverage annotation guidelines, corpus statistics and logic inferred from the gold standard data. We also show how data imbalance in a training set affects performance. Evaluation of this approach on the test data gave an F-Score of 91.7%, one percent behind the top performing system.

  1. Agile Text Mining for the 2014 i2b2/UTHealth Cardiac Risk Factors Challenge

    PubMed Central

    Cormack, James; Nath, Chinmoy; Milward, David; Raja, Kalpana; Jonnalagadda, Siddhartha R

    2016-01-01

    This paper describes the use of an agile text mining platform (Linguamatics’ Interactive Information Extraction Platform, I2E) to extract document-level cardiac risk factors in patient records as defined in the i2b2/UTHealth 2014 Challenge. The approach uses a data-driven rule-based methodology with the addition of a simple supervised classifier. We demonstrate that agile text mining allows for rapid optimization of extraction strategies, while post-processing can leverage annotation guidelines, corpus statistics and logic inferred from the gold standard data. We also show how data imbalance in a training set affects performance. Evaluation of this approach on the test data gave an F-Score of 91.7%, one percent behind the top performing system. PMID:26209007

  2. Development of an energy-use estimation methodology for the revised Navy Manual MO-303

    SciTech Connect

    Richman, E.E.; Keller, J.M.; Wood, A.G.; Dittmer, A.L.

    1995-01-01

    The U.S. Navy commissioned Pacific Northwest Laboratory (PNL) to revise and/or update the Navy Utilities Targets Manual, NAVFAC MO-303 (U.S. Navy 1972b). The purpose of the project was to produce a current, applicable, and easy-to-use version of the manual for use by energy and facility engineers and staff at all Navy Public Works Centers (PWCs), Public Works Departments (PWDs), Engineering Field Divisions (EFDs), and other related organizations. The revision of the MO-303 manual involved developing a methodology for estimating energy consumption in buildings and ships. This methodology can account for, and equitably allocate, energy consumption within Navy installations. The analyses used to develop this methodology included developing end-use intensities (EUIs) from a vast collection of Navy base metering and billing data. A statistical analysis of the metering data, weather data, and building energy-use characteristics was used to develop appropriate EUI values for use at all Navy bases. A complete Navy base energy reconciliation process was also created for use in allocating all known energy consumption. Initial attempts to use total Navy base consumption values did not produce usable results. A parallel effort using individual building consumption data provided an estimating method that incorporated weather effects. This method produced a set of building EUI values and weather adjustments for use in estimating building energy use. A method of reconciling total site energy consumption was developed based on a {open_quotes}zero-sum{close_quotes} principle. This method provides a way to account for all energy use and apportion part or all of it to buildings and other energy uses when actual consumption is not known. The entire text of the manual was also revised to present a more easily read understood and usable document.

  3. Preliminary methodology to assess the national and regional impact of U.S. wind energy development on birds and bats

    USGS Publications Warehouse

    Diffendorfer, James E.; Beston, Julie A.; Merrill, Matthew D.; Stanton, Jessica C.; Corum, Margo D.; Loss, Scott R.; Thogmartin, Wayne E.; Johnson, Douglas H.; Erickson, Richard A.; Heist, Kevin W.

    2015-01-01

    Components of the methodology are based on simplifying assumptions and require information that, for many species, may be sparse or unreliable. These assumptions are presented in the report and should be carefully considered when using output from the methodology. In addition, this methodology can be used to recommend species for more intensive demographic modeling or highlight those species that may not require any additional protection because effects of wind energy development on their populations are projected to be small.

  4. A methodology for small scale rural land use mapping in semi-arid developing countries using orbital imagery. 1: Introduction

    NASA Technical Reports Server (NTRS)

    Vangenderen, J. L. (Principal Investigator); Lock, B. F.

    1976-01-01

    The author has identified the following significant results. This research program has developed a viable methodology for producing small scale rural land use maps in semi-arid developing countries using imagery obtained from orbital multispectral scanners.

  5. Are They All Created Equal? A Comparison of Different Concept Inventory Development Methodologies

    NASA Astrophysics Data System (ADS)

    Lindell, Rebecca S.; Peak, Elizabeth; Foster, Thomas M.

    2007-01-01

    The creation of the Force Concept Inventory (FCI) was a seminal moment for Physics Education Research. Based on the development of the FCI, many more concept inventories have been developed. The problem with the development of all of these concept inventories is there does not seem to be a concise methodology for developing these inventories, nor is there a concise definition of what these inventories measure. By comparing the development methodologies of many common Physics and Astronomy Concept Inventories we can draw inferences about different types of concept inventories, as well as different valid conclusions that can be drawn from the administration of these inventories. Inventories compared include: Astronomy Diagnostic Test (ADT), Brief Electricity and Magnetism Assessment (BEMA), Conceptual Survey in Electricity and Magnetism (CSEM), Diagnostic Exam Electricity and Magnetism (DEEM), Determining and Interpreting Resistive Electric Circuits Concept Test (DIRECT), Energy and Motion Conceptual Survey (EMCS), Force Concept Inventory (FCI), Force and Motion Conceptual Evaluation (FMCE), Lunar Phases Concept Inventory (LPCI), Test of Understanding Graphs in Kinematics (TUG-K) and Wave Concept Inventory (WCI).

  6. Frequency-agile microwave components using ferroelectric materials

    NASA Astrophysics Data System (ADS)

    Colom-Ustariz, Jose G.; Rodriguez-Solis, Rafael; Velez, Salmir; Rodriguez-Acosta, Snaider

    2003-04-01

    The non-linear electric field dependence of ferroelectric thin films can be used to design frequency and phase agile components. Tunable components have traditionally been developed using mechanically tuned resonant structures, ferrite components, or semiconductor-based voltage controlled electronics, but they are limited by their frequency performance, high cost, hgih losses, and integration into larger systems. In contrast, the ferroelectric-based tunable microwave component can easily be integrated into conventional microstrip circuits and attributes such as small size, light weight, and low-loss make these components attractive for broadband and multi-frequency applications. Components that are essential elements in the design of a microwave sensor can be fabricated with ferroelectric materials to achieve tunability over a broad frequency range. It has been reported that with a thin ferroelectric film placed between the top conductor layer and the dielectric material of a microstrip structure, and the proper DC bias scheme, tunable components above the Ku band can be fabricated. Components such as phase shifters, coupled line filters, and Lange couplers have been reported in the literature using this technique. In this wokr, simulated results from a full wave electromagnetic simulator are obtained to show the tunability of a matching netowrk typically used in the design of microwave amplifiers and antennas. In addition, simulated results of a multilayer Lange coupler, and a patch antenna are also presented. The results show that typical microstrip structures can be easily modified to provide frequency agile capabilities.

  7. Theoretical framework and methodological development of common subjective health outcome measures in osteoarthritis: a critical review.

    PubMed

    Pollard, Beth; Johnston, Marie; Dixon, Diane

    2007-03-07

    Subjective measures involving clinician ratings or patient self-assessments have become recognised as an important tool for the assessment of health outcome. The value of a health outcome measure is usually assessed by a psychometric evaluation of its reliability, validity and responsiveness. However, psychometric testing involves an accumulation of evidence and has recognised limitations. It has been suggested that an evaluation of how well a measure has been developed would be a useful additional criteria in assessing the value of a measure. This paper explored the theoretical background and methodological development of subjective health status measures commonly used in osteoarthritis research. Fourteen subjective health outcome measures commonly used in osteoarthritis research were examined. Each measure was explored on the basis of their i) theoretical framework (was there a definition of what was being assessed and was it part of a theoretical model?) and ii) methodological development (what was the scaling strategy, how were the items generated and reduced, what was the response format and what was the scoring method?). Only the AIMS, SF-36 and WHOQOL defined what they were assessing (i.e. the construct of interest) and no measure assessed was part of a theoretical model. None of the clinician report measures appeared to have implemented a scaling procedure or described the rationale for the items selected or scoring system. Of the patient self-report measures, the AIMS, MPQ, OXFORD, SF-36, WHOQOL and WOMAC appeared to follow a standard psychometric scaling method. The DRP and EuroQol used alternative scaling methods. The review highlighted the general lack of theoretical framework for both clinician report and patient self-report measures. This review also drew attention to the wide variation in the methodological development of commonly used measures in OA. While, in general the patient self-report measures had good methodological development, the

  8. Quantitative evaluation of geodiversity: development of methodological procedures with application to territorial management

    NASA Astrophysics Data System (ADS)

    Forte, J.; Brilha, J.; Pereira, D.; Nolasco, M.

    2012-04-01

    Although geodiversity is considered the setting for biodiversity, there is still a huge gap in the social recognition of these two concepts. The concept of geodiversity, less developed, is now making its own way as a robust and fundamental idea concerning the abiotic component of nature. From a conservationist point of view, the lack of a broader knowledge concerning the type and spatial variation of geodiversity, as well as its relationship with biodiversity, makes the protection and management of natural or semi-natural areas incomplete. There is a growing need to understand the patterns of geodiversity in different landscapes and to translate this knowledge for territorial management in a practical and effective point of view. This kind of management can also represent an important tool for the development of sustainable tourism, particularly geotourism, which can bring benefits not only for the environment, but also for social and economic purposes. The quantification of geodiversity is an important step in all this process but still few researchers are investing in the development of a proper methodology. The assessment methodologies that were published so far are mainly focused on the evaluation of geomorphological elements, sometimes complemented with information about lithology, soils, hidrology, morphometric variables, climatic surfaces and geosites. This results in very dissimilar areas at very different spatial scales, showing the complexity of the task and the need of further research. This current work aims the development of an effective methodology for the assessment of the maximum elements of geodiversity possible (rocks, minerals, fossils, landforms, soils), based on GIS routines. The main determinant factor for the quantitative assessment is scale, but other factors are also very important, such as the existence of suitable spatial data with sufficient degree of detail. It is expected to attain the proper procedures in order to assess geodiversity

  9. Risk assessment methodology applied to counter IED research & development portfolio prioritization

    SciTech Connect

    Shevitz, Daniel W; O' Brien, David A; Zerkle, David K; Key, Brian P; Chavez, Gregory M

    2009-01-01

    In an effort to protect the United States from the ever increasing threat of domestic terrorism, the Department of Homeland Security, Science and Technology Directorate (DHS S&T), has significantly increased research activities to counter the terrorist use of explosives. More over, DHS S&T has established a robust Counter-Improvised Explosive Device (C-IED) Program to Deter, Predict, Detect, Defeat, and Mitigate this imminent threat to the Homeland. The DHS S&T portfolio is complicated and changing. In order to provide the ''best answer'' for the available resources, DHS S&T would like some ''risk based'' process for making funding decisions. There is a definite need for a methodology to compare very different types of technologies on a common basis. A methodology was developed that allows users to evaluate a new ''quad chart'' and rank it, compared to all other quad charts across S&T divisions. It couples a logic model with an evidential reasoning model using an Excel spreadsheet containing weights of the subjective merits of different technologies. The methodology produces an Excel spreadsheet containing the aggregate rankings of the different technologies. It uses Extensible Logic Modeling (ELM) for logic models combined with LANL software called INFTree for evidential reasoning.

  10. Rolling and tumbling: status of the SuperAGILE experiment

    NASA Astrophysics Data System (ADS)

    Del Monte, E.; Costa, E.; di Persio, G.; Donnarumma, I.; Evangelista, Y.; Feroci, M.; Lapshov, I.; Lazzarotto, F.; Mastropietro, M.; Morelli, E.; Pacciani, L.; Rapisarda, M.; Rubini, A.; Soffitta, P.; Tavani, M.; Argan, A.; Trois, A.

    2010-07-01

    The SuperAGILE experiment is the hard X-ray monitor of the AGILE mission. It is a 2 x one-dimensional imager, with 6-arcmin angular resolution in the energy range 18 - 60 keV and a field of view in excess of 1 steradian. SuperAGILE is successfully operating in orbit since Summer 2007, providing long-term monitoring of bright sources and prompt detection and localization of gamma-ray bursts. Starting on October 2009 the AGILE mission lost its reaction wheel and the satellite attitude is no longer stabilized. The current mode of operation of the AGILE satellite is a Spinning Mode, around the Sun-pointing direction, with an angular velocity of about 0.8 degree/s (corresponding to 8 times the SuperAGILE point spread function every second). In these new conditions, SuperAGILE continuously scans a much larger fraction of the sky, with much smaller exposure to each region. In this paper we review some of the results of the first 2.5 years of "standard" operation of SuperAGILE, and show how new implementations in the data analysis software allows to continue the hard X-ray sky monitoring by SuperAGILE also in the new attitude conditions.

  11. Appropriate Methodology for Assessing the Economic Development Impacts of Wind Power

    SciTech Connect

    NWCC Economic Development Work Group

    2003-12-17

    OAK-B135 Interest in wind power development is growing as a means of expanding local economies. Such development holds promise as a provider of short-term employment during facility construction and long-term employment from ongoing facility operation and maintenance. It may also support some expansion of the local economy through ripple effects resulting from initial increases in jobs and income. However, there is a need for a theoretically sound method for assessing the economic impacts of wind power development. These ripple effects stem from subsequent expenditures for goods and services made possible by first-round income from the development, and are expressed in terms of a multiplier. If the local economy offers a wide range of goods and services the resulting multiplier can be substantial--as much as three or four. If not, then much of the initial income will leave the local economy to buy goods and services from elsewhere. Loss of initial income to other locales is referred to as a leakage. Northwest Economic Associates (NEA), under contract to the National Wind Coordinating Committee (NWCC), investigated three case study areas in the United States where wind power projects were recently developed. The full report, ''Assessing the Economic Development Impacts of Wind Power,'' is available at NWCC's website http://www.nationalwind.org/. The methodology used for that study is summarized here in order to provide guidance for future studies of the economic impacts of other wind power developments. The methodology used in the NEA study was specifically designed for these particular case study areas; however, it can be generally applied to other areas. Significant differences in local economic conditions and the amount of goods and services that are purchased locally as opposed to imported from outside the will strongly influence results obtained. Listed below are some of the key tasks that interested parties should undertake to develop a reasonable picture of

  12. Methodology for optimizing the development and operation of gas storage fields

    SciTech Connect

    Mercer, J.C.; Ammer, J.R.; Mroz, T.H.

    1995-04-01

    The Morgantown Energy Technology Center is pursuing the development of a methodology that uses geologic modeling and reservoir simulation for optimizing the development and operation of gas storage fields. Several Cooperative Research and Development Agreements (CRADAs) will serve as the vehicle to implement this product. CRADAs have been signed with National Fuel Gas and Equitrans, Inc. A geologic model is currently being developed for the Equitrans CRADA. Results from the CRADA with National Fuel Gas are discussed here. The first phase of the CRADA, based on original well data, was completed last year and reported at the 1993 Natural Gas RD&D Contractors Review Meeting. Phase 2 analysis was completed based on additional core and geophysical well log data obtained during a deepening/relogging program conducted by the storage operator. Good matches, within 10 percent, of wellhead pressure were obtained using a numerical simulator to history match 2 1/2 injection withdrawal cycles.

  13. Development of methodologies for the estimation of thermal properties associated with aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Scott, Elaine P.

    1993-01-01

    Thermal stress analyses are an important aspect in the development of aerospace vehicles such as the National Aero-Space Plane (NASP) and the High-Speed Civil Transport (HSCT) at NASA-LaRC. These analyses require knowledge of the temperature within the structures which consequently necessitates the need for thermal property data. The initial goal of this research effort was to develop a methodology for the estimation of thermal properties of aerospace structural materials at room temperature and to develop a procedure to optimize the estimation process. The estimation procedure was implemented utilizing a general purpose finite element code. In addition, an optimization procedure was developed and implemented to determine critical experimental parameters to optimize the estimation procedure. Finally, preliminary experiments were conducted at the Aircraft Structures Branch (ASB) laboratory.

  14. Structured system engineering methodologies used to develop a nuclear thermal propulsion engine

    NASA Technical Reports Server (NTRS)

    Corban, R.; Wagner, R.

    1993-01-01

    To facilitate the development of a space nuclear thermal propulsion engine for manned flights to Mars, requirements must be established early in the technology development cycle. The long lead times for the acquisition of the engine system and nuclear test facilities demands that the engine system size, performance and safety goals be defined at the earliest possible time. These systems are highly complex and require a large multidisciplinary systems engineering team to develop and track requirements, and to ensure that the as-built system reflects the intent of the mission. A methodology has been devised which uses sophisticated computer tools to effectively develop and interpret functional requirements, and furnish these to the specification level for implementation.

  15. A Life-Cycle Cost Estimating Methodology for NASA-Developed Air Traffic Control Decision Support Tools

    NASA Technical Reports Server (NTRS)

    Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)

    2002-01-01

    This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.

  16. Development of Rapid Earthquake Loss Assessment Methodologies for Euro-Med Region

    NASA Astrophysics Data System (ADS)

    Erdik, M.

    2009-04-01

    For almost-real time estimation of the ground shaking and losses after a major earthquake in the Euro-Mediterranean region the JRA-3 component of the EU Project entitled "Network of research Infrastructures for European Seismology, NERIES" foresees: 1. Finding of the most likely location of the source of the earthquake using regional seismotectonic data base, supported, if and when possible, by the estimation of fault rupture parameters from rapid inversion of data from on-line regional broadband stations. 2. Estimation of the spatial distribution of selected ground motion parameters at engineering bedrock through region specific ground motion attenuation relationships and/or actual physical simulation of ground motion. 3. Estimation of the spatial distribution of site-specific ground selected motion parameters using regional geology (or urban geotechnical information) data-base using appropriate amplification models. 4. Estimation of the losses and uncertainties at various orders of sophistication (buildings, casualties) Main objective of the JRA-3 wprk package is to develop a methodology for real time estimation of losses after a major earthquake in the Euro-Mediterranean region. The multi-level methodology being developed together with researchers from Imperial College, NORSAR and ETH-Zurich is capable of incorporating regional variabilities and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical ane social elements subjected to earthquake hazard and the associated vulnerability relationships. A comprehensive methodology has been developed and the related software ELER is under preparation. The apllications of the ELER software are presented in the following two accompanying papers. 1. Regional Earthquake Shaking and Loss Estimation 2. Urban Earthquake Shakıng and Loss Assessment

  17. HSTDEK: Developing a methodology for construction of large-scale, multi-use knowledge bases

    NASA Technical Reports Server (NTRS)

    Freeman, Michael S.

    1987-01-01

    The primary research objectives of the Hubble Space Telescope Design/Engineering Knowledgebase (HSTDEK) are to develop a methodology for constructing and maintaining large scale knowledge bases which can be used to support multiple applications. To insure the validity of its results, this research is being persued in the context of a real world system, the Hubble Space Telescope. The HSTDEK objectives are described in detail. The history and motivation of the project are briefly described. The technical challenges faced by the project are outlined.

  18. Development of a Reliable Fuel Depletion Methodology for the HTR-10 Spent Fuel Analysis

    SciTech Connect

    Chung, Kiwhan; Beddingfield, David H.; Geist, William H.; Lee, Sang-Yoon

    2012-07-03

    A technical working group formed in 2007 between NNSA and CAEA to develop a reliable fuel depletion method for HTR-10 based on MCNPX and to analyze the isotopic inventory and radiation source terms of the HTR-10 spent fuel. Conclusions of this presentation are: (1) Established a fuel depletion methodology and demonstrated its safeguards application; (2) Proliferation resistant at high discharge burnup ({approx}80 GWD/MtHM) - Unfavorable isotopics, high number of pebbles needed, harder to reprocess pebbles; (3) SF should remain under safeguards comparable to that of LWR; and (4) Diversion scenarios not considered, but can be performed.

  19. Developing purchasing strategy: a case study of a District Health Authority using soft systems methodology.

    PubMed

    Brown, A D

    1997-02-01

    This paper examines the attempt by a District Health Authority (DHA) to create structures (called Purchasing Strategy Groups or PSGs) to facilitate the effective development of its purchasing strategy. The paper is based on a case study design conducted using Soft Systems Methodology (SSM). The research contribution the paper makes is twofold. First, it analyses some of the fundamental management-related difficulties that a DHA can experience when attempting to come to terms with its role and responsibilities in the 1990s. Second, it provides a discussion and evaluation of the utility of SSM for qualitative research in the National Health Service (NHS) in the UK.

  20. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  1. The AIV quick look and health monitoring system of the AGILE payload

    NASA Astrophysics Data System (ADS)

    Bulgarelli, Andrea; Gianotti, Fulvio; Trifoglio, Massimo; Di Cocco, Guido; Tavani, Marco; Marisaldi, Martino

    2008-07-01

    AGILE is an ASI (Italian Space Agency) Small Scientific Mission dedicated to high-energy astrophysics which was launched on April 23 2007 from Satish Dawan Space Centre, India) on a PSLV-C8 rocket. The AGILE Payload is composed of three instruments: a Tungsten-Silicon Tracker designed to detect and image photons in the 30 MeV-50 GeV energy band, an X-ray imager called SuperAGILE that works in the 18-60 keV energy band, and a Minicalorimeter that detects gamma-rays or particle energy deposits between 300~keV and 200~MeV. The instrument is surrounded by an anti-coincidence (AC) system. We have developed a set of Quick Look software tools in the framework of the Test Equipment (TE) and the Electrical Ground Support Equipment (EGSE. This s/w is required in order to support all the assembly, integration and verification (AIV) activities to be carried out for the AGILE mission, from data handling unit level to payload integrated level, calibration campaign, launch campaign and in-orbit commissioning. These software tools have enabled us to test the engineering performance and to perform a health check of the Payload during the various phases. We have used an incremental development approach and a common framework to rapidly adapt our software to the different requirements of the various phases.

  2. Information Models, Data Requirements, and Agile Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, John S.; Crichton, Dan; Ritschel, Bernd; Hardman, Sean; Joyner, Ron

    2015-04-01

    The Planetary Data System's next generation system, PDS4, is an example of the successful use of an ontology-based Information Model (IM) to drive the development and operations of a data system. In traditional systems engineering, requirements or statements about what is necessary for the system are collected and analyzed for input into the design stage of systems development. With the advent of big data the requirements associated with data have begun to dominate and an ontology-based information model can be used to provide a formalized and rigorous set of data requirements. These requirements address not only the usual issues of data quantity, quality, and disposition but also data representation, integrity, provenance, context, and semantics. In addition the use of these data requirements during system's development has many characteristics of Agile Curation as proposed by Young et al. [Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community, AGU 2014], namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. For example customers can be satisfied through early and continuous delivery of system software and services that are configured directly from the information model. This presentation will describe the PDS4 architecture and its three principle parts: the ontology-based Information Model (IM), the federated registries and repositories, and the REST-based service layer for search, retrieval, and distribution. The development of the IM will be highlighted with special emphasis on knowledge acquisition, the impact of the IM on development and operations, and the use of shared ontologies at multiple governance levels to promote system interoperability and data correlation.

  3. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    NASA Technical Reports Server (NTRS)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  4. Wideband Agile Digital Microwave Radiometer

    NASA Technical Reports Server (NTRS)

    Gaier, Todd C.; Brown, Shannon T.; Ruf, Christopher; Gross, Steven

    2012-01-01

    The objectives of this work were to take the initial steps needed to develop a field programmable gate array (FPGA)- based wideband digital radiometer backend (>500 MHz bandwidth) that will enable passive microwave observations with minimal performance degradation in a radiofrequency-interference (RFI)-rich environment. As manmade RF emissions increase over time and fill more of the microwave spectrum, microwave radiometer science applications will be increasingly impacted in a negative way, and the current generation of spaceborne microwave radiometers that use broadband analog back ends will become severely compromised or unusable over an increasing fraction of time on orbit. There is a need to develop a digital radiometer back end that, for each observation period, uses digital signal processing (DSP) algorithms to identify the maximum amount of RFI-free spectrum across the radiometer band to preserve bandwidth to minimize radiometer noise (which is inversely related to the bandwidth). Ultimately, the objective is to incorporate all processing necessary in the back end to take contaminated input spectra and produce a single output value free of manmade signals to minimize data rates for spaceborne radiometer missions. But, to meet these objectives, several intermediate processing algorithms had to be developed, and their performance characterized relative to typical brightness temperature accuracy re quirements for current and future microwave radiometer missions, including those for measuring salinity, soil moisture, and snow pack.

  5. On the development of a strength prediction methodology for fibre metal laminates in pin bearing

    NASA Astrophysics Data System (ADS)

    Krimbalis, Peter Panagiotis

    The development of Fibre Metal Laminates (FMLs) for application into aerospace structures represents a paradigm shift in airframe and material technology. By consolidating both monolithic metallic alloys and fibre reinforced composite layers, a new material structure is born exhibiting desired qualities emerging from its heterogeneous constituency. When mechanically fastened via pins, bolts and rivets, these laminated materials develop damage and ultimately fail via mechanisms that were not entirely understood and different than either their metallic or composite constituents. The development of a predictive methodology capable of characterizing how FMLs fastened with pins behave and fail would drastically reduce the amount of experimentation necessary for material qualification and be an invaluable design tool. The body of this thesis discusses the extension of the characteristic dimension approach to FMLs and the subsequent development of a new failure mechanism as part of a progressive damage finite element (FE) modeling methodology with yielding, delamination and buckling representing the central tenets of the new mechanism. This yielding through delamination buckling (YDB) mechanism and progressive FE model were investigated through multiple experimental studies. The experimental investigations required the development of a protocol with emphasis on measuring deformation on a local scheme in addition to a global one. With the extended protocol employed, complete characterization of the material response was possible and a new definition for yield in a pin bearing configuration was developed and subsequently extended to a tensile testing configuration. The performance of this yield definition was compared directly to existing definitions and was shown to be effective in both quasi-isotropic and orthotropic materials. The results of the experiments and FE simulations demonstrated that yielding (according to the new definition), buckling and delamination

  6. Development of a new methodology to study drop shape and surface tension in electric fields.

    PubMed

    Bateni, A; Susnar, S S; Amirfazli, A; Neumann, A W

    2004-08-31

    Development of a new methodology for the study of both shape and surface tension of conducting drops in an electric field is presented. This methodology, called axisymmetric drop shape analysis-electric fields (ADSA-EF), generates numerical drop profiles in an electrostatic field, for a given surface tension. Then, it calculates the true value of the surface tension by matching theoretical profiles to the shape of experimental drops, using the surface tension as an adjustable parameter. ADSA-EF can be employed to simulate and study drop shapes in the electric field and to determine its effect on liquid surface tension. The method can also be used to measure surface tension in microgravity, where current drop-shape techniques are not applicable. The axisymmetric shape of the drop is the only assumption made in the development of ADSA-EF. The new scheme is applicable when both gravity and electrostatic forces are present. Preliminary measurements using ADSA-EF suggest that the surface tension of water increases by about 2% when an electric field with the magnitude of 10(6) V/m is applied.

  7. [AIDS/education and prevention: methodologic proposal for the development of educational games].

    PubMed

    Araújo, M F; de Almeida, M I; da Silva, R M

    2000-01-01

    The development of educational resources to mediate actions in health education and AIDS prevention has been a challenge to health's educators. Due to the undeniable importance of this kind of material in environments favourable to learning, the authors created a methodological proposal to be used in the elaboration of educative games with the purpose of mediating educational measures and the prevention of HIV/AIDS. For theoretical framework, the study relied on the ideas/premises about problem solving, adapted from Charles Manguerez's "arch method", which where put into practice in public schools and health institutions in the state of Ceara, having as study population a group of students (180), aged 13 to 19 years, from the period of 1995 to 1997. Data collection occurred during four workshops, according to a predefined scheme. The results dealing with the meaning of the game, place of action and social environment were obtained from the teenagers while they were engaged in the games and further analysed by them, resulting in the formulation of (03) educative games (memor AIDS, baralhAIDS and dominAIDS). The authors conclude that the proposal is favourable to participatory action, encouraging during its entire process the development of intellectual and creative skills, by way of mobilizing human capacities and exercising associations related to the AIDS epidemic. Due to its clarity, the authors consider the proposal scientifically acceptable as a methodological guideline for the elaboration of educative games.

  8. Agile Machining and Inspection Non-Nuclear Report (NNR) Project

    SciTech Connect

    Lazarus, Lloyd

    2009-02-19

    This report is a high level summary of the eight major projects funded by the Agile Machining and Inspection Non-Nuclear Readiness (NNR) project (FY06.0422.3.04.R1). The largest project of the group is the Rapid Response project in which the six major sub categories are summarized. This project focused on the operations of the machining departments that will comprise Special Applications Machining (SAM) in the Kansas City Responsive Infrastructure Manufacturing & Sourcing (KCRIMS) project. This project was aimed at upgrading older machine tools, developing new inspection tools, eliminating Classified Removable Electronic Media (CREM) in the handling of classified Numerical Control (NC) programs by installing the CRONOS network, and developing methods to automatically load Coordinated-Measuring Machine (CMM) inspection data into bomb books and product score cards. Finally, the project personnel leaned perations of some of the machine tool cells, and now have the model to continue this activity.

  9. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    NASA Astrophysics Data System (ADS)

    Ferreira Fonseca, T. C.; Bogaerts, R.; Hunt, John; Vanhavere, F.

    2014-11-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium.

  10. ASSET: a software tool for the evaluation of manoeuvre capabilities of highly agile satellites

    NASA Astrophysics Data System (ADS)

    Barschke, Merlin F.; Levenhagen, Jens; Reggio, Domenico; Roberts, Peter C. E.

    2014-03-01

    The new generation of agile earth observation satellites provides much higher observation capabilities than their non-agile predecessors. From a kinematic point of view, these capabilities result in more complex guidance laws for the spacecraft's attitude control system. The computation of these guidance laws is driven by a number of factors. For instance, the Earth's curved shape and its rotation in combination with the possible scan path geometries lead to a highly nonlinear relation between the motion of the satellite and the line-of-sight projection onto Earth. In this paper ASSET (Agile Satellites Scenario Evaluation Tool) is presented. ASSET is a modular MATLAB command line tool developed at Astrium GmbH, Germany, to asses the manoeuvre capabilities of agile satellites carrying time-delayed integration instruments. Each single scenario may consist of one or several ground scans, linked by suitable spacecraft slews. Once the entire scenario is defined, ASSET will analyse whether the kinematic and dynamic constraints of a specific satellite allow this scenario to be performed and will then generate the related guidance profile (angles and angular rates). The satellites' ground track, the projection of the instruments line-of-sight, and the projection of the instruments field of view onto the earth can be plotted for a visual inspection. ASSET can perform the analysis of scenarios with several different scan modes usually performed by this type of satellite.

  11. Delaying Mobility Disability in People With Parkinson Disease Using a Sensorimotor Agility Exercise Program

    PubMed Central

    King, Laurie A; Horak, Fay B

    2009-01-01

    This article introduces a new framework for therapists to develop an exercise program to delay mobility disability in people with Parkinson disease (PD). Mobility, or the ability to efficiently navigate and function in a variety of environments, requires balance, agility, and flexibility, all of which are affected by PD. This article summarizes recent research identifying how constraints on mobility specific to PD, such as rigidity, bradykinesia, freezing, poor sensory integration, inflexible program selection, and impaired cognitive processing, limit mobility in people with PD. Based on these constraints, a conceptual framework for exercises to maintain and improve mobility is presented. An example of a constraint-focused agility exercise program, incorporating movement principles from tai chi, kayaking, boxing, lunges, agility training, and Pilates exercises, is presented. This new constraint-focused agility exercise program is based on a strong scientific framework and includes progressive levels of sensorimotor, resistance, and coordination challenges that can be customized for each patient while maintaining fidelity. Principles for improving mobility presented here can be incorporated into an ongoing or long-term exercise program for people with PD. PMID:19228832

  12. Delaying mobility disability in people with Parkinson disease using a sensorimotor agility exercise program.

    PubMed

    King, Laurie A; Horak, Fay B

    2009-04-01

    This article introduces a new framework for therapists to develop an exercise program to delay mobility disability in people with Parkinson disease (PD). Mobility, or the ability to efficiently navigate and function in a variety of environments, requires balance, agility, and flexibility, all of which are affected by PD. This article summarizes recent research identifying how constraints on mobility specific to PD, such as rigidity, bradykinesia, freezing, poor sensory integration, inflexible program selection, and impaired cognitive processing, limit mobility in people with PD. Based on these constraints, a conceptual framework for exercises to maintain and improve mobility is presented. An example of a constraint-focused agility exercise program, incorporating movement principles from tai chi, kayaking, boxing, lunges, agility training, and Pilates exercises, is presented. This new constraint-focused agility exercise program is based on a strong scientific framework and includes progressive levels of sensorimotor, resistance, and coordination challenges that can be customized for each patient while maintaining fidelity. Principles for improving mobility presented here can be incorporated into an ongoing or long-term exercise program for people with PD.

  13. Supply chain network design problem for a new market opportunity in an agile manufacturing system

    NASA Astrophysics Data System (ADS)

    Babazadeh, Reza; Razmi, Jafar; Ghodsi, Reza

    2012-08-01

    The characteristics of today's competitive environment, such as the speed with which products are designed, manufactured, and distributed, and the need for higher responsiveness and lower operational cost, are forcing companies to search for innovative ways to do business. The concept of agile manufacturing has been proposed in response to these challenges for companies. This paper copes with the strategic and tactical level decisions in agile supply chain network design. An efficient mixed-integer linear programming model that is able to consider the key characteristics of agile supply chain such as direct shipments, outsourcing, different transportation modes, discount, alliance (process and information integration) between opened facilities, and maximum waiting time of customers for deliveries is developed. In addition, in the proposed model, the capacity of facilities is determined as decision variables, which are often assumed to be fixed. Computational results illustrate that the proposed model can be applied as a power tool in agile supply chain network design as well as in the integration of strategic decisions with tactical decisions.

  14. Recent developments in atomic/nuclear methodologies used for the study of cultural heritage objects

    NASA Astrophysics Data System (ADS)

    Appoloni, Carlos Roberto

    2013-05-01

    Archaeometry is an area established in the international community since the 60s, with extensive use of atomic-nuclear methods in the characterization of art, archaeological and cultural heritage objects in general. In Brazil, however, until the early '90s, employing methods of physics, only the area of archaeological dating was implemented. It was only after this period that Brazilian groups became involved in the characterization of archaeological and art objects with these methodologies. The Laboratory of Applied Nuclear Physics, State University of Londrina (LFNA/UEL) introduced, pioneered in 1994, Archaeometry and related issues among its priority lines of research, after a member of LFNA has been involved in 1992 with the possibilities of tomography in archaeometry, as well as the analysis of ancient bronzes by EDXRF. Since then, LFNA has been working with PXRF and Portable Raman in several museums in Brazil, in field studies of cave paintings and in the laboratory with material sent by archaeologists, as well as carrying out collaborative work with new groups that followed in this area. From 2003/2004 LAMFI/DFN/IFUSP and LIN/COPPE/UFRJ began to engage in the area, respectively with methodologies using ion beams and PXRF, then over time incorporating other techniques, followed later by other groups. Due to the growing number of laboratories and institutions/archaeologists/conservators interested in these applications, in may 2012 was created a network of available laboratories, based at http://www.dfn.if.usp.br/lapac. It will be presented a panel of recent developments and applications of these methodologies by national groups, as well as a sampling of what has been done by leading groups abroad.

  15. Topobathymetric elevation model development using a new methodology: Coastal National Elevation Database

    USGS Publications Warehouse

    Danielson, Jeffrey J.; Poppenga, Sandra; Brock, John C.; Evans, Gayla A.; Tyler, Dean; Gesch, Dean B.; Thatcher, Cindy; Barras, John

    2016-01-01

    During the coming decades, coastlines will respond to widely predicted sea-level rise, storm surge, and coastalinundation flooding from disastrous events. Because physical processes in coastal environments are controlled by the geomorphology of over-the-land topography and underwater bathymetry, many applications of geospatial data in coastal environments require detailed knowledge of the near-shore topography and bathymetry. In this paper, an updated methodology used by the U.S. Geological Survey Coastal National Elevation Database (CoNED) Applications Project is presented for developing coastal topobathymetric elevation models (TBDEMs) from multiple topographic data sources with adjacent intertidal topobathymetric and offshore bathymetric sources to generate seamlessly integrated TBDEMs. This repeatable, updatable, and logically consistent methodology assimilates topographic data (land elevation) and bathymetry (water depth) into a seamless coastal elevation model. Within the overarching framework, vertical datum transformations are standardized in a workflow that interweaves spatially consistent interpolation (gridding) techniques with a land/water boundary mask delineation approach. Output gridded raster TBDEMs are stacked into a file storage system of mosaic datasets within an Esri ArcGIS geodatabase for efficient updating while maintaining current and updated spatially referenced metadata. Topobathymetric data provide a required seamless elevation product for several science application studies, such as shoreline delineation, coastal inundation mapping, sediment-transport, sea-level rise, storm surge models, and tsunami impact assessment. These detailed coastal elevation data are critical to depict regions prone to climate change impacts and are essential to planners and managers responsible for mitigating the associated risks and costs to both human communities and ecosystems. The CoNED methodology approach has been used to construct integrated TBDEM models

  16. Recent developments in atomic/nuclear methodologies used for the study of cultural heritage objects

    SciTech Connect

    Appoloni, Carlos Roberto

    2013-05-06

    Archaeometry is an area established in the international community since the 60s, with extensive use of atomic-nuclear methods in the characterization of art, archaeological and cultural heritage objects in general. In Brazil, however, until the early '90s, employing methods of physics, only the area of archaeological dating was implemented. It was only after this period that Brazilian groups became involved in the characterization of archaeological and art objects with these methodologies. The Laboratory of Applied Nuclear Physics, State University of Londrina (LFNA/UEL) introduced, pioneered in 1994, Archaeometry and related issues among its priority lines of research, after a member of LFNA has been involved in 1992 with the possibilities of tomography in archaeometry, as well as the analysis of ancient bronzes by EDXRF. Since then, LFNA has been working with PXRF and Portable Raman in several museums in Brazil, in field studies of cave paintings and in the laboratory with material sent by archaeologists, as well as carrying out collaborative work with new groups that followed in this area. From 2003/2004 LAMFI/DFN/IFUSP and LIN/COPPE/UFRJ began to engage in the area, respectively with methodologies using ion beams and PXRF, then over time incorporating other techniques, followed later by other groups. Due to the growing number of laboratories and institutions/archaeologists/conservators interested in these applications, in may 2012 was created a network of available laboratories, based at http://www.dfn.if.usp.br/lapac. It will be presented a panel of recent developments and applications of these methodologies by national groups, as well as a sampling of what has been done by leading groups abroad.

  17. Integrated product definition representation for agile numerical control applications

    SciTech Connect

    Simons, W.R. Jr.; Brooks, S.L.; Kirk, W.J. III; Brown, C.W.

    1994-11-01

    Realization of agile manufacturing capabilities for a virtual enterprise requires the integration of technology, management, and work force into a coordinated, interdependent system. This paper is focused on technology enabling tools for agile manufacturing within a virtual enterprise specifically relating to Numerical Control (N/C) manufacturing activities and product definition requirements for these activities.

  18. Development of residential-conservation-survey methodology for the US Air Force. Interim report. Task two

    SciTech Connect

    Abrams, D. W.; Hartman, T. L.; Lau, A. S.

    1981-11-13

    A US Air Force (USAF) Residential Energy Conservation Methodology was developed to compare USAF needs and available data to the procedures of the Residential Conservation Service (RCS) program as developed for general use by utility companies serving civilian customers. Attention was given to the data implications related to group housing, climatic data requirements, life-cycle cost analysis, energy saving modifications beyond those covered by RCS, and methods for utilizing existing energy consumption data in approaching the USAF survey program. Detailed information and summaries are given on the five subtasks of the program. Energy conservation alternatives are listed and the basic analysis techniques to be used in evaluating their thermal performane are described. (MCW)

  19. [Methodology for the development and update of practice guidelines: current state].

    PubMed

    Barrera-Cruz, Antonio; Viniegra-Osorio, Arturo; Valenzuela-Flores, Adriana Abigail; Torres-Arreola, Laura Pilar; Dávila-Torres, Javier

    2016-01-01

    The current scenario of health services in Mexico reveals as a priority the implementation of strategies that allow us to better respond to the needs and expectations of individuals and society as a whole, through the provision of efficient and effective alternatives for the prevention, diagnosis and treatment of diseases. In this context, clinical practice guidelines constitute an element of management in the health care system, whose objective is to establish a national bechmark for encouraging clinical and management decision making, based on recommendations from the best available evidence, in order to contribute to the quality and effectiveness of health care. The purpose of this document is to show the methodology used for the development and updating of clinical practice guidelines that the Instituto Mexicano del Seguro Social has developed in line with the sectorial model in order to serve the user of these guidelines.

  20. Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors

    SciTech Connect

    Hugo, Jacques

    2015-05-01

    This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method was adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.

  1. The Importance of Rapid Auditory Processing Abilities to Early Language Development: Evidence from Converging Methodologies

    PubMed Central

    Thomas, Jennifer J.; Choudhury, Naseem; Leppänen, Paavo H. T.

    2006-01-01

    The ability to process two or more rapidly presented, successive, auditory stimuli is believed to underlie successful language acquisition. Likewise, deficits in rapid auditory processing of both verbal and nonverbal stimuli are characteristic of individuals with developmental language disorders such as Specific Language Impairment. Auditory processing abilities are well developed in infancy, and thus such deficits should be detectable in infants. In the studies presented here, converging methodologies are used to examine such abilities in infants with and without a family history of language disorder. Behavioral measures, including assessments of infant information processing, and an EEG/event-related potential (ERP) paradigm are used concurrently. Results suggest that rapid auditory processing skills differ as a function of family history and are predictive of later language outcome. Further, these paradigms may prove to be sensitive tools for identifying children with poor processing skills in infancy and thus at a higher risk for developing a language disorder. PMID:11891639

  2. Scaling Agile Infrastructure to People

    NASA Astrophysics Data System (ADS)

    Jones, B.; McCance, G.; Traylen, S.; Barrientos Arias, N.

    2015-12-01

    When CERN migrated its infrastructure away from homegrown fabric management tools to emerging industry-standard open-source solutions, the immediate technical challenges and motivation were clear. The move to a multi-site Cloud Computing model meant that the tool chains that were growing around this ecosystem would be a good choice, the challenge was to leverage them. The use of open-source tools brings challenges other than merely how to deploy them. Homegrown software, for all the deficiencies identified at the outset of the project, has the benefit of growing with the organization. This paper will examine what challenges there were in adapting open-source tools to the needs of the organization, particularly in the areas of multi-group development and security. Additionally, the increase in scale of the plant required changes to how Change Management was organized and managed. Continuous Integration techniques are used in order to manage the rate of change across multiple groups, and the tools and workflow for this will be examined.

  3. Combat Agility Management System (CAMS)

    NASA Technical Reports Server (NTRS)

    Skow, Andrew; Porada, William

    1994-01-01

    The proper management of energy becomes a complex task in fighter aircraft which have high angle of attack (AOA) capability. Maneuvers at high AOA are accompanied by high bleed rates (velocity decrease), a characteristic that is usually undesirable in a typical combat arena. Eidetics has developed under NASA SBIR Phase 1 and NAVAIR SBIR Phase 2 contracts a system which allows a pilot to more easily and effectively manage the trade-off of energy (airspeed or altitude) for turn rate while not imposing hard limits on the high AOA nose pointing capability that can be so important in certain air combat maneuver situations. This has been accomplished by incorporating a two-stage angle of attack limiter into the flight control laws. The first stage sets a limit on AOA to achieve a limit on the maximum bleed rate (selectable) by limiting AOA to values which are dependent on the aircraft attitude and dynamic pressure (or flight path, velocity, and altitude). The second stage sets an AOA limit near the AOA for C(sub l max). One of the principal benefits of such a system is that it enables a low-experience pilot to become much more proficient at managing his energy. The Phase 2 simulation work is complete, and an exploratory flight test on the F-18 HARV is planned for the Fall of 1994 to demonstrate/validate the concept.

  4. Development of Hydrologic Characterization Methodology of Faults: Outline of the Project in Berkeley, California

    NASA Astrophysics Data System (ADS)

    Goto, J.; Miwa, T.; Tsuchi, H.; Karasaki, K.

    2009-12-01

    The Nuclear Waste Management Organization of Japan (NUMO), after volunteering municipalities arise, will start a three-staged program for selecting a HLW and TRU waste repository site. It is recognized from experiences from various site characterization programs in the world that the hydrologic property of faults is one of the most important parameters in the early stage of the program. It is expected that numerous faults of interest exist in an investigation area of several tens of square kilometers. It is, however, impossible to characterize all these faults in a limited time and budget. This raises problems in the repository designing and safety assessment that we may have to accept unrealistic or over conservative results by using a single model or parameters for all the faults in the area. We, therefore, seek to develop an efficient and practical methodology to characterize hydrologic property of faults. This project is a five year program started in 2007, and comprises the basic methodology development through literature study and its verification through field investigations. The literature study tries to classify faults by correlating their geological features with hydraulic property, to search for the most efficient technology for fault characterization, and to develop a work flow diagram. The field investigation starts from selection of a site and fault(s), followed by existing site data analyses, surface geophysics, geological mapping, trenching, water sampling, a series of borehole investigations and modeling/analyses. Based on the results of the field investigations, we plan to develop a systematic hydrologic characterization methodology of faults. A classification method that correlates combinations of geological features (rock type, fault displacement, fault type, position in a fault zone, fracture zone width, damage zone width) with widths of high permeability zones around a fault zone was proposed through a survey on available documents of the site

  5. Associated with aerospace vehicles development of methodologies for the estimation of thermal properties

    NASA Technical Reports Server (NTRS)

    Scott, Elaine P.

    1994-01-01

    Thermal stress analyses are an important aspect in the development of aerospace vehicles at NASA-LaRC. These analyses require knowledge of the temperature distributions within the vehicle structures which consequently necessitates the need for accurate thermal property data. The overall goal of this ongoing research effort is to develop methodologies for the estimation of the thermal property data needed to describe the temperature responses of these complex structures. The research strategy undertaken utilizes a building block approach. The idea here is to first focus on the development of property estimation methodologies for relatively simple conditions, such as isotropic materials at constant temperatures, and then systematically modify the technique for the analysis of more and more complex systems, such as anisotropic multi-component systems. The estimation methodology utilized is a statistically based method which incorporates experimental data and a mathematical model of the system. Several aspects of this overall research effort were investigated during the time of the ASEE summer program. One important aspect involved the calibration of the estimation procedure for the estimation of the thermal properties through the thickness of a standard material. Transient experiments were conducted using a Pyrex standard at various temperatures, and then the thermal properties (thermal conductivity and volumetric heat capacity) were estimated at each temperature. Confidence regions for the estimated values were also determined. These results were then compared to documented values. Another set of experimental tests were conducted on carbon composite samples at different temperatures. Again, the thermal properties were estimated for each temperature, and the results were compared with values obtained using another technique. In both sets of experiments, a 10-15 percent off-set between the estimated values and the previously determined values was found. Another effort

  6. Evaluation of a proposed expert system development methodology: Two case studies

    NASA Technical Reports Server (NTRS)

    Gilstrap, Lewey

    1990-01-01

    Two expert system development projects were studied to evaluate a proposed Expert Systems Development Methodology (ESDM). The ESDM was developed to provide guidance to managers and technical personnel and serve as a standard in the development of expert systems. It was agreed that the proposed ESDM must be evaluated before it could be adopted; therefore a study was planned for its evaluation. This detailed study is now underway. Before the study began, however, two ongoing projects were selected for a retrospective evaluation. They were the Ranging Equipment Diagnostic Expert System (REDEX) and the Backup Control Mode Analysis and Utility System (BCAUS). Both projects were approximately 1 year into development. Interviews of project personnel were conducted, and the resulting data was used to prepare the retrospective evaluation. Decision models of the two projects were constructed and used to evaluate the completeness and accuracy of key provisions of ESDM. A major conclusion reached from these case studies is that suitability and risk analysis should be required for all AI projects, large and small. Further, the objectives of each stage of development during a project should be selected to reduce the next largest area of risk or uncertainty on the project.

  7. An Initial Meteoroid Stream Survey in the Southern Hemisphere Using the Southern Argentina Agile Meteor Radar (SAAMER)

    NASA Technical Reports Server (NTRS)

    Janches, D.; Hormaechea, J. L.; Brunini, C.; Hocking, W.; Fritts, D. C.

    2013-01-01

    We present in this manuscript a 4 year survey of meteor shower radiants utilizing the Southern Argentina Agile Meteor Radar (SAAMER). SAAMER, which operates at the southern most region of South America, is a new generation SKiYMET system designed with significant differences from typical meteor radars including high transmitted power and an 8-antenna transmitting array enabling large detected rates at low zenith angles. We applied the statistical methodology developed by Jones and Jones (Jones, J., Jones, W. [2006]. Month. Not. R. Astron. Soc. 367, 1050-1056) to the data collected each day and compiled the results into 1 composite representative year at 1 resolution in Solar Longitude. We then search for enhancements in the activity which last for at least 3 days and evolve temporally as is expected from a meteor shower. Using this methodology, we have identified in our data 32 shower radiants, two of which were not part of the IAU commission 22 meteor shower working list. Recently, SAAMER's capabilities were enhanced by adding two remote stations to receive meteor forward scatter signals from meteor trails and thus enable the determination of meteoroid orbital parameters. SAAMER started recording orbits in January 2012 and future surveys will focus on the search for unknown meteor streams, in particular in the southern ecliptic sky.

  8. An initial meteoroid stream survey in the southern hemisphere using the Southern Argentina Agile Meteor Radar (SAAMER)

    NASA Astrophysics Data System (ADS)

    Janches, D.; Hormaechea, J. L.; Brunini, C.; Hocking, W.; Fritts, D. C.

    2013-04-01

    We present in this manuscript a 4 year survey of meteor shower radiants utilizing the Southern Argentina Agile Meteor Radar (SAAMER). SAAMER, which operates at the southern most region of South America, is a new generation SKiYMET system designed with significant differences from typical meteor radars including high transmitted power and an 8-antenna transmitting array enabling large detected rates at low zenith angles. We applied the statistical methodology developed by Jones and Jones (Jones, J., Jones, W. [2006]. Month. Not. R. Astron. Soc. 367, 1050-1056) to the data collected each day and compiled the results into 1 composite representative year at 1° resolution in Solar Longitude. We then search for enhancements in the activity which last for at least 3 days and evolve temporally as is expected from a meteor shower. Using this methodology, we have identified in our data 32 shower radiants, two of which were not part of the IAU commission 22 meteor shower working list. Recently, SAAMER's capabilities were enhanced by adding two remote stations to receive meteor forward scatter signals from meteor trails and thus enable the determination of meteoroid orbital parameters. SAAMER started recording orbits in January 2012 and future surveys will focus on the search for unknown meteor streams, in particular in the southern ecliptic sky.

  9. Development of Methodology to Assess the Failure Behaviour of Bamboo Single Fibre by Acoustic Emission Technique

    NASA Astrophysics Data System (ADS)

    Alam, Md. Saiful; Gulshan, Fahmida; Ahsan, Qumrul; Wevers, Martine; Pfeiffer, Helge; van Vuure, Aart-Willem; Osorio, Lina; Verpoest, Ignaas

    2017-04-01

    Acoustic emission (AE) was used as a tool for detecting, evaluating and for better understanding of the damage mechanism and failure behavior in composites during mechanical loading. Methodology was developed for tensile test of natural fibres (bamboo single fibre). A series of experiments were performed and load drops (one or two) were observed in the load versus time graphs. From the observed AE parameters such as amplitude, energy, duration etc. significant information corresponding to the load drops were found. These AE signals from the load drop occurred from such failure as debonding between two elementary fibre or from join of elementary fibre at edge. The various sources of load at first load drop was not consistent for the different samples (for a particular sample the value is 8 N, stress: 517.51 MPa). Final breaking of fibre corresponded to saturated level AE amplitude of preamplifier (99.9 dB) for all samples. Therefore, it was not possible to determine the exact AE energy value for final breaking. Same methodology was used for tensile test of three single fibres, which gave clear indication of load drop before the final breaking of first and second fibre.

  10. Development, testing and implementation of an emergency services methodology in Alberta.

    PubMed

    Eliasoph, H; Ashdown, C

    1995-01-01

    Alberta was the first province in Canada to mandate reporting of hospital-based emergency services. This reporting is based on a workload measurement system that groups emergency visits into five discreet workload levels/classes driven by ICD-9-CM diagnoses. Other related workload measurement variables are incorporated, including admissions, transfers, maintenance monitoring, nursing and non-nursing patient support activities, trips, staff replacement, and personal fatigue and delay. The methodology used to design the reporting system has been subjected to extensive testing, auditing and refinement. The results of one year of province-wide data collection yielded approximately 1.5 million emergency visits. These data reveal consistent patterns/trends of workload that vary by hospital size and type. Although this information can assist in utilization management efforts to predict and compare workload and staffing levels, the impetus for establishing this system derived from its potential for funding hospital-based emergency services. This would be the first time that such services would be funded on a systemic, system-wide basis whereby hospitals would be reimbursed in relation to workload. This proposed funding system would distribute available funding in a consistent, fair and equitable manner across all hospitals providing a similar set of services, thus achieving one of the key goals of the Alberta Acute Care Funding Plan. Ultimately, this proposed funding methodology would be integrated into a broader Ambulatory Care Funding system currently being developed in Alberta.

  11. Development of Methodology to Assess the Failure Behaviour of Bamboo Single Fibre by Acoustic Emission Technique

    NASA Astrophysics Data System (ADS)

    Alam, Md. Saiful; Gulshan, Fahmida; Ahsan, Qumrul; Wevers, Martine; Pfeiffer, Helge; van Vuure, Aart-Willem; Osorio, Lina; Verpoest, Ignaas

    2016-06-01

    Acoustic emission (AE) was used as a tool for detecting, evaluating and for better understanding of the damage mechanism and failure behavior in composites during mechanical loading. Methodology was developed for tensile test of natural fibres (bamboo single fibre). A series of experiments were performed and load drops (one or two) were observed in the load versus time graphs. From the observed AE parameters such as amplitude, energy, duration etc. significant information corresponding to the load drops were found. These AE signals from the load drop occurred from such failure as debonding between two elementary fibre or from join of elementary fibre at edge. The various sources of load at first load drop was not consistent for the different samples (for a particular sample the value is 8 N, stress: 517.51 MPa). Final breaking of fibre corresponded to saturated level AE amplitude of preamplifier (99.9 dB) for all samples. Therefore, it was not possible to determine the exact AE energy value for final breaking. Same methodology was used for tensile test of three single fibres, which gave clear indication of load drop before the final breaking of first and second fibre.

  12. Development of an Evaluation Methodology for Triple Bottom Line Reports Using International Standards on Reporting

    NASA Astrophysics Data System (ADS)

    Skouloudis, Antonis; Evangelinos, Konstantinos; Kourmousis, Fotis

    2009-08-01

    The purpose of this article is twofold. First, evaluation scoring systems for triple bottom line (TBL) reports to date are examined and potential methodological weaknesses and problems are highlighted. In this context, a new assessment methodology is presented based explicitly on the most widely acknowledged standard on non-financial reporting worldwide, the Global Reporting Initiative (GRI) guidelines. The set of GRI topics and performance indicators was converted into scoring criteria while the generic scoring devise was set from 0 to 4 points. Secondly, the proposed benchmark tool was applied to the TBL reports published by Greek companies. Results reveal major gaps in reporting practices, stressing the need for the further development of internal systems and processes in order to collect essential non-financial performance data. A critical overview of the structure and rationale of the evaluation tool in conjunction with the Greek case study is discussed while recommendations for future research on the field of this relatively new form of reporting are suggested.

  13. Development of an evaluation methodology for triple bottom line reports using international standards on reporting.

    PubMed

    Skouloudis, Antonis; Evangelinos, Konstantinos; Kourmousis, Fotis

    2009-08-01

    The purpose of this article is twofold. First, evaluation scoring systems for triple bottom line (TBL) reports to date are examined and potential methodological weaknesses and problems are highlighted. In this context, a new assessment methodology is presented based explicitly on the most widely acknowledged standard on non-financial reporting worldwide, the Global Reporting Initiative (GRI) guidelines. The set of GRI topics and performance indicators was converted into scoring criteria while the generic scoring devise was set from 0 to 4 points. Secondly, the proposed benchmark tool was applied to the TBL reports published by Greek companies. Results reveal major gaps in reporting practices, stressing the need for the further development of internal systems and processes in order to collect essential non-financial performance data. A critical overview of the structure and rationale of the evaluation tool in conjunction with the Greek case study is discussed while recommendations for future research on the field of this relatively new form of reporting are suggested.

  14. Development of a 3D numerical methodology for fast prediction of gun blast induced loading

    NASA Astrophysics Data System (ADS)

    Costa, E.; Lagasco, F.

    2014-05-01

    In this paper, the development of a methodology based on semi-empirical models from the literature to carry out 3D prediction of pressure loading on surfaces adjacent to a weapon system during firing is presented. This loading is consequent to the impact of the blast wave generated by the projectile exiting the muzzle bore. When exceeding a pressure threshold level, loading is potentially capable to induce unwanted damage to nearby hard structures as well as frangible panels or electronic equipment. The implemented model shows the ability to quickly predict the distribution of the blast wave parameters over three-dimensional complex geometry surfaces when the weapon design and emplacement data as well as propellant and projectile characteristics are available. Considering these capabilities, the use of the proposed methodology is envisaged as desirable in the preliminary design phase of the combat system to predict adverse effects and then enable to identify the most appropriate countermeasures. By providing a preliminary but sensitive estimate of the operative environmental loading, this numerical means represents a good alternative to more powerful, but time consuming advanced computational fluid dynamics tools, which use can, thus, be limited to the final phase of the design.

  15. Development of a Lagrangian-Lagrangian methodology to predict brownout dust clouds

    NASA Astrophysics Data System (ADS)

    Syal, Monica

    A Lagrangian-Lagrangian dust cloud simulation methodology has been developed to help better understand the complicated two-phase nature of the rotorcraft brownout problem. Brownout conditions occur when rotorcraft land or take off from ground surfaces covered with loose sediment such as sand and dust, which decreases the pilot's visibility of the ground and poses a serious safety of flight risk. The present work involved the development of a comprehensive, computationally efficient three-dimensional sediment tracking method for dilute, low Reynolds number Stokes-type flows. The flow field generated by a helicopter rotor in ground effect operations over a mobile sediment bed was modeled by using an inviscid, incompressible, Lagrangian free-vortex method, coupled to a viscous semi-empirical approximation for the boundary layer flow near the ground. A new threshold model for the onset of sediment mobility was developed by including the effects of unsteady pressure forces that are induced in vortically dominated rotor flows, which can significantly alter the threshold conditions for particle motion. Other important aspects of particle mobility and uplift in such vortically driven dust flows were also modeled, including bombardment effects when previously suspended particles impact the bed and eject new particles. Bombardment effects were shown to be a particularly significant contributor to the mobilization and eventual suspension of large quantities of smaller-sized dust particles, which tend to remain suspended. A numerically efficient Lagrangian particle tracking methodology was developed where individual particle or clusters of particles were tracked in the flow. To this end, a multi-step, second-order accurate time-marching scheme was developed to solve the numerically stiff equations that govern the dynamics of particle motion. The stability and accuracy of this scheme was examined and matched to the characteristics of free-vortex method. One-way coupling of the

  16. Development of the CPXSD Methodology for Generation of Fine-Group Libraries for Shielding Applications

    SciTech Connect

    Alpan, F. Arzu; Haghighat, Alireza

    2005-01-15

    Multigroup cross sections are one of the major factors that cause uncertainties in the results of deterministic transport calculations. Thus, it is important to prepare effective cross-section libraries that include an appropriate group structure and are based on an appropriate spectrum. There are several multigroup cross-section libraries available for particular applications. For example, the 47-neutron, 20-gamma group BUGLE library that is derived from the 199-neutron, 42-gamma group VITAMIN-B6 library is widely used for light water reactor (LWR) shielding and pressure vessel dosimetry applications. However, there is no publicly available methodology that can construct problem-dependent libraries. Thus, the authors have developed the Contributon and Point-wise Cross Section Driven (CPXSD) methodology for constructing effective fine- and broad-group structures. In this paper, new fine-group structures were constructed using the CPXSD, and new fine-group cross-section libraries were generated. The 450-group LIB450 and 589-group LIB589 libraries were developed for neutrons sensitive to the fast and thermal energy ranges, respectively, for LWR shielding problems. As compared to a VITAMIN-B6-like library, the new fine-group library developed for fast neutron dosimetry calculations resulted in closer agreement to the continuous-energy predictions. For example, for the fast neutron cavity dosimetry, {approx}4% improvement was observed for the {sup 237}Np(n,f) reaction rate. For the thermal neutron {sup 1}H(n, {gamma}) reaction, a maximum improvement of {approx}14% was observed in the reaction rate at the middowncomer position.

  17. Thrust Direction Optimization: Satisfying Dawn's Attitude Agility Constraints

    NASA Technical Reports Server (NTRS)

    Whiffen, Gregory J.

    2013-01-01

    The science objective of NASA's Dawn Discovery mission is to explore the giant asteroid Vesta and the dwarf planet Ceres, the two largest members of the main asteroid belt. Dawn successfully completed its orbital mission at Vesta. The Dawn spacecraft has complex, difficult to quantify, and in some cases severe limitations on its attitude agility. The low-thrust transfers between science orbits at Vesta required very complex time varying thrust directions due to the strong and complex gravity and various science objectives. Traditional low-thrust design objectives (like minimum change in velocity or minimum transfer time) often result in thrust direction time evolutions that cannot be accommodated with the attitude control system available on Dawn. This paper presents several new optimal control objectives, collectively called thrust direction optimization that were developed and turned out to be essential to the successful navigation of Dawn at Vesta.

  18. Thrust Direction Optimization: Satisfying Dawn's Attitude Agility Constraints

    NASA Technical Reports Server (NTRS)

    Whiffen, Gregory J.

    2013-01-01

    The science objective of NASA's Dawn Discovery mission is to explore the two largest members of the main asteroid belt, the giant asteroid Vesta and the dwarf planet Ceres. Dawn successfully completed its orbital mission at Vesta. The Dawn spacecraft has complex, difficult to quantify, and in some cases severe limitations on its attitude agility. The low-thrust transfers between science orbits at Vesta required very complex time varying thrust directions due to the strong and complex gravity and various science objectives. Traditional thrust design objectives (like minimum (Delta)V or minimum transfer time) often result in thrust direction time evolutions that can not be accommodated with the attitude control system available on Dawn. This paper presents several new optimal control objectives, collectively called thrust direction optimization that were developed and necessary to successfully navigate Dawn through all orbital transfers at Vesta.

  19. A new algorithm for agile satellite-based acquisition operations

    NASA Astrophysics Data System (ADS)

    Bunkheila, Federico; Ortore, Emiliano; Circi, Christian

    2016-06-01

    Taking advantage of the high manoeuvrability and the accurate pointing of the so-called agile satellites, an algorithm which allows efficient management of the operations concerning optical acquisitions is described. Fundamentally, this algorithm can be subdivided into two parts: in the first one the algorithm operates a geometric classification of the areas of interest and a partitioning of these areas into stripes which develop along the optimal scan directions; in the second one it computes the succession of the time windows in which the acquisition operations of the areas of interest are feasible, taking into consideration the potential restrictions associated with these operations and with the geometric and stereoscopic constraints. The results and the performances of the proposed algorithm have been determined and discussed considering the case of the Periodic Sun-Synchronous Orbits.

  20. Agile and dexterous robot for inspection and EOD operations

    NASA Astrophysics Data System (ADS)

    Handelman, David A.; Franken, Gordon H.; Komsuoglu, Haldun

    2010-04-01

    The All-Terrain Biped (ATB) robot is an unmanned ground vehicle with arms, legs and wheels designed to drive, crawl, walk and manipulate objects for inspection and explosive ordnance disposal tasks. This paper summarizes on-going development of the ATB platform. Control technology for semi-autonomous legged mobility and dual-arm dexterity is described as well as preliminary simulation and hardware test results. Performance goals include driving on flat terrain, crawling on steep terrain, walking on stairs, opening doors and grasping objects. Anticipated benefits of the adaptive mobility and dexterity of the ATB platform include increased robot agility and autonomy for EOD operations, reduced operator workload and reduced operator training and skill requirements.

  1. Dynamic tumor tracking using the Elekta Agility MLC

    SciTech Connect

    Fast, Martin F. Nill, Simeon Bedford, James L.; Oelfke, Uwe

    2014-11-01

    Purpose: To evaluate the performance of the Elekta Agility multileaf collimator (MLC) for dynamic real-time tumor tracking. Methods: The authors have developed a new control software which interfaces to the Agility MLC to dynamically program the movement of individual leaves, the dynamic leaf guides (DLGs), and the Y collimators (“jaws”) based on the actual target trajectory. A motion platform was used to perform dynamic tracking experiments with sinusoidal trajectories. The actual target positions reported by the motion platform at 20, 30, or 40 Hz were used as shift vectors for the MLC in beams-eye-view. The system latency of the MLC (i.e., the average latency comprising target device reporting latencies and MLC adjustment latency) and the geometric tracking accuracy were extracted from a sequence of MV portal images acquired during irradiation for the following treatment scenarios: leaf-only motion, jaw + leaf motion, and DLG + leaf motion. Results: The portal imager measurements indicated a clear dependence of the system latency on the target position reporting frequency. Deducting the effect of the target frequency, the leaf adjustment latency was measured to be 38 ± 3 ms for a maximum target speed v of 13 mm/s. The jaw + leaf adjustment latency was 53 ± 3 at a similar speed. The system latency at a target position frequency of 30 Hz was in the range of 56–61 ms for the leaves (v ≤ 31 mm/s), 71–78 ms for the jaw + leaf motion (v ≤ 25 mm/s), and 58–72 ms for the DLG + leaf motion (v ≤ 59 mm/s). The tracking accuracy showed a similar dependency on the target position frequency and the maximum target speed. For the leaves, the root-mean-squared error (RMSE) was between 0.6–1.5 mm depending on the maximum target speed. For the jaw + leaf (DLG + leaf) motion, the RMSE was between 0.7–1.5 mm (1.9–3.4 mm). Conclusions: The authors have measured the latency and geometric accuracy of the Agility MLC, facilitating its future use for clinical

  2. Frequency/phase agile microwave circuits on ferroelectric films

    NASA Astrophysics Data System (ADS)

    Romanofsky, Robert Raymond

    This work describes novel microwave circuits that can be tuned in either frequency or phase through the use of nonlinear dielectrics, specifically thin ferroelectric films. These frequency and phase agile circuits in many cases provide a new capability or offer the potential for lower cost alternatives in satellite and terrestrial communications and sensor applications. A brief introduction to nonlinear dielectrics and a summary of some of the special challenges confronting the practical insertion of ferroelectric technology into commercial systems is provided. A theoretical solution for the propagation characteristics of the multi-layer structures, with emphasis on a new type of phase shifter based on coupled microstrip, lines, is developed. The quasi-TEM analysis is based on a variational solution for line capacitance and an extension of coupled transmission line theory. It is shown that the theoretical model is applicable to a broad class of multi-layer transmission lines. The critical role that ferroelectric film thickness plays in loss and phase-shift is closely examined. Experimental data for both thin film BaxSr1-xTiO 3 phase shifters near room temperature and SMO3 phase shifters at cryogenic temperatures on MgO and LaAlO3 substrates is included. Some of these devices demonstrated an insertion loss of less than 5 dB at Ku-band with continuously variable phase shift in excess of 360 degrees. The performance of these devices is superior to the state-of-the-art semiconductor counterparts. Frequency and phase agile antenna prototypes including a microstrip patch that can operate at multiple microwave frequency bands and a new type of phased array antenna concept called the ferroelectric reflectarray are introduced. Modeled data for tunable microstrip patch antennas is presented for various ferroelectric film thickness. A prototype linear phased array, with a conventional beam-forming manifold, and an electronic controller is described. This is the first

  3. ASTATINE-211 RADIOCHEMISTRY: THE DEVELOPMENT OF METHODOLOGIES FOR HIGH ACTIVITY LEVEL RADIOSYNTHESIS

    SciTech Connect

    MICHAEL R. ZALUTSKY

    2012-08-08

    Targeted radionuclide therapy is emerging as a viable approach for cancer treatment because of its potential for delivering curative doses of radiation to malignant cell populations while sparing normal tissues. Alpha particles such as those emitted by 211At are particularly attractive for this purpose because of their short path length in tissue and high energy, making them highly effective in killing cancer cells. The current impact of targeted radiotherapy in the clinical domain remains limited despite the fact that in many cases, potentially useful molecular targets and labeled compounds have already been identified. Unfortunately, putting these concepts into practice has been impeded by limitations in radiochemistry methodologies. A critical problem is that the synthesis of therapeutic radiopharmaceuticals provides additional challenges in comparison to diagnostic reagents because of the need to perform radio-synthesis at high levels of radioactivity. This is particularly important for {alpha}-particle emitters such as 211At because they deposit large amounts of energy in a highly focal manner. The overall objective of this project is to develop convenient and reproducible radiochemical methodologies for the radiohalogenation of molecules with the {alpha}-particle emitter 211At at the radioactivity levels needed for clinical studies. Our goal is to address two problems in astatine radiochemistry: First, a well known characteristic of 211At chemistry is that yields for electrophilic astatination reactions decline as the time interval after radionuclide isolation from the cyclotron target increases. This is a critical problem that must be addressed if cyclotrons are to be able to efficiently supply 211At to remote users. And second, when the preparation of high levels of 211At-labeled compounds is attempted, the radiochemical yields can be considerably lower than those encountered at tracer dose. For these reasons, clinical evaluation of promising 211At

  4. Agile Data Curation: A conceptual framework and approach for practitioner data management

    NASA Astrophysics Data System (ADS)

    Young, J. W.; Benedict, K. K.; Lenhardt, W. C.

    2015-12-01

    Data management occurs across a range of science and related activities such as decision-support. Exemplars within the science community operate data management systems that are extensively planned before implementation, staffed with robust data management expertise, equipped with appropriate services and technologies, and often highly structured. However, this is not the only approach to data management and almost certainly not the typical experience. The other end of the spectrum is often an ad hoc practitioner team, with changing requirements, limited training in data management, and resource constrained for both equipment and human resources. Much of the existing data management literature serves the exemplar community and ignores the ad hoc practitioners. Somewhere in the middle are examples where data are repurposed for new uses thereby generating new data management challenges. This submission presents a conceptualization of an Agile Data Curation approach that provides foundational principles for data management efforts operating across the spectrum of data generation and use from large science systems to efforts with constrained resources, limited expertise, and evolving requirements. The underlying principles to Agile Data Curation are a reapplication of agile software development principles to data management. The historical reality for many data management efforts is operating in a practioner environment so Agile Data Curation utilizes historical and current case studies to validate the foundational principles and through comparison learn lessons for future application. This submission will provide an overview of the Agile Data Curation, cover the foundational principles to the approach, and introduce a framework for gathering, classifying, and applying lessons from case studies of practitioner data management.

  5. Development of improved methodology for the comparative assesment of potential repository concepts and locations

    SciTech Connect

    Tsuchi, Hiroyuki; Koike, Akihisa; Sato, Shoko; Kawamura, Hideki

    2007-07-01

    NUMO has adopted a volunteering approach to siting a geological repository for high-level radioactive waste (HLW). It is important for this process that the pros and cons of volunteers can be assessed from literature data in a clear and transparent manner, prior to the very careful selection of those sites that will be carried forward for more detailed characterisation. For this purpose, multi-attribute analysis (MAA) methodology has been developed that allows the technical assessment of criteria to be represented as scoring models. The trickier job of weighting different criteria involves expert opinion, which can be solicited by different methods. In particular, weighting of top-level attributes involves balancing a range of technical and socio-economic issues, which can be examined by considering the viewpoint of different stakeholders. The applicability of the MAA tool and its sensitivity to stakeholder viewpoints have been examined by simple case studies. (authors)

  6. Numerical simulation methodologies for design and development of Diffuser-Augmented Wind Turbines - analysis and comparison

    NASA Astrophysics Data System (ADS)

    Michał, Lipian; Maciej, Karczewski; Jakub, Molinski; Krzysztof, Jozwik

    2016-01-01

    Different numerical computation methods used to develop a methodology for fast, efficient, reliable design and comparison of Diffuser-Augmented Wind Turbine (DAWT) geometries are presented. The demand for such methods is evident, following the multitude of geometrical parameters that influence the flow character through ducted turbines. The results of the Actuator Disk Model (ADM) simulations will be confronted with a simulation method of higher order of accuracy, i.e. the 3D Fully-resolved Rotor Model (FRM) in the rotor design point. Both will be checked for consistency with the experimental results measured in the wind tunnel at the Institute of Turbo-machinery (IMP), Lodz University of Technology (TUL). An attempt to find an efficient method (with a compromise between accuracy and design time) for the flow analysis pertinent to the DAWT is a novel approach presented in this paper.

  7. Chapter 43: Assessment of NE Greenland: Prototype for development of Circum-ArcticResource Appraisal methodology

    USGS Publications Warehouse

    Gautier, D.L.; Stemmerik, L.; Christiansen, F.G.; Sorensen, K.; Bidstrup, T.; Bojesen-Koefoed, J. A.; Bird, K.J.; Charpentier, R.R.; Houseknecht, D.W.; Klett, T.R.; Schenk, C.J.; Tennyson, M.E.

    2011-01-01

    Geological features of NE Greenland suggest large petroleum potential, as well as high uncertainty and risk. The area was the prototype for development of methodology used in the US Geological Survey (USGS) Circum-Arctic Resource Appraisal (CARA), and was the first area evaluated. In collaboration with the Geological Survey of Denmark and Greenland (GEUS), eight "assessment units" (AU) were defined, six of which were probabilistically assessed. The most prospective areas are offshore in the Danmarkshavn Basin. This study supersedes a previous USGS assessment, from which it differs in several important respects: oil estimates are reduced and natural gas estimates are increased to reflect revised understanding of offshore geology. Despite the reduced estimates, the CARA indicates that NE Greenland may be an important future petroleum province. ?? 2011 The Geological Society of London.

  8. Development of the activation analysis calculational methodology for the Spallation Neutron Source (SNS)

    SciTech Connect

    Odano, N.; Johnson, J.O.; Charton, L.A.; Barnes, J.M.

    1998-03-01

    For the design of the proposed Spallation Neutron Source (SNS), activation analyses are required to determine the radioactive waste streams, on-line material processing requirements remote handling/maintenance requirements, potential site contamination and background radiation levels. For the conceptual design of the SNS, the activation analyses were carried out using the high-energy transport code HETC96 coupled with MCNP to generate the required nuclide production rates for the ORIHET95 isotope generation code. ORIHET95 utilizes a matrix-exponential method to study the buildup and decay of activities for any system for which the nuclide production rates are known. In this paper, details of the developed methodology adopted for the activation analyses in the conceptual design of the SNS are presented along with some typical results of the analyses.

  9. Coccolithophore Blooms at High Latitudes as Observed from Space: Developed Methodologies and Resutls of their Application

    NASA Astrophysics Data System (ADS)

    Kondrik, D.; Pozdnyakov, D.; Pettersson, L.

    2016-08-01

    The methodology developed for delineation of E.huxleyi bloom areas employs spaceborne data. Methods of analysis of the spectral curvature of remote sensing reflectance are applied for generating binary masks. Together with the generated RGB images, this approach permits to quantitatively assess the bloom areas at high latitudes in the Atlantic, Arctic and Pacific oceans. The assessment results are organized as time series of bloom area dynamics across the time period 1998-2013 for the target marine waters. Notable differences are revealed in the established temporal patterns in terms of both interannual variations of bloom extent and the occurrence of bloom extent peaks throughout the year. The results obtained indicate that the mechanisms driving the above specific features are of different nature.

  10. Methodology for analyzing and developing information management infrastructure to support telerehabilitation.

    PubMed

    Saptono, Andi; Schein, Richard M; Parmanto, Bambang; Fairman, Andrea

    2009-01-01

    The proliferation of advanced technologies led researchers within the Rehabilitation Engineering Research Center on Telerehabilitation (RERC-TR) to devise an integrated infrastructure for clinical services using the University of Pittsburgh (PITT) model. This model describes five required characteristics for a telerehabilitation (TR) infrastructure: openness, extensibility, scalability, cost-effectiveness, and security. The infrastructure is to deliver clinical services over distance to improve access to health services for people living in underserved or remote areas. The methodological approach to design, develop, and employ this infrastructure is explained and detailed for the remote wheelchair prescription project, a research task within the RERC-TR. The availability of this specific clinical service and personnel outside of metropolitan areas is limited due to the lack of specialty expertise and access to resources. The infrastructure is used to deliver expertise in wheeled mobility and seating through teleconsultation to remote clinics, and has been successfully deployed to five rural clinics in Western Pennsylvania.

  11. Methodology for Analyzing and Developing Information Management Infrastructure to Support Telerehabilitation

    PubMed Central

    Saptono, Andi; Schein, Richard M.; Parmanto, Bambang; Fairman, Andrea

    2009-01-01

    The proliferation of advanced technologies led researchers within the Rehabilitation Engineering Research Center on Telerehabilitation (RERC-TR) to devise an integrated infrastructure for clinical services using the University of Pittsburgh (PITT) model. This model describes five required characteristics for a telerehabilitation (TR) infrastructure: openness, extensibility, scalability, cost-effectiveness, and security. The infrastructure is to deliver clinical services over distance to improve access to health services for people living in underserved or remote areas. The methodological approach to design, develop, and employ this infrastructure is explained and detailed for the remote wheelchair prescription project, a research task within the RERC-TR. The availability of this specific clinical service and personnel outside of metropolitan areas is limited due to the lack of specialty expertise and access to resources. The infrastructure is used to deliver expertise in wheeled mobility and seating through teleconsultation to remote clinics, and has been successfully deployed to five rural clinics in Western Pennsylvania. PMID:25945161

  12. Developing knowledge intensive ideas in engineering education: the application of camp methodology

    NASA Astrophysics Data System (ADS)

    Heidemann Lassen, Astrid; Løwe Nielsen, Suna

    2011-11-01

    Background: Globalization, technological advancement, environmental problems, etc. challenge organizations not just to consider cost-effectiveness, but also to develop new ideas in order to build competitive advantages. Hence, methods to deliberately enhance creativity and facilitate its processes of development must also play a central role in engineering education. However, so far the engineering education literature provides little attention to the important discussion of how to develop knowledge intensive ideas based on creativity methods and concepts. Purpose: The purpose of this article is to investigate how to design creative camps from which knowledge intensive ideas can unfold. Design/method/sample: A framework on integration of creativity and knowledge intensity is first developed, and then tested through the planning, execution and evaluation of a specialized creativity camp with focus on supply chain management. Detailed documentation of the learning processes of the participating 49 engineering and business students is developed through repeated interviews during the process as well as a survey. Results: The research illustrates the process of development of ideas, and how the participants through interdisciplinary collaboration, cognitive flexibility and joint ownership develop highly innovative and knowledge-intensive ideas, with direct relevance for the four companies whose problems they address. Conclusions: The article demonstrates how the creativity camp methodology holds the potential of combining advanced academic knowledge and creativity, to produce knowledge intensive ideas, when the design is based on ideas of experiential learning as well as creativity principles. This makes the method a highly relevant learning approach for engineering students in the search for skills to both develop and implement innovative ideas.

  13. Development and extraction optimization of baicalein and pinostrobin from Scutellaria violacea through response surface methodology

    PubMed Central

    Subramaniam, Shankar; Raju, Ravikumar; Palanisamy, Anbumathi; Sivasubramanian, Aravind

    2015-01-01

    Objective: To develop a process that involves optimization of the amount of baicalein and pinostrobin from the hydro-methanolic extract of the leaves of Scutellaria violacea by response surface methodology (RSM). Materials and Methods: The combinatorial influence of various extraction parameters on the extraction yield was investigated by adopting Box–Behnken experimental design. Preliminary experiments carried out based on the traditional one variable at a time optimization revealed four such operational parameters to play a crucial role by influencing the yield. These four process parameters at three levels were considered to obtain the Box–Behnken experimental design. Results: RSM based model fitted to the resulting experimental data suggested that 52.3% methanol/water, 12.46:1 solvent-solid ratio, 285 rpm agitation and 6.07 h of extraction time are the optimal conditions which yielded a maximized amount of baicalein and pinostrobin of 2.9 and 4.05 mg/g DM. Analysis of variance revealed a high correlation coefficient (R2 = 0.999 for baicalein and 0.994 for pinostrobin), signifying a good fit between the regression model (second order) and the experimental observations. Conclusion: The present study signifies that both the metabolites have been extracted from S. violacea for the first time. Further, this study developed an optimized extraction procedure to obtain maximum yield of the metabolites, which is unique and better than conventional extraction methodology. The operational parameters under optimized conditions accounts for the lowest cost in extraction process thus, providing an efficient, rapid and cost-effective method for isolation and scale up of these commercially vital flavonoids. PMID:26109758

  14. Methodology for urban rail and construction technology research and development planning

    NASA Technical Reports Server (NTRS)

    Rubenstein, L. D.; Land, J. E.; Deshpande, G.; Dayman, B.; Warren, E. H.

    1980-01-01

    A series of transit system visits, organized by the American Public Transit Association (APTA), was conducted in which the system operators identified the most pressing development needs. These varied by property and were reformulated into a series of potential projects. To assist in the evaluation, a data base useful for estimating the present capital and operating costs of various transit system elements was generated from published data. An evaluation model was developed which considered the rate of deployment of the research and development project, potential benefits, development time and cost. An outline of an evaluation methodology that considered benefits other than capital and operating cost savings was also presented. During the course of the study, five candidate projects were selected for detailed investigation; (1) air comfort systems; (2) solid state auxiliary power conditioners; (3) door systems; (4) escalators; and (5) fare collection systems. Application of the evaluation model to these five examples showed the usefulness of modeling deployment rates and indicated a need to increase the scope of the model to quantitatively consider reliability impacts.

  15. Development of a novel spectroscopic methodology for the unique determination of bacterial spores

    NASA Astrophysics Data System (ADS)

    Alexander, Troy A.

    2003-08-01

    A novel methodology has been developed for the determination (i.e., identification and quantification) of bacterial spores that may be useful in many applications; most notably, development of detection schemes toward potentially harmful biological agents such as Bacillus anthracis. In addition, this method would be useful as an environmental warning system where sterility is of importance (i.e., food preparation areas as well as invasive and minimally-invasive medical applications). This method is based on the infrared (1500 to 4000-nm) absorption of fatty acids and peptides extracted from the spore. The absorption spectra of several bacterial spore extracts in carbon disulfide solution have been measured. Further, the groups of absorption bands in this region are unique for each spore, which implies it may be possible to use this technique for their determination. The Bacillus spores studied were chosen because they are taxonomically close to each other as well as to Bacillus anthracis. Expectedly, the measured absorption bands are heavily overlapped since the extracted analytes are similar in structure for each Bacillus spore. Additionally, this makes it impossible to use a single wavelength for the determination of any bacterial spore species. However, it may be possible to use the infrared absorption technique in conjunction with the Partial Least Squares (PLS) regression method to develop statistical models for the determination of bacterial spores. Results will be presented concerning sampling, data treatment, and development of PLS models as well as application of these models in the determination of unknown Bacillus bacterial spores.

  16. Development of a Methodology to Gather Seated Anthropometry in a Microgravity Environment

    NASA Technical Reports Server (NTRS)

    Rajulu, Sudhakar; Young, Karen; Mesloh, Miranda

    2009-01-01

    The Constellation Program's Crew Exploration Vehicle (CEV) is required to accommodate the full population range of crewmembers according to the anthropometry requirements stated in the Human-Systems Integration Requirement (HSIR) document (CxP70024). Seated height is one of many critical dimensions of importance to the CEV designers in determining the optimum seat configuration in the vehicle. Changes in seated height may have a large impact to the design, accommodation, and safety of the crewmembers. Seated height can change due to elongation of the spine when crewmembers are exposed to microgravity. Spinal elongation is the straightening of the natural curvature of the spine and the expansion of inter-vertebral disks. This straightening occurs due to fluid shifts in the body and the lack of compressive forces on the spinal vertebrae. Previous studies have shown that as the natural curvature of the spine straightens, an increase in overall height of 3% of stature occurs which has been the basis of the current HSIR requirements. However due to variations in the torso/leg ratio and impact of soft tissue, data is nonexistent as to how spinal elongation specifically affects the measurement of seated height. In order to obtain this data, an experiment was designed to collect spinal elongation data while in a seated posture in microgravity. The purpose of this study was to provide quantitative data that represents the amount of change that occurs in seated height due to spinal elongation in microgravity environments. Given the schedule and budget constraints of ISS and Shuttle missions and the uniqueness of the problem, a methodology had to be developed to ensure that the seated height measurements were accurately collected. Therefore, simulated microgravity evaluations were conducted to test the methodology and procedures of the experiment. This evaluation obtained seat pan pressure and seated height data to a) ensure that the lap restraint provided sufficient

  17. A Control Law Design Method Facilitating Control Power, Robustness, Agility, and Flying Qualities Tradeoffs: CRAFT

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Davidson, John B.

    1998-01-01

    A multi-input, multi-output control law design methodology, named "CRAFT", is presented. CRAFT stands for the design objectives addressed, namely, Control power, Robustness, Agility, and Flying Qualities Tradeoffs. The methodology makes use of control law design metrics from each of the four design objective areas. It combines eigenspace assignment, which allows for direct specification of eigenvalues and eigenvectors, with a graphical approach for representing the metrics that captures numerous design goals in one composite illustration. Sensitivity of the metrics to eigenspace choice is clearly displayed, enabling the designer to assess the cost of design tradeoffs. This approach enhances the designer's ability to make informed design tradeoffs and to reach effective final designs. An example of the CRAFT methodology applied to an advanced experimental fighter and discussion of associated design issues are provided.

  18. An Examination of an Information Security Framework Implementation Based on Agile Values to Achieve Health Insurance Portability and Accountability Act Security Rule Compliance in an Academic Medical Center: The Thomas Jefferson University Case Study

    ERIC Educational Resources Information Center

    Reis, David W.

    2012-01-01

    Agile project management is most often examined in relation to software development, while information security frameworks are often examined with respect to certain risk management capabilities rather than in terms of successful implementation approaches. This dissertation extended the study of both Agile project management and information…

  19. On-the-Job Training: Development and Assessment of a Methodology for Generating Task Proficiency Evaluation Instruments.

    ERIC Educational Resources Information Center

    Warm, Ronnie; And Others

    This document describes the development and assessment of a methodology for generating on-the-job-training (OJT) task proficiency assessment instruments. The Task Evaluation Form (TEF) development procedures were derived to address previously identified deficiencies in the evaluation of OJT task proficiency. The TEF development procedures allow…

  20. Reflections on Software Agility and Agile Methods: Challenges, Dilemmas, & the Way Ahead

    DTIC Science & Technology

    2005-05-11

    Mellon University 10 Research Team Richard Baskerville and Balasubramanian Ramesh Department of Computer Information Systems, Georgia State University...University 38 References Agile Manifesto. http://agilemanifesto.org/ Baskerville , R., Pries-Heje, J., Levine, L., & Ramesh, B. (2005). The high speed...balancing game: How software companies cope with Internet speed. Scandinavian Journal of Information Systems 16, 11-54. Baskerville , R., Levine, L

  1. Development and methodology of level 1 probability safety assessment at PUSPATI TRIGA Reactor

    SciTech Connect

    Maskin, Mazleha; Tom, Phongsakorn Prak; Lanyau, Tonny Anak; Saad, Mohamad Fauzi; Ismail, Ahmad Razali; Abu, Mohamad Puad Haji; Brayon, Fedrick Charlie Matthew; Mohamed, Faizal

    2014-02-12

    As a consequence of the accident at the Fukushima Dai-ichi Nuclear Power Plant in Japan, the safety aspects of the one and only research reactor (31 years old) in Malaysia need be reviewed. Based on this decision, Malaysian Nuclear Agency in collaboration with Atomic Energy Licensing Board and Universiti Kebangsaan Malaysia develop a Level-1 Probability Safety Assessment on this research reactor. This work is aimed to evaluate the potential risks of incidents in RTP and at the same time to identify internal and external hazard that may cause any extreme initiating events. This report documents the methodology in developing a Level 1 PSA performed for the RTP as a complementary approach to deterministic safety analysis both in neutronics and thermal hydraulics. This Level-1 PSA work has been performed according to the procedures suggested in relevant IAEA publications and at the same time numbers of procedures has been developed as part of an Integrated Management System programme implemented in Nuclear Malaysia.

  2. Methodology to develop crash modification functions for road safety treatments with fully specified and hierarchical models.

    PubMed

    Chen, Yongsheng; Persaud, Bhagwant

    2014-09-01

    Crash modification factors (CMFs) for road safety treatments are developed as multiplicative factors that are used to reflect the expected changes in safety performance associated with changes in highway design and/or the traffic control features. However, current CMFs have methodological drawbacks. For example, variability with application circumstance is not well understood, and, as important, correlation is not addressed when several CMFs are applied multiplicatively. These issues can be addressed by developing safety performance functions (SPFs) with components of crash modification functions (CM-Functions), an approach that includes all CMF related variables, along with others, while capturing quantitative and other effects of factors and accounting for cross-factor correlations. CM-Functions can capture the safety impact of factors through a continuous and quantitative approach, avoiding the problematic categorical analysis that is often used to capture CMF variability. There are two formulations to develop such SPFs with CM-Function components - fully specified models and hierarchical models. Based on sample datasets from two Canadian cities, both approaches are investigated in this paper. While both model formulations yielded promising results and reasonable CM-Functions, the hierarchical model was found to be more suitable in retaining homogeneity of first-level SPFs, while addressing CM-Functions in sub-level modeling. In addition, hierarchical models better capture the correlations between different impact factors.

  3. Development and Evaluation of an Improved Methodology for Assessing Adherence to Evidence-Based Drug Therapy Guidelines Using Claims Data

    PubMed Central

    Kawamoto, Kensaku; Allen LaPointe, Nancy M.; Silvey, Garry M.; Anstrom, Kevin J.; Eisenstein, Eric L.; Lobach, David F.

    2007-01-01

    Non-adherence to evidence-based pharmacotherapy is associated with increased morbidity and mortality. Claims data can be used to detect and intervene on such non-adherence, but existing claims-based approaches for measuring adherence to pharmacotherapy guidelines have significant limitations. In this manuscript, we describe a methodology for assessing adherence to pharmacotherapy guidelines that overcomes many of these limitations. To develop this methodology, we first reviewed the literature to identify prior work on potential strategies for overcoming these limitations. We then assembled a team of relevant domain experts to iteratively develop an improved methodology. This development process was informed by the use of the proposed methodology to assess adherence levels for 14 pharmacotherapy guidelines related to seven common diseases among approximately 36,000 Medicaid beneficiaries. Finally, we evaluated the ability of the methodology to overcome the targeted limitations. Based on this evaluation, we conclude that the proposed methodology overcomes many of the limitations associated with existing approaches. PMID:18693865

  4. Mapping plant species ranges in the Hawaiian Islands: developing a methodology and associated GIS layers

    USGS Publications Warehouse

    Price, Jonathan P.; Jacobi, James D.; Gon, Samuel M.; Matsuwaki, Dwight; Mehrhoff, Loyal; Wagner, Warren; Lucas, Matthew; Rowe, Barbara

    2012-01-01

    This report documents a methodology for projecting the geographic ranges of plant species in the Hawaiian Islands. The methodology consists primarily of the creation of several geographic information system (GIS) data layers depicting attributes related to the geographic ranges of plant species. The most important spatial-data layer generated here is an objectively defined classification of climate as it pertains to the distribution of plant species. By examining previous zonal-vegetation classifications in light of spatially detailed climate data, broad zones of climate relevant to contemporary concepts of vegetation in the Hawaiian Islands can be explicitly defined. Other spatial-data layers presented here include the following: substrate age, as large areas of the island of Hawai'i, in particular, are covered by very young lava flows inimical to the growth of many plant species; biogeographic regions of the larger islands that are composites of multiple volcanoes, as many of their species are restricted to a given topographically isolated mountain or a specified group of them; and human impact, which can reduce the range of many species relative to where they formerly were found. Other factors influencing the geographic ranges of species that are discussed here but not developed further, owing to limitations in rendering them spatially, include topography, soils, and disturbance. A method is described for analyzing these layers in a GIS, in conjunction with a database of species distributions, to project the ranges of plant species, which include both the potential range prior to human disturbance and the projected present range. Examples of range maps for several species are given as case studies that demonstrate different spatial characteristics of range. Several potential applications of species-range maps are discussed, including facilitating field surveys, informing restoration efforts, studying range size and rarity, studying biodiversity, managing

  5. Compact, flexible, frequency agile parametric wavelength converter

    DOEpatents

    Velsko, Stephan P.; Yang, Steven T.

    2002-01-01

    This improved Frequency Agile Optical Parametric Oscillator provides near on-axis pumping of a single QPMC with a tilted periodically poled grating to overcome the necessity to find a particular crystal that will permit collinear birefringence in order to obtain a desired tuning range. A tilted grating design and the elongation of the transverse profile of the pump beam in the angle tuning plane of the FA-OPO reduces the rate of change of the overlap between the pumped volume in the crystal and the resonated and non-resonated wave mode volumes as the pump beam angle is changed. A folded mirror set relays the pivot point for beam steering from a beam deflector to the center of the FA-OPO crystal. This reduces the footprint of the device by as much as a factor of two over that obtained when using the refractive telescope design.

  6. Developing services for climate impact and adaptation baseline information and methodologies for the Andes

    NASA Astrophysics Data System (ADS)

    Huggel, C.

    2012-04-01

    Impacts of climate change are observed and projected across a range of ecosystems and economic sectors, and mountain regions thereby rank among the hotspots of climate change. The Andes are considered particularly vulnerable to climate change, not only due to fragile ecosystems but also due to the high vulnerability of the population. Natural resources such as water systems play a critical role and are observed and projected to be seriously affected. Adaptation to climate change impacts is therefore crucial to contain the negative effects on the population. Adaptation projects require information on the climate and affected socio-environmental systems. There is, however, generally a lack of methodological guidelines how to generate the necessary scientific information and how to communicate to implementing governmental and non-governmental institutions. This is particularly important in view of the international funds for adaptation such as the Green Climate Fund established and set into process at the UNFCCC Conferences of the Parties in Cancun 2010 and Durban 2011. To facilitate this process international and regional organizations (World Bank and Andean Community) and a consortium of research institutions have joined forces to develop and define comprehensive methodologies for baseline and climate change impact assessments for the Andes, with an application potential to other mountain regions (AndesPlus project). Considered are the climatological baseline of a region, and the assessment of trends based on ground meteorological stations, reanalysis data, and satellite information. A challenge is the scarcity of climate information in the Andes, and the complex climatology of the mountain terrain. A climate data platform has been developed for the southern Peruvian Andes and is a key element for climate data service and exchange. Water resources are among the key livelihood components for the Andean population, and local and national economy, in particular for

  7. Moving target detection for frequency agility radar by sparse reconstruction.

    PubMed

    Quan, Yinghui; Li, YaChao; Wu, Yaojun; Ran, Lei; Xing, Mengdao; Liu, Mengqi

    2016-09-01

    Frequency agility radar, with randomly varied carrier frequency from pulse to pulse, exhibits superior performance compared to the conventional fixed carrier frequency pulse-Doppler radar against the electromagnetic interference. A novel moving target detection (MTD) method is proposed for the estimation of the target's velocity of frequency agility radar based on pulses within a coherent processing interval by using sparse reconstruction. Hardware implementation of orthogonal matching pursuit algorithm is executed on Xilinx Virtex-7 Field Programmable Gata Array (FPGA) to perform sparse optimization. Finally, a series of experiments are performed to evaluate the performance of proposed MTD method for frequency agility radar systems.

  8. Moving target detection for frequency agility radar by sparse reconstruction

    NASA Astrophysics Data System (ADS)

    Quan, Yinghui; Li, YaChao; Wu, Yaojun; Ran, Lei; Xing, Mengdao; Liu, Mengqi

    2016-09-01

    Frequency agility radar, with randomly varied carrier frequency from pulse to pulse, exhibits superior performance compared to the conventional fixed carrier frequency pulse-Doppler radar against the electromagnetic interference. A novel moving target detection (MTD) method is proposed for the estimation of the target's velocity of frequency agility radar based on pulses within a coherent processing interval by using sparse reconstruction. Hardware implementation of orthogonal matching pursuit algorithm is executed on Xilinx Virtex-7 Field Programmable Gata Array (FPGA) to perform sparse optimization. Finally, a series of experiments are performed to evaluate the performance of proposed MTD method for frequency agility radar systems.

  9. Built To Last: Using Iterative Development Models for Sustainable Scientific Software Development

    NASA Astrophysics Data System (ADS)

    Jasiak, M. E.; Truslove, I.; Savoie, M.

    2013-12-01

    In scientific research, software development exists fundamentally for the results they create. The core research must take focus. It seems natural to researchers, driven by grant deadlines, that every dollar invested in software development should be used to push the boundaries of problem solving. This system of values is frequently misaligned with those of the software being created in a sustainable fashion; short-term optimizations create longer-term sustainability issues. The National Snow and Ice Data Center (NSIDC) has taken bold cultural steps in using agile and lean development and management methodologies to help its researchers meet critical deadlines, while building in the necessary support structure for the code to live far beyond its original milestones. Agile and lean software development and methodologies including Scrum, Kanban, Continuous Delivery and Test-Driven Development have seen widespread adoption within NSIDC. This focus on development methods is combined with an emphasis on explaining to researchers why these methods produce more desirable results for everyone, as well as promoting developers interacting with researchers. This presentation will describe NSIDC's current scientific software development model, how this addresses the short-term versus sustainability dichotomy, the lessons learned and successes realized by transitioning to this agile and lean-influenced model, and the current challenges faced by the organization.

  10. Optimization of spray drying process for developing seabuckthorn fruit juice powder using response surface methodology.

    PubMed

    Selvamuthukumaran, Meenakshisundaram; Khanum, Farhath

    2014-12-01

    The response surface methodology was used to optimize the spray drying process for development of seabuckthorn fruit juice powder. The independent variables were different levels of inlet air temperature and maltodextrin concentration. The responses were moisture, solubility, dispersibility, vitamin C and overall color difference value. Statistical analysis revealed that independent variables significantly affected all the responses. The Inlet air temperature showed maximum influence on moisture and vitamin C content, while the maltodextrin concentration showed similar influence on solubility, dispersibility and overall color difference value. Contour plots for each response were used to generate an optimum area by superimposition. The seabuckthorn fruit juice powder was developed using the derived optimum processing conditions to check the validity of the second order polynomial model. The experimental values were found to be in close agreement to the predicted values and were within the acceptable limits indicating the suitability of the model in predicting quality attributes of seabuckthorn fruit juice powder. The recommended optimum spray drying conditions for drying 100 g fruit juice slurry were inlet air temperature and maltodextrin concentration of 162.5 °C and 25 g, respectively. The spray dried juice powder contains higher amounts of antioxidants viz., vitamin C, vitamin E, total carotenoids, total anthocyanins and total phenols when compared to commercial fruit juice powders and they are also found to be free flowing without any physical alterations such as caking, stickiness, collapse and crystallization by exhibiting greater glass transition temperature.

  11. An integrated methodology on the suitability of offshore sites for wind farm development

    NASA Astrophysics Data System (ADS)

    Patlakas, Platon; Galanis, George; Péray, Marie; Filipot, Jean-François; Kalogeri, Christina; Spyrou, Christos; Diamantis, Dimitris; Kallos, Gerorge

    2016-04-01

    During, the last decades the potential and interest in wind energy investments has been constantly increasing in the European countries. As technology changes rapidly, more and more areas can be identified as suitable for energy applications. Offshore wind farms perfectly illustrate how new technologies allow to build bigger, more efficient and resistant in extreme conditions wind power plants. The current work proposes an integrated methodology to determine the suitability of an offshore marine area for the development of wind farm structures. More specifically, the region of interest is evaluated based both on the natural resources, connected to the local environmental characteristics, and potential constrains set by anthropogenic or other activities. State of the art atmospheric and wave models and a 10-year hindcast database are utilized in conjunction with local information for a number of potential constrains, leading to a 5-scale suitability index for the whole area. In this way, sub regions are characterized, at a high resolution mode, as poorly or highly suitable for wind farm development, providing a new tool for technical/research teams and decision makers. In addition, extreme wind and wave conditions and their 50-years return period are analyzed and used to define the safety level of the wind farms structural characteristics.

  12. Development of the Spanish version of the Systematized Nomenclature of Medicine: methodology and main issues.

    PubMed Central

    Reynoso, G. A.; March, A. D.; Berra, C. M.; Strobietto, R. P.; Barani, M.; Iubatti, M.; Chiaradio, M. P.; Serebrisky, D.; Kahn, A.; Vaccarezza, O. A.; Leguiza, J. L.; Ceitlin, M.; Luna, D. A.; Bernaldo de Quirós, F. G.; Otegui, M. I.; Puga, M. C.; Vallejos, M.

    2000-01-01

    This presentation features linguistic and terminology management issues related to the development of the Spanish version of the Systematized Nomenclature of Medicine (SNOMED). It aims at describing the aspects of translating and the difficulties encountered in delivering a natural and consistent medical nomenclature. Bunge's three-layered model is referenced to analyze the sequence of symbolic concept representations. It further explains how a communicative translation based on a concept-to-concept approach was used to achieve the highest level of flawlessness and naturalness for the Spanish rendition of SNOMED. Translation procedures and techniques are described and exemplified. Both the computer-aided and human translation methods are portrayed. The scientific and translation team tasks are detailed, with focus on Newmark's four-level principle for the translation process, extended with a fifth further level relevant to the ontology to control the consistency of the typology of concepts. Finally the convenience for a common methodology to develop non-English versions of SNOMED is suggested. PMID:11079973

  13. Moving to Capture Children’s Attention: Developing a Methodology for Measuring Visuomotor Attention

    PubMed Central

    Coats, Rachel O.; Mushtaq, Faisal; Williams, Justin H. G.; Aucott, Lorna S.; Mon-Williams, Mark

    2016-01-01

    Attention underpins many activities integral to a child’s development. However, methodological limitations currently make large-scale assessment of children’s attentional skill impractical, costly and lacking in ecological validity. Consequently we developed a measure of ‘Visual Motor Attention’ (VMA)—a construct defined as the ability to sustain and adapt visuomotor behaviour in response to task-relevant visual information. In a series of experiments, we evaluated the capability of our method to measure attentional processes and their contributions in guiding visuomotor behaviour. Experiment 1 established the method’s core features (ability to track stimuli moving on a tablet-computer screen with a hand-held stylus) and demonstrated its sensitivity to principled manipulations in adults’ attentional load. Experiment 2 standardised a format suitable for use with children and showed construct validity by capturing developmental changes in executive attention processes. Experiment 3 tested the hypothesis that children with and without coordination difficulties would show qualitatively different response patterns, finding an interaction between the cognitive and motor factors underpinning responses. Experiment 4 identified associations between VMA performance and existing standardised attention assessments and thereby confirmed convergent validity. These results establish a novel approach to measuring childhood attention that can produce meaningful functional assessments that capture how attention operates in an ecologically valid context (i.e. attention's specific contribution to visuomanual action). PMID:27434198

  14. Design, Development and Optimization of S (-) Atenolol Floating Sustained Release Matrix Tablets Using Surface Response Methodology

    PubMed Central

    Gunjal, P. T.; Shinde, M. B.; Gharge, V. S.; Pimple, S. V.; Gurjar, M. K.; Shah, M. N.

    2015-01-01

    The objective of this present investigation was to develop and formulate floating sustained release matrix tablets of s (-) atenolol, by using different polymer combinations and filler, to optimize by using surface response methodology for different drug release variables and to evaluate the drug release pattern of the optimized product. Floating sustained release matrix tablets of various combinations were prepared with cellulose-based polymers: Hydroxypropyl methylcellulose, sodium bicarbonate as a gas generating agent, polyvinyl pyrrolidone as a binder and lactose monohydrate as filler. The 32 full factorial design was employed to investigate the effect of formulation variables on different properties of tablets applicable to floating lag time, buoyancy time, % drug release in 1 and 6 h (D1 h,D6 h) and time required to 90% drug release (t90%). Significance of result was analyzed using analysis of non variance and P < 0.05 was considered statistically significant. S (-) atenolol floating sustained release matrix tablets followed the Higuchi drug release kinetics that indicates the release of drug follows anomalous (non-Fickian) diffusion mechanism. The developed floating sustained release matrix tablet of improved efficacy can perform therapeutically better than a conventional tablet. PMID:26798171

  15. Methodologies in Cultural-Historical Activity Theory: The Example of School-Based Development

    ERIC Educational Resources Information Center

    Postholm, May Britt

    2015-01-01

    Background and purpose: Relatively little research has been conducted on methodology within Cultural-Historical Activity Theory (CHAT). CHAT is mainly used as a framework for developmental processes. The purpose of this article is to discuss both focuses for research and research questions within CHAT and to outline methodologies that can be used…

  16. Methodology for the Development of Antithrombotic Therapy and Prevention of Thrombosis Guidelines

    PubMed Central

    Norris, Susan L.; Schulman, Sam; Hirsh, Jack; Eckman, Mark H.; Akl, Elie A.; Crowther, Mark; Vandvik, Per Olav; Eikelboom, John W.; McDonagh, Marian S.; Lewis, Sandra Zelman; Gutterman, David D.; Cook, Deborah J.; Schünemann, Holger J.

    2012-01-01

    Background: To develop the Antithrombotic Therapy and Prevention of Thrombosis, 9th ed: ACCP Evidence-Based Clinical Practice Guidelines (AT9), the American College of Chest Physicians (ACCP) assembled a panel of clinical experts, information scientists, decision scientists, and systematic review and guideline methodologists. Methods: Clinical areas were designated as articles, and a methodologist without important intellectual or financial conflicts of interest led a panel for each article. Only panel members without significant conflicts of interest participated in making recommendations. Panelists specified the population, intervention and alternative, and outcomes for each clinical question and defined criteria for eligible studies. Panelists and an independent evidence-based practice center executed systematic searches for relevant studies and evaluated the evidence, and where resources and evidence permitted, they created standardized tables that present the quality of the evidence and key results in a transparent fashion. Results: One or more recommendations relate to each specific clinical question, and each recommendation is clearly linked to the underlying body of evidence. Judgments regarding the quality of evidence and strength of recommendations were based on approaches developed by the Grades of Recommendations, Assessment, Development, and Evaluation Working Group. Panel members constructed scenarios describing relevant health states and rated the disutility associated with these states based on an additional systematic review of evidence regarding patient values and preferences for antithrombotic therapy. These ratings guided value and preference decisions underlying the recommendations. Each topic panel identified questions in which resource allocation issues were particularly important and, for these issues, experts in economic analysis provided additional searches and guidance. Conclusions: AT9 methodology reflects the current science of evidence

  17. Single-step affinity purification of enzyme biotherapeutics: a platform methodology for accelerated process development.

    PubMed

    Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean

    2014-01-01

    Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT).

  18. GRB 070724B: the first Gamma Ray Burst localized by SuperAGILE

    SciTech Connect

    Del Monte, E.; Costa, E.; Donnarumma, I.; Feroci, M.; Lapshov, I.; Lazzarotto, F.; Soffitta, P.; Argan, A.; Pucella, G.; Trois, A.; Vittorini, V.; Evangelista, Y.; Rapisarda, M.; Barbiellini, G.; Longo, F.; Basset, M.; Foggetta, L.; Vallazza, E.; Bulgarelli, A.; Di Cocco, G.

    2008-05-22

    GRB070724B is the first Gamma Ray Burst localized by the SuperAGILE instrument aboard the AGILE space mission. The SuperAGILE localization has been confirmed after the after-glow observation by the XRT aboard the Swift satellite. No significant gamma ray emission above 50 MeV has been detected for this GRB. In this paper we describe the SuperAGILE capabilities in detecting Gamma Ray Burst and the AGILE observation of GRB 070724B.

  19. Methodology for Developing the REScheckTM Software through Version 4.4.3

    SciTech Connect

    Bartlett, Rosemarie; Connell, Linda M; Gowri, Krishnan; Lucas, Robert G; Schultz, Robert W; Taylor, Zachary T; Wiberg, John D

    2012-09-01

    , MECcheck was renamed REScheck™ to better identify it as a residential code compliance tool. The “MEC” in MECcheck was outdated because it was taken from the Model Energy Code, which has been succeeded by the IECC. The “RES” in REScheck is also a better fit with the companion commercial product, COMcheck™. The easy-to-use REScheck compliance materials include a compliance and enforcement manual for all the MEC and IECC requirements and three compliance approaches for meeting the code’s thermal envelope requirements-prescriptive packages, software, and a trade-off worksheet (included in the compliance manual). The compliance materials can be used for single-family and low-rise multifamily dwellings. The materials allow building energy efficiency measures (such as insulation levels) to be “traded off” against each other, allowing a wide variety of building designs to comply with the code. This report explains the methodology used to develop Version 4.4.3 of the REScheck software developed for the 1992, 1993, and 1995 editions of the MEC, and the 1998, 2000, 2003, 2006, 2007, 2009, and 2012 editions of the IECC, and the 2006 edition of the International Residential Code (IRC). Although some requirements contained in these codes have changed, the methodology used to develop the REScheck software for these editions is similar. Beginning with REScheck Version 4.4.0, support for 1992, 1993, and 1995 MEC and the 1998 IECC is no longer included, but those sections remain in this document for reference purposes. REScheck assists builders in meeting the most complicated part of the code-the building envelope Uo-, U-, and R-value requirements in Section 502 of the code. This document details the calculations and assumptions underlying the treatment of the code requirements in REScheck, with a major emphasis on the building envelope requirements.

  20. Generic Competences in Higher Education: Studying Their Development in Undergraduate Social Science Studies by Means of a Specific Methodology

    ERIC Educational Resources Information Center

    Gallifa, Josep; Garriga, Jordi

    2010-01-01

    Research into the acquisition of generic competences was carried out with the undergraduate social science programmes offered by the Ramon Llull University, Barcelona (Spain). For these programmes an innovative methodology called "cross-course seminars" has been developed. Its focus is, amongst others, on developing generic competences.…

  1. Agile data management for curation of genomes to watershed datasets

    NASA Astrophysics Data System (ADS)

    Varadharajan, C.; Agarwal, D.; Faybishenko, B.; Versteeg, R.

    2015-12-01

    A software platform is being developed for data management and assimilation [DMA] as part of the U.S. Department of Energy's Genomes to Watershed Sustainable Systems Science Focus Area 2.0. The DMA components and capabilities are driven by the project science priorities and the development is based on agile development techniques. The goal of the DMA software platform is to enable users to integrate and synthesize diverse and disparate field, laboratory, and simulation datasets, including geological, geochemical, geophysical, microbiological, hydrological, and meteorological data across a range of spatial and temporal scales. The DMA objectives are (a) developing an integrated interface to the datasets, (b) storing field monitoring data, laboratory analytical results of water and sediments samples collected into a database, (c) providing automated QA/QC analysis of data and (d) working with data providers to modify high-priority field and laboratory data collection and reporting procedures as needed. The first three objectives are driven by user needs, while the last objective is driven by data management needs. The project needs and priorities are reassessed regularly with the users. After each user session we identify development priorities to match the identified user priorities. For instance, data QA/QC and collection activities have focused on the data and products needed for on-going scientific analyses (e.g. water level and geochemistry). We have also developed, tested and released a broker and portal that integrates diverse datasets from two different databases used for curation of project data. The development of the user interface was based on a user-centered design process involving several user interviews and constant interaction with data providers. The initial version focuses on the most requested feature - i.e. finding the data needed for analyses through an intuitive interface. Once the data is found, the user can immediately plot and download data

  2. Developing a methodology for identifying action zones to protect and manage groundwater well fields

    NASA Astrophysics Data System (ADS)

    Bellier, Sandra; Viennot, Pascal; Ledoux, Emmanuel; Schott, Celine

    2013-04-01

    Implementation of a long term action plan to manage and protect well fields is a complex and very expensive process. In this context, the relevance and efficiency of such action plans on water quality should be evaluated. The objective of this study is to set up a methodology to identify relevant actions zones in which environmental changes may significantly impact the quantity or quality of pumped water. In the Seine-et-Marne department (France), under French environmental laws three sectors integrating numerous well-field pumping in Champigny's limestone aquifer are considered as priority. This aquifer, located at south-east of Paris, supplies more than one million people with drinking water. Catchments areas of these abstractions are very large (2000 km2) and their intrinsic vulnerability was established by a simple parametric approach that does not permit to consider the complexity of hydrosystem. Consequently, a methodology based on a distributed modeling of the process of the aquifer was developed. The basin is modeled using the hydrogeological model MODCOU, developed in MINES ParisTech since the 1980s. It simulates surface and groundwater flow in aquifer systems and allows to represent the local characteristics of the hydrosystem (aquifers communicating by leakage, rivers infiltration, supply from sinkholes and locally perched or dewatering aquifers). The model was calibrated by matching simulated river discharge hydrographs and piezometric heads with observed ones since the 1970s. Thanks to this modelling tool, a methodology based on the transfer of a theoretical tracer through the hydrosystem from the ground surface to the outlets was implemented to evaluate the spatial distribution of the contribution areas at contrasted, wet or dry recharge periods. The results show that the surface of areas contributing to supply most catchments is lower than 300 km2 and the major contributory zones are located along rivers. This finding illustrates the importance of

  3. Laterality and performance of agility-trained dogs.

    PubMed

    Siniscalchi, Marcello; Bertino, Daniele; Quaranta, Angelo

    2014-01-01

    Correlations between lateralised behaviour and performance were investigated in 19 agility-trained dogs (Canis familiaris) by scoring paw preference to hold a food object and relating it to performance during typical agility obstacles (jump/A-frame and weave poles). In addition, because recent behavioural studies reported that visual stimuli of emotional valence presented to one visual hemifield at a time affect visually guided motor responses in dogs, the possibility that the position of the owner respectively in the left and in the right canine visual hemifield might be associated with quality of performance during agility was considered. Dogs' temperament was also measured by an owner-rated questionnaire. The most relevant finding was that agility-trained dogs displayed longer latencies to complete the obstacles with the owner located in their left visual hemifield compared to the right. Interestingly, the results showed that this phenomenon was significantly linked to both dogs' trainability and the strength of paw preference.

  4. Frequency agile OPO-based transmitters for multiwavelength DIAL

    SciTech Connect

    Velsko, S.P.; Ruggiero, A.; Herman, M.

    1996-09-01

    We describe a first generation mid-infrared transmitter with pulse-to- pulse frequency agility and both wide and narrow band capability. This transmitter was used to make multicomponent DIAL measurements in the field.

  5. Frequency agile OPO-based transmitters for multiwavelength DIAL

    SciTech Connect

    Velsko, S.P.; Ruggiero, A.; Herman, M.

    1996-09-01

    We describe a first generation mid-infrared transmitter with pulse to pulse frequency agility and both wide and narrow band capability. This transmitter was used to make multicomponent Differential Absorption LIDAR (DIAL) measurements in the field.

  6. Materials by design: methodological developments in the calculation of excited-state properties

    NASA Astrophysics Data System (ADS)

    Govoni, Marco

    Density functional theory (DFT) is one of the main tools used in first principle simulations of materials; however several of the current approximations of exchange and correlation functionals do not provide the level of accuracy required for predictive calculations of excited state properties. The application to heterogeneous systems of more accurate post-DFT approaches such as Many-Body Perturbation Theory (MBPT) - for example to nanostructured, disordered, and defective materials - has been hindered by high computational costs. In this talk recent methodological developments in MBPT calculations will be discussed, as recently implemented in the open source code WEST, which efficiently exploits HPC architectures. Results using a formulation that does not require the explicit calculation of virtual states, nor the storage and inversion of large dielectric matrices will be presented; these results include quasi particle energies for systems with thousands of electrons and encompass the electronic structure of aqueous solutions, spin defects in insulators, and benchmarks for molecules and solids containing heavy elements. Simplifications of MBPT calculations based on the use of static response properties, such as dielectric-dependent hybrid functionals, will also be discussed. Work done in collaboration with Hosung Seo, Peter Scherpelz, Ikutaro Hamada, Jonathan Skone, Alex Gaiduk, T. Anh Pham, and Giulia Galli. Supported by DOE-BES.

  7. Development of ginger based ready-to-eat appetizers by response surface methodology.

    PubMed

    Wadikar, D D; Nanjappa, C; Premavalli, K S; Bawa, A S

    2010-08-01

    Ginger is an herbaceous perennial rhizome traditionally used in culinary for its flavor and pungency. It is also used as carminative, stimulant and for its anti-emetic properties due to gingerols and shogaols. Appetite loss is one of the problems faced at high altitudes and the appetizers based on ginger may be useful for appetite stimulation. The fruit munch and ginger munch based on fresh and powdered ginger respectively were developed using response surface methodology (RSM). The sensory score, acidity and total sugars were the responses in the central composite designs of experiments with three independent variables. The ingredients raisins, dates, almonds were pre-processed by frying in stable fat while juice was extracted from pseudolemon and lemon. The optimized composition of ingredients was processed further through concentration. The carbohydrate rich munches had vitamin C content in the range 37-43mg/100g and calorific value of about 90kCal per munch. The munches packed in metalized polyester pouches had a shelf life of 8 months at ambient conditions (18-33 degrees C) as well as at a fixed temperature of 37 degrees C storage.

  8. DARE Train-the-Trainer Pedagogy Development Using 2-Round Delphi Methodology

    PubMed Central

    Kua, Phek Hui Jade; Soon, Swee Sung

    2016-01-01

    The Dispatcher-Assisted first REsponder programme aims to equip the public with skills to perform hands-only cardiopulmonary resuscitation (CPR) and to use an automated external defibrillator (AED). By familiarising them with instructions given by a medical dispatcher during an out-of-hospital cardiac arrest call, they will be prepared and empowered to react in an emergency. We aim to formalise curriculum and standardise the way information is conveyed to the participants. A panel of 20 experts were chosen. Using Delphi methodology, selected issues were classified into open-ended and close-ended questions. Consensus for an item was established at a 70% agreement rate within the panel. Questions that had 60%–69% agreement were edited and sent to the panel for another round of voting. After 2 rounds of voting, 70 consensus statements were agreed upon. These covered the following: focus of CPR; qualities and qualifications of trainers; recognition of agonal breathing; head-tilt-chin lift; landmark for chest compression; performance of CPR when injuries are present; trainers' involvement in training lay people; modesty of female patients during CPR; AED usage; content of trainer's manual; addressing of questions and answers; updates-dissemination to trainers and attendance of refresher courses. Recommendations for pedagogy for trainers of dispatcher-assisted CPR programmes were developed. PMID:27660757

  9. Advanced Raman Spectroscopy of Methylammonium Lead Iodide: Development of a Non-destructive Characterisation Methodology

    PubMed Central

    Pistor, Paul; Ruiz, Alejandro; Cabot, Andreu; Izquierdo-Roca, Victor

    2016-01-01

    In recent years, there has been an impressively fast technological progress in the development of highly efficient lead halide perovskite solar cells. However, the stability of perovskite films and respective solar cells is still an open point of concern and calls for advanced characterization methods. In this work, we identify appropriate measurement conditions for a meaningful analysis of spin-coated absorber-grade perovskite thin films based on methylammonium (MA) lead iodide (MAPbI3) by Raman spectroscopy. The material under investigation and its derivates is the most commonly used for high efficiency devices in the literatures and has yielded working solar cell devices with efficiencies around 10% in our laboratory. We report highly detailed Raman spectra obtained with excitation at 532 nm and 633 nm and their deconvolution taking advantage of the simultaneous fitting of spectra obtained with varying excitation wavelengths. Finally, we propose a fast and contactless methodology based on Raman to probe composition variations and/or degradation of these perovskite thin films and discuss the potential of the presented technique as quality control and degradation monitoring tool in other organic-inorganic perovskite materials and complete solar cell devices. PMID:27786250

  10. Developing a robust methodology for assessing the value of weather/climate services

    NASA Astrophysics Data System (ADS)

    Krijnen, Justin; Golding, Nicola; Buontempo, Carlo

    2016-04-01

    Increasingly, scientists involved in providing weather and climate services are expected to demonstrate the value of their work for end users in order to justify the costs of developing and delivering these services. This talk will outline different approaches that can be used to assess the socio-economic benefits of weather and climate services, including, among others, willingness to pay and avoided costs. The advantages and limitations of these methods will be discussed and relevant case-studies will be used to illustrate each approach. The choice of valuation method may be influenced by different factors, such as resource and time constraints and the end purposes of the study. In addition, there are important methodological differences which will affect the value assessed. For instance the ultimate value of a weather/climate forecast to a decision-maker will not only depend on forecast accuracy but also on other factors, such as how the forecast is communicated to and consequently interpreted by the end-user. Thus, excluding these additional factors may result in inaccurate socio-economic value estimates. In order to reduce the inaccuracies in this valuation process we propose an approach that assesses how the initial weather/climate forecast information can be incorporated within the value chain of a given sector, taking into account value gains and losses at each stage of the delivery process. By this we aim to more accurately depict the socio-economic benefits of a weather/climate forecast to decision-makers.

  11. The Component Packaging Problem: A Vehicle for the Development of Multidisciplinary Design and Analysis Methodologies

    NASA Technical Reports Server (NTRS)

    Fadel, Georges; Bridgewood, Michael; Figliola, Richard; Greenstein, Joel; Kostreva, Michael; Nowaczyk, Ronald; Stevenson, Steve

    1999-01-01

    This report summarizes academic research which has resulted in an increased appreciation for multidisciplinary efforts among our students, colleagues and administrators. It has also generated a number of research ideas that emerged from the interaction between disciplines. Overall, 17 undergraduate students and 16 graduate students benefited directly from the NASA grant: an additional 11 graduate students were impacted and participated without financial support from NASA. The work resulted in 16 theses (with 7 to be completed in the near future), 67 papers or reports mostly published in 8 journals and/or presented at various conferences (a total of 83 papers, presentations and reports published based on NASA inspired or supported work). In addition, the faculty and students presented related work at many meetings, and continuing work has been proposed to NSF, the Army, Industry and other state and federal institutions to continue efforts in the direction of multidisciplinary and recently multi-objective design and analysis. The specific problem addressed is component packing which was solved as a multi-objective problem using iterative genetic algorithms and decomposition. Further testing and refinement of the methodology developed is presently under investigation. Teaming issues research and classes resulted in the publication of a web site, (http://design.eng.clemson.edu/psych4991) which provides pointers and techniques to interested parties. Specific advantages of using iterative genetic algorithms, hurdles faced and resolved, and institutional difficulties associated with multi-discipline teaming are described in some detail.

  12. WHPA delineation in Rhode Island: Development and statewide application of methodologies. [WellHead Protection Area

    SciTech Connect

    Bradley, M.D.; Kaczor-Bobiak, S.M. )

    1992-01-01

    Wellhead Protection Areas (WHPAs) were delineated for all 525 public drinking water wells in Rhode Island by RI Department of Environmental Management hydrogeology staff. WHPA delineation is an element of the EPA-approved Rhode Island Wellhead Protection Program (RIWHPP), which is designed to protect areas contributing groundwater to public drinking water wells. For resource protection to proceed, legally defensible WHPAs were needed which could be quickly delineated. The authors incorporated input and feedback from a technical subcommittee in developing Rhode Island WHPA delineation methodologies. Comprehensive databases were compiled, which included well parameters and associated aquifer characteristics. More complex delineation techniques were applied to large-capacity wells (average discharge greater than 10 gpm) than to smaller wells. WHPAs for the smaller wells were limited to a 1750-foot-radius circle based on average characteristics of small bedrock wells in Rhode Island. For the large wells, WHPAs consisted of a combination of analytical modelling and hydrogeologic mapping. The Theis equation was used to map the downgradient WHPA boundary for large wells finished in bedrock. The uniform flow equation was used to calculate the downgradient portion of the WHPA for large wells finished in stratified drift. The upgradient boundary for all large wells was delineated using hydrogeologic mapping based on a technique modified from a USGS method. These WHPAs are being provided to municipalities and public water suppliers, who will use them to carry out the other elements of the RIWHPP, such as pollution source inventories, contingency planning, and management approaches.

  13. Advanced Raman Spectroscopy of Methylammonium Lead Iodide: Development of a Non-destructive Characterisation Methodology

    NASA Astrophysics Data System (ADS)

    Pistor, Paul; Ruiz, Alejandro; Cabot, Andreu; Izquierdo-Roca, Victor

    2016-10-01

    In recent years, there has been an impressively fast technological progress in the development of highly efficient lead halide perovskite solar cells. However, the stability of perovskite films and respective solar cells is still an open point of concern and calls for advanced characterization methods. In this work, we identify appropriate measurement conditions for a meaningful analysis of spin-coated absorber-grade perovskite thin films based on methylammonium (MA) lead iodide (MAPbI3) by Raman spectroscopy. The material under investigation and its derivates is the most commonly used for high efficiency devices in the literatures and has yielded working solar cell devices with efficiencies around 10% in our laboratory. We report highly detailed Raman spectra obtained with excitation at 532 nm and 633 nm and their deconvolution taking advantage of the simultaneous fitting of spectra obtained with varying excitation wavelengths. Finally, we propose a fast and contactless methodology based on Raman to probe composition variations and/or degradation of these perovskite thin films and discuss the potential of the presented technique as quality control and degradation monitoring tool in other organic-inorganic perovskite materials and complete solar cell devices.

  14. Development of new methodologies for evaluating the energy performance of new commercial buildings

    NASA Astrophysics Data System (ADS)

    Song, Suwon

    The concept of Measurement and Verification (M&V) of a new building continues to become more important because efficient design alone is often not sufficient to deliver an efficient building. Simulation models that are calibrated to measured data can be used to evaluate the energy performance of new buildings if they are compared to energy baselines such as similar buildings, energy codes, and design standards. Unfortunately, there is a lack of detailed M&V methods and analysis methods to measure energy savings from new buildings that would have hypothetical energy baselines. Therefore, this study developed and demonstrated several new methodologies for evaluating the energy performance of new commercial buildings using a case-study building in Austin, Texas. First, three new M&V methods were developed to enhance the previous generic M&V framework for new buildings, including: (1) The development of a method to synthesize weather-normalized cooling energy use from a correlation of Motor Control Center (MCC) electricity use when chilled water use is unavailable, (2) The development of an improved method to analyze measured solar transmittance against incidence angle for sample glazing using different solar sensor types, including Eppley PSP and Li-Cor sensors, and (3) The development of an improved method to analyze chiller efficiency and operation at part-load conditions. Second, three new calibration methods were developed and analyzed, including: (1) A new percentile analysis added to the previous signature method for use with a DOE-2 calibration, (2) A new analysis to account for undocumented exhaust air in DOE-2 calibration, and (3) An analysis of the impact of synthesized direct normal solar radiation using the Erbs correlation on DOE-2 simulation. Third, an analysis of the actual energy savings compared to three different energy baselines was performed, including: (1) Energy Use Index (EUI) comparisons with sub-metered data, (2) New comparisons against

  15. Development of an assessment methodology for hydrocarbon recovery potential using carbon dioxide and associated carbon sequestration-Workshop findings

    USGS Publications Warehouse

    Verma, Mahendra K.; Warwick, Peter D.

    2011-01-01

    The Energy Independence and Security Act of 2007 (Public Law 110-140) authorized the U.S. Geological Survey (USGS) to conduct a national assessment of geologic storage resources for carbon dioxide (CO2) and requested that the USGS estimate the "potential volumes of oil and gas recoverable by injection and sequestration of industrial carbon dioxide in potential sequestration formations" (121 Stat. 1711). The USGS developed a noneconomic, probability-based methodology to assess the Nation's technically assessable geologic storage resources available for sequestration of CO2 (Brennan and others, 2010) and is currently using the methodology to assess the Nation's CO2 geologic storage resources. Because the USGS has not developed a methodology to assess the potential volumes of technically recoverable hydrocarbons that could be produced by injection and sequestration of CO2, the Geologic Carbon Sequestration project initiated an effort in 2010 to develop a methodology for the assessment of the technically recoverable hydrocarbon potential in the sedimentary basins of the United States using enhanced oil recovery (EOR) techniques with CO2 (CO2-EOR). In collaboration with Stanford University, the USGS hosted a 2-day CO2-EOR workshop in May 2011, attended by 28 experts from academia, natural resource agencies and laboratories of the Federal Government, State and international geologic surveys, and representatives from the oil and gas industry. The geologic and the reservoir engineering and operations working groups formed during the workshop discussed various aspects of geology, reservoir engineering, and operations to make recommendations for the methodology.

  16. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    NASA Astrophysics Data System (ADS)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  17. Development and evaluation of habitat suitability criteria for use in the instream flow incremental methodology

    USGS Publications Warehouse

    Bovee, Ken D.

    1986-01-01

    The Instream Flow Incremental Methodology (IFIM) is a habitat-based tool used to evaluate the environmental consequences of various water and land use practices. As such, knowledge about the conditions that provide favorable habitat for a species, and those that do not, is necessary for successful implementation of the methodology. In the context of IFIM, this knowledge is defined as habitat suitability criteria: characteristic behavioral traits of a species that are established as standards for comparison in the decision-making process. Habitat suitability criteria may be expressed in a variety of types and formats. The type, or category, refers to the procedure used to develop the criteria. Category I criteria are based on professional judgment, with little or no empirical data. Category II criteria have as their source, microhabitat data collected at locations where target organisms are observed or collected. These are called “utilization” functions because they are based on observed locations that were used by the target organism. These functions tend to be biased by the environmental conditions that were available to the fish or invertebrates at the time they were observed. Correction of the utilization function for environmental availability creates category III, or “preference” criteria, which tend to be much less site specific than category II criteria. There are also several ways to express habitat suitability in graphical form. The binary format establishes a suitable range for each variable as it pertains to a life stage of interest, and is presented graphically as a step function. The quality rating for a variable is 1.0 if it falls within the range of the criteria, and 0.0 if it falls outside the range. The univariate curve format established both the usable range and the optimum range for each variable, with conditions of intermediate usability expressed along the portion between the tails and the peak of the curve. Multivariate probability

  18. Interferometric meteor head echo observations using the Southern Argentina Agile Meteor Radar

    NASA Astrophysics Data System (ADS)

    Janches, D.; Hocking, W.; Pifko, S.; Hormaechea, J. L.; Fritts, D. C.; Brunini, C.; Michell, R.; Samara, M.

    2014-03-01

    A radar meteor echo is the radar scattering signature from the free electrons generated by the entry of extraterrestrial particles into the atmosphere. Three categories of scattering mechanisms exist: specular, nonspecular trails, and head echoes. Generally, there are two types of radars utilized to detect meteors. Traditional VHF all-sky meteor radars primarily detect the specular trails, while high-power, large-aperture (HPLA) radars efficiently detect meteor head echoes and, in some cases, nonspecular trails. The fact that head echo measurements can be performed only with HPLA radars limits these studies in several ways. HPLA radars are sensitive instruments constraining the studies to the lower masses, and these observations cannot be performed continuously because they take place at national observatories with limited allocated observing time. These drawbacks can be addressed by developing head echo observing techniques with modified all-sky meteor radars. Such systems would also permit simultaneous detection of all different scattering mechanisms using the same instrument, rather than requiring assorted different classes of radars, which can help clarify observed differences between the different methodologies. In this study, we demonstrate that such concurrent observations are now possible, enabled by the enhanced design of the Southern Argentina Agile Meteor Radar (SAAMER). The results presented here are derived from observations performed over a period of 12 days in August 2011 and include meteoroid dynamical parameter distributions, radiants, and estimated masses. Overall, the SAAMER's head echo detections appear to be produced by larger particles than those which have been studied thus far using this technique.

  19. Interferometric Meteor Head Echo Observations using the Southern Argentina Agile Meteor Radar (SAAMER)

    NASA Technical Reports Server (NTRS)

    Janches, D.; Hocking, W.; Pifko, S.; Hormaechea, J. L.; Fritts, D. C.; Brunini, C; Michell, R.; Samara, M.

    2013-01-01

    A radar meteor echo is the radar scattering signature from the free-electrons in a plasma trail generated by entry of extraterrestrial particles into the atmosphere. Three categories of scattering mechanisms exist: specular, nonspecular trails, and head-echoes. Generally, there are two types of radars utilized to detect meteors. Traditional VHF meteor radars (often called all-sky1radars) primarily detect the specular reflection of meteor trails traveling perpendicular to the line of sight of the scattering trail, while High Power and Large Aperture (HPLA) radars efficiently detect meteor head-echoes and, in some cases, non-specular trails. The fact that head-echo measurements can be performed only with HPLA radars limits these studies in several ways. HPLA radars are very sensitive instruments constraining the studies to the lower masses, and these observations cannot be performed continuously because they take place at national observatories with limited allocated observing time. These drawbacks can be addressed by developing head echo observing techniques with modified all-sky meteor radars. In addition, the fact that the simultaneous detection of all different scattering mechanisms can be made with the same instrument, rather than requiring assorted different classes of radars, can help clarify observed differences between the different methodologies. In this study, we demonstrate that such concurrent observations are now possible, enabled by the enhanced design of the Southern Argentina Agile Meteor Radar (SAAMER) deployed at the Estacion Astronomica Rio Grande (EARG) in Tierra del Fuego, Argentina. The results presented here are derived from observations performed over a period of 12 days in August 2011, and include meteoroid dynamical parameter distributions, radiants and estimated masses. Overall, the SAAMER's head echo detections appear to be produced by larger particles than those which have been studied thus far using this technique.

  20. Spectroscopic Investigation of Materials for Frequency Agile Laser Systems.

    DTIC Science & Technology

    1985-01-01

    fluorescence spectra and lifetimes of divalent Rh, Ru, Pt, and Ir ions in alkali halide crystals are measured using pulsed nitrogen laser excitation...AD-Ai5t 73t SPECTROSCOPIC INVESTIGRTION OF MATERIALS FOR FREQUENCY t/ AGILE LASER SYSTEMS(U) OKLAHOMA STATE UNIV STILLWATER DEPT OF PHYSICS R C...INVESTIGATION OF MATERIALS FOR FREQUENCY AGILE LASER SYSTEMS Richard C. Powell, Ph.D. Principal Investigator Department of Physics OKLAHOMA STATE UNIVERSITY