Science.gov

Sample records for agile development methodologies

  1. Towards an Understanding of the Conceptual Underpinnings of Agile Development Methodologies

    NASA Astrophysics Data System (ADS)

    Nerur, Sridhar; Cannon, Alan; Balijepally, Venugopal; Bond, Philip

    While the growing popularity of agile development methodologies is undeniable, there has been little systematic exploration of its intellectual foundation. Such an effort would be an important first step in understanding this paradigm's underlying premises. This understanding, in turn, would be invaluable in our assessment of current practices as well as in our efforts to advance the field of software engineering. Drawing on a variety of sources, both within and outside the discipline, we argue that the concepts underpinning agile development methodologies are by no means novel. In the tradition of General Systems Theory this paper advocates a transdisciplinary examination of agile development methodologies to extend the intellectual boundaries of software development. This is particularly important as the field moves beyond instrumental processes aimed at satisfying mere technical considerations.

  2. Integrating Low-Cost Rapid Usability Testing into Agile System Development of Healthcare IT: A Methodological Perspective.

    PubMed

    Kushniruk, Andre W; Borycki, Elizabeth M

    2015-01-01

    The development of more usable and effective healthcare information systems has become a critical issue. In the software industry methodologies such as agile and iterative development processes have emerged to lead to more effective and usable systems. These approaches highlight focusing on user needs and promoting iterative and flexible development practices. Evaluation and testing of iterative agile development cycles is considered an important part of the agile methodology and iterative processes for system design and re-design. However, the issue of how to effectively integrate usability testing methods into rapid and flexible agile design cycles has remained to be fully explored. In this paper we describe our application of an approach known as low-cost rapid usability testing as it has been applied within agile system development in healthcare. The advantages of the integrative approach are described, along with current methodological considerations. PMID:25991130

  3. Integrating Low-Cost Rapid Usability Testing into Agile System Development of Healthcare IT: A Methodological Perspective.

    PubMed

    Kushniruk, Andre W; Borycki, Elizabeth M

    2015-01-01

    The development of more usable and effective healthcare information systems has become a critical issue. In the software industry methodologies such as agile and iterative development processes have emerged to lead to more effective and usable systems. These approaches highlight focusing on user needs and promoting iterative and flexible development practices. Evaluation and testing of iterative agile development cycles is considered an important part of the agile methodology and iterative processes for system design and re-design. However, the issue of how to effectively integrate usability testing methods into rapid and flexible agile design cycles has remained to be fully explored. In this paper we describe our application of an approach known as low-cost rapid usability testing as it has been applied within agile system development in healthcare. The advantages of the integrative approach are described, along with current methodological considerations.

  4. Some Findings Concerning Requirements in Agile Methodologies

    NASA Astrophysics Data System (ADS)

    Rodríguez, Pilar; Yagüe, Agustín; Alarcón, Pedro P.; Garbajosa, Juan

    Agile methods have appeared as an attractive alternative to conventional methodologies. These methods try to reduce the time to market and, indirectly, the cost of the product through flexible development and deep customer involvement. The processes related to requirements have been extensively studied in literature, in most cases in the frame of conventional methods. However, conclusions of conventional methodologies could not be necessarily valid for Agile; in some issues, conventional and Agile processes are radically different. As recent surveys report, inadequate project requirements is one of the most conflictive issues in agile approaches and better understanding about this is needed. This paper describes some findings concerning requirements activities in a project developed under an agile methodology. The project intended to evolve an existing product and, therefore, some background information was available. The major difficulties encountered were related to non-functional needs and management of requirements dependencies.

  5. The Impacts of Agile Development Methodology Use on Project Success: A Contingency View

    ERIC Educational Resources Information Center

    Tripp, John F.

    2012-01-01

    Agile Information Systems Development Methods have emerged in the past decade as an alternative manner of managing the work and delivery of information systems development teams, with a large number of organizations reporting the adoption & use of agile methods. The practitioners of these methods make broad claims as to the benefits of their…

  6. The development of a test of reactive agility for netball: a new methodology.

    PubMed

    Farrow, D; Young, W; Bruce, L

    2005-03-01

    The purpose of this study was to present a new methodology for the measurement of agility for netball that is considered more ecologically valid than previous agility tests. Specifically, the agility performance of highly-skilled (n = 12), moderately-skilled (n = 12) and lesser-skilled players (n = 8) when responding to a life-size, interactive video display of a netball player initiating a pass was compared to a traditional, pre-planned agility movement where no external stimulus was present. The total movement times and decision times of the players were the primary dependent measures of interest. A second purpose of the research was to determine the test-retest reliability of the testing approach. Results revealed significant differences existed between the 2 test conditions demonstrating that they were measuring different types of agility. The highly-skilled group was significantly faster in both the reactive and planned test conditions relative to the lesser-skilled group, while the moderately-skilled group was significantly faster than the lesser-skilled group in the reactive test condition. The decision time component within the reactive test condition revealed that the highly-skilled players made significantly faster decisions than the lesser-skilled players. It is reasoned that it is this decision-making component of reactive agility that contributes to the significant differences between the two test conditions. The testing approach was shown to have good test-retest reliability with an intra-class correlation of r = .83. PMID:15887901

  7. The development of a test of reactive agility for netball: a new methodology.

    PubMed

    Farrow, D; Young, W; Bruce, L

    2005-03-01

    The purpose of this study was to present a new methodology for the measurement of agility for netball that is considered more ecologically valid than previous agility tests. Specifically, the agility performance of highly-skilled (n = 12), moderately-skilled (n = 12) and lesser-skilled players (n = 8) when responding to a life-size, interactive video display of a netball player initiating a pass was compared to a traditional, pre-planned agility movement where no external stimulus was present. The total movement times and decision times of the players were the primary dependent measures of interest. A second purpose of the research was to determine the test-retest reliability of the testing approach. Results revealed significant differences existed between the 2 test conditions demonstrating that they were measuring different types of agility. The highly-skilled group was significantly faster in both the reactive and planned test conditions relative to the lesser-skilled group, while the moderately-skilled group was significantly faster than the lesser-skilled group in the reactive test condition. The decision time component within the reactive test condition revealed that the highly-skilled players made significantly faster decisions than the lesser-skilled players. It is reasoned that it is this decision-making component of reactive agility that contributes to the significant differences between the two test conditions. The testing approach was shown to have good test-retest reliability with an intra-class correlation of r = .83.

  8. Agile Software Development

    ERIC Educational Resources Information Center

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  9. Teaching Agile Software Development: A Case Study

    ERIC Educational Resources Information Center

    Devedzic, V.; Milenkovic, S. R.

    2011-01-01

    This paper describes the authors' experience of teaching agile software development to students of computer science, software engineering, and other related disciplines, and comments on the implications of this and the lessons learned. It is based on the authors' eight years of experience in teaching agile software methodologies to various groups…

  10. Improving Global Development Using Agile

    NASA Astrophysics Data System (ADS)

    Avritzer, Alberto; Bronsard, Francois; Matos, Gilberto

    Global development promises important productivity and capability advantages over centralized work by optimally allocating tasks according to locality, expertise or cost. All too often, global development also introduces a different set of communication and coordination challenges that can negate all the expected benefits and even cause project failures. Most common problems have to do with building trust or quick feedback loops between distributed teams, or with the integration of globally developed components. Agile processes tend to emphasize the intensity of communication, and would seem to be negatively impacted by team distribution. In our experience, these challenges can be overcome, and agile processes can address some of the pitfalls of global development more effectively than plan-driven development. This chapter discusses how to address the difficulties faced when adapting agile processes to global development and the improvements to global development that adopting agile can produce.

  11. Contribution of Agility to Successful Distributed Software Development

    NASA Astrophysics Data System (ADS)

    Sarker, Saonee; Munson, Charles L.; Sarker, Suprateek; Chakraborty, Suranjan

    In recent times, both researchers and practitioners have touted agility as the latest innovation in distributed software development (DSD). In spite of this acknowledgement, there is little understanding and evidence surrounding the effect of agility on distributed project success. This chapter reports on a study that examines practitioner views surrounding the relative importance of different sub-types of agility to DSD project success. Preliminary results indicate that practitioners view on-time completion of DSD projects, and effective collaboration amongst stakeholders as the top two criteria of DSD project success, with lower emphasis on within-budget considerations. Among the many agility sub-types examined, people-based agility, communication-based agility, methodological agility, and time-based agility emerged as the most important for practitioners in terms of ensuring DSD project success.

  12. A Framework for Decomposition and Analysis of Agile Methodologies During Their Adaptation

    NASA Astrophysics Data System (ADS)

    Mikulenas, Gytenis; Kapocius, Kestutis

    In recent years there has been a steady increase of interest in Agile software development methodologies and techniques, which are often positioned as proven alternatives to the traditional plan-driven approaches. However, although there is no shortage of Agile methodologies to choose from, the formal methods for actually choosing or adapting the right one are lacking. The aim of the presented research was to define the formal way of preparing Agile methodologies for adaptation and creating an adaptation process framework. We argue that Agile methodologies can be successfully broken down into individual parts that can be specified on three different levels and later analyzed with regard to problem/concern areas. Results of such decomposition can form the foundation for the decisions on the adaptation of the specific Agile methodology. A case study is included in this chapter to further clarify the proposed approach.

  13. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and procedure for developing software. This paper will discuss the some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies.

  14. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases, they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At the Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and processes for developing software. This paper will discuss some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies and processes.

  15. Supporting Agile Development of Authorization Rules for SME Applications

    NASA Astrophysics Data System (ADS)

    Bartsch, Steffen; Sohr, Karsten; Bormann, Carsten

    Custom SME applications for collaboration and workflow have become affordable when implemented as Web applications employing Agile methodologies. Security engineering is still difficult with Agile development, though: heavy-weight processes put the improvements of Agile development at risk. We propose Agile security engineering and increased end-user involvement to improve Agile development with respect to authorization policy development. To support the authorization policy development, we introduce a simple and readable authorization rules language implemented in a Ruby on Rails authorization plugin that is employed in a real-world SME collaboration and workflow application. Also, we report on early findings of the language’s use in authorization policy development with domain experts.

  16. Agile

    NASA Technical Reports Server (NTRS)

    Trimble, Jay Phillip

    2013-01-01

    This is based on a previous talk on agile development. Methods for delivering software on a short cycle are described, including interactions with the customer, the affect on the team, and how to be more effective, streamlined and efficient.

  17. An Agile Constructionist Mentoring Methodology for Software Projects in the High School

    ERIC Educational Resources Information Center

    Meerbaum-Salant, Orni; Hazzan, Orit

    2010-01-01

    This article describes the construction process and evaluation of the Agile Constructionist Mentoring Methodology (ACMM), a mentoring method for guiding software development projects in the high school. The need for such a methodology has arisen due to the complexity of mentoring software project development in the high school. We introduce the…

  18. Pilot users in agile development processes: motivational factors.

    PubMed

    Johannessen, Liv Karen; Gammon, Deede

    2010-01-01

    Despite a wealth of research on user participation, few studies offer insights into how to involve multi-organizational users in agile development methods. This paper is a case study of user involvement in developing a system for electronic laboratory requisitions using agile methodologies in a multi-organizational context. Building on an interpretive approach, we illuminate questions such as: How does collaboration between users and developers evolve and how might it be improved? What key motivational aspects are at play when users volunteer and continue contributing in the face of considerable added burdens? The study highlights how agile methods in themselves appear to facilitate mutually motivating collaboration between user groups and developers. Lessons learned for leveraging the advantages of agile development processes include acknowledging the substantial and ongoing contributions of users and their roles as co-designers of the system. PMID:20543366

  19. Pilot users in agile development processes: motivational factors.

    PubMed

    Johannessen, Liv Karen; Gammon, Deede

    2010-01-01

    Despite a wealth of research on user participation, few studies offer insights into how to involve multi-organizational users in agile development methods. This paper is a case study of user involvement in developing a system for electronic laboratory requisitions using agile methodologies in a multi-organizational context. Building on an interpretive approach, we illuminate questions such as: How does collaboration between users and developers evolve and how might it be improved? What key motivational aspects are at play when users volunteer and continue contributing in the face of considerable added burdens? The study highlights how agile methods in themselves appear to facilitate mutually motivating collaboration between user groups and developers. Lessons learned for leveraging the advantages of agile development processes include acknowledging the substantial and ongoing contributions of users and their roles as co-designers of the system.

  20. Value Creation by Agile Projects: Methodology or Mystery?

    NASA Astrophysics Data System (ADS)

    Racheva, Zornitza; Daneva, Maya; Sikkel, Klaas

    Business value is a key concept in agile software development approaches. This paper presents results of a systematic review of literature on how business value is created by agile projects. We found that with very few exceptions, most published studies take the concept of business value for granted and do not state what it means in general as well as in the specific study context. We could find no study which clearly indicates how exactly individual agile practices or groups of those create value and keep accumulating it over time. The key implication for research is that we have an incentive to pursue the study of value creation in agile project by deploying empirical research methods.

  1. Exploring the possibility of modeling a genetic counseling guideline using agile methodology.

    PubMed

    Choi, Jeeyae

    2013-01-01

    Increased demand of genetic counseling services heightened the necessity of a computerized genetic counseling decision support system. In order to develop an effective and efficient computerized system, modeling of genetic counseling guideline is an essential step. Throughout this pilot study, Agile methodology with United Modeling Language (UML) was utilized to model a guideline. 13 tasks and 14 associated elements were extracted. Successfully constructed conceptual class and activity diagrams revealed that Agile methodology with UML was a suitable tool to modeling a genetic counseling guideline.

  2. Development of a Computer Program for Analyzing Preliminary Aircraft Configurations in Relationship to Emerging Agility Metrics

    NASA Technical Reports Server (NTRS)

    Bauer, Brent

    1993-01-01

    This paper discusses the development of a FORTRAN computer code to perform agility analysis on aircraft configurations. This code is to be part of the NASA-Ames ACSYNT (AirCraft SYNThesis) design code. This paper begins with a discussion of contemporary agility research in the aircraft industry and a survey of a few agility metrics. The methodology, techniques and models developed for the code are then presented. Finally, example trade studies using the agility module along with ACSYNT are illustrated. These trade studies were conducted using a Northrop F-20 Tigershark aircraft model. The studies show that the agility module is effective in analyzing the influence of common parameters such as thrust-to-weight ratio and wing loading on agility criteria. The module can compare the agility potential between different configurations. In addition, one study illustrates the module's ability to optimize a configuration's agility performance.

  3. Participatory Design Activities and Agile Software Development

    NASA Astrophysics Data System (ADS)

    Kautz, Karlheinz

    This paper contributes to the studies of design activities in information systems development. It provides a case study of a large agile development project and focusses on how customers and users participated in agile development and design activities in practice. The investigated project utilized the agile method eXtreme Programming. Planning games, user stories and story cards, working software, and acceptance tests structured the customer and user involvement. We found genuine customer and user involvement in the design activities in the form of both direct and indirect participation in the agile development project. The involved customer representatives played informative, consultative, and participative roles in the project. This led to their functional empowerment— the users were enabled to carry out their work to their own satisfaction and in an effective, efficient, and economical manner.

  4. Software Product Line Engineering Approach for Enhancing Agile Methodologies

    NASA Astrophysics Data System (ADS)

    Martinez, Jabier; Diaz, Jessica; Perez, Jennifer; Garbajosa, Juan

    One of the main principles of Agile methodologies consists in the early and continuous delivery of valuable software by short time-framed iterations. After each iteration, a working product is delivered according to the requirements defined at the beginning of the iteration. Testing tools facilitate the task of checking if the system provides the expected behavior according to the specified requirements. However, since testing tools need to be adapted in order to test new working products in each iteration, a significant effort has to be invested. This work presents a Software Product Line Engineering (SPLE) approach that allows flexibility in the adaption of testing tools with the working products in an iterative way. A case study is also presented using PLUM (Product Line Unified Modeller) as the tool suite for SPL implementation and management.

  5. Agile methods in biomedical software development: a multi-site experience report

    PubMed Central

    Kane, David W; Hohman, Moses M; Cerami, Ethan G; McCormick, Michael W; Kuhlmman, Karl F; Byrd, Jeff A

    2006-01-01

    Background Agile is an iterative approach to software development that relies on strong collaboration and automation to keep pace with dynamic environments. We have successfully used agile development approaches to create and maintain biomedical software, including software for bioinformatics. This paper reports on a qualitative study of our experiences using these methods. Results We have found that agile methods are well suited to the exploratory and iterative nature of scientific inquiry. They provide a robust framework for reproducing scientific results and for developing clinical support systems. The agile development approach also provides a model for collaboration between software engineers and researchers. We present our experience using agile methodologies in projects at six different biomedical software development organizations. The organizations include academic, commercial and government development teams, and included both bioinformatics and clinical support applications. We found that agile practices were a match for the needs of our biomedical projects and contributed to the success of our organizations. Conclusion We found that the agile development approach was a good fit for our organizations, and that these practices should be applicable and valuable to other biomedical software development efforts. Although we found differences in how agile methods were used, we were also able to identify a set of core practices that were common to all of the groups, and that could be a focus for others seeking to adopt these methods. PMID:16734914

  6. Agile Development Methods for Space Operations

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Webster, Chris

    2012-01-01

    Main stream industry software development practice has gone from a traditional waterfall process to agile iterative development that allows for fast response to customer inputs and produces higher quality software at lower cost. How can we, the space ops community, adopt state of the art software development practice, achieve greater productivity at lower cost, and maintain safe and effective space flight operations? At NASA Ames, we are developing Mission Control Technologies Software, in collaboration with Johnson Space Center (JSC) and, more recently, the Jet Propulsion Laboratory (JPL).

  7. A Quantitative Examination of Critical Success Factors Comparing Agile and Waterfall Project Management Methodologies

    ERIC Educational Resources Information Center

    Pedersen, Mitra

    2013-01-01

    This study investigated the rate of success for IT projects using agile and standard project management methodologies. Any successful project requires use of project methodology. Specifically, large projects require formal project management methodologies or models, which establish a blueprint of processes and project planning activities. This…

  8. Peridigm summary report : lessons learned in development with agile components.

    SciTech Connect

    Salinger, Andrew Gerhard; Mitchell, John Anthony; Littlewood, David John; Parks, Michael L.

    2011-09-01

    This report details efforts to deploy Agile Components for rapid development of a peridynamics code, Peridigm. The goal of Agile Components is to enable the efficient development of production-quality software by providing a well-defined, unifying interface to a powerful set of component-based software. Specifically, Agile Components facilitate interoperability among packages within the Trilinos Project, including data management, time integration, uncertainty quantification, and optimization. Development of the Peridigm code served as a testbed for Agile Components and resulted in a number of recommendations for future development. Agile Components successfully enabled rapid integration of Trilinos packages into Peridigm. A cost of this approach, however, was a set of restrictions on Peridigm's architecture which impacted the ability to track history-dependent material data, dynamically modify the model discretization, and interject user-defined routines into the time integration algorithm. These restrictions resulted in modifications to the Agile Components approach, as implemented in Peridigm, and in a set of recommendations for future Agile Components development. Specific recommendations include improved handling of material states, a more flexible flow control model, and improved documentation. A demonstration mini-application, SimpleODE, was developed at the onset of this project and is offered as a potential supplement to Agile Components documentation.

  9. Future Research in Agile Systems Development: Applying Open Innovation Principles Within the Agile Organisation

    NASA Astrophysics Data System (ADS)

    Conboy, Kieran; Morgan, Lorraine

    A particular strength of agile approaches is that they move away from ‘introverted' development and intimately involve the customer in all areas of development, supposedly leading to the development of a more innovative and hence more valuable information system. However, we argue that a single customer representative is too narrow a focus to adopt and that involvement of stakeholders beyond the software development itself is still often quite weak and in some cases non-existent. In response, we argue that current thinking regarding innovation in agile development needs to be extended to include multiple stakeholders outside the business unit. This paper explores the intra-organisational applicability and implications of open innovation in agile systems development. Additionally, it argues for a different perspective of project management that includes collaboration and knowledge-sharing with other business units, customers, partners, and other relevant stakeholders pertinent to the business success of an organisation, thus embracing open innovation principles.

  10. The NERV Methodology: Non-Functional Requirements Elicitation, Reasoning and Validation in Agile Processes

    ERIC Educational Resources Information Center

    Domah, Darshan

    2013-01-01

    Agile software development has become very popular around the world in recent years, with methods such as Scrum and Extreme Programming (XP). Literature suggests that functionality is the primary focus in Agile processes while non-functional requirements (NFR) are either ignored or ill-defined. However, for software to be of good quality both…

  11. Distributed agile software development for the SKA

    NASA Astrophysics Data System (ADS)

    Wicenec, Andreas; Parsons, Rebecca; Kitaeff, Slava; Vinsen, Kevin; Wu, Chen; Nelson, Paul; Reed, David

    2012-09-01

    The SKA software will most probably be developed by many groups distributed across the globe and coming from dierent backgrounds, like industries and research institutions. The SKA software subsystems will have to cover a very wide range of dierent areas, but still they have to react and work together like a single system to achieve the scientic goals and satisfy the challenging data ow requirements. Designing and developing such a system in a distributed fashion requires proper tools and the setup of an environment to allow for ecient detection and tracking of interface and integration issues in particular in a timely way. Agile development can provide much faster feedback mechanisms and also much tighter collaboration between the customer (scientist) and the developer. Continuous integration and continuous deployment on the other hand can provide much faster feedback of integration issues from the system level to the subsystem developers. This paper describes the results obtained from trialing a potential SKA development environment based on existing science software development processes like ALMA, the expected distribution of the groups potentially involved in the SKA development and experience gained in the development of large scale commercial software projects.

  12. A Case Study of Coordination in Distributed Agile Software Development

    NASA Astrophysics Data System (ADS)

    Hole, Steinar; Moe, Nils Brede

    Global Software Development (GSD) has gained significant popularity as an emerging paradigm. Companies also show interest in applying agile approaches in distributed development to combine the advantages of both approaches. However, in their most radical forms, agile and GSD can be placed in each end of a plan-based/agile spectrum because of how work is coordinated. We describe how three GSD projects applying agile methods coordinate their work. We found that trust is needed to reduce the need of standardization and direct supervision when coordinating work in a GSD project, and that electronic chatting supports mutual adjustment. Further, co-location and modularization mitigates communication problems, enables agility in at least part of a GSD project, and renders the implementation of Scrum of Scrums possible.

  13. An agile implementation of SCRUM

    NASA Astrophysics Data System (ADS)

    Gannon, Michele

    Is Agile a way to cut corners? To some, the use of an Agile Software Development Methodology has a negative connotation - “ Oh, you're just not producing any documentation” . So can a team with no experience in Agile successfully implement and use SCRUM?

  14. Agile informatics: application of agile project management to the development of a personal health application.

    PubMed

    Chung, Jeanhee; Pankey, Evan; Norris, Ryan J

    2007-01-01

    We describe the application of the Agile method-- a short iteration cycle, user responsive, measurable software development approach-- to the project management of a modular personal health record, iHealthSpace, to be deployed to the patients and providers of a large academic primary care practice. PMID:18694014

  15. Agile informatics: application of agile project management to the development of a personal health application.

    PubMed

    Chung, Jeanhee; Pankey, Evan; Norris, Ryan J

    2007-10-11

    We describe the application of the Agile method-- a short iteration cycle, user responsive, measurable software development approach-- to the project management of a modular personal health record, iHealthSpace, to be deployed to the patients and providers of a large academic primary care practice.

  16. Modern Enterprise Systems as Enablers of Agile Development

    NASA Astrophysics Data System (ADS)

    Fredriksson, Odd; Ljung, Lennart

    Traditional ES technology and traditional project management methods are supporting and matching each other. But they are not supporting the critical success conditions for ES development in an effective way. Although the findings from one case study of a successful modern ES change project is not strong empirical evidence, we carefully propose that the new modern ES technology is supporting and matching agile project management methods. In other words, it provides the required flexibility which makes it possible to put into practice the agile way of running projects, both for the system supplier and for the customer. In addition, we propose that the combination of modern ES technology and agile project management methods are more appropriate for supporting the realization of critical success conditions for ES development. The main purpose of this chapter is to compare critical success conditions for modern enterprise systems development projects with critical success conditions for agile information systems development projects.

  17. Chaste: using agile programming techniques to develop computational biology software.

    PubMed

    Pitt-Francis, Joe; Bernabeu, Miguel O; Cooper, Jonathan; Garny, Alan; Momtahan, Lee; Osborne, James; Pathmanathan, Pras; Rodriguez, Blanca; Whiteley, Jonathan P; Gavaghan, David J

    2008-09-13

    Cardiac modelling is the area of physiome modelling where the available simulation software is perhaps most mature, and it therefore provides an excellent starting point for considering the software requirements for the wider physiome community. In this paper, we will begin by introducing some of the most advanced existing software packages for simulating cardiac electrical activity. We consider the software development methods used in producing codes of this type, and discuss their use of numerical algorithms, relative computational efficiency, usability, robustness and extensibility. We then go on to describe a class of software development methodologies known as test-driven agile methods and argue that such methods are more suitable for scientific software development than the traditional academic approaches. A case study is a project of our own, Cancer, Heart and Soft Tissue Environment, which is a library of computational biology software that began as an experiment in the use of agile programming methods. We present our experiences with a review of our progress thus far, focusing on the advantages and disadvantages of this new approach compared with the development methods used in some existing packages. We conclude by considering whether the likely wider needs of the cardiac modelling community are currently being met and suggest that, in order to respond effectively to changing requirements, it is essential that these codes should be more malleable. Such codes will allow for reliable extensions to include both detailed mathematical models--of the heart and other organs--and more efficient numerical techniques that are currently being developed by many research groups worldwide. PMID:18565813

  18. Chaste: using agile programming techniques to develop computational biology software.

    PubMed

    Pitt-Francis, Joe; Bernabeu, Miguel O; Cooper, Jonathan; Garny, Alan; Momtahan, Lee; Osborne, James; Pathmanathan, Pras; Rodriguez, Blanca; Whiteley, Jonathan P; Gavaghan, David J

    2008-09-13

    Cardiac modelling is the area of physiome modelling where the available simulation software is perhaps most mature, and it therefore provides an excellent starting point for considering the software requirements for the wider physiome community. In this paper, we will begin by introducing some of the most advanced existing software packages for simulating cardiac electrical activity. We consider the software development methods used in producing codes of this type, and discuss their use of numerical algorithms, relative computational efficiency, usability, robustness and extensibility. We then go on to describe a class of software development methodologies known as test-driven agile methods and argue that such methods are more suitable for scientific software development than the traditional academic approaches. A case study is a project of our own, Cancer, Heart and Soft Tissue Environment, which is a library of computational biology software that began as an experiment in the use of agile programming methods. We present our experiences with a review of our progress thus far, focusing on the advantages and disadvantages of this new approach compared with the development methods used in some existing packages. We conclude by considering whether the likely wider needs of the cardiac modelling community are currently being met and suggest that, in order to respond effectively to changing requirements, it is essential that these codes should be more malleable. Such codes will allow for reliable extensions to include both detailed mathematical models--of the heart and other organs--and more efficient numerical techniques that are currently being developed by many research groups worldwide.

  19. Lean and Agile Development of the AITS Ground Software System

    NASA Astrophysics Data System (ADS)

    Richters, Mark; Dutruel, Etienne; Mecredy, Nicolas

    2013-08-01

    We present the ongoing development of a new ground software system used for integrating, testing and operating spacecraft. The Advanced Integration and Test Services (AITS) project aims at providing a solution for electrical ground support equipment and mission control systems in future Astrium Space Transportation missions. Traditionally ESA ground or flight software development projects are conducted according to a waterfall-like process as specified in the ECSS-E-40 standard promoted by ESA in the European industry. In AITS a decision was taken to adopt an agile development process. This work could serve as a reference for future ESA software projects willing to apply agile concepts.

  20. Developing a model for agile supply: an empirical study from Iranian pharmaceutical supply chain.

    PubMed

    Rajabzadeh Ghatari, Ali; Mehralian, Gholamhossein; Zarenezhad, Forouzandeh; Rasekh, Hamid Reza

    2013-01-01

    Agility is the fundamental characteristic of a supply chain needed for survival in turbulent markets, where environmental forces create additional uncertainty resulting in higher risk in the supply chain management. In addition, agility helps providing the right product, at the right time to the consumer. The main goal of this research is therefore to promote supplier selection in pharmaceutical industry according to the formative basic factors. Moreover, this paper can configure its supply network to achieve the agile supply chain. The present article analyzes the supply part of supply chain based on SCOR model, used to assess agile supply chains by highlighting their specific characteristics and applicability in providing the active pharmaceutical ingredient (API). This methodology provides an analytical modeling; the model enables potential suppliers to be assessed against the multiple criteria using both quantitative and qualitative measures. In addition, for making priority of critical factors, TOPSIS algorithm has been used as a common technique of MADM model. Finally, several factors such as delivery speed, planning and reorder segmentation, trust development and material quantity adjustment are identified and prioritized as critical factors for being agile in supply of API. PMID:24250689

  1. Developing a Model for Agile Supply: an Empirical Study from Iranian Pharmaceutical Supply Chain

    PubMed Central

    Rajabzadeh Ghatari, Ali; Mehralian, Gholamhossein; Zarenezhad, Forouzandeh; Rasekh, Hamid Reza

    2013-01-01

    Agility is the fundamental characteristic of a supply chain needed for survival in turbulent markets, where environmental forces create additional uncertainty resulting in higher risk in the supply chain management. In addition, agility helps providing the right product, at the right time to the consumer. The main goal of this research is therefore to promote supplier selection in pharmaceutical industry according to the formative basic factors. Moreover, this paper can configure its supply network to achieve the agile supply chain. The present article analyzes the supply part of supply chain based on SCOR model, used to assess agile supply chains by highlighting their specific characteristics and applicability in providing the active pharmaceutical ingredient (API). This methodology provides an analytical modeling; the model enables potential suppliers to be assessed against the multiple criteria using both quantitative and qualitative measures. In addition, for making priority of critical factors, TOPSIS algorithm has been used as a common technique of MADM model. Finally, several factors such as delivery speed, planning and reorder segmentation, trust development and material quantity adjustment are identified and prioritized as critical factors for being agile in supply of API. PMID:24250689

  2. Developing a model for agile supply: an empirical study from Iranian pharmaceutical supply chain.

    PubMed

    Rajabzadeh Ghatari, Ali; Mehralian, Gholamhossein; Zarenezhad, Forouzandeh; Rasekh, Hamid Reza

    2013-01-01

    Agility is the fundamental characteristic of a supply chain needed for survival in turbulent markets, where environmental forces create additional uncertainty resulting in higher risk in the supply chain management. In addition, agility helps providing the right product, at the right time to the consumer. The main goal of this research is therefore to promote supplier selection in pharmaceutical industry according to the formative basic factors. Moreover, this paper can configure its supply network to achieve the agile supply chain. The present article analyzes the supply part of supply chain based on SCOR model, used to assess agile supply chains by highlighting their specific characteristics and applicability in providing the active pharmaceutical ingredient (API). This methodology provides an analytical modeling; the model enables potential suppliers to be assessed against the multiple criteria using both quantitative and qualitative measures. In addition, for making priority of critical factors, TOPSIS algorithm has been used as a common technique of MADM model. Finally, several factors such as delivery speed, planning and reorder segmentation, trust development and material quantity adjustment are identified and prioritized as critical factors for being agile in supply of API.

  3. Agile Development Processes: Delivering a Successful Data Management Platform Now and in the Future

    NASA Astrophysics Data System (ADS)

    Deaubl, E.; Lowry, S.

    2007-10-01

    Developing a flexible, extensible architecture for scientific data archival and management is a monumental task under older, big design, up-front methodologies. We will describe how we are using agile development techniques in our service oriented architecture (SOA)-based platform to integrate astronomer and operator input into the development process, deliver functional software earlier, and ensure that the software is maintainable and extensible in the future.

  4. A Capstone Course on Agile Software Development Using Scrum

    ERIC Educational Resources Information Center

    Mahnic, V.

    2012-01-01

    In this paper, an undergraduate capstone course in software engineering is described that not only exposes students to agile software development, but also makes it possible to observe the behavior of developers using Scrum for the first time. The course requires students to work as Scrum Teams, responsible for the implementation of a set of user…

  5. An Agile Methodology for Implementing Service-Oriented Architecture in Small and Medium Sized Organizations

    ERIC Educational Resources Information Center

    Laidlaw, Gregory

    2013-01-01

    The purpose of this study is to evaluate the use of Lean/Agile principles, using action research to develop and deploy new technology for Small and Medium sized enterprises. The research case was conducted at the Lapeer County Sheriff's Department and involves the initial deployment of a Service Oriented Architecture to alleviate the data…

  6. Impact of Agile Software Development Model on Software Maintainability

    ERIC Educational Resources Information Center

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  7. Agile Software Development Methods: A Comparative Review1

    NASA Astrophysics Data System (ADS)

    Abrahamsson, Pekka; Oza, Nilay; Siponen, Mikko T.

    Although agile software development methods have caught the attention of software engineers and researchers worldwide, scientific research still remains quite scarce. The aim of this study is to order and make sense of the different agile approaches that have been proposed. This comparative review is performed from the standpoint of using the following features as the analytical perspectives: project management support, life-cycle coverage, type of practical guidance, adaptability in actual use, type of research objectives and existence of empirical evidence. The results show that agile software development methods cover, without offering any rationale, different phases of the software development life-cycle and that most of these methods fail to provide adequate project management support. Moreover, quite a few methods continue to offer little concrete guidance on how to use their solutions or how to adapt them in different development situations. Empirical evidence after ten years of application remains quite limited. Based on the results, new directions on agile methods are outlined.

  8. Implementation of an agile maintenance mechanic assignment methodology

    NASA Astrophysics Data System (ADS)

    Jimenez, Jesus A.; Quintana, Rolando

    2000-10-01

    The objective of this research was to develop a decision support system (DSS) to study the impact of introducing new equipment into a medical apparel plant from a maintenance organizational structure perspective. This system will enable the company to determine if their capacity is sufficient to meet current maintenance challenges. The DSS contains two database sets that describe equipment and maintenance resource profiles. The equipment profile specifies data such as mean time to failures, mean time to repairs, and minimum mechanic skill level required to fix each machine group. Similarly, maintenance-resource profile reports information about the mechanic staff, such as number and type of certifications received, education level, and experience. The DSS will then use this information to minimize machine downtime by assigning the highest skilled mechanics to machines with higher complexity and product value. A modified version of the simplex method, the transportation problem, was used to perform the optimization. The DSS was built using the Visual Basic for Applications (VBA) language contained in the Microsoft Excel environment. A case study was developed from current existing data. The analysis consisted of forty-two machine groups and six mechanic categories with ten skill levels. Results showed that only 56% of the mechanic workforce was utilized. Thus, the company had available resources for meeting future maintenance requirements.

  9. An Agile Course-Delivery Approach

    ERIC Educational Resources Information Center

    Capellan, Mirkeya

    2009-01-01

    In the world of software development, agile methodologies have gained popularity thanks to their lightweight methodologies and flexible approach. Many advocates believe that agile methodologies can provide significant benefits if applied in the educational environment as a teaching method. The need for an approach that engages and motivates…

  10. Positioning Agility

    NASA Astrophysics Data System (ADS)

    Oza, Nilay; Abrahamsson, Pekka; Conboy, Kieran

    Agile methods are increasingly adopted by European companies. Academics too are conducting numerous studies on different tenets of agile methods. Companies often feel proud in marketing themselves as ‘agile’. However, the true notion of ‘being agile’ seems to have been overlooked due to lack of positioning of oneself for agility. This raises a call for more research and interactions between academia and the industry. The proposed workshop refers to this call. It will be highly relevant to participants, interested in positioning their company’s agility from organizational, group or project perspectives. The positioning of agility will help companies to better align their agile practices with stakeholder values. Results of the workshop will be shared across participants and they will also have opportunity to continue their work on agile positioning in their companies. At broader level, the work done in this workshop will contribute towards developing Agile Positioning System.

  11. Effective speed and agility conditioning methodology for random intermittent dynamic type sports.

    PubMed

    Bloomfield, Jonathan; Polman, Remco; O'Donoghue, Peter; McNaughton, Lars

    2007-11-01

    Different coaching methods are often used to improve performance. This study compared the effectiveness of 2 methodologies for speed and agility conditioning for random, intermittent, and dynamic activity sports (e.g., soccer, tennis, hockey, basketball, rugby, and netball) and the necessity for specialized coaching equipment. Two groups were delivered either a programmed method (PC) or a random method (RC) of conditioning with a third group receiving no conditioning (NC). PC participants used the speed, agility, quickness (SAQ) conditioning method, and RC participants played supervised small-sided soccer games. PC was also subdivided into 2 groups where participants either used specialized SAQ equipment or no equipment. A total of 46 (25 males and 21 females) untrained participants received (mean +/- SD) 12.2 +/- 2.1 hours of physical conditioning over 6 weeks between a battery of speed and agility parameter field tests. Two-way analysis of variance results indicated that both conditioning groups showed a significant decrease in body mass and body mass index, although PC achieved significantly greater improvements on acceleration, deceleration, leg power, dynamic balance, and the overall summation of % increases when compared to RC and NC (p < 0.05). PC in the form of SAQ exercises appears to be a superior method for improving speed and agility parameters; however, this study found that specialized SAQ equipment was not a requirement to observe significant improvements. Further research is required to establish whether these benefits transfer to sport-specific tasks as well as to the underlying mechanisms resulting in improved performance.

  12. A Review of Agile and Lean Manufacturing as Issues in Selected International and National Research and Development Programs and Roadmaps

    ERIC Educational Resources Information Center

    Castro, Helio; Putnik, Goran D.; Shah, Vaibhav

    2012-01-01

    Purpose: The aim of this paper is to analyze international and national research and development (R&D) programs and roadmaps for the manufacturing sector, presenting how agile and lean manufacturing models are addressed in these programs. Design/methodology/approach: In this review, several manufacturing research and development programs and…

  13. A Roadmap for using Agile Development in a Traditional System

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Starbird, Thomas

    2006-01-01

    I. Ensemble Development Group: a) Produces activity planning software for in spacecraft; b) Built on Eclipse Rich Client Platform (open source development and runtime software); c) Funded by multiple sources including the Mars Technology Program; d) Incorporated the use of Agile Development. II. Next Generation Uplink Planning System: a) Researches the Activity Planning and Sequencing Subsystem for Mars Science Laboratory (APSS); b) APSS includes Ensemble, Activity Modeling, Constraint Checking, Command Editing and Sequencing tools plus other uplink generation utilities; c) Funded by the Mars Technology Program; d) Integrates all of the tools for APSS.

  14. Development of EarthCube Governance: An Agile Approach

    NASA Astrophysics Data System (ADS)

    Pearthree, G.; Allison, M. L.; Patten, K.

    2013-12-01

    Governance of geosciences cyberinfrastructure is a complex and essential undertaking, critical in enabling distributed knowledge communities to collaborate and communicate across disciplines, distances, and cultures. Advancing science with respect to 'grand challenges," such as global climate change, weather prediction, and core fundamental science, depends not just on technical cyber systems, but also on social systems for strategic planning, decision-making, project management, learning, teaching, and building a community of practice. Simply put, a robust, agile technical system depends on an equally robust and agile social system. Cyberinfrastructure development is wrapped in social, organizational and governance challenges, which may significantly impede progress. An agile development process is underway for governance of transformative investments in geosciences cyberinfrastructure through the NSF EarthCube initiative. Agile development is iterative and incremental, and promotes adaptive planning and rapid and flexible response. Such iterative deployment across a variety of EarthCube stakeholders encourages transparency, consensus, accountability, and inclusiveness. A project Secretariat acts as the coordinating body, carrying out duties for planning, organizing, communicating, and reporting. A broad coalition of stakeholder groups comprises an Assembly (Mainstream Scientists, Cyberinfrastructure Institutions, Information Technology/Computer Sciences, NSF EarthCube Investigators, Science Communities, EarthCube End-User Workshop Organizers, Professional Societies) to serve as a preliminary venue for identifying, evaluating, and testing potential governance models. To offer opportunity for broader end-user input, a crowd-source approach will engage stakeholders not involved otherwise. An Advisory Committee from the Earth, ocean, atmosphere, social, computer and library sciences is guiding the process from a high-level policy point of view. Developmental

  15. Towards a Framework for Using Agile Approaches in Global Software Development

    NASA Astrophysics Data System (ADS)

    Hossain, Emam; Ali Babar, Muhammad; Verner, June

    As agile methods and Global Software Development (GSD) are become increasingly popular, GSD project managers have been exploring the viability of using agile approaches in their development environments. Despite the expected benefits of using an agile approach with a GSD project, the overall combining mechanisms of the two approaches are not clearly understood. To address this challenge, we propose a conceptual framework, based on the research literature. This framework is expected to aid a project manager in deciding what agile strategies are effective for a particular GSD project, taking into account project context. We use an industry-based case study to explore the components of our conceptual framework. Our case study is planned and conducted according to specific published case study guidelines. We identify the agile practices and agile supporting practices used by a GSD project manager in our case study and conclude with future research directions.

  16. Data Centric Development Methodology

    ERIC Educational Resources Information Center

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  17. How Can Agile Practices Minimize Global Software Development Co-ordination Risks?

    NASA Astrophysics Data System (ADS)

    Hossain, Emam; Babar, Muhammad Ali; Verner, June

    The distribution of project stakeholders in Global Software Development (GSD) projects provides significant risks related to project communication, coordination and control processes. There is growing interest in applying agile practices in GSD projects in order to leverage the advantages of both approaches. In some cases, GSD project managers use agile practices to reduce project distribution challenges. We use an existing coordination framework to identify GSD coordination problems due to temporal, geographical and socio-cultural distances. An industry-based case study is used to describe, explore and explain the use of agile practices to reduce development coordination challenges.

  18. Agile Project Management for e-Learning Developments

    ERIC Educational Resources Information Center

    Doherty, Iain

    2010-01-01

    We outline the project management tactics that we developed in praxis in order to manage elearning projects and show how our tactics were enhanced through implementing project management techniques from a formal project management methodology. Two key factors have contributed to our project management success. The first is maintaining a clear…

  19. Adopting best practices: "Agility" moves from software development to healthcare project management.

    PubMed

    Kitzmiller, Rebecca; Hunt, Eleanor; Sproat, Sara Breckenridge

    2006-01-01

    It is time for a change in mindset in how nurses operationalize system implementations and manage projects. Computers and systems have evolved over time from unwieldy mysterious machines of the past to ubiquitous computer use in every aspect of daily lives and work sites. Yet, disconcertingly, the process used to implement these systems has not evolved. Technology implementation does not need to be a struggle. It is time to adapt traditional plan-driven implementation methods to incorporate agile techniques. Agility is a concept borrowed from software development and is presented here because it encourages flexibility, adaptation, and continuous learning as part of the implementation process. Agility values communication and harnesses change to an advantage, which facilitates the natural evolution of an adaptable implementation process. Specific examples of agility in an implementation are described, and plan-driven implementation stages are adapted to incorporate relevant agile techniques. This comparison demonstrates how an agile approach enhances traditional implementation techniques to meet the demands of today's complex healthcare environments. PMID:16554690

  20. Adopting best practices: "Agility" moves from software development to healthcare project management.

    PubMed

    Kitzmiller, Rebecca; Hunt, Eleanor; Sproat, Sara Breckenridge

    2006-01-01

    It is time for a change in mindset in how nurses operationalize system implementations and manage projects. Computers and systems have evolved over time from unwieldy mysterious machines of the past to ubiquitous computer use in every aspect of daily lives and work sites. Yet, disconcertingly, the process used to implement these systems has not evolved. Technology implementation does not need to be a struggle. It is time to adapt traditional plan-driven implementation methods to incorporate agile techniques. Agility is a concept borrowed from software development and is presented here because it encourages flexibility, adaptation, and continuous learning as part of the implementation process. Agility values communication and harnesses change to an advantage, which facilitates the natural evolution of an adaptable implementation process. Specific examples of agility in an implementation are described, and plan-driven implementation stages are adapted to incorporate relevant agile techniques. This comparison demonstrates how an agile approach enhances traditional implementation techniques to meet the demands of today's complex healthcare environments.

  1. Agile Walker.

    PubMed

    Katz, Reuven

    2015-01-01

    The goal of the Agile Walker is to improve the outdoor mobility of healthy elderly people with some mobility limitations. It is a newly developed, all-terrain walker, equipped with an electric drive system and speed control that can assists elderly people to walk outdoors or to hike. The walker has a unique product design with an attractive look that will appeal to "active-agers" population. This paper describes product design requirements and the development process of the Agile Walker, its features and some preliminary testing results.

  2. Agile enterprise development framework utilizing services principles for building pervasive security

    NASA Astrophysics Data System (ADS)

    Farroha, Deborah; Farroha, Bassam

    2011-06-01

    We are in an environment of continuously changing mission requirements and therefore our Information Systems must adapt to accomplish new tasks, quicker, in a more proficient manner. Agility is the only way we will be able to keep up with this change. But there are subtleties that must be considered as we adopt various agile methods: secure, protect, control and authenticate are all elements needed to posture our Information Technology systems to counteract the real and perceived threats in today's environment. Many systems have been tasked to ingest process and analyze different data sets than they were originally designed for and they have to interact with multiple new systems that were unaccounted for at design time. Leveraging the tenets of security, we have devised a new framework that takes agility into a new realm where the product will built to work in a service-based environment but is developed using agile processes. Even though these two criteria promise to hone the development effort, they actually contradict each other in philosophy where Services require stable interfaces, while Agile focuses on being flexible and tolerate changes up to much later stages of development. This framework is focused on enabling a successful product development that capitalizes on both philosophies.

  3. Applying Agile MethodstoWeapon/Weapon-Related Software

    SciTech Connect

    Adams, D; Armendariz, M; Blackledge, M; Campbell, F; Cloninger, M; Cox, L; Davis, J; Elliott, M; Granger, K; Hans, S; Kuhn, C; Lackner, M; Loo, P; Matthews, S; Morrell, K; Owens, C; Peercy, D; Pope, G; Quirk, R; Schilling, D; Stewart, A; Tran, A; Ward, R; Williamson, M

    2007-05-02

    This white paper provides information and guidance to the Department of Energy (DOE) sites on Agile software development methods and the impact of their application on weapon/weapon-related software development. The purpose of this white paper is to provide an overview of Agile methods, examine the accepted interpretations/uses/practices of these methodologies, and discuss the applicability of Agile methods with respect to Nuclear Weapons Complex (NWC) Technical Business Practices (TBPs). It also provides recommendations on the application of Agile methods to the development of weapon/weapon-related software.

  4. Cross Sectional Study of Agile Software Development Methods and Project Performance

    ERIC Educational Resources Information Center

    Lambert, Tracy

    2011-01-01

    Agile software development methods, characterized by delivering customer value via incremental and iterative time-boxed development processes, have moved into the mainstream of the Information Technology (IT) industry. However, despite a growing body of research which suggests that a predictive manufacturing approach, with big up-front…

  5. Insights into Global Health Practice from the Agile Software Development Movement.

    PubMed

    Flood, David; Chary, Anita; Austad, Kirsten; Diaz, Anne Kraemer; García, Pablo; Martinez, Boris; Canú, Waleska López; Rohloff, Peter

    2016-01-01

    Global health practitioners may feel frustration that current models of global health research, delivery, and implementation are overly focused on specific interventions, slow to provide health services in the field, and relatively ill-equipped to adapt to local contexts. Adapting design principles from the agile software development movement, we propose an analogous approach to designing global health programs that emphasizes tight integration between research and implementation, early involvement of ground-level health workers and program beneficiaries, and rapid cycles of iterative program improvement. Using examples from our own fieldwork, we illustrate the potential of 'agile global health' and reflect on the limitations, trade-offs, and implications of this approach.

  6. Analysis and optimization of preliminary aircraft configurations in relationship to emerging agility metrics

    NASA Technical Reports Server (NTRS)

    Sandlin, Doral R.; Bauer, Brent Alan

    1993-01-01

    This paper discusses the development of a FORTRAN computer code to perform agility analysis on aircraft configurations. This code is to be part of the NASA-Ames ACSYNT (AirCraft SYNThesis) design code. This paper begins with a discussion of contemporary agility research in the aircraft industry and a survey of a few agility metrics. The methodology, techniques and models developed for the code are then presented. Finally, example trade studies using the agility module along with ACSYNT are illustrated. These trade studies were conducted using a Northrop F-20 Tigershark aircraft model. The studies show that the agility module is effective in analyzing the influence of common parameters such as thrust-to-weight ratio and wing loading on agility criteria. The module can compare the agility potential between different configurations. In addition one study illustrates the module's ability to optimize a configuration's agility performance.

  7. Agile in Large-Scale Development Workshop: Coaching, Transitioning and Practicing

    NASA Astrophysics Data System (ADS)

    Nilsson, Thomas; Larsson, Andreas

    Agile in large-scale and complex development presents its own set of problems, both how to practice, transition and coaching. This workshop aims at bringing persons interested in this topic together to share tools, techniques and insights. The workshop will follow the increasingly popular “lightning talk + open space” format.

  8. Development methodology for scientific software

    SciTech Connect

    Cort, G.; Goldstone, J.A.; Nelson, R.O.; Poore, R.V.; Miller, L.; Barrus, D.M.

    1985-01-01

    We present the details of a software development methodology that addresses all phases of the software life cycle, yet is well suited for application by small projects with limited resources. The methodology has been developed at the Los Alamos Weapons Neutron Research (WNR) Facility and was utilized during the recent development of the WNR Data Acquisition Command Language. The methodology emphasizes the development and maintenance of comprehensive documentation for all software components. The impact of the methodology upon software quality and programmer productivity is assessed.

  9. Ramping up for agility: Development of a concurrent engineering communications infrastructure

    SciTech Connect

    Forsythe, C.; Ashby, M.R.

    1995-09-01

    A-PRIMED (Agile Product Realization for Innovative Electro MEchanical Devices) demonstrated new product development in24 days accompanied by improved product quality, through ability enabling technologies. A concurrent engineering communications infrastructure was developed that provided electronic data communications, information access, enterprise integration of computers and applications, and collaborative work tools. This paper describes how A-PRIMED did it through attention to technologies, processes, and people.

  10. Collaboration, Communication and Co-ordination in Agile Software Development Practice

    NASA Astrophysics Data System (ADS)

    Robinson, Hugh; Sharp, Helen

    This chapter analyses the results of a series of observational studies of agile software developmentagile software development teams, identifying commonalities in collaboration, co-ordination and communication activities. Pairing and customer collaborationcustomer collaboration are focussed on to illustrate the nature of collaboration and communication, as are two simple physical artefacts that emerged through analysis as being an information-rich focal point for the co-ordination of collaboration and communication activities. The analysis shows that pairingpairing has common characteristics across all teams, while customer collaboration differs between the teams depending on the application and organisational context of development.

  11. Agile rediscovering values: Similarities to continuous improvement strategies

    NASA Astrophysics Data System (ADS)

    Díaz de Mera, P.; Arenas, J. M.; González, C.

    2012-04-01

    Research in the late 80's on technological companies that develop products of high value innovation, with sufficient speed and flexibility to adapt quickly to changing market conditions, gave rise to the new set of methodologies known as Agile Management Approach. In the current changing economic scenario, we considered very interesting to study the similarities of these Agile Methodologies with other practices whose effectiveness has been amply demonstrated in both the West and Japan. Strategies such as Kaizen, Lean, World Class Manufacturing, Concurrent Engineering, etc, would be analyzed to check the values they have in common with the Agile Approach.

  12. Insights into Global Health Practice from the Agile Software Development Movement

    PubMed Central

    Flood, David; Chary, Anita; Austad, Kirsten; Diaz, Anne Kraemer; García, Pablo; Martinez, Boris; Canú, Waleska López; Rohloff, Peter

    2016-01-01

    Global health practitioners may feel frustration that current models of global health research, delivery, and implementation are overly focused on specific interventions, slow to provide health services in the field, and relatively ill-equipped to adapt to local contexts. Adapting design principles from the agile software development movement, we propose an analogous approach to designing global health programs that emphasizes tight integration between research and implementation, early involvement of ground-level health workers and program beneficiaries, and rapid cycles of iterative program improvement. Using examples from our own fieldwork, we illustrate the potential of ‘agile global health’ and reflect on the limitations, trade-offs, and implications of this approach. PMID:27134081

  13. Towards a unified approach to the design of knowledge based agile manufacturing systems: Part 1 - methodology

    SciTech Connect

    Jones, A.H.; Uzam, M.

    1996-12-31

    To date there are no general techniques available to design Knowledge Based Discrete Event Control Systems. In this paper, a new technique is proposed which solves the problem. The generality of the technique means that the method can be applied to any complex (multi-component) Discrete Event Control problem and can easily accommodate diagnostics and reconfiguration. The technique involves firstly, defining the complex Discrete Event Control system as a colored Petri net controller and then converting colored Petri net controller into a colored Token Passing Logic Controller via Token Passing Logic (TPL) technique and finally, representing the colored Token Passing Logic Controller as rules within a control knowledge base for use within a concurrent inference engine. The technique is described by considering the fundamental structures inherent in colored Petri net control design and shows how to convert these structures into a knowledge base suitable for Discrete Event Control. Moreover, a context sensitive concurrent inference engine is also proposed to ensure the correct processing of the control knowledge base. An illustrative example of how this methodology can be applied to a complex discrete event control problem is described in Part II.

  14. Combining Agile and Traditional: Customer Communication in Distributed Environment

    NASA Astrophysics Data System (ADS)

    Korkala, Mikko; Pikkarainen, Minna; Conboy, Kieran

    Distributed development is a radically increasing phenomenon in modern software development environments. At the same time, traditional and agile methodologies and combinations of those are being used in the industry. Agile approaches place a large emphasis on customer communication. However, existing knowledge on customer communication in distributed agile development seems to be lacking. In order to shed light on this topic and provide practical guidelines for companies in distributed agile environments, a qualitative case study was conducted in a large globally distributed software company. The key finding was that it might be difficult for an agile organization to get relevant information from a traditional type of customer organization, even though the customer communication was indicated to be active and utilized via multiple different communication media. Several challenges discussed in this paper referred to "information blackout" indicating the importance of an environment fostering meaningful communication. In order to evaluate if this environment can be created a set of guidelines is proposed.

  15. Accelerating Software Development through Agile Practices--A Case Study of a Small-Scale, Time-Intensive Web Development Project at a College-Level IT Competition

    ERIC Educational Resources Information Center

    Zhang, Xuesong; Dorn, Bradley

    2012-01-01

    Agile development has received increasing interest both in industry and academia due to its benefits in developing software quickly, meeting customer needs, and keeping pace with the rapidly changing requirements. However, agile practices and scrum in particular have been mainly tested in mid- to large-size projects. In this paper, we present…

  16. Advancing cancer drug discovery towards more agile development of targeted combination therapies.

    PubMed

    Carragher, Neil O; Unciti-Broceta, Asier; Cameron, David A

    2012-01-01

    Current drug-discovery strategies are typically 'target-centric' and are based upon high-throughput screening of large chemical libraries against nominated targets and a selection of lead compounds with optimized 'on-target' potency and selectivity profiles. However, high attrition of targeted agents in clinical development suggest that combinations of targeted agents will be most effective in treating solid tumors if the biological networks that permit cancer cells to subvert monotherapies are identified and retargeted. Conventional drug-discovery and development strategies are suboptimal for the rational design and development of novel drug combinations. In this article, we highlight a series of emerging technologies supporting a less reductionist, more agile, drug-discovery and development approach for the rational design, validation, prioritization and clinical development of novel drug combinations.

  17. ISS Double-Gimbaled CMG Subsystem Simulation Using the Agile Development Method

    NASA Technical Reports Server (NTRS)

    Inampudi, Ravi

    2016-01-01

    This paper presents an evolutionary approach in simulating a cluster of 4 Control Moment Gyros (CMG) on the International Space Station (ISS) using a common sense approach (the agile development method) for concurrent mathematical modeling and simulation of the CMG subsystem. This simulation is part of Training systems for the 21st Century simulator which will provide training for crew members, instructors, and flight controllers. The basic idea of how the CMGs on the space station are used for its non-propulsive attitude control is briefly explained to set up the context for simulating a CMG subsystem. Next different reference frames and the detailed equations of motion (EOM) for multiple double-gimbal variable-speed control moment gyroscopes (DGVs) are presented. Fixing some of the terms in the EOM becomes the special case EOM for ISS's double-gimbaled fixed speed CMGs. CMG simulation development using the agile development method is presented in which customer's requirements and solutions evolve through iterative analysis, design, coding, unit testing and acceptance testing. At the end of the iteration a set of features implemented in that iteration are demonstrated to the flight controllers thus creating a short feedback loop and helping in creating adaptive development cycles. The unified modeling language (UML) tool is used in illustrating the user stories, class designs and sequence diagrams. This incremental development approach of mathematical modeling and simulating the CMG subsystem involved the development team and the customer early on, thus improving the quality of the working CMG system in each iteration and helping the team to accurately predict the cost, schedule and delivery of the software.

  18. Reactive Agility Performance in Handball; Development and Evaluation of a Sport-Specific Measurement Protocol

    PubMed Central

    Spasic, Miodrag; Krolo, Ante; Zenic, Natasa; Delextrat, Anne; Sekulic, Damir

    2015-01-01

    There is no current study that examined sport-specific tests of reactive-agility and change-of-direction-speed (CODS) to replicate real-sport environment in handball (team-handball). This investigation evaluated the reliability and validity of two novel tests designed to assess reactive-agility and CODS of handball players. Participants were female (25.14 ± 3.71 years of age; 1.77 ± 0.09 m and 74.1 ± 6.1 kg) and male handball players (26.9 ± 4.1 years of age; 1.90 ± 0.09 m and 93.90±4.6 kg). Variables included body height, body mass, body mass index, broad jump, 5-m sprint, CODS and reactive-agility tests. Results showed satisfactory reliability for reactive-agility-test and CODS-test (ICC of 0.85-0.93, and CV of 2.4-4.8%). The reactive-agility and CODS shared less than 20% of the common variance. The calculated index of perceptual and reactive capacity (P&RC; ratio between reactive-agility- and CODS-performance) is found to be valid measure in defining true-game reactive-agility performance in handball in both genders. Therefore, the handball athletes’ P&RC should be used in the evaluation of real-game reactive-agility performance. Future studies should explore other sport-specific reactive-agility tests and factors associated to such performance in sports involving agile maneuvers. Key points Reactive agility and change-of-direction-speed should be observed as independent qualities, even when tested over the same course and similar movement template The reactive-agility-performance of the handball athletes involved in defensive duties is closer to their non-reactive-agility-score than in their peers who are not involved in defensive duties The handball specific “true-game” reactive-agility-performance should be evaluated as the ratio between reactive-agility and corresponding CODS performance. PMID:26336335

  19. Reactive Agility Performance in Handball; Development and Evaluation of a Sport-Specific Measurement Protocol.

    PubMed

    Spasic, Miodrag; Krolo, Ante; Zenic, Natasa; Delextrat, Anne; Sekulic, Damir

    2015-09-01

    There is no current study that examined sport-specific tests of reactive-agility and change-of-direction-speed (CODS) to replicate real-sport environment in handball (team-handball). This investigation evaluated the reliability and validity of two novel tests designed to assess reactive-agility and CODS of handball players. Participants were female (25.14 ± 3.71 years of age; 1.77 ± 0.09 m and 74.1 ± 6.1 kg) and male handball players (26.9 ± 4.1 years of age; 1.90 ± 0.09 m and 93.90±4.6 kg). Variables included body height, body mass, body mass index, broad jump, 5-m sprint, CODS and reactive-agility tests. Results showed satisfactory reliability for reactive-agility-test and CODS-test (ICC of 0.85-0.93, and CV of 2.4-4.8%). The reactive-agility and CODS shared less than 20% of the common variance. The calculated index of perceptual and reactive capacity (P&RC; ratio between reactive-agility- and CODS-performance) is found to be valid measure in defining true-game reactive-agility performance in handball in both genders. Therefore, the handball athletes' P&RC should be used in the evaluation of real-game reactive-agility performance. Future studies should explore other sport-specific reactive-agility tests and factors associated to such performance in sports involving agile maneuvers. Key pointsReactive agility and change-of-direction-speed should be observed as independent qualities, even when tested over the same course and similar movement templateThe reactive-agility-performance of the handball athletes involved in defensive duties is closer to their non-reactive-agility-score than in their peers who are not involved in defensive dutiesThe handball specific "true-game" reactive-agility-performance should be evaluated as the ratio between reactive-agility and corresponding CODS performance.

  20. Development of perceived competence, tactical skills, motivation, technical skills, and speed and agility in young soccer players.

    PubMed

    Forsman, Hannele; Gråstén, Arto; Blomqvist, Minna; Davids, Keith; Liukkonen, Jarmo; Konttinen, Niilo

    2016-07-01

    The objective of this 1-year, longitudinal study was to examine the development of perceived competence, tactical skills, motivation, technical skills, and speed and agility characteristics of young Finnish soccer players. We also examined associations between latent growth models of perceived competence and other recorded variables. Participants were 288 competitive male soccer players ranging from 12 to 14 years (12.7 ± 0.6) from 16 soccer clubs. Players completed the self-assessments of perceived competence, tactical skills, and motivation, and participated in technical, and speed and agility tests. Results of this study showed that players' levels of perceived competence, tactical skills, motivation, technical skills, and speed and agility characteristics remained relatively high and stable across the period of 1 year. Positive relationships were found between these levels and changes in perceived competence and motivation, and levels of perceived competence and speed and agility characteristics. Together these results illustrate the multi-dimensional nature of talent development processes in soccer. Moreover, it seems crucial in coaching to support the development of perceived competence and motivation in young soccer players and that it might be even more important in later maturing players. PMID:26708723

  1. An Approach for Prioritizing Agile Practices for Adaptation

    NASA Astrophysics Data System (ADS)

    Mikulenas, Gytenis; Kapocius, Kestutis

    Agile software development approaches offer a strong alternative to the traditional plan-driven methodologies that have not been able to warrant successfulness of the software projects. However, the move toward Agile is often hampered by the wealth of alternative practices that are accompanied by numerous success or failure stories. Clearly, the formal methods for choosing most suitable practices are lacking. In this chapter, we present an overview of this problem and propose an approach for prioritization of available practices in accordance to the particular circumstances. The proposal combines ideas from Analytic Hierarchy Process (AHP) decision-making technique, cost-value analysis, and Rule-Description-Practice (RDP) technique. Assumption that such approach could facilitate the Agile adaptation process was supported by the case study of the approach illustrating the process of choosing most suitable Agile practices within a real-life project.

  2. Investigation into the impact of agility on conceptual fighter design

    NASA Technical Reports Server (NTRS)

    Engelbeck, R. M.

    1995-01-01

    The Agility Design Study was performed by the Boeing Defense and Space Group for the NASA Langley Research Center. The objective of the study was to assess the impact of agility requirements on new fighter configurations. Global trade issues investigated were the level of agility, the mission role of the aircraft (air-to-ground, multi-role, or air-to-air), and whether the customer is Air force, Navy, or joint service. Mission profiles and design objectives were supplied by NASA. An extensive technology assessment was conducted to establish the available technologies to industry for the aircraft. Conceptual level methodology is presented to assess the five NASA-supplied agility metrics. Twelve configurations were developed to address the global trade issues. Three-view drawings, inboard profiles, and performance estimates were made and are included in the report. A critical assessment and lessons learned from the study are also presented.

  3. The Telemetry Agile Manufacturing Effort

    SciTech Connect

    Brown, K.D.

    1995-01-01

    The Telemetry Agile Manufacturing Effort (TAME) is an agile enterprising demonstration sponsored by the US Department of Energy (DOE). The project experimented with new approaches to product realization and assessed their impacts on performance, cost, flow time, and agility. The purpose of the project was to design the electrical and mechanical features of an integrated telemetry processor, establish the manufacturing processes, and produce an initial production lot of two to six units. This paper outlines the major methodologies utilized by the TAME, describes the accomplishments that can be attributed to each methodology, and finally, examines the lessons learned and explores the opportunities for improvement associated with the overall effort. The areas for improvement are discussed relative to an ideal vision of the future for agile enterprises. By the end of the experiment, the TAME reduced production flow time by approximately 50% and life cycle cost by more than 30%. Product performance was improved compared with conventional DOE production approaches.

  4. Poster — Thur Eve — 56: Design of Quality Assurance Methodology for VMAT system on Agility System equipped with CVDR

    SciTech Connect

    Thind, K; Tolakanahalli, R

    2014-08-15

    The aim of this study was to analyze the feasibility of designing comprehensive QA plans using iComCAT for Elekta machines equipped with Agility multileaf collimator and continuously variable dose rate. Test plans with varying MLC speed, gantry speed, and dose rate were created and delivered in a controlled manner. A strip test was designed with three 1 cm MLC positions and delivered using dynamic, StepNShoot and VMAT techniques. Plans were also designed to test error in MLC position with various gantry speeds and various MLC speeds. The delivery fluence was captured using the electronic portal-imaging device. Gantry speed was found to be within tolerance as per the Canadian standards. MLC positioning errors at higher MLC speed with gravity effects does add more than 2 mm discrepancy. More tests need to be performed to evaluate MLC performance using independent measurement systems. The treatment planning system with end-to-end testing necessary for commissioning was also investigated and found to have >95% passing rates within 3%/3mm gamma criteria. Future studies involve performing off-axis gantry starshot pattern and repeating the tests on three matched Elekta linear accelerators.

  5. Utilization of an agility assessment module in analysis and optimization of preliminary fighter configuration

    NASA Technical Reports Server (NTRS)

    Ngan, Angelen; Biezad, Daniel

    1996-01-01

    A study has been conducted to develop and to analyze a FORTRAN computer code for performing agility analysis on fighter aircraft configurations. This program is one of the modules of the NASA Ames ACSYNT (AirCraft SYNThesis) design code. The background of the agility research in the aircraft industry and a survey of a few agility metrics are discussed. The methodology, techniques, and models developed for the code are presented. The validity of the existing code was evaluated by comparing with existing flight test data. A FORTRAN program was developed for a specific metric, PM (Pointing Margin), as part of the agility module. Example trade studies using the agility module along with ACSYNT were conducted using a McDonnell Douglas F/A-18 Hornet aircraft model. Tile sensitivity of thrust loading, wing loading, and thrust vectoring on agility criteria were investigated. The module can compare the agility potential between different configurations and has capability to optimize agility performance in the preliminary design process. This research provides a new and useful design tool for analyzing fighter performance during air combat engagements in the preliminary design.

  6. Lean Mission Operations Systems Design - Using Agile and Lean Development Principles for Mission Operations Design and Development

    NASA Technical Reports Server (NTRS)

    Trimble, Jay Phillip

    2014-01-01

    The Resource Prospector Mission seeks to rove the lunar surface with an in-situ resource utilization payload in search of volatiles at a polar region. The mission operations system (MOS) will need to perform the short-duration mission while taking advantage of the near real time control that the short one-way light time to the Moon provides. To maximize our use of limited resources for the design and development of the MOS we are utilizing agile and lean methods derived from our previous experience with applying these methods to software. By using methods such as "say it then sim it" we will spend less time in meetings and more time focused on the one outcome that counts - the effective utilization of our assets on the Moon to meet mission objectives.

  7. An Investigation of Agility Issues in Scrum Teams Using Agility Indicators

    NASA Astrophysics Data System (ADS)

    Pikkarainen, Minna; Wang, Xiaofeng

    Agile software development methods have emerged and become increasingly popular in recent years; yet the issues encountered by software development teams that strive to achieve agility using agile methods are yet to be explored systematically. Built upon a previous study that has established a set of indicators of agility, this study investigates what issues are manifested in software development teams using agile methods. It is focussed on Scrum teams particularly. In other words, the goal of the chapter is to evaluate Scrum teams using agility indicators and therefore to further validate previously presented agility indicators within the additional cases. A multiple case study research method is employed. The findings of the study reveal that the teams using Scrum do not necessarily achieve agility in terms of team autonomy, sharing, stability and embraced uncertainty. The possible reasons include previous organizational plan-driven culture, resistance towards the Scrum roles and changing resources.

  8. Modeling and Developing the Information System for the SuperAGILE Experiment

    NASA Astrophysics Data System (ADS)

    Lazzarotto, F.; Costa, E.; del Monte, E.; Feroci, M.

    2004-07-01

    We will present some formal description of the SuperAGILE (SA) detection system data, the relationships among them and the operations applied on data, with the aid of instruments such as Entity-Relationship (E-R) and UML diagrams. We just realized functions of reception, pre-processing, archiving and analysis on SA data making use of Object Oriented and SQL open source software instruments.

  9. Decision Support for Iteration Scheduling in Agile Environments

    NASA Astrophysics Data System (ADS)

    Szőke, Ákos

    Today’s software business development projects often lay claim to low-risk value to the customers in order to be financed. Emerging agile processes offer shorter investment periods, faster time-to-market and better customer satisfaction. To date, however, in agile environments there is no sound methodological schedule support contrary to the traditional plan-based approaches. To address this situation, we present an agile iteration scheduling method whose usefulness is evaluated with post-mortem simulation. It demonstrates that the method can significantly improve load balancing of resources (cca. 5×), produce higher quality and lower-risk feasible schedule, and provide more informed and established decisions by optimized schedule production. Finally, the paper analyzes benefits and issues from the use of this method.

  10. Photovoltaic module energy rating methodology development

    SciTech Connect

    Kroposki, B.; Myers, D.; Emery, K.; Mrig, L.; Whitaker, C.; Newmiller, J.

    1996-05-01

    A consensus-based methodology to calculate the energy output of a PV module will be described in this paper. The methodology develops a simple measure of PV module performance that provides for a realistic estimate of how a module will perform in specific applications. The approach makes use of the weather data profiles that describe conditions throughout the United States and emphasizes performance differences between various module types. An industry-representative Technical Review Committee has been assembled to provide feedback and guidance on the strawman and final approach used in developing the methodology.

  11. Development and evaluation of an inverse solution technique for studying helicopter maneuverability and agility

    NASA Technical Reports Server (NTRS)

    Whalley, Matthew S.

    1991-01-01

    An inverse solution technique for determining the maximum maneuvering performance of a helicopter using smooth, pilotlike control inputs is presented. Also described is a pilot simulation experiment performed to investigate the accuracy of the solution resulting from this technique. The maneuverability and agility capability of the helicopter math model was varied by varying the pitch and roll damping, the maximum pitch and roll rate, and the maximum load-factor capability. Three maneuvers were investigated: a 180-deg turn, a longitudinal pop-up, and a lateral jink. The inverse solution technique yielded accurate predictions of pilot-in-the-loop maneuvering performance for two of the three maneuvers.

  12. Development of Methodology for Programming Autonomous Agents

    NASA Technical Reports Server (NTRS)

    Erol, Kutluhan; Levy, Renato; Lang, Lun

    2004-01-01

    A brief report discusses the rationale for, and the development of, a methodology for generating computer code for autonomous-agent-based systems. The methodology is characterized as enabling an increase in the reusability of the generated code among and within such systems, thereby making it possible to reduce the time and cost of development of the systems. The methodology is also characterized as enabling reduction of the incidence of those software errors that are attributable to the human failure to anticipate distributed behaviors caused by the software. A major conceptual problem said to be addressed in the development of the methodology was that of how to efficiently describe the interfaces between several layers of agent composition by use of a language that is both familiar to engineers and descriptive enough to describe such interfaces unambivalently

  13. Agile manufacturing: The factory of the future

    NASA Technical Reports Server (NTRS)

    Loibl, Joseph M.; Bossieux, Terry A.

    1994-01-01

    The factory of the future will require an operating methodology which effectively utilizes all of the elements of product design, manufacturing and delivery. The process must respond rapidly to changes in product demand, product mix, design changes or changes in the raw materials. To achieve agility in a manufacturing operation, the design and development of the manufacturing processes must focus on customer satisfaction. Achieving greatest results requires that the manufacturing process be considered from product concept through sales. This provides the best opportunity to build a quality product for the customer at a reasonable rate. The primary elements of a manufacturing system include people, equipment, materials, methods and the environment. The most significant and most agile element in any process is the human resource. Only with a highly trained, knowledgeable work force can the proper methods be applied to efficiently process materials with machinery which is predictable, reliable and flexible. This paper discusses the affect of each element on the development of agile manufacturing systems.

  14. Investigating Agile User-Centered Design in Practice: A Grounded Theory Perspective

    NASA Astrophysics Data System (ADS)

    Hussain, Zahid; Slany, Wolfgang; Holzinger, Andreas

    This paper investigates how the integration of agile methods and User-Centered Design (UCD) is carried out in practice. For this study, we have applied grounded theory as a suitable qualitative approach to determine what is happening in actual practice. The data was collected by semi-structured interviews with professionals who have already worked with an integrated agile UCD methodology. Further data was collected by observing these professionals in their working context, and by studying their documents, where possible. The emerging themes that the study found show that there is an increasing realization of the importance of usability in software development among agile team members. The requirements are emerging; and both low and high fidelity prototypes based usability tests are highly used in agile teams. There is an appreciation of each other's work from both UCD professionals and developers and both sides can learn from each other.

  15. Methodological Issues in the Study of Development.

    ERIC Educational Resources Information Center

    Havens, A. Eugene

    The failure of development to improve the quality of life in most third world countries and in the less advantaged sectors of advanced capitalistic countries can be partially attributed, it is felt, to methodological errors made by those studying development. Some recent sociological approaches to the study of development are reviewed in this…

  16. Human factors in agile manufacturing

    SciTech Connect

    Forsythe, C.

    1995-03-01

    As industries position themselves for the competitive markets of today, and the increasingly competitive global markets of the 21st century, agility, or the ability to rapidly develop and produce new products, represents a common trend. Agility manifests itself in many different forms, with the agile manufacturing paradigm proposed by the Iacocca Institute offering a generally accepted, long-term vision. In its many forms, common elements of agility or agile manufacturing include: changes in business, engineering and production practices, seamless information flow from design through production, integration of computer and information technologies into all facets of the product development and production process, application of communications technologies to enable collaborative work between geographically dispersed product development team members and introduction of flexible automation of production processes. Industry has rarely experienced as dramatic an infusion of new technologies or as extensive a change in culture and work practices. Human factors will not only play a vital role in accomplishing the technical and social objectives of agile manufacturing. but has an opportunity to participate in shaping the evolution of industry paradigms for the 21st century.

  17. A Methodology for Developing Diagnostic Concept Inventories

    NASA Astrophysics Data System (ADS)

    Lindell, Rebecca

    2006-12-01

    Since the development of the Force Concept Inventory, there as been a heightened interest in developing other concept inventories that not only assess if students understand a phenomena, but also diagnose specific alternative understandings. Unfortunately, there is no clear-cut methodology on how to construct such inventories. One of the difficulties is that only some parts of test development theory are appropriate for such concept inventories. This is due to the concept inventories being distracter driven, where test-takers do not randomly choose an incorrect answer. In this poster, I will present a methodology for developing diagnostic concept inventories, which combines traditional psychometric theory with modern theories of concentration and model analysis. An example of how this methodology was utilized to develop the successful Lunar Phases Concept Inventory (LPCI) will also be given.

  18. Development of an agile knowledge engineering framework in support of multi-disciplinary translational research.

    PubMed

    Borlawsky, Tara B; Dhaval, Rakesh; Hastings, Shannon L; Payne, Philip R O

    2009-03-01

    In October 2006, the National Institutes of Health launched a new national consortium, funded through Clinical and Translational Science Awards (CTSA), with the primary objective of improving the conduct and efficiency of the inherently multi-disciplinary field of translational research. To help meet this goal, the Ohio State University Center for Clinical and Translational Science has launched a knowledge management initiative that is focused on facilitating widespread semantic interoperability among administrative, basic science, clinical and research computing systems, both internally and among the translational research community at-large, through the integration of domain-specific standard terminologies and ontologies with local annotations. This manuscript describes an agile framework that builds upon prevailing knowledge engineering and semantic interoperability methods, and will be implemented as part this initiative.

  19. Transitioning from Distributed and Traditional to Distributed and Agile: An Experience Report

    NASA Astrophysics Data System (ADS)

    Wildt, Daniel; Prikladnicki, Rafael

    Global companies that experienced extensive waterfall phased plans are trying to improve their existing processes to expedite team engagement. Agile methodologies have become an acceptable path to follow because it comprises project management as part of its practices. Agile practices have been used with the objective of simplifying project control through simple processes, easy to update documentation and higher team iteration over exhaustive documentation, focusing rather on team continuous improvement and aiming to add value to business processes. The purpose of this chapter is to describe the experience of a global multinational company on transitioning from distributed and traditional to distributed and agile. This company has development centers across North America, South America and Asia. This chapter covers challenges faced by the project teams of two pilot projects, including strengths of using agile practices in a globally distributed environment and practical recommendations for similar endeavors.

  20. Agile manufacturing and constraints management: a strategic perspective

    NASA Astrophysics Data System (ADS)

    Stratton, Roy; Yusuf, Yahaya Y.

    2000-10-01

    The definition of the agile paradigm has proved elusive and is often viewed as a panacea, in contention with more traditional approaches to operations strategy development and Larkin its own methodology and tools. The Theory of Constraints (TOC) is also poorly understood, as it is commonly solely associated with production planning and control systems and bottleneck management. This paper will demonstrate the synergy between these two approaches together with the Theory of Inventive Problem Solving (TRIZ), and establish how the systematic elimination of trade-offs can support the agile paradigm. Whereas agility is often seen as a trade-off free destination, both TOC and TRIZ may be considered to be route finders, as they comprise methodologies that focus on the identification and elimination of the trade-offs that constrain the purposeful improvement of a system, be it organizational or mechanical. This paper will also show how the TOC thinking process may be combined with the TRIZ knowledge based approach and used in breaking contradictions within agile logistics.

  1. Software development methodology for high consequence systems

    SciTech Connect

    Baca, L.S.; Bouchard, J.F.; Collins, E.W.; Eisenhour, M.; Neidigk, D.D.; Shortencarier, M.J.; Trellue, P.A.

    1997-10-01

    This document describes a Software Development Methodology for High Consequence Systems. A High Consequence System is a system whose failure could lead to serious injury, loss of life, destruction of valuable resources, unauthorized use, damaged reputation or loss of credibility or compromise of protected information. This methodology can be scaled for use in projects of any size and complexity and does not prescribe any specific software engineering technology. Tasks are described that ensure software is developed in a controlled environment. The effort needed to complete the tasks will vary according to the size, complexity, and risks of the project. The emphasis of this methodology is on obtaining the desired attributes for each individual High Consequence System.

  2. [Adaptive clinical study methodologies in drug development].

    PubMed

    Antal, János

    2015-11-29

    The evolution of drug development in human, clinical phase studies triggers the overview of those technologies and procedures which are labelled as adaptive clinical trials. The most relevant procedural and operational aspects will be discussed in this overview from points of view of clinico-methodological aspect.

  3. Software ``Best'' Practices: Agile Deconstructed

    NASA Astrophysics Data System (ADS)

    Fraser, Steven

    This workshop will explore the intersection of agility and software development in a world of legacy code-bases and large teams. Organizations with hundreds of developers and code-bases exceeding a million or tens of millions of lines of code are seeking new ways to expedite development while retaining and attracting staff who desire to apply “agile” methods. This is a situation where specific agile practices may be embraced outside of their usual zone of applicability. Here is where practitioners must understand both what “best practices” already exist in the organization - and how they might be improved or modified by applying “agile” approaches.

  4. Control design for future agile fighters

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Davidson, John B.

    1991-01-01

    The CRAFT control design methodology is presented. CRAFT stands for the design objectives addressed, namely, Control power, Robustness, Agility, and Flying Qualities Tradeoffs. The approach combines eigenspace assignment, which allows for direct specification of eigenvalues and eigenvectors, and a graphical approach for representing control design metrics that captures numerous design goals in one composite illustration. The methodology makes use of control design metrics from four design objective areas, namely, control power, robustness, agility, and flying qualities. An example of the CRAFT methodology as well as associated design issues are presented.

  5. Development of a flight software testing methodology

    NASA Technical Reports Server (NTRS)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  6. Customer Communication Challenges and Solutions in Globally Distributed Agile Software Development

    NASA Astrophysics Data System (ADS)

    Pikkarainen, Minna; Korkala, Mikko

    Working in the globally distributed market is one of the key trends among the software organizations all over the world. [1-5]. Several factors have contributed to the growth of distributed software development; time-zone independent ”follow the sun” development, access to well-educated labour, maturation of the technical infrastructure and reduced costs are some of the most commonly cited benefits of distributed development [3, 6-8]. Furthermore, customers are often located in different countries because of the companies’ internationalization purposes or good market opportunities.

  7. Developing collaborative environments - A Holistic software development methodology

    SciTech Connect

    PETERSEN,MARJORIE B.; MITCHINER,JOHN L.

    2000-03-08

    Sandia National Laboratories has been developing technologies to support person-to-person collaboration and the efforts of teams in the business and research communities. The technologies developed include knowledge-based design advisors, knowledge management systems, and streamlined manufacturing supply chains. These collaborative environments in which people can work together sharing information and knowledge have required a new approach to software development. The approach includes an emphasis on the requisite change in business practice that often inhibits user acceptance of collaborative technology. Leveraging the experience from this work, they have established a multidisciplinary approach for developing collaborative software environments. They call this approach ``A Holistic Software Development Methodology''.

  8. Enabling Agile Testing through Continuous Integration

    SciTech Connect

    Stolberg, Sean E.

    2009-08-24

    A Continuous Integration system is often considered one of the key elements involved in supporting an agile software development and testing environment. As a traditional software tester transitioning to an agile development environment it became clear to me that I would need to put this essential infrastructure in place and promote improved development practices in order to make the transition to agile testing possible. This experience report discusses a continuous integration implementation I lead last year. The initial motivations for implementing continuous integration are discussed and a pre and post-assessment using Martin Fowler's "Practices of Continuous Integration" is provided along with the technical specifics of the implementation. Finally, I’ll wrap up with a retrospective of my experiences implementing and promoting continuous integration within the context of agile testing.

  9. Design, implementation and validation of a novel open framework for agile development of mobile health applications

    PubMed Central

    2015-01-01

    The delivery of healthcare services has experienced tremendous changes during the last years. Mobile health or mHealth is a key engine of advance in the forefront of this revolution. Although there exists a growing development of mobile health applications, there is a lack of tools specifically devised for their implementation. This work presents mHealthDroid, an open source Android implementation of a mHealth Framework designed to facilitate the rapid and easy development of mHealth and biomedical apps. The framework is particularly planned to leverage the potential of mobile devices such as smartphones or tablets, wearable sensors and portable biomedical systems. These devices are increasingly used for the monitoring and delivery of personal health care and wellbeing. The framework implements several functionalities to support resource and communication abstraction, biomedical data acquisition, health knowledge extraction, persistent data storage, adaptive visualization, system management and value-added services such as intelligent alerts, recommendations and guidelines. An exemplary application is also presented along this work to demonstrate the potential of mHealthDroid. This app is used to investigate on the analysis of human behavior, which is considered to be one of the most prominent areas in mHealth. An accurate activity recognition model is developed and successfully validated in both offline and online conditions. PMID:26329639

  10. Design, implementation and validation of a novel open framework for agile development of mobile health applications.

    PubMed

    Banos, Oresti; Villalonga, Claudia; Garcia, Rafael; Saez, Alejandro; Damas, Miguel; Holgado-Terriza, Juan A; Lee, Sungyong; Pomares, Hector; Rojas, Ignacio

    2015-01-01

    The delivery of healthcare services has experienced tremendous changes during the last years. Mobile health or mHealth is a key engine of advance in the forefront of this revolution. Although there exists a growing development of mobile health applications, there is a lack of tools specifically devised for their implementation. This work presents mHealthDroid, an open source Android implementation of a mHealth Framework designed to facilitate the rapid and easy development of mHealth and biomedical apps. The framework is particularly planned to leverage the potential of mobile devices such as smartphones or tablets, wearable sensors and portable biomedical systems. These devices are increasingly used for the monitoring and delivery of personal health care and wellbeing. The framework implements several functionalities to support resource and communication abstraction, biomedical data acquisition, health knowledge extraction, persistent data storage, adaptive visualization, system management and value-added services such as intelligent alerts, recommendations and guidelines. An exemplary application is also presented along this work to demonstrate the potential of mHealthDroid. This app is used to investigate on the analysis of human behavior, which is considered to be one of the most prominent areas in mHealth. An accurate activity recognition model is developed and successfully validated in both offline and online conditions. PMID:26329639

  11. Design, implementation and validation of a novel open framework for agile development of mobile health applications.

    PubMed

    Banos, Oresti; Villalonga, Claudia; Garcia, Rafael; Saez, Alejandro; Damas, Miguel; Holgado-Terriza, Juan A; Lee, Sungyong; Pomares, Hector; Rojas, Ignacio

    2015-01-01

    The delivery of healthcare services has experienced tremendous changes during the last years. Mobile health or mHealth is a key engine of advance in the forefront of this revolution. Although there exists a growing development of mobile health applications, there is a lack of tools specifically devised for their implementation. This work presents mHealthDroid, an open source Android implementation of a mHealth Framework designed to facilitate the rapid and easy development of mHealth and biomedical apps. The framework is particularly planned to leverage the potential of mobile devices such as smartphones or tablets, wearable sensors and portable biomedical systems. These devices are increasingly used for the monitoring and delivery of personal health care and wellbeing. The framework implements several functionalities to support resource and communication abstraction, biomedical data acquisition, health knowledge extraction, persistent data storage, adaptive visualization, system management and value-added services such as intelligent alerts, recommendations and guidelines. An exemplary application is also presented along this work to demonstrate the potential of mHealthDroid. This app is used to investigate on the analysis of human behavior, which is considered to be one of the most prominent areas in mHealth. An accurate activity recognition model is developed and successfully validated in both offline and online conditions.

  12. Lean and Agile: An Epistemological Reflection

    ERIC Educational Resources Information Center

    Browaeys, Marie-Joelle; Fisser, Sandra

    2012-01-01

    Purpose: The aim of the paper is to contribute to the discussion of treating the concepts of lean and agile in isolation or combination by presenting an alternative view from complexity thinking on these concepts, considering an epistemological approach to this topic. Design/methodology/approach: The paper adopts an epistemological approach, using…

  13. Development of Robust, Light-weight, Agile Deformable Mirrors in Carbon Fiber

    NASA Astrophysics Data System (ADS)

    Hart, M.; Ammons, S. M.; Coughenour, B.; Richardson, L.,; Romeo, R.; Martin, R.

    2012-09-01

    Carbon fiber reinforced polymer (CFRP) has recently been developed to the point that surfaces of high optical quality can be routinely replicated. Building on this advance, we are developing a new generation of deformable mirrors (DMs) for adaptive optics application that extends long-standing expertise at the University of Arizona in large, optically powered DMs for astronomy. Our existing mirrors, up to 90 cm in diameter and with aspheric deformable facesheets, are deployed on a number of large astronomical telescopes. With actuator stroke of up to 50 microns and no hysteresis, they are delivering the best imaging ever seen from an astronomical AO system. Their Zerodur glass ceramic facesheets though are not well suited to non-astronomical applications. In this paper, we describe developmental work to replace the glass components of the DMs with CFRP, an attractive material for optics fabrication because of its high stiffness-to-weight ratio, strength, and very low coefficient of thermal expansion. Surface roughness arising from fiber print-through in the CFRP facesheets is low, < 3 nm PTV across a range of temperature, and the optical figure after correction of static terms by the DM actuators is on the order of 20 nm rms. After initial investment in an optical quality mandrel, replication costs of identical units in CFRP are very low, making the technology ideal for rapid mass production.

  14. Development and testing of a frequency-agile optical parametric oscillator system for differential absorption lidar

    NASA Astrophysics Data System (ADS)

    Weibring, P.; Smith, J. N.; Edner, H.; Svanberg, S.

    2003-10-01

    An all-solid-state fast-tuning lidar transmitter for range- and temporally resolved atmospheric gas concentration measurements has been developed and thoroughly tested. The instrument is based on a commercial optical parametric oscillator (OPO) laser system, which has been redesigned with piezoelectric transducers mounted on the wavelength-tuning mirror and on the crystal angle tuning element in the OPO. Piezoelectric transducers similarly control a frequency-mixing stage and doubling stage, which have been incorporated to extend system capabilities to the mid-IR and UV regions. The construction allows the system to be tuned to any wavelength, in any order, in the range of the piezoelectric transducers on a shot-to-shot basis. This extends the measurement capabilities far beyond the two-wavelength differential absorption lidar method and enables simultaneous measurements of several gases. The system performance in terms of wavelength, linewidth, and power stability is monitored in real time by an étalon-based wave meter and gas cells. The tests showed that the system was able to produce radiation in the 220-4300-nm-wavelength region, with an average linewidth better than 0.2 cm-1 and a shot-to-shot tunability up to 160 cm-1 within 20 ms. The utility of real-time linewidth and wavelength measurements is demonstrated by the ability to identify occasional poor quality laser shots and disregard these measurements. Also, absorption cell measurements of methane and mercury demonstrate the performance in obtaining stable wavelength and linewidth during rapid scans in the mid-IR and UV regions.

  15. Agile interferometry: a non-traditional approach

    NASA Astrophysics Data System (ADS)

    Riza, Nabeel A.; Yaqoob, Zahid

    2004-11-01

    A new approach called agile interferometry is introduced to attain interferometric information with high sensitivity and scenario-based intelligence. Compared to traditional interferometric techniques, the proposed method thrives on dynamic control of the reference signal strength and detector integration time for efficient interferometric detection with high signal-to-noise ratio and significantly improved detected signal dynamic range capabilities. Theoretical analysis is presented with the operational methodology of the new approach. A high-speed optical attenuator is required in the interferometer reference arm to implement the proposed agile interferometer.

  16. 3rd International Workshop on Designing Empirical Studies: Assessing the Effectiveness of Agile Methods (IWDES 2009)

    NASA Astrophysics Data System (ADS)

    di Penta, Massimiliano; Morasca, Sandro; Sillitti, Alberto

    Assessing the effectiveness of a development methodology is difficult and requires an extensive empirical investigation. Moreover, the design of such investigations is complex since they involve several stakeholders and their validity can be questioned if not replicated in similar and different contexts. Agilists are aware that data collection is important and the problem of designing and execute meaningful experiments is common. This workshop aims at creating a critical mass for the development of new and extensive investigations in the Agile world.

  17. Strategic agility for nursing leadership.

    PubMed

    Shirey, Maria R

    2015-06-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change. In this article, the author discusses strategic agility as an important leadership competency and offers approaches for incorporating strategic agility in healthcare systems. A strategic agility checklist and infrastructure-building approach are presented. PMID:26010278

  18. Strategic agility for nursing leadership.

    PubMed

    Shirey, Maria R

    2015-06-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change. In this article, the author discusses strategic agility as an important leadership competency and offers approaches for incorporating strategic agility in healthcare systems. A strategic agility checklist and infrastructure-building approach are presented.

  19. An expert system development methodology which supports verification and validation

    NASA Technical Reports Server (NTRS)

    Culbert, Chris; Riley, Gary; Savely, Robert T.

    1987-01-01

    Expert systems have demonstrated commercial viability in a wide range of applications, but still face some obstacles to widespread use. A major stumbling block is the lack of well defined verification and validation (V and V) techniques. The primary difficulty with expert system V and V is the use of development methodologies which do not support V and V. As with conventional code, the key to effective V and V is the development methodology. An expert system development methodology is described which is based upon a panel review approach, that allows input from all parties concerned with the expert system.

  20. Embedding Agile Practices within a Plan-Driven Hierarchical Project Life Cycle

    SciTech Connect

    Millard, W. David; Johnson, Daniel M.; Henderson, John M.; Lombardo, Nicholas J.; Bass, Robert B.; Smith, Jason E.

    2014-07-28

    Organizations use structured, plan-driven approaches to provide continuity, direction, and control to large, multi-year programs. Projects within these programs vary greatly in size, complexity, level of maturity, technical risk, and clarity of the development objectives. Organizations that perform exploratory research, evolutionary development, and other R&D activities can obtain the benefits of Agile practices without losing the benefits of their program’s overarching plan-driven structure. This paper describes application of Agile development methods on a large plan-driven sensor integration program. While the client employed plan-driven, requirements flow-down methodologies, tight project schedules and complex interfaces called for frequent end-to-end demonstrations to provide feedback during system development. The development process maintained the many benefits of plan-driven project execution with the rapid prototyping, integration, demonstration, and client feedback possible through Agile development methods. This paper also describes some of the tools and implementing mechanisms used to transition between and take advantage of each methodology, and presents lessons learned from the project management, system engineering, and developer’s perspectives.

  1. Creating IT agility.

    PubMed

    Glaser, John

    2008-04-01

    Seven steps healthcare organizations can take to improve IT agility are: Pay attention to the capabilities of IT applications. Establish short project phases. Stage the release of capital and new IT positions. Cross-train IT staff. Adopt technology standards. Shorten IT plan time horizons. Align IT with organizational strategies and priorities.

  2. Agile manufacturing in Intelligence, Surveillance and Reconnaissance (ISR)

    NASA Astrophysics Data System (ADS)

    DiPadua, Mark; Dalton, George

    2016-05-01

    The objective of the Agile Manufacturing for Intelligence, Surveillance, and Reconnaissance (AMISR) effort is to research, develop, design and build a prototype multi-intelligence (multi-INT), reconfigurable pod demonstrating benefits of agile manufacturing and a modular open systems approach (MOSA) to make podded intelligence, surveillance, and reconnaissance (ISR) capability more affordable and operationally flexible.

  3. Fighter agility metrics. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Liefer, Randall K.

    1990-01-01

    Fighter flying qualities and combat capabilities are currently measured and compared in terms relating to vehicle energy, angular rates and sustained acceleration. Criteria based on these measurable quantities have evolved over the past several decades and are routinely used to design aircraft structures, aerodynamics, propulsion and control systems. While these criteria, or metrics, have the advantage of being well understood, easily verified and repeatable during test, they tend to measure the steady state capability of the aircraft and not its ability to transition quickly from one state to another. Proposed new metrics to assess fighter aircraft agility are collected and analyzed. A framework for classification of these new agility metrics is developed and applied. A complete set of transient agility metrics is evaluated with a high fidelity, nonlinear F-18 simulation. Test techniques and data reduction methods are proposed. A method of providing cuing information to the pilot during flight test is discussed. The sensitivity of longitudinal and lateral agility metrics to deviations from the pilot cues is studied in detail. The metrics are shown to be largely insensitive to reasonable deviations from the nominal test pilot commands. Instrumentation required to quantify agility via flight test is also considered. With one exception, each of the proposed new metrics may be measured with instrumentation currently available.

  4. Introduction to Stand-up Meetings in Agile Methods

    NASA Astrophysics Data System (ADS)

    Hasnain, Eisha; Hall, Tracy

    2009-05-01

    In recent years, agile methods have become more popular in the software industry. Agile methods are a new approach compared to plan-driven approaches. One of the most important shifts in adopting an agile approach is the central focus given to people in the process. This is exemplified by the independence afforded to developers in the development work they do. This work investigates the opinions of practitioners about daily stand-up meetings in the agile methods and the role of developer in that. For our investigation we joined a yahoo group called "Extreme Programming". Our investigation suggests that although trust is an important factor in agile methods. But stand-ups are not the place to build trust.

  5. Comparison of a New Test For Agility and Skill in Soccer With Other Agility Tests

    PubMed Central

    Kutlu, Mehmet; Yapıcı, Hakan; Yoncalık, Oğuzhan; Çelik, Serkan

    2012-01-01

    The purpose of this study was both to develop a novel test to measure run, shuttle run and directional change agility, and soccer shots on goal with decision making and to compare it with other agility tests. Multiple comparisons and assessments were conducted, including test-retest, Illinois, Zig-Zag, 30 m, Bosco, T-drill agility, and Wingate peak power tests. A total of 113 Turkish amateur and professional soccer players and tertiary-level students participated in the study. Test-retest and inter-tester reliability testing measures were conducted with athletes. The correlation coefficient of the new test was .88, with no significant difference (p> 0.01> 0.01) between the test results obtained in the first and second test sessions. The results of an analysis of variance revealed a significant (p < 0.01) difference between the T-drill agility and power test results for soccer players. The new agility and skill test is an acceptable and reliable test when considering test-retest reliability and inter-rater reliability. The findings in this study suggest that the novel soccer-specific agility and shooting test can be utilized in the testing and identification of soccer players’ talents. PMID:23486732

  6. Developing enterprise collaboration: a methodology to implement and improve interoperability

    NASA Astrophysics Data System (ADS)

    Daclin, Nicolas; Chen, David; Vallespir, Bruno

    2016-06-01

    The aim of this article is to present a methodology for guiding enterprises to implement and improve interoperability. This methodology is based on three components: a framework of interoperability which structures specific solutions of interoperability and is composed of abstraction levels, views and approaches dimensions; a method to measure interoperability including interoperability before (maturity) and during (operational performances) a partnership; and a structured approach defining the steps of the methodology, from the expression of an enterprise's needs to implementation of solutions. The relationship which consistently relates these components forms the methodology and enables developing interoperability in a step-by-step manner. Each component of the methodology and the way it operates is presented. The overall approach is illustrated in a case study example on the basis of a process between a given company and its dealers. Conclusions and future perspectives are given at the end of the article.

  7. Photogrammetry Methodology Development for Gossamer Spacecraft Structures

    NASA Technical Reports Server (NTRS)

    Pappa, Richard S.; Jones, Thomas W.; Walford, Alan; Black, Jonathan T.; Robson, Stuart; Shortis, Mark R.

    2002-01-01

    Photogrammetry--the science of calculating 3D object coordinates from images-is a flexible and robust approach for measuring the static and dynamic characteristics of future ultralightweight and inflatable space structures (a.k.a., Gossamer structures), such as large membrane reflectors, solar sails, and thin-film solar arrays. Shape and dynamic measurements are required to validate new structural modeling techniques and corresponding analytical models for these unconventional systems. This paper summarizes experiences at NASA Langley Research Center over the past three years to develop or adapt photogrammetry methods for the specific problem of measuring Gossamer space structures. Turnkey industrial photogrammetry systems were not considered a cost-effective choice for this basic research effort because of their high purchase and maintenance costs. Instead, this research uses mainly off-the-shelf digital-camera and software technologies that are affordable to most organizations and provide acceptable accuracy.

  8. Photogrammetry Methodology Development for Gossamer Spacecraft Structures

    NASA Technical Reports Server (NTRS)

    Pappa, Richard S.; Jones, Thomas W.; Black, Jonathan T.; Walford, Alan; Robson, Stuart; Shortis, Mark R.

    2002-01-01

    Photogrammetry--the science of calculating 3D object coordinates from images--is a flexible and robust approach for measuring the static and dynamic characteristics of future ultra-lightweight and inflatable space structures (a.k.a., Gossamer structures), such as large membrane reflectors, solar sails, and thin-film solar arrays. Shape and dynamic measurements are required to validate new structural modeling techniques and corresponding analytical models for these unconventional systems. This paper summarizes experiences at NASA Langley Research Center over the past three years to develop or adapt photogrammetry methods for the specific problem of measuring Gossamer space structures. Turnkey industrial photogrammetry systems were not considered a cost-effective choice for this basic research effort because of their high purchase and maintenance costs. Instead, this research uses mainly off-the-shelf digital-camera and software technologies that are affordable to most organizations and provide acceptable accuracy.

  9. Agile Methods for Open Source Safety-Critical Software

    PubMed Central

    Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-01-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the right amount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545

  10. Agile Methods for Open Source Safety-Critical Software.

    PubMed

    Gary, Kevin; Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-08-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the rightamount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion.

  11. Agile Methods for Open Source Safety-Critical Software.

    PubMed

    Gary, Kevin; Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-08-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the rightamount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545

  12. Prometheus Reactor I&C Software Development Methodology, for Action

    SciTech Connect

    T. Hamilton

    2005-07-30

    The purpose of this letter is to submit the Reactor Instrumentation and Control (I&C) software life cycle, development methodology, and programming language selections and rationale for project Prometheus to NR for approval. This letter also provides the draft Reactor I&C Software Development Process Manual and Reactor Module Software Development Plan to NR for information.

  13. An investigation of fighter aircraft agility

    NASA Technical Reports Server (NTRS)

    Valasek, John; Downing, David R.

    1993-01-01

    This report attempts to unify in a single document the results of a series of studies on fighter aircraft agility funded by the NASA Ames Research Center, Dryden Flight Research Facility and conducted at the University of Kansas Flight Research Laboratory during the period January 1989 through December 1993. New metrics proposed by pilots and the research community to assess fighter aircraft agility are collected and analyzed. The report develops a framework for understanding the context into which the various proposed fighter agility metrics fit in terms of application and testing. Since new metrics continue to be proposed, this report does not claim to contain every proposed fighter agility metric. Flight test procedures, test constraints, and related criteria are developed. Instrumentation required to quantify agility via flight test is considered, as is the sensitivity of the candidate metrics to deviations from nominal pilot command inputs, which is studied in detail. Instead of supplying specific, detailed conclusions about the relevance or utility of one candidate metric versus another, the authors have attempted to provide sufficient data and analyses for readers to formulate their own conclusions. Readers are therefore ultimately responsible for judging exactly which metrics are 'best' for their particular needs. Additionally, it is not the intent of the authors to suggest combat tactics or other actual operational uses of the results and data in this report. This has been left up to the user community. Twenty of the candidate agility metrics were selected for evaluation with high fidelity, nonlinear, non real-time flight simulation computer programs of the F-5A Freedom Fighter, F-16A Fighting Falcon, F-18A Hornet, and X-29A. The information and data presented on the 20 candidate metrics which were evaluated will assist interested readers in conducting their own extensive investigations. The report provides a definition and analysis of each metric; details

  14. Development of nondestructive testing/evaluation methodology for MEMS

    NASA Astrophysics Data System (ADS)

    Zunino, James L., III; Skelton, Donald R.; Marinis, Ryan T.; Klempner, Adam R.; Hefti, Peter; Pryputniewicz, Ryszard J.

    2008-02-01

    Development of MEMS constitutes one of the most challenging tasks in today's micromechanics. In addition to design, analysis, and fabrication capabilities, this task also requires advanced test methodologies for determination of functional characteristics of MEMS to enable refinement and optimization of their designs as well as for demonstration of their reliability. Until recently, this characterization was hindered by lack of a readily available methodology. However, using recent advances in photonics, electronics, and computer technology, it was possible to develop a NonDestructive Testing (NDT) methodology suitable for evaluation of MEMS. In this paper, an optoelectronic methodology for NDT of MEMS is described and its application is illustrated with representative examples; this description represents work in progress and the results are preliminary. This methodology provides quantitative full-field-of-view measurements in near real-time with high spatial resolution and nanometer accuracy. By quantitatively characterizing performance of MEMS, under different vibration, thermal, and other operating conditions, specific suggestions for their improvements can be made. Then, using the methodology, we can verify the effects of these improvements. In this way, we can develop better understanding of functional characteristics of MEMS, which will ensure that they are operated at optimum performance, are durable, and are reliable.

  15. Perspectives on Agile Coaching

    NASA Astrophysics Data System (ADS)

    Fraser, Steven; Lundh, Erik; Davies, Rachel; Eckstein, Jutta; Larsen, Diana; Vilkki, Kati

    There are many perspectives to agile coaching including: growing coaching expertise, selecting the appropriate coach for your context; and eva luating value. A coach is often an itinerant who may observe, mentor, negotiate, influence, lead, and/or architect everything from team organization to system architecture. With roots in diverse fields ranging from technology to sociology coaches have differing motivations and experience bases. This panel will bring together coaches to debate and discuss various perspectives on agile coaching. Some of the questions to be addressed will include: What are the skills required for effective coaching? What should be the expectations for teams or individu als being coached? Should coaches be: a corporate resource (internal team of consultants working with multiple internal teams); an integral part of a specific team; or external contractors? How should coaches exercise influence and au thority? How should management assess the value of a coaching engagement? Do you have what it takes to be a coach? - This panel will bring together sea soned agile coaches to offer their experience and advice on how to be the best you can be!

  16. CT-assisted agile manufacturing

    NASA Astrophysics Data System (ADS)

    Stanley, James H.; Yancey, Robert N.

    1996-11-01

    The next century will witness at least two great revolutions in the way goods are produced. First, workers will use the medium of virtual reality in all aspects of marketing, research, development, prototyping, manufacturing, sales and service. Second, market forces will drive manufacturing towards small-lot production and just-in-time delivery. Already, we can discern the merging of these megatrends into what some are calling agile manufacturing. Under this new paradigm, parts and processes will be designed and engineered within the mind of a computer, tooled and manufactured by the offspring of today's rapid prototyping equipment, and evaluated for performance and reliability by advanced nondestructive evaluation (NDE) techniques and sophisticated computational models. Computed tomography (CT) is the premier example of an NDE method suitable for future agile manufacturing activities. It is the only modality that provides convenient access to the full suite of engineering data that users will need to avail themselves of computer- aided design, computer-aided manufacturing, and computer- aided engineering capabilities, as well as newly emerging reverse engineering, rapid prototyping and solid freeform fabrication technologies. As such, CT is assured a central, utilitarian role in future industrial operations. An overview of this exciting future for industrial CT is presented.

  17. Improved rapid prototyping methodology for MPEG-4 IC development

    NASA Astrophysics Data System (ADS)

    Tang, Clive K. K.; Moseler, Kathy; Levi, Sami

    1998-12-01

    One important factor in deciding the success of a new consumer product or integrated circuit is minimized time-to- market. A rapid prototyping methodology that encompasses algorithm development in the hardware design phase will have great impact on reducing time-to-market. In this paper, a proven hardware design methodology and a novel top-down design methodology based on Frontier Design's DSP Station tool are described. The proven methodology was used during development of the MC149570 H.261/H.263 video codec manufactured by Motorola. This paper discusses an improvement to this method to create an integrated environment for both system and hardware development, thereby further reducing the time-to-market. The software tool chosen is DSP Station tool by Frontier Design. The rich features of DSP Station tool will be described and then it will be shown how these features may be useful in designing from algorithm to silicon. How this methodology may be used in the development of a new MPEG4 Video Communication ASIC will be outlined. A brief comparison with a popular tool, Signal Processing WorkSystem tool by Cadence, will also be given.

  18. Agility Meets Systems Engineering: A Catalogue of Success Factors from Industry Practice

    NASA Astrophysics Data System (ADS)

    Stelzmann, Ernst; Kreiner, Christian; Spork, Gunther; Messnarz, Richard; Koenig, Frank

    Agile software development methods are widely accepted and valued in software-dominated industries. In more complex setups like multidisciplinary system development the adoption of an agile development paradigm is much less straightforward. Bigger teams, longer development cycles, process and product standard compliance and products lacking flexibility make an agile behaviour more difficult to achieve. Focusing on the fundamental underlying problem of dealing with ever ongoing change, this paper presents an agile Systems Engineering approach as a potential solution. Therefore a generic Systems Engineering action model was upgraded respecting agile principles and adapted according to practical needs discovered in an empirical study. This study was conducted among the partners of the S2QI agile workgroup made up from experts of automotive, logistics and electronics industries. Additionally to an agile Systems Engineering action model, a list of 15 practical success factors that should be considered when using an agile Systems Engineering approach is one of the main outcomes of this survey. It was also found that an agile behaviour in Systems Engineering could be supported in many different areas within companies. These areas are listed and it is also shown how the agile action model and the agile success factors are related to them.

  19. Candidate control design metrics for an agile fighter

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Bailey, Melvin L.; Ostroff, Aaron J.

    1991-01-01

    Success in the fighter combat environment of the future will certainly demand increasing capability from aircraft technology. These advanced capabilities in the form of superagility and supermaneuverability will require special design techniques which translate advanced air combat maneuvering requirements into design criteria. Control design metrics can provide some of these techniques for the control designer. Thus study presents an overview of control design metrics and investigates metrics for advanced fighter agility. The objectives of various metric users, such as airframe designers and pilots, are differentiated from the objectives of the control designer. Using an advanced fighter model, metric values are documented over a portion of the flight envelope through piloted simulation. These metric values provide a baseline against which future control system improvements can be compared and against which a control design methodology can be developed. Agility is measured for axial, pitch, and roll axes. Axial metrics highlight acceleration and deceleration capabilities under different flight loads and include specific excess power measurements to characterize energy meneuverability. Pitch metrics cover both body-axis and wind-axis pitch rates and accelerations. Included in pitch metrics are nose pointing metrics which highlight displacement capability between the nose and the velocity vector. Roll metrics (or torsion metrics) focus on rotational capability about the wind axis.

  20. Agile Infrastructure Monitoring

    NASA Astrophysics Data System (ADS)

    Andrade, P.; Ascenso, J.; Fedorko, I.; Fiorini, B.; Paladin, M.; Pigueiras, L.; Santos, M.

    2014-06-01

    At the present time, data centres are facing a massive rise in virtualisation and cloud computing. The Agile Infrastructure (AI) project is working to deliver new solutions to ease the management of CERN data centres. Part of the solution consists in a new "shared monitoring architecture" which collects and manages monitoring data from all data centre resources. In this article, we present the building blocks of this new monitoring architecture, the different open source technologies selected for each architecture layer, and how we are building a community around this common effort.

  1. Agile Walking Robot

    NASA Technical Reports Server (NTRS)

    Larimer, Stanley J.; Lisec, Thomas R.; Spiessbach, Andrew J.; Waldron, Kenneth J.

    1990-01-01

    Proposed agile walking robot operates over rocky, sandy, and sloping terrain. Offers stability and climbing ability superior to other conceptual mobile robots. Equipped with six articulated legs like those of insect, continually feels ground under leg before applying weight to it. If leg sensed unexpected object or failed to make contact with ground at expected point, seeks alternative position within radius of 20 cm. Failing that, robot halts, examines area around foot in detail with laser ranging imager, and replans entire cycle of steps for all legs before proceeding.

  2. Methodology Evaluation Framework for Component-Based System Development.

    ERIC Educational Resources Information Center

    Dahanayake, Ajantha; Sol, Henk; Stojanovic, Zoran

    2003-01-01

    Explains component-based development (CBD) for distributed information systems and presents an evaluation framework, which highlights the extent to which a methodology is component oriented. Compares prominent CBD methods, discusses ways of modeling, and suggests that this is a first step towards a components-oriented systems development…

  3. Development of Management Methodology for Engineering Production Quality

    NASA Astrophysics Data System (ADS)

    Gorlenko, O.; Miroshnikov, V.; Borbatc, N.

    2016-04-01

    The authors of the paper propose four directions of the methodology developing the quality management of engineering products that implement the requirements of new international standard ISO 9001:2015: the analysis of arrangement context taking into account stakeholders, the use of risk management, management of in-house knowledge, assessment of the enterprise activity according to the criteria of effectiveness

  4. Research Methodology on Language Development from a Complex Systems Perspective

    ERIC Educational Resources Information Center

    Larsen-Freeman, Diane; Cameron, Lynne

    2008-01-01

    Changes to research methodology motivated by the adoption of a complexity theory perspective on language development are considered. The dynamic, nonlinear, and open nature of complex systems, together with their tendency toward self-organization and interaction across levels and timescales, requires changes in traditional views of the functions…

  5. A Methodology for Developing Learning Objects for Web Course Delivery

    ERIC Educational Resources Information Center

    Stauffer, Karen; Lin, Fuhua; Koole, Marguerite

    2008-01-01

    This article presents a methodology for developing learning objects for web-based courses using the IMS Learning Design (IMS LD) specification. We first investigated the IMS LD specification, determining how to use it with online courses and the student delivery model, and then applied this to a Unit of Learning (UOL) for online computer science…

  6. Achieving agility through parameter space qualification

    SciTech Connect

    Diegert, K.V.; Easterling, R.G.; Ashby, M.R.; Benavides, G.L.; Forsythe, C.; Jones, R.E.; Longcope, D.B.; Parratt, S.W.

    1995-02-01

    The A-primed (Agile Product Realization of Innovative electro-Mechanical Devices) project is defining and proving processes for agile product realization for the Department of Energy complex. Like other agile production efforts reported in the literature, A-primed uses concurrent engineering and information automation technologies to enhance information transfer. A unique aspect of our approach to agility is the qualification during development of a family of related product designs and their production processes, rather than a single design and its attendant processes. Applying engineering principles and statistical design of experiments, economies of test and analytic effort are realized for the qualification of the device family as a whole. Thus the need is minimized for test and analysis to qualify future devices from this family, thereby further reducing the design-to-production cycle time. As a measure of the success of the A-primed approach, the first design took 24 days to produce, and operated correctly on the first attempt. A flow diagram for the qualification process is presented. Guidelines are given for implementation, based on the authors experiences as members of the A-primed qualification team.

  7. Frequency agile optical parametric oscillator

    DOEpatents

    Velsko, S.P.

    1998-11-24

    The frequency agile OPO device converts a fixed wavelength pump laser beam to arbitrary wavelengths within a specified range with pulse to pulse agility, at a rate limited only by the repetition rate of the pump laser. Uses of this invention include Laser radar, LIDAR, active remote sensing of effluents/pollutants, environmental monitoring, antisensor lasers, and spectroscopy. 14 figs.

  8. Frequency agile optical parametric oscillator

    DOEpatents

    Velsko, Stephan P.

    1998-01-01

    The frequency agile OPO device converts a fixed wavelength pump laser beam to arbitrary wavelengths within a specified range with pulse to pulse agility, at a rate limited only by the repetition rate of the pump laser. Uses of this invention include Laser radar, LIDAR, active remote sensing of effluents/pollutants, environmental monitoring, antisensor lasers, and spectroscopy.

  9. Risk-Informed Assessment Methodology Development and Application

    SciTech Connect

    Sung Goo Chi; Seok Jeong Park; Chul Jin Choi; Ritterbusch, S.E.; Jacob, M.C.

    2002-07-01

    Westinghouse Electric Company (WEC) has been working with Korea Power Engineering Company (KOPEC) on a US Department of Energy (DOE) sponsored Nuclear Energy Research Initiative (NERI) project through a collaborative agreement established for the domestic NERI program. The project deals with Risk-Informed Assessment (RIA) of regulatory and design requirements of future nuclear power plants. An objective of the RIA project is to develop a risk-informed design process, which focuses on identifying and incorporating advanced features into future nuclear power plants (NPPs) that would meet risk goals in a cost-effective manner. The RIA design methodology is proposed to accomplish this objective. This paper discusses the development of this methodology and demonstrates its application in the design of plant systems for future NPPs. Advanced conceptual plant systems consisting of an advanced Emergency Core Cooling System (ECCS) and Emergency Feedwater System (EFWS) for a NPP were developed and the risk-informed design process was exercised to demonstrate the viability and feasibility of the RIA design methodology. Best estimate Loss-of-Coolant Accident (LOCA) analyses were performed to validate the PSA success criteria for the NPP. The results of the analyses show that the PSA success criteria can be met using the advanced conceptual systems and that the RIA design methodology is a viable and appropriate means of designing key features of risk-significant NPP systems. (authors)

  10. Demand Activated Manufacturing Architecture (DAMA) supply chain collaboration development methodology

    SciTech Connect

    PETERSEN,MARJORIE B.; CHAPMAN,LEON D.

    2000-03-15

    The Demand Activated Manufacturing Architecture (DAMA) project during the last five years of work with the U.S. Integrated Textile Complex (retail, apparel, textile, and fiber sectors) has developed an inter-enterprise supply chain collaboration development methodology. The goal of this methodology is to enable a supply chain to work more efficiently and competitively. The outcomes of this methodology include: (1) A definitive description and evaluation of the role of business cultures and supporting business organizational structures in either inhibiting or fostering change to a more competitive supply chain; (2) ``As-Is'' and proposed ``To-Be'' supply chain business process models focusing on information flows and decision-making; and (3) Software tools that enable and support a transition to a more competitive supply chain, which results form a business driven rather than technologically driven approach to software design. This methodology development will continue in FY00 as DAMA engages companies in the soft goods industry in supply chain research and implementation of supply chain collaboration.

  11. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  12. Thinking Outside the Box: Agile Business Models for CNOs

    NASA Astrophysics Data System (ADS)

    Loss, Leandro; Crave, Servane

    This paper introduces the idea of an agile Business Model for CNOs grounded on a new model of innovation based on the effects of globalization and of Knowledge Economy. The agile Business Model considers the resources that are spread out and available worldwide as well as the need for each customer to receive a unique customer experience. It aims at reinforcing in the context of the Knowledge Economy the different business models approaches developed so far. The paper also identifies the levers and the barriers of Agile Business Models Innovation in CNOs.

  13. Methodology to develop and evaluate a semantic representation for NLP.

    PubMed

    Irwin, Jeannie Y; Harkema, Henk; Christensen, Lee M; Schleyer, Titus; Haug, Peter J; Chapman, Wendy W

    2009-11-14

    Natural language processing applications that extract information from text rely on semantic representations. The objective of this paper is to describe a methodology for creating a semantic representation for information that will be automatically extracted from textual clinical records. We illustrate two of the four steps of the methodology in this paper using the case study of encoding information from dictated dental exams: (1) develop an initial representation from a set of training documents and (2) iteratively evaluate and evolve the representation while developing annotation guidelines. Our approach for developing and evaluating a semantic representation is based on standard principles and approaches that are not dependent on any particular domain or type of semantic representation.

  14. Development of test methodology for dynamic mechanical analysis instrumentation

    NASA Technical Reports Server (NTRS)

    Allen, V. R.

    1982-01-01

    Dynamic mechanical analysis instrumentation was used for the development of specific test methodology in the determination of engineering parameters of selected materials, esp. plastics and elastomers, over a broad range of temperature with selected environment. The methodology for routine procedures was established with specific attention given to sample geometry, sample size, and mounting techniques. The basic software of the duPont 1090 thermal analyzer was used for data reduction which simplify the theoretical interpretation. Clamps were developed which allowed 'relative' damping during the cure cycle to be measured for the fiber-glass supported resin. The correlation of fracture energy 'toughness' (or impact strength) with the low temperature (glassy) relaxation responses for a 'rubber-modified' epoxy system was negative in result because the low-temperature dispersion mode (-80 C) of the modifier coincided with that of the epoxy matrix, making quantitative comparison unrealistic.

  15. Agile manufacturing concept

    NASA Astrophysics Data System (ADS)

    Goldman, Steven L.

    1994-03-01

    The initial conceptualization of agile manufacturing was the result of a 1991 study -- chaired by Lehigh Professor Roger N. Nagel and California-based entrepreneur Rick Dove, President of Paradigm Shifts, International -- of what it would take for U.S. industry to regain global manufacturing competitiveness by the early twenty-first century. This industry-led study, reviewed by senior management at over 100 companies before its release, concluded that incremental improvement of the current system of manufacturing would not be enough to be competitive in today's global marketplace. Computer-based information and production technologies that were becoming available to industry opened up the possibility of an altogether new system of manufacturing, one that would be characterized by a distinctive integration of people and technologies; of management and labor; of customers, producers, suppliers, and society.

  16. Aircraft agility maneuvers

    NASA Technical Reports Server (NTRS)

    Cliff, Eugene M.; Thompson, Brian G.

    1992-01-01

    A new dynamic model for aircraft motions is presented. This model can be viewed as intermediate between a point-mass model, in which the body attitude angles are control-like, and a rigid-body model, in which the body-attitude angles evolve according to Newton's Laws. Specifically, consideration is given to the case of symmetric flight, and a model is constructed in which the body roll-rate and the body pitch-rate are the controls. In terms of this body-rate model a minimum-time heading change maneuver is formulated. When the bounds on the body-rates are large the results are similar to the point-mass model in that the model can very quickly change the applied forces and produce an acceleration to turn the vehicle. With finite bounds on these rates, the forces change in a smooth way. This leads to a measurable effect of agility.

  17. Agile based "Semi-"Automated Data ingest process : ORNL DAAC example

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Hook, L.; Wei, Y.; Wright, D.

    2015-12-01

    The ORNL DAAC archives and publishes data and information relevant to biogeochemical, ecological, and environmental processes. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. The ORNL DAAC ingest team curates diverse data sets from multiple data providers simultaneously. To streamline the ingest process, the data set submission process at the ORNL DAAC has been recently updated to use an agile process and a semi-automated workflow system has been developed to provide a consistent data provider experience and to create a uniform data product. The goals of semi-automated agile ingest process are to: 1.Provide the ability to track a data set from acceptance to publication 2. Automate steps that can be automated to improve efficiencies and reduce redundancy 3.Update legacy ingest infrastructure 4.Provide a centralized system to manage the various aspects of ingest. This talk will cover the agile methodology, workflow, and tools developed through this system.

  18. Coal resources available for development; a methodology and pilot study

    USGS Publications Warehouse

    Eggleston, Jane R.; Carter, M. Devereux; Cobb, James C.

    1990-01-01

    Coal accounts for a major portion of our Nation's energy supply in projections for the future. A demonstrated reserve base of more than 475 billion short tons, as the Department of Energy currently estimates, indicates that, on the basis of today's rate of consumption, the United States has enough coal to meet projected energy needs for almost 200 years. However, the traditional procedures used for estimating the demonstrated reserve base do not account for many environmental and technological restrictions placed on coal mining. A new methodology has been developed to determine the quantity of coal that might actually be available for mining under current and foreseeable conditions. This methodology is unique in its approach, because it applies restrictions to the coal resource before it is mined. Previous methodologies incorporated restrictions into the recovery factor (a percentage), which was then globally applied to the reserve (minable coal) tonnage to derive a recoverable coal tonnage. None of the previous methodologies define the restrictions and their area and amount of impact specifically. Because these restrictions and their impacts are defined in this new methodology, it is possible to achieve more accurate and specific assessments of available resources. This methodology has been tested in a cooperative project between the U.S. Geological Survey and the Kentucky Geological Survey on the Matewan 7.5-minute quadrangle in eastern Kentucky. Pertinent geologic, mining, land-use, and technological data were collected, assimilated, and plotted. The National Coal Resources Data System was used as the repository for data, and its geographic information system software was applied to these data to eliminate restricted coal and quantify that which is available for mining. This methodology does not consider recovery factors or the economic factors that would be considered by a company before mining. Results of the pilot study indicate that, of the estimated

  19. Elements of an Art - Agile Coaching

    NASA Astrophysics Data System (ADS)

    Lundh, Erik

    This tutorial gives you a lead on becoming or redefining yourself as an Agile Coach. Introduction to elements and dimensions of state-of-the-art Agile Coaching. How to position the agile coach to be effective in a larger setting. Making the agile transition - from a single team to thousands of people. How to support multiple teams as a coach. How to build a coaches network in your company. Challenges when the agile coach is a consultant and the organization is large.

  20. Development of methodology to prioritise wildlife pathogens for surveillance.

    PubMed

    McKenzie, Joanna; Simpson, Helen; Langstaff, Ian

    2007-09-14

    We developed and evaluated a methodology to prioritise pathogens for a wildlife disease surveillance strategy in New Zealand. The methodology, termed 'rapid risk analysis' was based on the import risk analysis framework recommended by the Office Internationale des Epizooties (OIE), and involved: hazard identification, risk estimation, and ranking of 48 exotic and 34 endemic wildlife pathogens. The risk assessment was more rapid than a full quantitative assessment through the use of a semi-quantitative approach to score pathogens for probability of entry to NZ (release assessment), likelihood of spread (exposure assessment) and consequences in free-living wildlife, captive wildlife, humans, livestock and companion animals. Risk was estimated by multiplying the scores for the probability of entry to New Zealand by the likelihood of spread by the consequences for free-living wildlife, humans and livestock. The rapid risk analysis methodology produced scores that were sufficiently differentiated between pathogens to be useful for ranking them on the basis of risk. Ranking pathogens on the basis of the risk estimate for each population sector provided an opportunity to identify the priorities within each sector alone thus avoiding value-laden comparisons between sectors. Ranking pathogens across all three population sectors by summing the risk estimate for each sector provided a comparison of total risk which may be useful for resource allocation decisions at national level. Ranking pathogens within each wildlife taxonomic group using the total risk estimate was most useful for developing specific surveillance strategies for each group. PMID:17482697

  1. Development of a statistically based access delay timeline methodology.

    SciTech Connect

    Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt

    2013-02-01

    The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversarys task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.

  2. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  3. Development of tools, technologies, and methodologies for imaging sensor testing

    NASA Astrophysics Data System (ADS)

    Lowry, H.; Bynum, K.; Steely, S.; Nicholson, R.; Horne, H.

    2013-05-01

    Ground testing of space- and air-borne imaging sensor systems is supported by Vis-to-LWIR imaging sensor calibration and characterization, as well as hardware-in-the-loop (HWIL) simulation with high-fidelity complex scene projection to validate sensor mission performance. To accomplish this successfully, there must be the development of tools, technologies, and methodologies that are used in space simulation chambers for such testing. This paper provides an overview of such efforts being investigated and implemented at Arnold Engineering Development Complex (AEDC).

  4. Decanting geriatric institutions: development of a patient assessment methodology.

    PubMed

    Warner, M M

    1991-01-01

    Many elderly people in both developing and developed countries are institutionalized--often irrespective of whether their ability to function requires it. Increased attention is now being given to prospects for decanting geriatric institutions and planning new forms of care. However, methodologic difficulties exist, it being hard to determine how much of the institutionalized elderly population could be effectively accommodated by alternate forms of care requiring certain levels of social, physical, and mental capacity. The procedure described in this article, based on work performed in Barbados, seeks to assess the eligibility of an existing institutionalized geriatric population for alternate types of care, thereby laying the groundwork for future planning.

  5. Architecture-Centric Methods and Agile Approaches

    NASA Astrophysics Data System (ADS)

    Babar, Muhammad Ali; Abrahamsson, Pekka

    Agile software development approaches have had significant impact on industrial software development practices. Despite becoming widely popular, there is an increasing perplexity about the role and importance of a system’s software architecture in agile approaches [1, 2]. Advocates of the vital role of architecture in achieving quality goals of large-scale-software-intensive-systems are skeptics of the scalability of any development approach that does not pay sufficient attention to architectural issues. However, the proponents of agile approaches usually perceive the upfront design and evaluation of architecture as being of less value to the customers of a system. According to them, for example, re-factoring can help fix most of the problems. Many experiences show that large-scale re-factoring often results in significant defects, which are very costly to address later in the development cycle. It is considered that re-factoring is worthwhile as long as the high-level design is good enough to limit the need for large-scale re-factoring [1, 3, 4].

  6. Compact, Automated, Frequency-Agile Microspectrofluorimeter

    NASA Technical Reports Server (NTRS)

    Fernandez, Salvador M.; Guignon, Ernest F.

    1995-01-01

    Compact, reliable, rugged, automated cell-culture and frequency-agile microspectrofluorimetric apparatus developed to perform experiments involving photometric imaging observations of single live cells. In original application, apparatus operates mostly unattended aboard spacecraft; potential terrestrial applications include automated or semiautomated diagnosis of pathological tissues in clinical laboratories, biomedical instrumentation, monitoring of biological process streams, and portable instrumentation for testing biological conditions in various environments. Offers obvious advantages over present laboratory instrumentation.

  7. Development of Testing Methodologies for the Mechanical Properties of MEMS

    NASA Technical Reports Server (NTRS)

    Ekwaro-Osire, Stephen

    2003-01-01

    This effort is to investigate and design testing strategies to determine the mechanical properties of MicroElectroMechanical Systems (MEMS) as well as investigate the development of a MEMS Probabilistic Design Methodology (PDM). One item of potential interest is the design of a test for the Weibull size effect in pressure membranes. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. However, the primary area of investigation will most likely be analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. This will be a continuation of the previous years work. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads.

  8. Photovoltaic-system costing-methodology development. Final report

    SciTech Connect

    Not Available

    1982-07-01

    Presented are the results of a study to expand the use of standardized costing methodologies in the National Photovoltaics Program. The costing standards, which include SAMIS for manufacturing costs and M and D for marketing and distribution costs, have been applied to concentrator collectors and power-conditioning units. The M and D model was also computerized. Finally, a uniform construction cost-accounting structure was developed for use in photovoltaic test and application projects. The appendices contain example cases which demonstrate the use of the models.

  9. A computer simulator for development of engineering system design methodologies

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Sobieszczanski-Sobieski, J.

    1987-01-01

    A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.

  10. Final Report of the NASA Office of Safety and Mission Assurance Agile Benchmarking Team

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2016-01-01

    To ensure that the NASA Safety and Mission Assurance (SMA) community remains in a position to perform reliable Software Assurance (SA) on NASAs critical software (SW) systems with the software industry rapidly transitioning from waterfall to Agile processes, Terry Wilcutt, Chief, Safety and Mission Assurance, Office of Safety and Mission Assurance (OSMA) established the Agile Benchmarking Team (ABT). The Team's tasks were: 1. Research background literature on current Agile processes, 2. Perform benchmark activities with other organizations that are involved in software Agile processes to determine best practices, 3. Collect information on Agile-developed systems to enable improvements to the current NASA standards and processes to enhance their ability to perform reliable software assurance on NASA Agile-developed systems, 4. Suggest additional guidance and recommendations for updates to those standards and processes, as needed. The ABT's findings and recommendations for software management, engineering and software assurance are addressed herein.

  11. Development of the Comprehensive Cervical Dystonia Rating Scale: Methodology

    PubMed Central

    Comella, Cynthia L.; Fox, Susan H.; Bhatia, Kailash P.; Perlmutter, Joel S.; Jinnah, Hyder A.; Zurowski, Mateusz; McDonald, William M.; Marsh, Laura; Rosen, Ami R.; Waliczek, Tracy; Wright, Laura J.; Galpern, Wendy R.; Stebbins, Glenn T.

    2016-01-01

    We present the methodology utilized for development and clinimetric testing of the Comprehensive Cervical Dystonia (CD) Rating scale, or CCDRS. The CCDRS includes a revision of the Toronto Western Spasmodic Torticollis Rating Scale (TWSTRS-2), a newly developed psychiatric screening tool (TWSTRS-PSYCH), and the previously validated Cervical Dystonia Impact Profile (CDIP-58). For the revision of the TWSTRS, the original TWSTRS was examined by a committee of dystonia experts at a dystonia rating scales workshop organized by the Dystonia Medical Research Foundation. During this workshop, deficiencies in the standard TWSTRS were identified and recommendations for revision of the severity and pain subscales were incorporated into the TWSTRS-2. Given that no scale currently evaluates the psychiatric features of cervical dystonia (CD), we used a modified Delphi methodology and a reiterative process of item selection to develop the TWSTRS-PSYCH. We also included the CDIP-58 to capture the impact of CD on quality of life. The three scales (TWSTRS2, TWSTRS-PSYCH, and CDIP-58) were combined to construct the CCDRS. Clinimetric testing of reliability and validity of the CCDRS are described. The CCDRS was designed to be used in a modular fashion that can measure the full spectrum of CD. This scale will provide rigorous assessment for studies of natural history as well as novel symptom-based or disease-modifying therapies. PMID:27088112

  12. Developing a science of land change: Challenges and methodological issues

    PubMed Central

    Rindfuss, Ronald R.; Walsh, Stephen J.; Turner, B. L.; Fox, Jefferson; Mishra, Vinod

    2004-01-01

    Land-change science has emerged as a foundational element of global environment change and sustainability science. It seeks to understand the human and environment dynamics that give rise to changed land uses and covers, not only in terms of their type and magnitude but their location as well. This focus requires the integration of social, natural, and geographical information sciences. Each of these broad research communities has developed different ways to enter the land-change problem, each with different means of treating the locational specificity of the critical variables, such as linking the land manager to the parcel being managed. The resulting integration encounters various data, methodological, and analytical problems, especially those concerning aggregation and inference, land-use pixel links, data and measurement, and remote sensing analysis. Here, these integration problems, which hinder comprehensive understanding and theory development, are addressed. Their recognition and resolution are required for the sustained development of land-change science. PMID:15383671

  13. Aggregate Building Simulator (ABS) Methodology Development, Application, and User Manual

    SciTech Connect

    Dirks, James A.; Gorrissen, Willy J.

    2011-11-30

    As the relationship between the national building stock and various global energy issues becomes a greater concern, it has been deemed necessary to develop a system of predicting the energy consumption of large groups of buildings. Ideally this system is to take advantage of the most advanced energy simulation software available, be able to execute runs quickly, and provide concise and useful results at a level of detail that meets the users needs without inundating them with data. The resulting methodology that was developed allows the user to quickly develop and execute energy simulations of many buildings simultaneously, taking advantage of parallel processing to greatly reduce total simulation times. The result of these simulations can then be rapidly condensed and presented in a useful and intuitive manner.

  14. The SIMRAND methodology - Simulation of Research and Development Projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1984-01-01

    In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.

  15. Intelligent System Development Using a Rough Sets Methodology

    NASA Technical Reports Server (NTRS)

    Anderson, Gray T.; Shelton, Robert O.

    1997-01-01

    The purpose of this research was to examine the potential of the rough sets technique for developing intelligent models of complex systems from limited information. Rough sets a simple but promising technology to extract easily understood rules from data. The rough set methodology has been shown to perform well when used with a large set of exemplars, but its performance with sparse data sets is less certain. The difficulty is that rules will be developed based on just a few examples, each of which might have a large amount of noise associated with them. The question then becomes, what is the probability of a useful rule being developed from such limited information? One nice feature of rough sets is that in unusual situations, the technique can give an answer of 'I don't know'. That is, if a case arises that is different from the cases the rough set rules were developed on, the methodology can recognize this and alert human operators of it. It can also be trained to do this when the desired action is unknown because conflicting examples apply to the same set of inputs. This summer's project was to look at combining rough set theory with statistical theory to develop confidence limits in rules developed by rough sets. Often it is important not to make a certain type of mistake (e.g., false positives or false negatives), so the rules must be biased toward preventing a catastrophic error, rather than giving the most likely course of action. A method to determine the best course of action in the light of such constraints was examined. The resulting technique was tested with files containing electrical power line 'signatures' from the space shuttle and with decompression sickness data.

  16. Methodology of citrate-based biomaterial development and application

    NASA Astrophysics Data System (ADS)

    Tran, M. Richard

    Biomaterials play central roles in modern strategies of regenerative medicine and tissue engineering. Attempts to find tissue-engineered solutions to cure various injuries or diseases have led to an enormous increase in the number of polymeric biomaterials over the past decade. The breadth of new materials arises from the multiplicity of anatomical locations, cell types, and mode of application, which all place application-specific requirements on the biomaterial. Unfortunately, many of the currently available biodegradable polymers are limited in their versatility to meet the wide range of requirements for tissue engineering. Therefore, a methodology of biomaterial development, which is able to address a broad spectrum of requirements, would be beneficial to the biomaterial field. This work presents a methodology of citrate-based biomaterial design and application to meet the multifaceted needs of tissue engineering. We hypothesize that (1) citric acid, a non-toxic metabolic product of the body (Krebs Cycle), can be exploited as a universal multifunctional monomer and reacted with various diols to produce a new class of soft biodegradable elastomers with the flexibility to tune the material properties of the resulting material to meet a wide range of requirements; (2) the newly developed citrate-based polymers can be used as platform biomaterials for the design of novel tissue engineering scaffolding; and (3) microengineering approaches in the form thin scaffold sheets, microchannels, and a new porogen design can be used to generate complex cell-cell and cell-microenvironment interactions to mimic tissue complexity and architecture. To test these hypotheses, we first developed a methodology of citrate-based biomaterial development through the synthesis and characterization of a family of in situ crosslinkable and urethane-doped elastomers, which are synthesized using simple, cost-effective strategies and offer a variety methods to tailor the material properties to

  17. What Does an Agile Coach Do?

    NASA Astrophysics Data System (ADS)

    Davies, Rachel; Pullicino, James

    The surge in Agile adoption has created a demand for project managers rather than direct their teams. A sign of this trend is the ever-increasing number of people getting certified as scrum masters and agile leaders. Training courses that introduce agile practices are easy to find. But making the transition to coach is not as simple as understanding what agile practices are. Your challenge as an Agile Coach is to support your team in learning how to wield their new Agile tools in creating great software.

  18. Development of a Composite Delamination Fatigue Life Prediction Methodology

    NASA Technical Reports Server (NTRS)

    OBrien, Thomas K.

    2009-01-01

    Delamination is one of the most significant and unique failure modes in composite structures. Because of a lack of understanding of the consequences of delamination and the inability to predict delamination onset and growth, many composite parts are unnecessarily rejected upon inspection, both immediately after manufacture and while in service. NASA Langley is leading the efforts in the U.S. to develop a fatigue life prediction methodology for composite delamination using fracture mechanics. Research being performed to this end will be reviewed. Emphasis will be placed on the development of test standards for delamination characterization, incorporation of approaches for modeling delamination in commercial finite element codes, and efforts to mature the technology for use in design handbooks and certification documents.

  19. Quantification of Posterior Globe Flattening: Methodology Development and Validation

    NASA Technical Reports Server (NTRS)

    Lumpkins, Sarah B.; Garcia, Kathleen M.; Sargsyan, Ashot E.; Hamilton, Douglas R.; Berggren, Michael D.; Ebert, Douglas

    2012-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in the eyes of several astronauts. This phenomenon is known to affect some terrestrial patient populations and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT) or B-mode Ultrasound (US), without consistent objective criteria. NASA uses a semiquantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was ot initiate development of an objective quantification methodology to monitor small changes in posterior globe flattening.

  20. Quantification of Posterior Globe Flattening: Methodology Development and Validationc

    NASA Technical Reports Server (NTRS)

    Lumpkins, S. B.; Garcia, K. M.; Sargsyan, A. E.; Hamilton, D. R.; Berggren, M. D.; Antonsen, E.; Ebert, D.

    2011-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts, and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in several astronauts. This phenomenon is known to affect some terrestrial patient populations, and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT), or B-mode ultrasound (US), without consistent objective criteria. NASA uses a semi-quantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was to initiate development of an objective quantification methodology for posterior globe flattening.

  1. Turbofan Engine Core Compartment Vent Aerodynamic Configuration Development Methodology

    NASA Technical Reports Server (NTRS)

    Hebert, Leonard J.

    2006-01-01

    This paper presents an overview of the design methodology used in the development of the aerodynamic configuration of the nacelle core compartment vent for a typical Boeing commercial airplane together with design challenges for future design efforts. Core compartment vents exhaust engine subsystem flows from the space contained between the engine case and the nacelle of an airplane propulsion system. These subsystem flows typically consist of precooler, oil cooler, turbine case cooling, compartment cooling and nacelle leakage air. The design of core compartment vents is challenging due to stringent design requirements, mass flow sensitivity of the system to small changes in vent exit pressure ratio, and the need to maximize overall exhaust system performance at cruise conditions.

  2. Piloted simulator assessments of agility

    NASA Technical Reports Server (NTRS)

    Schneider, Edward T.

    1990-01-01

    NASA has utilized piloted simulators for nearly two decades to study high-angle-of-attack flying qualities, agility, and air-to-air combat. These studies have included assessments of an F-16XL aircraft equipped with thrust vectoring, an assessment of the F-18 HARV maneuvering requirements to assist in thrust vectoring control system design, and an agility assessment of the F-18. The F-18 agility assessment was compared with in-flight testing. Open-loop maneuvers such as 180-deg rolls to measure roll rate showed favorable simulator/in-flight comparison. Closed-loop maneuvers such as rolls to 90 deg with precision stops or certain maximum longitudinal pitching maneuvers showed poorer performance due to reduced aggressiveness of pilot inputs in flight to remain within flight envelope limits.

  3. Organizational Culture and the Deployment of Agile Methods: The Competing Values Model View

    NASA Astrophysics Data System (ADS)

    Iivari, Juhani; Iivari, Netta

    A number of researchers have identified organizational culture as a factor that potentially affects the deployment of agile systems development methods. Inspired by the study of Iivari and Huisman (2007), which focused on the deployment of traditional systems development methods, the present paper proposes a number of hypotheses about the influence of organizational culture on the deployment of agile methods.

  4. Onshore and Offshore Outsourcing with Agility: Lessons Learned

    NASA Astrophysics Data System (ADS)

    Kussmaul, Clifton

    This chapter reflects on case study based an agile distributed project that ran for approximately three years (from spring 2003 to spring 2006). The project involved (a) a customer organization with key personnel distributed across the US, developing an application with rapidly changing requirements; (b) onshore consultants with expertise in project management, development processes, offshoring, and relevant technologies; and (c) an external offsite development team in a CMM-5 organization in southern India. This chapter is based on surveys and discussions with multiple participants. The several years since the project was completed allow greater perspective on both the strengths and weaknesses, since the participants can reflect on the entire life of the project, and compare it to subsequent experiences. Our findings emphasize the potential for agile project management in distributed software development, and the importance of people and interactions, taking many small steps to find and correct errors, and matching the structures of the project and product to support implementation of agility.

  5. Methodology for developing competency standards for dietitians in Australia.

    PubMed

    Palermo, Claire; Conway, Jane; Beck, Eleanor J; Dart, Janeane; Capra, Sandra; Ash, Susan

    2016-03-01

    Competency standards document the knowledge, skills, and attitudes required for competent performance. This study develops competency standards for dietitians in order to substantiate an approach to competency standard development. Focus groups explored the current and emerging purpose, role, and function of the profession, which were used to draft competency standards. Consensus was then sought using two rounds of a Delphi survey. Seven focus groups were conducted with 28 participants (15 employers/practitioners, 5 academics, 8 new graduates). Eighty-two of 110 invited experts participated in round one and 67 experts completed round two. Four major functions of dietitians were identified: being a professional, influencing the health of individuals, groups, communities, and populations through evidence-based nutrition practice, and working collaboratively in teams. Overall there was a high level of consensus on the standards: 93% achieved agreement by participants in round one and all revised standards achieved consensus on round 2. The methodology provides a framework for other professions wishing to embark on competency standard review or development.

  6. Development and application of proton NMR methodology to lipoprotein analysis

    NASA Astrophysics Data System (ADS)

    Korhonen, Ari Juhani

    1998-11-01

    The present thesis describes the development of 1H NMR spectroscopy and its applications to lipoprotein analysis in vitro, utilizing biochemical prior knowledge and advanced lineshape fitting analysis in the frequency domain. A method for absolute quantification of lipoprotein lipids and proteins directly from the terminal methyl-CH3 resonance region of 1H NMR spectra of human blood plasma is described. Then the use of NMR methodology in time course studies of the oxidation process of LDL particles is presented. The function of the cholesteryl ester transfer protein (CETP) in lipoprotein mixtures was also assessed by 1H NMR, which allows for dynamic follow-up of the lipid transfer reactions between VLDL, LDL, and HDL particles. The results corroborated the suggestion that neutral lipid mass transfer among lipoproteins is not an equimolar heteroexchange. A novel method for studying lipoprotein particle fusion is also demonstrated. It is shown that the progression of proteolytically (α- chymotrypsin) induced fusion of LDL particles can be followed by 1H NMR spectroscopy and, moreover, that fusion can be distinguished from aggregation. In addition, NMR methodology was used to study the changes in HDL3 particles induced by phospholipid transfer protein (PLTP) in HDL3 + PLTP mixtures. The 1H NMR study revealed a gradual production of enlarged HDL particles, which demonstrated that PLTP-mediated remodeling of HDL involves fusion of the HDL particles. These applications demonstrated that the 1H NMR approach offers several advantages both in quantification and in time course studies of lipoprotein-lipoprotein interactions and of enzyme/lipid transfer protein function.

  7. Development of economic consequence methodology for process risk analysis.

    PubMed

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies.

  8. Project-Method Fit: Exploring Factors That Influence Agile Method Use

    ERIC Educational Resources Information Center

    Young, Diana K.

    2013-01-01

    While the productivity and quality implications of agile software development methods (SDMs) have been demonstrated, research concerning the project contexts where their use is most appropriate has yielded less definitive results. Most experts agree that agile SDMs are not suited for all project contexts. Several project and team factors have been…

  9. Architected Agile Solutions for Software-Reliant Systems

    NASA Astrophysics Data System (ADS)

    Boehm, Barry; Lane, Jo Ann; Koolmanojwong, Supannika; Turner, Richard

    Systems are becoming increasingly reliant on software due to needs for rapid fielding of “70% capabilities,” interoperability, net-centricity, and rapid adaptation to change. The latter need has led to increased interest in agile methods of software development, in which teams rely on shared tacit interpersonal knowledge rather than explicit documented knowledge. However, such systems often need to be scaled up to higher level of performance and assurance, requiring stronger architectural support. Several organizations have recently transformed themselves by developing successful combinations of agility and architecture that can scale to projects of up to 100 personnel. This chapter identifies a set of key principles for such architected agile solutions for software-reliant systems, provides guidance for how much architecting is enough, and illustrates the key principles with several case studies.

  10. Development of Methodologies for IV and V of Neural Networks

    NASA Technical Reports Server (NTRS)

    Taylor, Brian; Darrah, Marjorie

    2003-01-01

    Non-deterministic systems often rely upon neural network (NN) technology to "lean" to manage flight systems under controlled conditions using carefully chosen training sets. How can these adaptive systems be certified to ensure that they will become increasingly efficient and behave appropriately in real-time situations? The bulk of Independent Verification and Validation (IV&V) research of non-deterministic software control systems such as Adaptive Flight Controllers (AFC's) addresses NNs in well-behaved and constrained environments such as simulations and strict process control. However, neither substantive research, nor effective IV&V techniques have been found to address AFC's learning in real-time and adapting to live flight conditions. Adaptive flight control systems offer good extensibility into commercial aviation as well as military aviation and transportation. Consequently, this area of IV&V represents an area of growing interest and urgency. ISR proposes to further the current body of knowledge to meet two objectives: Research the current IV&V methods and assess where these methods may be applied toward a methodology for the V&V of Neural Network; and identify effective methods for IV&V of NNs that learn in real-time, including developing a prototype test bed for IV&V of AFC's. Currently. no practical method exists. lSR will meet these objectives through the tasks identified and described below. First, ISR will conduct a literature review of current IV&V technology. TO do this, ISR will collect the existing body of research on IV&V of non-deterministic systems and neural network. ISR will also develop the framework for disseminating this information through specialized training. This effort will focus on developing NASA's capability to conduct IV&V of neural network systems and to provide training to meet the increasing need for IV&V expertise in such systems.

  11. Flight dynamics research for highly agile aircraft

    NASA Technical Reports Server (NTRS)

    Nguyen, Luat T.

    1989-01-01

    This paper highlights recent results of research conducted at the NASA Langley Research Center as part of a broad flight dynamics program aimed at developing technology that will enable future combat aircraft to achieve greatly enhanced agility capability at subsonic combat conditions. Studies of advanced control concepts encompassing both propulsive and aerodynamic approaches are reviewed. Dynamic stall phenomena and their potential impact on maneuvering performance and stability are summarized. Finally, issues of mathematical modeling of complex aerodynamics occurring during rapid, large amplitude maneuvers are discussed.

  12. Towards a general object-oriented software development methodology

    NASA Technical Reports Server (NTRS)

    Seidewitz, ED; Stark, Mike

    1986-01-01

    An object is an abstract software model of a problem domain entity. Objects are packages of both data and operations of that data (Goldberg 83, Booch 83). The Ada (tm) package construct is representative of this general notion of an object. Object-oriented design is the technique of using objects as the basic unit of modularity in systems design. The Software Engineering Laboratory at the Goddard Space Flight Center is currently involved in a pilot program to develop a flight dynamics simulator in Ada (approximately 40,000 statements) using object-oriented methods. Several authors have applied object-oriented concepts to Ada (e.g., Booch 83, Cherry 85). It was found that these methodologies are limited. As a result a more general approach was synthesized with allows a designer to apply powerful object-oriented principles to a wide range of applications and at all stages of design. An overview is provided of this approach. Further, how object-oriented design fits into the overall software life-cycle is considered.

  13. Lean vs Agile in the Context of Complexity Management in Organizations

    ERIC Educational Resources Information Center

    Putnik, Goran D.; Putnik, Zlata

    2012-01-01

    Purpose: The objective of this paper is to provide a deeper insight into the relationship of the issue "lean vs agile" in order to inform managers towards more coherent decisions especially in a dynamic, unpredictable, uncertain, non-linear environment. Design/methodology/approach: The methodology is an exploratory study based on secondary data…

  14. Development of an aeroelastic methodology for surface morphing rotors

    NASA Astrophysics Data System (ADS)

    Cook, James R.

    Helicopter performance capabilities are limited by maximum lift characteristics and vibratory loading. In high speed forward flight, dynamic stall and transonic flow greatly increase the amplitude of vibratory loads. Experiments and computational simulations alike have indicated that a variety of active rotor control devices are capable of reducing vibratory loads. For example, periodic blade twist and flap excitation have been optimized to reduce vibratory loads in various rotors. Airfoil geometry can also be modified in order to increase lift coefficient, delay stall, or weaken transonic effects. To explore the potential benefits of active controls, computational methods are being developed for aeroelastic rotor evaluation, including coupling between computational fluid dynamics (CFD) and computational structural dynamics (CSD) solvers. In many contemporary CFD/CSD coupling methods it is assumed that the airfoil is rigid to reduce the interface by single dimension. Some methods retain the conventional one-dimensional beam model while prescribing an airfoil shape to simulate active chord deformation. However, to simulate the actual response of a compliant airfoil it is necessary to include deformations that originate not only from control devices (such as piezoelectric actuators), but also inertial forces, elastic stresses, and aerodynamic pressures. An accurate representation of the physics requires an interaction with a more complete representation of loads and geometry. A CFD/CSD coupling methodology capable of communicating three-dimensional structural deformations and a distribution of aerodynamic forces over the wetted blade surface has not yet been developed. In this research an interface is created within the Fully Unstructured Navier-Stokes (FUN3D) solver that communicates aerodynamic forces on the blade surface to University of Michigan's Nonlinear Active Beam Solver (UM/NLABS -- referred to as NLABS in this thesis). Interface routines are developed for

  15. Reaching the grassroots: publishing methodologies for development organizations.

    PubMed

    Zielinski, C

    1987-01-01

    There are 3 major distinctions between the traditional form of academic publishing and publishing for the grassroots as a development-organization activity, particularly in developing countries. Whereas academic publishing seeks to cover the target audience in its entirety, grassroots publishing can only cover a sampling. Academic publishing fulfills a need, while grassroots publishing demonstrates a need and a way to fulfill it. Finally, whereas academic publishing is largely a support activity aimed at facilitating the dissemination of information as a relatively minor part of a technical program, grassroots publishing is a more substantive activity aimed at producing a catalytic effect. Publication for the grassroots further calls for a different methodological approach. Given the constraint of numbers, publications aimed at the grassroots can only be examples or prototypes. The function of a prototype is to serve both as a basis for translation, adaptation, and replication and as a model end result. The approach to the use and promotion of prototypes differs according to the specific country situation. In countries with a heterogenous culture or several different languages, 2 items should be produced: a prototype of the complete text, which should be pretested and evaluated, and a prototype adaptation kit stripped of cultural and social biases. Promotion of the translation and replication of a publication can be achieved by involving officials at the various levels of government, interesting international and voluntary funding agencies, and stimulating indigenous printing capacities at the community level. The most important factors are the appropriateness of the publication in solving specific priority problems and the interest and involvement of national and state authorities at all stages of the project. PMID:12280779

  16. SU-E-T-610: Comparison of Treatment Times Between the MLCi and Agility Multileaf Collimators

    SciTech Connect

    Ramsey, C; Bowling, J

    2014-06-01

    Purpose: The Agility is a new 160-leaf MLC developed by Elekta for use in their Infinity and Versa HD linacs. As compared to the MLCi, the Agility increased the maximum leaf speed from 2 cm/s to 3.5 cm/s, and the maximum primary collimator speed from 1.5 cm/s to 9.0 cm/s. The purpose of this study was to determine if the Agility MLC resulted in improved plan quality and/or shorter treatment times. Methods: An Elekta Infinity that was originally equipped with a 80 leaf MLCi was upgraded to an 160 leaf Agility. Treatment plan quality was evaluated using the Pinnacle planning system with SmartArc. Optimization was performed once for the MLCi and once for the Agility beam models using the same optimization parameters and the same number of iterations. Patient treatment times were measured for all IMRT, VMAT, and SBRT patients treated on the Infinity with the MLCi and Agility MLCs. Treatment times were extracted from the EMR and measured from when the patient first walked into the treatment room until exiting the treatment room. Results: 11,380 delivery times were measured for patients treated with the MLCi, and 1,827 measurements have been made for the Agility MLC. The average treatment times were 19.1 minutes for the MLCi and 20.8 minutes for the Agility. Using a t-test analysis, there was no difference between the two groups (t = 0.22). The dose differences between patients planned with the MLCi and the Agility MLC were minimal. For example, the dose difference for the PTV, GTV, and cord for a head and neck patient planned using Pinnacle were effectively equivalent. However, the dose to the parotid glands was slightly worse with the Agility MLC. Conclusion: There was no statistical difference in treatment time, or any significant dosimetric difference between the Agility MLC and the MLCi.

  17. Development of an in-situ soil structure characterization methodology

    NASA Astrophysics Data System (ADS)

    Debos, Endre; Kriston, Sandor

    2015-04-01

    The agricultural cultivation has several direct and indirect effects on the soil properties, among which the soil structure degradation is the best known and most detectable one. Soil structure degradation leads to several water and nutrient management problems, which reduce the efficiency of agricultural production. There are several innovative technological approaches aiming to reduce these negative impacts on the soil structure. The tests, validation and optimization of these methods require an adequate technology to measure the impacts on the complex soil system. This study aims to develop an in-situ soil structure and root development testing methodology, which can be used in field experiments and which allows one to follow the real time changes in the soil structure - evolution / degradation and its quantitative characterization. The method is adapted from remote sensing image processing technology. A specifically transformed A/4 size scanner is placed into the soil into a safe depth that cannot be reached by the agrotechnical treatments. Only the scanner USB cable comes to the surface to allow the image acquisition without any soil disturbance. Several images from the same place can be taken throughout the vegetation season to follow the soil consolidation and structure development after the last tillage treatment for the seedbed preparation. The scanned image of the soil profile is classified using supervised image classification, namely the maximum likelihood classification algorithm. The resulting image has two principal classes, soil matrix and pore space and other complementary classes to cover the occurring thematic classes, like roots, stones. The calculated data is calibrated with filed sampled porosity data. As the scanner is buried under the soil with no changes in light conditions, the image processing can be automated for better temporal comparison. Besides the total porosity each pore size fractions and their distributions can be calculated for

  18. SuperAGILE: The Hard X-ray Imager of AGILE

    SciTech Connect

    Feroci, M.; Costa, E.; Barbanera, L.; Del Monte, E.; Di Persio, G.; Frutti, M.; Lapshov, I.; Lazzarotto, F.; Pacciani, L.; Porrovecchio, G.; Preger, B.; Rapisarda, M.; Rubini, A.; Soffitta, P.; Tavani, M.; Mastropietro, M.; Morelli, E.; Argan, A.; Ghirlanda, G.; Mereghetti, S.

    2004-09-28

    SuperAGILE is the hard X-ray (10-40 keV) imager for the gamma-ray mission AGILE, currently scheduled for launch in mid-2005. It is based on 4 Si-microstrip detectors, with a total geometric area of 1444 cm{sup 2} (max effective about 300 cm{sup 2}), equipped with one-dimensional coded masks. The 4 detectors are perpendicularly oriented, in order to provide pairs of orthogonal one-dimensional images of the X-ray sky. The field of view of each 1-D detector is 107 deg. x 68 deg., at zero response, with an overlap in the central 68 deg. x 68 deg. area. The angular resolution on axis is 6 arcmin (pixel size). We present here the current status of the hardware development and the scientific potential for GRBs, for which an onboard trigger and imaging system will allow distributing locations through a fast communication telemetry link from AGILE to the ground.

  19. A Proven Methodology for Developing Secure Software and Applying It to Ground Systems

    NASA Technical Reports Server (NTRS)

    Bailey, Brandon

    2016-01-01

    Part Two expands upon Part One in an attempt to translate the methodology for ground system personnel. The goal is to build upon the methodology presented in Part One by showing examples and details on how to implement the methodology. Section 1: Ground Systems Overview; Section 2: Secure Software Development; Section 3: Defense in Depth for Ground Systems; Section 4: What Now?

  20. Evaluation of agile designs in first-in-human (FIH) trials--a simulation study.

    PubMed

    Perlstein, Itay; Bolognese, James A; Krishna, Rajesh; Wagner, John A

    2009-12-01

    The aim of the investigation was to evaluate alternatives to standard first-in-human (FIH) designs in order to optimize the information gained from such studies by employing novel agile trial designs. Agile designs combine adaptive and flexible elements to enable optimized use of prior information either before and/or during conduct of the study to seamlessly update the study design. A comparison of the traditional 6 + 2 (active + placebo) subjects per cohort design with alternative, reduced sample size, agile designs was performed by using discrete event simulation. Agile designs were evaluated for specific adverse event models and rates as well as dose-proportional, saturated, and steep-accumulation pharmacokinetic profiles. Alternative, reduced sample size (hereafter referred to as agile) designs are proposed for cases where prior knowledge about pharmacokinetics and/or adverse event relationships are available or appropriately assumed. Additionally, preferred alternatives are proposed for a general case when prior knowledge is limited or unavailable. Within the tested conditions and stated assumptions, some agile designs were found to be as efficient as traditional designs. Thus, simulations demonstrated that the agile design is a robust and feasible approach to FIH clinical trials, with no meaningful loss of relevant information, as it relates to PK and AE assumptions. In some circumstances, applying agile designs may decrease the duration and resources required for Phase I studies, increasing the efficiency of early clinical development. We highlight the value and importance of useful prior information when specifying key assumptions related to safety, tolerability, and PK.

  1. The Pre-Conceptual Map Methodology: Development and Application

    ERIC Educational Resources Information Center

    Hipsky, Shellie

    2006-01-01

    The objective of this article is to present the Pre-Conceptual Map methodology as a formalized way to identify, document, and utilize preconceived assumptions on the part of the researcher in qualitative inquiry. This technique can be used as a stand alone method or in conjunction with other qualitative techniques (i.e., naturalistic inquiry).…

  2. Development of a methodology for LES of Turbulent Cavitating Flows

    NASA Astrophysics Data System (ADS)

    Gnanaskandan, Aswin

    The objective of this dissertation is to develop a numerical methodology for large eddy simulation of multiphase cavitating flows on unstructured grids and apply it to study two cavitating flow problems. The multiphase medium is represented using a homogeneous mixture model that assumes thermal equilibrium between the liquid and vapor phases. We develop a predictor-corrector approach to solve the governing Navier Stokes equations for the liquid/vapor mixture, together with the transport equation for the vapor mass fraction. While a non-dissipative and symmetric scheme is used in the predictor step, a novel characteristic-based filtering scheme with a second order TVD filter is developed for the corrector step to handle shocks and material discontinuities in non-ideal gases and mixtures. Additionally, a sensor based on vapor volume fraction is proposed to localize dissipation to the vicinity of discontinuities. The scheme is first validated for one dimensional canonical problems to verify its accuracy in predicting jump conditions across material discontinuities and shocks. It is then applied to two turbulent cavitating flow problems - over a hydrofoil and over a wedge. Our results show that the simulations are in good agreement with experimental data for the above tested cases, and that the scheme can be successfully applied to RANS, LES and DNS methodologies. We first study cavitation over a circular cylinder at two different Reynolds numbers (Re = 200 and 3900 based on cylinder diameter and free stream velocity) and four different cavitation numbers (sigma = 2.0, 1.0, 0.7 and 0.5). Large Eddy Simulation (LES) is employed at the higher Reynolds number and Direct Numerical Simulations (DNS) at the lower Reynolds number. The unsteady characteristics of the flow are found to be altered significantly by cavitation. It is observed that the simulated cases fall into two different cavitation regimes: cyclic and transitional. Cavitation is seen to significantly influence

  3. Current State of Agile User-Centered Design: A Survey

    NASA Astrophysics Data System (ADS)

    Hussain, Zahid; Slany, Wolfgang; Holzinger, Andreas

    Agile software development methods are quite popular nowadays and are being adopted at an increasing rate in the industry every year. However, these methods are still lacking usability awareness in their development lifecycle, and the integration of usability/User-Centered Design (UCD) into agile methods is not adequately addressed. This paper presents the preliminary results of a recently conducted online survey regarding the current state of the integration of agile methods and usability/UCD. A world wide response of 92 practitioners was received. The results show that the majority of practitioners perceive that the integration of agile methods with usability/UCD has added value to their adopted processes and to their teams; has resulted in the improvement of usability and quality of the product developed; and has increased the satisfaction of the end-users of the product developed. The top most used HCI techniques are low-fidelity prototyping, conceptual designs, observational studies of users, usability expert evaluations, field studies, personas, rapid iterative testing, and laboratory usability testing.

  4. The AGILE gamma-ray astronomy mission

    NASA Astrophysics Data System (ADS)

    Mereghetti, S.; Tavani, M.; Argan, A.; Barbiellini, G.; Caraveo, P.; Chen, A.; Cocco, V.; Costa, E.; Di Cocco, G.; Feroci, M.; Labanti, C.; Lapshov, I.; Lipari, P.; Longo, F.; Morselli, A.; Perotti, F.; Picozza, P.; Pittori, C.; Prest, M.; Rubini, A.; Soffitta, P.; Vallazza, E.; Vercellone, S.; Zanello, D.

    2001-09-01

    We describe the AGILE satellite: a unique tool for high-energy astrophysics in the 30 MeV - 50 GeV range before GLAST. The scientific performances of AGILE are comparable to those of EGRET, despite the much smaller weight and dimensions. The AGILE mission will be optimized for the imaging capabilities above 30 MeV and for the study of transient phenomena, complemented by simultaneous monitoring in the hard X-ray band (10 - 40 keV).

  5. Design and characterization of frequency agile RF and microwave devices using ferroelectrics

    NASA Astrophysics Data System (ADS)

    Nath, Jayesh

    A methodology for the optimized design of tunable distributed resonators is introduced and verified. This technique enables maximum tuning with minimum degradation in quality (Q) factor. The concept of a network transformation factor and a new figure-of-merit for tunable resonators is introduced and applied to experimental data. The figure-of-merit quantifies the trade-off between tunability and Q factor for a given tuning ratio of the variable reactance device. As such, it can be extended to the design of filters, phase shifters, antennas, matching networks and other frequency-agile devices where resonant elements are used. Varactors utilizing Barium Strontium Titanate (BST) thin-film were designed and fabricated in integrated form and also in discrete form as standard 0603 components. High frequency characterization and modeling of BST varactors is described. A novel characterization technique for the intrinsic loss extraction of symmetrical two-port networks was developed and verified experimentally. Both integrated and discrete BST thin-film varactors were used to design, fabricate and characterize frequency-agile circuits. Tunable bandpass and bandstop filters and matching networks are described. A dual-mode, narrowband microstrip patch antenna with independently tunable modes was developed and characterized. Tuning and nonlinear characterization results are presented. Investigation for the use of BST thin-film varactors for voltage-controlled oscillators and phase shifters are also presented. Design parameters, fabrication issues, and processing challenges are discussed.

  6. Development of a Design Methodology for Reconfigurable Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Hess, Ronald A.; McLean, C.

    2000-01-01

    A methodology is presented for the design of flight control systems that exhibit stability and performance-robustness in the presence of actuator failures. The design is based upon two elements. The first element consists of a control law that will ensure at least stability in the presence of a class of actuator failures. This law is created by inner-loop, reduced-order, linear dynamic inversion, and outer-loop compensation based upon Quantitative Feedback Theory. The second element consists of adaptive compensators obtained from simple and approximate time-domain identification of the dynamics of the 'effective vehicle' with failed actuator(s). An example involving the lateral-directional control of a fighter aircraft is employed both to introduce the proposed methodology and to demonstrate its effectiveness and limitations.

  7. On the biomimetic design of agile-robot legs.

    PubMed

    Garcia, Elena; Arevalo, Juan Carlos; Muñoz, Gustavo; Gonzalez-de-Santos, Pablo

    2011-01-01

    The development of functional legged robots has encountered its limits in human-made actuation technology. This paper describes research on the biomimetic design of legs for agile quadrupeds. A biomimetic leg concept that extracts key principles from horse legs which are responsible for the agile and powerful locomotion of these animals is presented. The proposed biomimetic leg model defines the effective leg length, leg kinematics, limb mass distribution, actuator power, and elastic energy recovery as determinants of agile locomotion, and values for these five key elements are given. The transfer of the extracted principles to technological instantiations is analyzed in detail, considering the availability of current materials, structures and actuators. A real leg prototype has been developed following the biomimetic leg concept proposed. The actuation system is based on the hybrid use of series elasticity and magneto-rheological dampers which provides variable compliance for natural motion. From the experimental evaluation of this prototype, conclusions on the current technological barriers to achieve real functional legged robots to walk dynamically in agile locomotion are presented.

  8. On the Biomimetic Design of Agile-Robot Legs

    PubMed Central

    Garcia, Elena; Arevalo, Juan Carlos; Muñoz, Gustavo; Gonzalez-de-Santos, Pablo

    2011-01-01

    The development of functional legged robots has encountered its limits in human-made actuation technology. This paper describes research on the biomimetic design of legs for agile quadrupeds. A biomimetic leg concept that extracts key principles from horse legs which are responsible for the agile and powerful locomotion of these animals is presented. The proposed biomimetic leg model defines the effective leg length, leg kinematics, limb mass distribution, actuator power, and elastic energy recovery as determinants of agile locomotion, and values for these five key elements are given. The transfer of the extracted principles to technological instantiations is analyzed in detail, considering the availability of current materials, structures and actuators. A real leg prototype has been developed following the biomimetic leg concept proposed. The actuation system is based on the hybrid use of series elasticity and magneto-rheological dampers which provides variable compliance for natural motion. From the experimental evaluation of this prototype, conclusions on the current technological barriers to achieve real functional legged robots to walk dynamically in agile locomotion are presented. PMID:22247667

  9. Methodology development for evaluation of selective-fidelity rotorcraft simulation

    NASA Technical Reports Server (NTRS)

    Lewis, William D.; Schrage, D. P.; Prasad, J. V. R.; Wolfe, Daniel

    1992-01-01

    This paper addressed the initial step toward the goal of establishing performance and handling qualities acceptance criteria for realtime rotorcraft simulators through a planned research effort to quantify the system capabilities of 'selective fidelity' simulators. Within this framework the simulator is then classified based on the required task. The simulator is evaluated by separating the various subsystems (visual, motion, etc.) and applying corresponding fidelity constants based on the specific task. This methodology not only provides an assessment technique, but also provides a technique to determine the required levels of subsystem fidelity for a specific task.

  10. Multiply-agile encryption in high speed communication networks

    SciTech Connect

    Pierson, L.G.; Witzke, E.L.

    1997-05-01

    Different applications have different security requirements for data privacy, data integrity, and authentication. Encryption is one technique that addresses these requirements. Encryption hardware, designed for use in high-speed communications networks, can satisfy a wide variety of security requirements if that hardware is key-agile, robustness-agile and algorithm-agile. Hence, multiply-agile encryption provides enhanced solutions to the secrecy, interoperability and quality of service issues in high-speed networks. This paper defines these three types of agile encryption. Next, implementation issues are discussed. While single-algorithm, key-agile encryptors exist, robustness-agile and algorithm-agile encryptors are still research topics.

  11. Wavelength-Agile External-Cavity Diode Laser for DWDM

    NASA Technical Reports Server (NTRS)

    Pilgrim, Jeffrey S.; Bomse, David S.

    2006-01-01

    A prototype external-cavity diode laser (ECDL) has been developed for communication systems utilizing dense wavelength- division multiplexing (DWDM). This ECDL is an updated version of the ECDL reported in Wavelength-Agile External- Cavity Diode Laser (LEW-17090), NASA Tech Briefs, Vol. 25, No. 11 (November 2001), page 14a. To recapitulate: The wavelength-agile ECDL combines the stability of an external-cavity laser with the wavelength agility of a diode laser. Wavelength is modulated by modulating the injection current of the diode-laser gain element. The external cavity is a Littman-Metcalf resonator, in which the zeroth-order output from a diffraction grating is used as the laser output and the first-order-diffracted light is retro-reflected by a cavity feedback mirror, which establishes one end of the resonator. The other end of the resonator is the output surface of a Fabry-Perot resonator that constitutes the diode-laser gain element. Wavelength is selected by choosing the angle of the diffracted return beam, as determined by position of the feedback mirror. The present wavelength-agile ECDL is distinguished by design details that enable coverage of all 60 channels, separated by 100-GHz frequency intervals, that are specified in DWDM standards.

  12. Network configuration management : paving the way to network agility.

    SciTech Connect

    Maestas, Joseph H.

    2007-08-01

    Sandia networks consist of nearly nine hundred routers and switches and nearly one million lines of command code, and each line ideally contributes to the capabilities of the network to convey information from one location to another. Sandia's Cyber Infrastructure Development and Deployment organizations recognize that it is therefore essential to standardize network configurations and enforce conformance to industry best business practices and documented internal configuration standards to provide a network that is agile, adaptable, and highly available. This is especially important in times of constrained budgets as members of the workforce are called upon to improve efficiency, effectiveness, and customer focus. Best business practices recommend using the standardized configurations in the enforcement process so that when root cause analysis results in recommended configuration changes, subsequent configuration auditing will improve compliance to the standard. Ultimately, this minimizes mean time to repair, maintains the network security posture, improves network availability, and enables efficient transition to new technologies. Network standardization brings improved network agility, which in turn enables enterprise agility, because the network touches all facets of corporate business. Improved network agility improves the business enterprise as a whole.

  13. Neuromuscular strategies contributing to faster multidirectional agility performance.

    PubMed

    Spiteri, Tania; Newton, Robert U; Nimphius, Sophia

    2015-08-01

    The aim of this study was to first determine differences in neuromuscular strategy between a faster and slower agility performance, and second compare differences in muscle activation strategy employed when performing two closely executed agility movements. Participants recruited from an elite female basketball team completed an ultrasound to determine quadriceps muscle-cross sectional area; reactive isometric mid-thigh pull to determine the rate of muscle activation, rate of force development, pre-motor time and motor time; and multidirectional agility tests completing two directional changes in response to a visual stimulus. Peak and average relative muscle activation of the rectus femoris, vastus medialis, vastus lateralis, biceps femoris, semitendinosus and gastrocnemius were measured 100ms prior to heel strike (pre-heel strike) and across stance phase for both directional changes. Faster agility performance was characterized by greater pre-heel strike muscle activity and greater anterior muscle activation during stance phase resulting in greater hip and knee extension increasing propulsive impulse. Differences between directional changes appear to result from processing speed, where a greater delay in refractory times during the second directional change resulted in greater anterior muscle activation, decelerating the body while movement direction was determined.

  14. Agile parallel bioinformatics workflow management using Pwrake

    PubMed Central

    2011-01-01

    Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability

  15. The GLAST-AGILE Support Program (GASP)

    NASA Astrophysics Data System (ADS)

    Villata, M.; Raiteri, C. M.; Webt Collaboration

    2008-10-01

    The GLAST-AGILE Support Program (GASP) was organized within the Whole Earth Blazar Telescope to provide optical-to-radio long-term continuous monitoring of a list of selected gamma-ray-loud blazars during the operation of the AGILE and GLAST satellites. We present some results obtained since its birth, in September 2007.

  16. The Introduction of Agility into Albania.

    ERIC Educational Resources Information Center

    Smith-Stevens, Eileen J.; Shkurti, Drita

    1998-01-01

    Describes a plan to introduce and achieve a national awareness of agility (and easy entry into the world market) for Albania through the relatively stable higher-education order. Agility's four strategic principles are enriching the customer, cooperating to enhance competitiveness, organizing to master change and uncertainty, and leveraging the…

  17. Towards a general object-oriented software development methodology

    NASA Technical Reports Server (NTRS)

    Seidewitz, ED; Stark, Mike

    1986-01-01

    Object diagrams were used to design a 5000 statement team training exercise and to design the entire dynamics simulator. The object diagrams are also being used to design another 50,000 statement Ada system and a personal computer based system that will be written in Modula II. The design methodology evolves out of these experiences as well as the limitations of other methods that were studied. Object diagrams, abstraction analysis, and associated principles provide a unified framework which encompasses concepts from Yourdin, Booch, and Cherry. This general object-oriented approach handles high level system design, possibly with concurrency, through object-oriented decomposition down to a completely functional level. How object-oriented concepts can be used in other phases of the software life-cycle, such as specification and testing is being studied concurrently.

  18. Development of a methodology for classifying software errors

    NASA Technical Reports Server (NTRS)

    Gerhart, S. L.

    1976-01-01

    A mathematical formalization of the intuition behind classification of software errors is devised and then extended to a classification discipline: Every classification scheme should have an easily discernible mathematical structure and certain properties of the scheme should be decidable (although whether or not these properties hold is relative to the intended use of the scheme). Classification of errors then becomes an iterative process of generalization from actual errors to terms defining the errors together with adjustment of definitions according to the classification discipline. Alternatively, whenever possible, small scale models may be built to give more substance to the definitions. The classification discipline and the difficulties of definition are illustrated by examples of classification schemes from the literature and a new study of observed errors in published papers of programming methodologies.

  19. AGILE integration into APC for high mix logic fab

    NASA Astrophysics Data System (ADS)

    Gatefait, M.; Lam, A.; Le Gratiet, B.; Mikolajczak, M.; Morin, V.; Chojnowski, N.; Kocsis, Z.; Smith, I.; Decaunes, J.; Ostrovsky, A.; Monget, C.

    2015-09-01

    mix logic Fab) in term of product and technology portfolio AGILE corrects for up to 120nm of product topography error on process layer with less than 50nm depth of focus Based on tool functionalities delivered by ASML and on high volume manufacturing requirement, AGILE integration is a real challenge. Regarding ST requirements "Automatic AGILE" functionality developed by ASML was not a turnkey solution and a dedicated functionality was needed. A "ST homemade AGILE integration" has been fully developed and implemented within ASML and ST constraints. This paper describes this integration in our Advanced Process Control platform (APC).

  20. Agile Data Management with the Global Change Information System

    NASA Astrophysics Data System (ADS)

    Duggan, B.; Aulenbach, S.; Tilmes, C.; Goldstein, J.

    2013-12-01

    We describe experiences applying agile software development techniques to the realm of data management during the development of the Global Change Information System (GCIS), a web service and API for authoritative global change information under development by the US Global Change Research Program. Some of the challenges during system design and implementation have been : (1) balancing the need for a rigorous mechanism for ensuring information quality with the realities of large data sets whose contents are often in flux, (2) utilizing existing data to inform decisions about the scope and nature of new data, and (3) continuously incorporating new knowledge and concepts into a relational data model. The workflow for managing the content of the system has much in common with the development of the system itself. We examine various aspects of agile software development and discuss whether or how we have been able to use them for data curation as well as software development.

  1. Agile manufacturing from a statistical perspective

    SciTech Connect

    Easterling, R.G.

    1995-10-01

    The objective of agile manufacturing is to provide the ability to quickly realize high-quality, highly-customized, in-demand products at a cost commensurate with mass production. More broadly, agility in manufacturing, or any other endeavor, is defined as change-proficiency; the ability to thrive in an environment of unpredictable change. This report discusses the general direction of the agile manufacturing initiative, including research programs at the National Institute of Standards and Technology (NIST), the Department of Energy, and other government agencies, but focuses on agile manufacturing from a statistical perspective. The role of statistics can be important because agile manufacturing requires the collection and communication of process characterization and capability information, much of which will be data-based. The statistical community should initiate collaborative work in this important area.

  2. Agile manufacturing prototyping system (AMPS)

    SciTech Connect

    Garcia, P.

    1998-05-09

    The Agile Manufacturing Prototyping System (AMPS) is being integrated at Sandia National Laboratories. AMPS consists of state of the industry flexible manufacturing hardware and software enhanced with Sandia advancements in sensor and model based control; automated programming, assembly and task planning; flexible fixturing; and automated reconfiguration technology. AMPS is focused on the agile production of complex electromechanical parts. It currently includes 7 robots (4 Adept One, 2 Adept 505, 1 Staubli RX90), conveyance equipment, and a collection of process equipment to form a flexible production line capable of assembling a wide range of electromechanical products. This system became operational in September 1995. Additional smart manufacturing processes will be integrated in the future. An automated spray cleaning workcell capable of handling alcohol and similar solvents was added in 1996 as well as parts cleaning and encapsulation equipment, automated deburring, and automated vision inspection stations. Plans for 1997 and out years include adding manufacturing processes for the rapid prototyping of electronic components such as soldering, paste dispensing and pick-and-place hardware.

  3. Agile: From Software to Mission System

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Shirley, Mark H.; Hobart, Sarah Groves

    2016-01-01

    The Resource Prospector (RP) is an in-situ resource utilization (ISRU) technology demonstration mission, designed to search for volatiles at the Lunar South Pole. This is NASA's first near real time tele-operated rover on the Moon. The primary objective is to search for volatiles at one of the Lunar Poles. The combination of short mission duration, a solar powered rover, and the requirement to explore shadowed regions makes for an operationally challenging mission. To maximize efficiency and flexibility in Mission System design and thus to improve the performance and reliability of the resulting Mission System, we are tailoring Agile principles that we have used effectively in ground data system software development and applying those principles to the design of elements of the mission operations system.

  4. Agility and mixed-model furniture production

    NASA Astrophysics Data System (ADS)

    Yao, Andrew C.

    2000-10-01

    The manufacture of upholstered furniture provides an excellent opportunity to analyze the effect of a comprehensive communication system on classical production management functions. The objective of the research is to study the scheduling heuristics that embrace the concepts inherent in MRP, JIT and TQM while recognizing the need for agility in a somewhat complex and demanding environment. An on-line, real-time data capture system provides the status and location of production lots, components, subassemblies for schedule control. Current inventory status of raw material and purchased items are required in order to develop and adhere to schedules. For the large variety of styles and fabrics customers may order, the communication system must provide timely, accurate and comprehensive information for intelligent decisions with respect to the product mix and production resources.

  5. Optimum detection of multiple vapor materials with frequency-agile lidar

    NASA Astrophysics Data System (ADS)

    Warren, Russell E.

    1996-07-01

    Differential absorption lidar (DIAL) is a well-established technology for estimating the concentration and its path integral CL of vapor materials using two closely spaced wavelengths. The recent development of frequency-agile lasers (FAL's) with as many as 60 wavelengths that can be rapidly scanned motivates the need for detection and estimation algorithms that are optimal for lidar employing these new sources. I derive detection and multimaterial CL estimation algorithms for FAL applications using the likelihood ratio test methodology of multivariate statistical inference theory. Three model sets of assumptions are considered with regard to the spectral properties of the backscatter from either topographic or aerosol targets. The calculations are illustrated through both simulated and actual lidar data.

  6. Range-resolved frequency-agile CO2 lidar measurements of smokestack vapor effluents

    NASA Astrophysics Data System (ADS)

    D'Amico, Francis M.; Vanderbeek, Richard G.; Warren, Russell E.

    1999-11-01

    Range-resolved lidar measurements of chemical vapor output from a smokestack were conducted using a moderate-power (100 millijoules per pulse) frequency-agile CO2 differential absorption lidar (DIAL) system. A 70-foot non-industrial smokestack, erected for the purpose of studying effluent emissions, was used in the experiment. These measurements were conducted for the purpose of obtaining real data to support development of advanced chemical and biological (CB) range- resolved vapor detection algorithms. Plume transmission measurements were made using natural atmospheric backscatter from points at the mouth of the stack and several positions downwind. Controlled releases of triethyl-phosphate (TEP), dimethyl-methylphosphonate (DMMP), and sulfur-hexaflouride (SF6) were performed. Test methodology and experimental results are presented. Effective application of ground-based lidar to the monitoring of smokestack effluents, without the use of fixed targets, is discussed.

  7. New developments in the TEP neutral transport methodology

    NASA Astrophysics Data System (ADS)

    Zhang, Dingkang; Mandrekas, J.; Stacey, W. M.

    2004-11-01

    The Transmission and Escape Probabilities (TEP) method (W.M. Stacey, J. Mandrekas, Nucl. Fusion 34) (1994) 1385. is a computationally efficient and accurate technique for the calculation of neutral transport in edge plasmas. The method has been implemented into the GTNEUT code(J. Mandrekas, Comput. Phys. Commun. 161) (2004) 36. which has been benchmarked extensively against Monte Carlo and experiment. Recently, the TEP methodology and the GTNEUT code have been extended to relax certain restrictive assumptions in the original formulation, namely the requirement of an isotropic distribution function at the interfaces and the assumption of a spatially uniform first collision source. A double P1 (DP1) expansion allows distributions with linear anisotropies at the interfaces, extending the accuracy of the TEP method to cases where anisotropic effects are important. Three different approaches are compared to deal with the non-uniformity of the first collision source (subdivision into smaller computational regions, spatially-dependent expansion functions and diffusion theory calculation of the directional escape probabilities). Benchmarks with Monte Carlo simulations are presented.

  8. Exploring issues in the development of Ayurvedic research methodology.

    PubMed

    Singh, Ram H

    2010-04-01

    Research is the prime need of contemporary Ayurveda, but modern research on Ayurveda has not been very rewarding for Ayurveda itself. Much of it uses Ayurveda to extend modern bioscience. In contrast, Ayurveda needs research designed to test and validate its fundamental concepts as well as its treatments. In this context, if Ayurveda is to be truly explored and validated in all its aspects, scientific inputs should conform to Ayurveda's principles and philosophy. While its evidence base, established since antiquity, may need further verification, research should now focus on the Science of Ayurveda, rather than merely looking for new drugs based on Ayurveda herbals; in-depth research is needed on Ayurveda. Such research will require teamwork between scientists and vaidyas based on truth and trust. Ayurveda research methodology requires the 'whole system testing approach', global participation with protocols evolved through intense interface with modern science, regulatory reforms to eliminate barriers, and to be investigated 'as it is', using approaches adapted from its own basic principles.

  9. Development of the damage assessment methodology for ceiling elements

    NASA Astrophysics Data System (ADS)

    Nitta, Yoshihiro; Iwasaki, Atsumi; Nishitani, Akira; Wakatabe, Morimasa; Inai, Shinsuke; Ohdomari, Iwao; Tsutsumi, Hiroki

    2012-04-01

    This paper presents the basic concept of a damage assessment methodology for ceiling elements with the aid of smart sensor board and inspection robot. In this proposed system, the distributed smart sensor boards firstly detect the fact of damage occurrence. Next, the robot inspects the damage location and captures the photographic image of damage condition. The smart sensor board for the proposed system mainly consists of microcontroller, strain gage and LAN module. The inspection robot integrated into the proposed system has a wireless camera and wireless LAN device for receiving signal to manipulate itself. At first, the effectiveness of the smart sensor board and inspection robot is tested by experiments of a full-scale suspended ceiling utilizing shaking table facilities. The model ceiling is subjected to several levels of excitations and thus various levels of damages are caused. Next, this robot inspection scheme is applied to the ceiling of a real structure damaged by the 2011 off the pacific coast of Tohoku Earthquake. The obtained results indicate that the proposed system can detect the location and condition of the damage.

  10. Developing an Item Bank for Use in Testing in Africa: Theory and Methodology

    ERIC Educational Resources Information Center

    Furtuna, Daniela

    2014-01-01

    The author describes the steps taken by a research team, of which she was part, to develop a specific methodology for assessing student attainment in primary school, working with the Programme for the Analysis of Education Systems (PASEC) of the Conference of Ministers of Education of French-speaking Countries (CONFEMEN). This methodology provides…

  11. On-line maintenance methodology development and its applications

    SciTech Connect

    Kim, J.; Jae, M.

    2012-07-01

    With the increasing economic pressures being faced and the potential for shortening outage times under the conditions of deregulated electricity markets in the world, licensees are motivated to get an increasing amount of the on-line maintenance (OLM). The benefits of the OLM includes increased system and plant reliability, reduction of plant equipment and system material condition deficiencies that could adversely impact operations, and reduction of work scope during plant refueling outages. In Korea, allowance guidelines of risk assessment is specified in the safety regulation guidelines 16.7 and 16.8 of the Korea Inst. of Nuclear Safety (KINS), which is 'General guidelines of Risk-informed application for requesting permission of changes' and 'Requesting permission of changes of Risk-informed application for Technical Specification'. We select the emergency diesel generator (EDG) of the Ulchin unit 3 and 4 for risk assessment analysis by applying configuration changes. The EDG which has plant safety level IE belongs to on-site standby power (A, B train EDG) in electric distribution system. The EDG is important component because it should maintain standby status during plant is operating, therefore we select the EDG for target component of risk assessment analysis. The risk assessment is limited to CDF. The risk assessment is performed by using AIMS-PSA Release2. We evaluate CDF by applying the configuration changes with some assumptions. Evaluation of the full power operation and Low power/Shut down operation was performed. This study has been performed for introducing a methodology and performing risk assessment. (authors)

  12. Are Agile and Lean Manufacturing Systems Employing Sustainability, Complexity and Organizational Learning?

    ERIC Educational Resources Information Center

    Flumerfelt, Shannon; Siriban-Manalang, Anna Bella; Kahlen, Franz-Josef

    2012-01-01

    Purpose: This paper aims to peruse theories and practices of agile and lean manufacturing systems to determine whether they employ sustainability, complexity and organizational learning. Design/methodology/approach: The critical review of the comparative operational similarities and difference of the two systems was conducted while the new views…

  13. Preparing your Offshore Organization for Agility: Experiences in India

    NASA Astrophysics Data System (ADS)

    Srinivasan, Jayakanth

    Two strategies that have significantly changed the way we conventionally think about managing software development and sustainment are the family of development approaches collectively referred to as agile methods, and the distribution of development efforts on a global scale. When you combine the two strategies, organizations have to address not only the technical challenges that arise from introducing new ways of working, but more importantly have to manage the 'soft' factors that if ignored lead to hard challenges. Using two case studies of distributed agile software development in India we illustrate the areas that organizations need to be aware of when transitioning work to India. The key issues that we emphasize are the need to recruit and retain personnel; the importance of teaching, mentoring and coaching; the need to manage customer expectations; the criticality of well-articulated senior leadership vision and commitment; and the reality of operating in a heterogeneous process environment.

  14. Development of Fuzzy Logic and Soft Computing Methodologies

    NASA Technical Reports Server (NTRS)

    Zadeh, L. A.; Yager, R.

    1999-01-01

    Our earlier research on computing with words (CW) has led to a new direction in fuzzy logic which points to a major enlargement of the role of natural languages in information processing, decision analysis and control. This direction is based on the methodology of computing with words and embodies a new theory which is referred to as the computational theory of perceptions (CTP). An important feature of this theory is that it can be added to any existing theory - especially to probability theory, decision analysis, and control - and enhance the ability of the theory to deal with real-world problems in which the decision-relevant information is a mixture of measurements and perceptions. The new direction is centered on an old concept - the concept of a perception - a concept which plays a central role in human cognition. The ability to reason with perceptions perceptions of time, distance, force, direction, shape, intent, likelihood, truth and other attributes of physical and mental objects - underlies the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Everyday examples of such tasks are parking a car, driving in city traffic, cooking a meal, playing golf and summarizing a story. Perceptions are intrinsically imprecise. Imprecision of perceptions reflects the finite ability of sensory organs and ultimately, the brain, to resolve detail and store information. More concretely, perceptions are both fuzzy and granular, or, for short, f-granular. Perceptions are f-granular in the sense that: (a) the boundaries of perceived classes are not sharply defined; and (b) the elements of classes are grouped into granules, with a granule being a clump of elements drawn together by indistinguishability, similarity. proximity or functionality. F-granularity of perceptions may be viewed as a human way of achieving data compression. In large measure, scientific progress has been, and continues to be

  15. I'll Txt U if I Have a Problem: How the Société Canadienne du Cancer in Quebec Applied Behavior-Change Theory, Data Mining and Agile Software Development to Help Young Adults Quit Smoking

    PubMed Central

    van Mierlo, Trevor; Fournier, Rachel; Jean-Charles, Anathalie; Hovington, Jacinthe; Ethier, Isabelle; Selby, Peter

    2014-01-01

    Introduction For many organizations, limited budgets and phased funding restrict the development of digital health tools. This problem is often exacerbated by the ever-increasing sophistication of technology and costs related to programming and maintenance. Traditional development methods tend to be costly and inflexible and not client centered. The purpose of this study is to analyze the use of Agile software development and outcomes of a three-phase mHealth program designed to help young adult Quebecers quit smoking. Methods In Phase I, literature reviews, focus groups, interviews, and behavior change theory were used in the adaption and re-launch of an existing evidence-based mHealth platform. Based on analysis of user comments and utilization data from Phase I, the second phase expanded the service to allow participants to live text-chat with counselors. Phase II evaluation led to the third and current phase, in which algorithms were introduced to target pregnant smokers, substance users, students, full-time workers, those affected by mood disorders and chronic disease. Results Data collected throughout the three phases indicate that the incremental evolution of the intervention has led to increasing numbers of smokers being enrolled while making functional enhancements. In Phase I (240 days) 182 smokers registered with the service. 51% (n = 94) were male and 61.5% (n = 112) were between the ages of 18–24. In Phase II (300 days), 994 smokers registered with the service. 51% (n = 508) were male and 41% (n = 403) were between the ages of 18–24. At 174 days to date 873 smokers have registered in the third phase. 44% (n = 388) were male and 24% (n = 212) were between the ages of 18–24. Conclusions Emerging technologies in behavioral science show potential, but do not have defined best practices for application development. In phased-based projects with limited funding, Agile appears to be a viable approach to building and expanding

  16. Opening up the Agile Innovation Process

    NASA Astrophysics Data System (ADS)

    Conboy, Kieran; Donnellan, Brian; Morgan, Lorraine; Wang, Xiaofeng

    The objective of this panel is to discuss how firms can operate both an open and agile innovation process. In an era of unprecedented changes, companies need to be open and agile in order to adapt rapidly and maximize their innovation processes. Proponents of agile methods claim that one of the main distinctions between agile methods and their traditional bureaucratic counterparts is their drive toward creativity and innovation. However, agile methods are rarely adopted in their textbook, "vanilla" format, and are usually adopted in part or are tailored or modified to suit the organization. While we are aware that this happens, there is still limited understanding of what is actually happening in practice. Using innovation adoption theory, this panel will discuss the issues and challenges surrounding the successful adoption of agile practices. In addition, this panel will report on the obstacles and benefits reported by over 20 industrial partners engaged in a pan-European research project into agile practices between 2006 and 2009.

  17. Social Protocols for Agile Virtual Teams

    NASA Astrophysics Data System (ADS)

    Picard, Willy

    Despite many works on collaborative networked organizations (CNOs), CSCW, groupware, workflow systems and social networks, computer support for virtual teams is still insufficient, especially support for agility, i.e. the capability of virtual team members to rapidly and cost efficiently adapt the way they interact to changes. In this paper, requirements for computer support for agile virtual teams are presented. Next, an extension of the concept of social protocol is proposed as a novel model supporting agile interactions within virtual teams. The extended concept of social protocol consists of an extended social network and a workflow model.

  18. Methodological Barriers Precluding the Development of Comprehensive Theory.

    ERIC Educational Resources Information Center

    Wilmot, William; King, Stephen

    The authors examine published research in speech communication and evaluate its potential for theory development. Two major suggestions are advanced that will facilitate the quest for viable theory of speech communication. First, research should begin to focus on relevant communication behaviors rather than merely using them as convenient contexts…

  19. Advances in Artificial Neural Networks - Methodological Development and Application

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Artificial neural networks as a major soft-computing technology have been extensively studied and applied during the last three decades. Research on backpropagation training algorithms for multilayer perceptron networks has spurred development of other neural network training algorithms for other ne...

  20. Semantic Differential Studies of Children's Language Development: Some Methodological Problems.

    ERIC Educational Resources Information Center

    O'Dowd, Sarah C.

    1980-01-01

    Demonstrates that both scale-checking style and concept-scale interaction are crucial variables in research using the semantic differential method to assess semantic development. Eight children from each of grades K-5 and 16 adults (aged 17.5-68 years) served as subjects. (MP)

  1. Review of methodological developments in laser Doppler flowmetry.

    PubMed

    Rajan, Vinayakrishnan; Varghese, Babu; van Leeuwen, Ton G; Steenbergen, Wiendelt

    2009-03-01

    Laser Doppler flowmetry is a non-invasive method of measuring microcirculatory blood flow in tissue. In this review the technique is discussed in detail. The theoretical and experimental developments to improve the technique are reviewed. The limitations of the method are elaborated upon, and the research done so far to overcome these limitations is critically assessed.

  2. Application of Information Integration Theory to Methodology of Theory Development.

    ERIC Educational Resources Information Center

    Shanteau, James

    Information integration theory (IIT) seeks to develop a unified theory of judgment and behavior. This theory provides a conceptual framework that has been applied to a variety of research areas including personality impression formation and decision making. In these applications information integration theory has helped to resolve methodological…

  3. Software Development and Test Methodology for a Distributed Ground System

    NASA Technical Reports Server (NTRS)

    Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.

  4. Methodology for Developing and Evaluating the PROMIS® Smoking Item Banks

    PubMed Central

    Cai, Li; Stucky, Brian D.; Tucker, Joan S.; Shadel, William G.; Edelen, Maria Orlando

    2014-01-01

    Introduction: This article describes the procedures used in the PROMIS® Smoking Initiative for the development and evaluation of item banks, short forms (SFs), and computerized adaptive tests (CATs) for the assessment of 6 constructs related to cigarette smoking: nicotine dependence, coping expectancies, emotional and sensory expectancies, health expectancies, psychosocial expectancies, and social motivations for smoking. Methods: Analyses were conducted using response data from a large national sample of smokers. Items related to each construct were subjected to extensive item factor analyses and evaluation of differential item functioning (DIF). Final item banks were calibrated, and SF assessments were developed for each construct. The performance of the SFs and the potential use of the item banks for CAT administration were examined through simulation study. Results: Item selection based on dimensionality assessment and DIF analyses produced item banks that were essentially unidimensional in structure and free of bias. Simulation studies demonstrated that the constructs could be accurately measured with a relatively small number of carefully selected items, either through fixed SFs or CAT-based assessment. Illustrative results are presented, and subsequent articles provide detailed discussion of each item bank in turn. Conclusions: The development of the PROMIS smoking item banks provides researchers with new tools for measuring smoking-related constructs. The use of the calibrated item banks and suggested SF assessments will enhance the quality of score estimates, thus advancing smoking research. Moreover, the methods used in the current study, including innovative approaches to item selection and SF construction, may have general relevance to item bank development and evaluation. PMID:23943843

  5. Methodology Development for Assessment of Spaceport Technology Returns and Risks

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla; Zapata, Edgar

    2001-01-01

    As part of Kennedy Space Center's (KSC's) challenge to open the space frontier, new spaceport technologies must be developed, matured and successfully transitioned to operational systems. R&D investment decisions can be considered from multiple perspectives. Near mid and far term technology horizons must be understood. Because a multitude of technology investment opportunities are available, we must identify choices that promise the greatest likelihood of significant lifecycle At the same time, the costs and risks of any choice must be well understood and balanced against its potential returns The problem is not one of simply rank- ordering projects in terms of their desirability. KSC wants to determine a portfolio of projects that simultaneously satisfies multiple goals, such as getting the biggest bang for the buck, supporting projects that may be too risky for private funding, staying within annual budget cycles without foregoing the requirements of a long term technology vision, and ensuring the development of a diversity of technologies that, support the variety of operational functions involved in space transportation. This work aims to assist in the development of in methods and techniques that support strategic technology investment decisions and ease the process of determining an optimal portfolio of spaceport R&D investments. Available literature on risks and returns to R&D is reviewed and most useful pieces are brought to the attention of the Spaceport Technology Development Office (STDO). KSC's current project management procedures are reviewed. It is found that the "one size fits all" nature of KSC's existing procedures and project selection criteria is not conducive to prudent decision-making. Directions for improving KSC's - procedures and criteria are outlined. With help of a contractor, STDO is currently developing a tool, named Change Management Analysis Tool (CMAT)/ Portfolio Analysis Tool (PAT), to assist KSC's R&D portfolio determination. A

  6. Development cooperation as methodology for teaching social responsibility to engineers

    NASA Astrophysics Data System (ADS)

    Lappalainen, Pia

    2011-12-01

    The role of engineering in promoting global well-being has become accentuated, turning the engineering curriculum into a means of dividing well-being equally. The gradual fortifying calls for humanitarian engineering have resulted in the incorporation of social responsibility themes in the university curriculum. Cooperation, communication, teamwork, intercultural cooperation, sustainability, social and global responsibility represent the socio-cultural dimensions that are becoming increasingly important as globalisation intensifies the demands for socially and globally adept engineering communities. This article describes an experiment, the Development Cooperation Project, which was conducted at Aalto University in Finland to integrate social responsibility themes into higher engineering education.

  7. Agile Data Curation at a State Geological Survey

    NASA Astrophysics Data System (ADS)

    Hills, D. J.

    2015-12-01

    State agencies, including geological surveys, are often the gatekeepers for myriad data products essential for scientific research and economic development. For example, the Geological Survey of Alabama (GSA) is mandated to explore for, characterize, and report Alabama's mineral, energy, water, and biological resources in support of economic development, conservation, management, and public policy for the betterment of Alabama's citizens, communities, and businesses. As part of that mandate, the GSA has increasingly been called upon to make our data more accessible to stakeholders. Even as demand for greater data accessibility grows, budgets for such efforts are often small, meaning that agencies must do more for less. Agile software development has yielded efficient, effective products, most often at lower cost and in shorter time. Taking guidance from the agile software development model, the GSA is working towards more agile data management and curation. To date, the GSA's work has been focused primarily on data rescue. By using workflows that maximize clear communication while encouraging simplicity (e.g., maximizing the amount of work not done or that can be automated), the GSA is bringing decades of dark data into the light. Regular checks by the data rescuer with the data provider (or their proxy) provides quality control without adding an overt burden on either party. Moving forward, these workflows will also allow for more efficient and effective data management.

  8. Developments in the Tools and Methodologies of Synthetic Biology

    PubMed Central

    Kelwick, Richard; MacDonald, James T.; Webb, Alexander J.; Freemont, Paul

    2014-01-01

    Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices, or systems. However, biological systems are generally complex and unpredictable, and are therefore, intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a “body of knowledge” from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled, and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled, and its functionality tested. At each stage of the design cycle, an expanding repertoire of tools is being developed. In this review, we highlight several of these tools in terms of their applications and benefits to the synthetic biology community. PMID:25505788

  9. Development of SPME-HPLC methodology for detection of nitroexplosives

    NASA Astrophysics Data System (ADS)

    Peña-Luengas, Sandra L.; Jerez-Rozo, Jackeline I.; Correa, Sandra N.; Peña, Nelson E.; Hernández-Rivera, Samuel P.

    2007-04-01

    Solid phase microextraction (SPME) has been coupled with liquid chromatography to widen its range of application to nonvolatile and thermally unstable compounds, generally limited for SPME-GC. A method for analysis of nitroaromatic explosives and its degradations products was developed by coupling SPME and high performance liquid chromatography with ultraviolet detection (HPLC/UV), introducing a modified interface that ensure accuracy, precision, repeatability, high efficiency, unique selectivity and high sensitive to detection and quantification of explosives from surface soil samples and increased chromatographic efficiency. A pretreatment step was introduced for the soil samples which extracted the target compounds into an aqueous phase. Several parameters that affect the microextraction were evaluated, such as: fiber coating, adsorption and desorption time and stirring rate. The effect of salting out (NaCl) on analyte extraction and the role of various solvents on SPME fiber were also evaluated. Carbowax-templated resin (CW/TPR) and Polydimethilsiloxane-divinilbenzene (PDMS-DVB) fibers were used to extract the analytes from the aqueous samples. Explosives were detected at low μg/mL concentrations. This study demonstrates that SPME-HPLC is a very promising method of analysis of explosives from aqueous samples and has been successfully applied to the determination of nitroaromatic compounds, such as TNT.

  10. Methodological choices for the clinical development of medical devices

    PubMed Central

    Bernard, Alain; Vaneau, Michel; Fournel, Isabelle; Galmiche, Hubert; Nony, Patrice; Dubernard, Jean Michel

    2014-01-01

    clinical development of MDs. PMID:25285025

  11. Study on the methodology of developing evidence-based clinical practice guidelines of Chinese medicine.

    PubMed

    Chen, Zheng-guang; Luo, Hui; Xu, Shan; Yang, Yan; Wang, Shou-chuan

    2015-11-01

    At present, evidence-based clinical practice guideline (EBCPG) is the main mode of developing clinical practice guidelines (CPGs) in the world, but in China, most of CPGs of Chinese medicine (CM) are still guidelines based on expert consensus. The objective of this study is to construct initially the methodology of developing EBCPGs of CM and to promote the development of standardization of CM. Based on the development of "Guideline for Diagnosis and Treatment of Common Pediatric Diseases in CM", the methodology of developing EBCPG of CM was explored by analyzing the pertinent literature and considering the characteristics of CM. In this study, the key problem was to put forward the suggestion and strategies. However, due to the methodology study of developing EBCPG of CM is still in the initial stage, there are still some problems which need further study.

  12. A framework for assessing the adequacy and effectiveness of software development methodologies

    NASA Technical Reports Server (NTRS)

    Arthur, James D.; Nance, Richard E.

    1990-01-01

    Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.

  13. Agile robotic edge finishing system research

    SciTech Connect

    Powell, M.A.

    1995-07-01

    This paper describes a new project undertaken by Sandia National Laboratories to develop an agile, automated, high-precision edge finishing system. The project has a two-year duration and was initiated in October, 1994. This project involves re-designing and adding additional capabilities to an existing finishing workcell at Sandia; and developing intelligent methods for automating process definition and for controlling finishing processes. The resulting system will serve as a prototype for systems that will be deployed into highly flexible automated production lines. The production systems will be used to produce a wide variety of products with limited production quantities and quick turnaround requirements. The prototype system is designed to allow programming, process definition, fixture re-configuration, and process verification to be performed off-line for new products. CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) models of the part will be used to assist with the automated process development and process control tasks. To achieve Sandia`s performance goals, the system will be employ advanced path planning, burr prediction expert systems, automated process definition, statistical process models in a process database, and a two-level control scheme using hybrid position-force control and fuzzy logic control. In this paper, we discuss the progress and the planned system development under this project.

  14. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  15. The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology

    ERIC Educational Resources Information Center

    Wang, Greg G.; Swanson, Richard A.

    2008-01-01

    Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…

  16. Balancing Plan-Driven and Agile Methods in Software Engineering Project Courses

    NASA Astrophysics Data System (ADS)

    Boehm, Barry; Port, Dan; Winsor Brown, A.

    2002-09-01

    For the past 6 years, we have been teaching a two-semester software engineering project course. The students organize into 5-person teams and develop largely web-based electronic services projects for real USC campus clients. We have been using and evolving a method called Model- Based (System) Architecting and Software Engineering (MBASE) for use in both the course and in industrial applications. The MBASE Guidelines include a lot of documents. We teach risk-driven documentation: if it is risky to document something, and not risky to leave it out (e.g., GUI screen placements), leave it out. Even so, students tend to associate more documentation with higher grades, although our grading eventually discourages this. We are always on the lookout for ways to have students learn best practices without having to produce excessive documentation. Thus, we were very interested in analyzing the various emerging agile methods. We found that agile methods and milestone plan-driven methods are part of a “how much planning is enough?” spectrum. Both agile and plan-driven methods have home grounds of project characteristics where they clearly work best, and where the other will have difficulties. Hybrid agile/plan-driven approaches are feasible, and necessary for projects having a mix of agile and plan-driven home ground characteristics. Information technology trends are going more toward the agile methods' home ground characteristics of emergent requirements and rapid change, although there is a concurrent increase in concern with dependability. As a result, we are currently experimenting with risk-driven combinations of MBASE and agile methods, such as integrating requirements, test plans, peer reviews, and pair programming into “agile quality management.”

  17. SAR imagery using chaotic carrier frequency agility pulses

    NASA Astrophysics Data System (ADS)

    Xu, Xiaojian; Feng, Xiangzhi

    2011-06-01

    Synthetic aperture radar (SAR) systems are getting more and more applications in both civilian and military remote sensing missions. With the increasing deployment of electronic countermeasures (ECM) on modern battlefields, SAR encounters more and more interference jamming signals. The ECM jamming signals cause the SAR system to receive and process erroneous information which results in severe degradations in the output SAR images and/or formation of phony images of nonexistent targets. As a consequence, development of the electronic counter-countermeasures (ECCM) capability becomes one of the key problems in SAR system design. This paper develops radar signaling strategies and algorithms that enhance the ability of synthetic aperture radar to image targets under conditions of electronic jamming. The concept of SAR using chaotic carrier frequency agility pulses (CCFAP-SAR) is first proposed. Then the imaging procedure for CCFAP-SAR is discussed in detail. The ECCM performance of CCFAP-SAR for both depressive noise jamming and deceptive repeat jamming is analyzed. The impact of the carrier frequency agility range on the image quality of CCFAP-SAR is also studied. Simulation results demonstrate that, with adequate agility range of the carrier frequency, the proposed CCFAP-SAR performs as well as conventional radar with linear frequency modulation (LFM) waveform in image quality and slightly better in anti-noise depressive jamming; while performs very well in anti-deception jamming which cannot be rejected by LFM-SAR.

  18. Integrating a distributed, agile, virtual enterprise in the TEAM program

    NASA Astrophysics Data System (ADS)

    Cobb, C. K.; Gray, W. Harvey; Hewgley, Robert E.; Klages, Edward J.; Neal, Richard E.

    1997-01-01

    The technologies enabling agile manufacturing (TEAM) program enhances industrial capability by advancing and deploying manufacturing technologies that promote agility. TEAM has developed a product realization process that features the integration of product design and manufacturing groups. TEAM uses the tools it collects, develops, and integrates in support of the product realization process to demonstrate and deploy agile manufacturing capabilities for three high- priority processes identified by industry: material removal, forming, and electromechanical assembly. In order to provide a proof-of-principle, the material removal process has been addressed first and has been successfully demonstrate din an 'interconnected' mode. An internet-accessible intersite file manager (IFM) application has been deployed to allow geographically distributed TEAM participants to share and distribute information as the product realization process is executed. An automated inspection planning application has been demonstrated, importing a solid model form the IFM, generating an inspection plan and a part program to be used in the inspection process, and then distributing the part program to the inspection site via the IFM. TEAM seeks to demonstrate the material removal process in an integrated mode in June 1997 complete with an object-oriented framework and infrastructure. The current status and future plans for this project are presented here.

  19. AGILE/GRID Science Alert Monitoring System: The Workflow and the Crab Flare Case

    NASA Astrophysics Data System (ADS)

    Bulgarelli, A.; Trifoglio, M.; Gianotti, F.; Tavani, M.; Conforti, V.; Parmiggiani, N.

    2013-10-01

    During the first five years of the AGILE mission we have observed many gamma-ray transients of Galactic and extragalactic origin. A fast reaction to unexpected transient events is a crucial part of the AGILE monitoring program, because the follow-up of astrophysical transients is a key point for this space mission. We present the workflow and the software developed by the AGILE Team to perform the automatic analysis for the detection of gamma-ray transients. In addition, an App for iPhone will be released enabling the Team to access the monitoring system through mobile phones. In 2010 September the science alert monitoring system presented in this paper recorded a transient phenomena from the Crab Nebula, generating an automated alert sent via email and SMS two hours after the end of an AGILE satellite orbit, i.e. two hours after the Crab flare itself: for this discovery AGILE won the 2012 Bruno Rossi prize. The design of this alert system is maximized to reach the maximum speed, and in this, as in many other cases, AGILE has demonstrated that the reaction speed of the monitoring system is crucial for the scientific return of the mission.

  20. Towards a Better Understanding of CMMI and Agile Integration - Multiple Case Study of Four Companies

    NASA Astrophysics Data System (ADS)

    Pikkarainen, Minna

    The amount of software is increasing in the different domains in Europe. This provides the industries in smaller countries good opportunities to work in the international markets. Success in the global markets however demands the rapid production of high quality, error free software. Both CMMI and agile methods seem to provide a ready solution for quality and lead time improvements. There is not, however, much empirical evidence available either about 1) how the integration of these two aspects can be done in practice or 2) what it actually demands from assessors and software process improvement groups. The goal of this paper is to increase the understanding of CMMI and agile integration, in particular, focusing on the research question: how to use ‘lightweight’ style of CMMI assessments in agile contexts. This is done via four case studies in which assessments were conducted using the goals of CMMI integrated project management and collaboration and coordination with relevant stakeholder process areas and practices from XP and Scrum. The study shows that the use of agile practices may support the fulfilment of the goals of CMMI process areas but there are still many challenges for the agile teams to be solved within the continuous improvement programs. It also identifies practical advices to the assessors and improvement groups to take into consideration when conducting assessment in the context of agile software development.

  1. Software ``Best'' Practices: Agile Deconstructed

    NASA Astrophysics Data System (ADS)

    Fraser, Steven

    Software “best” practices depend entirely on context - in terms of the problem domain, the system constructed, the software designers, and the “customers” ultimately deriving value from the system. Agile practices no longer have the luxury of “choosing” small non-mission critical projects with co-located teams. Project stakeholders are selecting and adapting practices based on a combina tion of interest, need and staffing. For example, growing product portfolios through a merger or the acquisition of a company exposes legacy systems to new staff, new software integration challenges, and new ideas. Innovation in communications (tools and processes) to span the growth and contraction of both information and organizations, while managing the adoption of changing software practices, is imperative for success. Traditional web-based tools such as web pages, document libraries, and forums are not suf ficient. A blend of tweeting, blogs, wikis, instant messaging, web-based confer encing, and telepresence creates a new dimension of communication “best” practices.

  2. A comparison of linear speed, closed-skill agility, and open-skill agility qualities between backcourt and frontcourt adult semiprofessional male basketball players.

    PubMed

    Scanlan, Aaron T; Tucker, Patrick S; Dalbo, Vincent J

    2014-05-01

    The measurement of fitness qualities relevant to playing position is necessary to inform basketball coaching and conditioning staff of role-related differences in playing groups. To date, sprinting and agility performance have not been compared between playing positions in adult male basketball players. Therefore, the purpose of this study was to describe and compare linear speed, closed-skill agility, and open-skill agility qualities between backcourt (point guard and shooting guard positions) and frontcourt (small forward, power forward, and center positions) semiprofessional basketball players. Six backcourt (mean ± SD: age, 24.3 ± 7.9 years; stature, 183.4 ± 4.0 cm; body mass, 85.5 ± 12.3 kg; VO2max, 51.9 ± 4.8 ml·kg(-1)·min(-1)) and 6 frontcourt (mean ± SD: age, 27.5 ± 5.5 years; stature, 194.4 ± 7.1 cm; body mass, 109.4 ± 8.8 kg; VO2max, 47.1 ± 5.0 ml·kg(-1)·min(-1)) adult male basketball players completed 20-m sprint, closed-skill agility, and open-skill agility performance tests. Magnitude-based inferences revealed that backcourt players (5 m, 1.048 ± 0.027 seconds; 10 m, 1.778 ± 0.048 seconds; 20 m, 3.075 ± 0.121 seconds) possessed likely quicker linear sprint times than frontcourt players (5 m, 1.095 ± 0.085 seconds; 10 m, 1.872 ± 0.127 seconds; 20 m, 3.242 ± 0.221 seconds). Conversely, frontcourt players (1.665 ± 0.096 seconds) held possible superior closed-skill agility performance than backcourt players (1.613 ± 0.111 seconds). In addition, unclear positional differences were apparent for open-skill agility qualities. These findings indicate that linear speed and change of direction speed might be differently developed across playing positions. Furthermore, position-related functions might similarly depend on the aspects of open-skill agility performance across backcourt and frontcourt players. Basketball coaching and conditioning staff should consider the development of position-targeted training drills to improve speed, agility

  3. Application Development Methodology Appropriateness: An Exploratory Case Study Bridging the Gap between Framework Characteristics and Selection

    ERIC Educational Resources Information Center

    Williams, Lawrence H., Jr.

    2013-01-01

    This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…

  4. Knowledge Consolidation Analysis: Toward a Methodology for Studying the Role of Argument in Technology Development

    ERIC Educational Resources Information Center

    Dyehouse, Jeremiah

    2007-01-01

    Researchers studying technology development often examine how rhetorical activity contributes to technologies' design, implementation, and stabilization. This article offers a possible methodology for studying one role of rhetorical activity in technology development: knowledge consolidation analysis. Applying this method to an exemplar case, the…

  5. Educational Planning Methodology for the Integrated Development of Rural Areas. Reports Studies... S.83.

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific, and Cultural Organization, Paris (France).

    A summary of educational planning methodologies tested in Argentina, Guatemala, Brazil, Ecuador, and Bolivia, the document offers opinions and proposals about integrated rural development. Integrated rural development is seen as a social, economic, political, and cultural process in rural areas, designed to improve living conditions. Chapters…

  6. Gamma-ray Astrophysics with AGILE

    SciTech Connect

    Longo, Francesco |; Tavani, M.; Barbiellini, G.; Argan, A.; Basset, M.; Boffelli, F.; Bulgarelli, A.; Caraveo, P.; Cattaneo, P.; Chen, A.; Costa, E.; Del Monte, E.; Di Cocco, G.; Di Persio, G.; Donnarumma, I.; Feroci, M.; Fiorini, M.; Foggetta, L.; Froysland, T.; Frutti, M.

    2007-07-12

    AGILE will explore the gamma-ray Universe with a very innovative instrument combining for the first time a gamma-ray imager and a hard X-ray imager. AGILE will be operational in spring 2007 and it will provide crucial data for the study of Active Galactic Nuclei, Gamma-Ray Bursts, unidentified gamma-ray sources. Galactic compact objects, supernova remnants, TeV sources, and fundamental physics by microsecond timing. The AGILE instrument is designed to simultaneously detect and image photons in the 30 MeV - 50 GeV and 15 - 45 keV energy bands with excellent imaging and timing capabilities, and a large field of view covering {approx} 1/5 of the entire sky at energies above 30 MeV. A CsI calorimeter is capable of GRB triggering in the energy band 0.3-50 MeV AGILE is now (March 2007) undergoing launcher integration and testing. The PLSV launch is planned in spring 2007. AGILE is then foreseen to be fully operational during the summer of 2007.

  7. Development of Non-LOCA Safety Analysis Methodology with RETRAN-3D and VIPRE-01/K

    SciTech Connect

    Kim, Yo-Han; Cheong, Ae-Ju; Yang, Chang-Keun

    2004-10-15

    Korea Electric Power Research Institute has launched a project to develop an in-house non-loss-of-coolant-accident analysis methodology to overcome the hardships caused by the narrow analytical scopes of existing methodologies. Prior to the development, some safety analysis codes were reviewed, and RETRAN-3D and VIPRE-01 were chosen as the base codes. The codes have been modified to improve the analytical capabilities required to analyze the nuclear power plants in Korea. The methodologies of the vendors and the Electric Power Research Institute have been reviewed, and some documents of foreign utilities have been used to compensate for the insufficiencies. For the next step, a draft methodology for pressurized water reactors has been developed and modified to apply to Westinghouse-type plants in Korea. To verify the feasibility of the methodology, some events of Yonggwang Units 1 and 2 have been analyzed from the standpoints of reactor coolant system pressure and the departure from nucleate boiling ratio. The results of the analyses show trends similar to those of the Final Safety Analysis Report.

  8. Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama

    2003-01-01

    This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.

  9. Accelerating 32nm BEOL technology development by advanced wafer inspection methodology

    NASA Astrophysics Data System (ADS)

    Jeng, P. R.; Lin, C. L.; Jang, Simon; Liang, M. S.; Chen, Wallas; Tsui, David; Chen, Damian; Chen, Henry; Young, Chris; Chang, Ellis

    2008-11-01

    In the early development stage of 32nm processes, identifying and isolating systematic defects is critical to understanding the issues related to design and process interactions. Conventional inspection methodologies using random review sampling on large defect populations do not provide the information required to take accurate and quick corrective action. This paper demonstrates the successful identification and isolation of systematic defects using a novel methodology that combines Design Based Binning (DBB) and inline Defect Organizer (iDO). This new method of integrating design and defect data produced actionable inspection data, resulting in fewer mask revisions and reduced device development time.

  10. Development of an Improved Methodology to Assess Potential Unconventional Gas Resources

    SciTech Connect

    Salazar, Jesus; McVay, Duane A. Lee, W. John

    2010-12-15

    Considering the important role played today by unconventional gas resources in North America and their enormous potential for the future around the world, it is vital to both policy makers and industry that the volumes of these resources and the impact of technology on these resources be assessed. To provide for optimal decision making regarding energy policy, research funding, and resource development, it is necessary to reliably quantify the uncertainty in these resource assessments. Since the 1970s, studies to assess potential unconventional gas resources have been conducted by various private and governmental agencies, the most rigorous of which was by the United States Geological Survey (USGS). The USGS employed a cell-based, probabilistic methodology which used analytical equations to calculate distributions of the resources assessed. USGS assessments have generally produced distributions for potential unconventional gas resources that, in our judgment, are unrealistically narrow for what are essentially undiscovered, untested resources. In this article, we present an improved methodology to assess potential unconventional gas resources. Our methodology is a stochastic approach that includes Monte Carlo simulation and correlation between input variables. Application of the improved methodology to the Uinta-Piceance province of Utah and Colorado with USGS data validates the means and standard deviations of resource distributions produced by the USGS methodology, but reveals that these distributions are not right skewed, as expected for a natural resource. Our investigation indicates that the unrealistic shape and width of the gas resource distributions are caused by the use of narrow triangular input parameter distributions. The stochastic methodology proposed here is more versatile and robust than the USGS analytic methodology. Adoption of the methodology, along with a careful examination and revision of input distributions, should allow a more realistic

  11. The National Aviation Operational Monitoring Service (NAOMS): A Documentation of the Development of a Survey Methodology

    NASA Technical Reports Server (NTRS)

    Connors, Mary M.; Mauro, Robert; Statler, Irving C.

    2012-01-01

    The National Aviation Operational Monitoring Service (NAOMS) was a research project under NASA s Aviation Safety Program during the years from 2000 to 2005. The purpose of this project was to develop a methodology for gaining reliable information on changes over time in the rates-of-occurrence of safety-related events as a means of assessing the safety of the national airspace. The approach was a scientifically designed survey of the operators of the aviation system concerning their safety-related experiences. This report presents the results of the methodology developed and a demonstration of the NAOMS concept through a survey of nearly 20,000 randomly selected air-carrier pilots. Results give evidence that the NAOMS methodology can provide a statistically sound basis for evaluating trends of incidents that could compromise safety. The approach and results are summarized in the report and supporting documentation and complete analyses of results are presented in 14 appendices.

  12. Development and application of a safety assessment methodology for waste disposals

    SciTech Connect

    Little, R.H.; Torres, C.; Schaller, K.H.

    1996-12-31

    As part of a European Commission funded research programme, QuantiSci (formerly the Environmental Division of Intera Information Technologies) and Instituto de Medio Ambiente of the Centro de Investigaciones Energeticas Medioambientales y Tecnologicas (IMA/CIEMAT) have developed and applied a comprehensive, yet practicable, assessment methodology for post-disposal safety assessment of land-based disposal facilities. This Safety Assessment Comparison (SACO) Methodology employs a systematic approach to the collection, evaluation and use of waste and disposal system data. It can be used to assess engineered barrier performance, the attenuating properties of host geological formations, and the long term impacts of a facility on the environment and human health, as well as allowing the comparison of different disposal options for radioactive, mixed and non-radioactive wastes. This paper describes the development of the methodology and illustrates its use.

  13. SuperAGILE and Gamma Ray Bursts

    SciTech Connect

    Pacciani, Luigi; Costa, Enrico; Del Monte, Ettore; Donnarumma, Immacolata; Evangelista, Yuri; Feroci, Marco; Frutti, Massimo; Lazzarotto, Francesco; Lapshov, Igor; Rubini, Alda; Soffitta, Paolo; Tavani, Marco; Barbiellini, Guido; Mastropietro, Marcello; Morelli, Ennio; Rapisarda, Massimo

    2006-05-19

    The solid-state hard X-ray imager of AGILE gamma-ray mission -- SuperAGILE -- has a six arcmin on-axis angular resolution in the 15-45 keV range, a field of view in excess of 1 steradian. The instrument is very light: 5 kg only. It is equipped with an on-board self triggering logic, image deconvolution, and it is able to transmit the coordinates of a GRB to the ground in real-time through the ORBCOMM constellation of satellites. Photon by photon Scientific Data are sent to the Malindi ground station at every contact. In this paper we review the performance of the SuperAGILE experiment (scheduled for a launch in the middle of 2006), after its first onground calibrations, and show the perspectives for Gamma Ray Bursts.

  14. A strategy for developing a launch vehicle system for orbit insertion: Methodological aspects

    NASA Astrophysics Data System (ADS)

    Klyushnikov, V. Yu.; Kuznetsov, I. I.; Osadchenko, A. S.

    2014-12-01

    The article addresses methodological aspects of a development strategy to design a launch vehicle system for orbit insertion. The development and implementation of the strategy are broadly outlined. An analysis is provided of the criterial base and input data needed to define the main requirements for the launch vehicle system. Approaches are suggested for solving individual problems in working out the launch vehicle system development strategy.

  15. The Backyard Human Performance Technologist: Applying the Development Research Methodology to Develop and Validate a New Instructional Design Framework

    ERIC Educational Resources Information Center

    Brock, Timothy R.

    2009-01-01

    Development research methodology (DRM) has been recommended as a viable research approach to expand the practice-to-theory/theory-to-practice literature that human performance technology (HPT) practitioners can integrate into the day-to-day work flow they already use to develop instructional products. However, little has been written about how it…

  16. Explaining the Obvious - How Do You Teach Agile?

    NASA Astrophysics Data System (ADS)

    Lundh, Erik

    Agile is now a hot topic and many organizations decide on adopting “agile” without really knowing how and why. This workshop will explore how fresh and seasoned agile coaches teach traditional and novel agile concepts, by example, with discussions. All participants are invited to show and tell about agile with an audience of peers. It might be the fresh first time with an audience, or golden hits that served you well for years.

  17. Cognitive Sensitivity in Sibling Interactions: Development of the Construct and Comparison of Two Coding Methodologies

    ERIC Educational Resources Information Center

    Prime, Heather; Perlman, Michal; Tackett, Jennifer L.; Jenkins, Jennifer M.

    2014-01-01

    Research Findings: The goal of this study was to develop a construct of sibling cognitive sensitivity, which describes the extent to which children take their siblings' knowledge and cognitive abilities into account when working toward a joint goal. In addition, the study compared 2 coding methodologies for measuring the construct: a thin…

  18. Evaluating EPA’s AP-42 development methodology using a cotton gin total PM dataset

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In August 2013, the U.S. Environmental Protection Agency’s (EPA) published their new methodology for updating the Compilation of Air Pollution Emission Factors (AP-42). The “Recommended Procedures for Development of Emissions Factors and Use of the WebFIRE Database” has yet to be widely used. These ...

  19. EPA’s AP-42 development methodology: Converting or rerating current AP-42 datasets

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In August 2013, the U.S. Environmental Protection Agency’s (EPA) published their new methodology for updating the Compilation of Air Pollution Emission Factors (AP-42). The “Recommended Procedures for Development of Emissions Factors and Use of the WebFIRE Database” instructs that the ratings of the...

  20. A Methodological Framework for Instructional Design Model Development: Critical Dimensions and Synthesized Procedures

    ERIC Educational Resources Information Center

    Lee, Jihyun; Jang, Seonyoung

    2014-01-01

    Instructional design (ID) models have been developed to promote understandings of ID reality and guide ID performance. As the number and diversity of ID practices grows, implicit doubts regarding the reliability, validity, and usefulness of ID models suggest the need for methodological guidance that would help to generate ID models that are…

  1. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    SciTech Connect

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-08-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.

  2. SURVEY OF METHODOLOGIES FOR DEVELOPING MEDIA SCREENING VALUES FOR ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    Barron, Mace G. and Steve Wharton. Submitted. Survey of Methodologies for Developing Media Screening Values for Ecological Risk Assessment. Environ. Toxicol. Chem. 44 p. (ERL,GB 1200).

    Concurrent with the increase in the number of ecological risk assessments over the past...

  3. Cost Analysis of Air Force On-the-Job Training: Development and Demonstration of a Methodology.

    ERIC Educational Resources Information Center

    Eisele, Charles R.; And Others

    A research project was developed to construct and demonstrate a methodology for estimating the costs of conducting on-the-job training (OJT) in the Air Force. The project focused on the formal upgrade training to the three, five, and seven skill levels. Project efforts involved five major tasks: literature review, cost factor identification, cost…

  4. Developing Pre-Technical Secondary Education Programs: Rationale, Content, and Methodology.

    ERIC Educational Resources Information Center

    Georgia State Univ., Atlanta. Dept. of Vocational and Career Development.

    This guide outlines the rationale, content, and methodology of a three-part high technology program that was developed in Georgia to provide secondary school students with training in the areas of electronics and electromechanical and mechanical technologies. Discussed first are the Georgia Initiative, the impact of high technology and the role of…

  5. Methodology development for the sustainability process assessment of sheet metal forming of complex-shaped products

    NASA Astrophysics Data System (ADS)

    Pankratov, D. L.; Kashapova, L. R.

    2015-06-01

    A methodology was developed for automated assessment of the reliability of the process of sheet metal forming process to reduce the defects in complex components manufacture. The article identifies the range of allowable values of the stamp parameters to obtain defect-free punching of spars trucks.

  6. Development of Methodologies for the Estimation of Thermal Properties Associated with Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Scott, Elaine P.

    1996-01-01

    A thermal stress analysis is an important aspect in the design of aerospace structures and vehicles such as the High Speed Civil Transport (HSCT) at the National Aeronautics and Space Administration Langley Research Center (NASA-LaRC). These structures are complex and are often composed of numerous components fabricated from a variety of different materials. The thermal loads on these structures induce temperature variations within the structure, which in turn result in the development of thermal stresses. Therefore, a thermal stress analysis requires knowledge of the temperature distributions within the structures which consequently necessitates the need for accurate knowledge of the thermal properties, boundary conditions and thermal interface conditions associated with the structural materials. The goal of this proposed multi-year research effort was to develop estimation methodologies for the determination of the thermal properties and interface conditions associated with aerospace vehicles. Specific objectives focused on the development and implementation of optimal experimental design strategies and methodologies for the estimation of thermal properties associated with simple composite and honeycomb structures. The strategy used in this multi-year research effort was to first develop methodologies for relatively simple systems and then systematically modify these methodologies to analyze complex structures. This can be thought of as a building block approach. This strategy was intended to promote maximum usability of the resulting estimation procedure by NASA-LARC researchers through the design of in-house experimentation procedures and through the use of an existing general purpose finite element software.

  7. Success Rates by Software Development Methodology in Information Technology Project Management: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Wright, Gerald P.

    2013-01-01

    Despite over half a century of Project Management research, project success rates are still too low. Organizations spend a tremendous amount of valuable resources on Information Technology projects and seek to maximize the utility gained from their efforts. The author investigated the impact of software development methodology choice on ten…

  8. Methodology Development and Physical Organic Chemistry: A Powerful Combination for the Advancement of Glycochemistry

    PubMed Central

    Crich, David

    2011-01-01

    This perspective article outlines work in the Crich group on the diastereoselective synthesis of the so-called difficult classes of glycosidic bond; the 2-deoxy-β-glycopyranosides, the β-mannopyranosides, the α-sialosides, the α-glucopyranosides and the β-arabinofuranosides with an emphasis on the critical interplay between mechanism and methodology development. PMID:21919522

  9. A Methodological Approach to Developing Bibliometric Models of Types of Humanities Scholarship.

    ERIC Educational Resources Information Center

    Wiberley, Stephen E., Jr.

    2003-01-01

    Outlines a methodological approach to developing bibliometric models of the sources used in different types of humanities scholarship. Identifies five types of scholarship: description of primary sources, editing of primary sources, historical studies, criticism, and theory. Illustrates the approach through an analysis of sources used in 54…

  10. Using Constructivist Case Study Methodology to Understand Community Development Processes: Proposed Methodological Questions to Guide the Research Process

    ERIC Educational Resources Information Center

    Lauckner, Heidi; Paterson, Margo; Krupa, Terry

    2012-01-01

    Often, research projects are presented as final products with the methodologies cleanly outlined and little attention paid to the decision-making processes that led to the chosen approach. Limited attention paid to these decision-making processes perpetuates a sense of mystery about qualitative approaches, particularly for new researchers who will…

  11. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    SciTech Connect

    Aldemir, Tunc; Denning, Richard; Catalyurek, Umit; Unwin, Stephen

    2015-01-23

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  12. Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    McClung, R. C.; Chell, G. G.; Lee, Y. -D.; Russell, D. A.; Orient, G. E.

    1999-01-01

    A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, delta J(sub eff) as the governing parameter. The methodology contains original and literature J and delta J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.

  13. Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    McClung, R. C.; Chell, G. G.; Lee, Y.-D.; Russell, D. A.; Orient, G. E.

    1999-01-01

    A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, (Delta)J(sub eff), as the governing parameter. The methodology contains original and literature J and (Delta)J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.

  14. Sequential detection and concentration estimation of chemical vapors using range-resolved lidar with frequency-agile lasers

    NASA Astrophysics Data System (ADS)

    Warren, Russell E.; Vanderbeek, Richard G.; D'Amico, Francis M.

    2000-07-01

    This paper extends our earlier work in developing statistically optimal algorithms for estimating the range- dependent concentration of multiple vapor materials using multiwavelength frequency-agile lidar with a fixed set of wavelength bursts to the case of a time series processor that recursively updates the estimates as new data become available. The concentration estimates are used to detect the presence of one or more vapor materials by a sequential approach that accumulates likelihood in time for each range cell. A Bayesian methodology is used to construct the concentration estimates with a prior concentration smoothness constraint chosen to produce numerically stable results at longer ranges having weak signal return. The approach is illustrated on synthetic and actual field test data collected by SBCCOM.

  15. [Development of New Mathematical Methodology in Air Traffic Control for the Analysis of Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Hermann, Robert

    1997-01-01

    The aim of this research is to develop new mathematical methodology for the analysis of hybrid systems of the type involved in Air Traffic Control (ATC) problems. Two directions of investigation were initiated. The first used the methodology of nonlinear generalized functions, whose mathematical foundations were initiated by Colombeau and developed further by Oberguggenberger; it has been extended to apply to ordinary differential. Systems of the type encountered in control in joint work with the PI and M. Oberguggenberger. This involved a 'mixture' of 'continuous' and 'discrete' methodology. ATC clearly involves mixtures of two sorts of mathematical problems: (1) The 'continuous' dynamics of a standard control type described by ordinary differential equations (ODE) of the form: {dx/dt = f(x, u)} and (2) the discrete lattice dynamics involved of cellular automata. Most of the CA literature involves a discretization of a partial differential equation system of the type encountered in physics problems (e.g. fluid and gas problems). Both of these directions requires much thinking and new development of mathematical fundamentals before they may be utilized in the ATC work. Rather than consider CA as 'discretization' of PDE systems, I believe that the ATC applications will require a completely different and new mathematical methodology, a sort of discrete analogue of jet bundles and/or the sheaf-theoretic techniques to topologists. Here too, I have begun work on virtually 'virgin' mathematical ground (at least from an 'applied' point of view) which will require considerable preliminary work.

  16. Blending critical realist and emancipatory practice development methodologies: making critical realism work in nursing research.

    PubMed

    Parlour, Randal; McCormack, Brendan

    2012-12-01

    This paper examines the efficacy of facilitation as a practice development intervention in changing practice within an Older Person setting and in implementing evidence into practice. It outlines the influences exerted by the critical realist paradigm in guiding emancipatory practice development activities and, in particular, how the former may be employed within an emancipatory practice development study to elucidate and increase understanding pertinent to causation and outcomes. The methodology is based upon an emancipatory practice development approach set within a realistic evaluation framework. This allows for systematic analysis of the social and contextual elements that influence the explication of outcomes associated with facilitation. The study is concentrated upon five practice development cycles, within which a sequence of iterative processes is integrated. The authors assert that combining critical realist and emancipatory processes offers a robust and practical method for translating evidence and implementing changes in practice, as the former affirms or falsifies the influence that emancipatory processes exert on attaining culture shift, and enabling transformation towards effective clinical practice. A new framework for practice development is proposed that establishes methodological coherency between emancipatory practice development and realistic evaluation. This augments the existing theoretical bases for both these approaches by contributing new theoretical and methodological understandings of causation.

  17. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    SciTech Connect

    Jaeger, Calvin Dell; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  18. [The development of the scientific and methodological basis of Russian forensic biology].

    PubMed

    Gusrov, A A

    2010-01-01

    The development of the scientific and methodological basis of Russian forensic biology is retraced since the time of its formation as a self-contained branch of forensic medical examination. Comprehensive analysis of scientific publications and technical documents revealed principal directions along which forensic examination of evidential objects of biological origin developed. The wealth of research data obtained by forensic biologists during a long period (from the 1930s to the 2000s) is for the first time summarized and thoroughly analysed.

  19. Surreptitious, Evolving and Participative Ontology Development: An End-User Oriented Ontology Development Methodology

    ERIC Educational Resources Information Center

    Bachore, Zelalem

    2012-01-01

    Ontology not only is considered to be the backbone of the semantic web but also plays a significant role in distributed and heterogeneous information systems. However, ontology still faces limited application and adoption to date. One of the major problems is that prevailing engineering-oriented methodologies for building ontologies do not…

  20. 5th Annual AGILE Science Workshop

    NASA Technical Reports Server (NTRS)

    Hunter, Stanley

    2008-01-01

    The EGRET model of the galactic diffuse gamma-ray emission (GALDIF) has been extended to provide full-sky coverage and improved to address the discrepancies with the EGRET data. This improved model is compared with the AGILE results from the Galactic center. The comparison is discussed.

  1. Development of 3D pseudo pin-by-pin calculation methodology in ANC

    SciTech Connect

    Zhang, B.; Mayhue, L.; Huria, H.; Ivanov, B.

    2012-07-01

    Advanced cores and fuel assembly designs have been developed to improve operational flexibility, economic performance and further enhance safety features of nuclear power plants. The simulation of these new designs, along with strong heterogeneous fuel loading, have brought new challenges to the reactor physics methodologies currently employed in the industrial codes for core analyses. Control rod insertion during normal operation is one operational feature in the AP1000{sup R} plant of Westinghouse next generation Pressurized Water Reactor (PWR) design. This design improves its operational flexibility and efficiency but significantly challenges the conventional reactor physics methods, especially in pin power calculations. The mixture loading of fuel assemblies with significant neutron spectrums causes a strong interaction between different fuel assembly types that is not fully captured with the current core design codes. To overcome the weaknesses of the conventional methods, Westinghouse has developed a state-of-the-art 3D Pin-by-Pin Calculation Methodology (P3C) and successfully implemented in the Westinghouse core design code ANC. The new methodology has been qualified and licensed for pin power prediction. The 3D P3C methodology along with its application and validation will be discussed in the paper. (authors)

  2. Evaluation of probable maximum snow accumulation: Development of a methodology for climate change studies

    NASA Astrophysics Data System (ADS)

    Klein, Iris M.; Rousseau, Alain N.; Frigon, Anne; Freudiger, Daphné; Gagnon, Patrick

    2016-06-01

    Probable maximum snow accumulation (PMSA) is one of the key variables used to estimate the spring probable maximum flood (PMF). A robust methodology for evaluating the PMSA is imperative so the ensuing spring PMF is a reasonable estimation. This is of particular importance in times of climate change (CC) since it is known that solid precipitation in Nordic landscapes will in all likelihood change over the next century. In this paper, a PMSA methodology based on simulated data from regional climate models is developed. Moisture maximization represents the core concept of the proposed methodology; precipitable water being the key variable. Results of stationarity tests indicate that CC will affect the monthly maximum precipitable water and, thus, the ensuing ratio to maximize important snowfall events. Therefore, a non-stationary approach is used to describe the monthly maximum precipitable water. Outputs from three simulations produced by the Canadian Regional Climate Model were used to give first estimates of potential PMSA changes for southern Quebec, Canada. A sensitivity analysis of the computed PMSA was performed with respect to the number of time-steps used (so-called snowstorm duration) and the threshold for a snowstorm to be maximized or not. The developed methodology is robust and a powerful tool to estimate the relative change of the PMSA. Absolute results are in the same order of magnitude as those obtained with the traditional method and observed data; but are also found to depend strongly on the climate projection used and show spatial variability.

  3. Developing a methodology to assess the impact of research grant funding: a mixed methods approach.

    PubMed

    Bloch, Carter; Sørensen, Mads P; Graversen, Ebbe K; Schneider, Jesper W; Schmidt, Evanthia Kalpazidou; Aagaard, Kaare; Mejlgaard, Niels

    2014-04-01

    This paper discusses the development of a mixed methods approach to analyse research funding. Research policy has taken on an increasingly prominent role in the broader political scene, where research is seen as a critical factor in maintaining and improving growth, welfare and international competitiveness. This has motivated growing emphasis on the impacts of science funding, and how funding can best be designed to promote socio-economic progress. Meeting these demands for impact assessment involves a number of complex issues that are difficult to fully address in a single study or in the design of a single methodology. However, they point to some general principles that can be explored in methodological design. We draw on a recent evaluation of the impacts of research grant funding, discussing both key issues in developing a methodology for the analysis and subsequent results. The case of research grant funding, involving a complex mix of direct and intermediate effects that contribute to the overall impact of funding on research performance, illustrates the value of a mixed methods approach to provide a more robust and complete analysis of policy impacts. Reflections on the strengths and weaknesses of the methodology are used to examine refinements for future work.

  4. Wired Widgets: Agile Visualization for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Gerschefske, K.; Witmer, J.

    2012-09-01

    Continued advancement in sensors and analysis techniques have resulted in a wealth of Space Situational Awareness (SSA) data, made available via tools and Service Oriented Architectures (SOA) such as those in the Joint Space Operations Center Mission Systems (JMS) environment. Current visualization software cannot quickly adapt to rapidly changing missions and data, preventing operators and analysts from performing their jobs effectively. The value of this wealth of SSA data is not fully realized, as the operators' existing software is not built with the flexibility to consume new or changing sources of data or to rapidly customize their visualization as the mission evolves. While tools like the JMS user-defined operational picture (UDOP) have begun to fill this gap, this paper presents a further evolution, leveraging Web 2.0 technologies for maximum agility. We demonstrate a flexible Web widget framework with inter-widget data sharing, publish-subscribe eventing, and an API providing the basis for consumption of new data sources and adaptable visualization. Wired Widgets offers cross-portal widgets along with a widget communication framework and development toolkit for rapid new widget development, giving operators the ability to answer relevant questions as the mission evolves. Wired Widgets has been applied in a number of dynamic mission domains including disaster response, combat operations, and noncombatant evacuation scenarios. The variety of applications demonstrate that Wired Widgets provides a flexible, data driven solution for visualization in changing environments. In this paper, we show how, deployed in the Ozone Widget Framework portal environment, Wired Widgets can provide an agile, web-based visualization to support the SSA mission. Furthermore, we discuss how the tenets of agile visualization can generally be applied to the SSA problem space to provide operators flexibility, potentially informing future acquisition and system development.

  5. Impacts of Outer Continental Shelf (OCS) development on recreation and tourism. Volume 3. Detailed methodology

    SciTech Connect

    Not Available

    1987-04-01

    The final report for the project is presented in five volumes. This volume, Detailed Methodology Review, presents a discussion of the methods considered and used to estimate the impacts of Outer Continental Shelf (OCS) oil and gas development on coastal recreation in California. The purpose is to provide the Minerals Management Service with data and methods to improve their ability to analyze the socio-economic impacts of OCS development. Chapter II provides a review of previous attempts to evaluate the effects of OCS development and of oil spills on coastal recreation. The review also discusses the strengths and weaknesses of different approaches and presents the rationale for the methodology selection made. Chapter III presents a detailed discussion of the methods actually used in the study. The volume contains the bibliography for the entire study.

  6. Methodology for Developing the REScheckTM Software through Version 4.2

    SciTech Connect

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan; Lucas, R. G.; Schultz, Robert W.; Taylor, Zachary T.; Wiberg, John D.

    2009-08-01

    This report explains the methodology used to develop Version 4.2 of the REScheck software developed for the 1992, 1993, and 1995 editions of the MEC, and the 1998, 2000, 2003, and 2006 editions of the IECC, and the 2006 edition of the International Residential Code (IRC). Although some requirements contained in these codes have changed, the methodology used to develop the REScheck software for these five editions is similar. REScheck assists builders in meeting the most complicated part of the code-the building envelope Uo-, U-, and R-value requirements in Section 502 of the code. This document details the calculations and assumptions underlying the treatment of the code requirements in REScheck, with a major emphasis on the building envelope requirements.

  7. PDS4 - Some Principles for Agile Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D. J.; Hardman, S. H.; Joyner, R.; Algermissen, S.; Padams, J.

    2015-12-01

    PDS4, a research data management and curation system for NASA's Planetary Science Archive, was developed using principles that promote the characteristics of agile development. The result is an efficient system that produces better research data products while using less resources (time, effort, and money) and maximizes their usefulness for current and future scientists. The key principle is architectural. The PDS4 information architecture is developed and maintained independent of the infrastructure's process, application and technology architectures. The information architecture is based on an ontology-based information model developed to leverage best practices from standard reference models for digital archives, digital object registries, and metadata registries and capture domain knowledge from a panel of planetary science domain experts. The information model provides a sharable, stable, and formal set of information requirements for the system and is the primary source for information to configure most system components, including the product registry, search engine, validation and display tools, and production pipelines. Multi-level governance is also allowed for the effective management of the informational elements at the common, discipline, and project level. This presentation will describe the development principles, components, and uses of the information model and how an information model-driven architecture exhibits characteristics of agile curation including early delivery, evolutionary development, adaptive planning, continuous improvement, and rapid and flexible response to change.

  8. AGILE and Gamma-Ray Bursts

    SciTech Connect

    Longo, Francesco; Tavani, M.; Barbiellini, G.; Argan, A.; Basset, M.; Boffelli, F.; Bulgarelli, A.; Caraveo, P.; Cattaneo, P.; Chen, A.; Costa, E.; Del Monte, E.; Di Cocco, G.; Di Persio, G.; Donnarumma, I.; Feroci, M.; Fiorini, M.; Foggetta, L.; Froysland, T.; Frutti, M.

    2006-05-19

    AGILE is a Scientific Mission dedicated to high-energy astrophysics supported by ASI with scientific participation of INAF and INFN. The AGILE instrument is designed to simultaneously detect and image photons in the 30 MeV - 50 GeV and 15 - 45 keV energy bands with excellent imaging and timing capabilities, and a large field of view covering {approx} 1/5 of the entire sky at energies above 30 MeV. A CsI calorimeter is capable of GRB triggering in the energy band 0.3-50 MeV. The broadband detection of GRBs and the study of implications for particle acceleration and high energy emission are primary goals of th emission. AGILE can image GRBs with 2-3 arcminutes error boxes in the hard X-ray range, and provide broadband photon-by photon detection in the 15-45 keV, 03-50 MeV, and 30 MeV-30 GeV energy ranges. Microsecond on-board photon tagging and a {approx} 100 microsecond gamma-ray detection deadtime will be crucial for fast GRB timing. On-board calculated GRB coordinates and energy fluxes will be quickly transmitted to the ground by an ORBCOMM transceiver. AGILE have recently (December 2005) completed its gamma-ray calibration. It is now (January 2006) undergoing satellite integration and testing. The PLSV launch is planned in early 2006. AGILE is then foreseen to be fully operational during the summer of 2006. It will be the only mission entirely dedicated to high-energy astrophysics above 30 MeV during the period mid-2006/mid-2007.

  9. Development of uncertainty methodology for COBRA-TF void distribution and critical power predictions

    NASA Astrophysics Data System (ADS)

    Aydogan, Fatih

    Thermal hydraulic codes are commonly used tools in licensing processes for the evaluation of various thermal hydraulic scenarios. The uncertainty of a thermal hydraulic code prediction is calculated with uncertainty analyses. The objective of all the uncertainty analysis is to determine how well a code predicts with corresponding uncertainties. If a code has a big output uncertainty, this code needs further development and/or model improvements. If a code has a small uncertainty, this code needs maintenance program in order to keep this small output uncertainty. Uncertainty analysis also indicates the more validation data is needed. Uncertainty analyses for the BWR nominal steady state and transient scenarios are necessary in order to develop and improve the two phase flow models in the thermal hydraulic codes. Because void distribution is the key factor in order to determine the flow regime and heat transfer regime of the flow and critical power is an important factor for the safety margin, both steady state void distribution and critical power predictions are important features of a code. An uncertainty analysis for these two phenomena/cases provides valuable results. These results can be used for the development of the thermal hydraulic codes that are used for designing a BWR bundle or for licensing procedures. This dissertation includes the development of a particular uncertainty methodology for the steady state void distribution and critical power predictions. In this methodology, the PIRT element of CSAU was used to eliminate the low ranked uncertainty parameters. The SPDF element of GRS was utilized to make the uncertainty methodology flexible for the assignment of PDFs to the uncertainty parameters. The developed methodology includes the uncertainty comparison methods to assess the code precision with the sample-averaged bias, to assess the code spreading with the sample-averaged standard deviation and to assess the code reliability with the proportion of

  10. Application of Low-Cost Methodologies for Mobile Phone App Development

    PubMed Central

    Ng, Beng Yeong; Ho, Roger; Cheok, Christopher Cheng Soon

    2014-01-01

    Background The usage of mobile phones and mobile phone apps in the recent decade has indeed become more prevalent. Previous research has highlighted a method of using just the Internet browser and a text editor to create an app, but this does not eliminate the challenges faced by clinicians. More recently, two methodologies of app development have been shared, but there has not been any disclosures pertaining to the costs involved. In addition, limitations such as the distribution and dissemination of the apps have not been addressed. Objective The aims of this research article are to: (1) highlight a low-cost methodology that clinicians without technical knowledge could use to develop educational apps; (2) clarify the respective costs involved in the process of development; (3) illustrate how limitations pertaining to dissemination could be addressed; and (4) to report initial utilization data of the apps and to share initial users’ self-rated perception of the apps. Methods In this study, we will present two techniques of how to create a mobile app using two of the well-established online mobile app building websites. The costs of development are specified and the methodology of dissemination of the apps will be shared. The application of the low-cost methodologies in the creation of the “Mastering Psychiatry” app for undergraduates and “Déjà vu” app for postgraduates will be discussed. A questionnaire survey has been administered to undergraduate students collating their perceptions towards the app. Results For the Mastering Psychiatry app, a cumulative total of 722 users have used the mobile app since inception, based on our analytics. For the Déjà vu app, there has been a cumulative total of 154 downloads since inception. The utilization data demonstrated the receptiveness towards these apps, and this is reinforced by the positive perceptions undergraduate students (n=185) had towards the low-cost self-developed apps. Conclusions This is one of

  11. Methodology for Developing Deprescribing Guidelines: Using Evidence and GRADE to Guide Recommendations for Deprescribing

    PubMed Central

    Rojas-Fernandez, Carlos H.; Bjerre, Lise M.; Thompson, Wade; Welch, Vivian

    2016-01-01

    Background Class specific deprescribing guidelines could help clinicians taper and stop medications no longer needed or which may be causing more harm than benefit. We set out to develop methodology to create such guidelines using evidence-based methods for guideline development, evidence synthesis and recommendation rating. Methods and Findings Using a comprehensive checklist for a successful guideline enterprise, we conducted a national modified Delphi consensus process to identify priorities for deprescribing guidelines, then conducted scoping exercises to identify feasible topics, and sequentially developed three deprescribing guidelines. We selected guideline development team members for clinical expertise; a GRADE member worked with staff to ensure guideline development processes were followed. We conducted or used systematic searches and reviews of deprescribing trials of selected drug classes, reviews or systematic reviews of drug class effectiveness, reviews of reviews of drug class harm and narrative syntheses of contextual questions to inform recommendations and guideline development. Our 8 step process for guideline development included defining scope and purpose, developing a logic model to guide the process and generate key clinical questions, setting criteria for admissible evidence and conducting systematic reviews, synthesizing evidence considering additional contextual information and performing quality estimates, formulating recommendations and providing strength estimations, adding clinical considerations, conducting clinical and stakeholder review and finally updating content pre-publication. Innovative aspects of the guideline development process included synthesizing evidence for outcomes of tapering or stopping medication, and incorporating evidence for medication harm into the recommendation strength rating. Through the development of three deprescribing guidelines (for proton pump inhibitors, benzodiazepine receptor agonists and

  12. Planning and scheduling for agile manufacturers: The Pantex Process Model

    SciTech Connect

    Kjeldgaard, E.A.; Jones, D.A.; List, G.F.; Tumquist, M.A.

    1998-02-01

    Effective use of resources that are shared among multiple products or processes is critical for agile manufacturing. This paper describes the development and implementation of a computerized model to support production planning in a complex manufacturing system at the Pantex Plant, a US Department of Energy facility. The model integrates two different production processes (nuclear weapon disposal and stockpile evaluation) that use common facilities and personnel at the plant. The two production processes are characteristic of flow-shop and job shop operations. The model reflects the interactions of scheduling constraints, material flow constraints, and the availability of required technicians and facilities. Operational results show significant productivity increases from use of the model.

  13. Perspectives on Industrial Innovation from Agilent, HP, and Bell Labs

    NASA Astrophysics Data System (ADS)

    Hollenhorst, James

    2014-03-01

    Innovation is the life blood of technology companies. I will give perspectives gleaned from a career in research and development at Bell Labs, HP Labs, and Agilent Labs, from the point of view of an individual contributor and a manager. Physicists bring a unique set of skills to the corporate environment, including a desire to understand the fundamentals, a solid foundation in physical principles, expertise in applied mathematics, and most importantly, an attitude: namely, that hard problems can be solved by breaking them into manageable pieces. In my experience, hiring managers in industry seldom explicitly search for physicists, but they want people with those skills.

  14. Production planning tools and techniques for agile manufacturing

    SciTech Connect

    Kjeldgaard, E.A.; Jones, D.A.; List, G.F.; Turnquist, M.A.

    1996-10-01

    Effective use of resources shared among multiple products or processes is critical for agile manufacturing. This paper describes development and implementation of a computerized model to support production planning in a complex manufacturing system at Pantex Plant. The model integrates two different production processes (nuclear weapon dismantlement and stockpile evaluation) which use common facilities and personnel, and reflects the interactions of scheduling constraints, material flow constraints, and resource availability. These two processes reflect characteristics of flow-shop and job-shop operations in a single facility. Operational results from using the model are also discussed.

  15. The Southern Argentine Agile Meteor Radar (SAAMER)

    NASA Astrophysics Data System (ADS)

    Janches, Diego

    2014-11-01

    The Southern Argentina Agile Meteor Radar (SAAMER) is a new generation system deployed in Rio Grande, Tierra del Fuego, Argentina (53 S) in May 2008. SAAMER transmits 10 times more power than regular meteor radars, and uses a newly developed transmitting array, which focuses power upward instead of the traditional single-antenna-all-sky configuration. The system is configured such that the transmitter array can also be utilized as a receiver. The new design greatly increases the sensitivity of the radar enabling the detection of large number of particles at low zenith angles. The more concentrated transmitted power enables additional meteor studies besides those typical of these systems based on the detection of specular reflections, such as routine detections of head echoes and non-specular trails, previously only possible with High Power and Large Aperture radars. In August 2010, SAAMER was upgraded to a system capable to determine meteoroid orbital parameters. This was achieved by adding two remote receiving stations approximately 10 km away from the main site in near perpendicular directions. The upgrade significantly expands the science that is achieved with this new radar enabling us to study the orbital properties of the interplanetary dust environment. Because of the unique geographical location, SAAMER allows for additional inter-hemispheric comparison with measurements from Canadian Meteor Orbit Radar, which is geographically conjugate. Initial surveys show, for example, that SAAMER observes a very strong contribution of the South Toroidal Sporadic meteor source, of which limited observational data is available. In addition, SAAMER offers similar unique capabilities for meteor showers and streams studies given the range of ecliptic latitudes that the system enables detailed study of showers at high southern latitudes (e.g July Phoenicids or Puppids complex). Finally, SAAMER is ideal for the deployment of complementary instrumentation in both, permanent

  16. A multimedia approach for teaching human embryology: Development and evaluation of a methodology.

    PubMed

    Moraes, Suzana Guimarães; Pereira, Luis Antonio Violin

    2010-12-20

    Human embryology requires students to understand the simultaneous changes in embryos, but students find it difficult to grasp the concepts presented and to visualise the related processes in three dimensions. The aims of this study have been to develop and evaluate new educational materials and a teaching methodology based on multimedia approaches to improve the comprehension of human development. The materials developed at the State University of Campinas include clinical histories, movies, animations, and ultrasound, as well as autopsy images from embryos and foetuses. The series of embryology lectures were divided into two parts. The first part of the series addressed the development of the body's structures, while in the second part, clinical history and the corresponding materials were shown to the students, who were encouraged to discuss the malformations. The teaching materials were made available on software used by the students in classes. At the end of the discipline, the material and methodology were evaluated with an attitudinal instrument, interviews, and knowledge examination. The response rate to the attitudinal instrument was 95.35%, and the response rate to the interview was 46%. The students approved of the materials and the teaching methodology (reliability of the attitudinal instrument was 0.9057). The exams showed that most students scored above 6.0. A multimedia approach proved useful for solving an important problem associated with teaching methods in many medical institutions: the lack of integration between basic sciences and clinical disciplines.

  17. Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama M.

    2003-01-01

    This effort is to investigate probabilistic life prediction methodologies for MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. This includes completion of a literature survey regarding Weibull size effect in MEMS and strength testing techniques. Also of interest is the design of a proper test for the Weibull size effect in tensile specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. Another potential item of interest is analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structuredlife (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. Along these lines work may also be performed on transient fatigue life prediction methodologies.

  18. Agile development of ontologies through conversation

    NASA Astrophysics Data System (ADS)

    Braines, Dave; Bhattal, Amardeep; Preece, Alun D.; de Mel, Geeth

    2016-05-01

    Ontologies and semantic systems are necessarily complex but offer great potential in terms of their ability to fuse information from multiple sources in support of situation awareness. Current approaches do not place the ontologies directly into the hands of the end user in the field but instead hide them away behind traditional applications. We have been experimenting with human-friendly ontologies and conversational interactions to enable non-technical business users to interact with and extend these dynamically. In this paper we outline our approach via a worked example, covering: OWL ontologies, ITA Controlled English, Sensor/mission matching and conversational interactions between human and machine agents.

  19. Agile multiple aperture imager receiver development

    NASA Astrophysics Data System (ADS)

    Lees, David E. B.; Dillon, Robert F.

    1990-02-01

    A variety of unconventional imaging schemes have been investigated in recent years that rely on small, unphased optical apertures (subaperture) to measure properties of an incoming optical wavefront and recover images of distant objects without using precisely figured, large aperture optical elements. Such schemes offer several attractive features. They provide the potential to create very lare effective aperture that are expandable over time and can be launched into space in small pieces. Since the subapertures are identical in construction, they may be mass producible at potentially low cost. A preliminary design for a practical low cost optical receiver is presented. The multiple aperture design has high sensitivity, wide field-of-view, and is lightweight. A combination of spectral, temporal, and spatial background suppression are used to achieve daytime operation at low signal levels. Modular packaging to make the number of receiver subapertures conveniently scalable is also presented. The design is appropriate to a ground-base proof-of-concept experiment for long range active speckle imaging.

  20. U.S. Geological Survey Methodology Development for Ecological Carbon Assessment and Monitoring

    USGS Publications Warehouse

    Zhu, Zhi-Liang; Stackpoole, S.M.

    2009-01-01

    Ecological carbon sequestration refers to transfer and storage of atmospheric carbon in vegetation, soils, and aquatic environments to help offset the net increase from carbon emissions. Understanding capacities, associated opportunities, and risks of vegetated ecosystems to sequester carbon provides science information to support formulation of policies governing climate change mitigation, adaptation, and land-management strategies. Section 712 of the Energy Independence and Security Act (EISA) of 2007 mandates the Department of the Interior to develop a methodology and assess the capacity of our nation's ecosystems for ecological carbon sequestration and greenhouse gas (GHG) flux mitigation. The U.S. Geological Survey (USGS) LandCarbon Project is responding to the Department of Interior's request to develop a methodology that meets specific EISA requirements.

  1. Discovery of new antimalarial chemotypes through chemical methodology and library development.

    PubMed

    Brown, Lauren E; Chih-Chien Cheng, Ken; Wei, Wan-Guo; Yuan, Pingwei; Dai, Peng; Trilles, Richard; Ni, Feng; Yuan, Jing; MacArthur, Ryan; Guha, Rajarshi; Johnson, Ronald L; Su, Xin-zhuan; Dominguez, Melissa M; Snyder, John K; Beeler, Aaron B; Schaus, Scott E; Inglese, James; Porco, John A

    2011-04-26

    In an effort to expand the stereochemical and structural complexity of chemical libraries used in drug discovery, the Center for Chemical Methodology and Library Development at Boston University has established an infrastructure to translate methodologies accessing diverse chemotypes into arrayed libraries for biological evaluation. In a collaborative effort, the NIH Chemical Genomics Center determined IC(50)'s for Plasmodium falciparum viability for each of 2,070 members of the CMLD-BU compound collection using quantitative high-throughput screening across five parasite lines of distinct geographic origin. Three compound classes displaying either differential or comprehensive antimalarial activity across the lines were identified, and the nascent structure activity relationships (SAR) from this experiment used to initiate optimization of these chemotypes for further development.

  2. Development of a methodology for the detection of hospital financial outliers using information systems.

    PubMed

    Okada, Sachiko; Nagase, Keisuke; Ito, Ayako; Ando, Fumihiko; Nakagawa, Yoshiaki; Okamoto, Kazuya; Kume, Naoto; Takemura, Tadamasa; Kuroda, Tomohiro; Yoshihara, Hiroyuki

    2014-01-01

    Comparison of financial indices helps to illustrate differences in operations and efficiency among similar hospitals. Outlier data tend to influence statistical indices, and so detection of outliers is desirable. Development of a methodology for financial outlier detection using information systems will help to reduce the time and effort required, eliminate the subjective elements in detection of outlier data, and improve the efficiency and quality of analysis. The purpose of this research was to develop such a methodology. Financial outliers were defined based on a case model. An outlier-detection method using the distances between cases in multi-dimensional space is proposed. Experiments using three diagnosis groups indicated successful detection of cases for which the profitability and income structure differed from other cases. Therefore, the method proposed here can be used to detect outliers.

  3. Co-design of RAD and ETHICS methodologies: a combination of information system development methods

    NASA Astrophysics Data System (ADS)

    Nasehi, Arezo; Shahriyari, Salman

    2011-12-01

    Co-design is a new trend in the social world which tries to capture different ideas in order to use the most appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid application development (RAD) and effective technical and human implementation of computer-based systems (ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or combination of them for developing an information system. To reach this purpose, four different aspects of them are analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some suggestions for the co-design.

  4. Implementation of a cooperative methodology to develop organic chemical engineering skills

    NASA Astrophysics Data System (ADS)

    Arteaga, J. F.; Díaz Blanco, M. J.; Toscano Fuentes, C.; Martín Alfonso, J. E.

    2013-08-01

    The objective of this work is to investigate how most of the competences required by engineering students may be developed through an active methodology based on cooperative learning/evaluation. Cooperative learning was employed by the University of Huelva's third-year engineering students. The teaching methodology pretends to create some of the most relevant engineering skills required nowadays such as the ability to cooperate finding appropriate information; the ability to solve problems through critical and creative thinking; and the ability to make decisions and to communicate effectively. The statistical study carried out supports the hypothesis that comprehensive and well-defined protocols in the development of the subject, the rubric and cooperative evaluation allow students to acquire a successful learning.

  5. Development of a methodology for the detection of hospital financial outliers using information systems.

    PubMed

    Okada, Sachiko; Nagase, Keisuke; Ito, Ayako; Ando, Fumihiko; Nakagawa, Yoshiaki; Okamoto, Kazuya; Kume, Naoto; Takemura, Tadamasa; Kuroda, Tomohiro; Yoshihara, Hiroyuki

    2014-01-01

    Comparison of financial indices helps to illustrate differences in operations and efficiency among similar hospitals. Outlier data tend to influence statistical indices, and so detection of outliers is desirable. Development of a methodology for financial outlier detection using information systems will help to reduce the time and effort required, eliminate the subjective elements in detection of outlier data, and improve the efficiency and quality of analysis. The purpose of this research was to develop such a methodology. Financial outliers were defined based on a case model. An outlier-detection method using the distances between cases in multi-dimensional space is proposed. Experiments using three diagnosis groups indicated successful detection of cases for which the profitability and income structure differed from other cases. Therefore, the method proposed here can be used to detect outliers. PMID:23785010

  6. Development of a Pattern Recognition Methodology for Determining Operationally Optimal Heat Balance Instrumentation Calibration Schedules

    SciTech Connect

    Kurt Beran; John Christenson; Dragos Nica; Kenny Gross

    2002-12-15

    The goal of the project is to enable plant operators to detect with high sensitivity and reliability the onset of decalibration drifts in all of the instrumentation used as input to the reactor heat balance calculations. To achieve this objective, the collaborators developed and implemented at DBNPS an extension of the Multivariate State Estimation Technique (MSET) pattern recognition methodology pioneered by ANAL. The extension was implemented during the second phase of the project and fully achieved the project goal.

  7. Methodological Guidelines for Reducing the Complexity of Data Warehouse Development for Transactional Blood Bank Systems

    PubMed Central

    Takecian, Pedro L.; Oikawa, Marcio K.; Braghetto, Kelly R.; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S.; Acker, Susan; Carneiro-Proietti, Anna B. F.; Sabino, Ester C.; Custer, Brian; Busch, Michael P.; Ferreira, João E.

    2013-01-01

    Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development. PMID:23729945

  8. Developing a methodology to predict PM10 concentrations in urban areas using generalized linear models.

    PubMed

    Garcia, J M; Teodoro, F; Cerdeira, R; Coelho, L M R; Kumar, Prashant; Carvalho, M G

    2016-09-01

    A methodology to predict PM10 concentrations in urban outdoor environments is developed based on the generalized linear models (GLMs). The methodology is based on the relationship developed between atmospheric concentrations of air pollutants (i.e. CO, NO2, NOx, VOCs, SO2) and meteorological variables (i.e. ambient temperature, relative humidity (RH) and wind speed) for a city (Barreiro) of Portugal. The model uses air pollution and meteorological data from the Portuguese monitoring air quality station networks. The developed GLM considers PM10 concentrations as a dependent variable, and both the gaseous pollutants and meteorological variables as explanatory independent variables. A logarithmic link function was considered with a Poisson probability distribution. Particular attention was given to cases with air temperatures both below and above 25°C. The best performance for modelled results against the measured data was achieved for the model with values of air temperature above 25°C compared with the model considering all ranges of air temperatures and with the model considering only temperature below 25°C. The model was also tested with similar data from another Portuguese city, Oporto, and results found to behave similarly. It is concluded that this model and the methodology could be adopted for other cities to predict PM10 concentrations when these data are not available by measurements from air quality monitoring stations or other acquisition means. PMID:26839052

  9. Developing a methodology to predict PM10 concentrations in urban areas using generalized linear models.

    PubMed

    Garcia, J M; Teodoro, F; Cerdeira, R; Coelho, L M R; Kumar, Prashant; Carvalho, M G

    2016-09-01

    A methodology to predict PM10 concentrations in urban outdoor environments is developed based on the generalized linear models (GLMs). The methodology is based on the relationship developed between atmospheric concentrations of air pollutants (i.e. CO, NO2, NOx, VOCs, SO2) and meteorological variables (i.e. ambient temperature, relative humidity (RH) and wind speed) for a city (Barreiro) of Portugal. The model uses air pollution and meteorological data from the Portuguese monitoring air quality station networks. The developed GLM considers PM10 concentrations as a dependent variable, and both the gaseous pollutants and meteorological variables as explanatory independent variables. A logarithmic link function was considered with a Poisson probability distribution. Particular attention was given to cases with air temperatures both below and above 25°C. The best performance for modelled results against the measured data was achieved for the model with values of air temperature above 25°C compared with the model considering all ranges of air temperatures and with the model considering only temperature below 25°C. The model was also tested with similar data from another Portuguese city, Oporto, and results found to behave similarly. It is concluded that this model and the methodology could be adopted for other cities to predict PM10 concentrations when these data are not available by measurements from air quality monitoring stations or other acquisition means.

  10. Development and application of a methodology for a clean development mechanism to avoid methane emissions in closed landfills.

    PubMed

    Janke, Leandro; Lima, André O S; Millet, Maurice; Radetski, Claudemir M

    2013-01-01

    In Brazil, Solid Waste Disposal Sites have operated without consideration of environmental criteria, these areas being characterized by methane (CH4) emissions during the anaerobic degradation of organic matter. The United Nations organization has made efforts to control this situation, through the United Nations Framework Convention on Climate Change (UNFCCC) and the Kyoto Protocol, where projects that seek to reduce the emissions of greenhouse gases (GHG) can be financially rewarded through Certified Emission Reductions (CERs) if they respect the requirements established by the Clean Development Mechanism (CDM), such as the use of methodologies approved by the CDM Executive Board (CDM-EB). Thus, a methodology was developed according to the CDM standards related to the aeration, excavation and composting of closed Municipal Solid Waste (MSW) landfills, which was submitted to CDM-EB for assessment and, after its approval, applied to a real case study in Maringá City (Brazil) with a view to avoiding negative environmental impacts due the production of methane and leachates even after its closure. This paper describes the establishment of this CDM-EB-approved methodology to determine baseline emissions, project emissions and the resultant emission reductions with the application of appropriate aeration, excavation and composting practices at closed MSW landfills. A further result obtained through the application of the methodology in the landfill case study was that it would be possible to achieve an ex-ante emission reduction of 74,013 tCO2 equivalent if the proposed CDM project activity were implemented.

  11. Development of a Probabilistic Assessment Methodology for Evaluation of Carbon Dioxide Storage

    USGS Publications Warehouse

    Burruss, Robert A.; Brennan, Sean T.; Freeman, P.A.; Merrill, Matthew D.; Ruppert, Leslie F.; Becker, Mark F.; Herkelrath, William N.; Kharaka, Yousif K.; Neuzil, Christopher E.; Swanson, Sharon M.; Cook, Troy A.; Klett, Timothy R.; Nelson, Philip H.; Schenk, Christopher J.

    2009-01-01

    This report describes a probabilistic assessment methodology developed by the U.S. Geological Survey (USGS) for evaluation of the resource potential for storage of carbon dioxide (CO2) in the subsurface of the United States as authorized by the Energy Independence and Security Act (Public Law 110-140, 2007). The methodology is based on USGS assessment methodologies for oil and gas resources created and refined over the last 30 years. The resource that is evaluated is the volume of pore space in the subsurface in the depth range of 3,000 to 13,000 feet that can be described within a geologically defined storage assessment unit consisting of a storage formation and an enclosing seal formation. Storage assessment units are divided into physical traps (PTs), which in most cases are oil and gas reservoirs, and the surrounding saline formation (SF), which encompasses the remainder of the storage formation. The storage resource is determined separately for these two types of storage. Monte Carlo simulation methods are used to calculate a distribution of the potential storage size for individual PTs and the SF. To estimate the aggregate storage resource of all PTs, a second Monte Carlo simulation step is used to sample the size and number of PTs. The probability of successful storage for individual PTs or the entire SF, defined in this methodology by the likelihood that the amount of CO2 stored will be greater than a prescribed minimum, is based on an estimate of the probability of containment using present-day geologic knowledge. The report concludes with a brief discussion of needed research data that could be used to refine assessment methodologies for CO2 sequestration.

  12. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis

    PubMed Central

    Guardia, Gabriela D. A.; Pires, Luís Ferreira; Vêncio, Ricardo Z. N.; Malmegrim, Kelen C. R.; de Farias, Cléver R. G.

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740

  13. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    PubMed

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  14. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    PubMed

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740

  15. Methodology, status and plans for development and assessment of Cathare code

    SciTech Connect

    Bestion, D.; Barre, F.; Faydide, B.

    1997-07-01

    This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests or integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.

  16. Lesson Learned from AGILE and LARES ASI Projects About MATED Data Collection and Post Analysis

    NASA Astrophysics Data System (ADS)

    Carpentiero, Rita; Mrchetti, Ernesto; Natalucci, Silvia; Portelli, Claudio

    2012-07-01

    ASI has managed and collected data on project development of two scientific all-Italian missions: AGILE and LARES. Collection of the Model And Test Effectiveness Database (MATED) data, concerning Project, AIV (Assembly Integration and Verification) and NCR (Non Conformance Report) aspects has been performed by the Italian Space Agency (ASI), using available technical documentation of both AGILE e LARES projects. In this paper some consideration on the needs of 'real time' data collection is made, together with proposal of front end improvement of this tool. In addition a preliminary analysis of MATED effectiveness related to the above ASI projects will be presented in a bottom-up and post verification approach.

  17. A process for the agile product realization of electro-mechanical devices

    SciTech Connect

    Forsythe, C.; Ashby, M.R.; Benavides, G.L.; Diegert, K.V.; Jones, R.E.; Longcope, D.B.; Parratt, S.W.

    1995-09-01

    This paper describes a product realization process developed and demonstrated at Sandia by the A-PRIMED (Agile Product Realization for Innovative Electro MEchanical Devices) project that integrates many of the key components of ``agile manufacturing`` into a complete, design-to-production process. Evidence indicates that the process has reduced the product realization cycle and assured product quality. Products included discriminators for a robotic quick change adapter and for an electronic defense system. These discriminators, built using A-PRIMED, met random vibration requirements and had life cycles that far surpass the performance obtained from earlier efforts.

  18. Development of a Robust and Integrated Methodology for Predicting the Reliability of Microelectronic Packaging Systems

    NASA Astrophysics Data System (ADS)

    Fallah-Adl, Ali

    Ball Grid Array (BGA) using lead-free or lead-rich solder materials are widely used as Second Level Interconnects (SLI) in mounting packaged components to the printed circuit board (PCB). The reliability of these solder joints is of significant importance to the performance of microelectronics components and systems. Product design/form-factor, solder material, manufacturing process, use condition, as well as, the inherent variabilities present in the system, greatly influence product reliability. Accurate reliability analysis requires an integrated approach to concurrently account for all these factors and their synergistic effects. Such an integrated and robust methodology can be used in design and development of new and advanced microelectronics systems and can provide significant improvement in cycle-time, cost, and reliability. IMPRPK approach is based on a probabilistic methodology, focusing on three major tasks of (1) Characterization of BGA solder joints to identify failure mechanisms and obtain statistical data, (2) Finite Element analysis (FEM) to predict system response needed for life prediction, and (3) development of a probabilistic methodology to predict the reliability, as well as, the sensitivity of the system to various parameters and the variabilities. These tasks and the predictive capabilities of IMPRPK in microelectronic reliability analysis are discussed.

  19. Contraceptive introduction reconsidered: a new methodology for policy and program development.

    PubMed

    Simmons, R; Fajans, P

    1999-03-01

    Although new contraceptive technology has the potential for providing women with expanded options for fertility control, the historical record of international family planning shows that, in practice, introduction of new methods has not always broadened reproductive choice. Using the example of introduction of intrauterine devices into the Indian family planning program in the 1960s, we show that an exclusive focus on the technology itself is problematic and argue that methodologies are needed that relate introduction of new methods to user needs and program capacities. We summarize key findings from the Indonesian experience with Norplant introduction. Although an effort was made to address problems with previous approaches, major deficiencies in both the technical and interpersonal dimensions of care arose when the implants were made broadly available within the program. We subsequently present a methodology for contraceptive introduction developed by the World Health Organization. This methodology emphasizes the social and institutional context in which technology is used and suggests a participatory and research-based approach to program and policy development. We illustrate results from this new approach in its implementation in Vietnam and suggest areas for further evaluation.

  20. Progress in the Development of a Nozzle Design Methodology for Pulsed Detonation Engines

    NASA Technical Reports Server (NTRS)

    Leary, B. A.; Waltrup, P. J.; Rice, T.; Cybyk, B. Z.

    2002-01-01

    The Johns Hopkins University Applied Physics Laboratory (JHU/APL), in support of the NASA Glenn Research Center (NASA GRC), is investigating performance methodologies and system integration issues related to Pulsed Detonation Engine (PDE) nozzles. The primary goal of this ongoing effort is to develop design and performance assessment methodologies applicable to PDE exit nozzle(s). APL is currently focusing its efforts on a common plenum chamber design that collects the exhaust products from multiple PDE tubes prior to expansion in a single converging-diverging exit nozzle. To accomplish this goal, a time-dependent, quasi-one-dimensional analysis for determining the flow properties in and through a single plenum and exhaust nozzle is underway. In support of these design activities, parallel modeling efforts using commercial Computational Fluid Dynamics (CFD) software are on-going. These efforts include both two and three-dimensional as well as steady and time-dependent computations to assess the flow in and through these devices. This paper discusses the progress in developing this nozzle design methodology.

  1. Study design, methodology and statistical analyses in the clinical development of sparfloxacin.

    PubMed

    Genevois, E; Lelouer, V; Vercken, J B; Caillon, R

    1996-05-01

    Many publications in the past 10 years have emphasised the difficulties of evaluating anti-infective drugs and the need for well-designed clinical trials in this therapeutic field. The clinical development of sparfloxacin in Europe, involving more than 4000 patients in ten countries, provided the opportunity to implement a methodology for evaluation and statistical analyses which would take into account actual requirements and past insufficiencies. This methodology focused on a rigorous and accurate patient classification for evaluability, subgroups of particular interest, efficacy assessment based on automation (algorithm) and individual case review by expert panel committees. In addition, the statistical analyses did not use significance testing but rather confidence intervals to determine whether sparfloxacin was therapeutically equivalent to the reference comparator antibacterial agents. PMID:8737126

  2. Development, characterization, and optimization of protein level in date bars using response surface methodology.

    PubMed

    Nadeem, Muhammad; Salim-ur-Rehman; Muhammad Anjum, Faqir; Murtaza, Mian Anjum; Mueen-ud-Din, Ghulam

    2012-01-01

    This project was designed to produce a nourishing date bar with commercial value especially for school going children to meet their body development requirements. Protein level of date bars was optimized using response surface methodology (RSM). Economical and underutilized sources, that is, whey protein concentrate and vetch protein isolates, were explored for protein supplementation. Fourteen date bar treatments were produced using a central composite design (CCD) with 2 variables and 3 levels for each variable. Date bars were then analyzed for nutritional profile. Proximate composition revealed that addition of whey protein concentrate and vetch protein isolates improved the nutritional profile of date bars. Protein level, texture, and taste were considerably improved by incorporating 6.05% whey protein concentrate and 4.35% vetch protein isolates in date bar without affecting any sensory characteristics during storage. Response surface methodology was observed as an economical and effective tool to optimize the ingredient level and to discriminate the interactive effects of independent variables. PMID:22792044

  3. Study design, methodology and statistical analyses in the clinical development of sparfloxacin.

    PubMed

    Genevois, E; Lelouer, V; Vercken, J B; Caillon, R

    1996-05-01

    Many publications in the past 10 years have emphasised the difficulties of evaluating anti-infective drugs and the need for well-designed clinical trials in this therapeutic field. The clinical development of sparfloxacin in Europe, involving more than 4000 patients in ten countries, provided the opportunity to implement a methodology for evaluation and statistical analyses which would take into account actual requirements and past insufficiencies. This methodology focused on a rigorous and accurate patient classification for evaluability, subgroups of particular interest, efficacy assessment based on automation (algorithm) and individual case review by expert panel committees. In addition, the statistical analyses did not use significance testing but rather confidence intervals to determine whether sparfloxacin was therapeutically equivalent to the reference comparator antibacterial agents.

  4. Development of a methodology for assessing the safety of embedded software systems

    NASA Technical Reports Server (NTRS)

    Garrett, C. J.; Guarro, S. B.; Apostolakis, G. E.

    1993-01-01

    A Dynamic Flowgraph Methodology (DFM) based on an integrated approach to modeling and analyzing the behavior of software-driven embedded systems for assessing and verifying reliability and safety is discussed. DFM is based on an extension of the Logic Flowgraph Methodology to incorporate state transition models. System models which express the logic of the system in terms of causal relationships between physical variables and temporal characteristics of software modules are analyzed to determine how a certain state can be reached. This is done by developing timed fault trees which take the form of logical combinations of static trees relating the system parameters at different point in time. The resulting information concerning the hardware and software states can be used to eliminate unsafe execution paths and identify testing criteria for safety critical software functions.

  5. A Methodology and a Web Platform for the Collaborative Development of Context-Aware Systems

    PubMed Central

    Martín, David; López-de-Ipiña, Diego; Alzua-Sorzabal, Aurkene; Lamsfus, Carlos; Torres-Manzanera, Emilio

    2013-01-01

    Information and services personalization is essential for an optimal user experience. Systems have to be able to acquire data about the user's context, process them in order to identify the user's situation and finally, adapt the functionality of the system to that situation, but the development of context-aware systems is complex. Data coming from distributed and heterogeneous sources have to be acquired, processed and managed. Several programming frameworks have been proposed in order to simplify the development of context-aware systems. These frameworks offer high-level application programming interfaces for programmers that complicate the involvement of domain experts in the development life-cycle. The participation of users that do not have programming skills but are experts in the application domain can speed up and improve the development process of these kinds of systems. Apart from that, there is a lack of methodologies to guide the development process. This article presents as main contributions, the implementation and evaluation of a web platform and a methodology to collaboratively develop context-aware systems by programmers and domain experts. PMID:23666131

  6. Using Modern Methodologies with Maintenance Software

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.

    2014-01-01

    Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.

  7. Curriculum Development of a Research Laboratory Methodology Course for Complementary and Integrative Medicine Students

    PubMed Central

    Vasilevsky, Nicole; Schafer, Morgan; Tibbitts, Deanne; Wright, Kirsten; Zwickey, Heather

    2015-01-01

    Training in fundamental laboratory methodologies is valuable to medical students because it enables them to understand the published literature, critically evaluate clinical studies, and make informed decisions regarding patient care. It also prepares them for research opportunities that may complement their medical practice. The National College of Natural Medicine's (NCNM) Master of Science in Integrative Medicine Research (MSiMR) program has developed an Introduction to Laboratory Methods course. The objective of the course it to train clinical students how to perform basic laboratory skills, analyze and manage data, and judiciously assess biomedical studies. Here we describe the course development and implementation as it applies to complementary and integrative medicine students. PMID:26500806

  8. Allometric multilevel modelling of agility and dribbling speed by skeletal age and playing position in youth soccer players.

    PubMed

    Valente-dos-Santos, J; Coelho-e-Silva, M J; Duarte, J; Pereira, J; Rebelo-Gonçalves, R; Figueiredo, A; Mazzuco, M A; Sherar, L B; Elferink-Gemser, M T; Malina, R M

    2014-08-01

    This study evaluates the contributions of age, skeletal maturation, body size and composition, training and playing position to the development of agility and dribbling speed in young male soccer players (10-18 years) followed longitudinally. 83 players [defenders (n=35), midfielders (n=27), forwards (n=21)] were followed annually over 5 years (average: 4.4 observations per player). Skeletal age (SA), stature, body mass, triceps and subscapular skinfolds, agility and dribbling speed were measured annually. Body composition was estimated from the 2 skinfolds. Annual training volume was estimated from weekly participation forms completed by coaches. The multiplicative allometric models with the best statistical fit showed that statural growth of 1 cm predicts 1.334 s and 1.927 s of improvement in agility and dribbling speed, respectively. Significant independent effects of fat-free mass and annual volume training were found for agility and dribbling speed, respectively (P<0.05). Predicted agility (from 12 to 18 years of SA) and dribbling speed (from 13 to 18 years of SA) differed significantly among players by playing positions (midfielders>forwards>defenders). The present results provide developmental models for the interpretation of intra- and inter-individual variability in agility and dribbling speed among youth soccer players across adolescence, and may provide a framework for trainers and coaches to develop and evaluate individualized training protocols.

  9. Development of an energy-use estimation methodology for the revised Navy Manual MO-303

    SciTech Connect

    Richman, E.E.; Keller, J.M.; Wood, A.G.; Dittmer, A.L.

    1995-01-01

    The U.S. Navy commissioned Pacific Northwest Laboratory (PNL) to revise and/or update the Navy Utilities Targets Manual, NAVFAC MO-303 (U.S. Navy 1972b). The purpose of the project was to produce a current, applicable, and easy-to-use version of the manual for use by energy and facility engineers and staff at all Navy Public Works Centers (PWCs), Public Works Departments (PWDs), Engineering Field Divisions (EFDs), and other related organizations. The revision of the MO-303 manual involved developing a methodology for estimating energy consumption in buildings and ships. This methodology can account for, and equitably allocate, energy consumption within Navy installations. The analyses used to develop this methodology included developing end-use intensities (EUIs) from a vast collection of Navy base metering and billing data. A statistical analysis of the metering data, weather data, and building energy-use characteristics was used to develop appropriate EUI values for use at all Navy bases. A complete Navy base energy reconciliation process was also created for use in allocating all known energy consumption. Initial attempts to use total Navy base consumption values did not produce usable results. A parallel effort using individual building consumption data provided an estimating method that incorporated weather effects. This method produced a set of building EUI values and weather adjustments for use in estimating building energy use. A method of reconciling total site energy consumption was developed based on a {open_quotes}zero-sum{close_quotes} principle. This method provides a way to account for all energy use and apportion part or all of it to buildings and other energy uses when actual consumption is not known. The entire text of the manual was also revised to present a more easily read understood and usable document.

  10. Are They All Created Equal? A Comparison of Different Concept Inventory Development Methodologies

    NASA Astrophysics Data System (ADS)

    Lindell, Rebecca S.; Peak, Elizabeth; Foster, Thomas M.

    2007-01-01

    The creation of the Force Concept Inventory (FCI) was a seminal moment for Physics Education Research. Based on the development of the FCI, many more concept inventories have been developed. The problem with the development of all of these concept inventories is there does not seem to be a concise methodology for developing these inventories, nor is there a concise definition of what these inventories measure. By comparing the development methodologies of many common Physics and Astronomy Concept Inventories we can draw inferences about different types of concept inventories, as well as different valid conclusions that can be drawn from the administration of these inventories. Inventories compared include: Astronomy Diagnostic Test (ADT), Brief Electricity and Magnetism Assessment (BEMA), Conceptual Survey in Electricity and Magnetism (CSEM), Diagnostic Exam Electricity and Magnetism (DEEM), Determining and Interpreting Resistive Electric Circuits Concept Test (DIRECT), Energy and Motion Conceptual Survey (EMCS), Force Concept Inventory (FCI), Force and Motion Conceptual Evaluation (FMCE), Lunar Phases Concept Inventory (LPCI), Test of Understanding Graphs in Kinematics (TUG-K) and Wave Concept Inventory (WCI).

  11. First GRB detections with the AGILE Minicalorimeter

    NASA Astrophysics Data System (ADS)

    Marisaldi, M.; Labanti, C.; Fuschino, F.; Galli, M.; Tavani, M.; Bulgarelli, A.; Gianotti, F.; Trifoglio, M.; Argan, A.

    2008-05-01

    The Minicalorimeter (MCAL) onboard the AGILE satellite is a 1400 cm2 scintillation detector sensitive in the energy range 0.3-200 MeV. MCAL works both as a slave of the AGILE Silicon Tracker and as an autonomous detector for transient events (BURST mode). A dedicated onboard Burst Search logic scans BURST mode data in search of count rate increase. Peculiar characteristics of the detector are the high energy spectral coverage and a timing resolution of about 2 microseconds. Even if a trigger is not issued, BURST mode data are used to build a broad band energy spectrum (scientific ratemeters) organized in 11 bands for each of the two MCAL detection planes, with a time resolution of 1 second. After the first engineering commissioning phase, following the AGILE launch on 23rd April 2007, between 22nd June and 5th November 2007 eighteen GRBs were detected offline in the scientific ratemeters data, with a detection rate of about one per week. In this paper the capabilities of the detector will be described and an overview of the first detected GRBs will be given.

  12. First GRB detections with the AGILE Minicalorimeter

    SciTech Connect

    Marisaldi, M.; Labanti, C.; Fuschino, F.; Bulgarelli, A.; Gianotti, F.; Trifoglio, M.; Galli, M.; Tavani, M.; Argan, A.

    2008-05-22

    The Minicalorimeter (MCAL) onboard the AGILE satellite is a 1400 cm{sup 2} scintillation detector sensitive in the energy range 0.3-200 MeV. MCAL works both as a slave of the AGILE Silicon Tracker and as an autonomous detector for transient events (BURST mode). A dedicated onboard Burst Search logic scans BURST mode data in search of count rate increase. Peculiar characteristics of the detector are the high energy spectral coverage and a timing resolution of about 2 microseconds. Even if a trigger is not issued, BURST mode data are used to build a broad band energy spectrum (scientific ratemeters) organized in 11 bands for each of the two MCAL detection planes, with a time resolution of 1 second. After the first engineering commissioning phase, following the AGILE launch on 23rd April 2007, between 22nd June and 5th November 2007 eighteen GRBs were detected offline in the scientific ratemeters data, with a detection rate of about one per week. In this paper the capabilities of the detector will be described and an overview of the first detected GRBs will be given.

  13. Preliminary methodology to assess the national and regional impact of U.S. wind energy development on birds and bats

    USGS Publications Warehouse

    Diffendorfer, James E.; Beston, Julie A.; Merrill, Matthew D.; Stanton, Jessica C.; Corum, Margo D.; Loss, Scott R.; Thogmartin, Wayne E.; Johnson, Douglas H.; Erickson, Richard A.; Heist, Kevin W.

    2015-01-01

    Components of the methodology are based on simplifying assumptions and require information that, for many species, may be sparse or unreliable. These assumptions are presented in the report and should be carefully considered when using output from the methodology. In addition, this methodology can be used to recommend species for more intensive demographic modeling or highlight those species that may not require any additional protection because effects of wind energy development on their populations are projected to be small.

  14. Development of analytical methodologies to assess recalcitrant pesticide bioremediation in biobeds at laboratory scale.

    PubMed

    Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica

    2016-06-01

    To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs <11% in all cases. The application of this methodology proved that A. biennis is able to dissipate 94% of endosulfan and 87% of chlorpyrifos after 90 days. Having assessed that A. biennis growing over bran can metabolize the studied pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs

  15. A methodology for small scale rural land use mapping in semi-arid developing countries using orbital imagery. 1: Introduction

    NASA Technical Reports Server (NTRS)

    Vangenderen, J. L. (Principal Investigator); Lock, B. F.

    1976-01-01

    The author has identified the following significant results. This research program has developed a viable methodology for producing small scale rural land use maps in semi-arid developing countries using imagery obtained from orbital multispectral scanners.

  16. Theoretical framework and methodological development of common subjective health outcome measures in osteoarthritis: a critical review.

    PubMed

    Pollard, Beth; Johnston, Marie; Dixon, Diane

    2007-03-07

    Subjective measures involving clinician ratings or patient self-assessments have become recognised as an important tool for the assessment of health outcome. The value of a health outcome measure is usually assessed by a psychometric evaluation of its reliability, validity and responsiveness. However, psychometric testing involves an accumulation of evidence and has recognised limitations. It has been suggested that an evaluation of how well a measure has been developed would be a useful additional criteria in assessing the value of a measure. This paper explored the theoretical background and methodological development of subjective health status measures commonly used in osteoarthritis research. Fourteen subjective health outcome measures commonly used in osteoarthritis research were examined. Each measure was explored on the basis of their i) theoretical framework (was there a definition of what was being assessed and was it part of a theoretical model?) and ii) methodological development (what was the scaling strategy, how were the items generated and reduced, what was the response format and what was the scoring method?). Only the AIMS, SF-36 and WHOQOL defined what they were assessing (i.e. the construct of interest) and no measure assessed was part of a theoretical model. None of the clinician report measures appeared to have implemented a scaling procedure or described the rationale for the items selected or scoring system. Of the patient self-report measures, the AIMS, MPQ, OXFORD, SF-36, WHOQOL and WOMAC appeared to follow a standard psychometric scaling method. The DRP and EuroQol used alternative scaling methods. The review highlighted the general lack of theoretical framework for both clinician report and patient self-report measures. This review also drew attention to the wide variation in the methodological development of commonly used measures in OA. While, in general the patient self-report measures had good methodological development, the

  17. Quantitative evaluation of geodiversity: development of methodological procedures with application to territorial management

    NASA Astrophysics Data System (ADS)

    Forte, J.; Brilha, J.; Pereira, D.; Nolasco, M.

    2012-04-01

    Although geodiversity is considered the setting for biodiversity, there is still a huge gap in the social recognition of these two concepts. The concept of geodiversity, less developed, is now making its own way as a robust and fundamental idea concerning the abiotic component of nature. From a conservationist point of view, the lack of a broader knowledge concerning the type and spatial variation of geodiversity, as well as its relationship with biodiversity, makes the protection and management of natural or semi-natural areas incomplete. There is a growing need to understand the patterns of geodiversity in different landscapes and to translate this knowledge for territorial management in a practical and effective point of view. This kind of management can also represent an important tool for the development of sustainable tourism, particularly geotourism, which can bring benefits not only for the environment, but also for social and economic purposes. The quantification of geodiversity is an important step in all this process but still few researchers are investing in the development of a proper methodology. The assessment methodologies that were published so far are mainly focused on the evaluation of geomorphological elements, sometimes complemented with information about lithology, soils, hidrology, morphometric variables, climatic surfaces and geosites. This results in very dissimilar areas at very different spatial scales, showing the complexity of the task and the need of further research. This current work aims the development of an effective methodology for the assessment of the maximum elements of geodiversity possible (rocks, minerals, fossils, landforms, soils), based on GIS routines. The main determinant factor for the quantitative assessment is scale, but other factors are also very important, such as the existence of suitable spatial data with sufficient degree of detail. It is expected to attain the proper procedures in order to assess geodiversity

  18. [Application and development of spectroscopy methodologies in the study on non-covalent interactions].

    PubMed

    Li, Rui; Dai, Ben-Cai; Zhao, Yong-De; Lu, Kui

    2009-01-01

    Spectrophotometric method is widely used in the structure determination of biologic macromolecules and non-covalent interactions study for its convenience and speed. In the present paper, spectroscopy methodologies in the study of non-covalent interactions between small-molecule and biomacromolecule is comprehensively reviewed with 25 references. This review article focuses on the applications and development of common spectroscopy methodologies in the study of non-covalent interactions between small molecule and biomacromolecule,including the UV, fluorescence, CD, IR, Raman, resonance light scattering technique and SPR. The advantages and disadvantages of spectroscopy methodologies are also described. UV-Vis absorption spectrum (UV) method is widely used in the study of non-covalent interactions for its convenience and speed. The binding site number, the apparent binding constant and the interaction mode of non-covalent interactions can be obtained by fluorescence spectrum method. Circular dichroism (CD) method is effective way in the study of non-covalent interactions measure. Spectroscopy information about protein secondary structure and conformation can be acquired by infrared spectrometry (IR) method. Raman spectroscopy method is a better way to investigate the conformation change in macromolecules in solution. Non-covalent interactions can be measured by surface plasma resonance (SPR) method under the natural active condition. X-ray diffraction analysis method is better for non-covalent interactions research, but it is difficult to cultivate crystalline complex.

  19. Toward the development of a Trust-Tech-based methodology for solving mixed integer nonlinear optimization

    NASA Astrophysics Data System (ADS)

    Wang, Bin; Chiang, Hsiao-Dong

    Many applications of smart grid can be formulated as constrained optimization problems. Because of the discrete controls involved in power systems, these problems are essentially mixed-integer nonlinear programs. In this paper, we review the Trust-Tech-based methodology for solving mixed-integer nonlinear optimization. Specifically, we have developed a two-stage Trust-Tech-based methodology to systematically compute all the local optimal solutions for constrained mixed-integer nonlinear programming (MINLP) problems. In the first stage, for a given MINLP problem this methodology starts with the construction of a new, continuous, unconstrained problem through relaxation and the penalty function method. A corresponding dynamical system is then constructed to search for a set of local optimal solutions for the unconstrained problem. In the second stage, a reduced constrained NLP is defined for each local optimal solution by determining and fixing the values of integral variables of the MINLP problem. The Trust-Tech-based method is used to compute a set of local optimal solutions for these reduced NLP problems, from which the optimal solution of the original MINLP problem is determined. A numerical simulation of several testing problems is provided to illustrate the effectiveness of our proposed method.

  20. Risk assessment methodology applied to counter IED research & development portfolio prioritization

    SciTech Connect

    Shevitz, Daniel W; O' Brien, David A; Zerkle, David K; Key, Brian P; Chavez, Gregory M

    2009-01-01

    In an effort to protect the United States from the ever increasing threat of domestic terrorism, the Department of Homeland Security, Science and Technology Directorate (DHS S&T), has significantly increased research activities to counter the terrorist use of explosives. More over, DHS S&T has established a robust Counter-Improvised Explosive Device (C-IED) Program to Deter, Predict, Detect, Defeat, and Mitigate this imminent threat to the Homeland. The DHS S&T portfolio is complicated and changing. In order to provide the ''best answer'' for the available resources, DHS S&T would like some ''risk based'' process for making funding decisions. There is a definite need for a methodology to compare very different types of technologies on a common basis. A methodology was developed that allows users to evaluate a new ''quad chart'' and rank it, compared to all other quad charts across S&T divisions. It couples a logic model with an evidential reasoning model using an Excel spreadsheet containing weights of the subjective merits of different technologies. The methodology produces an Excel spreadsheet containing the aggregate rankings of the different technologies. It uses Extensible Logic Modeling (ELM) for logic models combined with LANL software called INFTree for evidential reasoning.

  1. [Application and development of spectroscopy methodologies in the study on non-covalent interactions].

    PubMed

    Li, Rui; Dai, Ben-Cai; Zhao, Yong-De; Lu, Kui

    2009-01-01

    Spectrophotometric method is widely used in the structure determination of biologic macromolecules and non-covalent interactions study for its convenience and speed. In the present paper, spectroscopy methodologies in the study of non-covalent interactions between small-molecule and biomacromolecule is comprehensively reviewed with 25 references. This review article focuses on the applications and development of common spectroscopy methodologies in the study of non-covalent interactions between small molecule and biomacromolecule,including the UV, fluorescence, CD, IR, Raman, resonance light scattering technique and SPR. The advantages and disadvantages of spectroscopy methodologies are also described. UV-Vis absorption spectrum (UV) method is widely used in the study of non-covalent interactions for its convenience and speed. The binding site number, the apparent binding constant and the interaction mode of non-covalent interactions can be obtained by fluorescence spectrum method. Circular dichroism (CD) method is effective way in the study of non-covalent interactions measure. Spectroscopy information about protein secondary structure and conformation can be acquired by infrared spectrometry (IR) method. Raman spectroscopy method is a better way to investigate the conformation change in macromolecules in solution. Non-covalent interactions can be measured by surface plasma resonance (SPR) method under the natural active condition. X-ray diffraction analysis method is better for non-covalent interactions research, but it is difficult to cultivate crystalline complex. PMID:19385248

  2. Using practice development methodology to develop children's centre teams: ideas for the future.

    PubMed

    Hemingway, Ann; Cowdell, Fiona

    2009-09-01

    The Children's Centre Programme is a recent development in the UK and brings together multi-agency teams to work with disadvantaged families. Practice development methods enable teams to work together in new ways. Although the term practice development remains relatively poorly defined, its key properties suggest that it embraces engagement, empowerment, evaluation and evolution. This paper introduces the Children's Centre Programme and practice development methods and aims to discuss the relevance of using this method to develop teams in children's centres through considering the findings from an evaluation of a two-year project to develop inter-agency public health teams. The evaluation showed that practice development methods can enable successful team development and showed that through effective facilitation, teams can change their practice to focus on areas of local need. The team came up with their own process to develop a strategy for their locality. PMID:19788167

  3. Appropriate Methodology for Assessing the Economic Development Impacts of Wind Power

    SciTech Connect

    NWCC Economic Development Work Group

    2003-12-17

    OAK-B135 Interest in wind power development is growing as a means of expanding local economies. Such development holds promise as a provider of short-term employment during facility construction and long-term employment from ongoing facility operation and maintenance. It may also support some expansion of the local economy through ripple effects resulting from initial increases in jobs and income. However, there is a need for a theoretically sound method for assessing the economic impacts of wind power development. These ripple effects stem from subsequent expenditures for goods and services made possible by first-round income from the development, and are expressed in terms of a multiplier. If the local economy offers a wide range of goods and services the resulting multiplier can be substantial--as much as three or four. If not, then much of the initial income will leave the local economy to buy goods and services from elsewhere. Loss of initial income to other locales is referred to as a leakage. Northwest Economic Associates (NEA), under contract to the National Wind Coordinating Committee (NWCC), investigated three case study areas in the United States where wind power projects were recently developed. The full report, ''Assessing the Economic Development Impacts of Wind Power,'' is available at NWCC's website http://www.nationalwind.org/. The methodology used for that study is summarized here in order to provide guidance for future studies of the economic impacts of other wind power developments. The methodology used in the NEA study was specifically designed for these particular case study areas; however, it can be generally applied to other areas. Significant differences in local economic conditions and the amount of goods and services that are purchased locally as opposed to imported from outside the will strongly influence results obtained. Listed below are some of the key tasks that interested parties should undertake to develop a reasonable picture of

  4. Agile text mining for the 2014 i2b2/UTHealth Cardiac risk factors challenge.

    PubMed

    Cormack, James; Nath, Chinmoy; Milward, David; Raja, Kalpana; Jonnalagadda, Siddhartha R

    2015-12-01

    This paper describes the use of an agile text mining platform (Linguamatics' Interactive Information Extraction Platform, I2E) to extract document-level cardiac risk factors in patient records as defined in the i2b2/UTHealth 2014 challenge. The approach uses a data-driven rule-based methodology with the addition of a simple supervised classifier. We demonstrate that agile text mining allows for rapid optimization of extraction strategies, while post-processing can leverage annotation guidelines, corpus statistics and logic inferred from the gold standard data. We also show how data imbalance in a training set affects performance. Evaluation of this approach on the test data gave an F-Score of 91.7%, one percent behind the top performing system.

  5. Optimization-based methodology for the development of wastewater facilities for energy and nutrient recovery.

    PubMed

    Puchongkawarin, C; Gomez-Mont, C; Stuckey, D C; Chachuat, B

    2015-12-01

    A paradigm shift is currently underway from an attitude that considers wastewater streams as a waste to be treated, to a proactive interest in recovering materials and energy from these streams. This paper is concerned with the development and application of a systematic, model-based methodology for the development of wastewater resource recovery systems that are both economically attractive and sustainable. With the array of available treatment and recovery options growing steadily, a superstructure modeling approach based on rigorous mathematical optimization appears to be a natural approach for tackling these problems. The development of reliable, yet simple, performance and cost models is a key issue with this approach in order to allow for a reliable solution based on global optimization. We argue that commercial wastewater simulators can be used to derive such models, and we illustrate this approach with a simple resource recovery system. The results show that the proposed methodology is computationally tractable, thereby supporting its application as a decision support system for selection of promising resource recovery systems whose development is worth pursuing.

  6. Development of methodologies for the estimation of thermal properties associated with aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Scott, Elaine P.

    1993-01-01

    Thermal stress analyses are an important aspect in the development of aerospace vehicles such as the National Aero-Space Plane (NASP) and the High-Speed Civil Transport (HSCT) at NASA-LaRC. These analyses require knowledge of the temperature within the structures which consequently necessitates the need for thermal property data. The initial goal of this research effort was to develop a methodology for the estimation of thermal properties of aerospace structural materials at room temperature and to develop a procedure to optimize the estimation process. The estimation procedure was implemented utilizing a general purpose finite element code. In addition, an optimization procedure was developed and implemented to determine critical experimental parameters to optimize the estimation procedure. Finally, preliminary experiments were conducted at the Aircraft Structures Branch (ASB) laboratory.

  7. Structured system engineering methodologies used to develop a nuclear thermal propulsion engine

    NASA Technical Reports Server (NTRS)

    Corban, R.; Wagner, R.

    1993-01-01

    To facilitate the development of a space nuclear thermal propulsion engine for manned flights to Mars, requirements must be established early in the technology development cycle. The long lead times for the acquisition of the engine system and nuclear test facilities demands that the engine system size, performance and safety goals be defined at the earliest possible time. These systems are highly complex and require a large multidisciplinary systems engineering team to develop and track requirements, and to ensure that the as-built system reflects the intent of the mission. A methodology has been devised which uses sophisticated computer tools to effectively develop and interpret functional requirements, and furnish these to the specification level for implementation.

  8. Methodology for optimizing the development and operation of gas storage fields

    SciTech Connect

    Mercer, J.C.; Ammer, J.R.; Mroz, T.H.

    1995-04-01

    The Morgantown Energy Technology Center is pursuing the development of a methodology that uses geologic modeling and reservoir simulation for optimizing the development and operation of gas storage fields. Several Cooperative Research and Development Agreements (CRADAs) will serve as the vehicle to implement this product. CRADAs have been signed with National Fuel Gas and Equitrans, Inc. A geologic model is currently being developed for the Equitrans CRADA. Results from the CRADA with National Fuel Gas are discussed here. The first phase of the CRADA, based on original well data, was completed last year and reported at the 1993 Natural Gas RD&D Contractors Review Meeting. Phase 2 analysis was completed based on additional core and geophysical well log data obtained during a deepening/relogging program conducted by the storage operator. Good matches, within 10 percent, of wellhead pressure were obtained using a numerical simulator to history match 2 1/2 injection withdrawal cycles.

  9. Frequency-agile CO2 DIAL for environmental monitoring

    NASA Astrophysics Data System (ADS)

    Carr, Lewis W.; Fletcher, Leland; Crittenden, Max; Carlisle, Clinton B.; Gotoff, Steve W.; Reyes, Felix; D'Amico, Francis

    1994-06-01

    SRI International has designed and developed a fully automated frequency-agile CO2 DIAL (differential absorption lidar) system. The system sensor head consists of a single, frequency- agile, CO2, TEA laser; a 10-inch receiver telescope, a liquid-nitrogen-cooled HgCdTe detector; and a transmit energy monitor. The sensor head and its auxiliary equipment (including the data acquisition and processing system, laser power supply, and water cooler) are mounted in a Grumman-Olson 11-ft step van. The self-contained, mobile system can be used to detect and quantify many volatile organic compounds (VOCs) at parts per million sensitivities over open-path ranges to 5 km. Characterization and demonstration of the system is ongoing. However, data collected on benzene, toluene, xylene, methanol, ethyl acetate, acetic anhydride, and other VOCs will be described herein. The system could be used by industry and government agencies in stand-off monitoring to map VOC emission sources and transport patterns into surrounding communities. A single mobile system could be used for several locations to verify compliance with environmental regulations such as the 1990 Clean Air Act Amendments.

  10. Frequency-agile microwave components using ferroelectric materials

    NASA Astrophysics Data System (ADS)

    Colom-Ustariz, Jose G.; Rodriguez-Solis, Rafael; Velez, Salmir; Rodriguez-Acosta, Snaider

    2003-04-01

    The non-linear electric field dependence of ferroelectric thin films can be used to design frequency and phase agile components. Tunable components have traditionally been developed using mechanically tuned resonant structures, ferrite components, or semiconductor-based voltage controlled electronics, but they are limited by their frequency performance, high cost, hgih losses, and integration into larger systems. In contrast, the ferroelectric-based tunable microwave component can easily be integrated into conventional microstrip circuits and attributes such as small size, light weight, and low-loss make these components attractive for broadband and multi-frequency applications. Components that are essential elements in the design of a microwave sensor can be fabricated with ferroelectric materials to achieve tunability over a broad frequency range. It has been reported that with a thin ferroelectric film placed between the top conductor layer and the dielectric material of a microstrip structure, and the proper DC bias scheme, tunable components above the Ku band can be fabricated. Components such as phase shifters, coupled line filters, and Lange couplers have been reported in the literature using this technique. In this wokr, simulated results from a full wave electromagnetic simulator are obtained to show the tunability of a matching netowrk typically used in the design of microwave amplifiers and antennas. In addition, simulated results of a multilayer Lange coupler, and a patch antenna are also presented. The results show that typical microstrip structures can be easily modified to provide frequency agile capabilities.

  11. The Test Equipment of the AGILE Minicalorimeter Prototype

    SciTech Connect

    Trifoglio, M.; Bulgarelli, A.; Gianotti, F.; Celesti, E.; Di Cocco, G.; Labanti, C.; Mauri, A.; Prest, M.; Vallazza, E.; Froysland, T.

    2004-09-28

    AGILE is an ASI (Italian Space Agency) Small Space Mission for high energy astrophysics in the range 30 MeV - 50 GeV. The AGILE satellite is currently in the C phase and is planned to be launched in 2005. The Payload shall consist of a Tungsten-Silicon Tracker, a CsI Minicalorimeter, an anticoincidence system and a X-Ray detector sensitive in the 10-40 KeV range. The purpose of the Minicalorimeter (MCAL) is twofold. It shall work in conjunction with the Tracker in order to evaluate the energy of the interacting photons, and it shall operate autonomously in the energy range 250KeV-250 MeV for detection of transients and gamma ray burst events and for the measurement of gamma ray background fluctuations. We present the architecture of the Test Equipment we have designed and developed in order to test and verify the MCAL Simplified Electrical Model prototype which has been manufactured in order to validate the design of the MCAL Proto Flight Model.

  12. A Life-Cycle Cost Estimating Methodology for NASA-Developed Air Traffic Control Decision Support Tools

    NASA Technical Reports Server (NTRS)

    Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)

    2002-01-01

    This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.

  13. Backscatter measurements of aerosolized CB simulants with a frequency agile CO2 lidar

    NASA Astrophysics Data System (ADS)

    Vanderbeek, Richard; Gurton, Kristan

    2004-02-01

    A novel windowless chamber was developed to allow aerosol backscatter measurements with a frequency-agile CO2 lidar. The chamber utilizes curtains of air to contain the cloud, thus preventing the inevitable backscatter off of conventional windows from corrupting the desired measurements. This feature is critical because the CO2 lidar has a long (1 μs) pulse and the backscatter off the window cannot be temporally separated from the backscatter off the aerosol in the chamber. The chamber was designed for testing with a variety of CB simulants and interferents in both vapor and aerosol form and has been successfully shown to contain a cloud of known size, concentration, and particle size distribution for 10-15 minutes. This paper shows the results using Arizona road dust that was screened by the manufacturer into 0-3 μm and 5-10 μm particle size distributions. The measurements clearly show the effect of size distribution on the infrared backscatter coefficients as well as the dynamic nature of the size distribution for a population of aerosols. The test methodology and experimental results are presented.

  14. HSTDEK: Developing a methodology for construction of large-scale, multi-use knowledge bases

    NASA Technical Reports Server (NTRS)

    Freeman, Michael S.

    1987-01-01

    The primary research objectives of the Hubble Space Telescope Design/Engineering Knowledgebase (HSTDEK) are to develop a methodology for constructing and maintaining large scale knowledge bases which can be used to support multiple applications. To insure the validity of its results, this research is being persued in the context of a real world system, the Hubble Space Telescope. The HSTDEK objectives are described in detail. The history and motivation of the project are briefly described. The technical challenges faced by the project are outlined.

  15. Participatory scenario development for environmental management: a methodological framework illustrated with experience from the UK uplands.

    PubMed

    Reed, M S; Kenter, J; Bonn, A; Broad, K; Burt, T P; Fazey, I R; Fraser, E D G; Hubacek, K; Nainggolan, D; Quinn, C H; Stringer, L C; Ravera, F

    2013-10-15

    A methodological framework is proposed for participatory scenario development on the basis of evidence from the literature, and is tested and refined through the development of scenarios for the future of UK uplands. The paper uses a review of previous work to justify a framework based around the following steps: i) define context and establish whether there is a basis for stakeholder engagement in scenario development; ii) systematically identify and represent relevant stakeholders in the process; iii) define clear objectives for scenario development with stakeholders including spatial and temporal boundaries; iv) select relevant participatory methods for scenario development, during initial scenario construction, evaluation and to support decision-making based on scenarios; and v) integrate local and scientific knowledge throughout the process. The application of this framework in case study research suggests that participatory scenario development has the potential to: i) make scenarios more relevant to stakeholder needs and priorities; ii) extend the range of scenarios developed; iii) develop more detailed and precise scenarios through the integration of local and scientific knowledge; and iv) move beyond scenario development to facilitate adaptation to future change. It is argued that participatory scenario development can empower stakeholders and lead to more consistent and robust scenarios that can help people prepare more effectively for future change. PMID:23774752

  16. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  17. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    NASA Technical Reports Server (NTRS)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  18. On the development of a strength prediction methodology for fibre metal laminates in pin bearing

    NASA Astrophysics Data System (ADS)

    Krimbalis, Peter Panagiotis

    The development of Fibre Metal Laminates (FMLs) for application into aerospace structures represents a paradigm shift in airframe and material technology. By consolidating both monolithic metallic alloys and fibre reinforced composite layers, a new material structure is born exhibiting desired qualities emerging from its heterogeneous constituency. When mechanically fastened via pins, bolts and rivets, these laminated materials develop damage and ultimately fail via mechanisms that were not entirely understood and different than either their metallic or composite constituents. The development of a predictive methodology capable of characterizing how FMLs fastened with pins behave and fail would drastically reduce the amount of experimentation necessary for material qualification and be an invaluable design tool. The body of this thesis discusses the extension of the characteristic dimension approach to FMLs and the subsequent development of a new failure mechanism as part of a progressive damage finite element (FE) modeling methodology with yielding, delamination and buckling representing the central tenets of the new mechanism. This yielding through delamination buckling (YDB) mechanism and progressive FE model were investigated through multiple experimental studies. The experimental investigations required the development of a protocol with emphasis on measuring deformation on a local scheme in addition to a global one. With the extended protocol employed, complete characterization of the material response was possible and a new definition for yield in a pin bearing configuration was developed and subsequently extended to a tensile testing configuration. The performance of this yield definition was compared directly to existing definitions and was shown to be effective in both quasi-isotropic and orthotropic materials. The results of the experiments and FE simulations demonstrated that yielding (according to the new definition), buckling and delamination

  19. Development and application of a hybrid transport methodology for active interrogation systems

    SciTech Connect

    Royston, K.; Walters, W.; Haghighat, A.; Yi, C.; Sjoden, G.

    2013-07-01

    A hybrid Monte Carlo and deterministic methodology has been developed for application to active interrogation systems. The methodology consists of four steps: i) neutron flux distribution due to neutron source transport and subcritical multiplication; ii) generation of gamma source distribution from (n, 7) interactions; iii) determination of gamma current at a detector window; iv) detection of gammas by the detector. This paper discusses the theory and results of the first three steps for the case of a cargo container with a sphere of HEU in third-density water cargo. To complete the first step, a response-function formulation has been developed to calculate the subcritical multiplication and neutron flux distribution. Response coefficients are pre-calculated using the MCNP5 Monte Carlo code. The second step uses the calculated neutron flux distribution and Bugle-96 (n, 7) cross sections to find the resulting gamma source distribution. In the third step the gamma source distribution is coupled with a pre-calculated adjoint function to determine the gamma current at a detector window. The AIMS (Active Interrogation for Monitoring Special-Nuclear-Materials) software has been written to output the gamma current for a source-detector assembly scanning across a cargo container using the pre-calculated values and taking significantly less time than a reference MCNP5 calculation. (authors)

  20. The AIV quick look and health monitoring system of the AGILE payload

    NASA Astrophysics Data System (ADS)

    Bulgarelli, Andrea; Gianotti, Fulvio; Trifoglio, Massimo; Di Cocco, Guido; Tavani, Marco; Marisaldi, Martino

    2008-07-01

    AGILE is an ASI (Italian Space Agency) Small Scientific Mission dedicated to high-energy astrophysics which was launched on April 23 2007 from Satish Dawan Space Centre, India) on a PSLV-C8 rocket. The AGILE Payload is composed of three instruments: a Tungsten-Silicon Tracker designed to detect and image photons in the 30 MeV-50 GeV energy band, an X-ray imager called SuperAGILE that works in the 18-60 keV energy band, and a Minicalorimeter that detects gamma-rays or particle energy deposits between 300~keV and 200~MeV. The instrument is surrounded by an anti-coincidence (AC) system. We have developed a set of Quick Look software tools in the framework of the Test Equipment (TE) and the Electrical Ground Support Equipment (EGSE. This s/w is required in order to support all the assembly, integration and verification (AIV) activities to be carried out for the AGILE mission, from data handling unit level to payload integrated level, calibration campaign, launch campaign and in-orbit commissioning. These software tools have enabled us to test the engineering performance and to perform a health check of the Payload during the various phases. We have used an incremental development approach and a common framework to rapidly adapt our software to the different requirements of the various phases.

  1. Application of side-oblique image-motion blur correction to Kuaizhou-1 agile optical images.

    PubMed

    Sun, Tao; Long, Hui; Liu, Bao-Cheng; Li, Ying

    2016-03-21

    Given the recent development of agile optical satellites for rapid-response land observation, side-oblique image-motion (SOIM) detection and blur correction have become increasingly essential for improving the radiometric quality of side-oblique images. The Chinese small-scale agile mapping satellite Kuaizhou-1 (KZ-1) was developed by the Harbin Institute of Technology and launched for multiple emergency applications. Like other agile satellites, KZ-1 suffers from SOIM blur, particularly in captured images with large side-oblique angles. SOIM detection and blur correction are critical for improving the image radiometric accuracy. This study proposes a SOIM restoration method based on segmental point spread function detection. The segment region width is determined by satellite parameters such as speed, height, integration time, and side-oblique angle. The corresponding algorithms and a matrix form are proposed for SOIM blur correction. Radiometric objective evaluation indices are used to assess the restoration quality. Beijing regional images from KZ-1 are used as experimental data. The radiometric quality is found to increase greatly after SOIM correction. Thus, the proposed method effectively corrects image motion for KZ-1 agile optical satellites.

  2. Application of side-oblique image-motion blur correction to Kuaizhou-1 agile optical images.

    PubMed

    Sun, Tao; Long, Hui; Liu, Bao-Cheng; Li, Ying

    2016-03-21

    Given the recent development of agile optical satellites for rapid-response land observation, side-oblique image-motion (SOIM) detection and blur correction have become increasingly essential for improving the radiometric quality of side-oblique images. The Chinese small-scale agile mapping satellite Kuaizhou-1 (KZ-1) was developed by the Harbin Institute of Technology and launched for multiple emergency applications. Like other agile satellites, KZ-1 suffers from SOIM blur, particularly in captured images with large side-oblique angles. SOIM detection and blur correction are critical for improving the image radiometric accuracy. This study proposes a SOIM restoration method based on segmental point spread function detection. The segment region width is determined by satellite parameters such as speed, height, integration time, and side-oblique angle. The corresponding algorithms and a matrix form are proposed for SOIM blur correction. Radiometric objective evaluation indices are used to assess the restoration quality. Beijing regional images from KZ-1 are used as experimental data. The radiometric quality is found to increase greatly after SOIM correction. Thus, the proposed method effectively corrects image motion for KZ-1 agile optical satellites. PMID:27136855

  3. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    NASA Astrophysics Data System (ADS)

    Ferreira Fonseca, T. C.; Bogaerts, R.; Hunt, John; Vanhavere, F.

    2014-11-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium.

  4. Information Models, Data Requirements, and Agile Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, John S.; Crichton, Dan; Ritschel, Bernd; Hardman, Sean; Joyner, Ron

    2015-04-01

    The Planetary Data System's next generation system, PDS4, is an example of the successful use of an ontology-based Information Model (IM) to drive the development and operations of a data system. In traditional systems engineering, requirements or statements about what is necessary for the system are collected and analyzed for input into the design stage of systems development. With the advent of big data the requirements associated with data have begun to dominate and an ontology-based information model can be used to provide a formalized and rigorous set of data requirements. These requirements address not only the usual issues of data quantity, quality, and disposition but also data representation, integrity, provenance, context, and semantics. In addition the use of these data requirements during system's development has many characteristics of Agile Curation as proposed by Young et al. [Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community, AGU 2014], namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. For example customers can be satisfied through early and continuous delivery of system software and services that are configured directly from the information model. This presentation will describe the PDS4 architecture and its three principle parts: the ontology-based Information Model (IM), the federated registries and repositories, and the REST-based service layer for search, retrieval, and distribution. The development of the IM will be highlighted with special emphasis on knowledge acquisition, the impact of the IM on development and operations, and the use of shared ontologies at multiple governance levels to promote system interoperability and data correlation.

  5. Wideband Agile Digital Microwave Radiometer

    NASA Technical Reports Server (NTRS)

    Gaier, Todd C.; Brown, Shannon T.; Ruf, Christopher; Gross, Steven

    2012-01-01

    The objectives of this work were to take the initial steps needed to develop a field programmable gate array (FPGA)- based wideband digital radiometer backend (>500 MHz bandwidth) that will enable passive microwave observations with minimal performance degradation in a radiofrequency-interference (RFI)-rich environment. As manmade RF emissions increase over time and fill more of the microwave spectrum, microwave radiometer science applications will be increasingly impacted in a negative way, and the current generation of spaceborne microwave radiometers that use broadband analog back ends will become severely compromised or unusable over an increasing fraction of time on orbit. There is a need to develop a digital radiometer back end that, for each observation period, uses digital signal processing (DSP) algorithms to identify the maximum amount of RFI-free spectrum across the radiometer band to preserve bandwidth to minimize radiometer noise (which is inversely related to the bandwidth). Ultimately, the objective is to incorporate all processing necessary in the back end to take contaminated input spectra and produce a single output value free of manmade signals to minimize data rates for spaceborne radiometer missions. But, to meet these objectives, several intermediate processing algorithms had to be developed, and their performance characterized relative to typical brightness temperature accuracy re quirements for current and future microwave radiometer missions, including those for measuring salinity, soil moisture, and snow pack.

  6. Proposed Methodology for Developing a National Strategy for Human Resource Development: Lessons Learned from a NNSA Workshop

    SciTech Connect

    Elkhamri, Oksana O.; Frazar, Sarah L.; Essner, Jonathan; Vergino, Eileen; Bissani, Mo; Apt, Kenneth E.; McClelland-Kerr, John; Mininni, Margot; VanSickle, Matthew; Kovacic, Donald

    2009-10-07

    This paper describes a recent National Nuclear Security Administration (NNSA) workshop on Human Resource Development, which was focused on the potential methodology for developing a National Human Resource strategy for nuclear power in emerging nuclear states. The need for indigenous human resource development (HRD) has been singled out as a key milestone by the International Atomic Energy Agency (IAEA) in its 2007 Milestones document. A number of countries considering nuclear energy have reiterated this need for experts and specialists to support a national nuclear program that is sustainable and secure. Many have expressed concern over how best to assure the long-term availability of crucial human resource, how to approach the workforce planning process, and how to determine the key elements of developing a national strategy.

  7. Development of a methodology for hydrologic characterization of faults for geological repository siting

    NASA Astrophysics Data System (ADS)

    Goto, J.; Yoshimura, K.; Moriya, T.; Tsuchi, H.; Karasaki, K.; Onishi, C. T.; Ueta, K.; Kiho, K.

    2011-12-01

    The Nuclear Waste Management Organization of Japan (NUMO) will select a site for HLW and TRU waste repository through the three-staged program, namely, the Literature Surveys, the Preliminary Investigations and the Detailed Investigations. Areas that are susceptible to natural hazards such as volcanism, faulting and significant uplift/erosion will be eliminated at first. Then, sites that have more favorable geological environment will be selected with respect to the repository design and long-term safety after closure. It is internationally acknowledged that hydrologic features of faults are of special concern in the above respects. It is highly likely from the experiences of site characterization worldwide that one could encounter numerous faults in an area of one hundred square kilometer assumed for the Preliminary Investigations. Efficient and practical investigation programs, and reliable models/parameters for the repository design and safety analysis are important aspects for implementers. A comprehensive methodology including strategies and procedures for characterizing such faults should thus be prepared prior to the actual investigations. Surveys on the results of site characterization in the world indicate potential contribution of geological features of faults such as host lithology, geometry, slip direction, internal structure and alteration to the fault hydrology. Therefore, NUMO, in collaboration with Lawrence Berkeley National Laboratory (LBNL), started a 5-year project in 2007 involving field investigations to develop a comprehensive methodology for hydrologic characterization of faults, with emphasis on the relationship between geological and hydrologic features of faults. A series of field investigations including ground geophysics, geological mapping, trench surveys, borehole investigations, hydrochemical analyses and hydrologic monitoring have been carried out on the Wildcat Fault that runs along the Berkeley Hills, California (see Karasaki, et

  8. Recent developments in atomic/nuclear methodologies used for the study of cultural heritage objects

    SciTech Connect

    Appoloni, Carlos Roberto

    2013-05-06

    Archaeometry is an area established in the international community since the 60s, with extensive use of atomic-nuclear methods in the characterization of art, archaeological and cultural heritage objects in general. In Brazil, however, until the early '90s, employing methods of physics, only the area of archaeological dating was implemented. It was only after this period that Brazilian groups became involved in the characterization of archaeological and art objects with these methodologies. The Laboratory of Applied Nuclear Physics, State University of Londrina (LFNA/UEL) introduced, pioneered in 1994, Archaeometry and related issues among its priority lines of research, after a member of LFNA has been involved in 1992 with the possibilities of tomography in archaeometry, as well as the analysis of ancient bronzes by EDXRF. Since then, LFNA has been working with PXRF and Portable Raman in several museums in Brazil, in field studies of cave paintings and in the laboratory with material sent by archaeologists, as well as carrying out collaborative work with new groups that followed in this area. From 2003/2004 LAMFI/DFN/IFUSP and LIN/COPPE/UFRJ began to engage in the area, respectively with methodologies using ion beams and PXRF, then over time incorporating other techniques, followed later by other groups. Due to the growing number of laboratories and institutions/archaeologists/conservators interested in these applications, in may 2012 was created a network of available laboratories, based at http://www.dfn.if.usp.br/lapac. It will be presented a panel of recent developments and applications of these methodologies by national groups, as well as a sampling of what has been done by leading groups abroad.

  9. Recent developments in atomic/nuclear methodologies used for the study of cultural heritage objects

    NASA Astrophysics Data System (ADS)

    Appoloni, Carlos Roberto

    2013-05-01

    Archaeometry is an area established in the international community since the 60s, with extensive use of atomic-nuclear methods in the characterization of art, archaeological and cultural heritage objects in general. In Brazil, however, until the early '90s, employing methods of physics, only the area of archaeological dating was implemented. It was only after this period that Brazilian groups became involved in the characterization of archaeological and art objects with these methodologies. The Laboratory of Applied Nuclear Physics, State University of Londrina (LFNA/UEL) introduced, pioneered in 1994, Archaeometry and related issues among its priority lines of research, after a member of LFNA has been involved in 1992 with the possibilities of tomography in archaeometry, as well as the analysis of ancient bronzes by EDXRF. Since then, LFNA has been working with PXRF and Portable Raman in several museums in Brazil, in field studies of cave paintings and in the laboratory with material sent by archaeologists, as well as carrying out collaborative work with new groups that followed in this area. From 2003/2004 LAMFI/DFN/IFUSP and LIN/COPPE/UFRJ began to engage in the area, respectively with methodologies using ion beams and PXRF, then over time incorporating other techniques, followed later by other groups. Due to the growing number of laboratories and institutions/archaeologists/conservators interested in these applications, in may 2012 was created a network of available laboratories, based at http://www.dfn.if.usp.br/lapac. It will be presented a panel of recent developments and applications of these methodologies by national groups, as well as a sampling of what has been done by leading groups abroad.

  10. [Development of an optimized formulation of damask marmalade with low energy level using Taguchi methodology].

    PubMed

    Villarroel, Mario; Castro, Ruth; Junod, Julio

    2003-06-01

    The goal of this present study was the development of an optimized formula of damask marmalade low in calories applying Taguchi methodology to improve the quality of this product. The selection of this methodology lies on the fact that in real life conditions the result of an experiment frequently depends on the influence of several variables, therefore, one expedite way to solve this problem is utilizing factorial desings. The influence of acid, thickener, sweetener and aroma additives, as well as time of cooking, and possible interactions among some of them, were studied trying to get the best combination of these factors to optimize the sensorial quality of an experimental formulation of dietetic damask marmalade. An orthogonal array L8 (2(7)) was applied in this experience, as well as level average analysis was carried out according Taguchi methodology to determine the suitable working levels of the design factors previously choiced, to achieve a desirable product quality. A sensory trained panel was utilized to analyze the marmalade samples using a composite scoring test with a descriptive acuantitative scale ranging from 1 = Bad, 5 = Good. It was demonstrated that the design factors sugar/aspartame, pectin and damask aroma had a significant effect (p < 0.05) on the sensory quality of the marmalade with 82% of contribution on the response. The optimal combination result to be: citric acid 0.2%; pectin 1%; 30 g sugar/16 mg aspartame/100 g, damask aroma 0.5 ml/100 g, time of cooking 5 minutes. Regarding chemical composition, the most important results turned out to be the decrease in carbohydrate content compaired with traditional marmalade with a reduction of 56% in coloric value and also the amount of dietary fiber greater than similar commercial products. Assays of storage stability were carried out on marmalade samples submitted to different temperatures held in plastic bags of different density. Non percetible sensorial, microbiological and chemical changes

  11. The Interplay of Invention, Discovery, Development and Application in Organic Synthetic Methodology: A Case Study

    PubMed Central

    Denmark, Scott E.

    2009-01-01

    This Perspective chronicles the conceptual development, proof of principle, exploration of scope, mechanistic investigations and applications in natural product total synthesis of palladium-catalyzed cross-coupling reactions of silicon derivatives. The explication of how this new class of cross-coupling reactions evolved from problem formulation to use in complex molecule synthesis serves as one goal of the essay. The other goal is the presentation of the various stages of this methodological enterprise such that the reader gleans a first hand look at one approach to the creation of new synthetic reactions. These two goals are woven together such that the underlying thought processes that guide a program of reaction development emerge in clear view and imbue the chemical tapestry with a cohesive logic. PMID:19278233

  12. Overview of multiple testing methodology and recent development in clinical trials.

    PubMed

    Wang, Deli; Li, Yihan; Wang, Xin; Liu, Xuan; Fu, Bo; Lin, Yunzhi; Larsen, Lois; Offen, Walter

    2015-11-01

    Multiplicity control is an important statistical issue in clinical trials where strong control of the type I error rate is required. Many multiple testing methods have been proposed and applied to address multiplicity issues in clinical trials. This paper provides an application oriented and comprehensive overview of commonly used multiple testing procedures and recent developments in statistical methodology in multiple testing in clinical trials. Commonly used multiple testing procedures are applied to test non-hierarchical hypotheses and gatekeeping procedures can be used to test hierarchically ordered hypotheses while controlling the overall type I error rate. The recently developed graphical approach has the flexibility to integrate hierarchical and non-hierarchical procedures into one framework. A graphical multiple testing procedure with "no-dead-end" provides an opportunity to fully recycle α across hypothesis families. Two hypothetical clinical trial examples are used to illustrate applications of these procedures. The advantages and disadvantages of the different procedures are briefly discussed.

  13. Development of residential-conservation-survey methodology for the US Air Force. Interim report. Task two

    SciTech Connect

    Abrams, D. W.; Hartman, T. L.; Lau, A. S.

    1981-11-13

    A US Air Force (USAF) Residential Energy Conservation Methodology was developed to compare USAF needs and available data to the procedures of the Residential Conservation Service (RCS) program as developed for general use by utility companies serving civilian customers. Attention was given to the data implications related to group housing, climatic data requirements, life-cycle cost analysis, energy saving modifications beyond those covered by RCS, and methods for utilizing existing energy consumption data in approaching the USAF survey program. Detailed information and summaries are given on the five subtasks of the program. Energy conservation alternatives are listed and the basic analysis techniques to be used in evaluating their thermal performane are described. (MCW)

  14. [Methodology for the development and update of practice guidelines: current state].

    PubMed

    Barrera-Cruz, Antonio; Viniegra-Osorio, Arturo; Valenzuela-Flores, Adriana Abigail; Torres-Arreola, Laura Pilar; Dávila-Torres, Javier

    2016-01-01

    The current scenario of health services in Mexico reveals as a priority the implementation of strategies that allow us to better respond to the needs and expectations of individuals and society as a whole, through the provision of efficient and effective alternatives for the prevention, diagnosis and treatment of diseases. In this context, clinical practice guidelines constitute an element of management in the health care system, whose objective is to establish a national bechmark for encouraging clinical and management decision making, based on recommendations from the best available evidence, in order to contribute to the quality and effectiveness of health care. The purpose of this document is to show the methodology used for the development and updating of clinical practice guidelines that the Instituto Mexicano del Seguro Social has developed in line with the sectorial model in order to serve the user of these guidelines.

  15. Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors

    SciTech Connect

    Hugo, Jacques

    2015-05-01

    This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method was adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.

  16. Development of Hydrologic Characterization Methodology of Faults: Outline of the Project in Berkeley, California

    NASA Astrophysics Data System (ADS)

    Goto, J.; Miwa, T.; Tsuchi, H.; Karasaki, K.

    2009-12-01

    The Nuclear Waste Management Organization of Japan (NUMO), after volunteering municipalities arise, will start a three-staged program for selecting a HLW and TRU waste repository site. It is recognized from experiences from various site characterization programs in the world that the hydrologic property of faults is one of the most important parameters in the early stage of the program. It is expected that numerous faults of interest exist in an investigation area of several tens of square kilometers. It is, however, impossible to characterize all these faults in a limited time and budget. This raises problems in the repository designing and safety assessment that we may have to accept unrealistic or over conservative results by using a single model or parameters for all the faults in the area. We, therefore, seek to develop an efficient and practical methodology to characterize hydrologic property of faults. This project is a five year program started in 2007, and comprises the basic methodology development through literature study and its verification through field investigations. The literature study tries to classify faults by correlating their geological features with hydraulic property, to search for the most efficient technology for fault characterization, and to develop a work flow diagram. The field investigation starts from selection of a site and fault(s), followed by existing site data analyses, surface geophysics, geological mapping, trenching, water sampling, a series of borehole investigations and modeling/analyses. Based on the results of the field investigations, we plan to develop a systematic hydrologic characterization methodology of faults. A classification method that correlates combinations of geological features (rock type, fault displacement, fault type, position in a fault zone, fracture zone width, damage zone width) with widths of high permeability zones around a fault zone was proposed through a survey on available documents of the site

  17. Associated with aerospace vehicles development of methodologies for the estimation of thermal properties

    NASA Technical Reports Server (NTRS)

    Scott, Elaine P.

    1994-01-01

    Thermal stress analyses are an important aspect in the development of aerospace vehicles at NASA-LaRC. These analyses require knowledge of the temperature distributions within the vehicle structures which consequently necessitates the need for accurate thermal property data. The overall goal of this ongoing research effort is to develop methodologies for the estimation of the thermal property data needed to describe the temperature responses of these complex structures. The research strategy undertaken utilizes a building block approach. The idea here is to first focus on the development of property estimation methodologies for relatively simple conditions, such as isotropic materials at constant temperatures, and then systematically modify the technique for the analysis of more and more complex systems, such as anisotropic multi-component systems. The estimation methodology utilized is a statistically based method which incorporates experimental data and a mathematical model of the system. Several aspects of this overall research effort were investigated during the time of the ASEE summer program. One important aspect involved the calibration of the estimation procedure for the estimation of the thermal properties through the thickness of a standard material. Transient experiments were conducted using a Pyrex standard at various temperatures, and then the thermal properties (thermal conductivity and volumetric heat capacity) were estimated at each temperature. Confidence regions for the estimated values were also determined. These results were then compared to documented values. Another set of experimental tests were conducted on carbon composite samples at different temperatures. Again, the thermal properties were estimated for each temperature, and the results were compared with values obtained using another technique. In both sets of experiments, a 10-15 percent off-set between the estimated values and the previously determined values was found. Another effort

  18. Agile Machining and Inspection Non-Nuclear Report (NNR) Project

    SciTech Connect

    Lazarus, Lloyd

    2009-02-19

    This report is a high level summary of the eight major projects funded by the Agile Machining and Inspection Non-Nuclear Readiness (NNR) project (FY06.0422.3.04.R1). The largest project of the group is the Rapid Response project in which the six major sub categories are summarized. This project focused on the operations of the machining departments that will comprise Special Applications Machining (SAM) in the Kansas City Responsive Infrastructure Manufacturing & Sourcing (KCRIMS) project. This project was aimed at upgrading older machine tools, developing new inspection tools, eliminating Classified Removable Electronic Media (CREM) in the handling of classified Numerical Control (NC) programs by installing the CRONOS network, and developing methods to automatically load Coordinated-Measuring Machine (CMM) inspection data into bomb books and product score cards. Finally, the project personnel leaned perations of some of the machine tool cells, and now have the model to continue this activity.

  19. Evaluation of a proposed expert system development methodology: Two case studies

    NASA Technical Reports Server (NTRS)

    Gilstrap, Lewey

    1990-01-01

    Two expert system development projects were studied to evaluate a proposed Expert Systems Development Methodology (ESDM). The ESDM was developed to provide guidance to managers and technical personnel and serve as a standard in the development of expert systems. It was agreed that the proposed ESDM must be evaluated before it could be adopted; therefore a study was planned for its evaluation. This detailed study is now underway. Before the study began, however, two ongoing projects were selected for a retrospective evaluation. They were the Ranging Equipment Diagnostic Expert System (REDEX) and the Backup Control Mode Analysis and Utility System (BCAUS). Both projects were approximately 1 year into development. Interviews of project personnel were conducted, and the resulting data was used to prepare the retrospective evaluation. Decision models of the two projects were constructed and used to evaluate the completeness and accuracy of key provisions of ESDM. A major conclusion reached from these case studies is that suitability and risk analysis should be required for all AI projects, large and small. Further, the objectives of each stage of development during a project should be selected to reduce the next largest area of risk or uncertainty on the project.

  20. Methodological aspects of current problems in target-based anticancer drug development.

    PubMed

    Yamanaka, Takeharu; Okamoto, Tatsuro; Ichinose, Yukito; Oda, Shinya; Maehara, Yoshihiko

    2006-06-01

    Differently from the conventional antineoplastic agents, target-based drugs are designed a priori, based on our knowledge of various physiological molecules that has been obtained by the development of molecular biology. This "Copernican revolution" in drug development may imply a paradigm shift in this field. However, contrary to the initial expectations, many drugs developed by this approach are now faced with difficulties, mainly because of the fundamental and theoretical limits of this approach. All of the physiological functions are not always known in all target molecules. In low-molecular-weight drugs, i.e., "inhibitors," targets disperse, due to the structural similarities in physiological molecules. This double-faced "out-of-focusing" causes many problems in various steps of drug development, drug design, clinical trials, and administration to patients. Many drugs are now being abandoned because of unexpectedly lower response rates or unforeseeable adverse effects, and the variety of the drugs exhibits a kaleidoscopic appearance. The double-faced "out-of-focusing" derives from the methodological limits in molecular biology, i.e., elementalism, and limits in our techniques for drug development. To overcome these currently inevitable limits, it appears essential to elucidate the specific changes in target molecules that chiefly promote tumor growth and, consequently, strongly predict response to the administered drugs. Precise and efficient detection of responder populations is the key to the development and establishment of target-based anticancer therapies. PMID:16850122

  1. Recent Developments in General Methodologies for the Synthesis of α-Ketoamides.

    PubMed

    De Risi, Carmela; Pollini, Gian Piero; Zanirato, Vinicio

    2016-03-01

    The α-ketoamide motif is widely found in many natural products and drug candidates with relevant biological activities. Furthermore, α-ketoamides are attractive candidates to synthetic chemists due to the ability of the motif to access a wide range of functional group transformations, including multiple bond-forming processes. For these reasons, a vast array of synthetic procedures for the preparation of α-ketoamides have been developed over the past decades, and the search for expeditious and efficient protocols continues unabated. The aim of this review is to give an overview of the diverse methodologies that have emerged since the 1990s up to the present. The different synthetic routes have been grouped according to the way the α-ketoamide moiety has been created. Thus, syntheses of α-ketoamides proceeding via C(2)-oxidation of amide starting compounds are detailed, as are amidation approaches installing the α-ketoamide residue through C(1)-N bond formation. Also discussed are the methodologies centered on C(1)-C(2) σ-bond construction and C(2)-R/Ar bond-forming processes. Finally, the literature regarding the synthesis of α-ketoamide compounds by palladium-catalyzed double-carbonylative amination reactions is discussed. PMID:26881454

  2. Development, testing and implementation of an emergency services methodology in Alberta.

    PubMed

    Eliasoph, H; Ashdown, C

    1995-01-01

    Alberta was the first province in Canada to mandate reporting of hospital-based emergency services. This reporting is based on a workload measurement system that groups emergency visits into five discreet workload levels/classes driven by ICD-9-CM diagnoses. Other related workload measurement variables are incorporated, including admissions, transfers, maintenance monitoring, nursing and non-nursing patient support activities, trips, staff replacement, and personal fatigue and delay. The methodology used to design the reporting system has been subjected to extensive testing, auditing and refinement. The results of one year of province-wide data collection yielded approximately 1.5 million emergency visits. These data reveal consistent patterns/trends of workload that vary by hospital size and type. Although this information can assist in utilization management efforts to predict and compare workload and staffing levels, the impetus for establishing this system derived from its potential for funding hospital-based emergency services. This would be the first time that such services would be funded on a systemic, system-wide basis whereby hospitals would be reimbursed in relation to workload. This proposed funding system would distribute available funding in a consistent, fair and equitable manner across all hospitals providing a similar set of services, thus achieving one of the key goals of the Alberta Acute Care Funding Plan. Ultimately, this proposed funding methodology would be integrated into a broader Ambulatory Care Funding system currently being developed in Alberta.

  3. Development of an evaluation methodology for triple bottom line reports using international standards on reporting.

    PubMed

    Skouloudis, Antonis; Evangelinos, Konstantinos; Kourmousis, Fotis

    2009-08-01

    The purpose of this article is twofold. First, evaluation scoring systems for triple bottom line (TBL) reports to date are examined and potential methodological weaknesses and problems are highlighted. In this context, a new assessment methodology is presented based explicitly on the most widely acknowledged standard on non-financial reporting worldwide, the Global Reporting Initiative (GRI) guidelines. The set of GRI topics and performance indicators was converted into scoring criteria while the generic scoring devise was set from 0 to 4 points. Secondly, the proposed benchmark tool was applied to the TBL reports published by Greek companies. Results reveal major gaps in reporting practices, stressing the need for the further development of internal systems and processes in order to collect essential non-financial performance data. A critical overview of the structure and rationale of the evaluation tool in conjunction with the Greek case study is discussed while recommendations for future research on the field of this relatively new form of reporting are suggested.

  4. Development of an Evaluation Methodology for Triple Bottom Line Reports Using International Standards on Reporting

    NASA Astrophysics Data System (ADS)

    Skouloudis, Antonis; Evangelinos, Konstantinos; Kourmousis, Fotis

    2009-08-01

    The purpose of this article is twofold. First, evaluation scoring systems for triple bottom line (TBL) reports to date are examined and potential methodological weaknesses and problems are highlighted. In this context, a new assessment methodology is presented based explicitly on the most widely acknowledged standard on non-financial reporting worldwide, the Global Reporting Initiative (GRI) guidelines. The set of GRI topics and performance indicators was converted into scoring criteria while the generic scoring devise was set from 0 to 4 points. Secondly, the proposed benchmark tool was applied to the TBL reports published by Greek companies. Results reveal major gaps in reporting practices, stressing the need for the further development of internal systems and processes in order to collect essential non-financial performance data. A critical overview of the structure and rationale of the evaluation tool in conjunction with the Greek case study is discussed while recommendations for future research on the field of this relatively new form of reporting are suggested.

  5. Development, testing and implementation of an emergency services methodology in Alberta.

    PubMed

    Eliasoph, H; Ashdown, C

    1995-01-01

    Alberta was the first province in Canada to mandate reporting of hospital-based emergency services. This reporting is based on a workload measurement system that groups emergency visits into five discreet workload levels/classes driven by ICD-9-CM diagnoses. Other related workload measurement variables are incorporated, including admissions, transfers, maintenance monitoring, nursing and non-nursing patient support activities, trips, staff replacement, and personal fatigue and delay. The methodology used to design the reporting system has been subjected to extensive testing, auditing and refinement. The results of one year of province-wide data collection yielded approximately 1.5 million emergency visits. These data reveal consistent patterns/trends of workload that vary by hospital size and type. Although this information can assist in utilization management efforts to predict and compare workload and staffing levels, the impetus for establishing this system derived from its potential for funding hospital-based emergency services. This would be the first time that such services would be funded on a systemic, system-wide basis whereby hospitals would be reimbursed in relation to workload. This proposed funding system would distribute available funding in a consistent, fair and equitable manner across all hospitals providing a similar set of services, thus achieving one of the key goals of the Alberta Acute Care Funding Plan. Ultimately, this proposed funding methodology would be integrated into a broader Ambulatory Care Funding system currently being developed in Alberta. PMID:10142620

  6. Core melt progression and consequence analysis methodology development in support of the Savannah River Reactor PSA

    SciTech Connect

    O'Kula, K.R.; Sharp, D.A. ); Amos, C.N.; Wagner, K.C.; Bradley, D.R. )

    1992-01-01

    A three-level Probabilistic Safety Assessment (PSA) of production reactor operation has been underway since 1985 at the US Department of Energy's Savannah River Site (SRS). The goals of this analysis are to: Analyze existing margins of safety provided by the heavy-water reactor (HWR) design challenged by postulated severe accidents; Compare measures of risk to the general public and onsite workers to guideline values, as well as to those posed by commercial reactor operation; and Develop the methodology and database necessary to prioritize improvements to engineering safety systems and components, operator training, and engineering projects that contribute significantly to improving plant safety. PSA technical staff from the Westinghouse Savannah River Company (WSRC) and Science Applications International Corporation (SAIC) have performed the assessment despite two obstacles: A variable baseline plant configuration and power level; and a lack of technically applicable code methodology to model the SRS reactor conditions. This paper discusses the detailed effort necessary to modify the requisite codes before accident analysis insights for the risk assessment were obtained.

  7. Development of Methodology to Assess the Failure Behaviour of Bamboo Single Fibre by Acoustic Emission Technique

    NASA Astrophysics Data System (ADS)

    Alam, Md. Saiful; Gulshan, Fahmida; Ahsan, Qumrul; Wevers, Martine; Pfeiffer, Helge; van Vuure, Aart-Willem; Osorio, Lina; Verpoest, Ignaas

    2016-06-01

    Acoustic emission (AE) was used as a tool for detecting, evaluating and for better understanding of the damage mechanism and failure behavior in composites during mechanical loading. Methodology was developed for tensile test of natural fibres (bamboo single fibre). A series of experiments were performed and load drops (one or two) were observed in the load versus time graphs. From the observed AE parameters such as amplitude, energy, duration etc. significant information corresponding to the load drops were found. These AE signals from the load drop occurred from such failure as debonding between two elementary fibre or from join of elementary fibre at edge. The various sources of load at first load drop was not consistent for the different samples (for a particular sample the value is 8 N, stress: 517.51 MPa). Final breaking of fibre corresponded to saturated level AE amplitude of preamplifier (99.9 dB) for all samples. Therefore, it was not possible to determine the exact AE energy value for final breaking. Same methodology was used for tensile test of three single fibres, which gave clear indication of load drop before the final breaking of first and second fibre.

  8. Methodological development for selection of significant predictors explaining fatal road accidents.

    PubMed

    Dadashova, Bahar; Arenas-Ramírez, Blanca; Mira-McWilliams, José; Aparicio-Izquierdo, Francisco

    2016-05-01

    Identification of the most relevant factors for explaining road accident occurrence is an important issue in road safety research, particularly for future decision-making processes in transport policy. However model selection for this particular purpose is still an ongoing research. In this paper we propose a methodological development for model selection which addresses both explanatory variable and adequate model selection issues. A variable selection procedure, TIM (two-input model) method is carried out by combining neural network design and statistical approaches. The error structure of the fitted model is assumed to follow an autoregressive process. All models are estimated using Markov Chain Monte Carlo method where the model parameters are assigned non-informative prior distributions. The final model is built using the results of the variable selection. For the application of the proposed methodology the number of fatal accidents in Spain during 2000-2011 was used. This indicator has experienced the maximum reduction internationally during the indicated years thus making it an interesting time series from a road safety policy perspective. Hence the identification of the variables that have affected this reduction is of particular interest for future decision making. The results of the variable selection process show that the selected variables are main subjects of road safety policy measures. PMID:26928290

  9. Development of a 3D numerical methodology for fast prediction of gun blast induced loading

    NASA Astrophysics Data System (ADS)

    Costa, E.; Lagasco, F.

    2014-05-01

    In this paper, the development of a methodology based on semi-empirical models from the literature to carry out 3D prediction of pressure loading on surfaces adjacent to a weapon system during firing is presented. This loading is consequent to the impact of the blast wave generated by the projectile exiting the muzzle bore. When exceeding a pressure threshold level, loading is potentially capable to induce unwanted damage to nearby hard structures as well as frangible panels or electronic equipment. The implemented model shows the ability to quickly predict the distribution of the blast wave parameters over three-dimensional complex geometry surfaces when the weapon design and emplacement data as well as propellant and projectile characteristics are available. Considering these capabilities, the use of the proposed methodology is envisaged as desirable in the preliminary design phase of the combat system to predict adverse effects and then enable to identify the most appropriate countermeasures. By providing a preliminary but sensitive estimate of the operative environmental loading, this numerical means represents a good alternative to more powerful, but time consuming advanced computational fluid dynamics tools, which use can, thus, be limited to the final phase of the design.

  10. Development of a Lagrangian-Lagrangian methodology to predict brownout dust clouds

    NASA Astrophysics Data System (ADS)

    Syal, Monica

    A Lagrangian-Lagrangian dust cloud simulation methodology has been developed to help better understand the complicated two-phase nature of the rotorcraft brownout problem. Brownout conditions occur when rotorcraft land or take off from ground surfaces covered with loose sediment such as sand and dust, which decreases the pilot's visibility of the ground and poses a serious safety of flight risk. The present work involved the development of a comprehensive, computationally efficient three-dimensional sediment tracking method for dilute, low Reynolds number Stokes-type flows. The flow field generated by a helicopter rotor in ground effect operations over a mobile sediment bed was modeled by using an inviscid, incompressible, Lagrangian free-vortex method, coupled to a viscous semi-empirical approximation for the boundary layer flow near the ground. A new threshold model for the onset of sediment mobility was developed by including the effects of unsteady pressure forces that are induced in vortically dominated rotor flows, which can significantly alter the threshold conditions for particle motion. Other important aspects of particle mobility and uplift in such vortically driven dust flows were also modeled, including bombardment effects when previously suspended particles impact the bed and eject new particles. Bombardment effects were shown to be a particularly significant contributor to the mobilization and eventual suspension of large quantities of smaller-sized dust particles, which tend to remain suspended. A numerically efficient Lagrangian particle tracking methodology was developed where individual particle or clusters of particles were tracked in the flow. To this end, a multi-step, second-order accurate time-marching scheme was developed to solve the numerically stiff equations that govern the dynamics of particle motion. The stability and accuracy of this scheme was examined and matched to the characteristics of free-vortex method. One-way coupling of the

  11. Supply chain network design problem for a new market opportunity in an agile manufacturing system

    NASA Astrophysics Data System (ADS)

    Babazadeh, Reza; Razmi, Jafar; Ghodsi, Reza

    2012-08-01

    The characteristics of today's competitive environment, such as the speed with which products are designed, manufactured, and distributed, and the need for higher responsiveness and lower operational cost, are forcing companies to search for innovative ways to do business. The concept of agile manufacturing has been proposed in response to these challenges for companies. This paper copes with the strategic and tactical level decisions in agile supply chain network design. An efficient mixed-integer linear programming model that is able to consider the key characteristics of agile supply chain such as direct shipments, outsourcing, different transportation modes, discount, alliance (process and information integration) between opened facilities, and maximum waiting time of customers for deliveries is developed. In addition, in the proposed model, the capacity of facilities is determined as decision variables, which are often assumed to be fixed. Computational results illustrate that the proposed model can be applied as a power tool in agile supply chain network design as well as in the integration of strategic decisions with tactical decisions.

  12. Scaling Agile Infrastructure to People

    NASA Astrophysics Data System (ADS)

    Jones, B.; McCance, G.; Traylen, S.; Barrientos Arias, N.

    2015-12-01

    When CERN migrated its infrastructure away from homegrown fabric management tools to emerging industry-standard open-source solutions, the immediate technical challenges and motivation were clear. The move to a multi-site Cloud Computing model meant that the tool chains that were growing around this ecosystem would be a good choice, the challenge was to leverage them. The use of open-source tools brings challenges other than merely how to deploy them. Homegrown software, for all the deficiencies identified at the outset of the project, has the benefit of growing with the organization. This paper will examine what challenges there were in adapting open-source tools to the needs of the organization, particularly in the areas of multi-group development and security. Additionally, the increase in scale of the plant required changes to how Change Management was organized and managed. Continuous Integration techniques are used in order to manage the rate of change across multiple groups, and the tools and workflow for this will be examined.

  13. Hybrid methodology for situation assessment model development within an air operations center domain

    NASA Astrophysics Data System (ADS)

    Ho, Stephen; Gonsalves, Paul; Call, Catherine

    2007-04-01

    Within the dynamic environment of an Air Operations Center (AOC), effective decision-making is highly dependent on timely and accurate situation assessment. In previous research efforts the capabilities and potential of a Bayesian belief network (BN) model-based approach to support situation assessment have been demonstrated. In our own prior research, we have presented and formalized a hybrid process for situation assessment model development that seeks to ameliorate specific concerns and drawbacks associated with using a BN-based model construct. Specifically, our hybrid methodology addresses the significant knowledge acquisition requirements and the associated subjective nature of using subject matter experts (SMEs) for model development. Our methodology consists of two distinct functional elements: an off-line mechanism for rapid construction of a Bayesian belief network (BN) library of situation assessment models tailored to different situations and derived from knowledge elicitation with SMEs; and an on-line machine-learning-based mechanism to learn, tune, or adapt BN model parameters and structure. The adaptation supports the ability to adjust the models over time to respond to novel situations not initially available or anticipated during initial model construction, thus ensuring that the models continue to meet the dynamic requirements of performing the situation assessment function within dynamic application environments such as an AOC. In this paper, we apply and demonstrate the hybrid approach within the specific context of an AOC-based air campaign monitoring scenario. We detail both the initial knowledge elicitation and subsequent machine learning phases of the model development process, as well as demonstrate model performance within an operational context.

  14. Development of the CPXSD Methodology for Generation of Fine-Group Libraries for Shielding Applications

    SciTech Connect

    Alpan, F. Arzu; Haghighat, Alireza

    2005-01-15

    Multigroup cross sections are one of the major factors that cause uncertainties in the results of deterministic transport calculations. Thus, it is important to prepare effective cross-section libraries that include an appropriate group structure and are based on an appropriate spectrum. There are several multigroup cross-section libraries available for particular applications. For example, the 47-neutron, 20-gamma group BUGLE library that is derived from the 199-neutron, 42-gamma group VITAMIN-B6 library is widely used for light water reactor (LWR) shielding and pressure vessel dosimetry applications. However, there is no publicly available methodology that can construct problem-dependent libraries. Thus, the authors have developed the Contributon and Point-wise Cross Section Driven (CPXSD) methodology for constructing effective fine- and broad-group structures. In this paper, new fine-group structures were constructed using the CPXSD, and new fine-group cross-section libraries were generated. The 450-group LIB450 and 589-group LIB589 libraries were developed for neutrons sensitive to the fast and thermal energy ranges, respectively, for LWR shielding problems. As compared to a VITAMIN-B6-like library, the new fine-group library developed for fast neutron dosimetry calculations resulted in closer agreement to the continuous-energy predictions. For example, for the fast neutron cavity dosimetry, {approx}4% improvement was observed for the {sup 237}Np(n,f) reaction rate. For the thermal neutron {sup 1}H(n, {gamma}) reaction, a maximum improvement of {approx}14% was observed in the reaction rate at the middowncomer position.

  15. Agile Bodies: A New Imperative in Neoliberal Governance

    ERIC Educational Resources Information Center

    Gillies, Donald

    2011-01-01

    Modern business discourse suggests that a key bulwark against market fluctuation and the threat of failure is for organizations to become "agile'", a more dynamic and proactive position than that previously afforded by mere "flexibility". The same idea is also directed at the personal level, it being argued that the "agile" individual is better…

  16. ASTATINE-211 RADIOCHEMISTRY: THE DEVELOPMENT OF METHODOLOGIES FOR HIGH ACTIVITY LEVEL RADIOSYNTHESIS

    SciTech Connect

    MICHAEL R. ZALUTSKY

    2012-08-08

    Targeted radionuclide therapy is emerging as a viable approach for cancer treatment because of its potential for delivering curative doses of radiation to malignant cell populations while sparing normal tissues. Alpha particles such as those emitted by 211At are particularly attractive for this purpose because of their short path length in tissue and high energy, making them highly effective in killing cancer cells. The current impact of targeted radiotherapy in the clinical domain remains limited despite the fact that in many cases, potentially useful molecular targets and labeled compounds have already been identified. Unfortunately, putting these concepts into practice has been impeded by limitations in radiochemistry methodologies. A critical problem is that the synthesis of therapeutic radiopharmaceuticals provides additional challenges in comparison to diagnostic reagents because of the need to perform radio-synthesis at high levels of radioactivity. This is particularly important for {alpha}-particle emitters such as 211At because they deposit large amounts of energy in a highly focal manner. The overall objective of this project is to develop convenient and reproducible radiochemical methodologies for the radiohalogenation of molecules with the {alpha}-particle emitter 211At at the radioactivity levels needed for clinical studies. Our goal is to address two problems in astatine radiochemistry: First, a well known characteristic of 211At chemistry is that yields for electrophilic astatination reactions decline as the time interval after radionuclide isolation from the cyclotron target increases. This is a critical problem that must be addressed if cyclotrons are to be able to efficiently supply 211At to remote users. And second, when the preparation of high levels of 211At-labeled compounds is attempted, the radiochemical yields can be considerably lower than those encountered at tracer dose. For these reasons, clinical evaluation of promising 211At

  17. Advanced software development workstation. Comparison of two object-oriented development methodologies

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    1992-01-01

    This report is an attempt to clarify some of the concerns raised about the OMT method, specifically that OMT is weaker than the Booch method in a few key areas. This interim report specifically addresses the following issues: (1) is OMT object-oriented or only data-driven?; (2) can OMT be used as a front-end to implementation in C++?; (3) the inheritance concept in OMT is in contradiction with the 'pure and real' inheritance concept found in object-oriented (OO) design; (4) low support for software life-cycle issues, for project and risk management; (5) uselessness of functional modeling for the ROSE project; and (6) problems with event-driven and simulation systems. The conclusion of this report is that both Booch's method and Rumbaugh's method are good OO methods, each with strengths and weaknesses in different areas of the development process.

  18. Utilization of research methodology in designing and developing an interdisciplinary course in ethics.

    PubMed

    Stone, Jennie Ann M; Haas, Barbara A; Harmer-Beem, Marji J; Baker, David L

    2004-02-01

    Development research methodology was utilized to design an interdisciplinary ethics course for students from seven disciplines: dental hygiene, nursing, nurse anesthesia, occupational therapy, physician assistant, physical therapy, and social work. Two research questions, 'What content areas should be considered for inclusion in an interdisciplinary course in Ethics?' and 'What design framework, format, or structure would best fit the content chosen?' guided the study. An interdisciplinary faculty design team conducted a comparative analysis of each of the seven discipline's codes of ethics to find common topics of interest. Further analysis then grouped these topics into eight categories of professional responsibility. The result was a fifteen-week course with validated content relevant to all disciplines.

  19. Development of genetic transformation methodologies for an industrially-promising microalga: Scenedesmus almeriensis.

    PubMed

    Dautor, Yasmeen; Úbeda-Mínguez, Patricia; Chileh, Tarik; García-Maroto, Federico; Alonso, Diego López

    2014-12-01

    The development of the microalgal industry requires advances in every aspect of microalgal biotechnology. In this regard, the availability of genetic engineering tools for industrially-promising species is key. As Scenedesmus almeriensis has promise for industrial use, we describe here an Agrobacterium-based methodology that allows stable genetic transformation of it for the first time, thus opening the way to its genetic manipulation. Transformation was accomplished using two different antibiotic resistance genes [hygromicine phophotransferase (hpt) and Shble] and it is credited by PCR amplification of both hpt/Shble and GUS genes and by the β-glucuronidase activity of transformed cells. Nevertheless, the single 35S promoter seems unable to direct gene expression to a convenient level in S. almeriensis as suggested by the low GUS enzymatic activity. Temperature was critical for the transformation efficiency.

  20. Chapter 43: Assessment of NE Greenland: Prototype for development of Circum-ArcticResource Appraisal methodology

    USGS Publications Warehouse

    Gautier, D.L.; Stemmerik, L.; Christiansen, F.G.; Sorensen, K.; Bidstrup, T.; Bojesen-Koefoed, J. A.; Bird, K.J.; Charpentier, R.R.; Houseknecht, D.W.; Klett, T.R.; Schenk, C.J.; Tennyson, M.E.

    2011-01-01

    Geological features of NE Greenland suggest large petroleum potential, as well as high uncertainty and risk. The area was the prototype for development of methodology used in the US Geological Survey (USGS) Circum-Arctic Resource Appraisal (CARA), and was the first area evaluated. In collaboration with the Geological Survey of Denmark and Greenland (GEUS), eight "assessment units" (AU) were defined, six of which were probabilistically assessed. The most prospective areas are offshore in the Danmarkshavn Basin. This study supersedes a previous USGS assessment, from which it differs in several important respects: oil estimates are reduced and natural gas estimates are increased to reflect revised understanding of offshore geology. Despite the reduced estimates, the CARA indicates that NE Greenland may be an important future petroleum province. ?? 2011 The Geological Society of London.

  1. Development of improved methodology for the comparative assesment of potential repository concepts and locations

    SciTech Connect

    Tsuchi, Hiroyuki; Koike, Akihisa; Sato, Shoko; Kawamura, Hideki

    2007-07-01

    NUMO has adopted a volunteering approach to siting a geological repository for high-level radioactive waste (HLW). It is important for this process that the pros and cons of volunteers can be assessed from literature data in a clear and transparent manner, prior to the very careful selection of those sites that will be carried forward for more detailed characterisation. For this purpose, multi-attribute analysis (MAA) methodology has been developed that allows the technical assessment of criteria to be represented as scoring models. The trickier job of weighting different criteria involves expert opinion, which can be solicited by different methods. In particular, weighting of top-level attributes involves balancing a range of technical and socio-economic issues, which can be examined by considering the viewpoint of different stakeholders. The applicability of the MAA tool and its sensitivity to stakeholder viewpoints have been examined by simple case studies. (authors)

  2. Distributed situation awareness in dynamic systems: theoretical development and application of an ergonomics methodology.

    PubMed

    Stanton, N A; Stewart, R; Harris, D; Houghton, R J; Baber, C; McMaster, R; Salmon, P; Hoyle, G; Walker, G; Young, M S; Linsell, M; Dymott, R; Green, D

    The purpose of this paper is to propose foundations for a theory of situation awareness based on the analysis of interactions between agents (i.e. both human and non-human) in subsystems. This approach may help to promote a better understanding of technology-mediated interaction in systems, as well as helping in the formulation of hypotheses and predictions concerning distributed situation awareness. It is proposed that agents within a system each hold their own situation awareness, which may be very different from (although compatible with) that of other agents. It is argued that we should not always hope for, or indeed want, sharing of this awareness, as different system agents have different purposes. This view marks situation awareness as a dynamic and collaborative process binding agents together on tasks on a moment-by-moment basis. Implications of this viewpoint for the development of a new theory of, and accompanying methodology for, distributed situation awareness are offered. PMID:17008257

  3. An Iterative Methodology for Developing National Recommendations for Nursing Informatics Curricula.

    PubMed

    Egbert, Nicole; Thye, Johannes; Schulte, Georg; Liebe, Jan-David; Hackl, Werner O; Ammenwerth, Elske; Hübner, Ursula

    2016-01-01

    The increasing importance of IT in nursing requires educational measures to support its meaningful application. However, many countries do not yet have national recommendations for nursing informatics competencies. We thus developed an iterative triple methodology to yield validated and country specific recommendations for informatics core competencies in nursing. We identified relevant competencies from national sources (step 1), matched and enriched these with input from the international literature (step 2) and fed the resulting 24 core competencies into a survey (120 invited experts from which 87 responded) and two focus group sessions with a total of 48 experts (steps 3a/3b). The subsequent focus group sessions confirmed and expanded the findings. As a result, we were able to define role specific informatics core competencies for three countries. PMID:27577467

  4. Methodology for Analyzing and Developing Information Management Infrastructure to Support Telerehabilitation

    PubMed Central

    Saptono, Andi; Schein, Richard M.; Parmanto, Bambang; Fairman, Andrea

    2009-01-01

    The proliferation of advanced technologies led researchers within the Rehabilitation Engineering Research Center on Telerehabilitation (RERC-TR) to devise an integrated infrastructure for clinical services using the University of Pittsburgh (PITT) model. This model describes five required characteristics for a telerehabilitation (TR) infrastructure: openness, extensibility, scalability, cost-effectiveness, and security. The infrastructure is to deliver clinical services over distance to improve access to health services for people living in underserved or remote areas. The methodological approach to design, develop, and employ this infrastructure is explained and detailed for the remote wheelchair prescription project, a research task within the RERC-TR. The availability of this specific clinical service and personnel outside of metropolitan areas is limited due to the lack of specialty expertise and access to resources. The infrastructure is used to deliver expertise in wheeled mobility and seating through teleconsultation to remote clinics, and has been successfully deployed to five rural clinics in Western Pennsylvania. PMID:25945161

  5. An initial meteoroid stream survey in the southern hemisphere using the Southern Argentina Agile Meteor Radar (SAAMER)

    NASA Astrophysics Data System (ADS)

    Janches, D.; Hormaechea, J. L.; Brunini, C.; Hocking, W.; Fritts, D. C.

    2013-04-01

    We present in this manuscript a 4 year survey of meteor shower radiants utilizing the Southern Argentina Agile Meteor Radar (SAAMER). SAAMER, which operates at the southern most region of South America, is a new generation SKiYMET system designed with significant differences from typical meteor radars including high transmitted power and an 8-antenna transmitting array enabling large detected rates at low zenith angles. We applied the statistical methodology developed by Jones and Jones (Jones, J., Jones, W. [2006]. Month. Not. R. Astron. Soc. 367, 1050-1056) to the data collected each day and compiled the results into 1 composite representative year at 1° resolution in Solar Longitude. We then search for enhancements in the activity which last for at least 3 days and evolve temporally as is expected from a meteor shower. Using this methodology, we have identified in our data 32 shower radiants, two of which were not part of the IAU commission 22 meteor shower working list. Recently, SAAMER's capabilities were enhanced by adding two remote stations to receive meteor forward scatter signals from meteor trails and thus enable the determination of meteoroid orbital parameters. SAAMER started recording orbits in January 2012 and future surveys will focus on the search for unknown meteor streams, in particular in the southern ecliptic sky.

  6. An Initial Meteoroid Stream Survey in the Southern Hemisphere Using the Southern Argentina Agile Meteor Radar (SAAMER)

    NASA Technical Reports Server (NTRS)

    Janches, D.; Hormaechea, J. L.; Brunini, C.; Hocking, W.; Fritts, D. C.

    2013-01-01

    We present in this manuscript a 4 year survey of meteor shower radiants utilizing the Southern Argentina Agile Meteor Radar (SAAMER). SAAMER, which operates at the southern most region of South America, is a new generation SKiYMET system designed with significant differences from typical meteor radars including high transmitted power and an 8-antenna transmitting array enabling large detected rates at low zenith angles. We applied the statistical methodology developed by Jones and Jones (Jones, J., Jones, W. [2006]. Month. Not. R. Astron. Soc. 367, 1050-1056) to the data collected each day and compiled the results into 1 composite representative year at 1 resolution in Solar Longitude. We then search for enhancements in the activity which last for at least 3 days and evolve temporally as is expected from a meteor shower. Using this methodology, we have identified in our data 32 shower radiants, two of which were not part of the IAU commission 22 meteor shower working list. Recently, SAAMER's capabilities were enhanced by adding two remote stations to receive meteor forward scatter signals from meteor trails and thus enable the determination of meteoroid orbital parameters. SAAMER started recording orbits in January 2012 and future surveys will focus on the search for unknown meteor streams, in particular in the southern ecliptic sky.

  7. Development and extraction optimization of baicalein and pinostrobin from Scutellaria violacea through response surface methodology

    PubMed Central

    Subramaniam, Shankar; Raju, Ravikumar; Palanisamy, Anbumathi; Sivasubramanian, Aravind

    2015-01-01

    Objective: To develop a process that involves optimization of the amount of baicalein and pinostrobin from the hydro-methanolic extract of the leaves of Scutellaria violacea by response surface methodology (RSM). Materials and Methods: The combinatorial influence of various extraction parameters on the extraction yield was investigated by adopting Box–Behnken experimental design. Preliminary experiments carried out based on the traditional one variable at a time optimization revealed four such operational parameters to play a crucial role by influencing the yield. These four process parameters at three levels were considered to obtain the Box–Behnken experimental design. Results: RSM based model fitted to the resulting experimental data suggested that 52.3% methanol/water, 12.46:1 solvent-solid ratio, 285 rpm agitation and 6.07 h of extraction time are the optimal conditions which yielded a maximized amount of baicalein and pinostrobin of 2.9 and 4.05 mg/g DM. Analysis of variance revealed a high correlation coefficient (R2 = 0.999 for baicalein and 0.994 for pinostrobin), signifying a good fit between the regression model (second order) and the experimental observations. Conclusion: The present study signifies that both the metabolites have been extracted from S. violacea for the first time. Further, this study developed an optimized extraction procedure to obtain maximum yield of the metabolites, which is unique and better than conventional extraction methodology. The operational parameters under optimized conditions accounts for the lowest cost in extraction process thus, providing an efficient, rapid and cost-effective method for isolation and scale up of these commercially vital flavonoids. PMID:26109758

  8. Developing knowledge intensive ideas in engineering education: the application of camp methodology

    NASA Astrophysics Data System (ADS)

    Heidemann Lassen, Astrid; Løwe Nielsen, Suna

    2011-11-01

    Background: Globalization, technological advancement, environmental problems, etc. challenge organizations not just to consider cost-effectiveness, but also to develop new ideas in order to build competitive advantages. Hence, methods to deliberately enhance creativity and facilitate its processes of development must also play a central role in engineering education. However, so far the engineering education literature provides little attention to the important discussion of how to develop knowledge intensive ideas based on creativity methods and concepts. Purpose: The purpose of this article is to investigate how to design creative camps from which knowledge intensive ideas can unfold. Design/method/sample: A framework on integration of creativity and knowledge intensity is first developed, and then tested through the planning, execution and evaluation of a specialized creativity camp with focus on supply chain management. Detailed documentation of the learning processes of the participating 49 engineering and business students is developed through repeated interviews during the process as well as a survey. Results: The research illustrates the process of development of ideas, and how the participants through interdisciplinary collaboration, cognitive flexibility and joint ownership develop highly innovative and knowledge-intensive ideas, with direct relevance for the four companies whose problems they address. Conclusions: The article demonstrates how the creativity camp methodology holds the potential of combining advanced academic knowledge and creativity, to produce knowledge intensive ideas, when the design is based on ideas of experiential learning as well as creativity principles. This makes the method a highly relevant learning approach for engineering students in the search for skills to both develop and implement innovative ideas.

  9. Methodology for urban rail and construction technology research and development planning

    NASA Technical Reports Server (NTRS)

    Rubenstein, L. D.; Land, J. E.; Deshpande, G.; Dayman, B.; Warren, E. H.

    1980-01-01

    A series of transit system visits, organized by the American Public Transit Association (APTA), was conducted in which the system operators identified the most pressing development needs. These varied by property and were reformulated into a series of potential projects. To assist in the evaluation, a data base useful for estimating the present capital and operating costs of various transit system elements was generated from published data. An evaluation model was developed which considered the rate of deployment of the research and development project, potential benefits, development time and cost. An outline of an evaluation methodology that considered benefits other than capital and operating cost savings was also presented. During the course of the study, five candidate projects were selected for detailed investigation; (1) air comfort systems; (2) solid state auxiliary power conditioners; (3) door systems; (4) escalators; and (5) fare collection systems. Application of the evaluation model to these five examples showed the usefulness of modeling deployment rates and indicated a need to increase the scope of the model to quantitatively consider reliability impacts.

  10. Research & development methodology for recycling residues as building materials--a proposal.

    PubMed

    John, V M; Zordan, S E

    2001-01-01

    This article presents a proposal of methodology for conducting such research and development. The data/statistics waste collection statistics phase must cover geographical distribution, seasonal variations on production rates, waste management practices, current applications and their related costs and revenues. Waste characterisation must be comprehensive with physical, environmental and chemical aspects, including waste variability and waste contamination from shipping, handling and storage activities. Based on the previous results a broad forecast of potential applications must be developed based on very simple rules like minimisation of transportation distances and energy consumption, etc. Marketing evaluation is a very important step, frequently neglected when choosing the best applications for a particular waste. Other steps are product development and performance evaluation. Environmental evaluation of the new technology is very important because not all recycling is environmentally sound. This evaluation must be based on the life cycle assessment (LCA) and has to consider the environmental benefit of avoiding landfill disposal of the waste, and could include leaching or other specific tests or simulations. Also, the technological transference phase must be carefully planned and developed. Each proposed step is discussed, examples are given and needs for further research emphasised.

  11. From vision to reality: strategic agility in complex times.

    PubMed

    Soule, Barbara M

    2002-04-01

    Health care is experiencing turbulent times. Change has become the constant. Complexity and sometimes chaos are common characteristics. Within this context, infection control professionals strive to maintain their practices, achieve excellence, and plan for the future. As demands shift and expectations increase, professionals in infection surveillance, prevention, and control (ISPC) programs must develop strategic agility. This article describes the rationale for strategic thinking and action set within a framework of 6 thought-provoking questions. It also describes a number of techniques to use for thinking strategically, such as designing visions, becoming entrepreneurial, and engaging in creative and futuristic exercises to evaluate possibilities for program direction. These techniques can guide individual professionals or ISPC programs in strategic decision-making that will increase the ability to survive and succeed in the future.

  12. Agile and dexterous robot for inspection and EOD operations

    NASA Astrophysics Data System (ADS)

    Handelman, David A.; Franken, Gordon H.; Komsuoglu, Haldun

    2010-04-01

    The All-Terrain Biped (ATB) robot is an unmanned ground vehicle with arms, legs and wheels designed to drive, crawl, walk and manipulate objects for inspection and explosive ordnance disposal tasks. This paper summarizes on-going development of the ATB platform. Control technology for semi-autonomous legged mobility and dual-arm dexterity is described as well as preliminary simulation and hardware test results. Performance goals include driving on flat terrain, crawling on steep terrain, walking on stairs, opening doors and grasping objects. Anticipated benefits of the adaptive mobility and dexterity of the ATB platform include increased robot agility and autonomy for EOD operations, reduced operator workload and reduced operator training and skill requirements.

  13. Frequency-agile bandpass filter for direct detection lidar receivers.

    PubMed

    Gittins, C M; Lawrence, W G; Marinelli, W J

    1998-12-20

    We discuss the development of a frequency-agile receiver for CO(2) laser-based differential absorption lidar (DIAL) systems. The receiver is based on the insertion of a low-order tunable etalon into the detector field of view. The incorporation of the etalon into the receiver reduces system noise by decreasing the instantaneous spectral bandwidth of the IR detector to a narrow wavelength range centered on the transmitted CO(2) laser line, thereby improving the overall D* of the detection system. A consideration of overall lidar system performance results in a projected factor of a 2-7 reduction in detector system noise, depending on the characteristics of the environment being probed. These improvements can play a key role in extending the ability of DIAL systems to monitor chemical releases from long standoff distances.

  14. A new algorithm for agile satellite-based acquisition operations

    NASA Astrophysics Data System (ADS)

    Bunkheila, Federico; Ortore, Emiliano; Circi, Christian

    2016-06-01

    Taking advantage of the high manoeuvrability and the accurate pointing of the so-called agile satellites, an algorithm which allows efficient management of the operations concerning optical acquisitions is described. Fundamentally, this algorithm can be subdivided into two parts: in the first one the algorithm operates a geometric classification of the areas of interest and a partitioning of these areas into stripes which develop along the optimal scan directions; in the second one it computes the succession of the time windows in which the acquisition operations of the areas of interest are feasible, taking into consideration the potential restrictions associated with these operations and with the geometric and stereoscopic constraints. The results and the performances of the proposed algorithm have been determined and discussed considering the case of the Periodic Sun-Synchronous Orbits.

  15. Development of a Methodology to Gather Seated Anthropometry in a Microgravity Environment

    NASA Technical Reports Server (NTRS)

    Rajulu, Sudhakar; Young, Karen; Mesloh, Miranda

    2009-01-01

    The Constellation Program's Crew Exploration Vehicle (CEV) is required to accommodate the full population range of crewmembers according to the anthropometry requirements stated in the Human-Systems Integration Requirement (HSIR) document (CxP70024). Seated height is one of many critical dimensions of importance to the CEV designers in determining the optimum seat configuration in the vehicle. Changes in seated height may have a large impact to the design, accommodation, and safety of the crewmembers. Seated height can change due to elongation of the spine when crewmembers are exposed to microgravity. Spinal elongation is the straightening of the natural curvature of the spine and the expansion of inter-vertebral disks. This straightening occurs due to fluid shifts in the body and the lack of compressive forces on the spinal vertebrae. Previous studies have shown that as the natural curvature of the spine straightens, an increase in overall height of 3% of stature occurs which has been the basis of the current HSIR requirements. However due to variations in the torso/leg ratio and impact of soft tissue, data is nonexistent as to how spinal elongation specifically affects the measurement of seated height. In order to obtain this data, an experiment was designed to collect spinal elongation data while in a seated posture in microgravity. The purpose of this study was to provide quantitative data that represents the amount of change that occurs in seated height due to spinal elongation in microgravity environments. Given the schedule and budget constraints of ISS and Shuttle missions and the uniqueness of the problem, a methodology had to be developed to ensure that the seated height measurements were accurately collected. Therefore, simulated microgravity evaluations were conducted to test the methodology and procedures of the experiment. This evaluation obtained seat pan pressure and seated height data to a) ensure that the lap restraint provided sufficient

  16. Frequency/phase agile microwave circuits on ferroelectric films

    NASA Astrophysics Data System (ADS)

    Romanofsky, Robert Raymond

    This work describes novel microwave circuits that can be tuned in either frequency or phase through the use of nonlinear dielectrics, specifically thin ferroelectric films. These frequency and phase agile circuits in many cases provide a new capability or offer the potential for lower cost alternatives in satellite and terrestrial communications and sensor applications. A brief introduction to nonlinear dielectrics and a summary of some of the special challenges confronting the practical insertion of ferroelectric technology into commercial systems is provided. A theoretical solution for the propagation characteristics of the multi-layer structures, with emphasis on a new type of phase shifter based on coupled microstrip, lines, is developed. The quasi-TEM analysis is based on a variational solution for line capacitance and an extension of coupled transmission line theory. It is shown that the theoretical model is applicable to a broad class of multi-layer transmission lines. The critical role that ferroelectric film thickness plays in loss and phase-shift is closely examined. Experimental data for both thin film BaxSr1-xTiO 3 phase shifters near room temperature and SMO3 phase shifters at cryogenic temperatures on MgO and LaAlO3 substrates is included. Some of these devices demonstrated an insertion loss of less than 5 dB at Ku-band with continuously variable phase shift in excess of 360 degrees. The performance of these devices is superior to the state-of-the-art semiconductor counterparts. Frequency and phase agile antenna prototypes including a microstrip patch that can operate at multiple microwave frequency bands and a new type of phased array antenna concept called the ferroelectric reflectarray are introduced. Modeled data for tunable microstrip patch antennas is presented for various ferroelectric film thickness. A prototype linear phased array, with a conventional beam-forming manifold, and an electronic controller is described. This is the first

  17. Dynamic tumor tracking using the Elekta Agility MLC

    SciTech Connect

    Fast, Martin F. Nill, Simeon Bedford, James L.; Oelfke, Uwe

    2014-11-01

    Purpose: To evaluate the performance of the Elekta Agility multileaf collimator (MLC) for dynamic real-time tumor tracking. Methods: The authors have developed a new control software which interfaces to the Agility MLC to dynamically program the movement of individual leaves, the dynamic leaf guides (DLGs), and the Y collimators (“jaws”) based on the actual target trajectory. A motion platform was used to perform dynamic tracking experiments with sinusoidal trajectories. The actual target positions reported by the motion platform at 20, 30, or 40 Hz were used as shift vectors for the MLC in beams-eye-view. The system latency of the MLC (i.e., the average latency comprising target device reporting latencies and MLC adjustment latency) and the geometric tracking accuracy were extracted from a sequence of MV portal images acquired during irradiation for the following treatment scenarios: leaf-only motion, jaw + leaf motion, and DLG + leaf motion. Results: The portal imager measurements indicated a clear dependence of the system latency on the target position reporting frequency. Deducting the effect of the target frequency, the leaf adjustment latency was measured to be 38 ± 3 ms for a maximum target speed v of 13 mm/s. The jaw + leaf adjustment latency was 53 ± 3 at a similar speed. The system latency at a target position frequency of 30 Hz was in the range of 56–61 ms for the leaves (v ≤ 31 mm/s), 71–78 ms for the jaw + leaf motion (v ≤ 25 mm/s), and 58–72 ms for the DLG + leaf motion (v ≤ 59 mm/s). The tracking accuracy showed a similar dependency on the target position frequency and the maximum target speed. For the leaves, the root-mean-squared error (RMSE) was between 0.6–1.5 mm depending on the maximum target speed. For the jaw + leaf (DLG + leaf) motion, the RMSE was between 0.7–1.5 mm (1.9–3.4 mm). Conclusions: The authors have measured the latency and geometric accuracy of the Agility MLC, facilitating its future use for clinical

  18. Sensory enhancing insoles improve athletic performance during a hexagonal agility task.

    PubMed

    Miranda, Daniel L; Hsu, Wen-Hao; Gravelle, Denise C; Petersen, Kelsey; Ryzman, Rachael; Niemi, James; Lesniewski-Laas, Nicholas

    2016-05-01

    Athletes incorporate afferent signals from the mechanoreceptors of their plantar feet to provide information about posture, stability, and joint position. Sub-threshold stochastic resonance (SR) sensory enhancing insoles have been shown to improve balance and proprioception in young and elderly participant populations. Balance and proprioception are correlated with improved athletic performance, such as agility. Agility is defined as the ability to quickly change direction. An athlete's agility is commonly evaluated during athletic performance testing to assess their ability to participate in a competitive sporting event. Therefore, the purpose of this study was to examine the effects of SR insoles during a hexagonal agility task routinely used by coaches and sports scientists. Twenty recreational athletes were recruited to participate in this study. Each athlete was asked to perform a set of hexagonal agility trials while SR stimulation was either on or off. Vicon motion capture was used to measure feet position during six successful trials for each stimulation condition. Stimulation condition was randomized in a pairwise fashion. The study outcome measures were the task completion time and the positional accuracy of footfalls. Pairwise comparisons revealed a 0.12s decrease in task completion time (p=0.02) with no change in hopping accuracy (p=0.99) when SR stimulation was on. This is the first study to show athletic performance benefits while wearing proprioception and balance improving equipment on healthy participants. With further development, a self-contained sensory enhancing insole device could be used by recreational and professional athletes to improve movements that require rapid changes in direction.

  19. Agile Data Curation: A conceptual framework and approach for practitioner data management

    NASA Astrophysics Data System (ADS)

    Young, J. W.; Benedict, K. K.; Lenhardt, W. C.

    2015-12-01

    Data management occurs across a range of science and related activities such as decision-support. Exemplars within the science community operate data management systems that are extensively planned before implementation, staffed with robust data management expertise, equipped with appropriate services and technologies, and often highly structured. However, this is not the only approach to data management and almost certainly not the typical experience. The other end of the spectrum is often an ad hoc practitioner team, with changing requirements, limited training in data management, and resource constrained for both equipment and human resources. Much of the existing data management literature serves the exemplar community and ignores the ad hoc practitioners. Somewhere in the middle are examples where data are repurposed for new uses thereby generating new data management challenges. This submission presents a conceptualization of an Agile Data Curation approach that provides foundational principles for data management efforts operating across the spectrum of data generation and use from large science systems to efforts with constrained resources, limited expertise, and evolving requirements. The underlying principles to Agile Data Curation are a reapplication of agile software development principles to data management. The historical reality for many data management efforts is operating in a practioner environment so Agile Data Curation utilizes historical and current case studies to validate the foundational principles and through comparison learn lessons for future application. This submission will provide an overview of the Agile Data Curation, cover the foundational principles to the approach, and introduce a framework for gathering, classifying, and applying lessons from case studies of practitioner data management.

  20. A methodological proposal for the development of an HPC-based antenna array scheduler

    NASA Astrophysics Data System (ADS)

    Bonvallet, Roberto; Hoffstadt, Arturo; Herrera, Diego; López, Daniela; Gregorio, Rodrigo; Almuna, Manuel; Hiriart, Rafael; Solar, Mauricio

    2010-07-01

    As new astronomy projects choose interferometry to improve angular resolution and to minimize costs, preparing and optimizing schedules for an antenna array becomes an increasingly critical task. This problem shares similarities with the job-shop problem, which is known to be a NP-hard problem, making a complete approach infeasible. In the case of ALMA, 18000 projects per season are expected, and the best schedule must be found in the order of minutes. The problem imposes severe difficulties: the large domain of observation projects to be taken into account; a complex objective function, composed of several abstract, environmental, and hardware constraints; the number of restrictions imposed and the dynamic nature of the problem, as weather is an ever-changing variable. A solution can benefit from the use of High-Performance Computing for the final implementation to be deployed, but also for the development process. Our research group proposes the use of both metaheuristic search and statistical learning algorithms, in order to create schedules in a reasonable time. How these techniques will be applied is yet to be determined as part of the ongoing research. Several algorithms need to be implemented, tested and evaluated by the team. This work presents the methodology proposed to lead the development of the scheduler. The basic functionality is encapsulated into software components implemented on parallel architectures. These components expose a domain-level interface to the researchers, enabling then to develop early prototypes for evaluating and comparing their proposed techniques.

  1. Development of designer chicken shred with response surface methodology and evaluation of its quality characteristics.

    PubMed

    Reddy, K Jalarama; Jayathilakan, K; Pandey, M C

    2016-01-01

    Meat is considered to be an excellent source of protein, essential minerals, trace elements and vitamins but negative concerns regarding meat consumption and its impact on human health have promoted research into development of novel functional meat products. In the present study Rice bran oil (RBO), and Flaxseed oil (FSO) were used for attaining an ideal lipid profile in the product. The experiment was designed to optimise the RBO and FSO concentration for development of product with ideal lipid profile and maximum acceptability by the application of central composite rotatable design of Response surface methodology (RSM). Levels of RBO and FSO were taken as independent variables and overall acceptability (OAA), n-6 and n-3 fatty acids as responses. Quadratic fit model was found to be suitable for optimising the product. Sample with RBO (20.51 ml) and FSO (2.57 ml) yielded an OAA score of 8.25, 29.54 % of n-6 and 7.70 % of n-3 having n-6/n-3 ratio as 3.8:1. Optimised product was analysed for physico-chemical, sensory and microbial profile during storage at 4 ± 1 °C for 30 days. Increase in the lipid oxidative parameters was observed during storage but it was not significant (p < 0.05). Studies revealed great potential of developing functional poultry products with improved nutritional quality and good shelf stability by incorporating RBO and FSO.

  2. Development and methodology of level 1 probability safety assessment at PUSPATI TRIGA Reactor

    NASA Astrophysics Data System (ADS)

    Maskin, Mazleha; Tom, Phongsakorn Prak; Lanyau, Tonny Anak; Brayon, Fedrick Charlie Matthew; Mohamed, Faizal; Saad, Mohamad Fauzi; Ismail, Ahmad Razali; Abu, Mohamad Puad Haji

    2014-02-01

    As a consequence of the accident at the Fukushima Dai-ichi Nuclear Power Plant in Japan, the safety aspects of the one and only research reactor (31 years old) in Malaysia need be reviewed. Based on this decision, Malaysian Nuclear Agency in collaboration with Atomic Energy Licensing Board and Universiti Kebangsaan Malaysia develop a Level-1 Probability Safety Assessment on this research reactor. This work is aimed to evaluate the potential risks of incidents in RTP and at the same time to identify internal and external hazard that may cause any extreme initiating events. This report documents the methodology in developing a Level 1 PSA performed for the RTP as a complementary approach to deterministic safety analysis both in neutronics and thermal hydraulics. This Level-1 PSA work has been performed according to the procedures suggested in relevant IAEA publications and at the same time numbers of procedures has been developed as part of an Integrated Management System programme implemented in Nuclear Malaysia.

  3. Methodology to develop crash modification functions for road safety treatments with fully specified and hierarchical models.

    PubMed

    Chen, Yongsheng; Persaud, Bhagwant

    2014-09-01

    Crash modification factors (CMFs) for road safety treatments are developed as multiplicative factors that are used to reflect the expected changes in safety performance associated with changes in highway design and/or the traffic control features. However, current CMFs have methodological drawbacks. For example, variability with application circumstance is not well understood, and, as important, correlation is not addressed when several CMFs are applied multiplicatively. These issues can be addressed by developing safety performance functions (SPFs) with components of crash modification functions (CM-Functions), an approach that includes all CMF related variables, along with others, while capturing quantitative and other effects of factors and accounting for cross-factor correlations. CM-Functions can capture the safety impact of factors through a continuous and quantitative approach, avoiding the problematic categorical analysis that is often used to capture CMF variability. There are two formulations to develop such SPFs with CM-Function components - fully specified models and hierarchical models. Based on sample datasets from two Canadian cities, both approaches are investigated in this paper. While both model formulations yielded promising results and reasonable CM-Functions, the hierarchical model was found to be more suitable in retaining homogeneity of first-level SPFs, while addressing CM-Functions in sub-level modeling. In addition, hierarchical models better capture the correlations between different impact factors.

  4. Development and methodology of level 1 probability safety assessment at PUSPATI TRIGA Reactor

    SciTech Connect

    Maskin, Mazleha; Tom, Phongsakorn Prak; Lanyau, Tonny Anak; Saad, Mohamad Fauzi; Ismail, Ahmad Razali; Abu, Mohamad Puad Haji; Brayon, Fedrick Charlie Matthew; Mohamed, Faizal

    2014-02-12

    As a consequence of the accident at the Fukushima Dai-ichi Nuclear Power Plant in Japan, the safety aspects of the one and only research reactor (31 years old) in Malaysia need be reviewed. Based on this decision, Malaysian Nuclear Agency in collaboration with Atomic Energy Licensing Board and Universiti Kebangsaan Malaysia develop a Level-1 Probability Safety Assessment on this research reactor. This work is aimed to evaluate the potential risks of incidents in RTP and at the same time to identify internal and external hazard that may cause any extreme initiating events. This report documents the methodology in developing a Level 1 PSA performed for the RTP as a complementary approach to deterministic safety analysis both in neutronics and thermal hydraulics. This Level-1 PSA work has been performed according to the procedures suggested in relevant IAEA publications and at the same time numbers of procedures has been developed as part of an Integrated Management System programme implemented in Nuclear Malaysia.

  5. Mode decomposition as a methodology for developing convective-scale representations in global models

    NASA Astrophysics Data System (ADS)

    Yano, Jun-Ichi; Redelsperger, Jean-Luc; Bechtold, Peter; Guichard, Françoise

    2005-07-01

    Mode decomposition is proposed as a methodology for developing subgrid-scale physical representations in global models by a systematic reduction of an originally full system such as a cloud-resolving model (CRM). A general formulation is presented, and also discussed are mathematical requirements that make this procedure possible. Features of this general methodology are further elucidated by the two specific examples: mass fluxes and wavelets.The traditional mass-flux formulation for convective parametrizations is derived as a special case from this general formulation. It is based on the decomposition of a horizontal domain into an approximate sum of piecewise-constant segments. Thus, a decomposition of CRM outputs on this basis is crucial for their direct verification. However, this decomposition is mathematically not well-posed nor unique due to the lack of admissibility. A classification into cloud types, primarily based on precipitation characteristics of the atmospheric columns, that has been used as its substitute, does not necessarily provide a good approximation for a piecewiseconstant segment decomposition. This difficulty with mass-flux decomposition makes a verification of the formulational details of parametrizations based on mass fluxes by a CRM inherently difficult.The wavelet decomposition is an alternative possibility that can more systematically decompose the convective system. Its completeness and orthogonality also allow a prognostic description of a CRM system in wavelet space in the same manner as is done in Fourier space. The wavelets can, furthermore, efficiently represent the various convective coherencies by a limited number of modes due to their spatial localizations. Thus, the degree of complexity of the wavelet-based prognostic representation of a CRM can be extensively reduced. Such an extensive reduction may allow its use in place of current cumulus parametrizations. This wavelet-based scheme can easily be verified from the full

  6. Mapping plant species ranges in the Hawaiian Islands: developing a methodology and associated GIS layers

    USGS Publications Warehouse

    Price, Jonathan P.; Jacobi, James D.; Gon, Samuel M.; Matsuwaki, Dwight; Mehrhoff, Loyal; Wagner, Warren; Lucas, Matthew; Rowe, Barbara

    2012-01-01

    This report documents a methodology for projecting the geographic ranges of plant species in the Hawaiian Islands. The methodology consists primarily of the creation of several geographic information system (GIS) data layers depicting attributes related to the geographic ranges of plant species. The most important spatial-data layer generated here is an objectively defined classification of climate as it pertains to the distribution of plant species. By examining previous zonal-vegetation classifications in light of spatially detailed climate data, broad zones of climate relevant to contemporary concepts of vegetation in the Hawaiian Islands can be explicitly defined. Other spatial-data layers presented here include the following: substrate age, as large areas of the island of Hawai'i, in particular, are covered by very young lava flows inimical to the growth of many plant species; biogeographic regions of the larger islands that are composites of multiple volcanoes, as many of their species are restricted to a given topographically isolated mountain or a specified group of them; and human impact, which can reduce the range of many species relative to where they formerly were found. Other factors influencing the geographic ranges of species that are discussed here but not developed further, owing to limitations in rendering them spatially, include topography, soils, and disturbance. A method is described for analyzing these layers in a GIS, in conjunction with a database of species distributions, to project the ranges of plant species, which include both the potential range prior to human disturbance and the projected present range. Examples of range maps for several species are given as case studies that demonstrate different spatial characteristics of range. Several potential applications of species-range maps are discussed, including facilitating field surveys, informing restoration efforts, studying range size and rarity, studying biodiversity, managing

  7. A Control Law Design Method Facilitating Control Power, Robustness, Agility, and Flying Qualities Tradeoffs: CRAFT

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Davidson, John B.

    1998-01-01

    A multi-input, multi-output control law design methodology, named "CRAFT", is presented. CRAFT stands for the design objectives addressed, namely, Control power, Robustness, Agility, and Flying Qualities Tradeoffs. The methodology makes use of control law design metrics from each of the four design objective areas. It combines eigenspace assignment, which allows for direct specification of eigenvalues and eigenvectors, with a graphical approach for representing the metrics that captures numerous design goals in one composite illustration. Sensitivity of the metrics to eigenspace choice is clearly displayed, enabling the designer to assess the cost of design tradeoffs. This approach enhances the designer's ability to make informed design tradeoffs and to reach effective final designs. An example of the CRAFT methodology applied to an advanced experimental fighter and discussion of associated design issues are provided.

  8. Developing services for climate impact and adaptation baseline information and methodologies for the Andes

    NASA Astrophysics Data System (ADS)

    Huggel, C.

    2012-04-01

    Impacts of climate change are observed and projected across a range of ecosystems and economic sectors, and mountain regions thereby rank among the hotspots of climate change. The Andes are considered particularly vulnerable to climate change, not only due to fragile ecosystems but also due to the high vulnerability of the population. Natural resources such as water systems play a critical role and are observed and projected to be seriously affected. Adaptation to climate change impacts is therefore crucial to contain the negative effects on the population. Adaptation projects require information on the climate and affected socio-environmental systems. There is, however, generally a lack of methodological guidelines how to generate the necessary scientific information and how to communicate to implementing governmental and non-governmental institutions. This is particularly important in view of the international funds for adaptation such as the Green Climate Fund established and set into process at the UNFCCC Conferences of the Parties in Cancun 2010 and Durban 2011. To facilitate this process international and regional organizations (World Bank and Andean Community) and a consortium of research institutions have joined forces to develop and define comprehensive methodologies for baseline and climate change impact assessments for the Andes, with an application potential to other mountain regions (AndesPlus project). Considered are the climatological baseline of a region, and the assessment of trends based on ground meteorological stations, reanalysis data, and satellite information. A challenge is the scarcity of climate information in the Andes, and the complex climatology of the mountain terrain. A climate data platform has been developed for the southern Peruvian Andes and is a key element for climate data service and exchange. Water resources are among the key livelihood components for the Andean population, and local and national economy, in particular for

  9. Brain tumors and synchrotron radiation: Methodological developments in quantitative brain perfusion imaging and radiation therapy

    SciTech Connect

    Adam, Jean-Francois

    2005-04-01

    High-grade gliomas are the most frequent type of primary brain tumors in adults. Unfortunately, the management of glioblastomas is still mainly palliative and remains a difficult challenge, despite advances in brain tumor molecular biology and in some emerging therapies. Synchrotron radiation opens fields for medical imaging and radiation therapy by using monochromatic intense x-ray beams. It is now well known that angiogenesis plays a critical role in the tumor growth process and that brain perfusion is representative of the tumor mitotic activity. Synchrotron radiation quantitative computed tomography (SRCT) is one of the most accurate techniques for measuring in vivo contrast agent concentration and thus computing precise and accurate absolute values of the brain perfusion key parameters. The methodological developments of SRCT absolute brain perfusion measurements as well as their preclinical validation are detailed in this thesis. In particular, absolute cerebral volume and blood brain barrier permeability high-resolution (pixel size <50x50 {mu}m{sup 2}) parametric maps were reported. In conventional radiotherapy, the treatment of these tumors remains a delicate challenge, because the damages to the surrounding normal brain tissue limit the amount of radiation that can be delivered. One strategy to overcome this limitation is to infuse an iodinated contrast agent to the patient during the irradiation. The contrast agent accumulates in the tumor, through the broken blood brain barrier, and the irradiation is performed with kilovoltage x rays, in tomography mode, the tumor being located at the center of rotation and the beam size adjusted to the tumor dimensions. The dose enhancement results from the photoelectric effect on the heavy element and from the irradiation geometry. Synchrotron beams, providing high intensity, tunable monochromatic x rays, are ideal for this treatment. The beam properties allow the selection of monochromatic irradiation, at the optimal

  10. An Examination of an Information Security Framework Implementation Based on Agile Values to Achieve Health Insurance Portability and Accountability Act Security Rule Compliance in an Academic Medical Center: The Thomas Jefferson University Case Study

    ERIC Educational Resources Information Center

    Reis, David W.

    2012-01-01

    Agile project management is most often examined in relation to software development, while information security frameworks are often examined with respect to certain risk management capabilities rather than in terms of successful implementation approaches. This dissertation extended the study of both Agile project management and information…

  11. Methodologies in Cultural-Historical Activity Theory: The Example of School-Based Development

    ERIC Educational Resources Information Center

    Postholm, May Britt

    2015-01-01

    Background and purpose: Relatively little research has been conducted on methodology within Cultural-Historical Activity Theory (CHAT). CHAT is mainly used as a framework for developmental processes. The purpose of this article is to discuss both focuses for research and research questions within CHAT and to outline methodologies that can be used…

  12. Methodological and Pedagogical Potential of Reflection in Development of Contemporary Didactics

    ERIC Educational Resources Information Center

    Chupina, Valentina A.; Pleshakova, Anastasiia Yu.; Konovalova, Maria E.

    2016-01-01

    Applicability of the issue under research is preconditioned by the need of practical pedagogics to expand methodological and methodical tools of contemporary didactics. The purpose of the article is to detect the methodological core of reflection as a form of thinking and to provide insight thereunto on the basis of systematic attributes of the…

  13. Moving to Capture Children's Attention: Developing a Methodology for Measuring Visuomotor Attention.

    PubMed

    Hill, Liam J B; Coats, Rachel O; Mushtaq, Faisal; Williams, Justin H G; Aucott, Lorna S; Mon-Williams, Mark

    2016-01-01

    Attention underpins many activities integral to a child's development. However, methodological limitations currently make large-scale assessment of children's attentional skill impractical, costly and lacking in ecological validity. Consequently we developed a measure of 'Visual Motor Attention' (VMA)-a construct defined as the ability to sustain and adapt visuomotor behaviour in response to task-relevant visual information. In a series of experiments, we evaluated the capability of our method to measure attentional processes and their contributions in guiding visuomotor behaviour. Experiment 1 established the method's core features (ability to track stimuli moving on a tablet-computer screen with a hand-held stylus) and demonstrated its sensitivity to principled manipulations in adults' attentional load. Experiment 2 standardised a format suitable for use with children and showed construct validity by capturing developmental changes in executive attention processes. Experiment 3 tested the hypothesis that children with and without coordination difficulties would show qualitatively different response patterns, finding an interaction between the cognitive and motor factors underpinning responses. Experiment 4 identified associations between VMA performance and existing standardised attention assessments and thereby confirmed convergent validity. These results establish a novel approach to measuring childhood attention that can produce meaningful functional assessments that capture how attention operates in an ecologically valid context (i.e. attention's specific contribution to visuomanual action). PMID:27434198

  14. Scenario development as a basis for formulating a research program on future agriculture: a methodological approach.

    PubMed

    Oborn, Ingrid; Bengtsson, Jan; Hedenus, Fredrik; Rydhmer, Lotta; Stenström, Maria; Vrede, Katarina; Westin, Charles; Magnusson, Ulf

    2013-11-01

    To increase the awareness of society to the challenges of global food security, we developed five contrasting global and European scenarios for 2050 and used these to identify important issues for future agricultural research. Using a scenario development method known as morphological analysis, scenarios were constructed that took economic, political, technical, and environmental factors into account. With the scenarios as a starting point future challenges were discussed and research issues and questions were identified in an interactive process with stakeholders and researchers. Based on the outcome of this process, six socioeconomic and biophysical overarching challenges for future agricultural were formulated and related research issues identified. The outcome was compared with research priorities generated in five other research programs. In comparison, our research questions focus more on societal values and the role of consumers in influencing agricultural production, as well as on policy formulation and resolving conflicting goals, areas that are presently under-represented in agricultural research. The partly new and more interdisciplinary research priorities identified in Future Agriculture compared to other programs analyzed are likely a result of the methodological approach used, combining scenarios and interaction between stakeholders and researchers.

  15. Development of the Spanish version of the Systematized Nomenclature of Medicine: methodology and main issues.

    PubMed Central

    Reynoso, G. A.; March, A. D.; Berra, C. M.; Strobietto, R. P.; Barani, M.; Iubatti, M.; Chiaradio, M. P.; Serebrisky, D.; Kahn, A.; Vaccarezza, O. A.; Leguiza, J. L.; Ceitlin, M.; Luna, D. A.; Bernaldo de Quirós, F. G.; Otegui, M. I.; Puga, M. C.; Vallejos, M.

    2000-01-01

    This presentation features linguistic and terminology management issues related to the development of the Spanish version of the Systematized Nomenclature of Medicine (SNOMED). It aims at describing the aspects of translating and the difficulties encountered in delivering a natural and consistent medical nomenclature. Bunge's three-layered model is referenced to analyze the sequence of symbolic concept representations. It further explains how a communicative translation based on a concept-to-concept approach was used to achieve the highest level of flawlessness and naturalness for the Spanish rendition of SNOMED. Translation procedures and techniques are described and exemplified. Both the computer-aided and human translation methods are portrayed. The scientific and translation team tasks are detailed, with focus on Newmark's four-level principle for the translation process, extended with a fifth further level relevant to the ontology to control the consistency of the typology of concepts. Finally the convenience for a common methodology to develop non-English versions of SNOMED is suggested. PMID:11079973

  16. Moving to Capture Children’s Attention: Developing a Methodology for Measuring Visuomotor Attention

    PubMed Central

    Coats, Rachel O.; Mushtaq, Faisal; Williams, Justin H. G.; Aucott, Lorna S.; Mon-Williams, Mark

    2016-01-01

    Attention underpins many activities integral to a child’s development. However, methodological limitations currently make large-scale assessment of children’s attentional skill impractical, costly and lacking in ecological validity. Consequently we developed a measure of ‘Visual Motor Attention’ (VMA)—a construct defined as the ability to sustain and adapt visuomotor behaviour in response to task-relevant visual information. In a series of experiments, we evaluated the capability of our method to measure attentional processes and their contributions in guiding visuomotor behaviour. Experiment 1 established the method’s core features (ability to track stimuli moving on a tablet-computer screen with a hand-held stylus) and demonstrated its sensitivity to principled manipulations in adults’ attentional load. Experiment 2 standardised a format suitable for use with children and showed construct validity by capturing developmental changes in executive attention processes. Experiment 3 tested the hypothesis that children with and without coordination difficulties would show qualitatively different response patterns, finding an interaction between the cognitive and motor factors underpinning responses. Experiment 4 identified associations between VMA performance and existing standardised attention assessments and thereby confirmed convergent validity. These results establish a novel approach to measuring childhood attention that can produce meaningful functional assessments that capture how attention operates in an ecologically valid context (i.e. attention's specific contribution to visuomanual action). PMID:27434198

  17. Design, Development and Optimization of S (-) Atenolol Floating Sustained Release Matrix Tablets Using Surface Response Methodology

    PubMed Central

    Gunjal, P. T.; Shinde, M. B.; Gharge, V. S.; Pimple, S. V.; Gurjar, M. K.; Shah, M. N.

    2015-01-01

    The objective of this present investigation was to develop and formulate floating sustained release matrix tablets of s (-) atenolol, by using different polymer combinations and filler, to optimize by using surface response methodology for different drug release variables and to evaluate the drug release pattern of the optimized product. Floating sustained release matrix tablets of various combinations were prepared with cellulose-based polymers: Hydroxypropyl methylcellulose, sodium bicarbonate as a gas generating agent, polyvinyl pyrrolidone as a binder and lactose monohydrate as filler. The 32 full factorial design was employed to investigate the effect of formulation variables on different properties of tablets applicable to floating lag time, buoyancy time, % drug release in 1 and 6 h (D1 h,D6 h) and time required to 90% drug release (t90%). Significance of result was analyzed using analysis of non variance and P < 0.05 was considered statistically significant. S (-) atenolol floating sustained release matrix tablets followed the Higuchi drug release kinetics that indicates the release of drug follows anomalous (non-Fickian) diffusion mechanism. The developed floating sustained release matrix tablet of improved efficacy can perform therapeutically better than a conventional tablet. PMID:26798171

  18. Clinical trials in Huntington's disease: Interventions in early clinical development and newer methodological approaches.

    PubMed

    Sampaio, Cristina; Borowsky, Beth; Reilmann, Ralf

    2014-09-15

    Since the identification of the Huntington's disease (HD) gene, knowledge has accumulated about mechanisms directly or indirectly affected by the mutated Huntingtin protein. Transgenic and knock-in animal models of HD facilitate the preclinical evaluation of these targets. Several treatment approaches with varying, but growing, preclinical evidence have been translated into clinical trials. We review major landmarks in clinical development and report on the main clinical trials that are ongoing or have been recently completed. We also review clinical trial settings and designs that influence drug-development decisions, particularly given that HD is an orphan disease. In addition, we provide a critical analysis of the evolution of the methodology of HD clinical trials to identify trends toward new processes and endpoints. Biomarker studies, such as TRACK-HD and PREDICT-HD, have generated evidence for the potential usefulness of novel outcome measures for HD clinical trials, such as volumetric imaging, quantitative motor (Q-Motor) measures, and novel cognitive endpoints. All of these endpoints are currently applied in ongoing clinical trials, which will provide insight into their reliability, sensitivity, and validity, and their use may expedite proof-of-concept studies. We also outline the specific opportunities that could provide a framework for a successful avenue toward identifying and efficiently testing and translating novel mechanisms of action in the HD field.

  19. Design, Development and Optimization of S (-) Atenolol Floating Sustained Release Matrix Tablets Using Surface Response Methodology.

    PubMed

    Gunjal, P T; Shinde, M B; Gharge, V S; Pimple, S V; Gurjar, M K; Shah, M N

    2015-01-01

    The objective of this present investigation was to develop and formulate floating sustained release matrix tablets of s (-) atenolol, by using different polymer combinations and filler, to optimize by using surface response methodology for different drug release variables and to evaluate the drug release pattern of the optimized product. Floating sustained release matrix tablets of various combinations were prepared with cellulose-based polymers: Hydroxypropyl methylcellulose, sodium bicarbonate as a gas generating agent, polyvinyl pyrrolidone as a binder and lactose monohydrate as filler. The 3(2) full factorial design was employed to investigate the effect of formulation variables on different properties of tablets applicable to floating lag time, buoyancy time, % drug release in 1 and 6 h (D1 h,D6 h) and time required to 90% drug release (t90%). Significance of result was analyzed using analysis of non variance and P < 0.05 was considered statistically significant. S (-) atenolol floating sustained release matrix tablets followed the Higuchi drug release kinetics that indicates the release of drug follows anomalous (non-Fickian) diffusion mechanism. The developed floating sustained release matrix tablet of improved efficacy can perform therapeutically better than a conventional tablet. PMID:26798171

  20. An integrated methodology on the suitability of offshore sites for wind farm development

    NASA Astrophysics Data System (ADS)

    Patlakas, Platon; Galanis, George; Péray, Marie; Filipot, Jean-François; Kalogeri, Christina; Spyrou, Christos; Diamantis, Dimitris; Kallos, Gerorge

    2016-04-01

    During, the last decades the potential and interest in wind energy investments has been constantly increasing in the European countries. As technology changes rapidly, more and more areas can be identified as suitable for energy applications. Offshore wind farms perfectly illustrate how new technologies allow to build bigger, more efficient and resistant in extreme conditions wind power plants. The current work proposes an integrated methodology to determine the suitability of an offshore marine area for the development of wind farm structures. More specifically, the region of interest is evaluated based both on the natural resources, connected to the local environmental characteristics, and potential constrains set by anthropogenic or other activities. State of the art atmospheric and wave models and a 10-year hindcast database are utilized in conjunction with local information for a number of potential constrains, leading to a 5-scale suitability index for the whole area. In this way, sub regions are characterized, at a high resolution mode, as poorly or highly suitable for wind farm development, providing a new tool for technical/research teams and decision makers. In addition, extreme wind and wave conditions and their 50-years return period are analyzed and used to define the safety level of the wind farms structural characteristics.

  1. Single-step affinity purification of enzyme biotherapeutics: a platform methodology for accelerated process development.

    PubMed

    Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean

    2014-01-01

    Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT).

  2. Wavelength agile holmium-doped fiber laser

    NASA Astrophysics Data System (ADS)

    Simakov, N.; Daniel, J. M. O.; Ward, J.; Clarkson, W. A.; Hemming, A.; Haub, J.

    2016-03-01

    For the first time, an electronically-controlled, wavelength-agile tuneable holmium-doped fibre laser is presented. A narrow-band acousto-optic tuneable filter was characterized and used as the wavelength selective element to avoid any inertial effects associated with opto-mechanical tuning mechanisms. We demonstrate operation over a 90 nm wavelength range spanning 2040 - 2130 nm. The laser produced >150 mW over this entire range with a signal-to-noise ratio of >45 dB and line-width of ~0.16 nm. Switching times of ~35 μs and sweep rates of up to 9 nm/ms were also demonstrated.

  3. Compact, flexible, frequency agile parametric wavelength converter

    DOEpatents

    Velsko, Stephan P.; Yang, Steven T.

    2002-01-01

    This improved Frequency Agile Optical Parametric Oscillator provides near on-axis pumping of a single QPMC with a tilted periodically poled grating to overcome the necessity to find a particular crystal that will permit collinear birefringence in order to obtain a desired tuning range. A tilted grating design and the elongation of the transverse profile of the pump beam in the angle tuning plane of the FA-OPO reduces the rate of change of the overlap between the pumped volume in the crystal and the resonated and non-resonated wave mode volumes as the pump beam angle is changed. A folded mirror set relays the pivot point for beam steering from a beam deflector to the center of the FA-OPO crystal. This reduces the footprint of the device by as much as a factor of two over that obtained when using the refractive telescope design.

  4. Development and Validation of a New Air Carrier Block Time Prediction Model and Methodology

    NASA Astrophysics Data System (ADS)

    Litvay, Robyn Olson

    Commercial airline operations rely on predicted block times as the foundation for critical, successive decisions that include fuel purchasing, crew scheduling, and airport facility usage planning. Small inaccuracies in the predicted block times have the potential to result in huge financial losses, and, with profit margins for airline operations currently almost nonexistent, potentially negate any possible profit. Although optimization techniques have resulted in many models targeting airline operations, the challenge of accurately predicting and quantifying variables months in advance remains elusive. The objective of this work is the development of an airline block time prediction model and methodology that is practical, easily implemented, and easily updated. Research was accomplished, and actual U.S., domestic, flight data from a major airline was utilized, to develop a model to predict airline block times with increased accuracy and smaller variance in the actual times from the predicted times. This reduction in variance represents tens of millions of dollars (U.S.) per year in operational cost savings for an individual airline. A new methodology for block time prediction is constructed using a regression model as the base, as it has both deterministic and probabilistic components, and historic block time distributions. The estimation of the block times for commercial, domestic, airline operations requires a probabilistic, general model that can be easily customized for a specific airline’s network. As individual block times vary by season, by day, and by time of day, the challenge is to make general, long-term estimations representing the average, actual block times while minimizing the variation. Predictions of block times for the third quarter months of July and August of 2011 were calculated using this new model. The resulting, actual block times were obtained from the Research and Innovative Technology Administration, Bureau of Transportation Statistics

  5. Methodology for Developing the REScheckTM Software through Version 4.4.3

    SciTech Connect

    Bartlett, Rosemarie; Connell, Linda M; Gowri, Krishnan; Lucas, Robert G; Schultz, Robert W; Taylor, Zachary T; Wiberg, John D

    2012-09-01

    , MECcheck was renamed REScheck™ to better identify it as a residential code compliance tool. The “MEC” in MECcheck was outdated because it was taken from the Model Energy Code, which has been succeeded by the IECC. The “RES” in REScheck is also a better fit with the companion commercial product, COMcheck™. The easy-to-use REScheck compliance materials include a compliance and enforcement manual for all the MEC and IECC requirements and three compliance approaches for meeting the code’s thermal envelope requirements-prescriptive packages, software, and a trade-off worksheet (included in the compliance manual). The compliance materials can be used for single-family and low-rise multifamily dwellings. The materials allow building energy efficiency measures (such as insulation levels) to be “traded off” against each other, allowing a wide variety of building designs to comply with the code. This report explains the methodology used to develop Version 4.4.3 of the REScheck software developed for the 1992, 1993, and 1995 editions of the MEC, and the 1998, 2000, 2003, 2006, 2007, 2009, and 2012 editions of the IECC, and the 2006 edition of the International Residential Code (IRC). Although some requirements contained in these codes have changed, the methodology used to develop the REScheck software for these editions is similar. Beginning with REScheck Version 4.4.0, support for 1992, 1993, and 1995 MEC and the 1998 IECC is no longer included, but those sections remain in this document for reference purposes. REScheck assists builders in meeting the most complicated part of the code-the building envelope Uo-, U-, and R-value requirements in Section 502 of the code. This document details the calculations and assumptions underlying the treatment of the code requirements in REScheck, with a major emphasis on the building envelope requirements.

  6. Moving target detection for frequency agility radar by sparse reconstruction

    NASA Astrophysics Data System (ADS)

    Quan, Yinghui; Li, YaChao; Wu, Yaojun; Ran, Lei; Xing, Mengdao; Liu, Mengqi

    2016-09-01

    Frequency agility radar, with randomly varied carrier frequency from pulse to pulse, exhibits superior performance compared to the conventional fixed carrier frequency pulse-Doppler radar against the electromagnetic interference. A novel moving target detection (MTD) method is proposed for the estimation of the target's velocity of frequency agility radar based on pulses within a coherent processing interval by using sparse reconstruction. Hardware implementation of orthogonal matching pursuit algorithm is executed on Xilinx Virtex-7 Field Programmable Gata Array (FPGA) to perform sparse optimization. Finally, a series of experiments are performed to evaluate the performance of proposed MTD method for frequency agility radar systems.

  7. Developing a methodology for identifying action zones to protect and manage groundwater well fields

    NASA Astrophysics Data System (ADS)

    Bellier, Sandra; Viennot, Pascal; Ledoux, Emmanuel; Schott, Celine

    2013-04-01

    Implementation of a long term action plan to manage and protect well fields is a complex and very expensive process. In this context, the relevance and efficiency of such action plans on water quality should be evaluated. The objective of this study is to set up a methodology to identify relevant actions zones in which environmental changes may significantly impact the quantity or quality of pumped water. In the Seine-et-Marne department (France), under French environmental laws three sectors integrating numerous well-field pumping in Champigny's limestone aquifer are considered as priority. This aquifer, located at south-east of Paris, supplies more than one million people with drinking water. Catchments areas of these abstractions are very large (2000 km2) and their intrinsic vulnerability was established by a simple parametric approach that does not permit to consider the complexity of hydrosystem. Consequently, a methodology based on a distributed modeling of the process of the aquifer was developed. The basin is modeled using the hydrogeological model MODCOU, developed in MINES ParisTech since the 1980s. It simulates surface and groundwater flow in aquifer systems and allows to represent the local characteristics of the hydrosystem (aquifers communicating by leakage, rivers infiltration, supply from sinkholes and locally perched or dewatering aquifers). The model was calibrated by matching simulated river discharge hydrographs and piezometric heads with observed ones since the 1970s. Thanks to this modelling tool, a methodology based on the transfer of a theoretical tracer through the hydrosystem from the ground surface to the outlets was implemented to evaluate the spatial distribution of the contribution areas at contrasted, wet or dry recharge periods. The results show that the surface of areas contributing to supply most catchments is lower than 300 km2 and the major contributory zones are located along rivers. This finding illustrates the importance of

  8. Generic Competences in Higher Education: Studying Their Development in Undergraduate Social Science Studies by Means of a Specific Methodology

    ERIC Educational Resources Information Center

    Gallifa, Josep; Garriga, Jordi

    2010-01-01

    Research into the acquisition of generic competences was carried out with the undergraduate social science programmes offered by the Ramon Llull University, Barcelona (Spain). For these programmes an innovative methodology called "cross-course seminars" has been developed. Its focus is, amongst others, on developing generic competences. In the…

  9. Development of designer chicken shred with response surface methodology and evaluation of its quality characteristics.

    PubMed

    Reddy, K Jalarama; Jayathilakan, K; Pandey, M C

    2016-01-01

    Meat is considered to be an excellent source of protein, essential minerals, trace elements and vitamins but negative concerns regarding meat consumption and its impact on human health have promoted research into development of novel functional meat products. In the present study Rice bran oil (RBO), and Flaxseed oil (FSO) were used for attaining an ideal lipid profile in the product. The experiment was designed to optimise the RBO and FSO concentration for development of product with ideal lipid profile and maximum acceptability by the application of central composite rotatable design of Response surface methodology (RSM). Levels of RBO and FSO were taken as independent variables and overall acceptability (OAA), n-6 and n-3 fatty acids as responses. Quadratic fit model was found to be suitable for optimising the product. Sample with RBO (20.51 ml) and FSO (2.57 ml) yielded an OAA score of 8.25, 29.54 % of n-6 and 7.70 % of n-3 having n-6/n-3 ratio as 3.8:1. Optimised product was analysed for physico-chemical, sensory and microbial profile during storage at 4 ± 1 °C for 30 days. Increase in the lipid oxidative parameters was observed during storage but it was not significant (p < 0.05). Studies revealed great potential of developing functional poultry products with improved nutritional quality and good shelf stability by incorporating RBO and FSO. PMID:26787966

  10. Materials by design: methodological developments in the calculation of excited-state properties

    NASA Astrophysics Data System (ADS)

    Govoni, Marco

    Density functional theory (DFT) is one of the main tools used in first principle simulations of materials; however several of the current approximations of exchange and correlation functionals do not provide the level of accuracy required for predictive calculations of excited state properties. The application to heterogeneous systems of more accurate post-DFT approaches such as Many-Body Perturbation Theory (MBPT) - for example to nanostructured, disordered, and defective materials - has been hindered by high computational costs. In this talk recent methodological developments in MBPT calculations will be discussed, as recently implemented in the open source code WEST, which efficiently exploits HPC architectures. Results using a formulation that does not require the explicit calculation of virtual states, nor the storage and inversion of large dielectric matrices will be presented; these results include quasi particle energies for systems with thousands of electrons and encompass the electronic structure of aqueous solutions, spin defects in insulators, and benchmarks for molecules and solids containing heavy elements. Simplifications of MBPT calculations based on the use of static response properties, such as dielectric-dependent hybrid functionals, will also be discussed. Work done in collaboration with Hosung Seo, Peter Scherpelz, Ikutaro Hamada, Jonathan Skone, Alex Gaiduk, T. Anh Pham, and Giulia Galli. Supported by DOE-BES.

  11. Advanced Raman Spectroscopy of Methylammonium Lead Iodide: Development of a Non-destructive Characterisation Methodology

    PubMed Central

    Pistor, Paul; Ruiz, Alejandro; Cabot, Andreu; Izquierdo-Roca, Victor

    2016-01-01

    In recent years, there has been an impressively fast technological progress in the development of highly efficient lead halide perovskite solar cells. However, the stability of perovskite films and respective solar cells is still an open point of concern and calls for advanced characterization methods. In this work, we identify appropriate measurement conditions for a meaningful analysis of spin-coated absorber-grade perovskite thin films based on methylammonium (MA) lead iodide (MAPbI3) by Raman spectroscopy. The material under investigation and its derivates is the most commonly used for high efficiency devices in the literatures and has yielded working solar cell devices with efficiencies around 10% in our laboratory. We report highly detailed Raman spectra obtained with excitation at 532 nm and 633 nm and their deconvolution taking advantage of the simultaneous fitting of spectra obtained with varying excitation wavelengths. Finally, we propose a fast and contactless methodology based on Raman to probe composition variations and/or degradation of these perovskite thin films and discuss the potential of the presented technique as quality control and degradation monitoring tool in other organic-inorganic perovskite materials and complete solar cell devices. PMID:27786250

  12. DARE Train-the-Trainer Pedagogy Development Using 2-Round Delphi Methodology

    PubMed Central

    Kua, Phek Hui Jade; Soon, Swee Sung

    2016-01-01

    The Dispatcher-Assisted first REsponder programme aims to equip the public with skills to perform hands-only cardiopulmonary resuscitation (CPR) and to use an automated external defibrillator (AED). By familiarising them with instructions given by a medical dispatcher during an out-of-hospital cardiac arrest call, they will be prepared and empowered to react in an emergency. We aim to formalise curriculum and standardise the way information is conveyed to the participants. A panel of 20 experts were chosen. Using Delphi methodology, selected issues were classified into open-ended and close-ended questions. Consensus for an item was established at a 70% agreement rate within the panel. Questions that had 60%–69% agreement were edited and sent to the panel for another round of voting. After 2 rounds of voting, 70 consensus statements were agreed upon. These covered the following: focus of CPR; qualities and qualifications of trainers; recognition of agonal breathing; head-tilt-chin lift; landmark for chest compression; performance of CPR when injuries are present; trainers' involvement in training lay people; modesty of female patients during CPR; AED usage; content of trainer's manual; addressing of questions and answers; updates-dissemination to trainers and attendance of refresher courses. Recommendations for pedagogy for trainers of dispatcher-assisted CPR programmes were developed. PMID:27660757

  13. Development of a Methodology for the Characterisation of Air-coupled Ultrasound Probes

    SciTech Connect

    Pietroni, Paolo; Marco Revel, Gian

    2010-05-28

    This study is aimed at developing a technique for the characterisation of air-coupled ultrasound probes, starting from the analysis of the mechanical behaviour of the probe membrane. The vibratory behaviour of the emission membrane is studied using laser-Doppler vibrometry techniques with high frequency demodulation system (20 MHz). The determination of the vibration provides information which are useful for the assessment of the performance of the probe, in particular concerning the Quality factor and the portion of the membrane which really contributes to the emission. During the second step the results of the vibration measurements are used to calculate, by means of numerical boundary element method, the ultrasound beam emitted in terms of intensity in space. The obtained field is compared with the direct measurements carried out by scanning with the receiver probe and a pinhole plate. This comparison allows the potential and the problems of the two different characterisation techniques to be determined, even if the pinhole technique (which is currently considered the state of the art) cannot be used as an absolute reference. This study appears to be useful for paving the way for a new methodology for the calibration of air-coupled ultrasound probes, which potentially could be used not only to improve the probe manufacturing process, but also to control conformity to specifications.

  14. Ispaghula mucilage-gellan mucoadhesive beads of metformin HCl: development by response surface methodology.

    PubMed

    Nayak, Amit Kumar; Pal, Dilipkumar; Santra, Kousik

    2014-07-17

    Response surface methodology based on 3(2) factorial design was used to develop ispaghula (Plantago ovata F.) husk mucilage (IHM)-gellan gum (GG) mucoadhesive beads containing metformin HCl through Ca(2+)-ion cross-linked ionotropic-gelation technique for the use in oral drug delivery. GG to IHM ratio and cross-linker (CaCl2) concentration were investigated as independent variables. Drug encapsulation efficiency (DEE, %) and cumulative drug release after 10h (R10h, %) were analyzed as dependent variables. The optimized mucoadhesive beads (F-O) showed DEE of 94.24 ± 4.18%, R10h of 59.13 ± 2.27%. These beads were also characterized by SEM and FTIR analyses. The in vitro drug release from these beads showed controlled-release (zero-order) pattern with super case-II transport mechanism over 10h. The optimized beads showed pH-dependent swelling and good mucoadhesivity with the goat intestinal mucosa. The optimized IHM-GG mucoadhesive beads containing metformin HCl exhibited significant antidiabetic effect in alloxan-induced diabetic rats over 10h. PMID:24702916

  15. Development of ginger based ready-to-eat appetizers by response surface methodology.

    PubMed

    Wadikar, D D; Nanjappa, C; Premavalli, K S; Bawa, A S

    2010-08-01

    Ginger is an herbaceous perennial rhizome traditionally used in culinary for its flavor and pungency. It is also used as carminative, stimulant and for its anti-emetic properties due to gingerols and shogaols. Appetite loss is one of the problems faced at high altitudes and the appetizers based on ginger may be useful for appetite stimulation. The fruit munch and ginger munch based on fresh and powdered ginger respectively were developed using response surface methodology (RSM). The sensory score, acidity and total sugars were the responses in the central composite designs of experiments with three independent variables. The ingredients raisins, dates, almonds were pre-processed by frying in stable fat while juice was extracted from pseudolemon and lemon. The optimized composition of ingredients was processed further through concentration. The carbohydrate rich munches had vitamin C content in the range 37-43mg/100g and calorific value of about 90kCal per munch. The munches packed in metalized polyester pouches had a shelf life of 8 months at ambient conditions (18-33 degrees C) as well as at a fixed temperature of 37 degrees C storage. PMID:20417239

  16. Development of a methodology to assess organometallic effects on bioenergetic systems

    SciTech Connect

    Packer, L.; Mehlhorn, R.J.

    1981-06-01

    A methodology for assessing the impact of subacute concentrations of organometallic agents on bioenergetic and oxidative damage processes in animals, cells and energy transducing subcellular organelles is being developed. Several of the assays are noninvasive and thus lend themselves to human tests. At the whole-animal level we utilize a treadmill chamber where physiological parameters of exercising animals are monitored. These include parameters of whole animals' work performance such as oxygen consumption, carbon dioxide evolution and endurance. Oxidative damage can be monitored in experiments by analyzing expired air of the animals for ethane and n-pentane. These alkanes correlate with lipid peroxidation in vivo. At the cellular and subcellular levels, respiratory activity, lipid peroxidation and free radical species are assayed. Respiratory activity is measured in muscle homogenates and isolated mitochondria using substrates which feed into different segments of the electron transport chain. To demonstrate how these assay procedures correlate, iron deficiency anemia in rats was analyzed. Physiologically, iron deficiency caused a 90% decrease in endurance which correlated with an 80% decrease in pyruvate-malate oxidation rates in muscle homogenates. Significant but smaller effects were seen in hemoglobin/hematocrit levels (50% decrease) and in maximal oxygen consumption (50% decrease). Tissue free-radical signals observed by ESR at room temperature increased with exercise.

  17. The Component Packaging Problem: A Vehicle for the Development of Multidisciplinary Design and Analysis Methodologies

    NASA Technical Reports Server (NTRS)

    Fadel, Georges; Bridgewood, Michael; Figliola, Richard; Greenstein, Joel; Kostreva, Michael; Nowaczyk, Ronald; Stevenson, Steve

    1999-01-01

    This report summarizes academic research which has resulted in an increased appreciation for multidisciplinary efforts among our students, colleagues and administrators. It has also generated a number of research ideas that emerged from the interaction between disciplines. Overall, 17 undergraduate students and 16 graduate students benefited directly from the NASA grant: an additional 11 graduate students were impacted and participated without financial support from NASA. The work resulted in 16 theses (with 7 to be completed in the near future), 67 papers or reports mostly published in 8 journals and/or presented at various conferences (a total of 83 papers, presentations and reports published based on NASA inspired or supported work). In addition, the faculty and students presented related work at many meetings, and continuing work has been proposed to NSF, the Army, Industry and other state and federal institutions to continue efforts in the direction of multidisciplinary and recently multi-objective design and analysis. The specific problem addressed is component packing which was solved as a multi-objective problem using iterative genetic algorithms and decomposition. Further testing and refinement of the methodology developed is presently under investigation. Teaming issues research and classes resulted in the publication of a web site, (http://design.eng.clemson.edu/psych4991) which provides pointers and techniques to interested parties. Specific advantages of using iterative genetic algorithms, hurdles faced and resolved, and institutional difficulties associated with multi-discipline teaming are described in some detail.

  18. Developing a robust methodology for assessing the value of weather/climate services

    NASA Astrophysics Data System (ADS)

    Krijnen, Justin; Golding, Nicola; Buontempo, Carlo

    2016-04-01

    Increasingly, scientists involved in providing weather and climate services are expected to demonstrate the value of their work for end users in order to justify the costs of developing and delivering these services. This talk will outline different approaches that can be used to assess the socio-economic benefits of weather and climate services, including, among others, willingness to pay and avoided costs. The advantages and limitations of these methods will be discussed and relevant case-studies will be used to illustrate each approach. The choice of valuation method may be influenced by different factors, such as resource and time constraints and the end purposes of the study. In addition, there are important methodological differences which will affect the value assessed. For instance the ultimate value of a weather/climate forecast to a decision-maker will not only depend on forecast accuracy but also on other factors, such as how the forecast is communicated to and consequently interpreted by the end-user. Thus, excluding these additional factors may result in inaccurate socio-economic value estimates. In order to reduce the inaccuracies in this valuation process we propose an approach that assesses how the initial weather/climate forecast information can be incorporated within the value chain of a given sector, taking into account value gains and losses at each stage of the delivery process. By this we aim to more accurately depict the socio-economic benefits of a weather/climate forecast to decision-makers.

  19. Development of ginger based ready-to-eat appetizers by response surface methodology.

    PubMed

    Wadikar, D D; Nanjappa, C; Premavalli, K S; Bawa, A S

    2010-08-01

    Ginger is an herbaceous perennial rhizome traditionally used in culinary for its flavor and pungency. It is also used as carminative, stimulant and for its anti-emetic properties due to gingerols and shogaols. Appetite loss is one of the problems faced at high altitudes and the appetizers based on ginger may be useful for appetite stimulation. The fruit munch and ginger munch based on fresh and powdered ginger respectively were developed using response surface methodology (RSM). The sensory score, acidity and total sugars were the responses in the central composite designs of experiments with three independent variables. The ingredients raisins, dates, almonds were pre-processed by frying in stable fat while juice was extracted from pseudolemon and lemon. The optimized composition of ingredients was processed further through concentration. The carbohydrate rich munches had vitamin C content in the range 37-43mg/100g and calorific value of about 90kCal per munch. The munches packed in metalized polyester pouches had a shelf life of 8 months at ambient conditions (18-33 degrees C) as well as at a fixed temperature of 37 degrees C storage.

  20. DARE Train-the-Trainer Pedagogy Development Using 2-Round Delphi Methodology

    PubMed Central

    Kua, Phek Hui Jade; Soon, Swee Sung

    2016-01-01

    The Dispatcher-Assisted first REsponder programme aims to equip the public with skills to perform hands-only cardiopulmonary resuscitation (CPR) and to use an automated external defibrillator (AED). By familiarising them with instructions given by a medical dispatcher during an out-of-hospital cardiac arrest call, they will be prepared and empowered to react in an emergency. We aim to formalise curriculum and standardise the way information is conveyed to the participants. A panel of 20 experts were chosen. Using Delphi methodology, selected issues were classified into open-ended and close-ended questions. Consensus for an item was established at a 70% agreement rate within the panel. Questions that had 60%–69% agreement were edited and sent to the panel for another round of voting. After 2 rounds of voting, 70 consensus statements were agreed upon. These covered the following: focus of CPR; qualities and qualifications of trainers; recognition of agonal breathing; head-tilt-chin lift; landmark for chest compression; performance of CPR when injuries are present; trainers' involvement in training lay people; modesty of female patients during CPR; AED usage; content of trainer's manual; addressing of questions and answers; updates-dissemination to trainers and attendance of refresher courses. Recommendations for pedagogy for trainers of dispatcher-assisted CPR programmes were developed.

  1. Development of new methodologies for evaluating the energy performance of new commercial buildings

    NASA Astrophysics Data System (ADS)

    Song, Suwon

    The concept of Measurement and Verification (M&V) of a new building continues to become more important because efficient design alone is often not sufficient to deliver an efficient building. Simulation models that are calibrated to measured data can be used to evaluate the energy performance of new buildings if they are compared to energy baselines such as similar buildings, energy codes, and design standards. Unfortunately, there is a lack of detailed M&V methods and analysis methods to measure energy savings from new buildings that would have hypothetical energy baselines. Therefore, this study developed and demonstrated several new methodologies for evaluating the energy performance of new commercial buildings using a case-study building in Austin, Texas. First, three new M&V methods were developed to enhance the previous generic M&V framework for new buildings, including: (1) The development of a method to synthesize weather-normalized cooling energy use from a correlation of Motor Control Center (MCC) electricity use when chilled water use is unavailable, (2) The development of an improved method to analyze measured solar transmittance against incidence angle for sample glazing using different solar sensor types, including Eppley PSP and Li-Cor sensors, and (3) The development of an improved method to analyze chiller efficiency and operation at part-load conditions. Second, three new calibration methods were developed and analyzed, including: (1) A new percentile analysis added to the previous signature method for use with a DOE-2 calibration, (2) A new analysis to account for undocumented exhaust air in DOE-2 calibration, and (3) An analysis of the impact of synthesized direct normal solar radiation using the Erbs correlation on DOE-2 simulation. Third, an analysis of the actual energy savings compared to three different energy baselines was performed, including: (1) Energy Use Index (EUI) comparisons with sub-metered data, (2) New comparisons against

  2. GRB 070724B: the first Gamma Ray Burst localized by SuperAGILE

    SciTech Connect

    Del Monte, E.; Costa, E.; Donnarumma, I.; Feroci, M.; Lapshov, I.; Lazzarotto, F.; Soffitta, P.; Argan, A.; Pucella, G.; Trois, A.; Vittorini, V.; Evangelista, Y.; Rapisarda, M.; Barbiellini, G.; Longo, F.; Basset, M.; Foggetta, L.; Vallazza, E.; Bulgarelli, A.; Di Cocco, G.

    2008-05-22

    GRB070724B is the first Gamma Ray Burst localized by the SuperAGILE instrument aboard the AGILE space mission. The SuperAGILE localization has been confirmed after the after-glow observation by the XRT aboard the Swift satellite. No significant gamma ray emission above 50 MeV has been detected for this GRB. In this paper we describe the SuperAGILE capabilities in detecting Gamma Ray Burst and the AGILE observation of GRB 070724B.

  3. Development and evaluation of habitat suitability criteria for use in the instream flow incremental methodology

    USGS Publications Warehouse

    Bovee, Ken D.

    1986-01-01

    The Instream Flow Incremental Methodology (IFIM) is a habitat-based tool used to evaluate the environmental consequences of various water and land use practices. As such, knowledge about the conditions that provide favorable habitat for a species, and those that do not, is necessary for successful implementation of the methodology. In the context of IFIM, this knowledge is defined as habitat suitability criteria: characteristic behavioral traits of a species that are established as standards for comparison in the decision-making process. Habitat suitability criteria may be expressed in a variety of types and formats. The type, or category, refers to the procedure used to develop the criteria. Category I criteria are based on professional judgment, with little or no empirical data. Category II criteria have as their source, microhabitat data collected at locations where target organisms are observed or collected. These are called “utilization” functions because they are based on observed locations that were used by the target organism. These functions tend to be biased by the environmental conditions that were available to the fish or invertebrates at the time they were observed. Correction of the utilization function for environmental availability creates category III, or “preference” criteria, which tend to be much less site specific than category II criteria. There are also several ways to express habitat suitability in graphical form. The binary format establishes a suitable range for each variable as it pertains to a life stage of interest, and is presented graphically as a step function. The quality rating for a variable is 1.0 if it falls within the range of the criteria, and 0.0 if it falls outside the range. The univariate curve format established both the usable range and the optimum range for each variable, with conditions of intermediate usability expressed along the portion between the tails and the peak of the curve. Multivariate probability

  4. Insight into model mechanisms through automatic parameter fitting: a new methodological framework for model development

    PubMed Central

    2014-01-01

    Background Striking a balance between the degree of model complexity and parameter identifiability, while still producing biologically feasible simulations using modelling is a major challenge in computational biology. While these two elements of model development are closely coupled, parameter fitting from measured data and analysis of model mechanisms have traditionally been performed separately and sequentially. This process produces potential mismatches between model and data complexities that can compromise the ability of computational frameworks to reveal mechanistic insights or predict new behaviour. In this study we address this issue by presenting a generic framework for combined model parameterisation, comparison of model alternatives and analysis of model mechanisms. Results The presented methodology is based on a combination of multivariate metamodelling (statistical approximation of the input–output relationships of deterministic models) and a systematic zooming into biologically feasible regions of the parameter space by iterative generation of new experimental designs and look-up of simulations in the proximity of the measured data. The parameter fitting pipeline includes an implicit sensitivity analysis and analysis of parameter identifiability, making it suitable for testing hypotheses for model reduction. Using this approach, under-constrained model parameters, as well as the coupling between parameters within the model are identified. The methodology is demonstrated by refitting the parameters of a published model of cardiac cellular mechanics using a combination of measured data and synthetic data from an alternative model of the same system. Using this approach, reduced models with simplified expressions for the tropomyosin/crossbridge kinetics were found by identification of model components that can be omitted without affecting the fit to the parameterising data. Our analysis revealed that model parameters could be constrained to a standard

  5. Built To Last: Using Iterative Development Models for Sustainable Scientific Software Development

    NASA Astrophysics Data System (ADS)

    Jasiak, M. E.; Truslove, I.; Savoie, M.

    2013-12-01

    In scientific research, software development exists fundamentally for the results they create. The core research must take focus. It seems natural to researchers, driven by grant deadlines, that every dollar invested in software development should be used to push the boundaries of problem solving. This system of values is frequently misaligned with those of the software being created in a sustainable fashion; short-term optimizations create longer-term sustainability issues. The National Snow and Ice Data Center (NSIDC) has taken bold cultural steps in using agile and lean development and management methodologies to help its researchers meet critical deadlines, while building in the necessary support structure for the code to live far beyond its original milestones. Agile and lean software development and methodologies including Scrum, Kanban, Continuous Delivery and Test-Driven Development have seen widespread adoption within NSIDC. This focus on development methods is combined with an emphasis on explaining to researchers why these methods produce more desirable results for everyone, as well as promoting developers interacting with researchers. This presentation will describe NSIDC's current scientific software development model, how this addresses the short-term versus sustainability dichotomy, the lessons learned and successes realized by transitioning to this agile and lean-influenced model, and the current challenges faced by the organization.

  6. Agile data management for curation of genomes to watershed datasets

    NASA Astrophysics Data System (ADS)

    Varadharajan, C.; Agarwal, D.; Faybishenko, B.; Versteeg, R.

    2015-12-01

    A software platform is being developed for data management and assimilation [DMA] as part of the U.S. Department of Energy's Genomes to Watershed Sustainable Systems Science Focus Area 2.0. The DMA components and capabilities are driven by the project science priorities and the development is based on agile development techniques. The goal of the DMA software platform is to enable users to integrate and synthesize diverse and disparate field, laboratory, and simulation datasets, including geological, geochemical, geophysical, microbiological, hydrological, and meteorological data across a range of spatial and temporal scales. The DMA objectives are (a) developing an integrated interface to the datasets, (b) storing field monitoring data, laboratory analytical results of water and sediments samples collected into a database, (c) providing automated QA/QC analysis of data and (d) working with data providers to modify high-priority field and laboratory data collection and reporting procedures as needed. The first three objectives are driven by user needs, while the last objective is driven by data management needs. The project needs and priorities are reassessed regularly with the users. After each user session we identify development priorities to match the identified user priorities. For instance, data QA/QC and collection activities have focused on the data and products needed for on-going scientific analyses (e.g. water level and geochemistry). We have also developed, tested and released a broker and portal that integrates diverse datasets from two different databases used for curation of project data. The development of the user interface was based on a user-centered design process involving several user interviews and constant interaction with data providers. The initial version focuses on the most requested feature - i.e. finding the data needed for analyses through an intuitive interface. Once the data is found, the user can immediately plot and download data

  7. Development of an assessment methodology for hydrocarbon recovery potential using carbon dioxide and associated carbon sequestration-Workshop findings

    USGS Publications Warehouse

    Verma, Mahendra K.; Warwick, Peter D.

    2011-01-01

    The Energy Independence and Security Act of 2007 (Public Law 110-140) authorized the U.S. Geological Survey (USGS) to conduct a national assessment of geologic storage resources for carbon dioxide (CO2) and requested that the USGS estimate the "potential volumes of oil and gas recoverable by injection and sequestration of industrial carbon dioxide in potential sequestration formations" (121 Stat. 1711). The USGS developed a noneconomic, probability-based methodology to assess the Nation's technically assessable geologic storage resources available for sequestration of CO2 (Brennan and others, 2010) and is currently using the methodology to assess the Nation's CO2 geologic storage resources. Because the USGS has not developed a methodology to assess the potential volumes of technically recoverable hydrocarbons that could be produced by injection and sequestration of CO2, the Geologic Carbon Sequestration project initiated an effort in 2010 to develop a methodology for the assessment of the technically recoverable hydrocarbon potential in the sedimentary basins of the United States using enhanced oil recovery (EOR) techniques with CO2 (CO2-EOR). In collaboration with Stanford University, the USGS hosted a 2-day CO2-EOR workshop in May 2011, attended by 28 experts from academia, natural resource agencies and laboratories of the Federal Government, State and international geologic surveys, and representatives from the oil and gas industry. The geologic and the reservoir engineering and operations working groups formed during the workshop discussed various aspects of geology, reservoir engineering, and operations to make recommendations for the methodology.

  8. Methodological developments in searching for studies for systematic reviews: past, present and future?

    PubMed

    Lefebvre, Carol; Glanville, Julie; Wieland, L Susan; Coles, Bernadette; Weightman, Alison L

    2013-01-01

    The Cochrane Collaboration was established in 1993, following the opening of the UK Cochrane Centre in 1992, at a time when searching for studies for inclusion in systematic reviews was not well-developed. Review authors largely conducted their own searches or depended on medical librarians, who often possessed limited awareness and experience of systematic reviews. Guidance on the conduct and reporting of searches was limited. When work began to identify reports of randomized controlled trials (RCTs) for inclusion in Cochrane Reviews in 1992, there were only approximately 20,000 reports indexed as RCTs in MEDLINE and none indexed as RCTs in Embase. No search filters had been developed with the aim of identifying all RCTs in MEDLINE or other major databases. This presented The Cochrane Collaboration with a considerable challenge in identifying relevant studies.Over time, the number of studies indexed as RCTs in the major databases has grown considerably and the Cochrane Central Register of Controlled Trials (CENTRAL) has become the best single source of published controlled trials, with approximately 700,000 records, including records identified by the Collaboration from Embase and MEDLINE. Search filters for various study types, including systematic reviews and the Cochrane Highly Sensitive Search Strategies for RCTs, have been developed. There have been considerable advances in the evidence base for methodological aspects of information retrieval. The Cochrane Handbook for Systematic Reviews of Interventions now provides detailed guidance on the conduct and reporting of searches. Initiatives across The Cochrane Collaboration to improve the quality inter alia of information retrieval include: the recently introduced Methodological Expectations for Cochrane Intervention Reviews (MECIR) programme, which stipulates 'mandatory' and 'highly desirable' standards for various aspects of review conduct and reporting including searching, the development of Standard Training

  9. Methodological developments in searching for studies for systematic reviews: past, present and future?

    PubMed

    Lefebvre, Carol; Glanville, Julie; Wieland, L Susan; Coles, Bernadette; Weightman, Alison L

    2013-09-25

    The Cochrane Collaboration was established in 1993, following the opening of the UK Cochrane Centre in 1992, at a time when searching for studies for inclusion in systematic reviews was not well-developed. Review authors largely conducted their own searches or depended on medical librarians, who often possessed limited awareness and experience of systematic reviews. Guidance on the conduct and reporting of searches was limited. When work began to identify reports of randomized controlled trials (RCTs) for inclusion in Cochrane Reviews in 1992, there were only approximately 20,000 reports indexed as RCTs in MEDLINE and none indexed as RCTs in Embase. No search filters had been developed with the aim of identifying all RCTs in MEDLINE or other major databases. This presented The Cochrane Collaboration with a considerable challenge in identifying relevant studies.Over time, the number of studies indexed as RCTs in the major databases has grown considerably and the Cochrane Central Register of Controlled Trials (CENTRAL) has become the best single source of published controlled trials, with approximately 700,000 records, including records identified by the Collaboration from Embase and MEDLINE. Search filters for various study types, including systematic reviews and the Cochrane Highly Sensitive Search Strategies for RCTs, have been developed. There have been considerable advances in the evidence base for methodological aspects of information retrieval. The Cochrane Handbook for Systematic Reviews of Interventions now provides detailed guidance on the conduct and reporting of searches. Initiatives across The Cochrane Collaboration to improve the quality inter alia of information retrieval include: the recently introduced Methodological Expectations for Cochrane Intervention Reviews (MECIR) programme, which stipulates 'mandatory' and 'highly desirable' standards for various aspects of review conduct and reporting including searching, the development of Standard Training

  10. Methodological developments in searching for studies for systematic reviews: past, present and future?

    PubMed Central

    2013-01-01

    The Cochrane Collaboration was established in 1993, following the opening of the UK Cochrane Centre in 1992, at a time when searching for studies for inclusion in systematic reviews was not well-developed. Review authors largely conducted their own searches or depended on medical librarians, who often possessed limited awareness and experience of systematic reviews. Guidance on the conduct and reporting of searches was limited. When work began to identify reports of randomized controlled trials (RCTs) for inclusion in Cochrane Reviews in 1992, there were only approximately 20,000 reports indexed as RCTs in MEDLINE and none indexed as RCTs in Embase. No search filters had been developed with the aim of identifying all RCTs in MEDLINE or other major databases. This presented The Cochrane Collaboration with a considerable challenge in identifying relevant studies. Over time, the number of studies indexed as RCTs in the major databases has grown considerably and the Cochrane Central Register of Controlled Trials (CENTRAL) has become the best single source of published controlled trials, with approximately 700,000 records, including records identified by the Collaboration from Embase and MEDLINE. Search filters for various study types, including systematic reviews and the Cochrane Highly Sensitive Search Strategies for RCTs, have been developed. There have been considerable advances in the evidence base for methodological aspects of information retrieval. The Cochrane Handbook for Systematic Reviews of Interventions now provides detailed guidance on the conduct and reporting of searches. Initiatives across The Cochrane Collaboration to improve the quality inter alia of information retrieval include: the recently introduced Methodological Expectations for Cochrane Intervention Reviews (MECIR) programme, which stipulates 'mandatory’ and 'highly desirable’ standards for various aspects of review conduct and reporting including searching, the development of Standard

  11. Epilepsy therapy development: technical and methodologic issues in studies with animal models.

    PubMed

    Galanopoulou, Aristea S; Kokaia, Merab; Loeb, Jeffrey A; Nehlig, Astrid; Pitkänen, Asla; Rogawski, Michael A; Staley, Kevin J; Whittemore, Vicky H; Dudek, F Edward

    2013-08-01

    The search for new treatments for seizures, epilepsies, and their comorbidities faces considerable challenges. This is due in part to gaps in our understanding of the etiology and pathophysiology of most forms of epilepsy. An additional challenge is the difficulty in predicting the efficacy, tolerability, and impact of potential new treatments on epilepsies and comorbidities in humans, using the available resources. Herein we provide a summary of the discussions and proposals of the Working Group 2 as presented in the Joint American Epilepsy Society and International League Against Epilepsy Translational Workshop in London (September 2012). We propose methodologic and reporting practices that will enhance the uniformity, reliability, and reporting of early stage preclinical studies with animal seizure and epilepsy models that aim to develop and evaluate new therapies for seizures or epilepsies, using multidisciplinary approaches. The topics considered include the following: (1) implementation of better study design and reporting practices; (2) incorporation in the study design and analysis of covariants that may influence outcomes (including species, age, sex); (3) utilization of approaches to document target relevance, exposure, and engagement by the tested treatment; (4) utilization of clinically relevant treatment protocols; (5) optimization of the use of video-electroencephalography (EEG) recordings to best meet the study goals; and (6) inclusion of outcome measures that address the tolerability of the treatment or study end points apart from seizures. We further discuss the different expectations for studies aiming to meet regulatory requirements to obtain approval for clinical testing in humans. Implementation of the rigorous practices discussed in this report will require considerable investment in time, funds, and other research resources, which may create challenges for academic researchers seeking to contribute to epilepsy therapy discovery and

  12. Epilepsy Therapy Development: Technical and Methodological Issues in Studies with Animal Models

    PubMed Central

    Galanopoulou, Aristea S.; Kokaia, Merab; Loeb, Jeffrey A.; Nehlig, Astrid; Pitkänen, Asla; Rogawski, Michael A.; Staley, Kevin J.; Whittemore, Vicky H.; Dudek, F. Edward

    2013-01-01

    SUMMARY The search for new treatments for seizures, epilepsies and their comorbidities faces considerable challenges. Partly, this is due to gaps in our understanding of the etiology and pathophysiology of most forms of epilepsy. An additional challenge is the difficulty to predict the efficacy, tolerability and impact of potential new treatments on epilepsies and comorbidities in humans, using the available resources. Here we provide a summary of the discussions and proposals of the Working Group 2 as presented in the Joint American Epilepsy Society and International League Against Epilepsy Translational Workshop in London (September 2012). We propose methodological and reporting practices that will enhance the uniformity, reliability and reporting of early stage preclinical studies with animal seizure and epilepsy models that aim to develop and evaluate new therapies for seizures or epilepsies, using multi-disciplinary approaches. The topics considered include: (a) implementation of better study design and reporting practices, (b) incorporation in the study design and analysis of covariants that may impact outcomes (including species, age, sex), (c) utilization of approaches to document target relevance, exposure and engagement by the tested treatment, (d) utilization of clinically relevant treatment protocols, (e) optimization of the use of video-EEG recordings to best meet the study goals, and (f) inclusion of outcome measures that address the tolerability of the treatment or study endpoints apart from seizures. We further discuss the different expectations for studies aiming to meet regulatory requirements to obtain approval for clinical testing in humans. Implementation of the rigorous practices discussed in this report will require considerable investment in time, funds and other research resources, which may create challenges for academic researchers seeking to contribute to epilepsy therapy discovery and development. We propose several infrastructure

  13. Development and optimization of ifosfamide nanostructured lipid carriers for oral delivery using response surface methodology

    NASA Astrophysics Data System (ADS)

    Velmurugan, Ramaiyan; Selvamuthukumar, Subramanian

    2016-02-01

    The research focuses on the development and optimization of ifosfamide nanostructured lipid carriers for oral delivery with the application of response surface methodology. The objectives of the study were to develop a formulation for ifosfamide to be delivered orally, overcome the instability of the drug in acidic environment during oral administration, to sustain the release, drug leakage during storage and low loading capacity. A modified solvent diffusion method in aqueous system was applied to prepare nanostructured lipid nanoparticles. Hydrophilic polymers such as chitosan and sodium alginate were used as coating materials. Glycerol mono oleate and oleic acid were used as solid and liquid lipid, respectively. Poloxamer is used as stabilizers. The central composite rotatable design consisting of three-factored factorial design with three levels was used in this study. The physiochemical characterization included evaluation of surface morphology, particle size and surface charge of the drug in the delivery system. The in vitro drug release, entrapment and drug loading efficiency and as well as the storage stability were evaluated. The results showed that the optimal formulation was composed of drug/lipid ratio of 1:3, organic/aqueous phase ratio of 1:10 and concentration of surfactant of 1 % w/v. Ifosfamide nanostructured lipid carrier under the optimized conditions gave rise to the entrapment efficiency of 77 %, drug loading of 6.14 %, mean diameter of 223 nm and zeta potential value of -25 mV. Transmission electron microscopy analysis showed spherical particles. The in vitro experiment proved that ifosfamide from the delivery system released gradually over the period of 72 h. Sodium alginate cross-linked chitosan nanostructured lipid carrier demonstrated enhanced stability of ifosfamide, high entrapment efficiency and sustained release.

  14. Frequency agile OPO-based transmitters for multiwavelength DIAL

    SciTech Connect

    Velsko, S.P.; Ruggiero, A.; Herman, M.

    1996-09-01

    We describe a first generation mid-infrared transmitter with pulse to pulse frequency agility and both wide and narrow band capability. This transmitter was used to make multicomponent Differential Absorption LIDAR (DIAL) measurements in the field.

  15. Frequency agile OPO-based transmitters for multiwavelength DIAL

    SciTech Connect

    Velsko, S.P.; Ruggiero, A.; Herman, M.

    1996-09-01

    We describe a first generation mid-infrared transmitter with pulse-to- pulse frequency agility and both wide and narrow band capability. This transmitter was used to make multicomponent DIAL measurements in the field.

  16. Laterality and performance of agility-trained dogs.

    PubMed

    Siniscalchi, Marcello; Bertino, Daniele; Quaranta, Angelo

    2014-01-01

    Correlations between lateralised behaviour and performance were investigated in 19 agility-trained dogs (Canis familiaris) by scoring paw preference to hold a food object and relating it to performance during typical agility obstacles (jump/A-frame and weave poles). In addition, because recent behavioural studies reported that visual stimuli of emotional valence presented to one visual hemifield at a time affect visually guided motor responses in dogs, the possibility that the position of the owner respectively in the left and in the right canine visual hemifield might be associated with quality of performance during agility was considered. Dogs' temperament was also measured by an owner-rated questionnaire. The most relevant finding was that agility-trained dogs displayed longer latencies to complete the obstacles with the owner located in their left visual hemifield compared to the right. Interestingly, the results showed that this phenomenon was significantly linked to both dogs' trainability and the strength of paw preference.

  17. Interferometric Meteor Head Echo Observations using the Southern Argentina Agile Meteor Radar (SAAMER)

    NASA Technical Reports Server (NTRS)

    Janches, D.; Hocking, W.; Pifko, S.; Hormaechea, J. L.; Fritts, D. C.; Brunini, C; Michell, R.; Samara, M.

    2013-01-01

    A radar meteor echo is the radar scattering signature from the free-electrons in a plasma trail generated by entry of extraterrestrial particles into the atmosphere. Three categories of scattering mechanisms exist: specular, nonspecular trails, and head-echoes. Generally, there are two types of radars utilized to detect meteors. Traditional VHF meteor radars (often called all-sky1radars) primarily detect the specular reflection of meteor trails traveling perpendicular to the line of sight of the scattering trail, while High Power and Large Aperture (HPLA) radars efficiently detect meteor head-echoes and, in some cases, non-specular trails. The fact that head-echo measurements can be performed only with HPLA radars limits these studies in several ways. HPLA radars are very sensitive instruments constraining the studies to the lower masses, and these observations cannot be performed continuously because they take place at national observatories with limited allocated observing time. These drawbacks can be addressed by developing head echo observing techniques with modified all-sky meteor radars. In addition, the fact that the simultaneous detection of all different scattering mechanisms can be made with the same instrument, rather than requiring assorted different classes of radars, can help clarify observed differences between the different methodologies. In this study, we demonstrate that such concurrent observations are now possible, enabled by the enhanced design of the Southern Argentina Agile Meteor Radar (SAAMER) deployed at the Estacion Astronomica Rio Grande (EARG) in Tierra del Fuego, Argentina. The results presented here are derived from observations performed over a period of 12 days in August 2011, and include meteoroid dynamical parameter distributions, radiants and estimated masses. Overall, the SAAMER's head echo detections appear to be produced by larger particles than those which have been studied thus far using this technique.

  18. Interferometric meteor head echo observations using the Southern Argentina Agile Meteor Radar

    NASA Astrophysics Data System (ADS)

    Janches, D.; Hocking, W.; Pifko, S.; Hormaechea, J. L.; Fritts, D. C.; Brunini, C.; Michell, R.; Samara, M.

    2014-03-01

    A radar meteor echo is the radar scattering signature from the free electrons generated by the entry of extraterrestrial particles into the atmosphere. Three categories of scattering mechanisms exist: specular, nonspecular trails, and head echoes. Generally, there are two types of radars utilized to detect meteors. Traditional VHF all-sky meteor radars primarily detect the specular trails, while high-power, large-aperture (HPLA) radars efficiently detect meteor head echoes and, in some cases, nonspecular trails. The fact that head echo measurements can be performed only with HPLA radars limits these studies in several ways. HPLA radars are sensitive instruments constraining the studies to the lower masses, and these observations cannot be performed continuously because they take place at national observatories with limited allocated observing time. These drawbacks can be addressed by developing head echo observing techniques with modified all-sky meteor radars. Such systems would also permit simultaneous detection of all different scattering mechanisms using the same instrument, rather than requiring assorted different classes of radars, which can help clarify observed differences between the different methodologies. In this study, we demonstrate that such concurrent observations are now possible, enabled by the enhanced design of the Southern Argentina Agile Meteor Radar (SAAMER). The results presented here are derived from observations performed over a period of 12 days in August 2011 and include meteoroid dynamical parameter distributions, radiants, and estimated masses. Overall, the SAAMER's head echo detections appear to be produced by larger particles than those which have been studied thus far using this technique.

  19. Autonomous, agile micro-satellites and supporting technologies

    SciTech Connect

    Breitfeller, E; Dittman, M D; Gaughan, R J; Jones, M S; Kordas, J F; Ledebuhr, A G; Ng, L C; Whitehead, J C; Wilson, B

    1999-07-19

    This paper updates the on-going effort at Lawrence Livermore National Laboratory to develop autonomous, agile micro-satellites (MicroSats). The objective of this development effort is to develop MicroSats weighing only a few tens of kilograms, that are able to autonomously perform precision maneuvers and can be used telerobotically in a variety of mission modes. The required capabilities include satellite rendezvous, inspection, proximity-operations, docking, and servicing. The MicroSat carries an integrated proximity-operations sensor-suite incorporating advanced avionics. A new self-pressurizing propulsion system utilizing a miniaturized pump and non-toxic mono-propellant hydrogen peroxide was successfully tested. This system can provide a nominal 25 kg MicroSat with 200-300 m/s delta-v including a warm-gas attitude control system. The avionics is based on the latest PowerPC processor using a CompactPCI bus architecture, which is modular, high-performance and processor-independent. This leverages commercial-off-the-shelf (COTS) technologies and minimizes the effects of future changes in processors. The MicroSat software development environment uses the Vx-Works real-time operating system (RTOS) that provides a rapid development environment for integration of new software modules, allowing early integration and test. We will summarize results of recent integrated ground flight testing of our latest non-toxic pumped propulsion MicroSat testbed vehicle operated on our unique dynamic air-rail.

  20. How you count carbon matters: implications of differing cookstove carbon credit methodologies for climate and development cobenefits.

    PubMed

    Freeman, Olivia E; Zerriffi, Hisham

    2014-12-16

    The opportunity to apply for carbon credits for cookstove projects creates a source of funding that can be leveraged to promote the "win-win" environmental and development benefits of improved cookstoves. Yet, as in most environment-development efforts, unacknowledged trade-offs exist under the all-encompassing "win-win" claims. This study therefore compares different scenarios for calculating cookstove carbon credits, including comparing different types of stoves using different fuels, different methodologies and theoretical scenarios to account for a range of climate-relevant emissions. The results of the study highlight the following: 1) impacts of different assumptions made within carbon credit methodologies, 2) discussion around potential trade-offs in such projects, and 3) considerations needed to truly promote sustainable development. The Gold Standard methodology was more comprehensive in its accounting and generally calculated more carbon credits per scenario than the Clean Development Mechanism methodology. Including black carbon in calculations would be more reflective of climate-relevant stove emissions and greatly increase the number of credits calculated. As health and other development benefits are not inherently included in carbon credit calculations, to achieve "win-win" outcomes, deliberate decisions about project design need to be made to ensure objectives are met and not simply assumed.