Science.gov

Sample records for agile process model

  1. Planning and scheduling for agile manufacturers: The Pantex Process Model

    SciTech Connect

    Kjeldgaard, E.A.; Jones, D.A.; List, G.F.; Tumquist, M.A.

    1998-02-01

    Effective use of resources that are shared among multiple products or processes is critical for agile manufacturing. This paper describes the development and implementation of a computerized model to support production planning in a complex manufacturing system at the Pantex Plant, a US Department of Energy facility. The model integrates two different production processes (nuclear weapon disposal and stockpile evaluation) that use common facilities and personnel at the plant. The two production processes are characteristic of flow-shop and job shop operations. The model reflects the interactions of scheduling constraints, material flow constraints, and the availability of required technicians and facilities. Operational results show significant productivity increases from use of the model.

  2. Are we unnecessarily constraining the agility of complex process-based models?

    NASA Astrophysics Data System (ADS)

    Mendoza, Pablo A.; Clark, Martyn P.; Barlage, Michael; Rajagopalan, Balaji; Samaniego, Luis; Abramowitz, Gab; Gupta, Hoshin

    2015-01-01

    In this commentary we suggest that hydrologists and land-surface modelers may be unnecessarily constraining the behavioral agility of very complex physics-based models. We argue that the relatively poor performance of such models can occur due to restrictions on their ability to refine their portrayal of physical processes, in part because of strong a priori constraints in: (i) the representation of spatial variability and hydrologic connectivity, (ii) the choice of model parameterizations, and (iii) the choice of model parameter values. We provide a specific example of problems associated with strong a priori constraints on parameters in a land surface model. Moving forward, we assert that improving hydrological models requires integrating the strengths of the "physics-based" modeling philosophy (which relies on prior knowledge of hydrologic processes) with the strengths of the "conceptual" modeling philosophy (which relies on data driven inference). Such integration will accelerate progress on methods to define and discriminate among competing modeling options, which should be ideally incorporated in agile modeling frameworks and tested through a diagnostic evaluation approach.

  3. Opening up the Agile Innovation Process

    NASA Astrophysics Data System (ADS)

    Conboy, Kieran; Donnellan, Brian; Morgan, Lorraine; Wang, Xiaofeng

    The objective of this panel is to discuss how firms can operate both an open and agile innovation process. In an era of unprecedented changes, companies need to be open and agile in order to adapt rapidly and maximize their innovation processes. Proponents of agile methods claim that one of the main distinctions between agile methods and their traditional bureaucratic counterparts is their drive toward creativity and innovation. However, agile methods are rarely adopted in their textbook, "vanilla" format, and are usually adopted in part or are tailored or modified to suit the organization. While we are aware that this happens, there is still limited understanding of what is actually happening in practice. Using innovation adoption theory, this panel will discuss the issues and challenges surrounding the successful adoption of agile practices. In addition, this panel will report on the obstacles and benefits reported by over 20 industrial partners engaged in a pan-European research project into agile practices between 2006 and 2009.

  4. Agile

    NASA Technical Reports Server (NTRS)

    Trimble, Jay Phillip

    2013-01-01

    This is based on a previous talk on agile development. Methods for delivering software on a short cycle are described, including interactions with the customer, the affect on the team, and how to be more effective, streamlined and efficient.

  5. Agility and mixed-model furniture production

    NASA Astrophysics Data System (ADS)

    Yao, Andrew C.

    2000-10-01

    The manufacture of upholstered furniture provides an excellent opportunity to analyze the effect of a comprehensive communication system on classical production management functions. The objective of the research is to study the scheduling heuristics that embrace the concepts inherent in MRP, JIT and TQM while recognizing the need for agility in a somewhat complex and demanding environment. An on-line, real-time data capture system provides the status and location of production lots, components, subassemblies for schedule control. Current inventory status of raw material and purchased items are required in order to develop and adhere to schedules. For the large variety of styles and fabrics customers may order, the communication system must provide timely, accurate and comprehensive information for intelligent decisions with respect to the product mix and production resources.

  6. RFID-Based Critical Path Expert System for Agility Manufacture Process Management

    NASA Astrophysics Data System (ADS)

    Cheng, Haifang; Xiang, Yuli

    This paper presents a critical path expert system for the agility manufacture process management based on radio frequency identification (RFID) technology. The paper explores that the agility manufacture processes can be visible and controllable with RFID. The critical paths or activities can be easily found out and tracked by the RFID tracing technology. And the expert system can optimize the bottle neck of the task process of the agility management with the critical path adjusting and reforming method. Finally, the paper gives a simple application example of the system to discuss how to adjust the critical paths and how to make the process more agility and flexibility with the critical path expert system. With an RFID-based critical path expert system, the agility manufacture process management will be more effective and efficient.

  7. The NERV Methodology: Non-Functional Requirements Elicitation, Reasoning and Validation in Agile Processes

    ERIC Educational Resources Information Center

    Domah, Darshan

    2013-01-01

    Agile software development has become very popular around the world in recent years, with methods such as Scrum and Extreme Programming (XP). Literature suggests that functionality is the primary focus in Agile processes while non-functional requirements (NFR) are either ignored or ill-defined. However, for software to be of good quality both…

  8. Bringing Agility to Business Process Management: Rules Deployment in an SOA

    NASA Astrophysics Data System (ADS)

    El Kharbili, Marwane; Keil, Tobias

    Business process management (BPM) has emerged as paradigm for integrating business strategies and enterprise architecture (EA). In this context, BPM implementation on top of web-service-based service oriented architectures is an accepted approach as shown by great amount of literature. One concern in this regard is how-to make BPs reactive to change. Our approach to the problem is the integration of business rule management (BRM) and BPM by allowing modeling of decisions hard-coded in BPs as separate business rules (BRs). These BRs become EA assets and need to be exploited when executing BPs. We motivate why BPM needs agility and discuss what requirements on BPM this poses. This paper presents prototyping work conducted at a BP modeling and analysis vendor which weeks to showcase how using business rule management (BRM) as a mean for modeling decisions can help achieve a much sought-after agility to BPM. This prototype relies on the integrated modeling of business rules (BRs) and BPs, and rule deployment as web services part of an SOA.

  9. Information Models, Data Requirements, and Agile Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, John S.; Crichton, Dan; Ritschel, Bernd; Hardman, Sean; Joyner, Ron

    2015-04-01

    The Planetary Data System's next generation system, PDS4, is an example of the successful use of an ontology-based Information Model (IM) to drive the development and operations of a data system. In traditional systems engineering, requirements or statements about what is necessary for the system are collected and analyzed for input into the design stage of systems development. With the advent of big data the requirements associated with data have begun to dominate and an ontology-based information model can be used to provide a formalized and rigorous set of data requirements. These requirements address not only the usual issues of data quantity, quality, and disposition but also data representation, integrity, provenance, context, and semantics. In addition the use of these data requirements during system's development has many characteristics of Agile Curation as proposed by Young et al. [Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community, AGU 2014], namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. For example customers can be satisfied through early and continuous delivery of system software and services that are configured directly from the information model. This presentation will describe the PDS4 architecture and its three principle parts: the ontology-based Information Model (IM), the federated registries and repositories, and the REST-based service layer for search, retrieval, and distribution. The development of the IM will be highlighted with special emphasis on knowledge acquisition, the impact of the IM on development and operations, and the use of shared ontologies at multiple governance levels to promote system interoperability and data correlation.

  10. Modeling the Agility MLC in the Monaco treatment planning system.

    PubMed

    Snyder, Michael; Halford, Robert; Knill, Cory; Adams, Jeffrey N; Bossenberger, Todd; Nalichowski, Adrian; Hammoud, Ahmad; Burmeister, Jay

    2016-01-01

    We investigate the relationship between the various parameters in the Monaco MLC model and dose calculation accuracy for an Elekta Agility MLC. The vendor-provided MLC modeling procedure - completed first with external vendor participation and then exclusively in-house - was used in combination with our own procedures to investigate several sets of MLC modeling parameters to determine their effect on dose distributions and point-dose measurements. Simple plans provided in the vendor procedure were used to elucidate specific mechanical characteristics of the MLC, while ten complex treatment plans - five IMRT and five VMAT - created using TG-119-based structure sets were used to test clinical dosimetric effects of particular parameter choices. EDR2 film was used for the vendor fields to give high spatial resolution, while a combination of MapCHECK and ion chambers were used for the in-house TG-119-based proced-ures. The vendor-determined parameter set provided a reasonable starting point for the MLC model and largely delivered acceptable gamma pass rates for clinical plans - including a passing external evaluation using the IROC H&N phantom. However, the vendor model did not provide point-dose accuracy consistent with that seen in other treatment systems at our center. Through further internal testing it was found that there existed many sets of MLC parameters, often at opposite ends of their allowable ranges, that provided similar dosimetric characteristics and good agreement with planar and point-dose measurements. In particular, the leaf offset and tip leakage parameters compensated for one another if adjusted in opposite directions, which provided a level curve of acceptable parameter sets across all plans. Interestingly, gamma pass rates of the plans were less dependent upon parameter choices than point-dose measurements, suggesting that MLC modeling using only gamma evaluation may be generally an insufficient approach. It was also found that exploring all

  11. A process for the agile product realization of electro-mechanical devices

    SciTech Connect

    Forsythe, C.; Ashby, M.R.; Benavides, G.L.; Diegert, K.V.; Jones, R.E.; Longcope, D.B.; Parratt, S.W.

    1995-09-01

    This paper describes a product realization process developed and demonstrated at Sandia by the A-PRIMED (Agile Product Realization for Innovative Electro MEchanical Devices) project that integrates many of the key components of ``agile manufacturing`` into a complete, design-to-production process. Evidence indicates that the process has reduced the product realization cycle and assured product quality. Products included discriminators for a robotic quick change adapter and for an electronic defense system. These discriminators, built using A-PRIMED, met random vibration requirements and had life cycles that far surpass the performance obtained from earlier efforts.

  12. Impact of Agile Software Development Model on Software Maintainability

    ERIC Educational Resources Information Center

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  13. Unsteady aerodynamic models for agile flight at low Reynolds numbers

    NASA Astrophysics Data System (ADS)

    Brunton, Steven L.

    This work develops low-order models for the unsteady aerodynamic forces on a wing in response to agile maneuvers at low Reynolds number. Model performance is assessed on the basis of accuracy across a range of parameters and frequencies as well as of computational efficiency and compatibility with existing control techniques and flight dynamic models. The result is a flexible modeling procedure that yields accurate, low-dimensional, state-space models. The modeling procedures are developed and tested on direct numerical simulations of a two-dimensional flat plate airfoil in motion at low Reynolds number, Re=100, and in a wind tunnel experiment at the Illinois Institute of Technology involving a NACA 0006 airfoil pitching and plunging at Reynolds number Re=65,000. In both instances, low-order models are obtained that accurately capture the unsteady aerodynamic forces at all frequencies. These cases demonstrate the utility of the modeling procedure developed in this thesis for obtaining accurate models for different geometries and Reynolds numbers. Linear reduced-order models are constructed from either the indicial response (step response) or realistic input/output maneuvers using a flexible modeling procedure. The method is based on identifying stability derivatives and modeling the remaining dynamics with the eigensystem realization algorithm. A hierarchy of models is developed, based on linearizing the flow at various operating conditions. These models are shown to be accurate and efficient for plunging, pitching about various points, and combined pitch and plunge maneuvers, at various angle of attack and Reynolds number. Models are compared against the classical unsteady aerodynamic models of Wagner and Theodorsen over a large range of Strouhal number and reduced frequency for a baseline comparison. Additionally, state-space representations are developed for Wagner's and Theodorsen's models, making them compatible with modern control-system analysis. A number of

  14. Agile based "Semi-"Automated Data ingest process : ORNL DAAC example

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Hook, L.; Wei, Y.; Wright, D.

    2015-12-01

    The ORNL DAAC archives and publishes data and information relevant to biogeochemical, ecological, and environmental processes. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. The ORNL DAAC ingest team curates diverse data sets from multiple data providers simultaneously. To streamline the ingest process, the data set submission process at the ORNL DAAC has been recently updated to use an agile process and a semi-automated workflow system has been developed to provide a consistent data provider experience and to create a uniform data product. The goals of semi-automated agile ingest process are to: 1.Provide the ability to track a data set from acceptance to publication 2. Automate steps that can be automated to improve efficiencies and reduce redundancy 3.Update legacy ingest infrastructure 4.Provide a centralized system to manage the various aspects of ingest. This talk will cover the agile methodology, workflow, and tools developed through this system.

  15. Developing a model for agile supply: an empirical study from Iranian pharmaceutical supply chain.

    PubMed

    Rajabzadeh Ghatari, Ali; Mehralian, Gholamhossein; Zarenezhad, Forouzandeh; Rasekh, Hamid Reza

    2013-01-01

    Agility is the fundamental characteristic of a supply chain needed for survival in turbulent markets, where environmental forces create additional uncertainty resulting in higher risk in the supply chain management. In addition, agility helps providing the right product, at the right time to the consumer. The main goal of this research is therefore to promote supplier selection in pharmaceutical industry according to the formative basic factors. Moreover, this paper can configure its supply network to achieve the agile supply chain. The present article analyzes the supply part of supply chain based on SCOR model, used to assess agile supply chains by highlighting their specific characteristics and applicability in providing the active pharmaceutical ingredient (API). This methodology provides an analytical modeling; the model enables potential suppliers to be assessed against the multiple criteria using both quantitative and qualitative measures. In addition, for making priority of critical factors, TOPSIS algorithm has been used as a common technique of MADM model. Finally, several factors such as delivery speed, planning and reorder segmentation, trust development and material quantity adjustment are identified and prioritized as critical factors for being agile in supply of API. PMID:24250689

  16. Developing a Model for Agile Supply: an Empirical Study from Iranian Pharmaceutical Supply Chain

    PubMed Central

    Rajabzadeh Ghatari, Ali; Mehralian, Gholamhossein; Zarenezhad, Forouzandeh; Rasekh, Hamid Reza

    2013-01-01

    Agility is the fundamental characteristic of a supply chain needed for survival in turbulent markets, where environmental forces create additional uncertainty resulting in higher risk in the supply chain management. In addition, agility helps providing the right product, at the right time to the consumer. The main goal of this research is therefore to promote supplier selection in pharmaceutical industry according to the formative basic factors. Moreover, this paper can configure its supply network to achieve the agile supply chain. The present article analyzes the supply part of supply chain based on SCOR model, used to assess agile supply chains by highlighting their specific characteristics and applicability in providing the active pharmaceutical ingredient (API). This methodology provides an analytical modeling; the model enables potential suppliers to be assessed against the multiple criteria using both quantitative and qualitative measures. In addition, for making priority of critical factors, TOPSIS algorithm has been used as a common technique of MADM model. Finally, several factors such as delivery speed, planning and reorder segmentation, trust development and material quantity adjustment are identified and prioritized as critical factors for being agile in supply of API. PMID:24250689

  17. Agile Development Processes: Delivering a Successful Data Management Platform Now and in the Future

    NASA Astrophysics Data System (ADS)

    Deaubl, E.; Lowry, S.

    2007-10-01

    Developing a flexible, extensible architecture for scientific data archival and management is a monumental task under older, big design, up-front methodologies. We will describe how we are using agile development techniques in our service oriented architecture (SOA)-based platform to integrate astronomer and operator input into the development process, deliver functional software earlier, and ensure that the software is maintainable and extensible in the future.

  18. Beam modeling and VMAT performance with the Agility 160-leaf multileaf collimator.

    PubMed

    Bedford, James L; Thomas, Michael D R; Smyth, Gregory

    2013-01-01

    The Agility multileaf collimator (Elekta AB, Stockholm, Sweden) has 160 leaves of projected width 0.5 cm at the isocenter, with maximum leaf speed 3.5 cms-1. These characteristics promise to facilitate fast and accurate delivery of radiotherapy, particularly volumetric-modulated arc therapy (VMAT). The aim of this study is therefore to create a beam model for the Pinnacle3 treatment planning system (Philips Radiation Oncology Systems, Fitchburg, WI), and to use this beam model to explore the performance of the Agility MLC in delivery of VMAT. A 6 MV beam model was created and verified by measuring doses under irregularly shaped fields. VMAT treatment plans for five typical head-and-neck patients were created using the beam model and delivered using both binned and continuously variable dose rate (CVDR). Results were compared with those for an MLCi unit without CVDR. The beam model has similar parameters to those of an MLCi model, with interleaf leakage of only 0.2%. The verification of irregular fields shows a mean agreement between measured and planned dose of 1.3% (planned dose higher). The Agility VMAT head-and-neck plans show equivalent plan quality and delivery accuracy to those for an MLCi unit, with 95% of verification measurements within 3% and 3 mm of planned dose. Mean delivery time is 133 s with the Agility head and CVDR, 171 s without CVDR, and 282 s with an MLCi unit. Pinnacle3 has therefore been shown to model the Agility MLC accurately, and to provide accurate VMAT treatment plans which can be delivered significantly faster with Agility than with an MLCi. PMID:23470941

  19. Developments in Agile Manufacturing

    SciTech Connect

    Clinesmith, M.G.

    1993-09-01

    As part of a project design initiative, Sandia National Laboratories and AlliedSignal Inc. Kansas City Division have joined efforts to develop a concurrent engineering capability for the manufacturing of complex precision components. The primary effort of this project, called Agile Manufacturing, is directed toward: (1) Understand the error associated with manufacturing and inspection. (2) Develop methods for correcting error. (3) Integrate diverse software technologies into a compatible process. The Agile Manufacturing System (AMS) is a system that integrates product design, manufacturing, and inspection into a closed loop, concurrent engineering process. The goal of developing the Agile Manufacturing System is to: (1) Optimize accuracy in manufacturing and inspection. (A) Use of softgage software for product evaluation. This will ensure ANSI Y14.5 compliance. (B) Establish and monitor bias between CMM and machine center. (C) Map probe deflection error and apply correction to inspection results. This applies to both on machine probing and CMM inspections. (D) Inspection process. (2) Compress the cycle time from product concept to production level manufacturing and verification. (3) Create a self-correcting process that feeds inspection results back into the machining process. (4) Link subordinate processes (cutting/probing path, softgage model, etc.) to the solid model definition.

  20. Agile IT: Thinking in User-Centric Models

    NASA Astrophysics Data System (ADS)

    Margaria, Tiziana; Steffen, Bernhard

    We advocate a new teaching direction for modern CS curricula: extreme model-driven development (XMDD), a new development paradigm designed to continuously involve the customer/application expert throughout the whole systems' life cycle. Based on the `One-Thing Approach', which works by successively enriching and refining one single artifact, system development becomes in essence a user-centric orchestration of intuitive service functionality. XMDD differs radically from classical software development, which, in our opinion is no longer adequate for the bulk of application programming - in particular when it comes to heterogeneous, cross organizational systems which must adapt to rapidly changing market requirements. Thus there is a need for new curricula addressing this model-driven, lightweight, and cooperative development paradigm that puts the user process in the center of the development and the application expert in control of the process evolution.

  1. Modeling and Developing the Information System for the SuperAGILE Experiment

    NASA Astrophysics Data System (ADS)

    Lazzarotto, F.; Costa, E.; del Monte, E.; Feroci, M.

    2004-07-01

    We will present some formal description of the SuperAGILE (SA) detection system data, the relationships among them and the operations applied on data, with the aid of instruments such as Entity-Relationship (E-R) and UML diagrams. We just realized functions of reception, pre-processing, archiving and analysis on SA data making use of Object Oriented and SQL open source software instruments.

  2. Micro-milling process improvement using an agile pulse-shaping fiber laser

    NASA Astrophysics Data System (ADS)

    Gay, David; Cournoyer, Alain; Deladurantaye, Pascal; Briand, Martin; Roy, Vincent; Labranche, Bruno; Levesque, Marc; Taillon, Y.

    2009-06-01

    We demonstrate the usefulness of INO's pulse-shaping fiber laser platform to rapidly develop complex laser micromachining processes. The versatility of such laser sources allows for straightforward control of the emitting energy envelop on the nanosecond timescale to create multi-amplitude level pulses and/or multi-pulse regimes. The pulses are amplified in an amplifier chain in a MOPA configuration that delivers output energy per pulse up to 60 μJ at 1064 nm at a repetition rate of 200 kHz with excellent beam quality (M2 < 1.1) and narrow line widths suitable for efficient frequency conversion. Also, their pulse-on-demand and pulse-to-pulse shape selection capability at high repetition rates makes those agile laser sources suitable for the implementation of high-throughput complex laser processing. Micro-milling experiments were carried out on two metals, aluminum and stainless steel, having very different thermal properties. For aluminum, our results show that the material removal efficiency depends strongly on the pulse shape, especially near the ablation threshold, and can be maximized to develop efficient laser micro-milling processes. But, the material removal efficiency is not always correlated with a good surface quality. However, the roughness of the milled surface can be improved by removing a few layers of material using another type of pulse shape. The agility of INO's fiber laser enables the implementation of a fast laser process including two steps employing different pulse characteristics for maximizing the material removal rate and obtaining a good surface quality at the same time. A comparison of material removal efficiency with stainless steel, well known to be difficult to mill on the micron scale, is also presented.

  3. A process for the agile product realization of electro-mechanical devices

    SciTech Connect

    Forsythe, C.; Diegert, K.V.; Ashby, M.R.; Parratt, S.W.; Benavides, G.L.; Jones, R.E.; Longcope, D.B.

    1995-08-01

    This paper describes a product realization process developed at Sandia National Laboratories by the A-PRIMED project that integrates many of the key components of ``agile manufacturing`` into a complete, step-by-step, design-to-production process. For three separate product realization efforts, each geared to a different set of requirements, A-PRIMED demonstrated product realization of a custom device in less than a month. A-PRIMED used a discriminator (a precision electro-mechanical device) as the demonstration device, but the process is readily adaptable to other electro-mechanical products. The process begins with a qualified design parameter space. From that point, the product realization process encompasses all facets of requirements development, analysis and testing, design, manufacturing, robotic assembly and quality assurance, as well as product data management and concurrent engineering. In developing the product realization process, A-PRIMED employed an iterative approach whereby after each of three builds, the process was reviewed and refinements made on the basis of lessons learned. This paper describes the integration of project functions and product realization technologies, with references to reports detailing specific facets of the overall process. The process described herein represents the outcome of an empirically-based process development effort that on repeated iterations, was proven successful.

  4. Agile Methods for Open Source Safety-Critical Software

    PubMed Central

    Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-01-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the right amount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545

  5. Agile Methods for Open Source Safety-Critical Software.

    PubMed

    Gary, Kevin; Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-08-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the rightamount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545

  6. A process for the agile product realization of electromechanical devices (A-primed)

    SciTech Connect

    Forsythe, C.; Ashby, M.R.; Benavides, G.L.; Diegert, K.V.; Jones, R.E.; Longcope, D.B.; Parratt, S.W.

    1996-02-01

    This paper describes a product realization process developed at Sandia National Laboratories by the A-PRIMED project that integrates many of the key components of ``agile manufacturing`` (Nagel & Dove, 1992) into a complete, step-by-step, design-to-production process. For two separate product realization efforts, each geared to a different set of requirements, A-PRIMED demonstrated product realization of a custom device in less than a month. A-PRIMED used a discriminator (a precision electro mechanical device) as the demonstration device, but the process is readily adaptable to other electro mechanical products. The process begins with a qualified design parameter space (Diegert et al, 1995). From that point, the product realization process encompasses all facets of requirements development, analysis and testing, design, manufacturing, robot assembly and quality assurance, as well as product data management and concurrent engineering. In developing the product realization process, A-PRIMED employed an iterative approach whereby after each build, the process was reviewed and refinements were made on the basis of lessons learned. This paper describes the integration of project functions and product realization technologies to develop a product realization process that on repeated iterations, was proven successful.

  7. Development of an agility assessment module for preliminary fighter design

    NASA Technical Reports Server (NTRS)

    Ngan, Angelen; Bauer, Brent; Biezad, Daniel; Hahn, Andrew

    1996-01-01

    A FORTRAN computer program is presented to perform agility analysis on fighter aircraft configurations. This code is one of the modules of the NASA Ames ACSYNT (AirCraft SYNThesis) design code. The background of the agility research in the aircraft industry and a survey of a few agility metrics are discussed. The methodology, techniques, and models developed for the code are presented. FORTRAN programs were developed for two specific metrics, CCT (Combat Cycle Time) and PM (Pointing Margin), as part of the agility module. The validity of the code was evaluated by comparing with existing flight test data. Example trade studies using the agility module along with ACSYNT were conducted using Northrop F-20 Tigershark and McDonnell Douglas F/A-18 Hornet aircraft models. The sensitivity of thrust loading and wing loading on agility criteria were investigated. The module can compare the agility potential between different configurations and has the capability to optimize agility performance in the preliminary design process. This research provides a new and useful design tool for analyzing fighter performance during air combat engagements.

  8. Toward agile control of a flexible-spine model for quadruped bounding

    NASA Astrophysics Data System (ADS)

    Byl, Katie; Satzinger, Brian; Strizic, Tom; Terry, Pat; Pusey, Jason

    2015-05-01

    Legged systems should exploit non-steady gaits both for improved recovery from unexpected perturbations and also to enlarge the set of reachable states toward negotiating a range of known upcoming terrain obstacles. We present a 4-link planar, bounding, quadruped model with compliance in its legs and spine and describe design of an intuitive and effective low-level gait controller. We extend our previous work on meshing hybrid dynamic systems and demonstrate that our control strategy results in stable gaits with meshable, low-dimension step- to-step variability. This meshability is a first step toward enabling switching control, to increase stability after perturbations compared with any single gait control, and we describe how this framework can also be used to find the set of n-step reachable states. Finally, we propose new guidelines for quantifying "agility" for legged robots, providing a preliminary framework for quantifying and improving performance of legged systems.

  9. Attitude Estimation for Unresolved Agile Space Objects with Shape Model Uncertainty

    NASA Astrophysics Data System (ADS)

    Holzinger, M.; Alfriend, K. T.; Wetterer, C. J.; Luu, K. K.; Sabol, C.; Hamada, K.; Harms, A.

    2012-09-01

    The increasing number of manufactured on-orbit objects as well as improving sensor capabilities indicate that the number of trackable objects will likely exceed 100,000 within the next several years. Characterizing the large population of non-spatially resolved active spacecraft, retired spacecraft, rocket bodies, debris, and High Area to Mass Ratio (HAMR) objects necessarily involves both attitude and shape estimation. While spatially unresolved space objects cannot be directly imaged, attitude and shape may be inferred by carefully examining their lightcurves. Lightcurves are temporally-resolved sequences of photometric intensity measurements over one or more bandwidths. Because the observable reflected light from an unresolved space object is a strong function of both its shape and attitude, estimating these parameters using lightcurves can provide an avenue to determine both space object attitude and shape. This problem is traditionally called `lightcurve inversion.' While lightcurves have been used for 25 years to characterize spin states and shapes of asteroids, estimating the attitude states and shapes of manufactured space objects involves a new set of challenges. New challenges addressed in this paper are 1) An active (agile) space object is often directly controlling its attitude, meaning that torques acting on the space object are not necessarily zero (non-homogeneous motion) and mass properties may not be known, 2) Shape models must often be estimated, and as such contain errors that need to be accounted for in the measurement function, 3) Dynamics and measurement functions are excessively nonlinear, and manufactured space objects may be quite symmetric about at least one axis of rotation/reflection. This can lead to multiple possible attitude estimate solutions and suggests the use of non-Gaussian estimation approaches. Agile space objects (those that can actively maneuver) pose new problems to lightcurve inversion efforts to estimate attitude. Because

  10. The Telemetry Agile Manufacturing Effort

    SciTech Connect

    Brown, K.D.

    1995-01-01

    The Telemetry Agile Manufacturing Effort (TAME) is an agile enterprising demonstration sponsored by the US Department of Energy (DOE). The project experimented with new approaches to product realization and assessed their impacts on performance, cost, flow time, and agility. The purpose of the project was to design the electrical and mechanical features of an integrated telemetry processor, establish the manufacturing processes, and produce an initial production lot of two to six units. This paper outlines the major methodologies utilized by the TAME, describes the accomplishments that can be attributed to each methodology, and finally, examines the lessons learned and explores the opportunities for improvement associated with the overall effort. The areas for improvement are discussed relative to an ideal vision of the future for agile enterprises. By the end of the experiment, the TAME reduced production flow time by approximately 50% and life cycle cost by more than 30%. Product performance was improved compared with conventional DOE production approaches.

  11. Agile manufacturing prototyping system (AMPS)

    SciTech Connect

    Garcia, P.

    1998-05-09

    The Agile Manufacturing Prototyping System (AMPS) is being integrated at Sandia National Laboratories. AMPS consists of state of the industry flexible manufacturing hardware and software enhanced with Sandia advancements in sensor and model based control; automated programming, assembly and task planning; flexible fixturing; and automated reconfiguration technology. AMPS is focused on the agile production of complex electromechanical parts. It currently includes 7 robots (4 Adept One, 2 Adept 505, 1 Staubli RX90), conveyance equipment, and a collection of process equipment to form a flexible production line capable of assembling a wide range of electromechanical products. This system became operational in September 1995. Additional smart manufacturing processes will be integrated in the future. An automated spray cleaning workcell capable of handling alcohol and similar solvents was added in 1996 as well as parts cleaning and encapsulation equipment, automated deburring, and automated vision inspection stations. Plans for 1997 and out years include adding manufacturing processes for the rapid prototyping of electronic components such as soldering, paste dispensing and pick-and-place hardware.

  12. Human factors in agile manufacturing

    SciTech Connect

    Forsythe, C.

    1995-03-01

    As industries position themselves for the competitive markets of today, and the increasingly competitive global markets of the 21st century, agility, or the ability to rapidly develop and produce new products, represents a common trend. Agility manifests itself in many different forms, with the agile manufacturing paradigm proposed by the Iacocca Institute offering a generally accepted, long-term vision. In its many forms, common elements of agility or agile manufacturing include: changes in business, engineering and production practices, seamless information flow from design through production, integration of computer and information technologies into all facets of the product development and production process, application of communications technologies to enable collaborative work between geographically dispersed product development team members and introduction of flexible automation of production processes. Industry has rarely experienced as dramatic an infusion of new technologies or as extensive a change in culture and work practices. Human factors will not only play a vital role in accomplishing the technical and social objectives of agile manufacturing. but has an opportunity to participate in shaping the evolution of industry paradigms for the 21st century.

  13. Utilization of an agility assessment module in analysis and optimization of preliminary fighter configuration

    NASA Technical Reports Server (NTRS)

    Ngan, Angelen; Biezad, Daniel

    1996-01-01

    A study has been conducted to develop and to analyze a FORTRAN computer code for performing agility analysis on fighter aircraft configurations. This program is one of the modules of the NASA Ames ACSYNT (AirCraft SYNThesis) design code. The background of the agility research in the aircraft industry and a survey of a few agility metrics are discussed. The methodology, techniques, and models developed for the code are presented. The validity of the existing code was evaluated by comparing with existing flight test data. A FORTRAN program was developed for a specific metric, PM (Pointing Margin), as part of the agility module. Example trade studies using the agility module along with ACSYNT were conducted using a McDonnell Douglas F/A-18 Hornet aircraft model. Tile sensitivity of thrust loading, wing loading, and thrust vectoring on agility criteria were investigated. The module can compare the agility potential between different configurations and has capability to optimize agility performance in the preliminary design process. This research provides a new and useful design tool for analyzing fighter performance during air combat engagements in the preliminary design.

  14. AGILE Data Center and AGILE science highlights

    NASA Astrophysics Data System (ADS)

    Pittori, C.

    2013-06-01

    AGILE is a scientific mission of the Italian Space Agency (ASI) with INFN, INAF e CIFS participation, devoted to gamma-ray astrophysics. The satellite is in orbit since April 23rd, 2007. Gamma-ray astrophysics above 100 MeV is an exciting field of astronomical sciences that has received a strong impulse in recent years. Despite the small size and budget, AGILE produced several important scientific results, among which the unexpected discovery of strong and rapid gamma-ray flares from the Crab Nebula. This discovery won to the AGILE PI and the AGILE Team the prestigious Bruno Rossi Prize for 2012, an international recognition in the field of high energy astrophysics. We present here the AGILE data center main activities, and we give an overview of the AGILE scientific highlights after 5 years of operations.

  15. CT-assisted agile manufacturing

    NASA Astrophysics Data System (ADS)

    Stanley, James H.; Yancey, Robert N.

    1996-11-01

    The next century will witness at least two great revolutions in the way goods are produced. First, workers will use the medium of virtual reality in all aspects of marketing, research, development, prototyping, manufacturing, sales and service. Second, market forces will drive manufacturing towards small-lot production and just-in-time delivery. Already, we can discern the merging of these megatrends into what some are calling agile manufacturing. Under this new paradigm, parts and processes will be designed and engineered within the mind of a computer, tooled and manufactured by the offspring of today's rapid prototyping equipment, and evaluated for performance and reliability by advanced nondestructive evaluation (NDE) techniques and sophisticated computational models. Computed tomography (CT) is the premier example of an NDE method suitable for future agile manufacturing activities. It is the only modality that provides convenient access to the full suite of engineering data that users will need to avail themselves of computer- aided design, computer-aided manufacturing, and computer- aided engineering capabilities, as well as newly emerging reverse engineering, rapid prototyping and solid freeform fabrication technologies. As such, CT is assured a central, utilitarian role in future industrial operations. An overview of this exciting future for industrial CT is presented.

  16. TriBITS lifecycle model. Version 1.0, a lean/agile software lifecycle model for research-based computational science and engineering and applied mathematical software.

    SciTech Connect

    Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.

  17. Aircraft agility maneuvers

    NASA Technical Reports Server (NTRS)

    Cliff, Eugene M.; Thompson, Brian G.

    1992-01-01

    A new dynamic model for aircraft motions is presented. This model can be viewed as intermediate between a point-mass model, in which the body attitude angles are control-like, and a rigid-body model, in which the body-attitude angles evolve according to Newton's Laws. Specifically, consideration is given to the case of symmetric flight, and a model is constructed in which the body roll-rate and the body pitch-rate are the controls. In terms of this body-rate model a minimum-time heading change maneuver is formulated. When the bounds on the body-rates are large the results are similar to the point-mass model in that the model can very quickly change the applied forces and produce an acceleration to turn the vehicle. With finite bounds on these rates, the forces change in a smooth way. This leads to a measurable effect of agility.

  18. Some Findings Concerning Requirements in Agile Methodologies

    NASA Astrophysics Data System (ADS)

    Rodríguez, Pilar; Yagüe, Agustín; Alarcón, Pedro P.; Garbajosa, Juan

    Agile methods have appeared as an attractive alternative to conventional methodologies. These methods try to reduce the time to market and, indirectly, the cost of the product through flexible development and deep customer involvement. The processes related to requirements have been extensively studied in literature, in most cases in the frame of conventional methods. However, conclusions of conventional methodologies could not be necessarily valid for Agile; in some issues, conventional and Agile processes are radically different. As recent surveys report, inadequate project requirements is one of the most conflictive issues in agile approaches and better understanding about this is needed. This paper describes some findings concerning requirements activities in a project developed under an agile methodology. The project intended to evolve an existing product and, therefore, some background information was available. The major difficulties encountered were related to non-functional needs and management of requirements dependencies.

  19. Implementing Kanban for agile process management within the ALMA Software Operations Group

    NASA Astrophysics Data System (ADS)

    Reveco, Johnny; Mora, Matias; Shen, Tzu-Chiang; Soto, Ruben; Sepulveda, Jorge; Ibsen, Jorge

    2014-07-01

    After the inauguration of the Atacama Large Millimeter/submillimeter Array (ALMA), the Software Operations Group in Chile has refocused its objectives to: (1) providing software support to tasks related to System Integration, Scientific Commissioning and Verification, as well as Early Science observations; (2) testing the remaining software features, still under development by the Integrated Computing Team across the world; and (3) designing and developing processes to optimize and increase the level of automation of operational tasks. Due to their different stakeholders, each of these tasks presents a wide diversity of importances, lifespans and complexities. Aiming to provide the proper priority and traceability for every task without stressing our engineers, we introduced the Kanban methodology in our processes in order to balance the demand on the team against the throughput of the delivered work. The aim of this paper is to share experiences gained during the implementation of Kanban in our processes, describing the difficulties we have found, solutions and adaptations that led us to our current but still evolving implementation, which has greatly improved our throughput, prioritization and problem traceability.

  20. Agile manufacturing from a statistical perspective

    SciTech Connect

    Easterling, R.G.

    1995-10-01

    The objective of agile manufacturing is to provide the ability to quickly realize high-quality, highly-customized, in-demand products at a cost commensurate with mass production. More broadly, agility in manufacturing, or any other endeavor, is defined as change-proficiency; the ability to thrive in an environment of unpredictable change. This report discusses the general direction of the agile manufacturing initiative, including research programs at the National Institute of Standards and Technology (NIST), the Department of Energy, and other government agencies, but focuses on agile manufacturing from a statistical perspective. The role of statistics can be important because agile manufacturing requires the collection and communication of process characterization and capability information, much of which will be data-based. The statistical community should initiate collaborative work in this important area.

  1. Social Protocols for Agile Virtual Teams

    NASA Astrophysics Data System (ADS)

    Picard, Willy

    Despite many works on collaborative networked organizations (CNOs), CSCW, groupware, workflow systems and social networks, computer support for virtual teams is still insufficient, especially support for agility, i.e. the capability of virtual team members to rapidly and cost efficiently adapt the way they interact to changes. In this paper, requirements for computer support for agile virtual teams are presented. Next, an extension of the concept of social protocol is proposed as a novel model supporting agile interactions within virtual teams. The extended concept of social protocol consists of an extended social network and a workflow model.

  2. Developing communications requirements for Agile Product Realization

    SciTech Connect

    Forsythe, C.; Ashby, M.R.

    1994-03-01

    Sandia National Laboratories has undertaken the Agile Product Realization for Innovative electroMEchanical Devices (A-PRIMED) pilot project to develop and implement technologies for agile design and manufacturing of electrochemical components. Emphasis on information-driven processes, concurrent engineering and multi-functional team communications makes computer-supported cooperative work critical to achieving significantly faster product development cycles. This report describes analyses conducted in developing communications requirements and a communications plan that addresses the unique communications demands of an agile enterprise.

  3. Agility enabled by the SEMATECH CIM framework

    NASA Astrophysics Data System (ADS)

    Hawker, Scott; Waskiewicz, Fred

    1997-01-01

    The survivor in today's market environment is agile: able to survive and thrive in a market place marked by rapid, continuous change. For manufacturers, this includes an ability to rapidly develop, deploy and reconfigure manufacturing information and control systems. The SEMATECH CIM framework defines an application integration architecture and standard application components that enable agile manufacturing information and control systems. Further, the CIM framework and its evolution process foster virtual organizations of suppliers and manufacturers, combining their products and capabilities into an agile manufacturing information and control system.

  4. Agile Software Development

    ERIC Educational Resources Information Center

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  5. Agile automated vision

    NASA Astrophysics Data System (ADS)

    Fandrich, Juergen; Schmitt, Lorenz A.

    1994-11-01

    The microelectronic industry is a protagonist in driving automated vision to new paradigms. Today semiconductor manufacturers use vision systems quite frequently in their fabs in the front-end process. In fact, the process depends on reliable image processing systems. In the back-end process, where ICs are assembled and packaged, today vision systems are only partly used. But in the next years automated vision will become compulsory for the back-end process as well. Vision will be fully integrated into every IC package production machine to increase yields and reduce costs. Modem high-speed material processing requires dedicated and efficient concepts in image processing. But the integration of various equipment in a production plant leads to unifying handling of data flow and interfaces. Only agile vision systems can act with these contradictions: fast, reliable, adaptable, scalable and comprehensive. A powerful hardware platform is a unneglectable requirement for the use of advanced and reliable, but unfortunately computing intensive image processing algorithms. The massively parallel SIMD hardware product LANTERN/VME supplies a powerful platform for existing and new functionality. LANTERN/VME is used with a new optical sensor for IC package lead inspection. This is done in 3D, including horizontal and coplanarity inspection. The appropriate software is designed for lead inspection, alignment and control tasks in IC package production and handling equipment, like Trim&Form, Tape&Reel and Pick&Place machines.

  6. Research on Modeling of the Agile Satellite Using a Single Gimbal Magnetically Suspended CMG and the Disturbance Feedforward Compensation for Rotors

    PubMed Central

    Cui, Peiling; Yan, Ning

    2012-01-01

    The magnetically suspended Control Moment Gyroscope (CMG) has the advantages of long-life, micro-vibration and being non-lubricating, and is the ideal actuator for agile maneuver satellite attitude control. However, the stability of the rotor in magnetic bearing and the precision of the output torque of a magnetically suspended CMG are affected by the rapid maneuvers of satellites. In this paper, a dynamic model of the agile satellite including a magnetically suspended single gimbal control moment gyroscope is built and the equivalent disturbance torque effected on the rotor is obtained. The feedforward compensation control method is used to depress the disturbance on the rotor. Simulation results are given to show that the rotor displacement is obviously reduced. PMID:23235442

  7. Research on modeling of the agile satellite using a single gimbal magnetically suspended CMG and the disturbance feedforward compensation for rotors.

    PubMed

    Cui, Peiling; Yan, Ning

    2012-01-01

    The magnetically suspended Control Moment Gyroscope (CMG) has the advantages of long-life, micro-vibration and being non-lubricating, and is the ideal actuator for agile maneuver satellite attitude control. However, the stability of the rotor in magnetic bearing and the precision of the output torque of a magnetically suspended CMG are affected by the rapid maneuvers of satellites. In this paper, a dynamic model of the agile satellite including a magnetically suspended single gimbal control moment gyroscope is built and the equivalent disturbance torque effected on the rotor is obtained. The feedforward compensation control method is used to depress the disturbance on the rotor. Simulation results are given to show that the rotor displacement is obviously reduced. PMID:23235442

  8. Integrating a distributed, agile, virtual enterprise in the TEAM program

    NASA Astrophysics Data System (ADS)

    Cobb, C. K.; Gray, W. Harvey; Hewgley, Robert E.; Klages, Edward J.; Neal, Richard E.

    1997-01-01

    The technologies enabling agile manufacturing (TEAM) program enhances industrial capability by advancing and deploying manufacturing technologies that promote agility. TEAM has developed a product realization process that features the integration of product design and manufacturing groups. TEAM uses the tools it collects, develops, and integrates in support of the product realization process to demonstrate and deploy agile manufacturing capabilities for three high- priority processes identified by industry: material removal, forming, and electromechanical assembly. In order to provide a proof-of-principle, the material removal process has been addressed first and has been successfully demonstrate din an 'interconnected' mode. An internet-accessible intersite file manager (IFM) application has been deployed to allow geographically distributed TEAM participants to share and distribute information as the product realization process is executed. An automated inspection planning application has been demonstrated, importing a solid model form the IFM, generating an inspection plan and a part program to be used in the inspection process, and then distributing the part program to the inspection site via the IFM. TEAM seeks to demonstrate the material removal process in an integrated mode in June 1997 complete with an object-oriented framework and infrastructure. The current status and future plans for this project are presented here.

  9. Production planning tools and techniques for agile manufacturing

    SciTech Connect

    Kjeldgaard, E.A.; Jones, D.A.; List, G.F.; Turnquist, M.A.

    1996-10-01

    Effective use of resources shared among multiple products or processes is critical for agile manufacturing. This paper describes development and implementation of a computerized model to support production planning in a complex manufacturing system at Pantex Plant. The model integrates two different production processes (nuclear weapon dismantlement and stockpile evaluation) which use common facilities and personnel, and reflects the interactions of scheduling constraints, material flow constraints, and resource availability. These two processes reflect characteristics of flow-shop and job-shop operations in a single facility. Operational results from using the model are also discussed.

  10. An agile implementation of SCRUM

    NASA Astrophysics Data System (ADS)

    Gannon, Michele

    Is Agile a way to cut corners? To some, the use of an Agile Software Development Methodology has a negative connotation - “ Oh, you're just not producing any documentation” . So can a team with no experience in Agile successfully implement and use SCRUM?

  11. A Quantitative Examination of Critical Success Factors Comparing Agile and Waterfall Project Management Methodologies

    ERIC Educational Resources Information Center

    Pedersen, Mitra

    2013-01-01

    This study investigated the rate of success for IT projects using agile and standard project management methodologies. Any successful project requires use of project methodology. Specifically, large projects require formal project management methodologies or models, which establish a blueprint of processes and project planning activities. This…

  12. Development of a Computer Program for Analyzing Preliminary Aircraft Configurations in Relationship to Emerging Agility Metrics

    NASA Technical Reports Server (NTRS)

    Bauer, Brent

    1993-01-01

    This paper discusses the development of a FORTRAN computer code to perform agility analysis on aircraft configurations. This code is to be part of the NASA-Ames ACSYNT (AirCraft SYNThesis) design code. This paper begins with a discussion of contemporary agility research in the aircraft industry and a survey of a few agility metrics. The methodology, techniques and models developed for the code are then presented. Finally, example trade studies using the agility module along with ACSYNT are illustrated. These trade studies were conducted using a Northrop F-20 Tigershark aircraft model. The studies show that the agility module is effective in analyzing the influence of common parameters such as thrust-to-weight ratio and wing loading on agility criteria. The module can compare the agility potential between different configurations. In addition, one study illustrates the module's ability to optimize a configuration's agility performance.

  13. Analysis and optimization of preliminary aircraft configurations in relationship to emerging agility metrics

    NASA Technical Reports Server (NTRS)

    Sandlin, Doral R.; Bauer, Brent Alan

    1993-01-01

    This paper discusses the development of a FORTRAN computer code to perform agility analysis on aircraft configurations. This code is to be part of the NASA-Ames ACSYNT (AirCraft SYNThesis) design code. This paper begins with a discussion of contemporary agility research in the aircraft industry and a survey of a few agility metrics. The methodology, techniques and models developed for the code are then presented. Finally, example trade studies using the agility module along with ACSYNT are illustrated. These trade studies were conducted using a Northrop F-20 Tigershark aircraft model. The studies show that the agility module is effective in analyzing the influence of common parameters such as thrust-to-weight ratio and wing loading on agility criteria. The module can compare the agility potential between different configurations. In addition one study illustrates the module's ability to optimize a configuration's agility performance.

  14. Agile robotic edge finishing system research

    SciTech Connect

    Powell, M.A.

    1995-07-01

    This paper describes a new project undertaken by Sandia National Laboratories to develop an agile, automated, high-precision edge finishing system. The project has a two-year duration and was initiated in October, 1994. This project involves re-designing and adding additional capabilities to an existing finishing workcell at Sandia; and developing intelligent methods for automating process definition and for controlling finishing processes. The resulting system will serve as a prototype for systems that will be deployed into highly flexible automated production lines. The production systems will be used to produce a wide variety of products with limited production quantities and quick turnaround requirements. The prototype system is designed to allow programming, process definition, fixture re-configuration, and process verification to be performed off-line for new products. CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) models of the part will be used to assist with the automated process development and process control tasks. To achieve Sandia`s performance goals, the system will be employ advanced path planning, burr prediction expert systems, automated process definition, statistical process models in a process database, and a two-level control scheme using hybrid position-force control and fuzzy logic control. In this paper, we discuss the progress and the planned system development under this project.

  15. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases, they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At the Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and processes for developing software. This paper will discuss some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies and processes.

  16. Peridigm summary report : lessons learned in development with agile components.

    SciTech Connect

    Salinger, Andrew Gerhard; Mitchell, John Anthony; Littlewood, David John; Parks, Michael L.

    2011-09-01

    This report details efforts to deploy Agile Components for rapid development of a peridynamics code, Peridigm. The goal of Agile Components is to enable the efficient development of production-quality software by providing a well-defined, unifying interface to a powerful set of component-based software. Specifically, Agile Components facilitate interoperability among packages within the Trilinos Project, including data management, time integration, uncertainty quantification, and optimization. Development of the Peridigm code served as a testbed for Agile Components and resulted in a number of recommendations for future development. Agile Components successfully enabled rapid integration of Trilinos packages into Peridigm. A cost of this approach, however, was a set of restrictions on Peridigm's architecture which impacted the ability to track history-dependent material data, dynamically modify the model discretization, and interject user-defined routines into the time integration algorithm. These restrictions resulted in modifications to the Agile Components approach, as implemented in Peridigm, and in a set of recommendations for future Agile Components development. Specific recommendations include improved handling of material states, a more flexible flow control model, and improved documentation. A demonstration mini-application, SimpleODE, was developed at the onset of this project and is offered as a potential supplement to Agile Components documentation.

  17. Optical flows method for lightweight agile remote sensor design and instrumentation

    NASA Astrophysics Data System (ADS)

    Wang, Chong; Xing, Fei; Wang, Hongjian; You, Zheng

    2013-08-01

    Lightweight agile remote sensors have become one type of the most important payloads and were widely utilized in space reconnaissance and resource survey. These imaging sensors are designed to obtain the high spatial, temporary and spectral resolution imageries. Key techniques in instrumentation include flexible maneuvering, advanced imaging control algorithms and integrative measuring techniques, which are closely correlative or even acting as the bottle-necks for each other. Therefore, mutual restrictive problems must be solved and optimized. Optical flow is the critical model which to be fully represented in the information transferring as well as radiation energy flowing in dynamic imaging. For agile sensors, especially with wide-field-of view, imaging optical flows may distort and deviate seriously when they perform large angle attitude maneuvering imaging. The phenomena are mainly attributed to the geometrical characteristics of the three-dimensional earth surface as well as the coupled effects due to the complicated relative motion between the sensor and scene. Under this circumstance, velocity fields distribute nonlinearly, the imageries may badly be smeared or probably the geometrical structures are changed since the image velocity matching errors are not having been eliminated perfectly. In this paper, precise imaging optical flow model is established for agile remote sensors, for which optical flows evolving is factorized by two forms, which respectively due to translational movement and image shape changing. Moreover, base on that, agile remote sensors instrumentation was investigated. The main techniques which concern optical flow modeling include integrative design with lightweight star sensors along with micro inertial measurement units and corresponding data fusion, the assemblies of focal plane layout and control, imageries post processing for agile remote sensors etc. Some experiments show that the optical analyzing method is effective to

  18. Introduction to Stand-up Meetings in Agile Methods

    NASA Astrophysics Data System (ADS)

    Hasnain, Eisha; Hall, Tracy

    2009-05-01

    In recent years, agile methods have become more popular in the software industry. Agile methods are a new approach compared to plan-driven approaches. One of the most important shifts in adopting an agile approach is the central focus given to people in the process. This is exemplified by the independence afforded to developers in the development work they do. This work investigates the opinions of practitioners about daily stand-up meetings in the agile methods and the role of developer in that. For our investigation we joined a yahoo group called "Extreme Programming". Our investigation suggests that although trust is an important factor in agile methods. But stand-ups are not the place to build trust.

  19. Supporting Agile Development of Authorization Rules for SME Applications

    NASA Astrophysics Data System (ADS)

    Bartsch, Steffen; Sohr, Karsten; Bormann, Carsten

    Custom SME applications for collaboration and workflow have become affordable when implemented as Web applications employing Agile methodologies. Security engineering is still difficult with Agile development, though: heavy-weight processes put the improvements of Agile development at risk. We propose Agile security engineering and increased end-user involvement to improve Agile development with respect to authorization policy development. To support the authorization policy development, we introduce a simple and readable authorization rules language implemented in a Ruby on Rails authorization plugin that is employed in a real-world SME collaboration and workflow application. Also, we report on early findings of the language’s use in authorization policy development with domain experts.

  20. Achieving agility through parameter space qualification

    SciTech Connect

    Diegert, K.V.; Easterling, R.G.; Ashby, M.R.; Benavides, G.L.; Forsythe, C.; Jones, R.E.; Longcope, D.B.; Parratt, S.W.

    1995-02-01

    The A-primed (Agile Product Realization of Innovative electro-Mechanical Devices) project is defining and proving processes for agile product realization for the Department of Energy complex. Like other agile production efforts reported in the literature, A-primed uses concurrent engineering and information automation technologies to enhance information transfer. A unique aspect of our approach to agility is the qualification during development of a family of related product designs and their production processes, rather than a single design and its attendant processes. Applying engineering principles and statistical design of experiments, economies of test and analytic effort are realized for the qualification of the device family as a whole. Thus the need is minimized for test and analysis to qualify future devices from this family, thereby further reducing the design-to-production cycle time. As a measure of the success of the A-primed approach, the first design took 24 days to produce, and operated correctly on the first attempt. A flow diagram for the qualification process is presented. Guidelines are given for implementation, based on the authors experiences as members of the A-primed qualification team.

  1. Perspectives on Agile Coaching

    NASA Astrophysics Data System (ADS)

    Fraser, Steven; Lundh, Erik; Davies, Rachel; Eckstein, Jutta; Larsen, Diana; Vilkki, Kati

    There are many perspectives to agile coaching including: growing coaching expertise, selecting the appropriate coach for your context; and eva luating value. A coach is often an itinerant who may observe, mentor, negotiate, influence, lead, and/or architect everything from team organization to system architecture. With roots in diverse fields ranging from technology to sociology coaches have differing motivations and experience bases. This panel will bring together coaches to debate and discuss various perspectives on agile coaching. Some of the questions to be addressed will include: What are the skills required for effective coaching? What should be the expectations for teams or individu als being coached? Should coaches be: a corporate resource (internal team of consultants working with multiple internal teams); an integral part of a specific team; or external contractors? How should coaches exercise influence and au thority? How should management assess the value of a coaching engagement? Do you have what it takes to be a coach? - This panel will bring together sea soned agile coaches to offer their experience and advice on how to be the best you can be!

  2. An investigation of fighter aircraft agility

    NASA Technical Reports Server (NTRS)

    Valasek, John; Downing, David R.

    1993-01-01

    of how to test and measure the metric, including any special data reduction requirements; typical values for the metric obtained using one or more aircraft types; and a sensitivity analysis if applicable. The report is organized as follows. The first chapter in the report presents a historical review of air combat trends which demonstrate the need for agility metrics in assessing the combat performance of fighter aircraft in a modern, all-aspect missile environment. The second chapter presents a framework for classifying each candidate metric according to time scale (transient, functional, instantaneous), further subdivided by axis (pitch, lateral, axial). The report is then broadly divided into two parts, with the transient agility metrics (pitch lateral, axial) covered in chapters three, four, and five, and the functional agility metrics covered in chapter six. Conclusions, recommendations, and an extensive reference list and biography are also included. Five appendices contain a comprehensive list of the definitions of all the candidate metrics; a description of the aircraft models and flight simulation programs used for testing the metrics; several relations and concepts which are fundamental to the study of lateral agility; an in-depth analysis of the axial agility metrics; and a derivation of the relations for the instantaneous agility and their approximations.

  3. Pinnacle3 modeling and end-to-end dosimetric testing of a Versa HD linear accelerator with the Agility head and flattening filter-free modes.

    PubMed

    Saenz, Daniel L; Narayanasamy, Ganesh; Cruz, Wilbert; Papanikolaou, Nikos; Stathakis, Sotirios

    2016-01-01

    The Elekta Versa HD incorporates a variety of upgrades to the line of Elekta linear accelerators, primarily including the Agility head and flattening filter-free (FFF) photon beam delivery. The completely distinct dosimetric output of the head from its predecessors, combined with the FFF beams, requires a new investigation of modeling in treatment planning systems. A model was created in Pinnacle3 v9.8 with the commissioned beam data. A phantom consisting of several plastic water and Styrofoam slabs was scanned and imported into Pinnacle3, where beams of different field sizes, source-to-surface distances (SSDs), wedges, and gantry angles were devised. Beams included all of the available photon energies (6, 10, 18, 6FFF, and 10 FFF MV), as well as the four electron energies commissioned for clinical use (6, 9, 12, and 15 MeV). The plans were verified at calculation points by measurement with a calibrated ionization chamber. Homogeneous and hetero-geneous point-dose measurements agreed within 2% relative to maximum dose for all photon and electron beams. AP photon open field measurements along the central axis at 100 cm SSD passed within 1%. In addition, IMRT testing was also performed with three standard plans (step and shoot IMRT, as well as a small- and large-field VMAT plan). The IMRT plans were delivered on the Delta4 IMRT QA phantom, for which a gamma passing rate was > 99.5% for all plans with a 3% dose deviation, 3 mm distance-to-agreement, and 10% dose threshold. The IMRT QA results for the first 23 patients yielded gamma passing rates of 97.4% ± 2.3%. Such testing ensures confidence in the ability of Pinnacle3 to model photon and electron beams with the Agility head. PMID:26894352

  4. Response of the Italian agile frog (Rana latastei) to a Ranavirus, frog virus 3: a model for viral emergence in naïve populations.

    PubMed

    Pearman, Peter B; Garner, Trenton W J; Straub, Monika; Greber, Urs F

    2004-10-01

    Ranavirus (family Iridoviridae) is a genus of pathogens of poikilotherms, and some ranaviruses may play a role in widespread mortality of amphibians. Ecology of viral transmission in amphibians is poorly known but can be addressed through experimentation in the laboratory. In this study, we use the Ranavirus frog virus 3 (FV3) as an experimental model for pathogen emergence in naive populations of tadpoles. We simulated emerging disease by exposing tadpoles of the Italian agile frog (Rana latastei), to the North American Ranavirus FV3. We demonstrated that mortality occurred due to viral exposure, exposure of tadpoles to decreasing concentrations of FV3 in the laboratory produced dose-dependent survival rates, and cannibalism of virus-carrying carcasses increased mortality due to FV3. These experiments suggest the potential for ecological mechanisms to affect the level of exposure of tadpoles to Ranavirus and to impact transmission of viral pathogens in aquatic systems. PMID:15650083

  5. Supply chain network design problem for a new market opportunity in an agile manufacturing system

    NASA Astrophysics Data System (ADS)

    Babazadeh, Reza; Razmi, Jafar; Ghodsi, Reza

    2012-08-01

    The characteristics of today's competitive environment, such as the speed with which products are designed, manufactured, and distributed, and the need for higher responsiveness and lower operational cost, are forcing companies to search for innovative ways to do business. The concept of agile manufacturing has been proposed in response to these challenges for companies. This paper copes with the strategic and tactical level decisions in agile supply chain network design. An efficient mixed-integer linear programming model that is able to consider the key characteristics of agile supply chain such as direct shipments, outsourcing, different transportation modes, discount, alliance (process and information integration) between opened facilities, and maximum waiting time of customers for deliveries is developed. In addition, in the proposed model, the capacity of facilities is determined as decision variables, which are often assumed to be fixed. Computational results illustrate that the proposed model can be applied as a power tool in agile supply chain network design as well as in the integration of strategic decisions with tactical decisions.

  6. Agile Walking Robot

    NASA Technical Reports Server (NTRS)

    Larimer, Stanley J.; Lisec, Thomas R.; Spiessbach, Andrew J.; Waldron, Kenneth J.

    1990-01-01

    Proposed agile walking robot operates over rocky, sandy, and sloping terrain. Offers stability and climbing ability superior to other conceptual mobile robots. Equipped with six articulated legs like those of insect, continually feels ground under leg before applying weight to it. If leg sensed unexpected object or failed to make contact with ground at expected point, seeks alternative position within radius of 20 cm. Failing that, robot halts, examines area around foot in detail with laser ranging imager, and replans entire cycle of steps for all legs before proceeding.

  7. Frequency agile relativistic magnetrons

    SciTech Connect

    Levine, J.S.; Harteneck, B.D.; Price, H.D.

    1995-11-01

    The authors are developing a family of frequency agile relativistic magnetrons to continuously cover the bands from 1 to 3 GHz. They have achieved tuning ranges of > 33%. The magnetrons have been operated repetitively in burst mode at rates up to 100 pps for 10 sec. Power is extracted from two resonators, and is in the range of 400--600 MW, fairly flat across the tuning bandwidth. They are using a network of phase shifters and 3-dB hybrids to combine the power into a single arm and to provide a continuously adjustable attenuator.

  8. Agile Infrastructure Monitoring

    NASA Astrophysics Data System (ADS)

    Andrade, P.; Ascenso, J.; Fedorko, I.; Fiorini, B.; Paladin, M.; Pigueiras, L.; Santos, M.

    2014-06-01

    At the present time, data centres are facing a massive rise in virtualisation and cloud computing. The Agile Infrastructure (AI) project is working to deliver new solutions to ease the management of CERN data centres. Part of the solution consists in a new "shared monitoring architecture" which collects and manages monitoring data from all data centre resources. In this article, we present the building blocks of this new monitoring architecture, the different open source technologies selected for each architecture layer, and how we are building a community around this common effort.

  9. Frequency agile optical parametric oscillator

    DOEpatents

    Velsko, Stephan P.

    1998-01-01

    The frequency agile OPO device converts a fixed wavelength pump laser beam to arbitrary wavelengths within a specified range with pulse to pulse agility, at a rate limited only by the repetition rate of the pump laser. Uses of this invention include Laser radar, LIDAR, active remote sensing of effluents/pollutants, environmental monitoring, antisensor lasers, and spectroscopy.

  10. Frequency agile optical parametric oscillator

    DOEpatents

    Velsko, S.P.

    1998-11-24

    The frequency agile OPO device converts a fixed wavelength pump laser beam to arbitrary wavelengths within a specified range with pulse to pulse agility, at a rate limited only by the repetition rate of the pump laser. Uses of this invention include Laser radar, LIDAR, active remote sensing of effluents/pollutants, environmental monitoring, antisensor lasers, and spectroscopy. 14 figs.

  11. Final Report of the NASA Office of Safety and Mission Assurance Agile Benchmarking Team

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2016-01-01

    To ensure that the NASA Safety and Mission Assurance (SMA) community remains in a position to perform reliable Software Assurance (SA) on NASAs critical software (SW) systems with the software industry rapidly transitioning from waterfall to Agile processes, Terry Wilcutt, Chief, Safety and Mission Assurance, Office of Safety and Mission Assurance (OSMA) established the Agile Benchmarking Team (ABT). The Team's tasks were: 1. Research background literature on current Agile processes, 2. Perform benchmark activities with other organizations that are involved in software Agile processes to determine best practices, 3. Collect information on Agile-developed systems to enable improvements to the current NASA standards and processes to enhance their ability to perform reliable software assurance on NASA Agile-developed systems, 4. Suggest additional guidance and recommendations for updates to those standards and processes, as needed. The ABT's findings and recommendations for software management, engineering and software assurance are addressed herein.

  12. PDS4 - Some Principles for Agile Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D. J.; Hardman, S. H.; Joyner, R.; Algermissen, S.; Padams, J.

    2015-12-01

    PDS4, a research data management and curation system for NASA's Planetary Science Archive, was developed using principles that promote the characteristics of agile development. The result is an efficient system that produces better research data products while using less resources (time, effort, and money) and maximizes their usefulness for current and future scientists. The key principle is architectural. The PDS4 information architecture is developed and maintained independent of the infrastructure's process, application and technology architectures. The information architecture is based on an ontology-based information model developed to leverage best practices from standard reference models for digital archives, digital object registries, and metadata registries and capture domain knowledge from a panel of planetary science domain experts. The information model provides a sharable, stable, and formal set of information requirements for the system and is the primary source for information to configure most system components, including the product registry, search engine, validation and display tools, and production pipelines. Multi-level governance is also allowed for the effective management of the informational elements at the common, discipline, and project level. This presentation will describe the development principles, components, and uses of the information model and how an information model-driven architecture exhibits characteristics of agile curation including early delivery, evolutionary development, adaptive planning, continuous improvement, and rapid and flexible response to change.

  13. Agile manufacturing: The factory of the future

    NASA Technical Reports Server (NTRS)

    Loibl, Joseph M.; Bossieux, Terry A.

    1994-01-01

    The factory of the future will require an operating methodology which effectively utilizes all of the elements of product design, manufacturing and delivery. The process must respond rapidly to changes in product demand, product mix, design changes or changes in the raw materials. To achieve agility in a manufacturing operation, the design and development of the manufacturing processes must focus on customer satisfaction. Achieving greatest results requires that the manufacturing process be considered from product concept through sales. This provides the best opportunity to build a quality product for the customer at a reasonable rate. The primary elements of a manufacturing system include people, equipment, materials, methods and the environment. The most significant and most agile element in any process is the human resource. Only with a highly trained, knowledgeable work force can the proper methods be applied to efficiently process materials with machinery which is predictable, reliable and flexible. This paper discusses the affect of each element on the development of agile manufacturing systems.

  14. A Rule-Based Modeling for the Description of Flexible and Self-healing Business Processes

    NASA Astrophysics Data System (ADS)

    Boukhebouze, Mohamed; Amghar, Youssef; Benharkat, Aïcha-Nabila; Maamar, Zakaria

    In this paper we discuss the importance of ensuring that business processes are label robust and agile at the same time robust and agile. To this end, we consider reviewing the way business processes are managed. For instance we consider offering a flexible way to model processes so that changes in regulations are handled through some self-healing mechanisms. These changes may raise exceptions at run-time if not properly reflected on these processes. To this end we propose a new rule based model that adopts the ECA rules and is built upon formal tools. The business logic of a process can be summarized with a set of rules that implement an organization’s policies. Each business rule is formalized using our ECAPE formalism (Event-Condition-Action-Post condition- post Event). This formalism allows translating a process into a graph of rules that is analyzed in terms of reliably and flexibility.

  15. Agile manufacturing concept

    NASA Astrophysics Data System (ADS)

    Goldman, Steven L.

    1994-03-01

    The initial conceptualization of agile manufacturing was the result of a 1991 study -- chaired by Lehigh Professor Roger N. Nagel and California-based entrepreneur Rick Dove, President of Paradigm Shifts, International -- of what it would take for U.S. industry to regain global manufacturing competitiveness by the early twenty-first century. This industry-led study, reviewed by senior management at over 100 companies before its release, concluded that incremental improvement of the current system of manufacturing would not be enough to be competitive in today's global marketplace. Computer-based information and production technologies that were becoming available to industry opened up the possibility of an altogether new system of manufacturing, one that would be characterized by a distinctive integration of people and technologies; of management and labor; of customers, producers, suppliers, and society.

  16. Compact, Automated, Frequency-Agile Microspectrofluorimeter

    NASA Technical Reports Server (NTRS)

    Fernandez, Salvador M.; Guignon, Ernest F.

    1995-01-01

    Compact, reliable, rugged, automated cell-culture and frequency-agile microspectrofluorimetric apparatus developed to perform experiments involving photometric imaging observations of single live cells. In original application, apparatus operates mostly unattended aboard spacecraft; potential terrestrial applications include automated or semiautomated diagnosis of pathological tissues in clinical laboratories, biomedical instrumentation, monitoring of biological process streams, and portable instrumentation for testing biological conditions in various environments. Offers obvious advantages over present laboratory instrumentation.

  17. Parallel optimization methods for agile manufacturing

    SciTech Connect

    Meza, J.C.; Moen, C.D.; Plantenga, T.D.; Spence, P.A.; Tong, C.H.; Hendrickson, B.A.; Leland, R.W.; Reese, G.M.

    1997-08-01

    The rapid and optimal design of new goods is essential for meeting national objectives in advanced manufacturing. Currently almost all manufacturing procedures involve the determination of some optimal design parameters. This process is iterative in nature and because it is usually done manually it can be expensive and time consuming. This report describes the results of an LDRD, the goal of which was to develop optimization algorithms and software tools that will enable automated design thereby allowing for agile manufacturing. Although the design processes vary across industries, many of the mathematical characteristics of the problems are the same, including large-scale, noisy, and non-differentiable functions with nonlinear constraints. This report describes the development of a common set of optimization tools using object-oriented programming techniques that can be applied to these types of problems. The authors give examples of several applications that are representative of design problems including an inverse scattering problem, a vibration isolation problem, a system identification problem for the correlation of finite element models with test data and the control of a chemical vapor deposition reactor furnace. Because the function evaluations are computationally expensive, they emphasize algorithms that can be adapted to parallel computers.

  18. Elements of an Art - Agile Coaching

    NASA Astrophysics Data System (ADS)

    Lundh, Erik

    This tutorial gives you a lead on becoming or redefining yourself as an Agile Coach. Introduction to elements and dimensions of state-of-the-art Agile Coaching. How to position the agile coach to be effective in a larger setting. Making the agile transition - from a single team to thousands of people. How to support multiple teams as a coach. How to build a coaches network in your company. Challenges when the agile coach is a consultant and the organization is large.

  19. Development of EarthCube Governance: An Agile Approach

    NASA Astrophysics Data System (ADS)

    Pearthree, G.; Allison, M. L.; Patten, K.

    2013-12-01

    Governance of geosciences cyberinfrastructure is a complex and essential undertaking, critical in enabling distributed knowledge communities to collaborate and communicate across disciplines, distances, and cultures. Advancing science with respect to 'grand challenges," such as global climate change, weather prediction, and core fundamental science, depends not just on technical cyber systems, but also on social systems for strategic planning, decision-making, project management, learning, teaching, and building a community of practice. Simply put, a robust, agile technical system depends on an equally robust and agile social system. Cyberinfrastructure development is wrapped in social, organizational and governance challenges, which may significantly impede progress. An agile development process is underway for governance of transformative investments in geosciences cyberinfrastructure through the NSF EarthCube initiative. Agile development is iterative and incremental, and promotes adaptive planning and rapid and flexible response. Such iterative deployment across a variety of EarthCube stakeholders encourages transparency, consensus, accountability, and inclusiveness. A project Secretariat acts as the coordinating body, carrying out duties for planning, organizing, communicating, and reporting. A broad coalition of stakeholder groups comprises an Assembly (Mainstream Scientists, Cyberinfrastructure Institutions, Information Technology/Computer Sciences, NSF EarthCube Investigators, Science Communities, EarthCube End-User Workshop Organizers, Professional Societies) to serve as a preliminary venue for identifying, evaluating, and testing potential governance models. To offer opportunity for broader end-user input, a crowd-source approach will engage stakeholders not involved otherwise. An Advisory Committee from the Earth, ocean, atmosphere, social, computer and library sciences is guiding the process from a high-level policy point of view. Developmental

  20. Tools for Supporting Distributed Agile Project Planning

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Maurer, Frank; Morgan, Robert; Oliveira, Josyleuda

    Agile project planning plays an important part in agile software development. In distributed settings, project planning is severely impacted by the lack of face-to-face communication and the inability to share paper index cards amongst all meeting participants. To address these issues, several distributed agile planning tools were developed. The tools vary in features, functions and running platforms. In this chapter, we first summarize the requirements for distributed agile planning. Then we give an overview on existing agile planning tools. We also evaluate existing tools based on tool requirements. Finally, we present some practical advices for both designers and users of distributed agile planning tools.

  1. What Does an Agile Coach Do?

    NASA Astrophysics Data System (ADS)

    Davies, Rachel; Pullicino, James

    The surge in Agile adoption has created a demand for project managers rather than direct their teams. A sign of this trend is the ever-increasing number of people getting certified as scrum masters and agile leaders. Training courses that introduce agile practices are easy to find. But making the transition to coach is not as simple as understanding what agile practices are. Your challenge as an Agile Coach is to support your team in learning how to wield their new Agile tools in creating great software.

  2. Software ``Best'' Practices: Agile Deconstructed

    NASA Astrophysics Data System (ADS)

    Fraser, Steven

    Software “best” practices depend entirely on context - in terms of the problem domain, the system constructed, the software designers, and the “customers” ultimately deriving value from the system. Agile practices no longer have the luxury of “choosing” small non-mission critical projects with co-located teams. Project stakeholders are selecting and adapting practices based on a combina tion of interest, need and staffing. For example, growing product portfolios through a merger or the acquisition of a company exposes legacy systems to new staff, new software integration challenges, and new ideas. Innovation in communications (tools and processes) to span the growth and contraction of both information and organizations, while managing the adoption of changing software practices, is imperative for success. Traditional web-based tools such as web pages, document libraries, and forums are not suf ficient. A blend of tweeting, blogs, wikis, instant messaging, web-based confer encing, and telepresence creates a new dimension of communication “best” practices.

  3. Piloted simulator assessments of agility

    NASA Technical Reports Server (NTRS)

    Schneider, Edward T.

    1990-01-01

    NASA has utilized piloted simulators for nearly two decades to study high-angle-of-attack flying qualities, agility, and air-to-air combat. These studies have included assessments of an F-16XL aircraft equipped with thrust vectoring, an assessment of the F-18 HARV maneuvering requirements to assist in thrust vectoring control system design, and an agility assessment of the F-18. The F-18 agility assessment was compared with in-flight testing. Open-loop maneuvers such as 180-deg rolls to measure roll rate showed favorable simulator/in-flight comparison. Closed-loop maneuvers such as rolls to 90 deg with precision stops or certain maximum longitudinal pitching maneuvers showed poorer performance due to reduced aggressiveness of pilot inputs in flight to remain within flight envelope limits.

  4. The AGILE Data Center at ASDC

    NASA Astrophysics Data System (ADS)

    Pittori, Carlotta; AGILE Collaboration

    2013-01-01

    AGILE is a Scientific Mission of the Italian Space Agency (ASI) with INFN, INAF and CIFS participation, devoted to gamma-ray astrophysics. The satellite has been in orbit since April 23rd, 2007. Thanks to its sky monitoring capability and fast ground segment alert system, AGILE produced several important scientific results, among which was the unexpected discovery of strong and rapid gamma-ray flares from the Crab Nebula over daily timescales. This discovery won for the AGILE PI and the AGILE Team the Bruno Rossi Prize for 2012. The AGILE Data Center, located at ASDC, is in charge of all the scientific oriented activities related to the analysis and archiving of AGILE data. I will present the AGILE data center main activities, and I will give an overview of the AGILE scientific highlights after 5 years of operations.

  5. Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community

    NASA Astrophysics Data System (ADS)

    Young, J. W.; Lenhardt, W. C.; Parsons, M. A.; Benedict, K. K.

    2014-12-01

    The data life cycle has figured prominently in describing the context of digital scientific data stewardship and cyberinfractructure in support of science. There are many different versions of the data life cycle, but they all follow a similar basic pattern: plan, collect, ingest, asses, preserve, discover, and reuse. The process is often interpreted in a fairly linear fashion despite it being a cycle conceptually. More recently at GeoData 2014 and elsewhere, questions have been raised about the utility of the data life cycle as it is currently represented. We are proposing to the community a re-examination of the data life cycle using an agile lens. Our goal is not to deploy agile methods, but to use agile principles as a heuristic to think about how to incorporate data stewardship across the scientific process from proposal stage to research and beyond. We will present alternative conceptualizations of the data life cycle with a goal to solicit feedback and to develop a new model for conceiving and describing the overall data stewardship process. We seek to re-examine past assumptions and shed new light on the challenges and necessity of data stewardship. The ultimate goal is to support new science through enhanced data interoperability, usability, and preservation.

  6. The AGILE Alert System for Gamma-Ray Transients

    NASA Astrophysics Data System (ADS)

    Bulgarelli, A.; Trifoglio, M.; Gianotti, F.; Tavani, M.; Parmiggiani, N.; Fioretti, V.; Chen, A. W.; Vercellone, S.; Pittori, C.; Verrecchia, F.; Lucarelli, F.; Santolamazza, P.; Fanari, G.; Giommi, P.; Beneventano, D.; Argan, A.; Trois, A.; Scalise, E.; Longo, F.; Pellizzoni, A.; Pucella, G.; Colafrancesco, S.; Conforti, V.; Tempesta, P.; Cerone, M.; Sabatini, P.; Annoni, G.; Valentini, G.; Salotti, L.

    2014-01-01

    In recent years, a new generation of space missions has offered great opportunities for discovery in high-energy astrophysics. In this article we focus on the scientific operations of the Gamma-Ray Imaging Detector (GRID) on board the AGILE space mission. AGILE-GRID, sensitive in the energy range of 30 MeV-30 GeV, has detected many γ-ray transients of both galactic and extragalactic origin. This work presents the AGILE innovative approach to fast γ-ray transient detection, which is a challenging task and a crucial part of the AGILE scientific program. The goals are to describe (1) the AGILE Gamma-Ray Alert System, (2) a new algorithm for blind search identification of transients within a short processing time, (3) the AGILE procedure for γ-ray transient alert management, and (4) the likelihood of ratio tests that are necessary to evaluate the post-trial statistical significance of the results. Special algorithms and an optimized sequence of tasks are necessary to reach our goal. Data are automatically analyzed at every orbital downlink by an alert pipeline operating on different timescales. As proper flux thresholds are exceeded, alerts are automatically generated and sent as SMS messages to cellular telephones, via e-mail, and via push notifications from an application for smartphones and tablets. These alerts are crosschecked with the results of two pipelines, and a manual analysis is performed. Being a small scientific-class mission, AGILE is characterized by optimization of both scientific analysis and ground-segment resources. The system is capable of generating alerts within two to three hours of a data downlink, an unprecedented reaction time in γ-ray astrophysics.

  7. The agile alert system for gamma-ray transients

    SciTech Connect

    Bulgarelli, A.; Trifoglio, M.; Gianotti, F.; Fioretti, V.; Chen, A. W.; Pittori, C.; Verrecchia, F.; Lucarelli, F.; Santolamazza, P.; Fanari, G.; Giommi, P.; Pellizzoni, A.; and others

    2014-01-20

    In recent years, a new generation of space missions has offered great opportunities for discovery in high-energy astrophysics. In this article we focus on the scientific operations of the Gamma-Ray Imaging Detector (GRID) on board the AGILE space mission. AGILE-GRID, sensitive in the energy range of 30 MeV-30 GeV, has detected many γ-ray transients of both galactic and extragalactic origin. This work presents the AGILE innovative approach to fast γ-ray transient detection, which is a challenging task and a crucial part of the AGILE scientific program. The goals are to describe (1) the AGILE Gamma-Ray Alert System, (2) a new algorithm for blind search identification of transients within a short processing time, (3) the AGILE procedure for γ-ray transient alert management, and (4) the likelihood of ratio tests that are necessary to evaluate the post-trial statistical significance of the results. Special algorithms and an optimized sequence of tasks are necessary to reach our goal. Data are automatically analyzed at every orbital downlink by an alert pipeline operating on different timescales. As proper flux thresholds are exceeded, alerts are automatically generated and sent as SMS messages to cellular telephones, via e-mail, and via push notifications from an application for smartphones and tablets. These alerts are crosschecked with the results of two pipelines, and a manual analysis is performed. Being a small scientific-class mission, AGILE is characterized by optimization of both scientific analysis and ground-segment resources. The system is capable of generating alerts within two to three hours of a data downlink, an unprecedented reaction time in γ-ray astrophysics.

  8. Lean and Agile Development of the AITS Ground Software System

    NASA Astrophysics Data System (ADS)

    Richters, Mark; Dutruel, Etienne; Mecredy, Nicolas

    2013-08-01

    We present the ongoing development of a new ground software system used for integrating, testing and operating spacecraft. The Advanced Integration and Test Services (AITS) project aims at providing a solution for electrical ground support equipment and mission control systems in future Astrium Space Transportation missions. Traditionally ESA ground or flight software development projects are conducted according to a waterfall-like process as specified in the ECSS-E-40 standard promoted by ESA in the European industry. In AITS a decision was taken to adopt an agile development process. This work could serve as a reference for future ESA software projects willing to apply agile concepts.

  9. Agile Development Methods for Space Operations

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Webster, Chris

    2012-01-01

    Main stream industry software development practice has gone from a traditional waterfall process to agile iterative development that allows for fast response to customer inputs and produces higher quality software at lower cost. How can we, the space ops community, adopt state of the art software development practice, achieve greater productivity at lower cost, and maintain safe and effective space flight operations? At NASA Ames, we are developing Mission Control Technologies Software, in collaboration with Johnson Space Center (JSC) and, more recently, the Jet Propulsion Laboratory (JPL).

  10. Development of perceived competence, tactical skills, motivation, technical skills, and speed and agility in young soccer players.

    PubMed

    Forsman, Hannele; Gråstén, Arto; Blomqvist, Minna; Davids, Keith; Liukkonen, Jarmo; Konttinen, Niilo

    2016-07-01

    The objective of this 1-year, longitudinal study was to examine the development of perceived competence, tactical skills, motivation, technical skills, and speed and agility characteristics of young Finnish soccer players. We also examined associations between latent growth models of perceived competence and other recorded variables. Participants were 288 competitive male soccer players ranging from 12 to 14 years (12.7 ± 0.6) from 16 soccer clubs. Players completed the self-assessments of perceived competence, tactical skills, and motivation, and participated in technical, and speed and agility tests. Results of this study showed that players' levels of perceived competence, tactical skills, motivation, technical skills, and speed and agility characteristics remained relatively high and stable across the period of 1 year. Positive relationships were found between these levels and changes in perceived competence and motivation, and levels of perceived competence and speed and agility characteristics. Together these results illustrate the multi-dimensional nature of talent development processes in soccer. Moreover, it seems crucial in coaching to support the development of perceived competence and motivation in young soccer players and that it might be even more important in later maturing players. PMID:26708723

  11. Onshore and Offshore Outsourcing with Agility: Lessons Learned

    NASA Astrophysics Data System (ADS)

    Kussmaul, Clifton

    This chapter reflects on case study based an agile distributed project that ran for approximately three years (from spring 2003 to spring 2006). The project involved (a) a customer organization with key personnel distributed across the US, developing an application with rapidly changing requirements; (b) onshore consultants with expertise in project management, development processes, offshoring, and relevant technologies; and (c) an external offsite development team in a CMM-5 organization in southern India. This chapter is based on surveys and discussions with multiple participants. The several years since the project was completed allow greater perspective on both the strengths and weaknesses, since the participants can reflect on the entire life of the project, and compare it to subsequent experiences. Our findings emphasize the potential for agile project management in distributed software development, and the importance of people and interactions, taking many small steps to find and correct errors, and matching the structures of the project and product to support implementation of agility.

  12. SU-E-T-610: Comparison of Treatment Times Between the MLCi and Agility Multileaf Collimators

    SciTech Connect

    Ramsey, C; Bowling, J

    2014-06-01

    Purpose: The Agility is a new 160-leaf MLC developed by Elekta for use in their Infinity and Versa HD linacs. As compared to the MLCi, the Agility increased the maximum leaf speed from 2 cm/s to 3.5 cm/s, and the maximum primary collimator speed from 1.5 cm/s to 9.0 cm/s. The purpose of this study was to determine if the Agility MLC resulted in improved plan quality and/or shorter treatment times. Methods: An Elekta Infinity that was originally equipped with a 80 leaf MLCi was upgraded to an 160 leaf Agility. Treatment plan quality was evaluated using the Pinnacle planning system with SmartArc. Optimization was performed once for the MLCi and once for the Agility beam models using the same optimization parameters and the same number of iterations. Patient treatment times were measured for all IMRT, VMAT, and SBRT patients treated on the Infinity with the MLCi and Agility MLCs. Treatment times were extracted from the EMR and measured from when the patient first walked into the treatment room until exiting the treatment room. Results: 11,380 delivery times were measured for patients treated with the MLCi, and 1,827 measurements have been made for the Agility MLC. The average treatment times were 19.1 minutes for the MLCi and 20.8 minutes for the Agility. Using a t-test analysis, there was no difference between the two groups (t = 0.22). The dose differences between patients planned with the MLCi and the Agility MLC were minimal. For example, the dose difference for the PTV, GTV, and cord for a head and neck patient planned using Pinnacle were effectively equivalent. However, the dose to the parotid glands was slightly worse with the Agility MLC. Conclusion: There was no statistical difference in treatment time, or any significant dosimetric difference between the Agility MLC and the MLCi.

  13. MODELING TREE LEVEL PROCESSES

    EPA Science Inventory

    An overview of three main types of simulation approach (explanatory, abstraction, and estimation) is presented, along with a discussion of their capabilities limitations, and the steps required for their validation. A process model being developed through the Forest Response Prog...

  14. Biorobotics: using robots to emulate and investigate agile locomotion.

    PubMed

    Ijspeert, Auke J

    2014-10-10

    The graceful and agile movements of animals are difficult to analyze and emulate because locomotion is the result of a complex interplay of many components: the central and peripheral nervous systems, the musculoskeletal system, and the environment. The goals of biorobotics are to take inspiration from biological principles to design robots that match the agility of animals, and to use robots as scientific tools to investigate animal adaptive behavior. Used as physical models, biorobots contribute to hypothesis testing in fields such as hydrodynamics, biomechanics, neuroscience, and prosthetics. Their use may contribute to the design of prosthetic devices that more closely take human locomotion principles into account. PMID:25301621

  15. Joint Spitzer and AGILE observations of the blazar 3C 454.3

    NASA Astrophysics Data System (ADS)

    Donnarumma, Immacolata; D'Ammando, Filippo

    2007-12-01

    We require SPITZER IRAC and MIPS observations for multifrequency follow-up of the blazar 3C 454.3. During the last three days AGILE is revealing a strong activity in the gamma-rays band. Joint Spitzer and AGILE observations offers the unprecedented great opportunity to study the correlated variability in the low and high energy peaks. This will contribute to improve the understanding of the structure of the inner jet, the origin of the seed photons for the IC process and then discriminating the different emission models in the red blazars during their high gamma-rays activity. The AGILE Team is going to activate a similar ToO to Swift, while a monitoring in the optical energy band is occurring thanks to WEBT. Therefore Spitzer observation of this blazar will give a unique and extraordinary opportunity to investigate its electromagnetic emission on a wide energy range in its strong flaring activity and then to determine its Spectral Energy Distribution.

  16. Biosphere Process Model Report

    SciTech Connect

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  17. Modeling robot contour processes

    NASA Astrophysics Data System (ADS)

    Whitney, D. E.; Edsall, A. C.

    Robot contour processes include those with contact force like car body grinding or deburring of complex castings, as well as those with little or no contact force like inspection. This paper describes ways of characterizing, identifying, and estimating contours and robot trajectories. Contour and robot are modeled as stochastic processes in order to emphasize that both successive robot cycles and successive industrial workpieces are similar but not exactly the same. The stochastic models can be used to identify the state of a workpiece or process, or to design a filter to estimate workpiece, shape and robot position from robot-based measurements.

  18. Agile methods in biomedical software development: a multi-site experience report

    PubMed Central

    Kane, David W; Hohman, Moses M; Cerami, Ethan G; McCormick, Michael W; Kuhlmman, Karl F; Byrd, Jeff A

    2006-01-01

    Background Agile is an iterative approach to software development that relies on strong collaboration and automation to keep pace with dynamic environments. We have successfully used agile development approaches to create and maintain biomedical software, including software for bioinformatics. This paper reports on a qualitative study of our experiences using these methods. Results We have found that agile methods are well suited to the exploratory and iterative nature of scientific inquiry. They provide a robust framework for reproducing scientific results and for developing clinical support systems. The agile development approach also provides a model for collaboration between software engineers and researchers. We present our experience using agile methodologies in projects at six different biomedical software development organizations. The organizations include academic, commercial and government development teams, and included both bioinformatics and clinical support applications. We found that agile practices were a match for the needs of our biomedical projects and contributed to the success of our organizations. Conclusion We found that the agile development approach was a good fit for our organizations, and that these practices should be applicable and valuable to other biomedical software development efforts. Although we found differences in how agile methods were used, we were also able to identify a set of core practices that were common to all of the groups, and that could be a focus for others seeking to adopt these methods. PMID:16734914

  19. Microwave sintering process model.

    PubMed

    Peng, Hu; Tinga, W R; Sundararaj, U; Eadie, R L

    2003-01-01

    In order to simulate and optimize the microwave sintering of a silicon nitride and tungsten carbide/cobalt toolbits process, a microwave sintering process model has been built. A cylindrical sintering furnace was used containing a heat insulating layer, a susceptor layer, and an alumina tube containing the green toolbit parts between parallel, electrically conductive, graphite plates. Dielectric and absorption properties of the silicon nitride green parts, the tungsten carbide/cobalt green parts, and an oxidizable susceptor material were measured using perturbation and waveguide transmission methods. Microwave absorption data were measured over a temperature range from 20 degrees C to 800 degrees C. These data were then used in the microwave process model which assumed plane wave propagation along the radial direction and included the microwave reflection at each interface between the materials and the microwave absorption in the bulk materials. Heat transfer between the components inside the cylindrical sintering furnace was also included in the model. The simulated heating process data for both silicon nitride and tungsten carbide/cobalt samples closely follow the experimental data. By varying the physical parameters of the sintering furnace model, such as the thickness of the susceptor layer, the thickness of the allumina tube wall, the sample load volume and the graphite plate mass, the model data predicts their effects which are helpful in optimizing those parameters in the industrial sintering process. PMID:15323110

  20. Evaluation of a novel laparoscopic camera for characterization of renal ischemia in a porcine model using digital light processing (DLP) hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Olweny, Ephrem O.; Tan, Yung K.; Faddegon, Stephen; Jackson, Neil; Wehner, Eleanor F.; Best, Sara L.; Park, Samuel K.; Thapa, Abhas; Cadeddu, Jeffrey A.; Zuzak, Karel J.

    2012-03-01

    Digital light processing hyperspectral imaging (DLP® HSI) was adapted for use during laparoscopic surgery by coupling a conventional laparoscopic light guide with a DLP-based Agile Light source (OL 490, Optronic Laboratories, Orlando, FL), incorporating a 0° laparoscope, and a customized digital CCD camera (DVC, Austin, TX). The system was used to characterize renal ischemia in a porcine model.

  1. Foam process models.

    SciTech Connect

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A.; Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  2. AGILE integration into APC for high mix logic fab

    NASA Astrophysics Data System (ADS)

    Gatefait, M.; Lam, A.; Le Gratiet, B.; Mikolajczak, M.; Morin, V.; Chojnowski, N.; Kocsis, Z.; Smith, I.; Decaunes, J.; Ostrovsky, A.; Monget, C.

    2015-09-01

    For C040 technology and below, photolithographic depth of focus control and dispersion improvement is essential to secure product functionality. Critical 193nm immersion layers present initial focus process windows close to machine control capability. For previous technologies, the standard scanner sensor (Level sensor - LS) was used to map wafer topology and expose the wafer at the right Focus. Such optical embedded metrology, based on light reflection, suffers from reading issues that cannot be neglected anymore. Metrology errors are correlated to inspected product area for which material types and densities change, and so optical properties are not constant. Various optical phenomena occur across the product field during wafer inspection and have an effect on the quality and position of the reflected light. This can result in incorrect heights being recorded and exposures possibly being done out of focus. Focus inaccuracy associated to aggressive process windows on critical layers will directly impact product realization and therefore functionality and yield. ASML has introduced an air gauge sensor to complement the optical level sensor and lead to optimal topology metrology. The use of this new sensor is managed by the AGILE (Air Gauge Improved process LEveling) application. This measurement with no optical dependency will correct for optical inaccuracy of level sensor, and so improve best focus dispersion across the product. Due to the fact that stack complexity is more and more important through process steps flow, optical perturbation of standard Level sensor metrology is increasing and is becoming maximum for metallization layers. For these reasons AGILE feature implementation was first considered for contact and all metal layers. Another key point is that standard metrology will be sensitive to layer and reticle/product density. The gain of Agile will be enhanced for multiple product contribution mask and for complex System on Chip. Into ST context (High

  3. An Investigation of Agility Issues in Scrum Teams Using Agility Indicators

    NASA Astrophysics Data System (ADS)

    Pikkarainen, Minna; Wang, Xiaofeng

    Agile software development methods have emerged and become increasingly popular in recent years; yet the issues encountered by software development teams that strive to achieve agility using agile methods are yet to be explored systematically. Built upon a previous study that has established a set of indicators of agility, this study investigates what issues are manifested in software development teams using agile methods. It is focussed on Scrum teams particularly. In other words, the goal of the chapter is to evaluate Scrum teams using agility indicators and therefore to further validate previously presented agility indicators within the additional cases. A multiple case study research method is employed. The findings of the study reveal that the teams using Scrum do not necessarily achieve agility in terms of team autonomy, sharing, stability and embraced uncertainty. The possible reasons include previous organizational plan-driven culture, resistance towards the Scrum roles and changing resources.

  4. The AGILE gamma-ray astronomy mission

    NASA Astrophysics Data System (ADS)

    Mereghetti, S.; Tavani, M.; Argan, A.; Barbiellini, G.; Caraveo, P.; Chen, A.; Cocco, V.; Costa, E.; Di Cocco, G.; Feroci, M.; Labanti, C.; Lapshov, I.; Lipari, P.; Longo, F.; Morselli, A.; Perotti, F.; Picozza, P.; Pittori, C.; Prest, M.; Rubini, A.; Soffitta, P.; Vallazza, E.; Vercellone, S.; Zanello, D.

    2001-09-01

    We describe the AGILE satellite: a unique tool for high-energy astrophysics in the 30 MeV - 50 GeV range before GLAST. The scientific performances of AGILE are comparable to those of EGRET, despite the much smaller weight and dimensions. The AGILE mission will be optimized for the imaging capabilities above 30 MeV and for the study of transient phenomena, complemented by simultaneous monitoring in the hard X-ray band (10 - 40 keV).

  5. Assessment of proposed fighter agility metrics

    NASA Technical Reports Server (NTRS)

    Liefer, Randall K.; Valasek, John; Eggold, David P.; Downing, David R.

    1990-01-01

    This paper presents the results of an analysis of proposed metrics to assess fighter aircraft agility. A novel framework for classifying these metrics is developed and applied. A set of transient metrics intended to quantify the axial and pitch agility of fighter aircraft is evaluated with a high fidelity, nonlinear F-18 simulation. Test techniques and data reduction method are proposed, and sensitivities to pilot introduced errors during flight testing is investigated. Results indicate that the power onset and power loss parameters are promising candidates for quantifying axial agility, while maximum pitch up and pitch down rates are for quantifying pitch agility.

  6. Agile robotic edge finishing

    SciTech Connect

    Powell, M.

    1996-08-01

    Edge finishing processes have seemed like ideal candidates for automation. Most edge finishing processes are unpleasant, dangerous, tedious, expensive, not repeatable and labor intensive. Estimates place the cost of manual edge finishing processes at 12% of the total cost of fabricating precision parts. For small, high precision parts, the cost of hand finishing may be as high as 305 of the total part cost. Up to 50% of this cost could be saved through automation. This cost estimate includes the direct costs of edge finishing: the machining hours required and the 30% scrap and rework rate after manual finishing. Not included in these estimates are the indirect costs resulting from cumulative trauma disorders and retraining costs caused by the high turnover rate for finishing jobs.. Despite the apparent economic advantages, edge finishing has proven difficult to automate except in low precision and/or high volume production environments. Finishing automation systems have not been deployed successfully in Department of Energy defense programs (DOE/DP) production, A few systems have been attempted but have been subsequently abandoned for traditional edge finishing approaches: scraping, grinding, and filing the edges using modified dental tools and hand held power tools. Edge finishing automation has been an elusive but potentially lucrative production enhancement. The amount of time required for reconfiguring workcells for new parts, the time required to reprogram the workcells to finish new parts, and automation equipment to respond to fixturing errors and part tolerances are the most common reasons cited for eliminating automation as an option for DOE/DP edge finishing applications. Existing automated finishing systems have proven to be economically viable only where setup and reprogramming costs are a negligible fraction of overall production costs.

  7. Current State of Agile User-Centered Design: A Survey

    NASA Astrophysics Data System (ADS)

    Hussain, Zahid; Slany, Wolfgang; Holzinger, Andreas

    Agile software development methods are quite popular nowadays and are being adopted at an increasing rate in the industry every year. However, these methods are still lacking usability awareness in their development lifecycle, and the integration of usability/User-Centered Design (UCD) into agile methods is not adequately addressed. This paper presents the preliminary results of a recently conducted online survey regarding the current state of the integration of agile methods and usability/UCD. A world wide response of 92 practitioners was received. The results show that the majority of practitioners perceive that the integration of agile methods with usability/UCD has added value to their adopted processes and to their teams; has resulted in the improvement of usability and quality of the product developed; and has increased the satisfaction of the end-users of the product developed. The top most used HCI techniques are low-fidelity prototyping, conceptual designs, observational studies of users, usability expert evaluations, field studies, personas, rapid iterative testing, and laboratory usability testing.

  8. Network configuration management : paving the way to network agility.

    SciTech Connect

    Maestas, Joseph H.

    2007-08-01

    Sandia networks consist of nearly nine hundred routers and switches and nearly one million lines of command code, and each line ideally contributes to the capabilities of the network to convey information from one location to another. Sandia's Cyber Infrastructure Development and Deployment organizations recognize that it is therefore essential to standardize network configurations and enforce conformance to industry best business practices and documented internal configuration standards to provide a network that is agile, adaptable, and highly available. This is especially important in times of constrained budgets as members of the workforce are called upon to improve efficiency, effectiveness, and customer focus. Best business practices recommend using the standardized configurations in the enforcement process so that when root cause analysis results in recommended configuration changes, subsequent configuration auditing will improve compliance to the standard. Ultimately, this minimizes mean time to repair, maintains the network security posture, improves network availability, and enables efficient transition to new technologies. Network standardization brings improved network agility, which in turn enables enterprise agility, because the network touches all facets of corporate business. Improved network agility improves the business enterprise as a whole.

  9. Tailoring Agility: Promiscuous Pair Story Authoring and Value Calculation

    NASA Astrophysics Data System (ADS)

    Tendon, Steve

    This chapter describes how a multi-national software organization created a business plan involving business units from eight countries that followed an agile way, after two previously failed attempts with traditional approaches. The case is told by the consultant who initiated implementation of agility into requirements gathering, estimation and planning processes in an international setting. The agile approach was inspired by XP, but then tailored to meet the peculiar requirements. Two innovations were critical. The first innovation was promiscuous pair story authoring, where user stories were written by two people (similarly to pair programming), and the pairing changed very often (as frequently as every 15-20 minutes) to achieve promiscuity and cater for diverse point of views. The second innovation was an economic value evaluation (and not the cost) which was attributed to stories. Continuous recalculation of the financial value of the stories allowed to assess the projects financial return. In this case implementation of agility in the international context allowed the involved team members to reach consensus and unanimity of decisions, vision and purpose.

  10. Agile manufacturing and constraints management: a strategic perspective

    NASA Astrophysics Data System (ADS)

    Stratton, Roy; Yusuf, Yahaya Y.

    2000-10-01

    The definition of the agile paradigm has proved elusive and is often viewed as a panacea, in contention with more traditional approaches to operations strategy development and Larkin its own methodology and tools. The Theory of Constraints (TOC) is also poorly understood, as it is commonly solely associated with production planning and control systems and bottleneck management. This paper will demonstrate the synergy between these two approaches together with the Theory of Inventive Problem Solving (TRIZ), and establish how the systematic elimination of trade-offs can support the agile paradigm. Whereas agility is often seen as a trade-off free destination, both TOC and TRIZ may be considered to be route finders, as they comprise methodologies that focus on the identification and elimination of the trade-offs that constrain the purposeful improvement of a system, be it organizational or mechanical. This paper will also show how the TOC thinking process may be combined with the TRIZ knowledge based approach and used in breaking contradictions within agile logistics.

  11. Agent-based scheduling system to achieve agility

    NASA Astrophysics Data System (ADS)

    Akbulut, Muhtar B.; Kamarthi, Sagar V.

    2000-12-01

    Today's competitive enterprises need to design, develop, and manufacture their products rapidly and inexpensively. Agile manufacturing has emerged as a new paradigm to meet these challenges. Agility requires, among many other things, scheduling and control software systems that are flexible, robust, and adaptive. In this paper a new agent-based scheduling system (ABBS) is developed to meet the challenges of an agile manufacturing system. In ABSS, unlike in the traditional approaches, information and decision making capabilities are distributed among the system entities called agents. In contrast with the most agent-based scheduling systems which commonly use a bidding approach, the ABBS employs a global performance monitoring strategy. A production-rate-based global performance metric which effectively assesses the system performance is developed to assist the agents' decision making process. To test the architecture, an agent-based discrete event simulation software is developed. The experiments performed using the simulation software yielded encouraging results in supporting the applicability of agent-based systems to address the scheduling and control needs of an agile manufacturing system.

  12. Multiply-agile encryption in high speed communication networks

    SciTech Connect

    Pierson, L.G.; Witzke, E.L.

    1997-05-01

    Different applications have different security requirements for data privacy, data integrity, and authentication. Encryption is one technique that addresses these requirements. Encryption hardware, designed for use in high-speed communications networks, can satisfy a wide variety of security requirements if that hardware is key-agile, robustness-agile and algorithm-agile. Hence, multiply-agile encryption provides enhanced solutions to the secrecy, interoperability and quality of service issues in high-speed networks. This paper defines these three types of agile encryption. Next, implementation issues are discussed. While single-algorithm, key-agile encryptors exist, robustness-agile and algorithm-agile encryptors are still research topics.

  13. On the Biomimetic Design of Agile-Robot Legs

    PubMed Central

    Garcia, Elena; Arevalo, Juan Carlos; Muñoz, Gustavo; Gonzalez-de-Santos, Pablo

    2011-01-01

    The development of functional legged robots has encountered its limits in human-made actuation technology. This paper describes research on the biomimetic design of legs for agile quadrupeds. A biomimetic leg concept that extracts key principles from horse legs which are responsible for the agile and powerful locomotion of these animals is presented. The proposed biomimetic leg model defines the effective leg length, leg kinematics, limb mass distribution, actuator power, and elastic energy recovery as determinants of agile locomotion, and values for these five key elements are given. The transfer of the extracted principles to technological instantiations is analyzed in detail, considering the availability of current materials, structures and actuators. A real leg prototype has been developed following the biomimetic leg concept proposed. The actuation system is based on the hybrid use of series elasticity and magneto-rheological dampers which provides variable compliance for natural motion. From the experimental evaluation of this prototype, conclusions on the current technological barriers to achieve real functional legged robots to walk dynamically in agile locomotion are presented. PMID:22247667

  14. How Can Agile Practices Minimize Global Software Development Co-ordination Risks?

    NASA Astrophysics Data System (ADS)

    Hossain, Emam; Babar, Muhammad Ali; Verner, June

    The distribution of project stakeholders in Global Software Development (GSD) projects provides significant risks related to project communication, coordination and control processes. There is growing interest in applying agile practices in GSD projects in order to leverage the advantages of both approaches. In some cases, GSD project managers use agile practices to reduce project distribution challenges. We use an existing coordination framework to identify GSD coordination problems due to temporal, geographical and socio-cultural distances. An industry-based case study is used to describe, explore and explain the use of agile practices to reduce development coordination challenges.

  15. An Agile Course-Delivery Approach

    ERIC Educational Resources Information Center

    Capellan, Mirkeya

    2009-01-01

    In the world of software development, agile methodologies have gained popularity thanks to their lightweight methodologies and flexible approach. Many advocates believe that agile methodologies can provide significant benefits if applied in the educational environment as a teaching method. The need for an approach that engages and motivates…

  16. The Introduction of Agility into Albania.

    ERIC Educational Resources Information Center

    Smith-Stevens, Eileen J.; Shkurti, Drita

    1998-01-01

    Describes a plan to introduce and achieve a national awareness of agility (and easy entry into the world market) for Albania through the relatively stable higher-education order. Agility's four strategic principles are enriching the customer, cooperating to enhance competitiveness, organizing to master change and uncertainty, and leveraging the…

  17. Teaching Agile Software Development: A Case Study

    ERIC Educational Resources Information Center

    Devedzic, V.; Milenkovic, S. R.

    2011-01-01

    This paper describes the authors' experience of teaching agile software development to students of computer science, software engineering, and other related disciplines, and comments on the implications of this and the lessons learned. It is based on the authors' eight years of experience in teaching agile software methodologies to various groups…

  18. Transitioning from Distributed and Traditional to Distributed and Agile: An Experience Report

    NASA Astrophysics Data System (ADS)

    Wildt, Daniel; Prikladnicki, Rafael

    Global companies that experienced extensive waterfall phased plans are trying to improve their existing processes to expedite team engagement. Agile methodologies have become an acceptable path to follow because it comprises project management as part of its practices. Agile practices have been used with the objective of simplifying project control through simple processes, easy to update documentation and higher team iteration over exhaustive documentation, focusing rather on team continuous improvement and aiming to add value to business processes. The purpose of this chapter is to describe the experience of a global multinational company on transitioning from distributed and traditional to distributed and agile. This company has development centers across North America, South America and Asia. This chapter covers challenges faced by the project teams of two pilot projects, including strengths of using agile practices in a globally distributed environment and practical recommendations for similar endeavors.

  19. Fighter agility metrics, research, and test

    NASA Technical Reports Server (NTRS)

    Liefer, Randall K.; Valasek, John; Eggold, David P.

    1990-01-01

    Proposed new metrics to assess fighter aircraft agility are collected and analyzed. A framework for classification of these new agility metrics is developed and applied. A completed set of transient agility metrics is evaluated with a high fidelity, nonlinear F-18 simulation provided by the NASA Dryden Flight Research Center. Test techniques and data reduction methods are proposed. A method of providing cuing information to the pilot during flight test is discussed. The sensitivity of longitudinal and lateral agility metrics to deviations from the pilot cues is studied in detail. The metrics are shown to be largely insensitive to reasonable deviations from the nominal test pilot commands. Instrumentation required to quantify agility via flight test is also considered. With one exception, each of the proposed new metrics may be measured with instrumentation currently available. Simulation documentation and user instructions are provided in an appendix.

  20. Agile parallel bioinformatics workflow management using Pwrake

    PubMed Central

    2011-01-01

    Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability

  1. Geometric simulation analysis of multi-band mosaic imaging from the same orbit by agile satellites

    NASA Astrophysics Data System (ADS)

    Xu, Yue; Chen, Jinwei; Chen, Yueting; Xu, Zhihai; Feng, Huajun; Li, Qi

    2015-08-01

    This paper establishes a geometric model of multi-band mosaic imaging from the same orbit by agile satellites, and introduces a self-write simulation software. Geometric parameters of each band are calculated based on the attitude control ability of the satellite and the mission requirements. Considering the different ground resolution and the imaging angle of each band, two new concepts, Gradient Entropy and Structure Similarity Parameter are presented. These two values are used to evaluate the change of image quality caused by agility, and help to estimate the effect of the mission. By building the geometric model and calculating the agile information with the program, we propose a new approach of forward analysis of agile imaging, which helps users evaluate the image degradation.

  2. Adopting best practices: "Agility" moves from software development to healthcare project management.

    PubMed

    Kitzmiller, Rebecca; Hunt, Eleanor; Sproat, Sara Breckenridge

    2006-01-01

    It is time for a change in mindset in how nurses operationalize system implementations and manage projects. Computers and systems have evolved over time from unwieldy mysterious machines of the past to ubiquitous computer use in every aspect of daily lives and work sites. Yet, disconcertingly, the process used to implement these systems has not evolved. Technology implementation does not need to be a struggle. It is time to adapt traditional plan-driven implementation methods to incorporate agile techniques. Agility is a concept borrowed from software development and is presented here because it encourages flexibility, adaptation, and continuous learning as part of the implementation process. Agility values communication and harnesses change to an advantage, which facilitates the natural evolution of an adaptable implementation process. Specific examples of agility in an implementation are described, and plan-driven implementation stages are adapted to incorporate relevant agile techniques. This comparison demonstrates how an agile approach enhances traditional implementation techniques to meet the demands of today's complex healthcare environments. PMID:16554690

  3. Towards a Better Understanding of CMMI and Agile Integration - Multiple Case Study of Four Companies

    NASA Astrophysics Data System (ADS)

    Pikkarainen, Minna

    The amount of software is increasing in the different domains in Europe. This provides the industries in smaller countries good opportunities to work in the international markets. Success in the global markets however demands the rapid production of high quality, error free software. Both CMMI and agile methods seem to provide a ready solution for quality and lead time improvements. There is not, however, much empirical evidence available either about 1) how the integration of these two aspects can be done in practice or 2) what it actually demands from assessors and software process improvement groups. The goal of this paper is to increase the understanding of CMMI and agile integration, in particular, focusing on the research question: how to use ‘lightweight’ style of CMMI assessments in agile contexts. This is done via four case studies in which assessments were conducted using the goals of CMMI integrated project management and collaboration and coordination with relevant stakeholder process areas and practices from XP and Scrum. The study shows that the use of agile practices may support the fulfilment of the goals of CMMI process areas but there are still many challenges for the agile teams to be solved within the continuous improvement programs. It also identifies practical advices to the assessors and improvement groups to take into consideration when conducting assessment in the context of agile software development.

  4. Agile manufacturing concepts and opportunities in ceramics

    SciTech Connect

    Booth, C.L.; Harmer, M.P.

    1995-08-01

    In 1991 Lehigh University facilitated seminars over a period of 8 months to define manufacturing needs for the 21st century. They concluded that the future will be characterized by rapid changes in technology advances, customer demands, and shifts in market dynamics and coined the term {open_quotes}Agile Manufacturing{close_quotes}. Agile manufacturing refers to the ability to thrive in an environment of constant unpredictable change. Market opportunities are attacked by partnering to form virtual firms to dynamically obtain the required skills for each product opportunity. This paper will describe and compare agile vs. traditional concepts of organization & structure, management policy and ethics, employee environment, product focus, information, and paradigm shift. Examples of agile manufacturing applied to ceramic materials will be presented.

  5. Preparing your Offshore Organization for Agility: Experiences in India

    NASA Astrophysics Data System (ADS)

    Srinivasan, Jayakanth

    Two strategies that have significantly changed the way we conventionally think about managing software development and sustainment are the family of development approaches collectively referred to as agile methods, and the distribution of development efforts on a global scale. When you combine the two strategies, organizations have to address not only the technical challenges that arise from introducing new ways of working, but more importantly have to manage the 'soft' factors that if ignored lead to hard challenges. Using two case studies of distributed agile software development in India we illustrate the areas that organizations need to be aware of when transitioning work to India. The key issues that we emphasize are the need to recruit and retain personnel; the importance of teaching, mentoring and coaching; the need to manage customer expectations; the criticality of well-articulated senior leadership vision and commitment; and the reality of operating in a heterogeneous process environment.

  6. Agility Following the Application of Cold Therapy

    PubMed Central

    Evans, Todd A.; Ingersoll, Christopher; Knight, Kenneth L.; Worrell, Teddy

    1995-01-01

    Cold application is commonly used before strenuous exercise due to its hypalgesic effects. Some have questioned this procedure because of reports that cold may reduce isokinetic torque. However, there have been no investigations of actual physical performance following cold application. The purpose of this study was to determine if a 20-minute ice immersion treatment to the foot and ankle affected the performance of three agility tests: the carioca maneuver, the cocontraction test, and the shuttle run. Twenty-four male athletic subjects were tested during two different treatment sessions following an orientation session. Subjects were tested following a 20-minute 1°C ice immersion treatment to the dominant foot and ankle and 20 minutes of rest. Following each treatment, subjects performed three trials of each agility test, with 30 seconds rest between each trial, and 1 minute between each different agility test. The order in which each subject performed the agility tests was determined by a balanced Latin square. A MANOVA with repeated measures was used to determine if there was an overall significant difference in the agility times recorded between the cold and control treatments and if the order of the treatment sessions affected the scores. Although the mean agility time scores were slightly slower following the cold treatment, cooling the foot and ankle caused no difference in agility times. Also, there was no difference resulting from the treatment orders. We felt that the slightly slower scores may have been a result of tissue stiffness and/or subject's apprehension immediately following the cold treatment. Cold application to the foot and ankle can be used before strenuous exercise without altering agility. Imagesp232-a PMID:16558341

  7. Agile Data Management with the Global Change Information System

    NASA Astrophysics Data System (ADS)

    Duggan, B.; Aulenbach, S.; Tilmes, C.; Goldstein, J.

    2013-12-01

    We describe experiences applying agile software development techniques to the realm of data management during the development of the Global Change Information System (GCIS), a web service and API for authoritative global change information under development by the US Global Change Research Program. Some of the challenges during system design and implementation have been : (1) balancing the need for a rigorous mechanism for ensuring information quality with the realities of large data sets whose contents are often in flux, (2) utilizing existing data to inform decisions about the scope and nature of new data, and (3) continuously incorporating new knowledge and concepts into a relational data model. The workflow for managing the content of the system has much in common with the development of the system itself. We examine various aspects of agile software development and discuss whether or how we have been able to use them for data curation as well as software development.

  8. SuperAGILE Services at ASDC

    SciTech Connect

    Preger, B.; Verrecchia, F.; Pittori, C.; Antonelli, L. A.; Giommi, P.; Lazzarotto, F.; Evangelista, Y.

    2008-05-22

    The Italian Space Agency Science Data Center (ASDC) is a facility with several responsibilities including support to all the ASI scientific missions as for management and archival of the data, acting as the interface between ASI and the scientific community and providing on-line access to the data hosted. In this poster we describe the services that ASDC provides for SuperAGILE, in particular the ASDC public web pages devoted to the dissemination of SuperAGILE scientific results. SuperAGILE is the X-Ray imager onboard the AGILE mission, and provides the scientific community with orbit-by-orbit information on the observed sources. Crucial source information including position and flux in chosen energy bands will be reported in the SuperAGILE public web page at ASDC. Given their particular interest, another web page will be dedicated entirely to GRBs and other transients, where new event alerts will be notified and where users will find all the available informations on the GRBs detected by SuperAGILE.

  9. GREENSCOPE: Sustainable Process Modeling

    EPA Science Inventory

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  10. A Big Data-driven Model for the Optimization of Healthcare Processes.

    PubMed

    Koufi, Vassiliki; Malamateniou, Flora; Vassilacopoulos, George

    2015-01-01

    Healthcare organizations increasingly navigate a highly volatile, complex environment in which technological advancements and new healthcare delivery business models are the only constants. In their effort to out-perform in this environment, healthcare organizations need to be agile enough in order to become responsive to these increasingly changing conditions. To act with agility, healthcare organizations need to discover new ways to optimize their operations. To this end, they focus on healthcare processes that guide healthcare delivery and on the technologies that support them. Business process management (BPM) and Service-Oriented Architecture (SOA) can provide a flexible, dynamic, cloud-ready infrastructure where business process analytics can be utilized to extract useful insights from mountains of raw data, and make them work in ways beyond the abilities of human brains, or IT systems from just a year ago. This paper presents a framework which provides healthcare professionals gain better insight within and across your business processes. In particular, it performs real-time analysis on process-related data in order reveal areas of potential process improvement. PMID:25991242

  11. Balancing Plan-Driven and Agile Methods in Software Engineering Project Courses

    NASA Astrophysics Data System (ADS)

    Boehm, Barry; Port, Dan; Winsor Brown, A.

    2002-09-01

    For the past 6 years, we have been teaching a two-semester software engineering project course. The students organize into 5-person teams and develop largely web-based electronic services projects for real USC campus clients. We have been using and evolving a method called Model- Based (System) Architecting and Software Engineering (MBASE) for use in both the course and in industrial applications. The MBASE Guidelines include a lot of documents. We teach risk-driven documentation: if it is risky to document something, and not risky to leave it out (e.g., GUI screen placements), leave it out. Even so, students tend to associate more documentation with higher grades, although our grading eventually discourages this. We are always on the lookout for ways to have students learn best practices without having to produce excessive documentation. Thus, we were very interested in analyzing the various emerging agile methods. We found that agile methods and milestone plan-driven methods are part of a “how much planning is enough?” spectrum. Both agile and plan-driven methods have home grounds of project characteristics where they clearly work best, and where the other will have difficulties. Hybrid agile/plan-driven approaches are feasible, and necessary for projects having a mix of agile and plan-driven home ground characteristics. Information technology trends are going more toward the agile methods' home ground characteristics of emergent requirements and rapid change, although there is a concurrent increase in concern with dependability. As a result, we are currently experimenting with risk-driven combinations of MBASE and agile methods, such as integrating requirements, test plans, peer reviews, and pair programming into “agile quality management.”

  12. Visual Modelling of Learning Processes

    ERIC Educational Resources Information Center

    Copperman, Elana; Beeri, Catriel; Ben-Zvi, Nava

    2007-01-01

    This paper introduces various visual models for the analysis and description of learning processes. The models analyse learning on two levels: the dynamic level (as a process over time) and the functional level. Two types of model for dynamic modelling are proposed: the session trace, which documents a specific learner in a particular learning…

  13. Function-based integration strategy for an agile manufacturing testbed

    NASA Astrophysics Data System (ADS)

    Park, Hisup

    1997-01-01

    This paper describes an integration strategy for plug-and- play software based on functional descriptions of the software modules. The functional descriptions identify explicitly the role of each module with respect to the overall systems. They define the critical dependencies that affect the individual modules and thus affect the behavior of the system. The specified roles, dependencies and behavioral constraints are then incorporated in a group of shared objects that are distributed over a network. These objects may be interchanged with others without disrupting the system so long as the replacements meet the interface and functional requirements. In this paper, we propose a framework for modeling the behavior of plug-and-play software modules that will be used to (1) design and predict the outcome of the integration, (2) generate the interface and functional requirements of individual modules, and (3) form a dynamic foundation for applying interchangeable software modules. I describe this strategy in the context of the development of an agile manufacturing testbed. The testbed represents a collection of production cells for machining operations, supported by a network of software modules or agents for planning, fabrication, and inspection. A process definition layer holds the functional description of the software modules. A network of distributed objects interact with one another over the Internet and comprise the plug-compatible software nodes that execute these functions. This paper will explore the technical and operational ramifications of using the functional description framework to organize and coordinate the distributed object modules.

  14. An agile mask data preparation and writer dispatching approach

    NASA Astrophysics Data System (ADS)

    Hsu, Chih-tung; Chen, Y. S.; Hsin, S. C.; Tuo, Laurent C.; Schulze, Steffen F.

    2004-08-01

    An agile mask data preparation (MDP) approach is proposed to cut re-fracture cycle time as incurred by mask writer dispatching policy changes. Shorter re-fracture cycle time increases the flexibility of mask writer dispatching, as a result, mask writer's capacity can be utilized to its optimum. Preliminary results demonstrate promising benefits in MDP cycle time reduction and writer dispatching flexibility improvement. The agile MDP can save up to 40% of re-fracture cycle time. OASIS (Open Artwork System Interchange Standard) was proposed to address the GDSII file size explosion problem. However, OASIS has yet to gain wide acceptance in the mask industry. The authors envision OASIS adoption by the mask industry as a three-phase process and identify key issues of each phase from the mask manufacturer's perspective. As a long-term MDP flow reengineering project, an agile MDP and writer dispatching approach based on OASIS is proposed. The paper describes the results of an extensive evaluation on OASIS performance compared to that of GDSII, both original GDSII and post-OPC GDSII files. The file size of eighty percent of the original GDSII files is more than ten times larger compared to that of its OASIS counterpart.

  15. Agile Data Curation at a State Geological Survey

    NASA Astrophysics Data System (ADS)

    Hills, D. J.

    2015-12-01

    State agencies, including geological surveys, are often the gatekeepers for myriad data products essential for scientific research and economic development. For example, the Geological Survey of Alabama (GSA) is mandated to explore for, characterize, and report Alabama's mineral, energy, water, and biological resources in support of economic development, conservation, management, and public policy for the betterment of Alabama's citizens, communities, and businesses. As part of that mandate, the GSA has increasingly been called upon to make our data more accessible to stakeholders. Even as demand for greater data accessibility grows, budgets for such efforts are often small, meaning that agencies must do more for less. Agile software development has yielded efficient, effective products, most often at lower cost and in shorter time. Taking guidance from the agile software development model, the GSA is working towards more agile data management and curation. To date, the GSA's work has been focused primarily on data rescue. By using workflows that maximize clear communication while encouraging simplicity (e.g., maximizing the amount of work not done or that can be automated), the GSA is bringing decades of dark data into the light. Regular checks by the data rescuer with the data provider (or their proxy) provides quality control without adding an overt burden on either party. Moving forward, these workflows will also allow for more efficient and effective data management.

  16. Distributed agile software development for the SKA

    NASA Astrophysics Data System (ADS)

    Wicenec, Andreas; Parsons, Rebecca; Kitaeff, Slava; Vinsen, Kevin; Wu, Chen; Nelson, Paul; Reed, David

    2012-09-01

    The SKA software will most probably be developed by many groups distributed across the globe and coming from dierent backgrounds, like industries and research institutions. The SKA software subsystems will have to cover a very wide range of dierent areas, but still they have to react and work together like a single system to achieve the scientic goals and satisfy the challenging data ow requirements. Designing and developing such a system in a distributed fashion requires proper tools and the setup of an environment to allow for ecient detection and tracking of interface and integration issues in particular in a timely way. Agile development can provide much faster feedback mechanisms and also much tighter collaboration between the customer (scientist) and the developer. Continuous integration and continuous deployment on the other hand can provide much faster feedback of integration issues from the system level to the subsystem developers. This paper describes the results obtained from trialing a potential SKA development environment based on existing science software development processes like ALMA, the expected distribution of the groups potentially involved in the SKA development and experience gained in the development of large scale commercial software projects.

  17. Design and characterization of frequency agile RF and microwave devices using ferroelectrics

    NASA Astrophysics Data System (ADS)

    Nath, Jayesh

    A methodology for the optimized design of tunable distributed resonators is introduced and verified. This technique enables maximum tuning with minimum degradation in quality (Q) factor. The concept of a network transformation factor and a new figure-of-merit for tunable resonators is introduced and applied to experimental data. The figure-of-merit quantifies the trade-off between tunability and Q factor for a given tuning ratio of the variable reactance device. As such, it can be extended to the design of filters, phase shifters, antennas, matching networks and other frequency-agile devices where resonant elements are used. Varactors utilizing Barium Strontium Titanate (BST) thin-film were designed and fabricated in integrated form and also in discrete form as standard 0603 components. High frequency characterization and modeling of BST varactors is described. A novel characterization technique for the intrinsic loss extraction of symmetrical two-port networks was developed and verified experimentally. Both integrated and discrete BST thin-film varactors were used to design, fabricate and characterize frequency-agile circuits. Tunable bandpass and bandstop filters and matching networks are described. A dual-mode, narrowband microstrip patch antenna with independently tunable modes was developed and characterized. Tuning and nonlinear characterization results are presented. Investigation for the use of BST thin-film varactors for voltage-controlled oscillators and phase shifters are also presented. Design parameters, fabrication issues, and processing challenges are discussed.

  18. Fighter agility metrics. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Liefer, Randall K.

    1990-01-01

    Fighter flying qualities and combat capabilities are currently measured and compared in terms relating to vehicle energy, angular rates and sustained acceleration. Criteria based on these measurable quantities have evolved over the past several decades and are routinely used to design aircraft structures, aerodynamics, propulsion and control systems. While these criteria, or metrics, have the advantage of being well understood, easily verified and repeatable during test, they tend to measure the steady state capability of the aircraft and not its ability to transition quickly from one state to another. Proposed new metrics to assess fighter aircraft agility are collected and analyzed. A framework for classification of these new agility metrics is developed and applied. A complete set of transient agility metrics is evaluated with a high fidelity, nonlinear F-18 simulation. Test techniques and data reduction methods are proposed. A method of providing cuing information to the pilot during flight test is discussed. The sensitivity of longitudinal and lateral agility metrics to deviations from the pilot cues is studied in detail. The metrics are shown to be largely insensitive to reasonable deviations from the nominal test pilot commands. Instrumentation required to quantify agility via flight test is also considered. With one exception, each of the proposed new metrics may be measured with instrumentation currently available.

  19. Gamma-ray Astrophysics with AGILE

    SciTech Connect

    Longo, Francesco |; Tavani, M.; Barbiellini, G.; Argan, A.; Basset, M.; Boffelli, F.; Bulgarelli, A.; Caraveo, P.; Cattaneo, P.; Chen, A.; Costa, E.; Del Monte, E.; Di Cocco, G.; Di Persio, G.; Donnarumma, I.; Feroci, M.; Fiorini, M.; Foggetta, L.; Froysland, T.; Frutti, M.

    2007-07-12

    AGILE will explore the gamma-ray Universe with a very innovative instrument combining for the first time a gamma-ray imager and a hard X-ray imager. AGILE will be operational in spring 2007 and it will provide crucial data for the study of Active Galactic Nuclei, Gamma-Ray Bursts, unidentified gamma-ray sources. Galactic compact objects, supernova remnants, TeV sources, and fundamental physics by microsecond timing. The AGILE instrument is designed to simultaneously detect and image photons in the 30 MeV - 50 GeV and 15 - 45 keV energy bands with excellent imaging and timing capabilities, and a large field of view covering {approx} 1/5 of the entire sky at energies above 30 MeV. A CsI calorimeter is capable of GRB triggering in the energy band 0.3-50 MeV AGILE is now (March 2007) undergoing launcher integration and testing. The PLSV launch is planned in spring 2007. AGILE is then foreseen to be fully operational during the summer of 2007.

  20. Enterprise Technologies Deployment for Agile Manufacturing

    SciTech Connect

    Neal, R.E.

    1992-11-01

    This report is intended for high-level technical planners who are responsible for planning future developments for their company or Department of Energy/Defense Programs (DOE/DP) facilities. On one hand, the information may be too detailed or contain too much manufacturing technology jargon for a high-level, nontechnical executive, while at the same time an expert in any of the four infrastructure fields (Product Definition/Order Entry, Planning and Scheduling, Shop Floor Management, and Intelligent Manufacturing Systems) will know more than is conveyed here. The purpose is to describe a vision of technology deployment for an agile manufacturing enterprise. According to the 21st Century Manufacturing Enterprise Strategy, the root philosophy of agile manufacturing is that ``competitive advantage in the new systems will belong to agile manufacturing enterprises, capable of responding rapidly to demand for high-quality, highly customized products.`` Such agility will be based on flexible technologies, skilled workers, and flexible management structures which collectively will foster cooperative initiatives in and among companies. The remainder of this report is dedicated to sharpening our vision and to establishing a framework for defining specific project or pre-competitive project goals which will demonstrate agility through technology deployment.

  1. Enterprise Technologies Deployment for Agile Manufacturing

    SciTech Connect

    Neal, R.E.

    1992-11-01

    This report is intended for high-level technical planners who are responsible for planning future developments for their company or Department of Energy/Defense Programs (DOE/DP) facilities. On one hand, the information may be too detailed or contain too much manufacturing technology jargon for a high-level, nontechnical executive, while at the same time an expert in any of the four infrastructure fields (Product Definition/Order Entry, Planning and Scheduling, Shop Floor Management, and Intelligent Manufacturing Systems) will know more than is conveyed here. The purpose is to describe a vision of technology deployment for an agile manufacturing enterprise. According to the 21st Century Manufacturing Enterprise Strategy, the root philosophy of agile manufacturing is that competitive advantage in the new systems will belong to agile manufacturing enterprises, capable of responding rapidly to demand for high-quality, highly customized products.'' Such agility will be based on flexible technologies, skilled workers, and flexible management structures which collectively will foster cooperative initiatives in and among companies. The remainder of this report is dedicated to sharpening our vision and to establishing a framework for defining specific project or pre-competitive project goals which will demonstrate agility through technology deployment.

  2. Radiolysis Process Model

    SciTech Connect

    Buck, Edgar C.; Wittman, Richard S.; Skomurski, Frances N.; Cantrell, Kirk J.; McNamara, Bruce K.; Soderquist, Chuck Z.

    2012-07-17

    Assessing the performance of spent (used) nuclear fuel in geological repository requires quantification of time-dependent phenomena that may influence its behavior on a time-scale up to millions of years. A high-level waste repository environment will be a dynamic redox system because of the time-dependent generation of radiolytic oxidants and reductants and the corrosion of Fe-bearing canister materials. One major difference between used fuel and natural analogues, including unirradiated UO2, is the intense radiolytic field. The radiation emitted by used fuel can produce radiolysis products in the presence of water vapor or a thin-film of water (including OH• and H• radicals, O2-, eaq, H2O2, H2, and O2) that may increase the waste form degradation rate and change radionuclide behavior. H2O2 is the dominant oxidant for spent nuclear fuel in an O2 depleted water environment, the most sensitive parameters have been identified with respect to predictions of a radiolysis model under typical conditions. As compared with the full model with about 100 reactions it was found that only 30-40 of the reactions are required to determine [H2O2] to one part in 10–5 and to preserve most of the predictions for major species. This allows a systematic approach for model simplification and offers guidance in designing experiments for validation.

  3. AGILE observation of a gamma-ray flare from the blazar 3C 279

    NASA Astrophysics Data System (ADS)

    Giuliani, A.; D'Ammando, F.; Vercellone, S.; Vittorini, V.; Chen, A. W.; Donnarumma, I.; Pacciani, L.; Pucella, G.; Trois, A.; Bulgarelli, A.; Longo, F.; Tavani, M.; Tosti, G.; Impiombato, D.; Argan, A.; Barbiellini, G.; Boffelli, F.; Caraveo, P. A.; Cattaneo, P. W.; Cocco, V.; Costa, E.; Del Monte, E.; de Paris, G.; Di Cocco, G.; Evangelista, Y.; Feroci, M.; Fiorini, M.; Fornari, F.; Froysland, T.; Fuschino, F.; Galli, M.; Gianotti, F.; Labanti, C.; Lapshov, Y.; Lazzarotto, F.; Lipari, P.; Marisaldi, M.; Mereghetti, S.; Morselli, A.; Pellizzoni, A.; Perotti, F.; Picozza, P.; Prest, M.; Rapisarda, M.; Rappoldi, A.; Soffitta, P.; Trifoglio, M.; Vallazza, E.; Zambra, A.; Zanello, D.; Cutini, S.; Gasparrini, D.; Pittori, C.; Preger, B.; Santolamazza, P.; Verrecchia, F.; Giommi, P.; Colafrancesco, S.; Salotti, L.

    2009-02-01

    Context: We report the detection by the AGILE satellite of an intense gamma-ray flare from the gamma-ray source 3EG J1255-0549, associated with the Flat Spectrum Radio Quasar 3C 279, during the AGILE pointings towards the Virgo Region on 2007 July 9-13. Aims: The simultaneous optical, X-ray and gamma-ray covering allows us to study the spectral energy distribution (SED) and the theoretical models relative to the mid-July flaring episode. Methods: AGILE observed the source during its Science Performance Verification Phase with its two co-aligned imagers: the Gamma-Ray Imaging Detector (GRID) and the hard X-ray imager (Super-AGILE) sensitive in the 30 MeV-50 GeV and 18-60 keV respectively. During the AGILE observation the source was monitored simultaneously in the optical band by the REM telescope and in the X-ray band by the Swift satellite through 4 target of opportunity observations. Results: During 2007 July 9-13, AGILE-GRID detected gamma-ray emission from 3C 279, with the source at ~2° from the center of the field of view, with an average flux of (210 ± 38) × 10-8 ph cm-2 s-1 for energy above 100 MeV. No emission was detected by Super-AGILE, with a 3-σ upper limit of 10 mCrab. During the observation, which lasted about 4 days, no significative gamma-ray flux variation was observed. Conclusions: The Spectral Energy Distribution is modelled with a homogeneous one-zone Synchrotron Self Compton emission plus the contributions by external Compton scattering of the direct disk radiation and, to a lesser extent, by external Compton scattering of photons from the Broad Line Region.

  4. Gamma-ray astrophysics with AGILE

    NASA Astrophysics Data System (ADS)

    Tavani, M.

    2003-09-01

    Gamma-ray astrophysics above 30 MeV will soon be revitalized by a new generation of high-energy detectors in space. We discuss here the AGILE Mission that will be dedicated to gamma-ray astrophysics above 30 MeV during the period 2005-2006. The main characteristics of AGILE are: (1) excellent imaging and monitoring capabilities both in the γ-ray (30 MeV - 30 GeV) and hard X-ray (10-40 keV) energy ranges (reaching an arcminute source positioning), (2) very good timing (improving by three orders of magnitude the instrumental deadtime for γ-ray detection compared to previous instruments), and (3) excellent imaging and triggering capability for Gamma-Ray Bursts. The AGILE scientific program will emphasize a quick response to gamma-ray transients and multiwavelength studies of gamma-ray sources.

  5. SuperAGILE and Gamma Ray Bursts

    SciTech Connect

    Pacciani, Luigi; Costa, Enrico; Del Monte, Ettore; Donnarumma, Immacolata; Evangelista, Yuri; Feroci, Marco; Frutti, Massimo; Lazzarotto, Francesco; Lapshov, Igor; Rubini, Alda; Soffitta, Paolo; Tavani, Marco; Barbiellini, Guido; Mastropietro, Marcello; Morelli, Ennio; Rapisarda, Massimo

    2006-05-19

    The solid-state hard X-ray imager of AGILE gamma-ray mission -- SuperAGILE -- has a six arcmin on-axis angular resolution in the 15-45 keV range, a field of view in excess of 1 steradian. The instrument is very light: 5 kg only. It is equipped with an on-board self triggering logic, image deconvolution, and it is able to transmit the coordinates of a GRB to the ground in real-time through the ORBCOMM constellation of satellites. Photon by photon Scientific Data are sent to the Malindi ground station at every contact. In this paper we review the performance of the SuperAGILE experiment (scheduled for a launch in the middle of 2006), after its first onground calibrations, and show the perspectives for Gamma Ray Bursts.

  6. Business Process Modeling: Perceived Benefits

    NASA Astrophysics Data System (ADS)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  7. Agile enterprise development framework utilizing services principles for building pervasive security

    NASA Astrophysics Data System (ADS)

    Farroha, Deborah; Farroha, Bassam

    2011-06-01

    We are in an environment of continuously changing mission requirements and therefore our Information Systems must adapt to accomplish new tasks, quicker, in a more proficient manner. Agility is the only way we will be able to keep up with this change. But there are subtleties that must be considered as we adopt various agile methods: secure, protect, control and authenticate are all elements needed to posture our Information Technology systems to counteract the real and perceived threats in today's environment. Many systems have been tasked to ingest process and analyze different data sets than they were originally designed for and they have to interact with multiple new systems that were unaccounted for at design time. Leveraging the tenets of security, we have devised a new framework that takes agility into a new realm where the product will built to work in a service-based environment but is developed using agile processes. Even though these two criteria promise to hone the development effort, they actually contradict each other in philosophy where Services require stable interfaces, while Agile focuses on being flexible and tolerate changes up to much later stages of development. This framework is focused on enabling a successful product development that capitalizes on both philosophies.

  8. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  9. Business process modeling in healthcare.

    PubMed

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare. PMID:22925789

  10. Modeling nuclear processes by Simulink

    NASA Astrophysics Data System (ADS)

    Rashid, Nahrul Khair Alang Md

    2015-04-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  11. Modeling nuclear processes by Simulink

    SciTech Connect

    Rashid, Nahrul Khair Alang Md

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  12. Electrophysiological models of neural processing.

    PubMed

    Nelson, Mark E

    2011-01-01

    The brain is an amazing information processing system that allows organisms to adaptively monitor and control complex dynamic interactions with their environment across multiple spatial and temporal scales. Mathematical modeling and computer simulation techniques have become essential tools in understanding diverse aspects of neural processing ranging from sub-millisecond temporal coding in the sound localization circuity of barn owls to long-term memory storage and retrieval in humans that can span decades. The processing capabilities of individual neurons lie at the core of these models, with the emphasis shifting upward and downward across different levels of biological organization depending on the nature of the questions being addressed. This review provides an introduction to the techniques for constructing biophysically based models of individual neurons and local networks. Topics include Hodgkin-Huxley-type models of macroscopic membrane currents, Markov models of individual ion-channel currents, compartmental models of neuronal morphology, and network models involving synaptic interactions among multiple neurons. PMID:21064164

  13. Models of the Reading Process

    PubMed Central

    Rayner, Keith; Reichle, Erik D.

    2010-01-01

    Reading is a complex skill involving the orchestration of a number of components. Researchers often talk about a “model of reading” when talking about only one aspect of the reading process (for example, models of word identification are often referred to as “models of reading”). Here, we review prominent models that are designed to account for (1) word identification, (2) syntactic parsing, (3) discourse representations, and (4) how certain aspects of language processing (e.g., word identification), in conjunction with other constraints (e g., limited visual acuity, saccadic error, etc.), guide readers’ eyes. Unfortunately, it is the case that these various models addressing specific aspects of the reading process seldom make contact with models dealing with other aspects of reading. Thus, for example, the models of word identification seldom make contact with models of eye movement control, and vice versa. While this may be unfortunate in some ways, it is quite understandable in other ways because reading itself is a very complex process. We discuss prototypical models of aspects of the reading process in the order mentioned above. We do not review all possible models, but rather focus on those we view as being representative and most highly recognized. PMID:21170142

  14. An agile enterprise regulation architecture for health information security management.

    PubMed

    Chen, Ying-Pei; Hsieh, Sung-Huai; Cheng, Po-Hsun; Chien, Tsan-Nan; Chen, Heng-Shuen; Luh, Jer-Junn; Lai, Jin-Shin; Lai, Feipei; Chen, Sao-Jie

    2010-09-01

    Information security management for healthcare enterprises is complex as well as mission critical. Information technology requests from clinical users are of such urgency that the information office should do its best to achieve as many user requests as possible at a high service level using swift security policies. This research proposes the Agile Enterprise Regulation Architecture (AERA) of information security management for healthcare enterprises to implement as part of the electronic health record process. Survey outcomes and evidential experiences from a sample of medical center users proved that AERA encourages the information officials and enterprise administrators to overcome the challenges faced within an electronically equipped hospital. PMID:20815748

  15. Lean and Agile: An Epistemological Reflection

    ERIC Educational Resources Information Center

    Browaeys, Marie-Joelle; Fisser, Sandra

    2012-01-01

    Purpose: The aim of the paper is to contribute to the discussion of treating the concepts of lean and agile in isolation or combination by presenting an alternative view from complexity thinking on these concepts, considering an epistemological approach to this topic. Design/methodology/approach: The paper adopts an epistemological approach, using…

  16. The Frequency Agile Solar Radiotelescope (FASR)

    NASA Astrophysics Data System (ADS)

    White, S. M.; Gary, D. E.; Bastian, T. S.; Hurford, G. J.; Lanzerotti, L. J.

    2003-04-01

    The Frequency Agile Solar Radiotelescope (FASR) is a radio interferometer designed to make high spatial resolution images of the Sun across a broad range of radio wavelengths simultaneously, allowing the technique of imaging spectroscopy to be exploited on a routine basis. The telescope will cover the frequency range 0.1-30 GHz using several sets of receiving elements that provide full-disk imaging, with of order 100 antennas at highest frequency range. FASR will be optimized for solar radio phenomena and will be the most powerful and versatile radioheliograph ever built, providing an improvement of orders of magnitude in image quality over existing instruments. FASR recently received the top ranking amongst all small projects considered by the decadal survey of the National Academy of Science Committee on Solar and Space Physics. FASR will probe all phenomena in the solar atmosphere from the mid-chromosphere outwards. In particular, FASR will provide direct measurement of coronal magnetic field strengths, will image the nonthermal solar atmosphere and show directly the locations of electrons accelerated by solar flares, will provide images of coronal mass ejections travelling outwwards through the solar corona, and supply extensive data products for forecasting and synoptic studies. A major emphasis in the project is to make FASR data as widely and easily used as possible, i.e., providing the general user with processed, fully-calibrated high-quality images that do not need particular knowledge of radio astronomy for interpretation. This paper will describe the telescope and its science goals, and summarize its current status.

  17. A Roadmap for using Agile Development in a Traditional System

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Starbird, Thomas

    2006-01-01

    I. Ensemble Development Group: a) Produces activity planning software for in spacecraft; b) Built on Eclipse Rich Client Platform (open source development and runtime software); c) Funded by multiple sources including the Mars Technology Program; d) Incorporated the use of Agile Development. II. Next Generation Uplink Planning System: a) Researches the Activity Planning and Sequencing Subsystem for Mars Science Laboratory (APSS); b) APSS includes Ensemble, Activity Modeling, Constraint Checking, Command Editing and Sequencing tools plus other uplink generation utilities; c) Funded by the Mars Technology Program; d) Integrates all of the tools for APSS.

  18. Kinetic Modeling of Microbiological Processes

    SciTech Connect

    Liu, Chongxuan; Fang, Yilin

    2012-08-26

    Kinetic description of microbiological processes is vital for the design and control of microbe-based biotechnologies such as waste water treatment, petroleum oil recovery, and contaminant attenuation and remediation. Various models have been proposed to describe microbiological processes. This editorial article discusses the advantages and limiation of these modeling approaches in cluding tranditional, Monod-type models and derivatives, and recently developed constraint-based approaches. The article also offers the future direction of modeling researches that best suit for petroleum and environmental biotechnologies.

  19. Social Models: Blueprints or Processes?

    ERIC Educational Resources Information Center

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  20. Autonomous Guidance of Agile Small-scale Rotorcraft

    NASA Technical Reports Server (NTRS)

    Mettler, Bernard; Feron, Eric

    2004-01-01

    This report describes a guidance system for agile vehicles based on a hybrid closed-loop model of the vehicle dynamics. The hybrid model represents the vehicle dynamics through a combination of linear-time-invariant control modes and pre-programmed, finite-duration maneuvers. This particular hybrid structure can be realized through a control system that combines trim controllers and a maneuvering control logic. The former enable precise trajectory tracking, and the latter enables trajectories at the edge of the vehicle capabilities. The closed-loop model is much simpler than the full vehicle equations of motion, yet it can capture a broad range of dynamic behaviors. It also supports a consistent link between the physical layer and the decision-making layer. The trajectory generation was formulated as an optimization problem using mixed-integer-linear-programming. The optimization is solved in a receding horizon fashion. Several techniques to improve the computational tractability were investigate. Simulation experiments using NASA Ames 'R-50 model show that this approach fully exploits the vehicle's agility.

  1. Array Databases: Agile Analytics (not just) for the Earth Sciences

    NASA Astrophysics Data System (ADS)

    Baumann, P.; Misev, D.

    2015-12-01

    Gridded data, such as images, image timeseries, and climate datacubes, today are managed separately from the metadata, and with different, restricted retrieval capabilities. While databases are good at metadata modelled in tables, XML hierarchies, or RDF graphs, they traditionally do not support multi-dimensional arrays.This gap is being closed by Array Databases, pioneered by the scalable rasdaman ("raster data manager") array engine. Its declarative query language, rasql, extends SQL with array operators which are optimized and parallelized on server side. Installations can easily be mashed up securely, thereby enabling large-scale location-transparent query processing in federations. Domain experts value the integration with their commonly used tools leading to a quick learning curve.Earth, Space, and Life sciences, but also Social sciences as well as business have massive amounts of data and complex analysis challenges that are answered by rasdaman. As of today, rasdaman is mature and in operational use on hundreds of Terabytes of timeseries datacubes, with transparent query distribution across more than 1,000 nodes. Additionally, its concepts have shaped international Big Data standards in the field, including the forthcoming array extension to ISO SQL, many of which are supported by both open-source and commercial systems meantime. In the geo field, rasdaman is reference implementation for the Open Geospatial Consortium (OGC) Big Data standard, WCS, now also under adoption by ISO. Further, rasdaman is in the final stage of OSGeo incubation.In this contribution we present array queries a la rasdaman, describe the architecture and novel optimization and parallelization techniques introduced in 2015, and put this in context of the intercontinental EarthServer initiative which utilizes rasdaman for enabling agile analytics on Petascale datacubes.

  2. Metrics for Business Process Models

    NASA Astrophysics Data System (ADS)

    Mendling, Jan

    Up until now, there has been little research on why people introduce errors in real-world business process models. In a more general context, Simon [404] points to the limitations of cognitive capabilities and concludes that humans act rationally only to a certain extent. Concerning modeling errors, this argument would imply that human modelers lose track of the interrelations of large and complex models due to their limited cognitive capabilities and introduce errors that they would not insert in a small model. A recent study by Mendling et al. [275] explores in how far certain complexity metrics of business process models have the potential to serve as error determinants. The authors conclude that complexity indeed appears to have an impact on error probability. Before we can test such a hypothesis in a more general setting, we have to establish an understanding of how we can define determinants that drive error probability and how we can measure them.

  3. Architecture and performances of the AGILE Telemetry Preprocessing System (TMPPS)

    NASA Astrophysics Data System (ADS)

    Trifoglio, M.; Bulgarelli, A.; Gianotti, F.; Lazzarotto, F.; Di Cocco, G.; Fuschino, F.; Tavani, M.

    2008-07-01

    AGILE is an Italian Space Agency (ASI) satellite dedicated to high energy Astrophysics. It was launched successfully on 23 April 2007, and it has been operated by the AGILE Ground Segment, consisting of the Ground Station located in Malindi (Kenia), the Mission Operations Centre (MOC) and the AGILE Data Centre (ADC) established in Italy, at Telespazio in Fucino and at the ASI Science Data Centre (ASDC) in Frascati respectively. Due to the low equatorial orbit at ~ 530 Km. with inclination angle of ~ 2.5°, the satellite passes over the Ground Station every ~ 100'. During the visibility period of . ~ 12', the Telemetry (TM) is down linked through two separated virtual channels, VC0 and VC1. The former is devoted to the real time TM generated during the pass at the average rate of 50 Kbit/s and is directly relayed to the Control Centre. The latter is used to downlink TM data collected on the satellite on-board mass memory during the non visibility period. This generates at the Ground Station a raw TM file of up to 37 MByte. Within 20' after the end of the contact, both the real time and mass memory TM arrive at ADC through the dedicated VPN ASINet. Here they are automatically detected and ingested by the TMPPS pipeline in less than 5 minutes. The TMPPS archives each TM file and sorts its packets into one stream for each of the different TM layout. Each stream is processed in parallel in order to unpack the various telemetry field and archive them into suitable FITS files. Each operation is tracked into a MySQL data base which interfaces the TMPPS pipeline to the rest of the scientific pipeline running at ADC. In this paper the architecture and the performance of the TMPPS will be described and discussed.

  4. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  5. Lesson Learned from AGILE and LARES ASI Projects About MATED Data Collection and Post Analysis

    NASA Astrophysics Data System (ADS)

    Carpentiero, Rita; Mrchetti, Ernesto; Natalucci, Silvia; Portelli, Claudio

    2012-07-01

    ASI has managed and collected data on project development of two scientific all-Italian missions: AGILE and LARES. Collection of the Model And Test Effectiveness Database (MATED) data, concerning Project, AIV (Assembly Integration and Verification) and NCR (Non Conformance Report) aspects has been performed by the Italian Space Agency (ASI), using available technical documentation of both AGILE e LARES projects. In this paper some consideration on the needs of 'real time' data collection is made, together with proposal of front end improvement of this tool. In addition a preliminary analysis of MATED effectiveness related to the above ASI projects will be presented in a bottom-up and post verification approach.

  6. Neuroscientific Model of Motivational Process

    PubMed Central

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  7. Neuroscientific model of motivational process.

    PubMed

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  8. The Test Equipment of the AGILE Minicalorimeter Prototype

    NASA Astrophysics Data System (ADS)

    Trifoglio, M.; Bulgarelli, A.; Gianotti, F.; Celesti, E.; Di Cocco, G.; Labanti, C.; Mauri, A.; Prest, M.; Vallazza, E.; Froysland, T.

    2004-09-01

    AGILE is an ASI (Italian Space Agency) Small Space Mission for high energy astrophysics in the range 30 MeV - 50 GeV. The AGILE satellite is currently in the C phase and is planned to be launched in 2005. The Payload shall consist of a Tungsten-Silicon Tracker, a CsI Minicalorimeter, an anticoincidence system and a X-Ray detector sensitive in the 10-40 KeV range. The purpose of the Minicalorimeter (MCAL) is twofold. It shall work in conjunction with the Tracker in order to evaluate the energy of the interacting photons, and it shall operate autonomously in the energy range 250KeV-250 MeV for detection of transients and gamma ray burst events and for the measurement of gamma ray background fluctuations. We present the architecture of the Test Equipment we have designed and developed in order to test and verify the MCAL Simplified Electrical Model prototype which has been manufactured in order to validate the design of the MCAL Proto Flight Model.

  9. An Agile Constructionist Mentoring Methodology for Software Projects in the High School

    ERIC Educational Resources Information Center

    Meerbaum-Salant, Orni; Hazzan, Orit

    2010-01-01

    This article describes the construction process and evaluation of the Agile Constructionist Mentoring Methodology (ACMM), a mentoring method for guiding software development projects in the high school. The need for such a methodology has arisen due to the complexity of mentoring software project development in the high school. We introduce the…

  10. Cross Sectional Study of Agile Software Development Methods and Project Performance

    ERIC Educational Resources Information Center

    Lambert, Tracy

    2011-01-01

    Agile software development methods, characterized by delivering customer value via incremental and iterative time-boxed development processes, have moved into the mainstream of the Information Technology (IT) industry. However, despite a growing body of research which suggests that a predictive manufacturing approach, with big up-front…

  11. Modeling Production Plant Forming Processes

    SciTech Connect

    Rhee, M; Becker, R; Couch, R; Li, M

    2004-09-22

    Engineering has simulation tools and experience in modeling forming processes. Y-12 personnel have expressed interest in validating our tools and experience against their manufacturing process activities such as rolling, casting, and forging etc. We have demonstrated numerical capabilities in a collaborative DOE/OIT project with ALCOA that is nearing successful completion. The goal was to use ALE3D to model Alcoa's slab rolling process in order to demonstrate a computational tool that would allow Alcoa to define a rolling schedule that would minimize the probability of ingot fracture, thus reducing waste and energy consumption. It is intended to lead to long-term collaboration with Y-12 and perhaps involvement with other components of the weapons production complex. Using simulations to aid in design of forming processes can: decrease time to production; reduce forming trials and associated expenses; and guide development of products with greater uniformity and less scrap.

  12. Architecture-Centric Methods and Agile Approaches

    NASA Astrophysics Data System (ADS)

    Babar, Muhammad Ali; Abrahamsson, Pekka

    Agile software development approaches have had significant impact on industrial software development practices. Despite becoming widely popular, there is an increasing perplexity about the role and importance of a system’s software architecture in agile approaches [1, 2]. Advocates of the vital role of architecture in achieving quality goals of large-scale-software-intensive-systems are skeptics of the scalability of any development approach that does not pay sufficient attention to architectural issues. However, the proponents of agile approaches usually perceive the upfront design and evaluation of architecture as being of less value to the customers of a system. According to them, for example, re-factoring can help fix most of the problems. Many experiences show that large-scale re-factoring often results in significant defects, which are very costly to address later in the development cycle. It is considered that re-factoring is worthwhile as long as the high-level design is good enough to limit the need for large-scale re-factoring [1, 3, 4].

  13. First GRB detections with the AGILE Minicalorimeter

    SciTech Connect

    Marisaldi, M.; Labanti, C.; Fuschino, F.; Bulgarelli, A.; Gianotti, F.; Trifoglio, M.; Galli, M.; Tavani, M.; Argan, A.

    2008-05-22

    The Minicalorimeter (MCAL) onboard the AGILE satellite is a 1400 cm{sup 2} scintillation detector sensitive in the energy range 0.3-200 MeV. MCAL works both as a slave of the AGILE Silicon Tracker and as an autonomous detector for transient events (BURST mode). A dedicated onboard Burst Search logic scans BURST mode data in search of count rate increase. Peculiar characteristics of the detector are the high energy spectral coverage and a timing resolution of about 2 microseconds. Even if a trigger is not issued, BURST mode data are used to build a broad band energy spectrum (scientific ratemeters) organized in 11 bands for each of the two MCAL detection planes, with a time resolution of 1 second. After the first engineering commissioning phase, following the AGILE launch on 23rd April 2007, between 22nd June and 5th November 2007 eighteen GRBs were detected offline in the scientific ratemeters data, with a detection rate of about one per week. In this paper the capabilities of the detector will be described and an overview of the first detected GRBs will be given.

  14. Prospects for High Energy Detection of Microquasars with the AGILE and GLAST Gamma-Ray Telescopes

    SciTech Connect

    Santolamazza, Patrizia; Pittori, Carlotta; Verrecchia, Francesco

    2007-08-21

    We estimate the sensitivities of the AGILE and GLAST {gamma}-ray experiments taking into account two cases for the galactic {gamma}-ray diffuse background (at high galactic latitude and toward the galactic center). Then we use sensitivities to estimate microquasar observability with the two experiments, assuming the {gamma}-ray emission above 100 MeV of a recent microquasar model.

  15. Design studies for a spectrally agile staring sensor /SASS/ system

    NASA Astrophysics Data System (ADS)

    Kollodge, M. A.; Cox, J. A.; Marshall, W. C.; Solstad, R. G.; Steadman, S. S.

    1981-01-01

    The operation of the Spectrally Agile Staring Sensor (SASS) involves the employment of a telescope system which uses variable spectral band information to detect and identify moving IR sources against the background radiance of the earth. A description is presented of SASS simulation studies. A signal-to-noise ratio (SNR) expression used as a measure of system performance is considered. Attention is given to the target trajectory generator, a target signature model, a background and atmospheric model, a Dual Tunable Fabry-Perot (DTFP) optical filter model, problems of out-of-band leakage transmission, a Focal Plane Array (FPA)/spot convolution model, SNR improvement with high filter transmission efficiency, system performance vs DTFP optical filter parameters, and system performance vs atmospheric conditions.

  16. A review of the Technologies Enabling Agile Manufacturing program

    SciTech Connect

    Gray, W.H.; Neal, R.E.; Cobb, C.K.

    1996-10-01

    Addressing a technical plan developed in consideration with major US manufacturers, software and hardware providers, and government representatives, the Technologies Enabling Agile Manufacturing (TEAM) program is leveraging the expertise and resources of industry, universities, and federal agencies to develop, integrate, and deploy leap-ahead manufacturing technologies. One of the TEAM program`s goals is to transition products from design to production faster, more efficiently, and at less cost. TEAM`s technology development strategy also provides all participants with early experience in establishing and working within an electronic enterprise that includes access to high-speed networks and high-performance computing and storage systems. The TEAM program uses the cross-cutting tools it collects, develops, and integrates to demonstrate and deploy agile manufacturing capabilities for three high-priority processes identified by industry: material removal, sheet metal forming, electro-mechanical assembly. This paper reviews the current status of the TEAM program with emphasis upon TEAM`s information infrastructure.

  17. SAMPLE (Sandia agile MEMS prototyping, layout tools, and education)

    NASA Astrophysics Data System (ADS)

    Davies, Brady R.; Craig Barron, Carole; Sniegowski, Jeffry J.; Rodgers, M. Steven

    1997-09-01

    The SAMPLE (Sandia agile MEMS prototyping, layout tools, and education) service makes Sandia's state-of-the-art surface micromachining fabrication process, known as SUMMiT, available to U.S. industry for the first time. The service provides a short course and customized computer-aided design (CAD) tools to assist customers in designing micromachine prototypes to be fabricated in SUMMiT. Frequent small-scale manufacturing runs then provide SAMPLE designers with hundreds of sophisticated MEMS (microelectromechanical systems) chips. SUMMiT (Sandia ultra-planar, multi-level MEMS technology) offers unique surface-micromachining capabilities, including four levels of polycrystalline silicon (including the ground layer), flanged hubs, substrate contacts, one-micron design rules, and chemical-mechanical polishing (CMP) planarization. This paper describes the SUMMiT process, design tools, and other information relevant to the SAMPLE service and SUMMiT process.

  18. SAMPLE (Sandia Agile MEMS Prototyping, Layout tools, and Education)

    SciTech Connect

    Davies, B.R.; Barron, C.C.; Sniegowski, J.J.; Rodgers, M.S.

    1997-08-01

    The SAMPLE (Sandia Agile MEMS Protyping, Layout tools, and Education) service makes Sandia`s state-of-the-art surface-micromachining fabrication process, known as SUMMiT, available to US industry for the first time. The service provides a short cause and customized computer-aided design (CAD) tools to assist customers in designing micromachine prototypes to be fabricated in SUMMiT. Frequent small-scale manufacturing runs then provide SAMPLE designers with hundreds of sophisticated MEMS (MicroElectroMechanical Systems) chips. SUMMiT (Sandia Ultra-planar, Multi-level MEMS Technology) offers unique surface-micromachining capabilities, including four levels of polycrystalline silicon (including the ground layer), flanged hubs, substrate contacts, one-micron design rules, and chemical-mechanical polishing (CMP) planarization. This paper describes the SUMMiT process, design tools, and other information relevant to the SAMPLE service and SUMMiT process.

  19. Future Research in Agile Systems Development: Applying Open Innovation Principles Within the Agile Organisation

    NASA Astrophysics Data System (ADS)

    Conboy, Kieran; Morgan, Lorraine

    A particular strength of agile approaches is that they move away from ‘introverted' development and intimately involve the customer in all areas of development, supposedly leading to the development of a more innovative and hence more valuable information system. However, we argue that a single customer representative is too narrow a focus to adopt and that involvement of stakeholders beyond the software development itself is still often quite weak and in some cases non-existent. In response, we argue that current thinking regarding innovation in agile development needs to be extended to include multiple stakeholders outside the business unit. This paper explores the intra-organisational applicability and implications of open innovation in agile systems development. Additionally, it argues for a different perspective of project management that includes collaboration and knowledge-sharing with other business units, customers, partners, and other relevant stakeholders pertinent to the business success of an organisation, thus embracing open innovation principles.

  20. Candidate control design metrics for an agile fighter

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Bailey, Melvin L.; Ostroff, Aaron J.

    1991-01-01

    Success in the fighter combat environment of the future will certainly demand increasing capability from aircraft technology. These advanced capabilities in the form of superagility and supermaneuverability will require special design techniques which translate advanced air combat maneuvering requirements into design criteria. Control design metrics can provide some of these techniques for the control designer. Thus study presents an overview of control design metrics and investigates metrics for advanced fighter agility. The objectives of various metric users, such as airframe designers and pilots, are differentiated from the objectives of the control designer. Using an advanced fighter model, metric values are documented over a portion of the flight envelope through piloted simulation. These metric values provide a baseline against which future control system improvements can be compared and against which a control design methodology can be developed. Agility is measured for axial, pitch, and roll axes. Axial metrics highlight acceleration and deceleration capabilities under different flight loads and include specific excess power measurements to characterize energy meneuverability. Pitch metrics cover both body-axis and wind-axis pitch rates and accelerations. Included in pitch metrics are nose pointing metrics which highlight displacement capability between the nose and the velocity vector. Roll metrics (or torsion metrics) focus on rotational capability about the wind axis.

  1. Gamma-ray blazars: The view from AGILE

    NASA Astrophysics Data System (ADS)

    D'Ammando, F.; Bulgarelli, A.; Chen, A. W.; Donnarumma, I.; Giuliani, A.; Longo, F.; Pacciani, L.; Pucella, G.; Striani, E.; Tavani, M.; Vercellone, S.; Vittorini, V.; Covino, S.; Krimm, H. A.; Raiteri, C. M.; Romano, P.; Villata, M.

    2011-07-01

    During the first 3 years of operation the Gamma-Ray Imaging Detector onboard the AGILE satellite detected several blazars in a high γ-ray activity: 3C 279, 3C 454.3, PKS 1510-089, S5 0716+714, 3C 273, W Comae, Mrk 421, PKS 0537-441 and 4C +21.35. Thanks to the rapid dissemination of our alerts, we were able to obtain multiwavelength data from other observatories such as Spitzer, Swift, RXTE, Suzaku, INTEGRAL, MAGIC, VERITAS, and ARGO as well as radio-to-optical coverage by means of the GASP Project of the WEBT and the REM Telescope. This large multifrequency coverage gave us the opportunity to study the variability correlations between the emission at different frequencies and to obtain simultaneous Spectral Energy Distributions of these sources from radio to γ-ray energy bands, investigating the different mechanisms responsible for their emission and uncovering in some cases a more complex behavior with respect to the standard models. We present a review of the most interesting AGILE results on these γ-ray blazars and their multifrequency data.

  2. Thermoplastic matrix composite processing model

    NASA Technical Reports Server (NTRS)

    Dara, P. H.; Loos, A. C.

    1985-01-01

    The effects the processing parameters pressure, temperature, and time have on the quality of continuous graphite fiber reinforced thermoplastic matrix composites were quantitatively accessed by defining the extent to which intimate contact and bond formation has occurred at successive ply interfaces. Two models are presented predicting the extents to which the ply interfaces have achieved intimate contact and cohesive strength. The models are based on experimental observation of compression molded laminates and neat resin conditions, respectively. Identified as the mechanism explaining the phenomenon by which the plies bond to themselves is the theory of autohesion (or self diffusion). Theoretical predictions from the Reptation Theory between autohesive strength and contact time are used to explain the effects of the processing parameters on the observed experimental strengths. The application of a time-temperature relationship for autohesive strength predictions is evaluated. A viscoelastic compression molding model of a tow was developed to explain the phenomenon by which the prepreg ply interfaces develop intimate contact.

  3. Welding process modelling and control

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  4. Applying Agile MethodstoWeapon/Weapon-Related Software

    SciTech Connect

    Adams, D; Armendariz, M; Blackledge, M; Campbell, F; Cloninger, M; Cox, L; Davis, J; Elliott, M; Granger, K; Hans, S; Kuhn, C; Lackner, M; Loo, P; Matthews, S; Morrell, K; Owens, C; Peercy, D; Pope, G; Quirk, R; Schilling, D; Stewart, A; Tran, A; Ward, R; Williamson, M

    2007-05-02

    This white paper provides information and guidance to the Department of Energy (DOE) sites on Agile software development methods and the impact of their application on weapon/weapon-related software development. The purpose of this white paper is to provide an overview of Agile methods, examine the accepted interpretations/uses/practices of these methodologies, and discuss the applicability of Agile methods with respect to Nuclear Weapons Complex (NWC) Technical Business Practices (TBPs). It also provides recommendations on the application of Agile methods to the development of weapon/weapon-related software.

  5. Sharing environmental models: An Approach using GitHub repositories and Web Processing Services

    NASA Astrophysics Data System (ADS)

    Stasch, Christoph; Nuest, Daniel; Pross, Benjamin

    2016-04-01

    accordingly. The admin tool of the 52°North WPS was extended to support automated retrieval and deployment of computational models from GitHub repositories. Once the R code is available in the GitHub repo, the contained process can be easily deployed and executed by simply defining the GitHub repository URL in the WPS admin tool. We illustrate the usage of the approach by sharing and running a model for land use system archetypes developed by the Helmholtz Centre for Environmental Research (UFZ, see Vaclavik et al.). The original R code was extended and published in the 52°North WPS using both, public and non-public datasets (Nüst et al., see also https://github.com/52North/glues-wps). Hosting the analysis in a Git repository now allows WPS administrators, client developers, and modelers to easily work together on new versions or completely new web processes using the powerful GitHub collaboration platform. References: Hinz, M. et. al. (2013): Spatial Statistics on the Geospatial Web. In: The 16th AGILE International Conference on Geographic Information Science, Short Papers. http://www.agile-online.org/Conference_Paper/CDs/agile_2013/Short_Papers/SP_S3.1_Hinz.pdf Nüst, D. et. al.: (2015): Open and reproducible global land use classification. In: EGU General Assembly Conference Abstracts . Vol. 17. European Geophysical Union, 2015, p. 9125, http://meetingorganizer.copernicus. org/EGU2015/EGU2015- 9125.pdf Vaclavik, T., et. al. (2013): Mapping global land system archetypes. Global Environmental Change 23(6): 1637-1647. Online available: October 9, 2013, DOI: 10.1016/j.gloenvcha.2013.09.004

  6. Wideband Agile Digital Microwave Radiometer

    NASA Technical Reports Server (NTRS)

    Gaier, Todd C.; Brown, Shannon T.; Ruf, Christopher; Gross, Steven

    2012-01-01

    The objectives of this work were to take the initial steps needed to develop a field programmable gate array (FPGA)- based wideband digital radiometer backend (>500 MHz bandwidth) that will enable passive microwave observations with minimal performance degradation in a radiofrequency-interference (RFI)-rich environment. As manmade RF emissions increase over time and fill more of the microwave spectrum, microwave radiometer science applications will be increasingly impacted in a negative way, and the current generation of spaceborne microwave radiometers that use broadband analog back ends will become severely compromised or unusable over an increasing fraction of time on orbit. There is a need to develop a digital radiometer back end that, for each observation period, uses digital signal processing (DSP) algorithms to identify the maximum amount of RFI-free spectrum across the radiometer band to preserve bandwidth to minimize radiometer noise (which is inversely related to the bandwidth). Ultimately, the objective is to incorporate all processing necessary in the back end to take contaminated input spectra and produce a single output value free of manmade signals to minimize data rates for spaceborne radiometer missions. But, to meet these objectives, several intermediate processing algorithms had to be developed, and their performance characterized relative to typical brightness temperature accuracy re quirements for current and future microwave radiometer missions, including those for measuring salinity, soil moisture, and snow pack.

  7. Animal models and conserved processes

    PubMed Central

    2012-01-01

    Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is insufficient for inter

  8. Model for amorphous aggregation processes

    NASA Astrophysics Data System (ADS)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  9. The agile transversal filter - A flexible building block for ICNIA

    NASA Astrophysics Data System (ADS)

    Botha, D. G.; Smead, F. W.

    Integrated Communications, Navigation and Identification Avionics (ICNIA) is an advanced development program to demonstrate an integrated systems approach to the implementation of functions normally performed by a collection of independent black boxes. The system design partitions all CNI functions to optimize modular commonality within the ICNIA system. One function required in many parallel channels is the processing of signals with instantaneous bandwidths of 10 MHz or less. A specific implementation is the Narrow Band Agile Transversal Filter (NBATF), which can be implemented in state-of-the-art technology, can process signals with a variety of algorithms selectable under software control, and can be replicated within the system, as required, to perform the total set of functions. The NBATF constitutes a building block module within the ICNIA system.

  10. Information Network Model Query Processing

    NASA Astrophysics Data System (ADS)

    Song, Xiaopu

    Information Networking Model (INM) [31] is a novel database model for real world objects and relationships management. It naturally and directly supports various kinds of static and dynamic relationships between objects. In INM, objects are networked through various natural and complex relationships. INM Query Language (INM-QL) [30] is designed to explore such information network, retrieve information about schema, instance, their attributes, relationships, and context-dependent information, and process query results in the user specified form. INM database management system has been implemented using Berkeley DB, and it supports INM-QL. This thesis is mainly focused on the implementation of the subsystem that is able to effectively and efficiently process INM-QL. The subsystem provides a lexical and syntactical analyzer of INM-QL, and it is able to choose appropriate evaluation strategies and index mechanism to process queries in INM-QL without the user's intervention. It also uses intermediate result structure to hold intermediate query result and other helping structures to reduce complexity of query processing.

  11. Between Oais and Agile a Dynamic Data Management Approach

    NASA Astrophysics Data System (ADS)

    Bennett, V. L.; Conway, E. A.; Waterfall, A. M.; Pepler, S.

    2015-12-01

    In this paper we decribe an approach to the integration of existing archival activities which lies between compliance with the more rigid OAIS/TRAC standards and a more flexible "Agile" approach to the curation and preservation of Earth Observation data. We provide a high level overview of existing practice and discuss how these procedures can be extended and supported through the description of preservation state. The aim of which is to facilitate the dynamic controlled management of scientific data through its lifecycle. While processes are considered they are not statically defined but rather driven by human interactions in the form of risk management/review procedure that produce actionable plans, which are responsive to change. We then proceed by describing the feasibility testing of extended risk management and planning procedures which integrate current practices. This was done through the CEDA Archival Format Audit which inspected British Atmospheric Data Centre and NERC Earth Observation Data Centre Archival holdings. These holdings are extensive, comprising of around 2 Petabytes of data and 137 million individual files, which were analysed and characterised in terms of format, based risk. We are then able to present an overview of the format based risk burden faced by a large scale archive attempting to maintain the usability of heterogeneous environmental data sets We continue by presenting a dynamic data management information model and provide discussion of the following core model entities and their relationships: Aspirational entities, which include Data Entity definitions and their associated Preservation Objectives. Risk entities, which act as drivers for change within the data lifecycle. These include Acquisitional Risks, Technical Risks, Strategic Risks and External Risks Plan entities, which detail the actions to bring about change within an archive. These include Acquisition Plans, Preservation Plans and Monitoring plans which support

  12. Agile informatics: application of agile project management to the development of a personal health application.

    PubMed

    Chung, Jeanhee; Pankey, Evan; Norris, Ryan J

    2007-01-01

    We describe the application of the Agile method-- a short iteration cycle, user responsive, measurable software development approach-- to the project management of a modular personal health record, iHealthSpace, to be deployed to the patients and providers of a large academic primary care practice. PMID:18694014

  13. Chaste: using agile programming techniques to develop computational biology software.

    PubMed

    Pitt-Francis, Joe; Bernabeu, Miguel O; Cooper, Jonathan; Garny, Alan; Momtahan, Lee; Osborne, James; Pathmanathan, Pras; Rodriguez, Blanca; Whiteley, Jonathan P; Gavaghan, David J

    2008-09-13

    Cardiac modelling is the area of physiome modelling where the available simulation software is perhaps most mature, and it therefore provides an excellent starting point for considering the software requirements for the wider physiome community. In this paper, we will begin by introducing some of the most advanced existing software packages for simulating cardiac electrical activity. We consider the software development methods used in producing codes of this type, and discuss their use of numerical algorithms, relative computational efficiency, usability, robustness and extensibility. We then go on to describe a class of software development methodologies known as test-driven agile methods and argue that such methods are more suitable for scientific software development than the traditional academic approaches. A case study is a project of our own, Cancer, Heart and Soft Tissue Environment, which is a library of computational biology software that began as an experiment in the use of agile programming methods. We present our experiences with a review of our progress thus far, focusing on the advantages and disadvantages of this new approach compared with the development methods used in some existing packages. We conclude by considering whether the likely wider needs of the cardiac modelling community are currently being met and suggest that, in order to respond effectively to changing requirements, it is essential that these codes should be more malleable. Such codes will allow for reliable extensions to include both detailed mathematical models--of the heart and other organs--and more efficient numerical techniques that are currently being developed by many research groups worldwide. PMID:18565813

  14. Scaling Agile Infrastructure to People

    NASA Astrophysics Data System (ADS)

    Jones, B.; McCance, G.; Traylen, S.; Barrientos Arias, N.

    2015-12-01

    When CERN migrated its infrastructure away from homegrown fabric management tools to emerging industry-standard open-source solutions, the immediate technical challenges and motivation were clear. The move to a multi-site Cloud Computing model meant that the tool chains that were growing around this ecosystem would be a good choice, the challenge was to leverage them. The use of open-source tools brings challenges other than merely how to deploy them. Homegrown software, for all the deficiencies identified at the outset of the project, has the benefit of growing with the organization. This paper will examine what challenges there were in adapting open-source tools to the needs of the organization, particularly in the areas of multi-group development and security. Additionally, the increase in scale of the plant required changes to how Change Management was organized and managed. Continuous Integration techniques are used in order to manage the rate of change across multiple groups, and the tools and workflow for this will be examined.

  15. Modeling of Plasma Spray Processes

    NASA Astrophysics Data System (ADS)

    Chang, Chong H.

    1996-10-01

    A comprehensive computational model for thermal plasma processes is being developed with sufficient generality and flexibility to apply to a wide variety of present and proposed plasma processing concepts and devices. In our model for gas-particle flows, the gas is represented as a continuous multicomponent chemically reacting gas with temperature-dependent thermodynamic and transport properties. Ions and electrons are considered as separate components or species of the mixture, while ionization and dissociation reactions are treated as chemical reactions. Entrained particles interacting with the plasma are represented by a stochastic particle model in which the velocities, temperatures, sizes, and other characteristics of typical particles are computed simultaneously with the plasma flow. The model in its present form can simulate particle injection, heating, and melting, but not evaporation and condensation. This model is embodied in the LAVA computer code, which has previously been applied to simulate plasma spraying, mixing and demixing of plasma gases, and departures from chemical (ionization/dissociation), thermal, and excitation equilibrium in plasmas. A transient simulation has been performed of stainless steel particles injected into a swirling high-velocity nitrogen-hydrogen plasma jet in air under typical operating conditions for a newly developed high-velocity high-power (HVHP) torch, which produces plasma jets with peak velocities in excess of 3000 m/s. The calculational results show that strong departures from ionization and dissociation equilibrium develop in the downstream region as the chemical reactions freeze out at lower temperatures. The calculational results also show good agreement with experimental data on particle temperature, velocity, and spray pattern, together with important statistical effects associated with distributions in particle properties and injection conditions. This work was performed under the auspices of the U. S

  16. AGILE detection of intense gamma-ray emission from the blazar PKS 1510-089

    NASA Astrophysics Data System (ADS)

    Pucella, G.; Vittorini, V.; D'Ammando, F.; Tavani, M.; Raiteri, C. M.; Villata, M.; Argan, A.; Barbiellini, G.; Boffelli, F.; Bulgarelli, A.; Caraveo, P.; Cattaneo, P. W.; Chen, A. W.; Cocco, V.; Costa, E.; Del Monte, E.; de Paris, G.; Di Cocco, G.; Donnarumma, I.; Evangelista, Y.; Feroci, M.; Fiorini, M.; Froysland, T.; Fuschino, F.; Galli, M.; Gianotti, F.; Giuliani, A.; Labanti, C.; Lapshov, I.; Lazzarotto, F.; Lipari, P.; Longo, F.; Marisaldi, M.; Mereghetti, S.; Morselli, A.; Pacciani, L.; Pellizzoni, A.; Perotti, F.; Picozza, P.; Prest, M.; Rapisarda, M.; Rappoldi, A.; Soffitta, P.; Trifoglio, M.; Trois, A.; Vallazza, E.; Vercellone, S.; Zambra, A.; Zanello, D.; Antonelli, L. A.; Colafrancesco, S.; Cutini, S.; Gasparrini, D.; Giommi, P.; Pittori, C.; Verrecchia, F.; Salotti, L.; Aller, M. F.; Aller, H. D.; Carosati, D.; Larionov, V. M.; Ligustri, R.

    2008-11-01

    Context: We report the detection by the AGILE (Astro-rivelatore Gamma a Immagini LEggero) satellite of an intense gamma-ray flare from the source AGL J1511-0909, associated with the powerful quasar PKS 1510-089, during ten days of observations from 23 August to 1 September 2007. Aims: During the observation period, the source was in optical decrease following a flaring event monitored by the GLAST-AGILE Support Program (GASP) of the Whole Earth Blazar Telescope (WEBT). The simultaneous gamma-ray, optical, and radio coverage allows us to study the spectral energy distribution and the theoretical models based on the synchrotron and inverse Compton (IC) emission mechanisms. Methods: AGILE observed the source with its two co-aligned imagers, the Gamma-Ray Imaging Detector and the hard X-ray imager Super-AGILE sensitive in the 30 MeV div 50 GeV and 18 div 60 keV bands, respectively. Results: Between 23 and 27 August 2007, AGILE detected gamma-ray emission from PKS 1510-089 when this source was located 50° off-axis, with an average flux of (270 ± 65) × 10-8 photons cm-2 s-1 for photon energy above 100 MeV. In the following period, 28 August-1 September, after a satellite re-pointing, AGILE detected the source at 35° off-axis, with an average flux (E > 100 MeV) of (195 ± 30) × 10-8 photons cm-2 s-1. No emission was detected by Super-AGILE, with a 3-σ upper limit of 45 mCrab in 200 ks. Conclusions: The spectral energy distribution is modelled with a homogeneous one-zone synchrotron self Compton (SSC) emission plus contributions by external photons: the SSC emission contributes primarily to the X-ray band, whereas the contribution of the IC from the external disc and the broad line region match the hard gamma-ray spectrum observed.

  17. An Adaptive, Agile, Reconfigurable Photonic System for Handling Analog Signals

    NASA Astrophysics Data System (ADS)

    Middleton, C.; DeSalvo, R.; Escalera, N.

    2014-09-01

    Photonic techniques can be applied to microwave and millimeter wave transmission and signal processing challenges, including signal transport, distribution, filtering, and up- and down-conversion. We present measured performance results for a wideband photonic-assisted frequency converter with 4 GHz instantaneous bandwidth and full spectral coverage up to 45 GHz. The photonic-assisted converter is applicable for both ground and space applications. We show the system performance in a ground station application, in which high frequency analog signals were transported over a moderate distance and down-converted directly into a digitizing receiver. We also describe our progress in the packaging and space qualification of the photonic system, and discuss the next steps toward higher TRL. The photonic system provides an adaptive, agile, reconfigurable backbone for handling analog signals, with performance superior to existing microwave systems.

  18. Agile Bodies: A New Imperative in Neoliberal Governance

    ERIC Educational Resources Information Center

    Gillies, Donald

    2011-01-01

    Modern business discourse suggests that a key bulwark against market fluctuation and the threat of failure is for organizations to become "agile'", a more dynamic and proactive position than that previously afforded by mere "flexibility". The same idea is also directed at the personal level, it being argued that the "agile" individual is better…

  19. Integrated product definition representation for agile numerical control applications

    SciTech Connect

    Simons, W.R. Jr.; Brooks, S.L.; Kirk, W.J. III; Brown, C.W.

    1994-11-01

    Realization of agile manufacturing capabilities for a virtual enterprise requires the integration of technology, management, and work force into a coordinated, interdependent system. This paper is focused on technology enabling tools for agile manufacturing within a virtual enterprise specifically relating to Numerical Control (N/C) manufacturing activities and product definition requirements for these activities.

  20. Agile manufacturing in Intelligence, Surveillance and Reconnaissance (ISR)

    NASA Astrophysics Data System (ADS)

    DiPadua, Mark; Dalton, George

    2016-05-01

    The objective of the Agile Manufacturing for Intelligence, Surveillance, and Reconnaissance (AMISR) effort is to research, develop, design and build a prototype multi-intelligence (multi-INT), reconfigurable pod demonstrating benefits of agile manufacturing and a modular open systems approach (MOSA) to make podded intelligence, surveillance, and reconnaissance (ISR) capability more affordable and operationally flexible.

  1. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and procedure for developing software. This paper will discuss the some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies.

  2. Modeling pellet impact drilling process

    NASA Astrophysics Data System (ADS)

    Kovalyov, A. V.; Ryabchikov, S. Ya; Isaev, Ye D.; Ulyanova, O. S.

    2016-03-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling the process of pellet impact drilling, which creates the scientific and methodological basis for engineering design of drilling operations under different geo-technical conditions.

  3. Software Product Line Engineering Approach for Enhancing Agile Methodologies

    NASA Astrophysics Data System (ADS)

    Martinez, Jabier; Diaz, Jessica; Perez, Jennifer; Garbajosa, Juan

    One of the main principles of Agile methodologies consists in the early and continuous delivery of valuable software by short time-framed iterations. After each iteration, a working product is delivered according to the requirements defined at the beginning of the iteration. Testing tools facilitate the task of checking if the system provides the expected behavior according to the specified requirements. However, since testing tools need to be adapted in order to test new working products in each iteration, a significant effort has to be invested. This work presents a Software Product Line Engineering (SPLE) approach that allows flexibility in the adaption of testing tools with the working products in an iterative way. A case study is also presented using PLUM (Product Line Unified Modeller) as the tool suite for SPL implementation and management.

  4. Agile Machining and Inspection Non-Nuclear Report (NNR) Project

    SciTech Connect

    Lazarus, Lloyd

    2009-02-19

    This report is a high level summary of the eight major projects funded by the Agile Machining and Inspection Non-Nuclear Readiness (NNR) project (FY06.0422.3.04.R1). The largest project of the group is the Rapid Response project in which the six major sub categories are summarized. This project focused on the operations of the machining departments that will comprise Special Applications Machining (SAM) in the Kansas City Responsive Infrastructure Manufacturing & Sourcing (KCRIMS) project. This project was aimed at upgrading older machine tools, developing new inspection tools, eliminating Classified Removable Electronic Media (CREM) in the handling of classified Numerical Control (NC) programs by installing the CRONOS network, and developing methods to automatically load Coordinated-Measuring Machine (CMM) inspection data into bomb books and product score cards. Finally, the project personnel leaned perations of some of the machine tool cells, and now have the model to continue this activity.

  5. A Case Study of Coordination in Distributed Agile Software Development

    NASA Astrophysics Data System (ADS)

    Hole, Steinar; Moe, Nils Brede

    Global Software Development (GSD) has gained significant popularity as an emerging paradigm. Companies also show interest in applying agile approaches in distributed development to combine the advantages of both approaches. However, in their most radical forms, agile and GSD can be placed in each end of a plan-based/agile spectrum because of how work is coordinated. We describe how three GSD projects applying agile methods coordinate their work. We found that trust is needed to reduce the need of standardization and direct supervision when coordinating work in a GSD project, and that electronic chatting supports mutual adjustment. Further, co-location and modularization mitigates communication problems, enables agility in at least part of a GSD project, and renders the implementation of Scrum of Scrums possible.

  6. Frequency-agile wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Arms, Steven W.; Townsend, Christopher P.; Churchill, David L.; Hamel, Michael J.; Galbreath, Jacob H.; Mundell, Steven W.

    2004-07-01

    Our goal was to demonstrate a wireless communications system capable of simultaneous, high speed data communications from a variety of sensors. We have previously reported on the design and application of 2 KHz data logging transceiver nodes, however, only one node may stream data at a time, since all nodes on the network use the same communications frequency. To overcome these limitations, second generation data logging transceivers were developed with software programmable radio frequency (RF) communications. Each node contains on-board memory (2 Mbytes), sensor excitation, instrumentation amplifiers with programmable gains & offsets, multiplexer, 16 bit A/D converter, microcontroller, and frequency agile, bi-directional, frequency shift keyed (FSK) RF serial data link. These systems are capable of continuous data transmission from 26 distinct nodes (902-928 MHz band, 75 kbaud). The system was demonstrated in a compelling structural monitoring application. The National Parks Service requested a means for continual monitoring and recording of sensor data from the Liberty Bell during a move to a new location (Philadelphia, October 2003). Three distinct, frequency agile, wireless sensing nodes were used to detect visible crack shear/opening micromotions, triaxial accelerations, and hairline crack tip strains. The wireless sensors proved to be useful in protecting the Liberty Bell.

  7. Analysis of VLF signals associated to AGILE Terrestrial Gamma-ray Flashes detected over Central America

    NASA Astrophysics Data System (ADS)

    Marisaldi, Martino; Lyu, Fanchao; Cummer, Steven; Ursi, Alessandro

    2016-04-01

    Analysis of radio signals detected on ground and associated to Terrestrial Gamma-ray Flashes (TGFs) have proven to be a successful tool to extract information on the TGF itself and the possible associated lightning process. Triangulation of Very Low Frequency (VLF) signals by means of the Time Of Arrival technique provides TGF location with few km accuracy. The AGILE satellite is routinely observing TGFs on a narrow band across the Equator, limited by the small satellite orbital inclination (2.5°). However, until recently it was not possible to provide firm associations between AGILE TGFs and radio signals, because of two main limiting factors. First, dead-time effects led to a bias towards long duration events in AGILE TGF sample, which are less likely associated to strong radio pulses. In addition, most VLF detection networks are less sensitive along the equatorial region. Since the end of March 2015 a major change in the AGILE MiniCalorimeter instrument configuration resulted in a ten fold increase in TGF detection rate, and in the detection of events as short as 20 microseconds. 14% of the events in the new sample resulted simultaneous (within 200 microseconds) to sferics detected by the World Wide Lightning Location Network (WWLLN), therefore a source localisation is available for these events. We present here the first analysis of VLF waveforms associated to AGILE TGFs observed above Central America, detected by magnetic field sensors deployed in Puerto Rico. Among the seven TGFs with a WWLLN location at a distance lower than 10000 km from the sensors, four of them have detectable signals. These events are the closest to the sensors, with distance less than 7500 km. We present here the properties of these TGFs and the characteristics of the associated radio waveforms.

  8. Collapse models and perceptual processes

    NASA Astrophysics Data System (ADS)

    Carlo Ghirardi, Gian; Romano, Raffaele

    2014-04-01

    Theories including a collapse mechanism have been presented various years ago. They are based on a modification of standard quantum mechanics in which nonlinear and stochastic terms are added to the evolution equation. Their principal merits derive from the fact that they are mathematically precise schemes accounting, on the basis of a unique universal dynamical principle, both for the quantum behavior of microscopic systems as well as for the reduction associated to measurement processes and for the classical behavior of macroscopic objects. Since such theories qualify themselves not as new interpretations but as modifications of the standard theory they can be, in principle, tested against quantum mechanics. Recently, various investigations identifying possible crucial test have been discussed. In spite of the extreme difficulty to perform such tests it seems that recent technological developments allow at least to put precise limits on the parameters characterizing the modifications of the evolution equation. Here we will simply mention some of the recent investigations in this direction, while we will mainly concentrate our attention to the way in which collapse theories account for definite perceptual process. The differences between the case of reductions induced by perceptions and those related to measurement procedures by means of standard macroscopic devices will be discussed. On this basis, we suggest a precise experimental test of collapse theories involving conscious observers. We make plausible, by discussing in detail a toy model, that the modified dynamics can give rise to quite small but systematic errors in the visual perceptual process.

  9. NUMERICAL MODELING OF FINE SEDIMENT PHYSICAL PROCESSES.

    USGS Publications Warehouse

    Schoellhamer, David H.

    1985-01-01

    Fine sediment in channels, rivers, estuaries, and coastal waters undergo several physical processes including flocculation, floc disruption, deposition, bed consolidation, and resuspension. This paper presents a conceptual model and reviews mathematical models of these physical processes. Several general fine sediment models that simulate some of these processes are reviewed. These general models do not directly simulate flocculation and floc disruption, but the conceptual model and existing functions are shown to adequately model these two processes for one set of laboratory data.

  10. A variability study of the AGILE first catalog of γ-ray sources on 2.3 years of AGILE pointed observations

    NASA Astrophysics Data System (ADS)

    Verrecchia, F.; Pittori, C.; Bulgarelli, A.; Chen, A. W.; Tavani, M.; Giommi, P.; AGILE Collaboration

    2013-01-01

    AGILE pointed observations performed from July 9, 2007 to October 30, 2009 cover a very large time interval, with a γ-ray data archive useful to perform monitoring studies of medium to high brightness γ-ray sources in the 30 MeV-50 GeV energy range. The first AGILE Gamma-Ray Imaging Detector (GRID) catalog (Pittori et al., 2009) included a significance-limited (4σ) sample of 47 sources (1AGL), detected with a conservative analysis over the first-year of operations.We present a variability study of the 1AGL sources over the complete AGILE pointed Observation Blocks (OBs) dataset.In the analysis here reported we used data obtained with an improved full Field of View (FOV) event filter, on a much larger (about 27.5 months) dataset, integrating data on the OB timescales, mostly ranging between 4 and 30 days. The data processing resulted in an improved source list as compared to the 1AGL one. We present here our results on the variability of some of these sources.

  11. Agile: From Software to Mission System

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Shirley, Mark H.; Hobart, Sarah Groves

    2016-01-01

    The Resource Prospector (RP) is an in-situ resource utilization (ISRU) technology demonstration mission, designed to search for volatiles at the Lunar South Pole. This is NASA's first near real time tele-operated rover on the Moon. The primary objective is to search for volatiles at one of the Lunar Poles. The combination of short mission duration, a solar powered rover, and the requirement to explore shadowed regions makes for an operationally challenging mission. To maximize efficiency and flexibility in Mission System design and thus to improve the performance and reliability of the resulting Mission System, we are tailoring Agile principles that we have used effectively in ground data system software development and applying those principles to the design of elements of the mission operations system.

  12. Compact, flexible, frequency agile parametric wavelength converter

    DOEpatents

    Velsko, Stephan P.; Yang, Steven T.

    2002-01-01

    This improved Frequency Agile Optical Parametric Oscillator provides near on-axis pumping of a single QPMC with a tilted periodically poled grating to overcome the necessity to find a particular crystal that will permit collinear birefringence in order to obtain a desired tuning range. A tilted grating design and the elongation of the transverse profile of the pump beam in the angle tuning plane of the FA-OPO reduces the rate of change of the overlap between the pumped volume in the crystal and the resonated and non-resonated wave mode volumes as the pump beam angle is changed. A folded mirror set relays the pivot point for beam steering from a beam deflector to the center of the FA-OPO crystal. This reduces the footprint of the device by as much as a factor of two over that obtained when using the refractive telescope design.

  13. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  14. Cupola Furnace Computer Process Model

    SciTech Connect

    Seymour Katz

    2004-12-31

    The cupola furnace generates more than 50% of the liquid iron used to produce the 9+ million tons of castings annually. The cupola converts iron and steel into cast iron. The main advantages of the cupola furnace are lower energy costs than those of competing furnaces (electric) and the ability to melt less expensive metallic scrap than the competing furnaces. However the chemical and physical processes that take place in the cupola furnace are highly complex making it difficult to operate the furnace in optimal fashion. The results are low energy efficiency and poor recovery of important and expensive alloy elements due to oxidation. Between 1990 and 2004 under the auspices of the Department of Energy, the American Foundry Society and General Motors Corp. a computer simulation of the cupola furnace was developed that accurately describes the complex behavior of the furnace. When provided with the furnace input conditions the model provides accurate values of the output conditions in a matter of seconds. It also provides key diagnostics. Using clues from the diagnostics a trained specialist can infer changes in the operation that will move the system toward higher efficiency. Repeating the process in an iterative fashion leads to near optimum operating conditions with just a few iterations. More advanced uses of the program have been examined. The program is currently being combined with an ''Expert System'' to permit optimization in real time. The program has been combined with ''neural network'' programs to affect very easy scanning of a wide range of furnace operation. Rudimentary efforts were successfully made to operate the furnace using a computer. References to these more advanced systems will be found in the ''Cupola Handbook''. Chapter 27, American Foundry Society, Des Plaines, IL (1999).

  15. Agile rediscovering values: Similarities to continuous improvement strategies

    NASA Astrophysics Data System (ADS)

    Díaz de Mera, P.; Arenas, J. M.; González, C.

    2012-04-01

    Research in the late 80's on technological companies that develop products of high value innovation, with sufficient speed and flexibility to adapt quickly to changing market conditions, gave rise to the new set of methodologies known as Agile Management Approach. In the current changing economic scenario, we considered very interesting to study the similarities of these Agile Methodologies with other practices whose effectiveness has been amply demonstrated in both the West and Japan. Strategies such as Kaizen, Lean, World Class Manufacturing, Concurrent Engineering, etc, would be analyzed to check the values they have in common with the Agile Approach.

  16. Process modeling and industrial energy use

    SciTech Connect

    Howe, S O; Pilati, D A; Sparrow, F T

    1980-11-01

    How the process models developed at BNL are used to analyze industrial energy use is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Case study results from the pulp and paper model illustrate how process models can be used to analyze a variety of issues. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for energy end-use modeling and conservation analysis. Information on the current status of industry models at BNL is tabulated.

  17. Analog modelling of obduction processes

    NASA Astrophysics Data System (ADS)

    Agard, P.; Zuo, X.; Funiciello, F.; Bellahsen, N.; Faccenna, C.; Savva, D.

    2012-04-01

    Obduction corresponds to one of plate tectonics oddities, whereby dense, oceanic rocks (ophiolites) are presumably 'thrust' on top of light, continental ones, as for the short-lived, almost synchronous Peri-Arabic obduction (which took place along thousands of km from Turkey to Oman in c. 5-10 Ma). Analog modelling experiments were performed to study the mechanisms of obduction initiation and test various triggering hypotheses (i.e., plate acceleration, slab hitting the 660 km discontinuity, ridge subduction; Agard et al., 2007). The experimental setup comprises (1) an upper mantle, modelled as a low-viscosity transparent Newtonian glucose syrup filling a rigid Plexiglas tank and (2) high-viscosity silicone plates (Rhodrosil Gomme with PDMS iron fillers to reproduce densities of continental or oceanic plates), located at the centre of the tank above the syrup to simulate the subducting and the overriding plates - and avoid friction on the sides of the tank. Convergence is simulated by pushing on a piston at one end of the model with velocities comparable to those of plate tectonics (i.e., in the range 1-10 cm/yr). The reference set-up includes, from one end to the other (~60 cm): (i) the piston, (ii) a continental margin containing a transition zone to the adjacent oceanic plate, (iii) a weakness zone with variable resistance and dip (W), (iv) an oceanic plate - with or without a spreading ridge, (v) a subduction zone (S) dipping away from the piston and (vi) an upper, active continental margin, below which the oceanic plate is being subducted at the start of the experiment (as is known to have been the case in Oman). Several configurations were tested and over thirty different parametric tests were performed. Special emphasis was placed on comparing different types of weakness zone (W) and the extent of mechanical coupling across them, particularly when plates were accelerated. Displacements, together with along-strike and across-strike internal deformation in all

  18. Multidimensional Data Modeling for Business Process Analysis

    NASA Astrophysics Data System (ADS)

    Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.

    The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.

  19. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  20. GRB 070724B: the first Gamma Ray Burst localized by SuperAGILE

    SciTech Connect

    Del Monte, E.; Costa, E.; Donnarumma, I.; Feroci, M.; Lapshov, I.; Lazzarotto, F.; Soffitta, P.; Argan, A.; Pucella, G.; Trois, A.; Vittorini, V.; Evangelista, Y.; Rapisarda, M.; Barbiellini, G.; Longo, F.; Basset, M.; Foggetta, L.; Vallazza, E.; Bulgarelli, A.; Di Cocco, G.

    2008-05-22

    GRB070724B is the first Gamma Ray Burst localized by the SuperAGILE instrument aboard the AGILE space mission. The SuperAGILE localization has been confirmed after the after-glow observation by the XRT aboard the Swift satellite. No significant gamma ray emission above 50 MeV has been detected for this GRB. In this paper we describe the SuperAGILE capabilities in detecting Gamma Ray Burst and the AGILE observation of GRB 070724B.

  1. The influence of ankle dorsiflexion on jumping capacity and the modified agility t-test performance.

    PubMed

    Salinero, Juan J; Abian-Vicen, Javier; Del Coso, Juan; González-Millán, Cristina

    2014-01-01

    Dorsiflexion sport shoes aim to increase jumping capacity and speed by means of a lower position of the heel in comparison with the forefoot, favouring additional stretching of the ankle plantar flexors. In previous studies, contradictory results have been found on the benefits of using this type of shoe. With the aim of comparing a dorsiflexion sport shoe model (DF) with a conventional sport shoe (CS), 41 participants performed a countermovement jump (CMJ) test and an agility test (MAT) with both models of shoe. There were no significant differences in the jump test [CS=35.3 cm (6.4) and DF=35.6 cm (6.4), P>0.05]. In the agility test, the conventional shoe obtained better results than the model with dorsiflexion with regard to time taken to complete the circuit [CS=6236 ms (540) and DF=6377 ms (507), P<0.05)]. In spite of producing pre-stretching of the plantar muscles, the DF sport shoes were not effective for improving either jump power or agility in a specific test. PMID:24533520

  2. Modern Enterprise Systems as Enablers of Agile Development

    NASA Astrophysics Data System (ADS)

    Fredriksson, Odd; Ljung, Lennart

    Traditional ES technology and traditional project management methods are supporting and matching each other. But they are not supporting the critical success conditions for ES development in an effective way. Although the findings from one case study of a successful modern ES change project is not strong empirical evidence, we carefully propose that the new modern ES technology is supporting and matching agile project management methods. In other words, it provides the required flexibility which makes it possible to put into practice the agile way of running projects, both for the system supplier and for the customer. In addition, we propose that the combination of modern ES technology and agile project management methods are more appropriate for supporting the realization of critical success conditions for ES development. The main purpose of this chapter is to compare critical success conditions for modern enterprise systems development projects with critical success conditions for agile information systems development projects.

  3. Frequency agile OPO-based transmitters for multiwavelength DIAL

    SciTech Connect

    Velsko, S.P.; Ruggiero, A.; Herman, M.

    1996-09-01

    We describe a first generation mid-infrared transmitter with pulse to pulse frequency agility and both wide and narrow band capability. This transmitter was used to make multicomponent Differential Absorption LIDAR (DIAL) measurements in the field.

  4. Investigation into the impact of agility on conceptual fighter design

    NASA Technical Reports Server (NTRS)

    Engelbeck, R. M.

    1995-01-01

    The Agility Design Study was performed by the Boeing Defense and Space Group for the NASA Langley Research Center. The objective of the study was to assess the impact of agility requirements on new fighter configurations. Global trade issues investigated were the level of agility, the mission role of the aircraft (air-to-ground, multi-role, or air-to-air), and whether the customer is Air force, Navy, or joint service. Mission profiles and design objectives were supplied by NASA. An extensive technology assessment was conducted to establish the available technologies to industry for the aircraft. Conceptual level methodology is presented to assess the five NASA-supplied agility metrics. Twelve configurations were developed to address the global trade issues. Three-view drawings, inboard profiles, and performance estimates were made and are included in the report. A critical assessment and lessons learned from the study are also presented.

  5. Agile text mining for the 2014 i2b2/UTHealth Cardiac risk factors challenge.

    PubMed

    Cormack, James; Nath, Chinmoy; Milward, David; Raja, Kalpana; Jonnalagadda, Siddhartha R

    2015-12-01

    This paper describes the use of an agile text mining platform (Linguamatics' Interactive Information Extraction Platform, I2E) to extract document-level cardiac risk factors in patient records as defined in the i2b2/UTHealth 2014 challenge. The approach uses a data-driven rule-based methodology with the addition of a simple supervised classifier. We demonstrate that agile text mining allows for rapid optimization of extraction strategies, while post-processing can leverage annotation guidelines, corpus statistics and logic inferred from the gold standard data. We also show how data imbalance in a training set affects performance. Evaluation of this approach on the test data gave an F-Score of 91.7%, one percent behind the top performing system. PMID:26209007

  6. Sandia Agile MEMS Prototyping, Layout Tools, Education and Services Program

    SciTech Connect

    Schriner, H.; Davies, B.; Sniegowski, J.; Rodgers, M.S.; Allen, J.; Shepard, C.

    1998-05-01

    Research and development in the design and manufacture of Microelectromechanical Systems (MEMS) is growing at an enormous rate. Advances in MEMS design tools and fabrication processes at Sandia National Laboratories` Microelectronics Development Laboratory (MDL) have broadened the scope of MEMS applications that can be designed and manufactured for both military and commercial use. As improvements in micromachining fabrication technologies continue to be made, MEMS designs can become more complex, thus opening the door to an even broader set of MEMS applications. In an effort to further research and development in MEMS design, fabrication, and application, Sandia National Laboratories has launched the Sandia Agile MEMS Prototyping, Layout Tools, Education and Services Program or SAMPLES program. The SAMPLES program offers potential partners interested in MEMS the opportunity to prototype an idea and produce hardware that can be used to sell a concept. The SAMPLES program provides education and training on Sandia`s design tools, analysis tools and fabrication process. New designers can participate in the SAMPLES program and design MEMS devices using Sandia`s design and analysis tools. As part of the SAMPLES program, participants` designs are fabricated using Sandia`s 4 level polycrystalline silicon surface micromachine technology fabrication process known as SUMMiT (Sandia Ultra-planar, Multi-level MEMS Technology). Furthermore, SAMPLES participants can also opt to obtain state of the art, post-fabrication services provided at Sandia such as release, packaging, reliability characterization, and failure analysis. This paper discusses the components of the SAMPLES program.

  7. Insights into Global Health Practice from the Agile Software Development Movement

    PubMed Central

    Flood, David; Chary, Anita; Austad, Kirsten; Diaz, Anne Kraemer; García, Pablo; Martinez, Boris; Canú, Waleska López; Rohloff, Peter

    2016-01-01

    Global health practitioners may feel frustration that current models of global health research, delivery, and implementation are overly focused on specific interventions, slow to provide health services in the field, and relatively ill-equipped to adapt to local contexts. Adapting design principles from the agile software development movement, we propose an analogous approach to designing global health programs that emphasizes tight integration between research and implementation, early involvement of ground-level health workers and program beneficiaries, and rapid cycles of iterative program improvement. Using examples from our own fieldwork, we illustrate the potential of ‘agile global health’ and reflect on the limitations, trade-offs, and implications of this approach. PMID:27134081

  8. The impact of flying qualities on helicopter operational agility

    NASA Technical Reports Server (NTRS)

    Padfield, Gareth D.; Lappos, Nick; Hodgkinson, John

    1993-01-01

    Flying qualities standards are formally set to ensure safe flight and therefore reflect minimum, rather than optimum, requirements. Agility is a flying quality but relates to operations at high, if not maximum, performance. While the quality metrics and test procedures for flying, as covered for example in ADS33C, may provide an adequate structure to encompass agility, they do not currently address flight at high performance. This is also true in the fixed-wing world and a current concern in both communities is the absence of substantiated agility criteria and possible conflicts between flying qualities and high performance. AGARD is sponsoring a working group (WG19) title 'Operational Agility' that deals with these and a range of related issues. This paper is condensed from contributions by the three authors to WG19, relating to flying qualities. Novel perspectives on the subject are presented including the agility factor, that quantifies performance margins in flying qualities terms; a new parameter, based on maneuver acceleration is introduced as a potential candidate for defining upper limits to flying qualities. Finally, a probabilistic analysis of pilot handling qualities ratings is presented that suggests a powerful relationship between inherent airframe flying qualities and operational agility.

  9. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  10. A Review of Agile and Lean Manufacturing as Issues in Selected International and National Research and Development Programs and Roadmaps

    ERIC Educational Resources Information Center

    Castro, Helio; Putnik, Goran D.; Shah, Vaibhav

    2012-01-01

    Purpose: The aim of this paper is to analyze international and national research and development (R&D) programs and roadmaps for the manufacturing sector, presenting how agile and lean manufacturing models are addressed in these programs. Design/methodology/approach: In this review, several manufacturing research and development programs and…

  11. Mathematical Modeling: A Structured Process

    ERIC Educational Resources Information Center

    Anhalt, Cynthia Oropesa; Cortez, Ricardo

    2015-01-01

    Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

  12. Birth/death process model

    NASA Technical Reports Server (NTRS)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  13. A Comparative of business process modelling techniques

    NASA Astrophysics Data System (ADS)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  14. Negative Binomial Process Count and Mixture Modeling.

    PubMed

    Zhou, Mingyuan; Carin, Lawrence

    2015-02-01

    The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to an NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural, and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters. PMID:26353243

  15. Negative Binomial Process Count and Mixture Modeling.

    PubMed

    Zhou, Mingyuan; Carin, Lawrence

    2013-10-17

    The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to a NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions distributions, and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters. PMID:24144977

  16. Information-Processing Models and Curriculum Design

    ERIC Educational Resources Information Center

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  17. Frequency-agile, rapid scanning spectroscopy

    NASA Astrophysics Data System (ADS)

    Truong, G.-W.; Douglass, K. O.; Maxwell, S. E.; van Zee, R. D.; Plusquellic, D. F.; Hodges, J. T.; Long, D. A.

    2013-07-01

    Challenging applications in trace gas measurements require low uncertainty and high acquisition rates. Many cavity-enhanced spectroscopies exhibit significant sensitivity and potential, but their scanning rates are limited by reliance on either mechanical or thermal frequency tuning. Here, we present frequency-agile, rapid scanning spectroscopy (FARS) in which a high-bandwidth electro-optic modulator steps a selected laser sideband to successive optical cavity modes. This approach involves no mechanical motion and allows for a scanning rate of 8 kHz per cavity mode, a rate that is limited only by the cavity response time itself. Unlike rapidly frequency-swept techniques, FARS does not reduce the measurement duty cycle, degrade the spectrum's frequency axis or require an unusual cavity configuration. FARS allows for a sensitivity of ~2 × 10-12 cm-1 Hz-1/2 and a tuning range exceeding 70 GHz. This technique shows promise for fast and sensitive trace gas measurements and studies of chemical kinetics.

  18. APID: Agile Protein Interaction DataAnalyzer.

    PubMed

    Prieto, Carlos; De Las Rivas, Javier

    2006-07-01

    Agile Protein Interaction DataAnalyzer (APID) is an interactive bioinformatics web tool developed to integrate and analyze in a unified and comparative platform main currently known information about protein-protein interactions demonstrated by specific small-scale or large-scale experimental methods. At present, the application includes information coming from five main source databases enclosing an unified sever to explore >35 000 different proteins and 111 000 different proven interactions. The web includes search tools to query and browse upon the data, allowing selection of the interaction pairs based in calculated parameters that weight and qualify the reliability of each given protein interaction. Such parameters are for the 'proteins': connectivity, cluster coefficient, Gene Ontology (GO) functional environment, GO environment enrichment; and for the 'interactions': number of methods, GO overlapping, iPfam domain-domain interaction. APID also includes a graphic interactive tool to visualize selected sub-networks and to navigate on them or along the whole interaction network. The application is available open access at http://bioinfow.dep.usal.es/apid/. PMID:16845013

  19. Frequency-agile dual-comb spectroscopy

    NASA Astrophysics Data System (ADS)

    Millot, Guy; Pitois, Stéphane; Yan, Ming; Hovhannisyan, Tatevik; Bendahmane, Abdelkrim; Hänsch, Theodor W.; Picqué, Nathalie

    2016-01-01

    Spectroscopic gas sensing and its applications to, for example, trace detection or chemical kinetics, require ever more demanding measurement times, acquisition rates, sensitivities, precisions and broad tuning ranges. Here, we propose a new approach to near-infrared molecular spectroscopy, utilizing advanced concepts of optical telecommunications and supercontinuum photonics. We generate, without mode-locked lasers, two frequency combs of slightly different repetition frequencies and moderate, but rapidly tunable, spectral span. The output of a frequency-agile continuous-wave laser is split and sent into two electro-optic intensity modulators. Flat-top low-noise frequency combs are produced by wave-breaking in a nonlinear optical fibre of normal dispersion. With a dual-comb spectrometer, we record Doppler-limited spectra spanning 60 GHz within 13 μs and an 80 kHz refresh rate, at a tuning speed of 10 nm s-1. The sensitivity for weak absorption is enhanced by a long gas-filled hollow-core fibre. New opportunities for real-time diagnostics may be opened up, even outside the laboratory.

  20. The Dilemma of High Level Planning in Distributed Agile Software Projects: An Action Research Study in a Danish Bank

    NASA Astrophysics Data System (ADS)

    Svejvig, Per; Fladkjær Nielsen, Ann-Dorte

    The chapter reports on an action research study with the aim to design a high level planning process in distributed and co-located software projects based on agile methods. The main contributions are the insight that high level planning process is highly integrated with other project disciplines and specific steps has to be taken to apply the process in distributed projects; and the action research approach is indeed suitable to software process improvements.

  1. A Hierarchical Process-Dissociation Model

    ERIC Educational Resources Information Center

    Rouder, Jeffrey N.; Lu, Jun; Morey, Richard D.; Sun, Dongchu; Speckman, Paul L.

    2008-01-01

    In fitting the process-dissociation model (L. L. Jacoby, 1991) to observed data, researchers aggregate outcomes across participant, items, or both. T. Curran and D. L. Hintzman (1995) demonstrated how biases from aggregation may lead to artifactual support for the model. The authors develop a hierarchical process-dissociation model that does not…

  2. Embedding Agile Practices within a Plan-Driven Hierarchical Project Life Cycle

    SciTech Connect

    Millard, W. David; Johnson, Daniel M.; Henderson, John M.; Lombardo, Nicholas J.; Bass, Robert B.; Smith, Jason E.

    2014-07-28

    Organizations use structured, plan-driven approaches to provide continuity, direction, and control to large, multi-year programs. Projects within these programs vary greatly in size, complexity, level of maturity, technical risk, and clarity of the development objectives. Organizations that perform exploratory research, evolutionary development, and other R&D activities can obtain the benefits of Agile practices without losing the benefits of their program’s overarching plan-driven structure. This paper describes application of Agile development methods on a large plan-driven sensor integration program. While the client employed plan-driven, requirements flow-down methodologies, tight project schedules and complex interfaces called for frequent end-to-end demonstrations to provide feedback during system development. The development process maintained the many benefits of plan-driven project execution with the rapid prototyping, integration, demonstration, and client feedback possible through Agile development methods. This paper also describes some of the tools and implementing mechanisms used to transition between and take advantage of each methodology, and presents lessons learned from the project management, system engineering, and developer’s perspectives.

  3. The mechanism and realization of a band-agile coaxial relativistic backward-wave oscillator

    SciTech Connect

    Ge, Xingjun; Zhang, Jun; Zhong, Huihuang; Qian, Baoliang; Wang, Haitao

    2014-11-03

    The mechanism and realization of a band-agile coaxial relativistic backward-wave oscillator (RBWO) are presented. The operation frequency tuning can be easily achieved by merely altering the inner-conductor length. The key effects of the inner-conductor length contributing to the mechanical frequency tunability are investigated theoretically and experimentally. There is a specific inner-conductor length where the operation frequency can jump from one mode to another mode, which belongs to a different operation band. In addition, the operation frequency is tunable within each operation band. During simulation, the L-band microwave with a frequency of 1.61 GHz is radiated when the inner-conductor length is 39 cm. Meanwhile, the S-band microwave with a frequency of 2.32 GHz is radiated when the inner-conductor length is 5 cm. The frequency adjustment bandwidths of L-band and S-band are about 8.5% and 2%, respectively. Moreover, the online mechanical tunability process is described in detail. In the initial experiment, the generated microwave frequencies remain approximately 1.59 GHz and 2.35 GHz when the inner-conductor lengths are 39 cm and 5 cm. In brief, this technical route of the band-agile coaxial RBWO is feasible and provides a guide to design other types of band-agile high power microwaves sources.

  4. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  5. Properties of terrestrial gamma ray flashes detected by AGILE MCAL below 30 MeV

    NASA Astrophysics Data System (ADS)

    Marisaldi, M.; Fuschino, F.; Tavani, M.; Dietrich, S.; Price, C.; Galli, M.; Pittori, C.; Verrecchia, F.; Mereghetti, S.; Cattaneo, P. W.; Colafrancesco, S.; Argan, A.; Labanti, C.; Longo, F.; Del Monte, E.; Barbiellini, G.; Giuliani, A.; Bulgarelli, A.; Campana, R.; Chen, A.; Gianotti, F.; Giommi, P.; Lazzarotto, F.; Morselli, A.; Rapisarda, M.; Rappoldi, A.; Trifoglio, M.; Trois, A.; Vercellone, S.

    2014-02-01

    We present the characteristics of 308 terrestrial gamma ray flashes (TGFs) detected by the Minicalorimeter (MCAL) instrument on board the AGILE satellite during the period March 2009-July 2012 in the ±2.5° latitude band and selected to have the maximum photon energy up to 30 MeV. The characteristics of the AGILE events are analyzed and compared to the observational framework established by the two other currently active missions capable of detecting TGFs from space, RHESSI and Fermi. A detailed model of the MCAL dead time is presented, which is fundamental to properly interpret our observations. The most significant contribution to dead time is due to the anticoincidence shield in its current configuration and not to the MCAL detector itself. Longitude and local time distributions are compatible with previous observations, while the duration distribution is biased toward longer values because of dead time. The intensity distribution is compatible with previous observations, when dead time is taken into account. The TGFs cumulative spectrum supports a low production altitude, in agreement with previous measurements. We also compare our sample to lightning sferics detected by the World Wide Lightning Location Network and suggest a new method to assess quantitatively the consistency of two TGF populations based on the comparison of the associated lightning activity. According to this method, AGILE and RHESSI samples are compatible with the same parent population. The AGILE TGF catalog below 30 MeV is accessible online at the website of the ASI Science Data Center http://www.asdc.asi.it/mcaltgfcat/.

  6. Mathematical and physical modelling of materials processing

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Mathematical and physical modeling of turbulence phenomena in metals processing, electromagnetically driven flows in materials processing, gas-solid reactions, rapid solidification processes, the electroslag casting process, the role of cathodic depolarizers in the corrosion of aluminum in sea water, and predicting viscoelastic flows are described.

  7. Terrestrial Gamma-Ray Flashes at the highest energies as detected by AGILE

    NASA Astrophysics Data System (ADS)

    Tavani, M.; Marisaldi, M.; Fuschino, F.; Labanti, C.; Argan, A.; Agile Team

    2011-12-01

    The AGILE satellite, operating since mid-2007, is ideal for the study of Terrestrial Gamma-Ray Flashes (TGFs) at the highest energies. AGILE has been detecting TGFs with both its Calorimeter and with its imaging gamma-ray Tracker. The on-board trigger logic has a broad dynamic range (reaching sub-millisecond trigger timescales) and a detection capability in the range 0.3 - 100 MeV. Since the 2009-2010 discovery of a power-law spectral component surprisingly detected up to 100 MeV, AGILE has been collecting additional TGF data with a substantial improvement of the statistics. We will present the most recent results based on about 300 events, focusing on the properties of TGFs showing substantial emission above 40 MeV (High-Energy TGF, HE-TGFs). We will also present new results on the global and local correlation between the TGFs/HE-TGFs and the lightning activity in the equatorial region as obtained by LIS/OTD data. Theoretical implications of HE-TGFs on particle acceleration in thunderstorms will be discussed as well as the possible important impacts of HE-TGFs in the atmospheric environment. The atmosphere during severe thunderstorms becomes a most efficient particle accelerator on Earth, challenging current models of TGF production.

  8. The first AGILE low-energy (< 30 MeV) Terrestrial Gamma-ray Flashes catalog

    NASA Astrophysics Data System (ADS)

    Marisaldi, Martino; Fuschino, Fabio; Pittori, Carlotta; Verrecchia, Francesco; Giommi, Paolo; Tavani, Marco; Dietrich, Stefano; Price, Colin; Argan, Andrea; Labanti, Claudio; Galli, Marcello; Longo, Francesco; Del Monte, Ettore; Barbiellini, Guido; Giuliani, Andrea; Bulgarelli, Andrea; Gianotti, Fulvio; Trifoglio, Massimo; Trois, Alessio

    2014-05-01

    We present the first catalog of Terrestrial Gamma-ray Flashes (TGFs) detected by the Minicalorimeter (MCAL) instrument on-board the AGILE satellite. The catalog includes 308 TGFs detected during the period March 2009 - July 2012 in the +/- 2.5° latitude band and selected to have the maximum photon energy up to 30 MeV. The characteristics of the AGILE events are analysed and compared to the observational framework established by the two other currently active missions capable of detecting TGFs from space, RHESSI and Fermi. A detailed model of the MCAL dead time is presented, which is fundamental to properly interpret our observations, particularly concerning duration, intensity and correlation with lightning sferics detected by the World Wide Lightning Location Network. The TGFs cumulative spectrum supports a low production altitude, in agreement with previous measurements. The AGILE TGF catalog below 30 MeV is publicly accessible online at the website of the ASI Science Data Center (ASDC) http://www.asdc.asi.it/mcaltgfcat/ In addition to the TGF sample properties we also present the catalog website functionalities available to users.

  9. Agile data management for curation of genomes to watershed datasets

    NASA Astrophysics Data System (ADS)

    Varadharajan, C.; Agarwal, D.; Faybishenko, B.; Versteeg, R.

    2015-12-01

    A software platform is being developed for data management and assimilation [DMA] as part of the U.S. Department of Energy's Genomes to Watershed Sustainable Systems Science Focus Area 2.0. The DMA components and capabilities are driven by the project science priorities and the development is based on agile development techniques. The goal of the DMA software platform is to enable users to integrate and synthesize diverse and disparate field, laboratory, and simulation datasets, including geological, geochemical, geophysical, microbiological, hydrological, and meteorological data across a range of spatial and temporal scales. The DMA objectives are (a) developing an integrated interface to the datasets, (b) storing field monitoring data, laboratory analytical results of water and sediments samples collected into a database, (c) providing automated QA/QC analysis of data and (d) working with data providers to modify high-priority field and laboratory data collection and reporting procedures as needed. The first three objectives are driven by user needs, while the last objective is driven by data management needs. The project needs and priorities are reassessed regularly with the users. After each user session we identify development priorities to match the identified user priorities. For instance, data QA/QC and collection activities have focused on the data and products needed for on-going scientific analyses (e.g. water level and geochemistry). We have also developed, tested and released a broker and portal that integrates diverse datasets from two different databases used for curation of project data. The development of the user interface was based on a user-centered design process involving several user interviews and constant interaction with data providers. The initial version focuses on the most requested feature - i.e. finding the data needed for analyses through an intuitive interface. Once the data is found, the user can immediately plot and download data

  10. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  11. Models for Turbulent Transport Processes.

    ERIC Educational Resources Information Center

    Hill, James C.

    1979-01-01

    Since the statistical theories of turbulence that have developed over the last twenty or thirty years are too abstract and unreliable to be of much use to chemical engineers, this paper introduces the techniques of single point models and suggests some areas of needed research. (BB)

  12. The Southern Argentine Agile Meteor Radar (SAAMER)

    NASA Astrophysics Data System (ADS)

    Janches, Diego

    2014-11-01

    The Southern Argentina Agile Meteor Radar (SAAMER) is a new generation system deployed in Rio Grande, Tierra del Fuego, Argentina (53 S) in May 2008. SAAMER transmits 10 times more power than regular meteor radars, and uses a newly developed transmitting array, which focuses power upward instead of the traditional single-antenna-all-sky configuration. The system is configured such that the transmitter array can also be utilized as a receiver. The new design greatly increases the sensitivity of the radar enabling the detection of large number of particles at low zenith angles. The more concentrated transmitted power enables additional meteor studies besides those typical of these systems based on the detection of specular reflections, such as routine detections of head echoes and non-specular trails, previously only possible with High Power and Large Aperture radars. In August 2010, SAAMER was upgraded to a system capable to determine meteoroid orbital parameters. This was achieved by adding two remote receiving stations approximately 10 km away from the main site in near perpendicular directions. The upgrade significantly expands the science that is achieved with this new radar enabling us to study the orbital properties of the interplanetary dust environment. Because of the unique geographical location, SAAMER allows for additional inter-hemispheric comparison with measurements from Canadian Meteor Orbit Radar, which is geographically conjugate. Initial surveys show, for example, that SAAMER observes a very strong contribution of the South Toroidal Sporadic meteor source, of which limited observational data is available. In addition, SAAMER offers similar unique capabilities for meteor showers and streams studies given the range of ecliptic latitudes that the system enables detailed study of showers at high southern latitudes (e.g July Phoenicids or Puppids complex). Finally, SAAMER is ideal for the deployment of complementary instrumentation in both, permanent

  13. Creativity in Agile Systems Development: A Literature Review

    NASA Astrophysics Data System (ADS)

    Conboy, Kieran; Wang, Xiaofeng; Fitzgerald, Brian

    Proponents of agile methods claim that enabling, fostering and driving creativity is the key motivation that differentiates agile methods from their more traditional, beauraucratic counterparts. However, there is very little rigorous research to support this claim. Like most of their predecessors, the development and promotion of these methods has been almost entirely driven by practitioners and consultants, with little objective validation from the research community. This lack of validation is particularly relevant for SMEs, given that many of their project teams typify the environment to which agile methods are most suited i.e. small, co-located teams with diverse, blended skills in unstructured, sometimes even chaotic surroundings. This paper uses creativity theory as a lens to review the current agile method literature to understand exactly how much we know about the extent to which creativity actually occurs in these agile environments. The study reveals many gaps and conflict of opinion in the body of knowledge in its current state and identifies many avenues for further research.

  14. Drought processes, modeling, and mitigation

    NASA Astrophysics Data System (ADS)

    Mishra, Ashok K.; Sivakumar, Bellie; Singh, Vijay P.

    2015-07-01

    Accurate assessment of droughts is crucial for proper planning and management of our water resources, environment, and ecosystems. The combined influence of increasing water demands and the anticipated impacts of global climate change has already raised serious concerns about worsening drought conditions in the future and their social, economic, and environmental impacts. As a result, studies on droughts are currently a major focal point for a broad range of research communities, including civil engineers, hydrologists, environmentalists, ecologists, meteorologists, geologists, agricultural scientists, economists, policy makers, and water managers. There is, therefore, an urgent need for enhancing our understanding of droughts (e.g. occurrence, modeling), making more reliable assessments of their impacts on various sectors of our society (e.g. domestic, agricultural, industrial), and undertaking appropriate adaptation and mitigation measures, especially in the face of global climate change.

  15. Lithography process window analysis with calibrated model

    NASA Astrophysics Data System (ADS)

    Zhou, Wenzhan; Yu, Jin; Lo, James; Liu, Johnson

    2004-05-01

    As critical-dimension shrink below 0.13 μm, the SPC (Statistical Process Control) based on CD (Critical Dimension) control in lithography process becomes more difficult. Increasing requirements of a shrinking process window have called on the need for more accurate decision of process window center. However in practical fabrication, we found that systematic error introduced by metrology and/or resist process can significantly impact the process window analysis result. Especially, when the simple polynomial functions are used to fit the lithographic data from focus exposure matrix (FEM), the model will fit these systematic errors rather than filter them out. This will definitely impact the process window analysis and determination of the best process condition. In this paper, we proposed to use a calibrated first principle model to do process window analysis. With this method, the systematic metrology error can be filtered out efficiently and give a more reasonable window analysis result.

  16. Distillation modeling for a uranium refining process

    SciTech Connect

    Westphal, B.R.

    1996-03-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed {open_quotes}cathode processing{close_quotes}. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process.

  17. VARTM Process Modeling of Aerospace Composite Structures

    NASA Technical Reports Server (NTRS)

    Song, Xiao-Lan; Grimsley, Brian W.; Hubert, Pascal; Cano, Roberto J.; Loos, Alfred C.

    2003-01-01

    A three-dimensional model was developed to simulate the VARTM composite manufacturing process. The model considers the two important mechanisms that occur during the process: resin flow, and compaction and relaxation of the preform. The model was used to simulate infiltration of a carbon preform with an epoxy resin by the VARTM process. The model predicted flow patterns and preform thickness changes agreed qualitatively with the measured values. However, the predicted total infiltration times were much longer than measured most likely due to the inaccurate preform permeability values used in the simulation.

  18. INTEGRATED FISCHER TROPSCH MODULAR PROCESS MODEL

    SciTech Connect

    Donna Post Guillen; Richard Boardman; Anastasia M. Gribik; Rick A. Wood; Robert A. Carrington

    2007-12-01

    With declining petroleum reserves, increased world demand, and unstable politics in some of the world’s richest oil producing regions, the capability for the U.S. to produce synthetic liquid fuels from domestic resources is critical to national security and economic stability. Coal, biomass and other carbonaceous materials can be converted to liquid fuels using several conversion processes. The leading candidate for large-scale conversion of coal to liquid fuels is the Fischer Tropsch (FT) process. Process configuration, component selection, and performance are interrelated and dependent on feed characteristics. This paper outlines a flexible modular approach to model an integrated FT process that utilizes a library of key component models, supporting kinetic data and materials and transport properties allowing rapid development of custom integrated plant models. The modular construction will permit rapid assessment of alternative designs and feed stocks. The modeling approach consists of three thrust areas, or “strands” – model/module development, integration of the model elements into an end to end integrated system model, and utilization of the model for plant design. Strand 1, model/module development, entails identifying, developing, and assembling a library of codes, user blocks, and data for FT process unit operations for a custom feedstock and plant description. Strand 2, integration development, provides the framework for linking these component and subsystem models to form an integrated FT plant simulation. Strand 3, plant design, includes testing and validation of the comprehensive model and performing design evaluation analyses.

  19. Pathways to agility in the production of neutron generators

    SciTech Connect

    Stoltz, R.E.; Beavis, L.C.; Cutchen, J.T.; Garcia, P.; Gurule, G.A.; Harris, R.N.; McKey, P.C.; Williams, D.W.

    1994-02-01

    This report is the result of a study team commissioned to explore pathways for increased agility in the manufacture of neutron generators. As a part of Sandia`s new responsibility for generator production, the goal of the study was to identify opportunities to reduce costs and increase flexibility in the manufacturing operation. Four parallel approaches (or pathways) were recommended: (1) Know the goal, (2) Use design leverage effectively, (3) Value simplicity, and (4) Configure for flexibility. Agility in neutron generator production can be enhanced if all of these pathways are followed. The key role of the workforce in achieving agility was also noted, with emphasis on ownership, continuous learning, and a supportive environment.

  20. Comparison of conventional and microlens-array agile beam steerers

    NASA Astrophysics Data System (ADS)

    McDearmon, Graham F.; Flood, Kevin M.; Finlan, J. Michael

    1995-05-01

    We analyzed the optical and mechanical performance of several designs of agile beam steerers based on refractive microlens arrays for sensing and imaging applications in the visible and infrared wavebands. Ray-trace analyses showed that the best design is capable of steering narrowband illumination +/- 25 degree(s) in two dimensions with nearly diffraction-limited performance. The maximum steering angle depends on the materials. We found that imaging the field of regard takes significantly more time than scanning it unless cameras with very high frame-rates are used. We performed many parametric studies that can be used to optimize the design for any application. We compared optimal designs for microlens-array and conventional galvanometric agile beam steerers. The microlens-array agile beam steerer provides significant improvements in scanning speed, random access pointing, energy consumption, mass reduction, and volume reduction.

  1. A study of a proposed modified torsional agility metric

    NASA Technical Reports Server (NTRS)

    Valasek, John; Eggold, David P.; Downing, David R.

    1991-01-01

    A new candidate lateral agility metric, the modified torsional agility parameter, is proposed and tested through generic, nonlinear, non-real-time flight simulation programs of the F-18 and F-5A. The metric is aimed at quantifying high subsonic loaded roll capabilities which might be useful in modern air combat. The metric is considered to be straightforward for testing and measuring based on nonreal-time unmanned flight simulation. The metric is found to be sensitive to pilot input errors of less than full lateral stick to capture bank angle, when tested using unmanned flight simulations. It is suggested that, for redesigned configurations of both aircraft with improved lateral agility, the major benefit would be provided by fast and highly effective rudders, and a high level of pitch, roll, and yaw damping at moderate to high normal load factor levels.

  2. Modelling Biological Processes Using Simple Matrices.

    ERIC Educational Resources Information Center

    Paton, Ray

    1991-01-01

    A variety of examples are given from different areas of biology to illustrate the general applicability of matrix algebra to discrete models. These models of biological systems are concerned with relations between processes occurring in discrete time intervals. Diffusion, ecosystems, and different types of cells are modeled. (KR/Author)

  3. Preform Characterization in VARTM Process Model Development

    NASA Technical Reports Server (NTRS)

    Grimsley, Brian W.; Cano, Roberto J.; Hubert, Pascal; Loos, Alfred C.; Kellen, Charles B.; Jensen, Brian J.

    2004-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) is a Liquid Composite Molding (LCM) process where both resin injection and fiber compaction are achieved under pressures of 101.3 kPa or less. Originally developed over a decade ago for marine composite fabrication, VARTM is now considered a viable process for the fabrication of aerospace composites (1,2). In order to optimize and further improve the process, a finite element analysis (FEA) process model is being developed to include the coupled phenomenon of resin flow, preform compaction and resin cure. The model input parameters are obtained from resin and fiber-preform characterization tests. In this study, the compaction behavior and the Darcy permeability of a commercially available carbon fabric are characterized. The resulting empirical model equations are input to the 3- Dimensional Infiltration, version 5 (3DINFILv.5) process model to simulate infiltration of a composite panel.

  4. Modeling Cellular Processes in 3-D

    PubMed Central

    Mogilner, Alex; Odde, David

    2011-01-01

    Summary Recent advances in photonic imaging and fluorescent protein technology offer unprecedented views of molecular space-time dynamics in living cells. At the same time, advances in computing hardware and software enable modeling of ever more complex systems, from global climate to cell division. As modeling and experiment become more closely integrated, we must address the issue of modeling cellular processes in 3-D. Here, we highlight recent advances related to 3-D modeling in cell biology. While some processes require full 3-D analysis, we suggest that others are more naturally described in 2-D or 1-D. Keeping the dimensionality as low as possible reduces computational time and makes models more intuitively comprehensible; however, the ability to test full 3-D models will build greater confidence in models generally and remains an important emerging area of cell biological modeling. PMID:22036197

  5. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  6. SDN-Enabled Dynamic Feedback Control and Sensing in Agile Optical Networks

    NASA Astrophysics Data System (ADS)

    Lin, Likun

    Fiber optic networks are no longer just pipelines for transporting data in the long haul backbone. Exponential growth in traffic in metro-regional areas has pushed higher capacity fiber toward the edge of the network, and highly dynamic patterns of heterogeneous traffic have emerged that are often bursty, severely stressing the historical "fat and dumb pipe" static optical network, which would need to be massively over-provisioned to deal with these loads. What is required is a more intelligent network with a span of control over the optical as well as electrical transport mechanisms which enables handling of service requests in a fast and efficient way that guarantees quality of service (QoS) while optimizing capacity efficiency. An "agile" optical network is a reconfigurable optical network comprised of high speed intelligent control system fed by real-time in situ network sensing. It provides fast response in the control and switching of optical signals in response to changing traffic demands and network conditions. This agile control of optical signals is enabled by pushing switching decisions downward in the network stack to the physical layer. Implementing such agility is challenging due to the response dynamics and interactions of signals in the physical layer. Control schemes must deal with issues such as dynamic power equalization, EDFA transients and cascaded noise effects, impairments due to self-phase modulation and dispersion, and channel-to-channel cross talk. If these issues are not properly predicted and mitigated, attempts at dynamic control can drive the optical network into an unstable state. In order to enable high speed actuation of signal modulators and switches, the network controller must be able to make decisions based on predictive models. In this thesis, we consider how to take advantage of Software Defined Networking (SDN) capabilities for network reconfiguration, combined with embedded models that access updates from deployed network

  7. The Coalescent Process in Models with Selection

    PubMed Central

    Kaplan, N. L.; Darden, T.; Hudson, R. R.

    1988-01-01

    Statistical properties of the process describing the genealogical history of a random sample of genes are obtained for a class of population genetics models with selection. For models with selection, in contrast to models without selection, the distribution of this process, the coalescent process, depends on the distribution of the frequencies of alleles in the ancestral generations. If the ancestral frequency process can be approximated by a diffusion, then the mean and the variance of the number of segregating sites due to selectively neutral mutations in random samples can be numerically calculated. The calculations are greatly simplified if the frequencies of the alleles are tightly regulated. If the mutation rates between alleles maintained by balancing selection are low, then the number of selectively neutral segregating sites in a random sample of genes is expected to substantially exceed the number predicted under a neutral model. PMID:3066685

  8. Object-oriented models of cognitive processing.

    PubMed

    Mather, G

    2001-05-01

    Information-processing models of vision and cognition are inspired by procedural programming languages. Models that emphasize object-based representations are closely related to object-oriented programming languages. The concepts underlying object-oriented languages provide a theoretical framework for cognitive processing that differs markedly from that offered by procedural languages. This framework is well-suited to a system designed to deal flexibly with discrete objects and unpredictable events in the world. PMID:11323249

  9. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  10. The (Mathematical) Modeling Process in Biosciences

    PubMed Central

    Torres, Nestor V.; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology. PMID:26734063

  11. The influence of agility training on physiological and cognitive performance.

    PubMed

    Lennemann, Lynette M; Sidrow, Kathryn M; Johnson, Erica M; Harrison, Catherine R; Vojta, Christopher N; Walker, Thomas B

    2013-12-01

    Agility training (AT) has recently been instituted in several military communities in hopes of improving combat performance and general fitness. The purpose of this study was to determine how substituting AT for traditional military physical training (PT) influences physical and cognitive performance. Forty-one subjects undergoing military technical training were divided randomly into 2 groups for 6 weeks of training. One group participated in standard military PT consisting of calisthenics and running. A second group duplicated the amount of exercise of the first group but used AT as their primary mode of training. Before and after training, subjects completed a physical and cognitive battery of tests including V[Combining Dot Above]O2max, reaction time, Illinois Agility Test, body composition, visual vigilance, dichotic listening, and working memory tests. There were significant improvements within the AT group in V[Combining Dot Above]O2max, Illinois Agility Test, visual vigilance, and continuous memory. There was a significant increase in time-to-exhaustion for the traditional group. We conclude that AT is as effective or more effective as PT in enhancing physical fitness. Further, it is potentially more effective than PT in enhancing specific measures of physical and cognitive performance, such as physical agility, memory, and vigilance. Consequently, we suggest that AT be incorporated into existing military PT programs as a way to improve war-fighter performance. Further, it seems likely that the benefits of AT observed here occur in various other populations. PMID:23442271

  12. Agile Software Development Methods: A Comparative Review1

    NASA Astrophysics Data System (ADS)

    Abrahamsson, Pekka; Oza, Nilay; Siponen, Mikko T.

    Although agile software development methods have caught the attention of software engineers and researchers worldwide, scientific research still remains quite scarce. The aim of this study is to order and make sense of the different agile approaches that have been proposed. This comparative review is performed from the standpoint of using the following features as the analytical perspectives: project management support, life-cycle coverage, type of practical guidance, adaptability in actual use, type of research objectives and existence of empirical evidence. The results show that agile software development methods cover, without offering any rationale, different phases of the software development life-cycle and that most of these methods fail to provide adequate project management support. Moreover, quite a few methods continue to offer little concrete guidance on how to use their solutions or how to adapt them in different development situations. Empirical evidence after ten years of application remains quite limited. Based on the results, new directions on agile methods are outlined.

  13. Wavelength-Agile External-Cavity Diode Laser for DWDM

    NASA Technical Reports Server (NTRS)

    Pilgrim, Jeffrey S.; Bomse, David S.

    2006-01-01

    A prototype external-cavity diode laser (ECDL) has been developed for communication systems utilizing dense wavelength- division multiplexing (DWDM). This ECDL is an updated version of the ECDL reported in Wavelength-Agile External- Cavity Diode Laser (LEW-17090), NASA Tech Briefs, Vol. 25, No. 11 (November 2001), page 14a. To recapitulate: The wavelength-agile ECDL combines the stability of an external-cavity laser with the wavelength agility of a diode laser. Wavelength is modulated by modulating the injection current of the diode-laser gain element. The external cavity is a Littman-Metcalf resonator, in which the zeroth-order output from a diffraction grating is used as the laser output and the first-order-diffracted light is retro-reflected by a cavity feedback mirror, which establishes one end of the resonator. The other end of the resonator is the output surface of a Fabry-Perot resonator that constitutes the diode-laser gain element. Wavelength is selected by choosing the angle of the diffracted return beam, as determined by position of the feedback mirror. The present wavelength-agile ECDL is distinguished by design details that enable coverage of all 60 channels, separated by 100-GHz frequency intervals, that are specified in DWDM standards.

  14. A Capstone Course on Agile Software Development Using Scrum

    ERIC Educational Resources Information Center

    Mahnic, V.

    2012-01-01

    In this paper, an undergraduate capstone course in software engineering is described that not only exposes students to agile software development, but also makes it possible to observe the behavior of developers using Scrum for the first time. The course requires students to work as Scrum Teams, responsible for the implementation of a set of user…

  15. The Epidemic Process and The Contagion Model

    ERIC Educational Resources Information Center

    Worthen, Dennis B.

    1973-01-01

    Goffman's epidemic theory is presented and compared to the contagion theory developed by Menzel. An attempt is made to compare the two models presented and examine their similarities and differences. The conclusion drawn is that the two models are very similar in their approach to understanding communication processes. (14 references) (Author/SJ)

  16. Information-Processing Models of Cognition.

    ERIC Educational Resources Information Center

    Simon, Herbert A.

    1981-01-01

    Reviews recent progress in modeling human cognition, in particular the use of computers in generating models. Topics covered include the information processing approach to cognition, problem solving, semantic memory, pattern induction, and learning and cognitive development. A 164-item reference list is attached. (JL)

  17. Modeling of fluidized bed silicon deposition process

    NASA Technical Reports Server (NTRS)

    Kim, K.; Hsu, G.; Lutwack, R.; PRATURI A. K.

    1977-01-01

    The model is intended for use as a means of improving fluidized bed reactor design and for the formulation of the research program in support of the contracts of Silicon Material Task for the development of the fluidized bed silicon deposition process. A computer program derived from the simple modeling is also described. Results of some sample calculations using the computer program are shown.

  18. Modeling biochemical transformation processes and information processing with Narrator

    PubMed Central

    Mandel, Johannes J; Fuß, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-01-01

    Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a flexible and intuitive systems

  19. VizieR Online Data Catalog: AGILE bright gamma-ray sources updated list (Verrecchia+, 2013)

    NASA Astrophysics Data System (ADS)

    Verrecchia, F.; Pittori, C.; Chen, A. W.; Bulgarelli, A.; Tavani, M.; Lucarelli, F.; Giommi, P.; Vercellone, S.; Pellizzoni, A.; Giuliani, A.; Longo, F.; Barbiellini, G.; Trifoglio, M.; Gianotti, F.; Argan, A.; Antonelli, L. A.; Caraveo, P.; Cardillo, M.; Cattaneo, P. W.; Cocco, V.; Colafrancesco, S.; Contessi, T.; Costa, E.; Del Monte, E.; De Paris, G.; Di Cocco, G.; Di Persio, G.; Donnarumma, I.; Evangelista, Y.; Fanari, G.; Feroci, M.; Ferrari, A.; Fiorini, M.; Fornari, F.; Fuschino, F.; Froysland, T.; Frutti, M.; Galli, M.; Labanti, C.; Lapshov, I.; Lazzarotto, F.; Liello, F.; Lipari, P.; Mattaini, E.; Marisaldi, M.; Mastropietro, M.; Mauri, A.; Mauri, F.; Mereghetti, S.; Morelli, E.; Moretti, E.; Morselli, A.; Pacciani, L.; Perotti, F.; Piano, G.; Picozza, P.; Pilia, M.; Pontoni, C.; Porrovecchio, G.; Prest, M.; Primavera, R.; Pucella, G.; Rapisarda, M.; Rappoldi, A.; Rossi, E.; Rubini, A.; Sabatini, S.; Santolamazza, P.; Sotta, P.; Stellato, S.; Striani, E.; Tamburelli, F.; Traci, A.; Trois, A.; Vallazza, E.; Vittorini, V.; Zanello, D.; Salotti, L.; Valentini, G.

    2013-10-01

    We present a variability study of a sample of bright γ-ray (30MeV-50GeV ) sources. This sample is an extension of the first AGILE catalogue of -ray sources (1AGL), obtained using the complete set of AGILE observations in pointing mode performed during a 2.3 year period from July 9, 2007 until October 30, 2009. The dataset of AGILE pointed observations covers a long time interval and its γ-ray data archive is useful for monitoring studies of medium-to-high brightness γ-ray sources. In the analysis reported here, we used data obtained with an improved event filter that covers a wider field of view, on a much larger (about 27.5 months) dataset, integrating data on observation block time scales, which mostly range from a few days to thirty days. The data processing resulted in a better characterized source list than 1AGL was, and includes 54 sources, 7 of which are new high galactic latitude (|BII|>=5) sources, 8 are new sources on the galactic plane, and 20 sources from the previous catalogue with revised positions. Eight 1AGL sources (2 high-latitude and 6 on the galactic plane) were not detected in the final processing either because of low Observing Block (OB) exposure and/or due to their position in complex galactic regions. We report the results in a catalogue of all the detections obtained in each single OB, including the variability results for each of these sources. In particular, we found that 12 sources out of 42 or 11 out of 53 are variable, depending on the variability index used, where 42 and 53 are the number of sources for which these indices could be calculated. Seven of the 11 variable sources are blazars, the others are Crab pulsar+nebula, LS I +61 303, Cyg X-3, and 1AGLR J2021+4030. (2 data files).

  20. Agile hardware and software systems engineering for critical military space applications

    NASA Astrophysics Data System (ADS)

    Huang, Philip M.; Knuth, Andrew A.; Krueger, Robert O.; Garrison-Darrin, Margaret A.

    2012-06-01

    The Multi Mission Bus Demonstrator (MBD) is a successful demonstration of agile program management and system engineering in a high risk technology application where utilizing and implementing new, untraditional development strategies were necessary. MBD produced two fully functioning spacecraft for a military/DOD application in a record breaking time frame and at dramatically reduced costs. This paper discloses the adaptation and application of concepts developed in agile software engineering to hardware product and system development for critical military applications. This challenging spacecraft did not use existing key technology (heritage hardware) and created a large paradigm shift from traditional spacecraft development. The insertion of new technologies and methods in space hardware has long been a problem due to long build times, the desire to use heritage hardware, and lack of effective process. The role of momentum in the innovative process can be exploited to tackle ongoing technology disruptions and allowing risk interactions to be mitigated in a disciplined manner. Examples of how these concepts were used during the MBD program will be delineated. Maintaining project momentum was essential to assess the constant non recurring technological challenges which needed to be retired rapidly from the engineering risk liens. Development never slowed due to tactical assessment of the hardware with the adoption of the SCRUM technique. We adapted this concept as a representation of mitigation of technical risk while allowing for design freeze later in the program's development cycle. By using Agile Systems Engineering and Management techniques which enabled decisive action, the product development momentum effectively was used to produce two novel space vehicles in a fraction of time with dramatically reduced cost.

  1. Development and evaluation of an inverse solution technique for studying helicopter maneuverability and agility

    NASA Technical Reports Server (NTRS)

    Whalley, Matthew S.

    1991-01-01

    An inverse solution technique for determining the maximum maneuvering performance of a helicopter using smooth, pilotlike control inputs is presented. Also described is a pilot simulation experiment performed to investigate the accuracy of the solution resulting from this technique. The maneuverability and agility capability of the helicopter math model was varied by varying the pitch and roll damping, the maximum pitch and roll rate, and the maximum load-factor capability. Three maneuvers were investigated: a 180-deg turn, a longitudinal pop-up, and a lateral jink. The inverse solution technique yielded accurate predictions of pilot-in-the-loop maneuvering performance for two of the three maneuvers.

  2. Impact of flow unsteadiness on maneuvers and loads of agile aircraft

    NASA Technical Reports Server (NTRS)

    Jarrah, M. Ameen; Ashley, Holt

    1989-01-01

    A program of airload measurements on a family of low-aspect-ratio delta wings with sharp leading edges, subjected to large amplitude pitch transients with angles of attack up to 90 deg, is reviewed. Even for small values of the pitch-rate parameter, representative of maneuvers anticipated for agile aircraft, the force and moment overshoots can exceed by 50 percent their steady-state values. This is explained in terms of the hysteretic behavior of the breakdown locations of leading-edge vortices. An approximate theoretical model is proposed which includes the breakdown hysteresis as part of a three-term representation of the unsteady chordwise load distribution.

  3. Quantitative modeling of soil genesis processes

    NASA Technical Reports Server (NTRS)

    Levine, E. R.; Knox, R. G.; Kerber, A. G.

    1992-01-01

    For fine spatial scale simulation, a model is being developed to predict changes in properties over short-, meso-, and long-term time scales within horizons of a given soil profile. Processes that control these changes can be grouped into five major process clusters: (1) abiotic chemical reactions; (2) activities of organisms; (3) energy balance and water phase transitions; (4) hydrologic flows; and (5) particle redistribution. Landscape modeling of soil development is possible using digitized soil maps associated with quantitative soil attribute data in a geographic information system (GIS) framework to which simulation models are applied.

  4. Using Perspective to Model Complex Processes

    SciTech Connect

    Kelsey, R.L.; Bisset, K.R.

    1999-04-04

    The notion of perspective, when supported in an object-based knowledge representation, can facilitate better abstractions of reality for modeling and simulation. The object modeling of complex physical and chemical processes is made more difficult in part due to the poor abstractions of state and phase changes available in these models. The notion of perspective can be used to create different views to represent the different states of matter in a process. These techniques can lead to a more understandable model. Additionally, the ability to record the progress of a process from start to finish is problematic. It is desirable to have a historic record of the entire process, not just the end result of the process. A historic record should facilitate backtracking and re-start of a process at different points in time. The same representation structures and techniques can be used to create a sequence of process markers to represent a historic record. By using perspective, the sequence of markers can have multiple and varying views tailored for a particular user's context of interest.

  5. Modeling the VARTM Composite Manufacturing Process

    NASA Technical Reports Server (NTRS)

    Song, Xiao-Lan; Loos, Alfred C.; Grimsley, Brian W.; Cano, Roberto J.; Hubert, Pascal

    2004-01-01

    A comprehensive simulation model of the Vacuum Assisted Resin Transfer Modeling (VARTM) composite manufacturing process has been developed. For isothermal resin infiltration, the model incorporates submodels which describe cure of the resin and changes in resin viscosity due to cure, resin flow through the reinforcement preform and distribution medium and compaction of the preform during the infiltration. The accuracy of the model was validated by measuring the flow patterns during resin infiltration of flat preforms. The modeling software was used to evaluate the effects of the distribution medium on resin infiltration of a flat preform. Different distribution medium configurations were examined using the model and the results were compared with data collected during resin infiltration of a carbon fabric preform. The results of the simulations show that the approach used to model the distribution medium can significantly effect the predicted resin infiltration times. Resin infiltration into the preform can be accurately predicted only when the distribution medium is modeled correctly.

  6. Silicon EFG process development by multiscale modeling

    NASA Astrophysics Data System (ADS)

    Müller, M.; Birkmann, B.; Mosel, F.; Westram, I.; Seidl, A.

    2010-04-01

    An overview of simulation models in use for optimizing the edge-defined film-fed growth (EFG) process of thin-walled hollow silicon tubes at WACKER SCHOTT Solar is presented. The simulations span the length scales from complete furnace models over growth simulations with a mesoscopic description of the crystalline character of silicon down to solidification simulations with atomic resolution. Results gained from one model are used as input parameters or boundary conditions on other levels. Examples for the application of these models and their impact on process design are given. These include the reduction of tube thickness variations, the control of tube deformations, residual stresses and dislocation densities and the identification of twin formation processes typical for EFG silicon.

  7. Incorporating process variability into stormwater quality modelling.

    PubMed

    Wijesiri, Buddhi; Egodawatta, Prasanna; McGree, James; Goonetilleke, Ashantha

    2015-11-15

    Process variability in pollutant build-up and wash-off generates inherent uncertainty that affects the outcomes of stormwater quality models. Poor characterisation of process variability constrains the accurate accounting of the uncertainty associated with pollutant processes. This acts as a significant limitation to effective decision making in relation to stormwater pollution mitigation. The study undertaken developed three theoretical scenarios based on research findings that variations in particle size fractions <150 μm and >150 μm during pollutant build-up and wash-off primarily determine the variability associated with these processes. These scenarios, which combine pollutant build-up and wash-off processes that takes place on a continuous timeline, are able to explain process variability under different field conditions. Given the variability characteristics of a specific build-up or wash-off event, the theoretical scenarios help to infer the variability characteristics of the associated pollutant process that follows. Mathematical formulation of the theoretical scenarios enables the incorporation of variability characteristics of pollutant build-up and wash-off processes in stormwater quality models. The research study outcomes will contribute to the quantitative assessment of uncertainty as an integral part of the interpretation of stormwater quality modelling outcomes. PMID:26179783

  8. Mathematical modeling of biomass fuels formation process

    SciTech Connect

    Gaska, Krzysztof Wandrasz, Andrzej J.

    2008-07-01

    The increasing demand for thermal and electric energy in many branches of industry and municipal management accounts for a drastic diminishing of natural resources (fossil fuels). Meanwhile, in numerous technical processes, a huge mass of wastes is produced. A segregated and converted combustible fraction of the wastes, with relatively high calorific value, may be used as a component of formed fuels. The utilization of the formed fuel components from segregated groups of waste in associated processes of co-combustion with conventional fuels causes significant savings resulting from partial replacement of fossil fuels, and reduction of environmental pollution resulting directly from the limitation of waste migration to the environment (soil, atmospheric air, surface and underground water). The realization of technological processes with the utilization of formed fuel in associated thermal systems should be qualified by technical criteria, which means that elementary processes as well as factors of sustainable development, from a global viewpoint, must not be disturbed. The utilization of post-process waste should be preceded by detailed technical, ecological and economic analyses. In order to optimize the mixing process of fuel components, a mathematical model of the forming process was created. The model is defined as a group of data structures which uniquely identify a real process and conversion of this data in algorithms based on a problem of linear programming. The paper also presents the optimization of parameters in the process of forming fuels using a modified simplex algorithm with a polynomial worktime. This model is a datum-point in the numerical modeling of real processes, allowing a precise determination of the optimal elementary composition of formed fuels components, with assumed constraints and decision variables of the task.

  9. Organizational Agility and Complex Enterprise System Innovations: A Mixed Methods Study of the Effects of Enterprise Systems on Organizational Agility

    ERIC Educational Resources Information Center

    Kharabe, Amol T.

    2012-01-01

    Over the last two decades, firms have operated in "increasingly" accelerated "high-velocity" dynamic markets, which require them to become "agile." During the same time frame, firms have increasingly deployed complex enterprise systems--large-scale packaged software "innovations" that integrate and automate…

  10. Applying Business Process Re-engineering Patterns to optimize WS-BPEL Workflows

    NASA Astrophysics Data System (ADS)

    Buys, Jonas; de Florio, Vincenzo; Blondia, Chris

    With the advent of XML-based SOA, WS-BPEL shortly turned out to become a widely accepted standard for modeling business processes. Though SOA is said to embrace the principle of business agility, BPEL process definitions are still manually crafted into their final executable version. While SOA has proven to be a giant leap forward in building flexible IT systems, this static BPEL workflow model is somewhat paradoxical to the need for real business agility and should be enhanced to better sustain continual process evolution. In this paper, we point out the potential of adding business intelligence with respect to business process re-engineering patterns to the system to allow for automatic business process optimization. Furthermore, we point out that BPR macro-rules could be implemented leveraging micro-techniques from computer science. We present some practical examples that illustrate the benefit of such adaptive process models and our preliminary findings.

  11. It's the parameters, stupid! Moving beyond multi-model and multi-physics approaches to characterize and reduce predictive uncertainty in process-based hydrological models

    NASA Astrophysics Data System (ADS)

    Clark, Martyn; Samaniego, Luis; Freer, Jim

    2014-05-01

    Multi-model and multi-physics approaches are a popular tool in environmental modelling, with many studies focusing on optimally combining output from multiple model simulations to reduce predictive errors and better characterize predictive uncertainty. However, a careful and systematic analysis of different hydrological models reveals that individual models are simply small permutations of a master modeling template, and inter-model differences are overwhelmed by uncertainty in the choice of the parameter values in the model equations. Furthermore, inter-model differences do not explicitly represent the uncertainty in modeling a given process, leading to many situations where different models provide the wrong results for the same reasons. In other cases, the available morphological data does not support the very fine spatial discretization of the landscape that typifies many modern applications of process-based models. To make the uncertainty characterization problem worse, the uncertain parameter values in process-based models are often fixed (hard-coded), and the models lack the agility necessary to represent the tremendous heterogeneity in natural systems. This presentation summarizes results from a systematic analysis of uncertainty in process-based hydrological models, where we explicitly analyze the myriad of subjective decisions made throughout both the model development and parameter estimation process. Results show that much of the uncertainty is aleatory in nature - given a "complete" representation of dominant hydrologic processes, uncertainty in process parameterizations can be represented using an ensemble of model parameters. Epistemic uncertainty associated with process interactions and scaling behavior is still important, and these uncertainties can be represented using an ensemble of different spatial configurations. Finally, uncertainty in forcing data can be represented using ensemble methods for spatial meteorological analysis. Our systematic

  12. Computational Modeling in Structural Materials Processing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    High temperature materials such as silicon carbide, a variety of nitrides, and ceramic matrix composites find use in aerospace, automotive, machine tool industries and in high speed civil transport applications. Chemical vapor deposition (CVD) is widely used in processing such structural materials. Variations of CVD include deposition on substrates, coating of fibers, inside cavities and on complex objects, and infiltration within preforms called chemical vapor infiltration (CVI). Our current knowledge of the process mechanisms, ability to optimize processes, and scale-up for large scale manufacturing is limited. In this regard, computational modeling of the processes is valuable since a validated model can be used as a design tool. The effort is similar to traditional chemically reacting flow modeling with emphasis on multicomponent diffusion, thermal diffusion, large sets of homogeneous reactions, and surface chemistry. In the case of CVI, models for pore infiltration are needed. In the present talk, examples of SiC nitride, and Boron deposition from the author's past work will be used to illustrate the utility of computational process modeling.

  13. Software Engineering Laboratory (SEL) cleanroom process model

    NASA Technical Reports Server (NTRS)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  14. Dynamic occupancy models for explicit colonization processes.

    PubMed

    Broms, Kristin M; Hooten, Mevin B; Johnson, Devin S; Altwegg, Res; Conquest, Loveday L

    2016-01-01

    The dynamic, multi-season occupancy model framework has become a popular tool for modeling open populations with occupancies that change over time through local colonizations and extinctions. However, few versions of the model relate these probabilities to the occupancies of neighboring sites or patches. We present a modeling framework that incorporates this information and is capable of describing a wide variety of spatiotemporal colonization and extinction processes. A key feature of the model is that it is based on a simple set of small-scale rules describing how the process evolves. The result is a dynamic process that can account for complicated large-scale features. In our model, a site is more likely to be colonized if more of its neighbors were previously occupied and if it provides more appealing environmental characteristics than its neighboring sites. Additionally, a site without occupied neighbors may also become colonized through the inclusion of a long-distance dispersal process. Although similar model specifications have been developed for epidemiological applications, ours formally accounts for detectability using the well-known occupancy modeling framework. After demonstrating the viability and potential of this new form of dynamic occupancy model in a simulation study, we use it to obtain inference for the ongoing Common Myna (Acridotheres tristis) invasion in South Africa. Our results suggest that the Common Myna continues to enlarge its distribution and its spread via short distance movement, rather than long-distance dispersal. Overall, this new modeling framework provides a powerful tool for managers examining the drivers of colonization including short- vs. long-distance dispersal, habitat quality, and distance from source populations. PMID:27008788

  15. Dynamic occupancy models for explicit colonization processes

    USGS Publications Warehouse

    Broms, Kristin M.; Hooten, Mevin B.; Johnson, Devin S.; Altwegg, Res; Conquest, Loveday

    2016-01-01

    The dynamic, multi-season occupancy model framework has become a popular tool for modeling open populations with occupancies that change over time through local colonizations and extinctions. However, few versions of the model relate these probabilities to the occupancies of neighboring sites or patches. We present a modeling framework that incorporates this information and is capable of describing a wide variety of spatiotemporal colonization and extinction processes. A key feature of the model is that it is based on a simple set of small-scale rules describing how the process evolves. The result is a dynamic process that can account for complicated large-scale features. In our model, a site is more likely to be colonized if more of its neighbors were previously occupied and if it provides more appealing environmental characteristics than its neighboring sites. Additionally, a site without occupied neighbors may also become colonized through the inclusion of a long-distance dispersal process. Although similar model specifications have been developed for epidemiological applications, ours formally accounts for detectability using the well-known occupancy modeling framework. After demonstrating the viability and potential of this new form of dynamic occupancy model in a simulation study, we use it to obtain inference for the ongoing Common Myna (Acridotheres tristis) invasion in South Africa. Our results suggest that the Common Myna continues to enlarge its distribution and its spread via short distance movement, rather than long-distance dispersal. Overall, this new modeling framework provides a powerful tool for managers examining the drivers of colonization including short- vs. long-distance dispersal, habitat quality, and distance from source populations.

  16. An agile acquisition decision-support workbench for evaluating ISR effectiveness

    NASA Astrophysics Data System (ADS)

    Stouch, Daniel W.; Champagne, Valerie; Mow, Christopher; Rosenberg, Brad; Serrin, Joshua

    2011-06-01

    The U.S. Air Force is consistently evolving to support current and future operations through the planning and execution of intelligence, surveillance and reconnaissance (ISR) missions. However, it is a challenge to maintain a precise awareness of current and emerging ISR capabilities to properly prepare for future conflicts. We present a decisionsupport tool for acquisition managers to empirically compare ISR capabilities and approaches to employing them, thereby enabling the DoD to acquire ISR platforms and sensors that provide the greatest return on investment. We have developed an analysis environment to perform modeling and simulation-based experiments to objectively compare alternatives. First, the analyst specifies an operational scenario for an area of operations by providing terrain and threat information; a set of nominated collections; sensor and platform capabilities; and processing, exploitation, and dissemination (PED) capacities. Next, the analyst selects and configures ISR collection strategies to generate collection plans. The analyst then defines customizable measures of effectiveness or performance to compute during the experiment. Finally, the analyst empirically compares the efficacy of each solution and generates concise reports to document their conclusions, providing traceable evidence for acquisition decisions. Our capability demonstrates the utility of using a workbench environment for analysts to design and run experiments. Crafting impartial metrics enables the acquisition manager to focus on evaluating solutions based on specific military needs. Finally, the metric and collection plan visualizations provide an intuitive understanding of the suitability of particular solutions. This facilitates a more agile acquisition strategy that handles rapidly changing technology in response to current military needs.

  17. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes

    PubMed Central

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Óscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-01-01

    Background Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. Methods With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. Results The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. Conclusion The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals. PMID:18673511

  18. Stochastic differential equation model to Prendiville processes

    SciTech Connect

    Granita; Bahar, Arifah

    2015-10-22

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution.

  19. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    The quality cost modeling (QCM) tool is intended to be a relatively simple-to-use device for obtaining a first-order assessment of the quality-cost relationship for a given process-material combination. The QCM curve is a plot of cost versus quality (an index indicating microstructural quality), which is unique for a given process-material combination. The QCM curve indicates the tradeoff between cost and performance, thus enabling one to evaluate affordability. Additionally, the effect of changes in process design, raw materials, and process conditions on the cost-quality relationship can be evaluated. Such results might indicate the most efficient means to obtain improved quality at reduced cost by process design refinements, the implementation of sensors and models for closed loop process control, or improvement in the properties of raw materials being fed into the process. QCM also allows alternative processes for producing the same or similar material to be compared in terms of their potential for producing competitively priced, high quality material. Aside from demonstrating the usefulness of the QCM concept, this is one of the main foci of the present research program, namely to compare processes for making continuous fiber reinforced, metal matrix composites (MMC's). Two processes, low pressure plasma spray deposition and tape casting are considered for QCM development. This document consists of a detailed look at the design of the QCM approach, followed by discussion of the application of QCM to each of the selected MMC manufacturing processes along with results, comparison of processes, and finally, a summary of findings and recommendations.

  20. Incorporating evolutionary processes into population viability models.

    PubMed

    Pierson, Jennifer C; Beissinger, Steven R; Bragg, Jason G; Coates, David J; Oostermeijer, J Gerard B; Sunnucks, Paul; Schumaker, Nathan H; Trotter, Meredith V; Young, Andrew G

    2015-06-01

    We examined how ecological and evolutionary (eco-evo) processes in population dynamics could be better integrated into population viability analysis (PVA). Complementary advances in computation and population genomics can be combined into an eco-evo PVA to offer powerful new approaches to understand the influence of evolutionary processes on population persistence. We developed the mechanistic basis of an eco-evo PVA using individual-based models with individual-level genotype tracking and dynamic genotype-phenotype mapping to model emergent population-level effects, such as local adaptation and genetic rescue. We then outline how genomics can allow or improve parameter estimation for PVA models by providing genotypic information at large numbers of loci for neutral and functional genome regions. As climate change and other threatening processes increase in rate and scale, eco-evo PVAs will become essential research tools to evaluate the effects of adaptive potential, evolutionary rescue, and locally adapted traits on persistence. PMID:25494697

  1. Session on modeling of radiative transfer processes

    NASA Technical Reports Server (NTRS)

    Flatau, Piotr

    1993-01-01

    The session on modeling of radiative transfer processes is reviewed. Six critical issues surfaced in the discussion concerning scale-interactive radiative processes relevent to the mesoscale convective systems (MCS's). These issues are the need to expand basic knowledge of how MCS's influence climate through extensive cloud shields and increased humidity in the upper troposphere; to improve radiation parameterizations used in mesoscale and General Circulation Model (GCM) models; to improve our basic understanding of the influence of radiation on MCS dynamics due to diabatic heating, production of condensate, and vertical and horizontal heat fluxes; to quantify our understanding of radiative impacts of MCS's on the surface and free atmosphere energy budgets; to quantify and identify radiative and microphysical processes important in the evolution of MCS's; and to improve the capability to remotely sense MCS radiative properties from space and ground-based systems.

  2. Quantitative Modeling of Earth Surface Processes

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.

  3. More details...
  4. Hydrothermal processing of Hanford tank wastes: Process modeling and control

    SciTech Connect

    Currier, R.P.

    1994-10-01

    In the Los Alamos National Laboratory (LANL) hydrothermal process, waste streams are first pressurized and heated as they pass through a continuous flow tubular reactor vessel. The waste is maintained at reaction temperature of 300--550 C where organic destruction and sludge reformation occur. This report documents LANL activities in process modeling and control undertaken in FY94 to support hydrothermal process development. Key issues discussed include non-ideal flow patterns (e.g. axial dispersion) and their effect on reactor performance, the use and interpretation of inert tracer experiments, and the use of computational fluid mechanics to evaluate novel hydrothermal reactor designs. In addition, the effects of axial dispersion (and simplifications to rate expressions) on the estimated kinetic parameters are explored by non-linear regression to experimental data. Safety-related calculations are reported which estimate the explosion limits of effluent gases and the fate of hydrogen as it passes through the reactor. Development and numerical solution of a generalized one-dimensional mathematical model is also summarized. The difficulties encountered in using commercially available software to correlate the behavior of high temperature, high pressure aqueous electrolyte mixtures are summarized. Finally, details of the control system and experiments conducted to empirically determine the system response are reported.

  5. A process algebra model of QED

    NASA Astrophysics Data System (ADS)

    Sulis, William

    2016-03-01

    The process algebra approach to quantum mechanics posits a finite, discrete, determinate ontology of primitive events which are generated by processes (in the sense of Whitehead). In this ontology, primitive events serve as elements of an emergent space-time and of emergent fundamental particles and fields. Each process generates a set of primitive elements, using only local information, causally propagated as a discrete wave, forming a causal space termed a causal tapestry. Each causal tapestry forms a discrete and finite sampling of an emergent causal manifold (space-time) M and emergent wave function. Interactions between processes are described by a process algebra which possesses 8 commutative operations (sums and products) together with a non-commutative concatenation operator (transitions). The process algebra possesses a representation via nondeterministic combinatorial games. The process algebra connects to quantum mechanics through the set valued process and configuration space covering maps, which associate each causal tapestry with sets of wave functions over M. Probabilities emerge from interactions between processes. The process algebra model has been shown to reproduce many features of the theory of non-relativistic scalar particles to a high degree of accuracy, without paradox or divergences. This paper extends the approach to a semi-classical form of quantum electrodynamics.

  6. Retort process modelling for Indian traditional foods.

    PubMed

    Gokhale, S V; Lele, S S

    2014-11-01

    Indian traditional staple and snack food is typically a heterogeneous recipe that incorporates varieties of vegetables, lentils and other ingredients. Modelling the retorting process of multilayer pouch packed Indian food was achieved using lumped-parameter approach. A unified model is proposed to estimate cold point temperature. Initial process conditions, retort temperature and % solid content were the significantly affecting independent variables. A model was developed using combination of vegetable solids and water, which was then validated using four traditional Indian vegetarian products: Pulav (steamed rice with vegetables), Sambar (south Indian style curry containing mixed vegetables and lentils), Gajar Halawa (carrot based sweet product) and Upama (wheat based snack product). The predicted and experimental values of temperature profile matched with ±10 % error which is a good match considering the food was a multi component system. Thus the model will be useful as a tool to reduce number of trials required to optimize retorting of various Indian traditional vegetarian foods. PMID:26396305

  7. The DAB model of drawing processes

    NASA Technical Reports Server (NTRS)

    Hochhaus, Larry W.

    1989-01-01

    The problem of automatic drawing was investigated in two ways. First, a DAB model of drawing processes was introduced. DAB stands for three types of knowledge hypothesized to support drawing abilities, namely, Drawing Knowledge, Assimilated Knowledge, and Base Knowledge. Speculation concerning the content and character of each of these subsystems of the drawing process is introduced and the overall adequacy of the model is evaluated. Second, eight experts were each asked to understand six engineering drawings and to think aloud while doing so. It is anticipated that a concurrent protocol analysis of these interviews can be carried out in the future. Meanwhile, a general description of the videotape database is provided. In conclusion, the DAB model was praised as a worthwhile first step toward solution of a difficult problem, but was considered by and large inadequate to the challenge of automatic drawing. Suggestions for improvements on the model were made.

  8. Soil processes parameterization in meteorological model.

    NASA Astrophysics Data System (ADS)

    Mazur, Andrzej; Duniec, Grzegorz

    2014-05-01

    In August 2012 Polish Institute Meteorology and Water Management - National Research Institute (IMWM-NRI) started a collaboration with the Institute of Agrophysics - Polish Academy of Science (IA-PAS) in order to improve soil processes parameterization in COSMO meteorological model of high resolution (horizontal grid size equal to 2,8 km). This cooperation turned into a project named "New approach to parameterization of physical processes in soil in numerical model". The new set of soil processes parameterizations is being developed considering many physical and microphysical processes in soil. Currently, main effort is focused on description of bare soil evaporation, soil water transport and the runoff from soil layers. The preliminary results from new mathematical formulation of bare soil evaporation implemented in COSMO model will be presented. Moreover, during the Conference authors (realizing a constant need for further improvement) would like to show future plans and topics for further studies. It is planned to combine the mentioned new approach with TILE and MOSAIC parameterizations, previously investigated as a part of TERRA-MultiLevel module of COSMO model, and to use measurements data received from IA-PAS and from Satellite Remote Sensing Center in soil-related COSMO model numerical experiments.

  9. Attrition and abrasion models for oil shale process modeling

    SciTech Connect

    Aldis, D.F.

    1991-10-25

    As oil shale is processed, fine particles, much smaller than the original shale are created. This process is called attrition or more accurately abrasion. In this paper, models of abrasion are presented for oil shale being processed in several unit operations. Two of these unit operations, a fluidized bed and a lift pipe are used in the Lawrence Livermore National Laboratory Hot-Recycle-Solid (HRS) process being developed for the above ground processing of oil shale. In two reports, studies were conducted on the attrition of oil shale in unit operations which are used in the HRS process. Carley reported results for attrition in a lift pipe for oil shale which had been pre-processed either by retorting or by retorting then burning. The second paper, by Taylor and Beavers, reported results for a fluidized bed processing of oil shale. Taylor and Beavers studied raw, retorted, and shale which had been retorted and then burned. In this paper, empirical models are derived, from the experimental studies conducted on oil shale for the process occurring in the HRS process. The derived models are presented along with comparisons with experimental results.

  10. Gaussian Process Modeling of Protein Turnover.

    PubMed

    Rahman, Mahbubur; Previs, Stephen F; Kasumov, Takhar; Sadygov, Rovshan G

    2016-07-01

    We describe a stochastic model to compute in vivo protein turnover rate constants from stable-isotope labeling and high-throughput liquid chromatography-mass spectrometry experiments. We show that the often-used one- and two-compartment nonstochastic models allow explicit solutions from the corresponding stochastic differential equations. The resulting stochastic process is a Gaussian processes with Ornstein-Uhlenbeck covariance matrix. We applied the stochastic model to a large-scale data set from (15)N labeling and compared its performance metrics with those of the nonstochastic curve fitting. The comparison showed that for more than 99% of proteins, the stochastic model produced better fits to the experimental data (based on residual sum of squares). The model was used for extracting protein-decay rate constants from mouse brain (slow turnover) and liver (fast turnover) samples. We found that the most affected (compared to two-exponent curve fitting) results were those for liver proteins. The ratio of the median of degradation rate constants of liver proteins to those of brain proteins increased 4-fold in stochastic modeling compared to the two-exponent fitting. Stochastic modeling predicted stronger differences of protein turnover processes between mouse liver and brain than previously estimated. The model is independent of the labeling isotope. To show this, we also applied the model to protein turnover studied in induced heart failure in rats, in which metabolic labeling was achieved by administering heavy water. No changes in the model were necessary for adapting to heavy-water labeling. The approach has been implemented in a freely available R code. PMID:27229456

  11. Modeling Asymmetric Rolling Process of Mg alloys

    SciTech Connect

    Cho, Jaehyung; Kim, Hyung-Wuk; Kang, Suk-Bong

    2010-06-15

    Asymmetric deformation during rolling can arise in various ways: difference in the radii, speeds, frictions of the top and bottom rolls. Asymmetric warm rolling processes of magnesium alloys were modeled using a lagrangian incremental approach. A constitutive equation representing flow behaviors of AZ31 magnesium alloys during warm deformation was implemented to the modeling. Various roll speed ratios were introduced to investigate deformation behaviors of the magnesium alloys. Bending and texturing of the strips were examined.

  12. Inbreeding avoidance, patch isolation and matrix permeability influence dispersal and settlement choices by male agile antechinus in a fragmented landscape.

    PubMed

    Banks, Sam C; Lindenmayer, David B

    2014-03-01

    Animal dispersal is highly non-random and has important implications for the dynamics of populations in fragmented habitat. We identified interpatch dispersal events from genetic tagging, parentage analyses and assignment tests and modelled the factors associated with apparent emigration and post-dispersal settlement choices by individual male agile antechinus (Antechinus agilis, a marsupial carnivore of south-east Australian forests). Emigration decisions were best modelled with on data patch isolation and inbreeding risk. The choice of dispersal destination by males was influenced by inbreeding risk, female abundance, patch size, patch quality and matrix permeability (variation in land cover). Males were less likely to settle in patches without highly unrelated females. Our findings highlight the importance of individual-level dispersal data for understanding how multiple processes drive non-randomness in dispersal in modified landscapes. Fragmented landscapes present novel environmental, demographic and genetic contexts in which dispersal decisions are made, so the major factors affecting dispersal decisions in fragmented habitat may differ considerably from unfragmented landscapes. We show that the spatial scale of genetic neighbourhoods can be large in fragmented habitat, such that dispersing males can potentially settle in the presence of genetically similar females after moving considerable distances, thereby necessitating both a choice to emigrate and a choice of where to settle to avoid inbreeding. PMID:23991826

  13. Measurement and modeling of moist processes

    NASA Technical Reports Server (NTRS)

    Cotton, William; Starr, David; Mitchell, Kenneth; Fleming, Rex; Koch, Steve; Smith, Steve; Mailhot, Jocelyn; Perkey, Don; Tripoli, Greg

    1993-01-01

    The keynote talk summarized five years of work simulating observed mesoscale convective systems with the RAMS (Regional Atmospheric Modeling System) model. Excellent results are obtained when simulating squall line or other convective systems that are strongly forced by fronts or other lifting mechanisms. Less highly forced systems are difficult to model. The next topic in this colloquium was measurement of water vapor and other constituents of the hydrologic cycle. Impressive accuracy was shown measuring water vapor with both the airborne DIAL (Differential Absorption Lidar) system and the the ground-based Raman Lidar. NMC's plans for initializing land water hydrology in mesoscale models was presented before water vapor measurement concepts for GCIP were discussed. The subject of using satellite data to provide mesoscale moisture and wind analyses was next. Recent activities in modeling of moist processes in mesoscale systems was reported on. These modeling activities at the Canadian Atmospheric Environment Service (AES) used a hydrostatic, variable-resolution grid model. Next the spatial resolution effects of moisture budgets was discussed; in particular, the effects of temporal resolution on heat and moisture budgets for cumulus parameterization. The conclusion of this colloquium was on modeling scale interaction processes.

  14. Building phenomenological models of complex biological processes

    NASA Astrophysics Data System (ADS)

    Daniels, Bryan; Nemenman, Ilya

    2009-11-01

    A central goal of any modeling effort is to make predictions regarding experimental conditions that have not yet been observed. Overly simple models will not be able to fit the original data well, but overly complex models are likely to overfit the data and thus produce bad predictions. Modern quantitative biology modeling efforts often err on the complexity side of this balance, using myriads of microscopic biochemical reaction processes with a priori unknown kinetic parameters to model relatively simple biological phenomena. In this work, we show how Bayesian model selection (which is mathematically similar to low temperature expansion in statistical physics) can be used to build coarse-grained, phenomenological models of complex dynamical biological processes, which have better predictive powers than microscopically correct, but poorely constrained mechanistic molecular models. We illustrate this on the example of a multiply-modifiable protein molecule, which is a simplified description of multiple biological systems, such as an immune receptors and an RNA polymerase complex. Our approach is similar in spirit to the phenomenological Landau expansion for the free energy in the theory of critical phenomena.

  15. Processing and Modeling of Porous Copper Using Sintering Dissolution Process

    NASA Astrophysics Data System (ADS)

    Salih, Mustafa Abualgasim Abdalhakam

    The growth of porous metal has produced materials with improved properties as compared to non-metals and solid metals. Porous metal can be classified as either open cell or closed cell. Open cell allows a fluid media to pass through it. Closed cell is made up of adjacent sealed pores with shared cell walls. Metal foams offer higher strength to weight ratios, increased impact energy absorption, and a greater tolerance to high temperatures and adverse environmental conditions when compared to bulk materials. Copper and its alloys are examples of these, well known for high strength and good mechanical, thermal and electrical properties. In the present study, the porous Cu was made by a powder metallurgy process, using three different space holders, sodium chloride, sodium carbonate and potassium carbonate. Several different samples have been produced, using different ratios of volume fraction. The densities of the porous metals have been measured and compared to the theoretical density calculated using an equation developed for these foams. The porous structure was determined with the removal of spacer materials through sintering process. The sintering process of each spacer material depends on the melting point of the spacer material. Processing, characterization, and mechanical properties were completed. These tests include density measurements, compression tests, computed tomography (CT) and scanning electron microscopy (SEM). The captured morphological images are utilized to generate the object-oriented finite element (OOF) analysis for the porous copper. Porous copper was formed with porosities in the range of 40-66% with density ranges from 3 to 5.2 g/cm3. A study of two different methods to measure porosity was completed. OOF (Object Oriented Finite Elements) is a desktop software application for studying the relationship between the microstructure of a material and its overall mechanical, dielectric, or thermal properties using finite element models based on

  16. Improving the process of process modelling by the use of domain process patterns

    NASA Astrophysics Data System (ADS)

    Koschmider, Agnes; Reijers, Hajo A.

    2015-01-01

    The use of business process models has become prevalent in a wide area of enterprise applications. But while their popularity is expanding, concerns are growing with respect to their proper creation and maintenance. An obvious way to boost the efficiency of creating high-quality business process models would be to reuse relevant parts of existing models. At this point, however, limited support exists to guide process modellers towards the usage of appropriate model content. In this paper, a set of content-oriented patterns is presented, which is extracted from a large set of process models from the order management and manufacturing production domains. The patterns are derived using a newly proposed set of algorithms, which are being discussed in this paper. The authors demonstrate how such Domain Process Patterns, in combination with information on their historic usage, can support process modellers in generating new models. To support the wider dissemination and development of Domain Process Patterns within and beyond the studied domains, an accompanying website has been set up.

  17. A Generic Modeling Process to Support Functional Fault Model Development

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  18. Model-based internal wave processing

    SciTech Connect

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  19. Dynamical modeling of laser ablation processes

    SciTech Connect

    Leboeuf, J.N.; Chen, K.R.; Donato, J.M.; Geohegan, D.B.; Liu, C.L.; Puretzky, A.A.; Wood, R.F.

    1995-09-01

    Several physics and computational approaches have been developed to globally characterize phenomena important for film growth by pulsed laser deposition of materials. These include thermal models of laser-solid target interactions that initiate the vapor plume; plume ionization and heating through laser absorption beyond local thermodynamic equilibrium mechanisms; gas dynamic, hydrodynamic, and collisional descriptions of plume transport; and molecular dynamics models of the interaction of plume particles with the deposition substrate. The complexity of the phenomena involved in the laser ablation process is matched by the diversity of the modeling task, which combines materials science, atomic physics, and plasma physics.

  20. Model Identification of Integrated ARMA Processes

    ERIC Educational Resources Information Center

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  21. Mathematical Modelling of Continuous Biotechnological Processes

    ERIC Educational Resources Information Center

    Pencheva, T.; Hristozov, I.; Shannon, A. G.

    2003-01-01

    Biotechnological processes (BTP) are characterized by a complicated structure of organization and interdependent characteristics. Partial differential equations or systems of partial differential equations are used for their behavioural description as objects with distributed parameters. Modelling of substrate without regard to dispersion…

  1. Aligning Grammatical Theories and Language Processing Models

    ERIC Educational Resources Information Center

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  2. The SERIOL2 Model of Orthographic Processing

    ERIC Educational Resources Information Center

    Whitney, Carol; Marton, Yuval

    2013-01-01

    The SERIOL model of orthographic analysis proposed mechanisms for converting visual input into a serial encoding of letter order, which involved hemisphere-specific processing at the retinotopic level. As a test of SERIOL predictions, we conducted a consonant trigram-identification experiment, where the trigrams were briefly presented at various…

  3. Content, Process, and Product: Modeling Differentiated Instruction

    ERIC Educational Resources Information Center

    Taylor, Barbara Kline

    2015-01-01

    Modeling differentiated instruction is one way to demonstrate how educators can incorporate instructional strategies to address students' needs, interests, and learning styles. This article discusses how secondary teacher candidates learn to focus on content--the "what" of instruction; process--the "how" of instruction;…

  4. Computational Process Modeling for Additive Manufacturing (OSU)

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  5. Hot blast stove process model and model-based controller

    SciTech Connect

    Muske, K.R.; Howse, J.W.; Hansen, G.A.; Cagliostro, D.J.; Chaubal, P.C.

    1998-12-31

    This paper describes the process model and model-based control techniques implemented on the hot blast stoves for the No. 7 Blast Furnace at the Inland Steel facility in East Chicago, Indiana. A detailed heat transfer model of the stoves is developed and verified using plant data. This model is used as part of a predictive control scheme to determine the minimum amount of fuel necessary to achieve the blast air requirements. The model is also used to predict maximum and minimum temperature constraint violations within the stove so that the controller can take corrective actions while still achieving the required stove performance.

  6. Modeling chondrocyte patterns by elliptical cluster processes.

    PubMed

    Meinhardt, Martin; Lück, Sebastian; Martin, Pascal; Felka, Tino; Aicher, Wilhelm; Rolauffs, Bernd; Schmidt, Volker

    2012-02-01

    Superficial zone chondrocytes (CHs) of human joints are spatially organized in distinct horizontal patterns. Among other factors, the type of spatial CH organization within a given articular surface depends on whether the cartilage has been derived from an intact joint or the joint is affected by osteoarthritis (OA). Furthermore, specific variations of the type of spatial organization are associated with particular states of OA. This association may prove relevant for early disease recognition based on a quantitative structural characterization of CH patterns. Therefore, we present a point process model describing the distinct morphology of CH patterns within the articular surface of intact human cartilage. This reference model for intact CH organization can be seen as a first step towards a model-based statistical diagnostic tool. Model parameters are fitted to fluorescence microscopy data by a novel statistical methodology utilizing tools from cluster and principal component analysis. This way, the complex morphology of surface CH patters is represented by a relatively small number of model parameters. We validate the point process model by comparing biologically relevant structural characteristics between the fitted model and data derived from photomicrographs of the human articular surface using techniques from spatial statistics. PMID:22155191

  7. Process Model for Friction Stir Welding

    NASA Technical Reports Server (NTRS)

    Adams, Glynn

    1996-01-01

    Friction stir welding (FSW) is a relatively new process being applied for joining of metal alloys. The process was initially developed by The Welding Institute (TWI) in Cambridge, UK. The FSW process is being investigated at NASA/MSEC as a repair/initial weld procedure for fabrication of the super-light-weight aluminum-lithium shuttle external tank. The FSW investigations at MSFC were conducted on a horizontal mill to produce butt welds of flat plate material. The weldment plates are butted together and fixed to a backing plate on the mill bed. A pin tool is placed into the tool holder of the mill spindle and rotated at approximately 400 rpm. The pin tool is then plunged into the plates such that the center of the probe lies at, one end of the line of contact, between the plates and the shoulder of the pin tool penetrates the top surface of the weldment. The weld is produced by traversing the tool along the line of contact between the plates. A lead angle allows the leading edge of the shoulder to remain above the top surface of the plate. The work presented here is the first attempt at modeling a complex phenomenon. The mechanical aspects of conducting the weld process are easily defined and the process itself is controlled by relatively few input parameters. However, in the region of the weld, plasticizing and forging of the parent material occurs. These are difficult processes to model. The model presented here addresses only variations in the radial dimension outward from the pin tool axis. Examinations of the grain structure of the weld reveal that a considerable amount of material deformation also occurs in the direction parallel to the pin tool axis of rotation, through the material thickness. In addition, measurements of the axial load on the pin tool demonstrate that the forging affect of the pin tool shoulder is an important process phenomenon. Therefore, the model needs to be expanded to account for the deformations through the material thickness and the

  8. Agile supply chain capabilities: emerging patterns as a determinant of competitive objectives

    NASA Astrophysics Data System (ADS)

    Yusuf, Yahaya Y.; Adeleye, E. O.; Sivayoganathan, K.

    2001-10-01

    Turbulent change caused by factors such as changing customer and technological requirements threatens manufacturers through lower product life cycles, profits and bleak survival prospects. Therefore, several companies are stressing flexibility and agility in order to respond, real time, to the unique needs of customers and markets. However, the resource competencies required are often difficult to mobilise and retain by single companies. It is therefore imperative for companies to co-operate and leverage complementary competencies. To this end, legally separate and spatially distributed companies are becoming integrated through Internet-based technologies. The paper reviews emerging patterns in supply chain integration. It also explores the relationship between the emerging patterns and attainment of competitive objectives. The results reported in the paper are based on data from a survey by questionnaire. The survey involved 600 companies in the UK, as part of a larger study of agile manufacturing. The study was driven by a conceptual model, which relates supply chain practices to competitive objectives. The analysis involves the use of factor analysis to reduce research variables to a few principal components. Subsequently, multiple regression was conducted to study the relationship amongst the reduced variables. The results validate the proposed conceptual model and lend credence to current thinking that supply chain integration is a vital tool for competitive advantage.

  9. A model evaluation checklist for process-based environmental models

    NASA Astrophysics Data System (ADS)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  10. Coal-to-Liquids Process Model

    2006-01-01

    A comprehensive Aspen Plus model has been developed to rigorously model coal-to-liquids processes. This portion was developed under Laboratory Directed Research and Development (LDRD) funding. The model is built in a modular fashion to allow rapid reconfiguration for evaluation of process options. Aspen Plus is the framework in which the model is developed. The coal-to-liquids simulation package is an assemble of Aspen Hierarchy Blocks representing subsections of the plant. Each of these Blocks are consideredmore » individual components of the Copyright, which may be extracted and licensed as individual components, but which may be combined with one or more other components, to model general coal-conversion processes, including the following plant operations: (1) coal handling and preparation, (2) coal pyrolysis, combustion, or gasification, (3) syngas conditioning and cleanup, (4) sulfur recovery using Claus-SCOT unit operations, (5) Fischer-Tropsch liquid fuels synthesis, (6) hydrocracking of high molecular weight paraffin, (7) hydrotreating of low molecular weight paraffin and olefins, (8) gas separations, and (9) power generation representing integrated combined cycle technology.« less

  11. [Cellular model of blood coagulation process].

    PubMed

    Bijak, Michał; Rzeźnicka, Paulina; Saluk, Joanna; Nowak, Paweł

    2015-07-01

    Blood coagulation is a process which main objective is the prevention of blood loss when the integrity of the blood vessel is damaged. Over the years, have been presented a number of concepts characterizing the mechanism of thrombus formation. Since the 60s of last century was current cascade model of the coagulation wherein forming of the fibrin clot is determined by two pathways called extrinsic and intrinsic pathways. In the nineties of the last century Monroe and Hoffman presented his concept of blood coagulation process which complement the currently valid model of cells participation especially of blood platelets which aim is to provide a negatively charged phospholipid surface and thereby allow the coagulation enzymatic complexes formation. Developed conception they called cellular model of coagulation. The aim of this work was to present in details of this blood coagulation, including descriptions of its various phases. PMID:26277170

  12. Modeling of the vacuum plasma spray process

    SciTech Connect

    Varacalle, D.J. Jr.; Neiser, R.A.; Smith, M.F.

    1992-10-01

    Experimental and analytical studies have been conducted to investigate gas, particle, and coating dynamics in the vacuum plasma spray (VPS) process for a tungsten powder. VPS coatings were examined metallographically and the results compared with the model`s predictions. The plasma was numerically modeled from the cathode tip to the spray distance in the free plume for the experimental conditions of this study. This information was then used as boundary conditions to solve the particle dynamics. The predicted temperature and velocity of the powder particles at standoff were then used as initial conditions for a coating dynamics code. The code predicts the coating morphology for the specific process parameters. The predicted characteristics exhibit good correlation with the observed coating properties.

  13. Pulsar timing and the Fermi and AGILE missions

    NASA Astrophysics Data System (ADS)

    Weltevrede, Patrick; Possenti, Andrea; Manchester, Dick; Johnston, Simon; Kramer, Michael; Hobbs, George; Keith, Michael; Romani, Roger W.; Thompson, David J.; Thorsett, Stephen; Roberts, Mallory

    2009-10-01

    We request time to observe 160 pulsars on a regular basis in order to provide accurate ephemerides necessary for the detection of gamma-ray pulsars with the Fermi and AGILE satellites. The main science goals are to increase the number of known gamma-ray pulsars (both radio loud and radio quiet), to determine accurate pulse profiles, to characterise their high energy spectra and phase resolved spectroscopy of the brightest pulsars. In the radio, the observations will also allow us to find glitches, characterise timing noise, investigate dispersion and rotation measure variability and enhance our knowledge of single pulse phenomenology. To date, we are (co-)authors on 2 Agile papers, 4 Fermi papers, 3 radio papers and authors on 3 papers in submission. The data are contributing to the PhD theses of Lucas Guillemot and Damien Parent from the Bordeaux Fermi group.

  14. ROADM architectures and technologies for agile optical networks

    NASA Astrophysics Data System (ADS)

    Eldada, Louay A.

    2007-02-01

    We review the different optoelectronic component and module technologies that have been developed for use in ROADM subsystems, and describe their principles of operation, designs, features, advantages, and challenges. We also describe the various needs for reconfigurable optical add/drop switching in agile optical networks. For each network need, we present the different ROADM subsystem architecture options with their pros and cons, and describe the optoelectronic technologies supporting each architecture.

  15. Sprint, agility, strength and endurance capacity in wheelchair basketball players

    PubMed Central

    Granados, C; Otero, M; Badiola, A; Olasagasti, J; Bidaurrazaga-Letona, I; Iturricastillo, A; Gil, SM

    2014-01-01

    The aims of the present study were, firstly, to determine the reliability and reproducibility of an agility T-test and Yo-Yo 10 m recovery test; and secondly, to analyse the physical characteristics measured by sprint, agility, strength and endurance field tests in wheelchair basketball (WB) players. 16 WB players (33.06 ± 7.36 years, 71.89 ± 21.71 kg and sitting body height 86.07 ± 6.82 cm) belonging to the national WB league participated in this study. Wheelchair sprint (5 and 20 m without ball, and 5 and 20 m with ball) agility (T-test and pick-up test) strength (handgrip and maximal pass) and endurance (Yo-Yo 10 m recovery test) were performed. T-test and Yo-Yo 10 m recovery test showed good reproducibility values (intraclass correlation coefficient, ICC = 0.74-0.94). The WB players’ results in 5 and 20 m sprints without a ball were 1.87 ± 0.21 s and 5.70 ± 0.43 s and with a ball 2.10 ± 0.30 s and 6.59 ± 0.61 s, being better than those reported in the literature. Regarding the pick-up test results (16.05 ± 0.52 s) and maximal pass (8.39 ± 1.77 m), players showed worse values than those obtained in elite players. The main contribution of the present study is the characterization of the physical performance profile of WB players using a field test battery. Furthermore, we demonstrated that the agility T-test and the aerobic Yo-Yo 10 m recovery test are reliable; consequently they may be appropriate instruments for measuring physical fitness in WB. PMID:25729153

  16. AGILE detection of a flare from PKS 1510-089

    NASA Astrophysics Data System (ADS)

    Bulgarelli, A.; Tavani, M.; Fioretti, V.; Gianotti, F.; Trifoglio, M.; Pittori, C.; Verrecchia, F.; Lucarelli, F.; Vercellone, S.; Piano, G.; Donnarumma, I.; Striani, E.; Cardillo, M.; Giuliani, A.; Mereghetti, S.; Caraveo, P.; Perotti, F.; Chen, A.; Colafrancesco, S.; Del Monte, E.; Evangelista, Y.; Feroci, M.; Lazzarotto, F.; Pacciani, L.; Soffitta, P.; Costa, E.; Lapshov, I.; Rapisarda, M.; Argan, A.; Pucella, G.; Sabatini, S.; Trois, A.; Vittorini, V.; Fuschino, F.; Galli, M.; Labanti, C.; Marisaldi, M.; Di Cocco, G.; Pellizzoni, A.; Pilia, M.; Barbiellini, G.; Vallazza, E.; Longo, F.; Morselli, A.; Picozza, P.; Prest, M.; Lipari, P.; Zanello, D.; Cattaneo, P. W.; Rappoldi, A.; Giommi, P.; Salotti, L.; Valentini, G.

    2014-08-01

    AGILE is now detecting transient gamma-ray emission above 100 MeV from a source positionally consistent with PKS 1510-089. Integrating from 2014-07-31 00:43 UT to 2014-08-02 02:15 UT, a preliminary maximum likelihood analysis yields a detection above 100 MeV positioned at Galactic coordinates (l,b) = (350.96, 40.12) +/- 0.9 (stat.) +/- 0.1 (syst.).

  17. AGILE Observations of the Gravitational-wave Event GW150914

    NASA Astrophysics Data System (ADS)

    Tavani, M.; Pittori, C.; Verrecchia, F.; Bulgarelli, A.; Giuliani, A.; Donnarumma, I.; Argan, A.; Trois, A.; Lucarelli, F.; Marisaldi, M.; Del Monte, E.; Evangelista, Y.; Fioretti, V.; Zoli, A.; Piano, G.; Munar-Adrover, P.; Antonelli, L. A.; Barbiellini, G.; Caraveo, P.; Cattaneo, P. W.; Costa, E.; Feroci, M.; Ferrari, A.; Longo, F.; Mereghetti, S.; Minervini, G.; Morselli, A.; Pacciani, L.; Pellizzoni, A.; Picozza, P.; Pilia, M.; Rappoldi, A.; Sabatini, S.; Vercellone, S.; Vittorini, V.; Giommi, P.; Colafrancesco, S.; Cardillo, M.; Galli, M.; Fuschino, F.

    2016-07-01

    We report the results of an extensive search through the AGILE data for a gamma-ray counterpart to the LIGO gravitational-wave (GW) event GW150914. Currently in spinning mode, AGILE has the potential of cover 80% of the sky with its gamma-ray instrument, more than 100 times a day. It turns out that AGILE came within a minute of the event time of observing the accessible GW150914 localization region. Interestingly, the gamma-ray detector exposed ∼65% of this region during the 100 s time intervals centered at ‑100 and +300 s from the event time. We determine a 2σ flux upper limit in the band 50 MeV–10 GeV, UL = 1.9 × 10‑8 erg cm‑2 s‑1, obtained ∼300 s after the event. The timing of this measurement is the fastest ever obtained for GW150914, and significantly constrains the electromagnetic emission of a possible high-energy counterpart. We also carried out a search for a gamma-ray precursor and delayed emission over five timescales ranging from minutes to days: in particular, we obtained an optimal exposure during the interval ‑150/‑30 s. In all these observations, we do not detect a significant signal associated with GW150914. We do not reveal the weak transient source reported by Fermi-GBM 0.4 s after the event time. However, even though a gamma-ray counterpart of the GW150914 event was not detected, the prospects for future AGILE observations of GW sources are decidedly promising.

  18. Iron and steel industry process model

    SciTech Connect

    Sparrow, F.T.; Pilati, D.; Dougherty, T.; McBreen, E.; Juang, L.L.

    1980-01-01

    The iron and steel industry process model depicts expected energy-consumption characteristics of the iron and steel industry and ancillary industries for the next 25 years by means of a process model of the major steps in steelmaking, from ore mining and scrap recycling to the final finishing of carbon, alloy, and stainless steel into steel products such as structural steel, slabs, plates, tubes, and bars. Two plant types are modeled: fully integrated mills and mini-mills. User-determined inputs into the model are as follows: projected energy and materials prices; projected costs of capacity expansion and replacement; energy-conserving options, both operating modes and investments; the internal rate of return required on investment; and projected demand for finished steel. Nominal input choices in the model for the inputs listed above are as follows: National Academy of Sciences Committee on Nuclear and Alternative Energy Systems Demand Panel nominal energy-price projections for oil, gas, distillates, residuals, and electricity and 1975 actual prices for materials; actual 1975 costs; new technologies added; 15% after taxes; and 1975 actual demand with 1.5%/y growth. The model reproduces the base-year (1975) actual performance of the industry; then, given the above nominal input choices, it projects modes of operation and capacity expansion that minimize the cost of meeting the given final demands for each of 5 years, each year being the midpoint of a 5-year interval. The output of the model includes the following: total energy use and intensity (Btu/ton) by type, by process, and by time period; energy conservation options chosen; utilization rates for existing capacity; capital-investment decisions for capacity expansion.

  19. Development of a comprehensive weld process model

    SciTech Connect

    Radhakrishnan, B.; Zacharia, T.; Paul, A.

    1997-05-01

    This cooperative research and development agreement (CRADA) between Concurrent Technologies Corporation (CTC) and Lockheed Martin Energy Systems (LMES) combines CTC`s expertise in the welding area and that of LMES to develop computer models and simulation software for welding processes. This development is of significant impact to the industry, including materials producers and fabricators. The main thrust of the research effort was to develop a comprehensive welding simulation methodology. A substantial amount of work has been done by several researchers to numerically model several welding processes. The primary drawback of most of the existing models is the lack of sound linkages between the mechanistic aspects (e.g., heat transfer, fluid flow, and residual stress) and the metallurgical aspects (e.g., microstructure development and control). A comprehensive numerical model which can be used to elucidate the effect of welding parameters/conditions on the temperature distribution, weld pool shape and size, solidification behavior, and microstructure development, as well as stresses and distortion, does not exist. It was therefore imperative to develop a comprehensive model which would predict all of the above phenomena during welding. The CRADA built upon an already existing three-dimensional (3-D) welding simulation model which was developed by LMES which is capable of predicting weld pool shape and the temperature history in 3-d single-pass welds. However, the model does not account for multipass welds, microstructural evolution, distortion and residual stresses. Additionally, the model requires large resources of computing time, which limits its use for practical applications. To overcome this, CTC and LMES have developed through this CRADA the comprehensive welding simulation model described above.

  20. Thermal modeling of an epoxy encapsulation process

    SciTech Connect

    Baca, R.G.; Schutt, J.A.

    1991-01-01

    The encapsulation of components is a widely used process at Sandia National Laboratories for packaging components to withstand structural loads. Epoxy encapsulants are also used for their outstanding dielectric strength characteristics. The production of high voltage assemblies requires the encapsulation of ceramic and electrical components (such as transformers). Separation of the encapsulant from internal contact surfaces or voids within the encapsulant itself in regions near the mold base have caused high voltage breakdown failures during production testing. In order to understand the failure mechanisms, a methodology was developed to predict both the thermal response and gel front progression of the epoxy the encapsulation process. A thermal model constructed with PATRAN Plus (1) and solved with the P/THERMAL (2) analysis system was used to predict the thermal response of the encapsulant. This paper discusses the incorporation of an Arrhenius kinetics model into Q/TRAN (2) to model the complex volumetric heat generation of the epoxy during the encapsulation process. As the epoxy begins to cure, it generates heat and shrinks. The total cure time of the encapsulant (transformation from a viscous liquid to solid) is dependent on both the initial temperature and the entire temperature history. Because the rate of cure is temperature dependent, the cure rate accelerates with a temperature increase and, likewise, the cure rate is quenched if the temperature is reduced. The temperature and conversion predictions compared well against experimental data. The thermal simulation results were used to modify the temperature cure process of the encapsulant and improve production yields.

  1. Dynamic displays of chemical process flowsheet models

    SciTech Connect

    Aull, J.E.

    1996-11-01

    This paper describes the algorithms used in constructing dynamic graphical displays of a process flowsheet. Movies are created which portray changes in the process over time using animation in the flowsheet such as individual streams that take on a color keyed to the current flow rate, tank levels that visibly rise and fall and {open_quotes}gauges{close_quotes} that move to display parameter values. Movies of this type can be a valuable tool for visualizing, analyzing, and communicating the behavior of a process model. This paper describes the algorithms used in constructing displays of this kind for dynamic models using the SPEEDUP{trademark} modeling package and the GMS{trademark} graphics package. It also tells how data is exported from the SPEEDUP{trademark} package to GMS{trademark} and describes how a user environment for running movies and editing flowsheets is set up. The algorithms are general enough to be applied to other processes and graphics packages. In fact the techniques described here can be used to create movies of any time-dependent data.

  2. Agile Science Operations: A New Approach for Primitive Exploration Bodies

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.; Thompson, David R.; Castillo-Rogez, Julie C.; Doyle, Richard; Estlin, Tara; Mclaren, David

    2012-01-01

    Primitive body exploration missions such as potential Comet Surface Sample Return or Trojan Tour and Rendezvous would challenge traditional operations practices. Earth-based observations would provide only basic understanding before arrival and many science goals would be defined during the initial rendezvous. It could be necessary to revise trajectories and observation plans to quickly characterize the target for safe, effective observations. Detection of outgassing activity and monitoring of comet surface activity are even more time constrained, with events occurring faster than round-trip light time. "Agile science operations" address these challenges with contingency plans that recognize the intrinsic uncertainty in the operating environment and science objectives. Planning for multiple alternatives can significantly improve the time required to repair and validate spacecraft command sequences. When appropriate, time-critical decisions can be automated and shifted to the spacecraft for immediate access to instrument data. Mirrored planning systems on both sides of the light-time gap permit transfer of authority back and forth as needed. We survey relevant science objectives, identifying time bottlenecks and the techniques that could be used to speed missions' reaction to new science data. Finally, we discuss the results of a trade study simulating agile observations during flyby and comet rendezvous scenarios. These experiments quantify instrument coverage of key surface features as a function of planning turnaround time. Careful application of agile operations techniques can play a significant role in realizing the Decadal Survey plan for primitive body exploration

  3. Clustering-based urbanisation to improve enterprise information systems agility

    NASA Astrophysics Data System (ADS)

    Imache, Rabah; Izza, Said; Ahmed-Nacer, Mohamed

    2015-11-01

    Enterprises are daily facing pressures to demonstrate their ability to adapt quickly to the unpredictable changes of their dynamic in terms of technology, social, legislative, competitiveness and globalisation. Thus, to ensure its place in this hard context, enterprise must always be agile and must ensure its sustainability by a continuous improvement of its information system (IS). Therefore, the agility of enterprise information systems (EISs) can be considered today as a primary objective of any enterprise. One way of achieving this objective is by the urbanisation of the EIS in the context of continuous improvement to make it a real asset servicing enterprise strategy. This paper investigates the benefits of EISs urbanisation based on clustering techniques as a driver for agility production and/or improvement to help managers and IT management departments to improve continuously the performance of the enterprise and make appropriate decisions in the scope of the enterprise objectives and strategy. This approach is applied to the urbanisation of a tour operator EIS.

  4. Observing peculiar γ-ray pulsars with AGILE

    NASA Astrophysics Data System (ADS)

    Pilia, M.; Pellizzoni, A.

    2011-08-01

    The AGILE γ-ray satellite provides large sky exposure levels (>=109 cm2 s per year on the Galactic Plane) with sensitivity peaking at E ~100 MeV where the bulk of pulsar energy output is typically released. Its ~1 μs absolute time tagging capability makes it perfectly suited for the study of γ-ray pulsars. AGILE collected a large number of γ-ray photons from EGRET pulsars (>=40,000 pulsed counts for Vela) in two years of observations unveiling new interesting features at sub-millisecond level in the pulsars' high-energy light-curves, γ-ray emission from pulsar glitches and Pulsar Wind Nebulae. AGILE detected about 20 nearby and energetic pulsars with good confidence through timing and/or spatial analysis. Among the newcomers we find pulsars with very high rotational energy losses, such as the remarkable PSR B1509-58 with a magnetic field in excess of 1013 Gauss, and PSR J2229+6114 providing a reliable identification for the previously unidentified EGRET source 3EG2227+6122. Moreover, the powerful millisecond pulsar B1821-24, in the globular cluster M28, is detected during a fraction of the observations.

  5. Effect of fence height on joint angles of agility dogs.

    PubMed

    Birch, Emily; Leśniak, Kirsty

    2013-12-01

    The Kennel Club (KC) and United Kingdom Agility (UKA) govern major dog agility competitions in the UK. Dogs are categorised into different jump heights depending on their height at the withers, with fence heights ranging from 300 to 650 mm for both organisations. Dogs fall into one of three height categories when competing under KC rules and one of four height categories under UKA rules. The aim of this study was to investigate the influence of an additional height category for agility dogs measuring over 430 mm at the withers. Jump heights were selected that related to the percentage of body height that dogs of 430 mm (7% lower) and 431 mm (51% higher) height at the withers would be encouraged to jump under UKA regulations without the addition of their fourth ('standard height') category. Joint angles were determined from anatomical markers placed on the forelimb and hind limb joints, and at six points along the vertebral column. As fence height increased, flexion of the scapulohumeral joint increased significantly for both the take-off and bascule (arc) phases of the jump. The increase in flexion as a consequence of the increase in fence height is likely to result in intensified stretching of the biceps brachii and supraspinatus muscles. In addition, increasing fence high resulted in an increase in the sacroiliac joint angle during take-off. PMID:24360736

  6. Modeling and simulation of plasma processing equipment

    NASA Astrophysics Data System (ADS)

    Kim, Heon Chang

    Currently plasma processing technology is utilized in a wide range of applications including advanced Integrated Circuit (IC) fabrication. Traditionally, plasma processing equipments have been empirically designed and optimized at great expense of development time and cost. This research proposes the development of a first principle based, multidimensional plasma process simulator with the aim of enhancing the equipment design procedure. The proposed simulator accounts for nonlinear interactions among various plasma chemistry and physics, neutral chemistry and transport, and dust transport phenomena. A three moment modeling approach is employed that shows good predictive capabilities at reasonable computational expense. For numerical efficiency, various versions of explicit and implicit Essentially Non- Oscillatory (ENO) algorithms are employed. For the rapid evaluation of time-periodic steady-state solutions, a feedback control approach is employed. Two dimensional simulation results of capacitively coupled rf plasmas show that ion bombardment uniformity can be improved through simulation based design of the plasma process. Through self-consistent simulations of an rf triode, it is also shown that effects of secondary rf voltage and frequency on ion bombardment energy can be accurately captured. These results prove that scaling relations among important process variables can be identified through the three moment modeling and simulation approach. Through coupling of the plasma model with a neutral chemistry and transport model, spatiotemporal distributions of both charged and uncharged species, including metastables, are predicted for an oxygen plasma. Furthermore, simulation results also verify the existence of a double layer in this electronegative plasma. Through Lagrangian simulation of dust in a plasma reactor, it is shown that small particles are accumulate near the center and the radial sheath boundary depending on their initial positions while large

  7. Solidification modeling of continuous casting process

    NASA Astrophysics Data System (ADS)

    Lerner, V. S.; Lerner, Y. S.

    2005-04-01

    The aim of the present work was to utilize a new systematic mathematical-informational approach based on informational macrodynamics (IMD) to model and optimize the casting process, taking as an example horizontal continuous casting (HCC). The IMD model takes into account the interrelated thermal, diffusion, kinetic, hydrodynamic, and mechanical effects that are essential for the given casting process. The optimum technological process parameters are determined by the simultaneous solution of problems of identification and optimal control. The control functions of the synthesized optimal model are found from the extremum of the entropy functional having a particular sense of an integrated assessment of the continuous cast bar physicochemical properties. For the physical system considered, the IMD structures of the optimal model are connected with controllable equations of nonequilibrium thermodynamics. This approach was applied to the HCC of ductile iron, and the results were compared with experimental data and numerical simulation. Good agreement was confirmed between the predicted and practical data, as well as between new and traditional methods.

  8. Computer vision challenges and technologies for agile manufacturing

    NASA Astrophysics Data System (ADS)

    Molley, Perry A.

    1996-02-01

    applicable to commercial production processes and applications. Computer vision will play a critical role in the new agile production environment for automation of processes such as inspection, assembly, welding, material dispensing and other process control tasks. Although there are many academic and commercial solutions that have been developed, none have had widespread adoption considering the huge potential number of applications that could benefit from this technology. The reason for this slow adoption is that the advantages of computer vision for automation can be a double-edged sword. The benefits can be lost if the vision system requires an inordinate amount of time for reprogramming by a skilled operator to account for different parts, changes in lighting conditions, background clutter, changes in optics, etc. Commercially available solutions typically require an operator to manually program the vision system with features used for the recognition. In a recent survey, we asked a number of commercial manufacturers and machine vision companies the question, 'What prevents machine vision systems from being more useful in factories?' The number one (and unanimous) response was that vision systems require too much skill to set up and program to be cost effective.

  9. Reversibility in Quantum Models of Stochastic Processes

    NASA Astrophysics Data System (ADS)

    Gier, David; Crutchfield, James; Mahoney, John; James, Ryan

    Natural phenomena such as time series of neural firing, orientation of layers in crystal stacking and successive measurements in spin-systems are inherently probabilistic. The provably minimal classical models of such stochastic processes are ɛ-machines, which consist of internal states, transition probabilities between states and output values. The topological properties of the ɛ-machine for a given process characterize the structure, memory and patterns of that process. However ɛ-machines are often not ideal because their statistical complexity (Cμ) is demonstrably greater than the excess entropy (E) of the processes they represent. Quantum models (q-machines) of the same processes can do better in that their statistical complexity (Cq) obeys the relation Cμ >= Cq >= E. q-machines can be constructed to consider longer lengths of strings, resulting in greater compression. With code-words of sufficiently long length, the statistical complexity becomes time-symmetric - a feature apparently novel to this quantum representation. This result has ramifications for compression of classical information in quantum computing and quantum communication technology.

  10. Digraph reliability model processing advances and applications

    NASA Technical Reports Server (NTRS)

    Iverson, D. L.; Patterson-Hine, F. A.

    1993-01-01

    This paper describes a new algorithm, called SourceDoubls, which efficiently solves for singletons and doubletons of a digraph reliability model. Compared with previous methods, the SourceDoubls algorithm provides up to a two order of magnitude reduction in the amount of time required to solve large digraph models. This significant increase in model solution speed allows complex digraphs containing thousands of nodes to be used as knowledge bases for real time automated monitoring and diagnosis applications. Currently, an application to provide monitoring and diagnosis of the Space Station Freedom Data Management System is under development at NASA/Ames Research Center and NASA/Johnson Space Center. This paper contains an overview of this system and provides details of how it will use digraph models processed by the SourceDoubls algorithm to accomplish its task.

  11. Qualitative simulation for process modeling and control

    NASA Technical Reports Server (NTRS)

    Dalle Molle, D. T.; Edgar, T. F.

    1989-01-01

    A qualitative model is developed for a first-order system with a proportional-integral controller without precise knowledge of the process or controller parameters. Simulation of the qualitative model yields all of the solutions to the system equations. In developing the qualitative model, a necessary condition for the occurrence of oscillatory behavior is identified. Initializations that cannot exhibit oscillatory behavior produce a finite set of behaviors. When the phase-space behavior of the oscillatory behavior is properly constrained, these initializations produce an infinite but comprehensible set of asymptotically stable behaviors. While the predictions include all possible behaviors of the real system, a class of spurious behaviors has been identified. When limited numerical information is included in the model, the number of predictions is significantly reduced.

  12. Modeling Manufacturing Processes to Mitigate Technological Risk

    SciTech Connect

    Allgood, G.O.; Manges, W.W.

    1999-10-24

    An economic model is a tool for determining the justifiable cost of new sensors and subsystems with respect to value and operation. This process balances the R and D costs against the expense of maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.). This model has been successfully used and deployed in the CAFE Project. The economic model was one of seven (see Fig. 1) elements critical in developing an investment strategy. It has been successfully used in guiding the R and D activities on the CAFE Project, suspending activities on three new sensor technologies, and continuing development o f two others. The model has also been used to justify the development of a new prognostic approach for diagnosing machine health using COTS equipment and a new algorithmic approach. maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.).

  13. Multiscale numerical modeling of levee breach processes

    NASA Astrophysics Data System (ADS)

    Kees, C. E.; Farthing, M. W.; Akkerman, I.; Bazilevs, Y.

    2010-12-01

    One of the dominant failure modes of levees during flood and storm surge events is erosion-based breach formation due to high velocity flow over the back (land-side) slope. Modeling the breaching process numerically is challenging due to both physical and geometric complexity that develops and evolves during the overtopping event. The surface water flows are aerated and sediment-laden mixtures in the supercritical and turbulent regimes. The air/water free surface may undergo perturbations on the same order as the depth or even topological change (breaking). Likewise the soil/fluid interface is characterized by evolving headcuts, which are essentially moving discontinuities in the soil surface elevation. The most widely used models of levee breaching are nevertheless based on depth-integrated models of flow, sediment transport, and bed morphology. In this work our objective is to explore models with less restrictive modeling assumptions, which have become computationally tractable due to advances in both numerical methods and high-performance computing hardware. In particular, we present formulations of fully three-dimensional flow, transport, and morphological evolution for overtopping and breaching processes and apply recently developed finite element and level set methods to solve the governing equations for relevant test problems.

  14. Glacier lake outburst floods - modelling process chains

    NASA Astrophysics Data System (ADS)

    Schaub, Yvonne; Huggel, Christian; Haeberli, Wilfried

    2013-04-01

    New lakes are forming in high-mountain areas all over the world due to glacier recession. Often they will be located below steep, destabilized flanks and are therefore exposed to impacts from rock-/ice-avalanches. Several events worldwide are known, where an outburst flood has been triggered by such an impact. In regions such as in the European Alps or in the Cordillera Blanca in Peru, where valley bottoms are densely populated, these far-travelling, high-magnitude events can result in major disasters. For appropriate integral risk management it is crucial to gain knowledge on how the processes (rock-/ice-avalanches - impact waves in lake - impact on dam - outburst flood) interact and how the hazard potential related to corresponding process chains can be assessed. Research in natural hazards so far has mainly concentrated on describing, understanding, modeling or assessing single hazardous processes. Some of the above mentioned individual processes are quite well understood in their physical behavior and some of the process interfaces have also been investigated in detail. Multi-hazard assessments of the entire process chain, however, have only recently become subjects of investigations. Our study aims at closing this gap and providing suggestions on how to assess the hazard potential of the entire process chain in order to generate hazard maps and support risk assessments. We analyzed different types of models (empirical, analytical, physically based) for each process regarding their suitability for application in hazard assessments of the entire process chain based on literature. Results show that for rock-/ice-avalanches, dam breach and outburst floods, only numerical, physically based models are able to provide the required information, whereas the impact wave can be estimated by means of physically based or empirical assessments. We demonstrate how the findings could be applied with the help of a case study of a recent glacier lake outburst event at Laguna

  15. Expert Models and Modeling Processes Associated with a Computer-Modeling Tool

    ERIC Educational Resources Information Center

    Zhang, BaoHui; Liu, Xiufeng; Krajcik, Joseph S.

    2006-01-01

    Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using "think aloud" technique…

  16. Towards a Framework for Using Agile Approaches in Global Software Development

    NASA Astrophysics Data System (ADS)

    Hossain, Emam; Ali Babar, Muhammad; Verner, June

    As agile methods and Global Software Development (GSD) are become increasingly popular, GSD project managers have been exploring the viability of using agile approaches in their development environments. Despite the expected benefits of using an agile approach with a GSD project, the overall combining mechanisms of the two approaches are not clearly understood. To address this challenge, we propose a conceptual framework, based on the research literature. This framework is expected to aid a project manager in deciding what agile strategies are effective for a particular GSD project, taking into account project context. We use an industry-based case study to explore the components of our conceptual framework. Our case study is planned and conducted according to specific published case study guidelines. We identify the agile practices and agile supporting practices used by a GSD project manager in our case study and conclude with future research directions.

  17. Modeling of the vacuum plasma spray process

    SciTech Connect

    Varacalle, D.J. Jr. ); Neiser, R.A.; Smith, M.F. )

    1992-01-01

    Experimental and analytical studies have been conducted to investigate gas, particle, and coating dynamics in the vacuum plasma spray (VPS) process for a tungsten powder. VPS coatings were examined metallographically and the results compared with the model's predictions. The plasma was numerically modeled from the cathode tip to the spray distance in the free plume for the experimental conditions of this study. This information was then used as boundary conditions to solve the particle dynamics. The predicted temperature and velocity of the powder particles at standoff were then used as initial conditions for a coating dynamics code. The code predicts the coating morphology for the specific process parameters. The predicted characteristics exhibit good correlation with the observed coating properties.

  18. High-Speed Time-Series CCD Photometry with Agile

    NASA Astrophysics Data System (ADS)

    Mukadam, Anjum S.; Owen, R.; Mannery, E.; MacDonald, N.; Williams, B.; Stauffer, F.; Miller, C.

    2011-12-01

    We have assembled a high-speed time-series CCD photometer named Agile for the 3.5 m telescope at Apache Point Observatory, based on the design of a photometer called Argos at McDonald Observatory. Instead of a mechanical shutter, we use the frame-transfer operation of the CCD to end an exposure and initiate the subsequent new exposure. The frame-transfer operation is triggered by the negative edge of a GPS pulse; the instrument timing is controlled directly by hardware, without any software intervention or delays. This is the central pillar in the design of Argos that we have also used in Agile; this feature makes the accuracy of instrument timing better than a millisecond. Agile is based on a Princeton Instruments Acton VersArray camera with a frame-transfer CCD, which has 1K × 1K active pixels, each of size 13 μm × 13 μm. Using a focal reducer at the Nasmyth focus of the 3.5 m telescope at Apache Point Observatory, we yield a field of view of 2.2 × 2.2 arcmin2 with an unbinned plate scale of 0.13'' pixel-1. The CCD is back-illuminated and thinned for improved blue sensitivity and provides a quantum efficiency >=80% in the wavelength range of 4500-7500 Å. The unbinned full-frame readout time can be as fast as 1.1 s this is achieved using a low-noise amplifier operating at 1 MHz with an average read noise of the order of 6.6 e rms. At the slow read rate of 100 kHz to be used for exposure times longer than a few seconds, we determine an average read noise of the order of 3.7 e rms. Agile is optimized to observe variability at short timescales from one-third of a second to several hundred seconds. The variable astronomical sources routinely observed with Agile include pulsating white dwarfs, cataclysmic variables, flare stars, planetary transits, and planetary satellite occultations.

  19. Coupling environmental models and geospatial data processing

    NASA Astrophysics Data System (ADS)

    Brandmeyer, Jo Ellen

    2000-10-01

    This research investigated geospatial functions for solving environmental problems from the perspective of the environmental modeler. Its purpose is to better understand the different approaches to coupling complex models and geospatial data processing, plus the implications for the coupled system. To this end, various coupling methodologies were systematically explored using a geographic information system (GIS) and an emissions processor (SMOKE) for air quality models (AQMs). SMOKE converts an emissions inventory into the format required by an AQM. A GIS creates a file describing the spatial distribution of emissions among the cells in a modeling domain. To demonstrate advantages of a coupled GIS---environmental model system, two methods of spatially distributing on-road mobile emissions to cells were examined. The existing method calculates emissions for each road class, but distributes emissions to the cells using population density. For the new method a GIS builds road density by class and then distributes the emissions using road density. Comparing these methods reveals a significantly different spatial pattern of emissions. Next, various model-coupling methodologies were analyzed, revealing numerous coupling approaches, some of which were categorized in the literature. Critiquing these categorizations while comparing them with documented implementations led to the development of a new coupling hierarchy. The properties of each hierarchical level are discussed with the advantages and limitations of each design. To successfully couple models, the spatial and temporal scales of all models in the coupled system and the spatiotemporal extents of the data must be reconciled. Finally, a case study demonstrated methodologies for coupling SMOKE and a GIS. One methodology required a new approach utilizing dynamically linked libraries. Consequently, emissions were processed using SMOKE from a GIS. Also, a new method of converting data from netCDF files into a database

  20. Modeling biomedical experimental processes with OBI

    PubMed Central

    2010-01-01

    Background Experimental descriptions are typically stored as free text without using standardized terminology, creating challenges in comparison, reproduction and analysis. These difficulties impose limitations on data exchange and information retrieval. Results The Ontology for Biomedical Investigations (OBI), developed as a global, cross-community effort, provides a resource that represents biomedical investigations in an explicit and integrative framework. Here we detail three real-world applications of OBI, provide detailed modeling information and explain how to use OBI. Conclusion We demonstrate how OBI can be applied to different biomedical investigations to both facilitate interpretation of the experimental process and increase the computational processing and integration within the Semantic Web. The logical definitions of the entities involved allow computers to unambiguously understand and integrate different biological experimental processes and their relevant components. Availability OBI is available at http://purl.obolibrary.org/obo/obi/2009-11-02/obi.owl PMID:20626927

  1. Software Model Of Software-Development Process

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Synott, Debra J.; Levary, Reuven R.

    1990-01-01

    Collection of computer programs constitutes software tool for simulation of medium- to large-scale software-development projects. Necessary to include easily identifiable and more-readily quantifiable characteristics like costs, times, and numbers of errors. Mathematical model incorporating these and other factors of dynamics of software-development process implemented in the Software Life Cycle Simulator (SLICS) computer program. Simulates dynamics of software-development process. In combination with input and output expert software systems and knowledge-based management software system, develops information for use in managing large software-development project. Intended to aid managers in planning, managing, and controlling software-development processes by reducing uncertainties in budgets, required personnel, and schedules.

  2. Near Field Environment Process Model Report

    SciTech Connect

    R.A. Wagner

    2000-11-14

    Waste emplacement and activities associated with construction of a repository system potentially will change environmental conditions within the repository system. These environmental changes principally result from heat generated by the decay of the radioactive waste, which elevates temperatures within the repository system. Elevated temperatures affect distribution of water, increase kinetic rates of geochemical processes, and cause stresses to change in magnitude and orientation from the stresses resulting from the overlying rock and from underground construction activities. The recognition of this evolving environment has been reflected in activities, studies and discussions generally associated with what has been termed the Near-Field Environment (NFE). The NFE interacts directly with waste packages and engineered barriers as well as potentially changing the fluid composition and flow conditions within the mountain. As such, the NFE defines the environment for assessing the performance of a potential Monitored Geologic Repository at Yucca Mountain, Nevada. The NFe evolves over time, and therefore is not amenable to direct characterization or measurement in the ambient system. Analysis or assessment of the NFE must rely upon projections based on tests and models that encompass the long-term processes of the evolution of this environment. This NFE Process Model Report (PMR) describes the analyses and modeling based on current understanding of the evolution of the near-field within the rock mass extending outward from the drift wall.

  3. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    PubMed

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-01-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling. PMID:27157858

  4. Modeling hydrologic processes at the residential scale

    NASA Astrophysics Data System (ADS)

    Xiao, Q.; McPherson, G.; Simpson, J.; Ustin, S.

    2003-12-01

    In California, urbanization has led to polluted runoff, flooding during winter, and water shortages during summer. There is growing interest in application of microscale hydrologic solutions that eliminate storm runoff and conserve water at the source. In this study, a physically-based numerical model was developed to better understand hydrologic processes at the residential scale and the interaction of these processes among different Best Management Practices (BMPs). This model calculates all in-flow and out-flow using an hourly interval over a full year or for specific storm events. Water enters the system via precipitation and irrigation and leaves the system via evapotranspiration, surface and subsurface runoff, and from percolation to groundwater. The model was applied to two single-family residential parcels in Los Angeles. Two years of data collected from the control and treatment sites were used to calibrate and validate the model. More than 97% of storm runoff to the street was eliminated with installation of low-cost BMPs (i.e., rain gutters that direct roof runoff to a lawn retention basin and a driveway interceptor that directs runoff to a drywell in the lawn retention basin). Evaluated individually, the driveway interceptor was the most effective BMP for storm runoff reduction (65%), followed by the rain gutter installation (28%), and lawn converted to retention basin (12%). Installation of an 11 m3 cistern did not substantially reduce runoff, but did provide storage for 9% of annual irrigation demand. Simulated landscape irrigation demand was reduced 53% by increasing efficiency through use of a drip irrigation system for shrubs, and adjusting monthly application rates based on evapotranspirational water demand. The model showed that infiltration and surface runoff processes were particularly sensitive to the soil's physical properties and its effective depth. If the existing loam soil were replaced by clay soil annual runoff discharge to the street

  5. Reinventing The Design Process: Teams and Models

    NASA Technical Reports Server (NTRS)

    Wall, Stephen D.

    1999-01-01

    The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.

  6. The Agile Approach with Doctoral Dissertation Supervision

    ERIC Educational Resources Information Center

    Tengberg, Lars Göran Wallgren

    2015-01-01

    Several research findings conclude that many doctoral students fail to complete their studies within the allowable time frame, in part because of problems related to the research and supervision process. Surveys show that most doctoral students are generally satisfied with their dissertation supervision. However, these surveys also reveal some…

  7. Thermal Modeling of A Friction Bonding Process

    SciTech Connect

    John Dixon; Douglas Burkes; Pavel Medvedev

    2007-10-01

    A COMSOL model capable of predicting temperature evolution during nuclear fuel fabrication is being developed at the Idaho National Laboratory (INL). Fuel plates are fabricated by friction bonding (FB) uranium-molybdenum (U-Mo) alloy foils positioned between two aluminum plates. The ability to predict temperature distribution during fabrication is imperative to ensure good quality bonding without inducing an undesirable chemical reaction between U-Mo and aluminum. A three-dimensional heat transfer model of the FB process implementing shallow pin penetration for cladding monolithic nuclear fuel foils is presented. Temperature distribution during the FB process as a function of fabrication parameters such as weld speed, tool load, and tool rotational frequency are predicted. Model assumptions, settings, and equations are described in relation to standard friction stir welding. Current experimental design for validation and calibration of the model is also demonstrated. Resulting experimental data reveal the accuracy in describing asymmetrical temperature distributions about the tool face. Temperature of the bonded plate drops beneath the pin and is higher on the advancing side than the retreating side of the tool.

  8. Modeling Dynamic Regulatory Processes in Stroke.

    SciTech Connect

    McDermott, Jason E.; Jarman, Kenneth D.; Taylor, Ronald C.; Lancaster, Mary J.; Shankaran, Harish; Vartanian, Keri B.; Stevens, S.L.; Stenzel-Poore, Mary; Sanfilippo, Antonio P.

    2012-10-11

    The ability to examine in silico the behavior of biological systems can greatly accelerate the pace of discovery in disease pathologies, such as stroke, where in vivo experimentation is lengthy and costly. In this paper we describe an approach to in silico examination of blood genomic responses to neuroprotective agents and subsequent stroke through the development of dynamic models of the regulatory processes observed in the experimental gene expression data. First, we identified functional gene clusters from these data. Next, we derived ordinary differential equations (ODEs) relating regulators and functional clusters from the data. These ODEs were used to develop dynamic models that simulate the expression of regulated functional clusters using system dynamics as the modeling paradigm. The dynamic model has the considerable advantage of only requiring an initial starting state, and does not require measurement of regulatory influences at each time point in order to make accurate predictions. The manipulation of input model parameters, such as changing the magnitude of gene expression, made it possible to assess the behavior of the networks through time under varying conditions. We report that an optimized dynamic model can provide accurate predictions of overall system behavior under several different preconditioning paradigms.

  9. Computational models of natural language processing

    SciTech Connect

    Bara, B.G.; Guida, G.

    1984-01-01

    The main concern in this work is the illustration of models for natural language processing, and the discussion of their role in the development of computational studies of language. Topics covered include the following: competence and performance in the design of natural language systems; planning and understanding speech acts by interpersonal games; a framework for integrating syntax and semantics; knowledge representation and natural language: extending the expressive power of proposition nodes; viewing parsing as word sense discrimination: a connectionist approach; a propositional language for text representation; from topic and focus of a sentence to linking in a text; language generation by computer; understanding the Chinese language; semantic primitives or meaning postulates: mental models of propositional representations; narrative complexity based on summarization algorithms; using focus to constrain language generation; and towards an integral model of language competence.

  10. Relationship Between Reactive Agility and Change of Direction Speed in Amateur Soccer Players.

    PubMed

    Matlák, János; Tihanyi, József; Rácz, Levente

    2016-06-01

    Matlák, J, Tihanyi, J, and Rácz, L. Relationship between reactive agility and change of direction speed in amateur soccer players. J Strength Cond Res 30(6): 1547-1552, 2016-The aim of the study was to assess the relationship between reactive agility and change of direction speed (CODS) among amateur soccer players using running tests with four directional changes. Sixteen amateur soccer players (24.1 ± 3.3 years; 72.4 ± 7.3 kg; 178.7 ± 6 cm) completed CODS and reactive agility tests with four changes of direction using the SpeedCourt™ system (Globalspeed GmbH, Hemsbach, Germany). Countermovement jump (CMJ) height and maximal foot tapping count (completed in 3 seconds) were also measured with the same device. In the reactive agility test, participants had to react to a series of light stimuli projected onto a screen. Total time was shorter in the CODS test than in the reactive agility test (p < 0.001). Nonsignificant correlations were found among variables measured in the CODS, reactive agility, and CMJ tests. Low common variance (r = 0.03-0.18) was found between CODS and reactive agility test variables. The results of this study underscore the importance of cognitive factors in reactive agility performance and suggest that specific methods may be required for training and testing reactive agility and CODS. PMID:26562713

  11. Team-based work and work system balance in the context of agile manufacturing.

    PubMed

    Yauch, Charlene A

    2007-01-01

    Manufacturing agility is the ability to prosper in an environment characterized by constant and unpredictable change. The purpose of this paper is to analyze team attributes necessary to facilitate agile manufacturing, and using Balance Theory as a framework, it evaluates the potential positive and negative impacts related to these team attributes that could alter the balance of work system elements and resulting "stress load" experienced by persons working on agile teams. Teams operating within the context of agile manufacturing are characterized as multifunctional, dynamic, cooperative, and virtual. A review of the literature relevant to each of these attributes is provided, as well as suggestions for future research. PMID:16631101

  12. How rolling forecasting facilitates dynamic, agile planning.

    PubMed

    Miller, Debra; Allen, Michael; Schnittger, Stephanie; Hackman, Theresa

    2013-11-01

    Rolling forecasting may be used to replace or supplement the annual budget process. The rolling forecast typically builds on the organization's strategic financial plan, focusing on the first three years of plan projections and comparing the strategic financial plan assumptions with the organization's expected trajectory. Leaders can then identify and respond to gaps between the rolling forecast and the strategic financial plan on an ongoing basis. PMID:24340653

  13. MODELING PAVEMENT DETERIORATION PROCESSES BY POISSON HIDDEN MARKOV MODELS

    NASA Astrophysics Data System (ADS)

    Nam, Le Thanh; Kaito, Kiyoyuki; Kobayashi, Kiyoshi; Okizuka, Ryosuke

    In pavement management, it is important to estimate lifecycle cost, which is composed of the expenses for repairing local damages, including potholes, and repairing and rehabilitating the surface and base layers of pavements, including overlays. In this study, a model is produced under the assumption that the deterioration process of pavement is a complex one that includes local damages, which occur frequently, and the deterioration of the surface and base layers of pavement, which progresses slowly. The variation in pavement soundness is expressed by the Markov deterioration model and the Poisson hidden Markov deterioration model, in which the frequency of local damage depends on the distribution of pavement soundness, is formulated. In addition, the authors suggest a model estimation method using the Markov Chain Monte Carlo (MCMC) method, and attempt to demonstrate the applicability of the proposed Poisson hidden Markov deterioration model by studying concrete application cases.

  14. TUNS/TCIS information model/process model

    NASA Technical Reports Server (NTRS)

    Wilson, James

    1992-01-01

    An Information Model is comprised of graphical and textual notation suitable for describing and defining the problem domain - in our case, TUNS or TCIS. The model focuses on the real world under study. It identifies what is in the problem and organizes the data into a formal structure for documentation and communication purposes. The Information Model is composed of an Entity Relationship Diagram (ERD) and a Data Dictionary component. The combination of these components provide an easy to understand methodology for expressing the entities in the problem space, the relationships between entities and the characteristics (attributes) of the entities. This approach is the first step in information system development. The Information Model identifies the complete set of data elements processed by TUNS. This representation provides a conceptual view of TUNS from the perspective of entities, data, and relationships. The Information Model reflects the business practices and real-world entities that users must deal with.

  15. Multiphase Flow Modeling of Biofuel Production Processes

    SciTech Connect

    D. Gaston; D. P. Guillen; J. Tester

    2011-06-01

    As part of the Idaho National Laboratory's (INL's) Secure Energy Initiative, the INL is performing research in areas that are vital to ensuring clean, secure energy supplies for the future. The INL Hybrid Energy Systems Testing (HYTEST) Laboratory is being established to develop and test hybrid energy systems with the principal objective to safeguard U.S. Energy Security by reducing dependence on foreign petroleum. HYTEST involves producing liquid fuels in a Hybrid Energy System (HES) by integrating carbon-based (i.e., bio-mass, oil-shale, etc.) with non-carbon based energy sources (i.e., wind energy, hydro, geothermal, nuclear, etc.). Advances in process development, control and modeling are the unifying vision for HES. This paper describes new modeling tools and methodologies to simulate advanced energy processes. Needs are emerging that require advanced computational modeling of multiphase reacting systems in the energy arena, driven by the 2007 Energy Independence and Security Act, which requires production of 36 billion gal/yr of biofuels by 2022, with 21 billion gal of this as advanced biofuels. Advanced biofuels derived from microalgal biomass have the potential to help achieve the 21 billion gal mandate, as well as reduce greenhouse gas emissions. Production of biofuels from microalgae is receiving considerable interest due to their potentially high oil yields (around 600 gal/acre). Microalgae have a high lipid content (up to 50%) and grow 10 to 100 times faster than terrestrial plants. The use of environmentally friendly alternatives to solvents and reagents commonly employed in reaction and phase separation processes is being explored. This is accomplished through the use of hydrothermal technologies, which are chemical and physical transformations in high-temperature (200-600 C), high-pressure (5-40 MPa) liquid or supercritical water. Figure 1 shows a simplified diagram of the production of biofuels from algae. Hydrothermal processing has significant

  16. Modeling the topological organization of cellular processes.

    PubMed

    Giavitto, Jean-Louis; Michel, Olivier

    2003-07-01

    The cell as a dynamical system presents the characteristics of having a dynamical structure. That is, the exact phase space of the system cannot be fixed before the evolution and integrative cell models must state the evolution of the structure jointly with the evolution of the cell state. This kind of dynamical systems is very challenging to model and simulate. New programming concepts must be developed to ease their modeling and simulation. In this context, the goal of the MGS project is to develop an experimental programming language dedicated to the simulation of this kind of systems. MGS proposes a unified view on several computational mechanisms (CHAM, Lindenmayer systems, Paun systems, cellular automata) enabling the specification of spatially localized computations on heterogeneous entities. The evolution of a dynamical structure is handled through the concept of transformation which relies on the topological organization of the system components. An example based on the modeling of spatially distributed biochemical networks is used to illustrate how these notions can be used to model the spatial and temporal organization of intracellular processes. PMID:12915272

  17. The Comprehensive Process Model of Engagement

    PubMed Central

    Cohen-Mansfield, Jiska; Marx, Marcia S.; Freedman, Laurence S.; Murad, Havi; Regier, Natalie G.; Thein, Khin; Dakheel-Ali, Maha

    2010-01-01

    Background Engagement refers to the act of being occupied or involved with an external stimulus. In dementia, engagement is the antithesis of apathy. Objective The Comprehensive Process Model of Engagement was examined, in which environmental, person, and stimulus characteristics impact the level of engagement of persons with dementia. Methods Participants were 193 residents of 7 Maryland nursing homes. All participants had a diagnosis of dementia. Stimulus engagement was assessed via the Observational Measure of Engagement. Engagement was measured by duration, attention, and attitude to the stimulus. 25 stimuli were presented, which were categorized as live human social stimuli, simulated social stimuli, inanimate social stimuli, a reading stimulus, manipulative stimuli, a music stimulus, task and work-related stimuli, and two different self-identity stimuli. Results All stimuli elicited significantly greater engagement in comparison to the control stimulus. In the multivariate model, music significantly increased engagement duration, while all other stimuli significantly increased duration, attention, and attitude. Significant environmental variables in the multivariate model that increased engagement were: use of the long introduction with modeling (relative to minimal introduction), any level of sound (most especially moderate sound), and the presence of between 2 to 24 people in the room. Significant personal attributes included MMSE scores, ADL performance and clarity of speech, which were positively associated with higher engagement scores. Conclusions Results are consistent with the Comprehensive Process Model of Engagement. Person attributes, environmental factors, and stimulus characteristics all contribute to the level and nature of engagement, with a secondary finding being that exposure to any stimulus elicits engagement in persons with dementia. PMID:21946802

  18. Three-dimensional model for fusion processes

    SciTech Connect

    Olson, A.P.

    1984-01-01

    Active galactic nuclei (AGN) emit unusual spectra of radiation which is interpreted to signify extreme distance, extreme power, or both. The status of AGNs was recently reviewed by Balick and Heckman. It seems that the greatest conceptual difficulty with understanding AGNs is how to form a coherent phenomenological model of their properties. What drives the galactic engine. What and where are the mass-flows of fuel to this engine. Are there more than one engine. Do the engines have any symmetry properties. Is observed radiation isotropically emitted from the source. If it is polarized, what causes the polarization. Why is there a roughly spherical cloud of ionized gas about the center of our own galaxy, the Milky Way. The purpose of this paper is to discuss a new model, based on fusion processes which are not axisymmetric, uniform, isotropic, or even time-invariant. Then, the relationship to these questions will be developed. A unified model of fusion processes applicable to many astronomical phenomena will be proposed and discussed.

  19. Modeling of an Active Tablet Coating Process.

    PubMed

    Toschkoff, Gregor; Just, Sarah; Knop, Klaus; Kleinebudde, Peter; Funke, Adrian; Djuric, Dejan; Scharrer, Georg; Khinast, Johannes G

    2015-12-01

    Tablet coating is a common unit operation in the pharmaceutical industry, during which a coating layer is applied to tablet cores. The coating uniformity of tablets in a batch is especially critical for active coating, that is, coating that contains an active pharmaceutical ingredient. In recent years, discrete element method (DEM) simulations became increasingly common for investigating tablet coating. In this work, DEM was applied to model an active coating process as closely as possible, using measured model parameters and non-spherical particles. We studied how operational conditions (rotation speed, fill level, number of nozzles, and spray rate) influence the coating uniformity. To this end, simulation runs were planned and interpreted according to a statistical design of (simulation) experiments. Our general goal was to achieve a deeper understanding of the process in terms of residence times and dimensionless scaling laws. With that regard, the results were interpreted in light of analytical models. The results were presented at various detail levels, ranging from an overview of all variations to in-depth considerations. It was determined that the biggest uniformity improvement in a realistic setting was achieved by increasing the number of spray nozzles, followed by increasing the rotation speed and decreasing the fill level. PMID:26344941

  20. Process-Based Modeling of Constructed Wetlands

    NASA Astrophysics Data System (ADS)

    Baechler, S.; Brovelli, A.; Rossi, L.; Barry, D. A.

    2007-12-01

    Constructed wetlands (CWs) are widespread facilities for wastewater treatment. In subsurface flow wetlands, contaminated wastewater flows through a porous matrix, where oxidation and detoxification phenomena occur. Despite the large number of working CWs, system design and optimization are still mainly based upon empirical equations or simplified first-order kinetics. This results from an incomplete understanding of the system functioning, and may in turn hinder the performance and effectiveness of the treatment process. As a result, CWs are often considered not suitable to meet high water quality-standards, or to treat water contaminated with recalcitrant anthropogenic contaminants. To date, only a limited number of detailed numerical models have been developed and successfully applied to simulate constructed wetland behavior. Among these, one of the most complete and powerful is CW2D, which is based on Hydrus2D. The aim of this work is to develop a comprehensive simulator tailored to model the functioning of horizontal flow constructed wetlands and in turn provide a reliable design and optimization tool. The model is based upon PHWAT, a general reactive transport code for saturated flow. PHWAT couples MODFLOW, MT3DMS and PHREEQC-2 using an operator-splitting approach. The use of PHREEQC to simulate reactions allows great flexibility in simulating biogeochemical processes. The biogeochemical reaction network is similar to that of CW2D, and is based on the Activated Sludge Model (ASM). Kinetic oxidation of carbon sources and nutrient transformations (nitrogen and phosphorous primarily) are modeled via Monod-type kinetic equations. Oxygen dissolution is accounted for via a first-order mass-transfer equation. While the ASM model only includes a limited number of kinetic equations, the new simulator permits incorporation of an unlimited number of both kinetic and equilibrium reactions. Changes in pH, redox potential and surface reactions can be easily incorporated

  1. Organizational Leadership Process for University Education

    ERIC Educational Resources Information Center

    Llamosa-Villalba, Ricardo; Delgado, Dario J.; Camacho, Heidi P.; Paéz, Ana M.; Valdivieso, Raúl F.

    2014-01-01

    This paper relates the "Agile School", an emerging archetype of the enterprise architecture: "Processes of Organizational Leadership" for leading and managing strategies, tactics and operations of forming in Higher Education Institutions. Agile School is a system for innovation and deep transformation of University Institutions…

  2. The evaluation of several agility metrics for fighter aircraft using optimal trajectory analysis

    NASA Technical Reports Server (NTRS)

    Ryan, George W., III; Downing, David R.

    1993-01-01

    Several functional agility metrics, including the combat cycle time metric, dynamic speed turn plots, and relative energy state metric, are used to compare turning performance for generic F-18, X-29, and X-31-type aircraft models. These three-degree-of-freedom models have characteristics similar to the real aircraft. The performance comparisons are made using data from optimal test trajectories to reduce sensitivities to different pilot input techniques and to reduce the effects of control system limiters. The turn performance for all three aircraft is calculated for simulated minimum time 180 deg heading captures from simulation data. Comparisons of the three aircraft give more insight into turn performance than would be available from traditional measures of performance. Using the optimal test technique yields significant performance improvements as measured by the metrics. These performance improvements were found without significant increases in turn radius.

  3. The determinants of parenting: a process model.

    PubMed

    Belsky, J

    1984-02-01

    This essay is based on the assumption that a long-neglected topic of socialization, the determinants of individual differences in parental functioning, is illuminated by research on the etiology of child maltreatment. Three domains of determinants are identified (personal psychological resources of parents, characteristics of the child, and contextual sources of stress and support), and a process model of competent parental functioning is offered on the basis of the analysis. The model presumes that parental functioning is multiply determined, that sources of contextual stress and support can directly affect parenting or indirectly affect parenting by first influencing individual psychological well-being, that personality influences contextual support/stress, which feeds back to shape parenting, and that, in order of importance, the personal psychological resources of the parent are more effective in buffering the parent-child relation from stress than are contextual sources of support, which are themselves more effective than characteristics of the child. PMID:6705636

  4. Development of a dynamic thermal model process

    SciTech Connect

    Smith, F. R.

    1996-04-01

    A dynamic electrical-thermal modeling simulation technique was developed to allow up-front design of thermal and electronic packaging with a high degree of accuracy and confidence. We are developing a hybrid multichip module output driver which controls with power MOSFET driver circuits. These MOSFET circuits will dissipate from 13 to 26 watts per driver in a physical package less than two square inches. The power dissipation plus an operating temperature range of -55{degrees} C to 100{degrees} C makes an accurate thermal package design critical. The project goal was to develop a simulation process to dynamically model the electrical/thermal characteristics of the power MOSFETS using the SABER analog simulator and the ABAQUS finite element simulator. SABER would simulate the electrical characteristics of the multi-chip module design while co-simulation is being done with ABAQUS simulating the solid model thermal characteristics of the MOSFET package. The dynamic parameters, MOSFET power and chip temperature, would be actively passed between simulators to effect a coupled simulator modelling technique. The project required a development of a SABER late for the analog ASIC controller circuit, a dynamic electrical/thermal template for the IRF150 and IRF9130 power MOSFETs, a solid model of the multi-chip module package, FORTRAN code to handle I/Q between and HP755 workstation and SABER, and I/O between CRAY J90 computer and ABAQUS. The simulation model was certified by measured electrical characteristics of the circuits and real time thermal imaging of the output multichip module.

  5. Observer-participant models of neural processing.

    PubMed

    Fry, R L

    1995-01-01

    A model is proposed in which the neuron serves as an information channel. Channel distortion occurs through the channel since the mapping from input Boolean codes to output codes are many-to-one in that neuron outputs consist of just two distinguished states. Within the described model, the neuron performs a decision-making function. Decisions are made regarding the validity of a question passively posed by the neuron. This question becomes defined through learning hence learning is viewed as the process of determining an appropriate question based on supplied input ensembles. An application of the Shannon information measures of entropy and mutual information taken together in the context of the proposed model lead to the Hopfield neuron model with conditionalized Hebbian learning rules. Neural decisions are shown to be based on a sigmoidal transfer characteristic or, in the limit as computational temperature tends to zero, a maximum likelihood decision rule. The described work is contrasted with the information-theoretic approach of Linsker. PMID:18263380

  6. Relationships Between Reactive Agility Movement Time and Unilateral Vertical, Horizontal, and Lateral Jumps.

    PubMed

    Henry, Greg J; Dawson, Brian; Lay, Brendan S; Young, Warren B

    2016-09-01

    Henry, GJ, Dawson, B, Lay, BS, and Young, WB. Relationships between reactive agility movement time and unilateral vertical, horizontal, and lateral jumps. J Strength Cond Res 30(9): 2514-2521, 2016-This study compared reactive agility movement time and unilateral (vertical, horizontal, and lateral) jump performance and kinetics between dominant and nondominant legs in Australian rules footballers (n = 31) to investigate the role of leg strength characteristics in reactive agility performance. Jumps involved jumping forward on 1 leg, then for maximum height or horizontal or lateral distance. Agility and movement time components of reactive agility were assessed using a video-based test. Correlations between each of the jumps were strong (r = -0.62 to -0.77), but between the jumps and agility movement time the relationships were weak (r = -0.25 to -0.33). Dominant leg performance was superior in reactive agility movement time (4.5%; p = 0.04), lateral jump distance (3%; p = 0.008), and lateral reactive strength index (4.4%; p = 0.03) compared with the nondominant leg. However, when the subjects were divided into faster and slower performers (based on their agility movement times) the movement time was significantly quicker in the faster group (n = 15; 12%; p < 0.001), but no differences in jump performance or kinetics were observed. Therefore, although the capacity for jumps to predict agility performance seems limited, factors involved in producing superior lateral jump performance in the dominant leg may also be associated with advantages in agility performance in that leg. However, because reactive strength as measured by unilateral jumps seems to play a limited role in reactive agility performance and other factors such as skill, balance, and coordination, and also cognitive and decision-making factors, are likely to be more important. PMID:23820562

  7. Development of a Comprehensive Weld Process Model

    SciTech Connect

    Radhakrishnan, B.; Zacharia, T.

    1997-05-01

    This cooperative research and development agreement (CRADA) between Concurrent Technologies Corporation (CTC) and Lockheed Martin Energy Systems (LMES) combines CTC's expertise in the welding area and that of LMES to develop computer models and simulation software for welding processes. This development is of significant impact to the industry, including materials producers and fabricators. The main thrust of the research effort was to develop a comprehensive welding simulation methodology. A substantial amount of work has been done by several researchers to numerically model several welding processes. The primary drawback of most of the existing models is the lack of sound linkages between the mechanistic aspects (e.g., heat transfer, fluid flow, and residual stress) and the metallurgical aspects (e.g., microstructure development and control). A comprehensive numerical model which can be used to elucidate the effect of welding parameters/conditions on the temperature distribution, weld pool shape and size, solidification behavior, and microstructure development, as well as stresses and distortion, does not exist. It was therefore imperative to develop a comprehensive model which would predict all of the above phenomena during welding. The CRADA built upon an already existing three- dimensional (3-D) welding simulation model which was developed by LMES which is capable of predicting weld pool shape and the temperature history in 3-d single-pass welds. However, the model does not account for multipass welds, microstructural evolution, distortion and residual stresses. Additionally, the model requires large resources of computing time, which limits its use for practical applications. To overcome this, CTC and LMES have developed through this CRADA the comprehensive welding simulation model described above. The following technical tasks have been accomplished as part of the CRADA. 1. The LMES welding code has been ported to the Intel Paragon parallel computer at ORNL

  8. Wired Widgets: Agile Visualization for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Gerschefske, K.; Witmer, J.

    2012-09-01

    Continued advancement in sensors and analysis techniques have resulted in a wealth of Space Situational Awareness (SSA) data, made available via tools and Service Oriented Architectures (SOA) such as those in the Joint Space Operations Center Mission Systems (JMS) environment. Current visualization software cannot quickly adapt to rapidly changing missions and data, preventing operators and analysts from performing their jobs effectively. The value of this wealth of SSA data is not fully realized, as the operators' existing software is not built with the flexibility to consume new or changing sources of data or to rapidly customize their visualization as the mission evolves. While tools like the JMS user-defined operational picture (UDOP) have begun to fill this gap, this paper presents a further evolution, leveraging Web 2.0 technologies for maximum agility. We demonstrate a flexible Web widget framework with inter-widget data sharing, publish-subscribe eventing, and an API providing the basis for consumption of new data sources and adaptable visualization. Wired Widgets offers cross-portal widgets along with a widget communication framework and development toolkit for rapid new widget development, giving operators the ability to answer relevant questions as the mission evolves. Wired Widgets has been applied in a number of dynamic mission domains including disaster response, combat operations, and noncombatant evacuation scenarios. The variety of applications demonstrate that Wired Widgets provides a flexible, data driven solution for visualization in changing environments. In this paper, we show how, deployed in the Ozone Widget Framework portal environment, Wired Widgets can provide an agile, web-based visualization to support the SSA mission. Furthermore, we discuss how the tenets of agile visualization can generally be applied to the SSA problem space to provide operators flexibility, potentially informing future acquisition and system development.

  9. Flexible Conveyance Control System for Agile Manufacturing

    NASA Astrophysics Data System (ADS)

    Uchiyama, Kazuhisa; Uchimura, Keiichi; Yoshikawa, Takeru; Shishino, Satoru; Hu, Zhencheng

    The control of conveyance between processes still relies on the engineer's experience. It is difficult to correspond to the change in the conveyance layout due to the expansion and reduction of production and the breakdown of conveyance devices. In this paper, we propose a conveyance control system that can adapt to changes in the conveyance layout and we aim at the improvement of the amount of total conveyance. First of all, the system is divided into the part where the conveyance layout is made a data base and the part where the conveyance control that uses the congestion index is done. Therefore, it is possible to deal flexibly by changing the data base part without changing the conveyance control part when it is necessary to change the conveyance layout suddenly. In all the combinations of the conveyance layout and the turning on pattern used as a result of the simulation, the proposed conveyance control method became a great result by the amount of total conveyance.

  10. An updated list of AGILE bright γ-ray sources and their variability in pointing mode

    NASA Astrophysics Data System (ADS)

    Verrecchia, F.; Pittori, C.; Chen, A. W.; Bulgarelli, A.; Tavani, M.; Lucarelli, F.; Giommi, P.; Vercellone, S.; Pellizzoni, A.; Giuliani, A.; Longo, F.; Barbiellini, G.; Trifoglio, M.; Gianotti, F.; Argan, A.; Antonelli, L. A.; Caraveo, P.; Cardillo, M.; Cattaneo, P. W.; Cocco, V.; Colafrancesco, S.; Contessi, T.; Costa, E.; Del Monte, E.; De Paris, G.; Di Cocco, G.; Di Persio, G.; Donnarumma, I.; Evangelista, Y.; Fanari, G.; Feroci, M.; Ferrari, A.; Fiorini, M.; Fornari, F.; Fuschino, F.; Froysland, T.; Frutti, M.; Galli, M.; Labanti, C.; Lapshov, I.; Lazzarotto, F.; Liello, F.; Lipari, P.; Mattaini, E.; Marisaldi, M.; Mastropietro, M.; Mauri, A.; Mauri, F.; Mereghetti, S.; Morelli, E.; Moretti, E.; Morselli, A.; Pacciani, L.; Perotti, F.; Piano, G.; Picozza, P.; Pilia, M.; Pontoni, C.; Porrovecchio, G.; Prest, M.; Primavera, R.; Pucella, G.; Rapisarda, M.; Rappoldi, A.; Rossi, E.; Rubini, A.; Sabatini, S.; Santolamazza, P.; Soffitta, P.; Stellato, S.; Striani, E.; Tamburelli, F.; Traci, A.; Trois, A.; Vallazza, E.; Vittorini, V.; Zanello, D.; Salotti, L.; Valentini, G.

    2013-10-01

    Aims: We present a variability study of a sample of bright γ-ray(30 Mev-50 Gev) sources. This sample is an extension of the first AGILE catalogue of γ-ray sources (1AGL), obtained using the complete set of AGILE observations in pointing mode performed during a 2.3 year period from July 9, 2007 until October 30, 2009. Methods: The dataset of AGILE pointed observations covers a long time interval and its γ-ray data archive is useful for monitoring studies of medium-to-high brightness γ-ray sources. In the analysis reported here, we used data obtained with an improved event filter that covers a wider field of view, on a much larger (about 27.5 months) dataset, integrating data on observation block time scales, which mostly range from a few days to thirty days. Results: The data processing resulted in a better characterized source list than 1AGL was, and includes 54 sources, 7 of which are new high galactic latitude (|BII| ≥ 5) sources, 8 are new sources on the galactic plane, and 20 sources from the previous catalogue with revised positions. Eight 1AGL sources (2 high-latitude and 6 on the galactic plane) were not detected in the final processing either because of low OB exposure and/or due to their position in complex galactic regions. We report the results in a catalogue of all the detections obtained in each single OB, including the variability results for each of these sources. In particular, we found that 12 sources out of 42 or 11 out of 53 are variable, depending on the variability index used, where 42 and 53 are the number of sources for which these indices could be calculated. Seven of the 11 variable sources are blazars, the others are Crab pulsar+nebula, LS I +61°303, Cyg X-3, and 1AGLR J2021+4030. Table 5 is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/558/A137

  11. Stress Process Model for Individuals With Dementia

    PubMed Central

    Judge, Katherine S.; Menne, Heather L.; Whitlatch, Carol J.

    2010-01-01

    Purpose: Individuals with dementia (IWDs) face particular challenges in managing and coping with their illness. The experience of dementia may be affected by the etiology, stage, and severity of symptoms, preexisting and related chronic conditions, and available informal and formal supportive services. Although several studies have examined particular features of IWD’s illness experience, few draw upon a conceptual model that outlines the global illness experience and the resulting stressors that commence with symptom onset, proliferate over time, and continue through the later stages of cognitive loss. Building on the work of Pearlin and colleagues (1990, Caregiving and the stress process: An overview of concepts and their measures. The Gerontologist, 30, 583–594), this article proposes a stress process model (SPM) for IWDs that conceptualizes and examines the illness experience of IWDs. Implications: The proposed SPM for IWDs serves as a guide to (a) consider and understand the short- and long-term complexities of the illness experience for IWDs, (b) investigate specific hypotheses by outlining key stressors in the illness experience and by positing relationships among stressors and outcomes, and (c) help inform the development of interventions to prevent or reduce the negative stressors and enhance the positive experiences of living with a dementing illness. PMID:20022935

  12. Simulation model of clastic sedimentary processes

    SciTech Connect

    Tetzlaff, D.M.

    1987-01-01

    This dissertation describes SEDSIM, a computer model that simulates erosion, transport, and deposition of clastic sediments by free-surface flow in natural environments. SEDSIM is deterministic and is applicable to sedimentary processes in rivers, deltas, continental shelves, submarine canyons, and turbidite fans. The model is used to perform experiments in clastic sedimentation. Computer experimentation is limited by computing power available, but is free from scaling problems associated with laboratory experiments. SEDSIM responds to information provided to it at the outset of a simulation experiment, including topography, subsurface configuration, physical parameters of fluid and sediment, and characteristics of sediment sources. Extensive computer graphics are incorporated in SEDSIM. The user can display the three-dimensional geometry of simulated deposits in the form of successions of contour maps, perspective diagrams, vector plots of current velocities, and vertical sections of any azimuth orientation. The sections show both sediment age and composition. SEDSIM works realistically with processes involving channel shifting and topographic changes. Example applications include simulation of an ancient submarine canyon carved into a Cretaceous sequence in the National Petroleum Reserve in Alaska, known mainly from seismic sections and a sequence of Tertiary age in the Golden Meadow oil field of Louisiana, known principally from well logs.

  13. Anisotropic model-based SAR processing

    NASA Astrophysics Data System (ADS)

    Knight, Chad; Gunther, Jake; Moon, Todd

    2013-05-01

    Synthetic aperture radar (SAR) collections that integrate over a wide range of aspect angles hold the potentional for improved resolution and fosters improved scene interpretability and target detection. However, in practice it is difficult to realize the potential due to the anisotropic scattering of objects in the scene. The radar cross section (RCS) of most objects changes as a function of aspect angle. The isotropic assumption is tacitly made for most common image formation algorithms (IFA). For wide aspect scenarios one way to account for anistropy would be to employ a piecewise linear model. This paper focuses on such a model but it incorporates aspect and spatial magnitude filters in the image formation process. This is advantageous when prior knowledge is available regarding the desired targets' RCS signature spatially and in aspect. The appropriate filters can be incorporated into the image formation processing so that specific targets are emphasized while other targets are suppressed. This is demonstrated on the Air Force Research Laboratory (AFRL) GOTCHA1 data set to demonstrate the utility of the proposed approach.

  14. Perspectives on Industrial Innovation from Agilent, HP, and Bell Labs

    NASA Astrophysics Data System (ADS)

    Hollenhorst, James

    2014-03-01

    Innovation is the life blood of technology companies. I will give perspectives gleaned from a career in research and development at Bell Labs, HP Labs, and Agilent Labs, from the point of view of an individual contributor and a manager. Physicists bring a unique set of skills to the corporate environment, including a desire to understand the fundamentals, a solid foundation in physical principles, expertise in applied mathematics, and most importantly, an attitude: namely, that hard problems can be solved by breaking them into manageable pieces. In my experience, hiring managers in industry seldom explicitly search for physicists, but they want people with those skills.

  15. Impact of emerging technologies on future combat aircraft agility

    NASA Technical Reports Server (NTRS)

    Nguyen, Luat T.; Gilert, William P.

    1990-01-01

    The foreseeable character of future within-visual-range air combat entails a degree of agility which calls for the integration of high-alpha aerodynamics, thrust vectoring, intimate pilot/vehicle interfaces, and advanced weapons/avionics suites, in prospective configurations. The primary technology-development programs currently contributing to these goals are presently discussed; they encompass the F-15 Short Takeoff and Landing/Maneuver Technology Demonstrator Program, the Enhanced Fighter Maneuverability Program, the High Angle-of-Attack Technology Program, and the X-29 Technology Demonstrator Program.

  16. Agile delivery of protein therapeutics to CNS.

    PubMed

    Yi, Xiang; Manickam, Devika S; Brynskikh, Anna; Kabanov, Alexander V

    2014-09-28

    A variety of therapeutic proteins have shown potential to treat central nervous system (CNS) disorders. Challenge to deliver these protein molecules to the brain is well known. Proteins administered through parenteral routes are often excluded from the brain because of their poor bioavailability and the existence of the blood-brain barrier (BBB). Barriers also exist to proteins administered through non-parenteral routes that bypass the BBB. Several strategies have shown promise in delivering proteins to the brain. This review, first, describes the physiology and pathology of the BBB that underscore the rationale and needs of each strategy to be applied. Second, major classes of protein therapeutics along with some key factors that affect their delivery outcomes are presented. Third, different routes of protein administration (parenteral, central intracerebroventricular and intraparenchymal, intranasal and intrathecal) are discussed along with key barriers to CNS delivery associated with each route. Finally, current delivery strategies involving chemical modification of proteins and use of particle-based carriers are overviewed using examples from literature and our own work. Whereas most of these studies are in the early stage, some provide proof of mechanism of increased protein delivery to the brain in relevant models of CNS diseases, while in few cases proof of concept had been attained in clinical studies. This review will be useful to broad audience of students, academicians and industry professionals who consider critical issues of protein delivery to the brain and aim developing and studying effective brain delivery systems for protein therapeutics. PMID:24956489

  17. Agile Delivery of Protein Therapeutics to CNS

    PubMed Central

    Yi, Xiang; Manickam, Devika S.; Brynskikh, Anna; Kabanov, Alexander V.

    2014-01-01

    A variety of therapeutic proteins have shown potential to treat central nervous system (CNS) disorders. Challenge to deliver these protein molecules to the brain is well known. Proteins administered through parenteral routes are often excluded from the brain because of their poor bioavailability and the existence of the blood-brain barrier (BBB). Barriers also exist to proteins administered through non-parenteral routes that bypass the BBB. Several strategies have shown promise in delivering proteins to the brain. This review, first, describes the physiology and pathology of the BBB that underscore the rationale and needs of each strategy to be applied. Second, major classes of protein therapeutics along with some key factors that affect their delivery outcomes are presented. Third, different routes of protein administration (parenteral, central intracerebroventricular and intraparenchymal, intranasal and intrathecal) are discussed along with key barriers to CNS delivery associated with each route. Finally, current delivery strategies involving chemical modification of proteins and use of particle-based carriers are overviewed using examples from literature and our own work. Whereas most of these studies are in the early stage, some provide proof of mechanism of increased protein delivery to the brain in relevant models of CNS diseases, while in few cases proof of concept had been attained in clinical studies. This review will be useful to broad audience of students, academicians and industry professionals who consider critical issues of protein delivery to the brain and aim developing and studying effective brain delivery systems for protein therapeutics. PMID:24956489

  18. Migration and Marriage: Modeling the Joint Process

    PubMed Central

    Jang, Joy Bohyun; Casterline, John B; Snyder, Anastasia

    2016-01-01

    Background Previous research on inter-relations between migration and marriage has relied on overly simplistic assumptions about the structure of dependency between the two events. However, there is good reason to posit that each of the two transitions has an impact on the likelihood of the other, and that unobserved common factors may affect both migration and marriage, leading to a distorted impression of the causal impact of one on the other. Objective We will investigate relationships between migration and marriage in the United States using data from the National Longitudinal Survey of Youth 1979. We allow for inter-dependency between the two events and examine whether unobserved common factors affect the estimates of both migration and marriage. Methods We estimate a multi-process model in which migration and marriage are considered simultaneously in regression analysis and there is allowance for correlation between disturbances; the latter feature accounts for possible endogeneity between shared unobserved determinants. The model also includes random effects for persons, exploiting the fact that many people experience both events multiple times throughout their lives. Results Unobserved factors appear to significantly influence both migration and marriage, resulting in upward bias in estimates of the effects of each on the other when these shared common factors are not accounted for. Estimates from the multi-process model indicate that marriage significantly increases the hazard of migration while migration does not affect the hazard of marriage. Conclusions Omitting inter-dependency between life course events can lead to a mistaken impression of the direct effects of certain features of each event on the other. PMID:27182198

  19. ISS Double-Gimbaled CMG Subsystem Simulation Using the Agile Development Method

    NASA Technical Reports Server (NTRS)

    Inampudi, Ravi

    2016-01-01

    This paper presents an evolutionary approach in simulating a cluster of 4 Control Moment Gyros (CMG) on the International Space Station (ISS) using a common sense approach (the agile development method) for concurrent mathematical modeling and simulation of the CMG subsystem. This simulation is part of Training systems for the 21st Century simulator which will provide training for crew members, instructors, and flight controllers. The basic idea of how the CMGs on the space station are used for its non-propulsive attitude control is briefly explained to set up the context for simulating a CMG subsystem. Next different reference frames and the detailed equations of motion (EOM) for multiple double-gimbal variable-speed control moment gyroscopes (DGVs) are presented. Fixing some of the terms in the EOM becomes the special case EOM for ISS's double-gimbaled fixed speed CMGs. CMG simulation development using the agile development method is presented in which customer's requirements and solutions evolve through iterative analysis, design, coding, unit testing and acceptance testing. At the end of the iteration a set of features implemented in that iteration are demonstrated to the flight controllers thus creating a short feedback loop and helping in creating adaptive development cycles. The unified modeling language (UML) tool is used in illustrating the user stories, class designs and sequence diagrams. This incremental development approach of mathematical modeling and simulating the CMG subsystem involved the development team and the customer early on, thus improving the quality of the working CMG system in each iteration and helping the team to accurately predict the cost, schedule and delivery of the software.

  20. Impact of Business Intelligence and IT Infrastructure Flexibility on Competitive Advantage: An Organizational Agility Perspective

    ERIC Educational Resources Information Center

    Chen, Xiaofeng

    2012-01-01

    There is growing use of business intelligence (BI) for better management decisions in industry. However, empirical studies on BI are still scarce in academic research. This research investigates BI from an organizational agility perspective. Organizational agility is the ability to sense and respond to market opportunities and threats with speed,…

  1. Renewed gamma-ray activity of the Blazar 3C 454.3 detected by AGILE

    NASA Astrophysics Data System (ADS)

    Bulgarelli, A.; Parmiggiani, N.; Fioretti, V.; Zoli, A.; Lucarelli, F.; Verrecchia, F.; Pittori, C.; Vercellone, S.; Piano, G.; Munar-Adrover, P.; Tavani, M.; Donnarumma, I.; Striani, E.; Cardillo, M.; Gianotti, F.; Trifoglio, M.; Giuliani, A.; Mereghetti, S.; Caraveo, P.; Perotti, F.; Chen, A.; Argan, A.; Costa, E.; Del Monte, E.; Evangelista, Y.; Feroci, M.; Lazzarotto, F.; Lapshov, I.; Pacciani, L.; Soffitta, P.; Sabatini, S.; Vittorini, V.; Pucella, G.; Rapisarda, M.; Di Cocco, G.; Fuschino, F.; Galli, M.; Labanti, C.; Marisaldi, M.; Pellizzoni, A.; Pilia, M.; Trois, A.; Barbiellini, G.; Vallazza, E.; Longo, F.; Morselli, A.; Picozza, P.; Prest, M.; Lipari, P.; Zanello, D.; Cattaneo, P. W.; Rappoldi, A.; Colafrancesco, S.; Ferrari, A.; Antonelli, A.; Giommi, P.; Salotti, L.; Valentini, G.; D'Amico, F.

    2016-06-01

    The AGILE satellite is detecting a significant enhancement in gamma-ray activity from the FSRQ 3C 454.3 (known as 1AGLR J2254+1609) since the recent AGILE ATel #9157, and the optical activity reported in ATel #9150.

  2. Project-Method Fit: Exploring Factors That Influence Agile Method Use

    ERIC Educational Resources Information Center

    Young, Diana K.

    2013-01-01

    While the productivity and quality implications of agile software development methods (SDMs) have been demonstrated, research concerning the project contexts where their use is most appropriate has yielded less definitive results. Most experts agree that agile SDMs are not suited for all project contexts. Several project and team factors have been…

  3. The Impacts of Agile Development Methodology Use on Project Success: A Contingency View

    ERIC Educational Resources Information Center

    Tripp, John F.

    2012-01-01

    Agile Information Systems Development Methods have emerged in the past decade as an alternative manner of managing the work and delivery of information systems development teams, with a large number of organizations reporting the adoption & use of agile methods. The practitioners of these methods make broad claims as to the benefits of their…

  4. Joint Scheduling and Spectrum Allocation in Wireless Networks with Frequency-Agile Radios

    NASA Astrophysics Data System (ADS)

    Uddin, Mohammad Faisal; Nurujjaman, Mohammad; Assi, Chadi

    We study the benefits of optimal spectrum allocation in a wireless network with frequency agile radios and we present a cross-layer problem formulation for the joint routing and link scheduling under non-uniform spectrum allocation. We present a primal-dual decomposition to provide an exact solution for this complex optimization problem. Given the difficulty associated with such design, we propose a heuristic approach based on simulated annealing to solve the dual sub-problem of the decomposed model. Numerical results revealed that up to 44% improvement in network performance is obtained when variable-width spectrum band allocation is used, as opposed to the best fixed-width spectrum band allocation for larger networks. Numerical results also confirm that the primal-dual decomposition method using simulated annealing to solve the dual sub-problem, substantially reduces the computation time and achieves near optimal solutions.

  5. Research on rapid agile metrology for manufacturing based on real-time multitask operating system

    NASA Astrophysics Data System (ADS)

    Chen, Jihong; Song, Zhen; Yang, Daoshan; Zhou, Ji; Buckley, Shawn

    1996-10-01

    Rapid agile metrology for manufacturing (RAMM) using multiple non-contact sensors is likely to remain a growing trend in manufacturing. High speed inspecting systems for manufacturing is characterized by multitasks implemented in parallel and real-time events which occur simultaneously. In this paper, we introduce a real-time operating system into RAMM research. A general task model of a class-based object- oriented technology is proposed. A general multitask frame of a typical RAMM system using OPNet is discussed. Finally, an application example of a machine which inspects parts held on a carrier strip is described. With RTOS and OPNet, this machine can measure two dimensions of the contacts at 300 parts/second.

  6. Mechanical-mathematical modeling for landslide process

    NASA Astrophysics Data System (ADS)

    Svalova, V.

    2009-04-01

    500 m and displacement of a landslide in the plan over 1 m. Last serious activization of a landslide has taken place in 2002 with a motion on 53 cm. Catastrophic activization of the deep blockglide landslide in the area of Khoroshevo in Moscow took place in 2006-2007. A crack of 330 m long appeared in the old sliding circus, along which a new 220 m long creeping block was separated from the plateau and began sinking with a displaced surface of the plateau reaching to 12 m. Such activization of the landslide process was not observed in Moscow since mid XIX century. The sliding area of Khoroshevo was stable during long time without manifestations of activity. Revealing of the reasons of deformation and development of ways of protection from deep landslide motions is extremely actual and difficult problem which decision is necessary for preservation of valuable historical monuments and modern city constructions. The reasons of activization and protective measures are discussed. Structure of monitoring system for urban territories is elaborated. Mechanical-mathematical model of high viscous fluid was used for modeling of matter behavior on landslide slopes. Equation of continuity and an approximated equation of the Navier-Stockes for slow motions in a thin layer were used. The results of modelling give possibility to define the place of highest velocity on landslide surface, which could be the best place for monitoring post position. Model can be used for calibration of monitoring equipment and gives possibility to investigate some fundamental aspects of matter movement on landslide slope.

  7. Discovery of New Gamma-Ray Pulsars with AGILE

    NASA Astrophysics Data System (ADS)

    Pellizzoni, A.; Pilia, M.; Possenti, A.; Chen, A.; Giuliani, A.; Trois, A.; Caraveo, P.; Del Monte, E.; Fornari, F.; Fuschino, F.; Mereghetti, S.; Tavani, M.; Argan, A.; Burgay, M.; Cognard, I.; Corongiu, A.; Costa, E.; D'Amico, N.; De Luca, A.; Esposito, P.; Evangelista, Y.; Feroci, M.; Johnston, S.; Kramer, M.; Longo, F.; Marisaldi, M.; Theureau, G.; Weltevrede, P.; Barbiellini, G.; Boffelli, F.; Bulgarelli, A.; Cattaneo, P. W.; Cocco, V.; D'Ammando, F.; DeParis, G.; Di Cocco, G.; Donnarumma, I.; Fiorini, M.; Froysland, T.; Galli, M.; Gianotti, F.; Labanti, C.; Lapshov, I.; Lazzarotto, F.; Lipari, P.; Mineo, T.; Morselli, A.; Pacciani, L.; Perotti, F.; Piano, G.; Picozza, P.; Prest, M.; Pucella, G.; Rapisarda, M.; Rappoldi, A.; Sabatini, S.; Soffitta, P.; Trifoglio, M.; Vallazza, E.; Vercellone, S.; Vittorini, V.; Zambra, A.; Zanello, D.; Pittori, C.; Verrecchia, F.; Preger, B.; Santolamazza, P.; Giommi, P.; Salotti, L.; Bignami, G. F.

    2009-04-01

    Using gamma-ray data collected by the Astro-rivelatore Gamma ad Immagini LEggero (AGILE) satellite over a period of almost one year (from 2007 July to 2008 June), we searched for pulsed signals from 35 potentially interesting radio pulsars, ordered according to F_{γ}∝ √{\\dot{E}} d^{-2} and for which contemporary or recent radio data were available. AGILE detected three new top-ranking nearby and Vela-like pulsars with good confidence both through timing and spatial analysis. Among the newcomers we find pulsars with very high rotational energy losses, such as the remarkable PSR B1509 - 58 with a magnetic field in excess of 1013 Gauss, and PSR J2229 + 6114 providing a reliable identification for the previously unidentified EGRET source 3EG 2227 + 6122. Moreover, the powerful millisecond pulsar B1821 - 24, in the globular cluster M28, is detected during a fraction of the observations. Four other promising gamma-ray pulsar candidates, among which is the notable J2043 + 2740 with an age in excess of 1 million years, show a possible detection in the timing analysis only and deserve confirmation.

  8. Enhanced detection of Terrestrial Gamma-Ray Flashes by AGILE

    NASA Astrophysics Data System (ADS)

    Marisaldi, M.; Argan, A.; Ursi, A.; Gjesteland, T.; Fuschino, F.; Labanti, C.; Galli, M.; Tavani, M.; Pittori, C.; Verrecchia, F.; D'Amico, F.; Ostgaard, N.; Mereghetti, S.; Campana, R.; Cattaneo, P.; Bulgarelli, A.; Colafrancesco, S.; Dietrich, S.; Longo, F.; Gianotti, F.; Giommi, P.; Rappoldi, A.; Trifoglio, M.; Trois, A.

    2015-12-01

    At the end of March 2015 the onboard configuration of the AGILE satellite was modified in order to disable the veto signal of the anticoincidence shield for the minicalorimeter instrument. The motivation for such a change was the understanding that the dead time induced by the anticoincidence prevented the detection of a large fraction of Terrestrial Gamma-ray Flashes (TGFs), especially the short duration ones. We present here the characteristics of the new TGF sample after several months of stable operations with the new configuration. The configuration change was highly successful resulting in the detection of about 100 TGFs/month, an increase of a factor about 11 in TGFs detection rate with respect to previous configuration. As expected, the largest fraction of the new events has short duration, with a median duration of 80 microseconds. We also obtain a sample of events with simultaneous association, within 100 microseconds, with lightning sferics detected by the World Wide Lightning Location Network (WWLLN), confirming previous results reported by the Fermi mission. Given the high detection rate and the AGILE very low (+/-2.5°) orbital inclination, the new configuration provides the largest TGF detection rate surface density (TGFs / km2 / year) to date, opening space for correlation studies with lightning and atmospheric parameters on short spatial and temporal scales along the equatorial region. Eventually, the events with associated simultaneous WWLLN sferics provide a highly reliable sample to probe the long-standing issue of the TGF maximal energy.

  9. A 'Common Information Model' for the climate modelling process

    NASA Astrophysics Data System (ADS)

    Treshansky, Allyn; Devine, Gerard

    2010-05-01

    The Common Information Model (CIM), developed by the EU-funded METAFOR project (http://metaforclimate.eu), is a formal model of the climate modeling process. It provides a rich structured description of not only climate data but also the "provenance" of that data: the software models and tools used to generate that data, the simulations those models implement, the experiments those simulations conform to, etc.. This formal metadata model is expected to add value to those datasets by firstly codifying what is currently found only in the heads of climate experts (the aforementioned provenance of climate datasets), and secondly by allowing tools to be developed that make searching for and analysing climate datasets a much more intuitive process than it has been in the past. This paper will describe the structure of the CIM, concentrating on how it works with and what it adds to other metadata standards. As alluded to above, current metadata standards concentrate on the contents of a climate dataset. Scientific detail and relevance of the model components that generated that data as well as the context for why it was run are missing. The CIM addresses this gap. However, it does not aim to replace existing standards. Rather, wherever possible it re-uses them. It also attempts to standardise our understanding of climate modeling at a very high level, at a conceptual level. This results in a UML description of climate modeling, the CONCIM. METAFOR extracts from this high-level UML the bits of the CIM that we want to use in our applications; These bits get converted into a set of XSD application schemas, the APPCIM. Other user groups may derive a different APPCIM (in a different format) that suits them from the same CONCIM. Thus there is a common understanding of the concepts used in climate modeling even if the implementation differs. In certain key places the CIM describes a general structure over which a specific Controlled Vocabulary (CV) can be applied. For example

  10. Model development for naphthenic acids ozonation process.

    PubMed

    Al Jibouri, Ali Kamel H; Wu, Jiangning

    2015-02-01

    Naphthenic acids (NAs) are toxic constituents of oil sands process-affected water (OSPW) which is generated during the extraction of bitumen from oil sands. NAs consist mainly of carboxylic acids which are generally biorefractory. For the treatment of OSPW, ozonation is a very beneficial method. It can significantly reduce the concentration of NAs and it can also convert NAs from biorefractory to biodegradable. In this study, a factorial design (2(4)) was used for the ozonation of OSPW to study the influences of the operating parameters (ozone concentration, oxygen/ozone flow rate, pH, and mixing) on the removal of a model NAs in a semi-batch reactor. It was found that ozone concentration had the most significant effect on the NAs concentration compared to other parameters. An empirical model was developed to correlate the concentration of NAs with ozone concentration, oxygen/ozone flow rate, and pH. In addition, a theoretical analysis was conducted to gain the insight into the relationship between the removal of NAs and the operating parameters. PMID:25189805

  11. Application of simulation models for the optimization of business processes

    NASA Astrophysics Data System (ADS)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  12. Developing Friction Stir Welding Process Model for ICME Application

    NASA Astrophysics Data System (ADS)

    Yang, Yu-Ping

    2015-01-01

    A framework for developing a product involving manufacturing processes was developed with integrated computational materials engineering approach. The key component in the framework is a process modeling tool which includes a thermal model, a microstructure model, a thermo-mechanical, and a property model. Using friction stir welding (FSW) process as an example, development of the process modeling tool was introduced in detail. The thermal model and the microstructure model of FSW of steels were validated with the experiment data. The model can predict reasonable temperature and hardness distributions as observed in the experiment. The model was applied to predict residual stress and joint strength of a pipe girth weld.

  13. Stochastic model updating utilizing Bayesian approach and Gaussian process model

    NASA Astrophysics Data System (ADS)

    Wan, Hua-Ping; Ren, Wei-Xin

    2016-03-01

    Stochastic model updating (SMU) has been increasingly applied in quantifying structural parameter uncertainty from responses variability. SMU for parameter uncertainty quantification refers to the problem of inverse uncertainty quantification (IUQ), which is a nontrivial task. Inverse problem solved with optimization usually brings about the issues of gradient computation, ill-conditionedness, and non-uniqueness. Moreover, the uncertainty present in response makes the inverse problem more complicated. In this study, Bayesian approach is adopted in SMU for parameter uncertainty quantification. The prominent strength of Bayesian approach for IUQ problem is that it solves IUQ problem in a straightforward manner, which enables it to avoid the previous issues. However, when applied to engineering structures that are modeled with a high-resolution finite element model (FEM), Bayesian approach is still computationally expensive since the commonly used Markov chain Monte Carlo (MCMC) method for Bayesian inference requires a large number of model runs to guarantee the convergence. Herein we reduce computational cost in two aspects. On the one hand, the fast-running Gaussian process model (GPM) is utilized to approximate the time-consuming high-resolution FEM. On the other hand, the advanced MCMC method using delayed rejection adaptive Metropolis (DRAM) algorithm that incorporates local adaptive strategy with global adaptive strategy is employed for Bayesian inference. In addition, we propose the use of the powerful variance-based global sensitivity analysis (GSA) in parameter selection to exclude non-influential parameters from calibration parameters, which yields a reduced-order model and thus further alleviates the computational burden. A simulated aluminum plate and a real-world complex cable-stayed pedestrian bridge are presented to illustrate the proposed framework and verify its feasibility.

  14. The frequency-agile radar: A multifunctional approach to remote sensing of the ionosphere

    NASA Astrophysics Data System (ADS)

    Tsunoda, R. T.; Livingston, R. C.; Buonocore, J. J.; McKinley, A. V.

    1995-09-01

    We introduce a new kind of diagnostic sensor that combines multifunctional measurement capabilities for ionospheric research. Multifunctionality is realized through agility in frequency selection over an extended band (1.5 to 50 MHz), system modularity, complete system control by software written in C, and a user-friendly computer interface. This sensor, which we call the frequency-agile radar (FAR), incorporates dual radar channels and an arbitrary waveform synthesizer that allows creative design of sophisticated waveforms as a means of increasing its sensitivity to weak signals while minimizing loss in radar resolution. The sensitivity of the FAR is determined by two sets of power amplifier modules: four 4-kW solid-state broadband amplifiers, and four 30-kW vacuum tube amplifiers. FAR control is by an AT-bus personal computer with on-line processing by a programmable array processor. The FAR does not simply house the separate functions of most radio sensors in use today, it provides convenient and flexible access to those functions as elements to be used in any combination. Some of the first new results obtained with the FAR during recent field campaigns are presented to illustrate its versatility. These include (1) the first detection of anomalous high-frequency (HF) reflections from a barium ion cloud, (2) the first evidence of unexpectedly large drifts and a shear north of the equatorial electrojet, (3) the first HF radar signature of a developing equatorial plasma bubble, and (4) the first measurements by a portable radar of altitude-extended, quasi-periodic backscatter from midlatitude sporadic E. We also mention the potential of the FAR for atmospheric remote sensing.

  15. Performance, Agility and Cost of Cloud Computing Services for NASA GES DISC Giovanni Application

    NASA Astrophysics Data System (ADS)

    Pham, L.; Chen, A.; Wharton, S.; Winter, E. L.; Lynnes, C.

    2013-12-01

    The NASA Goddard Earth Science Data and Information Services Center (GES DISC) is investigating the performance, agility and cost of Cloud computing for GES DISC applications. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure), one of the core applications at the GES DISC for online climate-related Earth science data access, subsetting, analysis, visualization, and downloading, was used to evaluate the feasibility and effort of porting an application to the Amazon Cloud Services platform. The performance and the cost of running Giovanni on the Amazon Cloud were compared to similar parameters for the GES DISC local operational system. A Giovanni Time-Series analysis of aerosol absorption optical depth (388nm) from OMI (Ozone Monitoring Instrument)/Aura was selected for these comparisons. All required data were pre-cached in both the Cloud and local system to avoid data transfer delays. The 3-, 6-, 12-, and 24-month data were used for analysis on the Cloud and local system respectively, and the processing times for the analysis were used to evaluate system performance. To investigate application agility, Giovanni was installed and tested on multiple Cloud platforms. The cost of using a Cloud computing platform mainly consists of: computing, storage, data requests, and data transfer in/out. The Cloud computing cost is calculated based on the hourly rate, and the storage cost is calculated based on the rate of Gigabytes per month. Cost for incoming data transfer is free, and for data transfer out, the cost is based on the rate in Gigabytes. The costs for a local server system consist of buying hardware/software, system maintenance/updating, and operating cost. The results showed that the Cloud platform had a 38% better performance and cost 36% less than the local system. This investigation shows the potential of cloud computing to increase system performance and lower the overall cost of system management.

  16. Heterogeneous processes: Laboratory, field, and modeling studies

    NASA Technical Reports Server (NTRS)

    Poole, Lamont R.; Kurylo, Michael J.; Jones, Rod L.; Wahner, Andreas; Calvert, Jack G.; Leu, M.-T.; Fried, A.; Molina, Mario J.; Hampson, Robert F.; Pitts, M. C.

    1991-01-01

    The efficiencies of chemical families such as ClO(x) and NO(x) for altering the total abundance and distribution of stratospheric ozone are controlled by a partitioning between reactive (active) and nonreactive (reservoir) compounds within each family. Gas phase thermodynamics, photochemistry, and kinetics would dictate, for example, that only about 1 percent of the chlorine resident in the lower stratosphere would be in the form of active Cl or ClO, the remainder existing in the reservoir compounds HCl and ClONO2. The consistency of this picture was recently challenged by the recognition that important chemical transformations take place on polar regions: the Airborne Antarctic Ozone Experiment (AAOE) and the Airborne Arctic Stratospheric Expedition (AASA). Following the discovery of the Antarctic ozone hole, Solomon et al. suggested that the heterogeneous chemical reaction: ClONO2(g)+HCl(s) yields Cl2(g)+HNO3(s) could play a key role in converting chlorine from inactive forms into a species (Cl2) that would rapidly dissociate in sunlight to liberate atomic chlorine and initiate ozone depletion. The symbols (s) and (g) denote solid phase, or adsorbed onto a solid surface, and gas phase, respectively, and represent the approach by which such a reaction is modeled rather than the microscopic details of the reaction. The reaction was expected to be most important at altitudes where PSC's were most prevalent (10 to 25 km), thereby extending the altitude range over which chlorine compounds can efficiently destroy ozone from the 35 to 45 km region (where concentrations of active chlorine are usually highest) to lower altitudes where the ozone concentration is at its peak. This chapter will briefly review the current state of knowledge of heterogeneous processes in the stratosphere, emphasizing those results obtained since the World Meteorological Organization (WMO) conference. Sections are included on laboratory investigations of heterogeneous reactions, the

  17. Dynamic tumor tracking using the Elekta Agility MLC

    SciTech Connect

    Fast, Martin F. Nill, Simeon Bedford, James L.; Oelfke, Uwe

    2014-11-01

    Purpose: To evaluate the performance of the Elekta Agility multileaf collimator (MLC) for dynamic real-time tumor tracking. Methods: The authors have developed a new control software which interfaces to the Agility MLC to dynamically program the movement of individual leaves, the dynamic leaf guides (DLGs), and the Y collimators (“jaws”) based on the actual target trajectory. A motion platform was used to perform dynamic tracking experiments with sinusoidal trajectories. The actual target positions reported by the motion platform at 20, 30, or 40 Hz were used as shift vectors for the MLC in beams-eye-view. The system latency of the MLC (i.e., the average latency comprising target device reporting latencies and MLC adjustment latency) and the geometric tracking accuracy were extracted from a sequence of MV portal images acquired during irradiation for the following treatment scenarios: leaf-only motion, jaw + leaf motion, and DLG + leaf motion. Results: The portal imager measurements indicated a clear dependence of the system latency on the target position reporting frequency. Deducting the effect of the target frequency, the leaf adjustment latency was measured to be 38 ± 3 ms for a maximum target speed v of 13 mm/s. The jaw + leaf adjustment latency was 53 ± 3 at a similar speed. The system latency at a target position frequency of 30 Hz was in the range of 56–61 ms for the leaves (v ≤ 31 mm/s), 71–78 ms for the jaw + leaf motion (v ≤ 25 mm/s), and 58–72 ms for the DLG + leaf motion (v ≤ 59 mm/s). The tracking accuracy showed a similar dependency on the target position frequency and the maximum target speed. For the leaves, the root-mean-squared error (RMSE) was between 0.6–1.5 mm depending on the maximum target speed. For the jaw + leaf (DLG + leaf) motion, the RMSE was between 0.7–1.5 mm (1.9–3.4 mm). Conclusions: The authors have measured the latency and geometric accuracy of the Agility MLC, facilitating its future use for clinical

  18. Modeling the Communication Process: The Map Is not the Territory.

    ERIC Educational Resources Information Center

    Bowman, Joel P.; Targowski, Andrew S.

    1987-01-01

    Presents a brief overview of the most significant models of the communication process, evaluates the communication models of the greatest relevance to business communication, and establishes a foundation for a new conception of that process. (JC)

  19. Development of a reburning boiler process model

    SciTech Connect

    Wu, K.T.

    1992-01-30

    The overall objective of this program is to integrate EER's expertise in boiler reburning performance evaluation into a package of analytical computer tools. Specific objectives of the program are to develop a computational capability with the following features: (1) can be used to predict the impact of gas reburning application on thermal conditions in the boiler radiant furnace, and on overall boiler performance; (2) can estimate gas reburning NO{sub x} reduction effectiveness based on specific reburning configurations and furnace/boiler configurations; (3) can be used as an analytical tool to evaluate the impact of boiler process parameters (e.g., fuel switching and changes in boiler operating conditions) on boiler thermal performance; (4) is adaptable to most boiler designs (tangential and wall fire boilers) and a variety of fuels (solid, liquid, gaseous and slurried fuels); (5) is sufficiently user friendly to be exercisable by engineers with a reasonable knowledge of boilers, and with reasonable computer skills. Here, user friendly'' means that the user will be guided by computer codes during the course of setting up individual input files for the boiler performance model.

  20. Agents: An approach for dynamic process modelling

    NASA Astrophysics Data System (ADS)

    Grohmann, Axel; Kopetzky, Roland; Lurk, Alexander

    1999-03-01

    With the growing amount of distributed and heterogeneous information and services, conventional information systems have come to their limits. This gave rise to the development of a Multi-Agent System (the "Logical Client") which can be used in complex information systems as well as in other advanced software systems. Computer agents are proactive, reactive and social. They form a community of independent software components that can communicate and co-operate in order to accomplish complex tasks. Thus the agent-oriented paradigm provides a new and powerful approach to programming distributed systems. The communication framework developed is based on standards like CORBA, KQML and KIF. It provides an embedded rule based system to find adequate reactions to incoming messages. The macro-architecture of the Logical Client consists of independent agents and uses artificial intelligence to cope with complex patterns of communication and actions. A set of system agents is also provided, including the Strategy Service as a core component for modelling processes at runtime, the Computer Supported Cooperative Work (CSCW) Component for supporting remote co-operation between human users and the Repository for managing and hiding the file based data flow in heterogeneous networks. This architecture seems to be capable of managing complexity in information systems. It is also being implemented in a complex simulation system that monitors and simulates the environmental radioactivity in the country Baden-Württemberg.

  1. Systemic vulnerability model for coastal erosion processes

    NASA Astrophysics Data System (ADS)

    Greco, M.; Martino, G.; Guariglia, A.

    2010-09-01

    Many coastal areas constitute an extraordinary environmental worth and economic value resource continuously exposed to an unceasing transformation due to climatic and anthropic factors. The pressure factor overloads carry out an amplification of environmental degradation and economic rent decrease of these territories producing a disruption of normal and anticipated community growth. This paper copes with coastal erosion problem by a systemic vulnerability model application and environmental indicators approach. Through the definition of an original indicator depending on the observed annual rate of coastal erosion and wave climate parameters, such an approach allow scenario generation and it is useful and powerful planning and management tool. The model has been applied on the test case of Ionian Coast of Basilicata Region located in the southern part of Italy, in the middle of Mediterranean basin. The littoral area is constituted of sandy shores of about 50 km length and 5 river deltas take place. Looking at the shoreline in terms of displacements, a shift of a coastal part is a function of grain size characteristics of the shore sands and of the wave climate. Therefore the selected index taking into account the energy stress affecting the shore area, characterizing the territorial system state and finalized to vulnerability estimation, is defined through the maximum annual erosion,tE, and the surface-wave parameters (H,T) corresponding to the wave-generated bottom orbital velocities higher than critical velocity matches with the bottom incipient transport condition. The resulting coefficient ? (? = tE? ? gH2-?T) is obviously dimensionless and represents the part of the available power in the seas, dissipated by erosion processes. If ? increases, the system integrity decreases and the system vulnerability increases. Available data, in terms of topographic/bathymetric information referred to the period 1873-2008, were utilized to derive tE by the use of a GIS

  2. Process modeling - It's history, current status, and future

    NASA Astrophysics Data System (ADS)

    Duttweiler, Russell E.; Griffith, Walter M.; Jain, Sulekh C.

    1991-04-01

    The development of process modeling is reviewed to examine the potential of process applications to prevent and solve problems associated with the aerospace industry. The business and global environments is assessed, and the traditional approach to product/process design is argued to be obsolete. A revised engineering process is described which involves planning and prediction before production by means of process simulation. Process simulation can permit simultaneous engineering of unit processes and complex processes, and examples are given in the cross-coupling of forging-process variance. The implementation of process modeling, CAE, and computer simulations are found to reduce costs and time associated with technological development when incorporated judiciously.

  3. CHEMICAL AND PHYSICAL PROCESS AND MECHANISM MODELING

    EPA Science Inventory

    The goal of this task is to develop and test chemical and physical mechanisms for use in the chemical transport models of EPA's Models-3. The target model for this research is the Community Multiscale Air Quality (CMAQ) model. These mechanisms include gas and aqueous phase ph...

  4. Two Undergraduate Process Modeling Courses Taught Using Inductive Learning Methods

    ERIC Educational Resources Information Center

    Soroush, Masoud; Weinberger, Charles B.

    2010-01-01

    This manuscript presents a successful application of inductive learning in process modeling. It describes two process modeling courses that use inductive learning methods such as inquiry learning and problem-based learning, among others. The courses include a novel collection of multi-disciplinary complementary process modeling examples. They were…

  5. Agile and dexterous robot for inspection and EOD operations

    NASA Astrophysics Data System (ADS)

    Handelman, David A.; Franken, Gordon H.; Komsuoglu, Haldun

    2010-04-01

    The All-Terrain Biped (ATB) robot is an unmanned ground vehicle with arms, legs and wheels designed to drive, crawl, walk and manipulate objects for inspection and explosive ordnance disposal tasks. This paper summarizes on-going development of the ATB platform. Control technology for semi-autonomous legged mobility and dual-arm dexterity is described as well as preliminary simulation and hardware test results. Performance goals include driving on flat terrain, crawling on steep terrain, walking on stairs, opening doors and grasping objects. Anticipated benefits of the adaptive mobility and dexterity of the ATB platform include increased robot agility and autonomy for EOD operations, reduced operator workload and reduced operator training and skill requirements.

  6. Agile and green manufacturing and super hard coated cutting tools

    SciTech Connect

    Chi-Hung Shen

    1995-12-31

    The paper discusses the global movement towards an agile and green manufacturing environment and their impacts on high volume producers such the automotive industry. In the area of machining, two major shifts are envision (1) proliferation of highly flexible CNC single spindle machining centers to replace conventional dedicated transfer lines and (2) implementation of {open_quotes}dry{close_quotes} machining systems where there are no or minimal use of machining fluids. In order to migrate towards these goals and still remain competitive and profitable, economically viable high performance super hard coated cutting tools must be developed. Machining results with CVD diamond coated tools are presented to illustrate their current capabilities and limitations. Key areas for further research and develop of super hard coating tools will also be discussed.

  7. Thrust Direction Optimization: Satisfying Dawn's Attitude Agility Constraints

    NASA Technical Reports Server (NTRS)

    Whiffen, Gregory J.

    2013-01-01

    The science objective of NASA's Dawn Discovery mission is to explore the giant asteroid Vesta and the dwarf planet Ceres, the two largest members of the main asteroid belt. Dawn successfully completed its orbital mission at Vesta. The Dawn spacecraft has complex, difficult to quantify, and in some cases severe limitations on its attitude agility. The low-thrust transfers between science orbits at Vesta required very complex time varying thrust directions due to the strong and complex gravity and various science objectives. Traditional low-thrust design objectives (like minimum change in velocity or minimum transfer time) often result in thrust direction time evolutions that cannot be accommodated with the attitude control system available on Dawn. This paper presents several new optimal control objectives, collectively called thrust direction optimization that were developed and turned out to be essential to the successful navigation of Dawn at Vesta.

  8. Thrust Direction Optimization: Satisfying Dawn's Attitude Agility Constraints

    NASA Technical Reports Server (NTRS)

    Whiffen, Gregory J.

    2013-01-01

    The science objective of NASA's Dawn Discovery mission is to explore the two largest members of the main asteroid belt, the giant asteroid Vesta and the dwarf planet Ceres. Dawn successfully completed its orbital mission at Vesta. The Dawn spacecraft has complex, difficult to quantify, and in some cases severe limitations on its attitude agility. The low-thrust transfers between science orbits at Vesta required very complex time varying thrust directions due to the strong and complex gravity and various science objectives. Traditional thrust design objectives (like minimum (Delta)V or minimum transfer time) often result in thrust direction time evolutions that can not be accommodated with the attitude control system available on Dawn. This paper presents several new optimal control objectives, collectively called thrust direction optimization that were developed and necessary to successfully navigate Dawn through all orbital transfers at Vesta.

  9. A new algorithm for agile satellite-based acquisition operations

    NASA Astrophysics Data System (ADS)

    Bunkheila, Federico; Ortore, Emiliano; Circi, Christian

    2016-06-01

    Taking advantage of the high manoeuvrability and the accurate pointing of the so-called agile satellites, an algorithm which allows efficient management of the operations concerning optical acquisitions is described. Fundamentally, this algorithm can be subdivided into two parts: in the first one the algorithm operates a geometric classification of the areas of interest and a partitioning of these areas into stripes which develop along the optimal scan directions; in the second one it computes the succession of the time windows in which the acquisition operations of the areas of interest are feasible, taking into consideration the potential restrictions associated with these operations and with the geometric and stereoscopic constraints. The results and the performances of the proposed algorithm have been determined and discussed considering the case of the Periodic Sun-Synchronous Orbits.

  10. Frequency Agile Transceiver for Advanced Vehicle Data Links

    NASA Technical Reports Server (NTRS)

    Freudinger, Lawrence C.; Macias, Filiberto; Cornelius, Harold

    2009-01-01

    Emerging and next-generation test instrumentation increasingly relies on network communication to manage complex and dynamic test scenarios, particularly for uninhabited autonomous systems. Adapting wireless communication infrastructure to accommodate challenging testing needs can benefit from reconfigurable radio technology. Frequency agility is one characteristic of reconfigurable radios that to date has seen only limited progress toward programmability. This paper overviews an ongoing project to validate a promising chipset that performs conversion of RF signals directly into digital data for the wireless receiver and, for the transmitter, converts digital data into RF signals. The Software Configurable Multichannel Transceiver (SCMT) enables four transmitters and four receivers in a single unit, programmable for any frequency band between 1 MHz and 6 GHz.

  11. AGILE follow-up of the neutrino ICECUBE-160731 event

    NASA Astrophysics Data System (ADS)

    Lucarelli, F.; Pittori, C.; Verrecchia, F.; Piano, G.; Munar-Adrover, P.; Bulgarelli, A.; Fioretti, V.; Zoli, A.; Tavani, M.; Donnarumma, I.; Vercellone, S.; Minervini, G.; Striani, E.; Cardillo, M.; Gianotti, F.; Trifoglio, M.; Giuliani, A.; Mereghetti, S.; Caraveo, P.; Perotti, F.; Chen, A.; Argan, A.; Costa, E.; Del Monte, E.; Evangelista, Y.; Feroci, M.; Lazzarotto, F.; Lapshov, I.; Pacciani, L.; Soffitta, P.; Sabatini, S.; Vittorini, V.; Pucella, G.; Rapisarda, M.; Di Cocco, G.; Fuschino, F.; Galli, M.; Labanti, C.; Marisaldi, M.; Pellizzoni, A.; Pilia, M.; Trois, A.; Barbiellini, G.; Vallazza, E.; Longo, F.; Morselli, A.; Picozza, P.; Prest, M.; Lipari, P.; Zanello, D.; Cattaneo, P. W.; Rappoldi, A.; Colafrancesco, S.; Parmiggiani, N.; Ferrari, A.; Antonelli, A.; Giommi, P.; Salotti, L.; Valentini, G.; D'Amico, F.

    2016-08-01

    Following the GCN notice posted by the ICECUBE Collaboration on July 31, 2016, reporting the detection at T0=16/07/31 01:55:04 UT of a very high energy neutrino with reconstructed arrival direction pointing at RA, DEC (J2000)=(214.5440, -0.3347 [deg]) with a 90% containement radius of 45.00 arcmin (stat+sys), we searched for transient gamma-ray emission in the AGILE data above 100 MeV. Integrating over the 48 hours from 2016-07-29 02:00 UT to 2016-07-31 02:00 UT a maximum likelihood analysis yields a possible detection at a significance level of about 3 sigma with a flux F(E > 100 MeV)=(1.5 +/- 0.7)x 10^-06 ph/cm^2/s within the GCN/AMON_ICECUBE_HESE notice error region.

  12. Pulsar timing and the Fermi and AGILE missions

    NASA Astrophysics Data System (ADS)

    Johnston, Simon; Possenti, Andrea; Manchester, Dick; Hobbs, George; Keith, Michael; Romani, Roger W.; Thompson, David J.; Thorsett, Stephen; Roberts, Mallory; Weltevrede, Patrick

    2010-10-01

    We request time to observe 170 pulsars on a regular basis in order to provide accurate ephemerides necessary for the detection of gamma-ray pulsars with the Fermi and AGILE satellites. The main science goals are to increase the number of known gamma-ray pulsars (both radio loud and radio quiet), to determine accurate pulse profiles, to characterise their high energy spectra and phase resolved spectroscopy of the brightest pulsars. In the radio, the observations will also allow us to find glitches, characterise timing noise, investigate dispersion and rotation measure variability and enhance our knowledge of single pulse phenomenology. To date, we are (co-)authors on 27 papers arising from the collaboration and P574 data. The data have contributed to the PhD theses of Lucas Guillemot and Damien Parent from the Bordeaux Fermi group (submitted mid 2009) and Kyle Watters from Stanford.

  13. Pulsar timing and the Fermi and AGILE missions

    NASA Astrophysics Data System (ADS)

    Johnston, Simon; Possenti, Andrea; Manchester, Dick; Hobbs, George; Keith, Michael; Romani, Roger W.; Thompson, David J.; Thorsett, Stephen; Roberts, Mallory; Weltevrede, Patrick

    2010-04-01

    We request time to observe 170 pulsars on a regular basis in order to provide accurate ephemerides necessary for the detection of gamma-ray pulsars with the Fermi and AGILE satellites. The main science goals are to increase the number of known gamma-ray pulsars (both radio loud and radio quiet), to determine accurate pulse profiles, to characterise their high energy spectra and phase resolved spectroscopy of the brightest pulsars. In the radio, the observations will also allow us to find glitches, characterise timing noise, investigate dispersion and rotation measure variability and enhance our knowledge of single pulse phenomenology. To date, we are (co-)authors on 20 papers arising from the collaboration and P574 data. The data have contributed to the PhD theses of Lucas Guillemot and Damien Parent from the Bordeaux Fermi group (submitted mid 2009).

  14. Pulsar Timing and the Fermi and AGILE missions

    NASA Astrophysics Data System (ADS)

    Shannon, Ryan; Possenti, Andrea; Manchester, Dick; Johnston, Simon; Hobbs, George; Keith, Michael; Romani, Roger W.; Thompson, David J.; Roberts, Mallory; Weltevrede, Patrick; Kerr, Matthew; Petroff, Emily; Brook, Paul

    2013-10-01

    We request time to observe 170 pulsars on a regular basis in order to provide accurate ephemerides necessary for the detection of gamma-ray pulsars with the Fermi and AGILE satellites. The main science goals are to increase the number of known gamma-ray pulsars (both radio loud and radio quiet), to determine accurate pulse profiles, to characterise their high energy spectra and phase resolved spectroscopy of the brightest pulsars. In the radio, the observations will also allow us to find glitches, characterise timing noise, investigate dispersion and rotation measure variability and enhance our knowledge of single pulse phenomenology. To date, we are (co-)authors on 43 papers arising from the collaboration and P574 data. The data have contributed to the PhD theses of Lucas Guillemot and Damien Parent from the Bordeaux Fermi group and Kyle Watters from Stanford. Currently five students have active projects using the radio datasets.

  15. Pulsar Timing and the Fermi and AGILE missions

    NASA Astrophysics Data System (ADS)

    Shannon, Ryan; Possenti, Andrea; Manchester, Dick; Johnston, Simon; Hobbs, George; Keith, Michael; Romani, Roger W.; Thompson, David J.; Thorsett, Stephen; Roberts, Mallory; Weltevrede, Patrick

    2011-04-01

    We request time to observe 170 pulsars on a regular basis in order to provide accurate ephemerides necessary for the detection of gamma-ray pulsars with the Fermi and AGILE satellites. The main science goals are to increase the number of known gamma-ray pulsars (both radio loud and radio quiet), to determine accurate pulse profiles, to characterise their high energy spectra and phase resolved spectroscopy of the brightest pulsars. In the radio, the observations will also allow us to find glitches, characterise timing noise, investigate dispersion and rotation measure variability and enhance our knowledge of single pulse phenomenology. To date, we are (co-)authors on 27 papers arising from the collaboration and P574 data. The data have contributed to the PhD theses of Lucas Guillemot and Damien Parent from the Bordeaux Fermi group (submitted mid 2009) and Kyle Watters from Stanford.

  16. Pulsar Timing and the Fermi and AGILE missions

    NASA Astrophysics Data System (ADS)

    Shannon, Ryan; Possenti, Andrea; Manchester, Dick; Johnston, Simon; Hobbs, George; Keith, Michael; Romani, Roger W.; Thompson, David J.; Thorsett, Stephen; Roberts, Mallory; Weltevrede, Patrick

    2011-10-01

    We request time to observe 170 pulsars on a regular basis in order to provide accurate ephemerides necessary for the detection of gamma-ray pulsars with the Fermi and AGILE satellites. The main science goals are to increase the number of known gamma-ray pulsars (both radio loud and radio quiet), to determine accurate pulse profiles, to characterise their high energy spectra and phase resolved spectroscopy of the brightest pulsars. In the radio, the observations will also allow us to find glitches, characterise timing noise, investigate dispersion and rotation measure variability and enhance our knowledge of single pulse phenomenology. To date, we are (co-)authors on 27 papers arising from the collaboration and P574 data. The data have contributed to the PhD theses of Lucas Guillemot and Damien Parent from the Bordeaux Fermi group (submitted mid 2009) and Kyle Watters from Stanford.

  17. Pulsar Timing and the Fermi and AGILE missions

    NASA Astrophysics Data System (ADS)

    Shannon, Ryan; Possenti, Andrea; Manchester, Dick; Johnston, Simon; Hobbs, George; Keith, Michael; Romani, Roger W.; Thompson, David J.; Thorsett, Stephen; Roberts, Mallory; Weltevrede, Patrick

    2012-04-01

    We request time to observe 170 pulsars on a regular basis in order to provide accurate ephemerides necessary for the detection of gamma-ray pulsars with the Fermi and AGILE satellites. The main science goals are to increase the number of known gamma-ray pulsars (both radio loud and radio quiet), to determine accurate pulse profiles, to characterise their high energy spectra and phase resolved spectroscopy of the brightest pulsars. In the radio, the observations will also allow us to find glitches, characterise timing noise, investigate dispersion and rotation measure variability and enhance our knowledge of single pulse phenomenology. To date, we are (co-)authors on 27 papers arising from the collaboration and P574 data. The data have contributed to the PhD theses of Lucas Guillemot and Damien Parent from the Bordeaux Fermi group (submitted mid 2009) and Kyle Watters from Stanford.

  18. Pulsar Timing and the Fermi and AGILE missions

    NASA Astrophysics Data System (ADS)

    Shannon, Ryan; Possenti, Andrea; Manchester, Dick; Johnston, Simon; Hobbs, George; Keith, Michael; Romani, Roger W.; Thompson, David J.; Thorsett, Stephen; Roberts, Mallory; Weltevrede, Patrick

    2012-10-01

    We request time to observe 170 pulsars on a regular basis in order to provide accurate ephemerides necessary for the detection of gamma-ray pulsars with the Fermi and AGILE satellites. The main science goals are to increase the number of known gamma-ray pulsars (both radio loud and radio quiet), to determine accurate pulse profiles, to characterise their high energy spectra and phase resolved spectroscopy of the brightest pulsars. In the radio, the observations will also allow us to find glitches, characterise timing noise, investigate dispersion and rotation measure variability and enhance our knowledge of single pulse phenomenology. To date, we are (co-)authors on 37 papers arising from the collaboration and P574 data. The data have contributed to the PhD theses of Lucas Guillemot and Damien Parent from the Bordeaux Fermi group and Kyle Watters from Stanford.

  19. Pulsar Timing and the Fermi and AGILE missions

    NASA Astrophysics Data System (ADS)

    Shannon, Ryan; Possenti, Andrea; Manchester, Dick; Johnston, Simon; Hobbs, George; Keith, Michael; Romani, Roger W.; Thompson, David J.; Roberts, Mallory; Weltevrede, Patrick; Brook, Paul

    2013-04-01

    We request time to observe 170 pulsars on a regular basis in order to provide accurate ephemerides necessary for the detection of gamma-ray pulsars with the Fermi and AGILE satellites. The main science goals are to increase the number of known gamma-ray pulsars (both radio loud and radio quiet), to determine accurate pulse profiles, to characterise their high energy spectra and phase resolved spectroscopy of the brightest pulsars. In the radio, the observations will also allow us to find glitches, characterise timing noise, investigate dispersion and rotation measure variability and enhance our knowledge of single pulse phenomenology. To date, we are (co-)authors on 37 papers arising from the collaboration and P574 data. The data have contributed to the PhD theses of Lucas Guillemot and Damien Parent from the Bordeaux Fermi group and Kyle Watters from Stanford. Currently for students have active projects using the radio datasets.

  20. Enhanced detection of terrestrial gamma-ray flashes by AGILE

    NASA Astrophysics Data System (ADS)

    Marisaldi, M.; Argan, A.; Ursi, A.; Gjesteland, T.; Fuschino, F.; Labanti, C.; Galli, M.; Tavani, M.; Pittori, C.; Verrecchia, F.; D'Amico, F.; Østgaard, N.; Mereghetti, S.; Campana, R.; Cattaneo, P. W.; Bulgarelli, A.; Colafrancesco, S.; Dietrich, S.; Longo, F.; Gianotti, F.; Giommi, P.; Rappoldi, A.; Trifoglio, M.; Trois, A.

    2015-11-01

    At the end of March 2015 the onboard software configuration of the Astrorivelatore Gamma a Immagini Leggero (AGILE) satellite was modified in order to disable the veto signal of the anticoincidence shield for the minicalorimeter instrument. The motivation for such a change was the understanding that the dead time induced by the anticoincidence prevented the detection of a large fraction of Terrestrial Gamma-Ray Flashes (TGFs). The configuration change was highly successful resulting in an increase of one order of magnitude in TGF detection rate. As expected, the largest fraction of the new events has short duration (<100 μs), and part of them has simultaneous association with lightning sferics detected by the World Wide Lightning Location Network. The new configuration provides the largest TGF detection rate surface density (TGFs/km2/yr) to date, opening prospects for improved correlation studies with lightning and atmospheric parameters on short spatial and temporal scales along the equatorial region.

  1. A Speeded Item Response Model with Gradual Process Change

    ERIC Educational Resources Information Center

    Goegebeur, Yuri; De Boeck, Paul; Wollack, James A.; Cohen, Allan S.

    2008-01-01

    An item response theory model for dealing with test speededness is proposed. The model consists of two random processes, a problem solving process and a random guessing process, with the random guessing gradually taking over from the problem solving process. The involved change point and change rate are considered random parameters in order to…

  2. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    ERIC Educational Resources Information Center

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  3. The Mathematical Models of the Periodical Literature Publishing Process.

    ERIC Educational Resources Information Center

    Guang, Yu; Daren, Yu; Yihong, Rong

    2000-01-01

    Describes two mathematical models of the periodical publishing process based on a theoretical analysis. Discusses the publishing process for periodical literature, explains the continuous model and the discrete model, presents partial differential equations, and demonstrates the adaptability and the validity of the models. (LRW)

  4. DESCRIPTION OF ATMOSPHERIC TRANSPORT PROCESSES IN EULERIAN AIR QUALITY MODELS

    EPA Science Inventory

    Key differences among many types of air quality models are the way atmospheric advection and turbulent diffusion processes are treated. Gaussian models use analytical solutions of the advection-diffusion equations. Lagrangian models use a hypothetical air parcel concept effecti...

  5. Modeling microbial processes in porous media

    NASA Astrophysics Data System (ADS)

    Murphy, Ellyn M.; Ginn, Timothy R.

    The incorporation of microbial processes into reactive transport models has generally proceeded along two separate lines of investigation: (1) transport of bacteria as inert colloids in porous media, and (2) the biodegradation of dissolved contaminants by a stationary phase of bacteria. Research over the last decade has indicated that these processes are closely linked. This linkage may occur when a change in metabolic activity alters the attachment/detachment rates of bacteria to surfaces, either promoting or retarding bacterial transport in a groundwater-contaminant plume. Changes in metabolic activity, in turn, are controlled by the time of exposure of the microbes to electron acceptors/donor and other components affecting activity. Similarly, metabolic activity can affect the reversibility of attachment, depending on the residence time of active microbes. Thus, improvements in quantitative analysis of active subsurface biota necessitate direct linkages between substrate availability, metabolic activity, growth, and attachment/detachment rates. This linkage requires both a detailed understanding of the biological processes and robust quantitative representations of these processes that can be tested experimentally. This paper presents an overview of current approaches used to represent physicochemical and biological processes in porous media, along with new conceptual approaches that link metabolic activity with partitioning of the microorganism between the aqueous and solid phases. Résumé L'introduction des processus microbiologiques dans des modèles de transport réactif a généralement suivi deux voies différentes de recherches: (1) le transport de bactéries sous forme de colloïdes inertes en milieu poreux, et (2) la biodégradation de polluants dissous par une phase stationnaire de bactéries. Les recherches conduites au cours des dix dernières années indiquent que ces processus sont intimement liés. Cette liaison peut intervenir lorsqu

  6. Modeling microbial processes in porous media

    NASA Astrophysics Data System (ADS)

    Murphy, Ellyn M.; Ginn, Timothy R.

    The incorporation of microbial processes into reactive transport models has generally proceeded along two separate lines of investigation: (1) transport of bacteria as inert colloids in porous media, and (2) the biodegradation of dissolved contaminants by a stationary phase of bacteria. Research over the last decade has indicated that these processes are closely linked. This linkage may occur when a change in metabolic activity alters the attachment/detachment rates of bacteria to surfaces, either promoting or retarding bacterial transport in a groundwater-contaminant plume. Changes in metabolic activity, in turn, are controlled by the time of exposure of the microbes to electron acceptors/donor and other components affecting activity. Similarly, metabolic activity can affect the reversibility of attachment, depending on the residence time of active microbes. Thus, improvements in quantitative analysis of active subsurface biota necessitate direct linkages between substrate availability, metabolic activity, growth, and attachment/detachment rates. This linkage requires both a detailed understanding of the biological processes and robust quantitative representations of these processes that can be tested experimentally. This paper presents an overview of current approaches used to represent physicochemical and biological processes in porous media, along with new conceptual approaches that link metabolic activity with partitioning of the microorganism between the aqueous and solid phases. Résumé L'introduction des processus microbiologiques dans des modèles de transport réactif a généralement suivi deux voies différentes de recherches: (1) le transport de bactéries sous forme de colloïdes inertes en milieu poreux, et (2) la biodégradation de polluants dissous par une phase stationnaire de bactéries. Les recherches conduites au cours des dix dernières années indiquent que ces processus sont intimement liés. Cette liaison peut intervenir lorsqu

  7. Computer modeling of lung cancer diagnosis-to-treatment process

    PubMed Central

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U.; Yu, Xinhua; Faris, Nick

    2015-01-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed. PMID:26380181

  8. Verification of image processing based visibility models

    SciTech Connect

    Larson, S.M.; Cass, G.R.; Hussey, K.J.; Luce, F.

    1988-06-01

    Methods are presented for testing visibility models that use simulated photographs to display results of model calculations. An experimental protocol is developed and used to obtain input data including standard photographs of chosen scenes on a clear day and during a smog event at Pasadena, CA. With clear day photograph as a substrate, pollutant properties measured on the smoggy day are introduced into the visibility model, and results of the model calculations are displayed as a synthetic photograph of the expected appearance of the smog event. Quantitative comparisons are made between the predicted and actual appearance of the smog event. Diagnostic techniques developed are applied to the visibility modeling procedure proposed by Malm et al. That model is shown to reproduce the contrast reduction characteristic of urban air pollution but produces synthetic photographs with sky elements that differ substantially from a real photograph of the actual smog event.

  9. Hot cheese: a processed Swiss cheese model.

    PubMed

    Li, Y; Thimbleby, H

    2014-01-01

    James Reason's classic Swiss cheese model is a vivid and memorable way to visualise how patient harm happens only when all system defences fail. Although Reason's model has been criticised for its simplicity and static portrait of complex systems, its use has been growing, largely because of the direct clarity of its simple and memorable metaphor. A more general, more flexible and equally memorable model of accident causation in complex systems is needed. We present the hot cheese model, which is more realistic, particularly in portraying defence layers as dynamic and active - more defences may cause more hazards. The hot cheese model, being more flexible, encourages deeper discussion of incidents than the simpler Swiss cheese model permits. PMID:24999771

  10. Modelling the Active Hearing Process in Mosquitoes

    NASA Astrophysics Data System (ADS)

    Avitabile, Daniele; Homer, Martin; Jackson, Joe; Robert, Daniel; Champneys, Alan

    2011-11-01

    A simple microscopic mechanistic model is described of the active amplification within the Johnston's organ of the mosquito species Toxorhynchites brevipalpis. The model is based on the description of the antenna as a forced-damped oscillator coupled to a set of active threads (ensembles of scolopidia) that provide an impulsive force when they twitch. This twitching is in turn controlled by channels that are opened and closed if the antennal oscillation reaches a critical amplitude. The model matches both qualitatively and quantitatively with recent experiments. New results are presented using mathematical homogenization techniques to derive a mesoscopic model as a simple oscillator with nonlinear force and damping characteristics. It is shown how the results from this new model closely resemble those from the microscopic model as the number of threads approach physiologically correct values.

  11. Modeling of Heating During Food Processing

    NASA Astrophysics Data System (ADS)

    Zheleva, Ivanka; Kamburova, Veselka

    Heat transfer processes are important for almost all aspects of food preparation and play a key role in determining food safety. Whether it is cooking, baking, boiling, frying, grilling, blanching, drying, sterilizing, or freezing, heat transfer is part of the processing of almost every food. Heat transfer is a dynamic process in which thermal energy is transferred from one body with higher temperature to another body with lower temperature. Temperature difference between the source of heat and the receiver of heat is the driving force in heat transfer.

  12. Requirement Changes and Project Success: The Moderating Effects of Agile Approaches in System Engineering Projects

    NASA Astrophysics Data System (ADS)

    Maierhofer, Sabine; Stelzmann, Ernst; Kohlbacher, Markus; Fellner, Björn

    This paper reports the findings of an empirical study on the influence agile development methods exert on the success of projects. The goal is to determine whether agile methods are able to mitigate negative effects requirement changes have on the performance of Systems Engineering projects, i.e. projects where systems consisting of hard- and software are developed. Agile methods have been proven to successfully support development projects in the field of traditional software engineering, but with an ever expending market of integrated systems manufacturers their usability for those complex projects has yet to be examined. This study focuses on 16 specific agile practices and their ability to improve the success of complex hard- and software projects.

  13. Renewed Gamma-Ray Emission from the blazar PKS 1510-089 Detected by AGILE

    NASA Astrophysics Data System (ADS)

    Munar-Adrover, P.; Pittori, C.; Bulgarelli, A.; Lucarelli, F.; Verrecchia, F.; Piano, G.; Fioretti, V.; Zoli, A.; Tavani, M.; Vercellone, S.; Minervini, G.; Striani, E.; Cardillo, M.; Gianotti, F.; Trifoglio, M.; Giuliani, A.; Mereghetti, S.; Caraveo, P.; Perotti, F.; Chen, A.; Argan, A.; Costa, E.; Del Monte, E.; Donnarumma, I.; Evangelista, Y.; Feroci, M.; Lazzarotto, F.; Lapshov, I.; Pacciani, L.; Soffitta, P.; Sabatini, S.; Vittorini, V.; Pucella, G.; Rapisarda, M.; Di Cocco, G.; Fuschino, F.; Galli, M.; Labanti, C.; Marisaldi, M.; Pellizzoni, A.; Pilia, M.; Trois, A.; Barbiellini, G.; Vallazza, E.; Longo, F.; Morselli, A.; Picozza, P.; Prest, M.; Lipari, P.; Zanello, D.; Cattaneo, P. W.; Rappoldi, A.; Colafrancesco, S.; Parmiggiani, N.; Ferrari, A.; Antonelli, A.; Giommi, P.; Salotti, L.; Valentini, G.; D'Amico, F.

    2016-09-01

    AGILE is currently detecting enhanced gamma-ray emission above 100 MeV from a source which position is consistent with the blazar PKS 1510-089. (the last activity of this source was reported in ATel #9350).

  14. Agile machining and inspection thrust area team-on-machine probing / compatibility assessment of Parametric Technology Corporation (PTC) pro/CMM DMIS with Zeiss DMISEngine.

    SciTech Connect

    Wade, James Rokwel; Tomlinson, Kurt; Bryce, Edwin Anthony

    2008-09-01

    The charter goal of the Agile Machining and Inspection Thrust Area Team is to identify technical requirements, within the nuclear weapons complex (NWC), for Agile Machining and Inspection capabilities. During FY 2008, the team identified Parametric Technology Corporation (PTC) Pro/CMM as a software tool for use in off-line programming of probing routines--used for measurement--for machining and turning centers. The probing routine would be used for in-process verification of part geometry. The same Pro/CMM program used on the machine tool could also be employed for program validation / part verification using a coordinate measuring machine (CMM). Funding was provided to determine the compatibility of the Pro/CMM probing program with CMM software (Zeiss DMISEngine).

  15. Comprehensive computational model for thermal plasma processing

    NASA Astrophysics Data System (ADS)

    Chang, C. H.

    A new numerical model is described for simulating thermal plasmas containing entrained particles, with emphasis on plasma spraying applications. The plasma is represented as a continuum multicomponent chemically reacting ideal gas, while the particles are tracked as discrete Lagrangian entities coupled to the plasma. The overall computational model is embodied in a new computer code called LAVA. Computational results are presented from a transient simulation of alumina spraying in a turbulent argon-helium plasma jet in air environment, including torch geometry, substrate, and multiple species with chemical reactions. Plasma-particle interactions including turbulent dispersion have been modeled in a fully self-consistent manner.

  16. Biomedical Simulation Models of Human Auditory Processes

    NASA Technical Reports Server (NTRS)

    Bicak, Mehmet M. A.

    2012-01-01

    Detailed acoustic engineering models that explore noise propagation mechanisms associated with noise attenuation and transmission paths created when using hearing protectors such as earplugs and headsets in high noise environments. Biomedical finite element (FE) models are developed based on volume Computed Tomography scan data which provides explicit external ear, ear canal, middle ear ossicular bones and cochlea geometry. Results from these studies have enabled a greater understanding of hearing protector to flesh dynamics as well as prioritizing noise propagation mechanisms. Prioritization of noise mechanisms can form an essential framework for exploration of new design principles and methods in both earplug and earcup applications. These models are currently being used in development of a novel hearing protection evaluation system that can provide experimentally correlated psychoacoustic noise attenuation. Moreover, these FE models can be used to simulate the effects of blast related impulse noise on human auditory mechanisms and brain tissue.

  17. Modeling of dynamical processes in laser ablation

    SciTech Connect

    Leboeuf, J.N.; Chen, K.R.; Donato, J.M.; Geohegan, D.B.; Liu, C.L.; Puretzky, A.A.; Wood, R.F.

    1995-12-31

    Various physics and computational approaches have been developed to globally characterize phenomena important for film growth by pulsed-laser deposition of materials. These include thermal models of laser-solid target interactions that initiate the vapor plume, plume ionization and heating through laser absorption beyond local thermodynamic equilibrium mechanisms, hydrodynamic and collisional descriptions of plume transport, and molecular dynamics models of the interaction of plume particles with the deposition substrate.

  18. X-36 Tailless Fighter Agility Research Aircraft arrival at Dryden

    NASA Technical Reports Server (NTRS)

    1996-01-01

    NASA and McDonnell Douglas Corporation (MDC) personnel remove protective covers from the newly arrived NASA/McDonnell Douglas Corporation X-36 Tailless Fighter Agility Research Aircraft. It arrived at NASA Dryden Flight Research Center, Edwards, California, on July 2, 1996. The NASA/Boeing X-36 Tailless Fighter Agility Research Aircraft program successfully demonstrated the tailless fighter design using advanced technologies to improve the maneuverability and survivability of possible future fighter aircraft. The program met or exceeded all project goals. For 31 flights during 1997 at the Dryden Flight Research Center, Edwards, California, the project team examined the aircraft's agility at low speed / high angles of attack and at high speed / low angles of attack. The aircraft's speed envelope reached up to 206 knots (234 mph). This aircraft was very stable and maneuverable. It handled very well. The X-36 vehicle was designed to fly without the traditional tail surfaces common on most aircraft. Instead, a canard forward of the wing was used as well as split ailerons and an advanced thrust-vectoring nozzle for directional control. The X-36 was unstable in both pitch and yaw axes, so an advanced, single-channel digital fly-by-wire control system (developed with some commercially available components) was put in place to stabilize the aircraft. Using a video camera mounted in the nose of the aircraft and an onboard microphone, the X-36 was remotely controlled by a pilot in a ground station virtual cockpit. A standard fighter-type head-up display (HUD) and a moving-map representation of the vehicle's position within the range in which it flew provided excellent situational awareness for the pilot. This pilot-in-the-loop approach eliminated the need for expensive and complex autonomous flight control systems and the risks associated with their inability to deal with unknown or unforeseen phenomena in flight. Fully fueled the X-36 prototype weighed approximately 1

  19. Model based control of polymer composite manufacturing processes

    NASA Astrophysics Data System (ADS)

    Potaraju, Sairam

    2000-10-01

    The objective of this research is to develop tools that help process engineers design, analyze and control polymeric composite manufacturing processes to achieve higher productivity and cost reduction. Current techniques for process design and control of composite manufacturing suffer from the paucity of good process models that can accurately represent these non-linear systems. Existing models developed by researchers in the past are designed to be process and operation specific, hence generating new simulation models is time consuming and requires significant effort. To address this issue, an Object Oriented Design (OOD) approach is used to develop a component-based model building framework. Process models for two commonly used industrial processes (Injected Pultrusion and Autoclave Curing) are developed using this framework to demonstrate the flexibility. Steady state and dynamic validation of this simulator is performed using a bench scale injected pultrusion process. This simulator could not be implemented online for control due to computational constraints. Models that are fast enough for online implementation, with nearly the same degree of accuracy are developed using a two-tier scheme. First, lower dimensional models that captures essential resin flow, heat transfer and cure kinetics important from a process monitoring and control standpoint are formulated. The second step is to reduce these low dimensional models to Reduced Order Models (ROM) suited for online model based estimation, control and optimization. Model reduction is carried out using Proper Orthogonal Decomposition (POD) technique in conjunction with a Galerkin formulation procedure. Subsequently, a nonlinear model-based estimation and inferential control scheme based on the ROM is implemented. In particular, this research work contributes in the following general areas: (1) Design and implementation of versatile frameworks for modeling and simulation of manufacturing processes using object

  20. Strength and agility training in adolescents with Down syndrome: a randomized controlled trial.

    PubMed

    Lin, Hsiu-Ching; Wuang, Yee-Pay

    2012-01-01

    The purpose of this study was to investigate the effects of a proposed strength and agility training program of adolescents with Down syndrome. Ninety-two adolescents were recruited and evenly randomized to two intervention groups (exercise group vs. control group). The mean age for the exercise and the control group was 10.6±3.2 and 11.2±3.5 respectively. The exercise training program consisted of a 5-min treadmill exercise and one 20-min virtual-reality based activity administered three times a week for 6 weeks. Pre- and post-test measures were taken for muscle strength and agility performance. The measured muscle included hip extensor, hip flexor, knee extensor, knee flexors, hip abductors, and ankle plantarflexor. A handheld dynamometer was used to measure the lower extremities muscle strength, and agility performance was assessed by the strength and agility subtests of the Bruininks-Oseretsky Test of Motor Proficiency-Second Edition. The exercise group had significant improvements in agility (p=0.02, d=0.80) and muscle strength of all muscle group (all p's<0.05, d=0.51-0.89) assessed in comparison to the control group after the 6-week intervention. Knee muscle groups including both flexors and extensors had the greatest gains among all the muscles measured. A short-term exercise training program used in this study is capable of improving muscle strength and agility performance of adolescents with DS. PMID:22820064

  1. Three `C's of Agile Practice: Collaboration, Co-ordination and Communication

    NASA Astrophysics Data System (ADS)

    Sharp, Helen; Robinson, Hugh

    The importance of collaboration, co-ordination and communication in agile teams is often discussed and rarely disputed. These activities are supported through various practices including pairing, customer collaboration, stand-ups and the planning game. However the mechanisms used to support these activities are sometimes more difficult to pin down. We have been studying agile teams for over a decade, and have found that story cards and the Wall are central to an agile team's activity, and the information they hold and convey is crucial for supporting the team's collaboration and co-ordination activity. However the information captured by these usually physical artefacts pertains mainly to progress rather than to functional dependencies. This latter information is fundamental to any software development, and in a non-agile environment is usually contained in detailed documentation not generally produced in an agile team. Instead, this information resides in their communication and social practices. In this chapter we discuss these three ‘C's of agile development and what we know about how they are supported through story cards and the Wall.

  2. AGILE/GRID Science Alert Monitoring System: The Workflow and the Crab Flare Case

    NASA Astrophysics Data System (ADS)

    Bulgarelli, A.; Trifoglio, M.; Gianotti, F.; Tavani, M.; Conforti, V.; Parmiggiani, N.

    2013-10-01

    During the first five years of the AGILE mission we have observed many gamma-ray transients of Galactic and extragalactic origin. A fast reaction to unexpected transient events is a crucial part of the AGILE monitoring program, because the follow-up of astrophysical transients is a key point for this space mission. We present the workflow and the software developed by the AGILE Team to perform the automatic analysis for the detection of gamma-ray transients. In addition, an App for iPhone will be released enabling the Team to access the monitoring system through mobile phones. In 2010 September the science alert monitoring system presented in this paper recorded a transient phenomena from the Crab Nebula, generating an automated alert sent via email and SMS two hours after the end of an AGILE satellite orbit, i.e. two hours after the Crab flare itself: for this discovery AGILE won the 2012 Bruno Rossi prize. The design of this alert system is maximized to reach the maximum speed, and in this, as in many other cases, AGILE has demonstrated that the reaction speed of the monitoring system is crucial for the scientific return of the mission.

  3. Frequency-Agile Differential Cavity Ring-Down Spectroscopy

    NASA Astrophysics Data System (ADS)

    Reed, Zachary; Hodges, Joseph

    2015-06-01

    The ultimate precision of highly sensitive cavity-enhanced spectroscopic measurements is often limited by interferences (etalons) caused by weak coupled-cavity effects. Differential measurements of ring-down decay constants have previously been demonstrated to largely cancel these effects, but the measurement acquisition rates were relatively low [1,2]. We have previously demonstrated the use of frequency agile rapid scanning cavity ring-down spectroscopy (FARS-CRDS) for acquisition of absorption spectra [3]. Here, the method of rapidly scanned, frequency-agile differential cavity ring-down spectroscopy (FADS-CRDS) is presented for reducing the effect of these interferences and other shot-to-shot statistical variations in measured decay times. To this end, an electro-optic phase modulator (EOM) with a bandwidth of 20 GHz is driven by a microwave source, generating pairs of sidebands on the probe laser. The optical resonator acts as a highly selective optical filter to all laser frequencies except for one tunable sideband. This sideband may be stepped arbitrarily from mode-to-mode of the ring-down cavity, at a rate limited only by the cavity buildup/decay time. The ability to probe any cavity mode across the EOM bandwidth enables a variety of methods for generating differential spectra. The differential mode spacing may be changed, and the effect of this method on suppressing the various coupled-cavity interactions present in the system is discussed. Alternatively, each mode may also be differentially referenced to a single point, providing immunity to temporal variations in the base losses of the cavity while allowing for conventional spectral fitting approaches. Differential measurements of absorption are acquired at 3.3 kHz and a minimum detectable absorption coefficient of 5 x10-12 cm-1 in 1 s averaging time is achieved. 1. J. Courtois, K. Bielska, and J.T Hodges J. Opt. Soc. Am. B, 30, 1486-1495, 2013 2. H.F. Huang and K.K. Lehmann App. Optics 49, 1378

  4. A Comprehensive and Harmonized Digital Forensic Investigation Process Model.

    PubMed

    Valjarevic, Aleksandar; Venter, Hein S

    2015-11-01

    Performing a digital forensic investigation (DFI) requires a standardized and formalized process. There is currently neither an international standard nor does a global, harmonized DFI process (DFIP) exist. The authors studied existing state-of-the-art DFIP models and concluded that there are significant disparities pertaining to the number of processes, the scope, the hierarchical levels, and concepts applied. This paper proposes a comprehensive model that harmonizes existing models. An effort was made to incorporate all types of processes proposed by the existing models, including those aimed at achieving digital forensic readiness. The authors introduce a novel class of processes called concurrent processes. This is a novel contribution that should, together with the rest of the model, enable more efficient and effective DFI, while ensuring admissibility of digital evidence. Ultimately, the proposed model is intended to be used for different types of DFI and should lead to standardization. PMID:26258644

  5. Branching process in a stochastic extremal model

    NASA Astrophysics Data System (ADS)

    Manna, S. S.

    2009-08-01

    We considered a stochastic version of the Bak-Sneppen model (SBSM) of ecological evolution where the number M of sites mutated in a mutation event is restricted to only two. Here the mutation zone consists of only one site and this site is randomly selected from the neighboring sites at every mutation event in an annealed fashion. The critical behavior of the SBSM is found to be the same as the BS model in dimensions d=1 and 2. However on the scale-free graphs the critical fitness value is nonzero even in the thermodynamic limit but the critical behavior is mean-field like. Finally ⟨M⟩ has been made even smaller than two by probabilistically updating the mutation zone, which also shows the original BS model behavior. We conjecture that a SBSM on any arbitrary graph with any small branching factor greater than unity will lead to a self-organized critical state.

  6. Branching process in a stochastic extremal model.

    PubMed

    Manna, S S

    2009-08-01

    We considered a stochastic version of the Bak-Sneppen model (SBSM) of ecological evolution where the number M of sites mutated in a mutation event is restricted to only two. Here the mutation zone consists of only one site and this site is randomly selected from the neighboring sites at every mutation event in an annealed fashion. The critical behavior of the SBSM is found to be the same as the BS model in dimensions d=1 and 2. However on the scale-free graphs the critical fitness value is nonzero even in the thermodynamic limit but the critical behavior is mean-field like. Finally M has been made even smaller than two by probabilistically updating the mutation zone, which also shows the original BS model behavior. We conjecture that a SBSM on any arbitrary graph with any small branching factor greater than unity will lead to a self-organized critical state. PMID:19792102

  7. LISP based simulation generators for modeling complex space processes

    NASA Technical Reports Server (NTRS)

    Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing

    1987-01-01

    The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.

  8. IDENTIFICATION AND EVALUATION OF FUNDAMENTAL TRANSPORT AND TRANSFORMATION PROCESS MODELS

    EPA Science Inventory

    Chemical fate models require explicit algorithms for computing the effects of transformation and transport processes on the spatial and temporal distribution of chemical concentrations. Transport processes in aquatic systems are driven by physical characteristics on the system an...

  9. Process models: analytical tools for managing industrial energy systems

    SciTech Connect

    Howe, S O; Pilati, D A; Balzer, C; Sparrow, F T

    1980-01-01

    How the process models developed at BNL are used to analyze industrial energy systems is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for managing industrial energy systems.

  10. Sensory processing and world modeling for an active ranging device

    NASA Technical Reports Server (NTRS)

    Hong, Tsai-Hong; Wu, Angela Y.

    1991-01-01

    In this project, we studied world modeling and sensory processing for laser range data. World Model data representation and operation were defined. Sensory processing algorithms for point processing and linear feature detection were designed and implemented. The interface between world modeling and sensory processing in the Servo and Primitive levels was investigated and implemented. In the primitive level, linear features detectors for edges were also implemented, analyzed and compared. The existing world model representations is surveyed. Also presented is the design and implementation of the Y-frame model, a hierarchical world model. The interfaces between the world model module and the sensory processing module are discussed as well as the linear feature detectors that were designed and implemented.

  11. LAKE MICHIGAN MASS BALANCE: MODELING PROCESS

    EPA Science Inventory

    The Lake Michigan Mass Balance Study measured PCBs, mercury, trans-nonachlor, and atrazine in rivers, the atmosphere, sediments, lake water, and the food chain. A mathematical model will predict what effect reducing pollution will have on the lake, and its large fish (lake trout ...

  12. A Review on Mathematical Modeling for Textile Processes

    NASA Astrophysics Data System (ADS)

    Chattopadhyay, R.

    2015-10-01

    Mathematical model is a powerful tool in engineering for studying variety of problems related to design and development of products and processes, optimization of manufacturing process, understanding a phenomenon and predicting product's behaviour in actual use. An insight of the process and use of appropriate mathematical tools are necessary for developing models. In the present paper, a review of types of model, procedure followed in developing them and their limitations have been discussed. Modeling techniques being used in few textile processes available in the literature have been cited as examples.

  13. Statistical Inference for Point Process Models of Rainfall

    NASA Astrophysics Data System (ADS)

    Smith, James A.; Karr, Alan F.

    1985-01-01

    In this paper we develop maximum likelihood procedures for parameter estimation and model selection that apply to a large class of point process models that have been used to model rainfall occurrences, including Cox processes, Neyman-Scott processes, and renewal processes. The statistical inference procedures are based on the stochastic intensity λ(t) = lims→0,s>0 (1/s)E[N(t + s) - N(t)|N(u), u < t]. The likelihood function of a point process is shown to have a simple expression in terms of the stochastic intensity. The main result of this paper is a recursive procedure for computing stochastic intensities; the procedure is applicable to a broad class of point process models, including renewal Cox process with Markovian intensity processes and an important class of Neyman-Scott processes. The model selection procedure we propose, which is based on likelihood ratios, allows direct comparison of two classes of point processes to determine which provides a better model for a given data set. The estimation and model selection procedures are applied to two data sets of simulated Cox process arrivals and a data set of daily rainfall occurrences in the Potomac River basin.

  14. Diagnosing process faults using neural network models

    SciTech Connect

    Buescher, K.L.; Jones, R.D.; Messina, M.J.

    1993-11-01

    In order to be of use for realistic problems, a fault diagnosis method should have the following three features. First, it should apply to nonlinear processes. Second, it should not rely on extensive amounts of data regarding previous faults. Lastly, it should detect faults promptly. The authors present such a scheme for static (i.e., non-dynamic) systems. It involves using a neural network to create an associative memory whose fixed points represent the normal behavior of the system.

  15. Verifying and Validating Proposed Models for FSW Process Optimization

    NASA Technical Reports Server (NTRS)

    Schneider, Judith

    2008-01-01

    This slide presentation reviews Friction Stir Welding (FSW) and the attempts to model the process in order to optimize and improve the process. The studies are ongoing to validate and refine the model of metal flow in the FSW process. There are slides showing the conventional FSW process, a couple of weld tool designs and how the design interacts with the metal flow path. The two basic components of the weld tool are shown, along with geometries of the shoulder design. Modeling of the FSW process is reviewed. Other topics include (1) Microstructure features, (2) Flow Streamlines, (3) Steady-state Nature, and (4) Grain Refinement Mechanisms

  16. Animated-simulation modeling facilitates clinical-process costing.

    PubMed

    Zelman, W N; Glick, N D; Blackmore, C C

    2001-09-01

    Traditionally, the finance department has assumed responsibility for assessing process costs in healthcare organizations. To enhance process-improvement efforts, however, many healthcare providers need to include clinical staff in process cost analysis. Although clinical staff often use electronic spreadsheets to model the cost of specific processes, PC-based animated-simulation tools offer two major advantages over spreadsheets: they allow clinicians to interact more easily with the costing model so that it more closely represents the process being modeled, and they represent cost output as a cost range rather than as a single cost estimate, thereby providing more useful information for decision making. PMID:11552586

  17. An Integrated Model of Emotion Processes and Cognition in Social Information Processing.

    ERIC Educational Resources Information Center

    Lemerise, Elizabeth A.; Arsenio, William F.

    2000-01-01

    Interprets literature on contributions of social cognitive and emotion processes to children's social competence in the context of an integrated model of emotion processes and cognition in social information processing. Provides neurophysiological and functional evidence for the centrality of emotion processes in personal-social decision making.…

  18. Development and application of a process model for thermoplastic pultrusion

    NASA Astrophysics Data System (ADS)

    Astrom, B. T.

    A fundamental understanding of the effects of processing parameters and die geometry in a pultrusion process requires a mathematical model in order to minimize the number of necessary experiments. Previous investigators have suggested a variety of models for thermoset pultrusion, while comparatively little effort has been spent modelling its less well-understood thermoplastic counterpart. Herein, models to describe temperature and pressure distributions within a thermoplastic composite as it travels through the pultrusion line, as well as a model to calculate the accumulated pulling resistance from a pultrusion die, are presented. The predictions of the models are compared to experimentally obtained data in terms of composite temperature and pressure and process pulling force; the correlations between predictions and experimental data are found to be good, indicating the soundness of the models. The practical usefulness of the models in terms of die design and the effects of changes in processing parameters is demonstrated with examples.

  19. Agent-Based Modeling of Growth Processes

    ERIC Educational Resources Information Center

    Abraham, Ralph

    2014-01-01

    Growth processes abound in nature, and are frequently the target of modeling exercises in the sciences. In this article we illustrate an agent-based approach to modeling, in the case of a single example from the social sciences: bullying.

  20. Physics and Process Modeling (PPM) and Other Propulsion R and T. Volume 1; Materials Processing, Characterization, and Modeling; Lifting Models

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This CP contains the extended abstracts and presentation figures of 36 papers presented at the PPM and Other Propulsion R&T Conference. The focus of the research described in these presentations is on materials and structures technologies that are parts of the various projects within the NASA Aeronautics Propulsion Systems Research and Technology Base Program. These projects include Physics and Process Modeling; Smart, Green Engine; Fast, Quiet Engine; High Temperature Engine Materials Program; and Hybrid Hyperspeed Propulsion. Also presented were research results from the Rotorcraft Systems Program and work supported by the NASA Lewis Director's Discretionary Fund. Authors from NASA Lewis Research Center, industry, and universities conducted research in the following areas: material processing, material characterization, modeling, life, applied life models, design techniques, vibration control, mechanical components, and tribology. Key issues, research accomplishments, and future directions are summarized in this publication.

  1. Gaussian predictive process models for large spatial data sets

    PubMed Central

    Banerjee, Sudipto; Gelfand, Alan E.; Finley, Andrew O.; Sang, Huiyan

    2009-01-01

    Summary With scientific data available at geocoded locations, investigators are increasingly turning to spatial process models for carrying out statistical inference. Over the last decade, hierarchical models implemented through Markov chain Monte Carlo methods have become especially popular for spatial modelling, given their flexibility and power to fit models that would be infeasible with classical methods as well as their avoidance of possibly inappropriate asymptotics. However, fitting hierarchical spatial models often involves expensive matrix decompositions whose computational complexity increases in cubic order with the number of spatial locations, rendering such models infeasible for large spatial data sets. This computational burden is exacerbated in multivariate settings with several spatially dependent response variables. It is also aggravated when data are collected at frequent time points and spatiotemporal process models are used. With regard to this challenge, our contribution is to work with what we call predictive process models for spatial and spatiotemporal data. Every spatial (or spatiotemporal) process induces a predictive process model (in fact, arbitrarily many of them). The latter models project process realizations of the former to a lower dimensional subspace, thereby reducing the computational burden. Hence, we achieve the flexibility to accommodate non-stationary, non-Gaussian, possibly multivariate, possibly spatiotemporal processes in the context of large data sets. We discuss attractive theoretical properties of these predictive processes. We also provide a computational template encompassing these diverse settings. Finally, we illustrate the approach with simulated and real data sets. PMID:19750209

  2. Survey of Bayesian Models for Modelling of Stochastic Temporal Processes

    SciTech Connect

    Ng, B

    2006-10-12

    This survey gives an overview of popular generative models used in the modeling of stochastic temporal systems. In particular, this survey is organized into two parts. The first part discusses the discrete-time representations of dynamic Bayesian networks and dynamic relational probabilistic models, while the second part discusses the continuous-time representation of continuous-time Bayesian networks.

  3. Commissioning the CERN IT Agile Infrastructure with experiment workloads

    NASA Astrophysics Data System (ADS)

    Medrano Llamas, Ramón; Harald Barreiro Megino, Fernando; Kucharczyk, Katarzyna; Kamil Denis, Marek; Cinquilli, Mattia

    2014-06-01

    In order to ease the management of their infrastructure, most of the WLCG sites are adopting cloud based strategies. In the case of CERN, the Tier 0 of the WLCG, is completely restructuring the resource and configuration management of their computing center under the codename Agile Infrastructure. Its goal is to manage 15,000 Virtual Machines by means of an OpenStack middleware in order to unify all the resources in CERN's two datacenters: the one placed in Meyrin and the new on in Wigner, Hungary. During the commissioning of this infrastructure, CERN IT is offering an attractive amount of computing resources to the experiments (800 cores for ATLAS and CMS) through a private cloud interface. ATLAS and CMS have joined forces to exploit them by running stress tests and simulation workloads since November 2012. This work will describe the experience of the first deployments of the current experiment workloads on the CERN private cloud testbed. The paper is organized as follows: the first section will explain the integration of the experiment workload management systems (WMS) with the cloud resources. The second section will revisit the performance and stress testing performed with HammerCloud in order to evaluate and compare the suitability for the experiment workloads. The third section will go deeper into the dynamic provisioning techniques, such as the use of the cloud APIs directly by the WMS. The paper finishes with a review of the conclusions and the challenges ahead.

  4. a Photon Tag Calibration Beam for the Agile Satellite

    NASA Astrophysics Data System (ADS)

    Hasan, S.; Prest, M.; Foggetta, L.; Pontoni, C.; Mozzanica, A.; Barbiellini, G.; Basset, M.; Liello, F.; Longo, F.; Vallazza, E.; Buonomo, G.; Mazzitelli, G.; Quintieri, L.; Valente, P.; Boffelli, F.; Cattaneo, P.; Mauri, F.

    2006-04-01

    The AGILE satellite will be launched in 2006 for the study of gamma rays in the energy range 30 MeV-50 GeV. The satellite has to be calibrated using gamma rays of known energy. The calibration facility is being developed at the Beam Test Facility (BTF) at the INFN Laboratories in Frascati. The photons are produced by bremsstrahlung of electrons with a maximum momentum of 750 MeV/c. The electrons are tagged using a dipole magnet whose internal walls are covered by microstrip silicon detectors: depending on the energy loss, they impinge on a different strip once the dipole current has been set to a given value. The correlation between the direction of the electron measured by a pair of x-y silicon chambers and the impinging position on the tagging module inside the magnet allows the tagging of the photon. The paper describes the calibration layout and tests and the results, compared with the Montecarlo simulation, in terms of production rate and energy resolution.

  5. MiniMAX: miniature, mobile, agile, x-ray system

    NASA Astrophysics Data System (ADS)

    Watson, Scott A.; Cunningham, Gwynneth; Gonzales, Samuel

    2012-06-01

    We present a unique, lightweight, compact, low-cost, x-ray imager: MiniMAX (Miniature, Mobile, Agile, X-ray). This system, which exploits the best aspects of Computed Radiography (CR) and Digital Radiography (DR) technology, weighs less than 6lbs, fits into a 6" diameter x 16" long carbon-fiber tube, and is constructed almost entirely from offthe- shelf components. MiniMAX is suitable for use in weld inspection, archaeology, homeland security, and veterinary medicine. While quantum limited for MeV radiography, the quantum-efficiency is too low for routine medical use. Formats include: 4"x6", 8"x12", or 16"x24" and can be readily displayed on the camera back, using a pocket projector, or on a tablet computer. In contrast to a conventional, flying-spot scanner, MiniMAX records a photostimulated image from the entire phosphor at once using a bright, red LED flash filtered through an extremely efficient (OD>9) dichroic filter.

  6. Resonance versus aerodynamics for energy savings in agile natural flyers

    NASA Astrophysics Data System (ADS)

    Kok, Jia M.; Chahl, Javaan

    2014-03-01

    Insects are the most diverse natural flyers in nature, being able to hover and perform agile manoeuvres. Dragon- flies in particular are aggressive flyers, attaining accelerations of up to 4g. Flight in all insects requires demanding aerodynamic and inertial loads be overcome. It has been proposed that resonance is a primary mechanism for reducing energy costs associated with flapping flight, by storing energy in an elastic thorax and releasing it on the following half-stroke. Certainly in insect flight motors dominated by inertial loads, such a mechanism would be extremely beneficial. However in highly manoeuvrable, aerodynamically dominated flyers, such as the dragonfly, the use of elastic storage members requires further investigation. We show that employing resonant mechanisms in a real world configuration produces minimal energy savings that are further reduced by 50 to 133% across the operational flapping frequency band of the dragonfly. Using a simple harmonic oscillator analysis to represent the dynamics of a dragonfly, we further demonstrate a reduction in manoeuvring limits of ˜1.5 times for a system employing elastic mechanisms. This is in contrast to the potential power reductions of √2/2 from regulating aerodynamics via active wing articulation. Aerodynamic means of energy storage provides flexibility between an energy efficient hover state and a manoeuvrable state capable of large accelerations. We conclude that active wing articulation is preferable to resonance for aerodynamically dominated natural flyers.

  7. SBIR Grant:No-Vibration Agile Cryogenic Optical Refrigerator

    SciTech Connect

    Epstein, Richard

    2013-04-09

    Optical refrigeration is currently the only all-solid-state cryocooling technology that has been demonstrated. Optical cryocoolers are devices that use laser light to cool small crystal or glass cooling elements. The cooling element absorbs the laser light and reradiates it at higher energy, an example of anti-Stokes fluorescence. The dif-ference between the energy of the outgoing and incoming light comes from the thermal energy of the cooling element, which in turn becomes colder. Entitled No-Vibration Agile Cryocoolers using Optical Refrigeration, this Phase I proposal directly addressed the continued development of the optical refrigerator components necessary to transition this scientific breakthrough into National Nu-clear Security Administration (NNSA) sensor applications in line with the objectives of topic 50b. ThermoDynamic Films LLC (TDF), in collaboration with the University of New Mexico (UNM), cooled an optical-refrigerator cooling element comprised of an ytterbium-doped yttrium lithium fluoride (Yb:YLF) crystal from room tempera-ture to 123 K with about 2% efficiency. This is the world record in optical refrigera-tion and an important step toward revolutionizing cryogenic systems for sensor ap-plications. During this period, they also designed and analyzed the crucial elements of a prototype optical refrigerator including the thermal link that connects the cool-ing element with the load.

  8. The AGILE-MCAL instrument as TGF monitor

    NASA Astrophysics Data System (ADS)

    Longo, F.; Marisaldi, M.; Fuschino, F.; Labanti, C.; Galli, M.

    2008-12-01

    The Mini-Calorimeter detector on-board the AGILE satellite was designed for astrophysical observations in the gamma-ray field, including cosmological Gamma-Ray-Burst (GRB). Thanks to its flexible on-board trigger logic for transient event, since the beginning of its operation, in April 2007, MCAL was also able to detect very fast transient phenomena in the energy range from 0.3 up to several MeV. These exhibit the characteristic of Terrestrial Gamma-ray Flashes (TGF). The candidate MCAL TGF events have a duration that can last from some hundred of μsec. to some millisec and reaches up to MeV in energy. For each triggered TGF its photon-by-photon history is recorded, with every photon described with energy, position on the detector plane and time (2 μsec. resolution) data. The on-ground data analysis strategy, based on more sophisticated algorithm than on-board logic, allows to discriminate between cosmological GRB and TGF and then to supply the spectral fluxes and the incoming area on the heart, within a 2000 km circle, of the observed TGF. MCAL can be then be used to increase the statistic of observed TGFs in the gamma ray domain and eventually to correlate them with observation in other energy ranges. The MCAL TGF observation strategy, as well as a short catalogue of detects events will be described and discussed.

  9. Autonomous, agile micro-satellites and supporting technologies

    SciTech Connect

    Breitfeller, E; Dittman, M D; Gaughan, R J; Jones, M S; Kordas, J F; Ledebuhr, A G; Ng, L C; Whitehead, J C; Wilson, B

    1999-07-19

    This paper updates the on-going effort at Lawrence Livermore National Laboratory to develop autonomous, agile micro-satellites (MicroSats). The objective of this development effort is to develop MicroSats weighing only a few tens of kilograms, that are able to autonomously perform precision maneuvers and can be used telerobotically in a variety of mission modes. The required capabilities include satellite rendezvous, inspection, proximity-operations, docking, and servicing. The MicroSat carries an integrated proximity-operations sensor-suite incorporating advanced avionics. A new self-pressurizing propulsion system utilizing a miniaturized pump and non-toxic mono-propellant hydrogen peroxide was successfully tested. This system can provide a nominal 25 kg MicroSat with 200-300 m/s delta-v including a warm-gas attitude control system. The avionics is based on the latest PowerPC processor using a CompactPCI bus architecture, which is modular, high-performance and processor-independent. This leverages commercial-off-the-shelf (COTS) technologies and minimizes the effects of future changes in processors. The MicroSat software development environment uses the Vx-Works real-time operating system (RTOS) that provides a rapid development environment for integration of new software modules, allowing early integration and test. We will summarize results of recent integrated ground flight testing of our latest non-toxic pumped propulsion MicroSat testbed vehicle operated on our unique dynamic air-rail.

  10. Clean, agile alternative binders, additives and plasticizers for propellant and explosive formulations

    SciTech Connect

    Hoffman, D.M.; Hawkins, T.W.; Lindsay, G.A.

    1994-12-01

    As part of the Strategic Environmental Research and Development Program (SERDP) a clean, agile manufacturing of explosives, propellants and pyrotechniques (CANPEP) effort set about to identify new approaches to materials and processes for producing propellants, explosives and pyrotechniques (PEP). The RDX based explosive PBXN-109 and gun propellant M-43 were identified as candidates for which waste minimization and recycling modifications might be implemented in a short time frame. The binders, additives and plasticizers subgroup identified cast non-curable thermoplastic elastomer (TPE) formulations as possible replacement candidates for these formulations. Paste extrudable explosives were also suggested as viable alternatives to PBXN-109. Commercial inert and energetic TPEs are reviewed. Biodegradable and hydrolyzable binders are discussed. The applicability of various types of explosive formulations are reviewed and some issues associated with implementation of recyclable formulations are identified. It is clear that some processing and weaponization modifications will need to be made if any of these approaches are to be implemented. The major advantages of formulations suggested here over PBXN-109 and M-43 is their reuse/recyclability. Formulations using TPE or Paste could by recovered from a generic bomb or propellant and reused if they met specification or easily reprocessed and sold to the mining industry.

  11. Modified Invasion Percolation Models for Multiphase Processes

    SciTech Connect

    Karpyn, Zuleima

    2015-01-31

    This project extends current understanding and modeling capabilities of pore-scale multiphase flow physics in porous media. High-resolution X-ray computed tomography imaging experiments are used to investigate structural and surface properties of the medium that influence immiscible displacement. Using experimental and computational tools, we investigate the impact of wetting characteristics, as well as radial and axial loading conditions, on the development of percolation pathways, residual phase trapping and fluid-fluid interfacial areas.

  12. Chemical kinetics models for semiconductor processing

    SciTech Connect

    Coltrin, M.E.; Creighton, J.R.; Meeks, E.; Grcar, J.F.; Houf, W.G.; Kee, R.J.

    1997-12-31

    Chemical reactions in the gas-phase and on surfaces are important in the deposition and etching of materials for microelectronic applications. A general software framework for describing homogeneous and heterogeneous reaction kinetics utilizing the Chemkin suite of codes is presented. Experimental, theoretical and modeling approaches to developing chemical reaction mechanisms are discussed. A number of TCAD application modules for simulating the chemically reacting flow in deposition and etching reactors have been developed and are also described.

  13. A Garbage Can Model of the Psychological Research Process.

    ERIC Educational Resources Information Center

    Martin, Joanne

    1981-01-01

    Reviews models commonly used in psychological research, and, particularly, in organizational decision making. An alternative model of organizational decision making is suggested. The model, referred to as the garbage can model, describes a process in which members of an organization collect the problems and solutions they generate by dumping them…

  14. A Software Development Simulation Model of a Spiral Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.

  15. Modeling of Inner Magnetosphere Coupling Processes

    NASA Technical Reports Server (NTRS)

    Khazanov, George V.

    2011-01-01

    The Ring Current (RC) is the biggest energy player in the inner magnetosphere. It is the source of free energy for Electromagnetic Ion Cyclotron (EMIC) wave excitation provided by a temperature anisotropy of RC ions, which develops naturally during inward E B convection from the plasmasheet. The cold plasmasphere, which is under the strong influence of the magnetospheric electric field, strongly mediates the RC-EMIC wave-particle-coupling process and ultimately becomes part of the particle and energy interplay. On the other hand, there is a strong influence of the RC on the inner magnetospheric electric and magnetic field configurations and these configurations, in turn, are important to RC dynamics. Therefore, one of the biggest needs for inner magnetospheric research is the continued progression toward a coupled, interconnected system with the inclusion of nonlinear feedback mechanisms between the plasma populations, the electric and magnetic fields, and plasma waves. As we clearly demonstrated in our studies, EMIC waves strongly interact with electrons and ions of energies ranging from approx.1 eV to approx.10 MeV, and that these waves strongly affect the dynamics of resonant RC ions, thermal electrons and ions, and the outer RB relativistic electrons. As we found, the rate of ion and electron scattering/heating in the Earth's magnetosphere is not only controlled by the wave intensity-spatial-temporal distribution but also strongly depends on the spectral distribution of the wave power. The latter is also a function of the plasmaspheric heavy ion content, and the plasma density and temperature distributions along the magnetic field lines. The above discussion places RC-EMIC wave coupling dynamics in context with inner magnetospheric coupling processes and, ultimately, relates RC studies with plasmaspheric and Superthermal Electrons formation processes as well as with outer RB physics.

  16. Animal models for information processing during sleep.

    PubMed

    Coenen, A M L; Drinkenburg, W H I M

    2002-12-01

    Information provided by external stimuli does reach the brain during sleep, although the amount of information is reduced during sleep compared to wakefulness. The process controlling this reduction is called 'sensory' gating and evidence exists that the underlying neurophysiological processes take place in the thalamus. Furthermore, it is clear that stimuli given during sleep can alter the functional state of the brain. Two factors have been shown to play a crucial role in causing changes in the sleeping brain: the intensity and the relevance of the stimulus. Intensive stimuli arouse the brain, as well as stimuli having a high informational impact on the sleeping person. The arousal threshold for important stimuli is quite low compared to neutral stimuli. A central question in sleep research is whether associative learning, or in other words the formation of new associations between stimuli, can take place in a sleeping brain. It has been shown that simple forms of learning are still possible during sleep. In sleeping rats, it is proven that habituation, an active, simple form of learning not to respond to irrelevant stimuli, can occur. Moreover, there is evidence for the view that more complex associations can be modulated and newly formed during sleep. This is shown by two experimental approaches: an extinction paradigm and a latent inhibition (pre-exposure) paradigm. The presentation of non-reinforced stimuli during sleep causes slower extinction compared to the same presentation of these stimuli during wakefulness. Consistently, the suppressive capacity of a stimulus in the latent inhibition paradigm is less when previously pre-exposed during sleep, as compared to pre-exposure during wakefulness. Thus, while associative learning is not completely blocked during sleep, aspects of association formation are clearly altered. However, animal studies also clearly indicate that complex forms of learning are not possible during sleep. It is hypothesised that this

  17. Beyond simple linear mixing models: process-based isotope partitioning of ecological processes.

    PubMed

    Ogle, Kiona; Tucker, Colin; Cable, Jessica M

    2014-01-01

    Stable isotopes are valuable tools for partitioning the components contributing to ecological processes of interest, such as animal diets and trophic interactions, plant resource use, ecosystem gas fluxes, streamflow, and many more. Stable isotope data are often analyzed with simple linear mixing (SLM) models to partition the contributions of different sources, but SLM models cannot incorporate a mechanistic understanding of the underlying processes and do not accommodate additional data associated with these processes (e.g., environmental covariates, flux data, gut contents). Thus, SLM models lack predictive ability. We describe a process-based mixing (PBM) model approach for integrating stable isotopes, other data sources, and process models to partition different sources or process components. This is accomplished via a hierarchical Bayesian framework that quantifies multiple sources of uncertainty and enables the incorporation of process models and prior information to help constrain the source-specific proportional contributions, thereby potentially avoiding identifiability issues that plague SLM models applied to "too many" sources. We discuss the application of the PBM model framework to three diverse examples: temporal and spatial partitioning of streamflow, estimation of plant rooting profiles and water uptake profiles (or water sources) with extension to partitioning soil and ecosystem CO2 fluxes, and reconstructing animal diets. These examples illustrate the advantages of the PBM modeling approach, which facilitates incorporation of ecological theory and diverse sources of information into the mixing model framework, thus enabling one to partition key process components across time and space. PMID:24640543

  18. A variability study of the first catalog of gamma-ray sources on the 2.3 years AGILE data archive

    NASA Astrophysics Data System (ADS)

    Verrecchia, Francesco; Pittori, C.; Bulgarelli, A.; Chen, A. W.; Tavani, M.; Giommi, P.; AGILE-Collaboration

    AGILE pointed observations performed from July 9, 2007 to October 30, 2009 are a recent, high quality gamma-ray data archive for monitoring studies of medium to high brightness gamma-ray sources in the 30 MeV -50 GeV energy range. We present a variability study of the 1AGL sources over the complete AGILE pointed Observation Blocks (OBs) dataset. The first AGILE Gamma-Ray Imaging Detector (GRID) catalog (Pittori et al. 2009) included a significance-limited (4 sigma) sample of 47 sources (1AGL), detected with a conservative analysis over a first-year non-uniform sky coverage dataset. In this analysis we used data of an improved full Field of View (FOV) event filter, on a much larger (about 27.5 months) observation dataset, analyzing each OB separatedly. This data processing resulted in an improved source list as compared to the 1AGL one. We present here some results on the variability of some of these sources.

  19. Denitrification as a Model Chemical Process

    NASA Astrophysics Data System (ADS)

    Grguric, Gordan

    2002-02-01

    Bacterial denitrification in seawater facilities such as aquaria and mariculture systems is a process particularly well suited for illustrating important concepts in chemistry to undergraduates. Students can gain firsthand experience related to these concepts, for example by (i) analyzing and quantifying chemical reactions based on empirical data, (ii) employing stoichiometry and mass balance to determine the amounts of reactants required and products produced in a chemical reaction, and (iii) using acid-base speciation diagrams and other information to quantify the changes in pH and carbonic acid speciation in an aqueous medium. At the Richard Stockton College of New Jersey, we have utilized actual data from several seawater systems to discuss topics such as stoichiometry, mass and charge balance, and limiting reagents. This paper describes denitrification in closed seawater systems and how the process can be used to enhance undergraduate chemistry education. A number of possible student exercises are described that can be used as practical tools to enhance the students' quantitative understanding of chemical reactions.

  20. A Point Process Model of Summer Season Rainfall Occurrences

    NASA Astrophysics Data System (ADS)

    Smith, James A.; Karr, Alan F.

    1983-02-01

    A point process model of summer season rainfall occurrences is developed. The model, which is termed an RCM process, is a member of the family of Cox processes (Poisson processes for which the rate of occurrence of events varies randomly over time). Model development is based on counts and interarrival time statistics estimated from Potomac River basin rainfall data. The counting parameters used are the conditional intensity function, index of dispersion, and counts spectrum; the interarrival time parameters are the coefficient of variation and the autocorrelation function. Explicit results are presented for the counts and interarrival time parameters of RCM processes. Of particular importance in this paper is the interpretation of clustering suggested by the form of the RCM process. For the RCM process the rate of occurrence alternates between two states, one of which is 0, the other positive. During periods when the intensity is 0, no events can occur. The form of the intensity process suggests that clustering of summer season rainfall occurrences in the Potomac River basin results from the alternation of wet and dry periods. Computational results are presented for two extensions of the RCM process model of rainfall occurrences: a marked RCM process model of rainfall occurrences and associated storm depths and a bivariate RCM process model of rainfall occurrences at two sites.