Science.gov

Sample records for agile process model

  1. Planning and scheduling for agile manufacturers: The Pantex Process Model

    SciTech Connect

    Kjeldgaard, E.A.; Jones, D.A.; List, G.F.; Tumquist, M.A.

    1998-02-01

    Effective use of resources that are shared among multiple products or processes is critical for agile manufacturing. This paper describes the development and implementation of a computerized model to support production planning in a complex manufacturing system at the Pantex Plant, a US Department of Energy facility. The model integrates two different production processes (nuclear weapon disposal and stockpile evaluation) that use common facilities and personnel at the plant. The two production processes are characteristic of flow-shop and job shop operations. The model reflects the interactions of scheduling constraints, material flow constraints, and the availability of required technicians and facilities. Operational results show significant productivity increases from use of the model.

  2. Clean, Agile Processing Technology.

    DTIC Science & Technology

    1997-12-01

    Research ltr dtd 10 Jun 98 THIS PAGE IS UNCLASSIFIED FINAL REPORT CLEAN, AGILE PROCESSING TECHNOLOGY Contract # N00014-96-C-0139 PI: S. W . Sinton...Agile Processing Technology . T UNCLAS I N Sinton, S. W.IN S REQUIRED FOR (Explain needin detaiO E C This document is requested by the Canadian Department

  3. Agile

    NASA Technical Reports Server (NTRS)

    Trimble, Jay Phillip

    2013-01-01

    This is based on a previous talk on agile development. Methods for delivering software on a short cycle are described, including interactions with the customer, the affect on the team, and how to be more effective, streamlined and efficient.

  4. Pilot users in agile development processes: motivational factors.

    PubMed

    Johannessen, Liv Karen; Gammon, Deede

    2010-01-01

    Despite a wealth of research on user participation, few studies offer insights into how to involve multi-organizational users in agile development methods. This paper is a case study of user involvement in developing a system for electronic laboratory requisitions using agile methodologies in a multi-organizational context. Building on an interpretive approach, we illuminate questions such as: How does collaboration between users and developers evolve and how might it be improved? What key motivational aspects are at play when users volunteer and continue contributing in the face of considerable added burdens? The study highlights how agile methods in themselves appear to facilitate mutually motivating collaboration between user groups and developers. Lessons learned for leveraging the advantages of agile development processes include acknowledging the substantial and ongoing contributions of users and their roles as co-designers of the system.

  5. The Perfect Process Storm: Integration of CMMI, Agile, and Lean Six Sigma

    DTIC Science & Technology

    2012-12-01

    projects using similar iterative methodologies includ- ing Scrum , Crystal, and Feature-driven Development leading to the meeting of the Agile...1986 Lean Six Sigma (LSS) Late 1990’s Lean Production 1990 CMM 1987 - 2002 CMMI 2002V1.3 2010 Agile XP 1996 Agile Manifesto 2001 Scrum 2001...Business Process Improvement. Most recently his efforts have targeted BPI for 22 Agile SCRUM projects, deploy- ing Project and Process Management

  6. Finding Discipline in an Agile Acquisition Process

    DTIC Science & Technology

    2011-05-18

    of technology to deployment Documentation of processes with compliance audits i th t f ll d• ensur ng a processes are o owe Financial performance...deployment Deltas • use case deferrals, shortfalls, test deficiencies are in domain-relevant language of end users and decisions makers – avoids...Bottom Line When we speak of discipline, we are advocating the creation of a more disciplined mechanism (structures + processes) to: • describe user

  7. The NERV Methodology: Non-Functional Requirements Elicitation, Reasoning and Validation in Agile Processes

    ERIC Educational Resources Information Center

    Domah, Darshan

    2013-01-01

    Agile software development has become very popular around the world in recent years, with methods such as Scrum and Extreme Programming (XP). Literature suggests that functionality is the primary focus in Agile processes while non-functional requirements (NFR) are either ignored or ill-defined. However, for software to be of good quality both…

  8. Information Models, Data Requirements, and Agile Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, John S.; Crichton, Dan; Ritschel, Bernd; Hardman, Sean; Joyner, Ron

    2015-04-01

    The Planetary Data System's next generation system, PDS4, is an example of the successful use of an ontology-based Information Model (IM) to drive the development and operations of a data system. In traditional systems engineering, requirements or statements about what is necessary for the system are collected and analyzed for input into the design stage of systems development. With the advent of big data the requirements associated with data have begun to dominate and an ontology-based information model can be used to provide a formalized and rigorous set of data requirements. These requirements address not only the usual issues of data quantity, quality, and disposition but also data representation, integrity, provenance, context, and semantics. In addition the use of these data requirements during system's development has many characteristics of Agile Curation as proposed by Young et al. [Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community, AGU 2014], namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. For example customers can be satisfied through early and continuous delivery of system software and services that are configured directly from the information model. This presentation will describe the PDS4 architecture and its three principle parts: the ontology-based Information Model (IM), the federated registries and repositories, and the REST-based service layer for search, retrieval, and distribution. The development of the IM will be highlighted with special emphasis on knowledge acquisition, the impact of the IM on development and operations, and the use of shared ontologies at multiple governance levels to promote system interoperability and data correlation.

  9. A process for the agile product realization of electro-mechanical devices

    SciTech Connect

    Forsythe, C.; Ashby, M.R.; Benavides, G.L.; Diegert, K.V.; Jones, R.E.; Longcope, D.B.; Parratt, S.W.

    1995-09-01

    This paper describes a product realization process developed and demonstrated at Sandia by the A-PRIMED (Agile Product Realization for Innovative Electro MEchanical Devices) project that integrates many of the key components of ``agile manufacturing`` into a complete, design-to-production process. Evidence indicates that the process has reduced the product realization cycle and assured product quality. Products included discriminators for a robotic quick change adapter and for an electronic defense system. These discriminators, built using A-PRIMED, met random vibration requirements and had life cycles that far surpass the performance obtained from earlier efforts.

  10. An optimized end-to-end process for the analysis of agile earth observation satellite missions

    NASA Astrophysics Data System (ADS)

    Hahn, M.; Müller, T.; Levenhagen, J.

    2014-12-01

    Agile earth observation satellite missions are becoming more and more important due to their capability to perform fast reorientation maneuvers with 3 degrees of freedom to capture different target areas along the orbital path, thus increasing the observed area and complexity of scans. The design of an agile earth observation satellite mission is a non-trivial task due to the fact that a trade-off between observed area and complexity of the scans on the one hand and degree of agility available and thus performance of the attitude control devices on the other hand has to be done. Additionally, the designed mission has to be evaluated in a realistic environment also taking into account the specific characteristics of the chosen actuators. In the present work, several methods are combined to provide an integrated analysis of agile earth observation satellite missions starting from the definition of a desired ground scan scenario, going via the creation of a guidance profile to a realistic simulation and ending at the verification of the feasibility by detailed closed-loop simulation. Regarding its technical implementation at Astrium GmbH, well-proven tools for the different tasks of the analysis are incorporated and well defined interfaces for those tools are specified, allowing a high degree of automatism and thus saving time and minimizing errors. This results in a complete end-to-end process for the design, analysis and verification of agile earth observation satellite missions. This process is demonstrated by means of an example analysis using control moment gyros for a high agility mission.

  11. A systematic review of the main factors that determine agility in sport using structural equation modeling.

    PubMed

    Hojka, Vladimir; Stastny, Petr; Rehak, Tomas; Gołas, Artur; Mostowik, Aleksandra; Zawart, Marek; Musálek, Martin

    2016-09-01

    While tests of basic motor abilities such as speed, maximum strength or endurance are well recognized, testing of complex motor functions such as agility remains unresolved in current literature. Therefore, the aim of this review was to evaluate which main factor or factor structures quantitatively determine agility. In methodological detail, this review focused on research that explained or described the relationships between latent variables in a factorial model of agility using approaches such as principal component analysis, factor analysis and structural equation modeling. Four research studies met the defined inclusion criteria. No quantitative empirical research was found that tried to verify the quality of the whole suggested model of the main factors determining agility through the use of a structural equation modeling (SEM) approach or a confirmatory factor analysis. From the whole structure of agility, only change of direction speed (CODS) and some of its subtests were appropriately analyzed. The combination of common CODS tests is reliable and useful to estimate performance in sub-elite athletes; however, for elite athletes, CODS tests must be specific to the needs of a particular sport discipline. Sprinting and jumping tests are stronger factors for CODS than explosive strength and maximum strength tests. The authors suggest the need to verify the agility factorial model by a second generation data analysis technique such as SEM.

  12. A systematic review of the main factors that determine agility in sport using structural equation modeling

    PubMed Central

    Hojka, Vladimir; Rehak, Tomas; Gołas, Artur; Mostowik, Aleksandra; Zawart, Marek; Musálek, Martin

    2016-01-01

    Abstract While tests of basic motor abilities such as speed, maximum strength or endurance are well recognized, testing of complex motor functions such as agility remains unresolved in current literature. Therefore, the aim of this review was to evaluate which main factor or factor structures quantitatively determine agility. In methodological detail, this review focused on research that explained or described the relationships between latent variables in a factorial model of agility using approaches such as principal component analysis, factor analysis and structural equation modeling. Four research studies met the defined inclusion criteria. No quantitative empirical research was found that tried to verify the quality of the whole suggested model of the main factors determining agility through the use of a structural equation modeling (SEM) approach or a confirmatory factor analysis. From the whole structure of agility, only change of direction speed (CODS) and some of its subtests were appropriately analyzed. The combination of common CODS tests is reliable and useful to estimate performance in sub-elite athletes; however, for elite athletes, CODS tests must be specific to the needs of a particular sport discipline. Sprinting and jumping tests are stronger factors for CODS than explosive strength and maximum strength tests. The authors suggest the need to verify the agility factorial model by a second generation data analysis technique such as SEM. PMID:28149399

  13. An Agile Systems Engineering Process: The Missing Link?

    DTIC Science & Technology

    2011-05-01

    Quality Management Systems. 2008. 26. Standardization, International Organization for. “ ISO 12207.” Software Life Cycle Processes. 2008. 27. Software...has a num- ber of standards available such as ISO 12207, ISO 9001 and the Capability Maturity Model Integrated (CMMI®) [24,25,26]. The CMMI was a...Of The Air. Life Cycle Systems Engineering. 2007 24. CMMI® for Development, Version 1.2. Pittsburgh: Carnegie Mellon University, 2006. 25. “ ISO 9001

  14. Impact of Agile Software Development Model on Software Maintainability

    ERIC Educational Resources Information Center

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  15. Unsteady aerodynamic models for agile flight at low Reynolds numbers

    NASA Astrophysics Data System (ADS)

    Brunton, Steven L.

    This work develops low-order models for the unsteady aerodynamic forces on a wing in response to agile maneuvers at low Reynolds number. Model performance is assessed on the basis of accuracy across a range of parameters and frequencies as well as of computational efficiency and compatibility with existing control techniques and flight dynamic models. The result is a flexible modeling procedure that yields accurate, low-dimensional, state-space models. The modeling procedures are developed and tested on direct numerical simulations of a two-dimensional flat plate airfoil in motion at low Reynolds number, Re=100, and in a wind tunnel experiment at the Illinois Institute of Technology involving a NACA 0006 airfoil pitching and plunging at Reynolds number Re=65,000. In both instances, low-order models are obtained that accurately capture the unsteady aerodynamic forces at all frequencies. These cases demonstrate the utility of the modeling procedure developed in this thesis for obtaining accurate models for different geometries and Reynolds numbers. Linear reduced-order models are constructed from either the indicial response (step response) or realistic input/output maneuvers using a flexible modeling procedure. The method is based on identifying stability derivatives and modeling the remaining dynamics with the eigensystem realization algorithm. A hierarchy of models is developed, based on linearizing the flow at various operating conditions. These models are shown to be accurate and efficient for plunging, pitching about various points, and combined pitch and plunge maneuvers, at various angle of attack and Reynolds number. Models are compared against the classical unsteady aerodynamic models of Wagner and Theodorsen over a large range of Strouhal number and reduced frequency for a baseline comparison. Additionally, state-space representations are developed for Wagner's and Theodorsen's models, making them compatible with modern control-system analysis. A number of

  16. Agile based "Semi-"Automated Data ingest process : ORNL DAAC example

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Hook, L.; Wei, Y.; Wright, D.

    2015-12-01

    The ORNL DAAC archives and publishes data and information relevant to biogeochemical, ecological, and environmental processes. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. The ORNL DAAC ingest team curates diverse data sets from multiple data providers simultaneously. To streamline the ingest process, the data set submission process at the ORNL DAAC has been recently updated to use an agile process and a semi-automated workflow system has been developed to provide a consistent data provider experience and to create a uniform data product. The goals of semi-automated agile ingest process are to: 1.Provide the ability to track a data set from acceptance to publication 2. Automate steps that can be automated to improve efficiencies and reduce redundancy 3.Update legacy ingest infrastructure 4.Provide a centralized system to manage the various aspects of ingest. This talk will cover the agile methodology, workflow, and tools developed through this system.

  17. Exploring the possibility of modeling a genetic counseling guideline using agile methodology.

    PubMed

    Choi, Jeeyae

    2013-01-01

    Increased demand of genetic counseling services heightened the necessity of a computerized genetic counseling decision support system. In order to develop an effective and efficient computerized system, modeling of genetic counseling guideline is an essential step. Throughout this pilot study, Agile methodology with United Modeling Language (UML) was utilized to model a guideline. 13 tasks and 14 associated elements were extracted. Successfully constructed conceptual class and activity diagrams revealed that Agile methodology with UML was a suitable tool to modeling a genetic counseling guideline.

  18. Developing a Model for Agile Supply: an Empirical Study from Iranian Pharmaceutical Supply Chain

    PubMed Central

    Rajabzadeh Ghatari, Ali; Mehralian, Gholamhossein; Zarenezhad, Forouzandeh; Rasekh, Hamid Reza

    2013-01-01

    Agility is the fundamental characteristic of a supply chain needed for survival in turbulent markets, where environmental forces create additional uncertainty resulting in higher risk in the supply chain management. In addition, agility helps providing the right product, at the right time to the consumer. The main goal of this research is therefore to promote supplier selection in pharmaceutical industry according to the formative basic factors. Moreover, this paper can configure its supply network to achieve the agile supply chain. The present article analyzes the supply part of supply chain based on SCOR model, used to assess agile supply chains by highlighting their specific characteristics and applicability in providing the active pharmaceutical ingredient (API). This methodology provides an analytical modeling; the model enables potential suppliers to be assessed against the multiple criteria using both quantitative and qualitative measures. In addition, for making priority of critical factors, TOPSIS algorithm has been used as a common technique of MADM model. Finally, several factors such as delivery speed, planning and reorder segmentation, trust development and material quantity adjustment are identified and prioritized as critical factors for being agile in supply of API. PMID:24250689

  19. Developing a model for agile supply: an empirical study from Iranian pharmaceutical supply chain.

    PubMed

    Rajabzadeh Ghatari, Ali; Mehralian, Gholamhossein; Zarenezhad, Forouzandeh; Rasekh, Hamid Reza

    2013-01-01

    Agility is the fundamental characteristic of a supply chain needed for survival in turbulent markets, where environmental forces create additional uncertainty resulting in higher risk in the supply chain management. In addition, agility helps providing the right product, at the right time to the consumer. The main goal of this research is therefore to promote supplier selection in pharmaceutical industry according to the formative basic factors. Moreover, this paper can configure its supply network to achieve the agile supply chain. The present article analyzes the supply part of supply chain based on SCOR model, used to assess agile supply chains by highlighting their specific characteristics and applicability in providing the active pharmaceutical ingredient (API). This methodology provides an analytical modeling; the model enables potential suppliers to be assessed against the multiple criteria using both quantitative and qualitative measures. In addition, for making priority of critical factors, TOPSIS algorithm has been used as a common technique of MADM model. Finally, several factors such as delivery speed, planning and reorder segmentation, trust development and material quantity adjustment are identified and prioritized as critical factors for being agile in supply of API.

  20. The Army Game Studios Agile Process: A Retrospective

    DTIC Science & Technology

    2012-04-26

    23-26, 2012 WWW.SSTC-ONLINE.ORG The Army Game Studio adopted Scrum because of a growing Team, growing Project sizes and the creation of a...Instructor Workstation • Vehicles • Dismount SYSTEMS & SOFTWARE TECHNOLOGY CONFERENCE APRIL 23-26, 2012 WWW.SSTC-ONLINE.ORG AAVP3 / Scrum ... Scrum was a process that fit our development style: – Requirements are never in stone, and most of the time are not completely known when funding occurs

  1. Agile Mythbusting

    DTIC Science & Technology

    2015-01-01

    does not fit all Scrum : The most adopted Agile method Scaling Agile Methods: Going beyond the team level methods Challenges to Agile Adoption: What’s...Arsenal Lapham, Wrubel Jan 2015 © 2015 Carnegie Mellon University. Myth: You Must Choose Agile or Waterfall – you can’t do both What about “water- scrum ...used in multiple environments, including DoD programs.1 1Start with “Agile EVM in Scrum Projects” from AGILE 2006 to get started learning about Agile

  2. Agile Methods for Open Source Safety-Critical Software

    PubMed Central

    Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-01-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the right amount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545

  3. Agile Methods for Open Source Safety-Critical Software.

    PubMed

    Gary, Kevin; Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-08-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the rightamount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion.

  4. A process for the agile product realization of electromechanical devices (A-primed)

    SciTech Connect

    Forsythe, C.; Ashby, M.R.; Benavides, G.L.; Diegert, K.V.; Jones, R.E.; Longcope, D.B.; Parratt, S.W.

    1996-02-01

    This paper describes a product realization process developed at Sandia National Laboratories by the A-PRIMED project that integrates many of the key components of ``agile manufacturing`` (Nagel & Dove, 1992) into a complete, step-by-step, design-to-production process. For two separate product realization efforts, each geared to a different set of requirements, A-PRIMED demonstrated product realization of a custom device in less than a month. A-PRIMED used a discriminator (a precision electro mechanical device) as the demonstration device, but the process is readily adaptable to other electro mechanical products. The process begins with a qualified design parameter space (Diegert et al, 1995). From that point, the product realization process encompasses all facets of requirements development, analysis and testing, design, manufacturing, robot assembly and quality assurance, as well as product data management and concurrent engineering. In developing the product realization process, A-PRIMED employed an iterative approach whereby after each build, the process was reviewed and refinements were made on the basis of lessons learned. This paper describes the integration of project functions and product realization technologies to develop a product realization process that on repeated iterations, was proven successful.

  5. Empirical Agility

    DTIC Science & Technology

    2014-06-01

    documented the fact that Unmanned Aerial Vehicles (now more commonly called drones ) added substantially to the quality of surveillance, resulting in better...Battalions. CACI Inc.-Federal, Arlington, Virginia. 1977. DTIC Accession Number ADA123481. Olmstead, Joseph A., B. Leon Elder, and John M...greater agility – Drones improved agility in ground and air operations – Nelson’s C2 at Trafalgar demonstrated agility. Theater Level C2 During WW II

  6. YIP: Generic Environment Models (GEMs) for Agile Marine Autonomy

    DTIC Science & Technology

    2012-09-30

    Prescribed by ANSI Std Z39-18 2 APPROACH The work is currently performed by PI Fumin Zhang and four graduate students in Georgia Tech: Paul ...addition, three undergraduate students are hired on an hourly base to develop experimental marine robots. The PI is leading the team. Paul Varnell focuses...dotted lines show the expected value of the CLTP error over time, based on our Langevin model of CLPT error growth. Agreement between the model and

  7. YIP: Generic Environment Models (GEMs) for Agile Marine Autonomy

    DTIC Science & Technology

    2011-09-30

    Prescribed by ANSI Std Z39-18 2 APPROACH The work is performed by PI Fumin Zhang and four graduate students in Georgia Tech: Paul Varnell (Fall...an hourly base to develop experimental marine robots. The PI is leading the team. Paul Varnell focuses on the control system and software system of... Langevin equation to model the growth of the expected glider position error (termed CLPT error), we have shown that the magnitude of the expected error

  8. YIP: Generic Environment Models (GEMs) for Agile Marine Autonomy

    DTIC Science & Technology

    2013-09-30

    to survey the tidal lagoon located at the Grand Isle State Park (Figure 4, upper left) in Louisiana where oil pollutions have been spotted in 2010...We tested the accuracy of the error growth model under different flow conditions, including constant flow and tidal flow, using simulations run in...simulated in GENIOS, and the flow field for the real vehicle included a constant or tidal perturbation. We found that, under constant flow, the first

  9. Using the Agile Development Methodology and Applying Best Practice Project Management Processes

    DTIC Science & Technology

    2014-12-01

    detrimental to a system architecture function. Proponents of Agile argue that developers in the waterfall 25 development get trapped in “ analysis ...limited to very small web-based socio-technical systems . (Krutchen 2010, 497) 2 . Agile Team Responsibilities So, who on the scrum team is...the use of a sprint 0 in which the system architect, heretofore singularly referred to but reflecting an individual or team approach, will take the

  10. Development of a fast, lean and agile direct pelletization process using experimental design techniques.

    PubMed

    Politis, Stavros N; Rekkas, Dimitrios M

    2017-04-01

    A novel hot melt direct pelletization method was developed, characterized and optimized, using statistical thinking and experimental design tools. Mixtures of carnauba wax (CW) and HPMC K100M were spheronized using melted gelucire 50-13 as a binding material (BM). Experimentation was performed sequentially; a fractional factorial design was set up initially to screen the factors affecting the process, namely spray rate, quantity of BM, rotor speed, type of rotor disk, lubricant-glidant presence, additional spheronization time, powder feeding rate and quantity. From the eight factors assessed, three were further studied during process optimization (spray rate, quantity of BM and powder feeding rate), at different ratios of the solid mixture of CW and HPMC K100M. The study demonstrated that the novel hot melt process is fast, efficient, reproducible and predictable. Therefore, it can be adopted in a lean and agile manufacturing setting for the production of flexible pellet dosage forms with various release rates easily customized between immediate and modified delivery.

  11. Development of an agility assessment module for preliminary fighter design

    NASA Technical Reports Server (NTRS)

    Ngan, Angelen; Bauer, Brent; Biezad, Daniel; Hahn, Andrew

    1996-01-01

    A FORTRAN computer program is presented to perform agility analysis on fighter aircraft configurations. This code is one of the modules of the NASA Ames ACSYNT (AirCraft SYNThesis) design code. The background of the agility research in the aircraft industry and a survey of a few agility metrics are discussed. The methodology, techniques, and models developed for the code are presented. FORTRAN programs were developed for two specific metrics, CCT (Combat Cycle Time) and PM (Pointing Margin), as part of the agility module. The validity of the code was evaluated by comparing with existing flight test data. Example trade studies using the agility module along with ACSYNT were conducted using Northrop F-20 Tigershark and McDonnell Douglas F/A-18 Hornet aircraft models. The sensitivity of thrust loading and wing loading on agility criteria were investigated. The module can compare the agility potential between different configurations and has the capability to optimize agility performance in the preliminary design process. This research provides a new and useful design tool for analyzing fighter performance during air combat engagements.

  12. Parallelization of fine-scale computation in Agile Multiscale Modelling Methodology

    NASA Astrophysics Data System (ADS)

    Macioł, Piotr; Michalik, Kazimierz

    2016-10-01

    Nowadays, multiscale modelling of material behavior is an extensively developed area. An important obstacle against its wide application is high computational demands. Among others, the parallelization of multiscale computations is a promising solution. Heterogeneous multiscale models are good candidates for parallelization, since communication between sub-models is limited. In this paper, the possibility of parallelization of multiscale models based on Agile Multiscale Methodology framework is discussed. A sequential, FEM based macroscopic model has been combined with concurrently computed fine-scale models, employing a MatCalc thermodynamic simulator. The main issues, being investigated in this work are: (i) the speed-up of multiscale models with special focus on fine-scale computations and (ii) on decreasing the quality of computations enforced by parallel execution. Speed-up has been evaluated on the basis of Amdahl's law equations. The problem of `delay error', rising from the parallel execution of fine scale sub-models, controlled by the sequential macroscopic sub-model is discussed. Some technical aspects of combining third-party commercial modelling software with an in-house multiscale framework and a MPI library are also discussed.

  13. Toward Agile Control of a Flexible-Spine Model for Quadruped Bounding

    DTIC Science & Technology

    2015-01-01

    step reachable states. Finally, we propose new guidelines for quantifying “agility” for legged robots , providing a preliminary framework for...quantifying and improving performance of legged systems. 1. INTRODUCTION One goal in developing legged robot systems is to provide a high degree of agility...Intuitively, being agile means that future states (i.e., position and velocity variables defining snapshots of the dynamic robot as it moves) are not

  14. Development of telemetry for the agility flight test of a radio controlled fighter model

    NASA Astrophysics Data System (ADS)

    Gallagher, Michael J.

    1992-03-01

    Advanced design tools, control devices, and supermaneuverability concepts provide innovative solutions to traditional aircraft design trade-offs. Emerging technologies enable improved agility throughout the performance envelope. Unmanned Air Vehicles provide an excellent platform for dynamic measurements and agility research. A 1/8-scaled F-16A ducted-fan radio-controlled aircraft was instrumented with a telemetry system to acquire angle of attack, sideslip angle, control surface deflection, throttle position, and airspeed data. A portable ground station was built to record and visually present real-time telemetry data. Flight tests will be conducted to acquire baseline high angle-of-attack performance measurements, and follow-on research will evaluate agility improvements with varied control configurations.

  15. Agile Metrics: Progress Monitoring of Agile Contractors

    DTIC Science & Technology

    2014-01-01

    can be tailored to leverage the iterative nature of Agile meth- ods. Using optional contract funding lines or indefinite delivery indefinite quantity... naturally created during the execution of the Agile implementation. In the following paragraphs, we identify issues to consider in building an Agile...employing Agile methods [Hartman 2006]. Be prepared to mine and effectively use the metrics data that naturally occur in typical Ag- ile teams. In

  16. The Telemetry Agile Manufacturing Effort

    SciTech Connect

    Brown, K.D.

    1995-01-01

    The Telemetry Agile Manufacturing Effort (TAME) is an agile enterprising demonstration sponsored by the US Department of Energy (DOE). The project experimented with new approaches to product realization and assessed their impacts on performance, cost, flow time, and agility. The purpose of the project was to design the electrical and mechanical features of an integrated telemetry processor, establish the manufacturing processes, and produce an initial production lot of two to six units. This paper outlines the major methodologies utilized by the TAME, describes the accomplishments that can be attributed to each methodology, and finally, examines the lessons learned and explores the opportunities for improvement associated with the overall effort. The areas for improvement are discussed relative to an ideal vision of the future for agile enterprises. By the end of the experiment, the TAME reduced production flow time by approximately 50% and life cycle cost by more than 30%. Product performance was improved compared with conventional DOE production approaches.

  17. Utilization of an agility assessment module in analysis and optimization of preliminary fighter configuration

    NASA Technical Reports Server (NTRS)

    Ngan, Angelen; Biezad, Daniel

    1996-01-01

    A study has been conducted to develop and to analyze a FORTRAN computer code for performing agility analysis on fighter aircraft configurations. This program is one of the modules of the NASA Ames ACSYNT (AirCraft SYNThesis) design code. The background of the agility research in the aircraft industry and a survey of a few agility metrics are discussed. The methodology, techniques, and models developed for the code are presented. The validity of the existing code was evaluated by comparing with existing flight test data. A FORTRAN program was developed for a specific metric, PM (Pointing Margin), as part of the agility module. Example trade studies using the agility module along with ACSYNT were conducted using a McDonnell Douglas F/A-18 Hornet aircraft model. Tile sensitivity of thrust loading, wing loading, and thrust vectoring on agility criteria were investigated. The module can compare the agility potential between different configurations and has capability to optimize agility performance in the preliminary design process. This research provides a new and useful design tool for analyzing fighter performance during air combat engagements in the preliminary design.

  18. Agile manufacturing prototyping system (AMPS)

    SciTech Connect

    Garcia, P.

    1998-05-09

    The Agile Manufacturing Prototyping System (AMPS) is being integrated at Sandia National Laboratories. AMPS consists of state of the industry flexible manufacturing hardware and software enhanced with Sandia advancements in sensor and model based control; automated programming, assembly and task planning; flexible fixturing; and automated reconfiguration technology. AMPS is focused on the agile production of complex electromechanical parts. It currently includes 7 robots (4 Adept One, 2 Adept 505, 1 Staubli RX90), conveyance equipment, and a collection of process equipment to form a flexible production line capable of assembling a wide range of electromechanical products. This system became operational in September 1995. Additional smart manufacturing processes will be integrated in the future. An automated spray cleaning workcell capable of handling alcohol and similar solvents was added in 1996 as well as parts cleaning and encapsulation equipment, automated deburring, and automated vision inspection stations. Plans for 1997 and out years include adding manufacturing processes for the rapid prototyping of electronic components such as soldering, paste dispensing and pick-and-place hardware.

  19. Human factors in agile manufacturing

    SciTech Connect

    Forsythe, C.

    1995-03-01

    As industries position themselves for the competitive markets of today, and the increasingly competitive global markets of the 21st century, agility, or the ability to rapidly develop and produce new products, represents a common trend. Agility manifests itself in many different forms, with the agile manufacturing paradigm proposed by the Iacocca Institute offering a generally accepted, long-term vision. In its many forms, common elements of agility or agile manufacturing include: changes in business, engineering and production practices, seamless information flow from design through production, integration of computer and information technologies into all facets of the product development and production process, application of communications technologies to enable collaborative work between geographically dispersed product development team members and introduction of flexible automation of production processes. Industry has rarely experienced as dramatic an infusion of new technologies or as extensive a change in culture and work practices. Human factors will not only play a vital role in accomplishing the technical and social objectives of agile manufacturing. but has an opportunity to participate in shaping the evolution of industry paradigms for the 21st century.

  20. Command and Control (C2) Agility (Agilite du commandement et du controle (C2))

    DTIC Science & Technology

    2014-10-01

    N2C2M2, C2 Agility Conceptual Model, Agility metrics, and an associated measurement process . Based on experiments designed and conducted using the...be tested than it would be otherwise. It should be noted that a meta-analysis of multiple experiments must adhere to the same design process ... process . One approach was to identify theories and definitions that reflected the concepts underlying a variable. For instance, there is an important

  1. CT-assisted agile manufacturing

    NASA Astrophysics Data System (ADS)

    Stanley, James H.; Yancey, Robert N.

    1996-11-01

    The next century will witness at least two great revolutions in the way goods are produced. First, workers will use the medium of virtual reality in all aspects of marketing, research, development, prototyping, manufacturing, sales and service. Second, market forces will drive manufacturing towards small-lot production and just-in-time delivery. Already, we can discern the merging of these megatrends into what some are calling agile manufacturing. Under this new paradigm, parts and processes will be designed and engineered within the mind of a computer, tooled and manufactured by the offspring of today's rapid prototyping equipment, and evaluated for performance and reliability by advanced nondestructive evaluation (NDE) techniques and sophisticated computational models. Computed tomography (CT) is the premier example of an NDE method suitable for future agile manufacturing activities. It is the only modality that provides convenient access to the full suite of engineering data that users will need to avail themselves of computer- aided design, computer-aided manufacturing, and computer- aided engineering capabilities, as well as newly emerging reverse engineering, rapid prototyping and solid freeform fabrication technologies. As such, CT is assured a central, utilitarian role in future industrial operations. An overview of this exciting future for industrial CT is presented.

  2. TriBITS lifecycle model. Version 1.0, a lean/agile software lifecycle model for research-based computational science and engineering and applied mathematical software.

    SciTech Connect

    Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.

  3. Towards agile large-scale predictive modelling in drug discovery with flow-based programming design principles.

    PubMed

    Lampa, Samuel; Alvarsson, Jonathan; Spjuth, Ola

    2016-01-01

    Predictive modelling in drug discovery is challenging to automate as it often contains multiple analysis steps and might involve cross-validation and parameter tuning that create complex dependencies between tasks. With large-scale data or when using computationally demanding modelling methods, e-infrastructures such as high-performance or cloud computing are required, adding to the existing challenges of fault-tolerant automation. Workflow management systems can aid in many of these challenges, but the currently available systems are lacking in the functionality needed to enable agile and flexible predictive modelling. We here present an approach inspired by elements of the flow-based programming paradigm, implemented as an extension of the Luigi system which we name SciLuigi. We also discuss the experiences from using the approach when modelling a large set of biochemical interactions using a shared computer cluster.Graphical abstract.

  4. Agility Quotient (AQ)

    DTIC Science & Technology

    2014-06-01

    system?s Agility IQ ?? and ?What is the requisite amount of Agility that is required?? This paper suggests a way forward and illustrates it, in the...answer two questions. “How can we measure a system’s Agility IQ ?” and “What is the requisite amount of Agility that is required?” This paper...agility is worth our attention. AQ can be patterned after the Intelligence Quotient ( IQ ). IQ is a score that is associated with educational potential

  5. Aircraft agility maneuvers

    NASA Technical Reports Server (NTRS)

    Cliff, Eugene M.; Thompson, Brian G.

    1992-01-01

    A new dynamic model for aircraft motions is presented. This model can be viewed as intermediate between a point-mass model, in which the body attitude angles are control-like, and a rigid-body model, in which the body-attitude angles evolve according to Newton's Laws. Specifically, consideration is given to the case of symmetric flight, and a model is constructed in which the body roll-rate and the body pitch-rate are the controls. In terms of this body-rate model a minimum-time heading change maneuver is formulated. When the bounds on the body-rates are large the results are similar to the point-mass model in that the model can very quickly change the applied forces and produce an acceleration to turn the vehicle. With finite bounds on these rates, the forces change in a smooth way. This leads to a measurable effect of agility.

  6. Some Findings Concerning Requirements in Agile Methodologies

    NASA Astrophysics Data System (ADS)

    Rodríguez, Pilar; Yagüe, Agustín; Alarcón, Pedro P.; Garbajosa, Juan

    Agile methods have appeared as an attractive alternative to conventional methodologies. These methods try to reduce the time to market and, indirectly, the cost of the product through flexible development and deep customer involvement. The processes related to requirements have been extensively studied in literature, in most cases in the frame of conventional methods. However, conclusions of conventional methodologies could not be necessarily valid for Agile; in some issues, conventional and Agile processes are radically different. As recent surveys report, inadequate project requirements is one of the most conflictive issues in agile approaches and better understanding about this is needed. This paper describes some findings concerning requirements activities in a project developed under an agile methodology. The project intended to evolve an existing product and, therefore, some background information was available. The major difficulties encountered were related to non-functional needs and management of requirements dependencies.

  7. Implementing Kanban for agile process management within the ALMA Software Operations Group

    NASA Astrophysics Data System (ADS)

    Reveco, Johnny; Mora, Matias; Shen, Tzu-Chiang; Soto, Ruben; Sepulveda, Jorge; Ibsen, Jorge

    2014-07-01

    After the inauguration of the Atacama Large Millimeter/submillimeter Array (ALMA), the Software Operations Group in Chile has refocused its objectives to: (1) providing software support to tasks related to System Integration, Scientific Commissioning and Verification, as well as Early Science observations; (2) testing the remaining software features, still under development by the Integrated Computing Team across the world; and (3) designing and developing processes to optimize and increase the level of automation of operational tasks. Due to their different stakeholders, each of these tasks presents a wide diversity of importances, lifespans and complexities. Aiming to provide the proper priority and traceability for every task without stressing our engineers, we introduced the Kanban methodology in our processes in order to balance the demand on the team against the throughput of the delivered work. The aim of this paper is to share experiences gained during the implementation of Kanban in our processes, describing the difficulties we have found, solutions and adaptations that led us to our current but still evolving implementation, which has greatly improved our throughput, prioritization and problem traceability.

  8. Agile manufacturing from a statistical perspective

    SciTech Connect

    Easterling, R.G.

    1995-10-01

    The objective of agile manufacturing is to provide the ability to quickly realize high-quality, highly-customized, in-demand products at a cost commensurate with mass production. More broadly, agility in manufacturing, or any other endeavor, is defined as change-proficiency; the ability to thrive in an environment of unpredictable change. This report discusses the general direction of the agile manufacturing initiative, including research programs at the National Institute of Standards and Technology (NIST), the Department of Energy, and other government agencies, but focuses on agile manufacturing from a statistical perspective. The role of statistics can be important because agile manufacturing requires the collection and communication of process characterization and capability information, much of which will be data-based. The statistical community should initiate collaborative work in this important area.

  9. SU-E-T-627: Precision Modelling of the Leaf-Bank Rotation in Elekta’s Agility MLC: Is It Necessary?

    SciTech Connect

    Vujicic, M; Belec, J; Heath, E; Gholampourkashi, S; Cygler, J

    2015-06-15

    Purpose: To demonstrate the method used to determine the leaf bank rotation angle (LBROT) as a parameter for modeling the Elekta Agility multi-leaf collimator (MLC) for Monte Carlo simulations and to evaluate the clinical impact of LBROT. Methods: A detailed model of an Elekta Infinity linac including an Agility MLC was built using the EGSnrc/BEAMnrc Monte Carlo code. The Agility 160-leaf MLC is modelled using the MLCE component module which allows for leaf bank rotation using the parameter LBROT. A precise value of LBROT is obtained by comparing measured and simulated profiles of a specific field, which has leaves arranged in a repeated pattern such that one leaf is opened and the adjacent one is closed. Profile measurements from an Agility linac are taken with gafchromic film, and an ion chamber is used to set the absolute dose. The measurements are compared to Monte Carlo (MC) simulations and the LBROT is adjusted until a match is found. The clinical impact of LBROT is evaluated by observing how an MC dose calculation changes with LBROT. A clinical Stereotactic Body Radiation Treatment (SBRT) plan is calculated using BEAMnrc/DOSXYZnrc simulations with different input values for LBROT. Results: Using the method outlined above, the LBROT is determined to be 9±1 mrad. Differences as high as 4% are observed in a clinical SBRT plan between the extreme case (LBROT not modeled) and the nominal case. Conclusion: In small-field radiation therapy treatment planning, it is important to properly account for LBROT as an input parameter for MC dose calculations with the Agility MLC. More work is ongoing to elucidate the observed differences by determining the contributions from transmission dose, change in field size, and source occlusion, which are all dependent on LBROT. This work was supported by OCAIRO (Ontario Consortium of Adaptive Interventions in Radiation Oncology), funded by the Ontario Research Fund.

  10. Agile High-Fidelity Mcnp Model Development Techniques for Rapid Mechanical Design Iteration

    NASA Astrophysics Data System (ADS)

    Kulesza, Joel A.

    2009-08-01

    In order to finalize mechanical design details and perform the associated radiological analyses for the AP1000 pressurized water reactor integrated head package (IHP) in time to meet industrial obligations, a process was developed that allowed a radiological analyst to rapidly respond to changing design criteria. This process used several tools together, most of which were freely available, that enabled the analyst to rapidly re-model both geometrical and radiological details, perform a three-dimensional dose field analysis with MCNP5, examine the results, and present the results in an informative and easily understandable manner to other technical working groups. Thus far the author has used this process to study the radiological impacts of different sources due to various incore instrumentation thimble assembly (IITA) materials, different IITA shield alloys and geometrical configurations, different MP shroud thicknesses, and parameterized air duct wall thicknesses and complementary shielding. Model processing before execution will be discussed in detail. Techniques will also be described which allow for rapid spatial redistribution based on the modified source term. Post processing tools and methods will also be described that yield both qualitative and quantitative results.

  11. Developing communications requirements for Agile Product Realization

    SciTech Connect

    Forsythe, C.; Ashby, M.R.

    1994-03-01

    Sandia National Laboratories has undertaken the Agile Product Realization for Innovative electroMEchanical Devices (A-PRIMED) pilot project to develop and implement technologies for agile design and manufacturing of electrochemical components. Emphasis on information-driven processes, concurrent engineering and multi-functional team communications makes computer-supported cooperative work critical to achieving significantly faster product development cycles. This report describes analyses conducted in developing communications requirements and a communications plan that addresses the unique communications demands of an agile enterprise.

  12. Towards Agile Ontology Maintenance

    NASA Astrophysics Data System (ADS)

    Luczak-Rösch, Markus

    Ontologies are an appropriate means to represent knowledge on the Web. Research on ontology engineering reached practices for an integrative lifecycle support. However, a broader success of ontologies in Web-based information systems remains unreached while the more lightweight semantic approaches are rather successful. We assume, paired with the emerging trend of services and microservices on the Web, new dynamic scenarios gain momentum in which a shared knowledge base is made available to several dynamically changing services with disparate requirements. Our work envisions a step towards such a dynamic scenario in which an ontology adapts to the requirements of the accessing services and applications as well as the user's needs in an agile way and reduces the experts' involvement in ontology maintenance processes.

  13. Agile Software Development

    ERIC Educational Resources Information Center

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  14. Agile automated vision

    NASA Astrophysics Data System (ADS)

    Fandrich, Juergen; Schmitt, Lorenz A.

    1994-11-01

    The microelectronic industry is a protagonist in driving automated vision to new paradigms. Today semiconductor manufacturers use vision systems quite frequently in their fabs in the front-end process. In fact, the process depends on reliable image processing systems. In the back-end process, where ICs are assembled and packaged, today vision systems are only partly used. But in the next years automated vision will become compulsory for the back-end process as well. Vision will be fully integrated into every IC package production machine to increase yields and reduce costs. Modem high-speed material processing requires dedicated and efficient concepts in image processing. But the integration of various equipment in a production plant leads to unifying handling of data flow and interfaces. Only agile vision systems can act with these contradictions: fast, reliable, adaptable, scalable and comprehensive. A powerful hardware platform is a unneglectable requirement for the use of advanced and reliable, but unfortunately computing intensive image processing algorithms. The massively parallel SIMD hardware product LANTERN/VME supplies a powerful platform for existing and new functionality. LANTERN/VME is used with a new optical sensor for IC package lead inspection. This is done in 3D, including horizontal and coplanarity inspection. The appropriate software is designed for lead inspection, alignment and control tasks in IC package production and handling equipment, like Trim&Form, Tape&Reel and Pick&Place machines.

  15. Decision Science Challenges for C2 Agility

    DTIC Science & Technology

    2014-06-01

    Controlled and automatic human information processing: II. Perceptual learning , automatic attending and a general theory . Psychological Review, 84 (2...1 19 th ICCRTS “C2 Agility: Lessons Learned from Research and Operations” Decision Science Challenges for C2 Agility Topic 1 (First...systems. We have two vectors there. The first vector would be in things like man-machine interface. The second ... is in the whole area of cognition

  16. Research on modeling of the agile satellite using a single gimbal magnetically suspended CMG and the disturbance feedforward compensation for rotors.

    PubMed

    Cui, Peiling; Yan, Ning

    2012-12-12

    The magnetically suspended Control Moment Gyroscope (CMG) has the advantages of long-life, micro-vibration and being non-lubricating, and is the ideal actuator for agile maneuver satellite attitude control. However, the stability of the rotor in magnetic bearing and the precision of the output torque of a magnetically suspended CMG are affected by the rapid maneuvers of satellites. In this paper, a dynamic model of the agile satellite including a magnetically suspended single gimbal control moment gyroscope is built and the equivalent disturbance torque effected on the rotor is obtained. The feedforward compensation control method is used to depress the disturbance on the rotor. Simulation results are given to show that the rotor displacement is obviously reduced.

  17. Research on Modeling of the Agile Satellite Using a Single Gimbal Magnetically Suspended CMG and the Disturbance Feedforward Compensation for Rotors

    PubMed Central

    Cui, Peiling; Yan, Ning

    2012-01-01

    The magnetically suspended Control Moment Gyroscope (CMG) has the advantages of long-life, micro-vibration and being non-lubricating, and is the ideal actuator for agile maneuver satellite attitude control. However, the stability of the rotor in magnetic bearing and the precision of the output torque of a magnetically suspended CMG are affected by the rapid maneuvers of satellites. In this paper, a dynamic model of the agile satellite including a magnetically suspended single gimbal control moment gyroscope is built and the equivalent disturbance torque effected on the rotor is obtained. The feedforward compensation control method is used to depress the disturbance on the rotor. Simulation results are given to show that the rotor displacement is obviously reduced. PMID:23235442

  18. Focused Logistics: Putting Agility in Agile Logistics

    DTIC Science & Technology

    2011-05-19

    envisioned an agile and adaptable logstics system built around common situational understanding.5 The Focused Logistics concept specified the requirement to...the tracking of resources moving through the TD network. As a result, 112 Ibid, 20. 113 Ibid, 18-20. 39 distribution centers tagged inbound

  19. Integrating a distributed, agile, virtual enterprise in the TEAM program

    NASA Astrophysics Data System (ADS)

    Cobb, C. K.; Gray, W. Harvey; Hewgley, Robert E.; Klages, Edward J.; Neal, Richard E.

    1997-01-01

    The technologies enabling agile manufacturing (TEAM) program enhances industrial capability by advancing and deploying manufacturing technologies that promote agility. TEAM has developed a product realization process that features the integration of product design and manufacturing groups. TEAM uses the tools it collects, develops, and integrates in support of the product realization process to demonstrate and deploy agile manufacturing capabilities for three high- priority processes identified by industry: material removal, forming, and electromechanical assembly. In order to provide a proof-of-principle, the material removal process has been addressed first and has been successfully demonstrate din an 'interconnected' mode. An internet-accessible intersite file manager (IFM) application has been deployed to allow geographically distributed TEAM participants to share and distribute information as the product realization process is executed. An automated inspection planning application has been demonstrated, importing a solid model form the IFM, generating an inspection plan and a part program to be used in the inspection process, and then distributing the part program to the inspection site via the IFM. TEAM seeks to demonstrate the material removal process in an integrated mode in June 1997 complete with an object-oriented framework and infrastructure. The current status and future plans for this project are presented here.

  20. Production planning tools and techniques for agile manufacturing

    SciTech Connect

    Kjeldgaard, E.A.; Jones, D.A.; List, G.F.; Turnquist, M.A.

    1996-10-01

    Effective use of resources shared among multiple products or processes is critical for agile manufacturing. This paper describes development and implementation of a computerized model to support production planning in a complex manufacturing system at Pantex Plant. The model integrates two different production processes (nuclear weapon dismantlement and stockpile evaluation) which use common facilities and personnel, and reflects the interactions of scheduling constraints, material flow constraints, and resource availability. These two processes reflect characteristics of flow-shop and job-shop operations in a single facility. Operational results from using the model are also discussed.

  1. Unsteady Aerodynamic Models for Flight Control of Agile Micro Air Vehicles

    DTIC Science & Technology

    2010-08-13

    MONITOR’S REPORT NUMBER(S) AFRL-OSR-VA-TR-2011-0251 12. DISTRIBUTION /AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13...the model in [18] must be calibrated to experimental data, and often does not match data it was not specifically calibrated against [9]. The model [41...strategies have a serious limitation in that they need to be calibrated for a particular angle of attack, and are not immediately suitable for

  2. Model-Driven Agile Development of Reactive Multi-Agent Systems

    DTIC Science & Technology

    2006-01-01

    the Sage Tool Set The prototype Sage toolchain includes the Sage prototype tool set, sol2sal compiler, Salsa property checker [6], Sol compiler [4...to com- pute shared values. The sol2sal compiler translated a SOL model of the WCP, which Sage generated, to the language of Salsa . Salsa checked the...R. Bharadwaj and S. Simms. “ Salsa : Combining Con- straint Solvers with BDDs for Automatic Invariant Check- ing“, Lecture Notes in Computer Science

  3. Rapid, Agile Modeling Support for Human-Computer Interface Conceptual Design

    DTIC Science & Technology

    2008-12-01

    information (see Chi, Pirolli, and Pitkow, 2000). 5.7 LATENT SEMANTIC ANALYSIS (LSA) In CoLiDeS, semantic similarity is determined by Latent Semantic Analysis...P. W. Foltz, and D. Laham. 1998. “An Introduction to Latent Semantic Analysis,” Discourse Processes, vol. 25, pp. 259–284. Mannes, S. M. and W...1998. “Learning and Representing Verbal Meaning: Latent Semantic Analysis Theory,” Current Directions in Psychological Science, vol. 7, pp. 161–164

  4. An agile implementation of SCRUM

    NASA Astrophysics Data System (ADS)

    Gannon, Michele

    Is Agile a way to cut corners? To some, the use of an Agile Software Development Methodology has a negative connotation - “ Oh, you're just not producing any documentation” . So can a team with no experience in Agile successfully implement and use SCRUM?

  5. A Quantitative Examination of Critical Success Factors Comparing Agile and Waterfall Project Management Methodologies

    ERIC Educational Resources Information Center

    Pedersen, Mitra

    2013-01-01

    This study investigated the rate of success for IT projects using agile and standard project management methodologies. Any successful project requires use of project methodology. Specifically, large projects require formal project management methodologies or models, which establish a blueprint of processes and project planning activities. This…

  6. Analysis and optimization of preliminary aircraft configurations in relationship to emerging agility metrics

    NASA Technical Reports Server (NTRS)

    Sandlin, Doral R.; Bauer, Brent Alan

    1993-01-01

    This paper discusses the development of a FORTRAN computer code to perform agility analysis on aircraft configurations. This code is to be part of the NASA-Ames ACSYNT (AirCraft SYNThesis) design code. This paper begins with a discussion of contemporary agility research in the aircraft industry and a survey of a few agility metrics. The methodology, techniques and models developed for the code are then presented. Finally, example trade studies using the agility module along with ACSYNT are illustrated. These trade studies were conducted using a Northrop F-20 Tigershark aircraft model. The studies show that the agility module is effective in analyzing the influence of common parameters such as thrust-to-weight ratio and wing loading on agility criteria. The module can compare the agility potential between different configurations. In addition one study illustrates the module's ability to optimize a configuration's agility performance.

  7. Development of a Computer Program for Analyzing Preliminary Aircraft Configurations in Relationship to Emerging Agility Metrics

    NASA Technical Reports Server (NTRS)

    Bauer, Brent

    1993-01-01

    This paper discusses the development of a FORTRAN computer code to perform agility analysis on aircraft configurations. This code is to be part of the NASA-Ames ACSYNT (AirCraft SYNThesis) design code. This paper begins with a discussion of contemporary agility research in the aircraft industry and a survey of a few agility metrics. The methodology, techniques and models developed for the code are then presented. Finally, example trade studies using the agility module along with ACSYNT are illustrated. These trade studies were conducted using a Northrop F-20 Tigershark aircraft model. The studies show that the agility module is effective in analyzing the influence of common parameters such as thrust-to-weight ratio and wing loading on agility criteria. The module can compare the agility potential between different configurations. In addition, one study illustrates the module's ability to optimize a configuration's agility performance.

  8. Agile robotic edge finishing system research

    SciTech Connect

    Powell, M.A.

    1995-07-01

    This paper describes a new project undertaken by Sandia National Laboratories to develop an agile, automated, high-precision edge finishing system. The project has a two-year duration and was initiated in October, 1994. This project involves re-designing and adding additional capabilities to an existing finishing workcell at Sandia; and developing intelligent methods for automating process definition and for controlling finishing processes. The resulting system will serve as a prototype for systems that will be deployed into highly flexible automated production lines. The production systems will be used to produce a wide variety of products with limited production quantities and quick turnaround requirements. The prototype system is designed to allow programming, process definition, fixture re-configuration, and process verification to be performed off-line for new products. CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) models of the part will be used to assist with the automated process development and process control tasks. To achieve Sandia`s performance goals, the system will be employ advanced path planning, burr prediction expert systems, automated process definition, statistical process models in a process database, and a two-level control scheme using hybrid position-force control and fuzzy logic control. In this paper, we discuss the progress and the planned system development under this project.

  9. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases, they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At the Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and processes for developing software. This paper will discuss some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies and processes.

  10. Peridigm summary report : lessons learned in development with agile components.

    SciTech Connect

    Salinger, Andrew Gerhard; Mitchell, John Anthony; Littlewood, David John; Parks, Michael L.

    2011-09-01

    This report details efforts to deploy Agile Components for rapid development of a peridynamics code, Peridigm. The goal of Agile Components is to enable the efficient development of production-quality software by providing a well-defined, unifying interface to a powerful set of component-based software. Specifically, Agile Components facilitate interoperability among packages within the Trilinos Project, including data management, time integration, uncertainty quantification, and optimization. Development of the Peridigm code served as a testbed for Agile Components and resulted in a number of recommendations for future development. Agile Components successfully enabled rapid integration of Trilinos packages into Peridigm. A cost of this approach, however, was a set of restrictions on Peridigm's architecture which impacted the ability to track history-dependent material data, dynamically modify the model discretization, and interject user-defined routines into the time integration algorithm. These restrictions resulted in modifications to the Agile Components approach, as implemented in Peridigm, and in a set of recommendations for future Agile Components development. Specific recommendations include improved handling of material states, a more flexible flow control model, and improved documentation. A demonstration mini-application, SimpleODE, was developed at the onset of this project and is offered as a potential supplement to Agile Components documentation.

  11. Achieving agility through parameter space qualification

    SciTech Connect

    Diegert, K.V.; Easterling, R.G.; Ashby, M.R.; Benavides, G.L.; Forsythe, C.; Jones, R.E.; Longcope, D.B.; Parratt, S.W.

    1995-02-01

    The A-primed (Agile Product Realization of Innovative electro-Mechanical Devices) project is defining and proving processes for agile product realization for the Department of Energy complex. Like other agile production efforts reported in the literature, A-primed uses concurrent engineering and information automation technologies to enhance information transfer. A unique aspect of our approach to agility is the qualification during development of a family of related product designs and their production processes, rather than a single design and its attendant processes. Applying engineering principles and statistical design of experiments, economies of test and analytic effort are realized for the qualification of the device family as a whole. Thus the need is minimized for test and analysis to qualify future devices from this family, thereby further reducing the design-to-production cycle time. As a measure of the success of the A-primed approach, the first design took 24 days to produce, and operated correctly on the first attempt. A flow diagram for the qualification process is presented. Guidelines are given for implementation, based on the authors experiences as members of the A-primed qualification team.

  12. An investigation of fighter aircraft agility

    NASA Technical Reports Server (NTRS)

    Valasek, John; Downing, David R.

    1993-01-01

    of how to test and measure the metric, including any special data reduction requirements; typical values for the metric obtained using one or more aircraft types; and a sensitivity analysis if applicable. The report is organized as follows. The first chapter in the report presents a historical review of air combat trends which demonstrate the need for agility metrics in assessing the combat performance of fighter aircraft in a modern, all-aspect missile environment. The second chapter presents a framework for classifying each candidate metric according to time scale (transient, functional, instantaneous), further subdivided by axis (pitch, lateral, axial). The report is then broadly divided into two parts, with the transient agility metrics (pitch lateral, axial) covered in chapters three, four, and five, and the functional agility metrics covered in chapter six. Conclusions, recommendations, and an extensive reference list and biography are also included. Five appendices contain a comprehensive list of the definitions of all the candidate metrics; a description of the aircraft models and flight simulation programs used for testing the metrics; several relations and concepts which are fundamental to the study of lateral agility; an in-depth analysis of the axial agility metrics; and a derivation of the relations for the instantaneous agility and their approximations.

  13. Perspectives on Agile Coaching

    NASA Astrophysics Data System (ADS)

    Fraser, Steven; Lundh, Erik; Davies, Rachel; Eckstein, Jutta; Larsen, Diana; Vilkki, Kati

    There are many perspectives to agile coaching including: growing coaching expertise, selecting the appropriate coach for your context; and eva luating value. A coach is often an itinerant who may observe, mentor, negotiate, influence, lead, and/or architect everything from team organization to system architecture. With roots in diverse fields ranging from technology to sociology coaches have differing motivations and experience bases. This panel will bring together coaches to debate and discuss various perspectives on agile coaching. Some of the questions to be addressed will include: What are the skills required for effective coaching? What should be the expectations for teams or individu als being coached? Should coaches be: a corporate resource (internal team of consultants working with multiple internal teams); an integral part of a specific team; or external contractors? How should coaches exercise influence and au thority? How should management assess the value of a coaching engagement? Do you have what it takes to be a coach? - This panel will bring together sea soned agile coaches to offer their experience and advice on how to be the best you can be!

  14. Tailless Vectored Fighters Theory. Laboratory and Flight Tests, Including Vectorable Inlets/Nozzles and Tailless Flying Models vs. Pilot’s Tolerances Affecting Maximum Post-Stall Vectoring Agility.

    DTIC Science & Technology

    1991-07-01

    scaled models, TV-egility is an Interdisciplinary subject involving a revolution in engineering and pilot education, References I. Gel -Or, B...er 85-4014. Agility in Demand’, Aerospace America. Vol. 26, Hay 1988, pp. 56-58. 6. HerbaL , W. 8., "Thrust Vectoring - Why and How ?" ISABE-87-7061

  15. Pinnacle3 modeling and end-to-end dosimetric testing of a Versa HD linear accelerator with the Agility head and flattening filter-free modes.

    PubMed

    Saenz, Daniel L; Narayanasamy, Ganesh; Cruz, Wilbert; Papanikolaou, Nikos; Stathakis, Sotirios

    2016-01-08

    The Elekta Versa HD incorporates a variety of upgrades to the line of Elekta linear accelerators, primarily including the Agility head and flattening filter-free (FFF) photon beam delivery. The completely distinct dosimetric output of the head from its predecessors, combined with the FFF beams, requires a new investigation of modeling in treatment planning systems. A model was created in Pinnacle3 v9.8 with the commissioned beam data. A phantom consisting of several plastic water and Styrofoam slabs was scanned and imported into Pinnacle3, where beams of different field sizes, source-to-surface distances (SSDs), wedges, and gantry angles were devised. Beams included all of the available photon energies (6, 10, 18, 6FFF, and 10 FFF MV), as well as the four electron energies commissioned for clinical use (6, 9, 12, and 15 MeV). The plans were verified at calculation points by measurement with a calibrated ionization chamber. Homogeneous and hetero-geneous point-dose measurements agreed within 2% relative to maximum dose for all photon and electron beams. AP photon open field measurements along the central axis at 100 cm SSD passed within 1%. In addition, IMRT testing was also performed with three standard plans (step and shoot IMRT, as well as a small- and large-field VMAT plan). The IMRT plans were delivered on the Delta4 IMRT QA phantom, for which a gamma passing rate was > 99.5% for all plans with a 3% dose deviation, 3 mm distance-to-agreement, and 10% dose threshold. The IMRT QA results for the first 23 patients yielded gamma passing rates of 97.4% ± 2.3%. Such testing ensures confidence in the ability of Pinnacle3 to model photon and electron beams with the Agility head.

  16. Operational Agility (La Maniabilite Operationnelle)

    DTIC Science & Technology

    1994-04-01

    describe how to go further and how to set up an analytical framework for the analysis of another fundamental property of modern combat aircraft, that is... analytical framework for the analysis of airframe agility and for the derivation of agility metrics. A general consensus has been found in relating agility...Gianchecchi, Aernmacchi Lt Col. G. Fristachi, Italian Air Force Prof. M. Innocenti, Auburn University/University of Pisa United Kingdom Mr P. Gordon

  17. Supply chain network design problem for a new market opportunity in an agile manufacturing system

    NASA Astrophysics Data System (ADS)

    Babazadeh, Reza; Razmi, Jafar; Ghodsi, Reza

    2012-08-01

    The characteristics of today's competitive environment, such as the speed with which products are designed, manufactured, and distributed, and the need for higher responsiveness and lower operational cost, are forcing companies to search for innovative ways to do business. The concept of agile manufacturing has been proposed in response to these challenges for companies. This paper copes with the strategic and tactical level decisions in agile supply chain network design. An efficient mixed-integer linear programming model that is able to consider the key characteristics of agile supply chain such as direct shipments, outsourcing, different transportation modes, discount, alliance (process and information integration) between opened facilities, and maximum waiting time of customers for deliveries is developed. In addition, in the proposed model, the capacity of facilities is determined as decision variables, which are often assumed to be fixed. Computational results illustrate that the proposed model can be applied as a power tool in agile supply chain network design as well as in the integration of strategic decisions with tactical decisions.

  18. Modelling fire-fighter responses to exercise and asymmetric infrared radiation using a dynamic multi-mode model of human physiology and results from the sweating agile thermal manikin.

    PubMed

    Richards, M G M; Fiala, D

    2004-09-01

    In this study, predicted dynamic physiological responses are compared with wear trials results for firefighter suits: impermeable (A), semi-permeable (B) and permeable (C), and underwear. Wear trials consisted of three rest phases and two moderate work phases, with a frontal infrared (IR) radiation exposure of 500 W/m2 for the last 15 min of each work phase. Simulations were performed by detailed modelling of the experimental boundary conditions, including the inhomogeneous IR radiation combined with clothing properties for still and walking conditions measured using the Sweating Agile thermal Manikin. Accounting for the effect of sweat gland activity suppression with increased skin wettedness, the predicted total moisture loss was insignificantly different (P<0.05) from the wear trial value for suits B and C but was 37% too high for suit A. Predicted evolution of core, mean skin and local skin temperatures agreed well with the wear trial results for all clothing. Root mean square deviations ranged from 0.11 degrees C to 0.26 degrees C for core temperatures and from 0.28 degrees C to 0.38 degrees C for mean skin temperatures, which where typically lower than the experimental error. Transient thermodynamic processes occurring within suit A may account for the delayed/reduced fall in core temperature following exercise.

  19. Aging Contributes to Inflammation in Upper Extremity Tendons and Declines in Forelimb Agility in a Rat Model of Upper Extremity Overuse

    PubMed Central

    Kietrys, David M.; Barr-Gillespie, Ann E.; Amin, Mamta; Wade, Christine K.; Popoff, Steve N.; Barbe, Mary F.

    2012-01-01

    We sought to determine if tendon inflammatory and histopathological responses increase in aged rats compared to young rats performing a voluntary upper extremity repetitive task, and if these changes are associated with motor declines. Ninety-six female Sprague-Dawley rats were used in the rat model of upper extremity overuse: 67 aged and 29 young adult rats. After a training period of 4 weeks, task rats performed a voluntary high repetition low force (HRLF) handle-pulling task for 2 hrs/day, 3 days/wk for up to 12 weeks. Upper extremity motor function was assessed, as were inflammatory and histomorphological changes in flexor digitorum and supraspinatus tendons. The percentage of successful reaches improved in young adult HRLF rats, but not in aged HRLF rats. Forelimb agility decreased transiently in young adult HRLF rats, but persistently in aged HRLF rats. HRLF task performance for 12 weeks lead to increased IL-1beta and IL-6 in flexor digitorum tendons of aged HRLF rats, compared to aged normal control (NC) as well as young adult HRLF rats. In contrast, TNF-alpha increased more in flexor digitorum tendons of young adult 12-week HRLF rats than in aged HRLF rats. Vascularity and collagen fibril organization were not affected by task performance in flexor digitorum tendons of either age group, although cellularity increased in both. By week 12 of HRLF task performance, vascularity and cellularity increased in the supraspinatus tendons of only aged rats. The increased cellularity was due to increased macrophages and connective tissue growth factor (CTGF)-immunoreactive fibroblasts in the peritendon. In conclusion, aged rat tendons were overall more affected by the HRLF task than young adult tendons, particularly supraspinatus tendons. Greater inflammatory changes in aged HRLF rat tendons were observed, increases associated temporally with decreased forelimb agility and lack of improvement in task success. PMID:23056540

  20. Moving target detection for frequency agility radar by sparse reconstruction.

    PubMed

    Quan, Yinghui; Li, YaChao; Wu, Yaojun; Ran, Lei; Xing, Mengdao; Liu, Mengqi

    2016-09-01

    Frequency agility radar, with randomly varied carrier frequency from pulse to pulse, exhibits superior performance compared to the conventional fixed carrier frequency pulse-Doppler radar against the electromagnetic interference. A novel moving target detection (MTD) method is proposed for the estimation of the target's velocity of frequency agility radar based on pulses within a coherent processing interval by using sparse reconstruction. Hardware implementation of orthogonal matching pursuit algorithm is executed on Xilinx Virtex-7 Field Programmable Gata Array (FPGA) to perform sparse optimization. Finally, a series of experiments are performed to evaluate the performance of proposed MTD method for frequency agility radar systems.

  1. Moving target detection for frequency agility radar by sparse reconstruction

    NASA Astrophysics Data System (ADS)

    Quan, Yinghui; Li, YaChao; Wu, Yaojun; Ran, Lei; Xing, Mengdao; Liu, Mengqi

    2016-09-01

    Frequency agility radar, with randomly varied carrier frequency from pulse to pulse, exhibits superior performance compared to the conventional fixed carrier frequency pulse-Doppler radar against the electromagnetic interference. A novel moving target detection (MTD) method is proposed for the estimation of the target's velocity of frequency agility radar based on pulses within a coherent processing interval by using sparse reconstruction. Hardware implementation of orthogonal matching pursuit algorithm is executed on Xilinx Virtex-7 Field Programmable Gata Array (FPGA) to perform sparse optimization. Finally, a series of experiments are performed to evaluate the performance of proposed MTD method for frequency agility radar systems.

  2. Agile Walking Robot

    NASA Technical Reports Server (NTRS)

    Larimer, Stanley J.; Lisec, Thomas R.; Spiessbach, Andrew J.; Waldron, Kenneth J.

    1990-01-01

    Proposed agile walking robot operates over rocky, sandy, and sloping terrain. Offers stability and climbing ability superior to other conceptual mobile robots. Equipped with six articulated legs like those of insect, continually feels ground under leg before applying weight to it. If leg sensed unexpected object or failed to make contact with ground at expected point, seeks alternative position within radius of 20 cm. Failing that, robot halts, examines area around foot in detail with laser ranging imager, and replans entire cycle of steps for all legs before proceeding.

  3. Agile Infrastructure Monitoring

    NASA Astrophysics Data System (ADS)

    Andrade, P.; Ascenso, J.; Fedorko, I.; Fiorini, B.; Paladin, M.; Pigueiras, L.; Santos, M.

    2014-06-01

    At the present time, data centres are facing a massive rise in virtualisation and cloud computing. The Agile Infrastructure (AI) project is working to deliver new solutions to ease the management of CERN data centres. Part of the solution consists in a new "shared monitoring architecture" which collects and manages monitoring data from all data centre resources. In this article, we present the building blocks of this new monitoring architecture, the different open source technologies selected for each architecture layer, and how we are building a community around this common effort.

  4. Frequency agile optical parametric oscillator

    DOEpatents

    Velsko, Stephan P.

    1998-01-01

    The frequency agile OPO device converts a fixed wavelength pump laser beam to arbitrary wavelengths within a specified range with pulse to pulse agility, at a rate limited only by the repetition rate of the pump laser. Uses of this invention include Laser radar, LIDAR, active remote sensing of effluents/pollutants, environmental monitoring, antisensor lasers, and spectroscopy.

  5. Frequency agile optical parametric oscillator

    DOEpatents

    Velsko, S.P.

    1998-11-24

    The frequency agile OPO device converts a fixed wavelength pump laser beam to arbitrary wavelengths within a specified range with pulse to pulse agility, at a rate limited only by the repetition rate of the pump laser. Uses of this invention include Laser radar, LIDAR, active remote sensing of effluents/pollutants, environmental monitoring, antisensor lasers, and spectroscopy. 14 figs.

  6. Final Report of the NASA Office of Safety and Mission Assurance Agile Benchmarking Team

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2016-01-01

    To ensure that the NASA Safety and Mission Assurance (SMA) community remains in a position to perform reliable Software Assurance (SA) on NASAs critical software (SW) systems with the software industry rapidly transitioning from waterfall to Agile processes, Terry Wilcutt, Chief, Safety and Mission Assurance, Office of Safety and Mission Assurance (OSMA) established the Agile Benchmarking Team (ABT). The Team's tasks were: 1. Research background literature on current Agile processes, 2. Perform benchmark activities with other organizations that are involved in software Agile processes to determine best practices, 3. Collect information on Agile-developed systems to enable improvements to the current NASA standards and processes to enhance their ability to perform reliable software assurance on NASA Agile-developed systems, 4. Suggest additional guidance and recommendations for updates to those standards and processes, as needed. The ABT's findings and recommendations for software management, engineering and software assurance are addressed herein.

  7. SU-E-T-360: End-To-End Dosimetric Testing of a Versa HD Linear Accelerator with the Agility Head Modeled in Pinnacle3

    SciTech Connect

    Saenz, D; Narayanasamy, G; Cruz, W; Papanikolaou, N; Stathakis, S

    2015-06-15

    Purpose: The Versa HD incorporates a variety of upgrades, primarily including the Agility head. The distinct dosimetric properties of the head from its predecessors combined with flattening-filter-free (FFF) beams require a new investigation of modeling in planning systems and verification of modeling accuracy. Methods: A model was created in Pinnacle{sup 3} v9.8 with commissioned beam data. Leaf transmission was modeled as <0.5% with maximum leaf speed of 3 cm/s. Photon spectra were tuned for FFF beams, for which profiles were modeled with arbitrary profiles rather than with cones. For verification, a variety of plans with varied parameters were devised, and point dose measurements were compared to calculated values. A phantom of several plastic water and Styrofoam slabs was scanned and imported into Pinnacle{sup 3}. Beams of different field sizes, SSD, wedges, and gantry angles were created. All available photon energies (6 MV, 10 MV, 18 MV, 6 FFF, 10 FFF) as well four clinical electron energies (6, 9, 12, and 15 MeV) were investigated. The plans were verified at a calculation point (8 cm deep for photons, variable for electrons) by measurement with a PTW Semiflex ionization chamber. In addition, IMRT testing was performed with three standard plans (step and shoot IMRT, small and large field VMAT plans). The plans were delivered on the Delta4 IMRT QA phantom (ScandiDos, Uppsala, Sweden). Results: Homogeneous point dose measurement agreed within 2% for all photon and electron beams. Open field photon measurements along the central axis at 100 cm SSD passed within 1%. Gamma passing rates were >99.5% for all plans with a 3%/3mm tolerance criteria. The IMRT QA results for the first 23 patients yielded gamma passing rates of 97.4±2.3%. Conclusion: The end-to-end testing ensured confidence in the ability of Pinnacle{sup 3} to model photon and electron beams with the Agility head.

  8. PDS4 - Some Principles for Agile Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D. J.; Hardman, S. H.; Joyner, R.; Algermissen, S.; Padams, J.

    2015-12-01

    PDS4, a research data management and curation system for NASA's Planetary Science Archive, was developed using principles that promote the characteristics of agile development. The result is an efficient system that produces better research data products while using less resources (time, effort, and money) and maximizes their usefulness for current and future scientists. The key principle is architectural. The PDS4 information architecture is developed and maintained independent of the infrastructure's process, application and technology architectures. The information architecture is based on an ontology-based information model developed to leverage best practices from standard reference models for digital archives, digital object registries, and metadata registries and capture domain knowledge from a panel of planetary science domain experts. The information model provides a sharable, stable, and formal set of information requirements for the system and is the primary source for information to configure most system components, including the product registry, search engine, validation and display tools, and production pipelines. Multi-level governance is also allowed for the effective management of the informational elements at the common, discipline, and project level. This presentation will describe the development principles, components, and uses of the information model and how an information model-driven architecture exhibits characteristics of agile curation including early delivery, evolutionary development, adaptive planning, continuous improvement, and rapid and flexible response to change.

  9. Agile manufacturing: The factory of the future

    NASA Technical Reports Server (NTRS)

    Loibl, Joseph M.; Bossieux, Terry A.

    1994-01-01

    The factory of the future will require an operating methodology which effectively utilizes all of the elements of product design, manufacturing and delivery. The process must respond rapidly to changes in product demand, product mix, design changes or changes in the raw materials. To achieve agility in a manufacturing operation, the design and development of the manufacturing processes must focus on customer satisfaction. Achieving greatest results requires that the manufacturing process be considered from product concept through sales. This provides the best opportunity to build a quality product for the customer at a reasonable rate. The primary elements of a manufacturing system include people, equipment, materials, methods and the environment. The most significant and most agile element in any process is the human resource. Only with a highly trained, knowledgeable work force can the proper methods be applied to efficiently process materials with machinery which is predictable, reliable and flexible. This paper discusses the affect of each element on the development of agile manufacturing systems.

  10. Coaching for Better (Software) Buying Power in an Agile World

    DTIC Science & Technology

    2013-06-01

    professionals carefully to consider incorporation of agile methodologies into the set of acquisition tools at their disposal. This transformation is not...believes that DevOps , the process of warfighters and developers work- ing together throughout the project, is superior to volumes of detailed...Professionalism of the Total Acquisition Workforce The DoD needs to invest in training the acquisition workforce in agile methodologies to add tools that

  11. Need for Agility in Security Constraints for Distributed Simulation

    DTIC Science & Technology

    2014-06-01

    Submission to: 19th ICCRTS Title: Need for Agility in Security Constraints for Distributed Simulation (024) Topics: Primary Topic: 5...Modelling and Simulation , Alternates: 4. Experimentation, Metrics and Analysis, and 1. Concepts, Theory, and Policy S. K. Numrich, Ph.D...REPORT TYPE 3. DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Need for Agility in Security Constraints for Distributed Simulation

  12. XP Workshop on Agile Product Line Engineering

    NASA Astrophysics Data System (ADS)

    Ghanam, Yaser; Cooper, Kendra; Abrahamsson, Pekka; Maurer, Frank

    Software Product Line Engineering (SPLE) promises to lower the costs of developing individual applications as they heavily reuse existing artifacts. Besides decreasing costs, software reuse achieves faster development and higher quality. Traditionally, SPLE favors big design upfront and employs traditional, heavy weight processes. On the other hand, agile methods have been proposed to rapidly develop high quality software by focusing on producing working code while reducing upfront analysis and design. Combining both paradigms, although is challenging, can yield significant improvements.

  13. Compact, Automated, Frequency-Agile Microspectrofluorimeter

    NASA Technical Reports Server (NTRS)

    Fernandez, Salvador M.; Guignon, Ernest F.

    1995-01-01

    Compact, reliable, rugged, automated cell-culture and frequency-agile microspectrofluorimetric apparatus developed to perform experiments involving photometric imaging observations of single live cells. In original application, apparatus operates mostly unattended aboard spacecraft; potential terrestrial applications include automated or semiautomated diagnosis of pathological tissues in clinical laboratories, biomedical instrumentation, monitoring of biological process streams, and portable instrumentation for testing biological conditions in various environments. Offers obvious advantages over present laboratory instrumentation.

  14. Agile manufacturing concept

    NASA Astrophysics Data System (ADS)

    Goldman, Steven L.

    1994-03-01

    The initial conceptualization of agile manufacturing was the result of a 1991 study -- chaired by Lehigh Professor Roger N. Nagel and California-based entrepreneur Rick Dove, President of Paradigm Shifts, International -- of what it would take for U.S. industry to regain global manufacturing competitiveness by the early twenty-first century. This industry-led study, reviewed by senior management at over 100 companies before its release, concluded that incremental improvement of the current system of manufacturing would not be enough to be competitive in today's global marketplace. Computer-based information and production technologies that were becoming available to industry opened up the possibility of an altogether new system of manufacturing, one that would be characterized by a distinctive integration of people and technologies; of management and labor; of customers, producers, suppliers, and society.

  15. SU-E-T-250: Determining VMAT Machine Limitations of An Elekta Linear Accelerator with Agility MLC for Accurate Modeling in RayStation and Robust Delivery

    SciTech Connect

    Yang, K; Yu, Z; Chen, H; Mourtada, F

    2015-06-15

    Purpose: To implement VMAT in RayStation with the Elekta Synergy linac with the new Agility MLC, and to utilize the same vendor softwares to determine the optimum Elekta VMAT machine parameters in RayStation for accurate modeling and robust delivery. Methods: iCOMCat is utilized to create various beam patterns with user defined dose rate, gantry, MLC and jaw speed for each control point. The accuracy and stability of the output and beam profile are qualified for each isolated functional component of VMAT delivery using ion chamber and Profiler2 with isocentric mounting fixture. Service graphing on linac console is used to verify the mechanical motion accuracy. The determined optimum Elekta VMAT machine parameters were configured in RayStation v4.5.1. To evaluate the system overall performance, TG-119 test cases and nine retrospective VMAT patients were planned on RayStation, and validated using both ArcCHECK (with plug and ion chamber) and MapCHECK2. Results: Machine output and profile varies <0.3% when only variable is dose rate (35MU/min-600MU/min). <0.9% output and <0.3% profile variation are observed with additional gantry motion (0.53deg/s–5.8deg/s both directions). The output and profile variation are still <1% with additional slow leaf motion (<1.5cm/s both direction). However, the profile becomes less symmetric, and >1.5% output and 7% profile deviation is seen with >2.5cm/s leaf motion. All clinical cases achieved comparable plan quality as treated IMRT plans. The gamma passing rate is 99.5±0.5% on ArcCheck (<3% iso center dose deviation) and 99.1±0.8% on MapCheck2 using 3%/3mm gamma (10% lower threshold). Mechanical motion accuracy in all VMAT deliveries is <1°/1mm. Conclusion: Accurate RayStation modeling and robust VMAT delivery is achievable on Elekta Agility for <2.5cm/s leaf motion and full range of dose rate and gantry speed determined by the same vendor softwares. Our TG-119 and patient results have provided us with the confidence to use VMAT

  16. Development of EarthCube Governance: An Agile Approach

    NASA Astrophysics Data System (ADS)

    Pearthree, G.; Allison, M. L.; Patten, K.

    2013-12-01

    Governance of geosciences cyberinfrastructure is a complex and essential undertaking, critical in enabling distributed knowledge communities to collaborate and communicate across disciplines, distances, and cultures. Advancing science with respect to 'grand challenges," such as global climate change, weather prediction, and core fundamental science, depends not just on technical cyber systems, but also on social systems for strategic planning, decision-making, project management, learning, teaching, and building a community of practice. Simply put, a robust, agile technical system depends on an equally robust and agile social system. Cyberinfrastructure development is wrapped in social, organizational and governance challenges, which may significantly impede progress. An agile development process is underway for governance of transformative investments in geosciences cyberinfrastructure through the NSF EarthCube initiative. Agile development is iterative and incremental, and promotes adaptive planning and rapid and flexible response. Such iterative deployment across a variety of EarthCube stakeholders encourages transparency, consensus, accountability, and inclusiveness. A project Secretariat acts as the coordinating body, carrying out duties for planning, organizing, communicating, and reporting. A broad coalition of stakeholder groups comprises an Assembly (Mainstream Scientists, Cyberinfrastructure Institutions, Information Technology/Computer Sciences, NSF EarthCube Investigators, Science Communities, EarthCube End-User Workshop Organizers, Professional Societies) to serve as a preliminary venue for identifying, evaluating, and testing potential governance models. To offer opportunity for broader end-user input, a crowd-source approach will engage stakeholders not involved otherwise. An Advisory Committee from the Earth, ocean, atmosphere, social, computer and library sciences is guiding the process from a high-level policy point of view. Developmental

  17. Parallel optimization methods for agile manufacturing

    SciTech Connect

    Meza, J.C.; Moen, C.D.; Plantenga, T.D.; Spence, P.A.; Tong, C.H.; Hendrickson, B.A.; Leland, R.W.; Reese, G.M.

    1997-08-01

    The rapid and optimal design of new goods is essential for meeting national objectives in advanced manufacturing. Currently almost all manufacturing procedures involve the determination of some optimal design parameters. This process is iterative in nature and because it is usually done manually it can be expensive and time consuming. This report describes the results of an LDRD, the goal of which was to develop optimization algorithms and software tools that will enable automated design thereby allowing for agile manufacturing. Although the design processes vary across industries, many of the mathematical characteristics of the problems are the same, including large-scale, noisy, and non-differentiable functions with nonlinear constraints. This report describes the development of a common set of optimization tools using object-oriented programming techniques that can be applied to these types of problems. The authors give examples of several applications that are representative of design problems including an inverse scattering problem, a vibration isolation problem, a system identification problem for the correlation of finite element models with test data and the control of a chemical vapor deposition reactor furnace. Because the function evaluations are computationally expensive, they emphasize algorithms that can be adapted to parallel computers.

  18. Elements of an Art - Agile Coaching

    NASA Astrophysics Data System (ADS)

    Lundh, Erik

    This tutorial gives you a lead on becoming or redefining yourself as an Agile Coach. Introduction to elements and dimensions of state-of-the-art Agile Coaching. How to position the agile coach to be effective in a larger setting. Making the agile transition - from a single team to thousands of people. How to support multiple teams as a coach. How to build a coaches network in your company. Challenges when the agile coach is a consultant and the organization is large.

  19. Providing Agility in C2 Environments Through Networked Information Processing: A Model of Expertise

    DTIC Science & Technology

    2014-06-01

    decision making,” Nursing Research, vo. 35, no. 2, March 1986. [4] Masakazu Yagi and H. Ohno, and K. Takada, “Decision-making system for orthodontic ... treatment planning based on direct implementation of expertise knowledge,” 2010 Annual International Conference of the IEEE Engineering in Medicine

  20. Inserting Agility in System Development

    DTIC Science & Technology

    2012-07-01

    Agile IT Acquisition, IT Box, Scrum Inserting Agility in System Development Matthew R. Kennedy and Lt Col Dan Ward, USAF With the fast-paced nature...1,700 individuals and 71 countries, found Scrum and eXtreme Programming to be the most widely followed method- ologies (VersionOne, 2007). Other...University http://www.dau.mil 259 Defense ARJ, July 2012, Vol. 19 No. 3 : 249–264 Scrum Scrum is a framework used for project management, which is

  1. What Does an Agile Coach Do?

    NASA Astrophysics Data System (ADS)

    Davies, Rachel; Pullicino, James

    The surge in Agile adoption has created a demand for project managers rather than direct their teams. A sign of this trend is the ever-increasing number of people getting certified as scrum masters and agile leaders. Training courses that introduce agile practices are easy to find. But making the transition to coach is not as simple as understanding what agile practices are. Your challenge as an Agile Coach is to support your team in learning how to wield their new Agile tools in creating great software.

  2. Software ``Best'' Practices: Agile Deconstructed

    NASA Astrophysics Data System (ADS)

    Fraser, Steven

    Software “best” practices depend entirely on context - in terms of the problem domain, the system constructed, the software designers, and the “customers” ultimately deriving value from the system. Agile practices no longer have the luxury of “choosing” small non-mission critical projects with co-located teams. Project stakeholders are selecting and adapting practices based on a combina tion of interest, need and staffing. For example, growing product portfolios through a merger or the acquisition of a company exposes legacy systems to new staff, new software integration challenges, and new ideas. Innovation in communications (tools and processes) to span the growth and contraction of both information and organizations, while managing the adoption of changing software practices, is imperative for success. Traditional web-based tools such as web pages, document libraries, and forums are not suf ficient. A blend of tweeting, blogs, wikis, instant messaging, web-based confer encing, and telepresence creates a new dimension of communication “best” practices.

  3. The influence of physical and cognitive factors on reactive agility performance in men basketball players.

    PubMed

    Scanlan, Aaron; Humphries, Brendan; Tucker, Patrick S; Dalbo, Vincent

    2014-01-01

    This study explored the influence of physical and cognitive measures on reactive agility performance in basketball players. Twelve men basketball players performed multiple sprint, Change of Direction Speed Test, and Reactive Agility Test trials. Pearson's correlation analyses were used to determine relationships between the predictor variables (stature, mass, body composition, 5-m, 10-m and 20-m sprint times, peak speed, closed-skill agility time, response time and decision-making time) and reactive agility time (response variable). Simple and stepwise regression analyses determined the individual influence of each predictor variable and the best predictor model for reactive agility time. Morphological (r = -0.45 to 0.19), sprint (r = -0.40 to 0.41) and change-of-direction speed measures (r = 0.43) had small to moderate correlations with reactive agility time. Response time (r = 0.76, P = 0.004) and decision-making time (r = 0.58, P = 0.049) had large to very large relationships with reactive agility time. Response time was identified as the sole predictor variable for reactive agility time in the stepwise model (R(2) = 0.58, P = 0.004). In conclusion, cognitive measures had the greatest influence on reactive agility performance in men basketball players. These findings suggest reaction and decision-making drills should be incorporated in basketball training programmes.

  4. Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community

    NASA Astrophysics Data System (ADS)

    Young, J. W.; Lenhardt, W. C.; Parsons, M. A.; Benedict, K. K.

    2014-12-01

    The data life cycle has figured prominently in describing the context of digital scientific data stewardship and cyberinfractructure in support of science. There are many different versions of the data life cycle, but they all follow a similar basic pattern: plan, collect, ingest, asses, preserve, discover, and reuse. The process is often interpreted in a fairly linear fashion despite it being a cycle conceptually. More recently at GeoData 2014 and elsewhere, questions have been raised about the utility of the data life cycle as it is currently represented. We are proposing to the community a re-examination of the data life cycle using an agile lens. Our goal is not to deploy agile methods, but to use agile principles as a heuristic to think about how to incorporate data stewardship across the scientific process from proposal stage to research and beyond. We will present alternative conceptualizations of the data life cycle with a goal to solicit feedback and to develop a new model for conceiving and describing the overall data stewardship process. We seek to re-examine past assumptions and shed new light on the challenges and necessity of data stewardship. The ultimate goal is to support new science through enhanced data interoperability, usability, and preservation.

  5. The agile alert system for gamma-ray transients

    SciTech Connect

    Bulgarelli, A.; Trifoglio, M.; Gianotti, F.; Fioretti, V.; Chen, A. W.; Pittori, C.; Verrecchia, F.; Lucarelli, F.; Santolamazza, P.; Fanari, G.; Giommi, P.; Pellizzoni, A.; and others

    2014-01-20

    In recent years, a new generation of space missions has offered great opportunities for discovery in high-energy astrophysics. In this article we focus on the scientific operations of the Gamma-Ray Imaging Detector (GRID) on board the AGILE space mission. AGILE-GRID, sensitive in the energy range of 30 MeV-30 GeV, has detected many γ-ray transients of both galactic and extragalactic origin. This work presents the AGILE innovative approach to fast γ-ray transient detection, which is a challenging task and a crucial part of the AGILE scientific program. The goals are to describe (1) the AGILE Gamma-Ray Alert System, (2) a new algorithm for blind search identification of transients within a short processing time, (3) the AGILE procedure for γ-ray transient alert management, and (4) the likelihood of ratio tests that are necessary to evaluate the post-trial statistical significance of the results. Special algorithms and an optimized sequence of tasks are necessary to reach our goal. Data are automatically analyzed at every orbital downlink by an alert pipeline operating on different timescales. As proper flux thresholds are exceeded, alerts are automatically generated and sent as SMS messages to cellular telephones, via e-mail, and via push notifications from an application for smartphones and tablets. These alerts are crosschecked with the results of two pipelines, and a manual analysis is performed. Being a small scientific-class mission, AGILE is characterized by optimization of both scientific analysis and ground-segment resources. The system is capable of generating alerts within two to three hours of a data downlink, an unprecedented reaction time in γ-ray astrophysics.

  6. The AGILE Alert System for Gamma-Ray Transients

    NASA Astrophysics Data System (ADS)

    Bulgarelli, A.; Trifoglio, M.; Gianotti, F.; Tavani, M.; Parmiggiani, N.; Fioretti, V.; Chen, A. W.; Vercellone, S.; Pittori, C.; Verrecchia, F.; Lucarelli, F.; Santolamazza, P.; Fanari, G.; Giommi, P.; Beneventano, D.; Argan, A.; Trois, A.; Scalise, E.; Longo, F.; Pellizzoni, A.; Pucella, G.; Colafrancesco, S.; Conforti, V.; Tempesta, P.; Cerone, M.; Sabatini, P.; Annoni, G.; Valentini, G.; Salotti, L.

    2014-01-01

    In recent years, a new generation of space missions has offered great opportunities for discovery in high-energy astrophysics. In this article we focus on the scientific operations of the Gamma-Ray Imaging Detector (GRID) on board the AGILE space mission. AGILE-GRID, sensitive in the energy range of 30 MeV-30 GeV, has detected many γ-ray transients of both galactic and extragalactic origin. This work presents the AGILE innovative approach to fast γ-ray transient detection, which is a challenging task and a crucial part of the AGILE scientific program. The goals are to describe (1) the AGILE Gamma-Ray Alert System, (2) a new algorithm for blind search identification of transients within a short processing time, (3) the AGILE procedure for γ-ray transient alert management, and (4) the likelihood of ratio tests that are necessary to evaluate the post-trial statistical significance of the results. Special algorithms and an optimized sequence of tasks are necessary to reach our goal. Data are automatically analyzed at every orbital downlink by an alert pipeline operating on different timescales. As proper flux thresholds are exceeded, alerts are automatically generated and sent as SMS messages to cellular telephones, via e-mail, and via push notifications from an application for smartphones and tablets. These alerts are crosschecked with the results of two pipelines, and a manual analysis is performed. Being a small scientific-class mission, AGILE is characterized by optimization of both scientific analysis and ground-segment resources. The system is capable of generating alerts within two to three hours of a data downlink, an unprecedented reaction time in γ-ray astrophysics.

  7. Biosphere Process Model Report

    SciTech Connect

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  8. Lean and Agile Development of the AITS Ground Software System

    NASA Astrophysics Data System (ADS)

    Richters, Mark; Dutruel, Etienne; Mecredy, Nicolas

    2013-08-01

    We present the ongoing development of a new ground software system used for integrating, testing and operating spacecraft. The Advanced Integration and Test Services (AITS) project aims at providing a solution for electrical ground support equipment and mission control systems in future Astrium Space Transportation missions. Traditionally ESA ground or flight software development projects are conducted according to a waterfall-like process as specified in the ECSS-E-40 standard promoted by ESA in the European industry. In AITS a decision was taken to adopt an agile development process. This work could serve as a reference for future ESA software projects willing to apply agile concepts.

  9. Agile Development Methods for Space Operations

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Webster, Chris

    2012-01-01

    Main stream industry software development practice has gone from a traditional waterfall process to agile iterative development that allows for fast response to customer inputs and produces higher quality software at lower cost. How can we, the space ops community, adopt state of the art software development practice, achieve greater productivity at lower cost, and maintain safe and effective space flight operations? At NASA Ames, we are developing Mission Control Technologies Software, in collaboration with Johnson Space Center (JSC) and, more recently, the Jet Propulsion Laboratory (JPL).

  10. Microwave sintering process model.

    PubMed

    Peng, Hu; Tinga, W R; Sundararaj, U; Eadie, R L

    2003-01-01

    In order to simulate and optimize the microwave sintering of a silicon nitride and tungsten carbide/cobalt toolbits process, a microwave sintering process model has been built. A cylindrical sintering furnace was used containing a heat insulating layer, a susceptor layer, and an alumina tube containing the green toolbit parts between parallel, electrically conductive, graphite plates. Dielectric and absorption properties of the silicon nitride green parts, the tungsten carbide/cobalt green parts, and an oxidizable susceptor material were measured using perturbation and waveguide transmission methods. Microwave absorption data were measured over a temperature range from 20 degrees C to 800 degrees C. These data were then used in the microwave process model which assumed plane wave propagation along the radial direction and included the microwave reflection at each interface between the materials and the microwave absorption in the bulk materials. Heat transfer between the components inside the cylindrical sintering furnace was also included in the model. The simulated heating process data for both silicon nitride and tungsten carbide/cobalt samples closely follow the experimental data. By varying the physical parameters of the sintering furnace model, such as the thickness of the susceptor layer, the thickness of the allumina tube wall, the sample load volume and the graphite plate mass, the model data predicts their effects which are helpful in optimizing those parameters in the industrial sintering process.

  11. Development of perceived competence, tactical skills, motivation, technical skills, and speed and agility in young soccer players.

    PubMed

    Forsman, Hannele; Gråstén, Arto; Blomqvist, Minna; Davids, Keith; Liukkonen, Jarmo; Konttinen, Niilo

    2016-07-01

    The objective of this 1-year, longitudinal study was to examine the development of perceived competence, tactical skills, motivation, technical skills, and speed and agility characteristics of young Finnish soccer players. We also examined associations between latent growth models of perceived competence and other recorded variables. Participants were 288 competitive male soccer players ranging from 12 to 14 years (12.7 ± 0.6) from 16 soccer clubs. Players completed the self-assessments of perceived competence, tactical skills, and motivation, and participated in technical, and speed and agility tests. Results of this study showed that players' levels of perceived competence, tactical skills, motivation, technical skills, and speed and agility characteristics remained relatively high and stable across the period of 1 year. Positive relationships were found between these levels and changes in perceived competence and motivation, and levels of perceived competence and speed and agility characteristics. Together these results illustrate the multi-dimensional nature of talent development processes in soccer. Moreover, it seems crucial in coaching to support the development of perceived competence and motivation in young soccer players and that it might be even more important in later maturing players.

  12. Foam process models.

    SciTech Connect

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A.; Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  13. Agile Development of Advanced Prototypes

    DTIC Science & Technology

    2014-11-01

    genetically modified babies. A case where researchers supplemented women’s defective mitochondria with healthy mitochondria from a donor was...and immersive experience showing genetic engineering’s implication for the future of medicine. 15. SUBJECT TERMS Agile Development, Games for...provoking perspective on genetic engineering’s implication for the future of medicine. Experiencing Living with Prostheses (Xense) During this period

  14. Making Agile Work for You

    DTIC Science & Technology

    2011-07-20

    Extreme Programming (XP) • SCRUM • Dynamic Systems Development Method (DSDM) • Adaptive Software Development • Crystal • Feature-Driven Development...Carnegie Mellon University Twitter #seiwebinar SCRUM Scrum is an iterative, incremental methodology for managing agile software projects. The Team

  15. Onshore and Offshore Outsourcing with Agility: Lessons Learned

    NASA Astrophysics Data System (ADS)

    Kussmaul, Clifton

    This chapter reflects on case study based an agile distributed project that ran for approximately three years (from spring 2003 to spring 2006). The project involved (a) a customer organization with key personnel distributed across the US, developing an application with rapidly changing requirements; (b) onshore consultants with expertise in project management, development processes, offshoring, and relevant technologies; and (c) an external offsite development team in a CMM-5 organization in southern India. This chapter is based on surveys and discussions with multiple participants. The several years since the project was completed allow greater perspective on both the strengths and weaknesses, since the participants can reflect on the entire life of the project, and compare it to subsequent experiences. Our findings emphasize the potential for agile project management in distributed software development, and the importance of people and interactions, taking many small steps to find and correct errors, and matching the structures of the project and product to support implementation of agility.

  16. Decision Support for Iteration Scheduling in Agile Environments

    NASA Astrophysics Data System (ADS)

    Szőke, Ákos

    Today’s software business development projects often lay claim to low-risk value to the customers in order to be financed. Emerging agile processes offer shorter investment periods, faster time-to-market and better customer satisfaction. To date, however, in agile environments there is no sound methodological schedule support contrary to the traditional plan-based approaches. To address this situation, we present an agile iteration scheduling method whose usefulness is evaluated with post-mortem simulation. It demonstrates that the method can significantly improve load balancing of resources (cca. 5×), produce higher quality and lower-risk feasible schedule, and provide more informed and established decisions by optimized schedule production. Finally, the paper analyzes benefits and issues from the use of this method.

  17. SU-E-T-610: Comparison of Treatment Times Between the MLCi and Agility Multileaf Collimators

    SciTech Connect

    Ramsey, C; Bowling, J

    2014-06-01

    Purpose: The Agility is a new 160-leaf MLC developed by Elekta for use in their Infinity and Versa HD linacs. As compared to the MLCi, the Agility increased the maximum leaf speed from 2 cm/s to 3.5 cm/s, and the maximum primary collimator speed from 1.5 cm/s to 9.0 cm/s. The purpose of this study was to determine if the Agility MLC resulted in improved plan quality and/or shorter treatment times. Methods: An Elekta Infinity that was originally equipped with a 80 leaf MLCi was upgraded to an 160 leaf Agility. Treatment plan quality was evaluated using the Pinnacle planning system with SmartArc. Optimization was performed once for the MLCi and once for the Agility beam models using the same optimization parameters and the same number of iterations. Patient treatment times were measured for all IMRT, VMAT, and SBRT patients treated on the Infinity with the MLCi and Agility MLCs. Treatment times were extracted from the EMR and measured from when the patient first walked into the treatment room until exiting the treatment room. Results: 11,380 delivery times were measured for patients treated with the MLCi, and 1,827 measurements have been made for the Agility MLC. The average treatment times were 19.1 minutes for the MLCi and 20.8 minutes for the Agility. Using a t-test analysis, there was no difference between the two groups (t = 0.22). The dose differences between patients planned with the MLCi and the Agility MLC were minimal. For example, the dose difference for the PTV, GTV, and cord for a head and neck patient planned using Pinnacle were effectively equivalent. However, the dose to the parotid glands was slightly worse with the Agility MLC. Conclusion: There was no statistical difference in treatment time, or any significant dosimetric difference between the Agility MLC and the MLCi.

  18. Evaluation of a novel laparoscopic camera for characterization of renal ischemia in a porcine model using digital light processing (DLP) hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Olweny, Ephrem O.; Tan, Yung K.; Faddegon, Stephen; Jackson, Neil; Wehner, Eleanor F.; Best, Sara L.; Park, Samuel K.; Thapa, Abhas; Cadeddu, Jeffrey A.; Zuzak, Karel J.

    2012-03-01

    Digital light processing hyperspectral imaging (DLP® HSI) was adapted for use during laparoscopic surgery by coupling a conventional laparoscopic light guide with a DLP-based Agile Light source (OL 490, Optronic Laboratories, Orlando, FL), incorporating a 0° laparoscope, and a customized digital CCD camera (DVC, Austin, TX). The system was used to characterize renal ischemia in a porcine model.

  19. Biorobotics: using robots to emulate and investigate agile locomotion.

    PubMed

    Ijspeert, Auke J

    2014-10-10

    The graceful and agile movements of animals are difficult to analyze and emulate because locomotion is the result of a complex interplay of many components: the central and peripheral nervous systems, the musculoskeletal system, and the environment. The goals of biorobotics are to take inspiration from biological principles to design robots that match the agility of animals, and to use robots as scientific tools to investigate animal adaptive behavior. Used as physical models, biorobots contribute to hypothesis testing in fields such as hydrodynamics, biomechanics, neuroscience, and prosthetics. Their use may contribute to the design of prosthetic devices that more closely take human locomotion principles into account.

  20. Impact of agile methodologies on team capacity in automotive radio-navigation projects

    NASA Astrophysics Data System (ADS)

    Prostean, G.; Hutanu, A.; Volker, S.

    2017-01-01

    The development processes used in automotive radio-navigation projects are constantly under adaption pressure. While the software development models are based on automotive production processes, the integration of peripheral components into an automotive system will trigger a high number of requirement modifications. The use of traditional development models in automotive industry will bring team’s development capacity to its boundaries. The root cause lays in the inflexibility of actual processes and their adaption limits. This paper addresses a new project management approach for the development of radio-navigation projects. The understanding of weaknesses of current used models helped us in development and integration of agile methodologies in traditional development model structure. In the first part we focus on the change management methods to reduce request for change inflow. Established change management risk analysis processes enables the project management to judge the impact of a requirement change and also gives time to the project to implement some changes. However, in big automotive radio-navigation projects the saved time is not enough to implement the large amount of changes, which are submitted to the project. In the second phase of this paper we focus on increasing team capacity by integrating at critical project phases agile methodologies into the used traditional model. The overall objective of this paper is to prove the need of process adaption in order to solve project team capacity bottlenecks.

  1. Organizational Agility Model and Simulation

    DTIC Science & Technology

    2011-06-01

    months) r(t) x(t) no control x(t) PI control independent de-conflicted coordinated collaborative edge a) P Control, Kp = 10 b) PI Control , Kp = 10...I control), and error trend expressed as a derivative, Kd (t) (D control). PI control drives the steady state error to zero, but induces

  2. An Organisational Interoperability Agility Model

    DTIC Science & Technology

    2005-06-01

    Government Organisations Organisation OIM Level 0 r anisation OIM Level 1 OIM Level 2 OIM Level 4 OIM Level 3 Layer for one Operational Type I l Figure...Government Organisations Organisation OIM Level 0 r anisation OIM Level 1 OIM Level 2 OIM Level 4 OIM Level 3 Layer for one Operational Type I l

  3. AGILE integration into APC for high mix logic fab

    NASA Astrophysics Data System (ADS)

    Gatefait, M.; Lam, A.; Le Gratiet, B.; Mikolajczak, M.; Morin, V.; Chojnowski, N.; Kocsis, Z.; Smith, I.; Decaunes, J.; Ostrovsky, A.; Monget, C.

    2015-09-01

    For C040 technology and below, photolithographic depth of focus control and dispersion improvement is essential to secure product functionality. Critical 193nm immersion layers present initial focus process windows close to machine control capability. For previous technologies, the standard scanner sensor (Level sensor - LS) was used to map wafer topology and expose the wafer at the right Focus. Such optical embedded metrology, based on light reflection, suffers from reading issues that cannot be neglected anymore. Metrology errors are correlated to inspected product area for which material types and densities change, and so optical properties are not constant. Various optical phenomena occur across the product field during wafer inspection and have an effect on the quality and position of the reflected light. This can result in incorrect heights being recorded and exposures possibly being done out of focus. Focus inaccuracy associated to aggressive process windows on critical layers will directly impact product realization and therefore functionality and yield. ASML has introduced an air gauge sensor to complement the optical level sensor and lead to optimal topology metrology. The use of this new sensor is managed by the AGILE (Air Gauge Improved process LEveling) application. This measurement with no optical dependency will correct for optical inaccuracy of level sensor, and so improve best focus dispersion across the product. Due to the fact that stack complexity is more and more important through process steps flow, optical perturbation of standard Level sensor metrology is increasing and is becoming maximum for metallization layers. For these reasons AGILE feature implementation was first considered for contact and all metal layers. Another key point is that standard metrology will be sensitive to layer and reticle/product density. The gain of Agile will be enhanced for multiple product contribution mask and for complex System on Chip. Into ST context (High

  4. An Investigation of Agility Issues in Scrum Teams Using Agility Indicators

    NASA Astrophysics Data System (ADS)

    Pikkarainen, Minna; Wang, Xiaofeng

    Agile software development methods have emerged and become increasingly popular in recent years; yet the issues encountered by software development teams that strive to achieve agility using agile methods are yet to be explored systematically. Built upon a previous study that has established a set of indicators of agility, this study investigates what issues are manifested in software development teams using agile methods. It is focussed on Scrum teams particularly. In other words, the goal of the chapter is to evaluate Scrum teams using agility indicators and therefore to further validate previously presented agility indicators within the additional cases. A multiple case study research method is employed. The findings of the study reveal that the teams using Scrum do not necessarily achieve agility in terms of team autonomy, sharing, stability and embraced uncertainty. The possible reasons include previous organizational plan-driven culture, resistance towards the Scrum roles and changing resources.

  5. Barriers to Achieving Mentally Agile Junior Leaders

    DTIC Science & Technology

    2009-01-21

    To help answer this question, this paper will describe the operational environment the agile leader must be prepared to operate within and the...senior leadership identified their need over eight years ago? To help answer this question, this paper will describe the operational environment the agile...to the reader. BARRIERS TO ACHIEVING MENTALLY AGILE JUNIOR LEADERS Persistent conflict and change characterize the strategic environment . We have

  6. Towards a Comparative Measure of Legged Agility

    DTIC Science & Technology

    2014-06-01

    so for this paper we explore the implications of a well-cited definition within the sports science community holding that agility is “a rapid whole...systems will have negligible agility according to our metric in accor- dance with biological observations that these motions require significantly less...W. Young, “Agility literature review: Classifications, training and testing,” Journal of sports sciences, vol. 24, no. 9, pp. 919–932, 2006. 19. D. L

  7. Modeling hyporheic zone processes

    USGS Publications Warehouse

    Runkel, Robert L.; McKnight, Diane M.; Rajaram, Harihar

    2003-01-01

    Stream biogeochemistry is influenced by the physical and chemical processes that occur in the surrounding watershed. These processes include the mass loading of solutes from terrestrial and atmospheric sources, the physical transport of solutes within the watershed, and the transformation of solutes due to biogeochemical reactions. Research over the last two decades has identified the hyporheic zone as an important part of the stream system in which these processes occur. The hyporheic zone may be loosely defined as the porous areas of the stream bed and stream bank in which stream water mixes with shallow groundwater. Exchange of water and solutes between the stream proper and the hyporheic zone has many biogeochemical implications, due to differences in the chemical composition of surface and groundwater. For example, surface waters are typically oxidized environments with relatively high dissolved oxygen concentrations. In contrast, reducing conditions are often present in groundwater systems leading to low dissolved oxygen concentrations. Further, microbial oxidation of organic materials in groundwater leads to supersaturated concentrations of dissolved carbon dioxide relative to the atmosphere. Differences in surface and groundwater pH and temperature are also common. The hyporheic zone is therefore a mixing zone in which there are gradients in the concentrations of dissolved gasses, the concentrations of oxidized and reduced species, pH, and temperature. These gradients lead to biogeochemical reactions that ultimately affect stream water quality. Due to the complexity of these natural systems, modeling techniques are frequently employed to quantify process dynamics.

  8. Assessment of proposed fighter agility metrics

    NASA Technical Reports Server (NTRS)

    Liefer, Randall K.; Valasek, John; Eggold, David P.; Downing, David R.

    1990-01-01

    This paper presents the results of an analysis of proposed metrics to assess fighter aircraft agility. A novel framework for classifying these metrics is developed and applied. A set of transient metrics intended to quantify the axial and pitch agility of fighter aircraft is evaluated with a high fidelity, nonlinear F-18 simulation. Test techniques and data reduction method are proposed, and sensitivities to pilot introduced errors during flight testing is investigated. Results indicate that the power onset and power loss parameters are promising candidates for quantifying axial agility, while maximum pitch up and pitch down rates are for quantifying pitch agility.

  9. Frequency Agility Radar,

    DTIC Science & Technology

    1982-12-06

    way, vhen we change the size of the added direct current magnetic field which can change the magnetic conduction of the ferrite as well as the...by the measurement of the ferrite magnetic circuit and the size of the direct current magnetic field. When we have the above mentioned measurements of...and the magnetic field of the TE011 model there will be weaker. This can further reduce loss and raise the power limit. Because the ferrite has a

  10. Agile robotic edge finishing

    SciTech Connect

    Powell, M.

    1996-08-01

    Edge finishing processes have seemed like ideal candidates for automation. Most edge finishing processes are unpleasant, dangerous, tedious, expensive, not repeatable and labor intensive. Estimates place the cost of manual edge finishing processes at 12% of the total cost of fabricating precision parts. For small, high precision parts, the cost of hand finishing may be as high as 305 of the total part cost. Up to 50% of this cost could be saved through automation. This cost estimate includes the direct costs of edge finishing: the machining hours required and the 30% scrap and rework rate after manual finishing. Not included in these estimates are the indirect costs resulting from cumulative trauma disorders and retraining costs caused by the high turnover rate for finishing jobs.. Despite the apparent economic advantages, edge finishing has proven difficult to automate except in low precision and/or high volume production environments. Finishing automation systems have not been deployed successfully in Department of Energy defense programs (DOE/DP) production, A few systems have been attempted but have been subsequently abandoned for traditional edge finishing approaches: scraping, grinding, and filing the edges using modified dental tools and hand held power tools. Edge finishing automation has been an elusive but potentially lucrative production enhancement. The amount of time required for reconfiguring workcells for new parts, the time required to reprogram the workcells to finish new parts, and automation equipment to respond to fixturing errors and part tolerances are the most common reasons cited for eliminating automation as an option for DOE/DP edge finishing applications. Existing automated finishing systems have proven to be economically viable only where setup and reprogramming costs are a negligible fraction of overall production costs.

  11. Current State of Agile User-Centered Design: A Survey

    NASA Astrophysics Data System (ADS)

    Hussain, Zahid; Slany, Wolfgang; Holzinger, Andreas

    Agile software development methods are quite popular nowadays and are being adopted at an increasing rate in the industry every year. However, these methods are still lacking usability awareness in their development lifecycle, and the integration of usability/User-Centered Design (UCD) into agile methods is not adequately addressed. This paper presents the preliminary results of a recently conducted online survey regarding the current state of the integration of agile methods and usability/UCD. A world wide response of 92 practitioners was received. The results show that the majority of practitioners perceive that the integration of agile methods with usability/UCD has added value to their adopted processes and to their teams; has resulted in the improvement of usability and quality of the product developed; and has increased the satisfaction of the end-users of the product developed. The top most used HCI techniques are low-fidelity prototyping, conceptual designs, observational studies of users, usability expert evaluations, field studies, personas, rapid iterative testing, and laboratory usability testing.

  12. Network configuration management : paving the way to network agility.

    SciTech Connect

    Maestas, Joseph H.

    2007-08-01

    Sandia networks consist of nearly nine hundred routers and switches and nearly one million lines of command code, and each line ideally contributes to the capabilities of the network to convey information from one location to another. Sandia's Cyber Infrastructure Development and Deployment organizations recognize that it is therefore essential to standardize network configurations and enforce conformance to industry best business practices and documented internal configuration standards to provide a network that is agile, adaptable, and highly available. This is especially important in times of constrained budgets as members of the workforce are called upon to improve efficiency, effectiveness, and customer focus. Best business practices recommend using the standardized configurations in the enforcement process so that when root cause analysis results in recommended configuration changes, subsequent configuration auditing will improve compliance to the standard. Ultimately, this minimizes mean time to repair, maintains the network security posture, improves network availability, and enables efficient transition to new technologies. Network standardization brings improved network agility, which in turn enables enterprise agility, because the network touches all facets of corporate business. Improved network agility improves the business enterprise as a whole.

  13. Tailoring Agility: Promiscuous Pair Story Authoring and Value Calculation

    NASA Astrophysics Data System (ADS)

    Tendon, Steve

    This chapter describes how a multi-national software organization created a business plan involving business units from eight countries that followed an agile way, after two previously failed attempts with traditional approaches. The case is told by the consultant who initiated implementation of agility into requirements gathering, estimation and planning processes in an international setting. The agile approach was inspired by XP, but then tailored to meet the peculiar requirements. Two innovations were critical. The first innovation was promiscuous pair story authoring, where user stories were written by two people (similarly to pair programming), and the pairing changed very often (as frequently as every 15-20 minutes) to achieve promiscuity and cater for diverse point of views. The second innovation was an economic value evaluation (and not the cost) which was attributed to stories. Continuous recalculation of the financial value of the stories allowed to assess the projects financial return. In this case implementation of agility in the international context allowed the involved team members to reach consensus and unanimity of decisions, vision and purpose.

  14. Agile manufacturing and constraints management: a strategic perspective

    NASA Astrophysics Data System (ADS)

    Stratton, Roy; Yusuf, Yahaya Y.

    2000-10-01

    The definition of the agile paradigm has proved elusive and is often viewed as a panacea, in contention with more traditional approaches to operations strategy development and Larkin its own methodology and tools. The Theory of Constraints (TOC) is also poorly understood, as it is commonly solely associated with production planning and control systems and bottleneck management. This paper will demonstrate the synergy between these two approaches together with the Theory of Inventive Problem Solving (TRIZ), and establish how the systematic elimination of trade-offs can support the agile paradigm. Whereas agility is often seen as a trade-off free destination, both TOC and TRIZ may be considered to be route finders, as they comprise methodologies that focus on the identification and elimination of the trade-offs that constrain the purposeful improvement of a system, be it organizational or mechanical. This paper will also show how the TOC thinking process may be combined with the TRIZ knowledge based approach and used in breaking contradictions within agile logistics.

  15. How Can Agile Practices Minimize Global Software Development Co-ordination Risks?

    NASA Astrophysics Data System (ADS)

    Hossain, Emam; Babar, Muhammad Ali; Verner, June

    The distribution of project stakeholders in Global Software Development (GSD) projects provides significant risks related to project communication, coordination and control processes. There is growing interest in applying agile practices in GSD projects in order to leverage the advantages of both approaches. In some cases, GSD project managers use agile practices to reduce project distribution challenges. We use an existing coordination framework to identify GSD coordination problems due to temporal, geographical and socio-cultural distances. An industry-based case study is used to describe, explore and explain the use of agile practices to reduce development coordination challenges.

  16. Multiply-agile encryption in high speed communication networks

    SciTech Connect

    Pierson, L.G.; Witzke, E.L.

    1997-05-01

    Different applications have different security requirements for data privacy, data integrity, and authentication. Encryption is one technique that addresses these requirements. Encryption hardware, designed for use in high-speed communications networks, can satisfy a wide variety of security requirements if that hardware is key-agile, robustness-agile and algorithm-agile. Hence, multiply-agile encryption provides enhanced solutions to the secrecy, interoperability and quality of service issues in high-speed networks. This paper defines these three types of agile encryption. Next, implementation issues are discussed. While single-algorithm, key-agile encryptors exist, robustness-agile and algorithm-agile encryptors are still research topics.

  17. On the Biomimetic Design of Agile-Robot Legs

    PubMed Central

    Garcia, Elena; Arevalo, Juan Carlos; Muñoz, Gustavo; Gonzalez-de-Santos, Pablo

    2011-01-01

    The development of functional legged robots has encountered its limits in human-made actuation technology. This paper describes research on the biomimetic design of legs for agile quadrupeds. A biomimetic leg concept that extracts key principles from horse legs which are responsible for the agile and powerful locomotion of these animals is presented. The proposed biomimetic leg model defines the effective leg length, leg kinematics, limb mass distribution, actuator power, and elastic energy recovery as determinants of agile locomotion, and values for these five key elements are given. The transfer of the extracted principles to technological instantiations is analyzed in detail, considering the availability of current materials, structures and actuators. A real leg prototype has been developed following the biomimetic leg concept proposed. The actuation system is based on the hybrid use of series elasticity and magneto-rheological dampers which provides variable compliance for natural motion. From the experimental evaluation of this prototype, conclusions on the current technological barriers to achieve real functional legged robots to walk dynamically in agile locomotion are presented. PMID:22247667

  18. On the biomimetic design of agile-robot legs.

    PubMed

    Garcia, Elena; Arevalo, Juan Carlos; Muñoz, Gustavo; Gonzalez-de-Santos, Pablo

    2011-01-01

    The development of functional legged robots has encountered its limits in human-made actuation technology. This paper describes research on the biomimetic design of legs for agile quadrupeds. A biomimetic leg concept that extracts key principles from horse legs which are responsible for the agile and powerful locomotion of these animals is presented. The proposed biomimetic leg model defines the effective leg length, leg kinematics, limb mass distribution, actuator power, and elastic energy recovery as determinants of agile locomotion, and values for these five key elements are given. The transfer of the extracted principles to technological instantiations is analyzed in detail, considering the availability of current materials, structures and actuators. A real leg prototype has been developed following the biomimetic leg concept proposed. The actuation system is based on the hybrid use of series elasticity and magneto-rheological dampers which provides variable compliance for natural motion. From the experimental evaluation of this prototype, conclusions on the current technological barriers to achieve real functional legged robots to walk dynamically in agile locomotion are presented.

  19. Transitioning from Distributed and Traditional to Distributed and Agile: An Experience Report

    NASA Astrophysics Data System (ADS)

    Wildt, Daniel; Prikladnicki, Rafael

    Global companies that experienced extensive waterfall phased plans are trying to improve their existing processes to expedite team engagement. Agile methodologies have become an acceptable path to follow because it comprises project management as part of its practices. Agile practices have been used with the objective of simplifying project control through simple processes, easy to update documentation and higher team iteration over exhaustive documentation, focusing rather on team continuous improvement and aiming to add value to business processes. The purpose of this chapter is to describe the experience of a global multinational company on transitioning from distributed and traditional to distributed and agile. This company has development centers across North America, South America and Asia. This chapter covers challenges faced by the project teams of two pilot projects, including strengths of using agile practices in a globally distributed environment and practical recommendations for similar endeavors.

  20. Teaching Agile Software Development: A Case Study

    ERIC Educational Resources Information Center

    Devedzic, V.; Milenkovic, S. R.

    2011-01-01

    This paper describes the authors' experience of teaching agile software development to students of computer science, software engineering, and other related disciplines, and comments on the implications of this and the lessons learned. It is based on the authors' eight years of experience in teaching agile software methodologies to various groups…

  1. The RITE Approach to Agile Acquisition

    DTIC Science & Technology

    2013-04-01

    Review (SVR), and Production Readiness Review ( PRR ), which were evaluated against Agile development requirements. Further analysis was conducted...Audit (FCA), PRR , Operational Test Readiness Review (OTRR), Physical Configuration Audit (PCA), Integration Readiness Review (IRR), In Service...Technology (DSB Task Force, 2009, p. 48) In the context of the primary milestone reviews (PDR, CDR, and SVR/ PRR ), a nominal Agile development structure was

  2. The Introduction of Agility into Albania.

    ERIC Educational Resources Information Center

    Smith-Stevens, Eileen J.; Shkurti, Drita

    1998-01-01

    Describes a plan to introduce and achieve a national awareness of agility (and easy entry into the world market) for Albania through the relatively stable higher-education order. Agility's four strategic principles are enriching the customer, cooperating to enhance competitiveness, organizing to master change and uncertainty, and leveraging the…

  3. An Agile Course-Delivery Approach

    ERIC Educational Resources Information Center

    Capellan, Mirkeya

    2009-01-01

    In the world of software development, agile methodologies have gained popularity thanks to their lightweight methodologies and flexible approach. Many advocates believe that agile methodologies can provide significant benefits if applied in the educational environment as a teaching method. The need for an approach that engages and motivates…

  4. Fighter agility metrics, research, and test

    NASA Technical Reports Server (NTRS)

    Liefer, Randall K.; Valasek, John; Eggold, David P.

    1990-01-01

    Proposed new metrics to assess fighter aircraft agility are collected and analyzed. A framework for classification of these new agility metrics is developed and applied. A completed set of transient agility metrics is evaluated with a high fidelity, nonlinear F-18 simulation provided by the NASA Dryden Flight Research Center. Test techniques and data reduction methods are proposed. A method of providing cuing information to the pilot during flight test is discussed. The sensitivity of longitudinal and lateral agility metrics to deviations from the pilot cues is studied in detail. The metrics are shown to be largely insensitive to reasonable deviations from the nominal test pilot commands. Instrumentation required to quantify agility via flight test is also considered. With one exception, each of the proposed new metrics may be measured with instrumentation currently available. Simulation documentation and user instructions are provided in an appendix.

  5. Integrating Agile Combat Support within Title 10 Wargames

    DTIC Science & Technology

    2015-03-26

    incorporate logistics into Air Force Title 10 wargames. More specifically, we capture Air Force Materiel Command’s (AFMC) Agile Combat Support (ACS...within an unclassified general wargame scenario. Logistics has been omitted from wargames for a multitude of reasons throughout the years. We...develop a logistics simulation model of a simplified wargame scenario designed to be run within the Logistics Composite Model (LCOM) Analysis Toolkit

  6. Agile parallel bioinformatics workflow management using Pwrake

    PubMed Central

    2011-01-01

    Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability

  7. GREENSCOPE: Sustainable Process Modeling

    EPA Science Inventory

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  8. Integrating Low-Cost Rapid Usability Testing into Agile System Development of Healthcare IT: A Methodological Perspective.

    PubMed

    Kushniruk, Andre W; Borycki, Elizabeth M

    2015-01-01

    The development of more usable and effective healthcare information systems has become a critical issue. In the software industry methodologies such as agile and iterative development processes have emerged to lead to more effective and usable systems. These approaches highlight focusing on user needs and promoting iterative and flexible development practices. Evaluation and testing of iterative agile development cycles is considered an important part of the agile methodology and iterative processes for system design and re-design. However, the issue of how to effectively integrate usability testing methods into rapid and flexible agile design cycles has remained to be fully explored. In this paper we describe our application of an approach known as low-cost rapid usability testing as it has been applied within agile system development in healthcare. The advantages of the integrative approach are described, along with current methodological considerations.

  9. Analysis on critical success factors for agile manufacturing evaluation in original equipment manufacturing industry-an AHP approach

    NASA Astrophysics Data System (ADS)

    Ajay Guru Dev, C.; Senthil Kumar, V. S.

    2016-09-01

    Manufacturing industries are facing challenges in the implementation of agile manufacturing in their products and processes. Agility is widely accepted as a new competitive concept in the manufacturing sector in fulfilling varying customer demand. Thus, evaluation of agile manufacturing in industries has become a necessity. The success of an organisation depends on its ability to manage finding the critical success factors and give them special and continued attention in order to bring about high performance. This paper proposes a set of critical success factors (CSFs) for evaluating agile manufacturing considered appropriate for the manufacturing sector. The analytical hierarchy process (AHP) method is applied for prioritizing the success factors, by summarizing the opinions of experts. It is believed that the proposed CSFs enable and assist manufacturing industries to achieve a higher performance in agile manufacturing so as to increase competitiveness.

  10. Visual Modelling of Learning Processes

    ERIC Educational Resources Information Center

    Copperman, Elana; Beeri, Catriel; Ben-Zvi, Nava

    2007-01-01

    This paper introduces various visual models for the analysis and description of learning processes. The models analyse learning on two levels: the dynamic level (as a process over time) and the functional level. Two types of model for dynamic modelling are proposed: the session trace, which documents a specific learner in a particular learning…

  11. Addressing the Barriers to Agile Development in the Department of Defense: Program Structure, Requirements, and Contracting

    DTIC Science & Technology

    2015-04-30

    with the IT Box requirements concept, and thus cannot take advantage of its flexibilities to enable Agile development. In addition, long contracting...place. Many DoD IT acquisition programs are unfamiliar with the IT Box requirements concept, and thus cannot take advantage of its flexibilities to...requirements, and contracting. The DoD can address these barriers by utilizing a proactively tailored Agile acquisition model, implementing an IT Box

  12. Towards a Better Understanding of CMMI and Agile Integration - Multiple Case Study of Four Companies

    NASA Astrophysics Data System (ADS)

    Pikkarainen, Minna

    The amount of software is increasing in the different domains in Europe. This provides the industries in smaller countries good opportunities to work in the international markets. Success in the global markets however demands the rapid production of high quality, error free software. Both CMMI and agile methods seem to provide a ready solution for quality and lead time improvements. There is not, however, much empirical evidence available either about 1) how the integration of these two aspects can be done in practice or 2) what it actually demands from assessors and software process improvement groups. The goal of this paper is to increase the understanding of CMMI and agile integration, in particular, focusing on the research question: how to use ‘lightweight’ style of CMMI assessments in agile contexts. This is done via four case studies in which assessments were conducted using the goals of CMMI integrated project management and collaboration and coordination with relevant stakeholder process areas and practices from XP and Scrum. The study shows that the use of agile practices may support the fulfilment of the goals of CMMI process areas but there are still many challenges for the agile teams to be solved within the continuous improvement programs. It also identifies practical advices to the assessors and improvement groups to take into consideration when conducting assessment in the context of agile software development.

  13. Adopting best practices: "Agility" moves from software development to healthcare project management.

    PubMed

    Kitzmiller, Rebecca; Hunt, Eleanor; Sproat, Sara Breckenridge

    2006-01-01

    It is time for a change in mindset in how nurses operationalize system implementations and manage projects. Computers and systems have evolved over time from unwieldy mysterious machines of the past to ubiquitous computer use in every aspect of daily lives and work sites. Yet, disconcertingly, the process used to implement these systems has not evolved. Technology implementation does not need to be a struggle. It is time to adapt traditional plan-driven implementation methods to incorporate agile techniques. Agility is a concept borrowed from software development and is presented here because it encourages flexibility, adaptation, and continuous learning as part of the implementation process. Agility values communication and harnesses change to an advantage, which facilitates the natural evolution of an adaptable implementation process. Specific examples of agility in an implementation are described, and plan-driven implementation stages are adapted to incorporate relevant agile techniques. This comparison demonstrates how an agile approach enhances traditional implementation techniques to meet the demands of today's complex healthcare environments.

  14. A Big Data-driven Model for the Optimization of Healthcare Processes.

    PubMed

    Koufi, Vassiliki; Malamateniou, Flora; Vassilacopoulos, George

    2015-01-01

    Healthcare organizations increasingly navigate a highly volatile, complex environment in which technological advancements and new healthcare delivery business models are the only constants. In their effort to out-perform in this environment, healthcare organizations need to be agile enough in order to become responsive to these increasingly changing conditions. To act with agility, healthcare organizations need to discover new ways to optimize their operations. To this end, they focus on healthcare processes that guide healthcare delivery and on the technologies that support them. Business process management (BPM) and Service-Oriented Architecture (SOA) can provide a flexible, dynamic, cloud-ready infrastructure where business process analytics can be utilized to extract useful insights from mountains of raw data, and make them work in ways beyond the abilities of human brains, or IT systems from just a year ago. This paper presents a framework which provides healthcare professionals gain better insight within and across your business processes. In particular, it performs real-time analysis on process-related data in order reveal areas of potential process improvement.

  15. DoD Acquisitions Reform: Embracing and Implementing Agile

    DTIC Science & Technology

    2015-12-01

    austerity, the DoD must focus on Agile implementation in three main areas: training, business process re-engineering and contracting guidance.   It ...specific recommendations and bound the topic of research, the focus of this paper is on information technology ( IT ) acquisitions specifically, which...The B-52 lived and died by the quality of its sheet metal. Today, our aircraft will live or die by the quality of our software” (Hagan, Hurt

  16. Achieving Agility and Stability in Large-Scale Software Development

    DTIC Science & Technology

    2013-01-16

    question Which software development process are you currently using? 1. Agile software development (e.g., using Scrum , XP practices, test-driven... Scrum teams, product development teams, component teams, feature teams) spend almost all of their time fixing defects, and new capability...architectural runway provides the degree of architectural stability to support the next n iterations of development. In a Scrum project environment

  17. Achieving Agility and Stability in Large-Scale Software Development

    DTIC Science & Technology

    2013-01-16

    question Which software development process are you currently using? 1. Agile software development (e.g., using Scrum , XP practices, test-driven... Scrum teams, product development teams, component teams, feature teams) spend almost all of their time fixing defects, and new capability...architectural runway provides the degree of architectural stability to support the next n iterations of development. In a Scrum project environment, the

  18. Agile science: creating useful products for behavior change in the real world.

    PubMed

    Hekler, Eric B; Klasnja, Predrag; Riley, William T; Buman, Matthew P; Huberty, Jennifer; Rivera, Daniel E; Martin, Cesar A

    2016-06-01

    Evidence-based practice is important for behavioral interventions but there is debate on how best to support real-world behavior change. The purpose of this paper is to define products and a preliminary process for efficiently and adaptively creating and curating a knowledge base for behavior change for real-world implementation. We look to evidence-based practice suggestions and draw parallels to software development. We argue to target three products: (1) the smallest, meaningful, self-contained, and repurposable behavior change modules of an intervention; (2) "computational models" that define the interaction between modules, individuals, and context; and (3) "personalization" algorithms, which are decision rules for intervention adaptation. The "agile science" process includes a generation phase whereby contender operational definitions and constructs of the three products are created and assessed for feasibility and an evaluation phase, whereby effect size estimates/casual inferences are created. The process emphasizes early-and-often sharing. If correct, agile science could enable a more robust knowledge base for behavior change.

  19. A combined exercise model for improving muscle strength, balance, walking distance, and motor agility in multiple sclerosis patients: A randomized clinical trial

    PubMed Central

    Sangelaji, Bahram; Kordi, Mohammadreza; Banihashemi, Farzaneh; Nabavi, Seyed Massood; Khodadadeh, Sara; Dastoorpoor, Maryam

    2016-01-01

    Background: Multiple sclerosis (MS) is a neurological disease with a variety of signs and symptoms. Exercise therapy has been shown to improve physical functions in MS. However, questions about an optimal exercise therapy remain. In this regard, we suggest a combined exercise therapy including aerobic and resistance exercises for MS patients. The study is designed to observe, test and compare the effects of proposed combined exercises on strength, balance, agility, fatigue, speed, and walking distance in people with mild to moderate MS [0 < expanded disability status scale (EDSS) < 5]. Methods: A total of 40 people with relapse-remitting MS (16 male, 0 < EDSS < 5) were randomized into one of the four groups (3 intervention and one control). The intervention consisted of various combinations of aerobic and resistance exercises with different repetition rates. Pre- and post-intervention scores of fatigue severity scale (FSS), timed up and go (TUG) test, 6-minute walk test (6MWT), 10- and 20-MWT, Berg balance scale (BBS), and one repetition maximum (1RM) test were recorded and analyzed. Results: For most tests, post-intervention values of the group 1, with 3-aerobic and 1-resistance exercises, were significantly higher compared to control group (P < 0.050). However, no significant progression was observed in the other two intervention groups. Conclusion: A combination of three aerobic exercises with one resistance exercise may result in improved balance, locomotion, and endurance in MS patients. PMID:27648171

  20. Agile multi-scale decompositions for automatic image registration

    NASA Astrophysics Data System (ADS)

    Murphy, James M.; Leija, Omar Navarro; Le Moigne, Jacqueline

    2016-05-01

    In recent works, the first and third authors developed an automatic image registration algorithm based on a multiscale hybrid image decomposition with anisotropic shearlets and isotropic wavelets. This prototype showed strong performance, improving robustness over registration with wavelets alone. However, this method imposed a strict hierarchy on the order in which shearlet and wavelet features were used in the registration process, and also involved an unintegrated mixture of MATLAB and C code. In this paper, we introduce a more agile model for generating features, in which a flexible and user-guided mix of shearlet and wavelet features are computed. Compared to the previous prototype, this method introduces a flexibility to the order in which shearlet and wavelet features are used in the registration process. Moreover, the present algorithm is now fully coded in C, making it more efficient and portable than the mixed MATLAB and C prototype. We demonstrate the versatility and computational efficiency of this approach by performing registration experiments with the fully-integrated C algorithm. In particular, meaningful timing studies can now be performed, to give a concrete analysis of the computational costs of the flexible feature extraction. Examples of synthetically warped and real multi-modal images are analyzed.

  1. Agile Leaders, Agile Institutions: Educating Adaptive and Innovative Leaders for Today and Tomorrow

    DTIC Science & Technology

    2005-03-18

    to organizational learning , specifically for militaries at war. With these lenses and informed by observations from the CCCs, the paper advances...rapid, effective organizational learning is the essence of organizational agility. In line with this paper’s concept of individual agility...organizational agility is a metaphor for organizational learning that is faster, more flexible, and more sensitive to the speed with which individual experiential

  2. Enhancing Users' Participation in Business Process Modeling through Ontology-Based Training

    NASA Astrophysics Data System (ADS)

    Macris, A.; Malamateniou, F.; Vassilacopoulos, G.

    Successful business process design requires active participation of users who are familiar with organizational activities and business process modelling concepts. Hence, there is a need to provide users with reusable, flexible, agile and adaptable training material in order to enable them instil their knowledge and expertise in business process design and automation activities. Knowledge reusability is of paramount importance in designing training material on process modelling since it enables users participate actively in process design/redesign activities stimulated by the changing business environment. This paper presents a prototype approach for the design and use of training material that provides significant advantages to both the designer (knowledge - content reusability and semantic web enabling) and the user (semantic search, knowledge navigation and knowledge dissemination). The approach is based on externalizing domain knowledge in the form of ontology-based knowledge networks (i.e. training scenarios serving specific training needs) so that it is made reusable.

  3. Preparing your Offshore Organization for Agility: Experiences in India

    NASA Astrophysics Data System (ADS)

    Srinivasan, Jayakanth

    Two strategies that have significantly changed the way we conventionally think about managing software development and sustainment are the family of development approaches collectively referred to as agile methods, and the distribution of development efforts on a global scale. When you combine the two strategies, organizations have to address not only the technical challenges that arise from introducing new ways of working, but more importantly have to manage the 'soft' factors that if ignored lead to hard challenges. Using two case studies of distributed agile software development in India we illustrate the areas that organizations need to be aware of when transitioning work to India. The key issues that we emphasize are the need to recruit and retain personnel; the importance of teaching, mentoring and coaching; the need to manage customer expectations; the criticality of well-articulated senior leadership vision and commitment; and the reality of operating in a heterogeneous process environment.

  4. Test Methods for Robot Agility in Manufacturing

    PubMed Central

    Downs, Anthony; Harrison, William; Schlenoff, Craig

    2017-01-01

    Purpose The paper aims to define and describe test methods and metrics to assess industrial robot system agility in both simulation and in reality. Design/methodology/approach The paper describes test methods and associated quantitative and qualitative metrics for assessing robot system efficiency and effectiveness which can then be used for the assessment of system agility. Findings The paper describes how the test methods were implemented in a simulation environment and real world environment. It also shows how the metrics are measured and assessed as they would be in a future competition. Practical Implications The test methods described in this paper will push forward the state of the art in software agility for manufacturing robots, allowing small and medium manufacturers to better utilize robotic systems. Originality / value The paper fulfills the identified need for standard test methods to measure and allow for improvement in software agility for manufacturing robots. PMID:28203034

  5. Agile manufacturing concepts and opportunities in ceramics

    SciTech Connect

    Booth, C.L.; Harmer, M.P.

    1995-08-01

    In 1991 Lehigh University facilitated seminars over a period of 8 months to define manufacturing needs for the 21st century. They concluded that the future will be characterized by rapid changes in technology advances, customer demands, and shifts in market dynamics and coined the term {open_quotes}Agile Manufacturing{close_quotes}. Agile manufacturing refers to the ability to thrive in an environment of constant unpredictable change. Market opportunities are attacked by partnering to form virtual firms to dynamically obtain the required skills for each product opportunity. This paper will describe and compare agile vs. traditional concepts of organization & structure, management policy and ethics, employee environment, product focus, information, and paradigm shift. Examples of agile manufacturing applied to ceramic materials will be presented.

  6. Agile Data Management with the Global Change Information System

    NASA Astrophysics Data System (ADS)

    Duggan, B.; Aulenbach, S.; Tilmes, C.; Goldstein, J.

    2013-12-01

    We describe experiences applying agile software development techniques to the realm of data management during the development of the Global Change Information System (GCIS), a web service and API for authoritative global change information under development by the US Global Change Research Program. Some of the challenges during system design and implementation have been : (1) balancing the need for a rigorous mechanism for ensuring information quality with the realities of large data sets whose contents are often in flux, (2) utilizing existing data to inform decisions about the scope and nature of new data, and (3) continuously incorporating new knowledge and concepts into a relational data model. The workflow for managing the content of the system has much in common with the development of the system itself. We examine various aspects of agile software development and discuss whether or how we have been able to use them for data curation as well as software development.

  7. Agile Port and High Speed Ship Technologies

    DTIC Science & Technology

    2009-12-31

    Report PNW Agile Port System Demonstration Center for the Commercial Deployment of Transportation Technologies milestone agenda for accomplishing the... report summarizes the results of the remaining three projects in the FY05 program cycle, in particular the PNW Agile Port System Demonstration, a system...the accomplishment of each project and the program objectives. With the submission of this report the FY05 CCDoTT Program is complete. Bibliography

  8. Modeling Primary Atomization Processes

    DTIC Science & Technology

    2007-11-02

    I., "Generation of Ripples by Wind Blowing Over a Viscous Fluid", The Scientific Papers of Sir Geoffrey Ingram Taylor, 1963. 2. A. A. Amsden, P. J...92, 1983. 28. Jin, Xiaoshi, "Boundary Element Study on Particle Orientation Caused by the Fountain Flow in Injection Molding ", Polymer Engineering...HTPB, PE is a thermoplastic which is commonly produced via extrusion from a die in a continuous process. Hence, PE grains could be produced using

  9. SuperAGILE Services at ASDC

    SciTech Connect

    Preger, B.; Verrecchia, F.; Pittori, C.; Antonelli, L. A.; Giommi, P.; Lazzarotto, F.; Evangelista, Y.

    2008-05-22

    The Italian Space Agency Science Data Center (ASDC) is a facility with several responsibilities including support to all the ASI scientific missions as for management and archival of the data, acting as the interface between ASI and the scientific community and providing on-line access to the data hosted. In this poster we describe the services that ASDC provides for SuperAGILE, in particular the ASDC public web pages devoted to the dissemination of SuperAGILE scientific results. SuperAGILE is the X-Ray imager onboard the AGILE mission, and provides the scientific community with orbit-by-orbit information on the observed sources. Crucial source information including position and flux in chosen energy bands will be reported in the SuperAGILE public web page at ASDC. Given their particular interest, another web page will be dedicated entirely to GRBs and other transients, where new event alerts will be notified and where users will find all the available informations on the GRBs detected by SuperAGILE.

  10. Balancing Plan-Driven and Agile Methods in Software Engineering Project Courses

    NASA Astrophysics Data System (ADS)

    Boehm, Barry; Port, Dan; Winsor Brown, A.

    2002-09-01

    For the past 6 years, we have been teaching a two-semester software engineering project course. The students organize into 5-person teams and develop largely web-based electronic services projects for real USC campus clients. We have been using and evolving a method called Model- Based (System) Architecting and Software Engineering (MBASE) for use in both the course and in industrial applications. The MBASE Guidelines include a lot of documents. We teach risk-driven documentation: if it is risky to document something, and not risky to leave it out (e.g., GUI screen placements), leave it out. Even so, students tend to associate more documentation with higher grades, although our grading eventually discourages this. We are always on the lookout for ways to have students learn best practices without having to produce excessive documentation. Thus, we were very interested in analyzing the various emerging agile methods. We found that agile methods and milestone plan-driven methods are part of a “how much planning is enough?” spectrum. Both agile and plan-driven methods have home grounds of project characteristics where they clearly work best, and where the other will have difficulties. Hybrid agile/plan-driven approaches are feasible, and necessary for projects having a mix of agile and plan-driven home ground characteristics. Information technology trends are going more toward the agile methods' home ground characteristics of emergent requirements and rapid change, although there is a concurrent increase in concern with dependability. As a result, we are currently experimenting with risk-driven combinations of MBASE and agile methods, such as integrating requirements, test plans, peer reviews, and pair programming into “agile quality management.”

  11. Aerospace Materials Process Modelling

    DTIC Science & Technology

    1988-08-01

    des phdnombnes physico - chimiques , slors sal connus, notamment des rdactions do phase as produisant dana l intorvalle do solidification, par des...connaissance do donndos theraiques, sinai qua du comportement e~canique, physico - chimique at mdtaliurgique des pibees & order maim aussi des moules. des...W.T.Sbs 16 A NUMERICAL MODEL OF DIRECTIONAL SOLIDIFICATION OF CAST TURBINE BLADES by G,.Lammndu and L -Veruiot des Roches 17 Paper IS withdrawn Pape 19

  12. Radiolysis Process Model

    SciTech Connect

    Buck, Edgar C.; Wittman, Richard S.; Skomurski, Frances N.; Cantrell, Kirk J.; McNamara, Bruce K.; Soderquist, Chuck Z.

    2012-07-17

    Assessing the performance of spent (used) nuclear fuel in geological repository requires quantification of time-dependent phenomena that may influence its behavior on a time-scale up to millions of years. A high-level waste repository environment will be a dynamic redox system because of the time-dependent generation of radiolytic oxidants and reductants and the corrosion of Fe-bearing canister materials. One major difference between used fuel and natural analogues, including unirradiated UO2, is the intense radiolytic field. The radiation emitted by used fuel can produce radiolysis products in the presence of water vapor or a thin-film of water (including OH• and H• radicals, O2-, eaq, H2O2, H2, and O2) that may increase the waste form degradation rate and change radionuclide behavior. H2O2 is the dominant oxidant for spent nuclear fuel in an O2 depleted water environment, the most sensitive parameters have been identified with respect to predictions of a radiolysis model under typical conditions. As compared with the full model with about 100 reactions it was found that only 30-40 of the reactions are required to determine [H2O2] to one part in 10–5 and to preserve most of the predictions for major species. This allows a systematic approach for model simplification and offers guidance in designing experiments for validation.

  13. Function-based integration strategy for an agile manufacturing testbed

    NASA Astrophysics Data System (ADS)

    Park, Hisup

    1997-01-01

    This paper describes an integration strategy for plug-and- play software based on functional descriptions of the software modules. The functional descriptions identify explicitly the role of each module with respect to the overall systems. They define the critical dependencies that affect the individual modules and thus affect the behavior of the system. The specified roles, dependencies and behavioral constraints are then incorporated in a group of shared objects that are distributed over a network. These objects may be interchanged with others without disrupting the system so long as the replacements meet the interface and functional requirements. In this paper, we propose a framework for modeling the behavior of plug-and-play software modules that will be used to (1) design and predict the outcome of the integration, (2) generate the interface and functional requirements of individual modules, and (3) form a dynamic foundation for applying interchangeable software modules. I describe this strategy in the context of the development of an agile manufacturing testbed. The testbed represents a collection of production cells for machining operations, supported by a network of software modules or agents for planning, fabrication, and inspection. A process definition layer holds the functional description of the software modules. A network of distributed objects interact with one another over the Internet and comprise the plug-compatible software nodes that execute these functions. This paper will explore the technical and operational ramifications of using the functional description framework to organize and coordinate the distributed object modules.

  14. SAR imagery using chaotic carrier frequency agility pulses

    NASA Astrophysics Data System (ADS)

    Xu, Xiaojian; Feng, Xiangzhi

    2011-06-01

    Synthetic aperture radar (SAR) systems are getting more and more applications in both civilian and military remote sensing missions. With the increasing deployment of electronic countermeasures (ECM) on modern battlefields, SAR encounters more and more interference jamming signals. The ECM jamming signals cause the SAR system to receive and process erroneous information which results in severe degradations in the output SAR images and/or formation of phony images of nonexistent targets. As a consequence, development of the electronic counter-countermeasures (ECCM) capability becomes one of the key problems in SAR system design. This paper develops radar signaling strategies and algorithms that enhance the ability of synthetic aperture radar to image targets under conditions of electronic jamming. The concept of SAR using chaotic carrier frequency agility pulses (CCFAP-SAR) is first proposed. Then the imaging procedure for CCFAP-SAR is discussed in detail. The ECCM performance of CCFAP-SAR for both depressive noise jamming and deceptive repeat jamming is analyzed. The impact of the carrier frequency agility range on the image quality of CCFAP-SAR is also studied. Simulation results demonstrate that, with adequate agility range of the carrier frequency, the proposed CCFAP-SAR performs as well as conventional radar with linear frequency modulation (LFM) waveform in image quality and slightly better in anti-noise depressive jamming; while performs very well in anti-deception jamming which cannot be rejected by LFM-SAR.

  15. Business process modeling in healthcare.

    PubMed

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  16. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  17. Modeling nuclear processes by Simulink

    SciTech Connect

    Rashid, Nahrul Khair Alang Md

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  18. Modeling nuclear processes by Simulink

    NASA Astrophysics Data System (ADS)

    Rashid, Nahrul Khair Alang Md

    2015-04-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  19. Agile Data Curation at a State Geological Survey

    NASA Astrophysics Data System (ADS)

    Hills, D. J.

    2015-12-01

    State agencies, including geological surveys, are often the gatekeepers for myriad data products essential for scientific research and economic development. For example, the Geological Survey of Alabama (GSA) is mandated to explore for, characterize, and report Alabama's mineral, energy, water, and biological resources in support of economic development, conservation, management, and public policy for the betterment of Alabama's citizens, communities, and businesses. As part of that mandate, the GSA has increasingly been called upon to make our data more accessible to stakeholders. Even as demand for greater data accessibility grows, budgets for such efforts are often small, meaning that agencies must do more for less. Agile software development has yielded efficient, effective products, most often at lower cost and in shorter time. Taking guidance from the agile software development model, the GSA is working towards more agile data management and curation. To date, the GSA's work has been focused primarily on data rescue. By using workflows that maximize clear communication while encouraging simplicity (e.g., maximizing the amount of work not done or that can be automated), the GSA is bringing decades of dark data into the light. Regular checks by the data rescuer with the data provider (or their proxy) provides quality control without adding an overt burden on either party. Moving forward, these workflows will also allow for more efficient and effective data management.

  20. Distributed agile software development for the SKA

    NASA Astrophysics Data System (ADS)

    Wicenec, Andreas; Parsons, Rebecca; Kitaeff, Slava; Vinsen, Kevin; Wu, Chen; Nelson, Paul; Reed, David

    2012-09-01

    The SKA software will most probably be developed by many groups distributed across the globe and coming from dierent backgrounds, like industries and research institutions. The SKA software subsystems will have to cover a very wide range of dierent areas, but still they have to react and work together like a single system to achieve the scientic goals and satisfy the challenging data ow requirements. Designing and developing such a system in a distributed fashion requires proper tools and the setup of an environment to allow for ecient detection and tracking of interface and integration issues in particular in a timely way. Agile development can provide much faster feedback mechanisms and also much tighter collaboration between the customer (scientist) and the developer. Continuous integration and continuous deployment on the other hand can provide much faster feedback of integration issues from the system level to the subsystem developers. This paper describes the results obtained from trialing a potential SKA development environment based on existing science software development processes like ALMA, the expected distribution of the groups potentially involved in the SKA development and experience gained in the development of large scale commercial software projects.

  1. Modelling of CWS combustion process

    NASA Astrophysics Data System (ADS)

    Rybenko, I. A.; Ermakova, L. A.

    2016-10-01

    The paper considers the combustion process of coal water slurry (CWS) drops. The physico-chemical process scheme consisting of several independent parallel-sequential stages is offered. This scheme of drops combustion process is proved by the particle size distribution test and research stereomicroscopic analysis of combustion products. The results of mathematical modelling and optimization of stationary regimes of CWS combustion are provided. During modeling the problem of defining possible equilibrium composition of products, which can be obtained as a result of CWS combustion processes at different temperatures, is solved.

  2. Fighter agility metrics. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Liefer, Randall K.

    1990-01-01

    Fighter flying qualities and combat capabilities are currently measured and compared in terms relating to vehicle energy, angular rates and sustained acceleration. Criteria based on these measurable quantities have evolved over the past several decades and are routinely used to design aircraft structures, aerodynamics, propulsion and control systems. While these criteria, or metrics, have the advantage of being well understood, easily verified and repeatable during test, they tend to measure the steady state capability of the aircraft and not its ability to transition quickly from one state to another. Proposed new metrics to assess fighter aircraft agility are collected and analyzed. A framework for classification of these new agility metrics is developed and applied. A complete set of transient agility metrics is evaluated with a high fidelity, nonlinear F-18 simulation. Test techniques and data reduction methods are proposed. A method of providing cuing information to the pilot during flight test is discussed. The sensitivity of longitudinal and lateral agility metrics to deviations from the pilot cues is studied in detail. The metrics are shown to be largely insensitive to reasonable deviations from the nominal test pilot commands. Instrumentation required to quantify agility via flight test is also considered. With one exception, each of the proposed new metrics may be measured with instrumentation currently available.

  3. Gamma-ray Astrophysics with AGILE

    SciTech Connect

    Longo, Francesco |; Tavani, M.; Barbiellini, G.; Argan, A.; Basset, M.; Boffelli, F.; Bulgarelli, A.; Caraveo, P.; Cattaneo, P.; Chen, A.; Costa, E.; Del Monte, E.; Di Cocco, G.; Di Persio, G.; Donnarumma, I.; Feroci, M.; Fiorini, M.; Foggetta, L.; Froysland, T.; Frutti, M.

    2007-07-12

    AGILE will explore the gamma-ray Universe with a very innovative instrument combining for the first time a gamma-ray imager and a hard X-ray imager. AGILE will be operational in spring 2007 and it will provide crucial data for the study of Active Galactic Nuclei, Gamma-Ray Bursts, unidentified gamma-ray sources. Galactic compact objects, supernova remnants, TeV sources, and fundamental physics by microsecond timing. The AGILE instrument is designed to simultaneously detect and image photons in the 30 MeV - 50 GeV and 15 - 45 keV energy bands with excellent imaging and timing capabilities, and a large field of view covering {approx} 1/5 of the entire sky at energies above 30 MeV. A CsI calorimeter is capable of GRB triggering in the energy band 0.3-50 MeV AGILE is now (March 2007) undergoing launcher integration and testing. The PLSV launch is planned in spring 2007. AGILE is then foreseen to be fully operational during the summer of 2007.

  4. Enterprise Technologies Deployment for Agile Manufacturing

    SciTech Connect

    Neal, R.E.

    1992-11-01

    This report is intended for high-level technical planners who are responsible for planning future developments for their company or Department of Energy/Defense Programs (DOE/DP) facilities. On one hand, the information may be too detailed or contain too much manufacturing technology jargon for a high-level, nontechnical executive, while at the same time an expert in any of the four infrastructure fields (Product Definition/Order Entry, Planning and Scheduling, Shop Floor Management, and Intelligent Manufacturing Systems) will know more than is conveyed here. The purpose is to describe a vision of technology deployment for an agile manufacturing enterprise. According to the 21st Century Manufacturing Enterprise Strategy, the root philosophy of agile manufacturing is that ``competitive advantage in the new systems will belong to agile manufacturing enterprises, capable of responding rapidly to demand for high-quality, highly customized products.`` Such agility will be based on flexible technologies, skilled workers, and flexible management structures which collectively will foster cooperative initiatives in and among companies. The remainder of this report is dedicated to sharpening our vision and to establishing a framework for defining specific project or pre-competitive project goals which will demonstrate agility through technology deployment.

  5. Enterprise Technologies Deployment for Agile Manufacturing

    SciTech Connect

    Neal, R.E.

    1992-11-01

    This report is intended for high-level technical planners who are responsible for planning future developments for their company or Department of Energy/Defense Programs (DOE/DP) facilities. On one hand, the information may be too detailed or contain too much manufacturing technology jargon for a high-level, nontechnical executive, while at the same time an expert in any of the four infrastructure fields (Product Definition/Order Entry, Planning and Scheduling, Shop Floor Management, and Intelligent Manufacturing Systems) will know more than is conveyed here. The purpose is to describe a vision of technology deployment for an agile manufacturing enterprise. According to the 21st Century Manufacturing Enterprise Strategy, the root philosophy of agile manufacturing is that competitive advantage in the new systems will belong to agile manufacturing enterprises, capable of responding rapidly to demand for high-quality, highly customized products.'' Such agility will be based on flexible technologies, skilled workers, and flexible management structures which collectively will foster cooperative initiatives in and among companies. The remainder of this report is dedicated to sharpening our vision and to establishing a framework for defining specific project or pre-competitive project goals which will demonstrate agility through technology deployment.

  6. Social Models: Blueprints or Processes?

    ERIC Educational Resources Information Center

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  7. SuperAGILE and Gamma Ray Bursts

    SciTech Connect

    Pacciani, Luigi; Costa, Enrico; Del Monte, Ettore; Donnarumma, Immacolata; Evangelista, Yuri; Feroci, Marco; Frutti, Massimo; Lazzarotto, Francesco; Lapshov, Igor; Rubini, Alda; Soffitta, Paolo; Tavani, Marco; Barbiellini, Guido; Mastropietro, Marcello; Morelli, Ennio; Rapisarda, Massimo

    2006-05-19

    The solid-state hard X-ray imager of AGILE gamma-ray mission -- SuperAGILE -- has a six arcmin on-axis angular resolution in the 15-45 keV range, a field of view in excess of 1 steradian. The instrument is very light: 5 kg only. It is equipped with an on-board self triggering logic, image deconvolution, and it is able to transmit the coordinates of a GRB to the ground in real-time through the ORBCOMM constellation of satellites. Photon by photon Scientific Data are sent to the Malindi ground station at every contact. In this paper we review the performance of the SuperAGILE experiment (scheduled for a launch in the middle of 2006), after its first onground calibrations, and show the perspectives for Gamma Ray Bursts.

  8. Waveform-Agile Tracking In Heavy Sea Clutter

    DTIC Science & Technology

    2007-01-01

    Waveform-Agile Tracking In Heavy Sea Clutter Sandeep P. Sira , Antonia Papandreou-Suppappola, Darryl Morrell†, and Douglas Cochran SenSIP Center...to a non-adaptive system. The paper is organized as follows. In Section II, we describe the models for the target dynamics, clutter, and observations ...dependence on zs[n]. C. Observations Model At the end of Sub-dwell 2 of the kth dwell, the measurement provided to the tracker is Yk = [yTn̂0−nv

  9. The Influence of Agility Training on Physiological and Cognitive Performance

    DTIC Science & Technology

    2010-11-01

    training, subjects completed a physical and cognitive battery of serum cortisol, VO2max, vertical jump , reaction time, Illinois Agility Test , body...strong trends toward the agility group improving more than the traditional group on VO2max (p=0.12), vertical jump (p=0.06), Illinois Agility Test ...levels, maximal oxygen uptake, Illinois Agility Test , Makoto reaction time, and vertical jump . The cognitive portion of the testing sessions

  10. Agile enterprise development framework utilizing services principles for building pervasive security

    NASA Astrophysics Data System (ADS)

    Farroha, Deborah; Farroha, Bassam

    2011-06-01

    We are in an environment of continuously changing mission requirements and therefore our Information Systems must adapt to accomplish new tasks, quicker, in a more proficient manner. Agility is the only way we will be able to keep up with this change. But there are subtleties that must be considered as we adopt various agile methods: secure, protect, control and authenticate are all elements needed to posture our Information Technology systems to counteract the real and perceived threats in today's environment. Many systems have been tasked to ingest process and analyze different data sets than they were originally designed for and they have to interact with multiple new systems that were unaccounted for at design time. Leveraging the tenets of security, we have devised a new framework that takes agility into a new realm where the product will built to work in a service-based environment but is developed using agile processes. Even though these two criteria promise to hone the development effort, they actually contradict each other in philosophy where Services require stable interfaces, while Agile focuses on being flexible and tolerate changes up to much later stages of development. This framework is focused on enabling a successful product development that capitalizes on both philosophies.

  11. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  12. Theoretical aspects of the agile mirror

    NASA Astrophysics Data System (ADS)

    Manheimer, Wallace M.; Fernsler, Richard

    1994-01-01

    A planar plasma mirror which can be oriented electronically could have the capability of providing electronic steering of a microwave beam in a radar or electronic warfare system. This system is denoted the agile mirror. A recent experiment has demonstrated such a planar plasma and the associated microwave reflection. This plasma was produced by a hollow cathode glow discharge, where the hollow cathode was a grooved metallic trench in a Lucite plate. Various theoretical aspects of this configuration of an agile mirror are examined here.

  13. IMAGE: Simulation for Understanding Complex Situations and Increasing Future Force Agility

    DTIC Science & Technology

    2008-12-01

    IMAGE: SIMULATION FOR UNDERSTANDING COMPLEX SITUATIONS AND INCREASING FUTURE FORCE AGILITY Michel Lizotte, François Bernier, Marielle Mokhtari...ism exploiting this common vocabulary. Supporting a team sharing ideas also means handling ownership of comprehension models and allowing a user to...simulation models are closely related. Any meaning - ful simulation object actually implements a Representa- tion concept. Some comprehension model

  14. Planning, Estimating, and Monitoring Progress in Agile Systems Development Environments

    DTIC Science & Technology

    2010-04-01

    Jilemanifesto.orgf Nt:JRJ’HRDP GRUMMAN Agile Terminology Term Definition P d B kl R i /U S i b l dro uct ac og equ rements ser tor es to e comp ete Iteration...Marion, McKelvey, & Uhl- Bien . (2007). Leadership Quarterly, 18(4), 298-318. Agile Development Practices Agile Project Management with Scrum Ken Schwaber

  15. Neuroscientific model of motivational process.

    PubMed

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  16. Neuroscientific Model of Motivational Process

    PubMed Central

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  17. An Agile Enterprise Regulation Architecture for Health Information Security Management

    PubMed Central

    Chen, Ying-Pei; Hsieh, Sung-Huai; Chien, Tsan-Nan; Chen, Heng-Shuen; Luh, Jer-Junn; Lai, Jin-Shin; Lai, Feipei; Chen, Sao-Jie

    2010-01-01

    Abstract Information security management for healthcare enterprises is complex as well as mission critical. Information technology requests from clinical users are of such urgency that the information office should do its best to achieve as many user requests as possible at a high service level using swift security policies. This research proposes the Agile Enterprise Regulation Architecture (AERA) of information security management for healthcare enterprises to implement as part of the electronic health record process. Survey outcomes and evidential experiences from a sample of medical center users proved that AERA encourages the information officials and enterprise administrators to overcome the challenges faced within an electronically equipped hospital. PMID:20815748

  18. An agile enterprise regulation architecture for health information security management.

    PubMed

    Chen, Ying-Pei; Hsieh, Sung-Huai; Cheng, Po-Hsun; Chien, Tsan-Nan; Chen, Heng-Shuen; Luh, Jer-Junn; Lai, Jin-Shin; Lai, Feipei; Chen, Sao-Jie

    2010-09-01

    Information security management for healthcare enterprises is complex as well as mission critical. Information technology requests from clinical users are of such urgency that the information office should do its best to achieve as many user requests as possible at a high service level using swift security policies. This research proposes the Agile Enterprise Regulation Architecture (AERA) of information security management for healthcare enterprises to implement as part of the electronic health record process. Survey outcomes and evidential experiences from a sample of medical center users proved that AERA encourages the information officials and enterprise administrators to overcome the challenges faced within an electronically equipped hospital.

  19. Lean and Agile: An Epistemological Reflection

    ERIC Educational Resources Information Center

    Browaeys, Marie-Joelle; Fisser, Sandra

    2012-01-01

    Purpose: The aim of the paper is to contribute to the discussion of treating the concepts of lean and agile in isolation or combination by presenting an alternative view from complexity thinking on these concepts, considering an epistemological approach to this topic. Design/methodology/approach: The paper adopts an epistemological approach, using…

  20. The Holy Grail of Agile Acquisition

    DTIC Science & Technology

    2010-04-01

    Bestsellers …” [Erwin 2009] Motivation • Despite of Erwin’s recommendation… – Agility seems to be a simple concept and it is commonly perceived as a virtue...osd mil/dapaproject/>. . . Erwin 2009 Erwin, S.I., Washington Pulse, Pentagon Brass: Stay Away From Management Bestsellers , National Defense, August

  1. AGILE and Gamma-Ray Bursts

    SciTech Connect

    Longo, Francesco; Tavani, M.; Barbiellini, G.; Argan, A.; Basset, M.; Boffelli, F.; Bulgarelli, A.; Caraveo, P.; Cattaneo, P.; Chen, A.; Costa, E.; Del Monte, E.; Di Cocco, G.; Di Persio, G.; Donnarumma, I.; Feroci, M.; Fiorini, M.; Foggetta, L.; Froysland, T.; Frutti, M.

    2006-05-19

    AGILE is a Scientific Mission dedicated to high-energy astrophysics supported by ASI with scientific participation of INAF and INFN. The AGILE instrument is designed to simultaneously detect and image photons in the 30 MeV - 50 GeV and 15 - 45 keV energy bands with excellent imaging and timing capabilities, and a large field of view covering {approx} 1/5 of the entire sky at energies above 30 MeV. A CsI calorimeter is capable of GRB triggering in the energy band 0.3-50 MeV. The broadband detection of GRBs and the study of implications for particle acceleration and high energy emission are primary goals of th emission. AGILE can image GRBs with 2-3 arcminutes error boxes in the hard X-ray range, and provide broadband photon-by photon detection in the 15-45 keV, 03-50 MeV, and 30 MeV-30 GeV energy ranges. Microsecond on-board photon tagging and a {approx} 100 microsecond gamma-ray detection deadtime will be crucial for fast GRB timing. On-board calculated GRB coordinates and energy fluxes will be quickly transmitted to the ground by an ORBCOMM transceiver. AGILE have recently (December 2005) completed its gamma-ray calibration. It is now (January 2006) undergoing satellite integration and testing. The PLSV launch is planned in early 2006. AGILE is then foreseen to be fully operational during the summer of 2006. It will be the only mission entirely dedicated to high-energy astrophysics above 30 MeV during the period mid-2006/mid-2007.

  2. Planning: The Participatory Process Model.

    ERIC Educational Resources Information Center

    McDowell, Elizabeth V.

    The participatory planning process model developed by Peirce Junior College is described in this paper. First, the rationale for shifting from a traditional authoritarian style of institutional leadership to a participatory style which encourages a broader concern for the institution and lessens morale problems is offered. The development of a new…

  3. Autonomous Guidance of Agile Small-scale Rotorcraft

    NASA Technical Reports Server (NTRS)

    Mettler, Bernard; Feron, Eric

    2004-01-01

    This report describes a guidance system for agile vehicles based on a hybrid closed-loop model of the vehicle dynamics. The hybrid model represents the vehicle dynamics through a combination of linear-time-invariant control modes and pre-programmed, finite-duration maneuvers. This particular hybrid structure can be realized through a control system that combines trim controllers and a maneuvering control logic. The former enable precise trajectory tracking, and the latter enables trajectories at the edge of the vehicle capabilities. The closed-loop model is much simpler than the full vehicle equations of motion, yet it can capture a broad range of dynamic behaviors. It also supports a consistent link between the physical layer and the decision-making layer. The trajectory generation was formulated as an optimization problem using mixed-integer-linear-programming. The optimization is solved in a receding horizon fashion. Several techniques to improve the computational tractability were investigate. Simulation experiments using NASA Ames 'R-50 model show that this approach fully exploits the vehicle's agility.

  4. Towards an Understanding of the Conceptual Underpinnings of Agile Development Methodologies

    NASA Astrophysics Data System (ADS)

    Nerur, Sridhar; Cannon, Alan; Balijepally, Venugopal; Bond, Philip

    While the growing popularity of agile development methodologies is undeniable, there has been little systematic exploration of its intellectual foundation. Such an effort would be an important first step in understanding this paradigm's underlying premises. This understanding, in turn, would be invaluable in our assessment of current practices as well as in our efforts to advance the field of software engineering. Drawing on a variety of sources, both within and outside the discipline, we argue that the concepts underpinning agile development methodologies are by no means novel. In the tradition of General Systems Theory this paper advocates a transdisciplinary examination of agile development methodologies to extend the intellectual boundaries of software development. This is particularly important as the field moves beyond instrumental processes aimed at satisfying mere technical considerations.

  5. Investigating the strategic antecedents of agility in humanitarian logistics.

    PubMed

    L'Hermitte, Cécile; Brooks, Benjamin; Bowles, Marcus; Tatham, Peter H

    2016-12-16

    This study investigates the strategic antecedents of operational agility in humanitarian logistics. It began by identifying the particular actions to be taken at the strategic level of a humanitarian organisation to support field-level agility. Next, quantitative data (n=59) were collected on four strategic-level capabilities (being purposeful, action-focused, collaborative, and learning-oriented) and on operational agility (field responsiveness and flexibility). Using a quantitative analysis, the study tested the relationship between organisational capacity building and operational agility and found that the four strategic-level capabilities are fundamental building blocks of agility. Collectively they account for 52 per cent of the ability of humanitarian logisticians to deal with ongoing changes and disruptions in the field. This study emphasises the need for researchers and practitioners to embrace a broader perspective of agility in humanitarian logistics. In addition, it highlights the inherently strategic nature of agility, the development of which involves focusing simultaneously on multiple drivers.

  6. Array Databases: Agile Analytics (not just) for the Earth Sciences

    NASA Astrophysics Data System (ADS)

    Baumann, P.; Misev, D.

    2015-12-01

    Gridded data, such as images, image timeseries, and climate datacubes, today are managed separately from the metadata, and with different, restricted retrieval capabilities. While databases are good at metadata modelled in tables, XML hierarchies, or RDF graphs, they traditionally do not support multi-dimensional arrays.This gap is being closed by Array Databases, pioneered by the scalable rasdaman ("raster data manager") array engine. Its declarative query language, rasql, extends SQL with array operators which are optimized and parallelized on server side. Installations can easily be mashed up securely, thereby enabling large-scale location-transparent query processing in federations. Domain experts value the integration with their commonly used tools leading to a quick learning curve.Earth, Space, and Life sciences, but also Social sciences as well as business have massive amounts of data and complex analysis challenges that are answered by rasdaman. As of today, rasdaman is mature and in operational use on hundreds of Terabytes of timeseries datacubes, with transparent query distribution across more than 1,000 nodes. Additionally, its concepts have shaped international Big Data standards in the field, including the forthcoming array extension to ISO SQL, many of which are supported by both open-source and commercial systems meantime. In the geo field, rasdaman is reference implementation for the Open Geospatial Consortium (OGC) Big Data standard, WCS, now also under adoption by ISO. Further, rasdaman is in the final stage of OSGeo incubation.In this contribution we present array queries a la rasdaman, describe the architecture and novel optimization and parallelization techniques introduced in 2015, and put this in context of the intercontinental EarthServer initiative which utilizes rasdaman for enabling agile analytics on Petascale datacubes.

  7. Thermoplastic matrix composite processing model

    NASA Technical Reports Server (NTRS)

    Dara, P. H.; Loos, A. C.

    1985-01-01

    The effects the processing parameters pressure, temperature, and time have on the quality of continuous graphite fiber reinforced thermoplastic matrix composites were quantitatively accessed by defining the extent to which intimate contact and bond formation has occurred at successive ply interfaces. Two models are presented predicting the extents to which the ply interfaces have achieved intimate contact and cohesive strength. The models are based on experimental observation of compression molded laminates and neat resin conditions, respectively. Identified as the mechanism explaining the phenomenon by which the plies bond to themselves is the theory of autohesion (or self diffusion). Theoretical predictions from the Reptation Theory between autohesive strength and contact time are used to explain the effects of the processing parameters on the observed experimental strengths. The application of a time-temperature relationship for autohesive strength predictions is evaluated. A viscoelastic compression molding model of a tow was developed to explain the phenomenon by which the prepreg ply interfaces develop intimate contact.

  8. Welding process modelling and control

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  9. Architecture and performances of the AGILE Telemetry Preprocessing System (TMPPS)

    NASA Astrophysics Data System (ADS)

    Trifoglio, M.; Bulgarelli, A.; Gianotti, F.; Lazzarotto, F.; Di Cocco, G.; Fuschino, F.; Tavani, M.

    2008-07-01

    AGILE is an Italian Space Agency (ASI) satellite dedicated to high energy Astrophysics. It was launched successfully on 23 April 2007, and it has been operated by the AGILE Ground Segment, consisting of the Ground Station located in Malindi (Kenia), the Mission Operations Centre (MOC) and the AGILE Data Centre (ADC) established in Italy, at Telespazio in Fucino and at the ASI Science Data Centre (ASDC) in Frascati respectively. Due to the low equatorial orbit at ~ 530 Km. with inclination angle of ~ 2.5°, the satellite passes over the Ground Station every ~ 100'. During the visibility period of . ~ 12', the Telemetry (TM) is down linked through two separated virtual channels, VC0 and VC1. The former is devoted to the real time TM generated during the pass at the average rate of 50 Kbit/s and is directly relayed to the Control Centre. The latter is used to downlink TM data collected on the satellite on-board mass memory during the non visibility period. This generates at the Ground Station a raw TM file of up to 37 MByte. Within 20' after the end of the contact, both the real time and mass memory TM arrive at ADC through the dedicated VPN ASINet. Here they are automatically detected and ingested by the TMPPS pipeline in less than 5 minutes. The TMPPS archives each TM file and sorts its packets into one stream for each of the different TM layout. Each stream is processed in parallel in order to unpack the various telemetry field and archive them into suitable FITS files. Each operation is tracked into a MySQL data base which interfaces the TMPPS pipeline to the rest of the scientific pipeline running at ADC. In this paper the architecture and the performance of the TMPPS will be described and discussed.

  10. Lesson Learned from AGILE and LARES ASI Projects About MATED Data Collection and Post Analysis

    NASA Astrophysics Data System (ADS)

    Carpentiero, Rita; Mrchetti, Ernesto; Natalucci, Silvia; Portelli, Claudio

    2012-07-01

    ASI has managed and collected data on project development of two scientific all-Italian missions: AGILE and LARES. Collection of the Model And Test Effectiveness Database (MATED) data, concerning Project, AIV (Assembly Integration and Verification) and NCR (Non Conformance Report) aspects has been performed by the Italian Space Agency (ASI), using available technical documentation of both AGILE e LARES projects. In this paper some consideration on the needs of 'real time' data collection is made, together with proposal of front end improvement of this tool. In addition a preliminary analysis of MATED effectiveness related to the above ASI projects will be presented in a bottom-up and post verification approach.

  11. Animal models and conserved processes

    PubMed Central

    2012-01-01

    Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is insufficient for inter

  12. Model for amorphous aggregation processes

    NASA Astrophysics Data System (ADS)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  13. Cross Sectional Study of Agile Software Development Methods and Project Performance

    ERIC Educational Resources Information Center

    Lambert, Tracy

    2011-01-01

    Agile software development methods, characterized by delivering customer value via incremental and iterative time-boxed development processes, have moved into the mainstream of the Information Technology (IT) industry. However, despite a growing body of research which suggests that a predictive manufacturing approach, with big up-front…

  14. An Agile Constructionist Mentoring Methodology for Software Projects in the High School

    ERIC Educational Resources Information Center

    Meerbaum-Salant, Orni; Hazzan, Orit

    2010-01-01

    This article describes the construction process and evaluation of the Agile Constructionist Mentoring Methodology (ACMM), a mentoring method for guiding software development projects in the high school. The need for such a methodology has arisen due to the complexity of mentoring software project development in the high school. We introduce the…

  15. Sharing environmental models: An Approach using GitHub repositories and Web Processing Services

    NASA Astrophysics Data System (ADS)

    Stasch, Christoph; Nuest, Daniel; Pross, Benjamin

    2016-04-01

    accordingly. The admin tool of the 52°North WPS was extended to support automated retrieval and deployment of computational models from GitHub repositories. Once the R code is available in the GitHub repo, the contained process can be easily deployed and executed by simply defining the GitHub repository URL in the WPS admin tool. We illustrate the usage of the approach by sharing and running a model for land use system archetypes developed by the Helmholtz Centre for Environmental Research (UFZ, see Vaclavik et al.). The original R code was extended and published in the 52°North WPS using both, public and non-public datasets (Nüst et al., see also https://github.com/52North/glues-wps). Hosting the analysis in a Git repository now allows WPS administrators, client developers, and modelers to easily work together on new versions or completely new web processes using the powerful GitHub collaboration platform. References: Hinz, M. et. al. (2013): Spatial Statistics on the Geospatial Web. In: The 16th AGILE International Conference on Geographic Information Science, Short Papers. http://www.agile-online.org/Conference_Paper/CDs/agile_2013/Short_Papers/SP_S3.1_Hinz.pdf Nüst, D. et. al.: (2015): Open and reproducible global land use classification. In: EGU General Assembly Conference Abstracts . Vol. 17. European Geophysical Union, 2015, p. 9125, http://meetingorganizer.copernicus. org/EGU2015/EGU2015- 9125.pdf Vaclavik, T., et. al. (2013): Mapping global land system archetypes. Global Environmental Change 23(6): 1637-1647. Online available: October 9, 2013, DOI: 10.1016/j.gloenvcha.2013.09.004

  16. Prospects for High Energy Detection of Microquasars with the AGILE and GLAST Gamma-Ray Telescopes

    SciTech Connect

    Santolamazza, Patrizia; Pittori, Carlotta; Verrecchia, Francesco

    2007-08-21

    We estimate the sensitivities of the AGILE and GLAST {gamma}-ray experiments taking into account two cases for the galactic {gamma}-ray diffuse background (at high galactic latitude and toward the galactic center). Then we use sensitivities to estimate microquasar observability with the two experiments, assuming the {gamma}-ray emission above 100 MeV of a recent microquasar model.

  17. Fall 2014 SEI Research Review Applying Agile Methods to DoD

    DTIC Science & Technology

    2014-10-28

    2014 © 2014 Carnegie Mellon University Agile Defense Adoption Proponents Team (ADAPT) member E-Learning Agile Course Multiple Presentations ... presentations , program committees: GSAW 2014, Agile 2014, Contracts in Agile International Meeting, AFEI/SEI DoD Agile Summit, GAO Working Groups 8

  18. Achieving Success via Multi-Model Process Improvement

    DTIC Science & Technology

    2007-03-01

    card EIA 632 ISO 9000 ITIL COBIT PSM GQIM RUP Agile Lean Six Sigma © 2007 by Carnegie Mellon University page 5 Your Expectations / Our Objectives...about defect density. “We’re Level 5 therefore we must be Six Sigma.” “We’re doing Six Sigma therefore we must be Level 4.” © 2007 by Carnegie Mellon...308537) 3 (66807) 4 (6210) 5 (233) 6 (3.4) 7 Average Company IRS - Tax Advice (phone-in) Restaurant Bills Doctor Prescription Writing Payroll Processing

  19. A review of the Technologies Enabling Agile Manufacturing program

    SciTech Connect

    Gray, W.H.; Neal, R.E.; Cobb, C.K.

    1996-10-01

    Addressing a technical plan developed in consideration with major US manufacturers, software and hardware providers, and government representatives, the Technologies Enabling Agile Manufacturing (TEAM) program is leveraging the expertise and resources of industry, universities, and federal agencies to develop, integrate, and deploy leap-ahead manufacturing technologies. One of the TEAM program`s goals is to transition products from design to production faster, more efficiently, and at less cost. TEAM`s technology development strategy also provides all participants with early experience in establishing and working within an electronic enterprise that includes access to high-speed networks and high-performance computing and storage systems. The TEAM program uses the cross-cutting tools it collects, develops, and integrates to demonstrate and deploy agile manufacturing capabilities for three high-priority processes identified by industry: material removal, sheet metal forming, electro-mechanical assembly. This paper reviews the current status of the TEAM program with emphasis upon TEAM`s information infrastructure.

  20. Modeling climate related feedback processes

    SciTech Connect

    Elzen, M.G.J. den; Rotmans, J. )

    1993-11-01

    In order to assess their impact, the feedbacks which at present can be quantified reasonably are built into the Integrated Model to Assess the Greenhouse Effect (IMAGE). Unlike previous studies, this study describes the scenario- and time-dependent role of biogeochemical feedbacks. A number of simulation experiments are performed with IMAGE to project climate changes. Besides estimates of their absolute importance, the relative importance of individual biogeochemical feedbacks is considered by calculating the gain for each feedback process. This study focuses on feedback processes in the carbon cycle and the methane (semi-) cycle. Modeled feedbacks are then used to balance the past and present carbon budget. This results in substantially lower projections for atmospheric carbon dioxide than the Intergovernmental Panel on Climate Change (IPCC) estimates. The difference is approximately 18% from the 1990 level for the IPCC [open quotes]Business-as-Usual[close quotes] scenario. Furthermore, the IPCC's [open quotes]best guess[close quotes] value of the CO[sub 2] concentration in the year 2100 falls outside the uncertainty range estimated with our balanced modeling approach. For the IPCC [open quotes]Business-as-Usual[close quotes] scenario, the calculated total gain of the feedbacks within the carbon cycle appears to be negative, a result of the dominant role of the fertilization feedback. This study also shows that if temperature feedbacks on methane emissions from wetlands, rice paddies, and hydrates do materialize, methane concentrations might be increased by 30% by 2100. 70 refs., 17 figs., 7 tabs.

  1. Candidate control design metrics for an agile fighter

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Bailey, Melvin L.; Ostroff, Aaron J.

    1991-01-01

    Success in the fighter combat environment of the future will certainly demand increasing capability from aircraft technology. These advanced capabilities in the form of superagility and supermaneuverability will require special design techniques which translate advanced air combat maneuvering requirements into design criteria. Control design metrics can provide some of these techniques for the control designer. Thus study presents an overview of control design metrics and investigates metrics for advanced fighter agility. The objectives of various metric users, such as airframe designers and pilots, are differentiated from the objectives of the control designer. Using an advanced fighter model, metric values are documented over a portion of the flight envelope through piloted simulation. These metric values provide a baseline against which future control system improvements can be compared and against which a control design methodology can be developed. Agility is measured for axial, pitch, and roll axes. Axial metrics highlight acceleration and deceleration capabilities under different flight loads and include specific excess power measurements to characterize energy meneuverability. Pitch metrics cover both body-axis and wind-axis pitch rates and accelerations. Included in pitch metrics are nose pointing metrics which highlight displacement capability between the nose and the velocity vector. Roll metrics (or torsion metrics) focus on rotational capability about the wind axis.

  2. Evaluation of a burst aggregation method in an optical burst switched agile all-photonic network

    NASA Astrophysics Data System (ADS)

    Parveen, Sonia; Radziwilowicz, Robert; Paredes, Sofia A.; Hall, Trevor J.

    2005-09-01

    This paper presents a burst aggregation method for an Agile All-Photonic Network (AAPN) operating under an asynchronous burst switched mode. The model combines both the timer-based and threshold-based approaches into a single composite burst assembly mechanism. This is evaluated semi-analytically for fixed length packets and Poisson arrivals and used as a special case to verify a more general OPNET Modeler simulation. The dependence of the blocking probability on different burst aggregation parameters is observed as well. The same procedure is extended to 'encapsulate' (aggregate) variable packet length traffic into 'envelopes' (bursts) matched to the time slots in an AAPN operating in a synchronous time-slotted mode. Results are presented for an emulation of this process using real IP network traffic from the local LAN using two encapsulation methods that differ depending upon whether 'envelope' boundaries are allowed to cross constituent packets or otherwise. Bandwidth utilization was measured for different encapsulation parameters and it is confirmed that the model with the boundaries allowed to cross packets (i.e., the model with packet segmentation) is more bandwidth-efficient even if the processing delay is slightly larger. The successful operation of the emulation system suggests as well that a simple, low-cost software implementation would be suitable to perform the burst/slot aggregation process in AAPN.

  3. Modeling Stem Cell Induction Processes

    PubMed Central

    Grácio, Filipe; Cabral, Joaquim; Tidor, Bruce

    2013-01-01

    Technology for converting human cells to pluripotent stem cell using induction processes has the potential to revolutionize regenerative medicine. However, the production of these so called iPS cells is still quite inefficient and may be dominated by stochastic effects. In this work we build mass-action models of the core regulatory elements controlling stem cell induction and maintenance. The models include not only the network of transcription factors NANOG, OCT4, SOX2, but also important epigenetic regulatory features of DNA methylation and histone modification. We show that the network topology reported in the literature is consistent with the observed experimental behavior of bistability and inducibility. Based on simulations of stem cell generation protocols, and in particular focusing on changes in epigenetic cellular states, we show that cooperative and independent reaction mechanisms have experimentally identifiable differences in the dynamics of reprogramming, and we analyze such differences and their biological basis. It had been argued that stochastic and elite models of stem cell generation represent distinct fundamental mechanisms. Work presented here suggests an alternative possibility that they represent differences in the amount of information we have about the distribution of cellular states before and during reprogramming protocols. We show further that unpredictability and variation in reprogramming decreases as the cell progresses along the induction process, and that identifiable groups of cells with elite-seeming behavior can come about by a stochastic process. Finally we show how different mechanisms and kinetic properties impact the prospects of improving the efficiency of iPS cell generation protocols. PMID:23667423

  4. Modeling pellet impact drilling process

    NASA Astrophysics Data System (ADS)

    Kovalyov, A. V.; Ryabchikov, S. Ya; Isaev, Ye D.; Ulyanova, O. S.

    2016-03-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling the process of pellet impact drilling, which creates the scientific and methodological basis for engineering design of drilling operations under different geo-technical conditions.

  5. Future Research in Agile Systems Development: Applying Open Innovation Principles Within the Agile Organisation

    NASA Astrophysics Data System (ADS)

    Conboy, Kieran; Morgan, Lorraine

    A particular strength of agile approaches is that they move away from ‘introverted' development and intimately involve the customer in all areas of development, supposedly leading to the development of a more innovative and hence more valuable information system. However, we argue that a single customer representative is too narrow a focus to adopt and that involvement of stakeholders beyond the software development itself is still often quite weak and in some cases non-existent. In response, we argue that current thinking regarding innovation in agile development needs to be extended to include multiple stakeholders outside the business unit. This paper explores the intra-organisational applicability and implications of open innovation in agile systems development. Additionally, it argues for a different perspective of project management that includes collaboration and knowledge-sharing with other business units, customers, partners, and other relevant stakeholders pertinent to the business success of an organisation, thus embracing open innovation principles.

  6. End users transforming experiences into formal information and process models for personalised health interventions.

    PubMed

    Lindgren, Helena; Lundin-Olsson, Lillemor; Pohl, Petra; Sandlund, Marlene

    2014-01-01

    Five physiotherapists organised a user-centric design process of a knowledge-based support system for promoting exercise and preventing falls. The process integrated focus group studies with 17 older adults and prototyping. The transformation of informal medical and rehabilitation expertise and older adults' experiences into formal information and process models during the development was studied. As tool they used ACKTUS, a development platform for knowledge-based applications. The process became agile and incremental, partly due to the diversity of expectations and preferences among both older adults and physiotherapists, and the participatory approach to design and development. In addition, there was a need to develop the knowledge content alongside with the formal models and their presentations, which allowed the participants to test hands-on and evaluate the ideas, content and design. The resulting application is modular, extendable, flexible and adaptable to the individual end user. Moreover, the physiotherapists are able to modify the information and process models, and in this way further develop the application. The main constraint was found to be the lack of support for the initial phase of concept modelling, which lead to a redesigned user interface and functionality of ACKTUS.

  7. Agile Methodology - Past and Future

    DTIC Science & Technology

    2011-05-01

    Model Integration • EA - DoD’s Evolutionary Acquisition policy Mellon University • SOA - Service-oriented Architecture • HBR – Harvard Business Review • WS - Web Service • XP – Extreme Programming

  8. Rolling and tumbling: status of the SuperAGILE experiment

    NASA Astrophysics Data System (ADS)

    Del Monte, E.; Costa, E.; di Persio, G.; Donnarumma, I.; Evangelista, Y.; Feroci, M.; Lapshov, I.; Lazzarotto, F.; Mastropietro, M.; Morelli, E.; Pacciani, L.; Rapisarda, M.; Rubini, A.; Soffitta, P.; Tavani, M.; Argan, A.; Trois, A.

    2010-07-01

    The SuperAGILE experiment is the hard X-ray monitor of the AGILE mission. It is a 2 x one-dimensional imager, with 6-arcmin angular resolution in the energy range 18 - 60 keV and a field of view in excess of 1 steradian. SuperAGILE is successfully operating in orbit since Summer 2007, providing long-term monitoring of bright sources and prompt detection and localization of gamma-ray bursts. Starting on October 2009 the AGILE mission lost its reaction wheel and the satellite attitude is no longer stabilized. The current mode of operation of the AGILE satellite is a Spinning Mode, around the Sun-pointing direction, with an angular velocity of about 0.8 degree/s (corresponding to 8 times the SuperAGILE point spread function every second). In these new conditions, SuperAGILE continuously scans a much larger fraction of the sky, with much smaller exposure to each region. In this paper we review some of the results of the first 2.5 years of "standard" operation of SuperAGILE, and show how new implementations in the data analysis software allows to continue the hard X-ray sky monitoring by SuperAGILE also in the new attitude conditions.

  9. Applying Agile MethodstoWeapon/Weapon-Related Software

    SciTech Connect

    Adams, D; Armendariz, M; Blackledge, M; Campbell, F; Cloninger, M; Cox, L; Davis, J; Elliott, M; Granger, K; Hans, S; Kuhn, C; Lackner, M; Loo, P; Matthews, S; Morrell, K; Owens, C; Peercy, D; Pope, G; Quirk, R; Schilling, D; Stewart, A; Tran, A; Ward, R; Williamson, M

    2007-05-02

    This white paper provides information and guidance to the Department of Energy (DOE) sites on Agile software development methods and the impact of their application on weapon/weapon-related software development. The purpose of this white paper is to provide an overview of Agile methods, examine the accepted interpretations/uses/practices of these methodologies, and discuss the applicability of Agile methods with respect to Nuclear Weapons Complex (NWC) Technical Business Practices (TBPs). It also provides recommendations on the application of Agile methods to the development of weapon/weapon-related software.

  10. NUMERICAL MODELING OF FINE SEDIMENT PHYSICAL PROCESSES.

    USGS Publications Warehouse

    Schoellhamer, David H.

    1985-01-01

    Fine sediment in channels, rivers, estuaries, and coastal waters undergo several physical processes including flocculation, floc disruption, deposition, bed consolidation, and resuspension. This paper presents a conceptual model and reviews mathematical models of these physical processes. Several general fine sediment models that simulate some of these processes are reviewed. These general models do not directly simulate flocculation and floc disruption, but the conceptual model and existing functions are shown to adequately model these two processes for one set of laboratory data.

  11. Wideband Agile Digital Microwave Radiometer

    NASA Technical Reports Server (NTRS)

    Gaier, Todd C.; Brown, Shannon T.; Ruf, Christopher; Gross, Steven

    2012-01-01

    The objectives of this work were to take the initial steps needed to develop a field programmable gate array (FPGA)- based wideband digital radiometer backend (>500 MHz bandwidth) that will enable passive microwave observations with minimal performance degradation in a radiofrequency-interference (RFI)-rich environment. As manmade RF emissions increase over time and fill more of the microwave spectrum, microwave radiometer science applications will be increasingly impacted in a negative way, and the current generation of spaceborne microwave radiometers that use broadband analog back ends will become severely compromised or unusable over an increasing fraction of time on orbit. There is a need to develop a digital radiometer back end that, for each observation period, uses digital signal processing (DSP) algorithms to identify the maximum amount of RFI-free spectrum across the radiometer band to preserve bandwidth to minimize radiometer noise (which is inversely related to the bandwidth). Ultimately, the objective is to incorporate all processing necessary in the back end to take contaminated input spectra and produce a single output value free of manmade signals to minimize data rates for spaceborne radiometer missions. But, to meet these objectives, several intermediate processing algorithms had to be developed, and their performance characterized relative to typical brightness temperature accuracy re quirements for current and future microwave radiometer missions, including those for measuring salinity, soil moisture, and snow pack.

  12. Modelling of Command and Control Agility

    DTIC Science & Technology

    2014-06-01

    experiments to explore different scenarios and identifying possible counterintuitive effects (Bar- Yam 2003, Sheard & Mostashari 2009, Hitchins 2008, Alberts...Systems, Man, and Cybernetics—Part C: Applications and Reviews, Vol. 28, No. 4: 516 – 527. Bar- Yam , Y., 2003. When Systems Engineering Fails

  13. Between Oais and Agile a Dynamic Data Management Approach

    NASA Astrophysics Data System (ADS)

    Bennett, V. L.; Conway, E. A.; Waterfall, A. M.; Pepler, S.

    2015-12-01

    In this paper we decribe an approach to the integration of existing archival activities which lies between compliance with the more rigid OAIS/TRAC standards and a more flexible "Agile" approach to the curation and preservation of Earth Observation data. We provide a high level overview of existing practice and discuss how these procedures can be extended and supported through the description of preservation state. The aim of which is to facilitate the dynamic controlled management of scientific data through its lifecycle. While processes are considered they are not statically defined but rather driven by human interactions in the form of risk management/review procedure that produce actionable plans, which are responsive to change. We then proceed by describing the feasibility testing of extended risk management and planning procedures which integrate current practices. This was done through the CEDA Archival Format Audit which inspected British Atmospheric Data Centre and NERC Earth Observation Data Centre Archival holdings. These holdings are extensive, comprising of around 2 Petabytes of data and 137 million individual files, which were analysed and characterised in terms of format, based risk. We are then able to present an overview of the format based risk burden faced by a large scale archive attempting to maintain the usability of heterogeneous environmental data sets We continue by presenting a dynamic data management information model and provide discussion of the following core model entities and their relationships: Aspirational entities, which include Data Entity definitions and their associated Preservation Objectives. Risk entities, which act as drivers for change within the data lifecycle. These include Acquisitional Risks, Technical Risks, Strategic Risks and External Risks Plan entities, which detail the actions to bring about change within an archive. These include Acquisition Plans, Preservation Plans and Monitoring plans which support

  14. Delaying Mobility Disability in People With Parkinson Disease Using a Sensorimotor Agility Exercise Program

    PubMed Central

    King, Laurie A; Horak, Fay B

    2009-01-01

    This article introduces a new framework for therapists to develop an exercise program to delay mobility disability in people with Parkinson disease (PD). Mobility, or the ability to efficiently navigate and function in a variety of environments, requires balance, agility, and flexibility, all of which are affected by PD. This article summarizes recent research identifying how constraints on mobility specific to PD, such as rigidity, bradykinesia, freezing, poor sensory integration, inflexible program selection, and impaired cognitive processing, limit mobility in people with PD. Based on these constraints, a conceptual framework for exercises to maintain and improve mobility is presented. An example of a constraint-focused agility exercise program, incorporating movement principles from tai chi, kayaking, boxing, lunges, agility training, and Pilates exercises, is presented. This new constraint-focused agility exercise program is based on a strong scientific framework and includes progressive levels of sensorimotor, resistance, and coordination challenges that can be customized for each patient while maintaining fidelity. Principles for improving mobility presented here can be incorporated into an ongoing or long-term exercise program for people with PD. PMID:19228832

  15. Delaying mobility disability in people with Parkinson disease using a sensorimotor agility exercise program.

    PubMed

    King, Laurie A; Horak, Fay B

    2009-04-01

    This article introduces a new framework for therapists to develop an exercise program to delay mobility disability in people with Parkinson disease (PD). Mobility, or the ability to efficiently navigate and function in a variety of environments, requires balance, agility, and flexibility, all of which are affected by PD. This article summarizes recent research identifying how constraints on mobility specific to PD, such as rigidity, bradykinesia, freezing, poor sensory integration, inflexible program selection, and impaired cognitive processing, limit mobility in people with PD. Based on these constraints, a conceptual framework for exercises to maintain and improve mobility is presented. An example of a constraint-focused agility exercise program, incorporating movement principles from tai chi, kayaking, boxing, lunges, agility training, and Pilates exercises, is presented. This new constraint-focused agility exercise program is based on a strong scientific framework and includes progressive levels of sensorimotor, resistance, and coordination challenges that can be customized for each patient while maintaining fidelity. Principles for improving mobility presented here can be incorporated into an ongoing or long-term exercise program for people with PD.

  16. Agile informatics: application of agile project management to the development of a personal health application.

    PubMed

    Chung, Jeanhee; Pankey, Evan; Norris, Ryan J

    2007-10-11

    We describe the application of the Agile method-- a short iteration cycle, user responsive, measurable software development approach-- to the project management of a modular personal health record, iHealthSpace, to be deployed to the patients and providers of a large academic primary care practice.

  17. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  18. Frequency/phase agile microwave circuits on ferroelectric films

    NASA Astrophysics Data System (ADS)

    Romanofsky, Robert Raymond

    This work describes novel microwave circuits that can be tuned in either frequency or phase through the use of nonlinear dielectrics, specifically thin ferroelectric films. These frequency and phase agile circuits in many cases provide a new capability or offer the potential for lower cost alternatives in satellite and terrestrial communications and sensor applications. A brief introduction to nonlinear dielectrics and a summary of some of the special challenges confronting the practical insertion of ferroelectric technology into commercial systems is provided. A theoretical solution for the propagation characteristics of the multi-layer structures, with emphasis on a new type of phase shifter based on coupled microstrip, lines, is developed. The quasi-TEM analysis is based on a variational solution for line capacitance and an extension of coupled transmission line theory. It is shown that the theoretical model is applicable to a broad class of multi-layer transmission lines. The critical role that ferroelectric film thickness plays in loss and phase-shift is closely examined. Experimental data for both thin film BaxSr1-xTiO 3 phase shifters near room temperature and SMO3 phase shifters at cryogenic temperatures on MgO and LaAlO3 substrates is included. Some of these devices demonstrated an insertion loss of less than 5 dB at Ku-band with continuously variable phase shift in excess of 360 degrees. The performance of these devices is superior to the state-of-the-art semiconductor counterparts. Frequency and phase agile antenna prototypes including a microstrip patch that can operate at multiple microwave frequency bands and a new type of phased array antenna concept called the ferroelectric reflectarray are introduced. Modeled data for tunable microstrip patch antennas is presented for various ferroelectric film thickness. A prototype linear phased array, with a conventional beam-forming manifold, and an electronic controller is described. This is the first

  19. Process-Response Modeling and the Scientific Process.

    ERIC Educational Resources Information Center

    Fichter, Lynn S.

    1988-01-01

    Discusses the process-response model (PRM) in its theoretical and practical forms. Describes how geologists attempt to reconstruct the process from the response (the geologic phenomenon) being studied. (TW)

  20. Scaling Agile Infrastructure to People

    NASA Astrophysics Data System (ADS)

    Jones, B.; McCance, G.; Traylen, S.; Barrientos Arias, N.

    2015-12-01

    When CERN migrated its infrastructure away from homegrown fabric management tools to emerging industry-standard open-source solutions, the immediate technical challenges and motivation were clear. The move to a multi-site Cloud Computing model meant that the tool chains that were growing around this ecosystem would be a good choice, the challenge was to leverage them. The use of open-source tools brings challenges other than merely how to deploy them. Homegrown software, for all the deficiencies identified at the outset of the project, has the benefit of growing with the organization. This paper will examine what challenges there were in adapting open-source tools to the needs of the organization, particularly in the areas of multi-group development and security. Additionally, the increase in scale of the plant required changes to how Change Management was organized and managed. Continuous Integration techniques are used in order to manage the rate of change across multiple groups, and the tools and workflow for this will be examined.

  1. Cupola Furnace Computer Process Model

    SciTech Connect

    Seymour Katz

    2004-12-31

    The cupola furnace generates more than 50% of the liquid iron used to produce the 9+ million tons of castings annually. The cupola converts iron and steel into cast iron. The main advantages of the cupola furnace are lower energy costs than those of competing furnaces (electric) and the ability to melt less expensive metallic scrap than the competing furnaces. However the chemical and physical processes that take place in the cupola furnace are highly complex making it difficult to operate the furnace in optimal fashion. The results are low energy efficiency and poor recovery of important and expensive alloy elements due to oxidation. Between 1990 and 2004 under the auspices of the Department of Energy, the American Foundry Society and General Motors Corp. a computer simulation of the cupola furnace was developed that accurately describes the complex behavior of the furnace. When provided with the furnace input conditions the model provides accurate values of the output conditions in a matter of seconds. It also provides key diagnostics. Using clues from the diagnostics a trained specialist can infer changes in the operation that will move the system toward higher efficiency. Repeating the process in an iterative fashion leads to near optimum operating conditions with just a few iterations. More advanced uses of the program have been examined. The program is currently being combined with an ''Expert System'' to permit optimization in real time. The program has been combined with ''neural network'' programs to affect very easy scanning of a wide range of furnace operation. Rudimentary efforts were successfully made to operate the furnace using a computer. References to these more advanced systems will be found in the ''Cupola Handbook''. Chapter 27, American Foundry Society, Des Plaines, IL (1999).

  2. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and procedure for developing software. This paper will discuss the some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies.

  3. Agile manufacturing in Intelligence, Surveillance and Reconnaissance (ISR)

    NASA Astrophysics Data System (ADS)

    DiPadua, Mark; Dalton, George

    2016-05-01

    The objective of the Agile Manufacturing for Intelligence, Surveillance, and Reconnaissance (AMISR) effort is to research, develop, design and build a prototype multi-intelligence (multi-INT), reconfigurable pod demonstrating benefits of agile manufacturing and a modular open systems approach (MOSA) to make podded intelligence, surveillance, and reconnaissance (ISR) capability more affordable and operationally flexible.

  4. Integrated product definition representation for agile numerical control applications

    SciTech Connect

    Simons, W.R. Jr.; Brooks, S.L.; Kirk, W.J. III; Brown, C.W.

    1994-11-01

    Realization of agile manufacturing capabilities for a virtual enterprise requires the integration of technology, management, and work force into a coordinated, interdependent system. This paper is focused on technology enabling tools for agile manufacturing within a virtual enterprise specifically relating to Numerical Control (N/C) manufacturing activities and product definition requirements for these activities.

  5. Process modeling and industrial energy use

    SciTech Connect

    Howe, S O; Pilati, D A; Sparrow, F T

    1980-11-01

    How the process models developed at BNL are used to analyze industrial energy use is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Case study results from the pulp and paper model illustrate how process models can be used to analyze a variety of issues. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for energy end-use modeling and conservation analysis. Information on the current status of industry models at BNL is tabulated.

  6. Agile Machining and Inspection Non-Nuclear Report (NNR) Project

    SciTech Connect

    Lazarus, Lloyd

    2009-02-19

    This report is a high level summary of the eight major projects funded by the Agile Machining and Inspection Non-Nuclear Readiness (NNR) project (FY06.0422.3.04.R1). The largest project of the group is the Rapid Response project in which the six major sub categories are summarized. This project focused on the operations of the machining departments that will comprise Special Applications Machining (SAM) in the Kansas City Responsive Infrastructure Manufacturing & Sourcing (KCRIMS) project. This project was aimed at upgrading older machine tools, developing new inspection tools, eliminating Classified Removable Electronic Media (CREM) in the handling of classified Numerical Control (NC) programs by installing the CRONOS network, and developing methods to automatically load Coordinated-Measuring Machine (CMM) inspection data into bomb books and product score cards. Finally, the project personnel leaned perations of some of the machine tool cells, and now have the model to continue this activity.

  7. A Case Study of Coordination in Distributed Agile Software Development

    NASA Astrophysics Data System (ADS)

    Hole, Steinar; Moe, Nils Brede

    Global Software Development (GSD) has gained significant popularity as an emerging paradigm. Companies also show interest in applying agile approaches in distributed development to combine the advantages of both approaches. However, in their most radical forms, agile and GSD can be placed in each end of a plan-based/agile spectrum because of how work is coordinated. We describe how three GSD projects applying agile methods coordinate their work. We found that trust is needed to reduce the need of standardization and direct supervision when coordinating work in a GSD project, and that electronic chatting supports mutual adjustment. Further, co-location and modularization mitigates communication problems, enables agility in at least part of a GSD project, and renders the implementation of Scrum of Scrums possible.

  8. Analysis of VLF signals associated to AGILE Terrestrial Gamma-ray Flashes detected over Central America

    NASA Astrophysics Data System (ADS)

    Marisaldi, Martino; Lyu, Fanchao; Cummer, Steven; Ursi, Alessandro

    2016-04-01

    Analysis of radio signals detected on ground and associated to Terrestrial Gamma-ray Flashes (TGFs) have proven to be a successful tool to extract information on the TGF itself and the possible associated lightning process. Triangulation of Very Low Frequency (VLF) signals by means of the Time Of Arrival technique provides TGF location with few km accuracy. The AGILE satellite is routinely observing TGFs on a narrow band across the Equator, limited by the small satellite orbital inclination (2.5°). However, until recently it was not possible to provide firm associations between AGILE TGFs and radio signals, because of two main limiting factors. First, dead-time effects led to a bias towards long duration events in AGILE TGF sample, which are less likely associated to strong radio pulses. In addition, most VLF detection networks are less sensitive along the equatorial region. Since the end of March 2015 a major change in the AGILE MiniCalorimeter instrument configuration resulted in a ten fold increase in TGF detection rate, and in the detection of events as short as 20 microseconds. 14% of the events in the new sample resulted simultaneous (within 200 microseconds) to sferics detected by the World Wide Lightning Location Network (WWLLN), therefore a source localisation is available for these events. We present here the first analysis of VLF waveforms associated to AGILE TGFs observed above Central America, detected by magnetic field sensors deployed in Puerto Rico. Among the seven TGFs with a WWLLN location at a distance lower than 10000 km from the sensors, four of them have detectable signals. These events are the closest to the sensors, with distance less than 7500 km. We present here the properties of these TGFs and the characteristics of the associated radio waveforms.

  9. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  10. Multidimensional Data Modeling for Business Process Analysis

    NASA Astrophysics Data System (ADS)

    Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.

    The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.

  11. Composing Models of Geographic Physical Processes

    NASA Astrophysics Data System (ADS)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  12. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  13. Reflections on Software Agility and Agile Methods: Challenges, Dilemmas, & the Way Ahead

    DTIC Science & Technology

    2005-05-11

    Mellon University 10 Research Team Richard Baskerville and Balasubramanian Ramesh Department of Computer Information Systems, Georgia State University...University 38 References Agile Manifesto. http://agilemanifesto.org/ Baskerville , R., Pries-Heje, J., Levine, L., & Ramesh, B. (2005). The high speed...balancing game: How software companies cope with Internet speed. Scandinavian Journal of Information Systems 16, 11-54. Baskerville , R., Levine, L

  14. Birth/death process model

    NASA Technical Reports Server (NTRS)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  15. Mathematical Modeling: A Structured Process

    ERIC Educational Resources Information Center

    Anhalt, Cynthia Oropesa; Cortez, Ricardo

    2015-01-01

    Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

  16. Ramping up for agility: Development of a concurrent engineering communications infrastructure

    SciTech Connect

    Forsythe, C.; Ashby, M.R.

    1995-09-01

    A-PRIMED (Agile Product Realization for Innovative Electro MEchanical Devices) demonstrated new product development in24 days accompanied by improved product quality, through ability enabling technologies. A concurrent engineering communications infrastructure was developed that provided electronic data communications, information access, enterprise integration of computers and applications, and collaborative work tools. This paper describes how A-PRIMED did it through attention to technologies, processes, and people.

  17. A Comparative of business process modelling techniques

    NASA Astrophysics Data System (ADS)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  18. Compact, flexible, frequency agile parametric wavelength converter

    DOEpatents

    Velsko, Stephan P.; Yang, Steven T.

    2002-01-01

    This improved Frequency Agile Optical Parametric Oscillator provides near on-axis pumping of a single QPMC with a tilted periodically poled grating to overcome the necessity to find a particular crystal that will permit collinear birefringence in order to obtain a desired tuning range. A tilted grating design and the elongation of the transverse profile of the pump beam in the angle tuning plane of the FA-OPO reduces the rate of change of the overlap between the pumped volume in the crystal and the resonated and non-resonated wave mode volumes as the pump beam angle is changed. A folded mirror set relays the pivot point for beam steering from a beam deflector to the center of the FA-OPO crystal. This reduces the footprint of the device by as much as a factor of two over that obtained when using the refractive telescope design.

  19. Agile: From Software to Mission System

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Shirley, Mark H.; Hobart, Sarah Groves

    2016-01-01

    The Resource Prospector (RP) is an in-situ resource utilization (ISRU) technology demonstration mission, designed to search for volatiles at the Lunar South Pole. This is NASA's first near real time tele-operated rover on the Moon. The primary objective is to search for volatiles at one of the Lunar Poles. The combination of short mission duration, a solar powered rover, and the requirement to explore shadowed regions makes for an operationally challenging mission. To maximize efficiency and flexibility in Mission System design and thus to improve the performance and reliability of the resulting Mission System, we are tailoring Agile principles that we have used effectively in ground data system software development and applying those principles to the design of elements of the mission operations system.

  20. SLS Flight Software Testing: Using a Modified Agile Software Testing Approach

    NASA Technical Reports Server (NTRS)

    Bolton, Albanie T.

    2016-01-01

    NASA's Space Launch System (SLS) is an advanced launch vehicle for a new era of exploration beyond earth's orbit (BEO). The world's most powerful rocket, SLS, will launch crews of up to four astronauts in the agency's Orion spacecraft on missions to explore multiple deep-space destinations. Boeing is developing the SLS core stage, including the avionics that will control vehicle during flight. The core stage will be built at NASA's Michoud Assembly Facility (MAF) in New Orleans, LA using state-of-the-art manufacturing equipment. At the same time, the rocket's avionics computer software is being developed here at Marshall Space Flight Center in Huntsville, AL. At Marshall, the Flight and Ground Software division provides comprehensive engineering expertise for development of flight and ground software. Within that division, the Software Systems Engineering Branch's test and verification (T&V) team uses an agile test approach in testing and verification of software. The agile software test method opens the door for regular short sprint release cycles. The idea or basic premise behind the concept of agile software development and testing is that it is iterative and developed incrementally. Agile testing has an iterative development methodology where requirements and solutions evolve through collaboration between cross-functional teams. With testing and development done incrementally, this allows for increased features and enhanced value for releases. This value can be seen throughout the T&V team processes that are documented in various work instructions within the branch. The T&V team produces procedural test results at a higher rate, resolves issues found in software with designers at an earlier stage versus at a later release, and team members gain increased knowledge of the system architecture by interfacing with designers. SLS Flight Software teams want to continue uncovering better ways of developing software in an efficient and project beneficial manner

  1. Using formal methods to scope performance challenges for Smart Manufacturing Systems: focus on agility

    PubMed Central

    Jung, Kiwook; Morris, KC; Lyons, Kevin W.; Leong, Swee; Cho, Hyunbo

    2016-01-01

    Smart Manufacturing Systems (SMS) need to be agile to adapt to new situations by using detailed, precise, and appropriate data for intelligent decision-making. The intricacy of the relationship of strategic goals to operational performance across the many levels of a manufacturing system inhibits the realization of SMS. This paper proposes a method for identifying what aspects of a manufacturing system should be addressed to respond to changing strategic goals. The method uses standard modeling techniques in specifying a manufacturing system and the relationship between strategic goals and operational performance metrics. Two existing reference models related to manufacturing operations are represented formally and harmonized to support the proposed method. The method is illustrated for a single scenario using agility as a strategic goal. By replicating the proposed method for other strategic goals and with multiple scenarios, a comprehensive set of performance challenges can be identified. PMID:27141209

  2. Using formal methods to scope performance challenges for Smart Manufacturing Systems: focus on agility.

    PubMed

    Jung, Kiwook; Morris, K C; Lyons, Kevin W; Leong, Swee; Cho, Hyunbo

    2015-12-01

    Smart Manufacturing Systems (SMS) need to be agile to adapt to new situations by using detailed, precise, and appropriate data for intelligent decision-making. The intricacy of the relationship of strategic goals to operational performance across the many levels of a manufacturing system inhibits the realization of SMS. This paper proposes a method for identifying what aspects of a manufacturing system should be addressed to respond to changing strategic goals. The method uses standard modeling techniques in specifying a manufacturing system and the relationship between strategic goals and operational performance metrics. Two existing reference models related to manufacturing operations are represented formally and harmonized to support the proposed method. The method is illustrated for a single scenario using agility as a strategic goal. By replicating the proposed method for other strategic goals and with multiple scenarios, a comprehensive set of performance challenges can be identified.

  3. Agile rediscovering values: Similarities to continuous improvement strategies

    NASA Astrophysics Data System (ADS)

    Díaz de Mera, P.; Arenas, J. M.; González, C.

    2012-04-01

    Research in the late 80's on technological companies that develop products of high value innovation, with sufficient speed and flexibility to adapt quickly to changing market conditions, gave rise to the new set of methodologies known as Agile Management Approach. In the current changing economic scenario, we considered very interesting to study the similarities of these Agile Methodologies with other practices whose effectiveness has been amply demonstrated in both the West and Japan. Strategies such as Kaizen, Lean, World Class Manufacturing, Concurrent Engineering, etc, would be analyzed to check the values they have in common with the Agile Approach.

  4. The Computer-Aided Analytic Process Model. Operations Handbook for the Analytic Process Model Demonstration Package

    DTIC Science & Technology

    1986-01-01

    Research Note 86-06 THE COMPUTER-AIDED ANALYTIC PROCESS MODEL : OPERATIONS HANDBOOK FOR THE ANALYTIC PROCESS MODEL DE ONSTRATION PACKAGE Ronald G...ic Process Model ; Operations Handbook; Tutorial; Apple; Systems Taxonomy Mod--l; Training System; Bradl1ey infantry Fighting * Vehicle; BIFV...8217. . . . . . . .. . . . . . . . . . . . . . . . * - ~ . - - * m- .. . . . . . . item 20. Abstract -continued companion volume-- "The Analytic Process Model for

  5. A Hierarchical Process-Dissociation Model

    ERIC Educational Resources Information Center

    Rouder, Jeffrey N.; Lu, Jun; Morey, Richard D.; Sun, Dongchu; Speckman, Paul L.

    2008-01-01

    In fitting the process-dissociation model (L. L. Jacoby, 1991) to observed data, researchers aggregate outcomes across participant, items, or both. T. Curran and D. L. Hintzman (1995) demonstrated how biases from aggregation may lead to artifactual support for the model. The authors develop a hierarchical process-dissociation model that does not…

  6. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  7. GRB 070724B: the first Gamma Ray Burst localized by SuperAGILE

    SciTech Connect

    Del Monte, E.; Costa, E.; Donnarumma, I.; Feroci, M.; Lapshov, I.; Lazzarotto, F.; Soffitta, P.; Argan, A.; Pucella, G.; Trois, A.; Vittorini, V.; Evangelista, Y.; Rapisarda, M.; Barbiellini, G.; Longo, F.; Basset, M.; Foggetta, L.; Vallazza, E.; Bulgarelli, A.; Di Cocco, G.

    2008-05-22

    GRB070724B is the first Gamma Ray Burst localized by the SuperAGILE instrument aboard the AGILE space mission. The SuperAGILE localization has been confirmed after the after-glow observation by the XRT aboard the Swift satellite. No significant gamma ray emission above 50 MeV has been detected for this GRB. In this paper we describe the SuperAGILE capabilities in detecting Gamma Ray Burst and the AGILE observation of GRB 070724B.

  8. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  9. Ada COCOMO and the Ada Process Model

    DTIC Science & Technology

    1989-01-01

    language, the use of incremental development, and the use of the Ada process model capitalizing on the strengths of Ada to improve the efficiency of software...development. This paper presents the portions of the revised Ada COCOMO dealing with the effects of Ada and the Ada process model . The remainder of...this section of the paper discusses the objectives of Ada COCOMO. Section 2 describes the Ada Process Model and its overall effects on software

  10. Models for Turbulent Transport Processes.

    ERIC Educational Resources Information Center

    Hill, James C.

    1979-01-01

    Since the statistical theories of turbulence that have developed over the last twenty or thirty years are too abstract and unreliable to be of much use to chemical engineers, this paper introduces the techniques of single point models and suggests some areas of needed research. (BB)

  11. Total Ship Design Process Modeling

    DTIC Science & Technology

    2012-04-30

    Microsoft Project® or Primavera ®, and perform process simulations that can investigate risk, cost, and schedule trade-offs. Prior efforts to capture...planning in the face of disruption, delay, and late‐changing  requirements. ADePT is interfaced with  PrimaVera , the AEC  industry favorite program

  12. Agile text mining for the 2014 i2b2/UTHealth Cardiac risk factors challenge.

    PubMed

    Cormack, James; Nath, Chinmoy; Milward, David; Raja, Kalpana; Jonnalagadda, Siddhartha R

    2015-12-01

    This paper describes the use of an agile text mining platform (Linguamatics' Interactive Information Extraction Platform, I2E) to extract document-level cardiac risk factors in patient records as defined in the i2b2/UTHealth 2014 challenge. The approach uses a data-driven rule-based methodology with the addition of a simple supervised classifier. We demonstrate that agile text mining allows for rapid optimization of extraction strategies, while post-processing can leverage annotation guidelines, corpus statistics and logic inferred from the gold standard data. We also show how data imbalance in a training set affects performance. Evaluation of this approach on the test data gave an F-Score of 91.7%, one percent behind the top performing system.

  13. Agile Text Mining for the 2014 i2b2/UTHealth Cardiac Risk Factors Challenge

    PubMed Central

    Cormack, James; Nath, Chinmoy; Milward, David; Raja, Kalpana; Jonnalagadda, Siddhartha R

    2016-01-01

    This paper describes the use of an agile text mining platform (Linguamatics’ Interactive Information Extraction Platform, I2E) to extract document-level cardiac risk factors in patient records as defined in the i2b2/UTHealth 2014 Challenge. The approach uses a data-driven rule-based methodology with the addition of a simple supervised classifier. We demonstrate that agile text mining allows for rapid optimization of extraction strategies, while post-processing can leverage annotation guidelines, corpus statistics and logic inferred from the gold standard data. We also show how data imbalance in a training set affects performance. Evaluation of this approach on the test data gave an F-Score of 91.7%, one percent behind the top performing system. PMID:26209007

  14. Value Creation by Agile Projects: Methodology or Mystery?

    NASA Astrophysics Data System (ADS)

    Racheva, Zornitza; Daneva, Maya; Sikkel, Klaas

    Business value is a key concept in agile software development approaches. This paper presents results of a systematic review of literature on how business value is created by agile projects. We found that with very few exceptions, most published studies take the concept of business value for granted and do not state what it means in general as well as in the specific study context. We could find no study which clearly indicates how exactly individual agile practices or groups of those create value and keep accumulating it over time. The key implication for research is that we have an incentive to pursue the study of value creation in agile project by deploying empirical research methods.

  15. Laterality and performance of agility-trained dogs.

    PubMed

    Siniscalchi, Marcello; Bertino, Daniele; Quaranta, Angelo

    2014-01-01

    Correlations between lateralised behaviour and performance were investigated in 19 agility-trained dogs (Canis familiaris) by scoring paw preference to hold a food object and relating it to performance during typical agility obstacles (jump/A-frame and weave poles). In addition, because recent behavioural studies reported that visual stimuli of emotional valence presented to one visual hemifield at a time affect visually guided motor responses in dogs, the possibility that the position of the owner respectively in the left and in the right canine visual hemifield might be associated with quality of performance during agility was considered. Dogs' temperament was also measured by an owner-rated questionnaire. The most relevant finding was that agility-trained dogs displayed longer latencies to complete the obstacles with the owner located in their left visual hemifield compared to the right. Interestingly, the results showed that this phenomenon was significantly linked to both dogs' trainability and the strength of paw preference.

  16. Frequency agile OPO-based transmitters for multiwavelength DIAL

    SciTech Connect

    Velsko, S.P.; Ruggiero, A.; Herman, M.

    1996-09-01

    We describe a first generation mid-infrared transmitter with pulse-to- pulse frequency agility and both wide and narrow band capability. This transmitter was used to make multicomponent DIAL measurements in the field.

  17. Investigation into the impact of agility on conceptual fighter design

    NASA Technical Reports Server (NTRS)

    Engelbeck, R. M.

    1995-01-01

    The Agility Design Study was performed by the Boeing Defense and Space Group for the NASA Langley Research Center. The objective of the study was to assess the impact of agility requirements on new fighter configurations. Global trade issues investigated were the level of agility, the mission role of the aircraft (air-to-ground, multi-role, or air-to-air), and whether the customer is Air force, Navy, or joint service. Mission profiles and design objectives were supplied by NASA. An extensive technology assessment was conducted to establish the available technologies to industry for the aircraft. Conceptual level methodology is presented to assess the five NASA-supplied agility metrics. Twelve configurations were developed to address the global trade issues. Three-view drawings, inboard profiles, and performance estimates were made and are included in the report. A critical assessment and lessons learned from the study are also presented.

  18. Frequency agile OPO-based transmitters for multiwavelength DIAL

    SciTech Connect

    Velsko, S.P.; Ruggiero, A.; Herman, M.

    1996-09-01

    We describe a first generation mid-infrared transmitter with pulse to pulse frequency agility and both wide and narrow band capability. This transmitter was used to make multicomponent Differential Absorption LIDAR (DIAL) measurements in the field.

  19. Modern Enterprise Systems as Enablers of Agile Development

    NASA Astrophysics Data System (ADS)

    Fredriksson, Odd; Ljung, Lennart

    Traditional ES technology and traditional project management methods are supporting and matching each other. But they are not supporting the critical success conditions for ES development in an effective way. Although the findings from one case study of a successful modern ES change project is not strong empirical evidence, we carefully propose that the new modern ES technology is supporting and matching agile project management methods. In other words, it provides the required flexibility which makes it possible to put into practice the agile way of running projects, both for the system supplier and for the customer. In addition, we propose that the combination of modern ES technology and agile project management methods are more appropriate for supporting the realization of critical success conditions for ES development. The main purpose of this chapter is to compare critical success conditions for modern enterprise systems development projects with critical success conditions for agile information systems development projects.

  20. AGILE: Technologies and Electronics for gamma-ray and GRB detection

    NASA Astrophysics Data System (ADS)

    Bonati, A.; Monzani, F.; Poulsen, J. M.; Azzano, M.; Nicolini, L.; Massa, P.; Tavani, M.; Feroci, M.; Barbiellini, G.; Prest, M.; Argan, A.; Perotti, F.; Froysland, T.; Labanti, C.

    2004-06-01

    Following the success of the Beppo-SAX mission, a new family of single payloads for GRB observations are in the development phase. These payloads are characterized by a combination of instruments that provide a wide field-of-view and accurate pointing capability, as well as on-board triggers and source position identification. AGILE is a scientific space mission dedicated to gamma-ray astrophysics. The AGILE payload is based on the instrument concept outlined above, and it combines gamma-ray imaging detectors, and an X-ray imaging detector. Fast triggers and short detectors dead time allow detection of GRB pulses of time-scales from 1 millisecond to several tens of seconds. The AGILE Small Mission is funded by the Italian Space Agency (ASI), and the instruments are developed in collaboration among Italian research institutes and Italian space industry. Laben S.p.A. (a FINMECCANICA company) designs and develops one of the detectors and most of the Payload electronics. This paper gives an overview of the implementation features of some detectors and on-board processing electronics with a view to burst detection and processing.

  1. Insights into Global Health Practice from the Agile Software Development Movement

    PubMed Central

    Flood, David; Chary, Anita; Austad, Kirsten; Diaz, Anne Kraemer; García, Pablo; Martinez, Boris; Canú, Waleska López; Rohloff, Peter

    2016-01-01

    Global health practitioners may feel frustration that current models of global health research, delivery, and implementation are overly focused on specific interventions, slow to provide health services in the field, and relatively ill-equipped to adapt to local contexts. Adapting design principles from the agile software development movement, we propose an analogous approach to designing global health programs that emphasizes tight integration between research and implementation, early involvement of ground-level health workers and program beneficiaries, and rapid cycles of iterative program improvement. Using examples from our own fieldwork, we illustrate the potential of ‘agile global health’ and reflect on the limitations, trade-offs, and implications of this approach. PMID:27134081

  2. Agile Software Development in Defense Acquisition: A Mission Assurance Perspective

    DTIC Science & Technology

    2012-03-23

    AEROSPACE REPORT NO ATR -2012(9010)-2 Agile Software Development in Defense Acquisition - A Mission Assurance Perspective March 23, 2012 Peter...release; distribution unlimited. aoUc£23>o;r7 AEROSPACE REPORT NO ATR -2012(9010)-2 Agile Software Development in Defense Acquisition - A Mission...Engineering and Technology Group Approved for public release; distribution unlimited. (A\\ AEROSPACE ^•^ Aautoff $m MK*I? taH AEROSPACE REPORT NO ATR

  3. Spectroscopic Investigation of Materials for Frequency Agile Laser Systems.

    DTIC Science & Technology

    1985-01-01

    fluorescence spectra and lifetimes of divalent Rh, Ru, Pt, and Ir ions in alkali halide crystals are measured using pulsed nitrogen laser excitation...AD-Ai5t 73t SPECTROSCOPIC INVESTIGRTION OF MATERIALS FOR FREQUENCY t/ AGILE LASER SYSTEMS(U) OKLAHOMA STATE UNIV STILLWATER DEPT OF PHYSICS R C...INVESTIGATION OF MATERIALS FOR FREQUENCY AGILE LASER SYSTEMS Richard C. Powell, Ph.D. Principal Investigator Department of Physics OKLAHOMA STATE UNIVERSITY

  4. Addressing the Barriers to Agile Development in DoD

    DTIC Science & Technology

    2015-05-01

    acquisition development  IT programs are subject to extensive documentation, reviews, and oversight that inhibits speed and agility needed for IT  Major...Based on Program, Ops, and Technical Risk Structuring an Agile Program  Notional: 6 Month Release with 4-Week Sprints – Continual development...integration, and testing – Monthly demonstration of capabilities to users  Gov’t testers, certifiers, and users involved early and often – Minimizes

  5. Architectural Tactics to Support Rapid and Agile Stability

    DTIC Science & Technology

    2012-05-01

    20 CrossTalk—May/June 2012 RAPID AND AGILE STABILITY • Scrum teams, product development teams, component teams, or feature teams spend almost...and individuals in the roles of Scrum master, developer, project manager, and architect on projects from organizations that develop embedded real...agile stability. Using Scrum , 25 teams participated in the develop- ment effort. Some of the teams were colocated; teams (waste), while not evolving

  6. Identification, Characterization, and Evaluation Criteria for Systems Engineering Agile Enablers

    DTIC Science & Technology

    2015-01-16

    monitoring communications in social media groups and websites (such as LinkedIn or Facebook groups associated with the Scaled Agile Framework, Lean... Identification , Characterization, and Evaluation Criteria for Systems Engineering Agile Enablers Technical Report SERC-2015-TR-049-1...currently valid OMB control number 1. REPORT DATE 16 JAN 2015 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Identification

  7. Light-Driven Chiral Molecular Motors for Passive Agile Filters

    DTIC Science & Technology

    2014-05-20

    AFRL-OSR-VA-TR-2014-0121 LIGHT-DRIVEN CHIRAL MOLECULAR MOTORS FOR PASSIVE AGILE FILTERS Quan Li KENT STATE UNIV OH Final Report 05/20/2014...Prescribed by ANSI Std. Z39.18 1 FINAL REPORT Title: Light-driven Chiral Molecular Motors for Passive Agile Filters AFOSR...As we proposed originally, the major objective of this project was to synthesize novel light- driven chiral molecular motors or switches targeted

  8. Sandia Agile MEMS Prototyping, Layout Tools, Education and Services Program

    SciTech Connect

    Schriner, H.; Davies, B.; Sniegowski, J.; Rodgers, M.S.; Allen, J.; Shepard, C.

    1998-05-01

    Research and development in the design and manufacture of Microelectromechanical Systems (MEMS) is growing at an enormous rate. Advances in MEMS design tools and fabrication processes at Sandia National Laboratories` Microelectronics Development Laboratory (MDL) have broadened the scope of MEMS applications that can be designed and manufactured for both military and commercial use. As improvements in micromachining fabrication technologies continue to be made, MEMS designs can become more complex, thus opening the door to an even broader set of MEMS applications. In an effort to further research and development in MEMS design, fabrication, and application, Sandia National Laboratories has launched the Sandia Agile MEMS Prototyping, Layout Tools, Education and Services Program or SAMPLES program. The SAMPLES program offers potential partners interested in MEMS the opportunity to prototype an idea and produce hardware that can be used to sell a concept. The SAMPLES program provides education and training on Sandia`s design tools, analysis tools and fabrication process. New designers can participate in the SAMPLES program and design MEMS devices using Sandia`s design and analysis tools. As part of the SAMPLES program, participants` designs are fabricated using Sandia`s 4 level polycrystalline silicon surface micromachine technology fabrication process known as SUMMiT (Sandia Ultra-planar, Multi-level MEMS Technology). Furthermore, SAMPLES participants can also opt to obtain state of the art, post-fabrication services provided at Sandia such as release, packaging, reliability characterization, and failure analysis. This paper discusses the components of the SAMPLES program.

  9. The impact of flying qualities on helicopter operational agility

    NASA Technical Reports Server (NTRS)

    Padfield, Gareth D.; Lappos, Nick; Hodgkinson, John

    1993-01-01

    Flying qualities standards are formally set to ensure safe flight and therefore reflect minimum, rather than optimum, requirements. Agility is a flying quality but relates to operations at high, if not maximum, performance. While the quality metrics and test procedures for flying, as covered for example in ADS33C, may provide an adequate structure to encompass agility, they do not currently address flight at high performance. This is also true in the fixed-wing world and a current concern in both communities is the absence of substantiated agility criteria and possible conflicts between flying qualities and high performance. AGARD is sponsoring a working group (WG19) title 'Operational Agility' that deals with these and a range of related issues. This paper is condensed from contributions by the three authors to WG19, relating to flying qualities. Novel perspectives on the subject are presented including the agility factor, that quantifies performance margins in flying qualities terms; a new parameter, based on maneuver acceleration is introduced as a potential candidate for defining upper limits to flying qualities. Finally, a probabilistic analysis of pilot handling qualities ratings is presented that suggests a powerful relationship between inherent airframe flying qualities and operational agility.

  10. A Review of Agile and Lean Manufacturing as Issues in Selected International and National Research and Development Programs and Roadmaps

    ERIC Educational Resources Information Center

    Castro, Helio; Putnik, Goran D.; Shah, Vaibhav

    2012-01-01

    Purpose: The aim of this paper is to analyze international and national research and development (R&D) programs and roadmaps for the manufacturing sector, presenting how agile and lean manufacturing models are addressed in these programs. Design/methodology/approach: In this review, several manufacturing research and development programs and…

  11. Framework for Modeling the Cognitive Process

    DTIC Science & Technology

    2005-06-16

    Yaworsky Air Force Research Laboratory/IFSB Rome, NY Keywords: Cognitive Process Modeling, Cognition, Conceptual Framework , Information...center of our conceptual framework and will distinguish our use of terms within the context of this framework. 3. A Conceptual Framework for...Modeling the Cognitive Process We will describe our conceptual framework using graphical examples to help illustrate main points. We form the two

  12. An Extension to the Weibull Process Model

    DTIC Science & Technology

    1981-11-01

    Subt5l . TYPE OF REPORT & PERIOD COVERED AN EXTENSION+TO THE WEIBULL PROCESS MODEL 6. PERFORMING O’G. REPORT NUMBER I. AuTHOR() S. CONTRACT OR GRANT...indicatinq its imrportance to applications. 4 AN EXTENSION TO TE WEIBULL PROCESS MODEL 1. INTRODUCTION Recent papers by Bain and Engelhardt (1980)1 and Crow

  13. Hybrid modelling of anaerobic wastewater treatment processes.

    PubMed

    Karama, A; Bernard, O; Genovesi, A; Dochain, D; Benhammou, A; Steyer, J P

    2001-01-01

    This paper presents a hybrid approach for the modelling of an anaerobic digestion process. The hybrid model combines a feed-forward network, describing the bacterial kinetics, and the a priori knowledge based on the mass balances of the process components. We have considered an architecture which incorporates the neural network as a static model of unmeasured process parameters (kinetic growth rate) and an integrator for the dynamic representation of the process using a set of dynamic differential equations. The paper contains a description of the neural network component training procedure. The performance of this approach is illustrated with experimental data.

  14. Distillation modeling for a uranium refining process

    SciTech Connect

    Westphal, B.R.

    1996-03-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed {open_quotes}cathode processing{close_quotes}. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process.

  15. Declarative business process modelling: principles and modelling languages

    NASA Astrophysics Data System (ADS)

    Goedertier, Stijn; Vanthienen, Jan; Caron, Filip

    2015-02-01

    The business process literature has proposed a multitude of business process modelling approaches or paradigms, each in response to a different business process type with a unique set of requirements. Two polar paradigms, i.e. the imperative and the declarative paradigm, appear to define the extreme positions on the paradigm spectrum. While imperative approaches focus on explicitly defining how an organisational goal should be reached, the declarative approaches focus on the directives, policies and regulations restricting the potential ways to achieve the organisational goal. In between, a variety of hybrid-paradigms can be distinguished, e.g. the advanced and adaptive case management. This article focuses on the less-exposed declarative approach on process modelling. An outline of the declarative process modelling and the modelling approaches is presented, followed by an overview of the observed declarative process modelling principles and an evaluation of the declarative process modelling approaches.

  16. VARTM Process Modeling of Aerospace Composite Structures

    NASA Technical Reports Server (NTRS)

    Song, Xiao-Lan; Grimsley, Brian W.; Hubert, Pascal; Cano, Roberto J.; Loos, Alfred C.

    2003-01-01

    A three-dimensional model was developed to simulate the VARTM composite manufacturing process. The model considers the two important mechanisms that occur during the process: resin flow, and compaction and relaxation of the preform. The model was used to simulate infiltration of a carbon preform with an epoxy resin by the VARTM process. The model predicted flow patterns and preform thickness changes agreed qualitatively with the measured values. However, the predicted total infiltration times were much longer than measured most likely due to the inaccurate preform permeability values used in the simulation.

  17. Threat processing: models and mechanisms.

    PubMed

    Bentz, Dorothée; Schiller, Daniela

    2015-01-01

    The experience of fear is closely linked to the survival of species. Fear can be conceptualized as a brain state that orchestrates defense reactions to threats. To avoid harm, an organism must be equipped with neural circuits that allow learning, detecting, and rapidly responding to threats. Past experience with threat can transform neutral stimuli present at the time of experience into learned threat-related stimuli via associative learning. Pavlovian threat conditioning is the central experimental paradigm to study associative learning. Once learned, these stimulus-response associations are not always expressed depending on context or new experiences with the conditioned stimuli. Neural circuits mediating threat learning have the inherent plasticity to adapt to changing environmental threats. Encounters devoid of danger pave the way for extinction or reconsolidation to occur. Extinction and reconsolidation can both lead to changes in the expression of threat-induced defense responses, but differ in stability and have a different neural basis. This review presents the behavioral models and the system-level neural mechanisms in animals and humans of threat learning and modulation.

  18. An information processing model of anxiety: automatic and strategic processes.

    PubMed

    Beck, A T; Clark, D A

    1997-01-01

    A three-stage schema-based information processing model of anxiety is described that involves: (a) the initial registration of a threat stimulus; (b) the activation of a primal threat mode; and (c) the secondary activation of more elaborative and reflective modes of thinking. The defining elements of automatic and strategic processing are discussed with the cognitive bias in anxiety reconceptualized in terms of a mixture of automatic and strategic processing characteristics depending on which stage of the information processing model is under consideration. The goal in the treatment of anxiety is to deactivate the more automatic primal threat mode and to strengthen more constructive reflective modes of thinking. Arguments are presented for the inclusion of verbal mediation as a necessary but not sufficient component in the cognitive and behavioral treatment of anxiety.

  19. The mechanism and realization of a band-agile coaxial relativistic backward-wave oscillator

    SciTech Connect

    Ge, Xingjun; Zhang, Jun; Zhong, Huihuang; Qian, Baoliang; Wang, Haitao

    2014-11-03

    The mechanism and realization of a band-agile coaxial relativistic backward-wave oscillator (RBWO) are presented. The operation frequency tuning can be easily achieved by merely altering the inner-conductor length. The key effects of the inner-conductor length contributing to the mechanical frequency tunability are investigated theoretically and experimentally. There is a specific inner-conductor length where the operation frequency can jump from one mode to another mode, which belongs to a different operation band. In addition, the operation frequency is tunable within each operation band. During simulation, the L-band microwave with a frequency of 1.61 GHz is radiated when the inner-conductor length is 39 cm. Meanwhile, the S-band microwave with a frequency of 2.32 GHz is radiated when the inner-conductor length is 5 cm. The frequency adjustment bandwidths of L-band and S-band are about 8.5% and 2%, respectively. Moreover, the online mechanical tunability process is described in detail. In the initial experiment, the generated microwave frequencies remain approximately 1.59 GHz and 2.35 GHz when the inner-conductor lengths are 39 cm and 5 cm. In brief, this technical route of the band-agile coaxial RBWO is feasible and provides a guide to design other types of band-agile high power microwaves sources.

  20. The mechanism and realization of a band-agile coaxial relativistic backward-wave oscillator

    NASA Astrophysics Data System (ADS)

    Ge, Xingjun; Zhang, Jun; Zhong, Huihuang; Qian, Baoliang; Wang, Haitao

    2014-11-01

    The mechanism and realization of a band-agile coaxial relativistic backward-wave oscillator (RBWO) are presented. The operation frequency tuning can be easily achieved by merely altering the inner-conductor length. The key effects of the inner-conductor length contributing to the mechanical frequency tunability are investigated theoretically and experimentally. There is a specific inner-conductor length where the operation frequency can jump from one mode to another mode, which belongs to a different operation band. In addition, the operation frequency is tunable within each operation band. During simulation, the L-band microwave with a frequency of 1.61 GHz is radiated when the inner-conductor length is 39 cm. Meanwhile, the S-band microwave with a frequency of 2.32 GHz is radiated when the inner-conductor length is 5 cm. The frequency adjustment bandwidths of L-band and S-band are about 8.5% and 2%, respectively. Moreover, the online mechanical tunability process is described in detail. In the initial experiment, the generated microwave frequencies remain approximately 1.59 GHz and 2.35 GHz when the inner-conductor lengths are 39 cm and 5 cm. In brief, this technical route of the band-agile coaxial RBWO is feasible and provides a guide to design other types of band-agile high power microwaves sources.

  1. Embedding Agile Practices within a Plan-Driven Hierarchical Project Life Cycle

    SciTech Connect

    Millard, W. David; Johnson, Daniel M.; Henderson, John M.; Lombardo, Nicholas J.; Bass, Robert B.; Smith, Jason E.

    2014-07-28

    Organizations use structured, plan-driven approaches to provide continuity, direction, and control to large, multi-year programs. Projects within these programs vary greatly in size, complexity, level of maturity, technical risk, and clarity of the development objectives. Organizations that perform exploratory research, evolutionary development, and other R&D activities can obtain the benefits of Agile practices without losing the benefits of their program’s overarching plan-driven structure. This paper describes application of Agile development methods on a large plan-driven sensor integration program. While the client employed plan-driven, requirements flow-down methodologies, tight project schedules and complex interfaces called for frequent end-to-end demonstrations to provide feedback during system development. The development process maintained the many benefits of plan-driven project execution with the rapid prototyping, integration, demonstration, and client feedback possible through Agile development methods. This paper also describes some of the tools and implementing mechanisms used to transition between and take advantage of each methodology, and presents lessons learned from the project management, system engineering, and developer’s perspectives.

  2. Three Models for the Curriculum Development Process

    ERIC Educational Resources Information Center

    O'Hanlon, James

    1973-01-01

    Presents descriptions of the management, systematic, and open-access curriculum development models to identify the decisionmaking bases, operational processes, evaluation requirements, and curriculum control methods of each model. A possible relationship among these models is then suggested. (Author/DN)

  3. "Agile" Battery Technology Transfer-Lessons Learnt

    NASA Astrophysics Data System (ADS)

    Sabatini, P.; Annoni, G.; Grossi, R.; Alia, Sergio; Reulier, David

    2008-09-01

    AGILE, the high energy astrophysics mission of the Italian Space Agency launched on April 23rd 2007, is the first LEO satellite to be powered by Saft's commercially available space qualified MPS176065 rechargeable lithium ion batteries.Saft and Carlo Gavazzi Space (CGS) have achieved a successful technology transfer replacing Ni-H2 batteries with high energy lithium ion batteries in a full speed program (4 months) and with a cost effective approach. The battery system comprises 2 x 24 Saft MPS176065 space qualified Li-ion cells in an 8s3p configuration (3 parallel arrays each composed by 8 series cell) with a nominal capacity of 2 x 480 Wh and an integral autonomous cell balancing system that ensures the maximum possible battery life.The MPS176065 space qualified cell is based on Saft's well proven MP series of prismatic rechargeable Li-ion batteries. It offers an extremely high capacity made possible by the stainless steel prismatic container that makes use of the volume which is otherwise lost when conventional cylindrical cells are packed together. A single prismatic cell has about 20% more volumetric energy density than an equivalent pack of cylindrical cells.

  4. Preform Characterization in VARTM Process Model Development

    NASA Technical Reports Server (NTRS)

    Grimsley, Brian W.; Cano, Roberto J.; Hubert, Pascal; Loos, Alfred C.; Kellen, Charles B.; Jensen, Brian J.

    2004-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) is a Liquid Composite Molding (LCM) process where both resin injection and fiber compaction are achieved under pressures of 101.3 kPa or less. Originally developed over a decade ago for marine composite fabrication, VARTM is now considered a viable process for the fabrication of aerospace composites (1,2). In order to optimize and further improve the process, a finite element analysis (FEA) process model is being developed to include the coupled phenomenon of resin flow, preform compaction and resin cure. The model input parameters are obtained from resin and fiber-preform characterization tests. In this study, the compaction behavior and the Darcy permeability of a commercially available carbon fabric are characterized. The resulting empirical model equations are input to the 3- Dimensional Infiltration, version 5 (3DINFILv.5) process model to simulate infiltration of a composite panel.

  5. Agile data management for curation of genomes to watershed datasets

    NASA Astrophysics Data System (ADS)

    Varadharajan, C.; Agarwal, D.; Faybishenko, B.; Versteeg, R.

    2015-12-01

    A software platform is being developed for data management and assimilation [DMA] as part of the U.S. Department of Energy's Genomes to Watershed Sustainable Systems Science Focus Area 2.0. The DMA components and capabilities are driven by the project science priorities and the development is based on agile development techniques. The goal of the DMA software platform is to enable users to integrate and synthesize diverse and disparate field, laboratory, and simulation datasets, including geological, geochemical, geophysical, microbiological, hydrological, and meteorological data across a range of spatial and temporal scales. The DMA objectives are (a) developing an integrated interface to the datasets, (b) storing field monitoring data, laboratory analytical results of water and sediments samples collected into a database, (c) providing automated QA/QC analysis of data and (d) working with data providers to modify high-priority field and laboratory data collection and reporting procedures as needed. The first three objectives are driven by user needs, while the last objective is driven by data management needs. The project needs and priorities are reassessed regularly with the users. After each user session we identify development priorities to match the identified user priorities. For instance, data QA/QC and collection activities have focused on the data and products needed for on-going scientific analyses (e.g. water level and geochemistry). We have also developed, tested and released a broker and portal that integrates diverse datasets from two different databases used for curation of project data. The development of the user interface was based on a user-centered design process involving several user interviews and constant interaction with data providers. The initial version focuses on the most requested feature - i.e. finding the data needed for analyses through an intuitive interface. Once the data is found, the user can immediately plot and download data

  6. Modeling cellular processes in 3D.

    PubMed

    Mogilner, Alex; Odde, David

    2011-12-01

    Recent advances in photonic imaging and fluorescent protein technology offer unprecedented views of molecular space-time dynamics in living cells. At the same time, advances in computing hardware and software enable modeling of ever more complex systems, from global climate to cell division. As modeling and experiment become more closely integrated we must address the issue of modeling cellular processes in 3D. Here, we highlight recent advances related to 3D modeling in cell biology. While some processes require full 3D analysis, we suggest that others are more naturally described in 2D or 1D. Keeping the dimensionality as low as possible reduces computational time and makes models more intuitively comprehensible; however, the ability to test full 3D models will build greater confidence in models generally and remains an important emerging area of cell biological modeling.

  7. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  8. Job Aiding/Training Decision Process Model

    DTIC Science & Technology

    1992-09-01

    I[ -, . 1’, oo Ii AL-CR-i1992-0004 AD-A256 947lEE = IIEI ifl ll 1l I JOB AIDING/TRAINING DECISION PROCESS MODEL A R M John P. Zenyuh DTIC S Phillip C...March 1990 - April 1990 4. TITLE AND SUBTITLE S. FUNDING NUMBERS C - F33615-86-C-0545 Job Aiding/Training Decision Process Model PE - 62205F PR - 1121 6...Components to Process Model Decision and Selection Points ........... 32 13. Summary of Subject Recommendations for Aiding Approaches

  9. MODEL OF DIFFUSERS / PERMEATORS FOR HYDROGEN PROCESSING

    SciTech Connect

    Hang, T; William Jacobs, W

    2007-08-27

    Palladium-silver (Pd-Ag) diffusers are mainstays of hydrogen processing. Diffusers separate hydrogen from inert species such as nitrogen, argon or helium. The tubing becomes permeable to hydrogen when heated to more than 250 C and a differential pressure is created across the membrane. The hydrogen diffuses better at higher temperatures. Experimental or experiential results have been the basis for determining or predicting a diffuser's performance. However, the process can be mathematically modeled, and comparison to experimental or other operating data can be utilized to improve the fit of the model. A reliable model-based diffuser system design is the goal which will have impacts on tritium and hydrogen processing. A computer model has been developed to solve the differential equations for diffusion given the operating boundary conditions. The model was compared to operating data for a low pressure diffuser system. The modeling approach and the results are presented in this paper.

  10. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  11. Creativity in Agile Systems Development: A Literature Review

    NASA Astrophysics Data System (ADS)

    Conboy, Kieran; Wang, Xiaofeng; Fitzgerald, Brian

    Proponents of agile methods claim that enabling, fostering and driving creativity is the key motivation that differentiates agile methods from their more traditional, beauraucratic counterparts. However, there is very little rigorous research to support this claim. Like most of their predecessors, the development and promotion of these methods has been almost entirely driven by practitioners and consultants, with little objective validation from the research community. This lack of validation is particularly relevant for SMEs, given that many of their project teams typify the environment to which agile methods are most suited i.e. small, co-located teams with diverse, blended skills in unstructured, sometimes even chaotic surroundings. This paper uses creativity theory as a lens to review the current agile method literature to understand exactly how much we know about the extent to which creativity actually occurs in these agile environments. The study reveals many gaps and conflict of opinion in the body of knowledge in its current state and identifies many avenues for further research.

  12. The Southern Argentine Agile Meteor Radar (SAAMER)

    NASA Astrophysics Data System (ADS)

    Janches, Diego

    2014-11-01

    The Southern Argentina Agile Meteor Radar (SAAMER) is a new generation system deployed in Rio Grande, Tierra del Fuego, Argentina (53 S) in May 2008. SAAMER transmits 10 times more power than regular meteor radars, and uses a newly developed transmitting array, which focuses power upward instead of the traditional single-antenna-all-sky configuration. The system is configured such that the transmitter array can also be utilized as a receiver. The new design greatly increases the sensitivity of the radar enabling the detection of large number of particles at low zenith angles. The more concentrated transmitted power enables additional meteor studies besides those typical of these systems based on the detection of specular reflections, such as routine detections of head echoes and non-specular trails, previously only possible with High Power and Large Aperture radars. In August 2010, SAAMER was upgraded to a system capable to determine meteoroid orbital parameters. This was achieved by adding two remote receiving stations approximately 10 km away from the main site in near perpendicular directions. The upgrade significantly expands the science that is achieved with this new radar enabling us to study the orbital properties of the interplanetary dust environment. Because of the unique geographical location, SAAMER allows for additional inter-hemispheric comparison with measurements from Canadian Meteor Orbit Radar, which is geographically conjugate. Initial surveys show, for example, that SAAMER observes a very strong contribution of the South Toroidal Sporadic meteor source, of which limited observational data is available. In addition, SAAMER offers similar unique capabilities for meteor showers and streams studies given the range of ecliptic latitudes that the system enables detailed study of showers at high southern latitudes (e.g July Phoenicids or Puppids complex). Finally, SAAMER is ideal for the deployment of complementary instrumentation in both, permanent

  13. The (Mathematical) Modeling Process in Biosciences.

    PubMed

    Torres, Nestor V; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology.

  14. The (Mathematical) Modeling Process in Biosciences

    PubMed Central

    Torres, Nestor V.; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology. PMID:26734063

  15. Program Development and Evaluation: A Modeling Process.

    ERIC Educational Resources Information Center

    Green, Donald W.; Corgiat, RayLene

    A model of program development and evaluation was developed at Genesee Community College, utilizing a system theory/process of deductive and inductive reasoning to ensure coherence and continuity within the program. The model links activities to specific measurable outcomes. Evaluation checks and feedback are built in at various levels so that…

  16. A Process Model for Water Jug Problems

    ERIC Educational Resources Information Center

    Atwood, Michael E.; Polson, Peter G.

    1976-01-01

    A model is developed and evaluated for use in the water jug task in in which subjects are required to find a sequence of moves which produce a specified amount of water in each jug. Results indicate that the model presented correctly predicts the difficulties of different problems and describes the behavior of subjects in the process of problem…

  17. Modeling of fluidized bed silicon deposition process

    NASA Technical Reports Server (NTRS)

    Kim, K.; Hsu, G.; Lutwack, R.; PRATURI A. K.

    1977-01-01

    The model is intended for use as a means of improving fluidized bed reactor design and for the formulation of the research program in support of the contracts of Silicon Material Task for the development of the fluidized bed silicon deposition process. A computer program derived from the simple modeling is also described. Results of some sample calculations using the computer program are shown.

  18. Stochastic model for supersymmetric particle branching process

    NASA Astrophysics Data System (ADS)

    Zhang, Yuanyuan; Chan, Aik Hui; Oh, Choo Hiap

    2017-01-01

    We develop a stochastic branching model to describe the jet evolution of supersymmetric (SUSY) particles. This model is a modified two-phase branching process, or more precisely, a two-phase simple birth process plus Poisson process. Both pure SUSY partons initiated jets and SUSY plus ordinary partons initiated jets scenarios are considered. The stochastic branching equations are established and the Multiplicity Distributions (MDs) are derived for these two scenarios. We also fit the distribution of the general case (SUSY plus ordinary partons initiated jets) with experimental data. The fitting shows the SUSY particles have not participated in branching at current collision energy yet.

  19. Quantitative modeling of soil genesis processes

    NASA Technical Reports Server (NTRS)

    Levine, E. R.; Knox, R. G.; Kerber, A. G.

    1992-01-01

    For fine spatial scale simulation, a model is being developed to predict changes in properties over short-, meso-, and long-term time scales within horizons of a given soil profile. Processes that control these changes can be grouped into five major process clusters: (1) abiotic chemical reactions; (2) activities of organisms; (3) energy balance and water phase transitions; (4) hydrologic flows; and (5) particle redistribution. Landscape modeling of soil development is possible using digitized soil maps associated with quantitative soil attribute data in a geographic information system (GIS) framework to which simulation models are applied.

  20. Filament winding cylinders. I - Process model

    NASA Technical Reports Server (NTRS)

    Lee, Soo-Yong; Springer, George S.

    1990-01-01

    A model was developed which describes the filament winding process of composite cylinders. The model relates the significant process variables such as winding speed, fiber tension, and applied temperature to the thermal, chemical and mechanical behavior of the composite cylinder and the mandrel. Based on the model, a user friendly code was written which can be used to calculate (1) the temperature in the cylinder and the mandrel, (2) the degree of cure and viscosity in the cylinder, (3) the fiber tensions and fiber positions, (4) the stresses and strains in the cylinder and in the mandrel, and (5) the void diameters in the cylinder.

  1. Using Perspective to Model Complex Processes

    SciTech Connect

    Kelsey, R.L.; Bisset, K.R.

    1999-04-04

    The notion of perspective, when supported in an object-based knowledge representation, can facilitate better abstractions of reality for modeling and simulation. The object modeling of complex physical and chemical processes is made more difficult in part due to the poor abstractions of state and phase changes available in these models. The notion of perspective can be used to create different views to represent the different states of matter in a process. These techniques can lead to a more understandable model. Additionally, the ability to record the progress of a process from start to finish is problematic. It is desirable to have a historic record of the entire process, not just the end result of the process. A historic record should facilitate backtracking and re-start of a process at different points in time. The same representation structures and techniques can be used to create a sequence of process markers to represent a historic record. By using perspective, the sequence of markers can have multiple and varying views tailored for a particular user's context of interest.

  2. Modeling the VARTM Composite Manufacturing Process

    NASA Technical Reports Server (NTRS)

    Song, Xiao-Lan; Loos, Alfred C.; Grimsley, Brian W.; Cano, Roberto J.; Hubert, Pascal

    2004-01-01

    A comprehensive simulation model of the Vacuum Assisted Resin Transfer Modeling (VARTM) composite manufacturing process has been developed. For isothermal resin infiltration, the model incorporates submodels which describe cure of the resin and changes in resin viscosity due to cure, resin flow through the reinforcement preform and distribution medium and compaction of the preform during the infiltration. The accuracy of the model was validated by measuring the flow patterns during resin infiltration of flat preforms. The modeling software was used to evaluate the effects of the distribution medium on resin infiltration of a flat preform. Different distribution medium configurations were examined using the model and the results were compared with data collected during resin infiltration of a carbon fabric preform. The results of the simulations show that the approach used to model the distribution medium can significantly effect the predicted resin infiltration times. Resin infiltration into the preform can be accurately predicted only when the distribution medium is modeled correctly.

  3. Mathematical modeling of the coating process.

    PubMed

    Toschkoff, Gregor; Khinast, Johannes G

    2013-12-05

    Coating of tablets is a common unit operation in the pharmaceutical industry. In most cases, the final product must meet strict quality requirements; to meet them, a detailed understanding of the coating process is required. To this end, numerous experiment studies have been performed. However, to acquire a mechanistic understanding, experimental data must be interpreted in the light of mathematical models. In recent years, a combination of analytical modeling and computational simulations enabled deeper insights into the nature of the coating process. This paper presents an overview of modeling and simulation approaches of the coating process, covering various relevant aspects from scale-up considerations to coating mass uniformity investigations and models for drop atomization. The most important analytical and computational concepts are presented and the findings are compared.

  4. The measurement and improvement of the lateral agility of the F-18

    NASA Technical Reports Server (NTRS)

    Eggold, David P.; Valasek, John; Downing, David R.

    1991-01-01

    The effect of vehicle configuration and flight control system performance on the roll agility of a modern fighter aircraft has been investigated. A batch simulation of a generic F-18 Hornet was used to study the roll agility as measured by the time to roll through 90 deg metric. Problems discussed include definition of agility, factors affecting the agility of a vehicle, the development of the time to roll through 90 deg agility metric, and a simulation experiment. It is concluded that the integral of stability or wind axis roll rate should be used as a measure of the roll measure traversed. The time through roll angle 90 deg metric is considered to be a good metric for measuring the transient performance aspect of agility. Roll agility of the F-18, as measured by 90 deg metric, can be improved by 10 to 30 percent. Compatible roll and rudder actuator rates can significantly affect 90 deg agility metric.

  5. Pathways to agility in the production of neutron generators

    SciTech Connect

    Stoltz, R.E.; Beavis, L.C.; Cutchen, J.T.; Garcia, P.; Gurule, G.A.; Harris, R.N.; McKey, P.C.; Williams, D.W.

    1994-02-01

    This report is the result of a study team commissioned to explore pathways for increased agility in the manufacture of neutron generators. As a part of Sandia`s new responsibility for generator production, the goal of the study was to identify opportunities to reduce costs and increase flexibility in the manufacturing operation. Four parallel approaches (or pathways) were recommended: (1) Know the goal, (2) Use design leverage effectively, (3) Value simplicity, and (4) Configure for flexibility. Agility in neutron generator production can be enhanced if all of these pathways are followed. The key role of the workforce in achieving agility was also noted, with emphasis on ownership, continuous learning, and a supportive environment.

  6. The AGILE Mission and Gamma-Ray Bursts

    SciTech Connect

    Longo, Francesco; Tavani, M.; Barbiellini, G.; Argan, A.; Basset, M.; Boffelli, F.; Bulgarelli, A.; Caraveo, P.; Cattaneo, P.; Chen, A.; Costa, E.; Del Monte, E.; Di Cocco, G.; Di Persio, G.; Donnarumma, I.; Feroci, M.; Fiorini, M.; Foggetta, L.; Froysland, T.; Frutti, M.

    2007-05-01

    The AGILE Mission will explore the gamma-ray Universe with a very innovative instrument combining for the first time a gamma-ray imager and a hard X-ray imager. AGILE will be operational at the beginning of 2007 and it will provide crucial data for the study of Active Galactic Nuclei, Gamma-Ray Bursts, unidentified gamma-ray sources, Galactic compact objects, supernova remnants, TeV sources, and fundamental physics by microsecond timing. The AGILE instrument is designed to simultaneously detect and image photons in the 30 MeV - 50 GeV and 15 - 45 keV energy bands with excellent imaging and timing capabilities, and a large field of view covering {approx} 1/5 of the entire sky at energies above 30 MeV. A CsI calorimeter is capable of GRB triggering in the energy band 0.3-50 MeV. The broadband detection of GRBs and the study of implications for particle acceleration and high energy emission are primary goals of the mission. AGILE can image GRBs with 2-3 arcminute error boxes in the hard X-ray range, and provide broadband photon-by photon detection in the 15-45 keV, 03-50 MeV, and 30 MeV-30 GeV energy ranges. Microsecond on-board photon tagging and a {approx} 100 microsecond gamma-ray detection deadtime will be crucial for fast GRB timing. On-board calculated GRB coordinates and energy fluxes will be quickly transmitted to the ground by an ORBCOMM transceiver. AGILE is now (January 2007) undergoing final satellite integration and testing. The PLS V launch is planned in spring 2007. AGILE is then foreseen to be fully operational during the summer of 2007.

  7. Computational Modeling in Structural Materials Processing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    High temperature materials such as silicon carbide, a variety of nitrides, and ceramic matrix composites find use in aerospace, automotive, machine tool industries and in high speed civil transport applications. Chemical vapor deposition (CVD) is widely used in processing such structural materials. Variations of CVD include deposition on substrates, coating of fibers, inside cavities and on complex objects, and infiltration within preforms called chemical vapor infiltration (CVI). Our current knowledge of the process mechanisms, ability to optimize processes, and scale-up for large scale manufacturing is limited. In this regard, computational modeling of the processes is valuable since a validated model can be used as a design tool. The effort is similar to traditional chemically reacting flow modeling with emphasis on multicomponent diffusion, thermal diffusion, large sets of homogeneous reactions, and surface chemistry. In the case of CVI, models for pore infiltration are needed. In the present talk, examples of SiC nitride, and Boron deposition from the author's past work will be used to illustrate the utility of computational process modeling.

  8. Software Engineering Laboratory (SEL) cleanroom process model

    NASA Technical Reports Server (NTRS)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  9. Dynamic occupancy models for explicit colonization processes

    USGS Publications Warehouse

    Broms, Kristin M.; Hooten, Mevin B.; Johnson, Devin S.; Altwegg, Res; Conquest, Loveday

    2016-01-01

    The dynamic, multi-season occupancy model framework has become a popular tool for modeling open populations with occupancies that change over time through local colonizations and extinctions. However, few versions of the model relate these probabilities to the occupancies of neighboring sites or patches. We present a modeling framework that incorporates this information and is capable of describing a wide variety of spatiotemporal colonization and extinction processes. A key feature of the model is that it is based on a simple set of small-scale rules describing how the process evolves. The result is a dynamic process that can account for complicated large-scale features. In our model, a site is more likely to be colonized if more of its neighbors were previously occupied and if it provides more appealing environmental characteristics than its neighboring sites. Additionally, a site without occupied neighbors may also become colonized through the inclusion of a long-distance dispersal process. Although similar model specifications have been developed for epidemiological applications, ours formally accounts for detectability using the well-known occupancy modeling framework. After demonstrating the viability and potential of this new form of dynamic occupancy model in a simulation study, we use it to obtain inference for the ongoing Common Myna (Acridotheres tristis) invasion in South Africa. Our results suggest that the Common Myna continues to enlarge its distribution and its spread via short distance movement, rather than long-distance dispersal. Overall, this new modeling framework provides a powerful tool for managers examining the drivers of colonization including short- vs. long-distance dispersal, habitat quality, and distance from source populations.

  10. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes

    PubMed Central

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Óscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-01-01

    Background Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. Methods With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. Results The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. Conclusion The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals. PMID:18673511

  11. Stochastic differential equation model to Prendiville processes

    NASA Astrophysics Data System (ADS)

    Granita, Bahar, Arifah

    2015-10-01

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution.

  12. Stochastic differential equation model to Prendiville processes

    SciTech Connect

    Granita; Bahar, Arifah

    2015-10-22

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution.

  13. Chain binomial models and binomial autoregressive processes.

    PubMed

    Weiss, Christian H; Pollett, Philip K

    2012-09-01

    We establish a connection between a class of chain-binomial models of use in ecology and epidemiology and binomial autoregressive (AR) processes. New results are obtained for the latter, including expressions for the lag-conditional distribution and related quantities. We focus on two types of chain-binomial model, extinction-colonization and colonization-extinction models, and present two approaches to parameter estimation. The asymptotic distributions of the resulting estimators are studied, as well as their finite-sample performance, and we give an application to real data. A connection is made with standard AR models, which also has implications for parameter estimation.

  14. Process simulation and modeling for gas processing plant

    NASA Astrophysics Data System (ADS)

    Alhameli, Falah Obaid Kenish Mubarak

    Natural gas is one of the major energy sources and its demand is increasing rapidly due to its environmental and economic advantages over other fuels. Gas processing is an essential component of natural gas system. In this work, gas processing plant is introduced with the objective of meeting pipeline gas quality. It consists of separation, sweetening and dehydration units. The separation unit contains phase separators along with stabilizer (conventional distillation column). The sweetening unit is an amine process with MDEA (Methyl DiEthanol Amine) solvent. The dehydration unit is glycol absorption with TEG (TriEthyleneGlycol) solvent. ProMaxRTM 3.2 was used to simulate the plant. Box-Behnken design was applied to build a black-box model using design of experiments (DoE). MinitabRTM 15 was used to generate and analyse the design. The chosen variables for the model were 10. They represent the gas feed conditions and units' parameters. The total runs were 170. They were successfully implemented and analysed. Total energy of the plant and water content for the product gas models were obtained. Case study was conducted to investigate the impact of H2S composition increase in the feed gas. The models were used for the case study with the objective of total energy minimization and constraint of 4 lb/MMscf for water content in the product gas. Lingo 13 was used for the optimization. It was observed that the feed pressure had the highest influence among the other parameters. Finally, some recommendations were pointed out for the future works.

  15. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    The quality cost modeling (QCM) tool is intended to be a relatively simple-to-use device for obtaining a first-order assessment of the quality-cost relationship for a given process-material combination. The QCM curve is a plot of cost versus quality (an index indicating microstructural quality), which is unique for a given process-material combination. The QCM curve indicates the tradeoff between cost and performance, thus enabling one to evaluate affordability. Additionally, the effect of changes in process design, raw materials, and process conditions on the cost-quality relationship can be evaluated. Such results might indicate the most efficient means to obtain improved quality at reduced cost by process design refinements, the implementation of sensors and models for closed loop process control, or improvement in the properties of raw materials being fed into the process. QCM also allows alternative processes for producing the same or similar material to be compared in terms of their potential for producing competitively priced, high quality material. Aside from demonstrating the usefulness of the QCM concept, this is one of the main foci of the present research program, namely to compare processes for making continuous fiber reinforced, metal matrix composites (MMC's). Two processes, low pressure plasma spray deposition and tape casting are considered for QCM development. This document consists of a detailed look at the design of the QCM approach, followed by discussion of the application of QCM to each of the selected MMC manufacturing processes along with results, comparison of processes, and finally, a summary of findings and recommendations.

  16. Session on modeling of radiative transfer processes

    NASA Technical Reports Server (NTRS)

    Flatau, Piotr

    1993-01-01

    The session on modeling of radiative transfer processes is reviewed. Six critical issues surfaced in the discussion concerning scale-interactive radiative processes relevent to the mesoscale convective systems (MCS's). These issues are the need to expand basic knowledge of how MCS's influence climate through extensive cloud shields and increased humidity in the upper troposphere; to improve radiation parameterizations used in mesoscale and General Circulation Model (GCM) models; to improve our basic understanding of the influence of radiation on MCS dynamics due to diabatic heating, production of condensate, and vertical and horizontal heat fluxes; to quantify our understanding of radiative impacts of MCS's on the surface and free atmosphere energy budgets; to quantify and identify radiative and microphysical processes important in the evolution of MCS's; and to improve the capability to remotely sense MCS radiative properties from space and ground-based systems.

  17. Incorporating evolutionary processes into population viability models.

    PubMed

    Pierson, Jennifer C; Beissinger, Steven R; Bragg, Jason G; Coates, David J; Oostermeijer, J Gerard B; Sunnucks, Paul; Schumaker, Nathan H; Trotter, Meredith V; Young, Andrew G

    2015-06-01

    We examined how ecological and evolutionary (eco-evo) processes in population dynamics could be better integrated into population viability analysis (PVA). Complementary advances in computation and population genomics can be combined into an eco-evo PVA to offer powerful new approaches to understand the influence of evolutionary processes on population persistence. We developed the mechanistic basis of an eco-evo PVA using individual-based models with individual-level genotype tracking and dynamic genotype-phenotype mapping to model emergent population-level effects, such as local adaptation and genetic rescue. We then outline how genomics can allow or improve parameter estimation for PVA models by providing genotypic information at large numbers of loci for neutral and functional genome regions. As climate change and other threatening processes increase in rate and scale, eco-evo PVAs will become essential research tools to evaluate the effects of adaptive potential, evolutionary rescue, and locally adapted traits on persistence.

  18. A process algebra model of QED

    NASA Astrophysics Data System (ADS)

    Sulis, William

    2016-03-01

    The process algebra approach to quantum mechanics posits a finite, discrete, determinate ontology of primitive events which are generated by processes (in the sense of Whitehead). In this ontology, primitive events serve as elements of an emergent space-time and of emergent fundamental particles and fields. Each process generates a set of primitive elements, using only local information, causally propagated as a discrete wave, forming a causal space termed a causal tapestry. Each causal tapestry forms a discrete and finite sampling of an emergent causal manifold (space-time) M and emergent wave function. Interactions between processes are described by a process algebra which possesses 8 commutative operations (sums and products) together with a non-commutative concatenation operator (transitions). The process algebra possesses a representation via nondeterministic combinatorial games. The process algebra connects to quantum mechanics through the set valued process and configuration space covering maps, which associate each causal tapestry with sets of wave functions over M. Probabilities emerge from interactions between processes. The process algebra model has been shown to reproduce many features of the theory of non-relativistic scalar particles to a high degree of accuracy, without paradox or divergences. This paper extends the approach to a semi-classical form of quantum electrodynamics.

  19. Modeling Kanban Processes in Systems Engineering

    DTIC Science & Technology

    2012-06-01

    Modeling Kanban Processes in Systems Engineering Richard Turner School of Systems and Enterprises Stevens Institute of Technology Hoboken, NJ...dingold@usc.edu, jolane@usc.edu Abstract—Systems engineering processes using pull scheduling methods ( kanban ) are being evaluated with hybrid...development projects incrementally evolve capabilities of existing systems and/or systems of systems. A kanban -based scheduling system was defined and

  20. Retort process modelling for Indian traditional foods.

    PubMed

    Gokhale, S V; Lele, S S

    2014-11-01

    Indian traditional staple and snack food is typically a heterogeneous recipe that incorporates varieties of vegetables, lentils and other ingredients. Modelling the retorting process of multilayer pouch packed Indian food was achieved using lumped-parameter approach. A unified model is proposed to estimate cold point temperature. Initial process conditions, retort temperature and % solid content were the significantly affecting independent variables. A model was developed using combination of vegetable solids and water, which was then validated using four traditional Indian vegetarian products: Pulav (steamed rice with vegetables), Sambar (south Indian style curry containing mixed vegetables and lentils), Gajar Halawa (carrot based sweet product) and Upama (wheat based snack product). The predicted and experimental values of temperature profile matched with ±10 % error which is a good match considering the food was a multi component system. Thus the model will be useful as a tool to reduce number of trials required to optimize retorting of various Indian traditional vegetarian foods.

  1. The DAB model of drawing processes

    NASA Technical Reports Server (NTRS)

    Hochhaus, Larry W.

    1989-01-01

    The problem of automatic drawing was investigated in two ways. First, a DAB model of drawing processes was introduced. DAB stands for three types of knowledge hypothesized to support drawing abilities, namely, Drawing Knowledge, Assimilated Knowledge, and Base Knowledge. Speculation concerning the content and character of each of these subsystems of the drawing process is introduced and the overall adequacy of the model is evaluated. Second, eight experts were each asked to understand six engineering drawings and to think aloud while doing so. It is anticipated that a concurrent protocol analysis of these interviews can be carried out in the future. Meanwhile, a general description of the videotape database is provided. In conclusion, the DAB model was praised as a worthwhile first step toward solution of a difficult problem, but was considered by and large inadequate to the challenge of automatic drawing. Suggestions for improvements on the model were made.

  2. Deterministic geologic processes and stochastic modeling

    SciTech Connect

    Rautman, C.A.; Flint, A.L.

    1991-12-31

    Recent outcrop sampling at Yucca Mountain, Nevada, has produced significant new information regarding the distribution of physical properties at the site of a potential high-level nuclear waste repository. Consideration of the spatial distribution of measured values and geostatistical measures of spatial variability indicates that there are a number of widespread deterministic geologic features at the site that have important implications for numerical modeling of such performance aspects as ground water flow and radionuclide transport. These deterministic features have their origin in the complex, yet logical, interplay of a number of deterministic geologic processes, including magmatic evolution; volcanic eruption, transport, and emplacement; post-emplacement cooling and alteration; and late-stage (diagenetic) alteration. Because of geologic processes responsible for formation of Yucca Mountain are relatively well understood and operate on a more-or-less regional scale, understanding of these processes can be used in modeling the physical properties and performance of the site. Information reflecting these deterministic geologic processes may be incorporated into the modeling program explicitly, using geostatistical concepts such as soft information, or implicitly, through the adoption of a particular approach to modeling. It is unlikely that any single representation of physical properties at the site will be suitable for all modeling purposes. Instead, the same underlying physical reality will need to be described many times, each in a manner conducive to assessing specific performance issues.

  3. Applying Business Process Re-engineering Patterns to optimize WS-BPEL Workflows

    NASA Astrophysics Data System (ADS)

    Buys, Jonas; de Florio, Vincenzo; Blondia, Chris

    With the advent of XML-based SOA, WS-BPEL shortly turned out to become a widely accepted standard for modeling business processes. Though SOA is said to embrace the principle of business agility, BPEL process definitions are still manually crafted into their final executable version. While SOA has proven to be a giant leap forward in building flexible IT systems, this static BPEL workflow model is somewhat paradoxical to the need for real business agility and should be enhanced to better sustain continual process evolution. In this paper, we point out the potential of adding business intelligence with respect to business process re-engineering patterns to the system to allow for automatic business process optimization. Furthermore, we point out that BPR macro-rules could be implemented leveraging micro-techniques from computer science. We present some practical examples that illustrate the benefit of such adaptive process models and our preliminary findings.

  4. SDN-Enabled Dynamic Feedback Control and Sensing in Agile Optical Networks

    NASA Astrophysics Data System (ADS)

    Lin, Likun

    Fiber optic networks are no longer just pipelines for transporting data in the long haul backbone. Exponential growth in traffic in metro-regional areas has pushed higher capacity fiber toward the edge of the network, and highly dynamic patterns of heterogeneous traffic have emerged that are often bursty, severely stressing the historical "fat and dumb pipe" static optical network, which would need to be massively over-provisioned to deal with these loads. What is required is a more intelligent network with a span of control over the optical as well as electrical transport mechanisms which enables handling of service requests in a fast and efficient way that guarantees quality of service (QoS) while optimizing capacity efficiency. An "agile" optical network is a reconfigurable optical network comprised of high speed intelligent control system fed by real-time in situ network sensing. It provides fast response in the control and switching of optical signals in response to changing traffic demands and network conditions. This agile control of optical signals is enabled by pushing switching decisions downward in the network stack to the physical layer. Implementing such agility is challenging due to the response dynamics and interactions of signals in the physical layer. Control schemes must deal with issues such as dynamic power equalization, EDFA transients and cascaded noise effects, impairments due to self-phase modulation and dispersion, and channel-to-channel cross talk. If these issues are not properly predicted and mitigated, attempts at dynamic control can drive the optical network into an unstable state. In order to enable high speed actuation of signal modulators and switches, the network controller must be able to make decisions based on predictive models. In this thesis, we consider how to take advantage of Software Defined Networking (SDN) capabilities for network reconfiguration, combined with embedded models that access updates from deployed network

  5. Attrition and abrasion models for oil shale process modeling

    SciTech Connect

    Aldis, D.F.

    1991-10-25

    As oil shale is processed, fine particles, much smaller than the original shale are created. This process is called attrition or more accurately abrasion. In this paper, models of abrasion are presented for oil shale being processed in several unit operations. Two of these unit operations, a fluidized bed and a lift pipe are used in the Lawrence Livermore National Laboratory Hot-Recycle-Solid (HRS) process being developed for the above ground processing of oil shale. In two reports, studies were conducted on the attrition of oil shale in unit operations which are used in the HRS process. Carley reported results for attrition in a lift pipe for oil shale which had been pre-processed either by retorting or by retorting then burning. The second paper, by Taylor and Beavers, reported results for a fluidized bed processing of oil shale. Taylor and Beavers studied raw, retorted, and shale which had been retorted and then burned. In this paper, empirical models are derived, from the experimental studies conducted on oil shale for the process occurring in the HRS process. The derived models are presented along with comparisons with experimental results.

  6. Modeling of a thermoplastic pultrusion process

    SciTech Connect

    Astroem, B.T. ); Pipes, R.B. )

    1991-07-01

    To obtain a fundamental understanding of the effects of processing parameters and die geometry in a pultrusion process, a mathematical model is essential in order to minimize the number of trial-and-error experiments. Previous investigators have suggested a variety of more or less complete models for thermoset pultrusion, while little effort seems to have been spent modeling its less well-understood thermoplastic equivalent. Hence, a set of intricately related models to describe the temperature and pressure distributions, as well as the matrix flow, in a thermoplastic composite as it travels through the pultrusion die is presented. An approach to calculate the accumulated pulling force is also explored, and the individual mechanisms contributing to the pulling force are discussed. The pressure model incorporates a matrix viscosity that varies with shear rate, temperature, and pressure. Comparisons are made between shear-rate-dependent and Newtonian viscosity representations, indicating the necessity of including non-Newtonian fluid behavior when modeling thermoplastic pultrusion. The governing equations of the models are stated in general terms, and simplifications are implemented in order to obtain solutions without extensive numerical efforts. Pressure, temperature, cooling rate, and pulling force distributions are presented for carbon-fiber-reinforced polyetheretherketone. Pulling force predictions are compared to data obtained from preliminary experiments conducted with a model pultrusion line that was built solely for the pultrusion of thermoplastic matrix composites, and the correlation is found to be qualitatively satisfactory.

  7. Measurement and modeling of moist processes

    NASA Technical Reports Server (NTRS)

    Cotton, William; Starr, David; Mitchell, Kenneth; Fleming, Rex; Koch, Steve; Smith, Steve; Mailhot, Jocelyn; Perkey, Don; Tripoli, Greg

    1993-01-01

    The keynote talk summarized five years of work simulating observed mesoscale convective systems with the RAMS (Regional Atmospheric Modeling System) model. Excellent results are obtained when simulating squall line or other convective systems that are strongly forced by fronts or other lifting mechanisms. Less highly forced systems are difficult to model. The next topic in this colloquium was measurement of water vapor and other constituents of the hydrologic cycle. Impressive accuracy was shown measuring water vapor with both the airborne DIAL (Differential Absorption Lidar) system and the the ground-based Raman Lidar. NMC's plans for initializing land water hydrology in mesoscale models was presented before water vapor measurement concepts for GCIP were discussed. The subject of using satellite data to provide mesoscale moisture and wind analyses was next. Recent activities in modeling of moist processes in mesoscale systems was reported on. These modeling activities at the Canadian Atmospheric Environment Service (AES) used a hydrostatic, variable-resolution grid model. Next the spatial resolution effects of moisture budgets was discussed; in particular, the effects of temporal resolution on heat and moisture budgets for cumulus parameterization. The conclusion of this colloquium was on modeling scale interaction processes.

  8. Therapeutic Process During Exposure: Habituation Model

    PubMed Central

    Benito, Kristen G.; Walther, Michael

    2015-01-01

    The current paper outlines the habituation model of exposure process, which is a behavioral model emphasizing use of individually tailored functional analysis during exposures. This is a model of therapeutic process rather than one meant to explain the mechanism of change underlying exposure-based treatments. Habitation, or a natural decrease in anxiety level in the absence of anxiety-reducing behavior, might be best understood as an intermediate treatment outcome that informs therapeutic process, rather than as a mechanism of change. The habituation model purports that three conditions are necessary for optimal benefit from exposures: 1) fear activation, 2) minimization of anxiety-reducing behaviors, and 3) habituation. We describe prescribed therapist and client behaviors as those that increase or maintain anxiety level during an exposure (and therefore, facilitate habituation), and proscribed therapist and client behaviors as those that decrease anxiety during an exposure (and therefore, impede habituation). We illustrate model-consistent behaviors in the case of Monica, as well as outline the existing research support and call for additional research to further test the tenets of the habituation model as described in this paper. PMID:26258012

  9. Multiscale retinocortical model of contrast processing

    NASA Astrophysics Data System (ADS)

    Moorhead, Ian R.; Haig, Nigel D.

    1996-04-01

    Visual performance models have in the past, typically been empirical, relying on the user to supply numerical values such as target contrast and background luminance to describe the performance of the visual system, when undertaking a specified task. However, it is becoming increasingly easy to obtain computer images using for example digital cameras, scanners, imaging photometers and radiometers. We have therefore been examining the possibility of producing a quantitative model of human vision that is capable of directly processing images in order to provide predictions of performance. We are particularly interested in being able to process images of 'real' scenes. The model is inspired by human vision and the components have analogies with parts of the human visual system but their properties are governed primarily by existing psychophysical data. The first stage of the model generates a multiscale, difference of Gaussian (DoG) representation of the image (Burton, Haig and Moorhead), with a central foveal region of high resolution, and with a resolution that declines with eccentricity as the scale of the filter increases. Incorporated into this stage is a gain control process which ensures that the contrast sensitivity is consistent with the psychophysical data of van Nes and Bouman. The second stage incorporates a model of perceived contrast proposed by Cannon and Fullenkamp. Their model assumes the image is analyzed by oriented (Gabor) filters and produces a representation of the image in terms of perceived contrast.

  10. Improving the process of process modelling by the use of domain process patterns

    NASA Astrophysics Data System (ADS)

    Koschmider, Agnes; Reijers, Hajo A.

    2015-01-01

    The use of business process models has become prevalent in a wide area of enterprise applications. But while their popularity is expanding, concerns are growing with respect to their proper creation and maintenance. An obvious way to boost the efficiency of creating high-quality business process models would be to reuse relevant parts of existing models. At this point, however, limited support exists to guide process modellers towards the usage of appropriate model content. In this paper, a set of content-oriented patterns is presented, which is extracted from a large set of process models from the order management and manufacturing production domains. The patterns are derived using a newly proposed set of algorithms, which are being discussed in this paper. The authors demonstrate how such Domain Process Patterns, in combination with information on their historic usage, can support process modellers in generating new models. To support the wider dissemination and development of Domain Process Patterns within and beyond the studied domains, an accompanying website has been set up.

  11. Modeling stroke rehabilitation processes using the Unified Modeling Language (UML).

    PubMed

    Ferrante, Simona; Bonacina, Stefano; Pinciroli, Francesco

    2013-10-01

    In organising and providing rehabilitation procedures for stroke patients, the usual need for many refinements makes it inappropriate to attempt rigid standardisation, but greater detail is required concerning workflow. The aim of this study was to build a model of the post-stroke rehabilitation process. The model, implemented in the Unified Modeling Language, was grounded on international guidelines and refined following the clinical pathway adopted at local level by a specialized rehabilitation centre. The model describes the organisation of the rehabilitation delivery and it facilitates the monitoring of recovery during the process. Indeed, a system software was developed and tested to support clinicians in the digital administration of clinical scales. The model flexibility assures easy updating after process evolution.

  12. Mesoscopic Modeling of Reactive Transport Processes

    NASA Astrophysics Data System (ADS)

    Kang, Q.; Chen, L.; Deng, H.

    2012-12-01

    Reactive transport processes involving precipitation and/or dissolution are pervasive in geochemical, biological and engineered systems. Typical examples include self-assembled patterns such as Liesegang rings or bands, cones of stalactites in limestones caves, biofilm growth in aqueous environment, formation of mineral deposits in boilers and heat exchangers, uptake of toxic metal ions from polluted water by calcium carbonate, and mineral trapping of CO2. Compared to experimental studies, a numerical approach enables a systematic study of the reaction kinetics, mass transport, and mechanisms of nucleation and crystal growth, and hence provides a detailed description of reactive transport processes. In this study, we enhance a previously developed lattice Boltzmann pore-scale model by taking into account the nucleation process, and develop a mesoscopic approach to simulate reactive transport processes involving precipitation and/or dissolution of solid phases. The model is then used to simulate the formation of Liesegang precipitation patterns and investigate the effects of gel on the morphology of the precipitates. It is shown that this model can capture the porous structures of the precipitates and can account for the effects of the gel concentration and material. A wide range of precipitation patterns is predicted under different gel concentrations, including regular bands, treelike patterns, and for the first time with numerical models, transition patterns from regular bands to treelike patterns. The model is also applied to study the effect of secondary precipitate on the dissolution of primary mineral. Several types of dissolution and precipitation processes are identified based on the morphology and structures of the precipitates and on the extent to which the precipitates affect the dissolution of the primary mineral. Finally the model is applied to study the formation of pseudomorph. It is demonstrated for the first time by numerical simulation that a

  13. Hencky's model for elastomer forming process

    NASA Astrophysics Data System (ADS)

    Oleinikov, A. A.; Oleinikov, A. I.

    2016-08-01

    In the numerical simulation of elastomer forming process, Henckys isotropic hyperelastic material model can guarantee relatively accurate prediction of strain range in terms of large deformations. It is shown, that this material model prolongate Hooke's law from the area of infinitesimal strains to the area of moderate ones. New representation of the fourth-order elasticity tensor for Hencky's hyperelastic isotropic material is obtained, it possesses both minor symmetries, and the major symmetry. Constitutive relations of considered model is implemented into MSC.Marc code. By calculating and fitting curves, the polyurethane elastomer material constants are selected. Simulation of equipment for elastomer sheet forming are considered.

  14. Dynamical modeling of laser ablation processes

    SciTech Connect

    Leboeuf, J.N.; Chen, K.R.; Donato, J.M.; Geohegan, D.B.; Liu, C.L.; Puretzky, A.A.; Wood, R.F.

    1995-09-01

    Several physics and computational approaches have been developed to globally characterize phenomena important for film growth by pulsed laser deposition of materials. These include thermal models of laser-solid target interactions that initiate the vapor plume; plume ionization and heating through laser absorption beyond local thermodynamic equilibrium mechanisms; gas dynamic, hydrodynamic, and collisional descriptions of plume transport; and molecular dynamics models of the interaction of plume particles with the deposition substrate. The complexity of the phenomena involved in the laser ablation process is matched by the diversity of the modeling task, which combines materials science, atomic physics, and plasma physics.

  15. A Generic Modeling Process to Support Functional Fault Model Development

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  16. The SERIOL2 Model of Orthographic Processing

    ERIC Educational Resources Information Center

    Whitney, Carol; Marton, Yuval

    2013-01-01

    The SERIOL model of orthographic analysis proposed mechanisms for converting visual input into a serial encoding of letter order, which involved hemisphere-specific processing at the retinotopic level. As a test of SERIOL predictions, we conducted a consonant trigram-identification experiment, where the trigrams were briefly presented at various…

  17. STBRSIM. Oil Shale Retorting Process Model

    SciTech Connect

    Braun, R.L.; Diaz, J.C.

    1992-03-02

    STBRSIM simulates an aboveground oil-shale retorting process that utilizes two reactors; a staged, fluidized-bed retort and a lift-pipe combustor. The model calculates the steady-state operating conditions for the retorting system,taking into account the chemical and physical processes occurring in the two reactors and auxiliary equipment. Chemical and physical processes considered in modeling the retort include: kerogen pyrolysis, bound water release, fluidization of solids mixture, and bed pressure drop. Processes accounted for by the combustor model include: combustion of residual organic carbon and hydrogen, combustion of pyrite and pyrrhotite, combustion of nonpyrolized kerogen, decomposition of dolomite and calcite, pneumatic transport, heat transfer between solids and gas streams, pressure drop and change in void fraction, and particle attrition. The release of mineral water and the pyrolysis of kerogen take place in the retort when raw shale is mixed with hot partially-burned shale, and the partial combustion of residual char and sulfur takes place in the combustor as the shale particles are transported pneumatically by preheated air. Auxiliary equipment is modeled to determine its effect on the system. This equipment includes blowers and heat-exchangers for the recycle gas to the retort and air to the combustor, as well as a condensor for the product stream from the retort. Simulation results include stream flow rates, temperatures and pressures, bed dimensions, and heater, cooling, and compressor power requirements.

  18. STBRSIM. Oil Shale Retorting Process Model

    SciTech Connect

    Eyberger, L.R.

    1992-03-02

    STBRSIM simulates an aboveground oil-shale retorting process that utilizes two reactors - a staged, fluidized-bed retort and a lift-pipe combustor. The model calculates the steady-state operating conditions for the retorting system, taking into account the chemical and physical processes occurring in the two reactors and auxiliary equipment. Chemical and physical processes considered in modeling the retort include: kerogen pyrolysis, bound water release, fluidization of solids mixture, and bed pressure drop. Processes accounted for by the combustor model include: combustion of residual organic carbon and hydrogen, combustion of pyrite and pyrrhotite, combustion of nonpyrolized kerogen, decomposition of dolomite and calcite, pneumatic transport, heat transfer between solids and gas streams, pressure drop and change in void fraction, and particle attrition. The release of mineral water and the pyrolysis of kerogen take place in the retort when raw shale is mixed with hot partially-burned shale, and the partial combustion of residual char and sulfur takes place in the combustor as the shale particles are transported pneumatically by preheated air. Auxiliary equipment is modeled to determine its effect on the system. This equipment includes blowers and heat-exchangers for the recycle gas to the retort and air to the combustor, as well as a condensor for the product stream from the retort. Simulation results include stream flow rates, temperatures and pressures, bed dimensions, and heater, cooling, and compressor power requirements.

  19. Content, Process, and Product: Modeling Differentiated Instruction

    ERIC Educational Resources Information Center

    Taylor, Barbara Kline

    2015-01-01

    Modeling differentiated instruction is one way to demonstrate how educators can incorporate instructional strategies to address students' needs, interests, and learning styles. This article discusses how secondary teacher candidates learn to focus on content--the "what" of instruction; process--the "how" of instruction;…

  20. Aligning Grammatical Theories and Language Processing Models

    ERIC Educational Resources Information Center

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  1. Computational Process Modeling for Additive Manufacturing (OSU)

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  2. Modeling Low-temperature Geochemical Processes

    NASA Astrophysics Data System (ADS)

    Nordstrom, D. K.

    2003-12-01

    Geochemical modeling has become a popular and useful tool for a wide number of applications from research on the fundamental processes of water-rock interactions to regulatory requirements and decisions regarding permits for industrial and hazardous wastes. In low-temperature environments, generally thought of as those in the temperature range of 0-100 °C and close to atmospheric pressure (1 atm=1.01325 bar=101,325 Pa), complex hydrobiogeochemical reactions participate in an array of interconnected processes that affect us, and that, in turn, we affect. Understanding these complex processes often requires tools that are sufficiently sophisticated to portray multicomponent, multiphase chemical reactions yet transparent enough to reveal the main driving forces. Geochemical models are such tools. The major processes that they are required to model include mineral dissolution and precipitation; aqueous inorganic speciation and complexation; solute adsorption and desorption; ion exchange; oxidation-reduction; or redox; transformations; gas uptake or production; organic matter speciation and complexation; evaporation; dilution; water mixing; reaction during fluid flow; reaction involving biotic interactions; and photoreaction. These processes occur in rain, snow, fog, dry atmosphere, soils, bedrock weathering, streams, rivers, lakes, groundwaters, estuaries, brines, and diagenetic environments. Geochemical modeling attempts to understand the redistribution of elements and compounds, through anthropogenic and natural means, for a large range of scale from nanometer to global. "Aqueous geochemistry" and "environmental geochemistry" are often used interchangeably with "low-temperature geochemistry" to emphasize hydrologic or environmental objectives.Recognition of the strategy or philosophy behind the use of geochemical modeling is not often discussed or explicitly described. Plummer (1984, 1992) and Parkhurst and Plummer (1993) compare and contrast two approaches for

  3. Process Model for Friction Stir Welding

    NASA Technical Reports Server (NTRS)

    Adams, Glynn

    1996-01-01

    Friction stir welding (FSW) is a relatively new process being applied for joining of metal alloys. The process was initially developed by The Welding Institute (TWI) in Cambridge, UK. The FSW process is being investigated at NASA/MSEC as a repair/initial weld procedure for fabrication of the super-light-weight aluminum-lithium shuttle external tank. The FSW investigations at MSFC were conducted on a horizontal mill to produce butt welds of flat plate material. The weldment plates are butted together and fixed to a backing plate on the mill bed. A pin tool is placed into the tool holder of the mill spindle and rotated at approximately 400 rpm. The pin tool is then plunged into the plates such that the center of the probe lies at, one end of the line of contact, between the plates and the shoulder of the pin tool penetrates the top surface of the weldment. The weld is produced by traversing the tool along the line of contact between the plates. A lead angle allows the leading edge of the shoulder to remain above the top surface of the plate. The work presented here is the first attempt at modeling a complex phenomenon. The mechanical aspects of conducting the weld process are easily defined and the process itself is controlled by relatively few input parameters. However, in the region of the weld, plasticizing and forging of the parent material occurs. These are difficult processes to model. The model presented here addresses only variations in the radial dimension outward from the pin tool axis. Examinations of the grain structure of the weld reveal that a considerable amount of material deformation also occurs in the direction parallel to the pin tool axis of rotation, through the material thickness. In addition, measurements of the axial load on the pin tool demonstrate that the forging affect of the pin tool shoulder is an important process phenomenon. Therefore, the model needs to be expanded to account for the deformations through the material thickness and the

  4. Agile hardware and software systems engineering for critical military space applications

    NASA Astrophysics Data System (ADS)

    Huang, Philip M.; Knuth, Andrew A.; Krueger, Robert O.; Garrison-Darrin, Margaret A.

    2012-06-01

    The Multi Mission Bus Demonstrator (MBD) is a successful demonstration of agile program management and system engineering in a high risk technology application where utilizing and implementing new, untraditional development strategies were necessary. MBD produced two fully functioning spacecraft for a military/DOD application in a record breaking time frame and at dramatically reduced costs. This paper discloses the adaptation and application of concepts developed in agile software engineering to hardware product and system development for critical military applications. This challenging spacecraft did not use existing key technology (heritage hardware) and created a large paradigm shift from traditional spacecraft development. The insertion of new technologies and methods in space hardware has long been a problem due to long build times, the desire to use heritage hardware, and lack of effective process. The role of momentum in the innovative process can be exploited to tackle ongoing technology disruptions and allowing risk interactions to be mitigated in a disciplined manner. Examples of how these concepts were used during the MBD program will be delineated. Maintaining project momentum was essential to assess the constant non recurring technological challenges which needed to be retired rapidly from the engineering risk liens. Development never slowed due to tactical assessment of the hardware with the adoption of the SCRUM technique. We adapted this concept as a representation of mitigation of technical risk while allowing for design freeze later in the program's development cycle. By using Agile Systems Engineering and Management techniques which enabled decisive action, the product development momentum effectively was used to produce two novel space vehicles in a fraction of time with dramatically reduced cost.

  5. Development and evaluation of an inverse solution technique for studying helicopter maneuverability and agility

    NASA Technical Reports Server (NTRS)

    Whalley, Matthew S.

    1991-01-01

    An inverse solution technique for determining the maximum maneuvering performance of a helicopter using smooth, pilotlike control inputs is presented. Also described is a pilot simulation experiment performed to investigate the accuracy of the solution resulting from this technique. The maneuverability and agility capability of the helicopter math model was varied by varying the pitch and roll damping, the maximum pitch and roll rate, and the maximum load-factor capability. Three maneuvers were investigated: a 180-deg turn, a longitudinal pop-up, and a lateral jink. The inverse solution technique yielded accurate predictions of pilot-in-the-loop maneuvering performance for two of the three maneuvers.

  6. Process modeling with the regression network.

    PubMed

    van der Walt, T; Barnard, E; van Deventer, J

    1995-01-01

    A new connectionist network topology called the regression network is proposed. The structural and underlying mathematical features of the regression network are investigated. Emphasis is placed on the intricacies of the optimization process for the regression network and some measures to alleviate these difficulties of optimization are proposed and investigated. The ability of the regression network algorithm to perform either nonparametric or parametric optimization, as well as a combination of both, is also highlighted. It is further shown how the regression network can be used to model systems which are poorly understood on the basis of sparse data. A semi-empirical regression network model is developed for a metallurgical processing operation (a hydrocyclone classifier) by building mechanistic knowledge into the connectionist structure of the regression network model. Poorly understood aspects of the process are provided for by use of nonparametric regions within the structure of the semi-empirical connectionist model. The performance of the regression network model is compared to the corresponding generalization performance results obtained by some other nonparametric regression techniques.

  7. Coal-to-Liquids Process Model

    SciTech Connect

    2006-01-01

    A comprehensive Aspen Plus model has been developed to rigorously model coal-to-liquids processes. This portion was developed under Laboratory Directed Research and Development (LDRD) funding. The model is built in a modular fashion to allow rapid reconfiguration for evaluation of process options. Aspen Plus is the framework in which the model is developed. The coal-to-liquids simulation package is an assemble of Aspen Hierarchy Blocks representing subsections of the plant. Each of these Blocks are considered individual components of the Copyright, which may be extracted and licensed as individual components, but which may be combined with one or more other components, to model general coal-conversion processes, including the following plant operations: (1) coal handling and preparation, (2) coal pyrolysis, combustion, or gasification, (3) syngas conditioning and cleanup, (4) sulfur recovery using Claus-SCOT unit operations, (5) Fischer-Tropsch liquid fuels synthesis, (6) hydrocracking of high molecular weight paraffin, (7) hydrotreating of low molecular weight paraffin and olefins, (8) gas separations, and (9) power generation representing integrated combined cycle technology.

  8. Considerations for Using Agile in DoD Acquisition

    DTIC Science & Technology

    2010-04-01

    Buettner, Aerospace Joe Tatem, Raytheon Stephany Bellomo, SEI Nanette Brown, SEI John Foreman, SEI Dr. John Goodenough, SEI Harry Levinson, SEI...event_details.php?id=452 [2] H. Glazer, J. Dalton, D. Anderson, M. Konrad , and S. Shrum, "CMMI or Agile: Why Not Embrace Both," Carnegie Mellon

  9. A Capstone Course on Agile Software Development Using Scrum

    ERIC Educational Resources Information Center

    Mahnic, V.

    2012-01-01

    In this paper, an undergraduate capstone course in software engineering is described that not only exposes students to agile software development, but also makes it possible to observe the behavior of developers using Scrum for the first time. The course requires students to work as Scrum Teams, responsible for the implementation of a set of user…

  10. Correlation between agility and sprinting according to student age.

    PubMed

    Yanci, Javier; Los Arcos, Asier; Grande, Ignacio; Gil, Eneko; Cámara, Jesús

    2014-06-01

    The purposes of the study were to assess sprinting and agility performance characteristics and to determine the relationship between these two motor skills in elementary education students. Sprinting and agility performance were assessed in 176 children (88 boys and 88 girls) divided into three groups: Group 1 (G1, N = 98; 48 boys and 50 girls), from the first year of elementary education; Group 2 (G2, N = 38; 15 boys and 23 girls), from the second year of elementary education; Group 3 (G3, N = 40; 25 boys and 15 girls), from the third year of elementary education. Significant differences (p < 0.001) were found in agility ability among the groups and between G1-G3 and G2-G3 in the 5 and 15 m sprint. Regarding gender of the students of the same age, significant differences (p < 0.001) between boys and girls in group G1 and G2 were obtained in the 5 and 15 m sprint. The correlation between agility and acceleration was significant but moderate (0.3 < r < 0.7) in all groups (G1, G2, and G3), in most cases. When the gender factor was included, the results were heterogeneous. Assessing this correlation according to age and gender produced heterogeneous results. For this reason, we think that both are independent qualities and that age and gender are two factors that influence the correlation results.

  11. Wavelength-Agile External-Cavity Diode Laser for DWDM

    NASA Technical Reports Server (NTRS)

    Pilgrim, Jeffrey S.; Bomse, David S.

    2006-01-01

    A prototype external-cavity diode laser (ECDL) has been developed for communication systems utilizing dense wavelength- division multiplexing (DWDM). This ECDL is an updated version of the ECDL reported in Wavelength-Agile External- Cavity Diode Laser (LEW-17090), NASA Tech Briefs, Vol. 25, No. 11 (November 2001), page 14a. To recapitulate: The wavelength-agile ECDL combines the stability of an external-cavity laser with the wavelength agility of a diode laser. Wavelength is modulated by modulating the injection current of the diode-laser gain element. The external cavity is a Littman-Metcalf resonator, in which the zeroth-order output from a diffraction grating is used as the laser output and the first-order-diffracted light is retro-reflected by a cavity feedback mirror, which establishes one end of the resonator. The other end of the resonator is the output surface of a Fabry-Perot resonator that constitutes the diode-laser gain element. Wavelength is selected by choosing the angle of the diffracted return beam, as determined by position of the feedback mirror. The present wavelength-agile ECDL is distinguished by design details that enable coverage of all 60 channels, separated by 100-GHz frequency intervals, that are specified in DWDM standards.

  12. Agile Software Development Methods: A Comparative Review1

    NASA Astrophysics Data System (ADS)

    Abrahamsson, Pekka; Oza, Nilay; Siponen, Mikko T.

    Although agile software development methods have caught the attention of software engineers and researchers worldwide, scientific research still remains quite scarce. The aim of this study is to order and make sense of the different agile approaches that have been proposed. This comparative review is performed from the standpoint of using the following features as the analytical perspectives: project management support, life-cycle coverage, type of practical guidance, adaptability in actual use, type of research objectives and existence of empirical evidence. The results show that agile software development methods cover, without offering any rationale, different phases of the software development life-cycle and that most of these methods fail to provide adequate project management support. Moreover, quite a few methods continue to offer little concrete guidance on how to use their solutions or how to adapt them in different development situations. Empirical evidence after ten years of application remains quite limited. Based on the results, new directions on agile methods are outlined.

  13. Agile Bodies: A New Imperative in Neoliberal Governance

    ERIC Educational Resources Information Center

    Gillies, Donald

    2011-01-01

    Modern business discourse suggests that a key bulwark against market fluctuation and the threat of failure is for organizations to become "agile'", a more dynamic and proactive position than that previously afforded by mere "flexibility". The same idea is also directed at the personal level, it being argued that the…

  14. Modeling veterans healthcare administration disclosure processes :

    SciTech Connect

    Beyeler, Walter E; DeMenno, Mercy B.; Finley, Patrick D.

    2013-09-01

    As with other large healthcare organizations, medical adverse events at the Department of Veterans Affairs (VA) facilities can expose patients to unforeseen negative risks. VHA leadership recognizes that properly handled disclosure of adverse events can minimize potential harm to patients and negative consequences for the effective functioning of the organization. The work documented here seeks to help improve the disclosure process by situating it within the broader theoretical framework of issues management, and to identify opportunities for process improvement through modeling disclosure and reactions to disclosure. The computational model will allow a variety of disclosure actions to be tested across a range of incident scenarios. Our conceptual model will be refined in collaboration with domain experts, especially by continuing to draw on insights from VA Study of the Communication of Adverse Large-Scale Events (SCALE) project researchers.

  15. An ecological process model of systems change.

    PubMed

    Peirson, Leslea J; Boydell, Katherine M; Ferguson, H Bruce; Ferris, Lorraine E

    2011-06-01

    In June 2007 the American Journal of Community Psychology published a special issue focused on theories, methods and interventions for systems change which included calls from the editors and authors for theoretical advancement in this field. We propose a conceptual model of systems change that integrates familiar and fundamental community psychology principles (succession, interdependence, cycling of resources, adaptation) and accentuates a process orientation. To situate our framework we offer a definition of systems change and a brief review of the ecological perspective and principles. The Ecological Process Model of Systems Change is depicted, described and applied to a case example of policy driven systems level change in publicly funded social programs. We conclude by identifying salient implications for thinking and action which flow from the Model.

  16. A model evaluation checklist for process-based environmental models

    NASA Astrophysics Data System (ADS)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  17. Development of a comprehensive weld process model

    SciTech Connect

    Radhakrishnan, B.; Zacharia, T.; Paul, A.

    1997-05-01

    This cooperative research and development agreement (CRADA) between Concurrent Technologies Corporation (CTC) and Lockheed Martin Energy Systems (LMES) combines CTC`s expertise in the welding area and that of LMES to develop computer models and simulation software for welding processes. This development is of significant impact to the industry, including materials producers and fabricators. The main thrust of the research effort was to develop a comprehensive welding simulation methodology. A substantial amount of work has been done by several researchers to numerically model several welding processes. The primary drawback of most of the existing models is the lack of sound linkages between the mechanistic aspects (e.g., heat transfer, fluid flow, and residual stress) and the metallurgical aspects (e.g., microstructure development and control). A comprehensive numerical model which can be used to elucidate the effect of welding parameters/conditions on the temperature distribution, weld pool shape and size, solidification behavior, and microstructure development, as well as stresses and distortion, does not exist. It was therefore imperative to develop a comprehensive model which would predict all of the above phenomena during welding. The CRADA built upon an already existing three-dimensional (3-D) welding simulation model which was developed by LMES which is capable of predicting weld pool shape and the temperature history in 3-d single-pass welds. However, the model does not account for multipass welds, microstructural evolution, distortion and residual stresses. Additionally, the model requires large resources of computing time, which limits its use for practical applications. To overcome this, CTC and LMES have developed through this CRADA the comprehensive welding simulation model described above.

  18. Thermal modeling of an epoxy encapsulation process

    SciTech Connect

    Baca, R.G.; Schutt, J.A.

    1991-01-01

    The encapsulation of components is a widely used process at Sandia National Laboratories for packaging components to withstand structural loads. Epoxy encapsulants are also used for their outstanding dielectric strength characteristics. The production of high voltage assemblies requires the encapsulation of ceramic and electrical components (such as transformers). Separation of the encapsulant from internal contact surfaces or voids within the encapsulant itself in regions near the mold base have caused high voltage breakdown failures during production testing. In order to understand the failure mechanisms, a methodology was developed to predict both the thermal response and gel front progression of the epoxy the encapsulation process. A thermal model constructed with PATRAN Plus (1) and solved with the P/THERMAL (2) analysis system was used to predict the thermal response of the encapsulant. This paper discusses the incorporation of an Arrhenius kinetics model into Q/TRAN (2) to model the complex volumetric heat generation of the epoxy during the encapsulation process. As the epoxy begins to cure, it generates heat and shrinks. The total cure time of the encapsulant (transformation from a viscous liquid to solid) is dependent on both the initial temperature and the entire temperature history. Because the rate of cure is temperature dependent, the cure rate accelerates with a temperature increase and, likewise, the cure rate is quenched if the temperature is reduced. The temperature and conversion predictions compared well against experimental data. The thermal simulation results were used to modify the temperature cure process of the encapsulant and improve production yields.

  19. Reversibility in Quantum Models of Stochastic Processes

    NASA Astrophysics Data System (ADS)

    Gier, David; Crutchfield, James; Mahoney, John; James, Ryan

    Natural phenomena such as time series of neural firing, orientation of layers in crystal stacking and successive measurements in spin-systems are inherently probabilistic. The provably minimal classical models of such stochastic processes are ɛ-machines, which consist of internal states, transition probabilities between states and output values. The topological properties of the ɛ-machine for a given process characterize the structure, memory and patterns of that process. However ɛ-machines are often not ideal because their statistical complexity (Cμ) is demonstrably greater than the excess entropy (E) of the processes they represent. Quantum models (q-machines) of the same processes can do better in that their statistical complexity (Cq) obeys the relation Cμ >= Cq >= E. q-machines can be constructed to consider longer lengths of strings, resulting in greater compression. With code-words of sufficiently long length, the statistical complexity becomes time-symmetric - a feature apparently novel to this quantum representation. This result has ramifications for compression of classical information in quantum computing and quantum communication technology.

  20. An agile acquisition decision-support workbench for evaluating ISR effectiveness

    NASA Astrophysics Data System (ADS)

    Stouch, Daniel W.; Champagne, Valerie; Mow, Christopher; Rosenberg, Brad; Serrin, Joshua

    2011-06-01

    The U.S. Air Force is consistently evolving to support current and future operations through the planning and execution of intelligence, surveillance and reconnaissance (ISR) missions. However, it is a challenge to maintain a precise awareness of current and emerging ISR capabilities to properly prepare for future conflicts. We present a decisionsupport tool for acquisition managers to empirically compare ISR capabilities and approaches to employing them, thereby enabling the DoD to acquire ISR platforms and sensors that provide the greatest return on investment. We have developed an analysis environment to perform modeling and simulation-based experiments to objectively compare alternatives. First, the analyst specifies an operational scenario for an area of operations by providing terrain and threat information; a set of nominated collections; sensor and platform capabilities; and processing, exploitation, and dissemination (PED) capacities. Next, the analyst selects and configures ISR collection strategies to generate collection plans. The analyst then defines customizable measures of effectiveness or performance to compute during the experiment. Finally, the analyst empirically compares the efficacy of each solution and generates concise reports to document their conclusions, providing traceable evidence for acquisition decisions. Our capability demonstrates the utility of using a workbench environment for analysts to design and run experiments. Crafting impartial metrics enables the acquisition manager to focus on evaluating solutions based on specific military needs. Finally, the metric and collection plan visualizations provide an intuitive understanding of the suitability of particular solutions. This facilitates a more agile acquisition strategy that handles rapidly changing technology in response to current military needs.

  1. Coupled process modeling and waste package performance

    SciTech Connect

    McGrail, B.P.; Engel, D.W.

    1992-11-01

    The interaction of borosilicate waste glasses with water has been studied extensively and reasonably good models are available that describe the reaction kinetics and solution chemical effects. Unfortunately, these models have not been utilized in performance assessment analyses, except in estimating radionuclide solubilities at the waste form surface. A geochemical model has been incorporated in the AREST code to examine the coupled processes of glass dissolution and transport within the engineering barrier system. Our calculations show that the typical assumptions used in performance assessment analyses, such as fixed solubilities or constant reaction rate at the waste form surface, do not always give conservative or realistic predictions of radionuclide release. Varying the transport properties of the waste package materials is shown to give counterintuitive effects on the release rates of some radionuclides. The use of noncoupled performance assessment models could lead a repository designer to an erroneous conclusion regarding the relative benefit of one waste package design or host rock setting over another.

  2. Organizational Agility and Complex Enterprise System Innovations: A Mixed Methods Study of the Effects of Enterprise Systems on Organizational Agility

    ERIC Educational Resources Information Center

    Kharabe, Amol T.

    2012-01-01

    Over the last two decades, firms have operated in "increasingly" accelerated "high-velocity" dynamic markets, which require them to become "agile." During the same time frame, firms have increasingly deployed complex enterprise systems--large-scale packaged software "innovations" that integrate and automate…

  3. Peer Review Process and Accreditation of Models

    DTIC Science & Technology

    1990-02-02

    COMMUNICATIONS, AND COMPUTER SCIENCES (AIRMICS) AD-A268 573 PEER REVIEW PROCESS AND ACCREDITATION OF MODELS (ASQBG-A-89-010) 2 February 1990 _DTIC ELECTE...IS BEST QUALITY AVAILABLE. THE COPY FURNISHED TO DTIC CONTAINED A SIGNIFICANT NUMBER OF PAGES WHICH DO NOT REPRODUCE LEGIBLY. Peer Review Process and...1990 Table of Contents Executive Summary 1 Introduction 1 1.1 Purpose 1 1.2 Background 1 1.3 Current Issues 5 2 Previous Peer Reviews 7 2.1 General 7 2.2

  4. Modeling Manufacturing Processes to Mitigate Technological Risk

    SciTech Connect

    Allgood, G.O.; Manges, W.W.

    1999-10-24

    An economic model is a tool for determining the justifiable cost of new sensors and subsystems with respect to value and operation. This process balances the R and D costs against the expense of maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.). This model has been successfully used and deployed in the CAFE Project. The economic model was one of seven (see Fig. 1) elements critical in developing an investment strategy. It has been successfully used in guiding the R and D activities on the CAFE Project, suspending activities on three new sensor technologies, and continuing development o f two others. The model has also been used to justify the development of a new prognostic approach for diagnosing machine health using COTS equipment and a new algorithmic approach. maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.).

  5. Glacier lake outburst floods - modelling process chains

    NASA Astrophysics Data System (ADS)

    Schaub, Yvonne; Huggel, Christian; Haeberli, Wilfried

    2013-04-01

    New lakes are forming in high-mountain areas all over the world due to glacier recession. Often they will be located below steep, destabilized flanks and are therefore exposed to impacts from rock-/ice-avalanches. Several events worldwide are known, where an outburst flood has been triggered by such an impact. In regions such as in the European Alps or in the Cordillera Blanca in Peru, where valley bottoms are densely populated, these far-travelling, high-magnitude events can result in major disasters. For appropriate integral risk management it is crucial to gain knowledge on how the processes (rock-/ice-avalanches - impact waves in lake - impact on dam - outburst flood) interact and how the hazard potential related to corresponding process chains can be assessed. Research in natural hazards so far has mainly concentrated on describing, understanding, modeling or assessing single hazardous processes. Some of the above mentioned individual processes are quite well understood in their physical behavior and some of the process interfaces have also been investigated in detail. Multi-hazard assessments of the entire process chain, however, have only recently become subjects of investigations. Our study aims at closing this gap and providing suggestions on how to assess the hazard potential of the entire process chain in order to generate hazard maps and support risk assessments. We analyzed different types of models (empirical, analytical, physically based) for each process regarding their suitability for application in hazard assessments of the entire process chain based on literature. Results show that for rock-/ice-avalanches, dam breach and outburst floods, only numerical, physically based models are able to provide the required information, whereas the impact wave can be estimated by means of physically based or empirical assessments. We demonstrate how the findings could be applied with the help of a case study of a recent glacier lake outburst event at Laguna

  6. Theoretical model of crystal growth shaping process

    NASA Astrophysics Data System (ADS)

    Tatarchenko, V. A.; Uspenski, V. S.; Tatarchenko, E. V.; Nabot, J. Ph.; Duffar, T.; Roux, B.

    1997-10-01

    A theoretical investigation of the crystal growth shaping process is carried out on the basis of the dynamic stability concept. The capillary dynamic stability of shaped crystal growth processes for various forms of the liquid menisci is analyzed using the mathematical model of the phenomena in the axisymmetric case. The catching boundary condition of the capillary boundary problem is considered and the limits of its application for shaped crystal growth modeling are discussed. The static stability of a liquid free surface is taken into account by means of the Jacobi equation analysis. The result is that a large number of menisci having drop-like shapes are statically unstable. A few new non-traditional liquid meniscus shapes (e.g., bubbles and related shapes) are proposed for the case of a catching boundary condition.

  7. Expert Models and Modeling Processes Associated with a Computer-Modeling Tool

    ERIC Educational Resources Information Center

    Zhang, BaoHui; Liu, Xiufeng; Krajcik, Joseph S.

    2006-01-01

    Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using "think aloud" technique…

  8. Analytical Modeling of High Rate Processes.

    DTIC Science & Technology

    2007-11-02

    TYPE AND DATES COVERED 1 13 Apr 98 Final (01 Sep 94 - 31 Aug 97) 4. TITLE AND SUBTITLE 5 . FUNDING NUMBERS Analytical Modeling of High Rate Processes...20332- 8050 FROM: S. E. Jones, University Research Professor Department of Aerospace Engineering and Mechanics University of Alabama SUBJECT: Final...Mr. Sandor Augustus and Mr. Jeffrey A. Drinkard. There are no outstanding commitments. The balance in the account, as of July 31 , 1997, was $102,916.42

  9. Which Process Model Practices Support Project Success?

    NASA Astrophysics Data System (ADS)

    Lepmets, Marion

    In this research the relevance of the guidance of software process models to industry was studied - more precisely, how relevant are the basic project management practices to the industry projects and to the success of these projects. The focus of the research is on project management and its related practices - the processes that support the achievement of capability levels 1 and 2 in CMMI and ISO/IEC 15504. These project management practices can also be viewed as best practices, the application of which can lead to project success. We aimed to discover whether the implementation of basic project management practices supports project success. There is evidence that higher process capability supports increased project performance. The question remains about the significance of basic project management practices to project performance.

  10. Software Model Of Software-Development Process

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Synott, Debra J.; Levary, Reuven R.

    1990-01-01

    Collection of computer programs constitutes software tool for simulation of medium- to large-scale software-development projects. Necessary to include easily identifiable and more-readily quantifiable characteristics like costs, times, and numbers of errors. Mathematical model incorporating these and other factors of dynamics of software-development process implemented in the Software Life Cycle Simulator (SLICS) computer program. Simulates dynamics of software-development process. In combination with input and output expert software systems and knowledge-based management software system, develops information for use in managing large software-development project. Intended to aid managers in planning, managing, and controlling software-development processes by reducing uncertainties in budgets, required personnel, and schedules.

  11. Near Field Environment Process Model Report

    SciTech Connect

    R.A. Wagner

    2000-11-14

    Waste emplacement and activities associated with construction of a repository system potentially will change environmental conditions within the repository system. These environmental changes principally result from heat generated by the decay of the radioactive waste, which elevates temperatures within the repository system. Elevated temperatures affect distribution of water, increase kinetic rates of geochemical processes, and cause stresses to change in magnitude and orientation from the stresses resulting from the overlying rock and from underground construction activities. The recognition of this evolving environment has been reflected in activities, studies and discussions generally associated with what has been termed the Near-Field Environment (NFE). The NFE interacts directly with waste packages and engineered barriers as well as potentially changing the fluid composition and flow conditions within the mountain. As such, the NFE defines the environment for assessing the performance of a potential Monitored Geologic Repository at Yucca Mountain, Nevada. The NFe evolves over time, and therefore is not amenable to direct characterization or measurement in the ambient system. Analysis or assessment of the NFE must rely upon projections based on tests and models that encompass the long-term processes of the evolution of this environment. This NFE Process Model Report (PMR) describes the analyses and modeling based on current understanding of the evolution of the near-field within the rock mass extending outward from the drift wall.

  12. Impact of numerical models on fragmentation processes

    NASA Astrophysics Data System (ADS)

    Renouf, Mathieu; Gezahengn, Belien; Abbas, Micheline; Bourgeois, Florent

    2013-06-01

    Simulated fragmentation process in granular assemblies is a challenging problem which date back the beginning of the 90'. If first approaches have focus on the fragmentation on a single particle, with the development of robust, fast numerical method is is possible today to simulated such process in a large collection of particles. But the question of the fragmentation problem is still open: should the fragmentation be done dynamically (one particle becoming two fragments) and according which criterion or should the fragment paths be defined initially and which is the impact of the discretization and the model of fragments? The present contribution proposes to investigate the second aspect i.e. the impact of fragment modeling on the fragmentation processes. First to perform such an analysis, the geometry of fragments (disks/sphere or polygon/polyhedra), their behavior (rigid/deformable) and the law governing their interactions are investigated. Then such model will be used in a grinding application where the evolution of fragments and impact on the behavior of the whole packing are investigate.

  13. CROW{trademark} process modeling. Final report

    SciTech Connect

    1996-01-01

    The Western Research Institute (WRI) has patented a technology (CROW{trademark}) for the recovery of oily contaminants from water-saturated formations. The CROW process uses either hot water or low-pressure steam to flush contaminants to the surface by means of production wells. CROW is typically applied to highly permeable aquifers that have been invaded by organics such as coal tars or chemical solvents. In conceptualizing a model of the CROW process, we draw an analogy between flushing organics from an organic-contaminated aquifer and producing oil from a petroleum reservoir. The organic-contaminated aquifer can be represented as a petroleum reservoir. The injection of water or steam and production of water/organic admixtures can be described by standard reservoir well equations. Finally, the movement of organic and water within the aquifer can be represented by Darcy flow of the individual phases. Thus, in modeling the CROW process, it is reasonable to assume that a petroleum reservoir simulator would accurately portray the recovery of organics from a contaminated aquifer. Of course, the reservoir simulator would need to incorporate thermal aspects of Darcy flow to accurately represent recovery during CROW processing.

  14. Reinventing The Design Process: Teams and Models

    NASA Technical Reports Server (NTRS)

    Wall, Stephen D.

    1999-01-01

    The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.

  15. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    PubMed

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  16. Thermal Modeling of A Friction Bonding Process

    SciTech Connect

    John Dixon; Douglas Burkes; Pavel Medvedev

    2007-10-01

    A COMSOL model capable of predicting temperature evolution during nuclear fuel fabrication is being developed at the Idaho National Laboratory (INL). Fuel plates are fabricated by friction bonding (FB) uranium-molybdenum (U-Mo) alloy foils positioned between two aluminum plates. The ability to predict temperature distribution during fabrication is imperative to ensure good quality bonding without inducing an undesirable chemical reaction between U-Mo and aluminum. A three-dimensional heat transfer model of the FB process implementing shallow pin penetration for cladding monolithic nuclear fuel foils is presented. Temperature distribution during the FB process as a function of fabrication parameters such as weld speed, tool load, and tool rotational frequency are predicted. Model assumptions, settings, and equations are described in relation to standard friction stir welding. Current experimental design for validation and calibration of the model is also demonstrated. Resulting experimental data reveal the accuracy in describing asymmetrical temperature distributions about the tool face. Temperature of the bonded plate drops beneath the pin and is higher on the advancing side than the retreating side of the tool.

  17. System Engineering Concept Demonstration, Process Model. Volume 3

    DTIC Science & Technology

    1992-12-01

    the results of SECD Process Model Task. The SECD Process Model is a system acquisition and development model that emphasizes System Engineering...activities over the entire system lifecycle. The Process model is a graphical representation of the System Engineering Lifecycle activities, agents, flows...feedbacks, and work products. This interactive Process Model provides a multi- dimensional view of government acquisition and contractor development

  18. Modeling Dynamic Regulatory Processes in Stroke.

    SciTech Connect

    McDermott, Jason E.; Jarman, Kenneth D.; Taylor, Ronald C.; Lancaster, Mary J.; Shankaran, Harish; Vartanian, Keri B.; Stevens, S.L.; Stenzel-Poore, Mary; Sanfilippo, Antonio P.

    2012-10-11

    The ability to examine in silico the behavior of biological systems can greatly accelerate the pace of discovery in disease pathologies, such as stroke, where in vivo experimentation is lengthy and costly. In this paper we describe an approach to in silico examination of blood genomic responses to neuroprotective agents and subsequent stroke through the development of dynamic models of the regulatory processes observed in the experimental gene expression data. First, we identified functional gene clusters from these data. Next, we derived ordinary differential equations (ODEs) relating regulators and functional clusters from the data. These ODEs were used to develop dynamic models that simulate the expression of regulated functional clusters using system dynamics as the modeling paradigm. The dynamic model has the considerable advantage of only requiring an initial starting state, and does not require measurement of regulatory influences at each time point in order to make accurate predictions. The manipulation of input model parameters, such as changing the magnitude of gene expression, made it possible to assess the behavior of the networks through time under varying conditions. We report that an optimized dynamic model can provide accurate predictions of overall system behavior under several different preconditioning paradigms.

  19. Teachers as managers of the modelling process

    NASA Astrophysics Data System (ADS)

    Lingefjärd, Thomas; Meier, Stephanie

    2010-09-01

    The work in the Comenius Network project Developing Quality in Mathematics Education II (DQME II) has a main focus on development and evaluation of modelling tasks. One reason is the gap between what mathematical modelling is and what is taught in mathematical classrooms. This article deals with one modelling task and focuses on how two teachers handle this task in their classrooms. Initially, the notion of a teacher being the manager of the learning process is elaborated. Using criteria developed from taking this perspective, we analyse classroom sequences to determine the nature of "teaching like a manager" and the actions that are classroom evidence for working in this way. Conclusions include recommendations for how to realise "acting like a manager" in mathematics classrooms.

  20. GREENSCOPE: A Method for Modeling Chemical Process ...

    EPA Pesticide Factsheets

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Efficiency, and Energy, can evaluate processes with over a hundred different indicators. These indicators provide a means for realizing the principles of green chemistry and green engineering in the context of sustainability. Development of the methodology has centered around three focal points. One is a taxonomy of impacts that describe the indicators and provide absolute scales for their evaluation. The setting of best and worst limits for the indicators allows the user to know the status of the process under study in relation to understood values. Thus, existing or imagined processes can be evaluated according to their relative indicator scores, and process modifications can strive towards realizable targets. A second area of focus is in advancing definitions of data needs for the many indicators of the taxonomy. Each of the indicators has specific data that is necessary for their calculation. Values needed and data sources have been identified. These needs can be mapped according to the information source (e.g., input stream, output stream, external data, etc.) for each of the bases. The user can visualize data-indicator relationships on the way to choosing selected ones for evalua

  1. Agile supply chain capabilities: emerging patterns as a determinant of competitive objectives

    NASA Astrophysics Data System (ADS)

    Yusuf, Yahaya Y.; Adeleye, E. O.; Sivayoganathan, K.

    2001-10-01

    Turbulent change caused by factors such as changing customer and technological requirements threatens manufacturers through lower product life cycles, profits and bleak survival prospects. Therefore, several companies are stressing flexibility and agility in order to respond, real time, to the unique needs of customers and markets. However, the resource competencies required are often difficult to mobilise and retain by single companies. It is therefore imperative for companies to co-operate and leverage complementary competencies. To this end, legally separate and spatially distributed companies are becoming integrated through Internet-based technologies. The paper reviews emerging patterns in supply chain integration. It also explores the relationship between the emerging patterns and attainment of competitive objectives. The results reported in the paper are based on data from a survey by questionnaire. The survey involved 600 companies in the UK, as part of a larger study of agile manufacturing. The study was driven by a conceptual model, which relates supply chain practices to competitive objectives. The analysis involves the use of factor analysis to reduce research variables to a few principal components. Subsequently, multiple regression was conducted to study the relationship amongst the reduced variables. The results validate the proposed conceptual model and lend credence to current thinking that supply chain integration is a vital tool for competitive advantage.

  2. Multiphase Flow Modeling of Biofuel Production Processes

    SciTech Connect

    D. Gaston; D. P. Guillen; J. Tester

    2011-06-01

    As part of the Idaho National Laboratory's (INL's) Secure Energy Initiative, the INL is performing research in areas that are vital to ensuring clean, secure energy supplies for the future. The INL Hybrid Energy Systems Testing (HYTEST) Laboratory is being established to develop and test hybrid energy systems with the principal objective to safeguard U.S. Energy Security by reducing dependence on foreign petroleum. HYTEST involves producing liquid fuels in a Hybrid Energy System (HES) by integrating carbon-based (i.e., bio-mass, oil-shale, etc.) with non-carbon based energy sources (i.e., wind energy, hydro, geothermal, nuclear, etc.). Advances in process development, control and modeling are the unifying vision for HES. This paper describes new modeling tools and methodologies to simulate advanced energy processes. Needs are emerging that require advanced computational modeling of multiphase reacting systems in the energy arena, driven by the 2007 Energy Independence and Security Act, which requires production of 36 billion gal/yr of biofuels by 2022, with 21 billion gal of this as advanced biofuels. Advanced biofuels derived from microalgal biomass have the potential to help achieve the 21 billion gal mandate, as well as reduce greenhouse gas emissions. Production of biofuels from microalgae is receiving considerable interest due to their potentially high oil yields (around 600 gal/acre). Microalgae have a high lipid content (up to 50%) and grow 10 to 100 times faster than terrestrial plants. The use of environmentally friendly alternatives to solvents and reagents commonly employed in reaction and phase separation processes is being explored. This is accomplished through the use of hydrothermal technologies, which are chemical and physical transformations in high-temperature (200-600 C), high-pressure (5-40 MPa) liquid or supercritical water. Figure 1 shows a simplified diagram of the production of biofuels from algae. Hydrothermal processing has significant

  3. TUNS/TCIS information model/process model

    NASA Technical Reports Server (NTRS)

    Wilson, James

    1992-01-01

    An Information Model is comprised of graphical and textual notation suitable for describing and defining the problem domain - in our case, TUNS or TCIS. The model focuses on the real world under study. It identifies what is in the problem and organizes the data into a formal structure for documentation and communication purposes. The Information Model is composed of an Entity Relationship Diagram (ERD) and a Data Dictionary component. The combination of these components provide an easy to understand methodology for expressing the entities in the problem space, the relationships between entities and the characteristics (attributes) of the entities. This approach is the first step in information system development. The Information Model identifies the complete set of data elements processed by TUNS. This representation provides a conceptual view of TUNS from the perspective of entities, data, and relationships. The Information Model reflects the business practices and real-world entities that users must deal with.

  4. Modeling of an Active Tablet Coating Process.

    PubMed

    Toschkoff, Gregor; Just, Sarah; Knop, Klaus; Kleinebudde, Peter; Funke, Adrian; Djuric, Dejan; Scharrer, Georg; Khinast, Johannes G

    2015-12-01

    Tablet coating is a common unit operation in the pharmaceutical industry, during which a coating layer is applied to tablet cores. The coating uniformity of tablets in a batch is especially critical for active coating, that is, coating that contains an active pharmaceutical ingredient. In recent years, discrete element method (DEM) simulations became increasingly common for investigating tablet coating. In this work, DEM was applied to model an active coating process as closely as possible, using measured model parameters and non-spherical particles. We studied how operational conditions (rotation speed, fill level, number of nozzles, and spray rate) influence the coating uniformity. To this end, simulation runs were planned and interpreted according to a statistical design of (simulation) experiments. Our general goal was to achieve a deeper understanding of the process in terms of residence times and dimensionless scaling laws. With that regard, the results were interpreted in light of analytical models. The results were presented at various detail levels, ranging from an overview of all variations to in-depth considerations. It was determined that the biggest uniformity improvement in a realistic setting was achieved by increasing the number of spray nozzles, followed by increasing the rotation speed and decreasing the fill level.

  5. Model systems for life processes on Mars

    NASA Technical Reports Server (NTRS)

    Mitz, M. A.

    1974-01-01

    In the evolution of life forms nonphotosynthetic mechanisms are developed. The question remains whether a total life system could evolve which is not dependent upon photosynthesis. In trying to visualize life on other planets, the photosynthetic process has problems. On Mars, the high intensity of light at the surface is a concern and alternative mechanisms need to be defined and analyzed. In the UV search for alternate mechanisms, several different areas may be identified. These involve activated inorganic compounds in the atmosphere, such as the products of photodissociation of carbon dioxide and the organic material which may be created by natural phenomena. In addition, a life system based on the pressure of the atmospheric constituents, such as carbon dioxide, is a possibility. These considerations may be important for the understanding of evolutionary processes of life on another planet. Model systems which depend on these alternative mechanisms are defined and related to presently planned and future planetary missions.

  6. A Markovian Process Modeling for Pickomino

    NASA Astrophysics Data System (ADS)

    Cardon, Stéphane; Chetcuti-Sperandio, Nathalie; Delorme, Fabien; Lagrue, Sylvain

    This paper deals with a nondeterministic game based on die rolls and on the "stop or continue" principle: Pickomino. During his turn, each participant has to make the best decisions first to choose the dice to keep, then to choose between continuing or stopping depending on the previous rolls and on the available resources. Markov Decision Processes (MDPs) offer the formal framework to model this game. The two main problems are first to determine the set of states, then to compute the transition probabilities.

  7. Time models and cognitive processes: a review

    PubMed Central

    Maniadakis, Michail; Trahanias, Panos

    2014-01-01

    The sense of time is an essential capacity of humans, with a major role in many of the cognitive processes expressed in our daily lifes. So far, in cognitive science and robotics research, mental capacities have been investigated in a theoretical and modeling framework that largely neglects the flow of time. Only recently there has been a rather limited, but constantly increasing interest in the temporal aspects of cognition, integrating time into a range of different models of perceptuo-motor capacities. The current paper aims to review existing works in the field and suggest directions for fruitful future work. This is particularly important for the newly developed field of artificial temporal cognition that is expected to significantly contribute in the development of sophisticated artificial agents seamlessly integrated into human societies. PMID:24578690

  8. Sprint, agility, strength and endurance capacity in wheelchair basketball players

    PubMed Central

    Granados, C; Otero, M; Badiola, A; Olasagasti, J; Bidaurrazaga-Letona, I; Iturricastillo, A; Gil, SM

    2014-01-01

    The aims of the present study were, firstly, to determine the reliability and reproducibility of an agility T-test and Yo-Yo 10 m recovery test; and secondly, to analyse the physical characteristics measured by sprint, agility, strength and endurance field tests in wheelchair basketball (WB) players. 16 WB players (33.06 ± 7.36 years, 71.89 ± 21.71 kg and sitting body height 86.07 ± 6.82 cm) belonging to the national WB league participated in this study. Wheelchair sprint (5 and 20 m without ball, and 5 and 20 m with ball) agility (T-test and pick-up test) strength (handgrip and maximal pass) and endurance (Yo-Yo 10 m recovery test) were performed. T-test and Yo-Yo 10 m recovery test showed good reproducibility values (intraclass correlation coefficient, ICC = 0.74-0.94). The WB players’ results in 5 and 20 m sprints without a ball were 1.87 ± 0.21 s and 5.70 ± 0.43 s and with a ball 2.10 ± 0.30 s and 6.59 ± 0.61 s, being better than those reported in the literature. Regarding the pick-up test results (16.05 ± 0.52 s) and maximal pass (8.39 ± 1.77 m), players showed worse values than those obtained in elite players. The main contribution of the present study is the characterization of the physical performance profile of WB players using a field test battery. Furthermore, we demonstrated that the agility T-test and the aerobic Yo-Yo 10 m recovery test are reliable; consequently they may be appropriate instruments for measuring physical fitness in WB. PMID:25729153

  9. In Pursuit of Agile Acquisition: Are We There Yet?

    DTIC Science & Technology

    2013-03-01

    bureaucracy in the methodology, and avoid promoting activities that would further expand regulatory guidance and oversight to improve agility. Once an...through an integrated digital system called Blue Force Tracker . Instead of the lack of situational awareness, units now use streaming video to help...to trace forensics collected at other crime scenes or events and trace the data back to specific individuals thus identifying dangerous insurgents

  10. ROADM architectures and technologies for agile optical networks

    NASA Astrophysics Data System (ADS)

    Eldada, Louay A.

    2007-02-01

    We review the different optoelectronic component and module technologies that have been developed for use in ROADM subsystems, and describe their principles of operation, designs, features, advantages, and challenges. We also describe the various needs for reconfigurable optical add/drop switching in agile optical networks. For each network need, we present the different ROADM subsystem architecture options with their pros and cons, and describe the optoelectronic technologies supporting each architecture.

  11. Laser agile illumination for object tracking and classification - Feasibility study

    NASA Technical Reports Server (NTRS)

    Scholl, Marija S.; Vanzyl, Jakob J.; Meinel, Aden B.; Meinel, Marjorie P.; Scholl, James W.

    1988-01-01

    The 'agile illumination' concept for discrimination between ICBM warheads and decoys involves a two-aperture illumination with coherent light, diffraction of light by propagation, and a resulting interference pattern on the object surface. A scanning two-beam interference pattern illuminates one object at a time; depending on the shape, momentum, spinning, and tumbling characteristics of the interrogated object, different temporal signals will be obtained for different classes of objects.

  12. Force Projection, Strategic Agility and the Big Meltdown

    DTIC Science & Technology

    2001-05-18

    UNLIMITED Number of Pages 29 ii Abstract of FORCE PROJECTION, STRATEGIC AGILITY AND THE BIG MELTDOWN Due to global warming , the polar icepack which...INTRODUCTION The polar icecap which covers the Arctic Ocean is melting. It is a well-known, scientific fact. Global warming is the generally...operational factors and functions, as applicable. 3 CHAPTER II BACKGROUND Global Warming and the Arctic During this and the last century, researchers have

  13. Network architecture design of an agile sensing system with sandwich wireless sensor nodes

    NASA Astrophysics Data System (ADS)

    Dorvash, S.; Li, X.; Pakzad, S.; Cheng, L.

    2012-04-01

    Wireless sensor network (WSN) is recently emerged as a powerful tool in the structural health monitoring (SHM). Due to the limitations of wireless channel capacity and the heavy data traffic, the control on the network is usually not real time. On the other hand, many SHM applications require quick response when unexpected events, such as earthquake, happen. Realizing the need to have an agile monitoring system, an approach, called sandwich node, was proposed. Sandwich is a design of complex sensor node where two Imote2 nodes are connected with each other to enhance the capabilities of the sensing units. The extra channel and processing power, added into the nodes, enable agile responses of the sensing network, particularly in interrupting the network and altering the undergoing tasks for burst events. This paper presents the design of a testbed for examination of the performance of wireless sandwich nodes in a network. The designed elements of the network are the software architecture of remote and local nodes, and the triggering strategies for coordinating the sensing units. The performance of the designed network is evaluated through its implementation in a monitoring test in the laboratory. For both original Imote2 and the sandwich node, the response time is estimated. The results show that the sandwich node is an efficient solution to the collision issue in existing interrupt approaches and the latency in dense wireless sensor networks.

  14. Clustering-based urbanisation to improve enterprise information systems agility

    NASA Astrophysics Data System (ADS)

    Imache, Rabah; Izza, Said; Ahmed-Nacer, Mohamed

    2015-11-01

    Enterprises are daily facing pressures to demonstrate their ability to adapt quickly to the unpredictable changes of their dynamic in terms of technology, social, legislative, competitiveness and globalisation. Thus, to ensure its place in this hard context, enterprise must always be agile and must ensure its sustainability by a continuous improvement of its information system (IS). Therefore, the agility of enterprise information systems (EISs) can be considered today as a primary objective of any enterprise. One way of achieving this objective is by the urbanisation of the EIS in the context of continuous improvement to make it a real asset servicing enterprise strategy. This paper investigates the benefits of EISs urbanisation based on clustering techniques as a driver for agility production and/or improvement to help managers and IT management departments to improve continuously the performance of the enterprise and make appropriate decisions in the scope of the enterprise objectives and strategy. This approach is applied to the urbanisation of a tour operator EIS.

  15. Observing peculiar γ-ray pulsars with AGILE

    NASA Astrophysics Data System (ADS)

    Pilia, M.; Pellizzoni, A.

    2011-08-01

    The AGILE γ-ray satellite provides large sky exposure levels (>=109 cm2 s per year on the Galactic Plane) with sensitivity peaking at E ~100 MeV where the bulk of pulsar energy output is typically released. Its ~1 μs absolute time tagging capability makes it perfectly suited for the study of γ-ray pulsars. AGILE collected a large number of γ-ray photons from EGRET pulsars (>=40,000 pulsed counts for Vela) in two years of observations unveiling new interesting features at sub-millisecond level in the pulsars' high-energy light-curves, γ-ray emission from pulsar glitches and Pulsar Wind Nebulae. AGILE detected about 20 nearby and energetic pulsars with good confidence through timing and/or spatial analysis. Among the newcomers we find pulsars with very high rotational energy losses, such as the remarkable PSR B1509-58 with a magnetic field in excess of 1013 Gauss, and PSR J2229+6114 providing a reliable identification for the previously unidentified EGRET source 3EG2227+6122. Moreover, the powerful millisecond pulsar B1821-24, in the globular cluster M28, is detected during a fraction of the observations.

  16. The reliability of a functional agility test for water polo.

    PubMed

    Tucher, Guilherme; de Souza Castro, Flávio Antônio; Garrido, Nuno Domingos; Martins da Silva, António José Rocha

    2014-06-28

    Few functional agility tests for water polo take into consideration its specific characteristics. The preliminary objective of this study was to evaluate the reliability of an agility test for water polo players. Fifteen players (16.3 ± 1.8 years old) with a minimum of two years of competitive experience were evaluated. A Functional Test for Agility Performance (FTAP) was designed to represent the context of this sport. Several trials were performed to familiarize the athlete with the movement. Two experienced coaches measured three repetitions of the FTAP. Descriptive statistics, repeated measures analysis of variance (ANOVA), 95% limit of agreement (LOA), intraclass correlation coefficient (ICC) and standard error of measurements (SEM) were used for data analysis. It was considered that certain criteria of reliability measures were met. There was no significant difference between the repetitions, which may be explained by an effect of the evaluator, the ability of the players or fatigue (p > 0.05). The ICC average from evaluators was high (0.88). The SEM varied between 0.13 s and 0.49 s. The CV average considering each individual was near 6-7%. These values depended on the condition of measurement. As the FTAP contains some characteristics that create a degree of unpredictability, the same athlete may reach different performance results, increasing variability. An adjustment in the sample, familiarization and careful selection of subjects help to improve this situation and enhance the reliability of the indicators.

  17. Agile Science Operations: A New Approach for Primitive Exploration Bodies

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.; Thompson, David R.; Castillo-Rogez, Julie C.; Doyle, Richard; Estlin, Tara; Mclaren, David

    2012-01-01

    Primitive body exploration missions such as potential Comet Surface Sample Return or Trojan Tour and Rendezvous would challenge traditional operations practices. Earth-based observations would provide only basic understanding before arrival and many science goals would be defined during the initial rendezvous. It could be necessary to revise trajectories and observation plans to quickly characterize the target for safe, effective observations. Detection of outgassing activity and monitoring of comet surface activity are even more time constrained, with events occurring faster than round-trip light time. "Agile science operations" address these challenges with contingency plans that recognize the intrinsic uncertainty in the operating environment and science objectives. Planning for multiple alternatives can significantly improve the time required to repair and validate spacecraft command sequences. When appropriate, time-critical decisions can be automated and shifted to the spacecraft for immediate access to instrument data. Mirrored planning systems on both sides of the light-time gap permit transfer of authority back and forth as needed. We survey relevant science objectives, identifying time bottlenecks and the techniques that could be used to speed missions' reaction to new science data. Finally, we discuss the results of a trade study simulating agile observations during flyby and comet rendezvous scenarios. These experiments quantify instrument coverage of key surface features as a function of planning turnaround time. Careful application of agile operations techniques can play a significant role in realizing the Decadal Survey plan for primitive body exploration

  18. Development of a Comprehensive Weld Process Model

    SciTech Connect

    Radhakrishnan, B.; Zacharia, T.

    1997-05-01

    This cooperative research and development agreement (CRADA) between Concurrent Technologies Corporation (CTC) and Lockheed Martin Energy Systems (LMES) combines CTC's expertise in the welding area and that of LMES to develop computer models and simulation software for welding processes. This development is of significant impact to the industry, including materials producers and fabricators. The main thrust of the research effort was to develop a comprehensive welding simulation methodology. A substantial amount of work has been done by several researchers to numerically model several welding processes. The primary drawback of most of the existing models is the lack of sound linkages between the mechanistic aspects (e.g., heat transfer, fluid flow, and residual stress) and the metallurgical aspects (e.g., microstructure development and control). A comprehensive numerical model which can be used to elucidate the effect of welding parameters/conditions on the temperature distribution, weld pool shape and size, solidification behavior, and microstructure development, as well as stresses and distortion, does not exist. It was therefore imperative to develop a comprehensive model which would predict all of the above phenomena during welding. The CRADA built upon an already existing three- dimensional (3-D) welding simulation model which was developed by LMES which is capable of predicting weld pool shape and the temperature history in 3-d single-pass welds. However, the model does not account for multipass welds, microstructural evolution, distortion and residual stresses. Additionally, the model requires large resources of computing time, which limits its use for practical applications. To overcome this, CTC and LMES have developed through this CRADA the comprehensive welding simulation model described above. The following technical tasks have been accomplished as part of the CRADA. 1. The LMES welding code has been ported to the Intel Paragon parallel computer at ORNL

  19. Computer vision challenges and technologies for agile manufacturing

    NASA Astrophysics Data System (ADS)

    Molley, Perry A.

    1996-02-01

    applicable to commercial production processes and applications. Computer vision will play a critical role in the new agile production environment for automation of processes such as inspection, assembly, welding, material dispensing and other process control tasks. Although there are many academic and commercial solutions that have been developed, none have had widespread adoption considering the huge potential number of applications that could benefit from this technology. The reason for this slow adoption is that the advantages of computer vision for automation can be a double-edged sword. The benefits can be lost if the vision system requires an inordinate amount of time for reprogramming by a skilled operator to account for different parts, changes in lighting conditions, background clutter, changes in optics, etc. Commercially available solutions typically require an operator to manually program the vision system with features used for the recognition. In a recent survey, we asked a number of commercial manufacturers and machine vision companies the question, 'What prevents machine vision systems from being more useful in factories?' The number one (and unanimous) response was that vision systems require too much skill to set up and program to be cost effective.

  20. Computer modeling of complete IC fabrication process

    NASA Astrophysics Data System (ADS)

    Dutton, Robert W.

    1987-05-01

    The development of fundamental algorithms for process and device modeling as well as novel integration of the tools for advanced Integrated Circuit (IC) technology design is discussed. The development of the first complete 2D process simulator, SUPREM 4, is reported. The algorithms are discussed as well as application to local-oxidation and extrinsic diffusion conditions which occur in CMOS AND BiCMOS technologies. The evolution of 1D (SEDAN) and 2D (PISCES) device analysis is discussed. The application of SEDAN to a variety of non-silicon technologies (GaAs and HgCdTe) are considered. A new multi-window analysis capability for PISCES which exploits Monte Carlo analysis of hot carriers has been demonstrated and used to characterize a variety of silicon MOSFET and GaAs MESFET effects. A parallel computer implementation of PISCES has been achieved using a Hypercube architecture. The PISCES program has been used for a range of important device studies including: latchup, analog switch analysis, MOSFET capacitance studies and bipolar transient device for ECL gates. The program is broadly applicable to RAM and BiCMOS technology analysis and design. In the analog switch technology area this research effort has produced a variety of important modeling and advances.

  1. Stress Process Model for Individuals With Dementia

    PubMed Central

    Judge, Katherine S.; Menne, Heather L.; Whitlatch, Carol J.

    2010-01-01

    Purpose: Individuals with dementia (IWDs) face particular challenges in managing and coping with their illness. The experience of dementia may be affected by the etiology, stage, and severity of symptoms, preexisting and related chronic conditions, and available informal and formal supportive services. Although several studies have examined particular features of IWD’s illness experience, few draw upon a conceptual model that outlines the global illness experience and the resulting stressors that commence with symptom onset, proliferate over time, and continue through the later stages of cognitive loss. Building on the work of Pearlin and colleagues (1990, Caregiving and the stress process: An overview of concepts and their measures. The Gerontologist, 30, 583–594), this article proposes a stress process model (SPM) for IWDs that conceptualizes and examines the illness experience of IWDs. Implications: The proposed SPM for IWDs serves as a guide to (a) consider and understand the short- and long-term complexities of the illness experience for IWDs, (b) investigate specific hypotheses by outlining key stressors in the illness experience and by positing relationships among stressors and outcomes, and (c) help inform the development of interventions to prevent or reduce the negative stressors and enhance the positive experiences of living with a dementing illness. PMID:20022935

  2. Simulation model of clastic sedimentary processes

    SciTech Connect

    Tetzlaff, D.M.

    1987-01-01

    This dissertation describes SEDSIM, a computer model that simulates erosion, transport, and deposition of clastic sediments by free-surface flow in natural environments. SEDSIM is deterministic and is applicable to sedimentary processes in rivers, deltas, continental shelves, submarine canyons, and turbidite fans. The model is used to perform experiments in clastic sedimentation. Computer experimentation is limited by computing power available, but is free from scaling problems associated with laboratory experiments. SEDSIM responds to information provided to it at the outset of a simulation experiment, including topography, subsurface configuration, physical parameters of fluid and sediment, and characteristics of sediment sources. Extensive computer graphics are incorporated in SEDSIM. The user can display the three-dimensional geometry of simulated deposits in the form of successions of contour maps, perspective diagrams, vector plots of current velocities, and vertical sections of any azimuth orientation. The sections show both sediment age and composition. SEDSIM works realistically with processes involving channel shifting and topographic changes. Example applications include simulation of an ancient submarine canyon carved into a Cretaceous sequence in the National Petroleum Reserve in Alaska, known mainly from seismic sections and a sequence of Tertiary age in the Golden Meadow oil field of Louisiana, known principally from well logs.

  3. Organizational Leadership Process for University Education

    ERIC Educational Resources Information Center

    Llamosa-Villalba, Ricardo; Delgado, Dario J.; Camacho, Heidi P.; Paéz, Ana M.; Valdivieso, Raúl F.

    2014-01-01

    This paper relates the "Agile School", an emerging archetype of the enterprise architecture: "Processes of Organizational Leadership" for leading and managing strategies, tactics and operations of forming in Higher Education Institutions. Agile School is a system for innovation and deep transformation of University Institutions…

  4. Towards a Framework for Using Agile Approaches in Global Software Development

    NASA Astrophysics Data System (ADS)

    Hossain, Emam; Ali Babar, Muhammad; Verner, June

    As agile methods and Global Software Development (GSD) are become increasingly popular, GSD project managers have been exploring the viability of using agile approaches in their development environments. Despite the expected benefits of using an agile approach with a GSD project, the overall combining mechanisms of the two approaches are not clearly understood. To address this challenge, we propose a conceptual framework, based on the research literature. This framework is expected to aid a project manager in deciding what agile strategies are effective for a particular GSD project, taking into account project context. We use an industry-based case study to explore the components of our conceptual framework. Our case study is planned and conducted according to specific published case study guidelines. We identify the agile practices and agile supporting practices used by a GSD project manager in our case study and conclude with future research directions.

  5. Migration and Marriage: Modeling the Joint Process

    PubMed Central

    Jang, Joy Bohyun; Casterline, John B; Snyder, Anastasia

    2016-01-01

    Background Previous research on inter-relations between migration and marriage has relied on overly simplistic assumptions about the structure of dependency between the two events. However, there is good reason to posit that each of the two transitions has an impact on the likelihood of the other, and that unobserved common factors may affect both migration and marriage, leading to a distorted impression of the causal impact of one on the other. Objective We will investigate relationships between migration and marriage in the United States using data from the National Longitudinal Survey of Youth 1979. We allow for inter-dependency between the two events and examine whether unobserved common factors affect the estimates of both migration and marriage. Methods We estimate a multi-process model in which migration and marriage are considered simultaneously in regression analysis and there is allowance for correlation between disturbances; the latter feature accounts for possible endogeneity between shared unobserved determinants. The model also includes random effects for persons, exploiting the fact that many people experience both events multiple times throughout their lives. Results Unobserved factors appear to significantly influence both migration and marriage, resulting in upward bias in estimates of the effects of each on the other when these shared common factors are not accounted for. Estimates from the multi-process model indicate that marriage significantly increases the hazard of migration while migration does not affect the hazard of marriage. Conclusions Omitting inter-dependency between life course events can lead to a mistaken impression of the direct effects of certain features of each event on the other. PMID:27182198

  6. Mechanochemical models of processive molecular motors

    NASA Astrophysics Data System (ADS)

    Lan, Ganhui; Sun, Sean X.

    2012-05-01

    Motor proteins are the molecular engines powering the living cell. These nanometre-sized molecules convert chemical energy, both enthalpic and entropic, into useful mechanical work. High resolution single molecule experiments can now observe motor protein movement with increasing precision. The emerging data must be combined with structural and kinetic measurements to develop a quantitative mechanism. This article describes a modelling framework where quantitative understanding of motor behaviour can be developed based on the protein structure. The framework is applied to myosin motors, with emphasis on how synchrony between motor domains give rise to processive unidirectional movement. The modelling approach shows that the elasticity of protein domains are important in regulating motor function. Simple models of protein domain elasticity are presented. The framework can be generalized to other motor systems, or an ensemble of motors such as muscle contraction. Indeed, for hundreds of myosins, our framework can be reduced to the Huxely-Simmons description of muscle movement in the mean-field limit.

  7. Modelling infiltration processes in frozen soils

    NASA Astrophysics Data System (ADS)

    Ireson, A. M.; Barbour, L. S.

    2014-12-01

    Understanding the hydrological processes in soils subject to significant freeze-thaw is fraught by "experimental vagaries and theoretical imponderables" (Miller 1980, Applications of soil physics). The infiltration of snowmelt water and the subsequent transmission of unfrozen water during thawing, is governed by hydraulic conductivity values which are changing with both ice and unfrozen water content. Water held within pores is subject to capillary forces, which results in a freezing point depression (i.e. water remains in the liquid state slightly below 0°C). As the temperature drops below zero, water freezes first in the larger pores, and then in progressively smaller pores. Since the larger pores also are the first to empty by drainage, these pores may be air filled during freezing, while smaller water filled pores freeze. This explains why an unsaturated, frozen soil may still have a considerable infiltration capacity. Infiltration into frozen soil is a critical phenomena related to the risk of flooding in the Canadian prairies, controlling the partitioning of snowmelt into either infiltration or runoff. We propose a new model, based on conceptualizing the pore space as a bundle of capillary tubes (with significant differences to the capillary bundle model of Wannatabe and Flury, 2008, WRR, doi:10.1029/2008WR007102) which allows any air-filled macropores to contribute to the potential infiltration capacity of the soil. The patterns of infiltration and water movement during freeze-thaw from the model are compared to field observations from the Canadian prairies and Boreal Plains.

  8. High-Speed Time-Series CCD Photometry with Agile

    NASA Astrophysics Data System (ADS)

    Mukadam, Anjum S.; Owen, R.; Mannery, E.; MacDonald, N.; Williams, B.; Stauffer, F.; Miller, C.

    2011-12-01

    We have assembled a high-speed time-series CCD photometer named Agile for the 3.5 m telescope at Apache Point Observatory, based on the design of a photometer called Argos at McDonald Observatory. Instead of a mechanical shutter, we use the frame-transfer operation of the CCD to end an exposure and initiate the subsequent new exposure. The frame-transfer operation is triggered by the negative edge of a GPS pulse; the instrument timing is controlled directly by hardware, without any software intervention or delays. This is the central pillar in the design of Argos that we have also used in Agile; this feature makes the accuracy of instrument timing better than a millisecond. Agile is based on a Princeton Instruments Acton VersArray camera with a frame-transfer CCD, which has 1K × 1K active pixels, each of size . Using a focal reducer at the Nasmyth focus of the 3.5 m telescope at Apache Point Observatory, we yield a field of view of 2.2 × 2.2 arcmin2 with an unbinned plate scale of 0.13″ pixel-1. The CCD is back-illuminated and thinned for improved blue sensitivity and provides a quantum efficiency ≥80% in the wavelength range of 4500–7500 Å. The unbinned full-frame readout time can be as fast as 1.1 s this is achieved using a low-noise amplifier operating at 1 MHz with an average read noise of the order of rms. At the slow read rate of 100 kHz to be used for exposure times longer than a few seconds, we determine an average read noise of the order of rms. Agile is optimized to observe variability at short timescales from one-third of a second to several hundred seconds. The variable astronomical sources routinely observed with Agile include pulsating white dwarfs, cataclysmic variables, flare stars, planetary transits, and planetary satellite occultations.

  9. Team-based work and work system balance in the context of agile manufacturing.

    PubMed

    Yauch, Charlene A

    2007-01-01

    Manufacturing agility is the ability to prosper in an environment characterized by constant and unpredictable change. The purpose of this paper is to analyze team attributes necessary to facilitate agile manufacturing, and using Balance Theory as a framework, it evaluates the potential positive and negative impacts related to these team attributes that could alter the balance of work system elements and resulting "stress load" experienced by persons working on agile teams. Teams operating within the context of agile manufacturing are characterized as multifunctional, dynamic, cooperative, and virtual. A review of the literature relevant to each of these attributes is provided, as well as suggestions for future research.

  10. Mechanical-mathematical modeling for landslide process

    NASA Astrophysics Data System (ADS)

    Svalova, V.

    2009-04-01

    500 m and displacement of a landslide in the plan over 1 m. Last serious activization of a landslide has taken place in 2002 with a motion on 53 cm. Catastrophic activization of the deep blockglide landslide in the area of Khoroshevo in Moscow took place in 2006-2007. A crack of 330 m long appeared in the old sliding circus, along which a new 220 m long creeping block was separated from the plateau and began sinking with a displaced surface of the plateau reaching to 12 m. Such activization of the landslide process was not observed in Moscow since mid XIX century. The sliding area of Khoroshevo was stable during long time without manifestations of activity. Revealing of the reasons of deformation and development of ways of protection from deep landslide motions is extremely actual and difficult problem which decision is necessary for preservation of valuable historical monuments and modern city constructions. The reasons of activization and protective measures are discussed. Structure of monitoring system for urban territories is elaborated. Mechanical-mathematical model of high viscous fluid was used for modeling of matter behavior on landslide slopes. Equation of continuity and an approximated equation of the Navier-Stockes for slow motions in a thin layer were used. The results of modelling give possibility to define the place of highest velocity on landslide surface, which could be the best place for monitoring post position. Model can be used for calibration of monitoring equipment and gives possibility to investigate some fundamental aspects of matter movement on landslide slope.

  11. Application of molecular modeling to biological processing

    NASA Astrophysics Data System (ADS)

    Lowrey, Alfred H.; Famini, George R.; Wick, Charles

    1993-07-01

    Detailed understanding of the molecular basis for biological processes is now available through computational modeling techniques. Advances in computational algorithms and technology allow applications to large biological macromolecules and permits the study of such problems as binding mechanisms, chemical reactivity, structural and conformational effects, and simulations of molecular motions. Recent crystallographic data provides access to detailed structural information that allows analysis and comparison of various computational techniques. Preliminary semiempirical studies on N-acetylneuraminic acid are presented as an example of computational studies on binding mechanisms. N-acetylneuraminic acid is a substituted carbohydrate, which is a recognition site for binding of proteins (i.e., cholera toxin). These calculations provide some insight into electronic effects on bin in a crystal complex and the effect of the molecular charge on hydrogen bonding the crystal complex.

  12. The Agile Approach with Doctoral Dissertation Supervision

    ERIC Educational Resources Information Center

    Tengberg, Lars Göran Wallgren

    2015-01-01

    Several research findings conclude that many doctoral students fail to complete their studies within the allowable time frame, in part because of problems related to the research and supervision process. Surveys show that most doctoral students are generally satisfied with their dissertation supervision. However, these surveys also reveal some…

  13. USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process. Volume 3. Future to be Asset Sustainment Process Model.

    DTIC Science & Technology

    2007-11-02

    Models), contains the To-Be Retail Asset Sustainment Process Model displaying the activities and functions related to the improved processes for receipt...of a logistics process model for a more distant future asset sustainment scenario unconstrained by today’s logistics information systems limitations...It also contains a process model reflecting the Reengineering Team’s vision of the future asset sustainment process.

  14. Modeling Sound Processing in Cochlear Nuclei

    NASA Astrophysics Data System (ADS)

    Meddis, Ray

    2003-03-01

    The cochlear nucleus is an obligatory relay nucleus between the ear and the rest of the brain. It consists of many different types of neurons each responding differently to the same stimulus. Much is known about the wiring diagram of the system but it has so far proved difficult to characterise the signal processing that is going on or what purpose it serves. The solution to this problem is a pre-requisite of any attempt to produce a practical electronic simulation that exploits the brain's unique capacity to recognise the significance of acoustic events and generate appropriate responses. This talk will explain the different types of neural cell and specify hypotheses as to their various functions. Cell-types vary in terms of their size and shape as well as the number and type of minute electrical currents that flow across the cell membranes. Computer models will also be used to illustrate how the physical substrate (the wet-ware) is used to achieve its signal-processing goals.

  15. Model development for naphthenic acids ozonation process.

    PubMed

    Al Jibouri, Ali Kamel H; Wu, Jiangning

    2015-02-01

    Naphthenic acids (NAs) are toxic constituents of oil sands process-affected water (OSPW) which is generated during the extraction of bitumen from oil sands. NAs consist mainly of carboxylic acids which are generally biorefractory. For the treatment of OSPW, ozonation is a very beneficial method. It can significantly reduce the concentration of NAs and it can also convert NAs from biorefractory to biodegradable. In this study, a factorial design (2(4)) was used for the ozonation of OSPW to study the influences of the operating parameters (ozone concentration, oxygen/ozone flow rate, pH, and mixing) on the removal of a model NAs in a semi-batch reactor. It was found that ozone concentration had the most significant effect on the NAs concentration compared to other parameters. An empirical model was developed to correlate the concentration of NAs with ozone concentration, oxygen/ozone flow rate, and pH. In addition, a theoretical analysis was conducted to gain the insight into the relationship between the removal of NAs and the operating parameters.

  16. Nanowire growth process modeling and reliability models for nanodevices

    NASA Astrophysics Data System (ADS)

    Fathi Aghdam, Faranak

    Nowadays, nanotechnology is becoming an inescapable part of everyday life. The big barrier in front of its rapid growth is our incapability of producing nanoscale materials in a reliable and cost-effective way. In fact, the current yield of nano-devices is very low (around 10 %), which makes fabrications of nano-devices very expensive and uncertain. To overcome this challenge, the first and most important step is to investigate how to control nano-structure synthesis variations. The main directions of reliability research in nanotechnology can be classified either from a material perspective or from a device perspective. The first direction focuses on restructuring materials and/or optimizing process conditions at the nano-level (nanomaterials). The other direction is linked to nano-devices and includes the creation of nano-electronic and electro-mechanical systems at nano-level architectures by taking into account the reliability of future products. In this dissertation, we have investigated two topics on both nano-materials and nano-devices. In the first research work, we have studied the optimization of one of the most important nanowire growth processes using statistical methods. Research on nanowire growth with patterned arrays of catalyst has shown that the wire-to-wire spacing is an important factor affecting the quality of resulting nanowires. To improve the process yield and the length uniformity of fabricated nanowires, it is important to reduce the resource competition between nanowires during the growth process. We have proposed a physical-statistical nanowire-interaction model considering the shadowing effect and shared substrate diffusion area to determine the optimal pitch that would ensure the minimum competition between nanowires. A sigmoid function is used in the model, and the least squares estimation method is used to estimate the model parameters. The estimated model is then used to determine the optimal spatial arrangement of catalyst arrays

  17. Application of simulation models for the optimization of business processes

    NASA Astrophysics Data System (ADS)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  18. How rolling forecasting facilitates dynamic, agile planning.

    PubMed

    Miller, Debra; Allen, Michael; Schnittger, Stephanie; Hackman, Theresa

    2013-11-01

    Rolling forecasting may be used to replace or supplement the annual budget process. The rolling forecast typically builds on the organization's strategic financial plan, focusing on the first three years of plan projections and comparing the strategic financial plan assumptions with the organization's expected trajectory. Leaders can then identify and respond to gaps between the rolling forecast and the strategic financial plan on an ongoing basis.

  19. Agile Electromagnetics Exploiting High Speed Logic (AEEHSL).

    DTIC Science & Technology

    2014-09-26

    examination and alteration of codes and filter weights 3. READ Mode - This mode enables the reading or replaying of the data from the digital tape recorder...available in this subsystems are used to initialize the * radar, clock the code from the high-speed code storage memory to drive the code modulator, delay...correlation process. There is storage space within the high speed memory for 32 codes of length 64 bits or less. The radiated code can be changed by a

  20. Developing Friction Stir Welding Process Model for ICME Application

    NASA Astrophysics Data System (ADS)

    Yang, Yu-Ping

    2015-01-01

    A framework for developing a product involving manufacturing processes was developed with integrated computational materials engineering approach. The key component in the framework is a process modeling tool which includes a thermal model, a microstructure model, a thermo-mechanical, and a property model. Using friction stir welding (FSW) process as an example, development of the process modeling tool was introduced in detail. The thermal model and the microstructure model of FSW of steels were validated with the experiment data. The model can predict reasonable temperature and hardness distributions as observed in the experiment. The model was applied to predict residual stress and joint strength of a pipe girth weld.

  1. The evaluation of several agility metrics for fighter aircraft using optimal trajectory analysis

    NASA Technical Reports Server (NTRS)

    Ryan, George W., III; Downing, David R.

    1993-01-01

    Several functional agility metrics, including the combat cycle time metric, dynamic speed turn plots, and relative energy state metric, are used to compare turning performance for generic F-18, X-29, and X-31-type aircraft models. These three-degree-of-freedom models have characteristics similar to the real aircraft. The performance comparisons are made using data from optimal test trajectories to reduce sensitivities to different pilot input techniques and to reduce the effects of control system limiters. The turn performance for all three aircraft is calculated for simulated minimum time 180 deg heading captures from simulation data. Comparisons of the three aircraft give more insight into turn performance than would be available from traditional measures of performance. Using the optimal test technique yields significant performance improvements as measured by the metrics. These performance improvements were found without significant increases in turn radius.

  2. Reactive Agility Performance in Handball; Development and Evaluation of a Sport-Specific Measurement Protocol.

    PubMed

    Spasic, Miodrag; Krolo, Ante; Zenic, Natasa; Delextrat, Anne; Sekulic, Damir

    2015-09-01

    There is no current study that examined sport-specific tests of reactive-agility and change-of-direction-speed (CODS) to replicate real-sport environment in handball (team-handball). This investigation evaluated the reliability and validity of two novel tests designed to assess reactive-agility and CODS of handball players. Participants were female (25.14 ± 3.71 years of age; 1.77 ± 0.09 m and 74.1 ± 6.1 kg) and male handball players (26.9 ± 4.1 years of age; 1.90 ± 0.09 m and 93.90±4.6 kg). Variables included body height, body mass, body mass index, broad jump, 5-m sprint, CODS and reactive-agility tests. Results showed satisfactory reliability for reactive-agility-test and CODS-test (ICC of 0.85-0.93, and CV of 2.4-4.8%). The reactive-agility and CODS shared less than 20% of the common variance. The calculated index of perceptual and reactive capacity (P&RC; ratio between reactive-agility- and CODS-performance) is found to be valid measure in defining true-game reactive-agility performance in handball in both genders. Therefore, the handball athletes' P&RC should be used in the evaluation of real-game reactive-agility performance. Future studies should explore other sport-specific reactive-agility tests and factors associated to such performance in sports involving agile maneuvers. Key pointsReactive agility and change-of-direction-speed should be observed as independent qualities, even when tested over the same course and similar movement templateThe reactive-agility-performance of the handball athletes involved in defensive duties is closer to their non-reactive-agility-score than in their peers who are not involved in defensive dutiesThe handball specific "true-game" reactive-agility-performance should be evaluated as the ratio between reactive-agility and corresponding CODS performance.

  3. Stochastic model updating utilizing Bayesian approach and Gaussian process model

    NASA Astrophysics Data System (ADS)

    Wan, Hua-Ping; Ren, Wei-Xin

    2016-03-01

    Stochastic model updating (SMU) has been increasingly applied in quantifying structural parameter uncertainty from responses variability. SMU for parameter uncertainty quantification refers to the problem of inverse uncertainty quantification (IUQ), which is a nontrivial task. Inverse problem solved with optimization usually brings about the issues of gradient computation, ill-conditionedness, and non-uniqueness. Moreover, the uncertainty present in response makes the inverse problem more complicated. In this study, Bayesian approach is adopted in SMU for parameter uncertainty quantification. The prominent strength of Bayesian approach for IUQ problem is that it solves IUQ problem in a straightforward manner, which enables it to avoid the previous issues. However, when applied to engineering structures that are modeled with a high-resolution finite element model (FEM), Bayesian approach is still computationally expensive since the commonly used Markov chain Monte Carlo (MCMC) method for Bayesian inference requires a large number of model runs to guarantee the convergence. Herein we reduce computational cost in two aspects. On the one hand, the fast-running Gaussian process model (GPM) is utilized to approximate the time-consuming high-resolution FEM. On the other hand, the advanced MCMC method using delayed rejection adaptive Metropolis (DRAM) algorithm that incorporates local adaptive strategy with global adaptive strategy is employed for Bayesian inference. In addition, we propose the use of the powerful variance-based global sensitivity analysis (GSA) in parameter selection to exclude non-influential parameters from calibration parameters, which yields a reduced-order model and thus further alleviates the computational burden. A simulated aluminum plate and a real-world complex cable-stayed pedestrian bridge are presented to illustrate the proposed framework and verify its feasibility.

  4. Heterogeneous processes: Laboratory, field, and modeling studies

    NASA Technical Reports Server (NTRS)

    Poole, Lamont R.; Kurylo, Michael J.; Jones, Rod L.; Wahner, Andreas; Calvert, Jack G.; Leu, M.-T.; Fried, A.; Molina, Mario J.; Hampson, Robert F.; Pitts, M. C.

    1991-01-01

    The efficiencies of chemical families such as ClO(x) and NO(x) for altering the total abundance and distribution of stratospheric ozone are controlled by a partitioning between reactive (active) and nonreactive (reservoir) compounds within each family. Gas phase thermodynamics, photochemistry, and kinetics would dictate, for example, that only about 1 percent of the chlorine resident in the lower stratosphere would be in the form of active Cl or ClO, the remainder existing in the reservoir compounds HCl and ClONO2. The consistency of this picture was recently challenged by the recognition that important chemical transformations take place on polar regions: the Airborne Antarctic Ozone Experiment (AAOE) and the Airborne Arctic Stratospheric Expedition (AASA). Following the discovery of the Antarctic ozone hole, Solomon et al. suggested that the heterogeneous chemical reaction: ClONO2(g)+HCl(s) yields Cl2(g)+HNO3(s) could play a key role in converting chlorine from inactive forms into a species (Cl2) that would rapidly dissociate in sunlight to liberate atomic chlorine and initiate ozone depletion. The symbols (s) and (g) denote solid phase, or adsorbed onto a solid surface, and gas phase, respectively, and represent the approach by which such a reaction is modeled rather than the microscopic details of the reaction. The reaction was expected to be most important at altitudes where PSC's were most prevalent (10 to 25 km), thereby extending the altitude range over which chlorine compounds can efficiently destroy ozone from the 35 to 45 km region (where concentrations of active chlorine are usually highest) to lower altitudes where the ozone concentration is at its peak. This chapter will briefly review the current state of knowledge of heterogeneous processes in the stratosphere, emphasizing those results obtained since the World Meteorological Organization (WMO) conference. Sections are included on laboratory investigations of heterogeneous reactions, the

  5. Wired Widgets: Agile Visualization for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Gerschefske, K.; Witmer, J.

    2012-09-01

    Continued advancement in sensors and analysis techniques have resulted in a wealth of Space Situational Awareness (SSA) data, made available via tools and Service Oriented Architectures (SOA) such as those in the Joint Space Operations Center Mission Systems (JMS) environment. Current visualization software cannot quickly adapt to rapidly changing missions and data, preventing operators and analysts from performing their jobs effectively. The value of this wealth of SSA data is not fully realized, as the operators' existing software is not built with the flexibility to consume new or changing sources of data or to rapidly customize their visualization as the mission evolves. While tools like the JMS user-defined operational picture (UDOP) have begun to fill this gap, this paper presents a further evolution, leveraging Web 2.0 technologies for maximum agility. We demonstrate a flexible Web widget framework with inter-widget data sharing, publish-subscribe eventing, and an API providing the basis for consumption of new data sources and adaptable visualization. Wired Widgets offers cross-portal widgets along with a widget communication framework and development toolkit for rapid new widget development, giving operators the ability to answer relevant questions as the mission evolves. Wired Widgets has been applied in a number of dynamic mission domains including disaster response, combat operations, and noncombatant evacuation scenarios. The variety of applications demonstrate that Wired Widgets provides a flexible, data driven solution for visualization in changing environments. In this paper, we show how, deployed in the Ozone Widget Framework portal environment, Wired Widgets can provide an agile, web-based visualization to support the SSA mission. Furthermore, we discuss how the tenets of agile visualization can generally be applied to the SSA problem space to provide operators flexibility, potentially informing future acquisition and system development.

  6. Development of a reburning boiler process model

    SciTech Connect

    Wu, K.T.

    1992-01-30

    The overall objective of this program is to integrate EER's expertise in boiler reburning performance evaluation into a package of analytical computer tools. Specific objectives of the program are to develop a computational capability with the following features: (1) can be used to predict the impact of gas reburning application on thermal conditions in the boiler radiant furnace, and on overall boiler performance; (2) can estimate gas reburning NO{sub x} reduction effectiveness based on specific reburning configurations and furnace/boiler configurations; (3) can be used as an analytical tool to evaluate the impact of boiler process parameters (e.g., fuel switching and changes in boiler operating conditions) on boiler thermal performance; (4) is adaptable to most boiler designs (tangential and wall fire boilers) and a variety of fuels (solid, liquid, gaseous and slurried fuels); (5) is sufficiently user friendly to be exercisable by engineers with a reasonable knowledge of boilers, and with reasonable computer skills. Here, user friendly'' means that the user will be guided by computer codes during the course of setting up individual input files for the boiler performance model.

  7. Opponent process model and psychostimulant addiction.

    PubMed

    Koob, G F; Caine, S B; Parsons, L; Markou, A; Weiss, F

    1997-07-01

    There are many sources of reinforcement in the spectrum of cocaine dependence that contribute to the compulsive cocaine self-administration or loss of control of cocaine intake that constitutes the core of modern definitions of dependence. The development of withdrawal has long been considered an integral part of drug addiction but has lost its impact in the theorization of drug dependence because of new emphasis on the neurobiological substrates for the positive-reinforcing properties of drugs. The present treatise reviews the neurobiological substrates for the acute positive reinforcing effects of cocaine and what is beginning to be known about the neurobiological substrates of cocaine withdrawal. The concept of motivational or affective withdrawal is reintroduced, which reemphasizes opponent process theory as a model for the motivational effects of cocaine dependence. The same neural substrates hypothesized to be involved in the acute reinforcing properties of drugs (basal forebrain regions of nucleus accumbens and amygdala) are hypothesized to be altered during chronic drug treatment to produce the negative motivational states characterizing drug withdrawal. Within these brain regions, both the neurochemical system(s) on which the drug has its primary actions and other neurochemical systems may undergo adaptations to chronic presence of the drug. An understanding of the adaptations of the motivational systems of the brain accompanying cocaine dependence leads to important predictions not only about the etiology, treatment, and prevention of cocaine addiction but also about the vulnerability of these motivational systems in non-drug-induced psychopathology.

  8. Software Systems Reengineering Process Model, Version 1.0

    DTIC Science & Technology

    1993-01-01

    The Center for Information Management (CIM) Software Systems Reengineering Process Model provides guidance for applying software reengineering...to support current business needs. The purpose of the CIM Software Systems Reengineering Process Model is to capture the essence of software

  9. Perspectives on Industrial Innovation from Agilent, HP, and Bell Labs

    NASA Astrophysics Data System (ADS)

    Hollenhorst, James

    2014-03-01

    Innovation is the life blood of technology companies. I will give perspectives gleaned from a career in research and development at Bell Labs, HP Labs, and Agilent Labs, from the point of view of an individual contributor and a manager. Physicists bring a unique set of skills to the corporate environment, including a desire to understand the fundamentals, a solid foundation in physical principles, expertise in applied mathematics, and most importantly, an attitude: namely, that hard problems can be solved by breaking them into manageable pieces. In my experience, hiring managers in industry seldom explicitly search for physicists, but they want people with those skills.

  10. Impact of emerging technologies on future combat aircraft agility

    NASA Technical Reports Server (NTRS)

    Nguyen, Luat T.; Gilert, William P.

    1990-01-01

    The foreseeable character of future within-visual-range air combat entails a degree of agility which calls for the integration of high-alpha aerodynamics, thrust vectoring, intimate pilot/vehicle interfaces, and advanced weapons/avionics suites, in prospective configurations. The primary technology-development programs currently contributing to these goals are presently discussed; they encompass the F-15 Short Takeoff and Landing/Maneuver Technology Demonstrator Program, the Enhanced Fighter Maneuverability Program, the High Angle-of-Attack Technology Program, and the X-29 Technology Demonstrator Program.

  11. Process modeling - It's history, current status, and future

    NASA Astrophysics Data System (ADS)

    Duttweiler, Russell E.; Griffith, Walter M.; Jain, Sulekh C.

    1991-04-01

    The development of process modeling is reviewed to examine the potential of process applications to prevent and solve problems associated with the aerospace industry. The business and global environments is assessed, and the traditional approach to product/process design is argued to be obsolete. A revised engineering process is described which involves planning and prediction before production by means of process simulation. Process simulation can permit simultaneous engineering of unit processes and complex processes, and examples are given in the cross-coupling of forging-process variance. The implementation of process modeling, CAE, and computer simulations are found to reduce costs and time associated with technological development when incorporated judiciously.

  12. ISS Double-Gimbaled CMG Subsystem Simulation Using the Agile Development Method

    NASA Technical Reports Server (NTRS)

    Inampudi, Ravi

    2016-01-01

    This paper presents an evolutionary approach in simulating a cluster of 4 Control Moment Gyros (CMG) on the International Space Station (ISS) using a common sense approach (the agile development method) for concurrent mathematical modeling and simulation of the CMG subsystem. This simulation is part of Training systems for the 21st Century simulator which will provide training for crew members, instructors, and flight controllers. The basic idea of how the CMGs on the space station are used for its non-propulsive attitude control is briefly explained to set up the context for simulating a CMG subsystem. Next different reference frames and the detailed equations of motion (EOM) for multiple double-gimbal variable-speed control moment gyroscopes (DGVs) are presented. Fixing some of the terms in the EOM becomes the special case EOM for ISS's double-gimbaled fixed speed CMGs. CMG simulation development using the agile development method is presented in which customer's requirements and solutions evolve through iterative analysis, design, coding, unit testing and acceptance testing. At the end of the iteration a set of features implemented in that iteration are demonstrated to the flight controllers thus creating a short feedback loop and helping in creating adaptive development cycles. The unified modeling language (UML) tool is used in illustrating the user stories, class designs and sequence diagrams. This incremental development approach of mathematical modeling and simulating the CMG subsystem involved the development team and the customer early on, thus improving the quality of the working CMG system in each iteration and helping the team to accurately predict the cost, schedule and delivery of the software.

  13. Agile delivery of protein therapeutics to CNS.

    PubMed

    Yi, Xiang; Manickam, Devika S; Brynskikh, Anna; Kabanov, Alexander V

    2014-09-28

    A variety of therapeutic proteins have shown potential to treat central nervous system (CNS) disorders. Challenge to deliver these protein molecules to the brain is well known. Proteins administered through parenteral routes are often excluded from the brain because of their poor bioavailability and the existence of the blood-brain barrier (BBB). Barriers also exist to proteins administered through non-parenteral routes that bypass the BBB. Several strategies have shown promise in delivering proteins to the brain. This review, first, describes the physiology and pathology of the BBB that underscore the rationale and needs of each strategy to be applied. Second, major classes of protein therapeutics along with some key factors that affect their delivery outcomes are presented. Third, different routes of protein administration (parenteral, central intracerebroventricular and intraparenchymal, intranasal and intrathecal) are discussed along with key barriers to CNS delivery associated with each route. Finally, current delivery strategies involving chemical modification of proteins and use of particle-based carriers are overviewed using examples from literature and our own work. Whereas most of these studies are in the early stage, some provide proof of mechanism of increased protein delivery to the brain in relevant models of CNS diseases, while in few cases proof of concept had been attained in clinical studies. This review will be useful to broad audience of students, academicians and industry professionals who consider critical issues of protein delivery to the brain and aim developing and studying effective brain delivery systems for protein therapeutics.

  14. Agile Delivery of Protein Therapeutics to CNS

    PubMed Central

    Yi, Xiang; Manickam, Devika S.; Brynskikh, Anna; Kabanov, Alexander V.

    2014-01-01

    A variety of therapeutic proteins have shown potential to treat central nervous system (CNS) disorders. Challenge to deliver these protein molecules to the brain is well known. Proteins administered through parenteral routes are often excluded from the brain because of their poor bioavailability and the existence of the blood-brain barrier (BBB). Barriers also exist to proteins administered through non-parenteral routes that bypass the BBB. Several strategies have shown promise in delivering proteins to the brain. This review, first, describes the physiology and pathology of the BBB that underscore the rationale and needs of each strategy to be applied. Second, major classes of protein therapeutics along with some key factors that affect their delivery outcomes are presented. Third, different routes of protein administration (parenteral, central intracerebroventricular and intraparenchymal, intranasal and intrathecal) are discussed along with key barriers to CNS delivery associated with each route. Finally, current delivery strategies involving chemical modification of proteins and use of particle-based carriers are overviewed using examples from literature and our own work. Whereas most of these studies are in the early stage, some provide proof of mechanism of increased protein delivery to the brain in relevant models of CNS diseases, while in few cases proof of concept had been attained in clinical studies. This review will be useful to broad audience of students, academicians and industry professionals who consider critical issues of protein delivery to the brain and aim developing and studying effective brain delivery systems for protein therapeutics. PMID:24956489

  15. Agile development approach for the observatory control software of the DAG 4m telescope

    NASA Astrophysics Data System (ADS)

    Güçsav, B. Bülent; ćoker, Deniz; Yeşilyaprak, Cahit; Keskin, Onur; Zago, Lorenzo; Yerli, Sinan K.

    2016-08-01

    Observatory Control Software for the upcoming 4m infrared telescope of DAG (Eastern Anatolian Observatory in Turkish) is in the beginning of its lifecycle. After the process of elicitation-validation of the initial requirements, we have been focused on preparation of a rapid conceptual design not only to see the big picture of the system but also to clarify the further development methodology. The existing preliminary designs for both software (including TCS and active optics control system) and hardware shall be presented here in brief to exploit the challenges the DAG software team has been facing with. The potential benefits of an agile approach for the development will be discussed depending on the published experience of the community and on the resources available to us.

  16. Relationship Between Reactive Agility and Change of Direction Speed in Amateur Soccer Players.

    PubMed

    Matlák, János; Tihanyi, József; Rácz, Levente

    2016-06-01

    The aim of the study was to assess the relationship between reactive agility and change of direction speed (CODS) among amateur soccer players using running tests with four directional changes. Sixteen amateur soccer players (24.1 ± 3.3 years; 72.4 ± 7.3 kg; 178.7 ± 6 cm) completed CODS and reactive agility tests with four changes of direction using the SpeedCourt™ system (Globalspeed GmbH, Hemsbach, Germany). Countermovement jump (CMJ) height and maximal foot tapping count (completed in 3 seconds) were also measured with the same device. In the reactive agility test, participants had to react to a series of light stimuli projected onto a screen. Total time was shorter in the CODS test than in the reactive agility test (p < 0.001). Nonsignificant correlations were found among variables measured in the CODS, reactive agility, and CMJ tests. Low common variance (r = 0.03-0.18) was found between CODS and reactive agility test variables. The results of this study underscore the importance of cognitive factors in reactive agility performance and suggest that specific methods may be required for training and testing reactive agility and CODS.

  17. The Preparation of Cognitively Agile Principals for Turnaround Schools: A Leadership Preparation Programme Study

    ERIC Educational Resources Information Center

    Reyes-Guerra, Daniel; Pisapia, John; Mick, Annie

    2016-01-01

    The purpose of this study was to examine the ability of two educational leadership university programmes to improve the cognitive agility of their graduates. The research looked to discover whether the aspiring principals exited the programmes with an increased ability to employ cognitive agility--the ability to use the multiple thinking skills of…

  18. The Impacts of Agile Development Methodology Use on Project Success: A Contingency View

    ERIC Educational Resources Information Center

    Tripp, John F.

    2012-01-01

    Agile Information Systems Development Methods have emerged in the past decade as an alternative manner of managing the work and delivery of information systems development teams, with a large number of organizations reporting the adoption & use of agile methods. The practitioners of these methods make broad claims as to the benefits of their…

  19. Impact of Business Intelligence and IT Infrastructure Flexibility on Competitive Advantage: An Organizational Agility Perspective

    ERIC Educational Resources Information Center

    Chen, Xiaofeng

    2012-01-01

    There is growing use of business intelligence (BI) for better management decisions in industry. However, empirical studies on BI are still scarce in academic research. This research investigates BI from an organizational agility perspective. Organizational agility is the ability to sense and respond to market opportunities and threats with speed,…

  20. Project-Method Fit: Exploring Factors That Influence Agile Method Use

    ERIC Educational Resources Information Center

    Young, Diana K.

    2013-01-01

    While the productivity and quality implications of agile software development methods (SDMs) have been demonstrated, research concerning the project contexts where their use is most appropriate has yielded less definitive results. Most experts agree that agile SDMs are not suited for all project contexts. Several project and team factors have been…

  1. 76 FR 59456 - ASGI Agility Income Fund, et al.; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-26

    ... sales loads (``CDSCs''). Applicants: ASGI Agility Income Fund (``Agility Fund''), ASGI Aurora Opportunities Fund, LLC (``Aurora Fund''), and ASGI Corbin Multi- Strategy Fund, LLC (``Corbin Fund'') (each a... Aurora Fund and the Corbin Fund are each organized as a Delaware limited liability company. The...

  2. Two Undergraduate Process Modeling Courses Taught Using Inductive Learning Methods

    ERIC Educational Resources Information Center

    Soroush, Masoud; Weinberger, Charles B.

    2010-01-01

    This manuscript presents a successful application of inductive learning in process modeling. It describes two process modeling courses that use inductive learning methods such as inquiry learning and problem-based learning, among others. The courses include a novel collection of multi-disciplinary complementary process modeling examples. They were…

  3. A Cognitive Architecture for Human Performance Process Model Research

    DTIC Science & Technology

    1992-11-01

    Architecture for Human Performance Process Model C - F33615-91 -D-0009 Research PE - 62205F PR- 1710 6. AUTHOR(S) TA - 00 Michael J. Young WU - 60 7...OF PAGES cognitive architectures human performance process models 4 1 cognitive psychology Implementation architectures 16. PRICE CODE computational...1 Human Performance Process Models ............................................................ 2

  4. FY 79 Software Acquisition Process Model Task. Revision 1

    DTIC Science & Technology

    1980-07-01

    This final report on the FY 79 Project 5220 Software Acquisition Process Model Task (522F) presents the approach taken to process model definition...plan for their incorporation and application in successive process model versions. The report contains diagrams that represent the Full-Scale

  5. Process Engineering with the Evolutionary Spiral Process Model. Version 01.00.01

    DTIC Science & Technology

    1992-12-01

    Appicv7c tcd 9N1 • ~92 12 28 13 PROCESS ENGINEERING WITH THE EVOLUTIONARY SPIRAL PROCESS MODEL DTI0QUA£T’•tN•L2•TEED 5/ SPC-92079-CMC r - -, - VERSION...Synthesis Process Model ............................................. 6 2.2.2 Process Activities...14 2.4 Sum m ary .................................................................... 15 3. EVOLUTIONARY SPIRAL PROCESS MODEL CONCEPTS

  6. Reliability of a Field Test of Defending and Attacking Agility in Australian Football and Relationships to Reactive Strength.

    PubMed

    Young, Warren B; Murray, Mitch P

    2017-02-01

    Young, WB and Murray, MP. Reliability of a field test of defending and attacking agility in Australian football and relationships to reactive strength. J Strength Cond Res 31(2): 509-516, 2017-Defending and attacking agility tests for Australian football do not exist, and it is unknown whether any physical qualities correlate with these types of agility. The purposes of this study were to develop new field tests of defending and attacking agility for Australian Rules football, to determine whether they were reliable, and to describe the relationship between the agility tests to determine their specificity. Because the reactive strength (RS) of the lower limb muscles has been previously correlated with change-of-direction speed, we also investigated the relationship between this quality and the agility tests. Nineteen male competitive recreational-level Australian Rules football players were assessed on the agility tests and a drop jump test to assess RS. Interday and interrater reliability was also assessed. The agility tests involved performing 10 trials of one-on-one agility tasks against 2 testers (opponents), in which the objective was to be in a position to tackle (defending) or to evade (attacking) the opponent. Both agility tests had good reliability (intraclass correlation > 0.8, %CV < 3, and no significant differences between test occasions [p > 0.05], and interrater reliability was very high [r = 0.997, p < 0.001]). The common variance between the agility tests was 45%, indicating that they represented relatively independent skills. There was a large correlation between RS and defending agility (r = 0.625, p = 0.004), and a very large correlation with attacking agility (r = 0.731, p < 0.001). Defending and attacking agility have different characteristics, possibly related to the footwork, physical, and cognitive demands of each. Nonetheless, RS seems to be important for agility, especially for attacking agility.

  7. Gender-specific influences of balance, speed, and power on agility performance.

    PubMed

    Sekulic, Damir; Spasic, Miodrag; Mirkov, Dragan; Cavar, Mile; Sattler, Tine

    2013-03-01

    The quick change of direction (i.e., agility) is an important athletic ability in numerous sports. Because of the diverse and therefore hardly predictable manifestations of agility in sports, studies noted that the improvement in speed, power, and balance should result in an improvement of agility. However, there is evident lack of data regarding the influence of potential predictors on different agility manifestations. The aim of this study was to determine the gender-specific influence of speed, power, and balance on different agility tests. A total of 32 college-aged male athletes and 31 college-aged female athletes (age 20.02 ± 1.89 years) participated in this study. The subjects were mostly involved in team sports (soccer, team handball, basketball, and volleyball; 80% of men, and 75% of women), martial arts, gymnastics, and dance. Anthropometric variables consisted of body height, body weight, and the body mass index. Five agility tests were used: a t-test (T-TEST), zig-zag test, 20-yard shuttle test, agility test with a 180-degree turn, and forward-backward running agility test (FWDBWD). Other tests included 1 jumping ability power test (squat jump, SQJ), 2 balance tests to determine the overall stability index and an overall limit of stability score (both measured by Biodex Balance System), and 2 running speed tests using a straight sprint for 10 and 20 m (S10 and S20, respectively). A reliability analysis showed that all the agility tests were reliable. Multiple regression and correlation analysis found speed and power (among women), and balance (among men), as most significant predictors of agility. The highest Pearson's correlation in both genders is found between the results of the FWDBWD and S10M tests (0.77 and 0.81 for men and women, respectively; p < 0.05). Power, measured using the SQJ, is significantly (p < 0.05) related to FWDBWD and T-TEST results but only for women (-0.44; -0.41). The balance measures were significantly related to the agility

  8. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    ERIC Educational Resources Information Center

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  9. DoD Information Assurance and Agile: Challenges and Recommendations Gathered Through Interviews with Agile Program Managers and DoD Accreditation Reviewers

    DTIC Science & Technology

    2012-11-01

    testing and certification are found in industry, too (e.g., water- Scrum -fall). The Office of the Secretary of Defense and other organizations have...report titled Water- Scrum -Fall Is The Reality Of Agile For Most Organizations Today. This paper served as the inspiration for many related blog...adoption has diverged some from the original ideas described in the Agile Manifesto. Many adoptions resemble what Forrester labels “water- Scrum -fall

  10. Research on rapid agile metrology for manufacturing based on real-time multitask operating system

    NASA Astrophysics Data System (ADS)

    Chen, Jihong; Song, Zhen; Yang, Daoshan; Zhou, Ji; Buckley, Shawn

    1996-10-01

    Rapid agile metrology for manufacturing (RAMM) using multiple non-contact sensors is likely to remain a growing trend in manufacturing. High speed inspecting systems for manufacturing is characterized by multitasks implemented in parallel and real-time events which occur simultaneously. In this paper, we introduce a real-time operating system into RAMM research. A general task model of a class-based object- oriented technology is proposed. A general multitask frame of a typical RAMM system using OPNet is discussed. Finally, an application example of a machine which inspects parts held on a carrier strip is described. With RTOS and OPNet, this machine can measure two dimensions of the contacts at 300 parts/second.

  11. AGILE detection of variable γ-ray activity from the blazar S5 0716+714 in September-October 2007

    NASA Astrophysics Data System (ADS)

    Chen, A. W.; D'Ammando, F.; Villata, M.; Raiteri, C. M.; Tavani, M.; Vittorini, V.; Bulgarelli, A.; Donnarumma, I.; Ferrari, A.; Giuliani, A.; Longo, F.; Pacciani, L.; Pucella, G.; Vercellone, S.; Argan, A.; Barbiellini, G.; Boffelli, F.; Caraveo, P.; Carosati, D.; Cattaneo, P. W.; Cocco, V.; Costa, E.; Del Monte, E.; de Paris, G.; Di Cocco, G.; Evangelista, Y.; Feroci, M.; Fiorini, M.; Froysland, T.; Frutti, M.; Fuschino, F.; Galli, M.; Gianotti, F.; Kurtanidze, O. M.; Labanti, C.; Lapshov, I.; Larionov, V. M.; Lazzarotto, F.; Lipari, P.; Marisaldi, M.; Mastropietro, M.; Mereghetti, S.; Morelli, E.; Morselli, A.; Pasanen, M.; Pellizzoni, A.; Perotti, F.; Picozza, P.; Porrovecchio, G.; Prest, M.; Rapisarda, M.; Rappoldi, A.; Rubini, A.; Soffitta, P.; Trifoglio, M.; Trois, A.; Vallazza, E.; Zambra, A.; Zanello, D.; Cutini, S.; Gasparrini, D.; Pittori, C.; Santolamazza, P.; Verrecchia, F.; Giommi, P.; Antonelli, L. A.; Colafrancesco, S.; Salotti, L.

    2008-10-01

    Aims: We report the γ-ray activity from the intermediate BL Lac S5 0716+714 during observations acquired by the AGILE satellite in September and October 2007. These detections of activity were contemporaneous with a period of intense optical activity, which was monitored by GASP-WEBT. This simultaneous optical and γ-ray coverage allows us to study in detail the light curves, time lags, γ-ray photon spectrum, and Spectral Energy Distributions (SEDs) during different states of activity. Methods: AGILE observed the source with its two co-aligned imagers, the Gamma-Ray Imaging Detector (GRID) and the hard X-ray imager (Super-AGILE), which are sensitive to the 30 MeV-50 GeV and 18-60 keV energy ranges, respectively. Observations were completed in two different periods, the first between 2007 September 4-23, and the second between 2007 October 24-November 1. Results: Over the period 2007 September 7-12, AGILE detected γ-ray emission from the source at a significance level of 9.6-σ with an average flux (E > 100 MeV) of (97 ± 15) × 10-8 photons cm-2 s-1, which increased by a factor of at least four within three days. No emission was detected by Super-AGILE for the energy range 18-60 keV to a 3-σ upper limit of 10 mCrab in 335 ks. In October 2007, AGILE repointed toward S5 0716+714 following an intense optical flare, measuring an average flux of (47 ± 11) × 10-8 photons cm-2 s-1 at a significance level of 6.0-σ. Conclusions: The γ-ray flux of S5 0716+714 detected by AGILE is the highest ever detected for this blazar and one of the most intense γ-ray fluxes detected from a BL Lac object. The SED of mid-September appears to be consistent with the synchrotron self-Compton (SSC) emission model, but only by including two SSC components of different variabilities. The optical data presented in this paper are stored in the GASP-WEBT archive; for questions regarding their availability, please contact the WEBT President Massimo Villata. Figure 5 is only available in

  12. Model medication management process in Australian nursing homes using business process modeling.

    PubMed

    Qian, Siyu; Yu, Ping

    2013-01-01

    One of the reasons for end user avoidance or rejection to use health information systems is poor alignment of the system with healthcare workflow, likely causing by system designers' lack of thorough understanding about healthcare process. Therefore, understanding the healthcare workflow is the essential first step for the design of optimal technologies that will enable care staff to complete the intended tasks faster and better. The often use of multiple or "high risk" medicines by older people in nursing homes has the potential to increase medication error rate. To facilitate the design of information systems with most potential to improve patient safety, this study aims to understand medication management process in nursing homes using business process modeling method. The paper presents study design and preliminary findings from interviewing two registered nurses, who were team leaders in two nursing homes. Although there were subtle differences in medication management between the two homes, major medication management activities were similar. Further field observation will be conducted. Based on the data collected from observations, an as-is process model for medication management will be developed.

  13. Modeling microbial processes in porous media

    NASA Astrophysics Data System (ADS)

    Murphy, Ellyn M.; Ginn, Timothy R.

    The incorporation of microbial processes into reactive transport models has generally proceeded along two separate lines of investigation: (1) transport of bacteria as inert colloids in porous media, and (2) the biodegradation of dissolved contaminants by a stationary phase of bacteria. Research over the last decade has indicated that these processes are closely linked. This linkage may occur when a change in metabolic activity alters the attachment/detachment rates of bacteria to surfaces, either promoting or retarding bacterial transport in a groundwater-contaminant plume. Changes in metabolic activity, in turn, are controlled by the time of exposure of the microbes to electron acceptors/donor and other components affecting activity. Similarly, metabolic activity can affect the reversibility of attachment, depending on the residence time of active microbes. Thus, improvements in quantitative analysis of active subsurface biota necessitate direct linkages between substrate availability, metabolic activity, growth, and attachment/detachment rates. This linkage requires both a detailed understanding of the biological processes and robust quantitative representations of these processes that can be tested experimentally. This paper presents an overview of current approaches used to represent physicochemical and biological processes in porous media, along with new conceptual approaches that link metabolic activity with partitioning of the microorganism between the aqueous and solid phases. Résumé L'introduction des processus microbiologiques dans des modèles de transport réactif a généralement suivi deux voies différentes de recherches: (1) le transport de bactéries sous forme de colloïdes inertes en milieu poreux, et (2) la biodégradation de polluants dissous par une phase stationnaire de bactéries. Les recherches conduites au cours des dix dernières années indiquent que ces processus sont intimement liés. Cette liaison peut intervenir lorsqu

  14. DESCRIPTION OF ATMOSPHERIC TRANSPORT PROCESSES IN EULERIAN AIR QUALITY MODELS

    EPA Science Inventory

    Key differences among many types of air quality models are the way atmospheric advection and turbulent diffusion processes are treated. Gaussian models use analytical solutions of the advection-diffusion equations. Lagrangian models use a hypothetical air parcel concept effecti...

  15. Macro Level Simulation Model Of Space Shuttle Processing

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  16. The Inference Construct: A Model of the Writing Process

    ERIC Educational Resources Information Center

    Van Nostrand, A. D.

    1978-01-01

    Presents a taxonomy of writing instruction, a model or paradigm of the writing process, an application of this model to the teaching of writing, and an explanation of the empirical basis of the model. (Author/GW)

  17. Frequency-agile microwave components using ferroelectric materials

    NASA Astrophysics Data System (ADS)

    Colom-Ustariz, Jose G.; Rodriguez-Solis, Rafael; Velez, Salmir; Rodriguez-Acosta, Snaider

    2003-04-01

    The non-linear electric field dependence of ferroelectric thin films can be used to design frequency and phase agile components. Tunable components have traditionally been developed using mechanically tuned resonant structures, ferrite components, or semiconductor-based voltage controlled electronics, but they are limited by their frequency performance, high cost, hgih losses, and integration into larger systems. In contrast, the ferroelectric-based tunable microwave component can easily be integrated into conventional microstrip circuits and attributes such as small size, light weight, and low-loss make these components attractive for broadband and multi-frequency applications. Components that are essential elements in the design of a microwave sensor can be fabricated with ferroelectric materials to achieve tunability over a broad frequency range. It has been reported that with a thin ferroelectric film placed between the top conductor layer and the dielectric material of a microstrip structure, and the proper DC bias scheme, tunable components above the Ku band can be fabricated. Components such as phase shifters, coupled line filters, and Lange couplers have been reported in the literature using this technique. In this wokr, simulated results from a full wave electromagnetic simulator are obtained to show the tunability of a matching netowrk typically used in the design of microwave amplifiers and antennas. In addition, simulated results of a multilayer Lange coupler, and a patch antenna are also presented. The results show that typical microstrip structures can be easily modified to provide frequency agile capabilities.

  18. Enhanced detection of Terrestrial Gamma-Ray Flashes by AGILE

    NASA Astrophysics Data System (ADS)

    Marisaldi, M.; Argan, A.; Ursi, A.; Gjesteland, T.; Fuschino, F.; Labanti, C.; Galli, M.; Tavani, M.; Pittori, C.; Verrecchia, F.; D'Amico, F.; Ostgaard, N.; Mereghetti, S.; Campana, R.; Cattaneo, P.; Bulgarelli, A.; Colafrancesco, S.; Dietrich, S.; Longo, F.; Gianotti, F.; Giommi, P.; Rappoldi, A.; Trifoglio, M.; Trois, A.

    2015-12-01

    At the end of March 2015 the onboard configuration of the AGILE satellite was modified in order to disable the veto signal of the anticoincidence shield for the minicalorimeter instrument. The motivation for such a change was the understanding that the dead time induced by the anticoincidence prevented the detection of a large fraction of Terrestrial Gamma-ray Flashes (TGFs), especially the short duration ones. We present here the characteristics of the new TGF sample after several months of stable operations with the new configuration. The configuration change was highly successful resulting in the detection of about 100 TGFs/month, an increase of a factor about 11 in TGFs detection rate with respect to previous configuration. As expected, the largest fraction of the new events has short duration, with a median duration of 80 microseconds. We also obtain a sample of events with simultaneous association, within 100 microseconds, with lightning sferics detected by the World Wide Lightning Location Network (WWLLN), confirming previous results reported by the Fermi mission. Given the high detection rate and the AGILE very low (+/-2.5°) orbital inclination, the new configuration provides the largest TGF detection rate surface density (TGFs / km2 / year) to date, opening space for correlation studies with lightning and atmospheric parameters on short spatial and temporal scales along the equatorial region. Eventually, the events with associated simultaneous WWLLN sferics provide a highly reliable sample to probe the long-standing issue of the TGF maximal energy.

  19. Caffeine supplementation and reactive agility in elite youth soccer players.

    PubMed

    Jordan, J Bradley; Korgaokar, Ajit; Farley, Richard S; Coons, John M; Caputo, Jennifer L

    2014-05-01

    This study examined the effects of caffeine supplementation (6 mg·kg-1) on performance of a reactive agility test (RAT) in 17 elite, male, youth (M = 14 y) soccer players. Using a double-blind, repeated-measures design, players completed 4 days of testing on the RAT after a standardized warm-up. On day 1, anthropometric measurements were taken and players were accommodated to the RAT. On day 2, baseline performance was established. Caffeine or placebo conditions were randomly assigned on day 3 and the condition was reversed on day 4. Players completed 3 randomized trials of the RAT on days 2, 3, and 4 with at least 1 trial to the players' dominant and nondominant sides. There were no significant differences among conditions in reaction time (RT) to the dominant side, heart rates at any point of measurement, or ratings of perceived exertion (RPE) after completion of the warm-up. Caffeine produced faster RT to the nondominant side (P = .041) and higher RPE at the conclusion of the RAT (P = .013). The effect on the total time (TT) to complete the agility test to the nondominant side approached significance (P = .051). Sprint time and TT to either side did not differ. Caffeine supplementation may provide ergogenic benefit to elite, male, youth soccer players.

  20. Modeling Trans-Scale Social Processes

    DTIC Science & Technology

    2010-05-01

    Topics • Limits of Empiricism • Radial Concepts • Generic Model of Social Action • Theory Templates – Trans-scale Social Coprocess (TSC) • Orientation...this presentation was supported, in part, by Office of Naval Research, Award No.: N00014-09-1-0766, project Modeling Strategic Contexts. Empiricism ...toward finer-grain, more dynamic models The Need for Social-Theoretical Models • The limits of empiricism – Recognized since Hume – Advances in

  1. Teachers as Managers of the Modelling Process

    ERIC Educational Resources Information Center

    Lingefjard, Thomas; Meier, Stephanie

    2010-01-01

    The work in the Comenius Network project Developing Quality in Mathematics Education II (DQME II) has a main focus on development and evaluation of modelling tasks. One reason is the gap between what mathematical modelling is and what is taught in mathematical classrooms. This article deals with one modelling task and focuses on how two teachers…

  2. A Total Quality Leadership Process Improvement Model

    DTIC Science & Technology

    1993-12-01

    air- craft, then it is unlikely that a major quality concern would be processing travel orders for personnel. However, if the business is a travel ... agency , it may be entirely appropriate to optimize travel processing procedures. Whenever possible, it is best to establish goals that will provide a STOP

  3. Computer modeling of lung cancer diagnosis-to-treatment process.

    PubMed

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U; Yu, Xinhua; Faris, Nick; Li, Jingshan

    2015-08-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed.

  4. Computer modeling of lung cancer diagnosis-to-treatment process

    PubMed Central

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U.; Yu, Xinhua; Faris, Nick

    2015-01-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed. PMID:26380181

  5. Process Model for Defining Space Sensing and Situational Awareness Requirements

    DTIC Science & Technology

    2006-04-01

    process model for defining systems for space sensing and space situational awareness is presented. The paper concentrates on eight steps for determining the requirements to include: decision maker needs, system requirements, exploitation methods and vulnerabilities, critical capabilities, and identify attack scenarios. Utilization of the USAF anti-tamper (AT) implementation process as a process model departure point for the space sensing and situational awareness (SSSA...is presented. The AT implementation process model , as an

  6. How to Improve Process Models for Better ISO/IEC 15504 Process Assessment

    NASA Astrophysics Data System (ADS)

    Picard, Michel; Renault, Alain; Cortina, Stéphane

    Since the evolution of SPICE towards a generic standard for process assessment in 2003, there have been an increasing number of initiatives aiming to propose Process Reference Models (PRM) and Process Assessment Models (PAM) in various fields of activity. Although these process models are the basis of any process assessment, the related ISO/IEC 15504-2:2003 requirements are not very strict and can be variously interpreted. Enhancing these requirements would improve both the intrinsic quality of process models and their added-value from the user standpoint. The current revision of the standard is an opportunity to bring an answer to issues that were raised by the experienced developers and users of ISO/IEC 15504 compliant process models. This paper proposes parts of an answer to some of these issues and motivates them through their direct impact on the process model relevance from the beneficiaries' point of view.

  7. Hot cheese: a processed Swiss cheese model.

    PubMed

    Li, Y; Thimbleby, H

    2014-01-01

    James Reason's classic Swiss cheese model is a vivid and memorable way to visualise how patient harm happens only when all system defences fail. Although Reason's model has been criticised for its simplicity and static portrait of complex systems, its use has been growing, largely because of the direct clarity of its simple and memorable metaphor. A more general, more flexible and equally memorable model of accident causation in complex systems is needed. We present the hot cheese model, which is more realistic, particularly in portraying defence layers as dynamic and active - more defences may cause more hazards. The hot cheese model, being more flexible, encourages deeper discussion of incidents than the simpler Swiss cheese model permits.

  8. Modelling the Active Hearing Process in Mosquitoes

    NASA Astrophysics Data System (ADS)

    Avitabile, Daniele; Homer, Martin; Jackson, Joe; Robert, Daniel; Champneys, Alan

    2011-11-01

    A simple microscopic mechanistic model is described of the active amplification within the Johnston's organ of the mosquito species Toxorhynchites brevipalpis. The model is based on the description of the antenna as a forced-damped oscillator coupled to a set of active threads (ensembles of scolopidia) that provide an impulsive force when they twitch. This twitching is in turn controlled by channels that are opened and closed if the antennal oscillation reaches a critical amplitude. The model matches both qualitatively and quantitatively with recent experiments. New results are presented using mathematical homogenization techniques to derive a mesoscopic model as a simple oscillator with nonlinear force and damping characteristics. It is shown how the results from this new model closely resemble those from the microscopic model as the number of threads approach physiologically correct values.

  9. Verification of image processing based visibility models

    SciTech Connect

    Larson, S.M.; Cass, G.R.; Hussey, K.J.; Luce, F.

    1988-06-01

    Methods are presented for testing visibility models that use simulated photographs to display results of model calculations. An experimental protocol is developed and used to obtain input data including standard photographs of chosen scenes on a clear day and during a smog event at Pasadena, CA. With clear day photograph as a substrate, pollutant properties measured on the smoggy day are introduced into the visibility model, and results of the model calculations are displayed as a synthetic photograph of the expected appearance of the smog event. Quantitative comparisons are made between the predicted and actual appearance of the smog event. Diagnostic techniques developed are applied to the visibility modeling procedure proposed by Malm et al. That model is shown to reproduce the contrast reduction characteristic of urban air pollution but produces synthetic photographs with sky elements that differ substantially from a real photograph of the actual smog event.

  10. Application of a Process Model to a Management Support System.

    DTIC Science & Technology

    The concept of a process model is developed and used as a basis for data organization for use in a Management Support System (MSS). The data...organization is proposed as being useful for historical records that constitute the bulk of the information stored in an MSS. The concept of a process model , and...manager. Other uses of the proposed data organization are also considered. Extensions of the process model are considered by combining the model with

  11. Dynamics of the two process model of human sleep regulation

    NASA Astrophysics Data System (ADS)

    Kenngott, Max; McKay, Cavendish

    2011-04-01

    We examine the dynamics of the two process model of human sleep regulation. In this model, sleep propensity is governed by the interaction between a periodic threshold (process C) and a saturating growth/decay (process S). We find that the parameter space of this model admits sleep cycles with a wide variety of characteristics, many of which are not observed in normal human sleepers. We also examine the effects of phase dependent feedback on this model.

  12. The Automation of Nowcast Model Assessment Processes

    DTIC Science & Technology

    2016-09-01

    developed at NCAR through a grant from the United States Air Force 557th Weather Wing (formerly the Air Force Weather Agency), where NCAR is sponsored...that will automate real-time WRE-N model simulations, collect and quality control check weather observations for assimilation and verification, and...observations and performing quality -control checks for the pre-forecast data assimilation period. 2. Run the WRE-N model to generate model forecast data

  13. The frequency-agile radar: A multifunctional approach to remote sensing of the ionosphere

    NASA Astrophysics Data System (ADS)

    Tsunoda, R. T.; Livingston, R. C.; Buonocore, J. J.; McKinley, A. V.

    1995-09-01

    We introduce a new kind of diagnostic sensor that combines multifunctional measurement capabilities for ionospheric research. Multifunctionality is realized through agility in frequency selection over an extended band (1.5 to 50 MHz), system modularity, complete system control by software written in C, and a user-friendly computer interface. This sensor, which we call the frequency-agile radar (FAR), incorporates dual radar channels and an arbitrary waveform synthesizer that allows creative design of sophisticated waveforms as a means of increasing its sensitivity to weak signals while minimizing loss in radar resolution. The sensitivity of the FAR is determined by two sets of power amplifier modules: four 4-kW solid-state broadband amplifiers, and four 30-kW vacuum tube amplifiers. FAR control is by an AT-bus personal computer with on-line processing by a programmable array processor. The FAR does not simply house the separate functions of most radio sensors in use today, it provides convenient and flexible access to those functions as elements to be used in any combination. Some of the first new results obtained with the FAR during recent field campaigns are presented to illustrate its versatility. These include (1) the first detection of anomalous high-frequency (HF) reflections from a barium ion cloud, (2) the first evidence of unexpectedly large drifts and a shear north of the equatorial electrojet, (3) the first HF radar signature of a developing equatorial plasma bubble, and (4) the first measurements by a portable radar of altitude-extended, quasi-periodic backscatter from midlatitude sporadic E. We also mention the potential of the FAR for atmospheric remote sensing.

  14. Performance, Agility and Cost of Cloud Computing Services for NASA GES DISC Giovanni Application

    NASA Astrophysics Data System (ADS)

    Pham, L.; Chen, A.; Wharton, S.; Winter, E. L.; Lynnes, C.

    2013-12-01

    The NASA Goddard Earth Science Data and Information Services Center (GES DISC) is investigating the performance, agility and cost of Cloud computing for GES DISC applications. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure), one of the core applications at the GES DISC for online climate-related Earth science data access, subsetting, analysis, visualization, and downloading, was used to evaluate the feasibility and effort of porting an application to the Amazon Cloud Services platform. The performance and the cost of running Giovanni on the Amazon Cloud were compared to similar parameters for the GES DISC local operational system. A Giovanni Time-Series analysis of aerosol absorption optical depth (388nm) from OMI (Ozone Monitoring Instrument)/Aura was selected for these comparisons. All required data were pre-cached in both the Cloud and local system to avoid data transfer delays. The 3-, 6-, 12-, and 24-month data were used for analysis on the Cloud and local system respectively, and the processing times for the analysis were used to evaluate system performance. To investigate application agility, Giovanni was installed and tested on multiple Cloud platforms. The cost of using a Cloud computing platform mainly consists of: computing, storage, data requests, and data transfer in/out. The Cloud computing cost is calculated based on the hourly rate, and the storage cost is calculated based on the rate of Gigabytes per month. Cost for incoming data transfer is free, and for data transfer out, the cost is based on the rate in Gigabytes. The costs for a local server system consist of buying hardware/software, system maintenance/updating, and operating cost. The results showed that the Cloud platform had a 38% better performance and cost 36% less than the local system. This investigation shows the potential of cloud computing to increase system performance and lower the overall cost of system management.

  15. Using Target Network Modelling to Increase Battlespace Agility

    DTIC Science & Technology

    2013-06-01

    Social constructivism as it is used here to explain battlespace complexity, is defined as the view that the material world shapes and is shaped by human...all situational understandings for determining military actions as being socially constructed realities and constantly subjected to change. How...all, apparent realities are only social constructs and are therefore subject to change. It claims that there is no absolute truth and that the way

  16. SAS-085 C2 Agility Model Validation Using Case Studies

    DTIC Science & Technology

    2013-06-01

    2010/11 B. Comprehensive Approach in NATO Operations Peace-keeping C. Rwanda Genocide 1994 Cyber Warfare D. Estonia Cyber Attack 2007 E. Georgia...principle that whatever decisions and actions were taken needed to have an effect towards achieving operational objectives. C. Rwanda Genocide 1994 ...NATO Operations) C. Ms. Micheline Bélanger, Defence R&D Canada – Valcartier ( Rwanda Genocide 1994 ) D. Prof. Michael Henshaw, Loughborough University

  17. Construction of Theoretical Model for Antiterrorism: From Reflexive Game Theory Viewpoint

    DTIC Science & Technology

    2014-06-01

    19th ICCRTS “C2 AGILITY: LESSONS LEARNED FROM RESEARCH AND OPERATIONS.” Construction of Theoretical Model for Antiterrorism: From Reflexive Game...of Theoretical Model for Antiterrorism: From Reflexive Game Theory Viewpoint 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...use of Reflexive Game Theory (RGT) for modeling the processes of decision making by terrorists. In the antiterrorist operations, an expert plays an

  18. Computational modeling in the primary processing of titanium: A review

    NASA Astrophysics Data System (ADS)

    Venkatesh, Vasisht; Wilson, Andrew; Kamal, Manish; Thomas, Matthew; Lambert, Dave

    2009-05-01

    Process modeling is increasingly becoming a vital tool for modern metals manufacturing. This paper reviews process modeling initiatives started at TIMET over the last decade for the primary processing of titanium alloys. SOLAR, a finite volume-based numerical model developed at the Ecole de Mine at Nancy, has been successfully utilized to optimize vacuum arc remelting process parameters, such as electromagnetic stirring profiles in order to minimize macrosegregation and improve ingot quality. Thermo-mechanical modeling of heat treating, billet forging, and slab rolling is accomplished via the commercial finite element analysis model, DEFORM, to determine heating times, cooling rates, strain distributions, etc.

  19. Capability Maturity Model (CMM) for Software Process Improvements

    NASA Technical Reports Server (NTRS)

    Ling, Robert Y.

    2000-01-01

    This slide presentation reviews the Avionic Systems Division's implementation of the Capability Maturity Model (CMM) for improvements in the software development process. The presentation reviews the process involved in implementing the model and the benefits of using CMM to improve the software development process.

  20. A Multi-Scale Modeling of Laser Cladding Process (Preprint)

    DTIC Science & Technology

    2006-04-01

    possibilities to alter a component at its surface. Despite immense potentials and advancements, the process model of microstructure evolution and its coupling...with macro parameter of laser cladding process has not been fully developed. To address this issue, a process model of microstructure evolution has