Sample records for process development requirements

  1. Technology and development requirements for advanced coal conversion systems

    NASA Technical Reports Server (NTRS)

    1981-01-01

    A compendium of coal conversion process descriptions is presented. The SRS and MC data bases were utilized to provide information paticularly in the areas of existing process designs and process evaluations. Additional information requirements were established and arrangements were made to visit process developers, pilot plants, and process development units to obtain information that was not otherwise available. Plant designs, process descriptions and operating conditions, and performance characteristics were analyzed and requirements for further development identified and evaluated to determine the impact of these requirements on the process commercialization potential from the standpoint of economics and technical feasibility. A preliminary methodology was established for the comparative technical and economic assessment of advanced processes.

  2. Integrated approaches to the application of advanced modeling technology in process development and optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allgor, R.J.; Feehery, W.F.; Tolsma, J.E.

    The batch process development problem serves as good candidate to guide the development of process modeling environments. It demonstrates that very robust numerical techniques are required within an environment that can collect, organize, and maintain the data and models required to address the batch process development problem. This paper focuses on improving the robustness and efficiency of the numerical algorithms required in such a modeling environment through the development of hybrid numerical and symbolic strategies.

  3. Technology development for lunar base water recycling

    NASA Technical Reports Server (NTRS)

    Schultz, John R.; Sauer, Richard L.

    1992-01-01

    This paper will review previous and ongoing work in aerospace water recycling and identify research activities required to support development of a lunar base. The development of a water recycle system for use in the life support systems envisioned for a lunar base will require considerable research work. A review of previous work on aerospace water recycle systems indicates that more efficient physical and chemical processes are needed to reduce expendable and power requirements. Development work on biological processes that can be applied to microgravity and lunar environments also needs to be initiated. Biological processes are inherently more efficient than physical and chemical processes and may be used to minimize resupply and waste disposal requirements. Processes for recovering and recycling nutrients such as nitrogen, phosphorus, and sulfur also need to be developed to support plant growth units. The development of efficient water quality monitors to be used for process control and environmental monitoring also needs to be initiated.

  4. Materials, Processes and Manufacturing in Ares 1 Upper Stage: Integration with Systems Design and Development

    NASA Technical Reports Server (NTRS)

    Bhat, Biliyar N.

    2008-01-01

    Ares I Crew Launch Vehicle Upper Stage is designed and developed based on sound systems engineering principles. Systems Engineering starts with Concept of Operations and Mission requirements, which in turn determine the launch system architecture and its performance requirements. The Ares I-Upper Stage is designed and developed to meet these requirements. Designers depend on the support from materials, processes and manufacturing during the design, development and verification of subsystems and components. The requirements relative to reliability, safety, operability and availability are also dependent on materials availability, characterization, process maturation and vendor support. This paper discusses the roles and responsibilities of materials and manufacturing engineering during the various phases of Ares IUS development, including design and analysis, hardware development, test and verification. Emphasis is placed how materials, processes and manufacturing support is integrated over the Upper Stage Project, both horizontally and vertically. In addition, the paper describes the approach used to ensure compliance with materials, processes, and manufacturing requirements during the project cycle, with focus on hardware systems design and development.

  5. The software development process at the Chandra X-ray Center

    NASA Astrophysics Data System (ADS)

    Evans, Janet D.; Evans, Ian N.; Fabbiano, Giuseppina

    2008-08-01

    Software development for the Chandra X-ray Center Data System began in the mid 1990's, and the waterfall model of development was mandated by our documents. Although we initially tried this approach, we found that a process with elements of the spiral model worked better in our science-based environment. High-level science requirements are usually established by scientists, and provided to the software development group. We follow with review and refinement of those requirements prior to the design phase. Design reviews are conducted for substantial projects within the development team, and include scientists whenever appropriate. Development follows agreed upon schedules that include several internal releases of the task before completion. Feedback from science testing early in the process helps to identify and resolve misunderstandings present in the detailed requirements, and allows review of intangible requirements. The development process includes specific testing of requirements, developer and user documentation, and support after deployment to operations or to users. We discuss the process we follow at the Chandra X-ray Center (CXC) to develop software and support operations. We review the role of the science and development staff from conception to release of software, and some lessons learned from managing CXC software development for over a decade.

  6. A Process Research Framework: The International Process Research Consortium

    DTIC Science & Technology

    2006-12-01

    projects ? 52 Theme P | IPRC Framework 5 P-30 How should a process for collaborative development be formulated? The development at different companies...requires some process for the actual collaboration . How should it be handled? P-31 How do we handle change? Requirements change during development ...source projects employ a single-site development model in which there is no large community of testers but rather a single-site small group

  7. A Scenario-Based Process for Requirements Development: Application to Mission Operations Systems

    NASA Technical Reports Server (NTRS)

    Bindschadler, Duane L.; Boyles, Carole A.

    2008-01-01

    The notion of using operational scenarios as part of requirements development during mission formulation (Phases A & B) is widely accepted as good system engineering practice. In the context of developing a Mission Operations System (MOS), there are numerous practical challenges to translating that notion into the cost-effective development of a useful set of requirements. These challenges can include such issues as a lack of Project-level focus on operations issues, insufficient or improper flowdown of requirements, flowdown of immature or poor-quality requirements from Project level, and MOS resource constraints (personnel expertise and/or dollars). System engineering theory must be translated into a practice that provides enough structure and standards to serve as guidance, but that retains sufficient flexibility to be tailored to the needs and constraints of a particular MOS or Project. We describe a detailed, scenario-based process for requirements development. Identifying a set of attributes for high quality requirements, we show how the portions of the process address many of those attributes. We also find that the basic process steps are robust, and can be effective even in challenging Project environments.

  8. The study on knowledge transferring incentive for information system requirement development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yang

    2015-03-10

    Information system requirement development is a process of users’ knowledge sharing and transferring. However the tacit requirements developing is a main problem during requirement development process, for the reason of difficult to encoding, express, and communicate. Knowledge fusion and corporate effort is needed to finding tacit requirements. Under this background, our paper try to find out the rule of effort dynamic evolutionary of software developer and user by building an evolutionary game model on the condition of incentive system. And in addition this paper provides an in depth discussion at the end of this paper.

  9. Improving Requirements Generation Thoroughness in User-Centered Workshops: The Role of Prompting and Shared User Stories

    ERIC Educational Resources Information Center

    Read, Aaron

    2013-01-01

    The rise of stakeholder centered software development has led to organizations engaging users early in the development process to help define system requirements. To facilitate user involvement in the requirements elicitation process, companies can use Group Support Systems (GSS) to conduct requirements elicitation workshops. The effectiveness of…

  10. Phase 111A Crew Interface Specifications Development for Inflight Maintenance and Stowage Functions

    NASA Technical Reports Server (NTRS)

    Carl, John G.

    1973-01-01

    This report presents the findings and data products developed during the Phase IIIA Crew Interface Specification Study for Inflight Maintenance and Stowage Functions, performed by General Electric for the NASA, Johnson Space Center with a set of documentation that can be used as definitive guidelines to improve the present process of defining, controlling and managing flight crew interface requirements that are related to inflight maintenance (including assembly and servicing) and stowage functions. During the Phase IIIA contract period, the following data products were developed: 1) Projected NASA Crew Procedures/Flight Data File Development Process. 2) Inflight Maintenance Management Process Description. 3) Preliminary Draft, General Specification, Inflight Maintenance Management Requirements. 4) Inflight Maintenance Operational Process Description. 5) Preliminary Draft, General Specification, Inflight Maintenance Task and Support Requirements Analysis. 6) Suggested IFM Data Processing Reports for Logistics Management The above Inflight Maintenance data products have been developed during the Phase IIIA study after review of Space Shuttle Program Documentation, including the Level II Integrated Logistics Requirements and other DOD and NASA data relative to Payloads Accommodations and Satellite On-Orbit Servicing. These Inflight Maintenance data products were developed to be in consonance with Space Shuttle Program technical and management requirements.

  11. ISO 9000 and/or Systems Engineering Capability Maturity Model?

    NASA Technical Reports Server (NTRS)

    Gholston, Sampson E.

    2002-01-01

    For businesses and organizations to remain competitive today they must have processes and systems in place that will allow them to first identify customer needs and then develop products/processes that will meet or exceed the customers needs and expectations. Customer needs, once identified, are normally stated as requirements. Designers can then develop products/processes that will meet these requirements. Several functions, such as quality management and systems engineering management are used to assist product development teams in the development process. Both functions exist in all organizations and both have a similar objective, which is to ensure that developed processes will meet customer requirements. Are efforts in these organizations being duplicated? Are both functions needed by organizations? What are the similarities and differences between the functions listed above? ISO 9000 is an international standard of goods and services. It sets broad requirements for the assurance of quality and for management's involvement. It requires organizations to document the processes and to follow these documented processes. ISO 9000 gives customers assurance that the suppliers have control of the process for product development. Systems engineering can broadly be defined as a discipline that seeks to ensure that all requirements for a system are satisfied throughout the life of the system by preserving their interrelationship. The key activities of systems engineering include requirements analysis, functional analysis/allocation, design synthesis and verification, and system analysis and control. The systems engineering process, when followed properly, will lead to higher quality products, lower cost products, and shorter development cycles. The System Engineering Capability Maturity Model (SE-CMM) will allow companies to measure their system engineering capability and continuously improve those capabilities. ISO 9000 and SE-CMM seem to have a similar objective, which is to document the organization's processes and certify to potential customers the capability of a supplier to control the processes that determine the quality of the product or services being produced. The remaining sections of this report examine the differences and similarities between ISO 9000 and SE-CMM and make recommendations for implementation.

  12. Rapid Development: A Content Analysis Comparison of Literature and Purposive Sampling of AFRL Rapid Reaction Projects

    DTIC Science & Technology

    2011-12-01

    systems engineering technical and technical management processes. Technical Planning, Stakeholders Requirements Development, and Architecture Design were...Stakeholder Requirements Definition, Architecture Design and Technical Planning. A purposive sampling of AFRL rapid development program managers and engineers...emphasize one process over another however Architecture Design , Implementation scored higher among Technical Processes. Decision Analysis, Technical

  13. Requirements Development Issues for Advanced Life Support Systems: Solid Waste Management

    NASA Technical Reports Server (NTRS)

    Levri, Julie A.; Fisher, John W.; Alazraki, Michael P.; Hogan, John A.

    2002-01-01

    Long duration missions pose substantial new challenges for solid waste management in Advanced Life Support (ALS) systems. These possibly include storing large volumes of waste material in a safe manner, rendering wastes stable or sterilized for extended periods of time, and/or processing wastes for recovery of vital resources. This is further complicated because future missions remain ill-defined with respect to waste stream quantity, composition and generation schedule. Without definitive knowledge of this information, development of requirements is hampered. Additionally, even if waste streams were well characterized, other operational and processing needs require clarification (e.g. resource recovery requirements, planetary protection constraints). Therefore, the development of solid waste management (SWM) subsystem requirements for long duration space missions is an inherently uncertain, complex and iterative process. The intent of this paper is to address some of the difficulties in writing requirements for missions that are not completely defined. This paper discusses an approach and motivation for ALS SWM requirements development, the characteristics of effective requirements, and the presence of those characteristics in requirements that are developed for uncertain missions. Associated drivers for life support system technological capability are also presented. A general means of requirements forecasting is discussed, including successive modification of requirements and the need to consider requirements integration among subsystems.

  14. Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing

    NASA Technical Reports Server (NTRS)

    Brown, Richard B.; Smoot, James C.; Underwood, Lauren; Armstrong, C. Duane

    2012-01-01

    Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then applying them to a given cloud-enabled infrastructure to assesses and compare environment setup options and enabled technologies. This project reviews findings that were observed when cloud platforms were evaluated for bulk geoprocessing capabilities based on data handling and application development requirements.

  15. Conducting requirements analyses for research using routinely collected health data: a model driven approach.

    PubMed

    de Lusignan, Simon; Cashman, Josephine; Poh, Norman; Michalakidis, Georgios; Mason, Aaron; Desombre, Terry; Krause, Paul

    2012-01-01

    Medical research increasingly requires the linkage of data from different sources. Conducting a requirements analysis for a new application is an established part of software engineering, but rarely reported in the biomedical literature; and no generic approaches have been published as to how to link heterogeneous health data. Literature review, followed by a consensus process to define how requirements for research, using, multiple data sources might be modeled. We have developed a requirements analysis: i-ScheDULEs - The first components of the modeling process are indexing and create a rich picture of the research study. Secondly, we developed a series of reference models of progressive complexity: Data flow diagrams (DFD) to define data requirements; unified modeling language (UML) use case diagrams to capture study specific and governance requirements; and finally, business process models, using business process modeling notation (BPMN). These requirements and their associated models should become part of research study protocols.

  16. Space Station Water Processor Process Pump

    NASA Technical Reports Server (NTRS)

    Parker, David

    1995-01-01

    This report presents the results of the development program conducted under contract NAS8-38250-12 related to the International Space Station (ISS) Water Processor (WP) Process Pump. The results of the Process Pumps evaluation conducted on this program indicates that further development is required in order to achieve the performance and life requirements for the ISSWP.

  17. Requirements for company-wide management

    NASA Technical Reports Server (NTRS)

    Southall, J. W.

    1980-01-01

    Computing system requirements were developed for company-wide management of information and computer programs in an engineering data processing environment. The requirements are essential to the successful implementation of a computer-based engineering data management system; they exceed the capabilities provided by the commercially available data base management systems. These requirements were derived from a study entitled The Design Process, which was prepared by design engineers experienced in development of aerospace products.

  18. 1-G Human Factors for Optimal Processing and Operability of Ground Systems Up to CxP GOP PDR

    NASA Technical Reports Server (NTRS)

    Stambolian, Damon B.; Henderson, Gena; Miller, Darcy; Prevost, Gary; Tran, Donald; Barth, Tim

    2011-01-01

    This slide presentation reviews the development and use of a process and tool for developing these requirements and improve the design for ground operations. A Human Factors Engineering Analysis (HFEA) Tool was developed to create a dedicated subset of requirements from the FAA requirements for each subsystem. As an example the use of the human interface with an actuator motor is considered.

  19. The Systems Engineering Process for Human Support Technology Development

    NASA Technical Reports Server (NTRS)

    Jones, Harry

    2005-01-01

    Systems engineering is designing and optimizing systems. This paper reviews the systems engineering process and indicates how it can be applied in the development of advanced human support systems. Systems engineering develops the performance requirements, subsystem specifications, and detailed designs needed to construct a desired system. Systems design is difficult, requiring both art and science and balancing human and technical considerations. The essential systems engineering activity is trading off and compromising between competing objectives such as performance and cost, schedule and risk. Systems engineering is not a complete independent process. It usually supports a system development project. This review emphasizes the NASA project management process as described in NASA Procedural Requirement (NPR) 7120.5B. The process is a top down phased approach that includes the most fundamental activities of systems engineering - requirements definition, systems analysis, and design. NPR 7120.5B also requires projects to perform the engineering analyses needed to ensure that the system will operate correctly with regard to reliability, safety, risk, cost, and human factors. We review the system development project process, the standard systems engineering design methodology, and some of the specialized systems analysis techniques. We will discuss how they could apply to advanced human support systems development. The purpose of advanced systems development is not directly to supply human space flight hardware, but rather to provide superior candidate systems that will be selected for implementation by future missions. The most direct application of systems engineering is in guiding the development of prototype and flight experiment hardware. However, anticipatory systems engineering of possible future flight systems would be useful in identifying the most promising development projects.

  20. SLS Navigation Model-Based Design Approach

    NASA Technical Reports Server (NTRS)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and management of design requirements to the development of usable models, model requirements, and model verification and validation efforts. The models themselves are represented in C/C++ code and accompanying data files. Under the idealized process, potential ambiguity in specification is reduced because the model must be implementable versus a requirement which is not necessarily subject to this constraint. Further, the models are shown to emulate the hardware during validation. For models developed by the Navigation Team, a common interface/standalone environment was developed. The common environment allows for easy implementation in design and analysis tools. Mechanisms such as unit test cases ensure implementation as the developer intended. The model verification and validation process provides a very high level of component design insight. The origin and implementation of the SLS variant of Model-based Design is described from the perspective of the SLS Navigation Team. The format of the models and the requirements are described. The Model-based Design approach has many benefits but is not without potential complications. Key lessons learned associated with the implementation of the Model Based Design approach and process from infancy to verification and certification are discussed

  1. Research in software allocation for advanced manned mission communications and tracking systems

    NASA Technical Reports Server (NTRS)

    Warnagiris, Tom; Wolff, Bill; Kusmanoff, Antone

    1990-01-01

    An assessment of the planned processing hardware and software/firmware for the Communications and Tracking System of the Space Station Freedom (SSF) was performed. The intent of the assessment was to determine the optimum distribution of software/firmware in the processing hardware for maximum throughput with minimum required memory. As a product of the assessment process an assessment methodology was to be developed that could be used for similar assessments of future manned spacecraft system designs. The assessment process was hampered by changing requirements for the Space Station. As a result, the initial objective of determining the optimum software/firmware allocation was not fulfilled, but several useful conclusions and recommendations resulted from the assessment. It was concluded that the assessment process would not be completely successful for a system with changing requirements. It was also concluded that memory requirements and hardware requirements were being modified to fit as a consequence of the change process, and although throughput could not be quantitized, potential problem areas could be identified. Finally, inherent flexibility of the system design was essential for the success of a system design with changing requirements. Recommendations resulting from the assessment included development of common software for some embedded controller functions, reduction of embedded processor requirements by hardwiring some Orbital Replacement Units (ORUs) to make better use of processor capabilities, and improvement in communications between software development personnel to enhance the integration process. Lastly, a critical observation was made regarding the software integration tasks did not appear to be addressed in the design process to the degree necessary for successful satisfaction of the system requirements.

  2. Low-Cost Rapid Usability Testing: Its Application in Both Product Development and System Implementation.

    PubMed

    Kushniruk, Andre; Borycki, Elizabeth

    2017-01-01

    In recent years there has been considerable discussion around the need for certification and regulation of healthcare information technology (IT). In particular, the usability of the products being developed needs to be evaluated. This has included the application of standards designed to ensure the process of system development is user-centered and takes usability into consideration while a product is being developed. In addition to this, in healthcare, organizations in the United States and Europe have also addressed the need and requirement for product certification. However, despite these efforts there are continued reports of unusable and unsafe implementations. In this paper we discuss the need to not only include (and require) usability testing in the one-time development process of health IT products (such as EHRs), but we also argue for the need to additionally develop specific usability standards and requirements for usability testing during the implementation of vendor products (i.e. post product development) in healthcare settings. It is further argued that health IT products that may have been certified regarding their development process will still require application of usability testing in the process of implementing them in real hospital settings in order to ensure usability and safety. This is needed in order to ensure that the final result of both product development and implementation processes take into account and apply the latest usability principles and methods.

  3. SHARP's systems engineering challenge: rectifying integrated product team requirements with performance issues in an evolutionary spiral development acquisition

    NASA Astrophysics Data System (ADS)

    Kuehl, C. Stephen

    2003-08-01

    Completing its final development and early deployment on the Navy's multi-role aircraft, the F/A-18 E/F Super Hornet, the SHAred Reconnaissance Pod (SHARP) provides the war fighter with the latest digital tactical reconnaissance (TAC Recce) Electro-Optical/Infrared (EO/IR) sensor system. The SHARP program is an evolutionary acquisition that used a spiral development process across a prototype development phase tightly coupled into overlapping Engineering and Manufacturing Development (EMD) and Low Rate Initial Production (LRIP) phases. Under a tight budget environment with a highly compressed schedule, SHARP challenged traditional acquisition strategies and systems engineering (SE) processes. Adopting tailored state-of-the-art systems engineering process models allowd the SHARP program to overcome the technical knowledge transition challenges imposed by a compressed program schedule. The program's original goal was the deployment of digital TAC Recce mission capabilities to the fleet customer by summer of 2003. Hardware and software integration technical challenges resulted from requirements definition and analysis activities performed across a government-industry led Integrated Product Team (IPT) involving Navy engineering and test sites, Boeing, and RTSC-EPS (with its subcontracted hardware and government furnished equipment vendors). Requirements development from a bottoms-up approach was adopted using an electronic requirements capture environment to clarify and establish the SHARP EMD product baseline specifications as relevant technical data became available. Applying Earned-Value Management (EVM) against an Integrated Master Schedule (IMS) resulted in efficiently managing SE task assignments and product deliveries in a dynamically evolving customer requirements environment. Application of Six Sigma improvement methodologies resulted in the uncovering of root causes of errors in wiring interconnectivity drawings, pod manufacturing processes, and avionics requirements specifications. Utilizing the draft NAVAIR SE guideline handbook and the ANSI/EIA-632 standard: Processes for Engineering a System, a systems engineering tailored process approach was adopted for the accelerated SHARP EMD prgram. Tailoring SE processes in this accelerated product delivery environment provided unique opportunities to be technically creative in the establishment of a product performance baseline. This paper provides an historical overview of the systems engineering activities spanning the prototype phase through the EMD SHARP program phase, the performance requirement capture activities and refinement process challenges, and what SE process improvements can be applied to future SHARP-like programs adopting a compressed, evolutionary spiral development acquisition paradigm.

  4. Process and utility water requirements for cellulosic ethanol production processes via fermentation pathway

    EPA Science Inventory

    The increasing need of additional water resources for energy production is a growing concern for future economic development. In technology development for ethanol production from cellulosic feedstocks, a detailed assessment of the quantity and quality of water required, and the ...

  5. What is the Final Verification of Engineering Requirements?

    NASA Technical Reports Server (NTRS)

    Poole, Eric

    2010-01-01

    This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.

  6. DREAMS and IMAGE: A Model and Computer Implementation for Concurrent, Life-Cycle Design of Complex Systems

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.

    1995-01-01

    Computing architectures are being assembled that extend concurrent engineering practices by providing more efficient execution and collaboration on distributed, heterogeneous computing networks. Built on the successes of initial architectures, requirements for a next-generation design computing infrastructure can be developed. These requirements concentrate on those needed by a designer in decision-making processes from product conception to recycling and can be categorized in two areas: design process and design information management. A designer both designs and executes design processes throughout design time to achieve better product and process capabilities while expanding fewer resources. In order to accomplish this, information, or more appropriately design knowledge, needs to be adequately managed during product and process decomposition as well as recomposition. A foundation has been laid that captures these requirements in a design architecture called DREAMS (Developing Robust Engineering Analysis Models and Specifications). In addition, a computing infrastructure, called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment), is being developed that satisfies design requirements defined in DREAMS and incorporates enabling computational technologies.

  7. Eliciting User Requirements Using Appreciative Inquiry

    ERIC Educational Resources Information Center

    Gonzales, Carol Kernitzki

    2010-01-01

    Many software development projects fail because they do not meet the needs of users, are over-budget, and abandoned. To address this problem, the user requirements elicitation process was modified based on principles of Appreciative Inquiry. Appreciative Inquiry, commonly used in organizational development, aims to build organizations, processes,…

  8. 24 CFR 1003.300 - Application requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 24 Housing and Urban Development 4 2013-04-01 2013-04-01 false Application requirements. 1003.300 Section 1003.300 Housing and Urban Development REGULATIONS RELATING TO HOUSING AND URBAN DEVELOPMENT... Grant Application and Selection Process § 1003.300 Application requirements. (a) Application information...

  9. In-Space Manufacturing (ISM): Pioneering Space Exploration

    NASA Technical Reports Server (NTRS)

    Werkheiser, Niki

    2015-01-01

    ISM Objective: Develop and enable the manufacturing technologies and processes required to provide on-demand, sustainable operations for Exploration Missions. This includes development of the desired capabilities, as well as the required processes for the certification, characterization & verification that will enable these capabilities to become institutionalized via ground-based and ISS demonstrations.

  10. Juggling Act: Re-Planning and Building on Observatory...Simultaneously!

    NASA Technical Reports Server (NTRS)

    Zavala, Eddie; Daws, Patricia

    2011-01-01

    SOFIA (Stratospheric Observatory for Infrared Astronomy) is a major SMD program that has been required to meet several requirements and implement major planning and business initiatives overthe past 1 1/2 years, in the midst of system development and flight test phases. The program was required to implementing JCL and EVM simultaneously, as well as undergo a major replan and Standing Review Board - and all without impacting technical schedule progress. The team developed innovative processes that met all the requirements, and improved Program Management process toolsets. The SOFIA team, being subject to all the typical budget constraints, found ways to leverage existing roles in new ways to meet the requirements without creating unmanageable overhead. The team developed strategies and value added processes - such as improved risk identification, structured reserves management, cost/risk integration - so that the effort expended resulted in a positive return to the program.

  11. Model for Simulating a Spiral Software-Development Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code), productivity (number of lines of code per hour), and number of defects per source line of code. The user provides the number of resources, the overall percent of effort that should be allocated to each process step, and the number of desired staff members for each step. The output of PATT includes the size of the product, a measure of effort, a measure of rework effort, the duration of the entire process, and the numbers of injected, detected, and corrected defects as well as a number of other interesting features. In the development of the present model, steps were added to the IEEE 12207 waterfall process, and this model and its implementing software were made to run repeatedly through the sequence of steps, each repetition representing an iteration in a spiral process. Because the IEEE 12207 model is founded on a waterfall paradigm, it enables direct comparison of spiral and waterfall processes. The model can be used throughout a software-development project to analyze the project as more information becomes available. For instance, data from early iterations can be used as inputs to the model, and the model can be used to estimate the time and cost of carrying the project to completion.

  12. Crew interface specifications preparation for in-flight maintenance and stowage functions

    NASA Technical Reports Server (NTRS)

    Parker, F. W.; Carlton, B. E.

    1972-01-01

    The findings and data products developed during the Phase 2 crew interface specification study are presented. Five new NASA general specifications were prepared: operations location coding system for crew interfaces; loose equipment and stowage management requirements; loose equipment and stowage data base information requirements; spacecraft loose equipment stowage drawing requirements; and inflight stowage management data requirements. Additional data was developed defining inflight maintenance processes and related data concepts for inflight troubleshooting, remove/repair/replace and scheduled maintenance activities. The process of maintenance task and equipment definition during spacecraft design and development was also defined and related data concepts were identified for futher development into formal NASA specifications during future follow-on study phases of the contract.

  13. Advanced manufacturing development of a composite empennage component for L-1011 aircraft

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Work continued toward the development of tooling and processing concepts required for a cocured hat/skin cover assembly. A plan was developed and implemented to develop the process for using preimpregnated T300/5208 with a resin content of 34 + or - 2 percent by weight. Use of this material results in a simplified laminating process because removal by bleeding or prebleeding is no longer required. The approach to this task basically consists of fabricating and testing flat laminated panels and simulated structural panels to verify known processing techniques relative to end-laminate quality. The flat panels were used to determine air bleeding arrangement and required cure cycle. Single and multihat-stiffened panels were fabricated using the established air bleeding arrangement and cure cycle with the resulting cured parts yielding excellent correlation of ply thickness with all surfaces clear of porosity and voids.

  14. Automated Methodologies for the Design of Flow Diagrams for Development and Maintenance Activities

    NASA Astrophysics Data System (ADS)

    Shivanand M., Handigund; Shweta, Bhat

    The Software Requirements Specification (SRS) of the organization is a text document prepared by strategic management incorporating the requirements of the organization. These requirements of ongoing business/ project development process involve the software tools, the hardware devices, the manual procedures, the application programs and the communication commands. These components are appropriately ordered for achieving the mission of the concerned process both in the project development and the ongoing business processes, in different flow diagrams viz. activity chart, workflow diagram, activity diagram, component diagram and deployment diagram. This paper proposes two generic, automatic methodologies for the design of various flow diagrams of (i) project development activities, (ii) ongoing business process. The methodologies also resolve the ensuing deadlocks in the flow diagrams and determine the critical paths for the activity chart. Though both methodologies are independent, each complements other in authenticating its correctness and completeness.

  15. Coal liquefaction processes and development requirements analysis for synthetic fuels production

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Focus of the study is on: (1) developing a technical and programmatic data base on direct and indirect liquefaction processes which have potential for commercialization during the 1980's and beyond, and (2) performing analyses to assess technology readiness and development trends, development requirements, commercial plant costs, and projected synthetic fuel costs. Numerous data sources and references were used as the basis for the analysis results and information presented.

  16. 78 FR 310 - Draft Revision of Guidance for Industry on Providing Regulatory Submissions in Electronic Format...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-03

    ...ApprovalProcess/FormsSubmissionRequirements/ElectronicSubmissions/ucm253101.htm , http://www.regulations.../Drugs/DevelopmentApprovalProcess/FormsSubmissionRequirements/ElectronicSubmissions/ucm253101.htm , http...), in a format that FDA can process, review, and archive. Currently, the Agency can process, review, and...

  17. KSC Shuttle ground turnaround evaluation

    NASA Technical Reports Server (NTRS)

    Ragusa, J. M.

    1983-01-01

    Payload/mission development, processing flows, facilities/systems, and the various environments to which a payload is exposed during ground processing are described. These considerations are important for payload design and ground processing requirements development.

  18. Software Development and Test Methodology for a Distributed Ground System

    NASA Technical Reports Server (NTRS)

    Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.

  19. Software Design Methodology Migration for a Distributed Ground System

    NASA Technical Reports Server (NTRS)

    Ritter, George; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has been developed and has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes. The new Software processes still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Process have evolved highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project .

  20. Capturing security requirements for software systems.

    PubMed

    El-Hadary, Hassan; El-Kassas, Sherif

    2014-07-01

    Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way.

  1. Capturing security requirements for software systems

    PubMed Central

    El-Hadary, Hassan; El-Kassas, Sherif

    2014-01-01

    Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way. PMID:25685514

  2. DEVS Unified Process for Web-Centric Development and Testing of System of Systems

    DTIC Science & Technology

    2008-05-20

    gathering from the user. Further, methodologies have been developed to generate DEVS models from BPMN /BPEL-based and message-based requirement specifications...27] 3. BPMN /BPEL based system specifications: Business Process Modeling Notation ( BPMN ) [bpm] or Business Process Execution Language (BPEL) provide a...information is stored in .wsdl and .bpel files for BPEL but in proprietary format for BPMN . 4. DoDAF-based requirement specifications: Department of

  3. Crew interface specifications development functions, phase 3A

    NASA Technical Reports Server (NTRS)

    Carl, J. G.

    1973-01-01

    The findings and data products developed during the crew interface specification study for inflight maintenance and stowage functions are presented. Guidelines are provided for improving the present progress of defining, controlling, and managing the flight crew requirements. The following data products were developed: (1) description of inflight maintenance management process, (2) specifications for inflight maintenance management requirements, and (3) suggested inflight maintenance data processing reports for logistics management.

  4. Post-test navigation data analysis techniques for the shuttle ALT

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Postflight test analysis data processing techniques for shuttle approach and landing tests (ALT) navigation data are defined. Postfight test processor requirements are described along with operational and design requirements, data input requirements, and software test requirements. The postflight test data processing is described based on the natural test sequence: quick-look analysis, postflight navigation processing, and error isolation processing. Emphasis is placed on the tradeoffs that must remain open and subject to analysis until final definition is achieved in the shuttle data processing system and the overall ALT plan. A development plan for the implementation of the ALT postflight test navigation data processing system is presented. Conclusions are presented.

  5. A comprehensive study on regulatory requirements for development and filing of generic drugs globally

    PubMed Central

    Handoo, Shweta; Arora, Vandana; Khera, Deepak; Nandi, Prafulla Kumar; Sahu, Susanta Kumar

    2012-01-01

    The regulatory requirements of various countries of the world vary from each other. Therefore, it is challenging for the companies to develop a single drug which can be simultaneously submitted in all the countries for approval. The regulatory strategy for product development is essentially to be established before commencement of developmental work in order to avoid major surprises after submission of the application. The role of the regulatory authorities is to ensure the quality, safety, and efficacy of all medicines in circulation in their country. It not only includes the process of regulating and monitoring the drugs but also the process of manufacturing, distribution, and promotion of it. One of the primary challenges for regulatory authority is to ensure that the pharmaceutical products are developed as per the regulatory requirement of that country. This process involves the assessment of critical parameters during product development. PMID:23373001

  6. Estimation of Managerial and Technical Personnel Requirements in Selected Industries. Training for Industry Series, No. 2.

    ERIC Educational Resources Information Center

    United Nations Industrial Development Organization, Vienna (Austria).

    The need to develop managerial and technical personnel in the cement, fertilizer, pulp and paper, sugar, leather and shoe, glass, and metal processing industries of various nations was studied, with emphasis on necessary steps in developing nations to relate occupational requirements to technology, processes, and scale of output. Estimates were…

  7. The opto-mechanical design process: from vision to reality

    NASA Astrophysics Data System (ADS)

    Kvamme, E. Todd; Stubbs, David M.; Jacoby, Michael S.

    2017-08-01

    The design process for an opto-mechanical sub-system is discussed from requirements development through test. The process begins with a proper mission understanding and the development of requirements for the system. Preliminary design activities are then discussed with iterative analysis and design work being shared between the design, thermal, and structural engineering personnel. Readiness for preliminary review and the path to a final design review are considered. The value of prototyping and risk mitigation testing is examined with a focus on when it makes sense to execute a prototype test program. System level margin is discussed in general terms, and the practice of trading margin in one area of performance to meet another area is reviewed. Requirements verification and validation is briefly considered. Testing and its relationship to requirements verification concludes the design process.

  8. Human Integration Design Processes (HIDP)

    NASA Technical Reports Server (NTRS)

    Boyer, Jennifer

    2014-01-01

    The purpose of the Human Integration Design Processes (HIDP) document is to provide human-systems integration design processes, including methodologies and best practices that NASA has used to meet human systems and human rating requirements for developing crewed spacecraft. HIDP content is framed around human-centered design methodologies and processes in support of human-system integration requirements and human rating. NASA-STD-3001, Space Flight Human-System Standard, is a two-volume set of National Aeronautics and Space Administration (NASA) Agency-level standards established by the Office of the Chief Health and Medical Officer, directed at minimizing health and performance risks for flight crews in human space flight programs. Volume 1 of NASA-STD-3001, Crew Health, sets standards for fitness for duty, space flight permissible exposure limits, permissible outcome limits, levels of medical care, medical diagnosis, intervention, treatment and care, and countermeasures. Volume 2 of NASASTD- 3001, Human Factors, Habitability, and Environmental Health, focuses on human physical and cognitive capabilities and limitations and defines standards for spacecraft (including orbiters, habitats, and suits), internal environments, facilities, payloads, and related equipment, hardware, and software with which the crew interfaces during space operations. The NASA Procedural Requirements (NPR) 8705.2B, Human-Rating Requirements for Space Systems, specifies the Agency's human-rating processes, procedures, and requirements. The HIDP was written to share NASA's knowledge of processes directed toward achieving human certification of a spacecraft through implementation of human-systems integration requirements. Although the HIDP speaks directly to implementation of NASA-STD-3001 and NPR 8705.2B requirements, the human-centered design, evaluation, and design processes described in this document can be applied to any set of human-systems requirements and are independent of reference missions. The HIDP is a reference document that is intended to be used during the development of crewed space systems and operations to guide human-systems development process activities.

  9. 24 CFR 972.109 - Conversion of developments.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Conversion of developments. 972.109... DEVELOPMENT CONVERSION OF PUBLIC HOUSING TO TENANT-BASED ASSISTANCE Required Conversion of Public Housing Developments Required Conversion Process § 972.109 Conversion of developments. (a)(1) The PHA may proceed to...

  10. 24 CFR 972.109 - Conversion of developments.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Conversion of developments. 972.109... DEVELOPMENT CONVERSION OF PUBLIC HOUSING TO TENANT-BASED ASSISTANCE Required Conversion of Public Housing Developments Required Conversion Process § 972.109 Conversion of developments. (a)(1) The PHA may proceed to...

  11. NTP comparison process

    NASA Technical Reports Server (NTRS)

    Corban, Robert

    1993-01-01

    The systems engineering process for the concept definition phase of the program involves requirements definition, system definition, and consistent concept definition. The requirements definition process involves obtaining a complete understanding of the system requirements based on customer needs, mission scenarios, and nuclear thermal propulsion (NTP) operating characteristics. A system functional analysis is performed to provide a comprehensive traceability and verification of top-level requirements down to detailed system specifications and provides significant insight into the measures of system effectiveness to be utilized in system evaluation. The second key element in the process is the definition of system concepts to meet the requirements. This part of the process involves engine system and reactor contractor teams to develop alternative NTP system concepts that can be evaluated against specific attributes, as well as a reference configuration against which to compare system benefits and merits. Quality function deployment (QFD), as an excellent tool within Total Quality Management (TQM) techniques, can provide the required structure and provide a link to the voice of the customer in establishing critical system qualities and their relationships. The third element of the process is the consistent performance comparison. The comparison process involves validating developed concept data and quantifying system merits through analysis, computer modeling, simulation, and rapid prototyping of the proposed high risk NTP subsystems. The maximum amount possible of quantitative data will be developed and/or validated to be utilized in the QFD evaluation matrix. If upon evaluation of a new concept or its associated subsystems determine to have substantial merit, those features will be incorporated into the reference configuration for subsequent system definition and comparison efforts.

  12. 24 CFR 3288.110 - Alternative Process agreements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Alternative Process agreements... HOUSING AND URBAN DEVELOPMENT MANUFACTURED HOME DISPUTE RESOLUTION PROGRAM Alternative Process in HUD-Administered States § 3288.110 Alternative Process agreements. (a) Required agreement. To use the Alternative...

  13. Welding and joining techniques.

    PubMed

    Chipperfield, F A; Dunkerton, S B

    2001-05-01

    There is a welding solution for most applications. As products must meet more stringent requirements or require more flexible processes to aid design or reduce cost, further improvements or totally new processes are likely to be developed. Quality control aspects are also becoming more important to meet regulation, and monitoring and control of welding processes and the standardised testing of joints will meet some if not all of these requirements.

  14. Missile signal processing common computer architecture for rapid technology upgrade

    NASA Astrophysics Data System (ADS)

    Rabinkin, Daniel V.; Rutledge, Edward; Monticciolo, Paul

    2004-10-01

    Interceptor missiles process IR images to locate an intended target and guide the interceptor towards it. Signal processing requirements have increased as the sensor bandwidth increases and interceptors operate against more sophisticated targets. A typical interceptor signal processing chain is comprised of two parts. Front-end video processing operates on all pixels of the image and performs such operations as non-uniformity correction (NUC), image stabilization, frame integration and detection. Back-end target processing, which tracks and classifies targets detected in the image, performs such algorithms as Kalman tracking, spectral feature extraction and target discrimination. In the past, video processing was implemented using ASIC components or FPGAs because computation requirements exceeded the throughput of general-purpose processors. Target processing was performed using hybrid architectures that included ASICs, DSPs and general-purpose processors. The resulting systems tended to be function-specific, and required custom software development. They were developed using non-integrated toolsets and test equipment was developed along with the processor platform. The lifespan of a system utilizing the signal processing platform often spans decades, while the specialized nature of processor hardware and software makes it difficult and costly to upgrade. As a result, the signal processing systems often run on outdated technology, algorithms are difficult to update, and system effectiveness is impaired by the inability to rapidly respond to new threats. A new design approach is made possible three developments; Moore's Law - driven improvement in computational throughput; a newly introduced vector computing capability in general purpose processors; and a modern set of open interface software standards. Today's multiprocessor commercial-off-the-shelf (COTS) platforms have sufficient throughput to support interceptor signal processing requirements. This application may be programmed under existing real-time operating systems using parallel processing software libraries, resulting in highly portable code that can be rapidly migrated to new platforms as processor technology evolves. Use of standardized development tools and 3rd party software upgrades are enabled as well as rapid upgrade of processing components as improved algorithms are developed. The resulting weapon system will have a superior processing capability over a custom approach at the time of deployment as a result of a shorter development cycles and use of newer technology. The signal processing computer may be upgraded over the lifecycle of the weapon system, and can migrate between weapon system variants enabled by modification simplicity. This paper presents a reference design using the new approach that utilizes an Altivec PowerPC parallel COTS platform. It uses a VxWorks-based real-time operating system (RTOS), and application code developed using an efficient parallel vector library (PVL). A quantification of computing requirements and demonstration of interceptor algorithm operating on this real-time platform are provided.

  15. Real-time optical fiber digital speckle pattern interferometry for industrial applications

    NASA Astrophysics Data System (ADS)

    Chan, Robert K.; Cheung, Y. M.; Lo, C. H.; Tam, T. K.

    1997-03-01

    There is current interest, especially in the industrial sector, to use the digital speckle pattern interferometry (DSPI) technique to measure surface stress. Indeed, many publications in the subject are evident of the growing interests in the field. However, to bring the technology to industrial use requires the integration of several emerging technologies, viz. optics, feedback control, electronics, imaging processing and digital signal processing. Due to the highly interdisciplinary nature of the technique, successful implementation and development require expertise in all of the fields. At Baptist University, under the funding of a major industrial grant, we are developing the technology for the industrial sector. Our system fully exploits optical fibers and diode lasers in the design to enable practical and rugged systems suited for industrial applications. Besides the development in optics, we have broken away from the reliance of a microcomputer PC platform for both image capture and processing, and have developed a digital signal processing array system that can handle simultaneous and independent image capture/processing with feedback control. The system, named CASPA for 'cascadable architecture signal processing array,' is a third generation development system that utilizes up to 7 digital signal processors has proved to be a very powerful system. With our CASPA we are now in a better position to developing novel optical measurement systems for industrial application that may require different measurement systems to operate concurrently and requiring information exchange between the systems. Applications in mind such as simultaneous in-plane and out-of-plane DSPI image capture/process, vibrational analysis with interactive DSPI and phase shifting control of optical systems are a few good examples of the potentials.

  16. Discovering system requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bahill, A.T.; Bentz, B.; Dean, F.F.

    1996-07-01

    Cost and schedule overruns are often caused by poor requirements that are produced by people who do not understand the requirements process. This report provides a high-level overview of the system requirements process, explaining types, sources, and characteristics of good requirements. System requirements, however, are seldom stated by the customer. Therefore, this report shows ways to help you work with your customer to discover the system requirements. It also explains terminology commonly used in the requirements development field, such as verification, validation, technical performance measures, and the various design reviews.

  17. Earth Sciences Data and Information System (ESDIS) program planning and evaluation methodology development

    NASA Technical Reports Server (NTRS)

    Dickinson, William B.

    1995-01-01

    An Earth Sciences Data and Information System (ESDIS) Project Management Plan (PMP) is prepared. An ESDIS Project Systems Engineering Management Plan (SEMP) consistent with the developed PMP is also prepared. ESDIS and related EOS program requirements developments, management and analysis processes are evaluated. Opportunities to improve the effectiveness of these processes and program/project responsiveness to requirements are identified. Overall ESDIS cost estimation processes are evaluated, and recommendations to improve cost estimating and modeling techniques are developed. ESDIS schedules and scheduling tools are evaluated. Risk assessment, risk mitigation strategies and approaches, and use of risk information in management decision-making are addressed.

  18. Testing Strategies for Model-Based Development

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.

    2006-01-01

    This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.

  19. NASA Procurement Career Development Program

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The NASA Procurement Career Development Program establishes an agency-wide framework for the management of career development activity in the procurement field. Within this framework, installations are encouraged to modify the various components to meet installation-specific mission and organization requirements. This program provides a systematic process for the assessment of and planning for the development, training, and education required to increase the employees' competence in the procurement work functions. It includes the agency-wide basic knowledge and skills by career field and level upon which individual and organizational development plans are developed. Also, it provides a system that is compatible with other human resource management and development systems, processes, and activities. The compatibility and linkage are important in fostering the dual responsibility of the individual and the organization in the career development process.

  20. Strengthening Interprofessional Requirements Engineering Through Action Sheets: A Pilot Study.

    PubMed

    Kunz, Aline; Pohlmann, Sabrina; Heinze, Oliver; Brandner, Antje; Reiß, Christina; Kamradt, Martina; Szecsenyi, Joachim; Ose, Dominik

    2016-10-18

    The importance of information and communication technology for healthcare is steadily growing. Newly developed tools are addressing different user groups: physicians, other health care professionals, social workers, patients, and family members. Since often many different actors with different expertise and perspectives are involved in the development process it can be a challenge to integrate the user-reported requirements of those heterogeneous user groups. Nevertheless, the understanding and consideration of user requirements is the prerequisite of building a feasible technical solution. In the course of the presented project it proved to be difficult to gain clear action steps and priorities for the development process out of the primary requirements compilation. Even if a regular exchange between involved teams took place there was a lack of a common language. The objective of this paper is to show how the already existing requirements catalog was subdivided into specific, prioritized, and coherent working packages and the cooperation of multiple interprofessional teams within one development project was reorganized at the same time. In the case presented, the manner of cooperation was reorganized and a new instrument called an Action Sheet was implemented. This paper introduces the newly developed methodology which was meant to smooth the development of a user-centered software product and to restructure interprofessional cooperation. There were 10 focus groups in which views of patients with colorectal cancer, physicians, and other health care professionals were collected in order to create a requirements catalog for developing a personal electronic health record. Data were audio- and videotaped, transcribed verbatim, and thematically analyzed. Afterwards, the requirements catalog was reorganized in the form of Action Sheets which supported the interprofessional cooperation referring to the development process of a personal electronic health record for the Rhine-Neckar region. In order to improve the interprofessional cooperation the idea arose to align the requirements arising from the implementation project with the method of software development applied by the technical development team. This was realized by restructuring the original requirements set in a standardized way and under continuous adjustment between both teams. As a result not only the way of displaying the user demands but also of interprofessional cooperation was steered in a new direction. User demands must be taken into account from the very beginning of the development process, but it is not always obvious how to bring them together with IT knowhow and knowledge of the contextual factors of the health care system. Action Sheets seem to be an effective tool for making the software development process more tangible and convertible for all connected disciplines. Furthermore, the working method turned out to support interprofessional ideas exchange.

  1. Strengthening Interprofessional Requirements Engineering Through Action Sheets: A Pilot Study

    PubMed Central

    Pohlmann, Sabrina; Heinze, Oliver; Brandner, Antje; Reiß, Christina; Kamradt, Martina; Szecsenyi, Joachim; Ose, Dominik

    2016-01-01

    Background The importance of information and communication technology for healthcare is steadily growing. Newly developed tools are addressing different user groups: physicians, other health care professionals, social workers, patients, and family members. Since often many different actors with different expertise and perspectives are involved in the development process it can be a challenge to integrate the user-reported requirements of those heterogeneous user groups. Nevertheless, the understanding and consideration of user requirements is the prerequisite of building a feasible technical solution. In the course of the presented project it proved to be difficult to gain clear action steps and priorities for the development process out of the primary requirements compilation. Even if a regular exchange between involved teams took place there was a lack of a common language. Objective The objective of this paper is to show how the already existing requirements catalog was subdivided into specific, prioritized, and coherent working packages and the cooperation of multiple interprofessional teams within one development project was reorganized at the same time. In the case presented, the manner of cooperation was reorganized and a new instrument called an Action Sheet was implemented. This paper introduces the newly developed methodology which was meant to smooth the development of a user-centered software product and to restructure interprofessional cooperation. Methods There were 10 focus groups in which views of patients with colorectal cancer, physicians, and other health care professionals were collected in order to create a requirements catalog for developing a personal electronic health record. Data were audio- and videotaped, transcribed verbatim, and thematically analyzed. Afterwards, the requirements catalog was reorganized in the form of Action Sheets which supported the interprofessional cooperation referring to the development process of a personal electronic health record for the Rhine-Neckar region. Results In order to improve the interprofessional cooperation the idea arose to align the requirements arising from the implementation project with the method of software development applied by the technical development team. This was realized by restructuring the original requirements set in a standardized way and under continuous adjustment between both teams. As a result not only the way of displaying the user demands but also of interprofessional cooperation was steered in a new direction. Conclusions User demands must be taken into account from the very beginning of the development process, but it is not always obvious how to bring them together with IT knowhow and knowledge of the contextual factors of the health care system. Action Sheets seem to be an effective tool for making the software development process more tangible and convertible for all connected disciplines. Furthermore, the working method turned out to support interprofessional ideas exchange. PMID:27756716

  2. The Scenario-Based Engineering Process (SEP): a user-centered approach for the development of health care systems.

    PubMed

    Harbison, K; Kelly, J; Burnell, L; Silva, J

    1995-01-01

    The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.

  3. PLAN-TA9-2443(U), Rev. B Remediated Nitrate Salt (RNS) Surrogate Formulation and Testing Standard Procedure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Geoffrey Wayne

    2016-03-16

    This document identifies scope and some general procedural steps for performing Remediated Nitrate Salt (RNS) Surrogate Formulation and Testing. This Test Plan describes the requirements, responsibilities, and process for preparing and testing a range of chemical surrogates intended to mimic the energetic response of waste created during processing of legacy nitrate salts. The surrogates developed are expected to bound1 the thermal and mechanical sensitivity of such waste, allowing for the development of process parameters required to minimize the risk to worker and public when processing this waste. Such parameters will be based on the worst-case kinetic parameters as derived frommore » APTAC measurements as well as the development of controls to mitigate sensitivities that may exist due to friction, impact, and spark. This Test Plan will define the scope and technical approach for activities that implement Quality Assurance requirements relevant to formulation and testing.« less

  4. Development of utility generic functional requirements for electronic work packages and computer-based procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oxstrand, Johanna

    The Nuclear Electronic Work Packages - Enterprise Requirements (NEWPER) initiative is a step toward a vision of implementing an eWP framework that includes many types of eWPs. This will enable immediate paper-related cost savings in work management and provide a path to future labor efficiency gains through enhanced integration and process improvement in support of the Nuclear Promise (Nuclear Energy Institute 2016). The NEWPER initiative was organized by the Nuclear Information Technology Strategic Leadership (NITSL) group, which is an organization that brings together leaders from the nuclear utility industry and regulatory agencies to address issues involved with information technology usedmore » in nuclear-power utilities. NITSL strives to maintain awareness of industry information technology-related initiatives and events and communicates those events to its membership. NITSL and LWRS Program researchers have been coordinating activities, including joint organization of NEWPER-related meetings and report development. The main goal of the NEWPER initiative was to develop a set of utility generic functional requirements for eWP systems. This set of requirements will support each utility in their process of identifying plant-specific functional and non-functional requirements. The NEWPER initiative has 140 members where the largest group of members consists of 19 commercial U.S. nuclear utilities and eleven of the most prominent vendors of eWP solutions. Through the NEWPER initiative two sets of functional requirements were developed; functional requirements for electronic work packages and functional requirements for computer-based procedures. This paper will describe the development process as well as a summary of the requirements.« less

  5. Possible monitoring requirements for the disinfectants and disinfection by-products (D/DBP) regulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-01-01

    The monitoring requirements presented in the report were developed by EPA before a negotiated Disinfectants and Disinfection By-Products (D/DBP) rule was considered. The framework described herein may be substantially changed as a result of the negotiated rulemaking process. The document is useful to consider in developing various monitoring options during the negotiated rulemaking process.

  6. Requirements Development for the NASA Advanced Engineering Environment (AEE)

    NASA Technical Reports Server (NTRS)

    Rogers, Eric; Hale, Joseph P.; Zook, Keith; Gowda, Sanjay; Salas, Andrea O.

    2003-01-01

    The requirements development process for the Advanced Engineering Environment (AEE) is presented. This environment has been developed to allow NASA to perform independent analysis and design of space transportation architectures and technologies. Given the highly collaborative and distributed nature of AEE, a variety of organizations are involved in the development, operations and management of the system. Furthermore, there are additional organizations involved representing external customers and stakeholders. Thorough coordination and effective communication is essential to translate desired expectations of the system into requirements. Functional, verifiable requirements for this (and indeed any) system are necessary to fulfill several roles. Requirements serve as a contractual tool, configuration management tool, and as an engineering tool, sometimes simultaneously. The role of requirements as an engineering tool is particularly important because a stable set of requirements for a system provides a common framework of system scope and characterization among team members. Furthermore, the requirements provide the basis for checking completion of system elements and form the basis for system verification. Requirements are at the core of systems engineering. The AEE Project has undertaken a thorough process to translate the desires and expectations of external customers and stakeholders into functional system-level requirements that are captured with sufficient rigor to allow development planning, resource allocation and system-level design, development, implementation and verification. These requirements are maintained in an integrated, relational database that provides traceability to governing Program requirements and also to verification methods and subsystem-level requirements.

  7. Implementation of a configurable laboratory information management system for use in cellular process development and manufacturing.

    PubMed

    Russom, Diana; Ahmed, Amira; Gonzalez, Nancy; Alvarnas, Joseph; DiGiusto, David

    2012-01-01

    Regulatory requirements for the manufacturing of cell products for clinical investigation require a significant level of record-keeping, starting early in process development and continuing through to the execution and requisite follow-up of patients on clinical trials. Central to record-keeping is the management of documentation related to patients, raw materials, processes, assays and facilities. To support these requirements, we evaluated several laboratory information management systems (LIMS), including their cost, flexibility, regulatory compliance, ongoing programming requirements and ability to integrate with laboratory equipment. After selecting a system, we performed a pilot study to develop a user-configurable LIMS for our laboratory in support of our pre-clinical and clinical cell-production activities. We report here on the design and utilization of this system to manage accrual with a healthy blood-donor protocol, as well as manufacturing operations for the production of a master cell bank and several patient-specific stem cell products. The system was used successfully to manage blood donor eligibility, recruiting, appointments, billing and serology, and to provide annual accrual reports. Quality management reporting features of the system were used to capture, report and investigate process and equipment deviations that occurred during the production of a master cell bank and patient products. Overall the system has served to support the compliance requirements of process development and phase I/II clinical trial activities for our laboratory and can be easily modified to meet the needs of similar laboratories.

  8. Solid Waste Management Requirements Definition for Advanced Life Support Missions: Results

    NASA Technical Reports Server (NTRS)

    Alazraki, Michael P.; Hogan, John; Levri, Julie; Fisher, John; Drysdale, Alan

    2002-01-01

    Prior to determining what Solid Waste Management (SWM) technologies should be researched and developed by the Advanced Life Support (ALS) Project for future missions, there is a need to define SWM requirements. Because future waste streams will be highly mission-dependent, missions need to be defined prior to developing SWM requirements. The SWM Working Group has used the mission architecture outlined in the System Integration, Modeling and Analysis (SIMA) Element Reference Missions Document (RMD) as a starting point in the requirement development process. The missions examined include the International Space Station (ISS), a Mars Dual Lander mission, and a Mars Base. The SWM Element has also identified common SWM functionalities needed for future missions. These functionalities include: acceptance, transport, processing, storage, monitoring and control, and disposal. Requirements in each of these six areas are currently being developed for the selected missions. This paper reviews the results of this ongoing effort and identifies mission-dependent resource recovery requirements.

  9. Applying Early Systems Engineering: Injecting Knowledge into the Capability Development Process

    DTIC Science & Technology

    2012-10-01

    involves early use of systems engi- neering and technical analyses to supplement the existing operational analysis techniques currently used in...complexity, and costs of systems now being developed require tight coupling between operational requirements stated in the CDD, system requirements...Fleischer » Keywords: Capability Development, Competitive Prototyping, Knowledge Points, Early Systems Engineering Applying Early Systems

  10. A Stream lined Approach for the Payload Customer in Identifying Payload Design Requirements

    NASA Technical Reports Server (NTRS)

    Miller, Ladonna J.; Schneider, Walter F.; Johnson, Dexer E.; Roe, Lesa B.

    2001-01-01

    NASA payload developers from across various disciplines were asked to identify areas where process changes would simplify their task of developing and flying flight hardware. Responses to this query included a central location for consistent hardware design requirements for middeck payloads. The multidisciplinary team assigned to review the numerous payload interface design documents is assessing the Space Shuttle middeck, the SPACEHAB Inc. locker, as well as the MultiPurpose Logistics Module (MPLM) and EXpedite the PRocessing of Experiments to Space Station (EXPRESS) rack design requirements for the payloads. They are comparing the multiple carriers and platform requirements and developing a matrix which illustrates the individual requirements, and where possible, the envelope that encompasses all of the possibilities. The matrix will be expanded to form an overall envelope that the payload developers will have the option to utilize when designing their payload's hardware. This will optimize the flexibility for payload hardware and ancillary items to be manifested on multiple carriers and platforms with minimal impact to the payload developer.

  11. Software for MR image overlay guided needle insertions: the clinical translation process

    NASA Astrophysics Data System (ADS)

    Ungi, Tamas; U-Thainual, Paweena; Fritz, Jan; Iordachita, Iulian I.; Flammang, Aaron J.; Carrino, John A.; Fichtinger, Gabor

    2013-03-01

    PURPOSE: Needle guidance software using augmented reality image overlay was translated from the experimental phase to support preclinical and clinical studies. Major functional and structural changes were needed to meet clinical requirements. We present the process applied to fulfill these requirements, and selected features that may be applied in the translational phase of other image-guided surgical navigation systems. METHODS: We used an agile software development process for rapid adaptation to unforeseen clinical requests. The process is based on iterations of operating room test sessions, feedback discussions, and software development sprints. The open-source application framework of 3D Slicer and the NA-MIC kit provided sufficient flexibility and stable software foundations for this work. RESULTS: All requirements were addressed in a process with 19 operating room test iterations. Most features developed in this phase were related to workflow simplification and operator feedback. CONCLUSION: Efficient and affordable modifications were facilitated by an open source application framework and frequent clinical feedback sessions. Results of cadaver experiments show that software requirements were successfully solved after a limited number of operating room tests.

  12. NASA Supportability Engineering Implementation Utilizing DoD Practices and Processes

    NASA Technical Reports Server (NTRS)

    Smith, David A.; Smith, John V.

    2010-01-01

    The Ares I design and development program made the determination early in the System Design Review Phase to utilize DoD ILS and LSA approach for supportability engineering as an integral part of the system engineering process. This paper is to provide a review of the overall approach to design Ares-I with an emphasis on a more affordable, supportable, and sustainable launch vehicle. Discussions will include the requirements development, design influence, support concept alternatives, ILS and LSA planning, Logistics support analyses/trades performed, LSA tailoring for NASA Ares Program, support system infrastructure identification, ILS Design Review documentation, Working Group coordination, and overall ILS implementation. At the outset, the Ares I Project initiated the development of the Integrated Logistics Support Plan (ILSP) and a Logistics Support Analysis process to provide a path forward for the management of the Ares-I ILS program and supportability analysis activities. The ILSP provide the initial planning and coordination between the Ares-I Project Elements and Ground Operation Project. The LSA process provided a system engineering approach in the development of the Ares-I supportability requirements; influence the design for supportability and development of alternative support concepts that satisfies the program operability requirements. The LSA planning and analysis results are documented in the Logistics Support Analysis Report. This document was required during the Ares-I System Design Review (SDR) and Preliminary Design Review (PDR) review cycles. To help coordinate the LSA process across the Ares-I project and between programs, the LSA Report is updated and released quarterly. A System Requirement Analysis was performed to determine the supportability requirements and technical performance measurements (TPMs). Two working groups were established to provide support in the management and implement the Ares-I ILS program, the Integrated Logistics Support Working Group (ILSWG) and the Logistics Support Analysis Record Working Group (LSARWG). The Ares I ILSWG is established to assess the requirements and conduct, evaluate analyses and trade studies associated with acquisition logistic and supportability processes and to resolve Ares I integrated logistics and supportability issues. It established a strategic collaborative alliance for coordination of Logistics Support Analysis activates in support of the integrated Ares I vehicle design and development of logistics support infrastructure. A Joint Ares I - Orion LSAR Working Group was established to: 1) Guide the development of Ares-I and Orion LSAR data and serve as a model for future Constellation programs, 2) Develop rules and assumptions that will apply across the Constellation program with regards to the program's LSAR development, and 3) Maintain the Constellation LSAR Style Guide.

  13. Churn in the Aircraft Spares Requirements Process.

    DTIC Science & Technology

    1988-04-01

    This paper examined the affect of churn on the budget requirement in terms of dollars. The study concludes that the budget process is good and that the...develop the budge5t and the buy requirements is one cf the most %% *,,1--" si gnificant reaons for the discrepancy between the items budgeted for and tho

  14. Crew interface specification development study for in-flight maintenance and stowage functions

    NASA Technical Reports Server (NTRS)

    Carl, J. G.

    1971-01-01

    The need and potential solutions for an orderly systems engineering approach to the definition, management and documentation requirements for in-flight maintenance, assembly, servicing, and stowage process activities of the flight crews of future spacecraft were investigated. These processes were analyzed and described using a new technique (mass/function flow diagramming), developed during the study, to give visibility to crew functions and supporting requirements, including data products. This technique is usable by NASA for specification baselines and can assist the designer in identifying both upper and lower level requirements associated with these processes. These diagrams provide increased visibility into the relationships between functions and related equipments being utilized and managed and can serve as a common communicating vehicle between the designer, program management, and the operational planner. The information and data product requirements to support the above processes were identified along with optimum formats and contents of these products. The resulting data product concepts are presented to support these in-flight maintenance and stowage processes.

  15. Pahoa geothermal industrial park. Engineering and economic analysis for direct applications of geothermal energy in an industrial park at Pahoa, Hawaii

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreau, J.W.

    1980-12-01

    This engineering and economic study evaluated the potential for developing a geothermal industrial park in the Puna District near Pahoa on the Island of Hawaii. Direct heat industrial applications were analyzed from a marketing, engineering, economic, environmental, and sociological standpoint to determine the most viable industries for the park. An extensive literature search produced 31 existing processes currently using geothermal heat. An additional list was compiled indicating industrial processes that require heat that could be provided by geothermal energy. From this information, 17 possible processes were selected for consideration. Careful scrutiny and analysis of these 17 processes revealed three thatmore » justified detailed economic workups. The three processes chosen for detailed analysis were: an ethanol plant using bagasse and wood as feedstock; a cattle feed mill using sugar cane leaf trash as feedstock; and a papaya processing facility providing both fresh and processed fruit. In addition, a research facility to assess and develop other processes was treated as a concept. Consideration was given to the impediments to development, the engineering process requirements and the governmental support for each process. The study describes the geothermal well site chosen, the pipeline to transmit the hydrothermal fluid, and the infrastructure required for the industrial park. A conceptual development plan for the ethanol plant, the feedmill and the papaya processing facility was prepared. The study concluded that a direct heat industrial park in Pahoa, Hawaii, involves considerable risks.« less

  16. Closed Loop Requirements and Analysis Management

    NASA Technical Reports Server (NTRS)

    Lamoreaux, Michael; Verhoef, Brett

    2015-01-01

    Effective systems engineering involves the use of analysis in the derivation of requirements and verification of designs against those requirements. The initial development of requirements often depends on analysis for the technical definition of specific aspects of a product. Following the allocation of system-level requirements to a product's components, the closure of those requirements often involves analytical approaches to verify that the requirement criteria have been satisfied. Meanwhile, changes that occur in between these two processes need to be managed in order to achieve a closed-loop requirement derivation/verification process. Herein are presented concepts for employing emerging Team center capabilities to jointly manage requirements and analysis data such that analytical techniques are utilized to effectively derive and allocate requirements, analyses are consulted and updated during the change evaluation processes, and analyses are leveraged during the design verification process. Recommendations on concept validation case studies are also discussed.

  17. 24 CFR 1003.302 - Project specific threshold requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 24 Housing and Urban Development 4 2014-04-01 2014-04-01 false Project specific threshold requirements. 1003.302 Section 1003.302 Housing and Urban Development REGULATIONS RELATING TO HOUSING AND URBAN... Purpose Grant Application and Selection Process § 1003.302 Project specific threshold requirements. (a...

  18. Mathematical Analysis of High-Temperature Co-electrolysis of CO2 and O2 Production in a Closed-Loop Atmosphere Revitalization System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael G. McKellar; Manohar S. Sohal; Lila Mulloth

    2010-03-01

    NASA has been evaluating two closed-loop atmosphere revitalization architectures based on Sabatier and Bosch carbon dioxide, CO2, reduction technologies. The CO2 and steam, H2O, co-electrolysis process is another option that NASA has investigated. Utilizing recent advances in the fuel cell technology sector, the Idaho National Laboratory, INL, has developed a CO2 and H2O co-electrolysis process to produce oxygen and syngas (carbon monoxide, CO and hydrogen, H2 mixture) for terrestrial (energy production) application. The technology is a combined process that involves steam electrolysis, CO2 electrolysis, and the reverse water gas shift (RWGS) reaction. A number of process models have been developedmore » and analyzed to determine the theoretical power required to recover oxygen, O2, in each case. These models include the current Sabatier and Bosch technologies and combinations of those processes with high-temperature co-electrolysis. The cases of constant CO2 supply and constant O2 production were evaluated. In addition, a process model of the hydrogenation process with co-electrolysis was developed and compared. Sabatier processes require the least amount of energy input per kg of oxygen produced. If co-electrolysis replaces solid polymer electrolyte (SPE) electrolysis within the Sabatier architecture, the power requirement is reduced by over 10%, but only if heat recuperation is used. Sabatier processes, however, require external water to achieve the lower power results. Under conditions of constant incoming carbon dioxide flow, the Sabatier architectures require more power than the other architectures. The Bosch, Boudouard with co-electrolysis, and the hydrogenation with co-electrolysis processes require little or no external water. The Bosch and hydrogenation processes produce water within their reactors, which aids in reducing the power requirement for electrolysis. The Boudouard with co-electrolysis process has a higher electrolysis power requirement because carbon dioxide is split instead of water, which has a lower heat of formation. Hydrogenation with co-electrolysis offers the best overall power performance for two reasons: it requires no external water, and it produces its own water, which reduces the power requirement for co-electrolysis.« less

  19. Needs Assessment for the Use of NASA Remote Sensing Data in the Development and Implementation of Estuarine and Coastal Water Quality Standards

    NASA Technical Reports Server (NTRS)

    Spiering, Bruce; Underwood, Lauren; Ellis, Chris; Lehrter, John; Hagy, Jim; Schaeffer, Blake

    2010-01-01

    The goals of the project are to provide information from satellite remote sensing to support numeric nutrient criteria development and to determine data processing methods and data quality requirements to support nutrient criteria development and implementation. The approach is to identify water quality indicators that are used by decision makers to assess water quality and that are related to optical properties of the water; to develop remotely sensed data products based on algorithms relating remote sensing imagery to field-based observations of indicator values; to develop methods to assess estuarine water quality, including trends, spatial and temporal variability, and seasonality; and to develop tools to assist in the development and implementation of estuarine and coastal nutrient criteria. Additional slides present process, criteria development, typical data sources and analyses for criteria process, the power of remote sensing data for the process, examples from Pensacola Bay, spatial and temporal variability, pixel matchups, remote sensing validation, remote sensing in coastal waters, requirements for remotely sensed data products, and needs assessment. An additional presentation examines group engagement and information collection. Topics include needs assessment purpose and objectives, understanding water quality decision making, determining information requirements, and next steps.

  20. The MINERVA Software Development Process

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  1. Developing Organizational Adaptability for Complex Environment

    ERIC Educational Resources Information Center

    Boylan, Steven A.; Turner, Kenneth A.

    2017-01-01

    Developing organizations capable of adapting requires leaders to set conditions. Setting conditions normally requires purposeful activities by the leadership to foster and develop leader and individual adaptability, supported by processes and activities that enable adaptive behaviors through the totality of the organization (Goldstein, Hazy, &…

  2. Advanced local area network concepts

    NASA Technical Reports Server (NTRS)

    Grant, Terry

    1985-01-01

    Development of a good model of the data traffic requirements for Local Area Networks (LANs) onboard the Space Station is the driving problem in this work. A parameterized workload model is under development. An analysis contract has been started specifically to capture the distributed processing requirements for the Space Station and then to develop a top level model to simulate how various processing scenarios can handle the workload and what data communication patterns result. A summary of the Local Area Network Extendsible Simulator 2 Requirements Specification and excerpts from a grant report on the topological design of fiber optic local area networks with application to Expressnet are given.

  3. Development of Integrated Programs for Aerospace-vehicle Design (IPAD): Product manufacture interactions with the design process

    NASA Technical Reports Server (NTRS)

    Crowell, H. A.

    1979-01-01

    The product manufacturing interactions with the design process and the IPAD requirements to support the interactions are described. The data requirements supplied to manufacturing by design are identified and quantified. Trends in computer-aided manufacturing are discussed and the manufacturing process of the 1980's is anticipated.

  4. Onboard experiment data support facility, task 1 report. [space shuttles

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The conceptual design and specifications are developed for an onboard experiment data support facility (OEDSF) to provide end to end processing of data from various payloads on board space shuttles. Classical data processing requirements are defined and modeled. Onboard processing requirements are analyzed. Specifications are included for an onboard processor.

  5. Implementation of ionizing radiation environment requirements for Space Station

    NASA Technical Reports Server (NTRS)

    Boeder, Paul A.; Watts, John W.

    1993-01-01

    Proper functioning of Space Station hardware requires that the effects of high-energy ionizing particles from the natural environment and (possibly) from man-made sources be considered during design. At the Space Station orbit of 28.5-deg inclination and 330-440 km altitude, geomagnetically trapped protons and electrons contribute almost all of the dose, while galactic cosmic rays and anomalous cosmic rays may produce Single Event Upsets (SEUs), latchups, and burnouts of microelectronic devices. Implementing ionizing radiation environment requirements for Space Station has been a two part process, including the development of a description of the environment for imposing requirements on the design and the development of a control process for assessing how well the design addresses the effects of the ionizing radiation environment. We will review both the design requirements and the control process for addressing ionizing radiation effects on Space Station.

  6. Sensori-Motor Experience Leads to Changes in Visual Processing in the Developing Brain

    ERIC Educational Resources Information Center

    James, Karin Harman

    2010-01-01

    Since Broca's studies on language processing, cortical functional specialization has been considered to be integral to efficient neural processing. A fundamental question in cognitive neuroscience concerns the type of learning that is required for functional specialization to develop. To address this issue with respect to the development of neural…

  7. A Conceptual Methodology for Assessing Acquisition Requirements Robustness against Technology Uncertainties

    NASA Astrophysics Data System (ADS)

    Chou, Shuo-Ju

    2011-12-01

    In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision-makers with the ability to assess or measure the robustness of program requirements against such uncertainties. A literature review of techniques for forecasting technology performance and development uncertainties and subsequent impacts on capability, budget, and schedule requirements resulted in the conclusion that an analysis process that coupled a probabilistic analysis technique such as Monte Carlo Simulations with quantitative and parametric models of technology performance impact and technology development time and cost requirements would allow the probabilities of meeting specific constraints of these requirements to be established. These probabilities of requirements success metrics can then be used as a quantitative and probabilistic measure of program requirements robustness against technology uncertainties. Combined with a Multi-Objective Genetic Algorithm optimization process and computer-based Decision Support System, critical information regarding requirements robustness against technology uncertainties can be captured and quantified for acquisition decision-makers. This results in a more informed and justifiable selection of program technologies during initial program definition as well as formulation of program development and risk management strategies. To meet the stated research objective, the ENhanced TEchnology Robustness Prediction and RISk Evaluation (ENTERPRISE) methodology was formulated to provide a structured and transparent process for integrating these enabling techniques to provide a probabilistic and quantitative assessment of acquisition program requirements robustness against technology performance and development uncertainties. In order to demonstrate the capabilities of the ENTERPRISE method and test the research Hypotheses, an demonstration application of this method was performed on a notional program for acquiring the Carrier-based Suppression of Enemy Air Defenses (SEAD) using Unmanned Combat Aircraft Systems (UCAS) and their enabling technologies. The results of this implementation provided valuable insights regarding the benefits and inner workings of this methodology as well as its limitations that should be addressed in the future to narrow the gap between current state and the desired state.

  8. Hardware development process for Human Research facility applications

    NASA Astrophysics Data System (ADS)

    Bauer, Liz

    2000-01-01

    The simple goal of the Human Research Facility (HRF) is to conduct human research experiments on the International Space Station (ISS) astronauts during long-duration missions. This is accomplished by providing integration and operation of the necessary hardware and software capabilities. A typical hardware development flow consists of five stages: functional inputs and requirements definition, market research, design life cycle through hardware delivery, crew training, and mission support. The purpose of this presentation is to guide the audience through the early hardware development process: requirement definition through selecting a development path. Specific HRF equipment is used to illustrate the hardware development paths. .

  9. The effect of requirements prioritization on avionics system conceptual design

    NASA Astrophysics Data System (ADS)

    Lorentz, John

    This dissertation will provide a detailed approach and analysis of a new collaborative requirements prioritization methodology that has been used successfully on four Coast Guard avionics acquisition and development programs valued at $400M+. A statistical representation of participant study results will be discussed and analyzed in detail. Many technically compliant projects fail to deliver levels of performance and capability that the customer desires. Some of these systems completely meet "threshold" levels of performance; however, the distribution of resources in the process devoted to the development and management of the requirements does not always represent the voice of the customer. This is especially true for technically complex projects such as modern avionics systems. A simplified facilitated process for prioritization of system requirements will be described. The collaborative prioritization process, and resulting artifacts, aids the systems engineer during early conceptual design. All requirements are not the same in terms of customer priority. While there is a tendency to have many thresholds inside of a system design, there is usually a subset of requirements and system performance that is of the utmost importance to the design. These critical capabilities and critical levels of performance typically represent the reason the system is being built. The systems engineer needs processes to identify these critical capabilities, the associated desired levels of performance, and the risks associated with the specific requirements that define the critical capability. The facilitated prioritization exercise is designed to collaboratively draw out these critical capabilities and levels of performance so they can be emphasized in system design. Developing the purpose, scheduling and process for prioritization events are key elements of systems engineering and modern project management. The benefits of early collaborative prioritization flow throughout the project schedule, resulting in greater success during system deployment and operational testing. This dissertation will discuss the data and findings from participant studies, present a literature review of systems engineering and design processes, and test the hypothesis that the prioritization process had no effect on stakeholder sentiment related to the conceptual design. In addition, the "Requirements Rationalization" process will be discussed in detail. Avionics, like many other systems, has transitioned from a discrete electronics engineering, hard engineering discipline to incorporate software engineering as a core process of the technology development cycle. As with other software-based systems, avionics now has significant soft system attributes that must be considered in the design process. The boundless opportunities that exist in software design demand prioritization to focus effort onto the critical functions that the software must provide. This has been a well documented and understood phenomenon in the software development community for many years. This dissertation will attempt to link the effect of software integrated avionics to the benefits of prioritization of requirements in the problem space and demonstrate the sociological and technical benefits of early prioritization practices.

  10. IEC 61511 and the capital project process--a protective management system approach.

    PubMed

    Summers, Angela E

    2006-03-17

    This year, the process industry has reached an important milestone in process safety-the acceptance of an internationally recognized standard for safety instrumented systems (SIS). This standard, IEC 61511, documents good engineering practice for the assessment, design, operation, maintenance, and management of SISs. The foundation of the standard is established by several requirements in Part 1, Clauses 5-7, which cover the development of a management system aimed at ensuring that functional safety is achieved. The management system includes a quality assurance process for the entire SIS lifecycle, requiring the development of procedures, identification of resources and acquisition of tools. For maximum benefit, the deliverables and quality control checks required by the standard should be integrated into the capital project process, addressing safety, environmental, plant productivity, and asset protection. Industry has become inundated with a multitude of programs focusing on safety, quality, and cost performance. This paper introduces a protective management system, which builds upon the work process identified in IEC 61511. Typical capital project phases are integrated with the management system to yield one comprehensive program to efficiently manage process risk. Finally, the paper highlights areas where internal practices or guidelines should be developed to improve program performance and cost effectiveness.

  11. An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle

    NASA Astrophysics Data System (ADS)

    Wang, Yue; Gao, Dan; Mao, Xuming

    2018-03-01

    A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.

  12. Toward the Decision Tree for Inferring Requirements Maturation Types

    NASA Astrophysics Data System (ADS)

    Nakatani, Takako; Kondo, Narihito; Shirogane, Junko; Kaiya, Haruhiko; Hori, Shozo; Katamine, Keiichi

    Requirements are elicited step by step during the requirements engineering (RE) process. However, some types of requirements are elicited completely after the scheduled requirements elicitation process is finished. Such a situation is regarded as problematic situation. In our study, the difficulties of eliciting various kinds of requirements is observed by components. We refer to the components as observation targets (OTs) and introduce the word “Requirements maturation.” It means when and how requirements are elicited completely in the project. The requirements maturation is discussed on physical and logical OTs. OTs Viewed from a logical viewpoint are called logical OTs, e.g. quality requirements. The requirements of physical OTs, e.g., modules, components, subsystems, etc., includes functional and non-functional requirements. They are influenced by their requesters' environmental changes, as well as developers' technical changes. In order to infer the requirements maturation period of each OT, we need to know how much these factors influence the OTs' requirements maturation. According to the observation of actual past projects, we defined the PRINCE (Pre Requirements Intelligence Net Consideration and Evaluation) model. It aims to guide developers in their observation of the requirements maturation of OTs. We quantitatively analyzed the actual cases with their requirements elicitation process and extracted essential factors that influence the requirements maturation. The results of interviews of project managers are analyzed by WEKA, a data mining system, from which the decision tree was derived. This paper introduces the PRINCE model and the category of logical OTs to be observed. The decision tree that helps developers infer the maturation type of an OT is also described. We evaluate the tree through real projects and discuss its ability to infer the requirements maturation types.

  13. Evaluation of Mars CO2 Capture and Gas Separation Technologies

    NASA Technical Reports Server (NTRS)

    Muscatello, Anthony C.; Santiago-Maldonado, Edgardo; Gibson, Tracy; Devor, Robert; Captain, James

    2011-01-01

    Recent national policy statements have established that the ultimate destination of NASA's human exploration program is Mars. In Situ Resource Utilization (ISRU) is a key technology required to ,enable such missions and it is appropriate to review progress in this area and continue to advance the systems required to produce rocket propellant, oxygen, and other consumables on Mars using the carbon dioxide atmosphere and other potential resources. The Mars Atmospheric Capture and Gas separation project is selecting, developing, and demonstrating techniques to capture and purify Martian atmospheric gases for their utilization for the production of hydrocarbons, oxygen, and water in ISRU systems. Trace gases will be required to be separated from Martian atmospheric gases to provide pure CO2 to processing elements. In addition, other Martian gases, such as nitrogen and argon, occur in concentrations high enough to be useful as buffer gas and should be captured as well. To achieve these goals, highly efficient gas separation processes will be required. These gas separation techniques are also required across various areas within the ISRU project to support various consumable production processes. The development of innovative gas separation techniques will evaluate the current state-of-the-art for the gas separation required, with the objective to demonstrate and develop light-weight, low-power methods for gas separation. Gas separation requirements include, but are not limited to the selective separation of: (1) methane and water from unreacted carbon oxides (C02-CO) and hydrogen typical of a Sabatier-type process, (2) carbon oxides and water from unreacted hydrogen from a Reverse Water-Gas Shift process, (3)/carbon oxides from oxygen from a trash/waste processing reaction, and (4) helium from hydrogen or oxygen from a propellant scavenging process. Potential technologies for the separations include' freezers, selective membranes, selective solvents, polymeric sorbents, zeolites, and new technologies. This paper summarizes the results of an extensive literature review of candidate technologies for the capture and separation of CO2 and other relevant gases. This information will be used to prioritize the technologies to be developed further during this and other ISRU projects.

  14. WFIRST: Update on the Coronagraph Science Requirements

    NASA Astrophysics Data System (ADS)

    Douglas, Ewan S.; Cahoy, Kerri; Carlton, Ashley; Macintosh, Bruce; Turnbull, Margaret; Kasdin, Jeremy; WFIRST Coronagraph Science Investigation Teams

    2018-01-01

    The WFIRST Coronagraph instrument (CGI) will enable direct imaging and low resolution spectroscopy of exoplanets in reflected light and imaging polarimetry of circumstellar disks. The CGI science investigation teams were tasked with developing a set of science requirements which advance our knowledge of exoplanet occurrence and atmospheric composition, as well as the composition and morphology of exozodiacal debris disks, cold Kuiper Belt analogs, and protoplanetary systems. We present the initial content, rationales, validation, and verification plans for the WFIRST CGI, informed by detailed and still-evolving instrument and observatory performance models. We also discuss our approach to the requirements development and management process, including the collection and organization of science inputs, open source approach to managing the requirements database, and the range of models used for requirements validation. These tools can be applied to requirements development processes for other astrophysical space missions, and may ease their management and maintenance. These WFIRST CGI science requirements allow the community to learn about and provide insights and feedback on the expected instrument performance and science return.

  15. Mars Atmospheric Capture and Gas Separation

    NASA Technical Reports Server (NTRS)

    Muscatello, Anthony; Santiago-Maldonado, Edgardo; Gibson, Tracy; Devor, Robert; Captain, James

    2011-01-01

    The Mars atmospheric capture and gas separation project is selecting, developing, and demonstrating techniques to capture and purify Martian atmospheric gases for their utilization for the production of hydrocarbons, oxygen, and water in ISRU systems. Trace gases will be required to be separated from Martian atmospheric gases to provide pure C02 to processing elements. In addition, other Martian gases, such as nitrogen and argon, occur in concentrations high enough to be useful as buffer gas and should be captured as welL To achieve these goals, highly efficient gas separation processes will be required. These gas separation techniques are also required across various areas within the ISRU project to support various consumable production processes. The development of innovative gas separation techniques will evaluate the current state-of-the-art for the gas separation required, with the objective to demonstrate and develop light-weight, low-power methods for gas separation. Gas separation requirements include, but are not limited to the selective separation of: (1) methane and water from un-reacted carbon oxides (C02- CO) and hydrogen typical of a Sabatier-type process, (2) carbon oxides and water from unreacted hydrogen from a Reverse Water-Gas Shift process, (3) carbon oxides from oxygen from a trash/waste processing reaction, and (4) helium from hydrogen or oxygen from a propellant scavenging process. Potential technologies for the separations include freezers, selective membranes, selective solvents, polymeric sorbents, zeolites, and new technologies. This paper and presentation will summarize the results of an extensive literature review and laboratory evaluations of candidate technologies for the capture and separation of C02 and other relevant gases.

  16. Review of the workshop on low-cost polysilicon for terrestrial photovoltaic solar cell applications

    NASA Technical Reports Server (NTRS)

    Lutwack, R.

    1986-01-01

    Topics reviewed include: polysilicon material requirements; effects of impurities; requirements for high-efficiency solar cells; economics; development of silane processes; fluidized-bed processor development; silicon purification; and marketing.

  17. Development of functionally-oriented technological processes of electroerosive processing

    NASA Astrophysics Data System (ADS)

    Syanov, S. Yu

    2018-03-01

    The stages of the development of functionally oriented technological processes of electroerosive processing from the separation of the surfaces of parts and their service functions to the determination of the parameters of the process of electric erosion, which will provide not only the quality parameters of the surface layer, but also the required operational properties, are described.

  18. Mission Engineering of a Rapid Cycle Spacecraft Logistics Fleet

    NASA Technical Reports Server (NTRS)

    Holladay, Jon; McClendon, Randy (Technical Monitor)

    2002-01-01

    The requirement for logistics re-supply of the International Space Station has provided a unique opportunity for engineering the implementation of NASA's first dedicated pressurized logistics carrier fleet. The NASA fleet is comprised of three Multi-Purpose Logistics Modules (MPLM) provided to NASA by the Italian Space Agency in return for operations time aboard the International Space Station. Marshall Space Flight Center was responsible for oversight of the hardware development from preliminary design through acceptance of the third flight unit, and currently manages the flight hardware sustaining engineering and mission engineering activities. The actual MPLM Mission began prior to NASA acceptance of the first flight unit in 1999 and will continue until the de-commission of the International Space Station that is planned for 20xx. Mission engineering of the MPLM program requires a broad focus on three distinct yet inter-related operations processes: pre-flight, flight operations, and post-flight turn-around. Within each primary area exist several complex subsets of distinct and inter-related activities. Pre-flight processing includes the evaluation of carrier hardware readiness for space flight. This includes integration of payload into the carrier, integration of the carrier into the launch vehicle, and integration of the carrier onto the orbital platform. Flight operations include the actual carrier operations during flight and any required real-time ground support. Post-flight processing includes de-integration of the carrier hardware from the launch vehicle, de-integration of the payload, and preparation for returning the carrier to pre-flight staging. Typical space operations are engineered around the requirements and objectives of a dedicated mission on a dedicated operational platform (i.e. Launch or Orbiting Vehicle). The MPLM, however, has expanded this envelope by requiring operations with both vehicles during flight as well as pre-launch and post-landing operations. These unique requirements combined with a success-oriented schedule of four flights within a ten-month period have provided numerous opportunities for understanding and improving operations processes. Furthermore, it has increased the knowledge base of future Payload Carrier and Launch Vehicle hardware and requirement developments. Discussion of the process flows and target areas for process improvement are provided in the subject paper. Special emphasis is also placed on supplying guidelines for hardware development. The combination of process knowledge and hardware development knowledge will provide a comprehensive overview for future vehicle developments as related to integration and transportation of payloads.

  19. Model-Driven Useware Engineering

    NASA Astrophysics Data System (ADS)

    Meixner, Gerrit; Seissler, Marc; Breiner, Kai

    User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.

  20. The Development and Implementation of Ground Safety Requirements for Project Orion Abort Flight Testing - A Case Study

    NASA Technical Reports Server (NTRS)

    Kirkpatrick, Paul D.; Williams, Jeffrey G.; Condzella, Bill R.

    2008-01-01

    A rigorous set of detailed ground safety requirements is required to make sure that ground support equipment (GSE) and associated planned ground operations are conducted safely. Detailed ground safety requirements supplement the GSE requirements already called out in NASA-STD-5005. This paper will describe the initial genesis of these ground safety requirements, the establishment and approval process and finally the implementation process for Project Orion. The future of the requirements will also be described. Problems and issues encountered and overcame will be discussed.

  1. Evolutionary Capability Delivery of Coast Guard Manpower System

    DTIC Science & Technology

    2014-06-01

    Office IID iterative incremental development model IT information technology MA major accomplishment MRA manpower requirements analysis MRD manpower...CG will need to ensure that development is low risk. The CG uses Manpower Requirements Analysis ( MRAs ) to collect the necessary manpower data to...of users. The CG uses two business processes to manage human capital: Manpower Requirements Analysis ( MRA ) and Manpower Requirements

  2. AMCC casting development, volume 2

    NASA Technical Reports Server (NTRS)

    1995-01-01

    PCC successfully cast and performed nondestructive testing, FPI and x-ray, on seventeen AMCC castings. Destructive testing, lab analysis and chemical milling, was performed on eleven of the castings and the remaining six castings were shipped to NASA or Aerojet. Two of the six castings shipped, lots 015 and 016, were fully processed per blueprint requirements. PCC has fully developed the gating and processing parameters of this part and feels the part could be implemented into production, after four more castings have been completed to ensure the repeatability of the process. The AMCC casting has been a technically challenging part due to its size, configuration, and alloy type. The height and weight of the wax pattern assembly necessitated the development of a hollow gating system to ensure structural integrity of the shell throughout the investment process. The complexity in the jacket area of the casting required the development of an innovative casting technology that PCC has termed 'TGC' or thermal gradient control. This method of setting up thermal gradients in the casting during solidification represents a significant process improvement for PCC and has been successfully implemented on other programs. The alloy, JBK75, is a relatively new alloy in the investment casting arena and required our engineering staff to learn the gating, processing, and dimensional characteristics of the material.

  3. A Systems Engineering Process Supporting the Development of Operational Requirements Driven Federations

    DTIC Science & Technology

    2008-12-01

    A SYSTEMS ENGINEERING PROCESS SUPPORTING THE DEVELOPMENT OF OPERATIONAL REQUIREMENTS DRIVEN FEDERATIONS Andreas Tolk & Thomas G. Litwin ...c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Tolk, Litwin and Kewley Executive Office (PEO...capabilities and their relative changes 1297 Tolk, Litwin and Kewley based on the system to be evaluated as well, in particular when it comes to

  4. Space processing applications payload equipment study. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Hammel, R. L.

    1974-01-01

    A study was conducted to derive and collect payload information on the anticipated space processing payload requirements for the Spacelab and space shuttle orbiter planning activities. The six objectives generated by the study are defined. Concepts and requirements for space processing payloads to accommodate the performance of the shuttle-supported research phase are analyzed. Diagrams and tables of data are developed to show the experiments involved, the power requirements, and the payloads for shared missions.

  5. Modular space station, phase B extension. Information management advanced development. Volume 4: Data processing assembly

    NASA Technical Reports Server (NTRS)

    Gerber, C. R.

    1972-01-01

    The computation and logical functions which are performed by the data processing assembly of the modular space station are defined. The subjects discussed are: (1) requirements analysis, (2) baseline data processing assembly configuration, (3) information flow study, (4) throughput simulation, (5) redundancy study, (6) memory studies, and (7) design requirements specification.

  6. Formal Specification of Information Systems Requirements.

    ERIC Educational Resources Information Center

    Kampfner, Roberto R.

    1985-01-01

    Presents a formal model for specification of logical requirements of computer-based information systems that incorporates structural and dynamic aspects based on two separate models: the Logical Information Processing Structure and the Logical Information Processing Network. The model's role in systems development is discussed. (MBR)

  7. Three Tier Unified Process Model for Requirement Negotiations and Stakeholder Collaborations

    NASA Astrophysics Data System (ADS)

    Niazi, Muhammad Ashraf Khan; Abbas, Muhammad; Shahzad, Muhammad

    2012-11-01

    This research paper is focused towards carrying out a pragmatic qualitative analysis of various models and approaches of requirements negotiations (a sub process of requirements management plan which is an output of scope managementís collect requirements process) and studies stakeholder collaborations methodologies (i.e. from within communication management knowledge area). Experiential analysis encompass two tiers; first tier refers to the weighted scoring model while second tier focuses on development of SWOT matrices on the basis of findings of weighted scoring model for selecting an appropriate requirements negotiation model. Finally the results are simulated with the help of statistical pie charts. On the basis of simulated results of prevalent models and approaches of negotiations, a unified approach for requirements negotiations and stakeholder collaborations is proposed where the collaboration methodologies are embeded into selected requirements negotiation model as internal parameters of the proposed process alongside some external required parameters like MBTI, opportunity analysis etc.

  8. NOSS Altimeter Detailed Algorithm specifications

    NASA Technical Reports Server (NTRS)

    Hancock, D. W.; Mcmillan, J. D.

    1982-01-01

    The details of the algorithms and data sets required for satellite radar altimeter data processing are documented in a form suitable for (1) development of the benchmark software and (2) coding the operational software. The algorithms reported in detail are those established for altimeter processing. The algorithms which required some additional development before documenting for production were only scoped. The algorithms are divided into two levels of processing. The first level converts the data to engineering units and applies corrections for instrument variations. The second level provides geophysical measurements derived from altimeter parameters for oceanographic users.

  9. Development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements

    NASA Technical Reports Server (NTRS)

    Rey, Charles A.

    1991-01-01

    The development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements are discussed. Efforts were directed towards the following task areas: design and development of a High Temperature Acoustic Levitator (HAL) for containerless processing and property measurements at high temperatures; testing of the HAL module to establish this technology for use as a positioning device for microgravity uses; construction and evaluation of a brassboard hot wall Acoustic Levitation Furnace; construction and evaluation of a noncontact temperature measurement (NCTM) system based on AGEMA thermal imaging camera; construction of a prototype Division of Amplitude Polarimetric Pyrometer for NCTM of levitated specimens; evaluation of and recommendations for techniques to control contamination in containerless materials processing chambers; and evaluation of techniques for heating specimens to high temperatures for containerless materials experimentation.

  10. Development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements

    NASA Astrophysics Data System (ADS)

    Rey, Charles A.

    1991-03-01

    The development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements are discussed. Efforts were directed towards the following task areas: design and development of a High Temperature Acoustic Levitator (HAL) for containerless processing and property measurements at high temperatures; testing of the HAL module to establish this technology for use as a positioning device for microgravity uses; construction and evaluation of a brassboard hot wall Acoustic Levitation Furnace; construction and evaluation of a noncontact temperature measurement (NCTM) system based on AGEMA thermal imaging camera; construction of a prototype Division of Amplitude Polarimetric Pyrometer for NCTM of levitated specimens; evaluation of and recommendations for techniques to control contamination in containerless materials processing chambers; and evaluation of techniques for heating specimens to high temperatures for containerless materials experimentation.

  11. Advancement of CMOS Doping Technology in an External Development Framework

    NASA Astrophysics Data System (ADS)

    Jain, Amitabh; Chambers, James J.; Shaw, Judy B.

    2011-01-01

    The consumer appetite for a rich multimedia experience drives technology development for mobile hand-held devices and the infrastructure to support them. Enhancements in functionality, speed, and user experience are derived from advancements in CMOS technology. The technical challenges in developing each successive CMOS technology node to support these enhancements have become increasingly difficult. These trends have motivated the CMOS business towards a collaborative approach based on strategic partnerships. This paper describes our model and experience of CMOS development, based on multi-dimensional industrial and academic partnerships. We provide to our process equipment, materials, and simulation partners, as well as to our silicon foundry partners, the detailed requirements for future integrated circuit products. This is done very early in the development cycle to ensure that these requirements can be met. In order to determine these fundamental requirements, we rely on a strategy that requires strong interaction between process and device simulation, physical and chemical analytical methods, and research at academic institutions. This learning is shared with each project partner to address integration and manufacturing issues encountered during CMOS technology development from its inception through product ramp. We utilize TI's core strengths in physical analysis, unit processes and integration, yield ramp, reliability, and product engineering to support this technological development. Finally, this paper presents examples of the advancement of CMOS doping technology for the 28 nm node and beyond through this development model.

  12. Postures and Motions Library Development for Verification of Ground Crew Human Systems Integration Requirements

    NASA Technical Reports Server (NTRS)

    Jackson, Mariea Dunn; Dischinger, Charles; Stambolian, Damon; Henderson, Gena

    2012-01-01

    Spacecraft and launch vehicle ground processing activities require a variety of unique human activities. These activities are being documented in a Primitive motion capture library. The Library will be used by the human factors engineering in the future to infuse real to life human activities into the CAD models to verify ground systems human factors requirements. As the Primitive models are being developed for the library the project has selected several current human factors issues to be addressed for the SLS and Orion launch systems. This paper explains how the Motion Capture of unique ground systems activities are being used to verify the human factors analysis requirements for ground system used to process the STS and Orion vehicles, and how the primitive models will be applied to future spacecraft and launch vehicle processing.

  13. Postures and Motions Library Development for Verification of Ground Crew Human Factors Requirements

    NASA Technical Reports Server (NTRS)

    Stambolian, Damon; Henderson, Gena; Jackson, Mariea Dunn; Dischinger, Charles

    2013-01-01

    Spacecraft and launch vehicle ground processing activities require a variety of unique human activities. These activities are being documented in a primitive motion capture library. The library will be used by human factors engineering analysts to infuse real to life human activities into the CAD models to verify ground systems human factors requirements. As the primitive models are being developed for the library, the project has selected several current human factors issues to be addressed for the Space Launch System (SLS) and Orion launch systems. This paper explains how the motion capture of unique ground systems activities is being used to verify the human factors engineering requirements for ground systems used to process the SLS and Orion vehicles, and how the primitive models will be applied to future spacecraft and launch vehicle processing.

  14. 24 CFR 972.133 - Public and resident consultation process for developing a conversion plan.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 4 2012-04-01 2012-04-01 false Public and resident consultation process for developing a conversion plan. 972.133 Section 972.133 Housing and Urban Development... ASSISTANCE Required Conversion of Public Housing Developments Conversion Plans § 972.133 Public and resident...

  15. Analyzing Discrepancies in a Software Development Project Change Request (CR) Assessment Process and Recommendations for Process Improvements

    NASA Technical Reports Server (NTRS)

    Cunningham, Kenneth James

    2003-01-01

    The Change Request (CR) assessment process is essential in the display development cycle. The assessment process is performed to ensure that the changes stated in the description of the CR match the changes in the actual display requirements. If a discrepancy is found between the CR and the requirements, the CR must be returned to the originator for corrections. Data was gathered from each of the developers to determine the type of discrepancies and the amount of time spent assessing each CR. This study sought to determine the most common types of discrepancies, and the amount of time required to assessing those issues. The study found that even though removing discrepancy before an assessment would save half the time needed to assess an CR with a discrepancy, the number of CR's found to have a discrepancy was very small compared to the total number of CR's assessed during the data gathering period.

  16. Development of an estimation model for the evaluation of the energy requirement of dilute acid pretreatments of biomass.

    PubMed

    Mafe, Oluwakemi A T; Davies, Scott M; Hancock, John; Du, Chenyu

    2015-01-01

    This study aims to develop a mathematical model to evaluate the energy required by pretreatment processes used in the production of second generation ethanol. A dilute acid pretreatment process reported by National Renewable Energy Laboratory (NREL) was selected as an example for the model's development. The energy demand of the pretreatment process was evaluated by considering the change of internal energy of the substances, the reaction energy, the heat lost and the work done to/by the system based on a number of simplifying assumptions. Sensitivity analyses were performed on the solid loading rate, temperature, acid concentration and water evaporation rate. The results from the sensitivity analyses established that the solids loading rate had the most significant impact on the energy demand. The model was then verified with data from the NREL benchmark process. Application of this model on other dilute acid pretreatment processes reported in the literature illustrated that although similar sugar yields were reported by several studies, the energy required by the different pretreatments varied significantly.

  17. Biosimilarity Versus Manufacturing Change: Two Distinct Concepts.

    PubMed

    Declerck, Paul; Farouk-Rezk, Mourad; Rudd, Pauline M

    2016-02-01

    As products of living cells, biologics are far more complicated than small molecular-weight drugs not only with respect to size and structural complexity but also their sensitivity to manufacturing processes and post-translational changes. Most of the information on the manufacturing process of biotherapeutics is proprietary and hence not fully accessible to the public. This information gap represents a key challenge for biosimilar developers and plays a key role in explaining the differences in regulatory pathways required to demonstrate biosimilarity versus those required to ensure that a change in manufacturing process did not have implications on safety and efficacy. Manufacturing process changes are frequently needed for a variety of reasons including response to regulatory requirements, up scaling production, change in facility, change in raw materials, improving control of quality (consistency) or optimising production efficiency. The scope of the change is usually a key indicator of the scale of analysis required to evaluate the quality. In most cases, where the scope of the process change is limited, only quality and analytical studies should be sufficient while comparative clinical studies can be required in case of major changes (e.g., cell line changes). Biosimilarity exercises have been addressed differently by regulators on the understanding that biosimilar developers start with fundamental differences being a new cell line and also a knowledge gap of the innovator's processes, including culture media, purification processes, and potentially different formulations, and are thus required to ensure that differences from innovators do not result in differences in efficacy and safety.

  18. RELAP-7 Software Verification and Validation Plan - Requirements Traceability Matrix (RTM) Part 2: Code Assessment Strategy, Procedure, and RTM Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Jun Soo; Choi, Yong Joon; Smith, Curtis Lee

    2016-09-01

    This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.

  19. Process Acceptance and Adoption by IT Software Project Practitioners

    ERIC Educational Resources Information Center

    Guardado, Deana R.

    2012-01-01

    This study addresses the question of what factors determine acceptance and adoption of processes in the context of Information Technology (IT) software development projects. This specific context was selected because processes required for managing software development projects are less prescriptive than in other, more straightforward, IT…

  20. Graphic Design in Libraries: A Conceptual Process

    ERIC Educational Resources Information Center

    Ruiz, Miguel

    2014-01-01

    Providing successful library services requires efficient and effective communication with users; therefore, it is important that content creators who develop visual materials understand key components of design and, specifically, develop a holistic graphic design process. Graphic design, as a form of visual communication, is the process of…

  1. Information technology strategic planning: art or science?

    PubMed

    Hutsell, Richard; Mancini-Newell, Lulcy

    2005-01-01

    It had been almost a decade since the hospitals that make up the Daughters of Charity Health System (DCHS) had engaged in a formal information technology strategic planning process. In the summer of 2002, as the health system re-formed, there was a unique opportunity to introduce a planning process that reflected the governance style of the new health system. DCHS embarked on this journey, with the CIO initiating and formally sponsoring the information technology strategic planning process in a dynamic and collaborative manner The system sought to develop a plan tailored to encompass both enterprise-wide and local requirements; to develop a governance model to engage the members of the local health ministries in plan development, both now and in the future; and to conduct the process in a manner that reflected the values of the Daughters of Charity. The DCHS CIO outlined a premise that the CIO would guide and be continuously involved in the development of this tailored process, in conjunction with an external resource. Together, there would be joint responsibility for introducing a flexible information technology strategic planning methodology; providing an education on the current state of healthcare IT, including future trends and success factors; facilitating support to tap into existing internal talent; cultivating a collaborative process to support both current requirements and future vision; and developing a well-functioning governance structure that would enable the plan to evolve and reflect user community requirements. This article highlights the planning process, including the lessons learned, the benchmarking during and in post-planning, and finally, but most importantly, the unexpected benefit that resulted from this planning process.

  2. The Role of Independent V&V in Upstream Software Development Processes

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve

    1996-01-01

    This paper describes the role of Verification and Validation (V&V) during the requirements and high level design processes, and in particular the role of Independent V&V (IV&V). The job of IV&V during these phases is to ensure that the requirements are complete, consistent and valid, and to ensure that the high level design meets the requirements. This contrasts with the role of Quality Assurance (QA), which ensures that appropriate standards and process models are defined and applied. This paper describes the current state of practice for IV&V, concentrating on the process model used in NASA projects. We describe a case study, showing the processes by which problem reporting and tracking takes place, and how IV&V feeds into decision making by the development team. We then describe the problems faced in implementing IV&V. We conclude that despite a well defined process model, and tools to support it, IV&V is still beset by communication and coordination problems.

  3. Medical device development.

    PubMed

    Panescu, Dorin

    2009-01-01

    The development of a successful medical product requires not only engineering design efforts, but also clinical, regulatory, marketing and business expertise. This paper reviews items related to the process of designing medical devices. It discusses the steps required to take a medical product idea from concept, through development, verification and validation, regulatory approvals and market release.

  4. Overview and development of EDA tools for integration of DSA into patterning solutions

    NASA Astrophysics Data System (ADS)

    Torres, J. Andres; Fenger, Germain; Khaira, Daman; Ma, Yuansheng; Granik, Yuri; Kapral, Chris; Mitra, Joydeep; Krasnova, Polina; Ait-Ferhat, Dehia

    2017-03-01

    Directed Self-Assembly is the method by which a self-assembly polymer is forced to follow a desired geometry defined or influenced by a guiding pattern. Such guiding pattern uses surface potentials, confinement or both to achieve polymer configurations that result in circuit-relevant topologies, which can be patterned onto a substrate. Chemo, and grapho epitaxy of lines and space structures are now routinely inspected at full wafer level to understand the defectivity limits of the materials and their maximum resolution. In the same manner, there is a deeper understanding about the formation of cylinders using grapho-epitaxy processes. Academia has also contributed by developing methods that help reduce the number of masks in advanced nodes by "combining" DSA-compatible groups, thus reducing the total cost of the process. From the point of view of EDA, new tools are required when a technology is adopted, and most technologies are adopted when they show a clear cost-benefit over alternative techniques. In addition, years of EDA development have led to the creation of very flexible toolkits that permit rapid prototyping and evaluation of new process alternatives. With the development of high-chi materials, and by moving away of the well characterized PS-PMMA systems, as well as novel integrations in the substrates that work in tandem with diblock copolymer systems, it is necessary to assess any new requirements that may or may not need custom tools to support such processes. Hybrid DSA processes (which contain both chemo and grapho elements), are currently being investigated as possible contenders for sub-5nm process techniques. Because such processes permit the re-distribution of discontinuities in the regular arrays between the substrate and a cut operation, they have the potential to extend the number of applications for DSA. This paper illustrates the reason as to why some DSA processes can be supported by existing rules and technology, while other processes require the development of highly customized correction tools and models. It also illustrates how developing DSA cannot be done in isolation, and it requires the full collaboration of EDA, Material's suppliers, Manufacturing equipment, Metrology, and electronic manufacturers.

  5. Processes involved in the development of latent fingerprints using the cyanoacrylate fuming method.

    PubMed

    Lewis, L A; Smithwick, R W; Devault, G L; Bolinger, B; Lewis, S A

    2001-03-01

    Chemical processes involved in the development of latent fingerprints using the cyanoacrylate fuming method have been studied. Two major types of latent prints have been investigated-clean and oily prints. Scanning electron microscopy (SEM) has been used as a tool for determining the morphology of the polymer developed separately on clean and oily prints after cyanoacrylate fuming. A correlation between the chemical composition of an aged latent fingerprint, prior to development, and the quality of a developed fingerprint has been observed in the morphology. The moisture in the print prior to fuming has been found to be more important than the moisture in the air during fuming for the development of a useful latent print. In addition, the amount of time required to develop a high quality latent print has been found to be within 2 min. The cyanoacrylate polymerization process is extremely rapid. When heat is used to accelerate the fuming process, typically a period of 2 min is required to develop the print. The optimum development time depends upon the concentration of cyanoacrylate vapors within the enclosure.

  6. Seven Processes that Enable NASA Software Engineering Technologies

    NASA Technical Reports Server (NTRS)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    This slide presentation reviews seven processes that NASA uses to ensure that software is developed, acquired and maintained as specified in the NPR 7150.2A requirement. The requirement is to ensure that all software be appraised for the Capability Maturity Model Integration (CMMI). The enumerated processes are: (7) Product Integration, (6) Configuration Management, (5) Verification, (4) Software Assurance, (3) Measurement and Analysis, (2) Requirements Management and (1) Planning & Monitoring. Each of these is described and the group(s) that are responsible is described.

  7. Development of parametric material, energy, and emission inventories for wafer fabrication in the semiconductor industry.

    PubMed

    Murphy, Cynthia F; Kenig, George A; Allen, David T; Laurent, Jean-Philippe; Dyer, David E

    2003-12-01

    Currently available data suggest that most of the energy and material consumption related to the production of an integrated circuit is due to the wafer fabrication process. The complexity of wafer manufacturing, requiring hundreds of steps that vary from product to product and from facility to facility and which change every few years, has discouraged the development of material, energy, and emission inventory modules for the purpose of insertion into life cycle assessments. To address this difficulty, a flexible, process-based system for estimating material requirements, energy requirements, and emissions in wafer fabrication has been developed. The method accounts for mass and energy use atthe unit operation level. Parametric unit operation modules have been developed that can be used to predict changes in inventory as the result of changes in product design, equipment selection, or process flow. A case study of the application of the modules is given for energy consumption, but a similar methodology can be used for materials, individually or aggregated.

  8. Propellant injection systems and processes

    NASA Technical Reports Server (NTRS)

    Ito, Jackson I.

    1995-01-01

    The previous 'Art of Injector Design' is maturing and merging with the more systematic 'Science of Combustion Device Analysis.' This technology can be based upon observation, correlation, experimentation and ultimately analytical modeling based upon basic engineering principles. This methodology is more systematic and far superior to the historical injector design process of 'Trial and Error' or blindly 'Copying Past Successes.' The benefit of such an approach is to be able to rank candidate design concepts for relative probability of success or technical risk in all the important combustion device design requirements and combustion process development risk categories before committing to an engine development program. Even if a single analytical design concept cannot be developed to predict satisfying all requirements simultaneously, a series of risk mitigation key enabling technologies can be identified for early resolution. Lower cost subscale or laboratory experimentation to demonstrate proof of principle, critical instrumentation requirements, and design discriminating test plans can be developed based on the physical insight provided by these analyses.

  9. Cryogenic fluid management in space

    NASA Technical Reports Server (NTRS)

    Antar, Basil N.

    1988-01-01

    Many future space based vehicles and satellites will require on orbit refuelling procedures. Cryogenic fluid management technology is being developed to assess the requirements of such procedures as well as to aid in the design and development of these vehicles. Cryogenic fluid management technology for this application could be divided into two areas of study, one is concerned with fluid transfer process and the other with cryogenic liquid storage. This division is based upon the needed technology for the development of each area. In the first, the interaction of fluid dynamics with thermodynamics is essential, while in the second only thermodynamic analyses are sufficient to define the problem. The following specific process related to the liquid transfer area are discussed: tank chilldown and fill; tank pressurization; liquid positioning; and slosh dynamics and control. These specific issues are discussed in relation with the required technology for their development in the low gravity application area. In each process the relevant physics controlling the technology is identified and methods for resolving some of the basic questions are discussed.

  10. Turnaround operations analysis for OTV. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Anaylses performed for ground processing, both expendable and reusable ground-based Orbital Transfer Vehicles (OTVs) launched on the Space Transportation System (STS), a reusable space-based OTV (SBOTV) launched on the STS, and a reusable ground-based OTV (GBOTV) launched on an unmanned cargo vehicle and recovered by the Orbiter are summarized. Also summarized are the analyses performed for space processing the reusable SBOTV at the Space Station in low Earth orbit (LEO) as well as the maintenance and servicing of the SBOTV accommodations at the Space Station. In addition, the candidate OTV concepts, design and interface requirements, and the Space Station design, support, and interface requirements are summarized. A development schedule and associated costs for the required SBOTV accommodations at the Space Station are presented. Finallly, the technology development plan to develop the capability to process both GBOTVs and SBOTVs are summarized.

  11. A requirements index for information processing in hospitals.

    PubMed

    Ammenwerth, E; Buchauer, A; Haux, R

    2002-01-01

    Reference models describing typical information processing requirements in hospitals do not currently exist. This leads to high hospital information system (HIS) management expenses, for example, during tender processes for the acquisition of software application programs. Our aim was, therefore, to develop a comprehensive, lasting, technology-independent, and sufficiently detailed index of requirements for information processing in hospitals in order to reduce respective expenses. Two-dozen German experts established an index of requirements for information processing in university hospitals. This was done in a consensus-based, top-down, cyclic manner. Each functional requirement was derived from information processing functions and sub-functions of a hospital. The result is the first official German version of a requirements index, containing 233 functional requirements and 102 function-independent requirements, focusing on German needs. The functional requirements are structured according to the primary care process from admission to discharge and supplemented by requirements for handling patient records, work organization and resource planning, hospital management, research and education. Both the German version and its English translation are available in the Internet. The index of requirements contains general information processing requirements in hospitals which are formulated independent of information processing tools, or of HIS architectures. It aims at supporting HIS management, especially HIS strategic planning, HIS evaluation, and tender processes. The index can be regarded as a draft, which must, however, be refined according to the specific aims of a particular project. Although focused on German needs, we expect that it can also be useful in other countries. The high amount of interest shown for the index supports its usefulness.

  12. A practical approach to object based requirements analysis

    NASA Technical Reports Server (NTRS)

    Drew, Daniel W.; Bishop, Michael

    1988-01-01

    Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.

  13. Developing Software Requirements for a Knowledge Management System That Coordinates Training Programs with Business Processes and Policies in Large Organizations

    ERIC Educational Resources Information Center

    Kiper, J. Richard

    2013-01-01

    For large organizations, updating instructional programs presents a challenge to keep abreast of constantly changing business processes and policies. Each time a process or policy changes, significant resources are required to locate and modify the training materials that convey the new content. Moreover, without the ability to track learning…

  14. Earth resources data acquisition sensor study

    NASA Technical Reports Server (NTRS)

    Grohse, E. W.

    1975-01-01

    The minimum data collection and data processing requirements are investigated for the development of water monitoring systems, which disregard redundant and irrelevant data and process only those data predictive of the onset of significant pollution events. Two approaches are immediately suggested: (1) adaptation of a presently available ambient air monitoring system developed by TVA, and (2) consideration of an air, water, and radiological monitoring system developed by the Georgia Tech Experiment Station. In order to apply monitoring systems, threshold values and maximum allowable rates of change of critical parameters such as dissolved oxygen and temperature are required.

  15. Space station analysis study. Part 2, Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Objectives of the space station program requiring the support of man in space, either in the shuttle sortie mode or in extended duration facilities are identified and analyzed. A set of functional requirements was derived to identify specific technology advancement needs, tests to be conducted, and processes to be developed. Program options are summarized for: (1) satellite power system; (2) earth services; (3) space cosmological research and development; (4) space processing and manufacturing; (5) multidiscipline science laboratory; (6) sensor development facility; (7) living and working in space; and (8) orbital depot.

  16. Descriptive Study Analyzing Discrepancies in a Software Development Project Change Request (CR) Assessment Process and Recommendations for Process Improvements

    NASA Technical Reports Server (NTRS)

    Cunningham, Kenneth J.

    2002-01-01

    The Change Request (CR) assessment process is essential in the display development cycle. The assessment process is performed to ensure that the changes stated in the description of the CR match the changes in the actual display requirements. If a discrepancy is found between the CR and the requirements, the CR must be returned to the originator for corrections. Data will be gathered from each of the developers to determine the type of discrepancies and the amount of time spent assessing each CR. This study will determine the most common types of discrepancies and the amount of time spent assessing those issues. The results of the study will provide a foundation for future improvements as well as a baseline for future studies.

  17. A diagnostic prototype of the potable water subsystem of the Space Station Freedom ECLSS

    NASA Technical Reports Server (NTRS)

    Lukefahr, Brenda D.; Rochowiak, Daniel M.; Benson, Brian L.; Rogers, John S.; Mckee, James W.

    1989-01-01

    In analyzing the baseline Environmental Control and Life Support System (ECLSS) command and control architecture, various processes are found which would be enhanced by the use of knowledge based system methods of implementation. The most suitable process for prototyping using rule based methods are documented, while domain knowledge resources and other practical considerations are examined. Requirements for a prototype rule based software system are documented. These requirements reflect Space Station Freedom ECLSS software and hardware development efforts, and knowledge based system requirements. A quick prototype knowledge based system environment is researched and developed.

  18. Validation Methods Research for Fault-Tolerant Avionics and Control Systems: Working Group Meeting, 2

    NASA Technical Reports Server (NTRS)

    Gault, J. W. (Editor); Trivedi, K. S. (Editor); Clary, J. B. (Editor)

    1980-01-01

    The validation process comprises the activities required to insure the agreement of system realization with system specification. A preliminary validation methodology for fault tolerant systems documented. A general framework for a validation methodology is presented along with a set of specific tasks intended for the validation of two specimen system, SIFT and FTMP. Two major areas of research are identified. First, are those activities required to support the ongoing development of the validation process itself, and second, are those activities required to support the design, development, and understanding of fault tolerant systems.

  19. Partnerships for Policy Development: A Case Study From Uganda’s Costed Implementation Plan for Family Planning

    PubMed Central

    Lipsky, Alyson B; Gribble, James N; Cahaelen, Linda; Sharma, Suneeta

    2016-01-01

    ABSTRACT In global health, partnerships between practitioners and policy makers facilitate stakeholders in jointly addressing those issues that require multiple perspectives for developing, implementing, and evaluating plans, strategies, and programs. For family planning, costed implementation plans (CIPs) are developed through a strategic government-led consultative process that results in a detailed plan for program activities and an estimate of the funding required to achieve an established set of goals. Since 2009, many countries have developed CIPs. Conventionally, the CIP approach has not been defined with partnerships as a focal point; nevertheless, cooperation between key stakeholders is vital to CIP development and execution. Uganda launched a CIP in November 2014, thus providing an opportunity to examine the process through a partnership lens. This article describes Uganda’s CIP development process in detail, grounded in a framework for assessing partnerships, and provides the findings from 22 key informant interviews. Findings reveal strengths in Uganda’s CIP development process, such as willingness to adapt and strong senior management support. However, the evaluation also highlighted challenges, including district health officers (DHOs), who are a key group of implementers, feeling excluded from the development process. There was also a lack of planning around long-term partnership practices that could help address anticipated execution challenges. The authors recommend that future CIP development efforts use a long-term partnership strategy that fosters accountability by encompassing both the short-term goal of developing the CIP and the longer-term goal of achieving the CIP objectives. Although this study focused on Uganda’s CIP for family planning, its lessons have implications for any policy or strategy development efforts that require multiple stakeholders to ensure successful execution. PMID:27353621

  20. Developing an undergraduate geography course on digital image processing of remotely sensed data

    NASA Technical Reports Server (NTRS)

    Baumann, P. R.

    1981-01-01

    Problems relating to the development of a digital image processing course in an undergraduate geography environment is discussed. Computer resource requirements, course prerequisites, and the size of the study area are addressed.

  1. Unlocking Potentials of Microwaves for Food Safety and Quality

    PubMed Central

    Tang, Juming

    2015-01-01

    Microwave is an effective means to deliver energy to food through polymeric package materials, offering potential for developing short-time in-package sterilization and pasteurization processes. The complex physics related to microwave propagation and microwave heating require special attention to the design of process systems and development of thermal processes in compliance with regulatory requirements for food safety. This article describes the basic microwave properties relevant to heating uniformity and system design, and provides a historical overview on the development of microwave-assisted thermal sterilization (MATS) and pasteurization systems in research laboratories and used in food plants. It presents recent activities on the development of 915 MHz single-mode MATS technology, the procedures leading to regulatory acceptance, and sensory results of the processed products. The article discusses needs for further efforts to bridge remaining knowledge gaps and facilitate transfer of academic research to industrial implementation. PMID:26242920

  2. Unlocking Potentials of Microwaves for Food Safety and Quality.

    PubMed

    Tang, Juming

    2015-08-01

    Microwave is an effective means to deliver energy to food through polymeric package materials, offering potential for developing short-time in-package sterilization and pasteurization processes. The complex physics related to microwave propagation and microwave heating require special attention to the design of process systems and development of thermal processes in compliance with regulatory requirements for food safety. This article describes the basic microwave properties relevant to heating uniformity and system design, and provides a historical overview on the development of microwave-assisted thermal sterilization (MATS) and pasteurization systems in research laboratories and used in food plants. It presents recent activities on the development of 915 MHz single-mode MATS technology, the procedures leading to regulatory acceptance, and sensory results of the processed products. The article discusses needs for further efforts to bridge remaining knowledge gaps and facilitate transfer of academic research to industrial implementation. © 2015 Institute of Food Technologists®

  3. Automated Derivation of Complex System Constraints from User Requirements

    NASA Technical Reports Server (NTRS)

    Muery, Kim; Foshee, Mark; Marsh, Angela

    2006-01-01

    International Space Station (ISS) payload developers submit their payload science requirements for the development of on-board execution timelines. The ISS systems required to execute the payload science operations must be represented as constraints for the execution timeline. Payload developers use a software application, User Requirements Collection (URC), to submit their requirements by selecting a simplified representation of ISS system constraints. To fully represent the complex ISS systems, the constraints require a level of detail that is beyond the insight of the payload developer. To provide the complex representation of the ISS system constraints, HOSC operations personnel, specifically the Payload Activity Requirements Coordinators (PARC), manually translate the payload developers simplified constraints into detailed ISS system constraints used for scheduling the payload activities in the Consolidated Planning System (CPS). This paper describes the implementation for a software application, User Requirements Integration (URI), developed to automate the manual ISS constraint translation process.

  4. 45 CFR 98.14 - Plan process.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Department of Health and Human Services GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND General Application Procedures § 98.14 Plan process. In the development of each Plan, as required pursuant to § 98.17... Federal, State, and local child care and early childhood development programs, including such programs for...

  5. 45 CFR 98.14 - Plan process.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND General Application Procedures § 98.14 Plan process. In the development of each Plan, as required pursuant to § 98.17... Federal, State, and local child care and early childhood development programs, including such programs for...

  6. 45 CFR 98.14 - Plan process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND General Application Procedures § 98.14 Plan process. In the development of each Plan, as required pursuant to § 98.17... Federal, State, and local child care and early childhood development programs, including such programs for...

  7. 45 CFR 98.14 - Plan process.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND General Application Procedures § 98.14 Plan process. In the development of each Plan, as required pursuant to § 98.17... Federal, State, and local child care and early childhood development programs, including such programs for...

  8. 45 CFR 98.14 - Plan process.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND General Application Procedures § 98.14 Plan process. In the development of each Plan, as required pursuant to § 98.17... Federal, State, and local child care and early childhood development programs, including such programs for...

  9. Climate Observing Systems: Where are we and where do we need to be in the future

    NASA Astrophysics Data System (ADS)

    Baker, B.; Diamond, H. J.

    2017-12-01

    Climate research and monitoring requires an observational strategy that blends long-term, carefully calibrated measurements as well as short-term, focused process studies. The operation and implementation of operational climate observing networks and the provision of related climate services, both have a significant role to play in assisting the development of national climate adaptation policies and in facilitating national economic development. Climate observing systems will require a strong research element for a long time to come. This requires improved observations of the state variables and the ability to set them in a coherent physical (as well as a chemical and biological) framework with models. Climate research and monitoring requires an integrated strategy of land/ocean/atmosphere observations, including both in situ and remote sensing platforms, and modeling and analysis. It is clear that we still need more research and analysis on climate processes, sampling strategies, and processing algorithms.

  10. Developing the skills required for evidence-based practice.

    PubMed

    French, B

    1998-01-01

    The current health care environment requires practitioners with the skills to find and apply the best currently available evidence for effective health care, to contribute to the development of evidence-based practice protocols, and to evaluate the impact of utilizing validated research findings in practice. Current approaches to teaching research are based mainly on gaining skills by participation in the research process. Emphasis on the requirement for rigour in the process of creating new knowledge is assumed to lead to skill in the process of using research information created by others. This article reflects upon the requirements for evidence-based practice, and the degree to which current approaches to teaching research prepare practitioners who are able to find, evaluate and best use currently available research information. The potential for using the principles of systematic review as a teaching and learning strategy for research is explored, and some of the possible strengths and weakness of this approach are highlighted.

  11. SU-E-CAMPUS-J-04: Image Guided Radiation Therapy (IGRT): Review of Technical Standards and Credentialing in Radiotherapy Clinical Trials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giaddui, T; Chen, W; Yu, J

    2014-06-15

    Purpose: To review IGRT credentialing experience and unexpected technical issues encountered in connection with advanced radiotherapy technologies as implemented in RTOG clinical trials. To update IGRT credentialing procedures with the aim of improving the quality of the process, and to increase the proportion of IGRT credentialing compliance. To develop a living disease site-specific IGRT encyclopedia. Methods: Numerous technical issues were encountered during the IGRT credentialing process. The criteria used for credentialing review were based on: image quality; anatomy included in fused data sets and shift results. Credentialing requirements have been updated according to the AAPM task group reports for IGRTmore » to ensure that all required technical items are included in the quality review process. Implementation instructions have been updated and expanded for recent protocols. Results: Technical issues observed during the credentialing review process include, but are not limited to: poor quality images; inadequate image acquisition region; poor data quality; shifts larger than acceptable; no soft tissue surrogate. The updated IGRT credentialing process will address these issues and will also include the technical items required from AAPM: TG 104; TG 142 and TG 179 reports. An instruction manual has been developed describing a remote credentialing method for reviewers. Submission requirements are updated, including images/documents as well as facility questionnaire. The review report now includes summary of the review process and the parameters that reviewers check. We have reached consensus on the minimum IGRT technical requirement for a number of disease sites. RTOG 1311(NRG-BR002A Phase 1 Study of Stereotactic Body Radiotherapy (SBRT) for the Treatment of Multiple Metastases) is an example, here; the protocol specified the minimum requirement for each anatomical sites (with/without fiducials). Conclusion: Technical issues are identified and reported. IGRT guidelines are updated, with the corresponding credentialing requirements. An IGRT encyclopedia describing site-specific implementation issues is currently in development.« less

  12. System Engineering Processes at Kennedy Space Center for Development of the SLS and Orion Launch Systems

    NASA Technical Reports Server (NTRS)

    Schafer, Eric J.

    2012-01-01

    There are over 40 subsystems being developed for the future SLS and Orion Launch Systems at Kennedy Space Center. These subsystems developed at the Kennedy Space Center Engineering Directorate follow a comprehensive design process which requires several different product deliverables during each phase of each of the subsystems. This Paper describes this process and gives an example of where the process has been applied.

  13. Seismology software: state of the practice

    NASA Astrophysics Data System (ADS)

    Smith, W. Spencer; Zeng, Zheng; Carette, Jacques

    2018-05-01

    We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

  14. Seismology software: state of the practice

    NASA Astrophysics Data System (ADS)

    Smith, W. Spencer; Zeng, Zheng; Carette, Jacques

    2018-02-01

    We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

  15. 24 CFR 235.1220 - Processing section 235(r) mortgages under the direct endorsement program.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... under the direct endorsement program. 235.1220 Section 235.1220 Housing and Urban Development... National Housing Act Eligibility Requirements; Direct Endorsement § 235.1220 Processing section 235(r) mortgages under the direct endorsement program. The regulations containing the requirements which a mortgage...

  16. ISRU System Model Tool: From Excavation to Oxygen Production

    NASA Technical Reports Server (NTRS)

    Santiago-Maldonado, Edgardo; Linne, Diane L.

    2007-01-01

    In the late 80's, conceptual designs for an in situ oxygen production plant were documented in a study by Eagle Engineering [1]. In the "Summary of Findings" of this study, it is clearly pointed out that: "reported process mass and power estimates lack a consistent basis to allow comparison." The study goes on to say: "A study to produce a set of process mass, power, and volume requirements on a consistent basis is recommended." Today, approximately twenty years later, as humans plan to return to the moon and venture beyond, the need for flexible up-to-date models of the oxygen extraction production process has become even more clear. Multiple processes for the production of oxygen from lunar regolith are being investigated by NASA, academia, and industry. Three processes that have shown technical merit are molten regolith electrolysis, hydrogen reduction, and carbothermal reduction. These processes have been selected by NASA as the basis for the development of the ISRU System Model Tool (ISMT). In working to develop up-to-date system models for these processes NASA hopes to accomplish the following: (1) help in the evaluation process to select the most cost-effective and efficient process for further prototype development, (2) identify key parameters, (3) optimize the excavation and oxygen production processes, and (4) provide estimates on energy and power requirements, mass and volume of the system, oxygen production rate, mass of regolith required, mass of consumables, and other important parameters. Also, as confidence and high fidelity is achieved with each component's model, new techniques and processes can be introduced and analyzed at a fraction of the cost of traditional hardware development and test approaches. A first generation ISRU System Model Tool has been used to provide inputs to the Lunar Architecture Team studies.

  17. Group Contribution Methods for Phase Equilibrium Calculations.

    PubMed

    Gmehling, Jürgen; Constantinescu, Dana; Schmid, Bastian

    2015-01-01

    The development and design of chemical processes are carried out by solving the balance equations of a mathematical model for sections of or the whole chemical plant with the help of process simulators. For process simulation, besides kinetic data for the chemical reaction, various pure component and mixture properties are required. Because of the great importance of separation processes for a chemical plant in particular, a reliable knowledge of the phase equilibrium behavior is required. The phase equilibrium behavior can be calculated with the help of modern equations of state or g(E)-models using only binary parameters. But unfortunately, only a very small part of the experimental data for fitting the required binary model parameters is available, so very often these models cannot be applied directly. To solve this problem, powerful predictive thermodynamic models have been developed. Group contribution methods allow the prediction of the required phase equilibrium data using only a limited number of group interaction parameters. A prerequisite for fitting the required group interaction parameters is a comprehensive database. That is why for the development of powerful group contribution methods almost all published pure component properties, phase equilibrium data, excess properties, etc., were stored in computerized form in the Dortmund Data Bank. In this review, the present status, weaknesses, advantages and disadvantages, possible applications, and typical results of the different group contribution methods for the calculation of phase equilibria are presented.

  18. Strategic planning: getting from here to there.

    PubMed

    Kaleba, Richard

    2006-11-01

    Hospitals should develop a strategic plan that defines specific actions in a realistic time frame. Hospitals can follow a five-phase process to develop a strategic plan. The strategic planning process requires a project leader and medical staff buy-in.

  19. Proceedings of the Flat-Plate Solar Array Project Workshop on Low-Cost Polysilicon for Terrestrial Photovoltaic Solar-Cell Applications

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Sessions conducted included: polysilicon material requirements; economics; process development in the U.S.; international process development; and polysilicon market and forecasts. Twenty-one papers were presented and discussed.

  20. Planning and Teaching Creatively within a Required Curriculum for School-Age Learners

    ERIC Educational Resources Information Center

    McKay, Penny, Ed.; Graves, Kathleen, Ed.

    2006-01-01

    As the second volume of a seven-volume series, this book describes curriculum development as three interrelated processes: planning, enacting, and evaluating. Curriculum development is a dynamic process that happens among learners and teachers in the classroom. In this volume, readers will encounter teachers, curriculum developers, and…

  1. Large area crop inventory experiment crop assessment subsystem software requirements document

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The functional data processing requirements are described for the Crop Assessment Subsystem of the Large Area Crop Inventory Experiment. These requirements are used as a guide for software development and implementation.

  2. Engineering the Future: Cell 6

    NASA Technical Reports Server (NTRS)

    Stahl, P. H.

    2010-01-01

    This slide presentation reviews the development of the James Webb Space Telescope (JWST), explaining the development using a systems engineering methodology. Included are slides showing the organizational chart, the JWST Science Goals, the size of the primary mirror, and full scale mockups of the JSWT. Also included is a review of the JWST Optical Telescope Requirements, a review of the preliminary design and analysis, the technology development required to create the JWST, with particular interest in the specific mirror technology that was required, and views of the mirror manufacturing process. Several slides review the process of verification and validation by testing and analysis, including a diagram of the Cryogenic Test Facility at Marshall, and views of the primary mirror while being tested in the cryogenic facility.

  3. Thermal design of the space shuttle external tank

    NASA Technical Reports Server (NTRS)

    Bachrtel, F. D.; Vaniman, J. L.; Stuckey, J. M.; Gray, C.; Widofsky, B.

    1985-01-01

    The shuttle external tank thermal design presents many challenges in meeting the stringent requirements established by the structures, main propulsion systems, and Orbiter elements. The selected thermal protection design had to meet these requirements, and ease of application, suitability for mass production considering low weight, cost, and high reliability. This development led to a spray-on-foam (SOFI) which covers the entire tank. The need and design for a SOFI material with a dual role of cryogenic insulation and ablator, and the development of the SOFI over SLA concept for high heating areas are discussed. Further issuses of minimum surface ice/frost, no debris, and the development of the TPS spray process considering the required quality and process control are examined.

  4. A public health hazard mitigation planning process.

    PubMed

    Griffith, Jennifer M; Kay Carpender, S; Crouch, Jill Artzberger; Quiram, Barbara J

    2014-01-01

    The Texas A&M Health Science Center School of Rural Public Health, a member of the Training and Education Collaborative System Preparedness and Emergency Response Learning Center (TECS-PERLC), has long-standing partnerships with 2 Health Service Regions (Regions) in Texas. TECS-PERLC was contracted by these Regions to address 2 challenges identified in meeting requirements outlined by the Risk-Based Funding Project. First, within Metropolitan Statistical Areas, there is not a formal authoritative structure. Second, preexisting tools and processes did not adequately satisfy requirements to assess public health, medical, and mental health needs and link mitigation strategies to the Public Health Preparedness Capabilities, which provide guidance to prepare for, respond to, and recover from public health incidents. TECS-PERLC, with its partners, developed a framework to interpret and apply results from the Texas Public Health Risk Assessment Tool (TxPHRAT). The 3-phase community engagement-based TxPHRAT Mitigation Planning Process (Mitigation Planning Process) and associated tools facilitated the development of mitigation plans. Tools included (1) profiles interpreting TxPHRAT results and identifying, ranking, and prioritizing hazards and capability gaps; (2) a catalog of intervention strategies and activities linked to hazards and capabilities; and (3) a template to plan, evaluate, and report mitigation planning efforts. The Mitigation Planning Process provided a framework for Regions to successfully address all funding requirements. TECS-PERLC developed more than 60 profiles, cataloged and linked 195 intervention strategies, and developed a template resulting in 20 submitted mitigation plans. A public health-focused, community engagement-based mitigation planning process was developed by TECS-PERLC and successfully implemented by the Regions. The outcomes met all requirements and reinforce the effectiveness of academic practice partnerships and importance of community engagement in mitigation planning. Additional funding has been approved to expand the Mitigation Planning Process to all counties in Texas with local health departments.

  5. Marshall Space Flight Center's Virtual Reality Applications Program 1993

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P., II

    1993-01-01

    A Virtual Reality (VR) applications program has been under development at the Marshall Space Flight Center (MSFC) since 1989. Other NASA Centers, most notably Ames Research Center (ARC), have contributed to the development of the VR enabling technologies and VR systems. This VR technology development has now reached a level of maturity where specific applications of VR as a tool can be considered. The objectives of the MSFC VR Applications Program are to develop, validate, and utilize VR as a Human Factors design and operations analysis tool and to assess and evaluate VR as a tool in other applications (e.g., training, operations development, mission support, teleoperations planning, etc.). The long-term goals of this technology program is to enable specialized Human Factors analyses earlier in the hardware and operations development process and develop more effective training and mission support systems. The capability to perform specialized Human Factors analyses earlier in the hardware and operations development process is required to better refine and validate requirements during the requirements definition phase. This leads to a more efficient design process where perturbations caused by late-occurring requirements changes are minimized. A validated set of VR analytical tools must be developed to enable a more efficient process for the design and development of space systems and operations. Similarly, training and mission support systems must exploit state-of-the-art computer-based technologies to maximize training effectiveness and enhance mission support. The approach of the VR Applications Program is to develop and validate appropriate virtual environments and associated object kinematic and behavior attributes for specific classes of applications. These application-specific environments and associated simulations will be validated, where possible, through empirical comparisons with existing, accepted tools and methodologies. These validated VR analytical tools will then be available for use in the design and development of space systems and operations and in training and mission support systems.

  6. An interfaces approach to TES ground data system processing design with the Science Investigator-led Processing System (SIPS)

    NASA Technical Reports Server (NTRS)

    Kurian, R.; Grifin, A.

    2002-01-01

    Developing production-quality software to process the large volumes of scientific data is the responsibility of the TES Ground Data System, which is being developed at the Jet Propulsion Laboratory together with support contractor Raytheon/ITSS. The large data volume and processing requirements of the TES pose significant challenges to the design.

  7. Optimization of Primary Drying in Lyophilization during Early Phase Drug Development using a Definitive Screening Design with Formulation and Process Factors.

    PubMed

    Goldman, Johnathan M; More, Haresh T; Yee, Olga; Borgeson, Elizabeth; Remy, Brenda; Rowe, Jasmine; Sadineni, Vikram

    2018-06-08

    Development of optimal drug product lyophilization cycles is typically accomplished via multiple engineering runs to determine appropriate process parameters. These runs require significant time and product investments, which are especially costly during early phase development when the drug product formulation and lyophilization process are often defined simultaneously. Even small changes in the formulation may require a new set of engineering runs to define lyophilization process parameters. In order to overcome these development difficulties, an eight factor definitive screening design (DSD), including both formulation and process parameters, was executed on a fully human monoclonal antibody (mAb) drug product. The DSD enables evaluation of several interdependent factors to define critical parameters that affect primary drying time and product temperature. From these parameters, a lyophilization development model is defined where near optimal process parameters can be derived for many different drug product formulations. This concept is demonstrated on a mAb drug product where statistically predicted cycle responses agree well with those measured experimentally. This design of experiments (DoE) approach for early phase lyophilization cycle development offers a workflow that significantly decreases the development time of clinically and potentially commercially viable lyophilization cycles for a platform formulation that still has variable range of compositions. Copyright © 2018. Published by Elsevier Inc.

  8. Way Forward for High Performance Payload Processing Development

    NASA Astrophysics Data System (ADS)

    Notebaert, Olivier; Franklin, John; Lefftz, Vincent; Moreno, Jose; Patte, Mathieu; Syed, Mohsin; Wagner, Arnaud

    2012-08-01

    Payload processing is facing technological challenges due to the large increase of performance requirements of future scientific, observation and telecom missions as well as the future instruments technologies capturing much larger amount of data. For several years, with the perspective of higher performance together with the planned obsolescence of solutions covering the current needs, ESA and the European space industry has been developing several technology solutions. Silicon technologies, radiation mitigation techniques and innovative functional architectures are developed with the goal of designing future space qualified processing devices with a much higher level of performance than today. The fast growing commercial market application have developed very attractive technologies but which are not fully suitable with respect to their tolerance to space environment. Without the financial capacity to explore and develop all possible technology paths, a specific and global approach is required to cover the future mission needs and their necessary performance targets with effectiveness.The next sections describe main issues and priorities and provides further detailed relevant for this approach covering the high performance processing technology.

  9. Metallic Fuel Casting Development and Parameter Optimization Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R.S. Fielding; J. Crapps; C. Unal

    One of the advantages of metallic fuel is the abilility to cast the fuel slugs to near net shape with little additional processing. However, the high aspect ratio of the fuel is not ideal for casting. EBR-II fuel was cast using counter gravity injection casting (CGIC) but, concerns have been raised concerning the feasibility of this process for americium bearing alloys. The Fuel Cycle Research and Development program has begun developing gravity casting techniques suitable for fuel production. Compared to CGIC gravity casting does not require a large heel that then is recycled, does not require application of a vacuummore » during melting, and is conducive to re-usable molds. Development has included fabrication of two separate benchscale, approximately 300 grams, systems. To shorten development time computer simulations have been used to ensure mold and crucible designs are feasible and to identify which fluid properties most affect casting behavior and therefore require more characterization.« less

  10. Updating National Topographic Data Base Using Change Detection Methods

    NASA Astrophysics Data System (ADS)

    Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.

    2016-06-01

    The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.

  11. Autophagy in C. elegans development.

    PubMed

    Palmisano, Nicholas J; Meléndez, Alicia

    2018-04-27

    Autophagy involves the sequestration of cytoplasmic contents in a double-membrane structure referred to as the autophagosome and the degradation of its contents upon delivery to lysosomes. Autophagy activity has a role in multiple biological processes during the development of the nematode Caenorhabditis elegans. Basal levels of autophagy are required to remove aggregate prone proteins, paternal mitochondria, and spermatid-specific membranous organelles. During larval development, autophagy is required for the remodeling that occurs during dauer development, and autophagy can selectively degrade components of the miRNA-induced silencing complex, and modulate miRNA-mediated silencing. Basal levels of autophagy are important in synapse formation and in the germ line, to promote the proliferation of proliferating stem cells. Autophagy activity is also required for the efficient removal of apoptotic cell corpses by promoting phagosome maturation. Finally, autophagy is also involved in lipid homeostasis and in the aging process. In this review, we first describe the molecular complexes involved in the process of autophagy, its regulation, and mechanisms for cargo recognition. In the second section, we discuss the developmental contexts where autophagy has been shown to be important. Studies in C. elegans provide valuable insights into the physiological relevance of this process during metazoan development. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Brg1 coordinates multiple processes during retinogenesis and is a tumor suppressor in retinoblastoma

    DOE PAGES

    Aldiri, Issam; Ajioka, Itsuki; Xu, Beisi; ...

    2015-12-01

    Retinal development requires precise temporal and spatial coordination of cell cycle exit, cell fate specification, cell migration and differentiation. When this process is disrupted, retinoblastoma, a developmental tumor of the retina, can form. Epigenetic modulators are central to precisely coordinating developmental events, and many epigenetic processes have been implicated in cancer. Studying epigenetic mechanisms in development is challenging because they often regulate multiple cellular processes; therefore, elucidating the primary molecular mechanisms involved can be difficult. Here we explore the role of Brg1 (Smarca4) in retinal development and retinoblastoma in mice using molecular and cellular approaches. Brg1 was found to regulatemore » retinal size by controlling cell cycle length, cell cycle exit and cell survival during development. Brg1 was not required for cell fate specification but was required for photoreceptor differentiation and cell adhesion/polarity programs that contribute to proper retinal lamination during development. The combination of defective cell differentiation and lamination led to retinal degeneration in Brg1-deficient retinae. Despite the hypocellularity, premature cell cycle exit, increased cell death and extended cell cycle length, retinal progenitor cells persisted in Brg1-deficient retinae, making them more susceptible to retinoblastoma. In conclusion, ChIP-Seq analysis suggests that Brg1 might regulate gene expression through multiple mechanisms.« less

  13. Brg1 coordinates multiple processes during retinogenesis and is a tumor suppressor in retinoblastoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aldiri, Issam; Ajioka, Itsuki; Xu, Beisi

    Retinal development requires precise temporal and spatial coordination of cell cycle exit, cell fate specification, cell migration and differentiation. When this process is disrupted, retinoblastoma, a developmental tumor of the retina, can form. Epigenetic modulators are central to precisely coordinating developmental events, and many epigenetic processes have been implicated in cancer. Studying epigenetic mechanisms in development is challenging because they often regulate multiple cellular processes; therefore, elucidating the primary molecular mechanisms involved can be difficult. Here we explore the role of Brg1 (Smarca4) in retinal development and retinoblastoma in mice using molecular and cellular approaches. Brg1 was found to regulatemore » retinal size by controlling cell cycle length, cell cycle exit and cell survival during development. Brg1 was not required for cell fate specification but was required for photoreceptor differentiation and cell adhesion/polarity programs that contribute to proper retinal lamination during development. The combination of defective cell differentiation and lamination led to retinal degeneration in Brg1-deficient retinae. Despite the hypocellularity, premature cell cycle exit, increased cell death and extended cell cycle length, retinal progenitor cells persisted in Brg1-deficient retinae, making them more susceptible to retinoblastoma. In conclusion, ChIP-Seq analysis suggests that Brg1 might regulate gene expression through multiple mechanisms.« less

  14. Intercomparison of U.S. Ballast Water Test Facilities

    DTIC Science & Technology

    2012-11-01

    with many of the requirements of the ETV Protocol, although in some aspects, they employed different approaches and processes . The ETV Protocol calls...and recommendations from the audit process will be reported separately by NSF (in preparation). Both TFs benefited from developing procedures needed...11  4.1.2  Development Process

  15. Adapting Rational Unified Process (RUP) approach in designing a secure e-Tendering model

    NASA Astrophysics Data System (ADS)

    Mohd, Haslina; Robie, Muhammad Afdhal Muhammad; Baharom, Fauziah; Darus, Norida Muhd; Saip, Mohamed Ali; Yasin, Azman

    2016-08-01

    e-Tendering is an electronic processing of the tender document via internet and allow tenderer to publish, communicate, access, receive and submit all tender related information and documentation via internet. This study aims to design the e-Tendering system using Rational Unified Process approach. RUP provides a disciplined approach on how to assign tasks and responsibilities within the software development process. RUP has four phases that can assist researchers to adjust the requirements of various projects with different scope, problem and the size of projects. RUP is characterized as a use case driven, architecture centered, iterative and incremental process model. However the scope of this study only focusing on Inception and Elaboration phases as step to develop the model and perform only three of nine workflows (business modeling, requirements, analysis and design). RUP has a strong focus on documents and the activities in the inception and elaboration phases mainly concern the creation of diagrams and writing of textual descriptions. The UML notation and the software program, Star UML are used to support the design of e-Tendering. The e-Tendering design based on the RUP approach can contribute to e-Tendering developers and researchers in e-Tendering domain. In addition, this study also shows that the RUP is one of the best system development methodology that can be used as one of the research methodology in Software Engineering domain related to secured design of any observed application. This methodology has been tested in various studies in certain domains, such as in Simulation-based Decision Support, Security Requirement Engineering, Business Modeling and Secure System Requirement, and so forth. As a conclusion, these studies showed that the RUP one of a good research methodology that can be adapted in any Software Engineering (SE) research domain that required a few artifacts to be generated such as use case modeling, misuse case modeling, activity diagram, and initial class diagram from a list of requirements as identified earlier by the SE researchers

  16. User-driven product data manager system design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-03-01

    With the infusion of information technologies into product development and production processes, effective management of product data is becoming essential to modern production enterprises. When an enterprise-wide Product Data Manager (PDM) is implemented, PDM designers must satisfy the requirements of individual users with different job functions and requirements, as well as the requirements of the enterprise as a whole. Concern must also be shown for the interrelationships between information, methods for retrieving archival information and integration of the PDM into the product development process. This paper describes a user-driven approach applied to PDM design for an agile manufacturing pilot projectmore » at Sandia National Laboratories that has been successful in achieving a much faster design-to-production process for a precision electro mechanical surety device.« less

  17. Image Processing Using a Parallel Architecture.

    DTIC Science & Technology

    1987-12-01

    ENG/87D-25 Abstract This study developed a set o± low level image processing tools on a parallel computer that allows concurrent processing of images...environment, the set of tools offers a significant reduction in the time required to perform some commonly used image processing operations. vI IMAGE...step toward developing these systems, a structured set of image processing tools was implemented using a parallel computer. More important than

  18. System Engineering Processes at Kennedy Space Center for Development of SLS and Orion Launch Systems

    NASA Technical Reports Server (NTRS)

    Schafer, Eric; Stambolian, Damon; Henderson, Gena

    2013-01-01

    There are over 40 subsystems being developed for the future SLS and Orion Launch Systems at Kennedy Space Center. These subsystems are developed at the Kennedy Space Center Engineering Directorate. The Engineering Directorate at Kennedy Space Center follows a comprehensive design process which requires several different product deliverables during each phase of each of the subsystems. This Presentation describes this process with examples of where the process has been applied.

  19. Towards Requirements in Systems Engineering for Aerospace IVHM Design

    NASA Technical Reports Server (NTRS)

    Saxena, Abhinav; Roychoudhury, Indranil; Lin, Wei; Goebel, Kai

    2013-01-01

    Health management (HM) technologies have been employed for safety critical system for decades, but a coherent systematic process to integrate HM into the system design is not yet clear. Consequently, in most cases, health management resorts to be an after-thought or 'band-aid' solution. Moreover, limited guidance exists for carrying out systems engineering (SE) on the subject of writing requirements for designs with integrated vehicle health management (IVHM). It is well accepted that requirements are key to developing a successful IVHM system right from the concept stage to development, verification, utilization, and support. However, writing requirements for systems with IVHM capability have unique challenges that require the designers to look beyond their own domains and consider the constraints and specifications of other interlinked systems. In this paper we look at various stages in the SE process and identify activities specific to IVHM design and development. More importantly, several relevant questions are posed that system engineers must address at various design and development stages. Addressing these questions should provide some guidance to systems engineers towards writing IVHM related requirements to ensure that appropriate IVHM functions are built into the system design.

  20. Linking Goal-Oriented Requirements and Model-Driven Development

    NASA Astrophysics Data System (ADS)

    Pastor, Oscar; Giachetti, Giovanni

    In the context of Goal-Oriented Requirement Engineering (GORE) there are interesting modeling approaches for the analysis of complex scenarios that are oriented to obtain and represent the relevant requirements for the development of software products. However, the way to use these GORE models in an automated Model-Driven Development (MDD) process is not clear, and, in general terms, the translation of these models into the final software products is still manually performed. Therefore, in this chapter, we show an approach to automatically link GORE models and MDD processes, which has been elaborated by considering the experience obtained from linking the i * framework with an industrially applied MDD approach. The linking approach proposed is formulated by means of a generic process that is based on current modeling standards and technologies in order to facilitate its application for different MDD and GORE approaches. Special attention is paid to how this process generates appropriate model transformation mechanisms to automatically obtain MDD conceptual models from GORE models, and how it can be used to specify validation mechanisms to assure the correct model transformations.

  1. Silicon-Germanium Fast Packet Switch Developed for Communications Satellites

    NASA Technical Reports Server (NTRS)

    Quintana, Jorge A.

    1999-01-01

    Emerging multimedia applications and future satellite systems will require high-speed switching networks to accommodate high data-rate traffic among thousands of potential users. This will require advanced switching devices to enable communication between satellites. The NASA Lewis Research Center has been working closely with industry to develop a state-of-the-art fast packet switch (FPS) to fulfill this requirement. Recently, the Satellite Industry Task Force identified the need for high-capacity onboard processing switching components as one of the "grand challenges" for the satellite industry in the 21st century. In response to this challenge, future generations of onboard processing satellites will require low power and low mass components to enable transmission of services in the 100 gigabit (1011 bits) per second (Gbps) range.

  2. 24 CFR 570.485 - Making of grants.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... future performance. If the Secretary makes any such determination, however, the State may be required to...) OFFICE OF ASSISTANT SECRETARY FOR COMMUNITY PLANNING AND DEVELOPMENT, DEPARTMENT OF HOUSING AND URBAN... includes requirements for the content of the consolidated plan, for the process of developing the plan...

  3. 24 CFR 570.485 - Making of grants.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... future performance. If the Secretary makes any such determination, however, the State may be required to...) OFFICE OF ASSISTANT SECRETARY FOR COMMUNITY PLANNING AND DEVELOPMENT, DEPARTMENT OF HOUSING AND URBAN... includes requirements for the content of the consolidated plan, for the process of developing the plan...

  4. 24 CFR 570.485 - Making of grants.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... future performance. If the Secretary makes any such determination, however, the State may be required to...) OFFICE OF ASSISTANT SECRETARY FOR COMMUNITY PLANNING AND DEVELOPMENT, DEPARTMENT OF HOUSING AND URBAN... includes requirements for the content of the consolidated plan, for the process of developing the plan...

  5. Automated process planning system

    NASA Technical Reports Server (NTRS)

    Mann, W.

    1978-01-01

    Program helps process engineers set up manufacturing plans for machined parts. System allows one to develop and store library of similar parts characteristics, as related to particular facility. Information is then used in interactive system to help develop manufacturing plans that meet required standards.

  6. Course Development Cycle Time: A Framework for Continuous Process Improvement.

    ERIC Educational Resources Information Center

    Lake, Erinn

    2003-01-01

    Details Edinboro University's efforts to reduce the extended cycle time required to develop new courses and programs. Describes a collaborative process improvement framework, illustrated data findings, the team's recommendations for improvement, and the outcomes of those recommendations. (EV)

  7. RELAP-7 Code Assessment Plan and Requirement Traceability Matrix

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Junsoo; Choi, Yong-joon; Smith, Curtis L.

    2016-10-01

    The RELAP-7, a safety analysis code for nuclear reactor system, is under development at Idaho National Laboratory (INL). Overall, the code development is directed towards leveraging the advancements in computer science technology, numerical solution methods and physical models over the last decades. Recently, INL has also been putting an effort to establish the code assessment plan, which aims to ensure an improved final product quality through the RELAP-7 development process. The ultimate goal of this plan is to propose a suitable way to systematically assess the wide range of software requirements for RELAP-7, including the software design, user interface, andmore » technical requirements, etc. To this end, we first survey the literature (i.e., international/domestic reports, research articles) addressing the desirable features generally required for advanced nuclear system safety analysis codes. In addition, the V&V (verification and validation) efforts as well as the legacy issues of several recently-developed codes (e.g., RELAP5-3D, TRACE V5.0) are investigated. Lastly, this paper outlines the Requirement Traceability Matrix (RTM) for RELAP-7 which can be used to systematically evaluate and identify the code development process and its present capability.« less

  8. Investigation of the current requirements engineering practices among software developers at the Universiti Utara Malaysia Information Technology (UUMIT) centre

    NASA Astrophysics Data System (ADS)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Abdullah, Inam

    2016-08-01

    Requirements Engineering (RE) is a systemic and integrated process of eliciting, elaborating, negotiating, validating and managing of the requirements of a system in a software development project. UUM has been supported by various systems developed and maintained by the UUM Information Technology (UUMIT) Centre. The aim of this study was to assess the current requirements engineering practices at UUMIT. The main problem that prompted this research is the lack of studies that support software development activities at the UUMIT. The study is geared at helping UUMIT produce quality but time and cost saving software products by implementing cutting edge and state of the art requirements engineering practices. Also, the study contributes to UUM by identifying the activities needed for software development so that the management will be able to allocate budget to provide adequate and precise training for the software developers. Three variables were investigated: Requirement Description, Requirements Development (comprising: Requirements Elicitation, Requirements Analysis and Negotiation, Requirements Validation), and Requirement Management. The results from the study showed that the current practice of requirement engineering in UUMIT is encouraging, but still need further development and improvement because a few RE practices were seldom practiced.

  9. Lunar resource recovery: A definition of requirements

    NASA Technical Reports Server (NTRS)

    Elsworth, D.; Kohler, J. L.; Alexander, S. S.

    1992-01-01

    The capability to locate, mine, and process the natural resources of the Moon will be an essential requirement for lunar base development and operation. The list of materials that will be necessary is extensive and ranges from oxygen and hydrogen for fuel and life support to process tailings for emplacement over habitats. Despite the resources need, little is known about methodologies that might be suitable for utilizing lunar resources. This paper examines some of the requirements and constraints for resource recovery and identifies key areas of research needed to locate, mine, and process extraterrestrial natural resources.

  10. Achieving continuous manufacturing: technologies and approaches for synthesis, workup, and isolation of drug substance. May 20-21, 2014 Continuous Manufacturing Symposium.

    PubMed

    Baxendale, Ian R; Braatz, Richard D; Hodnett, Benjamin K; Jensen, Klavs F; Johnson, Martin D; Sharratt, Paul; Sherlock, Jon-Paul; Florence, Alastair J

    2015-03-01

    This whitepaper highlights current challenges and opportunities associated with continuous synthesis, workup, and crystallization of active pharmaceutical ingredients (drug substances). We describe the technologies and requirements at each stage and emphasize the different considerations for developing continuous processes compared with batch. In addition to the specific sequence of operations required to deliver the necessary chemical and physical transformations for continuous drug substance manufacture, consideration is also given to how adoption of continuous technologies may impact different manufacturing stages in development from discovery, process development, through scale-up and into full scale production. The impact of continuous manufacture on drug substance quality and the associated challenges for control and for process safety are also emphasized. In addition to the technology and operational considerations necessary for the adoption of continuous manufacturing (CM), this whitepaper also addresses the cultural, as well as skills and training, challenges that will need to be met by support from organizations in order to accommodate the new work flows. Specific action items for industry leaders are: Develop flow chemistry toolboxes, exploiting the advantages of flow processing and including highly selective chemistries that allow use of simple and effective continuous workup technologies. Availability of modular or plug and play type equipment especially for workup to assist in straightforward deployment in the laboratory. As with learning from other industries, standardization is highly desirable and will require cooperation across industry and academia to develop and implement. Implement and exploit process analytical technologies (PAT) for real-time dynamic control of continuous processes. Develop modeling and simulation techniques to support continuous process development and control. Progress is required in multiphase systems such as crystallization. Involve all parts of the organization from discovery, research and development, and manufacturing in the implementation of CM. Engage with academia to develop the training provision to support the skills base for CM, particularly in flow chemistry, physical chemistry, and chemical engineering skills at the chemistry-process interface. Promote and encourage publication and dissemination of examples of CM across the sector to demonstrate capability, engage with regulatory comment, and establish benchmarks for performance and highlight challenges. Develop the economic case for CM of drug substance. This will involve various stakeholders at project and business level, however establishing the critical economic drivers is critical to driving the transformation in manufacturing. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  11. The development of a post-test diagnostic system for rocket engines

    NASA Technical Reports Server (NTRS)

    Zakrajsek, June F.

    1991-01-01

    An effort was undertaken by NASA to develop an automated post-test, post-flight diagnostic system for rocket engines. The automated system is designed to be generic and to automate the rocket engine data review process. A modular, distributed architecture with a generic software core was chosen to meet the design requirements. The diagnostic system is initially being applied to the Space Shuttle Main Engine data review process. The system modules currently under development are the session/message manager, and portions of the applications section, the component analysis section, and the intelligent knowledge server. An overview is presented of a rocket engine data review process, the design requirements and guidelines, the architecture and modules, and the projected benefits of the automated diagnostic system.

  12. Practical considerations in clinical strategy to support the development of injectable drug-device combination products for biologics.

    PubMed

    Li, Zhaoyang; Easton, Rachael

    2018-01-01

    The development of an injectable drug-device combination (DDC) product for biologics is an intricate and evolving process that requires substantial investments of time and money. Consequently, the commercial dosage form(s) or presentation(s) are often not ready when pivotal trials commence, and it is common to have drug product changes (manufacturing process or presentation) during clinical development. A scientifically sound and robust bridging strategy is required in order to introduce these changes into the clinic safely. There is currently no single developmental paradigm, but a risk-based hierarchical approach has been well accepted. The rigor required of a bridging package depends on the level of risk associated with the changes. Clinical pharmacokinetic/pharmacodynamic comparability or outcome studies are only required when important changes occur at a late stage. Moreover, an injectable DDC needs to be user-centric, and usability assessment in real-world clinical settings may be required to support the approval of a DDC. In this review, we discuss the common issues during the manufacturing process and presentation development of an injectable DDC and practical considerations in establishing a clinical strategy to address these issues, including key elements of clinical studies. We also analyze the current practice in the industry and review relevant and status of regulatory guidance in the DDC field.

  13. Practical considerations in clinical strategy to support the development of injectable drug-device combination products for biologics

    PubMed Central

    Easton, Rachael

    2018-01-01

    ABSTRACT The development of an injectable drug-device combination (DDC) product for biologics is an intricate and evolving process that requires substantial investments of time and money. Consequently, the commercial dosage form(s) or presentation(s) are often not ready when pivotal trials commence, and it is common to have drug product changes (manufacturing process or presentation) during clinical development. A scientifically sound and robust bridging strategy is required in order to introduce these changes into the clinic safely. There is currently no single developmental paradigm, but a risk-based hierarchical approach has been well accepted. The rigor required of a bridging package depends on the level of risk associated with the changes. Clinical pharmacokinetic/pharmacodynamic comparability or outcome studies are only required when important changes occur at a late stage. Moreover, an injectable DDC needs to be user-centric, and usability assessment in real-world clinical settings may be required to support the approval of a DDC. In this review, we discuss the common issues during the manufacturing process and presentation development of an injectable DDC and practical considerations in establishing a clinical strategy to address these issues, including key elements of clinical studies. We also analyze the current practice in the industry and review relevant and status of regulatory guidance in the DDC field. PMID:29035675

  14. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1986-01-01

    Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.

  15. SE Requirements Development Tool User Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benson, Faith Ann

    2016-05-13

    The LANL Systems Engineering Requirements Development Tool (SERDT) is a data collection tool created in InfoPath for use with the Los Alamos National Laboratory’s (LANL) SharePoint sites. Projects can fail if a clear definition of the final product requirements is not performed. For projects to be successful requirements must be defined early in the project and those requirements must be tracked during execution of the project to ensure the goals of the project are met. Therefore, the focus of this tool is requirements definition. The content of this form is based on International Council on Systems Engineering (INCOSE) and Departmentmore » of Defense (DoD) process standards and allows for single or collaborative input. The “Scoping” section is where project information is entered by the project team prior to requirements development, and includes definitions and examples to assist the user in completing the forms. The data entered will be used to define the requirements and once the form is filled out, a “Requirements List” is automatically generated and a Word document is created and saved to a SharePoint document library. SharePoint also includes the ability to download the requirements data defined in the InfoPath from into an Excel spreadsheet. This User Guide will assist you in navigating through the data entry process.« less

  16. A method to accelerate creation of plasma etch recipes using physics and Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Chopra, Meghali J.; Verma, Rahul; Lane, Austin; Willson, C. G.; Bonnecaze, Roger T.

    2017-03-01

    Next generation semiconductor technologies like high density memory storage require precise 2D and 3D nanopatterns. Plasma etching processes are essential to achieving the nanoscale precision required for these structures. Current plasma process development methods rely primarily on iterative trial and error or factorial design of experiment (DOE) to define the plasma process space. Here we evaluate the efficacy of the software tool Recipe Optimization for Deposition and Etching (RODEo) against standard industry methods at determining the process parameters of a high density O2 plasma system with three case studies. In the first case study, we demonstrate that RODEo is able to predict etch rates more accurately than a regression model based on a full factorial design while using 40% fewer experiments. In the second case study, we demonstrate that RODEo performs significantly better than a full factorial DOE at identifying optimal process conditions to maximize anisotropy. In the third case study we experimentally show how RODEo maximizes etch rates while using half the experiments of a full factorial DOE method. With enhanced process predictions and more accurate maps of the process space, RODEo reduces the number of experiments required to develop and optimize plasma processes.

  17. Advanced automation for in-space vehicle processing

    NASA Technical Reports Server (NTRS)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  18. Analytical techniques for the study of some parameters of multispectral scanner systems for remote sensing

    NASA Technical Reports Server (NTRS)

    Wiswell, E. R.; Cooper, G. R. (Principal Investigator)

    1978-01-01

    The author has identified the following significant results. The concept of average mutual information in the received spectral random process about the spectral scene was developed. Techniques amenable to implementation on a digital computer were also developed to make the required average mutual information calculations. These techniques required identification of models for the spectral response process of scenes. Stochastic modeling techniques were adapted for use. These techniques were demonstrated on empirical data from wheat and vegetation scenes.

  19. Flat-plate solar array project. Volume 5: Process development

    NASA Technical Reports Server (NTRS)

    Gallagher, B.; Alexander, P.; Burger, D.

    1986-01-01

    The goal of the Process Development Area, as part of the Flat-Plate Solar Array (FSA) Project, was to develop and demonstrate solar cell fabrication and module assembly process technologies required to meet the cost, lifetime, production capacity, and performance goals of the FSA Project. R&D efforts expended by Government, Industry, and Universities in developing processes capable of meeting the projects goals during volume production conditions are summarized. The cost goals allocated for processing were demonstrated by small volume quantities that were extrapolated by cost analysis to large volume production. To provide proper focus and coverage of the process development effort, four separate technology sections are discussed: surface preparation, junction formation, metallization, and module assembly.

  20. Cloud Computing and Validated Learning for Accelerating Innovation in IoT

    ERIC Educational Resources Information Center

    Suciu, George; Todoran, Gyorgy; Vulpe, Alexandru; Suciu, Victor; Bulca, Cristina; Cheveresan, Romulus

    2015-01-01

    Innovation in Internet of Things (IoT) requires more than just creation of technology and use of cloud computing or big data platforms. It requires accelerated commercialization or aptly called go-to-market processes. To successfully accelerate, companies need a new type of product development, the so-called validated learning process.…

  1. Conducting a Competitive Prototype Acquisition Program: An Account of the Joint Light Tactical Vehicle (JLTV) Technology Development Phase

    DTIC Science & Technology

    2013-03-01

    9  B.  REQUIREMENTS ANALYSIS PROCESS ..................................................9  1.  Requirements Management and... Analysis Plan ................................9  2.  Knowledge Point Reviews .................................................................11  3...are Identified .......12  5.  RMAP/CDD Process Analysis and Results......................................13  IV.  TD PHASE BEGINS

  2. Knowledge Representation Artifacts for Use in Sensemaking Support Systems

    DTIC Science & Technology

    2015-03-12

    and manual processing must be replaced by automated processing wherever it makes sense and is possible. Clearly, given the data and cognitive...knowledge-centric view to situation analysis and decision-making as previously discussed, has lead to the development of several automated processing components...for use in sensemaking support systems [6-11]. In turn, automated processing has required the development of appropriate knowledge

  3. Development of Entry-Level Competence Tests: A Strategy for Evaluation of Vocational Education Training Systems

    ERIC Educational Resources Information Center

    Schutte, Marc; Spottl, Georg

    2011-01-01

    Developing countries such as Malaysia and Oman have recently established occupational standards based on core work processes (functional clusters of work objects, activities and performance requirements), to which competencies (performance determinants) can be linked. While the development of work-process-based occupational standards is supposed…

  4. The Process of Professional School Counselor Multicultural Competency Development: A Grounded Theory

    ERIC Educational Resources Information Center

    Berry, Jessica L.

    2013-01-01

    Professional School Counselors who work in schools with a range of student diversity are posed with a unique set of challenges which require them to develop their multicultural competencies. The following qualitative study examined the process of developing multicultural competence for four professional school counselors. The four professional…

  5. International Space Station Requirement Verification for Commercial Visiting Vehicles

    NASA Technical Reports Server (NTRS)

    Garguilo, Dan

    2017-01-01

    The COTS program demonstrated NASA could rely on commercial providers for safe, reliable, and cost-effective cargo delivery to ISS. The ISS Program has developed a streamlined process to safely integrate commercial visiting vehicles and ensure requirements are met Levy a minimum requirement set (down from 1000s to 100s) focusing on the ISS interface and safety, reducing the level of NASA oversight/insight and burden on the commercial Partner. Partners provide a detailed verification and validation plan documenting how they will show they've met NASA requirements. NASA conducts process sampling to ensure that the established verification processes is being followed. NASA participates in joint verification events and analysis for requirements that require both parties verify. Verification compliance is approved by NASA and launch readiness certified at mission readiness reviews.

  6. Aerospace Engineering Systems

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    1999-01-01

    Continuous improvement of aerospace product development processes is a driving requirement across much of the aerospace community. As up to 90% of the cost of an aerospace product is committed during the first 10% of the development cycle, there is a strong emphasis on capturing, creating, and communicating better information (both requirements and performance) early in the product development process. The community has responded by pursuing the development of computer-based systems designed to enhance the decision-making capabilities of product development individuals and teams. Recently, the historical foci on sharing the geometrical representation and on configuration management are being augmented: Physics-based analysis tools for filling the design space database; Distributed computational resources to reduce response time and cost; Web-based technologies to relieve machine-dependence; and Artificial intelligence technologies to accelerate processes and reduce process variability. Activities such as the Advanced Design Technologies Testbed (ADTT) project at NASA Ames Research Center study the strengths and weaknesses of the technologies supporting each of these trends, as well as the overall impact of the combination of these trends on a product development event. Lessons learned and recommendations for future activities will be reported.

  7. Why and how Mastering an Incremental and Iterative Software Development Process

    NASA Astrophysics Data System (ADS)

    Dubuc, François; Guichoux, Bernard; Cormery, Patrick; Mescam, Jean Christophe

    2004-06-01

    One of the key issues regularly mentioned in the current software crisis of the space domain is related to the software development process that must be performed while the system definition is not yet frozen. This is especially true for complex systems like launchers or space vehicles.Several more or less mature solutions are under study by EADS SPACE Transportation and are going to be presented in this paper. The basic principle is to develop the software through an iterative and incremental process instead of the classical waterfall approach, with the following advantages:- It permits systematic management and incorporation of requirements changes over the development cycle with a minimal cost. As far as possible the most dimensioning requirements are analyzed and developed in priority for validating very early the architecture concept without the details.- A software prototype is very quickly available. It improves the communication between system and software teams, as it enables to check very early and efficiently the common understanding of the system requirements.- It allows the software team to complete a whole development cycle very early, and thus to become quickly familiar with the software development environment (methodology, technology, tools...). This is particularly important when the team is new, or when the environment has changed since the previous development. Anyhow, it improves a lot the learning curve of the software team.These advantages seem very attractive, but mastering efficiently an iterative development process is not so easy and induces a lot of difficulties such as:- How to freeze one configuration of the system definition as a development baseline, while most of thesystem requirements are completely and naturally unstable?- How to distinguish stable/unstable and dimensioning/standard requirements?- How to plan the development of each increment?- How to link classical waterfall development milestones with an iterative approach: when should theclassical reviews be performed: Software Specification Review? Preliminary Design Review? CriticalDesign Review? Code Review? Etc...Several solutions envisaged or already deployed by EADS SPACE Transportation will be presented, both from a methodological and technological point of view:- How the MELANIE EADS ST internal methodology improves the concurrent engineering activitiesbetween GNC, software and simulation teams in a very iterative and reactive way.- How the CMM approach can help by better formalizing Requirements Management and Planningprocesses.- How the Automatic Code Generation with "certified" tools (SCADE) can still dramatically shorten thedevelopment cycle.Then the presentation will conclude by showing an evaluation of the cost and planning reduction based on a pilot application by comparing figures on two similar projects: one with the classical waterfall process, the other one with an iterative and incremental approach.

  8. Spacelab Mission Implementation Cost Assessment (SMICA)

    NASA Technical Reports Server (NTRS)

    Guynes, B. V.

    1984-01-01

    A total savings of approximately 20 percent is attainable if: (1) mission management and ground processing schedules are compressed; (2) the equipping, staffing, and operating of the Payload Operations Control Center is revised, and (3) methods of working with experiment developers are changed. The development of a new mission implementation technique, which includes mission definition, experiment development, and mission integration/operations, is examined. The Payload Operations Control Center is to relocate and utilize new computer equipment to produce cost savings. Methods of reducing costs by minimizing the Spacelab and payload processing time during pre- and post-mission operation at KSC are analyzed. The changes required to reduce costs in the analytical integration process are studied. The influence of time, requirements accountability, and risk on costs is discussed. Recommendation for cost reductions developed by the Spacelab Mission Implementation Cost Assessment study are listed.

  9. Effective Software Engineering Leadership for Development Programs

    ERIC Educational Resources Information Center

    Cagle West, Marsha

    2010-01-01

    Software is a critical component of systems ranging from simple consumer appliances to complex health, nuclear, and flight control systems. The development of quality, reliable, and effective software solutions requires the incorporation of effective software engineering processes and leadership. Processes, approaches, and methodologies for…

  10. Environmental Compliance Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1981-02-01

    The Guide is intended to assist Department of Energy personnel by providing information on the NEPA process, the processes of other environmental statutes that bear on the NEPA process, the timing relationships between the NEPA process and these other processes, as well as timing relationships between the NEPA process and the development process for policies, programs, and projects. This information should be helpful not only in formulating environmental compliance plans but also in achieving compliance with NEPA and various other environmental statutes. The Guide is divided into three parts with related appendices: Part I provides guidance for developing environmental compliancemore » plans for DOE actions; Part II is devoted to NEPA with detailed flowcharts depicting the compliance procedures required by CEQ regulations and Department of Energy NEPA Guidelines; and Part III contains a series of flowcharts for other Federal environmental requirements that may apply to DOE projects.« less

  11. Orbital construction support equipment

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Approximately 200 separate construction steps were defined for the three solar power satellite (SPS) concepts. Detailed construction scenarios were developed which describe the specific tasks to be accomplished, and identify general equipment requirements. The scenarios were used to perform a functional analysis, which resulted in the definition of 100 distinct SPS elements. These elements are the components, parts, subsystems, or assemblies upon which construction activities take place. The major SPS elements for each configuration are shown. For those elements, 300 functional requirements were identified in seven generic processes. Cumulatively, these processes encompass all functions required during SPS construction/assembly. Individually each process is defined such that it includes a specific type of activity. Each SPS element may involve activities relating to any or all of the generic processes. The processes are listed, and examples of the requirements defined for a typical element are given.

  12. LANL surveillance requirements management and surveillance requirements from NA-12 tasking memo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hills, Charles R

    2011-01-25

    Surveillance briefing to NNSA to support a tasking memo from NA-12 on Surveillance requirements. This talk presents the process for developing surveillance requirements, discusses the LANL requirements that were issued as part of that tasking memo, and presents recommendations on Component Evaluation and Planning Committee activities for FY11.

  13. Single-step affinity purification of enzyme biotherapeutics: a platform methodology for accelerated process development.

    PubMed

    Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean

    2014-01-01

    Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT). © 2014 American Institute of Chemical Engineers.

  14. Self-Study Guide for Florida VPK Provider Improvement Plan Development

    ERIC Educational Resources Information Center

    Phillips, Beth M.; Mazzeo, Debbie; Smith, Kevin

    2016-01-01

    This Self-Study Guide has been developed to support Florida Voluntary Prekindergarten Providers (VPK) who are required to complete an improvement plan process (i.e., low-performing providers). The guide has sections that can be used during both the process of selecting target areas for an improvement plan and the process of implementing new or…

  15. Compliance through pollution prevention opportunity assessments at Edwards AFB -- Development, results and lessons learned

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beutelman, H.P.; Lawrence, A.

    1999-07-01

    Edwards Air Force Base (AFB), located in the Mojave Desert of southern California, is required to comply with environmental requirements for air pollution emissions, hazardous waste disposal, and clean water. The resources required to meet these many compliance requirements represents an ever increasing financial burden to the base, and to the Department of Defense. A recognized superior approach to environmental management is to achieve compliance through a proactive pollution prevention (P2) program which mitigates, and when possible, eliminates compliance requirements and costs, while at the same time reducing pollution released to the environment. At Edwards AFB, the Environmental Management Officemore » P2 Branch developed and implemented a strategy that addresses this concept, better known as Compliance Through Pollution Prevention (CTP2). At the 91st AWMA Annual Meeting and Exhibition, Edwards AFB presented a paper on its strategy and implementation of its CTP2 concept. Part of that strategy and implementation included accomplishment of process specific focused P2 opportunity assessments (OAs). Starting in 1998, Edwards AFB initiated a CTP2 OA project where OAs were targeted on those operational processes, identified as compliance sites, that contributed most to the compliance requirements and costs at Edwards AFB. The targeting of these compliance sites was accomplished by developing a compliance matrix that prioritized processes in accordance with an operational risk management approach. The Edwards AFB CTP2 PPOA project is the first of its kind within the Air Force Material Command, and is serving as a benchmark for establishment of the CTP2 OA process.« less

  16. Parallel processing architecture for computing inverse differential kinematic equations of the PUMA arm

    NASA Technical Reports Server (NTRS)

    Hsia, T. C.; Lu, G. Z.; Han, W. H.

    1987-01-01

    In advanced robot control problems, on-line computation of inverse Jacobian solution is frequently required. Parallel processing architecture is an effective way to reduce computation time. A parallel processing architecture is developed for the inverse Jacobian (inverse differential kinematic equation) of the PUMA arm. The proposed pipeline/parallel algorithm can be inplemented on an IC chip using systolic linear arrays. This implementation requires 27 processing cells and 25 time units. Computation time is thus significantly reduced.

  17. A Development Architecture for Serious Games Using BCI (Brain Computer Interface) Sensors

    PubMed Central

    Sung, Yunsick; Cho, Kyungeun; Um, Kyhyun

    2012-01-01

    Games that use brainwaves via brain–computer interface (BCI) devices, to improve brain functions are known as BCI serious games. Due to the difficulty of developing BCI serious games, various BCI engines and authoring tools are required, and these reduce the development time and cost. However, it is desirable to reduce the amount of technical knowledge of brain functions and BCI devices needed by game developers. Moreover, a systematic BCI serious game development process is required. In this paper, we present a methodology for the development of BCI serious games. We describe an architecture, authoring tools, and development process of the proposed methodology, and apply it to a game development approach for patients with mild cognitive impairment as an example. This application demonstrates that BCI serious games can be developed on the basis of expert-verified theories. PMID:23202227

  18. A development architecture for serious games using BCI (brain computer interface) sensors.

    PubMed

    Sung, Yunsick; Cho, Kyungeun; Um, Kyhyun

    2012-11-12

    Games that use brainwaves via brain-computer interface (BCI) devices, to improve brain functions are known as BCI serious games. Due to the difficulty of developing BCI serious games, various BCI engines and authoring tools are required, and these reduce the development time and cost. However, it is desirable to reduce the amount of technical knowledge of brain functions and BCI devices needed by game developers. Moreover, a systematic BCI serious game development process is required. In this paper, we present a methodology for the development of BCI serious games. We describe an architecture, authoring tools, and development process of the proposed methodology, and apply it to a game development approach for patients with mild cognitive impairment as an example. This application demonstrates that BCI serious games can be developed on the basis of expert-verified theories.

  19. Corrosion of Highly Specular Vapor Deposited Aluminum (VDA) on Earthshade Door Sandwich Structure

    NASA Technical Reports Server (NTRS)

    Plaskon, Daniel; Hsieh, Cheng

    2003-01-01

    High-resolution infrared (IR) imaging requires spacecraft instrument design that is tightly coupled with overall thermal control design. The JPL Tropospheric Emission Spectrometer (TES) instrument measures the 3-dimensional distribution of ozone and its precursors in the lower atmosphere on a global scale. The TES earthshade must protect the 180-K radiator and the 230-K radiator from the Earth IR and albedo. Requirements for specularity, emissivity, and solar absorptance of inner surfaces could only be met with vapor deposited aluminum (VDA). Circumstances leading to corrosion of the VDA are described. Innovative materials and processing to meet the optical and thermal cycle requirements were developed. Examples of scanning electronmicroscope (SEM), atomic force microscope (AFM), and other surface analysis techniques used in failure analysis, problem solving, and process development are given. Materials and process selection criteria and development test results are presented in a decision matrix. Examples of conditions promoting and preventing galvanic corrosion between VDA and graphite fiber-reinforced laminates are provided.

  20. Robot-Assisted Fracture Surgery: Surgical Requirements and System Design.

    PubMed

    Georgilas, Ioannis; Dagnino, Giulio; Tarassoli, Payam; Atkins, Roger; Dogramadzi, Sanja

    2018-03-09

    The design of medical devices is a complex and crucial process to ensure patient safety. It has been shown that improperly designed devices lead to errors and associated accidents and costs. A key element for a successful design is incorporating the views of the primary and secondary stakeholders early in the development process. They provide insights into current practice and point out specific issues with the current processes and equipment in use. This work presents how information from a user-study conducted in the early stages of the RAFS (Robot Assisted Fracture Surgery) project informed the subsequent development and testing of the system. The user needs were captured using qualitative methods and converted to operational, functional, and non-functional requirements based on the methods derived from product design and development. This work presents how the requirements inform a new workflow for intra-articular joint fracture reduction using a robotic system. It is also shown how the various elements of the system are developed to explicitly address one or more of the requirements identified, and how intermediate verification tests are conducted to ensure conformity. Finally, a validation test in the form of a cadaveric trial confirms the ability of the designed system to satisfy the aims set by the original research question and the needs of the users.

  1. Planetary Protection Considerations for Life Support and Habitation Systems

    NASA Technical Reports Server (NTRS)

    Barta, Daniel J.; Hogan, John A.

    2010-01-01

    Life support systems for future human missions beyond low Earth orbit may include a combination of existing hardware components and advanced technologies. Discipline areas for technology development include atmosphere revitalization, water recovery, solid waste management, crew accommodations, food production, thermal systems, environmental monitoring, fire protection and radiation protection. Life support systems will be influenced by in situ resource utilization (ISRU), crew mobility and the degree of extravehicular activity. Planetary protection represents an additional set of requirements that technology developers have generally not considered. Planetary protection guidelines will affect the kind of operations, processes, and functions that can take place during future exploration missions, including venting and discharge of liquids and solids, ejection of wastes, use of ISRU, requirements for cabin atmospheric trace contaminant concentrations, cabin leakage and restrictions on what materials, organisms, and technologies that may be brought on missions. Compliance with planetary protection requirements may drive development of new capabilities or processes (e.g. in situ sterilization, waste containment, contaminant measurement) and limit or prohibit certain kinds of operations or processes (e.g. unfiltered venting). Ultimately, there will be an effect on mission costs, including the mission trade space. Planetary protection requirements need to be considered early in technology development programs. It is expected that planetary protection will have a major impact on technology selection for future missions.

  2. Standard practices for the implementation of computer software

    NASA Technical Reports Server (NTRS)

    Irvine, A. P. (Editor)

    1978-01-01

    A standard approach to the development of computer program is provided that covers the file cycle of software development from the planning and requirements phase through the software acceptance testing phase. All documents necessary to provide the required visibility into the software life cycle process are discussed in detail.

  3. Development of the Sensor for Environmental Assessment (SEA Buoy)

    DTIC Science & Technology

    2014-01-01

    requirements, were developed. Constraints include the requirement that none of the nodes become entangled with each other during deployment and that there...post-processing for the Ambient Noise and Reverberation modes, was completed in MATLAB . A prototype board is shown in Fig. 8. Fig. 8 – Prototype SEA

  4. On the Logic and Process of Collaborative Innovation in Higher Vocational Education and Industrial Development

    ERIC Educational Resources Information Center

    Zhibin, Tang; Weiping, Shi

    2017-01-01

    Modern development of vocational education requires the joint participation of multiple departments and entities, and industry-education cooperation is a basic requirement. Against the backdrop of the third industrial revolution, cultivating skilled, innovative graduates and transforming the model of industrial technical innovation requires…

  5. Evaluation in the Design of Complex Systems

    ERIC Educational Resources Information Center

    Ho, Li-An; Schwen, Thomas M.

    2006-01-01

    We identify literature that argues the process of creating knowledge-based system is often imbalanced. In most knowledge-based systems, development is often technology-driven instead of requirement-driven. Therefore, we argue designers must recognize that evaluation is a critical link in the application of requirement-driven development models…

  6. Development of user customized smart keyboard using Smart Product Design-Finite Element Analysis Process in the Internet of Things.

    PubMed

    Kim, Jung Woo; Sul, Sang Hun; Choi, Jae Boong

    2018-06-07

    In a hyper-connected society, IoT environment, markets are rapidly changing as smartphones penetrate global market. As smartphones are applied to various digital media, development of a novel smart product is required. In this paper, a Smart Product Design-Finite Element Analysis Process (SPD-FEAP) is developed to adopt fast-changing tends and user requirements that can be visually verified. The user requirements are derived and quantitatively evaluated from Smart Quality Function Deployment (SQFD) using WebData. Then the usage scenarios are created according to the priority of the functions derived from SQFD. 3D shape analysis by Finite Element Analysis (FEA) was conducted and printed out through Rapid Prototyping (RP) technology to identify any possible errors. Thus, a User Customized Smart Keyboard has been developed using SPD-FEAP. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Integration of decentralized clinical data in a data warehouse: a service-oriented design and realization.

    PubMed

    Hanss, Sabine; Schaaf, T; Wetzel, T; Hahn, C; Schrader, T; Tolxdorff, T

    2009-01-01

    In this paper we present a general concept and describe the difficulties for the integration of data from various clinical partners in one data warehouse using the Open European Nephrology Science Center (OpEN.SC) as an example. This includes a requirements analysis of the data integration process and also the design according to these requirements. This conceptual approach based on the Rational Unified Process (RUP) and paradigm of Service-Oriented Architecture (SOA). Because we have to enhance the confidence of our partners in the OpEN.SC system and with this the willingness of them to participate, important requirements are controllability, transparency and security for all partners. Reusable and fine-grained components were found to be necessary when working with diverse data sources. With SOA the requested reusability is implemented easily. A key step in the development of a data integration process within such a health information system like OpEN.SC is to analyze the requirements. And to show that this is not only a theoretical work, we present a design - developed with RUP and SOA - which fulfills these requirements.

  8. CHALLENGES OF PROCESSING BIOLOGICAL DATA FOR INCORPORATION INTO A LAKE EUTROPHICATION MODEL

    EPA Science Inventory

    A eutrophication model is in development as part of the Lake Michigan Mass Balance Project (LMMBP). Successful development and calibration of this model required the processing and incorporation of extensive biological data. Data were drawn from multiple sources, including nutrie...

  9. The DACUM Job Analysis Process.

    ERIC Educational Resources Information Center

    Dofasco, Inc., Hamilton (Ontario).

    This document explains the DACUM (Developing A Curriculum) process for analyzing task-based jobs to: identify where standard operating procedures are required; identify duplicated low value added tasks; develop performance standards; create job descriptions; and identify the elements that must be included in job-specific training programs. The…

  10. The FoReVer Methodology: A MBSE Framework for Formal Verification

    NASA Astrophysics Data System (ADS)

    Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald

    2013-08-01

    The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.

  11. TARGET's role in knowledge acquisition, engineering, validation, and documentation

    NASA Technical Reports Server (NTRS)

    Levi, Keith R.

    1994-01-01

    We investigate the use of the TARGET task analysis tool for use in the development of rule-based expert systems. We found TARGET to be very helpful in the knowledge acquisition process. It enabled us to perform knowledge acquisition with one knowledge engineer rather than two. In addition, it improved communication between the domain expert and knowledge engineer. We also found it to be useful for both the rule development and refinement phases of the knowledge engineering process. Using the network in these phases required us to develop guidelines that enabled us to easily translate the network into production rules. A significant requirement for TARGET remaining useful throughout the knowledge engineering process was the need to carefully maintain consistency between the network and the rule representations. Maintaining consistency not only benefited the knowledge engineering process, but also has significant payoffs in the areas of validation of the expert system and documentation of the knowledge in the system.

  12. Organic electronics with polymer dielectrics on plastic substrates fabricated via transfer printing

    NASA Astrophysics Data System (ADS)

    Hines, Daniel R.

    Printing methods are fast becoming important processing techniques for the fabrication of flexible electronics. Some goals for flexible electronics are to produce cheap, lightweight, disposable radio frequency identification (RFID) tags, very large flexible displays that can be produced in a roll-to-roll process and wearable electronics for both the clothing and medical industries. Such applications will require fabrication processes for the assembly of dissimilar materials onto a common substrate in ways that are compatible with organic and polymeric materials as well as traditional solid-state electronic materials. A transfer printing method has been developed with these goals and application in mind. This printing method relies primarily on differential adhesion where no chemical processing is performed on the device substrate. It is compatible with a wide variety of materials with each component printed in exactly the same way, thus avoiding any mixed processing steps on the device substrate. The adhesion requirements of one material printed onto a second are studied by measuring the surface energy of both materials and by surface treatments such as plasma exposure or the application of self-assembled monolayers (SAM). Transfer printing has been developed within the context of fabricating organic electronics onto plastic substrates because these materials introduce unique opportunities associated with processing conditions not typically required for traditional semiconducting materials. Compared to silicon, organic semiconductors are soft materials that require low temperature processing and are extremely sensitive to chemical processing and environmental contamination. The transfer printing process has been developed for the important and commonly used organic semiconducting materials, pentacene (Pn) and poly(3-hexylthiophene) (P3HT). A three-step printing process has been developed by which these materials are printed onto an electrode subassembly consisting of previously printed electrodes separated by a polymer dielectric layer all on a plastic substrate. These bottom contact, flexible organic thin-film transistors (OTFT) have been compared to unprinted (reference) devices consisting of top contact electrodes and a silicon dioxide dielectric layer on a silicon substrate. Printed Pn and P3HT TFTs have been shown to out-perform the reference devices. This enhancement has been attributed to an annealing under pressure of the organic semiconducting material.

  13. Investigation of charge coupled device correlation techniques

    NASA Technical Reports Server (NTRS)

    Lampe, D. R.; Lin, H. C.; Shutt, T. J.

    1978-01-01

    Analog Charge Transfer Devices (CTD's) offer unique advantages to signal processing systems, which often have large development costs, making it desirable to define those devices which can be developed for general system's use. Such devices are best identified and developed early to give system's designers some interchangeable subsystem blocks, not requiring additional individual development for each new signal processing system. The objective of this work is to describe a discrete analog signal processing device with a reasonably broad system use and to implement its design, fabrication, and testing.

  14. Joint Distributed Regional Training Capacity: A Scoping Study

    DTIC Science & Technology

    2007-12-01

    use management mechanisms 4. Develop assessment tools to rapidly quantify temporary land-use dis- turbance risks . The development of such...the Army Environmental Requirements and Technology Assessments (AERTA) process to develop validated requirements upon which to base more focused...conducting a large environmental assessment study each time an exercise is planned is needlessly expensive and does not give the flexibility to

  15. Explosives Safety Requirements Manual

    DOT National Transportation Integrated Search

    1996-03-29

    This Manual describes the Department of Energy's (DOE's) explosives safety requirements applicable to operations involving the development, testing, handling, and processing of explosives or assemblies containing explosives. It is intended to reflect...

  16. Harnessing QbD, Programming Languages, and Automation for Reproducible Biology.

    PubMed

    Sadowski, Michael I; Grant, Chris; Fell, Tim S

    2016-03-01

    Building robust manufacturing processes from biological components is a task that is highly complex and requires sophisticated tools to describe processes, inputs, and measurements and administrate management of knowledge, data, and materials. We argue that for bioengineering to fully access biological potential, it will require application of statistically designed experiments to derive detailed empirical models of underlying systems. This requires execution of large-scale structured experimentation for which laboratory automation is necessary. This requires development of expressive, high-level languages that allow reusability of protocols, characterization of their reliability, and a change in focus from implementation details to functional properties. We review recent developments in these areas and identify what we believe is an exciting trend that promises to revolutionize biotechnology. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. A Study of Emotions in Requirements Engineering

    NASA Astrophysics Data System (ADS)

    Colomo-Palacios, Ricardo; Hernández-López, Adrián; García-Crespo, Ángel; Soto-Acosta, Pedro

    Requirements engineering (RE) is a crucial activity in software development projects. This phase in the software development cycle is knowledge intensive, and thus, human capital intensive. From the human point of view, emotions play an important role in behavior and can even act as behavioral motivators. Thus, if we consider that RE represents a set of knowledge-intensive tasks, which include acceptance and negotiation activities, then the emotional factor represents a key element in these issues. However, the emotional factor in RE has not received the attention it deserves. This paper aims to integrate the stakeholder's emotions into the requirement process, proposing to catalogue them like any other factor in the process such as clarity or stability. Results show that high arousal and low pleasure levels are predictors of high versioning requirements.

  18. Development of integrated control system for smart factory in the injection molding process

    NASA Astrophysics Data System (ADS)

    Chung, M. J.; Kim, C. Y.

    2018-03-01

    In this study, we proposed integrated control system for automation of injection molding process required for construction of smart factory. The injection molding process consists of heating, tool close, injection, cooling, tool open, and take-out. Take-out robot controller, image processing module, and process data acquisition interface module are developed and assembled to integrated control system. By adoption of integrated control system, the injection molding process can be simplified and the cost for construction of smart factory can be inexpensive.

  19. The integration of the risk management process with the lifecycle of medical device software.

    PubMed

    Pecoraro, F; Luzi, D

    2014-01-01

    The application of software in the Medical Device (MD) domain has become central to the improvement of diagnoses and treatments. The new European regulations that specifically address software as an important component of MD, require complex procedures to make software compliant with safety requirements, introducing thereby new challenges in the qualification and classification of MD software as well as in the performance of risk management activities. Under this perspective, the aim of this paper is to propose an integrated framework that combines the activities to be carried out by the manufacturer to develop safe software within the development lifecycle based on the regulatory requirements reported in US and European regulations as well as in the relevant standards and guidelines. A comparative analysis was carried out to identify the main issues related to the application of the current new regulations. In addition, standards and guidelines recently released to harmonise procedures for the validation of MD software have been used to define the risk management activities to be carried out by the manufacturer during the software development process. This paper highlights the main issues related to the qualification and classification of MD software, providing an analysis of the different regulations applied in Europe and the US. A model that integrates the risk management process within the software development lifecycle has been proposed too. It is based on regulatory requirements and considers software risk analysis as a central input to be managed by the manufacturer already at the initial stages of the software design, in order to prevent MD failures. Relevant changes in the process of MD development have been introduced with the recognition of software being an important component of MDs as stated in regulations and standards. This implies the performance of highly iterative processes that have to integrate the risk management in the framework of software development. It also makes it necessary to involve both medical and software engineering competences to safeguard patient and user safety.

  20. Saturn S-2 production operations techniques: Production welding. Volume 1: Bulkhead welding

    NASA Technical Reports Server (NTRS)

    Abel, O. G.

    1970-01-01

    The complex Saturn S-2 welding processes and procedures required considerable development and refinement to establish a production capability that could consistently produce aluminum alloy welds within specified requirements. The special processes and techniques are defined that were established for the welding of gore-to-gore and manhole- or closeout-to-gore.

  1. The NERV Methodology: Non-Functional Requirements Elicitation, Reasoning and Validation in Agile Processes

    ERIC Educational Resources Information Center

    Domah, Darshan

    2013-01-01

    Agile software development has become very popular around the world in recent years, with methods such as Scrum and Extreme Programming (XP). Literature suggests that functionality is the primary focus in Agile processes while non-functional requirements (NFR) are either ignored or ill-defined. However, for software to be of good quality both…

  2. Use of KRS-XE positive chemically amplified resist for optical mask manufacturing

    NASA Astrophysics Data System (ADS)

    Ashe, Brian; Deverich, Christina; Rabidoux, Paul A.; Peck, Barbara; Petrillo, Karen E.; Angelopoulos, Marie; Huang, Wu-Song; Moreau, Wayne M.; Medeiros, David R.

    2002-03-01

    The traditional mask making process uses chain scission-type resists such as PBS, poly(butene-1-sulfone), and ZEP, poly(methyl a-chloroacrylate-co-a-methylstyrene) for making masks with dimensions greater than 180nm. PBS resist requires a wet etch process to produce patterns in chrome. ZEP was employed for dry etch processing to meet the requirements of shrinking dimensions, optical proximity corrections and phase shift masks. However, ZEP offers low contrast, marginal etch resistance, organic solvent development, and concerns regarding resist heating with its high dose requirements1. Chemically Amplified Resist (CAR) systems are a very good choice for dimensions less than 180nm because of their high sensitivity and contrast, high resolution, dry etch resistance, aqueous development, and process latitude2. KRS-XE was developed as a high contrast CA resist based on ketal protecting groups that eliminate the need for post exposure bake (PEB). This resist can be used for a variety of electron beam exposures, and improves the capability to fabricate masks for devices smaller than 180nm. Many factors influence the performance of resists in mask making such as post apply bake, exposure dose, resist develop, and post exposure bake. These items will be discussed as well as the use of reactive ion etching (RIE) selectivity and pattern transfer.

  3. Polysilicon planarization and plug recess etching in a decoupled plasma source chamber using two endpoint techniques

    NASA Astrophysics Data System (ADS)

    Kaplita, George A.; Schmitz, Stefan; Ranade, Rajiv; Mathad, Gangadhara S.

    1999-09-01

    The planarization and recessing of polysilicon to form a plug are processes of increasing importance in silicon IC fabrication. While this technology has been developed and applied to DRAM technology using Trench Storage Capacitors, the need for such processes in other IC applications (i.e. polysilicon studs) has increased. Both planarization and recess processes usually have stringent requirements on etch rate, recess uniformity, and selectivity to underlying films. Additionally, both processes generally must be isotropic, yet must not expand any seams that might be present in the polysilicon fill. These processes should also be insensitive to changes in exposed silicon area (pattern factor) on the wafer. A SF6 plasma process in a polysilicon DPS (Decoupled Plasma Source) reactor has demonstrated the capability of achieving the above process requirements for both planarization and recess etch. The SF6 process in the decoupled plasma source reactor exhibited less sensitivity to pattern factor than in other types of reactors. Control of these planarization and recess processes requires two endpoint systems to work sequentially in the same recipe: one for monitoring the endpoint when blanket polysilicon (100% Si loading) is being planarized and one for monitoring the recess depth while the plug is being recessed (less than 10% Si loading). The planarization process employs an optical emission endpoint system (OES). An interferometric endpoint system (IEP), capable of monitoring lateral interference, is used for determining the recess depth. The ability of using either or both systems is required to make these plug processes manufacturable. Measuring the recess depth resulting from the recess process can be difficult, costly and time- consuming. An Atomic Force Microscope (AFM) can greatly alleviate these problems and can serve as a critical tool in the development of recess processes.

  4. FGF-Dependent, Context-Driven Role for FRS Adapters in the Early Telencephalon

    PubMed Central

    Gutin, Grigoriy; Blackwood, Christopher A.; Kamatkar, Nachiket G.; Lee, Kyung W.; Fishell, Gordon; Wang, Fen

    2017-01-01

    FGF signaling, an important component of intercellular communication, is required in many tissues throughout development to promote diverse cellular processes. Whether FGF receptors (FGFRs) accomplish such varied tasks in part by activating different intracellular transducers in different contexts remains unclear. Here, we used the developing mouse telencephalon as an example to study the role of the FRS adapters FRS2 and FRS3 in mediating the functions of FGFRs. Using tissue-specific and germline mutants, we examined the requirement of Frs genes in two FGFR-dependent processes. We found that Frs2 and Frs3 are together required for the differentiation of a subset of medial ganglionic eminence (MGE)-derived neurons, but are dispensable for the survival of early telencephalic precursor cells, in which any one of three FGFRs (FGFR1, FGFR2, or FGFR3) is sufficient for survival. Although FRS adapters are dispensable for ERK-1/2 activation, they are required for AKT activation within the subventricular zone of the developing MGE. Using an FRS2,3-binding site mutant of Fgfr1, we established that FRS adapters are necessary for mediating most or all FGFR1 signaling, not only in MGE differentiation, but also in cell survival, implying that other adapters mediate at least in part the signaling from FGFR2 and FGFR3. Our study provides an example of a contextual role for an intracellular transducer and contributes to our understanding of how FGF signaling plays diverse developmental roles. SIGNIFICANCE STATEMENT FGFs promote a range of developmental processes in many developing tissues and at multiple developmental stages. The mechanisms underlying this multifunctionality remain poorly defined in vivo. Using telencephalon development as an example, we show here that FRS adapters exhibit some selectivity in their requirement for mediating FGF receptor (FGFR) signaling and activating downstream mediators that depend on the developmental process, with a requirement in neuronal differentiation but not cell survival. Differential engagement of FRS and non-FRS intracellular adapters downstream of FGFRs could therefore in principle explain how FGFs play several distinct roles in other developing tissues and developmental stages. PMID:28483978

  5. FGF-Dependent, Context-Driven Role for FRS Adapters in the Early Telencephalon.

    PubMed

    Nandi, Sayan; Gutin, Grigoriy; Blackwood, Christopher A; Kamatkar, Nachiket G; Lee, Kyung W; Fishell, Gordon; Wang, Fen; Goldfarb, Mitchell; Hébert, Jean M

    2017-06-07

    FGF signaling, an important component of intercellular communication, is required in many tissues throughout development to promote diverse cellular processes. Whether FGF receptors (FGFRs) accomplish such varied tasks in part by activating different intracellular transducers in different contexts remains unclear. Here, we used the developing mouse telencephalon as an example to study the role of the FRS adapters FRS2 and FRS3 in mediating the functions of FGFRs. Using tissue-specific and germline mutants, we examined the requirement of Frs genes in two FGFR-dependent processes. We found that Frs2 and Frs3 are together required for the differentiation of a subset of medial ganglionic eminence (MGE)-derived neurons, but are dispensable for the survival of early telencephalic precursor cells, in which any one of three FGFRs (FGFR1, FGFR2, or FGFR3) is sufficient for survival. Although FRS adapters are dispensable for ERK-1/2 activation, they are required for AKT activation within the subventricular zone of the developing MGE. Using an FRS2,3-binding site mutant of Fgfr1 , we established that FRS adapters are necessary for mediating most or all FGFR1 signaling, not only in MGE differentiation, but also in cell survival, implying that other adapters mediate at least in part the signaling from FGFR2 and FGFR3. Our study provides an example of a contextual role for an intracellular transducer and contributes to our understanding of how FGF signaling plays diverse developmental roles. SIGNIFICANCE STATEMENT FGFs promote a range of developmental processes in many developing tissues and at multiple developmental stages. The mechanisms underlying this multifunctionality remain poorly defined in vivo Using telencephalon development as an example, we show here that FRS adapters exhibit some selectivity in their requirement for mediating FGF receptor (FGFR) signaling and activating downstream mediators that depend on the developmental process, with a requirement in neuronal differentiation but not cell survival. Differential engagement of FRS and non-FRS intracellular adapters downstream of FGFRs could therefore in principle explain how FGFs play several distinct roles in other developing tissues and developmental stages. Copyright © 2017 the authors 0270-6474/17/375690-09$15.00/0.

  6. Foundational Report Series: Advanced Distribution Management Systems for Grid Modernization, High-Level Use Cases for DMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jianhui; Lu, Xiaonan; Martino, Sal

    Many distribution management systems (DMS) projects have achieved limited success because the electric utility did not sufficiently plan for actual use of the DMS functions in the control room environment. As a result, end users were not clear on how to use the new application software in actual production environments with existing, well-established business processes. An important first step in the DMS implementation process is development and refinement of the “to be” business processes. Development of use cases for the required DMS application functions is a key activity that leads to the formulation of the “to be” requirements. It ismore » also an important activity that is needed to develop specifications that are used to procure a new DMS.« less

  7. Problematics of Time and Timing in the Longitudinal Study of Human Development: Theoretical and Methodological Issues

    PubMed Central

    Lerner, Richard M.; Schwartz, Seth J; Phelps, Erin

    2009-01-01

    Studying human development involves describing, explaining, and optimizing intraindividual change and interindividual differences in such change and, as such, requires longitudinal research. The selection of the appropriate type of longitudinal design requires selecting the option that best addresses the theoretical questions asked about developmental process and the use of appropriate statistical procedures to best exploit data derived from theory-predicated longitudinal research. This paper focuses on several interrelated problematics involving the treatment of time and the timing of observations that developmental scientists face in creating theory-design fit and in charting in change-sensitive ways developmental processes across life. We discuss ways in which these problematics may be addressed to advance theory-predicated understanding of the role of time in processes of individual development. PMID:19554215

  8. Short Serious Games Creation under the Paradigm of Software Process and Competencies as Software Requirements. Case Study: Elementary Math Competencies

    ERIC Educational Resources Information Center

    Barajas-Saavedra, Arturo; Álvarez-Rodriguez, Francisco J.; Mendoza-González, Ricardo; Oviedo-De-Luna, Ana C.

    2015-01-01

    Development of digital resources is difficult due to their particular complexity relying on pedagogical aspects. Another aspect is the lack of well-defined development processes, experiences documented, and standard methodologies to guide and organize game development. Added to this, there is no documented technique to ensure correct…

  9. Red Plague Control Plan (RPCP)

    NASA Technical Reports Server (NTRS)

    Cooke, Robert W.

    2010-01-01

    SCOPE: Prescribes the minimum requirements for the control of cuprous / cupric oxide corrosion (a.k.a. Red Plague) of silver-coated copper wire, cable, and harness assemblies. PURPOSE: Targeted for applications where exposure to assembly processes, environmental conditions, and contamination may promote the development of cuprous / cupric oxide corrosion (a.k.a. Red Plague) in silver-coated copper wire, cable, and harness assemblies. Does not exclude any alternate or contractor-proprietary documents or processes that meet or exceed the baseline of requirements established by this document. Use of alternate or contractor-proprietary documents or processes shall require review and prior approval of the procuring NASA activity.

  10. Optimization of RET flow using test layout

    NASA Astrophysics Data System (ADS)

    Zhang, Yunqiang; Sethi, Satyendra; Lucas, Kevin

    2008-11-01

    At advanced technology nodes with extremely low k1 lithography, it is very hard to achieve image fidelity requirements and process window for some layout configurations. Quite often these layouts are within simple design rule constraints for a given technology node. It is important to have these layouts included during early RET flow development. Most of RET developments are based on shrunk layout from the previous technology node, which is possibly not good enough. A better methodology in creating test layout is required for optical proximity correction (OPC) recipe and assists feature development. In this paper we demonstrate the application of programmable test layouts in RET development. Layout pattern libraries are developed and embedded in a layout tool (ICWB). Assessment gauges are generated together with patterns for quick correction accuracy assessment. Several groups of test pattern libraries have been developed based on learning from product patterns and a layout DOE approach. The interaction between layout patterns and OPC recipe has been studied. Correction of a contact layer is quite challenge because of poor convergence and low process window. We developed test pattern library with many different contact configurations. Different OPC schemes are studied on these test layouts. The worst process window patterns are pinpointed for a given illumination condition. Assist features (AF) are frequently placed according to pre-determined rules to improve lithography process window. These rules are usually derived from lithographic models and experiments. Direct validation of AF rules is required at development phase. We use the test layout approach to determine rules in order to eliminate AF printability problem.

  11. Creating Communications, Computing, and Networking Technology Development Road Maps for Future NASA Human and Robotic Missions

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Hayden, Jeffrey L.

    2005-01-01

    For human and robotic exploration missions in the Vision for Exploration, roadmaps are needed for capability development and investments based on advanced technology developments. A roadmap development process was undertaken for the needed communications, and networking capabilities and technologies for the future human and robotics missions. The underlying processes are derived from work carried out during development of the future space communications architecture, an d NASA's Space Architect Office (SAO) defined formats and structures for accumulating data. Interrelationships were established among emerging requirements, the capability analysis and technology status, and performance data. After developing an architectural communications and networking framework structured around the assumed needs for human and robotic exploration, in the vicinity of Earth, Moon, along the path to Mars, and in the vicinity of Mars, information was gathered from expert participants. This information was used to identify the capabilities expected from the new infrastructure and the technological gaps in the way of obtaining them. We define realistic, long-term space communication architectures based on emerging needs and translate the needs into interfaces, functions, and computer processing that will be required. In developing our roadmapping process, we defined requirements for achieving end-to-end activities that will be carried out by future NASA human and robotic missions. This paper describes: 10 the architectural framework developed for analysis; 2) our approach to gathering and analyzing data from NASA, industry, and academia; 3) an outline of the technology research to be done, including milestones for technology research and demonstrations with timelines; and 4) the technology roadmaps themselves.

  12. Advances in multi-scale modeling of solidification and casting processes

    NASA Astrophysics Data System (ADS)

    Liu, Baicheng; Xu, Qingyan; Jing, Tao; Shen, Houfa; Han, Zhiqiang

    2011-04-01

    The development of the aviation, energy and automobile industries requires an advanced integrated product/process R&D systems which could optimize the product and the process design as well. Integrated computational materials engineering (ICME) is a promising approach to fulfill this requirement and make the product and process development efficient, economic, and environmentally friendly. Advances in multi-scale modeling of solidification and casting processes, including mathematical models as well as engineering applications are presented in the paper. Dendrite morphology of magnesium and aluminum alloy of solidification process by using phase field and cellular automaton methods, mathematical models of segregation of large steel ingot, and microstructure models of unidirectionally solidified turbine blade casting are studied and discussed. In addition, some engineering case studies, including microstructure simulation of aluminum casting for automobile industry, segregation of large steel ingot for energy industry, and microstructure simulation of unidirectionally solidified turbine blade castings for aviation industry are discussed.

  13. Some Findings Concerning Requirements in Agile Methodologies

    NASA Astrophysics Data System (ADS)

    Rodríguez, Pilar; Yagüe, Agustín; Alarcón, Pedro P.; Garbajosa, Juan

    Agile methods have appeared as an attractive alternative to conventional methodologies. These methods try to reduce the time to market and, indirectly, the cost of the product through flexible development and deep customer involvement. The processes related to requirements have been extensively studied in literature, in most cases in the frame of conventional methods. However, conclusions of conventional methodologies could not be necessarily valid for Agile; in some issues, conventional and Agile processes are radically different. As recent surveys report, inadequate project requirements is one of the most conflictive issues in agile approaches and better understanding about this is needed. This paper describes some findings concerning requirements activities in a project developed under an agile methodology. The project intended to evolve an existing product and, therefore, some background information was available. The major difficulties encountered were related to non-functional needs and management of requirements dependencies.

  14. The study of integrated coal-gasifier molten carbonate fuel cell systems

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A novel integration concept for a coal-fueled coal gasifier-molten carbonate fuel cell power plant was studied. Effort focused on determining the efficiency potential of the concept, design, and development requirements of the processes in order to achieve the efficiency. The concept incorporates a methane producing catalytic gasifier of the type previously under development by Exxon Research and Development Corp., a reforming molten carbonate fuel cell power section of the type currently under development by United Technologies Corp., and a gasifier-fuel cell recycle loop. The concept utilizes the fuel cell waste heat, in the form of hydrogen and carbon monoxide, to generate additional fuel in the coal gasifier, thereby eliminating the use of both an O2 plant and a stream bottoming cycle from the power plant. The concept has the potential for achieving coal-pile-to-busbar efficiencies of 50-59%, depending on the process configuration and degree of process configuration and degree of process development requirements. This is significantly higher than any previously reported gasifier-molten carbonate fuel cell system.

  15. Process-Based Development of Competence Models to Computer Science Education

    ERIC Educational Resources Information Center

    Zendler, Andreas; Seitz, Cornelia; Klaudt, Dieter

    2016-01-01

    A process model ("cpm.4.CSE") is introduced that allows the development of competence models in computer science education related to curricular requirements. It includes eight subprocesses: (a) determine competence concept, (b) determine competence areas, (c) identify computer science concepts, (d) assign competence dimensions to…

  16. A practitioner's guide to service development.

    PubMed

    Lees, Liz

    2010-11-01

    Service development and service improvement are complex concepts, but this should not prevent practitioners engaging in, or initiating, them. There is no set blueprint for service development so this article examines the process, describes the skills required, lists some change management tools and offers a guide to the stages involved. The article aims to demystify service development for those considering embarking on the process for the first time.

  17. The Elements of an Effective Software Development Plan - Software Development Process Guidebook

    DTIC Science & Technology

    2011-11-11

    standards and practices required for all XMPL software development. This SDP implements the <corporate> Standard Software Process (SSP). as tailored...Developing and integrating reusable software products • Approach to managing COTS/Reuse software implementation • COTS/Reuse software selection...final selection and submit to change board for approval MAINTENANCE Monitor current products for obsolescence or end of support Track new

  18. ASRM test report: Autoclave cure process development

    NASA Technical Reports Server (NTRS)

    Nachbar, D. L.; Mitchell, Suzanne

    1992-01-01

    ASRM insulated segments will be autoclave cured following insulation pre-form installation and strip wind operations. Following competitive bidding, Aerojet ASRM Division (AAD) Purchase Order 100142 was awarded to American Fuel Cell and Coated Fabrics Company, Inc. (Amfuel), Magnolia, AR, for subcontracted insulation autoclave cure process development. Autoclave cure process development test requirements were included in Task 3 of TM05514, Manufacturing Process Development Specification for Integrated Insulation Characterization and Stripwind Process Development. The test objective was to establish autoclave cure process parameters for ASRM insulated segments. Six tasks were completed to: (1) evaluate cure parameters that control acceptable vulcanization of ASRM Kevlar-filled EPDM insulation material; (2) identify first and second order impact parameters on the autoclave cure process; and (3) evaluate insulation material flow-out characteristics to support pre-form configuration design.

  19. Post-treatment of reclaimed waste water based on an electrochemical advanced oxidation process

    NASA Technical Reports Server (NTRS)

    Verostko, Charles E.; Murphy, Oliver J.; Hitchens, G. D.; Salinas, Carlos E.; Rogers, Tom D.

    1992-01-01

    The purification of reclaimed water is essential to water reclamation technology life-support systems in lunar/Mars habitats. An electrochemical UV reactor is being developed which generates oxidants, operates at low temperatures, and requires no chemical expendables. The reactor is the basis for an advanced oxidation process in which electrochemically generated ozone and hydrogen peroxide are used in combination with ultraviolet light irradiation to produce hydroxyl radicals. Results from this process are presented which demonstrate concept feasibility for removal of organic impurities and disinfection of water for potable and hygiene reuse. Power, size requirements, Faradaic efficiency, and process reaction kinetics are discussed. At the completion of this development effort the reactor system will be installed in JSC's regenerative water recovery test facility for evaluation to compare this technique with other candidate processes.

  20. A process for prototyping onboard payload displays for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.

    1992-01-01

    Significant advances have been made in the area of Human-Computer Interface design. However, there is no well-defined process for going from user interface requirements to user interface design. Developing and designing a clear and consistent user interface for medium to large scale systems is a very challenging and complex task. The task becomes increasingly difficult when there is very little guidance and procedures on how the development process should flow from one stage to the next. Without a specific sequence of development steps each design becomes difficult to repeat, to evaluate, to improve, and to articulate to others. This research contributes a process which identifies the phases of development and products produced as a result of each phase for a rapid prototyping process to be used to develop requirements for the onboard payload displays for Space Station Freedom. The functional components of a dynamic prototyping environment in which this process can be carried out is also discussed. Some of the central questions which are answered here include: How does one go from specifications to an actual prototype? How is a prototype evaluated? How is usability defined and thus measured? How do we use the information from evaluation in redesign of an interface? and Are there techniques which allow for convergence on a design?

  1. Models for Deploying Open Source and Commercial Software to Support Earth Science Data Processing and Distribution

    NASA Astrophysics Data System (ADS)

    Yetman, G.; Downs, R. R.

    2011-12-01

    Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.

  2. Incremental development and prototyping in current laboratory software development projects: Preliminary analysis

    NASA Technical Reports Server (NTRS)

    Griesel, Martha Ann

    1988-01-01

    Several Laboratory software development projects that followed nonstandard development processes, which were hybrids of incremental development and prototyping, are being studied. Factors in the project environment leading to the decision to use a nonstandard development process and affecting its success are analyzed. A simple characterization of project environment based on this analysis is proposed, together with software development approaches which have been found effective for each category. These approaches include both documentation and review requirements.

  3. Data requirements for valuing externalities: The role of existing permitting processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, A.D.; Baechler, M.C.; Callaway, J.M.

    1990-08-01

    While the assessment of externalities, or residual impacts, will place new demands on regulators, utilities, and developers, existing processes already require certain data and information that may fulfill some of the data needs for externality valuation. This paper examines existing siting, permitting, and other processes and highlights similarities and differences between their data requirements and the data required to value environmental externalities. It specifically considers existing requirements for siting new electricity resources in Oregon and compares them with the information and data needed to value externalities for such resources. This paper also presents several observations about how states can takemore » advantage of data acquired through processes already in place as they move into an era when externalities are considered in utility decision-making. It presents other observations on the similarities and differences between the data requirements under existing processes and those for valuing externalities. This paper also briefly discusses the special case of cumulative impacts. And it presents recommendations on what steps to take in future efforts to value externalities. 35 refs., 2 tabs.« less

  4. Multiple roles for the Na,K-ATPase subunits, Atp1a1 and Fxyd1, during brain ventricle development

    PubMed Central

    Chang, Jessica T.; Lowery, Laura Anne; Sive, Hazel

    2012-01-01

    Formation of the vertebrate brain ventricles requires both production of cerebrospinal fluid (CSF), and its retention in the ventricles. The Na,K-ATPase is required for brain ventricle development, and we show here that this protein complex impacts three associated processes. The first requires both the alpha subunit (Atp1a1) and the regulatory subunit, Fxyd1, and leads to formation of a cohesive neuroepithelium, with continuous apical junctions. The second process leads to modulation of neuroepithelial permeability, and requires Atp1a1, which increases permeability with partial loss of function and decreases it with overexpression. In contrast, fxyd1 overexpression does not alter neuroepithelial permeability, suggesting that its activity is limited to neuroepithelium formation. RhoA regulates both neuroepithelium formation and permeability, downstream of the Na,K-ATPase. A third process, likely to be CSF production, is RhoA-independent, requiring Atp1a1, but not Fxyd1. Consistent with a role for Na,K-ATPase pump function, the inhibitor ouabain prevents neuroepithelium formation, while intracellular Na+ increases after Atp1a1 and Fxyd1 loss of function. These data include the first reported role for Fxyd1 in the developing brain, and indicate that the Na,K-ATPase regulates three aspects of brain ventricle development essential for normal function - formation of a cohesive neuroepithelium, restriction of neuroepithelial permeability, and production of CSF. PMID:22683378

  5. Speed isn’t everything: Complex processing speed measures mask individual differences and developmental changes in executive control

    PubMed Central

    Cepeda, Nicholas J.; Blackwell, Katharine A.; Munakata, Yuko

    2012-01-01

    The rate at which people process information appears to influence many aspects of cognition across the lifespan. However, many commonly accepted measures of “processing speed” may require goal maintenance, manipulation of information in working memory, and decision-making, blurring the distinction between processing speed and executive control and resulting in overestimation of processing-speed contributions to cognition. This concern may apply particularly to studies of developmental change, as even seemingly simple processing speed measures may require executive processes to keep children and older adults on task. We report two new studies and a re-analysis of a published study, testing predictions about how different processing speed measures influence conclusions about executive control across the life span. We find that the choice of processing speed measure affects the relationship observed between processing speed and executive control, in a manner that changes with age, and that choice of processing speed measure affects conclusions about development and the relationship among executive control measures. Implications for understanding processing speed, executive control, and their development are discussed. PMID:23432836

  6. [Development of Hospital Equipment Maintenance Information System].

    PubMed

    Zhou, Zhixin

    2015-11-01

    Hospital equipment maintenance information system plays an important role in improving medical treatment quality and efficiency. By requirement analysis of hospital equipment maintenance, the system function diagram is drawed. According to analysis of input and output data, tables and reports in connection with equipment maintenance process, relationships between entity and attribute is found out, and E-R diagram is drawed and relational database table is established. Software development should meet actual process requirement of maintenance and have a friendly user interface and flexible operation. The software can analyze failure cause by statistical analysis.

  7. Development of system design information for carbon dioxide using an amine type sorber

    NASA Technical Reports Server (NTRS)

    Rankin, R. L.; Roehlich, F.; Vancheri, F.

    1971-01-01

    Development work on system design information for amine type carbon dioxide sorber is reported. Amberlite IR-45, an aminated styrene divinyl benzene matrix, was investigated to determine the influence of design parameters of sorber particle size, process flow rate, CO2 partial pressure, total pressure, and bed designs. CO2 capacity and energy requirements for a 4-man size system were related mathematically to important operational parameters. Some fundamental studies in CO2 sorber capacity, energy requirements, and process operation were also performed.

  8. Facilitating Behavior Change With Low-literacy Patient Education Materials

    PubMed Central

    Seligman, Hilary K.; Wallace, Andrea S.; DeWalt, Darren A.; Schillinger, Dean; Arnold, Connie L.; Shilliday, Betsy Bryant; Delgadillo, Adriana; Bengal, Nikki; Davis, Terry C.

    2014-01-01

    Objective To describe a process for developing low-literacy health education materials that increase knowledge and activate patients toward healthier behaviors. Methods We developed a theoretically informed process for developing educational materials. This process included convening a multidisciplinary creative team, soliciting stakeholder input, identifying key concepts to be communicated, mapping concepts to a behavioral theory, creating a supporting behavioral intervention, designing and refining materials, and assessing efficacy. Results We describe the use of this process to develop a diabetes self-management guide. Conclusions Developing low-literacy health education materials that will activate patients toward healthier behaviors requires attention to factors beyond reading level. PMID:17931139

  9. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  10. Development of Integrated Programs for Aerospace-vehicle design (IPAD): Integrated information processing requirements

    NASA Technical Reports Server (NTRS)

    Southall, J. W.

    1979-01-01

    The engineering-specified requirements for integrated information processing by means of the Integrated Programs for Aerospace-Vehicle Design (IPAD) system are presented. A data model is described and is based on the design process of a typical aerospace vehicle. General data management requirements are specified for data storage, retrieval, generation, communication, and maintenance. Information management requirements are specified for a two-component data model. In the general portion, data sets are managed as entities, and in the specific portion, data elements and the relationships between elements are managed by the system, allowing user access to individual elements for the purpose of query. Computer program management requirements are specified for support of a computer program library, control of computer programs, and installation of computer programs into IPAD.

  11. Overview of processing activities aimed at higher efficiencies and economical production

    NASA Technical Reports Server (NTRS)

    Bickler, D. B.

    1985-01-01

    An overview of processing activities aimed at higher efficiencies and economical production were presented. Present focus is on low-cost process technology for higher-efficiency cells of up to 18% or higher. Process development concerns center on the use of less than optimum silicon sheet, the control of production yields, and making uniformly efficient large-area cells. High-efficiency cell factors that require process development are bulk material perfection, very shallow junction formation, front-surface passivation, and finely detailed metallization. Better bulk properties of the silicon sheet and the keeping of those qualities throughout large areas during cell processing are required so that minority carrier lifetimes are maintained and cell performance is not degraded by high doping levels. When very shallow junctions are formed, the process must be sensitive to metallizatin punch-through, series resisitance in the cell, and control of dopant leaching during surface passivation. There is a need to determine the sensitivity to processing by mathematical modeling and experimental activities.

  12. How to build a course in mathematical-biological modeling: content and processes for knowledge and skill.

    PubMed

    Hoskinson, Anne-Marie

    2010-01-01

    Biological problems in the twenty-first century are complex and require mathematical insight, often resulting in mathematical models of biological systems. Building mathematical-biological models requires cooperation among biologists and mathematicians, and mastery of building models. A new course in mathematical modeling presented the opportunity to build both content and process learning of mathematical models, the modeling process, and the cooperative process. There was little guidance from the literature on how to build such a course. Here, I describe the iterative process of developing such a course, beginning with objectives and choosing content and process competencies to fulfill the objectives. I include some inductive heuristics for instructors seeking guidance in planning and developing their own courses, and I illustrate with a description of one instructional model cycle. Students completing this class reported gains in learning of modeling content, the modeling process, and cooperative skills. Student content and process mastery increased, as assessed on several objective-driven metrics in many types of assessments.

  13. How to Build a Course in Mathematical–Biological Modeling: Content and Processes for Knowledge and Skill

    PubMed Central

    2010-01-01

    Biological problems in the twenty-first century are complex and require mathematical insight, often resulting in mathematical models of biological systems. Building mathematical–biological models requires cooperation among biologists and mathematicians, and mastery of building models. A new course in mathematical modeling presented the opportunity to build both content and process learning of mathematical models, the modeling process, and the cooperative process. There was little guidance from the literature on how to build such a course. Here, I describe the iterative process of developing such a course, beginning with objectives and choosing content and process competencies to fulfill the objectives. I include some inductive heuristics for instructors seeking guidance in planning and developing their own courses, and I illustrate with a description of one instructional model cycle. Students completing this class reported gains in learning of modeling content, the modeling process, and cooperative skills. Student content and process mastery increased, as assessed on several objective-driven metrics in many types of assessments. PMID:20810966

  14. Plug-and -Play Model Architecture and Development Environment for Powertrain/Propulsion System - Final CRADA Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rousseau, Aymeric

    2013-02-01

    Several tools already exist to develop detailed plant model, including GT-Power, AMESim, CarSim, and SimScape. The objective of Autonomie is not to provide a language to develop detailed models; rather, Autonomie supports the assembly and use of models from design to simulation to analysis with complete plug-and-play capabilities. Autonomie provides a plug-and-play architecture to support this ideal use of modeling and simulation for math-based automotive control system design. Models in the standard format create building blocks, which are assembled at runtime into a simulation model of a vehicle, system, subsystem, or component to simulate. All parts of the graphical usermore » interface (GUI) are designed to be flexible to support architectures, systems, components, and processes not yet envisioned. This allows the software to be molded to individual uses, so it can grow as requirements and technical knowledge expands. This flexibility also allows for implementation of legacy code, including models, controller code, processes, drive cycles, and post-processing equations. A library of useful and tested models and processes is included as part of the software package to support a full range of simulation and analysis tasks, immediately. Autonomie also includes a configuration and database management front end to facilitate the storage, versioning, and maintenance of all required files, such as the models themselves, the model’s supporting files, test data, and reports. During the duration of the CRADA, Argonne has worked closely with GM to implement and demonstrate each one of their requirements. A use case was developed by GM for every requirement and demonstrated by Argonne. Each of the new features were verified by GM experts through a series of Gate. Once all the requirements were validated they were presented to the directors as part of GM Gate process.« less

  15. Flexible control techniques for a lunar base

    NASA Technical Reports Server (NTRS)

    Kraus, Thomas W.

    1992-01-01

    The fundamental elements found in every terrestrial control system can be employed in all lunar applications. These elements include sensors which measure physical properties, controllers which acquire sensor data and calculate a control response, and actuators which apply the control output to the process. The unique characteristics of the lunar environment will certainly require the development of new control system technology. However, weightlessness, harsh atmospheric conditions, temperature extremes, and radiation hazards will most significantly impact the design of sensors and actuators. The controller and associated control algorithms, which are the most complex element of any control system, can be derived in their entirety from existing technology. Lunar process control applications -- ranging from small-scale research projects to full-scale processing plants -- will benefit greatly from the controller advances being developed today. In particular, new software technology aimed at commercial process monitoring and control applications will almost completely eliminate the need for custom programs and the lengthy development and testing cycle they require. The applicability of existing industrial software to lunar applications has other significant advantages in addition to cost and quality. This software is designed to run on standard hardware platforms and takes advantage of existing LAN and telecommunications technology. Further, in order to exploit the existing commercial market, the software is being designed to be implemented by users of all skill levels -- typically users who are familiar with their process, but not necessarily with software or control theory. This means that specialized technical support personnel will not need to be on-hand, and the associated costs are eliminated. Finally, the latest industrial software designed for the commercial market is extremely flexible, in order to fit the requirements of many types of processing applications with little or no customization. This means that lunar process control projects will not be delayed by unforeseen problems or last minute process modifications. The software will include all of the tools needed to adapt to virtually any changes. In contrast to other space programs which required the development of tremendous amounts of custom software, lunar-based processing facilities will benefit from the use of existing software technology which is being proven in commercial applications on Earth.

  16. Exploring Operational Test and Evaluation of Unmanned Aircraft Systems: A Qualitative Case Study

    NASA Astrophysics Data System (ADS)

    Saliceti, Jose A.

    The purpose of this qualitative case study was to explore and identify strategies that may potentially remedy operational test and evaluation procedures used to evaluate Unmanned Aircraft Systems (UAS) technology. The sample for analysis consisted of organizations testing and evaluating UASs (e.g., U.S. Air Force, U.S. Navy, U.S. Army, U.S. Marine Corps, U.S. Coast Guard, and Customs Border Protection). A purposeful sampling technique was used to select 15 subject matter experts in the field of operational test and evaluation of UASs. A questionnaire was provided to participants to construct a descriptive and robust research. Analysis of responses revealed themes related to each research question. Findings revealed operational testers utilized requirements documents to extrapolate measures for testing UAS technology and develop critical operational issues. The requirements documents were (a) developed without the contribution of stakeholders and operational testers, (b) developed with vague or unrealistic measures, and (c) developed without a systematic method to derive requirements from mission tasks. Four approaches are recommended to develop testable operational requirements and assist operational testers: (a) use a mission task analysis tool to derive requirements for mission essential tasks for the system, (b) exercise collaboration among stakeholders and testers to ensure testable operational requirements based on mission tasks, (c) ensure testable measures are used in requirements documents, and (d) create a repository list of critical operational issues by mission areas. The preparation of operational test and evaluation processes for UAS technology is not uniform across testers. The processes in place are not standardized, thus test plan preparation and reporting are different among participants. A standard method to prepare and report UAS technology should be used when preparing and reporting on UAS technology. Using a systematic process, such as mission-based test design, resonated among participants as an analytical method to link UAS mission tasks and measures of performance to the capabilities of the system under test when developing operational test plans. Further research should examine system engineering designs for system requirements traceability matrix of mission tasks and subtasks while using an analysis tool that adequately evaluates UASs with an acceptable level of confidence in the results.

  17. Screening Methodologies to Support Risk and Technology ...

    EPA Pesticide Factsheets

    The Clean Air Act establishes a two-stage regulatory process for addressing emissions of hazardous air pollutants (HAPs) from stationary sources. In the first stage, the Act requires the EPA to develop technology-based standards for categories of industrial sources. We have largely completed the required “Maximum Achievable Control Technology” (MACT) standards. In the second stage of the regulatory process, EPA must review each MACT standard at least every eight years and revise them as necessary, “taking into account developments in practices, processes and control technologies.” We call this requirement the “technology review.” EPA is also required to complete a one-time assessment of the health and environmental risks that remain after sources come into compliance with MACT. This residual risk review also must be done within 8 years of setting the initial MACT standard. If additional risk reductions are necessary to protect public health with an ample margin of safety or to prevent adverse environmental effects, EPA must develop standards to address these remaining risks. Because the risk review is an important component of the RTR process, EPA is seeking SAB input on the scientific credibility of specific enhancements made to our risk assessment methodologies, particularly with respect to screening methodologies, since the last SAB review was completed in 2010. These enhancements to our risk methodologies are outlined in the document title

  18. Integrated tools and techniques applied to the TES ground data system

    NASA Technical Reports Server (NTRS)

    Morrison, B. A.

    2000-01-01

    The author of this paper will dicuss the selection of CASE tools, a decision making process, requirements tracking and a review mechanism that leads to a highly integrated approach to software development that must deal with the constant pressure to change software requirements and design that is associated with research and development.

  19. X-33 Environmental Impact Statement: A Fast Track Approach

    NASA Technical Reports Server (NTRS)

    McCaleb, Rebecca C.; Holland, Donna L.

    1998-01-01

    NASA is required by the National Environmental Policy Act (NEPA) to prepare an appropriate level environmental analysis for its major projects. Development of the X-33 Technology Demonstrator and its associated flight test program required an environmental impact statement (EIS) under the NEPA. The EIS process is consists of four parts: the "Notice of Intent" to prepare an EIS and scoping; the draft EIS which is distributed for review and comment; the final ETS; and the "Record of Decision." Completion of this process normally takes from 2 - 3 years, depending on the complexity of the proposed action. Many of the agency's newest fast track, technology demonstration programs require NEPA documentation, but cannot sustain the lengthy time requirement between program concept development to implementation. Marshall Space Flight Center, in cooperation with Kennedy Space Center, accomplished the NEPA process for the X-33 Program in 13 months from Notice of Intent to Record of Decision. In addition, the environmental team implemented an extensive public involvement process, conducting a total of 23 public meetings for scoping and draft EIS comment along with numerous informal meetings with public officials, civic organizations, and Native American Indians. This paper will discuss the fast track approach used to successfully accomplish the NEPA process for X-33 on time.

  20. Economical Fabrication of Thick-Section Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Babcock, Jason; Ramachandran, Gautham; Williams, Brian; Benander, Robert

    2010-01-01

    A method was developed for producing thick-section [>2 in. (approx.5 cm)], continuous fiber-reinforced ceramic matrix composites (CMCs). Ultramet-modified fiber interface coating and melt infiltration processing, developed previously for thin-section components, were used for the fabrication of CMCs that were an order of magnitude greater in thickness [up to 2.5 in. (approx.6.4 cm)]. Melt processing first involves infiltration of a fiber preform with the desired interface coating, and then with carbon to partially densify the preform. A molten refractory metal is then infiltrated and reacts with the excess carbon to form the carbide matrix without damaging the fiber reinforcement. Infiltration occurs from the inside out as the molten metal fills virtually all the available void space. Densification to <5 vol% porosity is a one-step process requiring no intermediate machining steps. The melt infiltration method requires no external pressure. This prevents over-infiltration of the outer surface plies, which can lead to excessive residual porosity in the center of the part. However, processing of thick-section components required modification of the conventional process conditions, and the means by which the large amount of molten metal is introduced into the fiber preform. Modification of the low-temperature, ultraviolet-enhanced chemical vapor deposition process used to apply interface coatings to the fiber preform was also required to accommodate the high preform thickness. The thick-section CMC processing developed in this work proved to be invaluable for component development, fabrication, and testing in two complementary efforts. In a project for the Army, involving SiC/SiC blisk development, nominally 0.8 in. thick x 8 in. diameter (approx. 2 cm thick x 20 cm diameter) components were successfully infiltrated. Blisk hubs were machined using diamond-embedded cutting tools and successfully spin-tested. Good ply uniformity and extremely low residual porosity (<2 percent) were achieved, the latter being far lower than that achieved with SiC matrix composites fabricated via CVI or PIP. The pyrolytic carbon/zirconium nitride interface coating optimized in this work for use on carbon fibers was incorporated in the SiC/SiC composites and yielded a >41 ksi (approx. 283 MPa) flexural strength.

  1. Development and Validation of High Precision Thermal, Mechanical, and Optical Models for the Space Interferometry Mission

    NASA Technical Reports Server (NTRS)

    Lindensmith, Chris A.; Briggs, H. Clark; Beregovski, Yuri; Feria, V. Alfonso; Goullioud, Renaud; Gursel, Yekta; Hahn, Inseob; Kinsella, Gary; Orzewalla, Matthew; Phillips, Charles

    2006-01-01

    SIM Planetquest (SIM) is a large optical interferometer for making microarcsecond measurements of the positions of stars, and to detect Earth-sized planets around nearby stars. To achieve this precision, SIM requires stability of optical components to tens of picometers per hour. The combination of SIM s large size (9 meter baseline) and the high stability requirement makes it difficult and costly to measure all aspects of system performance on the ground. To reduce risks, costs and to allow for a design with fewer intermediate testing stages, the SIM project is developing an integrated thermal, mechanical and optical modeling process that will allow predictions of the system performance to be made at the required high precision. This modeling process uses commercial, off-the-shelf tools and has been validated against experimental results at the precision of the SIM performance requirements. This paper presents the description of the model development, some of the models, and their validation in the Thermo-Opto-Mechanical (TOM3) testbed which includes full scale brassboard optical components and the metrology to test them at the SIM performance requirement levels.

  2. 10 CFR 70.23 - Requirements for the approval of applications.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... be used for the conduct of research or development activities of a type specified in section 31 of... types of research and development activities specified in section 31 are those relating to: (1) Nuclear processes; (2) The theory and production of atomic energy, including processes, materials, and devices...

  3. Requirements Analysis for Large Ada Programs: Lessons Learned on CCPDS- R

    DTIC Science & Technology

    1989-12-01

    when the design had matured and This approach was not optimal from the formal the SRS role was to be the tester’s contract, implemen- testing and...on the software development CPU processing load. These constraints primar- process is the necessity to include sufficient testing ily affect algorithm...allocations and timing requirements are by-products of the software design process when multiple CSCls are a P R StrR eSOFTWARE ENGINEERING executed within

  4. Semi-automated camera trap image processing for the detection of ungulate fence crossing events.

    PubMed

    Janzen, Michael; Visser, Kaitlyn; Visscher, Darcy; MacLeod, Ian; Vujnovic, Dragomir; Vujnovic, Ksenija

    2017-09-27

    Remote cameras are an increasingly important tool for ecological research. While remote camera traps collect field data with minimal human attention, the images they collect require post-processing and characterization before it can be ecologically and statistically analyzed, requiring the input of substantial time and money from researchers. The need for post-processing is due, in part, to a high incidence of non-target images. We developed a stand-alone semi-automated computer program to aid in image processing, categorization, and data reduction by employing background subtraction and histogram rules. Unlike previous work that uses video as input, our program uses still camera trap images. The program was developed for an ungulate fence crossing project and tested against an image dataset which had been previously processed by a human operator. Our program placed images into categories representing the confidence of a particular sequence of images containing a fence crossing event. This resulted in a reduction of 54.8% of images that required further human operator characterization while retaining 72.6% of the known fence crossing events. This program can provide researchers using remote camera data the ability to reduce the time and cost required for image post-processing and characterization. Further, we discuss how this procedure might be generalized to situations not specifically related to animal use of linear features.

  5. SU-E-T-419: Workflow and FMEA in a New Proton Therapy (PT) Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, C; Wessels, B; Hamilton, H

    2014-06-01

    Purpose: Workflow is an important component in the operational planning of a new proton facility. By integrating the concept of failure mode and effect analysis (FMEA) and traditional QA requirements, a workflow for a proton therapy treatment course is set up. This workflow serves as the blue print for the planning of computer hardware/software requirements and network flow. A slight modification of the workflow generates a process map(PM) for FMEA and the planning of QA program in PT. Methods: A flowchart is first developed outlining the sequence of processes involved in a PT treatment course. Each process consists of amore » number of sub-processes to encompass a broad scope of treatment and QA procedures. For each subprocess, the personnel involved, the equipment needed and the computer hardware/software as well as network requirements are defined by a team of clinical staff, administrators and IT personnel. Results: Eleven intermediate processes with a total of 70 sub-processes involved in a PT treatment course are identified. The number of sub-processes varies, ranging from 2-12. The sub-processes within each process are used for the operational planning. For example, in the CT-Sim process, there are 12 sub-processes: three involve data entry/retrieval from a record-and-verify system, two controlled by the CT computer, two require department/hospital network, and the other five are setup procedures. IT then decides the number of computers needed and the software and network requirement. By removing the traditional QA procedures from the workflow, a PM is generated for FMEA analysis to design a QA program for PT. Conclusion: Significant efforts are involved in the development of the workflow in a PT treatment course. Our hybrid model of combining FMEA and traditional QA program serves a duo purpose of efficient operational planning and designing of a QA program in PT.« less

  6. Development of pulsed processes for the manufacture of solar cells

    NASA Technical Reports Server (NTRS)

    Minnucci, J. A.

    1978-01-01

    The results of a 1-year program to develop the processes required for low-energy ion implantation for the automated production of silicon solar cells are described. The program included: (1) demonstrating state-of-the-art ion implantation equipment and designing an automated ion implanter, (2) making efforts to improve the performance of ion-implanted solar cells to 16.5 percent AM1, (3) developing a model of the pulse annealing process used in solar cell production, and (4) preparing an economic analysis of the process costs of ion implantation.

  7. Automation and integration of components for generalized semantic markup of electronic medical texts.

    PubMed

    Dugan, J M; Berrios, D C; Liu, X; Kim, D K; Kaizer, H; Fagan, L M

    1999-01-01

    Our group has built an information retrieval system based on a complex semantic markup of medical textbooks. We describe the construction of a set of web-based knowledge-acquisition tools that expedites the collection and maintenance of the concepts required for text markup and the search interface required for information retrieval from the marked text. In the text markup system, domain experts (DEs) identify sections of text that contain one or more elements from a finite set of concepts. End users can then query the text using a predefined set of questions, each of which identifies a subset of complementary concepts. The search process matches that subset of concepts to relevant points in the text. The current process requires that the DE invest significant time to generate the required concepts and questions. We propose a new system--called ACQUIRE (Acquisition of Concepts and Queries in an Integrated Retrieval Environment)--that assists a DE in two essential tasks in the text-markup process. First, it helps her to develop, edit, and maintain the concept model: the set of concepts with which she marks the text. Second, ACQUIRE helps her to develop a query model: the set of specific questions that end users can later use to search the marked text. The DE incorporates concepts from the concept model when she creates the questions in the query model. The major benefit of the ACQUIRE system is a reduction in the time and effort required for the text-markup process. We compared the process of concept- and query-model creation using ACQUIRE to the process used in previous work by rebuilding two existing models that we previously constructed manually. We observed a significant decrease in the time required to build and maintain the concept and query models.

  8. New Directions in Space Operations Services in Support of Interplanetary Exploration

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.

    2005-01-01

    To gain access to the necessary operational processes and data in support of NASA's Lunar/Mars Exploration Initiative, new services, adequate levels of computing cycles and access to myriad forms of data must be provided to onboard spacecraft and ground based personnel/systems (earth, lunar and Martian) to enable interplanetary exploration by humans. These systems, cycles and access to vast amounts of development, test and operational data will be required to provide a new level of services not currently available to existing spacecraft, on board crews and other operational personnel. Although current voice, video and data systems in support of current space based operations has been adequate, new highly reliable and autonomous processes and services will be necessary for future space exploration activities. These services will range from the more mundane voice in LEO to voice in interplanetary travel which because of the high latencies will require new voice processes and standards. New services, like component failure predictions based on data mining of significant quantities of data, located at disparate locations, will be required. 3D or holographic representation of onboard components, systems or family members will greatly improve maintenance, operations and service restoration not to mention crew morale. Current operational systems and standards, like the Internet Protocol, will not able to provide the level of service required end to end from an end point on the Martian surface like a scientific instrument to a researcher at a university. Ground operations whether earth, lunar or Martian and in flight operations to the moon and especially to Mars will require significant autonomy that will require access to highly reliable processing capabilities, data storage based on network storage technologies. Significant processing cycles will be needed onboard but could be borrowed from other locations either ground based or onboard other spacecraft. Reliability will be a key factor with onboard and distributed backup processing an absolutely necessary requirement. Current cluster processing/Grid technologies may provide the basis for providing these services. An overview of existing services, future services that will be required and the technologies and standards required to be developed will be presented. The purpose of this paper will be to initiate a technological roadmap, albeit at a high level, of current voice, video, data and network technologies and standards (which show promise for adaptation or evolution) to what technologies and standards need to be redefined, adjusted or areas where new ones require development. The roadmap should begin the differentiation between non manned and manned processes/services where applicable. The paper will be based in part on the activities of the CCSDS Monitor and Control working group which is beginning the process of standardization of the these processes. Another element of the paper will be based on an analysis of current technologies supporting space flight processes and services at JSC, MSFC, GSFC and to a lesser extent at KSC. Work being accomplished in areas such as Grid computing, data mining and network storage at ARC, IBM and the University of Alabama at Huntsville will be researched and analyzed.

  9. An aspect-oriented approach for designing safety-critical systems

    NASA Astrophysics Data System (ADS)

    Petrov, Z.; Zaykov, P. G.; Cardoso, J. P.; Coutinho, J. G. F.; Diniz, P. C.; Luk, W.

    The development of avionics systems is typically a tedious and cumbersome process. In addition to the required functions, developers must consider various and often conflicting non-functional requirements such as safety, performance, and energy efficiency. Certainly, an integrated approach with a seamless design flow that is capable of requirements modelling and supporting refinement down to an actual implementation in a traceable way, may lead to a significant acceleration of development cycles. This paper presents an aspect-oriented approach supported by a tool chain that deals with functional and non-functional requirements in an integrated manner. It also discusses how the approach can be applied to development of safety-critical systems and provides experimental results.

  10. On the Development of a Hospital-Patient Web-Based Communication Tool: A Case Study From Norway.

    PubMed

    Granja, Conceição; Dyb, Kari; Bolle, Stein Roald; Hartvigsen, Gunnar

    2015-01-01

    Surgery cancellations are undesirable in hospital settings as they increase costs, reduce productivity and efficiency, and directly affect the patient. The problem of elective surgery cancellations in a North Norwegian University Hospital is addressed. Based on a three-step methodology conducted at the hospital, the preoperative planning process was modeled taking into consideration the narratives from different health professions. From the analysis of the generated process models, it is concluded that in order to develop a useful patient centered web-based communication tool, it is necessary to fully understand how hospitals plan and organize surgeries today. Moreover, process reengineering is required to generate a standard process that can serve as a tool for health ICT designers to define the requirements for a robust and useful system.

  11. Designing for human presence in space: An introduction to environmental control and life support systems

    NASA Technical Reports Server (NTRS)

    Wieland, Paul

    1994-01-01

    Human exploration and utilization of space requires habitats to provide appropriate conditions for working and living. These conditions are provided by environmental control and life support systems (ECLSS) that ensure appropriate atmosphere composition, pressure, and temperature; manage and distribute water, process waste matter, provide fire detection and suppression; and other functions as necessary. The functions that are performed by ECLSS are described and basic information necessary to design an ECLSS is provided. Technical and programmatic aspects of designing and developing ECLSS for space habitats are described including descriptions of technologies, analysis methods, test requirements, program organization, documentation requirements, and the requirements imposed by medical, mission, safety, and system needs. The design and development process is described from initial trade studies through system-level analyses to support operation. ECLSS needs for future space habitats are also described. Extensive listings of references and related works provide sources for more detailed information on each aspect of ECLSS design and development.

  12. Integrating Safety and Mission Assurance into Systems Engineering Modeling Practices

    NASA Technical Reports Server (NTRS)

    Beckman, Sean; Darpel, Scott

    2015-01-01

    During the early development of products, flight, or experimental hardware, emphasis is often given to the identification of technical requirements, utilizing such tools as use case and activity diagrams. Designers and project teams focus on understanding physical and performance demands and challenges. It is typically only later, during the evaluation of preliminary designs that a first pass, if performed, is made to determine the process, safety, and mission quality assurance requirements. Evaluation early in the life cycle, though, can yield requirements that force a fundamental change in design. This paper discusses an alternate paradigm for using the concepts of use case or activity diagrams to identify safety hazard and mission quality assurance risks and concerns using the same systems engineering modeling tools being used to identify technical requirements. It contains two examples of how this process might be used in the development of a space flight experiment, and the design of a Human Powered Pizza Delivery Vehicle, along with the potential benefits to decrease development time, and provide stronger budget estimates.

  13. IPL Processing of the Viking Orbiter Images of Mars

    NASA Technical Reports Server (NTRS)

    Ruiz, R. M.; Elliott, D. A.; Yagi, G. M.; Pomphrey, R. B.; Power, M. A.; Farrell, W., Jr.; Lorre, J. J.; Benton, W. D.; Dewar, R. E.; Cullen, L. E.

    1977-01-01

    The Viking orbiter cameras returned over 9000 images of Mars during the 6-month nominal mission. Digital image processing was required to produce products suitable for quantitative and qualitative scientific interpretation. Processing included the production of surface elevation data using computer stereophotogrammetric techniques, crater classification based on geomorphological characteristics, and the generation of color products using multiple black-and-white images recorded through spectral filters. The Image Processing Laboratory of the Jet Propulsion Laboratory was responsible for the design, development, and application of the software required to produce these 'second-order' products.

  14. Advances in the Development of a WCl6 CVD System for Coating UO2 Powders with Tungsten

    NASA Technical Reports Server (NTRS)

    Mireles, Omar R.; Tieman, Alyssa; Broadway, Jeramie; Hickman, Robert

    2013-01-01

    Demonstrated viability and utilization of: a) Fluidized powder bed. b) WCl6 CVD process. c) Coated spherical particles with tungsten. The highly corrosive nature of the WCl6 solid reagent limits material of construction. Indications that identifying optimized process variables with require substantial effort and will likely vary with changes in fuel requirements.

  15. Education Department Begins Process to Implement HEA Reauthorization with New Campus Safety Provisions

    ERIC Educational Resources Information Center

    Phillips, Lisa

    2008-01-01

    The U.S. Department of Education has announced the beginning of the process to develop rules for new requirements in the recently passed Higher Education Act (HEA). Highlights of the HEA that affect campus public safety departments include measures that: (1) Require a fire log be maintained at an institution of higher education for events that…

  16. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  17. Key issues in the thermal design of spaceborne cryogenic infrared instruments

    NASA Astrophysics Data System (ADS)

    Schember, Helene R.; Rapp, Donald

    1992-12-01

    Thermal design and analysis play an integral role in the development of spaceborne cryogenic infrared (IR) instruments. From conceptual sketches to final testing, both direct and derived thermal requirements place significant constraints on the instrument design. Although in practice these thermal requirements are interdependent, the sources of most thermal constraints may be grouped into six distinct categories. These are: (1) Detector temperatures, (2) Optics temperatures, (3) Pointing or alignment stability, (4) Mission lifetime, (5) Orbit, and (6) Test and Integration. In this paper, we discuss these six sources of thermal requirements with particular regard to development of instrument packages for low background infrared astronomical observatories. In the end, the thermal performance of these instruments must meet a set of thermal requirements. The development of these requirements is typically an ongoing and interactive process, however, and the thermal design must maintain flexibility and robustness throughout the process. The thermal (or cryogenic) engineer must understand the constraints imposed by the science requirements, the specific hardware, the observing environment, the mission design, and the testing program. By balancing these often competing factors, the system-oriented thermal engineer can work together with the experiment team to produce an effective overall design of the instrument.

  18. Development of a COTS Mass Storage Unit for the Space Readiness Coherent Lidar Experiment

    NASA Technical Reports Server (NTRS)

    Liggin, Karl; Clark, Porter

    1999-01-01

    The technology to develop a Mass Storage Unit (MSU) using commercial-off-the-shelf (COTS) hard drives is an on-going challenge to meet the Space Readiness Coherent Lidar Experiment (SPARCLE) program requirements. A conceptual view of SPARCLE's laser collecting atmospheric data from the shuttle is shown in Figure 1. The determination to develop this technology required several in depth studies before an actual COTS hard drive was selected to continue this effort. Continuing the development of the MSU can, and will, serve future NASA programs that require larger data storage and more on-board processing.

  19. Automated documentation generator for advanced protein crystal growth

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    To achieve an environment less dependent on the flow of paper, automated techniques of data storage and retrieval must be utilized. This software system, 'Automated Payload Experiment Tool,' seeks to provide a knowledge-based, hypertext environment for the development of NASA documentation. Once developed, the final system should be able to guide a Principal Investigator through the documentation process in a more timely and efficient manner, while supplying more accurate information to the NASA payload developer. The current system is designed for the development of the Science Requirements Document (SRD), the Experiment Requirements Document (ERD), the Project Plan, and the Safety Requirements Document.

  20. Aerospace Engineering Systems and the Advanced Design Technologies Testbed Experience

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    1999-01-01

    Continuous improvement of aerospace product development processes is a driving requirement across much of the aerospace community. As up to 90% of the cost of an aerospace product is committed during the first 10% of the development cycle, there is a strong emphasis on capturing, creating, and communicating better information (both requirements and performance) early in the product development process. The community has responded by pursuing the development of computer-based systems designed to enhance the decision-making capabilities of product development individuals and teams. Recently, the historical foci on sharing the geometrical representation and on configuration management are being augmented: 1) Physics-based analysis tools for filling the design space database; 2) Distributed computational resources to reduce response time and cost; 3) Web-based technologies to relieve machine-dependence; and 4) Artificial intelligence technologies to accelerate processes and reduce process variability. The Advanced Design Technologies Testbed (ADTT) activity at NASA Ames Research Center was initiated to study the strengths and weaknesses of the technologies supporting each of these trends, as well as the overall impact of the combination of these trends on a product development event. Lessons learned and recommendations for future activities are reported.

  1. Space station automation study: Automation requirements derived from space manufacturing concepts. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The electroepitaxial process and the Very Large Scale Integration (VLSI) circuits (chips) facilities were chosen because each requires a very high degree of automation, and therefore involved extensive use of teleoperators, robotics, process mechanization, and artificial intelligence. Both cover a raw materials process and a sophisticated multi-step process and are therfore highly representative of the kinds of difficult operation, maintenance, and repair challenges which can be expected for any type of space manufacturing facility. Generic areas were identified which will require significant further study. The initial design will be based on terrestrial state-of-the-art hard automation. One hundred candidate missions were evaluated on the basis of automation portential and availability of meaning ful knowldege. The design requirements and unconstrained design concepts developed for the two missions are presented.

  2. 24 CFR 1003.301 - Selection process.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Selection process. 1003.301 Section... Application and Selection Process § 1003.301 Selection process. (a) Threshold requirement. An applicant that... establish weights for the selection criteria, will specify the maximum points available, and will describe...

  3. Purple L1 Milestone Review Panel TotalView Debugger Functionality and Performance for ASC Purple

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfe, M

    2006-12-12

    ASC code teams require a robust software debugging tool to help developers quickly find bugs in their codes and get their codes running. Development debugging commonly runs up to 512 processes. Production jobs run up to full ASC Purple scale, and at times require introspection while running. Developers want a debugger that runs on all their development and production platforms and that works with all compilers and runtimes used with ASC codes. The TotalView Multiprocess Debugger made by Etnus was specified for ASC Purple to address this needed capability. The ASC Purple environment builds on the environment seen by TotalViewmore » on ASCI White. The debugger must now operate with the Power5 CPU, Federation switch, AIX 5.3 operating system including large pages, IBM compilers 7 and 9, POE 4.2 parallel environment, and rs6000 SLURM resource manager. Users require robust, basic debugger functionality with acceptable performance at development debugging scale. A TotalView installation must be provided at the beginning of the early user access period that meets these requirements. A functional enhancement, fast conditional data watchpoints, and a scalability enhancement, capability up to 8192 processes, are to be demonstrated.« less

  4. 45 CFR 162.910 - Maintenance of standards and adoption of modifications and new standards.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS ADMINISTRATIVE REQUIREMENTS General Provisions for... developed through a process that provides for the following: (1) Open public access. (2) Coordination with...

  5. 45 CFR 162.910 - Maintenance of standards and adoption of modifications and new standards.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS ADMINISTRATIVE REQUIREMENTS General Provisions for... developed through a process that provides for the following: (1) Open public access. (2) Coordination with...

  6. 14 CFR 158.20 - Submission of required documents.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... due to the security process. (b) Once the database development is completed with air carrier capability, public agencies and air carriers may use the FAA's national PFC database to post their required...

  7. 14 CFR 158.20 - Submission of required documents.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... due to the security process. (b) Once the database development is completed with air carrier capability, public agencies and air carriers may use the FAA's national PFC database to post their required...

  8. 14 CFR 158.20 - Submission of required documents.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... due to the security process. (b) Once the database development is completed with air carrier capability, public agencies and air carriers may use the FAA's national PFC database to post their required...

  9. 14 CFR 158.20 - Submission of required documents.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... due to the security process. (b) Once the database development is completed with air carrier capability, public agencies and air carriers may use the FAA's national PFC database to post their required...

  10. 14 CFR 158.20 - Submission of required documents.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... due to the security process. (b) Once the database development is completed with air carrier capability, public agencies and air carriers may use the FAA's national PFC database to post their required...

  11. Technology assessment and requirements analysis: a process to facilitate decision making in picture archiving and communications system implementation.

    PubMed

    Radvany, M G; Chacko, A K; Richardson, R R; Grazdan, G W

    1999-05-01

    In a time of decreasing resources, managers need a tool to manage their resources effectively, support clinical requirements, and replace aging equipment in order to ensure adequate clinical care. To do this successfully, one must be able to perform technology assessment and capital equipment asset management. The lack of a commercial system that adequately performed technology needs assessment and addressed the unique needs of the military led to the development of an in-house Technology Assessment and Requirements Analysis (TARA) program. The TARA is a tool that provides an unbiased review of clinical operations and the resulting capital equipment requirements for military hospitals. The TARA report allows for the development of acquisition strategies for new equipment, enhances personnel management, and improves and streamlines clinical operations and processes.

  12. Development of Sensors for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Medelius, Pedro

    2005-01-01

    Advances in technology have led to the availability of smaller and more accurate sensors. Computer power to process large amounts of data is no longer the prevailing issue; thus multiple and redundant sensors can be used to obtain more accurate and comprehensive measurements in a space vehicle. The successful integration and commercialization of micro- and nanotechnology for aerospace applications require that a close and interactive relationship be developed between the technology provider and the end user early in the project. Close coordination between the developers and the end users is critical since qualification for flight is time-consuming and expensive. The successful integration of micro- and nanotechnology into space vehicles requires a coordinated effort throughout the design, development, installation, and integration processes

  13. IS0 9000 Implementation and Assessment: A Guide to Developing and Evaluating Quality Management Systems

    NASA Technical Reports Server (NTRS)

    Navarro, Robert J.; Grimm, Barry

    1996-01-01

    The agency has developed this reference publication to aid NASA organizations and their suppliers in the transition to IS0 9000. This guide focuses on the standard s intent, clarifies its requirements, offers implementation examples and highlights interrelated areas. It can assist anyone developing or evaluating NASA or supplier quality management systems. The IS0 9000 standards contain the basic elements for managing those processes that affect an organization's ability to consistently meet customer requirements. IS0 9000 was developed through the International Organization for Standardization and has been adopted as the US. national standard. These standards define a flexible foundation for customer focused process measurement, management and improvement that is the hallmark of world class enterprises.

  14. Launch Vehicle Design Process Description and Training Formulation

    NASA Technical Reports Server (NTRS)

    Atherton, James; Morris, Charles; Settle, Gray; Teal, Marion; Schuerer, Paul; Blair, James; Ryan, Robert; Schutzenhofer, Luke

    1999-01-01

    A primary NASA priority is to reduce the cost and improve the effectiveness of launching payloads into space. As a consequence, significant improvements are being sought in the effectiveness, cost, and schedule of the launch vehicle design process. In order to provide a basis for understanding and improving the current design process, a model has been developed for this complex, interactive process, as reported in the references. This model requires further expansion in some specific design functions. Also, a training course for less-experienced engineers is needed to provide understanding of the process, to provide guidance for its effective implementation, and to provide a basis for major improvements in launch vehicle design process technology. The objective of this activity is to expand the description of the design process to include all pertinent design functions, and to develop a detailed outline of a training course on the design process for launch vehicles for use in educating engineers whose experience with the process has been minimal. Building on a previously-developed partial design process description, parallel sections have been written for the Avionics Design Function, the Materials Design Function, and the Manufacturing Design Function. Upon inclusion of these results, the total process description will be released as a NASA TP. The design function sections herein include descriptions of the design function responsibilities, interfaces, interactive processes, decisions (gates), and tasks. Associated figures include design function planes, gates, and tasks, along with other pertinent graphics. Also included is an expanded discussion of how the design process is divided, or compartmentalized, into manageable parts to achieve efficient and effective design. A detailed outline for an intensive two-day course on the launch vehicle design process has been developed herein, and is available for further expansion. The course is in an interactive lecture/workshop format to engage the participants in active learning. The course addresses the breadth and depth of the process, requirements, phases, participants, multidisciplinary aspects, tasks, critical elements,as well as providing guidance from previous lessons learned. The participants are led to develop their own understanding of the current process and how it can be improved. Included are course objectives and a session-by-session outline of course content. Also included is an initial identification of visual aid requirements.

  15. An evaluation of software tools for the design and development of cockpit displays

    NASA Technical Reports Server (NTRS)

    Ellis, Thomas D., Jr.

    1993-01-01

    The use of all-glass cockpits at the NASA Langley Research Center (LaRC) simulation facility has changed the means of design, development, and maintenance of instrument displays. The human-machine interface has evolved from a physical hardware device to a software-generated electronic display system. This has subsequently caused an increased workload at the facility. As computer processing power increases and the glass cockpit becomes predominant in facilities, software tools used in the design and development of cockpit displays are becoming both feasible and necessary for a more productive simulation environment. This paper defines LaRC requirements of a display software development tool and compares two available applications against these requirements. As a part of the software engineering process, these tools reduce development time, provide a common platform for display development, and produce exceptional real-time results.

  16. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  17. Process and product development in the manufacturing of molecular therapeutics.

    PubMed

    Atkinson, E M; Christensen, J R

    1999-08-01

    In the development of molecular therapies, a great deal of attention has focused on tissue targets, gene delivery vectors, and expression cassettes. In order to become an approved therapy, however, a molecular therapeutic has to pass down the same product registration pathway as any other biological product. Moving from research into industrial production requires careful attention to regulatory, manufacturing and quality concerns. Early work on developing and characterizing robust and scaleable manufacturing processes will ultimately be rewarded by ease of implementation as the product is successful in clinical trials. Regulatory agencies require solid process and product characterization studies to demonstrate control and understanding of the molecular therapeutic. As the gene therapy industry matures, standards will continue to rise, creating an industry that is capable of producing safe, high-quality and effective therapies for many of the world's most difficult disease targets.

  18. SiC/SiC Composites for 1200 C and Above

    NASA Technical Reports Server (NTRS)

    DiCarlo, J. A.; Yun, H.-M.; Morscher, G. N.; Bhatt, R. T.

    2004-01-01

    The successful replacement of metal alloys by ceramic matrix composites (CMC) in high-temperature engine components will require the development of constituent materials and processes that can provide CMC systems with enhanced thermal capability along with the key thermostructural properties required for long-term component service. This chapter presents information concerning processes and properties for five silicon carbide (SiC) fiber-reinforced SiC matrix composite systems recently developed by NASA that can operate under mechanical loading and oxidizing conditions for hundreds of hours at 1204, 1315, and 1427 C, temperatures well above current metal capability. This advanced capability stems in large part from specific NASA-developed processes that significantly improve the creep-rupture and environmental resistance of the SiC fiber as well as the thermal conductivity, creep resistance, and intrinsic thermal stability of the SiC matrices.

  19. Friction Stir Welding Development at NASA, Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    McGill, Preston; Gentz, Steve (Technical Monitor)

    2001-01-01

    Friction stir welding (FSW) is a solid state process that pan be used to join materials without melting. The process was invented by The Welding Institute (TWI), Cambridge, England. Friction stir welding exhibits several advantages over fusion welding in that it produces welds with fewer defects and higher joint efficiency and is capable of joining alloys that are generally considered non-weldable with a fusion weld process. In 1994, NASA-Marshall began collaborating with TWI to transform FSW from a laboratory curiosity to a viable metal joining process suitable for manufacturing hardware. While teamed with TWI, NASA-Marshall began its own FSW research and development effort to investigate possible aerospace applications for the FSW process. The work involved nearly all aspects of FSW development, including process modeling, scale-up issues, applications to advanced materials and development of tooling to use FSW on components of the Space Shuttle with particular emphasis on aluminum tanks. The friction stir welding process involves spinning a pin-tool at an appropriate speed, plunging it into the base metal pieces to be joined, and then translating it along the joint of the work pieces. In aluminum alloys the rotating speed typically ranges from 200 to 400 revolutions per minute and the translation speed is approximately two to five inches per minute. The pin-tool is inserted at a small lead angle from the axis normal to the work piece and requires significant loading along the axis of the tool. An anvil or reaction structure is required behind the welded material to react the load along the axis of the pin tool. The process requires no external heat input, filler material, protective shielding gas or inert atmosphere typical of fusion weld processes. The FSW solid-state weld process has resulted in aluminum welds with significantly higher strengths, higher joint efficiencies and fewer defects than fusion welds used to join similar alloys.

  20. Perspective on the National Aero-Space Plane Program instrumentation development

    NASA Technical Reports Server (NTRS)

    Bogue, Rodney K.; Erbland, Peter

    1993-01-01

    A review of the requirement for, and development of, advanced measurement technology for the National Aerospace Plane program is presented. The objective is to discuss the technical need and the program commitment required to ensure that adequate and timely measurement capabilities are provided for ground and flight testing in the NASP program. The scope of the measurement problem is presented, the measurement process is described, how instrumentation technology development has been affected by NASP program evolution is examined, the national effort to define measurement requirements and assess the adequacy of current technology to support the NASP program is discussed, and the measurement requirements are summarized. The unique features of the NASP program that complicate the understanding of requirements and the development of viable solutions are illustrated.

  1. Evaluation Criteria for Solid Waste Processing Research and Technology Development

    NASA Technical Reports Server (NTRS)

    Levri, Julie A.; Hogan, J. A.; Alazraki, M. P.

    2001-01-01

    A preliminary list of criteria is proposed for evaluation of solid waste processing technologies for research and technology development (R&TD) in the Advanced Life Support (ALS) Program. Completion of the proposed list by current and prospective ALS technology developers, with regard to specific missions of interest, may enable identification of appropriate technologies (or lack thereof) and guide future development efforts for the ALS Program solid waste processing area. An attempt is made to include criteria that capture information about the technology of interest as well as its system-wide impacts. Some of the criteria in the list are mission-independent, while the majority are mission-specific. In order for technology developers to respond to mission-specific criteria, critical information must be available on the quantity, composition and state of the waste stream, the wast processing requirements, as well as top-level mission scenario information (e.g. safety, resource recovery, planetary protection issues, and ESM equivalencies). The technology readiness level (TRL) determines the degree to which a technology developer is able to accurately report on the list of criteria. Thus, a criteria-specific minimum TRL for mandatory reporting has been identified for each criterion in the list. Although this list has been developed to define criteria that are needed to direct funding of solid waste processing technologies, this list processes significant overlap in criteria required for technology selection for inclusion in specific tests or missions. Additionally, this approach to technology evaluation may be adapted to other ALS subsystems.

  2. Development of a strategy for energy efficiency improvement in a Kraft process based on systems interactions analysis

    NASA Astrophysics Data System (ADS)

    Mateos-Espejel, Enrique

    The objective of this thesis is to develop, validate, and apply a unified methodology for the energy efficiency improvement of a Kraft process that addresses globally the interactions of the various process systems that affect its energy performance. An implementation strategy is the final result. An operating Kraft pulping mill situated in Eastern Canada with a production of 700 adt/d of high-grade bleached pulp was the case study. The Pulp and Paper industry is Canada's premier industry. It is characterized by large thermal energy and water consumption. Rising energy costs and more stringent environmental regulations have led the industry to refocus its efforts toward identifying ways to improve energy and water conservation. Energy and water aspects are usually analyzed independently, but in reality they are strongly interconnected. Therefore, there is a need for an integrated methodology, which considers energy and water aspects, as well as the optimal utilization and production of the utilities. The methodology consists of four successive stages. The first stage is the base case definition. The development of a focused, reliable and representative model of an operating process is a prerequisite to the optimization and fine tuning of its energy performance. A four-pronged procedure has been developed: data gathering, master diagram, utilities systems analysis, and simulation. The computer simulation has been focused on the energy and water systems. The second stage corresponds to the benchmarking analysis. The benchmarking of the base case has the objectives of identifying the process inefficiencies and to establish guidelines for the development of effective enhancement measures. The studied process is evaluated by a comparison of its efficiency to the current practice of the industry and by the application of new energy and exergy content indicators. The minimum energy and water requirements of the process are also determined in this step. The third stage is the core of the methodology; it represents the formulation of technically feasible energy enhancing options. Several techniques are applied in an iterative procedure to cast light on their synergies and counter-actions. The objective is to develop a path for improving the process so as to maximize steam savings while minimizing the investment required. The fourth stage is the implementation strategy. As the existing process configuration and operating conditions vary from process to process it is important to develop a strategy for the implementation of energy enhancement programs in the most advantageous way for each case. A three-phase strategy was selected for the specific case study in the context of its management strategic plan: the elimination of fossil fuel, the production of power and the liberation of steam capacity. A post-benchmarking analysis is done to quantify the improvement of the energy efficiency. The performance indicators are computed after all energy enhancing measures have been implemented. The improvement of the process by applying the unified methodology results in substantially more steam savings than by applying individually the typical techniques that it comprises: energy savings of 5.6 GJ/adt (27% of the current requirement), water savings of 32 m3/adt (34% of the current requirement) and an electricity production potential of 44.5MW. As a result of applying the unified methodology the process becomes eco-friendly as it does not require fossil fuel for producing steam; its water and steam consumptions are below the Canadian average and it produces large revenues from the production of green electricity.

  3. Multi-Center Implementation of NPR 7123.1A: A Collaborative Effort

    NASA Technical Reports Server (NTRS)

    Hall, Phillip B.; McNelis, Nancy B.

    2011-01-01

    Collaboration efforts between MSFC and GRC Engineering Directorates to implement the NASA Systems Engineering (SE) Engine have expanded over the past year to include other NASA Centers. Sharing information on designing, developing, and deploying SE processes has sparked further interest based on the realization that there is relative consistency in implementing SE processes at the institutional level. This presentation will provide a status on the ongoing multi-center collaboration and provide insight into how these NPR 7123.1A SE-aligned directives are being implemented and managed to better support the needs of NASA programs and projects. NPR 7123.1A, NASA Systems Engineering Processes and Requirements, was released on March 26, 2007 to clearly articulate and establish the requirements on the implementing organization for performing, supporting, and evaluating SE activities. In early 2009, MSFC and GRC Engineering Directorates undertook a collaborative opportunity to share their research and work associated with developing, updating and revising their SE process policy to comply and align with NPR 7123.1A. The goal is to develop instructions, checklists, templates, and procedures for each of the 17 SE process requirements so that systems engineers will be a position to define work that is process-driven. Greater efficiency and more effective technical management will be achieved due to consistency and repeatability of SE process implementation across and throughout each of the NASA centers. An added benefit will be to encourage NASA centers to pursue and collaborate on joint projects as a result of using common or similar processes, methods, tools, and techniques.

  4. Impact of agile methodologies on team capacity in automotive radio-navigation projects

    NASA Astrophysics Data System (ADS)

    Prostean, G.; Hutanu, A.; Volker, S.

    2017-01-01

    The development processes used in automotive radio-navigation projects are constantly under adaption pressure. While the software development models are based on automotive production processes, the integration of peripheral components into an automotive system will trigger a high number of requirement modifications. The use of traditional development models in automotive industry will bring team’s development capacity to its boundaries. The root cause lays in the inflexibility of actual processes and their adaption limits. This paper addresses a new project management approach for the development of radio-navigation projects. The understanding of weaknesses of current used models helped us in development and integration of agile methodologies in traditional development model structure. In the first part we focus on the change management methods to reduce request for change inflow. Established change management risk analysis processes enables the project management to judge the impact of a requirement change and also gives time to the project to implement some changes. However, in big automotive radio-navigation projects the saved time is not enough to implement the large amount of changes, which are submitted to the project. In the second phase of this paper we focus on increasing team capacity by integrating at critical project phases agile methodologies into the used traditional model. The overall objective of this paper is to prove the need of process adaption in order to solve project team capacity bottlenecks.

  5. An acetate precursor process for BSCCO (2223) thin films and coprecipitated powders

    NASA Technical Reports Server (NTRS)

    Haertling, Gene H.

    1992-01-01

    Since the discovery of high temperature superconducting oxides much attention has been paid to finding better and useful ways to take advantage of the special properties exhibited by these materials. One such process is the development of thin films for engineering applications. Another such process is the coprecipitation route to producing superconducting powders. An acetate precursor process for use in thin film fabrication and a chemical coprecipitation route to Bismuth based superconducting materials has been developed. Data obtained from the thin film process were inconclusive to date and require more study. The chemical coprecipitation method of producing bulk material is a viable method, and is preferred over the previously used solid state route. This method of powder production appears to be an excellent route to producing thin section tape cast material and screen printed devices, as it requires less calcines than the oxide route to produce quality powders.

  6. On the Inevitable Intertwining of Requirements and Architecture

    NASA Astrophysics Data System (ADS)

    Sutcliffe, Alistair

    The chapter investigates the relationship between architecture and requirements, arguing that architectural issues need to be addressed early in the RE process. Three trends are driving architectural implications for RE: the growth of intelligent, context-aware and adaptable systems. First the relationship between architecture and requirements is considered from a theoretical viewpoint of problem frames and abstract conceptual models. The relationships between architectural decisions and non-functional requirements is reviewed, and then the impact of architecture on the RE process is assessed using a case study of developing configurable, semi-intelligent software to support medical researchers in e-science domains.

  7. Integration of safety engineering into a cost optimized development program.

    NASA Technical Reports Server (NTRS)

    Ball, L. W.

    1972-01-01

    A six-segment management model is presented, each segment of which represents a major area in a new product development program. The first segment of the model covers integration of specialist engineers into 'systems requirement definition' or the system engineering documentation process. The second covers preparation of five basic types of 'development program plans.' The third segment covers integration of system requirements, scheduling, and funding of specialist engineering activities into 'work breakdown structures,' 'cost accounts,' and 'work packages.' The fourth covers 'requirement communication' by line organizations. The fifth covers 'performance measurement' based on work package data. The sixth covers 'baseline requirements achievement tracking.'

  8. Development of cognitive processing and judgments of knowledge in medical students: Analysis of progress test results.

    PubMed

    Cecilio-Fernandes, Dario; Kerdijk, Wouter; Jaarsma, A D Debbie C; Tio, René A

    2016-11-01

    Beside acquiring knowledge, medical students should also develop the ability to apply and reflect on it, requiring higher-order cognitive processing. Ideally, students should have reached higher-order cognitive processing when they enter the clinical program. Whether this is the case, is unknown. We investigated students' cognitive processing, and awareness of their knowledge during medical school. Data were gathered from 347 first-year preclinical and 196 first-year clinical students concerning the 2008 and 2011 Dutch progress tests. Questions were classified based upon Bloom's taxonomy: "simple questions" requiring lower and "vignette questions" requiring higher-order cognitive processing. Subsequently, we compared students' performance and awareness of their knowledge in 2008 to that in 2011 for each question type. Students' performance on each type of question increased as students progressed. Preclinical and first-year clinical students performed better on simple questions than on vignette questions. Third-year clinical students performed better on vignette questions than on simple questions. The accuracy of students' judgment of knowledge decreased over time. The progress test is a useful tool to assess students' cognitive processing and awareness of their knowledge. At the end of medical school, students achieved higher-order cognitive processing but their awareness of their knowledge had decreased.

  9. The application of intelligent process control to space based systems

    NASA Technical Reports Server (NTRS)

    Wakefield, G. Steve

    1990-01-01

    The application of Artificial Intelligence to electronic and process control can help attain the autonomy and safety requirements of manned space systems. An overview of documented applications within various industries is presented. The development process is discussed along with associated issues for implementing an intelligence process control system.

  10. Knowledge base and sensor bus messaging service architecture for critical tsunami warning and decision-support

    NASA Astrophysics Data System (ADS)

    Sabeur, Z. A.; Wächter, J.; Middleton, S. E.; Zlatev, Z.; Häner, R.; Hammitzsch, M.; Loewe, P.

    2012-04-01

    The intelligent management of large volumes of environmental monitoring data for early tsunami warning requires the deployment of robust and scalable service oriented infrastructure that is supported by an agile knowledge-base for critical decision-support In the TRIDEC project (TRIDEC 2010-2013), a sensor observation service bus of the TRIDEC system is being developed for the advancement of complex tsunami event processing and management. Further, a dedicated TRIDEC system knowledge-base is being implemented to enable on-demand access to semantically rich OGC SWE compliant hydrodynamic observations and operationally oriented meta-information to multiple subscribers. TRIDEC decision support requires a scalable and agile real-time processing architecture which enables fast response to evolving subscribers requirements as the tsunami crisis develops. This is also achieved with the support of intelligent processing services which specialise in multi-level fusion methods with relevance feedback and deep learning. The TRIDEC knowledge base development work coupled with that of the generic sensor bus platform shall be presented to demonstrate advanced decision-support with situation awareness in context of tsunami early warning and crisis management.

  11. Detailed Modeling of Physical Processes in Electron Sources for Accelerator Applications

    NASA Astrophysics Data System (ADS)

    Chubenko, Oksana; Afanasev, Andrei

    2017-01-01

    At present, electron sources are essential in a wide range of applications - from common technical use to exploring the nature of matter. Depending on the application requirements, different methods and materials are used to generate electrons. State-of-the-art accelerator applications set a number of often-conflicting requirements for electron sources (e.g., quantum efficiency vs. polarization, current density vs. lifetime, etc). Development of advanced electron sources includes modeling and design of cathodes, material growth, fabrication of cathodes, and cathode testing. The detailed simulation and modeling of physical processes is required in order to shed light on the exact mechanisms of electron emission and to develop new-generation electron sources with optimized efficiency. The purpose of the present work is to study physical processes in advanced electron sources and develop scientific tools, which could be used to predict electron emission from novel nano-structured materials. In particular, the area of interest includes bulk/superlattice gallium arsenide (bulk/SL GaAs) photo-emitters and nitrogen-incorporated ultrananocrystalline diamond ((N)UNCD) photo/field-emitters. Work supported by The George Washington University and Euclid TechLabs LLC.

  12. Basic Hitchhiker Payload Requirements

    NASA Technical Reports Server (NTRS)

    Horan, Stephen

    1999-01-01

    This document lists the requirements for the NMSU Hitchhiker experiment payload that were developed as part of the EE 498/499 Capstone Design class during the 1999-2000 academic year. This document is used to describe the system needs as described in the mission document. The requirements listed here are those primarily used to generate the basic electronic and data processing requirements developed in the class design document. The needs of the experiment components are more fully described in the draft NASA hitchhiker customer requirements document. Many of the details for the overall payload are given in full detail in the NASA hitchhiker documentation.

  13. Design requirements and development of an airborne descent path definition algorithm for time navigation

    NASA Technical Reports Server (NTRS)

    Izumi, K. H.; Thompson, J. L.; Groce, J. L.; Schwab, R. W.

    1986-01-01

    The design requirements for a 4D path definition algorithm are described. These requirements were developed for the NASA ATOPS as an extension of the Local Flow Management/Profile Descent algorithm. They specify the processing flow, functional and data architectures, and system input requirements, and recommended the addition of a broad path revision (reinitialization) function capability. The document also summarizes algorithm design enhancements and the implementation status of the algorithm on an in-house PDP-11/70 computer. Finally, the requirements for the pilot-computer interfaces, the lateral path processor, and guidance and steering function are described.

  14. Space station needs, attributes, and architectural options study. Volume 1: Missions and requirements

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Science and applications, NOAA environmental observation, commercial resource observations, commercial space processing, commercial communications, national security, technology development, and GEO servicing are addressed. Approach to time phasing of mission requirements, system sizing summary, time-phased user mission payload support, space station facility requirements, and integrated time-phased system requirements are also addressed.

  15. A Requirements-Based Exploration of Open-Source Software Development Projects--Towards a Natural Language Processing Software Analysis Framework

    ERIC Educational Resources Information Center

    Vlas, Radu Eduard

    2012-01-01

    Open source projects do have requirements; they are, however, mostly informal, text descriptions found in requests, forums, and other correspondence. Understanding such requirements provides insight into the nature of open source projects. Unfortunately, manual analysis of natural language requirements is time-consuming, and for large projects,…

  16. Advanced Research Deposition System (ARDS) for processing CdTe solar cells

    NASA Astrophysics Data System (ADS)

    Barricklow, Keegan Corey

    CdTe solar cells have been commercialized at the Gigawatt/year level. The development of volume manufacturing processes for next generation CdTe photovoltaics (PV) with higher efficiencies requires research systems with flexibility, scalability, repeatability and automation. The Advanced Research Deposition Systems (ARDS) developed by the Materials Engineering Laboratory (MEL) provides such a platform for the investigation of materials and manufacturing processes necessary to produce the next generation of CdTe PV. Limited by previous research systems, the ARDS was developed to provide process and hardware flexibility, accommodating advanced processing techniques, and capable of producing device quality films. The ARDS is a unique, in-line process tool with nine processing stations. The system was designed, built and assembled at the Materials Engineering Laboratory. Final assembly, startup, characterization and process development are the focus of this research. Many technical challenges encountered during the startup of the ARDS were addressed in this research. In this study, several hardware modifications needed for the reliable operation of the ARDS were designed, constructed and successfully incorporated into the ARDS. The effect of process condition on film properties for each process step was quantified. Process development to achieve 12% efficient baseline solar cell required investigation of discrete processing steps, troubleshooting process variation, and developing performance correlations. Subsequent to this research, many advances have been demonstrated with the ARDS. The ARDS consistently produces devices of 12% +/-.5% by the process of record (POR). The champion cell produced to date utilizing the ARDS has an efficiency of 16.2% on low cost commercial sodalime glass and utilizes advanced films. The ARDS has enabled investigation of advanced concepts for processing CdTe devices including, Plasma Cleaning, Plasma Enhanced Closed Space Sublimation (PECSS), Electron Reflector (ER) using Cd1-xMgxTe (CMT) structure and alternative device structures. The ARDS has been instrumental in the collaborative research with many institutions.

  17. Managing computer-controlled operations

    NASA Technical Reports Server (NTRS)

    Plowden, J. B.

    1985-01-01

    A detailed discussion of Launch Processing System Ground Software Production is presented to establish the interrelationships of firing room resource utilization, configuration control, system build operations, and Shuttle data bank management. The production of a test configuration identifier is traced from requirement generation to program development. The challenge of the operational era is to implement fully automated utilities to interface with a resident system build requirements document to eliminate all manual intervention in the system build operations. Automatic update/processing of Shuttle data tapes will enhance operations during multi-flow processing.

  18. Designing Flightdeck Procedures

    NASA Technical Reports Server (NTRS)

    Barshi, Immanuel; Mauro, Robert; Degani, Asaf; Loukopoulou, Loukia

    2016-01-01

    The primary goal of this document is to provide guidance on how to design, implement, and evaluate flight deck procedures. It provides a process for developing procedures that meet clear and specific requirements. This document provides a brief overview of: 1) the requirements for procedures, 2) a process for the design of procedures, and 3) a process for the design of checklists. The brief overview is followed by amplified procedures that follow the above steps and provide details for the proper design, implementation and evaluation of good flight deck procedures and checklists.

  19. ISS Payload Human Factors

    NASA Technical Reports Server (NTRS)

    Ellenberger, Richard; Duvall, Laura; Dory, Jonathan

    2016-01-01

    The ISS Payload Human Factors Implementation Team (HFIT) is the Payload Developer's resource for Human Factors. HFIT is the interface between Payload Developers and ISS Payload Human Factors requirements in SSP 57000. ? HFIT provides recommendations on how to meet the Human Factors requirements and guidelines early in the design process. HFIT coordinates with the Payload Developer and Astronaut Office to find low cost solutions to Human Factors challenges for hardware operability issues.

  20. Large Composite Structures Processing Technologies for Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Clinton, R. G., Jr.; Vickers, J. H.; McMahon, W. M.; Hulcher, A. B.; Johnston, N. J.; Cano, R. J.; Belvin, H. L.; McIver, K.; Franklin, W.; Sidwell, D.

    2001-01-01

    Significant efforts have been devoted to establishing the technology foundation to enable the progression to large scale composite structures fabrication. We are not capable today of fabricating many of the composite structures envisioned for the second generation reusable launch vehicle (RLV). Conventional 'aerospace' manufacturing and processing methodologies (fiber placement, autoclave, tooling) will require substantial investment and lead time to scale-up. Out-of-autoclave process techniques will require aggressive efforts to mature the selected technologies and to scale up. Focused composite processing technology development and demonstration programs utilizing the building block approach are required to enable envisioned second generation RLV large composite structures applications. Government/industry partnerships have demonstrated success in this area and represent best combination of skills and capabilities to achieve this goal.

  1. Metrology: Calibration and measurement processes guidelines

    NASA Technical Reports Server (NTRS)

    Castrup, Howard T.; Eicke, Woodward G.; Hayes, Jerry L.; Mark, Alexander; Martin, Robert E.; Taylor, James L.

    1994-01-01

    The guide is intended as a resource to aid engineers and systems contracts in the design, implementation, and operation of metrology, calibration, and measurement systems, and to assist NASA personnel in the uniform evaluation of such systems supplied or operated by contractors. Methodologies and techniques acceptable in fulfilling metrology quality requirements for NASA programs are outlined. The measurement process is covered from a high level through more detailed discussions of key elements within the process, Emphasis is given to the flowdown of project requirements to measurement system requirements, then through the activities that will provide measurements with defined quality. In addition, innovations and techniques for error analysis, development of statistical measurement process control, optimization of calibration recall systems, and evaluation of measurement uncertainty are presented.

  2. Payload crew training complex simulation engineer's handbook

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.

    1984-01-01

    The Simulation Engineer's Handbook is a guide for new engineers assigned to Experiment Simulation and a reference for engineers previously assigned. The experiment simulation process, development of experiment simulator requirements, development of experiment simulator hardware and software, and the verification of experiment simulators are discussed. The training required for experiment simulation is extensive and is only referenced in the handbook.

  3. Performance Steel Castings

    DTIC Science & Technology

    2012-09-30

    Development of Sand Properties 103 Advanced Modeling Dataset.. 105 High Strength Low Alloy (HSLA) Steels 107 Steel Casting and Engineering Support...to achieve the performance goals required for new systems. The dramatic reduction in weight and increase in capability will require high performance...for improved weapon system reliability. SFSA developed innovative casting design and manufacturing processes for high performance parts. SFSA is

  4. 42 CFR 137.320 - Is the Secretary required to consult with affected Indian Tribes concerning construction projects...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... HUMAN SERVICES TRIBAL SELF-GOVERNANCE Construction Notification (prioritization Process, Planning, Development and Construction) § 137.320 Is the Secretary required to consult with affected Indian Tribes...

  5. Get the Power You Need, When and Where You Need It Aboard the International Space Station (ISS) Using the ISS Plug-In Plan (IPiP) Requirement Request Process

    NASA Technical Reports Server (NTRS)

    Moore, Kevin D.

    2017-01-01

    Trying to get your experiment aboard ISS? You likely will need power. Many enditem providers do. ISS Plug-In Plan (IPiP) supports power and data for science, Payloads (or Utilization), vehicle systems, and daily operations through the Electrical Power System (EPS) Secondary Power/Data Subsystem. Yet limited resources and increasing requirements continue to influence decisions on deployment of ISS end items. Given the fluid launch schedule and the rapidly- increasing number of end item providers requiring power support, the focus of the Plug-In Plan has evolved from a simple FIFO recommendation to provide power to end item users, to anticipating future requirements by judicious development and delivery of support equipment (cables, power supplies, power strips, and alternating current (AC) power inverters), employing innovative deployment strategies, and collaborating on end item development. This paper describes the evolution of the ISS Program Office, Engineering Directorate, Flight Operations Directorate (FOD), International Partners and the end item provider relationship and how collaboration successfully leverages unique requirements with limited on- board equipment and resources, tools and processes which result in more agile integration, and describes the process designed for the new ISS end item provider to assure that their power requirements will be met.

  6. Influence of technology on magnetic tape storage device characteristics

    NASA Technical Reports Server (NTRS)

    Gniewek, John J.; Vogel, Stephen M.

    1994-01-01

    There are available today many data storage devices that serve the diverse application requirements of the consumer, professional entertainment, and computer data processing industries. Storage technologies include semiconductors, several varieties of optical disk, optical tape, magnetic disk, and many varieties of magnetic tape. In some cases, devices are developed with specific characteristics to meet specification requirements. In other cases, an existing storage device is modified and adapted to a different application. For magnetic tape storage devices, examples of the former case are 3480/3490 and QIC device types developed for the high end and low end segments of the data processing industry respectively, VHS, Beta, and 8 mm formats developed for consumer video applications, and D-1, D-2, D-3 formats developed for professional video applications. Examples of modified and adapted devices include 4 mm, 8 mm, 12.7 mm and 19 mm computer data storage devices derived from consumer and professional audio and video applications. With the conversion of the consumer and professional entertainment industries from analog to digital storage and signal processing, there have been increasing references to the 'convergence' of the computer data processing and entertainment industry technologies. There has yet to be seen, however, any evidence of convergence of data storage device types. There are several reasons for this. The diversity of application requirements results in varying degrees of importance for each of the tape storage characteristics.

  7. PVD thermal barrier coating applications and process development for aircraft engines

    NASA Astrophysics Data System (ADS)

    Rigney, D. V.; Viguie, R.; Wortman, D. J.; Skelly, D. W.

    1997-06-01

    Thermal barrier coatings (TBCs) have been developed for application to aircraft engine components to improve service life in an increasingly hostile thermal environment. The choice of TBC type is related to the component, intended use, and economics. Selection of electron beam physical vapor deposition proc-essing for turbine blade is due in part to part size, surface finish requirements, thickness control needs, and hole closure issues. Process development of PVD TBCs has been carried out at several different sites, including GE Aircraft Engines (GEAE). The influence of processing variables on microstructure is dis-cussed, along with the GEAE development coater and initial experiences of pilot line operation.

  8. Evolutionary Software Development (Developpement Evolutionnaire de Logiciels)

    DTIC Science & Technology

    2008-08-01

    development processes. While this may be true, frequently it is not. MIL-STD-498 was explicitly introduced to encourage iterative development; ISO /IEC... 12207 was carefully worded not to prohibit iterative development. Yet both standards were widely interpreted as requiring waterfall development, as

  9. Evolutionary Software Development (Developpement evolutionnaire de logiciels)

    DTIC Science & Technology

    2008-08-01

    development processes. While this may be true, frequently it is not. MIL-STD-498 was explicitly introduced to encourage iterative development; ISO /IEC... 12207 was carefully worded not to prohibit iterative development. Yet both standards were widely interpreted as requiring waterfall development, as

  10. Regulation of cold-induced sweetening in potatoes and markers for fast-track new variety development

    USDA-ARS?s Scientific Manuscript database

    Potato breeding is a tedious, time consuming process. With the growing requirements of the potato processing industry for new potato varieties, there is need for effective tools to speed-up new variety development. The purpose of this study was to understand the enzymatic regulation of cold-induce...

  11. Functionalised particles using dry powder coating in pharmaceutical drug delivery: promises and challenges.

    PubMed

    Dahmash, Eman Z; Mohammed, Afzal R

    2015-01-01

    Production of functionalised particles using dry powder coating is a one-step, environmentally friendly process that paves the way for the development of particles with targeted properties and diverse functionalities. Applying the first principles in physical science for powders, fine guest particles can be homogeneously dispersed over the surface of larger host particles to develop functionalised particles. Multiple functionalities can be modified including: flowability, dispersibility, fluidisation, homogeneity, content uniformity and dissolution profile. The current publication seeks to understand the fundamental underpinning principles and science governing dry coating process, evaluate key technologies developed to produce functionalised particles along with outlining their advantages, limitations and applications and discusses in detail the resultant functionalities and their applications. Dry particle coating is a promising solvent-free manufacturing technology to produce particles with targeted functionalities. Progress within this area requires the development of continuous processing devices that can overcome challenges encountered with current technologies such as heat generation and particle attrition. Growth within this field requires extensive research to further understand the impact of process design and material properties on resultant functionalities.

  12. 24 CFR 203.5 - Direct Endorsement process.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Direct Endorsement process. 203.5... SINGLE FAMILY MORTGAGE INSURANCE Eligibility Requirements and Underwriting Procedures Direct Endorsement, Lender Insurance, and Commitments § 203.5 Direct Endorsement process. (a) General. Under the Direct...

  13. Application of structured analysis to a telerobotic system

    NASA Technical Reports Server (NTRS)

    Dashman, Eric; Mclin, David; Harrison, F. W.; Soloway, Donald; Young, Steven

    1990-01-01

    The analysis and evaluation of a multiple arm telerobotic research and demonstration system developed by the NASA Intelligent Systems Research Laboratory (ISRL) is described. Structured analysis techniques were used to develop a detailed requirements model of an existing telerobotic testbed. Performance models generated during this process were used to further evaluate the total system. A commercial CASE tool called Teamwork was used to carry out the structured analysis and development of the functional requirements model. A structured analysis and design process using the ISRL telerobotic system as a model is described. Evaluation of this system focused on the identification of bottlenecks in this implementation. The results demonstrate that the use of structured methods and analysis tools can give useful performance information early in a design cycle. This information can be used to ensure that the proposed system meets its design requirements before it is built.

  14. Laser dimpling process parameters selection and optimization using surrogate-driven process capability space

    NASA Astrophysics Data System (ADS)

    Ozkat, Erkan Caner; Franciosa, Pasquale; Ceglarek, Dariusz

    2017-08-01

    Remote laser welding technology offers opportunities for high production throughput at a competitive cost. However, the remote laser welding process of zinc-coated sheet metal parts in lap joint configuration poses a challenge due to the difference between the melting temperature of the steel (∼1500 °C) and the vapourizing temperature of the zinc (∼907 °C). In fact, the zinc layer at the faying surface is vapourized and the vapour might be trapped within the melting pool leading to weld defects. Various solutions have been proposed to overcome this problem over the years. Among them, laser dimpling has been adopted by manufacturers because of its flexibility and effectiveness along with its cost advantages. In essence, the dimple works as a spacer between the two sheets in lap joint and allows the zinc vapour escape during welding process, thereby preventing weld defects. However, there is a lack of comprehensive characterization of dimpling process for effective implementation in real manufacturing system taking into consideration inherent changes in variability of process parameters. This paper introduces a methodology to develop (i) surrogate model for dimpling process characterization considering multiple-inputs (i.e. key control characteristics) and multiple-outputs (i.e. key performance indicators) system by conducting physical experimentation and using multivariate adaptive regression splines; (ii) process capability space (Cp-Space) based on the developed surrogate model that allows the estimation of a desired process fallout rate in the case of violation of process requirements in the presence of stochastic variation; and, (iii) selection and optimization of the process parameters based on the process capability space. The proposed methodology provides a unique capability to: (i) simulate the effect of process variation as generated by manufacturing process; (ii) model quality requirements with multiple and coupled quality requirements; and (iii) optimize process parameters under competing quality requirements such as maximizing the dimple height while minimizing the dimple lower surface area.

  15. Making the Grade: Describing Inherent Requirements for the Initial Teacher Education Practicum

    ERIC Educational Resources Information Center

    Sharplin, Elaine; Peden, Sanna; Marais, Ida

    2016-01-01

    This study explores the development, description, and illustration of inherent requirement (IR) statements to make explicit the requirements for performance on an initial teacher education (ITE) practicum. Through consultative group processes with stakeholders involved in ITE, seven IR domains were identified. From interviews with academics,…

  16. Saint Lawrence Seaway Navigation-Aid System Study : Volume III - Appendix C - User's Manual and Documentation of the Ship Maneuvering Requirements Computer Program

    DOT National Transportation Integrated Search

    1978-09-01

    The requirements for a navigation guidance system which will effect an increase in the ship processing capacity of the Saint Lawrence Seaway (Lake Ontario to Montreal, Quebec) are developed. The requirements include a specification of system position...

  17. Yield model development project implementation plan

    NASA Technical Reports Server (NTRS)

    Ambroziak, R. A.

    1982-01-01

    Tasks remaining to be completed are summarized for the following major project elements: (1) evaluation of crop yield models; (2) crop yield model research and development; (3) data acquisition processing, and storage; (4) related yield research: defining spectral and/or remote sensing data requirements; developing input for driving and testing crop growth/yield models; real time testing of wheat plant process models) and (5) project management and support.

  18. Systems Integration Processes for NASA Ares I Crew Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Taylor, James L.; Reuter, James L.; Sexton, Jeffrey D.

    2006-01-01

    NASA's Exploration Initiative will require development of many new elements to constitute a robust system of systems. New launch vehicles are needed to place cargo and crew in stable Low Earth Orbit (LEO). This paper examines the systems integration processes NASA is utilizing to ensure integration and control of propulsion and nonpropulsion elements within NASA's Crew Launch Vehicle (CLV), now known as the Ares I. The objective of the Ares I is to provide the transportation capabilities to meet the Constellation Program requirements for delivering a Crew Exploration Vehicle (CEV) or other payload to LEO in support of the lunar and Mars missions. The Ares I must successfully provide this capability within cost and schedule, and with an acceptable risk approach. This paper will describe the systems engineering management processes that will be applied to assure Ares I Project success through complete and efficient technical integration. Discussion of technical review and management processes for requirements development and verification, integrated design and analysis, integrated simulation and testing, and the integration of reliability, maintainability and supportability (RMS) into the design will also be included. The Ares I Project is logically divided into elements by the major hardware groupings, and associated management, system engineering, and integration functions. The processes to be described herein are designed to integrate within these Ares I elements and among the other Constellation projects. Also discussed is launch vehicle stack integration (Ares I to CEV, and Ground and Flight Operations integration) throughout the life cycle, including integrated vehicle performance through orbital insertion, recovery of the first stage, and reentry of the upper stage. The processes for decomposing requirements to the elements and ensuring that requirements have been correctly validated, decomposed, and allocated, and that the verification requirements are properly defined to ensure that the system design meets requirements, will be discussed.

  19. The systems engineering overview and process (from the Systems Engineering Management Guide, 1990)

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The past several decades have seen the rise of large, highly interactive systems that are on the forward edge of technology. As a result of this growth and the increased usage of digital systems (computers and software), the concept of systems engineering has gained increasing attention. Some of this attention is no doubt due to large program failures which possibly could have been avoided, or at least mitigated, through the use of systems engineering principles. The complexity of modern day weapon systems requires conscious application of systems engineering concepts to ensure producible, operable and supportable systems that satisfy mission requirements. Although many authors have traced the roots of systems engineering to earlier dates, the initial formalization of the systems engineering process for military development began to surface in the mid-1950s on the ballistic missile programs. These early ballistic missile development programs marked the emergence of engineering discipline 'specialists' which has since continued to grow. Each of these specialties not only has a need to take data from the overall development process, but also to supply data, in the form of requirements and analysis results, to the process. A number of technical instructions, military standards and specifications, and manuals were developed as a result of these development programs. In particular, MILSTD-499 was issued in 1969 to assist both government and contractor personnel in defining the systems engineering effort in support of defense acquisition programs. This standard was updated to MIL-STD499A in 1974, and formed the foundation for current application of systems engineering principles to military development programs.

  20. The systems engineering overview and process (from the Systems Engineering Management Guide, 1990)

    NASA Astrophysics Data System (ADS)

    The past several decades have seen the rise of large, highly interactive systems that are on the forward edge of technology. As a result of this growth and the increased usage of digital systems (computers and software), the concept of systems engineering has gained increasing attention. Some of this attention is no doubt due to large program failures which possibly could have been avoided, or at least mitigated, through the use of systems engineering principles. The complexity of modern day weapon systems requires conscious application of systems engineering concepts to ensure producible, operable and supportable systems that satisfy mission requirements. Although many authors have traced the roots of systems engineering to earlier dates, the initial formalization of the systems engineering process for military development began to surface in the mid-1950s on the ballistic missile programs. These early ballistic missile development programs marked the emergence of engineering discipline 'specialists' which has since continued to grow. Each of these specialties not only has a need to take data from the overall development process, but also to supply data, in the form of requirements and analysis results, to the process. A number of technical instructions, military standards and specifications, and manuals were developed as a result of these development programs. In particular, MILSTD-499 was issued in 1969 to assist both government and contractor personnel in defining the systems engineering effort in support of defense acquisition programs. This standard was updated to MIL-STD499A in 1974, and formed the foundation for current application of systems engineering principles to military development programs.

  1. A continuous process for the development of Kodak Aerochrome Infrared Film 2443 as a negative

    NASA Astrophysics Data System (ADS)

    Klimes, D.; Ross, D. I.

    1993-02-01

    A process for the continuous dry-to-dry development of Kodak Aerochrome Infrared Film 2443 as a negative (CIR-neg) is described. The process is well suited for production processing of long film lengths. Chemicals from three commercial film processes are used with modifications. Sensitometric procedures are recommended for the monitoring of processing quality control. Sensitometric data and operational aerial exposures indicate that films developed in this process have approximately the same effective aerial film speed as films processed in the reversal process recommended by the manufacturer (Kodak EA-5). The CIR-neg process is useful when aerial photography is acquired for resources management applications which require print reproductions. Originals can be readily reproduced using conventional production equipment (electronic dodging) in black and white or color (color compensation).

  2. Maintainability Program Requirements for Space Systems

    NASA Technical Reports Server (NTRS)

    1987-01-01

    This document is established to provide common general requirements for all NASA programs to: design maintainability into all systems where maintenance is a factor in system operation and mission success; and ensure that maintainability characteristics are developed through the systems engineering process. These requirements are not new. Design for ease of maintenance and minimization of repair time have always been fundamental requirements of the systems engineering process. However, new or reusable orbital manned and in-flight maintainable unmanned space systems demand special emphasis on maintainability, and this document has been prepared to meet that need. Maintainability requirements on many NASA programs differ in phasing and task emphasis from requirements promulgated by other Government agencies. This difference is due to the research and development nature of NASA programs where quantities produced are generally small; therefore, the depth of logistics support typical of many programs is generally not warranted. The cost of excessive maintenance is very high due to the logistics problems associated with the space environment. The ability to provide timely maintenance often involves safety considerations for manned space flight applications. This document represents a basic set of requirements that will achieve a design for maintenance. These requirements are directed primarily at manned and unmanned orbital space systems. To be effective, maintainability requirements should be tailored to meet specific NASA program and project needs and constraints. NASA activities shall invoke the requirements of this document consistent with program planning in procurements or on inhouse development efforts.

  3. Development of Integrated Programs for Aerospace-vehicle design (IPAD): Reference design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1979-01-01

    The airplane design process and its interfaces with manufacturing and customer operations are documented to be used as criteria for the development of integrated programs for the analysis, design, and testing of aerospace vehicles. Topics cover: design process management, general purpose support requirements, design networks, and technical program elements. Design activity sequences are given for both supersonic and subsonic commercial transports, naval hydrofoils, and military aircraft.

  4. Some considerations on the attractiveness of participatory processes for researchers from natural science

    NASA Astrophysics Data System (ADS)

    Barthel, Roland

    2013-04-01

    Participatory modeling and participatory scenario development have become an essential part of environmental impact assessment and planning in the field of water resources management. But even if most people agree that participation is required to solve environmental problems in a way that satisfies both the environmental and societal needs, success stories are relatively rare, while many attempts to include stakeholders in the development of models are still reported to have failed. This paper proposes the hypothesis, that the lack of success in participatory modeling can partly be attributed to a lack of attractiveness of participatory approaches for researchers from natural sciences (subsequently called 'modelers'). It has to be pointed out that this discussion is mainly concerned with natural scientists in academia and not with modelers who develop models for commercial purposes or modelers employed by public agencies. The involvement of modelers and stakeholders in participatory modeling has been intensively studied during recent years. However, such analysis is rarely made from the viewpoint of the modelers themselves. Modelers usually don't see participatory modeling and scenario development as scientific targets as such, because the theoretical foundations of such processes usually lie far outside their own area of expertise. Thus, participatory processes are seen mainly as a means to attract funding or to facilitate the access to data or (relatively rarely) as a way to develop a research model into a commercial product. The majority of modelers very likely do not spend too much time on reflecting whether or not their new tools are helpful to solve real world problems or if the results are understandable and acceptable for stakeholders. They consider their task completed when the model they developed satisfies the 'scientific requirements', which are essentially different from the requirements to satisfy a group of stakeholders. Funding often stops before a newly developed model can actually be tested in a stakeholder process. Therefore the gap between stakeholders and modelers persists or is even growing. A main reason for this probably lies in the way that the work of scientists (modelers) is evaluated. What counts is the number of journal articles produced, while applicability or societal impact is still not a measure of scientific success. A good journal article on a model requires an exemplary validation but only very rarely would a reviewer ask if a model was accepted by stakeholders. So why should a scientist go through a tedious stakeholder process? The stakeholder process might be a requirement of the research grant, but whether this is taken seriously, can be questioned, as long as stakeholder dialogues do not lead to quantifiable scientific success. In particular for researchers in early career stages who undergo typical, publication-based evaluation processes, participatory research is hardly beneficial. The discussion in this contribution is based on three pillars: (i) a comprehensive evaluation of the literature published on participatory modeling and scenario development, (ii) a case study involving the development of an integrated model for water and land use management including an intensive stakeholder process and (iii) unstructured, personal communication - with mainly young scientists - about the attractiveness of multidisciplinary, applied research.

  5. Strategies for automatic planning: A collection of ideas

    NASA Technical Reports Server (NTRS)

    Collins, Carol; George, Julia; Zamani, Elaine

    1989-01-01

    The main goal of the Jet Propulsion Laboratory (JPL) is to obtain science return from interplanetary probes. The uplink process is concerned with communicating commands to a spacecraft in order to achieve science objectives. There are two main parts to the development of the command file which is sent to a spacecraft. First, the activity planning process integrates the science requests for utilization of spacecraft time into a feasible sequence. Then the command generation process converts the sequence into a set of commands. The development of a feasible sequence plan is an expensive and labor intensive process requiring many months of effort. In order to save time and manpower in the uplink process, automation of parts of this process is desired. There is an ongoing effort to develop automatic planning systems. This has met with some success, but has also been informative about the nature of this effort. It is now clear that innovative techniques and state-of-the-art technology will be required in order to produce a system which can provide automatic sequence planning. As part of this effort to develop automatic planning systems, a survey of the literature, looking for known techniques which may be applicable to our work was conducted. Descriptions of and references for these methods are given, together with ideas for applying the techniques to automatic planning.

  6. EUV mask pilot line at Intel Corporation

    NASA Astrophysics Data System (ADS)

    Stivers, Alan R.; Yan, Pei-Yang; Zhang, Guojing; Liang, Ted; Shu, Emily Y.; Tejnil, Edita; Lieberman, Barry; Nagpal, Rajesh; Hsia, Kangmin; Penn, Michael; Lo, Fu-Chang

    2004-12-01

    The introduction of extreme ultraviolet (EUV) lithography into high volume manufacturing requires the development of a new mask technology. In support of this, Intel Corporation has established a pilot line devoted to encountering and eliminating barriers to manufacturability of EUV masks. It concentrates on EUV-specific process modules and makes use of the captive standard photomask fabrication capability of Intel Corporation. The goal of the pilot line is to accelerate EUV mask development to intersect the 32nm technology node. This requires EUV mask technology to be comparable to standard photomask technology by the beginning of the silicon wafer process development phase for that technology node. The pilot line embodies Intel's strategy to lead EUV mask development in the areas of the mask patterning process, mask fabrication tools, the starting material (blanks) and the understanding of process interdependencies. The patterning process includes all steps from blank defect inspection through final pattern inspection and repair. We have specified and ordered the EUV-specific tools and most will be installed in 2004. We have worked with International Sematech and others to provide for the next generation of EUV-specific mask tools. Our process of record is run repeatedly to ensure its robustness. This primes the supply chain and collects information needed for blank improvement.

  7. Critical Review of NOAA's Observation Requirements Process

    NASA Astrophysics Data System (ADS)

    LaJoie, M.; Yapur, M.; Vo, T.; Templeton, A.; Bludis, D.

    2017-12-01

    NOAA's Observing Systems Council (NOSC) maintains a comprehensive database of user observation requirements. The requirements collection process engages NOAA subject matter experts to document and effectively communicate the specific environmental observation measurements (parameters and attributes) needed to produce operational products and pursue research objectives. User observation requirements documented using a structured and standardized manner and framework enables NOAA to assess its needs across organizational lines in an impartial, objective, and transparent manner. This structure provides the foundation for: selecting, designing, developing, acquiring observing technologies, systems and architectures; budget and contract formulation and decision-making; and assessing in a repeatable fashion the productivity, efficiency and optimization of NOAA's observing system enterprise. User observation requirements are captured independently from observing technologies. Therefore, they can be addressed by a variety of current or expected observing capabilities and allow flexibility to be remapped to new and evolving technologies. NOAA's current inventory of user observation requirements were collected over a ten-year period, and there have been many changes in policies, mission priorities, and funding levels during this time. In light of these changes, the NOSC initiated a critical, in-depth review to examine all aspects of user observation requirements and associated processes during 2017. This presentation provides background on the NOAA requirements process, major milestones and outcomes of the critical review, and plans for evolving and connecting observing requirements processes in the next year.

  8. A Successful Infusion Process for Enabling Lunar Exploration Technologies

    NASA Technical Reports Server (NTRS)

    Over, Ann P.; Klem, Mark K.; Motil, Susan M.

    2008-01-01

    The NASA Vision for Space Exploration begins with a more reliable flight capability to the International Space Station and ends with sending humans to Mars. An important stepping stone on the path to Mars encompasses human missions to the Moon. There is little doubt throughout the stakeholder community that new technologies will be required to enable this Vision. However, there are many factors that influence the ability to successfully infuse any technology including the technical risk, requirement and development schedule maturity, and, funds available. This paper focuses on effective infusion processes that have been used recently for the technologies in development for the lunar exploration flight program, Constellation. Recent successes with Constellation customers are highlighted for the Exploration Technology Development Program (ETDP) Projects managed by NASA Glenn Research Center (GRC). Following an overview of the technical context of both the flight program and the technology capability mapping, the process is described for how to effectively build an integrated technology infusion plan. The process starts with a sound risk development plan and is completed with an integrated project plan, including content, schedule and cost. In reality, the available resources for this development are going to change over time, necessitating some level of iteration in the planning. However, the driving process is based on the initial risk assessment, which changes only when the overall architecture changes, enabling some level of stability in the process.

  9. Application of process mapping to understand integration of high risk medicine care bundles within community pharmacy practice.

    PubMed

    Weir, Natalie M; Newham, Rosemary; Corcoran, Emma D; Ali Atallah Al-Gethami, Ashwag; Mohammed Abd Alridha, Ali; Bowie, Paul; Watson, Anne; Bennie, Marion

    2017-11-21

    The Scottish Patient Safety Programme - Pharmacy in Primary Care collaborative is a quality improvement initiative adopting the Institute of Healthcare Improvement Breakthrough Series collaborative approach. The programme developed and piloted High Risk Medicine (HRM) Care Bundles (CB), focused on warfarin and non-steroidal anti-inflammatories (NSAIDs), within 27 community pharmacies over 4 NHS Regions. Each CB involves clinical assessment and patient education, although the CB content varies between regions. To support national implementation, this study aims to understand how the pilot pharmacies integrated the HRM CBs into routine practice to inform the development of a generic HRM CB process map. Regional process maps were developed in 4 pharmacies through simulation of the CB process, staff interviews and documentation of resources. Commonalities were collated to develop a process map for each HRM, which were used to explore variation at a national event. A single, generic process map was developed which underwent validation by case study testing. The findings allowed development of a generic process map applicable to warfarin and NSAID CB implementation. Five steps were identified as required for successful CB delivery: patient identification; clinical assessment; pharmacy CB prompt; CB delivery; and documentation. The generic HRM CB process map encompasses the staff and patients' journey and the CB's integration into routine community pharmacy practice. Pharmacist involvement was required only for clinical assessment, indicating suitability for whole-team involvement. Understanding CB integration into routine practice has positive implications for successful implementation. The generic process map can be used to develop targeted resources, and/or be disseminated to facilitate CB delivery and foster whole team involvement. Similar methods could be utilised within other settings, to allow those developing novel services to distil the key processes and consider their integration within routine workflows to effect maximal, efficient implementation and benefit to patient care. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Cryptococcus neoformans Mediator Protein Ssn8 Negatively Regulates Diverse Physiological Processes and Is Required for Virulence

    PubMed Central

    Wang, Lin-Ing; Lin, Yu-Sheng; Liu, Kung-Hung; Jong, Ambrose Y.; Shen, Wei-Chiang

    2011-01-01

    Cryptococcus neoformans is a ubiquitously distributed human pathogen. It is also a model system for studying fungal virulence, physiology and differentiation. Light is known to inhibit sexual development via the evolutionarily conserved white collar proteins in C. neoformans. To dissect molecular mechanisms regulating this process, we have identified the SSN8 gene whose mutation suppresses the light-dependent CWC1 overexpression phenotype. Characterization of sex-related phenotypes revealed that Ssn8 functions as a negative regulator in both heterothallic a-α mating and same-sex mating processes. In addition, Ssn8 is involved in the suppression of other physiological processes including invasive growth, and production of capsule and melanin. Interestingly, Ssn8 is also required for the maintenance of cell wall integrity and virulence. Our gene expression studies confirmed that deletion of SSN8 results in de-repression of genes involved in sexual development and melanization. Epistatic and yeast two hybrid studies suggest that C. neoformans Ssn8 plays critical roles downstream of the Cpk1 MAPK cascade and Ste12 and possibly resides at one of the major branches downstream of the Cwc complex in the light-mediated sexual development pathway. Taken together, our studies demonstrate that the conserved Mediator protein Ssn8 functions as a global regulator which negatively regulates diverse physiological and developmental processes and is required for virulence in C. neoformans. PMID:21559476

  11. Orbital transfer vehicle concept definition and system analysis study. Volume 4, Appendix A: Space station accommodations. Revision 1

    NASA Technical Reports Server (NTRS)

    Randall, Roger M.

    1987-01-01

    Orbit Transfer Vehicle (OTV) processing at the space station is divided into two major categories: OTV processing and assembly operations, and support operations. These categories are further subdivided into major functional areas to allow development of detailed OTV processing procedures and timelines. These procedures and timelines are used to derive the specific space station accommodations necessary to support OTV activities. The overall objective is to limit impact on OTV processing requirements on space station operations, involvement of crew, and associated crew training and skill requirements. The operational concept maximizes use of automated and robotic systems to perform all required OTV servicing and maintenance tasks. Only potentially critical activities would require direct crew involvement or supervision. EVA operations are considered to be strictly contingency back-up to failure of the automated and robotic systems, with the exception of the initial assembly of Space-Based OTV accommodations at the space station, which will require manned involvement.

  12. Baseline LAW Glass Formulation Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kruger, Albert A.; Mooers, Cavin; Bazemore, Gina

    2013-06-13

    The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements.

  13. Introduction to Session 5

    NASA Astrophysics Data System (ADS)

    Zullo, Luca; Snyder, Seth W.

    Production of bio-based products that are cost competitive in the market place requires well-developed operations that include innovative processes and separation solutions. Separations costs can make the difference between an interesting laboratory project and a successful commercial process. Bioprocessing and separations research and development addresses some of the most significant cost barriers in production of bioffuels and bio-based chemicals. Models of integrated biorefineries indicate that success will require production of higher volume fuels in conjunction with high margin chemical products. Addressing the bioprocessing and separations cost barriers will be critical to the overall success of the integrated biorefinery.

  14. The MSFC Collaborative Engineering Process for Preliminary Design and Concept Definition Studies

    NASA Technical Reports Server (NTRS)

    Mulqueen, Jack; Jones, David; Hopkins, Randy

    2011-01-01

    This paper describes a collaborative engineering process developed by the Marshall Space Flight Center's Advanced Concepts Office for performing rapid preliminary design and mission concept definition studies for potential future NASA missions. The process has been developed and demonstrated for a broad range of mission studies including human space exploration missions, space transportation system studies and in-space science missions. The paper will describe the design team structure and specialized analytical tools that have been developed to enable a unique rapid design process. The collaborative engineering process consists of integrated analysis approach for mission definition, vehicle definition and system engineering. The relevance of the collaborative process elements to the standard NASA NPR 7120.1 system engineering process will be demonstrated. The study definition process flow for each study discipline will be will be outlined beginning with the study planning process, followed by definition of ground rules and assumptions, definition of study trades, mission analysis and subsystem analyses leading to a standardized set of mission concept study products. The flexibility of the collaborative engineering design process to accommodate a wide range of study objectives from technology definition and requirements definition to preliminary design studies will be addressed. The paper will also describe the applicability of the collaborative engineering process to include an integrated systems analysis approach for evaluating the functional requirements of evolving system technologies and capabilities needed to meet the needs of future NASA programs.

  15. Japan-Specific Key Regulatory Aspects for Development of New Biopharmaceutical Drug Products.

    PubMed

    Desai, Kashappa Goud; Obayashi, Hirokazu; Colandene, James D; Nesta, Douglas P

    2018-03-28

    Japan represents the third largest pharmaceutical market in the world. Developing a new biopharmaceutical drug product for the Japanese market is a top business priority for global pharmaceutical companies while aligning with ethical drivers to treat more patients in need. Understanding Japan-specific key regulatory requirements is essential to achieve successful approvals. Understanding the full context of Japan-specific regulatory requirements/expectations is challenging to global pharmaceutical companies due to differences in language and culture. This article summarizes key Japan-specific regulatory aspects/requirements/expectations applicable to new drug development, approval, and postapproval phases. Formulation excipients should meet Japan compendial requirements with respect to the type of excipient, excipient grade, and excipient concentration. Preclinical safety assessments needed to support clinical phases I, II, and III development are summarized. Japanese regulatory authorities have taken appropriate steps to consider foreign clinical data, thereby enabling accelerated drug development and approval in Japan. Other important topics summarized in this article include: Japan new drug application-specific bracketing strategies for critical and noncritical aspects of the manufacturing process, regulatory requirements related to stability studies, release specifications and testing methods, standard processes involved in pre and postapproval inspections, management of postapproval changes, and Japan regulatory authority's consultation services available to global pharmaceutical companies. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  16. Processing of nutritious, safe and acceptable foods from cells candidate crops

    NASA Astrophysics Data System (ADS)

    Fu, B.; Nelson, P. E.; Irvine, R.; Kanach, L. L.

    A controlled ecological life-support system (CELSS) is required to sustain life for long-duration space missions. The challenge is preparing a wide variety of tasty, familiar, and nutritious foods from CELSS candidate crops under space environmental conditions. Conventional food processing technologies will have to be modified to adapt to the space environment. Extrusion is one of the processes being examined as a means of converting raw plant biomass into familiar foods. A nutrition-improved pasta has been developed using cowpea as a replacement for a portion of the durum semolina. A freeze-drying system that simulates the space conditions has also been developed. Other technologies that would fulfill the requirements of a CELSS will also be addressed.

  17. Processing of nutritious, safe and acceptable foods from CELSS candidate crops

    NASA Technical Reports Server (NTRS)

    Fu, B.; Nelson, P. E.; Irvine, R.; Kanach, L. L.; Mitchell, C. A. (Principal Investigator)

    1996-01-01

    A controlled ecological life-support system (CELSS) is required to sustain life for long-duration space missions. The challenge is preparing a wide variety of tasty, familiar, and nutritious foods from CELSS candidate crops under space environmental conditions. Conventional food processing technologies will have to be modified to adapt to the space environment. Extrusion is one of the processes being examined as a means of converting raw plant biomass into familiar foods. A nutrition-improved pasta has been developed using cowpea as a replacement for a portion of the durum semolina. A freeze-drying system that simulates the space conditions has also been developed. Other technologies that would fulfill the requirements of a CELSS will also be addressed.

  18. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 1B: Concise review

    NASA Technical Reports Server (NTRS)

    Miller, R. E., Jr.; Southall, J. W.; Kawaguchi, A. S.; Redhed, D. D.

    1973-01-01

    Reports on the design process, support of the design process, IPAD System design catalog of IPAD technical program elements, IPAD System development and operation, and IPAD benefits and impact are concisely reviewed. The approach used to define the design is described. Major activities performed during the product development cycle are identified. The computer system requirements necessary to support the design process are given as computational requirements of the host system, technical program elements and system features. The IPAD computer system design is presented as concepts, a functional description and an organizational diagram of its major components. The cost and schedules and a three phase plan for IPAD implementation are presented. The benefits and impact of IPAD technology are discussed.

  19. Creating objective and measurable postgraduate year 1 residency graduation requirements.

    PubMed

    Starosta, Kaitlin; Davis, Susan L; Kenney, Rachel M; Peters, Michael; To, Long; Kalus, James S

    2017-03-15

    The process of developing objective and measurable postgraduate year 1 (PGY1) residency graduation requirements and a progress tracking system is described. The PGY1 residency accreditation standard requires that programs establish criteria that must be met by residents for successful completion of the program (i.e., graduation requirements), which should presumably be aligned with helping residents to achieve the purpose of residency training. In addition, programs must track a resident's progress toward fulfillment of residency goals and objectives. Defining graduation requirements and establishing the process for tracking residents' progress are left up to the discretion of the residency program. To help standardize resident performance assessments, leaders of an academic medical center-based PGY1 residency program developed graduation requirement criteria that are objective, measurable, and linked back to residency goals and objectives. A system for tracking resident progress relative to quarterly progress targets was instituted. Leaders also developed a focused, on-the-spot skills assessment termed "the Thunderdome," which was designed for objective evaluation of direct patient care skills. Quarterly data on residents' progress are used to update and customize each resident's training plan. Implementation of this system allowed seamless linkage of the training plan, the progress tracking system, and the specified graduation requirement criteria. PGY1 residency requirements that are objective, that are measurable, and that attempt to identify what skills the resident must demonstrate in order to graduate from the program were developed for use in our residency program. A system for tracking the residents' progress by comparing residents' performance to predetermined quarterly benchmarks was developed. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  20. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  1. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  2. Using Life-Cycle Human Factors Engineering to Avoid $2.4 Million in Costs: Lessons Learned from NASA's Requirements Verification Process for Space Payloads

    NASA Technical Reports Server (NTRS)

    Carr, Daniel; Ellenberger, Rich

    2008-01-01

    The Human Factors Implementation Team (HFIT) process has been used to verify human factors requirements for NASA International Space Station (ISS) payloads since 2003, resulting in $2.4 million in avoided costs. This cost benefit has been realized by greatly reducing the need to process time-consuming formal waivers (exceptions) for individual requirements violations. The HFIT team, which includes astronauts and their technical staff, acts as the single source for human factors requirements integration of payloads. HFIT has the authority to provide inputs during early design phases, thus eliminating many potential requirements violations in a cost-effective manner. In those instances where it is not economically or technically feasible to meet the precise metric of a given requirement, HFIT can work with the payload engineers to develop common sense solutions and formally document that the resulting payload design does not materially affect the astronaut s ability to operate and interact with the payload. The HFIT process is fully ISO 9000 compliant and works concurrently with NASA s formal systems engineering work flow. Due to its success with payloads, the HFIT process is being adapted and extended to ISS systems hardware. Key aspects of this process are also being considered for NASA's Space Shuttle replacement, the Crew Exploration Vehicle.

  3. Shuttle mission simulator requirement report, volume 2, revision A

    NASA Technical Reports Server (NTRS)

    Burke, J. F.

    1973-01-01

    The training requirements of all mission phases for crews and ground support personnel are presented. The specifications are given for the design and development of the simulator, data processing systems, engine control, software, and systems integration.

  4. 15 CFR 923.3 - General requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...: resource protection, management of coastal development, and simplification of governmental processes. These...) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS General § 923.3 General requirements. (a) The...

  5. 15 CFR 923.3 - General requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...: resource protection, management of coastal development, and simplification of governmental processes. These...) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS General § 923.3 General requirements. (a) The...

  6. 15 CFR 923.3 - General requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...: resource protection, management of coastal development, and simplification of governmental processes. These...) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS General § 923.3 General requirements. (a) The...

  7. 15 CFR 923.3 - General requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...: resource protection, management of coastal development, and simplification of governmental processes. These...) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS General § 923.3 General requirements. (a) The...

  8. 15 CFR 923.3 - General requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...: resource protection, management of coastal development, and simplification of governmental processes. These...) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS General § 923.3 General requirements. (a) The...

  9. Development of Airport Surface Required Navigation Performance (RNP)

    NASA Technical Reports Server (NTRS)

    Cassell, Rick; Smith, Alex; Hicok, Dan

    1999-01-01

    The U.S. and international aviation communities have adopted the Required Navigation Performance (RNP) process for defining aircraft performance when operating the en-route, approach and landing phases of flight. RNP consists primarily of the following key parameters - accuracy, integrity, continuity, and availability. The processes and analytical techniques employed to define en-route, approach and landing RNP have been applied in the development of RNP for the airport surface. To validate the proposed RNP requirements several methods were used. Operational and flight demonstration data were analyzed for conformance with proposed requirements, as were several aircraft flight simulation studies. The pilot failure risk component was analyzed through several hypothetical scenarios. Additional simulator studies are recommended to better quantify crew reactions to failures as well as additional simulator and field testing to validate achieved accuracy performance, This research was performed in support of the NASA Low Visibility Landing and Surface Operations Programs.

  10. Gaining insights into interrill soil erosion processes using rare earth element tracers

    USDA-ARS?s Scientific Manuscript database

    Increasing interest in developing process-based erosion models requires better understanding of the relationships among soil detachment, transportation, and deposition. The objectives are to 1) identify the limiting process between soil detachment and sediment transport for interrill erosion, 2) und...

  11. Coal conversion systems design and process modeling. Volume 1: Application of MPPR and Aspen computer models

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.

  12. Simulation in Metallurgical Processing: Recent Developments and Future Perspectives

    NASA Astrophysics Data System (ADS)

    Ludwig, Andreas; Wu, Menghuai; Kharicha, Abdellah

    2016-08-01

    This article briefly addresses the most important topics concerning numerical simulation of metallurgical processes, namely, multiphase issues (particle and bubble motion and flotation/sedimentation of equiaxed crystals during solidification), multiphysics issues (electromagnetic stirring, electro-slag remelting, Cu-electro-refining, fluid-structure interaction, and mushy zone deformation), process simulations on graphical processing units, integrated computational materials engineering, and automatic optimization via simulation. The present state-of-the-art as well as requirements for future developments are presented and briefly discussed.

  13. Technology development in support of the TWRS process flowsheet. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washenfelder, D.J.

    1995-10-11

    The Tank Waste Remediation System is to treat and dispose of Hanford`s Single-Shell and Double-Shell Tank Waste. The TWRS Process Flowsheet, (WHC-SD-WM-TI-613 Rev. 1) described a flowsheet based on a large number of assumptions and engineering judgements that require verification or further definition through process and technology development activities. This document takes off from the TWRS Process Flowsheet to identify and prioritize tasks that should be completed to strengthen the technical foundation for the flowsheet.

  14. Engineering Tests of Experimental Ammonia Process Printer-Developer

    DTIC Science & Technology

    1950-07-06

    of materials and processes for photo reproduction by the amonia process. c. It was expected that the new machine might also pro- vide an interim...grease, oil, amonia waste can, and attachzmnts. A 6- inch diareter flexible tube is attached at the roar of the rxchine for carrying away the exhaust heat...by field troops. 2 TGIF 58 SUBJECT: Amonia Process Equipment Developed Under Project 8-35-09-005 19 Jan 50 7. An early reply would be required in

  15. Model-centric approaches for the development of health information systems.

    PubMed

    Tuomainen, Mika; Mykkänen, Juha; Luostarinen, Heli; Pöyhölä, Assi; Paakkanen, Esa

    2007-01-01

    Modeling is used increasingly in healthcare to increase shared knowledge, to improve the processes, and to document the requirements of the solutions related to health information systems (HIS). There are numerous modeling approaches which aim to support these aims, but a careful assessment of their strengths, weaknesses and deficiencies is needed. In this paper, we compare three model-centric approaches in the context of HIS development: the Model-Driven Architecture, Business Process Modeling with BPMN and BPEL and the HL7 Development Framework. The comparison reveals that all these approaches are viable candidates for the development of HIS. However, they have distinct strengths and abstraction levels, they require local and project-specific adaptation and offer varying levels of automation. In addition, illustration of the solutions to the end users must be improved.

  16. Knowledge-based requirements analysis for automating software development

    NASA Technical Reports Server (NTRS)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  17. Requirements model for an e-Health awareness portal

    NASA Astrophysics Data System (ADS)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Nawi, Mohd Nasrun M.

    2016-08-01

    Requirements engineering is at the heart and foundation of software engineering process. Poor quality requirements inevitably lead to poor quality software solutions. Also, poor requirement modeling is tantamount to designing a poor quality product. So, quality assured requirements development collaborates fine with usable products in giving the software product the needed quality it demands. In the light of the foregoing, the requirements for an e-Ebola Awareness Portal were modeled with a good attention given to these software engineering concerns. The requirements for the e-Health Awareness Portal are modeled as a contribution to the fight against Ebola and helps in the fulfillment of the United Nation's Millennium Development Goal No. 6. In this study requirements were modeled using UML 2.0 modeling technique.

  18. Mass production of silicon pore optics for ATHENA

    NASA Astrophysics Data System (ADS)

    Wille, Eric; Bavdaz, Marcos; Collon, Maximilien

    2016-07-01

    Silicon Pore Optics (SPO) provide high angular resolution with low effective area density as required for the Advanced Telescope for High Energy Astrophysics (Athena). The x-ray telescope consists of several hundreds of SPO mirror modules. During the development of the process steps of the SPO technology, specific requirements of a future mass production have been considered right from the beginning. The manufacturing methods heavily utilise off-the-shelf equipment from the semiconductor industry, robotic automation and parallel processing. This allows to upscale the present production flow in a cost effective way, to produce hundreds of mirror modules per year. Considering manufacturing predictions based on the current technology status, we present an analysis of the time and resources required for the Athena flight programme. This includes the full production process starting with Si wafers up to the integration of the mirror modules. We present the times required for the individual process steps and identify the equipment required to produce two mirror modules per day. A preliminary timeline for building and commissioning the required infrastructure, and for flight model production of about 1000 mirror modules, is presented.

  19. Aerothermodynamic testing requirements for future space transportation systems

    NASA Technical Reports Server (NTRS)

    Paulson, John W., Jr.; Miller, Charles G., III

    1995-01-01

    Aerothermodynamics, encompassing aerodynamics, aeroheating, and fluid dynamic and physical processes, is the genesis for the design and development of advanced space transportation vehicles. It provides crucial information to other disciplines involved in the development process such as structures, materials, propulsion, and avionics. Sources of aerothermodynamic information include ground-based facilities, computational fluid dynamic (CFD) and engineering computer codes, and flight experiments. Utilization of this triad is required to provide the optimum requirements while reducing undue design conservatism, risk, and cost. This paper discusses the role of ground-based facilities in the design of future space transportation system concepts. Testing methodology is addressed, including the iterative approach often required for the assessment and optimization of configurations from an aerothermodynamic perspective. The influence of vehicle shape and the transition from parametric studies for optimization to benchmark studies for final design and establishment of the flight data book is discussed. Future aerothermodynamic testing requirements including the need for new facilities are also presented.

  20. Issues in NASA Program and Project Management. Special Edition: A Collection of Papers on NASA Procedures and Guidance 7120.5A. Volume 14

    NASA Technical Reports Server (NTRS)

    Hoffman, Edward J. (Editor); Lawbaugh, William M. (Editor)

    1998-01-01

    A key aspect of NASA's new Strategic Management System is improving the way we plan, approve, execute and evaluate our programs and projects. To this end, NASA has developed the NASA Program and Project Management processes and Requirements-NASA Procedures and Guidelines (NPG) 7120.5A, which formally documents the "Provide Aerospace Products and Capabilities" crosscutting process, and defines the processes and requirements that are responsive to the Program/Project Management-NPD 7120.4A. The Program/Project Management-NPD 7120.4A, issued November 14, 1996, provides the policy for managing programs and projects in a new way that is aligned with the new NASA environment. An Agencywide team has spent thousands of hours developing the NASA Program and Project Management Processes and Requirements-NPG 7120.5A. We have created significant flexibility, authority and discretion for the program and project managers to exercise and carry out their duties, and have delegated the responsibility and the accountability for their programs and projects.

  1. HALE UAS Command and Control Communications: Step 1 - Functional Requirements Document. Version 4.0

    NASA Technical Reports Server (NTRS)

    2006-01-01

    The High Altitude Long Endurance (HALE) unmanned aircraft system (UAS) communicates with an off-board pilot-in-command in all flight phases via the C2 data link, making it a critical component for the UA to fly in the NAS safely and routinely. This is a new requirement in current FAA communications planning and monitoring processes. This document provides a set of comprehensive C2 communications functional requirements and performance guidelines to help facilitate the future FAA certification process for civil UAS to operate in the NAS. The objective of the guidelines is to provide the ability to validate the functional requirements and in future be used to develop performance-level requirements.

  2. Process material management in the Space Station environment

    NASA Technical Reports Server (NTRS)

    Perry, J. L.; Humphries, W. R.

    1988-01-01

    The Space Station will provide a unique facility for conducting material-processing and life-science experiments under microgravity conditions. These conditions place special requirements on the U.S. Laboratory for storing and transporting chemicals and process fluids, reclaiming water from selected experiments, treating and storing experiment wastes, and providing vacuum utilities. To meet these needs and provide a safe laboratory environment, the Process Material Management System (PMMS) is being developed. Preliminary design requirements and concepts related to the PMMS are addressed, and the MSFC PMMS breadboard test facility and a preliminary plan for validating the overall system design are discussed.

  3. Recent progress in NASA Langley textile reinforced composites program

    NASA Technical Reports Server (NTRS)

    Dexter, H. Benson; Harris, Charles E.; Johnston, Norman J.

    1992-01-01

    The NASA LaRC is conducting and sponsoring research to explore the benefits of textile reinforced composites for civil transport aircraft primary structures. The objective of this program is to develop and demonstrate the potential of affordable textile reinforced composite materials to meet design properties and damage tolerance requirements of advanced aircraft structural concepts. In addition to in-house research, the program was recently expanded to include major participation by the aircraft industry and aerospace textile companies. The major program elements include development of textile preforms, processing science, mechanics of materials, experimental characterization of materials, and development and evaluation of textile reinforced composite structural elements and subcomponents. The NASA Langley in-house focus is as follows: development of a science-based understanding of resin transfer molding (RTM), development of powder-coated towpreg processes, analysis methodology, and development of a performance database on textile reinforced composites. The focus of the textile industry participation is on development of multidirectional, damage-tolerant preforms, and the aircraft industry participation is in the areas of design, fabrication and testing of textile reinforced composite structural elements and subcomponents. Textile processes such as 3D weaving, 2D and 3D braiding, and knitting/stitching are being compared with conventional laminated tape processes for improved damage tolerance. Through-the-thickness reinforcements offer significant damage tolerance improvements. However, these gains must be weighed against potential loss in in-plane properties such as strength and stiffness. Analytical trade studies are underway to establish design guidelines for the application of textile material forms to meet specific loading requirements. Fabrication and testing of large structural components are required to establish the full potential of textile reinforced composite materials.

  4. Lithium-Ion Batteries for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Surampudi, S.; Halpert, G.; Marsh, R. A.; James, R.

    1999-01-01

    This presentation reviews: (1) the goals and objectives, (2) the NASA and Airforce requirements, (3) the potential near term missions, (4) management approach, (5) the technical approach and (6) the program road map. The objectives of the program include: (1) develop high specific energy and long life lithium ion cells and smart batteries for aerospace and defense applications, (2) establish domestic production sources, and to demonstrate technological readiness for various missions. The management approach is to encourage the teaming of universities, R&D organizations, and battery manufacturing companies, to build on existing commercial and government technology, and to develop two sources for manufacturing cells and batteries. The technological approach includes: (1) develop advanced electrode materials and electrolytes to achieve improved low temperature performance and long cycle life, (2) optimize cell design to improve specific energy, cycle life and safety, (3) establish manufacturing processes to ensure predictable performance, (4) establish manufacturing processes to ensure predictable performance, (5) develop aerospace lithium ion cells in various AH sizes and voltages, (6) develop electronics for smart battery management, (7) develop a performance database required for various applications, and (8) demonstrate technology readiness for the various missions. Charts which review the requirements for the Li-ion battery development program are presented.

  5. Requirements Development for Interoperability Simulation Capability for Law Enforcement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holter, Gregory M.

    2004-05-19

    The National Counterdrug Center (NCC) was initially authorized by Congress in FY 1999 appropriations to create a simulation-based counterdrug interoperability training capability. As the lead organization for Research and Analysis to support the NCC, the Pacific Northwest National Laboratory (PNNL) was responsible for developing the requirements for this interoperability simulation capability. These requirements were structured to address the hardware and software components of the system, as well as the deployment and use of the system. The original set of requirements was developed through a process of conducting a user-based survey of requirements for the simulation capability, coupled with an analysismore » of similar development efforts. The user-based approach ensured that existing concerns with respect to interoperability within the law enforcement community would be addressed. Law enforcement agencies within the designated pilot area of Cochise County, Arizona, were surveyed using interviews and ride-alongs during actual operations. The results of this survey were then accumulated, organized, and validated with the agencies to ensure the accuracy of the results. These requirements were then supplemented by adapting operational requirements from existing systems to ensure system reliability and operability. The NCC adopted a development approach providing incremental capability through the fielding of a phased series of progressively more capable versions of the system. This allowed for feedback from system users to be incorporated into subsequent revisions of the system requirements, and also allowed the addition of new elements as needed to adapt the system to broader geographic and geopolitical areas, including areas along the southwest and northwest U.S. borders. This paper addresses the processes used to develop and refine requirements for the NCC interoperability simulation capability, as well as the response of the law enforcement community to the use of the NCC system. The paper also addresses the applicability of such an interoperability simulation capability to a broader set of law enforcement, border protection, site/facility security, and first-responder needs.« less

  6. Making the purchase decision: factors other than price.

    PubMed

    Lyons, D M

    1992-05-01

    Taking price out of the limelight and concentrating on customer relations, mutual respect, and build-in/buy-in; involving the user; developing communication and evaluation processes; and being process oriented to attain the results needed require commitment on the part of administration and materiel management. There must be a commitment of time to develop the process, commitment of resources to work through the process, and a commitment of support to enhance the process. With those three parameters in place, price will no longer be the only factor in the purchasing decision.

  7. Architecture for distributed design and fabrication

    NASA Astrophysics Data System (ADS)

    McIlrath, Michael B.; Boning, Duane S.; Troxel, Donald E.

    1997-01-01

    We describe a flexible, distributed system architecture capable of supporting collaborative design and fabrication of semi-conductor devices and integrated circuits. Such capabilities are of particular importance in the development of new technologies, where both equipment and expertise are limited. Distributed fabrication enables direct, remote, physical experimentation in the development of leading edge technology, where the necessary manufacturing resources are new, expensive, and scarce. Computational resources, software, processing equipment, and people may all be widely distributed; their effective integration is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages current vendor and consortia developments to define software interfaces and infrastructure based on existing and merging networking, CIM, and CAD standards. Process engineers and product designers access processing and simulation results through a common interface and collaborate across the distributed manufacturing environment.

  8. Mapping CMMI Level 2 to Scrum Practices: An Experience Report

    NASA Astrophysics Data System (ADS)

    Diaz, Jessica; Garbajosa, Juan; Calvo-Manzano, Jose A.

    CMMI has been adopted advantageously in large companies for improvements in software quality, budget fulfilling, and customer satisfaction. However SPI strategies based on CMMI-DEV require heavy software development processes and large investments in terms of cost and time that medium/small companies do not deal with. The so-called light software development processes, such as Agile Software Development (ASD), deal with these challenges. ASD welcomes changing requirements and stresses the importance of adaptive planning, simplicity and continuous delivery of valuable software by short time-framed iterations. ASD is becoming convenient in a more and more global, and changing software market. It would be greatly useful to be able to introduce agile methods such as Scrum in compliance with CMMI process model. This paper intends to increase the understanding of the relationship between ASD and CMMI-DEV reporting empirical results that confirm theoretical comparisons between ASD practices and CMMI level2.

  9. Systems Engineering in NASA's R&TD Programs

    NASA Technical Reports Server (NTRS)

    Jones, Harry

    2005-01-01

    Systems engineering is largely the analysis and planning that support the design, development, and operation of systems. The most common application of systems engineering is in guiding systems development projects that use a phased process of requirements, specifications, design, and development. This paper investigates how systems engineering techniques should be applied in research and technology development programs for advanced space systems. These programs should include anticipatory engineering of future space flight systems and a project portfolio selection process, as well as systems engineering for multiple development projects.

  10. Adaptive Signal Processing Testbed: VME-based DSP board market survey

    NASA Astrophysics Data System (ADS)

    Ingram, Rick E.

    1992-04-01

    The Adaptive Signal Processing Testbed (ASPT) is a real-time multiprocessor system utilizing digital signal processor technology on VMEbus based printed circuit boards installed on a Sun workstation. The ASPT has specific requirements, particularly as regards to the signal excision application, with respect to interfacing with current and planned data generation equipment, processing of the data, storage to disk of final and intermediate results, and the development tools for applications development and integration into the overall EW/COM computing environment. A prototype ASPT was implemented using three VME-C-30 boards from Applied Silicon. Experience gained during the prototype development led to the conclusions that interprocessor communications capability is the most significant contributor to overall ASPT performance. In addition, the host involvement should be minimized. Boards using different processors were evaluated with respect to the ASPT system requirements, pricing, and availability. Specific recommendations based on various priorities are made as well as recommendations concerning the integration and interaction of various tools developed during the prototype implementation.

  11. Hardware Development Process for Human Research Facility Applications

    NASA Technical Reports Server (NTRS)

    Bauer, Liz

    2000-01-01

    The simple goal of the Human Research Facility (HRF) is to conduct human research experiments on the International Space Station (ISS) astronauts during long-duration missions. This is accomplished by providing integration and operation of the necessary hardware and software capabilities. A typical hardware development flow consists of five stages: functional inputs and requirements definition, market research, design life cycle through hardware delivery, crew training, and mission support. The purpose of this presentation is to guide the audience through the early hardware development process: requirement definition through selecting a development path. Specific HRF equipment is used to illustrate the hardware development paths. The source of hardware requirements is the science community and HRF program. The HRF Science Working Group, consisting of SCientists from various medical disciplines, defined a basic set of equipment with functional requirements. This established the performance requirements of the hardware. HRF program requirements focus on making the hardware safe and operational in a space environment. This includes structural, thermal, human factors, and material requirements. Science and HRF program requirements are defined in a hardware requirements document which includes verification methods. Once the hardware is fabricated, requirements are verified by inspection, test, analysis, or demonstration. All data is compiled and reviewed to certify the hardware for flight. Obviously, the basis for all hardware development activities is requirement definition. Full and complete requirement definition is ideal prior to initiating the hardware development. However, this is generally not the case, but the hardware team typically has functional inputs as a guide. The first step is for engineers to conduct market research based on the functional inputs provided by scientists. CommerCially available products are evaluated against the science requirements as well as modifications needed to meet program requirements. Options are consolidated and the hardware development team reaches a hardware development decision point. Within budget and schedule constraints, the team must decide whether or not to complete the hardware as an in-house, subcontract with vendor, or commercial-off-the-shelf (COTS) development. An in-house development indicates NASA personnel or a contractor builds the hardware at a NASA site. A subcontract development is completed off-site by a commercial company. A COTS item is a vendor product available by ordering a specific part number. The team evaluates the pros and cons of each development path. For example, in-bouse developments utilize existing corporate knowledge regarding bow to build equipment for use in space. However, technical expertise would be required to fully understand the medical equipment capabilities, such as for an ultrasound system. It may require additional time and funding to gain the expertise that commercially exists. The major benefit of subcontracting a hardware development is the product is delivered as an end-item and commercial expertise is utilized. On the other hand, NASA has limited control over schedule delays. The final option of COTS or modified COTS equipment is a compromise between in-house and subcontracts. A vendor product may exist that meets all functional requirements but req uires in-house modifications for successful operation in a space environment. The HRF utilizes equipment developed using all of the paths described: inhouse, subcontract, and modified COTS.

  12. Framework for establishing records control in hospitals as an ISO 9001 requirement.

    PubMed

    Al-Qatawneh, Lina

    2017-02-13

    Purpose The purpose of this paper is to present the process followed to control records in a Jordanian private community hospital as an ISO 9001:2008 standard requirement. Design/methodology/approach Under the hospital quality council's supervision, the quality management and development office staff were responsible for designing, planning and implementing the quality management system (QMS) using the ISO 9001:2008 standard. A policy for records control was established. An action plan for establishing the records control was developed and implemented. On completion, a coding system for records was specified to be used by hospital staff. Finally, an internal audit was performed to verify conformity to the ISO 9001:2008 standard requirements. Findings Successful certification by a neutral body ascertained that the hospital's QMS conformed to the ISO 9001:2008 requirements. A framework was developed that describes the records controlling process, which can be used by staff in any healthcare organization wanting to achieve ISO 9001:2008 accreditation. Originality/value Given the increased interest among healthcare organizations to achieve the ISO 9001 certification, the proposed framework for establishing records control is developed and is expected to be a valuable management tool to improve and sustain healthcare quality.

  13. The Toll pathway is required in the epidermis for muscle development in the Drosophila embryo

    NASA Technical Reports Server (NTRS)

    Halfon, M. S.; Keshishian, H.

    1998-01-01

    The Toll signaling pathway functions in several Drosophila processes, including dorsal-ventral pattern formation and the immune response. Here, we demonstrate that this pathway is required in the epidermis for proper muscle development. Previously, we showed that the zygotic Toll protein is necessary for normal muscle development; in the absence of zygotic Toll, close to 50% of hemisegments have muscle patterning defects consisting of missing, duplicated and misinserted muscle fibers (Halfon, M.S., Hashimoto, C., and Keshishian, H., Dev. Biol. 169, 151-167, 1995). We have now also analyzed the requirements for easter, spatzle, tube, and pelle, all of which function in the Toll-mediated dorsal-ventral patterning pathway. We find that spatzle, tube, and pelle, but not easter, are necessary for muscle development. Mutations in these genes give a phenotype identical to that seen in Toll mutants, suggesting that elements of the same pathway used for Toll signaling in dorsal-ventral development are used during muscle development. By expressing the Toll cDNA under the control of distinct Toll enhancer elements in Toll mutant flies, we have examined the spatial requirements for Toll expression during muscle development. Expression of Toll in a subset of epidermal cells that includes the epidermal muscle attachment cells, but not Toll expression in the musculature, is necessary for proper muscle development. Our results suggest that signals received by the epidermis early during muscle development are an important part of the muscle patterning process.

  14. C-C1-04: Building a Health Services Information Technology Research Environment

    PubMed Central

    Gehrum, David W; Jones, JB; Romania, Gregory J; Young, David L; Lerch, Virginia R; Bruce, Christa A; Donkochik, Diane; Stewart, Walter F

    2010-01-01

    Background: The electronic health record (EHR) has opened a new era for health services research (HSR) where information technology (IT) is used to re-engineer care processes. While the EHR provides one means of advancing novel solutions, a promising strategy is to develop tools (e.g., online questionnaires, visual display tools, decision support) distinct from, but which interact with, the EHR. Development of such software tools outside the EHR offers an advantage in flexibility, sophistication, and ultimately in portability to other settings. However, institutional IT departments have an imperative to protect patient data and to standardize IT processes to ensure system-level security and support traditional business needs. Such imperatives usually present formidable process barriers to testing novel software solutions. We describe how, in collaboration with our IT department, we are creating an environment and a process that allows for routine and rapid testing of novel software solutions. Methods: We convened a working group consisting of IT and research personnel with expertise in information security, database design/management, web design, EHR programming, and health services research. The working group was tasked with developing a research IT environment to accomplish two objectives: maintain network/ data security and regulatory compliance; allow researchers working with external vendors to rapidly prototype and, in a clinical setting, test web-based tools. Results: Two parallel solutions, one focused on hardware, the second on oversight and management, were developed. First, we concluded that three separate, staged development environments were required to allow external vendor access for testing software and for transitioning software to be used in a clinic. In parallel, the extant oversight process for approving/managing access to internal/external personnel had to be altered to reflect the scope and scale of discrete research projects, as opposed to an enterpriselevel approach to IT management. Conclusions: Innovation in health services software development requires a flexible, scalable IT environment adapted to the unique objectives of a HSR software development model. In our experience, implementing the hardware solution is less challenging than the cultural change required to implement such a model and the modifications to administrative and oversight processes to sustain an environment for rapid product development and testing.

  15. An Approach for Integrating the Prioritization of Functional and Nonfunctional Requirements

    PubMed Central

    Dabbagh, Mohammad; Lee, Sai Peck

    2014-01-01

    Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches. PMID:24982987

  16. An approach for integrating the prioritization of functional and nonfunctional requirements.

    PubMed

    Dabbagh, Mohammad; Lee, Sai Peck

    2014-01-01

    Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches.

  17. [The White Paper of the health professions of Catalonia].

    PubMed

    Pomés, Xavier; Oriol, Albert; de Oleza, Rafael; Ania, Olinda; Avila, Alicia; Branda, Luis; Brugulat, Pilar; Gual, Arcadi; Creus, Mariona; Zurro, Amando Martin

    2003-01-01

    The White Paper of the Health Professions of Catalonia (WPHPC) is a strategic document for the development of the health professions. It deals with the main components of the manpower development (education, management and planning) in relation to the health services development required to attain the objectives defined in the Catalan Health Plan. The WPHPC fosters the coherence between social needs and professional competencies required to respond to them, as well as to the quantitative aspects of service needs under adequate standards of quality, effectiveness and efficiency. The WPHPC has followed a methodological process with maximum stakeholder participation and transparency. Citizens, professionals and health organizations have contributed significantly. The conclusions and recommendations of the WPHPC are organized around four axis: the citizenship, the professionals, the health care organizations and the health care model. Key elements are: the requirement of a new social contract between the different stakeholders, the values of professionalism, the need for a new credentialism of professional competencies, innovation in the education process, innovation of governance and management for organization of knowledge, the redistribution of work inside teams requires deregulation and reregulation of the professions, the need for actualized data on workforce and job positions and the permanent requirement of sociological research.

  18. The NASA Commercial Crew Program (CCP) Mission Assurance Process

    NASA Technical Reports Server (NTRS)

    Canfield, Amy

    2016-01-01

    In 2010, NASA established the Commercial Crew Program in order to provide human access to the International Space Station and low earth orbit via the commercial (non-governmental) sector. A particular challenge to NASA has been how to determine the commercial providers transportation system complies with Programmatic safety requirements. The process used in this determination is the Safety Technical Review Board which reviews and approves provider submitted Hazard Reports. One significant product of the review is a set of hazard control verifications. In past NASA programs, 100 percent of these safety critical verifications were typically confirmed by NASA. The traditional Safety and Mission Assurance (SMA) model does not support the nature of the Commercial Crew Program. To that end, NASA SMA is implementing a Risk Based Assurance (RBA) process to determine which hazard control verifications require NASA authentication. Additionally, a Shared Assurance Model is also being developed to efficiently use the available resources to execute the verifications. This paper will describe the evolution of the CCP Mission Assurance process from the beginning of the Program to its current incarnation. Topics to be covered include a short history of the CCP; the development of the Programmatic mission assurance requirements; the current safety review process; a description of the RBA process and its products and ending with a description of the Shared Assurance Model.

  19. Improvement in recording and reading holograms

    NASA Technical Reports Server (NTRS)

    Hallock, J. N.

    1968-01-01

    Three-beam technique superimposes a number of patterns in the same plane of a hologram and then uniquely identifies each pattern by a suitable readout process. The developed readout process does not require any movement of parts.

  20. Re-Engineering the Curriculum at a Rural Institution: Reflection on the Process of Development

    ERIC Educational Resources Information Center

    Naude, A.; Wium, A. M.; du Plessis, S.

    2011-01-01

    The Department of Speech-Language Pathology and Audiology at the University of Limpopo (Medunsa Campus) redesigned their curriculum at the beginning of 2010. The template that was developed shows the horizontal and vertical integration of outcomes. Although the outcomes of the entire process met the requirements of the Health Professions Council…

  1. Designing and Demonstrating a Master Student Project to Explore Carbon Dioxide Capture Technology

    ERIC Educational Resources Information Center

    Asherman, Florine; Cabot, Gilles; Crua, Cyril; Estel, Lionel; Gagnepain, Charlotte; Lecerf, Thibault; Ledoux, Alain; Leveneur, Sebastien; Lucereau, Marie; Maucorps, Sarah; Ragot, Melanie; Syrykh, Julie; Vige, Manon

    2016-01-01

    The rise in carbon dioxide (CO[subscript 2]) concentration in the Earth's atmosphere, and the associated strengthening of the greenhouse effect, requires the development of low carbon technologies. New carbon capture processes are being developed to remove CO[subscript 2] that would otherwise be emitted from industrial processes and fossil fuel…

  2. Study for Identification of Beneficial Uses of Space (BUS). Volume 2: Technical report. Book 4: Development and business analysis of space processed surface acoustic wave devices

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Preliminary development plans, analysis of required R and D and production resources, the costs of such resources, and, finally, the potential profitability of a commercial space processing opportunity for the production of very high frequency surface acoustic wave devices are presented.

  3. SCOOP: A Measurement and Database of Student Online Search Behavior and Performance

    ERIC Educational Resources Information Center

    Zhou, Mingming

    2015-01-01

    The ability to access and process massive amounts of online information is required in many learning situations. In order to develop a better understanding of student online search process especially in academic contexts, an online tool (SCOOP) is developed for tracking mouse behavior on the web to build a more extensive account of student web…

  4. Development of advanced Czochralski growth process to produce low cost 150 kg silicon ingots from a single crucible for technology readiness

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The design and development of an advanced Czochralski crystal grower are described. Several exhaust gas analysis system equipment specifications studied are discussed. Process control requirements were defined and design work began on the melt temperature, melt level, and continuous diameter control. Sensor development included assembly and testing of a bench prototype of a diameter scanner system.

  5. Gsflow-py: An integrated hydrologic model development tool

    NASA Astrophysics Data System (ADS)

    Gardner, M.; Niswonger, R. G.; Morton, C.; Henson, W.; Huntington, J. L.

    2017-12-01

    Integrated hydrologic modeling encompasses a vast number of processes and specifications, variable in time and space, and development of model datasets can be arduous. Model input construction techniques have not been formalized or made easily reproducible. Creating the input files for integrated hydrologic models (IHM) requires complex GIS processing of raster and vector datasets from various sources. Developing stream network topology that is consistent with the model resolution digital elevation model is important for robust simulation of surface water and groundwater exchanges. Distribution of meteorologic parameters over the model domain is difficult in complex terrain at the model resolution scale, but is necessary to drive realistic simulations. Historically, development of input data for IHM models has required extensive GIS and computer programming expertise which has restricted the use of IHMs to research groups with available financial, human, and technical resources. Here we present a series of Python scripts that provide a formalized technique for the parameterization and development of integrated hydrologic model inputs for GSFLOW. With some modifications, this process could be applied to any regular grid hydrologic model. This Python toolkit automates many of the necessary and laborious processes of parameterization, including stream network development and cascade routing, land coverages, and meteorological distribution over the model domain.

  6. Evaluation of ERDA-sponsored coal feed system development

    NASA Technical Reports Server (NTRS)

    Phen, R. L.; Luckow, W. K.; Mattson, L.; Otth, D.; Tsou, P.

    1977-01-01

    Coal feeders were evaluated based upon criteria such as technical feasibility, performance (i.e. ability to meet process requirements), projected life cycle costs, and projected development cost. An initial set of feeders was selected based on the feeders' cost savings potential compared with baseline lockhopper systems. Additional feeders were considered for selection based on: (1) increasing the probability of successful feeder development; (2) application to specific processes; and (3) technical merit. A coal feeder development program is outlined.

  7. Contamination monitoring approaches for EUV space optics

    NASA Technical Reports Server (NTRS)

    Ray, David C.; Malina, Roger F.; Welsh, Barry J.; Battel, Steven J.

    1989-01-01

    Data from contaminant-induced UV optics degradation studies and particulate models are used here to develop end-of-service-life instrument contamination requirements which are very stringent but achievable. The budget is divided into allocations for each phase of hardware processing. Optical and nonoptical hardware are monitored for particulate and molecular contamination during initial cleaning and baking, assembly, test, and calibration phases. The measured contamination levels are compared to the requirements developed for each phase to provide confidence that the required end-of-life levels will be met.

  8. Framework Requirements for MDO Application Development

    NASA Technical Reports Server (NTRS)

    Salas, A. O.; Townsend, J. C.

    1999-01-01

    Frameworks or problem solving environments that support application development form an active area of research. The Multidisciplinary Optimization Branch at NASA Langley Research Center is investigating frameworks for supporting multidisciplinary analysis and optimization research. The Branch has generated a list of framework requirements, based on the experience gained from the Framework for Interdisciplinary Design Optimization project and the information acquired during a framework evaluation process. In this study, four existing frameworks are examined against these requirements. The results of this examination suggest several topics for further framework research.

  9. Functional requirements for onboard management of space shuttle consumables, volume 2.

    NASA Technical Reports Server (NTRS)

    Graf, P. J.; Herwig, H. A.; Neel, L. W.

    1973-01-01

    A study was conducted to develop the functional requirements for onboard management of space shuttle consumables. A specific consumables management concept for the space shuttle vehicle was developed and the functional requirements for the onboard portion of the concept were generated. Consumables management is the process of controlling or influencing the usage of expendable materials involved in vehicle subsystem operation. The subsystems considered in the study are: (1) propulsion, (2) power generation, and (3) environmental and life support.

  10. Spacelab user implementation assessment study. (Software requirements analysis). Volume 2: Technical report

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The engineering analyses and evaluation studies conducted for the Software Requirements Analysis are discussed. Included are the development of the study data base, synthesis of implementation approaches for software required by both mandatory onboard computer services and command/control functions, and identification and implementation of software for ground processing activities.

  11. A knowledge-based approach to configuration layout, justification, and documentation

    NASA Technical Reports Server (NTRS)

    Craig, F. G.; Cutts, D. E.; Fennel, T. R.; Case, C.; Palmer, J. R.

    1990-01-01

    The design, development, and implementation is described of a prototype expert system which could aid designers and system engineers in the placement of racks aboard modules on Space Station Freedom. This type of problem is relevant to any program with multiple constraints and requirements demanding solutions which minimize usage of limited resources. This process is generally performed by a single, highly experienced engineer who integrates all the diverse mission requirements and limitations, and develops an overall technical solution which meets program and system requirements with minimal cost, weight, volume, power, etc. This system architect performs an intellectual integration process in which the underlying design rationale is often not fully documented. This is a situation which lends itself to an expert system solution for enhanced consistency, thoroughness, documentation, and change assessment capabilities.

  12. A Knowledge-Based Approach to Configuration Layout, Justification, and Documentation

    NASA Technical Reports Server (NTRS)

    Craig, F. G.; Cutts, D. E.; Fennel, T. R.; Case, C. M.; Palmer, J. R.

    1991-01-01

    The design, development, and implementation of a prototype expert system which could aid designers and system engineers in the placement of racks aboard modules on the Space Station Freedom are described. This type of problem is relevant to any program with multiple constraints and requirements demanding solutions which minimize usage of limited resources. This process is generally performed by a single, highly experienced engineer who integrates all the diverse mission requirements and limitations, and develops an overall technical solution which meets program and system requirements with minimal cost, weight, volume, power, etc. This system architect performs an intellectual integration process in which the underlying design rationale is often not fully documented. This is a situation which lends itself to an expert system solution for enhanced consistency, thoroughness, documentation, and change assessment capabilities.

  13. Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'

    NASA Astrophysics Data System (ADS)

    Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno

    2015-04-01

    Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.

  14. Evaluation of the user requirements processes for NASA terrestrial applications programs

    NASA Technical Reports Server (NTRS)

    1982-01-01

    To support the evolution of increasingly sound user requirements definition processes that would meet the broad range of NASA's terrestrial applications planning and management needs during the 1980's, the user requirements processes as they function in the real world at the senior and middle management levels were evaluated. Special attention was given to geologic mapping and domestic crop reporting to provide insight into problems associated with the development and management of user established conventional practices and data sources. An attempt was made to identify alternative NASA user interfaces that sustain strengths, alleviate weaknesses, maximize application to multiple problems, and simplify management cognizance. Some of the alternatives are outlined and evaluated. It is recommended that NASA have an identified organizational point of focus for consolidation and oversight of the user processes.

  15. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 2: The design process

    NASA Technical Reports Server (NTRS)

    Gillette, W. B.; Turner, M. J.; Southall, J. W.; Whitener, P. C.; Kowalik, J. S.

    1973-01-01

    The extent to which IPAD is to support the design process is identified. Case studies of representative aerospace products were developed as models to characterize the design process and to provide design requirements for the IPAD computing system.

  16. Math Process Standards Series, Grades 3-5

    ERIC Educational Resources Information Center

    O'Connell, Susan, Ed.

    2008-01-01

    NCTM's Process Standards support teaching that helps upper elementary level children develop independent, effective mathematical thinking. The books in the Heinemann Math Process Standards Series give every intermediate-grades teacher the opportunity to explore each standard in depth. With language and examples that don't require prior math…

  17. Applying Business Process Reengineering to the Marine Corps Information Assurance Certification and Accreditation Process

    DTIC Science & Technology

    2009-09-01

    the most efficient model is developed and validated by applying it to the current IA C&A process flow at the TSO-KC. Finally... models are explored using the Knowledge Value Added (KVA) methodology, and the most efficient model is developed and validated by applying it to the ... models requires only one available actor from its respective group , rather than all actors in the group , to

  18. Functional and performance requirements of the next NOAA-Kasas City computer system

    NASA Technical Reports Server (NTRS)

    Mosher, F. R.

    1985-01-01

    The development of the Advanced Weather Interactive Processing System for the 1990's (AWIPS-90) will result in more timely and accurate forecasts with improved cost effectiveness. As part of the AWIPS-90 initiative, the National Meteorological Center (NMC), the National Severe Storms Forecast Center (NSSFC), and the National Hurricane Center (NHC) are to receive upgrades of interactive processing systems. This National Center Upgrade program will support the specialized inter-center communications, data acquisition, and processing needs of these centers. The missions, current capabilities and general functional requirements for the upgrade to the NSSFC are addressed. System capabilities are discussed along with the requirements for the upgraded system.

  19. Data Quality Objectives for Regulatory Requirements for Dangerous Waste Sampling and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MULKEY, C.H.

    1999-07-02

    This document describes sampling and analytical requirements needed to meet state and federal regulations for dangerous waste (DW). The River Protection Project (RPP) is assigned to the task of storage and interim treatment of hazardous waste. Any final treatment or disposal operations, as well as requirements under the land disposal restrictions (LDRs), fall in the jurisdiction of another Hanford organization and are not part of this scope. The requirements for this Data Quality Objective (DQO) Process were developed using the RPP Data Quality Objective Procedure (Banning 1996), which is based on the U.S. Environmental Protection Agency's (EPA) Guidance for themore » Data Quality Objectives Process (EPA 1994). Hereafter, this document is referred to as the DW DQO. Federal and state laws and regulations pertaining to waste contain requirements that are dependent upon the composition of the waste stream. These regulatory drivers require that pertinent information be obtained. For many requirements, documented process knowledge of a waste composition can be used instead of analytical data to characterize or designate a waste. When process knowledge alone is used to characterize a waste, it is a best management practice to validate the information with analytical measurements.« less

  20. How inverse solver technologies can support die face development and process planning in the automotive industry

    NASA Astrophysics Data System (ADS)

    Huhn, Stefan; Peeling, Derek; Burkart, Maximilian

    2017-10-01

    With the availability of die face design tools and incremental solver technologies to provide detailed forming feasibility results in a timely fashion, the use of inverse solver technologies and resulting process improvements during the product development process of stamped parts often is underestimated. This paper presents some applications of inverse technologies that are currently used in the automotive industry to streamline the product development process and greatly increase the quality of a developed process and the resulting product. The first focus is on the so-called target strain technology. Application examples will show how inverse forming analysis can be applied to support the process engineer during the development of a die face geometry for Class `A' panels. The drawing process is greatly affected by the die face design and the process designer has to ensure that the resulting drawn panel will meet specific requirements regarding surface quality and a minimum strain distribution to ensure dent resistance. The target strain technology provides almost immediate feedback to the process engineer during the die face design process if a specific change of the die face design will help to achieve these specific requirements or will be counterproductive. The paper will further show how an optimization of the material flow can be achieved through the use of a newly developed technology called Sculptured Die Face (SDF). The die face generation in SDF is more suited to be used in optimization loops than any other conventional die face design technology based on cross section design. A second focus in this paper is on the use of inverse solver technologies for secondary forming operations. The paper will show how the application of inverse technology can be used to accurately and quickly develop trim lines on simple as well as on complex support geometries.

  1. Conversion from Tree to Graph Representation of Requirements

    NASA Technical Reports Server (NTRS)

    Mayank, Vimal; Everett, David Frank; Shmunis, Natalya; Austin, Mark

    2009-01-01

    A procedure and software to implement the procedure have been devised to enable conversion from a tree representation to a graph representation of the requirements governing the development and design of an engineering system. The need for this procedure and software and for other requirements-management tools arises as follows: In systems-engineering circles, it is well known that requirements- management capability improves the likelihood of success in the team-based development of complex systems involving multiple technological disciplines. It is especially desirable to be able to visualize (in order to identify and manage) requirements early in the system- design process, when errors can be corrected most easily and inexpensively.

  2. Coal conversion processes and analysis methodologies for synthetic fuels production. [technology assessment and economic analysis of reactor design for coal gasification

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Information to identify viable coal gasification and utilization technologies is presented. Analysis capabilities required to support design and implementation of coal based synthetic fuels complexes are identified. The potential market in the Southeast United States for coal based synthetic fuels is investigated. A requirements analysis to identify the types of modeling and analysis capabilities required to conduct and monitor coal gasification project designs is discussed. Models and methodologies to satisfy these requirements are identified and evaluated, and recommendations are developed. Requirements for development of technology and data needed to improve gasification feasibility and economies are examined.

  3. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joubert, Wayne; Kothe, Douglas B; Nam, Hai Ah

    2009-12-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for themore » longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be advanced on multiple fronts, including peak flops, node memory capacity, interconnect latency, interconnect bandwidth, and memory bandwidth. (2) Effective parallel programming interfaces must be developed to exploit the power of emerging hardware. (3) Science application teams must now begin to adapt and reformulate application codes to the new hardware and software, typified by hierarchical and disparate layers of compute, memory and concurrency. (4) Algorithm research must be realigned to exploit this hierarchy. (5) When possible, mathematical libraries must be used to encapsulate the required operations in an efficient and useful way. (6) Software tools must be developed to make the new hardware more usable. (7) Science application software must be improved to cope with the increasing complexity of computing systems. (8) Data management efforts must be readied for the larger quantities of data generated by larger, more accurate science models. Requirements elicitation, analysis, validation, and management comprise a difficult and inexact process, particularly in periods of technological change. Nonetheless, the OLCF requirements modeling process is becoming increasingly quantitative and actionable, as the process becomes more developed and mature, and the process this year has identified clear and concrete steps to be taken. This report discloses (1) the fundamental science case driving the need for the next generation of computer hardware, (2) application usage trends that illustrate the science need, (3) application performance characteristics that drive the need for increased hardware capabilities, (4) resource and process requirements that make the development and deployment of science applications on next-generation hardware successful, and (5) summary recommendations for the required next steps within the computer and computational science communities.« less

  4. Emerging freeze-drying process development and scale-up issues.

    PubMed

    Patel, Sajal Manubhai; Pikal, Michael J

    2011-03-01

    Although several guidelines do exist for freeze-drying process development and scale-up, there are still a number of issues that require additional attention. The objective of this review article is to discuss some emerging process development and scale-up issue with emphasis on effect of load condition and freeze-drying in novel container systems such as syringes, Lyoguard trays, ampoules, and 96-well plates. Understanding the heat and mass transfer under different load conditions and for freeze-drying in these novel container systems will help in developing a robust freeze-drying process which is also easier to scale-up. Further research and development needs in these emerging areas have also been addressed. © 2011 American Association of Pharmaceutical Scientists

  5. Trade-off analysis of modes of data handling for earth resources (ERS), volume 1

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Data handling requirements are reviewed for earth observation missions along with likely technology advances. Parametric techniques for synthesizing potential systems are developed. Major tasks include: (1) review of the sensors under development and extensions of or improvements in these sensors; (2) development of mission models for missions spanning land, ocean, and atmosphere observations; (3) summary of data handling requirements including the frequency of coverage, timeliness of dissemination, and geographic relationships between points of collection and points of dissemination; (4) review of data routing to establish ways of getting data from the collection point to the user; (5) on-board data processing; (6) communications link; and (7) ground data processing. A detailed synthesis of three specific missions is included.

  6. Evidence development and publication planning: strategic process.

    PubMed

    Wittek, Michael R; Jo Williams, Mary; Carlson, Angeline M

    2009-11-01

    A number of decisions in the health care field rely heavily on published clinical evidence. A systematic approach to evidence development and publication planning is required to develop a portfolio of evidence that includes at minimum information on efficacy, safety, durability of effect, quality of life, and economic outcomes. The approach requires a critical assessment of available literature, identification of gaps in the literature, and a strategic plan to fill the gaps to ensure the availability of evidence demanded for clinical decisions, coverage/payment decisions and health technology assessments. The purpose of this manuscript is to offer a six-step strategic process leading to a portfolio of evidence that meets the informational needs of providers, payers, and governmental agencies concerning patient access to a therapy.

  7. Advanced optical sensing and processing technologies for the distributed control of large flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Williams, G. M.; Fraser, J. C.

    1991-01-01

    The objective was to examine state-of-the-art optical sensing and processing technology applied to control the motion of flexible spacecraft. Proposed large flexible space systems, such an optical telescopes and antennas, will require control over vast surfaces. Most likely distributed control will be necessary involving many sensors to accurately measure the surface. A similarly large number of actuators must act upon the system. The used technical approach included reviewing proposed NASA missions to assess system needs and requirements. A candidate mission was chosen as a baseline study spacecraft for comparison of conventional and optical control components. Control system requirements of the baseline system were used for designing both a control system containing current off-the-shelf components and a system utilizing electro-optical devices for sensing and processing. State-of-the-art surveys of conventional sensor, actuator, and processor technologies were performed. A technology development plan is presented that presents a logical, effective way to develop and integrate advancing technologies.

  8. Tritium processing for the European test blanket systems: current status of the design and development strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricapito, I.; Calderoni, P.; Poitevin, Y.

    2015-03-15

    Tritium processing technologies of the two European Test Blanket Systems (TBS), HCLL (Helium Cooled Lithium Lead) and HCPB (Helium Cooled Pebble Bed), play an essential role in meeting the main objectives of the TBS experimental campaign in ITER. The compliancy with the ITER interface requirements, in terms of space availability, service fluids, limits on tritium release, constraints on maintenance, is driving the design of the TBS tritium processing systems. Other requirements come from the characteristics of the relevant test blanket module and the scientific programme that has to be developed and implemented. This paper identifies the main requirements for themore » design of the TBS tritium systems and equipment and, at the same time, provides an updated overview on the current design status, mainly focusing onto the tritium extractor from Pb-16Li and TBS tritium accountancy. Considerations are also given on the possible extrapolation to DEMO breeding blanket. (authors)« less

  9. An OSEE Based Portable Surface Contamination Monitor

    NASA Technical Reports Server (NTRS)

    Perey, Daniel F.

    1997-01-01

    Many industrial and aerospace processes involving the joining of materials, require sufficient surface cleanliness to insure proper bonding. Processes as diverse as painting, welding, or the soldering of electronic circuits will be compromised if prior inspection and removal of surface contaminants is inadequate. As process requirements become more stringent and the number of different materials and identified contaminants increases, various instruments and techniques have been developed for improved inspection. One such technique based on the principle of Optically Stimulated Electron Emission (OSEE) has been explored for a number of years as a tool for surface contamination monitoring. Some of the benefits of OSEE are: it's non-contacting; requires little operator training; and has very high contamination sensitivity. This paper describes the development of a portable OSEE based surface contamination monitor. The instrument is suitable for both hand-held and robotic inspections with either manual or automated control of instrument operation. In addition, instrument output data is visually displayed to the operator and may be output to an external computer for archiving or analysis.

  10. Modeling and control of flow during impregnation of heterogeneous porous media, with application to composite mold-filling processes

    NASA Astrophysics Data System (ADS)

    Bickerton, Simon

    Liquid Composite Molding (LCM) encompasses a growing list of composite material manufacturing techniques. These processes have provided the promise for complex fiber reinforced plastics parts, manufactured from a single molding step. In recent years a significant research effort has been invested in development of process simulations, providing tools that have advanced current LCM technology and broadened the range of applications. The requirement for manufacture of larger, more complex parts has motivated investigation of active control of LCM processes. Due to the unlimited variety of part geometries that can be produced, finite element based process simulations will be used to some extent in design of actively controlled processes. Ongoing efforts are being made to improve material parameter specification for process simulations, increasing their value as design tools. Several phenomena occurring during mold filling have been addressed through flow visualization experimentation and analysis of manufactured composite parts. The influence of well defined air channels within a mold cavity is investigated, incorporating their effects within existing filling simulations. Three different flow configurations have been addressed, testing the application of 'equivalent permeabilities', effectively approximating air channels as representative porous media. LCM parts having doubly curved regions require preform fabrics to undergo significant, and varying deformation throughout a mold cavity. Existing methods for predicting preform deformation, and the resulting permeability distribution have been applied to a conical mold geometry. Comparisons between experiment and simulation are promising, while the geometry studied has required large deformation over much of the part, shearing the preform fabric beyond the scope of the models applied. An investigational study was performed to determine the magnitude of effect, if any, on mold filling caused by corners within LCM mold cavities. The molds applied in this study have required careful consideration of cavity thickness variations. Any effects on mold filling due to corner radii have been overshadowed by those due to preform compression. While numerical tools are available to study actively controlled mold filling in a virtual environment, some development is required for the physical equipment to implement this in practice. A versatile, multiple line fluid injection system is developed here. The equipment and control algorithms employed have provided servo control of flow rate, or injection pressure, and have been tested under very challenging conditions. The single injection line developed is expanded to a multiple line system, and shows great potential for application to actual resin systems. A case study is presented, demonstrating design and implementation of a simple actively controlled injection scheme. The experimental facility developed provides an excellent testbed for application of actively controlled mold filling concepts, an area that is providing great promise for the advancement of LCM processes.

  11. An Ada implementation of the network manager for the advanced information processing system

    NASA Technical Reports Server (NTRS)

    Nagle, Gail A.

    1986-01-01

    From an implementation standpoint, the Ada language provided many features which facilitated the data and procedure abstraction process. The language supported a design which was dynamically flexible (despite strong typing), modular, and self-documenting. Adequate training of programmers requires access to an efficient compiler which supports full Ada. When the performance issues for real time processing are finally addressed by more stringent requirements for tasking features and the development of efficient run-time environments for embedded systems, the full power of the language will be realized.

  12. The Development of Model for Measuring Railway Wheels Manufacturing Readiness Level

    NASA Astrophysics Data System (ADS)

    Inrawan Wiratmadja, Iwan; Mufid, Anas

    2016-02-01

    In an effort to grow the railway wheel industry in Indonesia and reduce the dependence on imports, Metal Industries Development Center (MIDC) makes the implementation of the railway wheel manufacturing technology in Indonesia. MIDC is an institution based on research and development having a task to research the production of railway wheels prototype and acts as a supervisor to the industry in Indonesia, for implementing the railway wheel manufacturing technology. The process of implementing manufacturing technology requires a lot of resources. Therefore it is necessary to measure the manufacturing readiness process. Measurement of railway wheels manufacturing readiness was in this study done using the manufacturing readiness level (MRL) model from the United States Department of Defense. MRL consists of 10 manufacturing readiness levels described by 90 criteria and 184 sub-criteria. To get a manufacturing readiness measurement instrument that is good and accurate, the development process involved experts through expert judgment method and validated with a content validity ratio (CVR). Measurement instrument developed in this study consist of 448 indicators. The measurement results show that MIDC's railway wheels manufacturing readiness is at the level 4. This shows that there is a gap between the current level of manufacturing readiness owned by MIDC and manufacturing readiness levels required to achieve the program objectives, which is level 5. To achieve the program objectives at level 5, a number of actions were required to be done by MIDC. Indicators that must be improved to be able to achieve level 5 are indicators related to the cost and financing, process capability and control, quality management, workers, and manufacturing management criteria.

  13. MODELING TREE LEVEL PROCESSES

    EPA Science Inventory

    An overview of three main types of simulation approach (explanatory, abstraction, and estimation) is presented, along with a discussion of their capabilities limitations, and the steps required for their validation. A process model being developed through the Forest Response Prog...

  14. Development of Food Acceptance Patterns.

    ERIC Educational Resources Information Center

    Birch, Leann L.

    1990-01-01

    Provides a rationale for the significance of the study of early feeding and delineates major issues that require investigation. Includes a list of acquisition processes implicated in the development of food acceptance patterns. (RH)

  15. Processing lunar soils for oxygen and other materials

    NASA Technical Reports Server (NTRS)

    Knudsen, Christian W.; Gibson, Michael A.

    1992-01-01

    Two types of lunar materials are excellent candidates for lunar oxygen production: ilmenite and silicates such as anorthite. Both are lunar surface minable, occurring in soils, breccias, and basalts. Because silicates are considerably more abundant than ilmenite, they may be preferred as source materials. Depending on the processing method chosen for oxygen production and the feedstock material, various useful metals and bulk materials can be produced as byproducts. Available processing techniques include hydrogen reduction of ilmenite and electrochemical and chemical reductions of silicates. Processes in these categories are generally in preliminary development stages and need significant research and development support to carry them to practical deployment, particularly as a lunar-based operation. The goal of beginning lunar processing operations by 2010 requires that planning and research and development emphasize the simplest processing schemes. However, more complex schemes that now appear to present difficult technical challenges may offer more valuable metal byproducts later. While they require more time and effort to perfect, the more complex or difficult schemes may provide important processing and product improvements with which to extend and elaborate the initial lunar processing facilities. A balanced R&D program should take this into account. The following topics are discussed: (1) ilmenite--semi-continuous process; (2) ilmenite--continuous fluid-bed reduction; (3) utilization of spent ilmenite to produce bulk materials; (4) silicates--electrochemical reduction; and (5) silicates--chemical reduction.

  16. Effects of Amine and Anhydride Curing Agents on the VARTM Matrix Processing Properties

    NASA Technical Reports Server (NTRS)

    Grimsley, Brian W.; Hubert, Pascal; Song, Xiaolan; Cano, Roberto J.; Loos, Alfred C.; Pipes, R. Byron

    2002-01-01

    To ensure successful application of composite structure for aerospace vehicles, it is necessary to develop material systems that meet a variety of requirements. The industry has recently developed a number of low-viscosity epoxy resins to meet the processing requirements associated with vacuum assisted resin transfer molding (VARTM) of aerospace components. The curing kinetics and viscosity of two of these resins, an amine-cured epoxy system, Applied Poleramic, Inc. VR-56-4 1, and an anhydride-cured epoxy system, A.T.A.R.D. Laboratories SI-ZG-5A, have been characterized for application in the VARTM process. Simulations were carried out using the process model, COMPRO, to examine heat transfer, curing kinetics and viscosity for different panel thicknesses and cure cycles. Results of these simulations indicate that the two resins have significantly different curing behaviors and flow characteristics.

  17. Sensory evaluation based fuzzy AHP approach for material selection in customized garment design and development process

    NASA Astrophysics Data System (ADS)

    Hong, Y.; Curteza, A.; Zeng, X.; Bruniaux, P.; Chen, Y.

    2016-06-01

    Material selection is the most difficult section in the customized garment product design and development process. This study aims to create a hierarchical framework for material selection. The analytic hierarchy process and fuzzy sets theories have been applied to mindshare the diverse requirements from the customer and inherent interaction/interdependencies among these requirements. Sensory evaluation ensures a quick and effective selection without complex laboratory test such as KES and FAST, using the professional knowledge of the designers. A real empirical application for the physically disabled people is carried out to demonstrate the proposed method. Both the theoretical and practical background of this paper have indicated the fuzzy analytical network process can capture expert's knowledge existing in the form of incomplete, ambiguous and vague information for the mutual influence on attribute and criteria of the material selection.

  18. Integrated aerodynamic-structural design of a forward-swept transport wing

    NASA Technical Reports Server (NTRS)

    Haftka, Raphael T.; Grossman, Bernard; Kao, Pi-Jen; Polen, David M.; Sobieszczanski-Sobieski, Jaroslaw

    1989-01-01

    The introduction of composite materials is having a profound effect on aircraft design. Since these materials permit the designer to tailor material properties to improve structural, aerodynamic and acoustic performance, they require an integrated multidisciplinary design process. Futhermore, because of the complexity of the design process, numerical optimization methods are required. The utilization of integrated multidisciplinary design procedures for improving aircraft design is not currently feasible because of software coordination problems and the enormous computational burden. Even with the expected rapid growth of supercomputers and parallel architectures, these tasks will not be practical without the development of efficient methods for cross-disciplinary sensitivities and efficient optimization procedures. The present research is part of an on-going effort which is focused on the processes of simultaneous aerodynamic and structural wing design as a prototype for design integration. A sequence of integrated wing design procedures has been developed in order to investigate various aspects of the design process.

  19. 78 FR 71635 - 60-Day Notice of Proposed Information Collection: Appalachia Economic Development Initiative and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-29

    ... Information Collection: Appalachia Economic Development Initiative and Semi-Annual Reporting AGENCY: Office of... of Rural Housing & Economic Development, Department of Housing and Urban Development, 451 7th Street... application for the Appalachia Economic Development Initiative grant process. Information is required to rate...

  20. AAL service development loom--from the idea to a marketable business model.

    PubMed

    Kriegel, Johannes; Auinger, Klemens

    2015-01-01

    The Ambient Assisted Living (AAL) market is still in an early stage of development. Previous approaches of comprehensive AAL services are mostly supply-side driven and focused on hardware and software. Usually this type of AAL solutions does not lead to a sustainable success on the market. Research and development increasingly focuses on demand and customer requirements in addition to the social and legal framework. The question is: How can a systematic performance measurement strategy along a service development process support the market-ready design of a concrete business model for AAL service? Within the EU funded research project DALIA (Assistant for Daily Life Activities at Home) an iterative service development process uses an adapted Osterwalder business model canvas. The application of a performance measurement index (PMI) to support the process has been developed and tested. Development of an iterative service development model using a supporting PMI. The PMI framework is developed throughout the engineering of a virtual assistant (AVATAR) as a modular interface to connect informal carers with necessary and useful services. Future research should seek to ensure that the PMI enables meaningful transparency regarding targeting (e.g. innovative AAL service), design (e.g. functional hybrid AAL service) and implementation (e.g. marketable AAL support services). To this end, a further reference to further testing practices is required. The aim must be to develop a weighted PMI in the context of further research, which supports both the service engineering and the subsequent service management process.

  1. EVALUATION OF ALTERNATIVE STRONIUM AND TRANSURANIC SEPARATION PROCESSES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SMALLEY CS

    2011-04-25

    In order to meet contract requirements on the concentrations of strontium-90 and transuranic isotopes in the immobilized low-activity waste, strontium-90 and transuranics must be removed from the supernate of tanks 241-AN-102 and 241-AN-107. The process currently proposed for this application is an in-tank precipitation process using strontium nitrate and sodium permanganate. Development work on the process has not proceeded since 2005. The purpose of the evaluation is to identify whether any promising alternative processes have been developed since this issue was last examined, evaluate the alternatives and the baseline process, and recommend which process should be carried forward.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clarke, Kester Diederik

    The intent of this report is to document a procedure used at LANL for HIP bonding aluminum cladding to U-10Mo fuel foils using a formed HIP can for the Domestic Reactor Conversion program in the NNSA Office of Material, Management and Minimization, and provide some details that may not have been published elsewhere. The HIP process is based on the procedures that have been used to develop the formed HIP can process, including the baseline process developed at Idaho National Laboratory (INL). The HIP bonding cladding process development is summarized in the listed references. Further iterations with Babcock & Wilcoxmore » (B&W) to refine the process to meet production and facility requirements is expected.« less

  3. Integration of sustainability into process simulaton of a dairy process

    USDA-ARS?s Scientific Manuscript database

    Life cycle analysis, a method used to quantify the energy and environmental flows of a process or product on the environment, is increasingly utilized by food processors to develop strategies to lessen the carbon footprint of their operations. In the case of the milk supply chain, the method requir...

  4. A Study on Improving Information Processing Abilities Based on PBL

    ERIC Educational Resources Information Center

    Kim, Du Gyu; Lee, JaeMu

    2014-01-01

    This study examined an instruction method for the improvement of information processing abilities in elementary school students. Current elementary students are required to develop information processing abilities to create new knowledge for this digital age. There is, however, a shortage of instruction strategies for these information processing…

  5. Encapsulation Processing and Manufacturing Yield Analysis

    NASA Technical Reports Server (NTRS)

    Willis, P. B.

    1984-01-01

    The development of encapsulation processing and a manufacturing productivity analysis for photovoltaic cells are discussed. The goals were: (1) to understand the relationships between both formulation variables and process variables; (2) to define conditions required for optimum performance; (3) to predict manufacturing yield; and (4) to provide documentation to industry.

  6. Computer program developed for flowsheet calculations and process data reduction

    NASA Technical Reports Server (NTRS)

    Alfredson, P. G.; Anastasia, L. J.; Knudsen, I. E.; Koppel, L. B.; Vogel, G. J.

    1969-01-01

    Computer program PACER-65, is used for flowsheet calculations and easily adapted to process data reduction. Each unit, vessel, meter, and processing operation in the overall flowsheet is represented by a separate subroutine, which the program calls in the order required to complete an overall flowsheet calculation.

  7. 45 CFR 1303.5 - Service of process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Service of process. 1303.5 Section 1303.5 Public Welfare Regulations Relating to Public Welfare (Continued) OFFICE OF HUMAN DEVELOPMENT SERVICES... § 1303.5 Service of process. Whenever documents are required to be filed or served under this part, or...

  8. Encapsulation processing and manufacturing yield analysis

    NASA Astrophysics Data System (ADS)

    Willis, P. B.

    1984-10-01

    The development of encapsulation processing and a manufacturing productivity analysis for photovoltaic cells are discussed. The goals were: (1) to understand the relationships between both formulation variables and process variables; (2) to define conditions required for optimum performance; (3) to predict manufacturing yield; and (4) to provide documentation to industry.

  9. Energy requirements of the switchable polarity solvent forward osmosis (SPS-FO) water purification process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendt, Daniel S.; Orme, Christopher J.; Mines, Gregory L.

    A model was developed to estimate the process energy requirements of a switchable polarity solvent forward osmosis (SPS FO) system for water purification from aqueous NaCl feed solution concentrations ranging from 0.5 to 4.0 molal at an operational scale of 480 m3/day (feed stream). The model indicates recovering approximately 90% of the water from a feed solution with NaCl concentration similar to seawater using SPS FO would have total equivalent energy requirements between 2.4 and 4.3 kWh per m 3 of purified water product. The process is predicted to be competitive with current costs for disposal/treatment of produced water frommore » oil and gas drilling operations. As a result, once scaled up the SPS FO process may be a thermally driven desalination process that can compete with the cost of seawater reverse osmosis.« less

  10. Energy requirements of the switchable polarity solvent forward osmosis (SPS-FO) water purification process

    DOE PAGES

    Wendt, Daniel S.; Orme, Christopher J.; Mines, Gregory L.; ...

    2015-08-01

    A model was developed to estimate the process energy requirements of a switchable polarity solvent forward osmosis (SPS FO) system for water purification from aqueous NaCl feed solution concentrations ranging from 0.5 to 4.0 molal at an operational scale of 480 m3/day (feed stream). The model indicates recovering approximately 90% of the water from a feed solution with NaCl concentration similar to seawater using SPS FO would have total equivalent energy requirements between 2.4 and 4.3 kWh per m 3 of purified water product. The process is predicted to be competitive with current costs for disposal/treatment of produced water frommore » oil and gas drilling operations. As a result, once scaled up the SPS FO process may be a thermally driven desalination process that can compete with the cost of seawater reverse osmosis.« less

  11. Harmonization of reimbursement and regulatory approval processes: a systematic review of international experiences.

    PubMed

    Tsoi, Bernice; Masucci, Lisa; Campbell, Kaitryn; Drummond, Michael; O'Reilly, Daria; Goeree, Ron

    2013-08-01

    A considerable degree of overlap exists between reimbursement and regulatory approval of health technologies, and harmonization of certain aspects is both possible and feasible. Various models to harmonization have been suggested in which a number of practical attempts have been drawn from. Based on a review of the literature, approaches can be categorized into those focused on reducing uncertainty and developing economies of scale in the evidentiary requirements; and/or aligning timeframes and logistical aspects of the review process. These strategies can further be classified based on the expected level of structural and organizational change required to implement them into the existing processes. Passive processes require less modification, whereas active processes are associated with greater restructuring. Attempts so far at harmonization have raised numerous legal and practical issues and these must be considered when introducing a more harmonized framework into the existing regulatory and reimbursement arrangements.

  12. Genomics to feed a switchgrass breeding program

    USDA-ARS?s Scientific Manuscript database

    Development of improved cultivars is one of three pillars, along with sustainable production and efficient conversion, required for dedicated cellulosic bioenergy crops to succeed. Breeding new cultivars is a long, slow process requiring patience, dedication, and motivation to realize gains and adva...

  13. EDOS operations concept and development approach

    NASA Technical Reports Server (NTRS)

    Knoble, G.; Garman, C.; Alcott, G.; Ramchandani, C.; Silvers, J.

    1994-01-01

    The Earth Observing System (EOS) Data and Operations System (EDOS) is being developed by the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) for the capture, level zero processing, distribution, and backup archiving of high speed telemetry data received from EOS spacecraft. All data received will conform to the Consultative Committee for Space Data Standards (CCSDS) recommendations. The major EDOS goals are to: (1) minimize EOS program costs to implement and operate EDOS; (2) respond effectively to EOS growth requirements; and (3) maintain compatibility with existing and enhanced versions of NASA institutional systems required to support EOS spacecraft. In order to meet these goals, the following objectives have been defined for EDOS: (1) standardize EDOS interfaces to maximize utility for future requirements; (2) emphasize life-cycle cost (LCC) considerations (rather than procurement costs) in making design decisions and meeting reliability, maintainability, availability (RMA) and upgradability requirements; (3) implement data-driven operations to the maximum extent possible to minimize staffing requirements and to maximize system responsiveness; (4) provide a system capable of simultaneously supporting multiple spacecraft, each in different phases of their life-cycles; (5) provide for technology insertion features to accommodate growth and future LCC reductions during the operations phase; and (6) provide a system that is sufficiently robust to accommodate incremental performance upgrades while supporting operations. Operations concept working group meetings were facilitated to help develop the EDOS operations concept. This provided a cohesive concept that met with approval of responsible personnel from the start. This approach not only speeded up the development process by reducing review cycles, it also provided a medium for generating good ideas that were immediately molded into feasible concepts. The operations concept was then used as a basis for the EDOS specification. When it was felt that concept elements did not support detailed requirements, the facilitator process was used to resolve discrepancies or to add new concept elements to support the specification. This method provided an ongoing revisal of the operations concept and prevented large revisions at the end of the requirement analysis phase of system development.

  14. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases, they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At the Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and processes for developing software. This paper will discuss some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies and processes.

  15. Vehicle Health Management Communications Requirements for AeroMACS

    NASA Technical Reports Server (NTRS)

    Kerczewski, Robert J.; Clements, Donna J.; Apaza, Rafael D.

    2012-01-01

    As the development of standards for the aeronautical mobile airport communications system (AeroMACS) progresses, the process of identifying and quantifying appropriate uses for the system is progressing. In addition to defining important elements of AeroMACS standards, indentifying the systems uses impacts AeroMACS bandwidth requirements. Although an initial 59 MHz spectrum allocation for AeroMACS was established in 2007, the allocation may be inadequate; studies have indicated that 100 MHz or more of spectrum may be required to support airport surface communications. Hence additional spectrum allocations have been proposed. Vehicle health management (VHM) systems, which can produce large volumes of vehicle health data, were not considered in the original bandwidth requirements analyses, and are therefore of interest in supporting proposals for additional AeroMACS spectrum. VHM systems are an emerging development in air vehicle safety, and preliminary estimates of the amount of data that will be produced and transmitted off an aircraft, both in flight and on the ground, have been prepared based on estimates of data produced by on-board vehicle health sensors and initial concepts of data processing approaches. This allowed an initial estimate of VHM data transmission requirements for the airport surface. More recently, vehicle-level systems designed to process and analyze VHM data and draw conclusions on the current state of vehicle health have been undergoing testing and evaluation. These systems make use of vehicle system data that is mostly different from VHM data considered previously for airport surface transmission, and produce processed system outputs that will be also need to be archived, thus generating additional data load for AeroMACS. This paper provides an analysis of airport surface data transmission requirements resulting from the vehicle level reasoning systems, within the context of overall VHM data requirements.

  16. Molybdenum-base cermet fuel development

    NASA Astrophysics Data System (ADS)

    Pilger, James P.; Gurwell, William E.; Moss, Ronald W.; White, George D.; Seifert, David A.

    Development of a multimegawatt (MMW) space nuclear power system requires identification and resolution of several technical feasibility issues before selecting one or more promising system concepts. Demonstration of reactor fuel fabrication technology is required for cermet-fueled reactor concepts. The MMW reactor fuel development activity at Pacific Northwest Laboratory (PNL) is focused on producing a molybdenum-matrix uranium-nitride (UN) fueled cermte. This cermet is to have a high matrix density (greater than or equal to 95 percent) for high strength and high thermal conductance coupled with a high particle (UN) porosity (approximately 25 percent) for retention of released fission gas at high burnup. Fabrication process development involves the use of porous TiN microspheres as surrogate fuel material until porous Un microspheres become available. Process development was conducted in the areas of microsphere synthesis, particle sealing/coating, and high-energy-rate forming (HERF) and the vacuum hot press consolidation techniques. This paper summarizes the status of these activities.

  17. Study of residue type defect formation mechanism and the effect of advanced defect reduction (ADR) rinse process

    NASA Astrophysics Data System (ADS)

    Arima, Hiroshi; Yoshida, Yuichi; Yoshihara, Kosuke; Shibata, Tsuyoshi; Kushida, Yuki; Nakagawa, Hiroki; Nishimura, Yukio; Yamaguchi, Yoshikazu

    2009-03-01

    Residue type defect is one of yield detractors in lithography process. It is known that occurrence of the residue type defect is dependent on resist development process and the defect is reduced by optimized rinsing condition. However, the defect formation is affected by resist materials and substrate conditions. Therefore, it is necessary to optimize the development process condition by each mask level. Those optimization steps require a large amount of time and effort. The formation mechanism is investigated from viewpoint of both material and process. The defect formation is affected by resist material types, substrate condition and development process condition (D.I.W. rinse step). Optimized resist formulation and new rinse technology significantly reduce the residue type defect.

  18. Scale-up of ethanol production from winter barley by the EDGE (enhanced dry grind enzymatic) process in fermentors up to 300 liters

    USDA-ARS?s Scientific Manuscript database

    A fermentation process, which was designated the EDGE (enhanced dry grind enzymatic) process, has recently been developed for barley ethanol production. In the EDGE process, in addition to the enzymes normally required for starch hydrolysis, commercial Beta-glucanases were used to hydrolyze (1,3)(1,...

  19. Innovative Product Design Based on Comprehensive Customer Requirements of Different Cognitive Levels

    PubMed Central

    Zhao, Wu; Zheng, Yake; Wang, Rui; Wang, Chen

    2014-01-01

    To improve customer satisfaction in innovative product design, a topology structure of customer requirements is established and an innovative product approach is proposed. The topology structure provides designers with reasonable guidance to capture the customer requirements comprehensively. With the aid of analytic hierarchy process (AHP), the importance of the customer requirements is evaluated. Quality function deployment (QFD) is used to translate customer requirements into product and process design demands and pick out the technical requirements which need urgent improvement. In this way, the product is developed in a more targeted way to satisfy the customers. the theory of innovative problems solving (TRIZ) is used to help designers to produce innovative solutions. Finally, a case study of automobile steering system is used to illustrate the application of the proposed approach. PMID:25013862

  20. Innovative product design based on comprehensive customer requirements of different cognitive levels.

    PubMed

    Li, Xiaolong; Zhao, Wu; Zheng, Yake; Wang, Rui; Wang, Chen

    2014-01-01

    To improve customer satisfaction in innovative product design, a topology structure of customer requirements is established and an innovative product approach is proposed. The topology structure provides designers with reasonable guidance to capture the customer requirements comprehensively. With the aid of analytic hierarchy process (AHP), the importance of the customer requirements is evaluated. Quality function deployment (QFD) is used to translate customer requirements into product and process design demands and pick out the technical requirements which need urgent improvement. In this way, the product is developed in a more targeted way to satisfy the customers. the theory of innovative problems solving (TRIZ) is used to help designers to produce innovative solutions. Finally, a case study of automobile steering system is used to illustrate the application of the proposed approach.

  1. Use of a structured template to facilitate practice-based learning and improvement projects.

    PubMed

    McClain, Elizabeth K; Babbott, Stewart F; Tsue, Terance T; Girod, Douglas A; Clements, Debora; Gilmer, Lisa; Persons, Diane; Unruh, Greg

    2012-06-01

    The Accreditation Council for Graduate Medical Education (ACGME) requires residency programs to meet and demonstrate outcomes across 6 competencies. Measuring residents' competency in practice-based learning and improvement (PBLI) is particularly challenging. We developed an educational tool to meet ACGME requirements for PBLI. The PBLI template helped programs document quality improvement (QI) projects and supported increased scholarly activity surrounding PBLI learning. We reviewed program requirements for 43 residency and fellowship programs and identified specific PBLI requirements for QI activities. We also examined ACGME Program Information Form responses on PBLI core competency questions surrounding QI projects for program sites visited in 2008-2009. Data were integrated by a multidisciplinary committee to develop a peer-protected PBLI template guiding programs through process, documentation, and evaluation of QI projects. All steps were reviewed and approved through our GME Committee structure. An electronic template, companion checklist, and evaluation form were developed using identified project characteristics to guide programs through the PBLI process and facilitate documentation and evaluation of the process. During a 24 month period, 27 programs have completed PBLI projects, and 15 have reviewed the template with their education committees, but have not initiated projects using the template. The development of the tool generated program leaders' support because the tool enhanced the ability to meet program-specific objectives. The peer-protected status of this document for confidentiality and from discovery has been beneficial for program usage. The document aggregates data on PBLI and QI initiatives, offers opportunities to increase scholarship in QI, and meets the ACGME goal of linking measures to outcomes important to meeting accreditation requirements at the program and institutional level.

  2. Use of a Structured Template to Facilitate Practice-Based Learning and Improvement Projects

    PubMed Central

    McClain, Elizabeth K.; Babbott, Stewart F.; Tsue, Terance T.; Girod, Douglas A.; Clements, Debora; Gilmer, Lisa; Persons, Diane; Unruh, Greg

    2012-01-01

    Background The Accreditation Council for Graduate Medical Education (ACGME) requires residency programs to meet and demonstrate outcomes across 6 competencies. Measuring residents' competency in practice-based learning and improvement (PBLI) is particularly challenging. Purpose We developed an educational tool to meet ACGME requirements for PBLI. The PBLI template helped programs document quality improvement (QI) projects and supported increased scholarly activity surrounding PBLI learning. Methods We reviewed program requirements for 43 residency and fellowship programs and identified specific PBLI requirements for QI activities. We also examined ACGME Program Information Form responses on PBLI core competency questions surrounding QI projects for program sites visited in 2008–2009. Data were integrated by a multidisciplinary committee to develop a peer-protected PBLI template guiding programs through process, documentation, and evaluation of QI projects. All steps were reviewed and approved through our GME Committee structure. Results An electronic template, companion checklist, and evaluation form were developed using identified project characteristics to guide programs through the PBLI process and facilitate documentation and evaluation of the process. During a 24 month period, 27 programs have completed PBLI projects, and 15 have reviewed the template with their education committees, but have not initiated projects using the template. Discussion The development of the tool generated program leaders' support because the tool enhanced the ability to meet program-specific objectives. The peer-protected status of this document for confidentiality and from discovery has been beneficial for program usage. The document aggregates data on PBLI and QI initiatives, offers opportunities to increase scholarship in QI, and meets the ACGME goal of linking measures to outcomes important to meeting accreditation requirements at the program and institutional level. PMID:23730444

  3. Automation and integration of components for generalized semantic markup of electronic medical texts.

    PubMed Central

    Dugan, J. M.; Berrios, D. C.; Liu, X.; Kim, D. K.; Kaizer, H.; Fagan, L. M.

    1999-01-01

    Our group has built an information retrieval system based on a complex semantic markup of medical textbooks. We describe the construction of a set of web-based knowledge-acquisition tools that expedites the collection and maintenance of the concepts required for text markup and the search interface required for information retrieval from the marked text. In the text markup system, domain experts (DEs) identify sections of text that contain one or more elements from a finite set of concepts. End users can then query the text using a predefined set of questions, each of which identifies a subset of complementary concepts. The search process matches that subset of concepts to relevant points in the text. The current process requires that the DE invest significant time to generate the required concepts and questions. We propose a new system--called ACQUIRE (Acquisition of Concepts and Queries in an Integrated Retrieval Environment)--that assists a DE in two essential tasks in the text-markup process. First, it helps her to develop, edit, and maintain the concept model: the set of concepts with which she marks the text. Second, ACQUIRE helps her to develop a query model: the set of specific questions that end users can later use to search the marked text. The DE incorporates concepts from the concept model when she creates the questions in the query model. The major benefit of the ACQUIRE system is a reduction in the time and effort required for the text-markup process. We compared the process of concept- and query-model creation using ACQUIRE to the process used in previous work by rebuilding two existing models that we previously constructed manually. We observed a significant decrease in the time required to build and maintain the concept and query models. Images Figure 1 Figure 2 Figure 4 Figure 5 PMID:10566457

  4. Final Report. Baseline LAW Glass Formulation Testing, VSL-03R3460-1, Rev. 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muller, Isabelle S.; Pegg, Ian L.; Gan, Hao

    2015-06-18

    The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements.

  5. Laboratory for Atmospheres: Instrument Systems Report

    NASA Technical Reports Server (NTRS)

    2011-01-01

    Studies of the atmospheres of our solar system's planets including our own require a comprehensive set of observations, relying on instruments on spacecraft, aircraft, balloons, and on the surface. Laboratory personnel define requirements, conceive concepts, and develop instrument systems for spaceflight missions, and for balloon, aircraft, and ground-based observations. Laboratory scientists also participate in the design of data processing algorithms, calibration techniques, and data processing systems. The instrument sections of this report are organized by measurement technique: lidar, passive, in situ and microwave. A number of instruments in various stages of development or modification are also described. This report will be updated as instruments evolve.

  6. [Development method of healthcare information system integration based on business collaboration model].

    PubMed

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  7. Development of Optimal Stressor Scenarios for New Operational Energy Systems

    DTIC Science & Technology

    2017-12-01

    Analyzing the previous model using a design of experiments (DOE) and regression analysis provides critical information about the associated operational...from experimentation. The resulting system requirements can be used to revisit the design requirements and develop a more robust system. This process...stressor scenarios for acceptance testing. Analyzing the previous model using a design of experiments (DOE) and regression analysis provides critical

  8. Effective Electronic Security: Process for the Development and Validation from Requirements to Testing

    DTIC Science & Technology

    2013-06-01

    ABBREVIATIONS ANSI American National Standards Institute ASIS American Society of Industrial Security CCTV Closed Circuit Television CONOPS...is globally recognized for the development and maintenance of standards. ASTM defines a specification as an explicit set of requirements...www.rkb.us/saver/. One of the SAVER reports titled CCTV Technology Handbook has a chapter on system design. The report uses terms like functional

  9. Analysis Systems for Air Force Missions.

    DTIC Science & Technology

    1987-02-28

    satisfying requirements in a cost effective manner. Subroutine libraries were developed for use in the overall systems. These libraries allow for the...anomalies occur or requirements change. A library of HILAT routines has been developed which is used by all processing routines as necessary. Upon...the AIM data were directly applied to the AIRS. Moreover, many of the computer modules in the HILAT library of subroutines have direct application

  10. Patents for Soldiers

    DTIC Science & Technology

    2016-06-10

    required for the U.S. Army to achieve overmatch of its enemies.6 It is one of the eight tenets prescribed for commanders to consider while conducting ...regulations that are current as of January 1, 2016. Sixth, interviews were generally not conducted due to time constraints, procedural requirements, and a...development process, were searching for such a material to develop a catheter. The named inventors conducted extensive research and demonstrations with this

  11. Requirement Development Process and Tools

    NASA Technical Reports Server (NTRS)

    Bayt, Robert

    2017-01-01

    Requirements capture the system-level capabilities in a set of complete, necessary, clear, attainable, traceable, and verifiable statements of need. Requirements should not be unduly restrictive, but should set limits that eliminate items outside the boundaries drawn, encourage competition (or alternatives), and capture source and reason of requirement. If it is not needed by the customer, it is not a requirement. They establish the verification methods that will lead to product acceptance. These must be reproducible assessment methods.

  12. Face-to-face interference in typical and atypical development

    PubMed Central

    Riby, Deborah M; Doherty-Sneddon, Gwyneth; Whittle, Lisa

    2012-01-01

    Visual communication cues facilitate interpersonal communication. It is important that we look at faces to retrieve and subsequently process such cues. It is also important that we sometimes look away from faces as they increase cognitive load that may interfere with online processing. Indeed, when typically developing individuals hold face gaze it interferes with task completion. In this novel study we quantify face interference for the first time in Williams syndrome (WS) and Autism Spectrum Disorder (ASD). These disorders of development impact on cognition and social attention, but how do faces interfere with cognitive processing? Individuals developing typically as well as those with ASD (n = 19) and WS (n = 16) were recorded during a question and answer session that involved mathematics questions. In phase 1 gaze behaviour was not manipulated, but in phase 2 participants were required to maintain eye contact with the experimenter at all times. Looking at faces decreased task accuracy for individuals who were developing typically. Critically, the same pattern was seen in WS and ASD, whereby task performance decreased when participants were required to hold face gaze. The results show that looking at faces interferes with task performance in all groups. This finding requires the caveat that individuals with WS and ASD found it harder than individuals who were developing typically to maintain eye contact throughout the interaction. Individuals with ASD struggled to hold eye contact at all points of the interaction while those with WS found it especially difficult when thinking. PMID:22356183

  13. Controlled Ecological Life Support System: Research and Development Guidelines

    NASA Technical Reports Server (NTRS)

    Mason, R. M. (Editor); Carden, J. L. (Editor)

    1982-01-01

    Results of a workshop designed to provide a base for initiating a program of research and development of controlled ecological life support systems (CELSS) are summarized. Included are an evaluation of a ground based manned demonstration as a milestone in CELSS development, and a discussion of development requirements for a successful ground based CELSS demonstration. Research recommendations are presented concerning the following topics: nutrition and food processing, food production, waste processing, systems engineering and modelling, and ecology-systems safety.

  14. Facilitating NASA's Use of GEIA-STD-0005-1, Performance Standard for Aerospace and High Performance Electronic Systems Containing Lead-Free Solder

    NASA Technical Reports Server (NTRS)

    Plante, Jeannete

    2010-01-01

    GEIA-STD-0005-1 defines the objectives of, and requirements for, documenting processes that assure customers and regulatory agencies that AHP electronic systems containing lead-free solder, piece parts, and boards will satisfy the applicable requirements for performance, reliability, airworthiness, safety, and certify-ability throughout the specified life of performance. It communicates requirements for a Lead-Free Control Plan (LFCP) to assist suppliers in the development of their own Plans. The Plan documents the Plan Owner's (supplier's) processes, that assure their customer, and all other stakeholders that the Plan owner's products will continue to meet their requirements. The presentation reviews quality assurance requirements traceability and LFCP template instructions.

  15. Field Geology/Processes

    NASA Technical Reports Server (NTRS)

    Allen, Carlton; Jakes, Petr; Jaumann, Ralf; Marshall, John; Moses, Stewart; Ryder, Graham; Saunders, Stephen; Singer, Robert

    1996-01-01

    The field geology/process group examined the basic operations of a terrestrial field geologist and the manner in which these operations could be transferred to a planetary lander. Four basic requirements for robotic field geology were determined: geologic content; surface vision; mobility; and manipulation. Geologic content requires a combination of orbital and descent imaging. Surface vision requirements include range, resolution, stereo, and multispectral imaging. The minimum mobility for useful field geology depends on the scale of orbital imagery. Manipulation requirements include exposing unweathered surfaces, screening samples, and bringing samples in contact with analytical instruments. To support these requirements, several advanced capabilities for future development are recommended. Capabilities include near-infrared reflectance spectroscopy, hyper-spectral imaging, multispectral microscopy, artificial intelligence in support of imaging, x ray diffraction, x ray fluorescence, and rock chipping.

  16. Preliminary development of digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Stanley, W. D.

    1980-01-01

    Topics covered involve a number of closely related tasks including: the development of several control loop and dynamic noise model computer programs for simulating microwave radiometer measurements; computer modeling of an existing stepped frequency radiometer in an effort to determine its optimum operational characteristics; investigation of the classical second order analog control loop to determine its ability to reduce the estimation error in a microwave radiometer; investigation of several digital signal processing unit designs; initiation of efforts to develop required hardware and software for implementation of the digital signal processing unit; and investigation of the general characteristics and peculiarities of digital processing noiselike microwave radiometer signals.

  17. STATISTICAL ANALYSIS OF SNAP 10A THERMOELECTRIC CONVERTER ELEMENT PROCESS DEVELOPMENT VARIABLES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitch, S.H.; Morris, J.W.

    1962-12-15

    Statistical analysis, primarily analysis of variance, was applied to evaluate several factors involved in the development of suitable fabrication and processing techniques for the production of lead telluride thermoelectric elements for the SNAP 10A energy conversion system. The analysis methods are described as to their application for determining the effects of various processing steps, estabIishing the value of individual operations, and evaluating the significance of test results. The elimination of unnecessary or detrimental processing steps was accomplished and the number of required tests was substantially reduced by application of these statistical methods to the SNAP 10A production development effort. (auth)

  18. NASA SMD Airborne Science Capabilities for Development and Testing of New Instruments

    NASA Technical Reports Server (NTRS)

    Fladeland, Matthew

    2015-01-01

    The SMD NASA Airborne Science Program operates and maintains a fleet of highly modified aircraft to support instrument development, satellite instrument calibration, data product validation and earth science process studies. This poster will provide an overview of aircraft available to NASA researchers including performance specifications and modifications for instrument support, processes for requesting aircraft time and developing cost estimates for proposals, and policies and procedures required to ensure safety of flight.

  19. Biological Hydrogen Production: Simultaneous Saccharification and Fermentation With Nitrogen and Phosphorus Removal from Wastewater Effluent

    DTIC Science & Technology

    2010-01-01

    requiring thermochemical pretreatment , aswould typically be required with lignocellulosic feedstocks. Therefore it offers a readily-processed and...Standards and Technology. The pH of the reactors was controlled throughout all fermentations by the automatic addition of 0.1 N NaOH . Total organic...nutrients. The optimized conditions developed with paper as a substrate may also convey to the use of a similar process with lignocellulosic biomass

  20. Space sensors for global change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canavan, G.H.

    1994-02-15

    Satellite measurements should contribute to a fuller understanding of the physical processes behind the radiation budget, exchange processes, and global change. Climate engineering requires global observation for early indications of predicted effects, which puts a premium on affordable, distributed constellations of satellites with effective, affordable sensors. Defense has a requirement for continuous global surveillance for warning of aggression, which could evolve from advanced sensors and satellites in development. Many climate engineering needs match those of defense technologies.

Top