Experiences Using Lightweight Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1997-01-01
This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
Requirement Assurance: A Verification Process
NASA Technical Reports Server (NTRS)
Alexander, Michael G.
2011-01-01
Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.
Experiences Using Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1996-01-01
This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
An analytical approach to customer requirement information processing
NASA Astrophysics Data System (ADS)
Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong
2013-11-01
'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.
Isothermal separation processes
NASA Technical Reports Server (NTRS)
England, C.
1982-01-01
The isothermal processes of membrane separation, supercritical extraction and chromatography were examined using availability analysis. The general approach was to derive equations that identified where energy is consumed in these processes and how they compare with conventional separation methods. These separation methods are characterized by pure work inputs, chiefly in the form of a pressure drop which supplies the required energy. Equations were derived for the energy requirement in terms of regular solution theory. This approach is believed to accurately predict the work of separation in terms of the heat of solution and the entropy of mixing. It can form the basis of a convenient calculation method for optimizing membrane and solvent properties for particular applications. Calculations were made on the energy requirements for a membrane process separating air into its components.
RTM: Cost-effective processing of composite structures
NASA Technical Reports Server (NTRS)
Hasko, Greg; Dexter, H. Benson
1991-01-01
Resin transfer molding (RTM) is a promising method for cost effective fabrication of high strength, low weight composite structures from textile preforms. In this process, dry fibers are placed in a mold, resin is introduced either by vacuum infusion or pressure, and the part is cured. RTM has been used in many industries, including automotive, recreation, and aerospace. Each of the industries has different requirements of material strength, weight, reliability, environmental resistance, cost, and production rate. These requirements drive the selection of fibers and resins, fiber volume fractions, fiber orientations, mold design, and processing equipment. Research is made into applying RTM to primary aircraft structures which require high strength and stiffness at low density. The material requirements are discussed of various industries, along with methods of orienting and distributing fibers, mold configurations, and processing parameters. Processing and material parameters such as resin viscosity, perform compaction and permeability, and tool design concepts are discussed. Experimental methods to measure preform compaction and permeability are presented.
Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach
NASA Astrophysics Data System (ADS)
Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.
Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.
An Approach for Integrating the Prioritization of Functional and Nonfunctional Requirements
Dabbagh, Mohammad; Lee, Sai Peck
2014-01-01
Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches. PMID:24982987
An approach for integrating the prioritization of functional and nonfunctional requirements.
Dabbagh, Mohammad; Lee, Sai Peck
2014-01-01
Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches.
Using the CoRE Requirements Method with ADARTS. Version 01.00.05
1994-03-01
requirements; combining ADARTS processes and objects derived from CoRE requirements into an ADARTS software architecture design ; and taking advantage of...CoRE’s precision in the ADARTS process structuring, class structuring, and software architecture design activities. Object-oriented requirements and
A method to accelerate creation of plasma etch recipes using physics and Bayesian statistics
NASA Astrophysics Data System (ADS)
Chopra, Meghali J.; Verma, Rahul; Lane, Austin; Willson, C. G.; Bonnecaze, Roger T.
2017-03-01
Next generation semiconductor technologies like high density memory storage require precise 2D and 3D nanopatterns. Plasma etching processes are essential to achieving the nanoscale precision required for these structures. Current plasma process development methods rely primarily on iterative trial and error or factorial design of experiment (DOE) to define the plasma process space. Here we evaluate the efficacy of the software tool Recipe Optimization for Deposition and Etching (RODEo) against standard industry methods at determining the process parameters of a high density O2 plasma system with three case studies. In the first case study, we demonstrate that RODEo is able to predict etch rates more accurately than a regression model based on a full factorial design while using 40% fewer experiments. In the second case study, we demonstrate that RODEo performs significantly better than a full factorial DOE at identifying optimal process conditions to maximize anisotropy. In the third case study we experimentally show how RODEo maximizes etch rates while using half the experiments of a full factorial DOE method. With enhanced process predictions and more accurate maps of the process space, RODEo reduces the number of experiments required to develop and optimize plasma processes.
40 CFR Appendix C to Part 75 - Missing Data Estimation Procedures
Code of Federal Regulations, 2010 CFR
2010-07-01
... certification of a parametric, empirical, or process simulation method or model for calculating substitute data... available process simulation methods and models. 1.2Petition Requirements Continuously monitor, determine... desulfurization, a corresponding empirical correlation or process simulation parametric method using appropriate...
Agile Methods for Open Source Safety-Critical Software
Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John
2011-01-01
The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the right amount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545
Agile Methods for Open Source Safety-Critical Software.
Gary, Kevin; Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John
2011-08-01
The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the rightamount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion.
Using Formal Methods to Assist in the Requirements Analysis of the Space Shuttle GPS Change Request
NASA Technical Reports Server (NTRS)
DiVito, Ben L.; Roberts, Larry W.
1996-01-01
We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CR's) were selected as promising targets to demonstrate the utility of formal methods in this application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this report. Carried out in parallel with the Shuttle program's conventional requirements analysis process was a limited form of analysis based on formalized requirements. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During the formal methods-based analysis, numerous requirements issues were discovered and submitted as official issues through the normal requirements inspection process. Shuttle analysts felt that many of these issues were uncovered earlier than would have occurred with conventional methods. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.
Method for exfoliation of hexagonal boron nitride
NASA Technical Reports Server (NTRS)
Lin, Yi (Inventor); Connell, John W. (Inventor)
2012-01-01
A new method is disclosed for the exfoliation of hexagonal boron nitride into mono- and few-layered nanosheets (or nanoplatelets, nanomesh, nanoribbons). The method does not necessarily require high temperature or vacuum, but uses commercially available h-BN powders (or those derived from these materials, bulk crystals) and only requires wet chemical processing. The method is facile, cost efficient, and scalable. The resultant exfoliated h-BN is dispersible in an organic solvent or water thus amenable for solution processing for unique microelectronic or composite applications.
System and Method for Multi-Wavelength Optical Signal Detection
NASA Technical Reports Server (NTRS)
McGlone, Thomas D. (Inventor)
2017-01-01
The system and method for multi-wavelength optical signal detection enables the detection of optical signal levels significantly below those processed at the discrete circuit level by the use of mixed-signal processing methods implemented with integrated circuit technologies. The present invention is configured to detect and process small signals, which enables the reduction of the optical power required to stimulate detection networks, and lowers the required laser power to make specific measurements. The present invention provides an adaptation of active pixel networks combined with mixed-signal processing methods to provide an integer representation of the received signal as an output. The present invention also provides multi-wavelength laser detection circuits for use in various systems, such as a differential absorption light detection and ranging system.
An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle
NASA Astrophysics Data System (ADS)
Wang, Yue; Gao, Dan; Mao, Xuming
2018-03-01
A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.
Supporting BPMN choreography with system integration artefacts for enterprise process collaboration
NASA Astrophysics Data System (ADS)
Nie, Hongchao; Lu, Xudong; Duan, Huilong
2014-07-01
Business Process Model and Notation (BPMN) choreography modelling depicts externally visible message exchanges between collaborating processes of enterprise information systems. Implementation of choreography relies on designing system integration solutions to realise message exchanges between independently developed systems. Enterprise integration patterns (EIPs) are widely accepted artefacts to design integration solutions. If the choreography model represents coordination requirements between processes with behaviour mismatches, the integration designer needs to analyse the routing requirements and address these requirements by manually designing EIP message routers. As collaboration scales and complexity increases, manual design becomes inefficient. Thus, the research problem of this paper is to explore a method to automatically identify routing requirements from BPMN choreography model and to accordingly design routing in the integration solution. To achieve this goal, recurring behaviour mismatch scenarios are analysed as patterns, and corresponding solutions are proposed as EIP routers. Using this method, a choreography model can be analysed by computer to identify occurrences of mismatch patterns, leading to corresponding router selection. A case study demonstrates that the proposed method enables computer-assisted integration design to implement choreography. A further experiment reveals that the method is effective to improve the design quality and reduce time cost.
K-space data processing for magnetic resonance elastography (MRE).
Corbin, Nadège; Breton, Elodie; de Mathelin, Michel; Vappou, Jonathan
2017-04-01
Magnetic resonance elastography (MRE) requires substantial data processing based on phase image reconstruction, wave enhancement, and inverse problem solving. The objective of this study is to propose a new, fast MRE method based on MR raw data processing, particularly adapted to applications requiring fast MRE measurement or high elastogram update rate. The proposed method allows measuring tissue elasticity directly from raw data without prior phase image reconstruction and without phase unwrapping. Experimental feasibility is assessed both in a gelatin phantom and in the liver of a porcine model in vivo. Elastograms are reconstructed with the raw MRE method and compared to those obtained using conventional MRE. In a third experiment, changes in elasticity are monitored in real-time in a gelatin phantom during its solidification by using both conventional MRE and raw MRE. The raw MRE method shows promising results by providing similar elasticity values to the ones obtained with conventional MRE methods while decreasing the number of processing steps and circumventing the delicate step of phase unwrapping. Limitations of the proposed method are the influence of the magnitude on the elastogram and the requirement for a minimum number of phase offsets. This study demonstrates the feasibility of directly reconstructing elastograms from raw data.
Rapid oxidation/stabilization technique for carbon foams, carbon fibers and C/C composites
Tan, Seng; Tan, Cher-Dip
2004-05-11
An enhanced method for the post processing, i.e. oxidation or stabilization, of carbon materials including, but not limited to, carbon foams, carbon fibers, dense carbon-carbon composites, carbon/ceramic and carbon/metal composites, which method requires relatively very short and more effective such processing steps. The introduction of an "oxygen spill over catalyst" into the carbon precursor by blending with the carbon starting material or exposure of the carbon precursor to such a material supplies required oxygen at the atomic level and permits oxidation/stabilization of carbon materials in a fraction of the time and with a fraction of the energy normally required to accomplish such carbon processing steps. Carbon based foams, solids, composites and fiber products made utilizing this method are also described.
Li, Lei; Gao, Cai; Zhao, Gang; Shu, Zhiquan; Cao, Yunxia; Gao, Dayong
2016-12-01
The measurement of hydraulic conductivity of the cell membrane is very important for optimizing the protocol of cryopreservation and cryosurgery. There are two different methods using differential scanning calorimetry (DSC) to measure the freezing response of cells and tissues. Devireddy et al. presented the slow-fast-slow (SFS) cooling method, in which the difference of the heat release during the freezing process between the osmotically active and inactive cells is used to obtain the cell membrane hydraulic conductivity and activation energy. Luo et al. simplified the procedure and introduced the single-slow (SS) cooling protocol, which requires only one cooling process although different cytocrits are required for the determination of the membrane transport properties. To the best of our knowledge, there is still a lack of comparison of experimental processes and requirements for experimental conditions between these two methods. This study made a systematic comparison between these two methods from the aforementioned aspects in detail. The SFS and SS cooling methods mentioned earlier were utilized to obtain the reference hydraulic conductivity (L pg ) and activation energy (E Lp ) of HeLa cells by fitting the model to DSC data. With the SFS method, it was determined that L pg = 0.10 μm/(min·atm) and E Lp = 22.9 kcal/mol; whereas the results obtained by the SS cooling method showed that L pg = 0.10 μm/(min·atm) and E Lp = 23.6 kcal/mol. The results indicated that the values of the water transport parameters measured by two methods were comparable. In other words, the two parameters can be obtained by comparing the heat releases between two slow cooling processes of the same sample according to the SFS method. However, the SS method required analyzing heat releases of samples with different cytocrits. Thus, more experimental time was required.
7 CFR 319.40-8 - Processing at facilities operating under compliance agreements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... compliance agreement shall specify the requirements necessary to prevent spread of plant pests from the facility, requirements to ensure the processing method effectively destroys plant pests, and the requirements for the application of chemical materials in accordance with part 305 of this chapter. The...
7 CFR 319.40-8 - Processing at facilities operating under compliance agreements.
Code of Federal Regulations, 2013 CFR
2013-01-01
... compliance agreement shall specify the requirements necessary to prevent spread of plant pests from the facility, requirements to ensure the processing method effectively destroys plant pests, and the requirements for the application of chemical materials in accordance with part 305 of this chapter. The...
Group Contribution Methods for Phase Equilibrium Calculations.
Gmehling, Jürgen; Constantinescu, Dana; Schmid, Bastian
2015-01-01
The development and design of chemical processes are carried out by solving the balance equations of a mathematical model for sections of or the whole chemical plant with the help of process simulators. For process simulation, besides kinetic data for the chemical reaction, various pure component and mixture properties are required. Because of the great importance of separation processes for a chemical plant in particular, a reliable knowledge of the phase equilibrium behavior is required. The phase equilibrium behavior can be calculated with the help of modern equations of state or g(E)-models using only binary parameters. But unfortunately, only a very small part of the experimental data for fitting the required binary model parameters is available, so very often these models cannot be applied directly. To solve this problem, powerful predictive thermodynamic models have been developed. Group contribution methods allow the prediction of the required phase equilibrium data using only a limited number of group interaction parameters. A prerequisite for fitting the required group interaction parameters is a comprehensive database. That is why for the development of powerful group contribution methods almost all published pure component properties, phase equilibrium data, excess properties, etc., were stored in computerized form in the Dortmund Data Bank. In this review, the present status, weaknesses, advantages and disadvantages, possible applications, and typical results of the different group contribution methods for the calculation of phase equilibria are presented.
Fully automated processing of fMRI data in SPM: from MRI scanner to PACS.
Maldjian, Joseph A; Baer, Aaron H; Kraft, Robert A; Laurienti, Paul J; Burdette, Jonathan H
2009-01-01
Here we describe the Wake Forest University Pipeline, a fully automated method for the processing of fMRI data using SPM. The method includes fully automated data transfer and archiving from the point of acquisition, real-time batch script generation, distributed grid processing, interface to SPM in MATLAB, error recovery and data provenance, DICOM conversion and PACS insertion. It has been used for automated processing of fMRI experiments, as well as for the clinical implementation of fMRI and spin-tag perfusion imaging. The pipeline requires no manual intervention, and can be extended to any studies requiring offline processing.
What is the Final Verification of Engineering Requirements?
NASA Technical Reports Server (NTRS)
Poole, Eric
2010-01-01
This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.
Advances in spectroscopic methods for quantifying soil carbon
Reeves, James B.; McCarty, Gregory W.; Calderon, Francisco; Hively, W. Dean
2012-01-01
The current gold standard for soil carbon (C) determination is elemental C analysis using dry combustion. However, this method requires expensive consumables, is limited by the number of samples that can be processed (~100/d), and is restricted to the determination of total carbon. With increased interest in soil C sequestration, faster methods of analysis are needed, and there is growing interest in methods based on diffuse reflectance spectroscopy in the visible, near-infrared or mid-infrared spectral ranges. These spectral methods can decrease analytical requirements and speed sample processing, be applied to large landscape areas using remote sensing imagery, and be used to predict multiple analytes simultaneously. However, the methods require localized calibrations to establish the relationship between spectral data and reference analytical data, and also have additional, specific problems. For example, remote sensing is capable of scanning entire watersheds for soil carbon content but is limited to the surface layer of tilled soils and may require difficult and extensive field sampling to obtain proper localized calibration reference values. The objective of this chapter is to discuss the present state of spectroscopic methods for determination of soil carbon.
2009-10-01
actuelle M&S couvrant le soutien aux operations, la representation du comportement humain , la guerre asymetrique, la defense contre le terrorisme et...methods, tools, data, intellectual capital , and processes to address these capability requirements. Fourth, there is a need to compare capability...requirements to current capabilities to identify gaps that may be addressed with DoD HSCB methods, tools, data, intellectual capital , and process
Some Findings Concerning Requirements in Agile Methodologies
NASA Astrophysics Data System (ADS)
Rodríguez, Pilar; Yagüe, Agustín; Alarcón, Pedro P.; Garbajosa, Juan
Agile methods have appeared as an attractive alternative to conventional methodologies. These methods try to reduce the time to market and, indirectly, the cost of the product through flexible development and deep customer involvement. The processes related to requirements have been extensively studied in literature, in most cases in the frame of conventional methods. However, conclusions of conventional methodologies could not be necessarily valid for Agile; in some issues, conventional and Agile processes are radically different. As recent surveys report, inadequate project requirements is one of the most conflictive issues in agile approaches and better understanding about this is needed. This paper describes some findings concerning requirements activities in a project developed under an agile methodology. The project intended to evolve an existing product and, therefore, some background information was available. The major difficulties encountered were related to non-functional needs and management of requirements dependencies.
Schiek, Richard [Albuquerque, NM
2006-06-20
A method of generating two-dimensional masks from a three-dimensional model comprises providing a three-dimensional model representing a micro-electro-mechanical structure for manufacture and a description of process mask requirements, reducing the three-dimensional model to a topological description of unique cross sections, and selecting candidate masks from the unique cross sections and the cross section topology. The method further can comprise reconciling the candidate masks based on the process mask requirements description to produce two-dimensional process masks.
Route to one-step microstructure mold fabrication for PDMS microfluidic chip
NASA Astrophysics Data System (ADS)
Lv, Xiaoqing; Geng, Zhaoxin; Fan, Zhiyuan; Wang, Shicai; Su, Yue; Fang, Weihao; Pei, Weihua; Chen, Hongda
2018-04-01
The microstructure mold fabrication for PDMS microfluidic chip remains complex and time-consuming process requiring special equipment and protocols: photolithography and etching. Thus, a rapid and cost-effective method is highly needed. Comparing with the traditional microfluidic chip fabricating process based on the micro-electromechanical system (MEMS), this method is simple and easy to implement, and the whole fabrication process only requires 1-2 h. Different size of microstructure from 100 to 1000 μm was fabricated, and used to culture four kinds of breast cancer cell lines. Cell viability and morphology was assessed when they were cultured in the micro straight channels, micro square holes and the bonding PDMS-glass microfluidic chip. The experimental results indicate that the microfluidic chip is good and meet the experimental requirements. This method can greatly reduce the process time and cost of the microfluidic chip, and provide a simple and effective way for the structure design and in the field of biological microfabrications and microfluidic chips.
Integration of sustainability into process simulaton of a dairy process
USDA-ARS?s Scientific Manuscript database
Life cycle analysis, a method used to quantify the energy and environmental flows of a process or product on the environment, is increasingly utilized by food processors to develop strategies to lessen the carbon footprint of their operations. In the case of the milk supply chain, the method requir...
Requirement Development Process and Tools
NASA Technical Reports Server (NTRS)
Bayt, Robert
2017-01-01
Requirements capture the system-level capabilities in a set of complete, necessary, clear, attainable, traceable, and verifiable statements of need. Requirements should not be unduly restrictive, but should set limits that eliminate items outside the boundaries drawn, encourage competition (or alternatives), and capture source and reason of requirement. If it is not needed by the customer, it is not a requirement. They establish the verification methods that will lead to product acceptance. These must be reproducible assessment methods.
Design requirements for operational earth resources ground data processing
NASA Technical Reports Server (NTRS)
Baldwin, C. J.; Bradford, L. H.; Burnett, E. S.; Hutson, D. E.; Kinsler, B. A.; Kugle, D. R.; Webber, D. S.
1972-01-01
Realistic tradeoff data and evaluation techniques were studied that permit conceptual design of operational earth resources ground processing systems. Methodology for determining user requirements that utilize the limited information available from users is presented along with definitions of sensor capabilities projected into the shuttle/station era. A tentative method is presented for synthesizing candidate ground processing concepts.
A Review of Microwave-Assisted Reactions for Biodiesel Production
Nomanbhay, Saifuddin; Ong, Mei Yin
2017-01-01
The conversion of biomass into chemicals and biofuels is an active research area as trends move to replace fossil fuels with renewable resources due to society’s increased concern towards sustainability. In this context, microwave processing has emerged as a tool in organic synthesis and plays an important role in developing a more sustainable world. Integration of processing methods with microwave irradiation has resulted in a great reduction in the time required for many processes, while the reaction efficiencies have been increased markedly. Microwave processing produces a higher yield with a cleaner profile in comparison to other methods. The microwave processing is reported to be a better heating method than the conventional methods due to its unique thermal and non-thermal effects. This paper provides an insight into the theoretical aspects of microwave irradiation practices and highlights the importance of microwave processing. The potential of the microwave technology to accomplish superior outcomes over the conventional methods in biodiesel production is presented. A green process for biodiesel production using a non-catalytic method is still new and very costly because of the supercritical condition requirement. Hence, non-catalytic biodiesel conversion under ambient pressure using microwave technology must be developed, as the energy utilization for microwave-based biodiesel synthesis is reported to be lower and cost-effective. PMID:28952536
A Review of Microwave-Assisted Reactions for Biodiesel Production.
Nomanbhay, Saifuddin; Ong, Mei Yin
2017-06-15
The conversion of biomass into chemicals and biofuels is an active research area as trends move to replace fossil fuels with renewable resources due to society's increased concern towards sustainability. In this context, microwave processing has emerged as a tool in organic synthesis and plays an important role in developing a more sustainable world. Integration of processing methods with microwave irradiation has resulted in a great reduction in the time required for many processes, while the reaction efficiencies have been increased markedly. Microwave processing produces a higher yield with a cleaner profile in comparison to other methods. The microwave processing is reported to be a better heating method than the conventional methods due to its unique thermal and non-thermal effects. This paper provides an insight into the theoretical aspects of microwave irradiation practices and highlights the importance of microwave processing. The potential of the microwave technology to accomplish superior outcomes over the conventional methods in biodiesel production is presented. A green process for biodiesel production using a non-catalytic method is still new and very costly because of the supercritical condition requirement. Hence, non-catalytic biodiesel conversion under ambient pressure using microwave technology must be developed, as the energy utilization for microwave-based biodiesel synthesis is reported to be lower and cost-effective.
Dynamic Environmental Qualification Techniques.
1981-12-01
environments peculiar to military operations and requirements. numerous dynamic qualification test methods have been established. It was the purpose...requires the achievement of the highest practicable degree in the standard- ization of items, materials and engineering practices within the...standard is described as "A document that established engineering and technical requirements for processes, pro’cedures, practices and methods that have
A diagnostic prototype of the potable water subsystem of the Space Station Freedom ECLSS
NASA Technical Reports Server (NTRS)
Lukefahr, Brenda D.; Rochowiak, Daniel M.; Benson, Brian L.; Rogers, John S.; Mckee, James W.
1989-01-01
In analyzing the baseline Environmental Control and Life Support System (ECLSS) command and control architecture, various processes are found which would be enhanced by the use of knowledge based system methods of implementation. The most suitable process for prototyping using rule based methods are documented, while domain knowledge resources and other practical considerations are examined. Requirements for a prototype rule based software system are documented. These requirements reflect Space Station Freedom ECLSS software and hardware development efforts, and knowledge based system requirements. A quick prototype knowledge based system environment is researched and developed.
A comprehensive review on utilization of wastewater from coffee processing.
Rattan, Supriya; Parande, A K; Nagaraju, V D; Ghiwari, Girish K
2015-05-01
The coffee processing industry is one of the major agro-based industries contributing significantly in international and national growth. Coffee fruits are processed by two methods, wet and dry process. In wet processing, coffee fruits generate enormous quantities of high strength wastewater requiring systematic treatment prior to disposal. Different method approach is used to treat the wastewater. Many researchers have attempted to assess the efficiency of batch aeration as posttreatment of coffee processing wastewater from an upflow anaerobic hybrid reactor (UAHR)-continuous and intermittent aeration system. However, wet coffee processing requires a high degree of processing know-how and produces large amounts of effluents which have the potential to damage the environment. Characteristics of wastewater from coffee processing has a biological oxygen demand (BOD) of up to 20,000 mg/l and a chemical oxygen demand (COD) of up to 50,000 mg/l as well as the acidity of pH below 4. In this review paper, various methods are discussed to treat coffee processing wastewaters; the constitution of wastewater is presented and the technical solutions for wastewater treatment are discussed.
Formalizing Space Shuttle Software Requirements
NASA Technical Reports Server (NTRS)
Crow, Judith; DiVito, Ben L.
1996-01-01
This paper describes two case studies in which requirements for new flight-software subsystems on NASA's Space Shuttle were analyzed, one using standard formal specification techniques, the other using state exploration. These applications serve to illustrate three main theses: (1) formal methods can complement conventional requirements analysis processes effectively, (2) formal methods confer benefits regardless of how extensively they are adopted and applied, and (3) formal methods are most effective when they are judiciously tailored to the application.
Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong
2015-02-01
Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.
An acetate precursor process for BSCCO (2223) thin films and coprecipitated powders
NASA Technical Reports Server (NTRS)
Haertling, Gene H.
1992-01-01
Since the discovery of high temperature superconducting oxides much attention has been paid to finding better and useful ways to take advantage of the special properties exhibited by these materials. One such process is the development of thin films for engineering applications. Another such process is the coprecipitation route to producing superconducting powders. An acetate precursor process for use in thin film fabrication and a chemical coprecipitation route to Bismuth based superconducting materials has been developed. Data obtained from the thin film process were inconclusive to date and require more study. The chemical coprecipitation method of producing bulk material is a viable method, and is preferred over the previously used solid state route. This method of powder production appears to be an excellent route to producing thin section tape cast material and screen printed devices, as it requires less calcines than the oxide route to produce quality powders.
Dynamic load balancing for petascale quantum Monte Carlo applications: The Alias method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sudheer, C. D.; Krishnan, S.; Srinivasan, A.
Diffusion Monte Carlo is the most accurate widely used Quantum Monte Carlo method for the electronic structure of materials, but it requires frequent load balancing or population redistribution steps to maintain efficiency and avoid accumulation of systematic errors on parallel machines. The load balancing step can be a significant factor affecting performance, and will become more important as the number of processing elements increases. We propose a new dynamic load balancing algorithm, the Alias Method, and evaluate it theoretically and empirically. An important feature of the new algorithm is that the load can be perfectly balanced with each process receivingmore » at most one message. It is also optimal in the maximum size of messages received by any process. We also optimize its implementation to reduce network contention, a process facilitated by the low messaging requirement of the algorithm. Empirical results on the petaflop Cray XT Jaguar supercomputer at ORNL showing up to 30% improvement in performance on 120,000 cores. The load balancing algorithm may be straightforwardly implemented in existing codes. The algorithm may also be employed by any method with many near identical computational tasks that requires load balancing.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pelt, Daniël M.; Gürsoy, Dogˇa; Palenstijn, Willem Jan
2016-04-28
The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it ismore » shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy's standard reconstruction method.« less
Pelt, Daniël M.; Gürsoy, Doǧa; Palenstijn, Willem Jan; Sijbers, Jan; De Carlo, Francesco; Batenburg, Kees Joost
2016-01-01
The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it is shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy’s standard reconstruction method. PMID:27140167
Scandurra, I; Hägglund, M; Koch, S
2008-08-01
This paper presents a new multi-disciplinary method for user needs analysis and requirements specification in the context of health information systems based on established theories from the fields of participatory design and computer supported cooperative work (CSCW). Whereas conventional methods imply a separate, sequential needs analysis for each profession, the "multi-disciplinary thematic seminar" (MdTS) method uses a collaborative design process. Application of the method in elderly homecare resulted in prototypes that were well adapted to the intended user groups. Vital information in the points of intersection between different care professions was elicited and a holistic view of the entire care process was obtained. Health informatics-usability specialists and clinical domain experts are necessary to apply the method. Although user needs acquisition can be time-consuming, MdTS was perceived to efficiently identify in-context user needs, and transformed these directly into requirements specifications. Consequently the method was perceived to expedite the entire ICT implementation process.
Non-contact temperature measurement requirements for electronic materials processing
NASA Technical Reports Server (NTRS)
Lehoczky, S. L.; Szofran, F. R.
1988-01-01
The requirements for non-contact temperature measurement capabilities for electronic materials processing in space are assessed. Non-contact methods are probably incapable of sufficient accuracy for the actual absolute measurement of temperatures in most such applications but would be useful for imaging in some applications.
Scaling of ratings: Concepts and methods
Thomas C. Brown; Terry C. Daniel
1990-01-01
Rating scales provide an efficient and widely used means of recording judgments. This paper reviews scaling issues within the context of a psychometric model of the rating process, describes several methods of scaling rating data, and compares the methods in terms of the assumptions they require about the rating process and the information they provide about the...
IDC Re-Engineering Phase 2 System Requirements Document Version 1.4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, James M.; Burns, John F.; Satpathi, Meara Allena
This System Requirements Document (SRD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The IDC applies, on a routine basis, automatic processing methods and interactive analysis to raw International Monitoring System (IMS) data in order to produce, archive, and distribute standard IDC products on behalf of all States Parties. The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide data,more » but includes requirements for the dissemination of radionuclide data and products.« less
IDC Re-Engineering Phase 2 System Requirements Document V1.3.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, James M.; Burns, John F.; Satpathi, Meara Allena
2015-12-01
This System Requirements Document (SRD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The IDC applies, on a routine basis, automatic processing methods and interactive analysis to raw International Monitoring System (IMS) data in order to produce, archive, and distribute standard IDC products on behalf of all States Parties. The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide datamore » but includes requirements for the dissemination of radionuclide data and products.« less
Code of Federal Regulations, 2010 CFR
2010-01-01
... disclosure is: (1) Required, or is one of the lawful or appropriate methods, to enforce your rights or the... service; or (2) Required, or is a usual, appropriate or acceptable method: (i) To carry out the transaction or the product or service business of which the transaction is a part, and record, service, or...
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Required, or is one of the lawful or appropriate methods, to enforce your rights or the rights of other...) Required, or is a usual, appropriate or acceptable method: (i) To carry out the transaction or the product or service business of which the transaction is a part, and record, service, or maintain the consumer...
Code of Federal Regulations, 2010 CFR
2010-01-01
... disclosure is: (1) Required, or is one of the lawful or appropriate methods, to enforce your rights or the... service; or (2) Required, or is a usual, appropriate or acceptable method: (i) To carry out the transaction or the product or service business of which the transaction is a part, and record, service, or...
Code of Federal Regulations, 2010 CFR
2010-04-01
... transaction means that the disclosure is: (1) Required, or is one of the lawful or appropriate methods, to... providing the product or service; or (2) Required, or is a usual, appropriate, or acceptable method: (i) To carry out the transaction or the product or service business of which the transaction is a part, and...
Code of Federal Regulations, 2010 CFR
2010-01-01
... disclosure is: (1) Required, or is one of the lawful or appropriate methods, to enforce your rights or the... service; or (2) Required, or is a usual, appropriate or acceptable method: (i) To carry out the transaction or the product or service business of which the transaction is a part, and record, service, or...
Adaptive Filtering Using Recurrent Neural Networks
NASA Technical Reports Server (NTRS)
Parlos, Alexander G.; Menon, Sunil K.; Atiya, Amir F.
2005-01-01
A method for adaptive (or, optionally, nonadaptive) filtering has been developed for estimating the states of complex process systems (e.g., chemical plants, factories, or manufacturing processes at some level of abstraction) from time series of measurements of system inputs and outputs. The method is based partly on the fundamental principles of the Kalman filter and partly on the use of recurrent neural networks. The standard Kalman filter involves an assumption of linearity of the mathematical model used to describe a process system. The extended Kalman filter accommodates a nonlinear process model but still requires linearization about the state estimate. Both the standard and extended Kalman filters involve the often unrealistic assumption that process and measurement noise are zero-mean, Gaussian, and white. In contrast, the present method does not involve any assumptions of linearity of process models or of the nature of process noise; on the contrary, few (if any) assumptions are made about process models, noise models, or the parameters of such models. In this regard, the method can be characterized as one of nonlinear, nonparametric filtering. The method exploits the unique ability of neural networks to approximate nonlinear functions. In a given case, the process model is limited mainly by limitations of the approximation ability of the neural networks chosen for that case. Moreover, despite the lack of assumptions regarding process noise, the method yields minimum- variance filters. In that they do not require statistical models of noise, the neural- network-based state filters of this method are comparable to conventional nonlinear least-squares estimators.
NASA Astrophysics Data System (ADS)
Swastika, Windra
2017-03-01
A money's nominal value recognition system has been developed using Artificial Neural Network (ANN). ANN with Back Propagation has one disadvantage. The learning process is very slow (or never reach the target) in the case of large number of iteration, weight and samples. One way to speed up the learning process is using Quickprop method. Quickprop method is based on Newton's method and able to speed up the learning process by assuming that the weight adjustment (E) is a parabolic function. The goal is to minimize the error gradient (E'). In our system, we use 5 types of money's nominal value, i.e. 1,000 IDR, 2,000 IDR, 5,000 IDR, 10,000 IDR and 50,000 IDR. One of the surface of each nominal were scanned and digitally processed. There are 40 patterns to be used as training set in ANN system. The effectiveness of Quickprop method in the ANN system was validated by 2 factors, (1) number of iterations required to reach error below 0.1; and (2) the accuracy to predict nominal values based on the input. Our results shows that the use of Quickprop method is successfully reduce the learning process compared to Back Propagation method. For 40 input patterns, Quickprop method successfully reached error below 0.1 for only 20 iterations, while Back Propagation method required 2000 iterations. The prediction accuracy for both method is higher than 90%.
Spatial recurrence analysis: A sensitive and fast detection tool in digital mammography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prado, T. L.; Galuzio, P. P.; Lopes, S. R.
Efficient diagnostics of breast cancer requires fast digital mammographic image processing. Many breast lesions, both benign and malignant, are barely visible to the untrained eye and requires accurate and reliable methods of image processing. We propose a new method of digital mammographic image analysis that meets both needs. It uses the concept of spatial recurrence as the basis of a spatial recurrence quantification analysis, which is the spatial extension of the well-known time recurrence analysis. The recurrence-based quantifiers are able to evidence breast lesions in a way as good as the best standard image processing methods available, but with amore » better control over the spurious fragments in the image.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruiz, Steven Adriel
The following discussion contains a high-level description of methods used to implement software for data processing. It describes the required directory structures and file handling required to use Excel's Visual Basic for Applications programming language and how to identify shot, test and capture types to appropriately process data. It also describes how to interface with the software.
The ferrosilicon process for the generation of hydrogen
NASA Technical Reports Server (NTRS)
Weaver, E R; Berry, W M; Bohnson, V L; Gordon, B D
1920-01-01
Report describes the generation of hydrogen by the reaction between ferrosilicon, sodium hydroxide, and water. This method known as the ferrosilicon method is especially adapted for use in the military field because of the relatively small size and low cost of the generator required to produce hydrogen at a rapid rate, the small operating force required, and the fact that no power is used except the small amount required to operate the stirring and pumping machinery. These advantages make it possible to quickly generate sufficient hydrogen to fill a balloon with a generator which can be transported on a motor truck. This report gives a summary of the details of the ferrosilicon process and a critical examination of the means which are necessary in order to make the process successful.
Survey of NASA V and V Processes/Methods
NASA Technical Reports Server (NTRS)
Pecheur, Charles; Nelson, Stacy
2002-01-01
The purpose of this report is to describe current NASA Verification and Validation (V&V) techniques and to explain how these techniques are applicable to 2nd Generation RLV Integrated Vehicle Health Management (IVHM) software. It also contains recommendations for special V&V requirements for IVHM. This report is divided into the following three sections: 1) Survey - Current NASA V&V Processes/Methods; 2) Applicability of NASA V&V to 2nd Generation RLV IVHM; and 3) Special 2nd Generation RLV IVHM V&V Requirements.
Schulze, H Georg; Turner, Robin F B
2014-01-01
Charge-coupled device detectors are vulnerable to cosmic rays that can contaminate Raman spectra with positive going spikes. Because spikes can adversely affect spectral processing and data analyses, they must be removed. Although both hardware-based and software-based spike removal methods exist, they typically require parameter and threshold specification dependent on well-considered user input. Here, we present a fully automated spike removal algorithm that proceeds without requiring user input. It is minimally dependent on sample attributes, and those that are required (e.g., standard deviation of spectral noise) can be determined with other fully automated procedures. At the core of the method is the identification and location of spikes with coincident second derivatives along both the spectral and spatiotemporal dimensions of two-dimensional datasets. The method can be applied to spectra that are relatively inhomogeneous because it provides fairly effective and selective targeting of spikes resulting in minimal distortion of spectra. Relatively effective spike removal obtained with full automation could provide substantial benefits to users where large numbers of spectra must be processed.
A Different Approach to Studying the Charge and Discharge of a Capacitor without an Oscilloscope
ERIC Educational Resources Information Center
Ladino, L. A.
2013-01-01
A different method to study the charging and discharging processes of a capacitor is presented. The method only requires a high impedance voltmeter. The charging and discharging processes of a capacitor are usually studied experimentally using an oscilloscope and, therefore, both processes are studied as a function of time. The approach presented…
Distributed systems status and control
NASA Technical Reports Server (NTRS)
Kreidler, David; Vickers, David
1990-01-01
Concepts are investigated for an automated status and control system for a distributed processing environment. System characteristics, data requirements for health assessment, data acquisition methods, system diagnosis methods and control methods were investigated in an attempt to determine the high-level requirements for a system which can be used to assess the health of a distributed processing system and implement control procedures to maintain an accepted level of health for the system. A potential concept for automated status and control includes the use of expert system techniques to assess the health of the system, detect and diagnose faults, and initiate or recommend actions to correct the faults. Therefore, this research included the investigation of methods by which expert systems were developed for real-time environments and distributed systems. The focus is on the features required by real-time expert systems and the tools available to develop real-time expert systems.
A Novel Calibration-Minimum Method for Prediction of Mole Fraction in Non-Ideal Mixture.
Shibayama, Shojiro; Kaneko, Hiromasa; Funatsu, Kimito
2017-04-01
This article proposes a novel concentration prediction model that requires little training data and is useful for rapid process understanding. Process analytical technology is currently popular, especially in the pharmaceutical industry, for enhancement of process understanding and process control. A calibration-free method, iterative optimization technology (IOT), was proposed to predict pure component concentrations, because calibration methods such as partial least squares, require a large number of training samples, leading to high costs. However, IOT cannot be applied to concentration prediction in non-ideal mixtures because its basic equation is derived from the Beer-Lambert law, which cannot be applied to non-ideal mixtures. We proposed a novel method that realizes prediction of pure component concentrations in mixtures from a small number of training samples, assuming that spectral changes arising from molecular interactions can be expressed as a function of concentration. The proposed method is named IOT with virtual molecular interaction spectra (IOT-VIS) because the method takes spectral change as a virtual spectrum x nonlin,i into account. It was confirmed through the two case studies that the predictive accuracy of IOT-VIS was the highest among existing IOT methods.
Defining Support Requirements During Conceptual Design of Reusable Launch Vehicles
NASA Technical Reports Server (NTRS)
Morris, W. D.; White, N. H.; Davis, W. T.; Ebeling, C. E.
1995-01-01
Current methods for defining the operational support requirements of new systems are data intensive and require significant design information. Methods are being developed to aid in the analysis process of defining support requirements for new launch vehicles during their conceptual design phase that work with the level of information available during this phase. These methods will provide support assessments based on the vehicle design and the operating scenarios. The results can be used both to define expected support requirements for new launch vehicle designs and to help evaluate the benefits of using new technologies. This paper describes the models, their current status, and provides examples of their use.
Code of Federal Regulations, 2013 CFR
2013-07-01
... and Group 1/Group 2 determinations (determining which wastewater streams require control). (a... methods and procedures for determining applicability and Group 1/Group 2 determinations (determining which wastewater streams require control). 63.144 Section 63.144 Protection of Environment ENVIRONMENTAL PROTECTION...
Code of Federal Regulations, 2010 CFR
2010-07-01
... and Group 1/Group 2 determinations (determining which wastewater streams require control). (a... methods and procedures for determining applicability and Group 1/Group 2 determinations (determining which wastewater streams require control). 63.144 Section 63.144 Protection of Environment ENVIRONMENTAL PROTECTION...
Code of Federal Regulations, 2014 CFR
2014-07-01
... and Group 1/Group 2 determinations (determining which wastewater streams require control). (a... methods and procedures for determining applicability and Group 1/Group 2 determinations (determining which wastewater streams require control). 63.144 Section 63.144 Protection of Environment ENVIRONMENTAL PROTECTION...
Code of Federal Regulations, 2011 CFR
2011-07-01
... and Group 1/Group 2 determinations (determining which wastewater streams require control). (a... methods and procedures for determining applicability and Group 1/Group 2 determinations (determining which wastewater streams require control). 63.144 Section 63.144 Protection of Environment ENVIRONMENTAL PROTECTION...
Code of Federal Regulations, 2012 CFR
2012-07-01
... and Group 1/Group 2 determinations (determining which wastewater streams require control). (a... methods and procedures for determining applicability and Group 1/Group 2 determinations (determining which wastewater streams require control). 63.144 Section 63.144 Protection of Environment ENVIRONMENTAL PROTECTION...
Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa
2012-06-01
The use of sophisticated information and communication technologies (ICTs) in the health care domain is a way to improve the quality of services. However, there are also hazards associated with the introduction of ICTs in this domain and a great number of projects have failed due to the lack of systematic consideration of human and other non-technology issues throughout the design or implementation process, particularly in the requirements engineering process. This paper presents the methodological approach followed in the design process of a web-based information system (WbIS) for managing the clinical information in hemophilia care, which integrates the values and practices of user-centered design (UCD) activities into the principles of software engineering, particularly in the phase of requirements engineering (RE). This process followed a paradigm that combines a grounded theory for data collection with an evolutionary design based on constant development and refinement of the generic domain model using three well-known methodological approaches: (a) object-oriented system analysis; (b) task analysis; and, (c) prototyping, in a triangulation work. This approach seems to be a good solution for the requirements engineering process in this particular case of the health care domain, since the inherent weaknesses of individual methods are reduced, and emergent requirements are easier to elicit. Moreover, the requirements triangulation matrix gives the opportunity to look across the results of all used methods and decide what requirements are critical for the system success. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hollingsworth, Peter Michael
The drive toward robust systems design, especially with respect to system affordability throughout the system life-cycle, has led to the development of several advanced design methods. While these methods have been extremely successful in satisfying the needs for which they have been developed, they inherently leave a critical area unaddressed. None of them fully considers the effect of requirements on the selection of solution systems. The goal of all of current modern design methodologies is to bring knowledge forward in the design process to the regions where more design freedom is available and design changes cost less. Therefore, it seems reasonable to consider the point in the design process where the greatest restrictions are placed on the final design, the point in which the system level requirements are set. Historically the requirements have been treated as something handed down from above. However, neither the customer nor the solution provider completely understood all of the options that are available in the broader requirements space. If a method were developed that provided the ability to understand the full scope of the requirements space, it would allow for a better comparison of potential solution systems with respect to both the current and potential future requirements. The key to a requirements conscious method is to treat requirements differently from the traditional approach. The method proposed herein is known as Requirements Controlled Design (RCD). By treating the requirements as a set of variables that control the behavior of the system, instead of variables that only define the response of the system, it is possible to determine a-priori what portions of the requirements space that any given system is capable of satisfying. Additionally, it should be possible to identify which systems can satisfy a given set of requirements and the locations where a small change in one or more requirements poses a significant risk to a design program. This thesis puts forth the theory and methodology to enable RCD, and details and validates a specific method called the Modified Strength Pareto Evolutionary Algorithm (MSPEA).
40 CFR 98.264 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
...-process phosphoric acid process line. You can use existing plant procedures that are used for accounting... the process line. Conduct the representative bulk sampling using the applicable standard method in the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalinin, V.P.; Tkacheva, O.N.
1986-03-01
Heat treatment entails considerable expenditure of power and often requires expensive equipment. One of the fundamental problems arising in the elaboration of heat treatment technology is the selection of the economically optimal process, which also has to ensure the quality of finished parts required by the customer. To correctly determine the expenditures on the basic kinds of resources it is necessary to improve the methods of calculating prime costs and to carry out such a calculation at the earliest stages of the technological preparation of production. A new method of optimizing synthesis of the structure of devising technological processes ofmore » heat treatment using the achievements of cybernetics and the possibilities of computerization is examined in this article. The method makes it possible to analyze in detail the economy of all possible variants of a technological process when one parameter is changed, without recalculating all items of prime cost.« less
7 CFR 319.40-8 - Processing at facilities operating under compliance agreements.
Code of Federal Regulations, 2011 CFR
2011-01-01
... plant pests from the facility, requirements to ensure the processing method effectively destroys plant pests, and the requirements for the application of chemical materials in accordance with part 305 of... and Budget under control number 0579-0049) [60 FR 27674, May 25, 1995, as amended at 69 FR 52418, Aug...
7 CFR 319.40-8 - Processing at facilities operating under compliance agreements.
Code of Federal Regulations, 2012 CFR
2012-01-01
... plant pests from the facility, requirements to ensure the processing method effectively destroys plant pests, and the requirements for the application of chemical materials in accordance with part 305 of... and Budget under control number 0579-0049) [60 FR 27674, May 25, 1995, as amended at 69 FR 52418, Aug...
7 CFR 319.40-8 - Processing at facilities operating under compliance agreements.
Code of Federal Regulations, 2010 CFR
2010-01-01
... plant pests from the facility, requirements to ensure the processing method effectively destroys plant pests, and the requirements for the application of chemical materials in accordance with part 305 of... and Budget under control number 0579-0049) [60 FR 27674, May 25, 1995, as amended at 69 FR 52418, Aug...
Development of replicated optics for AXAF-1 XDA testing
NASA Technical Reports Server (NTRS)
Engelhaupt, Darell; Wilson, Michele; Martin, Greg
1995-01-01
Advanced optical systems for applications such as grazing incidence Wolter I x-ray mirror assemblies require extraordinary mirror surfaces in terms of fine finish and surface figure. The impeccable mirror surface is on the inside of the rotational mirror form. One practical method of producing devices with these requirements is to first fabricate an exterior surface for the optical device then replicate that surface to have the inverse component with lightweight characteristics. The replicated optic is not better than the master or mandrel from which it is made. This task identifies methods and materials for forming these extremely low roughness optical components. The objectives of this contract were to (1) prepare replication samples of electroless nickel coated aluminum, and determine process requirements for plating XDA test optic; (2) prepare and assemble plating equipment required to process a demonstration optic; (3) characterize mandrels, replicas and test samples for residual stress, surface contamination and surface roughness and figure using equipment at MSFC and; (4) provide technical expertise in establishing the processes, procedures, supplies and equipment needed to process the XDA test optics.
Tamaoka, Katsuo; Asano, Michiko; Miyaoka, Yayoi; Yokosawa, Kazuhiko
2014-04-01
Using the eye-tracking method, the present study depicted pre- and post-head processing for simple scrambled sentences of head-final languages. Three versions of simple Japanese active sentences with ditransitive verbs were used: namely, (1) SO₁O₂V canonical, (2) SO₂O₁V single-scrambled, and (3) O₁O₂SV double-scrambled order. First pass reading times indicated that the third noun phrase just before the verb in both single- and double-scrambled sentences required longer reading times compared to canonical sentences. Re-reading times (the sum of all fixations minus the first pass reading) showed that all noun phrases including the crucial phrase before the verb in double-scrambled sentences required longer re-reading times than those required for single-scrambled sentences; single-scrambled sentences had no difference from canonical ones. Therefore, a single filler-gap dependency can be resolved in pre-head anticipatory processing whereas two filler-gap dependencies require much greater cognitive loading than a single case. These two dependencies can be resolved in post-head processing using verb agreement information.
1995-09-01
vital processes of a business. process, IDEF, method, methodology, modeling, knowledge acquisition, requirements definition, information systems... knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems
NASA Astrophysics Data System (ADS)
Nerita, S.; Maizeli, A.; Afza, A.
2017-09-01
Process Evaluation and Learning Outcomes of Biology subjects discusses the evaluation process in learning and application of designed and processed learning outcomes. Some problems found during this subject was the student difficult to understand the subject and the subject unavailability of learning resources that can guide and make students independent study. So, it necessary to develop a learning resource that can make active students to think and to make decisions with the guidance of the lecturer. The purpose of this study is to produce handout based on guided discovery method that match the needs of students. The research was done by using 4-D models and limited to define phase that is student requirement analysis. Data obtained from the questionnaire and analyzed descriptively. The results showed that the average requirement of students was 91,43%. Can be concluded that students need a handout based on guided discovery method in the learning process.
Li, Hailiang; Cui, Xiaoli; Tong, Yan; Gong, Muxin
2012-04-01
To compare inclusion effects and process conditions of two preparation methods-colloid mill and saturated solution-for beta-CD inclusion compound of four traditional Chinese medicine volatile oils and study the relationship between each process condition and volatile oil physical properties and the regularity of selective inclusion of volatile oil components. Volatile oils from Nardostachyos Radix et Rhizoma, Amomi Fructus, Zingiberis Rhizoma and Angelicaesinensis Radix were prepared using two methods in the orthogonal test. These inclusion compounds by optimized processes were assessed and compared by such methods as TLC, IR and scanning electron microscope. Inclusion oils were extracted by steam distillation, and the components found before and after inclusion were analyzed by GC-MS. Analysis showed that new inclusion compounds, but inclusion compounds prepared by the two processes had differences to some extent. The colloid mill method showed a better inclusion effect than the saturated solution method, indicating that their process conditions had relations with volatile oil physical properties. There were differences in the inclusion selectivity of components between each other. The colloid mill method for inclusion preparation is more suitable for industrial requirements. To prepare volatile oil inclusion compounds with heavy gravity and high refractive index, the colloid mill method needs longer time and more water, while the saturated solution method requires higher temperature and more beta-cyclodextrin. The inclusion complex prepared with the colloid mill method contains extended molecular weight chemical composition, but the kinds of components are reduced.
1983-04-08
constraints and legislation, the methods used for disposing of military lethal agents such as GB, VX, and HD, have changed from land and sea burial to...uction is the most generally accepted method of destroying toxic organic materials for all cases where the toxicity is associated with the totality of...8217., preferred method can be based on estimates or determinations of the required incineration conditions and an appraisal of the requirement for
Synthetic Aperture Radar (SAR) data processing
NASA Technical Reports Server (NTRS)
Beckner, F. L.; Ahr, H. A.; Ausherman, D. A.; Cutrona, L. J.; Francisco, S.; Harrison, R. E.; Heuser, J. S.; Jordan, R. L.; Justus, J.; Manning, B.
1978-01-01
The available and optimal methods for generating SAR imagery for NASA applications were identified. The SAR image quality and data processing requirements associated with these applications were studied. Mathematical operations and algorithms required to process sensor data into SAR imagery were defined. The architecture of SAR image formation processors was discussed, and technology necessary to implement the SAR data processors used in both general purpose and dedicated imaging systems was addressed.
Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M
2000-01-01
Hospital information systems have to support quality improvement objectives. The design issues of health care information system can be classified into three categories: 1) time-oriented and event-labelled storage of patient data; 2) contextual support of decision-making; 3) capabilities for modular upgrading. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualize clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the field of blood transfusion. An object-oriented data model of a process has been defined in order to identify its main components: activity, sub-process, resources, constrains, guidelines, parameters and indicators. Although some aspects of activity, such as "where", "what else", and "why" are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for this approach to be generalised within the organisation, for the processes to be interrelated, and for their characteristics to be shared.
Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M
2001-12-01
Healthcare institutions are looking at ways to increase their efficiency by reducing costs while providing care services with a high level of safety. Thus, hospital information systems have to support quality improvement objectives. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualise clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the blood transfusion process. An object-oriented data model of a process has been defined in order to organise the data dictionary. Although some aspects of activity, such as 'where', 'what else', and 'why' are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for the processes to be interrelated, and for their characteristics to be shared, in order to avoid data redundancy and to fit the gathering of data with the provision of care.
Adapting Western research methods to indigenous ways of knowing.
Simonds, Vanessa W; Christopher, Suzanne
2013-12-01
Indigenous communities have long experienced exploitation by researchers and increasingly require participatory and decolonizing research processes. We present a case study of an intervention research project to exemplify a clash between Western research methodologies and Indigenous methodologies and how we attempted reconciliation. We then provide implications for future research based on lessons learned from Native American community partners who voiced concern over methods of Western deductive qualitative analysis. Decolonizing research requires constant reflective attention and action, and there is an absence of published guidance for this process. Continued exploration is needed for implementing Indigenous methods alone or in conjunction with appropriate Western methods when conducting research in Indigenous communities. Currently, examples of Indigenous methods and theories are not widely available in academic texts or published articles, and are often not perceived as valid.
Aspects of the BPRIM Language for Risk Driven Process Engineering
NASA Astrophysics Data System (ADS)
Sienou, Amadou; Lamine, Elyes; Pingaud, Hervé; Karduck, Achim
Nowadays organizations are exposed to frequent changes in business environment requiring continuous alignment of business processes on business strategies. This agility requires methods promoted in enterprise engineering approaches. Risk consideration in enterprise engineering is getting important since the business environment is becoming more and more competitive and unpredictable. Business processes are subject to the same quality requirements as material and human resources. Thus, process management is supposed to tackle value creation challenges but also the ones related to value preservation. Our research considers risk driven business process design as an integral part of enterprise engineering. A graphical modelling language for risk driven business process engineering was introduced in former research. This paper extends the language and handles questions related to modelling risk in organisational context.
NASA Astrophysics Data System (ADS)
Grenn, Michael W.
This dissertation introduces a theory of information quality to explain macroscopic behavior observed in the systems engineering process. The theory extends principles of Shannon's mathematical theory of communication [1948] and statistical mechanics to information development processes concerned with the flow, transformation, and meaning of information. The meaning of requirements information in the systems engineering context is estimated or measured in terms of the cumulative requirements quality Q which corresponds to the distribution of the requirements among the available quality levels. The requirements entropy framework (REF) implements the theory to address the requirements engineering problem. The REF defines the relationship between requirements changes, requirements volatility, requirements quality, requirements entropy and uncertainty, and engineering effort. The REF is evaluated via simulation experiments to assess its practical utility as a new method for measuring, monitoring and predicting requirements trends and engineering effort at any given time in the process. The REF treats the requirements engineering process as an open system in which the requirements are discrete information entities that transition from initial states of high entropy, disorder and uncertainty toward the desired state of minimum entropy as engineering effort is input and requirements increase in quality. The distribution of the total number of requirements R among the N discrete quality levels is determined by the number of defined quality attributes accumulated by R at any given time. Quantum statistics are used to estimate the number of possibilities P for arranging R among the available quality levels. The requirements entropy H R is estimated using R, N and P by extending principles of information theory and statistical mechanics to the requirements engineering process. The information I increases as HR and uncertainty decrease, and the change in information AI needed to reach the desired state of quality is estimated from the perspective of the receiver. The HR may increase, decrease or remain steady depending on the degree to which additions, deletions and revisions impact the distribution of R among the quality levels. Current requirements trend metrics generally treat additions, deletions and revisions the same and simply measure the quantity of these changes over time. The REF evaluates the quantity of requirements changes over time, distinguishes between their positive and negative effects by calculating their impact on HR, Q, and AI, and forecasts when the desired state will be reached, enabling more accurate assessment of the status and progress of the requirements engineering effort. Results from random variable simulations suggest the REF is an improved leading indicator of requirements trends that can be readily combined with current methods. The increase in I, or decrease in H R and uncertainty, is proportional to the engineering effort E input into the requirements engineering process. The REF estimates the AE needed to transition R from their current state of quality to the desired end state or some other interim state of interest. Simulation results are compared with measured engineering effort data for Department of Defense programs published in the SE literature, and the results suggest the REF is a promising new method for estimation of AE.
Phytometric intelligence sensors
NASA Technical Reports Server (NTRS)
Seelig, Hans-Dieter (Inventor); Stoner, II, Richard J. (Inventor); Hoehn, Alexander (Inventor); Adams, III, William Walter (Inventor)
2010-01-01
Methods and apparatus for determining when plants require watering, and methods of attending to the watering of plants including signaling the grower that the plants are in need of hydration are provided. The novel methods include real-time measurement of plant metabolics and phytometric physiology changes of intrinsic physical or behavioral traits within the plant such as determining physiological flux measurement of enzyme flux due to environmental changes such as the wind and drought stress, soil and plant mineral deficiencies, or the interaction with a bio-control for organic disease control including, cell movement, signal transduction, internal chemical processes and external environmental processes including when plants require watering, and methods of attending to the watering of plants including signaling the grower that the plants are in need of hydration.
Yasman, Yakov; Bulatov, Valery; Gridin, Vladimir V; Agur, Sabina; Galil, Noah; Armon, Robert; Schechter, Israel
2004-09-01
A new method for detoxification of hydrophilic chloroorganic pollutants in effluent water was developed, using a combination of ultrasound waves, electrochemistry and Fenton's reagent. The advantages of the method are exemplified using two target compounds: the common herbicide 2,4-dichlorophenoxyacetic acid (2,4-D) and its derivative 2,4-dichlorophenol (2,4-DCP). The high degradation power of this process is due to the large production of oxidizing hydroxyl radicals and high mass transfer due to sonication. Application of this sono-electrochemical Fenton process (SEF) treatment (at 20 kHz) with quite a small current density, accomplished almost 50% oxidation of 2,4-D solution (300 ppm, 1.2 mM) in just 60 s. Similar treatments ran for 600 s resulted in practically full degradation of the herbicide; sizable oxidation of 2,4-DCP also occurs. The main intermediate compounds produced in the SEF process were identified. Their kinetic profile was measured and a chemical reaction scheme was suggested. The efficiency of the SEF process is tentatively much higher than the reference degradation methods and the time required for full degradation is considerably shorter. The SEF process maintains high performance up to concentrations which are higher than reference methods. The optimum concentration of Fe2+ ions required for this process was found to be of about 2 mM, which is lower than that in reference techniques. These findings indicate that SEF process may be an effective method for detoxification of environmental water.
Making Visible the Coding Process: Using Qualitative Data Software in a Post-Structural Study
ERIC Educational Resources Information Center
Ryan, Mary
2009-01-01
Qualitative research methods require transparency to ensure the "trustworthiness" of the data analysis. The intricate processes of organising, coding and analysing the data are often rendered invisible in the presentation of the research findings, which requires a "leap of faith" for the reader. Computer assisted data analysis software can be used…
ERIC Educational Resources Information Center
Zholdasbekova, S.; Karataev, G.; Yskak, A.; Zholdasbekov, A.; Nurzhanbaeva, J.
2015-01-01
This article describes the major components of required technological skills (TS) for future designers taught during the academic process of a college. It considers the choices in terms of the various logical operations required by the fashion industry including fabric processing, assembly charts, performing work operations, etc. The article…
ERIC Educational Resources Information Center
Domah, Darshan
2013-01-01
Agile software development has become very popular around the world in recent years, with methods such as Scrum and Extreme Programming (XP). Literature suggests that functionality is the primary focus in Agile processes while non-functional requirements (NFR) are either ignored or ill-defined. However, for software to be of good quality both…
Using multi-attribute decision-making approaches in the selection of a hospital management system.
Arasteh, Mohammad Ali; Shamshirband, Shahaboddin; Yee, Por Lip
2018-01-01
The most appropriate organizational software is always a real challenge for managers, especially, the IT directors. The illustration of the term "enterprise software selection", is to purchase, create, or order a software that; first, is best adapted to require of the organization; and second, has suitable price and technical support. Specifying selection criteria and ranking them, is the primary prerequisite for this action. This article provides a method to evaluate, rank, and compare the available enterprise software for choosing the apt one. The prior mentioned method is constituted of three-stage processes. First, the method identifies the organizational requires and assesses them. Second, it selects the best method throughout three possibilities; indoor-production, buying software, and ordering special software for the native use. Third, the method evaluates, compares and ranks the alternative software. The third process uses different methods of multi attribute decision making (MADM), and compares the consequent results. Based on different characteristics of the problem; several methods had been tested, namely, Analytic Hierarchy Process (AHP), Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS), Elimination and Choice Expressing Reality (ELECTURE), and easy weight method. After all, we propose the most practical method for same problems.
Supportability Technologies for Future Exploration Missions
NASA Technical Reports Server (NTRS)
Watson, Kevin; Thompson, Karen
2007-01-01
Future long-duration human exploration missions will be challenged by resupply limitations and mass and volume constraints. Consequently, it will be essential that the logistics footprint required to support these missions be minimized and that capabilities be provided to make them highly autonomous from a logistics perspective. Strategies to achieve these objectives include broad implementation of commonality and standardization at all hardware levels and across all systems, repair of failed hardware at the lowest possible hardware level, and manufacture of structural and mechanical replacement components as needed. Repair at the lowest hardware levels will require the availability of compact, portable systems for diagnosis of failures in electronic systems and verification of system functionality following repair. Rework systems will be required that enable the removal and replacement of microelectronic components with minimal human intervention to minimize skill requirements and training demand for crews. Materials used in the assembly of electronic systems (e.g. solders, fluxes, conformal coatings) must be compatible with the available repair methods and the spacecraft environment. Manufacturing of replacement parts for structural and mechanical applications will require additive manufacturing systems that can generate near-net-shape parts from the range of engineering alloys employed in the spacecraft structure and in the parts utilized in other surface systems. These additive manufacturing processes will need to be supported by real-time non-destructive evaluation during layer-additive processing for on-the-fly quality control. This will provide capabilities for quality control and may serve as an input for closed-loop process control. Additionally, non-destructive methods should be available for material property determination. These nondestructive evaluation processes should be incorporated with the additive manufacturing process - providing an in-process capability to ensure that material deposited during layer-additive processing meets required material property criteria.
Code of Federal Regulations, 2012 CFR
2012-01-01
... with the requirements of this rule before February 9, 2007. (b) In the event a contractor has... methods to be added to the existing program, description, or process, that satisfy the requirements of...
A Framework to Determine New System Requirements Under Design Parameter and Demand Uncertainties
2015-04-30
relegates quantitative complexities of decision-making to the method and designates trade-space exploration to the practitioner. We demonstrate the...quantitative complexities of decision-making to the method and designates trade-space exploration to the practitioner. We demonstrate the approach...play a critical role in determining new system requirements. Scope and Method of Approach The early stages of the design process have substantial
Portable brine evaporator unit, process, and system
Hart, Paul John; Miller, Bruce G.; Wincek, Ronald T.; Decker, Glenn E.; Johnson, David K.
2009-04-07
The present invention discloses a comprehensive, efficient, and cost effective portable evaporator unit, method, and system for the treatment of brine. The evaporator unit, method, and system require a pretreatment process that removes heavy metals, crude oil, and other contaminates in preparation for the evaporator unit. The pretreatment and the evaporator unit, method, and system process metals and brine at the site where they are generated (the well site). Thus, saving significant money to producers who can avoid present and future increases in transportation costs.
Fast and Scalable Gaussian Process Modeling with Applications to Astronomical Time Series
NASA Astrophysics Data System (ADS)
Foreman-Mackey, Daniel; Agol, Eric; Ambikasaran, Sivaram; Angus, Ruth
2017-12-01
The growing field of large-scale time domain astronomy requires methods for probabilistic data analysis that are computationally tractable, even with large data sets. Gaussian processes (GPs) are a popular class of models used for this purpose, but since the computational cost scales, in general, as the cube of the number of data points, their application has been limited to small data sets. In this paper, we present a novel method for GPs modeling in one dimension where the computational requirements scale linearly with the size of the data set. We demonstrate the method by applying it to simulated and real astronomical time series data sets. These demonstrations are examples of probabilistic inference of stellar rotation periods, asteroseismic oscillation spectra, and transiting planet parameters. The method exploits structure in the problem when the covariance function is expressed as a mixture of complex exponentials, without requiring evenly spaced observations or uniform noise. This form of covariance arises naturally when the process is a mixture of stochastically driven damped harmonic oscillators—providing a physical motivation for and interpretation of this choice—but we also demonstrate that it can be a useful effective model in some other cases. We present a mathematical description of the method and compare it to existing scalable GP methods. The method is fast and interpretable, with a range of potential applications within astronomical data analysis and beyond. We provide well-tested and documented open-source implementations of this method in C++, Python, and Julia.
Human Expertise Helps Computer Classify Images
NASA Technical Reports Server (NTRS)
Rorvig, Mark E.
1991-01-01
Two-domain method of computational classification of images requires less computation than other methods for computational recognition, matching, or classification of images or patterns. Does not require explicit computational matching of features, and incorporates human expertise without requiring translation of mental processes of classification into language comprehensible to computer. Conceived to "train" computer to analyze photomicrographs of microscope-slide specimens of leucocytes from human peripheral blood to distinguish between specimens from healthy and specimens from traumatized patients.
Range and mission scheduling automation using combined AI and operations research techniques
NASA Technical Reports Server (NTRS)
Arbabi, Mansur; Pfeifer, Michael
1987-01-01
Ground-based systems for Satellite Command, Control, and Communications (C3) operations require a method for planning, scheduling and assigning the range resources such as: antenna systems scattered around the world, communications systems, and personnel. The method must accommodate user priorities, last minute changes, maintenance requirements, and exceptions from nominal requirements. Described are computer programs which solve 24 hour scheduling problems, using heuristic algorithms and a real time interactive scheduling process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Treese II, J. Van; Hanlon, Edward A.; Amponsah, Nana
Here, recent changes in the United States requiring the use of ethanol in gasoline for most vehicular transportation have created discussion about important issues, such as shifting the use of certain plants from food production to energy supply, related federal subsidies, effects on soil, water and atmosphere resources, tradeoffs between food production and energy production, speculation about biofuels as a possible means for energy security, potential reduction of greenhouse gas (GHG) emissions or development and expansion of biofuels industry. A sustainable approach to biofuel production requires understanding inputs (i.e., energy required to carry out a process, both natural and anthropogenic)more » and outputs (i.e., energy produced by that process) and cover the entire process, as well as environmental considerations that can be overlooked in a more traditional approach. This publication gives an overview of two methods for evaluating energy transformations in biofuels production: (1) Life Cycle Assessment (LCA) and (2) Emergy Assessment (EA). The LCA approach involves measurements affecting greenhouse gases (GHG), which can be linked to the energy considerations used in the EA. Although these two methods have their basis in energy or GHG evaluations, their approaches can lead to a reliable judgment regarding a biofuel process. Using these two methods can ensure that the energy components are well understood and can help to evaluate the economic environmental component of a biofuel process. In turn, using these two evaluative tools will allow for decisions about biofuel processes that favor sustainability« less
NASA Technical Reports Server (NTRS)
Warren, Andrew H.; Arelt, Joseph E.; Lalicata, Anthony L.; Rogers, Karen M.
1993-01-01
A method of efficient and automated thermal-structural processing of very large space structures is presented. The method interfaces the finite element and finite difference techniques. It also results in a pronounced reduction of the quantity of computations, computer resources and manpower required for the task, while assuring the desired accuracy of the results.
Designing Class Methods from Dataflow Diagrams
NASA Astrophysics Data System (ADS)
Shoval, Peretz; Kabeli-Shani, Judith
A method for designing the class methods of an information system is described. The method is part of FOOM - Functional and Object-Oriented Methodology. In the analysis phase of FOOM, two models defining the users' requirements are created: a conceptual data model - an initial class diagram; and a functional model - hierarchical OO-DFDs (object-oriented dataflow diagrams). Based on these models, a well-defined process of methods design is applied. First, the OO-DFDs are converted into transactions, i.e., system processes that supports user task. The components and the process logic of each transaction are described in detail, using pseudocode. Then, each transaction is decomposed, according to well-defined rules, into class methods of various types: basic methods, application-specific methods and main transaction (control) methods. Each method is attached to a proper class; messages between methods express the process logic of each transaction. The methods are defined using pseudocode or message charts.
Due Process Rights of Nursing Students in Case of Misconduct.
ERIC Educational Resources Information Center
Osinski, Kay
2003-01-01
Explains the concepts of academic misconduct, due process rights, and the implicit contract between students and the university. Discusses ways to incorporate due process in nursing school course catalogs, course requirements, evaluation methods, and grievance procedures. (SK)
Scandurra, Isabella; Hägglund, Maria
2009-01-01
Introduction Integrated care involves different professionals, belonging to different care provider organizations and requires immediate and ubiquitous access to patient-oriented information, supporting an integrated view on the care process [1]. Purpose To present a method for development of usable and work process-oriented information and communication technology (ICT) systems for integrated care. Theory and method Based on Human-computer Interaction Science and in particular Participatory Design [2], we present a new collaborative design method in the context of health information systems (HIS) development [3]. This method implies a thorough analysis of the entire interdisciplinary cooperative work and a transformation of the results into technical specifications, via user validated scenarios, prototypes and use cases, ultimately leading to the development of appropriate ICT for the variety of occurring work situations for different user groups, or professions, in integrated care. Results and conclusions Application of the method in homecare of the elderly resulted in an HIS that was well adapted to the intended user groups. Conducted in multi-disciplinary seminars, the method captured and validated user needs and system requirements for different professionals, work situations, and environments not only for current work; it also aimed to improve collaboration in future (ICT supported) work processes. A holistic view of the entire care process was obtained and supported through different views of the HIS for different user groups, resulting in improved work in the entire care process as well as for each collaborating profession [4].
Li, Youping; Yu, Jiajie; Du, Liang; Sun, Xin; Kwong, Joey S W; Wu, Bin; Hu, Zhiqiang; Lu, Jing; Xu, Ting; Zhang, Lingli
2015-11-01
After 38 years of development, the procedure of selection and evaluation of the World Health Organization Essential Medicine List (WHO EML) is increasingly scientific and formal. However, peer review for the applications of World Health Organization Essential Medicine List is always required in a short period. It is necessary to build up a set of methods and processes for rapid review. We identified the process of evidenced-based rapid review on WHO EML application for peer reviews according to 11 items which were required during reporting of the peer review results of the proposals. The most important items for the rapid review of World Health Organization Essential Medicine List peer reviewers are (1) to confirm the requirements and identify the purposes; (2) to establish the research questions and translate the questions into the 'Participants, Interventions, Comparators, Outcomes, Study design' (PICOS) format; (3) to search and screen available evidence, for which high-level evidence is preferred, such as systematic reviews or meta-analyses, health technology assessment, clinical guidelines; (4) to extract data, where we extract primary information based on the purposes; (5) to synthesize data by qualitative methods, assess the quality of evidence, and compare the results; (6) to provide the answers to the applications, quality of evidences and strength of recommendations. Our study established a set of methods and processes for the rapid review of World Health Organization Essential Medicine List peer review, and our findings were used to guide the reviewers to fulfill the 19(th) World Health Organization Essential Medicine List peer review. The methods and processes were feasible and met the necessary requirements in terms of time and quality. Continuous improvement and evaluation in practice are warranted. © 2015 Chinese Cochrane Center, West China Hospital of Sichuan University and Wiley Publishing Asia Pty Ltd.
Application of high speed machining technology in aviation
NASA Astrophysics Data System (ADS)
Bałon, Paweł; Szostak, Janusz; Kiełbasa, Bartłomiej; Rejman, Edward; Smusz, Robert
2018-05-01
Aircraft structures are exposed to many loads during their working lifespan. Every particular action made during a flight is composed of a series of air movements which generate various aircraft loads. The most rigorous requirement which modern aircraft structures must fulfill is to maintain their high durability and reliability. This requirement involves taking many restrictions into account during the aircraft design process. The most important factor is the structure's overall mass, which has a crucial impact on both utility properties and cost-effectiveness. This makes aircraft one of the most complex results of modern technology. Additionally, there is currently an increasing utilization of high strength aluminum alloys, which requires the implementation of new manufacturing processes. High Speed Machining technology (HSM) is currently one of the most important machining technologies used in the aviation industry, especially in the machining of aluminium alloys. The primary difference between HSM and other milling techniques is the ability to select cutting parameters - depth of the cut layer, feed rate, and cutting speed in order to simultaneously ensure high quality, precision of the machined surface, and high machining efficiency, all of which shorten the manufacturing process of the integral components. In this paper, the authors explain the implementation of the HSM method in integral aircraft constructions. It presents the method of the airframe manufacturing method, and the final results. The HSM method is compared to the previous method where all subcomponents were manufactured by bending and forming processes, and then, they were joined by riveting.
Adapting Western Research Methods to Indigenous Ways of Knowing
Christopher, Suzanne
2013-01-01
Indigenous communities have long experienced exploitation by researchers and increasingly require participatory and decolonizing research processes. We present a case study of an intervention research project to exemplify a clash between Western research methodologies and Indigenous methodologies and how we attempted reconciliation. We then provide implications for future research based on lessons learned from Native American community partners who voiced concern over methods of Western deductive qualitative analysis. Decolonizing research requires constant reflective attention and action, and there is an absence of published guidance for this process. Continued exploration is needed for implementing Indigenous methods alone or in conjunction with appropriate Western methods when conducting research in Indigenous communities. Currently, examples of Indigenous methods and theories are not widely available in academic texts or published articles, and are often not perceived as valid. PMID:23678897
Simplified dichromated gelatin hologram recording process
NASA Technical Reports Server (NTRS)
Georgekutty, Tharayil G.; Liu, Hua-Kuang
1987-01-01
A simplified method for making dichromated gelatin (DCG) holographic optical elements (HOE) has been discovered. The method is much less tedious and it requires a period of processing time comparable with that for processing a silver halide hologram. HOE characteristics including diffraction efficiency (DE), linearity, and spectral sensitivity have been quantitatively investigated. The quality of the holographic grating is very high. Ninety percent or higher diffraction efficiency has been achieved in simple plane gratings made by this process.
A Dynamic Time Warping based covariance function for Gaussian Processes signature identification
NASA Astrophysics Data System (ADS)
Silversides, Katherine L.; Melkumyan, Arman
2016-11-01
Modelling stratiform deposits requires a detailed knowledge of the stratigraphic boundaries. In Banded Iron Formation (BIF) hosted ores of the Hamersley Group in Western Australia these boundaries are often identified using marker shales. Both Gaussian Processes (GP) and Dynamic Time Warping (DTW) have been previously proposed as methods to automatically identify marker shales in natural gamma logs. However, each method has different advantages and disadvantages. We propose a DTW based covariance function for the GP that combines the flexibility of the DTW with the probabilistic framework of the GP. The three methods are tested and compared on their ability to identify two natural gamma signatures from a Marra Mamba type iron ore deposit. These tests show that while all three methods can identify boundaries, the GP with the DTW covariance function combines and balances the strengths and weaknesses of the individual methods. This method identifies more positive signatures than the GP with the standard covariance function, and has a higher accuracy for identified signatures than the DTW. The combined method can handle larger variations in the signature without requiring multiple libraries, has a probabilistic output and does not require manual cut-off selections.
NASA Astrophysics Data System (ADS)
Riveiro, B.; DeJong, M.; Conde, B.
2016-06-01
Despite the tremendous advantages of the laser scanning technology for the geometric characterization of built constructions, there are important limitations preventing more widespread implementation in the structural engineering domain. Even though the technology provides extensive and accurate information to perform structural assessment and health monitoring, many people are resistant to the technology due to the processing times involved. Thus, new methods that can automatically process LiDAR data and subsequently provide an automatic and organized interpretation are required. This paper presents a new method for fully automated point cloud segmentation of masonry arch bridges. The method efficiently creates segmented, spatially related and organized point clouds, which each contain the relevant geometric data for a particular component (pier, arch, spandrel wall, etc.) of the structure. The segmentation procedure comprises a heuristic approach for the separation of different vertical walls, and later image processing tools adapted to voxel structures allows the efficient segmentation of the main structural elements of the bridge. The proposed methodology provides the essential processed data required for structural assessment of masonry arch bridges based on geometric anomalies. The method is validated using a representative sample of masonry arch bridges in Spain.
Spatial Statistics for Tumor Cell Counting and Classification
NASA Astrophysics Data System (ADS)
Wirjadi, Oliver; Kim, Yoo-Jin; Breuel, Thomas
To count and classify cells in histological sections is a standard task in histology. One example is the grading of meningiomas, benign tumors of the meninges, which requires to assess the fraction of proliferating cells in an image. As this process is very time consuming when performed manually, automation is required. To address such problems, we propose a novel application of Markov point process methods in computer vision, leading to algorithms for computing the locations of circular objects in images. In contrast to previous algorithms using such spatial statistics methods in image analysis, the present one is fully trainable. This is achieved by combining point process methods with statistical classifiers. Using simulated data, the method proposed in this paper will be shown to be more accurate and more robust to noise than standard image processing methods. On the publicly available SIMCEP benchmark for cell image analysis algorithms, the cell count performance of the present paper is significantly more accurate than results published elsewhere, especially when cells form dense clusters. Furthermore, the proposed system performs as well as a state-of-the-art algorithm for the computer-aided histological grading of meningiomas when combined with a simple k-nearest neighbor classifier for identifying proliferating cells.
Method for materials deposition by ablation transfer processing
Weiner, Kurt H.
1996-01-01
A method in which a thin layer of semiconducting, insulating, or metallic material is transferred by ablation from a source substrate, coated uniformly with a thin layer of said material, to a target substrate, where said material is desired, with a pulsed, high intensity, patternable beam of energy. The use of a patternable beam allows area-selective ablation from the source substrate resulting in additive deposition of the material onto the target substrate which may require a very low percentage of the area to be covered. Since material is placed only where it is required, material waste can be minimized by reusing the source substrate for depositions on multiple target substrates. Due to the use of a pulsed, high intensity energy source the target substrate remains at low temperature during the process, and thus low-temperature, low cost transparent glass or plastic can be used as the target substrate. The method can be carried out atmospheric pressures and at room temperatures, thus eliminating vacuum systems normally required in materials deposition processes. This invention has particular application in the flat panel display industry, as well as minimizing materials waste and associated costs.
NASA Astrophysics Data System (ADS)
Verrelst, Jochem; Malenovský, Zbyněk; Van der Tol, Christiaan; Camps-Valls, Gustau; Gastellu-Etchegorry, Jean-Philippe; Lewis, Philip; North, Peter; Moreno, Jose
2018-06-01
An unprecedented spectroscopic data stream will soon become available with forthcoming Earth-observing satellite missions equipped with imaging spectroradiometers. This data stream will open up a vast array of opportunities to quantify a diversity of biochemical and structural vegetation properties. The processing requirements for such large data streams require reliable retrieval techniques enabling the spatiotemporally explicit quantification of biophysical variables. With the aim of preparing for this new era of Earth observation, this review summarizes the state-of-the-art retrieval methods that have been applied in experimental imaging spectroscopy studies inferring all kinds of vegetation biophysical variables. Identified retrieval methods are categorized into: (1) parametric regression, including vegetation indices, shape indices and spectral transformations; (2) nonparametric regression, including linear and nonlinear machine learning regression algorithms; (3) physically based, including inversion of radiative transfer models (RTMs) using numerical optimization and look-up table approaches; and (4) hybrid regression methods, which combine RTM simulations with machine learning regression methods. For each of these categories, an overview of widely applied methods with application to mapping vegetation properties is given. In view of processing imaging spectroscopy data, a critical aspect involves the challenge of dealing with spectral multicollinearity. The ability to provide robust estimates, retrieval uncertainties and acceptable retrieval processing speed are other important aspects in view of operational processing. Recommendations towards new-generation spectroscopy-based processing chains for operational production of biophysical variables are given.
Estimating Logistics Support of Reusable Launch Vehicles During Conceptual Design
NASA Technical Reports Server (NTRS)
Morris, W. D.; White, N. H.; Davies, W. T.; Ebeling, C. E.
1997-01-01
Methods exist to define the logistics support requirements for new aircraft concepts but are not directly applicable to new launch vehicle concepts. In order to define the support requirements and to discriminate among new technologies and processing choices for these systems, NASA Langley Research Center (LaRC) is developing new analysis methods. This paper describes several methods under development, gives their current status, and discusses the benefits and limitations associated with their use.
2011-01-01
Background Academic literature and international standards bodies suggest that user involvement, via the incorporation of human factors engineering methods within the medical device design and development (MDDD) process, offer many benefits that enable the development of safer and more usable medical devices that are better suited to users' needs. However, little research has been carried out to explore medical device manufacturers' beliefs and attitudes towards user involvement within this process, or indeed what value they believe can be added by doing so. Methods In-depth interviews with representatives from 11 medical device manufacturers are carried out. We ask them to specify who they believe the intended users of the device to be, who they consult to inform the MDDD process, what role they believe the user plays within this process, and what value (if any) they believe users add. Thematic analysis is used to analyse the fully transcribed interview data, to gain insight into medical device manufacturers' beliefs and attitudes towards user involvement within the MDDD process. Results A number of high-level themes emerged, relating who the user is perceived to be, the methods used, the perceived value and barriers to user involvement, and the nature of user contributions. The findings reveal that despite standards agencies and academic literature offering strong support for the employment formal methods, manufacturers are still hesitant due to a range of factors including: perceived barriers to obtaining ethical approval; the speed at which such activity may be carried out; the belief that there is no need given the 'all-knowing' nature of senior health care staff and clinical champions; a belief that effective results are achievable by consulting a minimal number of champions. Furthermore, less senior health care practitioners and patients were rarely seen as being able to provide valuable input into the process. Conclusions Medical device manufacturers often do not see the benefit of employing formal human factors engineering methods within the MDDD process. Research is required to better understand the day-to-day requirements of manufacturers within this sector. The development of new or adapted methods may be required if user involvement is to be fully realised. PMID:21356097
Limiting factors in the production of deep microstructures
NASA Astrophysics Data System (ADS)
Tolfree, David W. L.; O'Neill, William; Tunna, Leslie; Sutcliffe, Christopher
1999-10-01
Microsystems increasingly require precision deep microstructures that can be cost-effectively designed and manufactured. New products must be able to meet the demands of the rapidly growing markets for microfluidic, micro- optical and micromechanical devices in industrial sectors which include chemicals, pharmaceuticals, biosciences, medicine and food. The realization of such products, first requires an effective process to design and manufacture prototypes. Two process methods used for the fabrication of high aspect-ratio microstructures are based on X-ray beam lithography with electroforming processes and direct micromachining with a frequency multiplied Nd:YAG laser using nanosecond pulse widths. Factors which limit the efficiency and precision obtainable using such processes are important parameters when deciding on the best fabrication method to use. A basic microstructure with narrow channels suitable for a microfluidic mixer have been fabricated using both these techniques and comparisons made of the limitations and suitability of the processes in respect of fast prototyping and manufacture or working devices.
Improvement of the System of Training of Specialists by University for Coal Mining Enterprises
NASA Astrophysics Data System (ADS)
Mikhalchenko, Vadim; Seredkina, Irina
2017-11-01
In the article the ingenious technique of the Quality Function Deployment with reference to the process of training of specialists with higher education by university is considered. The method is based on the step-by-step conversion of customer requirements into specific organizational, meaningful and functional transformations of the technological process of the university. A fully deployed quality function includes four stages of tracking customer requirements while creating a product: product planning and design, process design, production design. The Quality Function Deployment can be considered as one of the methods for optimizing the technological processes of training of specialists with higher education in the current economic conditions. Implemented at the initial stages of the life cycle of the technological process, it ensures not only the high quality of the "product" of graduate school, but also the fullest possible satisfaction of consumer's requests and expectations.
40 CFR 98.114 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... requirements. If you determine annual process CO2 emissions using the carbon mass balance procedure in § 98.113... D5373-08 Standard Test Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in...
40 CFR 98.114 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... requirements. If you determine annual process CO2 emissions using the carbon mass balance procedure in § 98.113... D5373-08 Standard Test Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in...
40 CFR 98.114 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... requirements. If you determine annual process CO2 emissions using the carbon mass balance procedure in § 98.113... D5373-08 Standard Test Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in...
40 CFR 98.114 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... requirements. If you determine annual process CO2 emissions using the carbon mass balance procedure in § 98.113... D5373-08 Standard Test Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in...
An ERTS-1 investigation for Lake Ontario and its basin
NASA Technical Reports Server (NTRS)
Polcyn, F. C.; Falconer, A. (Principal Investigator); Wagner, T. W.; Rebel, D. L.
1975-01-01
The author has identified the following significant results. Methods of manual, semi-automatic, and automatic (computer) data processing were evaluated, as were the requirements for spatial physiographic and limnological information. The coupling of specially processed ERTS data with simulation models of the watershed precipitation/runoff process provides potential for water resources management. Optimal and full use of the data requires a mix of data processing and analysis techniques, including single band editing, two band ratios, and multiband combinations. A combination of maximum likelihood ratio and near-IR/red band ratio processing was found to be particularly useful.
The Cox proportional Hazard model on duration of birth process
NASA Astrophysics Data System (ADS)
Wuryandari, Triastuti; Haryatmi Kartiko, Sri; Danardono
2018-05-01
The duration of birth process, which is measured from the birth sign until baby born, is one important factor to the whole outcome of delivery process. There is a method of birth process that given relaxing and gentle treatment to the mother caled as gentlebirth. Gentlebirth is a method of birth process that combines brain science, birth science and technology to empower positive birth without pain. However the effect of method to the duration of birth process is still need empirical investigations. Therefore, the objective of this paper is to analyze the duration of birth process using the appropriate statistical methods for durational data, survival data or time to event data. Since there are many variables or factor that may affect the duration, a regression model is considerated. The flexibility of the Cox Proportional Hazard Model in the sense that there is no distributional assumption required, makes the Cox Model as the appropriate model and method to analyze the duration birth process. It is concluded that the Gentlebirth method affects on duration of birth process, with Hazard Ratio of 2.073, showing that the duration of birth process with gentlebirth method is faster than the other method.
Besseling, Rut; Damen, Michiel; Tran, Thanh; Nguyen, Thanh; van den Dries, Kaspar; Oostra, Wim; Gerich, Ad
2015-10-10
Dry powder mixing is a wide spread Unit Operation in the Pharmaceutical industry. With the advent of in-line Near Infrared (NIR) Spectroscopy and Quality by Design principles, application of Process Analytical Technology to monitor Blend Uniformity (BU) is taking a more prominent role. Yet routine use of NIR for monitoring, let alone control of blending processes is not common in the industry, despite the improved process understanding and (cost) efficiency that it may offer. Method maintenance, robustness and translation to regulatory requirements have been important barriers to implement the method. This paper presents a qualitative NIR-BU method offering a convenient and compliant approach to apply BU control for routine operation and process understanding, without extensive calibration and method maintenance requirements. The method employs a moving F-test to detect the steady state of measured spectral variances and the endpoint of mixing. The fundamentals and performance characteristics of the method are first presented, followed by a description of the link to regulatory BU criteria, the method sensitivity and practical considerations. Applications in upscaling, tech transfer and commercial production are described, along with evaluation of the method performance by comparison with results from quantitative calibration models. A full application, in which end-point detection via the F-test controls the blending process of a low dose product, was successfully filed in Europe and Australia, implemented in commercial production and routinely used for about five years and more than 100 batches. Copyright © 2015 Elsevier B.V. All rights reserved.
Canseco Grellet, M A; Castagnaro, A; Dantur, K I; De Boeck, G; Ahmed, P M; Cárdenas, G J; Welin, B; Ruiz, R M
2016-10-01
To calculate fermentation efficiency in a continuous ethanol production process, we aimed to develop a robust mathematical method based on the analysis of metabolic by-product formation. This method is in contrast to the traditional way of calculating ethanol fermentation efficiency, where the ratio between the ethanol produced and the sugar consumed is expressed as a percentage of the theoretical conversion yield. Comparison between the two methods, at industrial scale and in sensitivity studies, showed that the indirect method was more robust and gave slightly higher fermentation efficiency values, although fermentation efficiency of the industrial process was found to be low (~75%). The traditional calculation method is simpler than the indirect method as it only requires a few chemical determinations in samples collected. However, a minor error in any measured parameter will have an important impact on the calculated efficiency. In contrast, the indirect method of calculation requires a greater number of determinations but is much more robust since an error in any parameter will only have a minor effect on the fermentation efficiency value. The application of the indirect calculation methodology in order to evaluate the real situation of the process and to reach an optimum fermentation yield for an industrial-scale ethanol production is recommended. Once a high fermentation yield has been reached the traditional method should be used to maintain the control of the process. Upon detection of lower yields in an optimized process the indirect method should be employed as it permits a more accurate diagnosis of causes of yield losses in order to correct the problem rapidly. The low fermentation efficiency obtained in this study shows an urgent need for industrial process optimization where the indirect calculation methodology will be an important tool to determine process losses. © 2016 The Society for Applied Microbiology.
Boston-Fleischhauer, Carol
2008-01-01
The design and implementation of efficient, effective, and safe processes are never-ending challenges in healthcare. Less than optimal performance levels and rising concerns about patient safety suggest that traditional process design methods are insufficient to meet design requirements. In this 2-part series, the author presents human factors engineering and reliability science as important knowledge to enhance existing operational and clinical process design methods in healthcare. An examination of these theories, application approaches, and examples are presented.
Multiscale Structure of UXO Site Characterization: Spatial Estimation and Uncertainty Quantification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ostrouchov, George; Doll, William E.; Beard, Les P.
2009-01-01
Unexploded ordnance (UXO) site characterization must consider both how the contamination is generated and how we observe that contamination. Within the generation and observation processes, dependence structures can be exploited at multiple scales. We describe a conceptual site characterization process, the dependence structures available at several scales, and consider their statistical estimation aspects. It is evident that most of the statistical methods that are needed to address the estimation problems are known but their application-specific implementation may not be available. We demonstrate estimation at one scale and propose a representation for site contamination intensity that takes full account of uncertainty,more » is flexible enough to answer regulatory requirements, and is a practical tool for managing detailed spatial site characterization and remediation. The representation is based on point process spatial estimation methods that require modern computational resources for practical application. These methods have provisions for including prior and covariate information.« less
Listening as a Method of Learning a Foreign Language at the Non-Language Faculty of the University
ERIC Educational Resources Information Center
Kondrateva, Irina G.; Safina, Minnisa S.; Valeev, Agzam A.
2016-01-01
Learning a foreign language is becoming an increasingly important with Russia's integration into the world community. In this regard, increased requirements for the educational process and the development of new innovative teaching methods meet the requirements of the time. One of the important aspects of learning a foreign language is listening…
Food Safety Impacts from Post-Harvest Processing Procedures of Molluscan Shellfish.
Baker, George L
2016-04-18
Post-harvest Processing (PHP) methods are viable food processing methods employed to reduce human pathogens in molluscan shellfish that would normally be consumed raw, such as raw oysters on the half-shell. Efficacy of human pathogen reduction associated with PHP varies with respect to time, temperature, salinity, pressure, and process exposure. Regulatory requirements and PHP molluscan shellfish quality implications are major considerations for PHP usage. Food safety impacts associated with PHP of molluscan shellfish vary in their efficacy and may have synergistic outcomes when combined. Further research for many PHP methods are necessary and emerging PHP methods that result in minimal quality loss and effective human pathogen reduction should be explored.
Lindoerfer, Doris; Mansmann, Ulrich
2017-07-01
Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be a first step to create standards for patient registry software system assessments. Copyright © 2017 Elsevier Inc. All rights reserved.
Methods of Adapting Digital Content for the Learning Process via Mobile Devices
ERIC Educational Resources Information Center
Lopez, J. L. Gimenez; Royo, T. Magal; Laborda, Jesus Garcia; Calvo, F. Garde
2009-01-01
This article analyses different methods of adapting digital content for its delivery via mobile devices taking into account two aspects which are a fundamental part of the learning process; on the one hand, functionality of the contents, and on the other, the actual controlled navigation requirements that the learner needs in order to acquire high…
ERIC Educational Resources Information Center
Garner, Johny T.
2015-01-01
Organizational communication processes are complex, but all too often, researchers oversimplify the study of these processes by relying on a single method. Particularly when scholars and practitioners partner together to solve organizational problems, meaningful results require methodological flexibility and diversity. As an exemplar of the fit…
Furnace and support equipment for space processing. [space manufacturing - Czochralski method
NASA Technical Reports Server (NTRS)
Mazelsky, R.; Duncan, C. S.; Seidensticker, R. G.; Johnson, R. A.; Hopkins, R. H.; Roland, G. W.
1975-01-01
A core facility capable of performing a majority of materials processing experiments is discussed. Experiment classes are described, the needs peculiar to each experiment type are outlined, and projected facility requirements to perform the experiments are treated. Control equipment (automatic control) and variations of the Czochralski method for use in space are discussed.
Analytical difficulties facing today's regulatory laboratories: issues in method validation.
MacNeil, James D
2012-08-01
The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures. © 2012 John Wiley & Sons, Ltd.
Generalization of the Poincare sphere to process 2D displacement signals
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.; Lamberti, Luciano
2017-06-01
Traditionally the multiple phase method has been considered as an essential tool for phase information recovery. The in-quadrature phase method that theoretically is an alternative pathway to achieve the same goal failed in actual applications. The authors in a previous paper dealing with 1D signals have shown that properly implemented the in-quadrature method yields phase values with the same accuracy than the multiple phase method. The present paper extends the methodology developed in 1D to 2D. This extension is not a straight forward process and requires the introduction of a number of additional concepts and developments. The concept of monogenic function provides the necessary tools required for the extension process. The monogenic function has a graphic representation through the Poincare sphere familiar in the field of Photoelasticity and through the developments introduced in this paper connected to the analysis of displacement fringe patterns. The paper is illustrated with examples of application that show that multiple phases method and the in-quadrature are two aspects of the same basic theoretical model.
Manufacturing methods of a composite cell case for a Ni-Cd battery
NASA Technical Reports Server (NTRS)
Bauer, J. L.; Bogner, R. S.; Lowe, E. P.; Orlowski, E.
1979-01-01
Graphite epoxy material for a nickel cadmium battery cell case has been evaluated and determined to perform in the simulated environment of the battery. The basic manufacturing method requires refinement to demonstrate production feasibility. The various facets of production scale-up, i.e., process and tooling development together with material and process control, have been integrated into a comprehensive manufacturing process that assures production reproducibility and product uniformity. Test results substantiate that a battery cell case produced from graphite epoxy pre-impregnated material utilizing internal pressure bag fabrication method is feasible.
Jurado, Marisa; Algora, Manuel; Garcia-Sanchez, Félix; Vico, Santiago; Rodriguez, Eva; Perez, Sonia; Barbolla, Luz
2012-01-01
Background The Community Transfusion Centre in Madrid currently processes whole blood using a conventional procedure (Compomat, Fresenius) followed by automated processing of buffy coats with the OrbiSac system (CaridianBCT). The Atreus 3C system (CaridianBCT) automates the production of red blood cells, plasma and an interim platelet unit from a whole blood unit. Interim platelet unit are pooled to produce a transfusable platelet unit. In this study the Atreus 3C system was evaluated and compared to the routine method with regards to product quality and operational value. Materials and methods Over a 5-week period 810 whole blood units were processed using the Atreus 3C system. The attributes of the automated process were compared to those of the routine method by assessing productivity, space, equipment and staffing requirements. The data obtained were evaluated in order to estimate the impact of implementing the Atreus 3C system in the routine setting of the blood centre. Yield and in vitro quality of the final blood components processed with the two systems were evaluated and compared. Results The Atreus 3C system enabled higher throughput while requiring less space and employee time by decreasing the amount of equipment and processing time per unit of whole blood processed. Whole blood units processed on the Atreus 3C system gave a higher platelet yield, a similar amount of red blood cells and a smaller volume of plasma. Discussion These results support the conclusion that the Atreus 3C system produces blood components meeting quality requirements while providing a high operational efficiency. Implementation of the Atreus 3C system could result in a large organisational improvement. PMID:22044958
Use of Foodomics for Control of Food Processing and Assessing of Food Safety.
Josić, D; Peršurić, Ž; Rešetar, D; Martinović, T; Saftić, L; Kraljević Pavelić, S
Food chain, food safety, and food-processing sectors face new challenges due to globalization of food chain and changes in the modern consumer preferences. In addition, gradually increasing microbial resistance, changes in climate, and human errors in food handling remain a pending barrier for the efficient global food safety management. Consequently, a need for development, validation, and implementation of rapid, sensitive, and accurate methods for assessment of food safety often termed as foodomics methods is required. Even though, the growing role of these high-throughput foodomic methods based on genomic, transcriptomic, proteomic, and metabolomic techniques has yet to be completely acknowledged by the regulatory agencies and bodies. The sensitivity and accuracy of these methods are superior to previously used standard analytical procedures and new methods are suitable to address a number of novel requirements posed by the food production sector and global food market. © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Singh, S. P.
1979-01-01
The computer software developed to set up a method for Wiener spectrum analysis of photographic films is presented. This method is used for the quantitative analysis of the autoradiographic enhancement process. The software requirements and design for the autoradiographic enhancement process are given along with the program listings and the users manual. A software description and program listings modification of the data analysis software are included.
Phase derivative method for reconstruction of slightly off-axis digital holograms.
Guo, Cheng-Shan; Wang, Ben-Yi; Sha, Bei; Lu, Yu-Jie; Xu, Ming-Yuan
2014-12-15
A phase derivative (PD) method is proposed for reconstruction of off-axis holograms. In this method, a phase distribution of the tested object wave constrained within 0 to pi radian is firstly worked out by a simple analytical formula; then it is corrected to its right range from -pi to pi according to the sign characteristics of its first-order derivative. A theoretical analysis indicates that this PD method is particularly suitable for reconstruction of slightly off-axis holograms because it only requires the spatial frequency of the reference beam larger than spatial frequency of the tested object wave in principle. In addition, because the PD method belongs to a pure local method with no need of any integral operation or phase shifting algorithm in process of the phase retrieval, it could have some advantages in reducing computer load and memory requirements to the image processing system. Some experimental results are given to demonstrate the feasibility of the method.
3D freeform printing of silk fibroin.
Rodriguez, Maria J; Dixon, Thomas A; Cohen, Eliad; Huang, Wenwen; Omenetto, Fiorenzo G; Kaplan, David L
2018-04-15
Freeform fabrication has emerged as a key direction in printing biologically-relevant materials and structures. With this emerging technology, complex structures with microscale resolution can be created in arbitrary geometries and without the limitations found in traditional bottom-up or top-down additive manufacturing methods. Recent advances in freeform printing have used the physical properties of microparticle-based granular gels as a medium for the submerged extrusion of bioinks. However, most of these techniques require post-processing or crosslinking for the removal of the printed structures (Miller et al., 2015; Jin et al., 2016) [1,2]. In this communication, we introduce a novel method for the one-step gelation of silk fibroin within a suspension of synthetic nanoclay (Laponite) and polyethylene glycol (PEG). Silk fibroin has been used as a biopolymer for bioprinting in several contexts, but chemical or enzymatic additives or bulking agents are needed to stabilize 3D structures. Our method requires no post-processing of printed structures and allows for in situ physical crosslinking of pure aqueous silk fibroin into arbitrary geometries produced through freeform 3D printing. 3D bioprinting has emerged as a technology that can produce biologically relevant structures in defined geometries with microscale resolution. Techniques for fabrication of free-standing structures by printing into granular gel media has been demonstrated previously, however, these methods require crosslinking agents and post-processing steps on printed structures. Our method utilizes one-step gelation of silk fibroin within a suspension of synthetic nanoclay (Laponite), with no need for additional crosslinking compounds or post processing of the material. This new method allows for in situ physical crosslinking of pure aqueous silk fibroin into defined geometries produced through freeform 3D printing. Copyright © 2018 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Thermal Spray Formation of Polymer Coatings
NASA Technical Reports Server (NTRS)
Coquill, Scott; Galbraith, Stephen L.; Tuss. Darren L.; Ivosevic, Milan
2008-01-01
This innovation forms a sprayable polymer film using powdered precursor materials and an in-process heating method. This device directly applies a powdered polymer onto a substrate to form an adherent, mechanically-sound, and thickness-regulated film. The process can be used to lay down both fully dense and porous, e.g., foam, coatings. This system is field-deployable and includes power distribution, heater controls, polymer constituent material bins, flow controls, material transportation functions, and a thermal spray apparatus. The only thing required for operation in the field is a power source. Because this method does not require solvents, it does not release the toxic, volatile organic compounds of previous methods. Also, the sprayed polymer material is not degraded because this method does not use hot combustion gas or hot plasma gas. This keeps the polymer from becoming rough, porous, or poorly bonded.
A parallel implementation of a multisensor feature-based range-estimation method
NASA Technical Reports Server (NTRS)
Suorsa, Raymond E.; Sridhar, Banavar
1993-01-01
There are many proposed vision based methods to perform obstacle detection and avoidance for autonomous or semi-autonomous vehicles. All methods, however, will require very high processing rates to achieve real time performance. A system capable of supporting autonomous helicopter navigation will need to extract obstacle information from imagery at rates varying from ten frames per second to thirty or more frames per second depending on the vehicle speed. Such a system will need to sustain billions of operations per second. To reach such high processing rates using current technology, a parallel implementation of the obstacle detection/ranging method is required. This paper describes an efficient and flexible parallel implementation of a multisensor feature-based range-estimation algorithm, targeted for helicopter flight, realized on both a distributed-memory and shared-memory parallel computer.
Ruszczyńska, A; Szteyn, J; Wiszniewska-Laszczych, A
2007-01-01
Producing dairy products which are safe for consumers requires the constant monitoring of the microbiological quality of raw material, the production process itself and the end product. Traditional methods, still a "gold standard", require a specialized laboratory working on recognized and validated methods. Obtaining results is time- and labor-consuming and do not allow rapid evaluation. Hence, there is a need for a rapid, precise method enabling the real-time monitoring of microbiological quality, and flow cytometry serves this function well. It is based on labeling cells suspended in a solution with fluorescent dyes and pumping them into a measurement zone where they are exposed to a precisely focused laser beam. This paper is aimed at presenting the possibilities of applying flow cytometry in the dairy industry.
40 CFR 60.675 - Test methods and procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 6 2011-07-01 2011-07-01 false Test methods and procedures. 60.675... Mineral Processing Plants § 60.675 Test methods and procedures. (a) In conducting the performance tests required in § 60.8, the owner or operator shall use as reference methods and procedures the test methods in...
SELECTING SITES FOR COMPARISON WITH CREATED WETLANDS
The paper describes the method used for selecting natural wetlands to compare with created wetlands. The results of the selection process and the advantages and disadvantages of the method are discussed. The random site selection method required extensive field work and may have ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The ''Disposal of Waste or Excess High Explosives'' project began January 1971. Various methods of disposal were investigated with the conclusion that incineration, at major ERDA facilities, would be the most feasible and safest method with the least cost and development time required. Two independent incinerator concepts were investigated: a rotary type for continuous processing and an enclosed pit type for batch processing. Both concepts are feasible; however, it is recommended that further investigations would be required to render them acceptable. It is felt that a larger effort would be required in the case of the rotary incinerator. The projectmore » was terminated (December 1976) prior to completion as a result of a grant of authority by the Texas Air Control Board allowing the ERDA Pantex Plant to continue indefinitely outdoor burning of explosives.« less
NASA Astrophysics Data System (ADS)
Ansari, Muhammad Ahsan; Zai, Sammer; Moon, Young Shik
2017-01-01
Manual analysis of the bulk data generated by computed tomography angiography (CTA) is time consuming, and interpretation of such data requires previous knowledge and expertise of the radiologist. Therefore, an automatic method that can isolate the coronary arteries from a given CTA dataset is required. We present an automatic yet effective segmentation method to delineate the coronary arteries from a three-dimensional CTA data cloud. Instead of a region growing process, which is usually time consuming and prone to leakages, the method is based on the optimal thresholding, which is applied globally on the Hessian-based vesselness measure in a localized way (slice by slice) to track the coronaries carefully to their distal ends. Moreover, to make the process automatic, we detect the aorta using the Hough transform technique. The proposed segmentation method is independent of the starting point to initiate its process and is fast in the sense that coronary arteries are obtained without any preprocessing or postprocessing steps. We used 12 real clinical datasets to show the efficiency and accuracy of the presented method. Experimental results reveal that the proposed method achieves 95% average accuracy.
Jin, Bo; Zhao, Haibo; Zheng, Chuguang; Liang, Zhiwu
2017-01-03
Exergy-based methods are widely applied to assess the performance of energy conversion systems; however, these methods mainly focus on a certain steady-state and have limited applications for evaluating the control impacts on system operation. To dynamically obtain the thermodynamic behavior and reveal the influences of control structures, layers and loops, on system energy performance, a dynamic exergy method is developed, improved, and applied to a complex oxy-combustion boiler island system for the first time. The three most common operating scenarios are studied, and the results show that the flow rate change process leads to less energy consumption than oxygen purity and air in-leakage change processes. The variation of oxygen purity produces the largest impact on system operation, and the operating parameter sensitivity is not affected by the presence of process control. The control system saves energy during flow rate and oxygen purity change processes, while it consumes energy during the air in-leakage change process. More attention should be paid to the oxygen purity change because it requires the largest control cost. In the control system, the supervisory control layer requires the greatest energy consumption and the largest control cost to maintain operating targets, while the steam control loops cause the main energy consumption.
Sampling methods for microbiological analysis of red meat and poultry carcasses.
Capita, Rosa; Prieto, Miguel; Alonso-Calleja, Carlos
2004-06-01
Microbiological analysis of carcasses at slaughterhouses is required in the European Union for evaluating the hygienic performance of carcass production processes as required for effective hazard analysis critical control point implementation. The European Union microbial performance standards refer exclusively to the excision method, even though swabbing using the wet/dry technique is also permitted when correlation between both destructive and nondestructive methods can be established. For practical and economic reasons, the swab technique is the most extensively used carcass surface-sampling method. The main characteristics, advantages, and limitations of the common excision and swabbing methods are described here.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giaddui, T; Chen, W; Yu, J
2014-06-15
Purpose: To review IGRT credentialing experience and unexpected technical issues encountered in connection with advanced radiotherapy technologies as implemented in RTOG clinical trials. To update IGRT credentialing procedures with the aim of improving the quality of the process, and to increase the proportion of IGRT credentialing compliance. To develop a living disease site-specific IGRT encyclopedia. Methods: Numerous technical issues were encountered during the IGRT credentialing process. The criteria used for credentialing review were based on: image quality; anatomy included in fused data sets and shift results. Credentialing requirements have been updated according to the AAPM task group reports for IGRTmore » to ensure that all required technical items are included in the quality review process. Implementation instructions have been updated and expanded for recent protocols. Results: Technical issues observed during the credentialing review process include, but are not limited to: poor quality images; inadequate image acquisition region; poor data quality; shifts larger than acceptable; no soft tissue surrogate. The updated IGRT credentialing process will address these issues and will also include the technical items required from AAPM: TG 104; TG 142 and TG 179 reports. An instruction manual has been developed describing a remote credentialing method for reviewers. Submission requirements are updated, including images/documents as well as facility questionnaire. The review report now includes summary of the review process and the parameters that reviewers check. We have reached consensus on the minimum IGRT technical requirement for a number of disease sites. RTOG 1311(NRG-BR002A Phase 1 Study of Stereotactic Body Radiotherapy (SBRT) for the Treatment of Multiple Metastases) is an example, here; the protocol specified the minimum requirement for each anatomical sites (with/without fiducials). Conclusion: Technical issues are identified and reported. IGRT guidelines are updated, with the corresponding credentialing requirements. An IGRT encyclopedia describing site-specific implementation issues is currently in development.« less
On demand processing of climate station sensor data
NASA Astrophysics Data System (ADS)
Wöllauer, Stephan; Forteva, Spaska; Nauss, Thomas
2015-04-01
Large sets of climate stations with several sensors produce big amounts of finegrained time series data. To gain value of this data, further processing and aggregation is needed. We present a flexible system to process the raw data on demand. Several aspects need to be considered to process the raw data in a way that scientists can use the processed data conveniently for their specific research interests. First of all, it is not feasible to pre-process the data in advance because of the great variety of ways it can be processed. Therefore, in this approach only the raw measurement data is archived in a database. When a scientist requires some time series, the system processes the required raw data according to the user-defined request. Based on the type of measurement sensor, some data validation is needed, because the climate station sensors may produce erroneous data. Currently, three validation methods are integrated in the on demand processing system and are optionally selectable. The most basic validation method checks if measurement values are within a predefined range of possible values. For example, it may be assumed that an air temperature sensor measures values within a range of -40 °C to +60 °C. Values outside of this range are considered as a measurement error by this validation method and consequently rejected. An other validation method checks for outliers in the stream of measurement values by defining a maximum change rate between subsequent measurement values. The third validation method compares measurement data to the average values of neighboring stations and rejects measurement values with a high variance. These quality checks are optional, because especially extreme climatic values may be valid but rejected by some quality check method. An other important task is the preparation of measurement data in terms of time. The observed stations measure values in intervals of minutes to hours. Often scientists need a coarser temporal resolution (days, months, years). Therefore, the interval of time aggregation is selectable for the processing. For some use cases it is desirable that the resulting time series are as continuous as possible. To meet these requirements, the processing system includes techniques to fill gaps of missing values by interpolating measurement values with data from adjacent stations using available contemporaneous measurements from the respective stations as training datasets. Alongside processing of sensor values, we created interactive visualization techniques to get a quick overview of a big amount of archived time series data.
Organic electronics with polymer dielectrics on plastic substrates fabricated via transfer printing
NASA Astrophysics Data System (ADS)
Hines, Daniel R.
Printing methods are fast becoming important processing techniques for the fabrication of flexible electronics. Some goals for flexible electronics are to produce cheap, lightweight, disposable radio frequency identification (RFID) tags, very large flexible displays that can be produced in a roll-to-roll process and wearable electronics for both the clothing and medical industries. Such applications will require fabrication processes for the assembly of dissimilar materials onto a common substrate in ways that are compatible with organic and polymeric materials as well as traditional solid-state electronic materials. A transfer printing method has been developed with these goals and application in mind. This printing method relies primarily on differential adhesion where no chemical processing is performed on the device substrate. It is compatible with a wide variety of materials with each component printed in exactly the same way, thus avoiding any mixed processing steps on the device substrate. The adhesion requirements of one material printed onto a second are studied by measuring the surface energy of both materials and by surface treatments such as plasma exposure or the application of self-assembled monolayers (SAM). Transfer printing has been developed within the context of fabricating organic electronics onto plastic substrates because these materials introduce unique opportunities associated with processing conditions not typically required for traditional semiconducting materials. Compared to silicon, organic semiconductors are soft materials that require low temperature processing and are extremely sensitive to chemical processing and environmental contamination. The transfer printing process has been developed for the important and commonly used organic semiconducting materials, pentacene (Pn) and poly(3-hexylthiophene) (P3HT). A three-step printing process has been developed by which these materials are printed onto an electrode subassembly consisting of previously printed electrodes separated by a polymer dielectric layer all on a plastic substrate. These bottom contact, flexible organic thin-film transistors (OTFT) have been compared to unprinted (reference) devices consisting of top contact electrodes and a silicon dioxide dielectric layer on a silicon substrate. Printed Pn and P3HT TFTs have been shown to out-perform the reference devices. This enhancement has been attributed to an annealing under pressure of the organic semiconducting material.
Selectively strippable paint schemes
NASA Astrophysics Data System (ADS)
Stein, R.; Thumm, D.; Blackford, Roger W.
1993-03-01
In order to meet the requirements of more environmentally acceptable paint stripping processes many different removal methods are under evaluation. These new processes can be divided into mechanical and chemical methods. ICI has developed a paint scheme with intermediate coat and fluid resistant polyurethane topcoat which can be stripped chemically in a short period of time with methylene chloride free and phenol free paint strippers.
Capitano, Cinzia; Peri, Giorgia; Rizzo, Gianfranco; Ferrante, Patrizia
2017-03-01
Marble is a natural dimension stone that is widely used in building due to its resistance and esthetic qualities. Unfortunately, some concerns have arisen regarding its production process because quarrying and processing activities demand significant amounts of energy and greatly affect the environment. Further, performing an environmental analysis of a production process such as that of marble requires the consideration of many environmental aspects (e.g., noise, vibrations, dust and waste production, energy consumption). Unfortunately, the current impact accounting tools do not seem to be capable of considering all of the major aspects of the (marble) production process that may affect the environment and thus cannot provide a comprehensive and concise assessment of all environmental aspects associated with the marble production process. Therefore, innovative, easy, and reliable methods for evaluating its environmental impact are necessary, and they must be accessible for the non-technician. The present study intends to provide a contribution in this sense by proposing a reliable and easy-to-use evaluation method to assess the significance of the environmental impacts associated with the marble production process. In addition, an application of the method to an actual marble-producing company is presented to demonstrate its practicability. Because of its relative ease of use, the method presented here can also be used as a "self-assessment" tool for pursuing a virtuous environmental policy because it enables company owners to easily identify the segments of their production chain that most require environmental enhancement.
Hayashi, Ryusuke; Watanabe, Osamu; Yokoyama, Hiroki; Nishida, Shin'ya
2017-06-01
Characterization of the functional relationship between sensory inputs and neuronal or observers' perceptual responses is one of the fundamental goals of systems neuroscience and psychophysics. Conventional methods, such as reverse correlation and spike-triggered data analyses are limited in their ability to resolve complex and inherently nonlinear neuronal/perceptual processes because these methods require input stimuli to be Gaussian with a zero mean. Recent studies have shown that analyses based on a generalized linear model (GLM) do not require such specific input characteristics and have advantages over conventional methods. GLM, however, relies on iterative optimization algorithms and its calculation costs become very expensive when estimating the nonlinear parameters of a large-scale system using large volumes of data. In this paper, we introduce a new analytical method for identifying a nonlinear system without relying on iterative calculations and yet also not requiring any specific stimulus distribution. We demonstrate the results of numerical simulations, showing that our noniterative method is as accurate as GLM in estimating nonlinear parameters in many cases and outperforms conventional, spike-triggered data analyses. As an example of the application of our method to actual psychophysical data, we investigated how different spatiotemporal frequency channels interact in assessments of motion direction. The nonlinear interaction estimated by our method was consistent with findings from previous vision studies and supports the validity of our method for nonlinear system identification.
NASA Technical Reports Server (NTRS)
Menzel, Christopher; Mayer, Richard J.; Edwards, Douglas D.
1991-01-01
The Process Description Capture Method (IDEF3) is one of several Integrated Computer-Aided Manufacturing (ICAM) DEFinition methods developed by the Air Force to support systems engineering activities, and in particular, to support information systems development. These methods have evolved as a distillation of 'good practice' experience by information system developers and are designed to raise the performance level of the novice practitioner to one comparable with that of an expert. IDEF3 is meant to serve as a knowledge acquisition and requirements definition tool that structures the user's understanding of how a given process, event, or system works around process descriptions. A special purpose graphical language accompanying the method serves to highlight temporal precedence and causality relationships relative to the process or event being described.
Dunst, J; Willich, N; Sack, H; Engenhart-Cabillic, R; Budach, V; Popp, W
2014-02-01
The QUIRO study aimed to establish a secure level of quality and innovation in radiation oncology. Over 6 years, 27 specific surveys were conducted at 24 radiooncological departments. In all, 36 renowned experts from the field of radiation oncology (mostly head physicians and full professors) supported the realization of the study. A salient feature of the chosen methodological approach is the "process" as a means of systematizing diversified medical-technical procedures according to standardized criteria. On the one hand, "processes" as a tool of translation are adapted for creating and transforming standards into concrete clinical and medical actions; on the other hand, they provide the basis for standardized instruments and methods to determine the required needs of physicians, staff, and equipment. In the foreground of the collection and measurement of resource requirements were the processes of direct service provision which were subdivided into modules for reasons of clarity and comprehensibility. Overhead tasks (i.e., participation in quality management) were excluded from the main study and examined in a separate survey with appropriate methods. After the exploration of guidelines, tumor- or indication-specific examination and treatment processes were developed in expert workshops. Moreover, those specific modules were defined which characterize these entities and indications in a special degree. Afterwards, these modules were compiled according to their time and resources required in the "reference institution", i.e., in specialized and as competent recognized departments (mostly from the university area), by various suitable survey methods. The significance of the QUIRO study and the validity of the results were optimized in a process of constant improvements and comprehensive checks. As a consequence, the QUIRO study yields representative results concerning the resource requirement for specialized, qualitatively and technologically highly sophisticated radiooncologic treatment in Germany.
Changing to Concept-Based Curricula: The Process for Nurse Educators
Baron, Kristy A.
2017-01-01
Background: The complexity of health care today requires nursing graduates to use effective thinking skills. Many nursing programs are revising curricula to include concept-based learning that encourages problem-solving, effective thinking, and the ability to transfer knowledge to a variety of situations—requiring nurse educators to modify their teaching styles and methods to promote student-centered learning. Changing from teacher-centered learning to student-centered learning requires a major shift in thinking and application. Objective: The focus of this qualitative study was to understand the process of changing to concept-based curricula for nurse educators who previously taught in traditional curriculum designs. Methods: The sample included eight educators from two institutions in one Western state using a grounded theory design. Results: The themes that emerged from participants’ experiences consisted of the overarching concept, support for change, and central concept, finding meaning in the change. Finding meaning is supported by three main themes: preparing for the change, teaching in a concept-based curriculum, and understanding the teaching-learning process. Conclusion: Changing to a concept-based curriculum required a major shift in thinking and application. Through support, educators discovered meaning to make the change by constructing authentic learning opportunities that mirrored practice, refining the change process, and reinforcing benefits of teaching. PMID:29399236
USING CFD TO ANALYZE NUCLEAR SYSTEMS BEHAVIOR: DEFINING THE VALIDATION REQUIREMENTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richard Schultz
2012-09-01
A recommended protocol to formulate numeric tool specifications and validation needs in concert with practices accepted by regulatory agencies for advanced reactors is described. The protocol is based on the plant type and perceived transient and accident envelopes that translates to boundary conditions for a process that gives the: (a) key phenomena and figures-of-merit which must be analyzed to ensure that the advanced plant can be licensed, (b) specification of the numeric tool capabilities necessary to perform the required analyses—including bounding calculational uncertainties, and (c) specification of the validation matrices and experiments--including the desired validation data. The result of applyingmore » the process enables a complete program to be defined, including costs, for creating and benchmarking transient and accident analysis methods for advanced reactors. By following a process that is in concert with regulatory agency licensing requirements from the start to finish, based on historical acceptance of past licensing submittals, the methods derived and validated have a high probability of regulatory agency acceptance.« less
The atmosphere of Mars - Resources for the exploration and settlement of Mars
NASA Technical Reports Server (NTRS)
Meyer, T. R.; Mckay, C. P.
1984-01-01
This paper describes methods of processing the Mars atmosphere to supply water, oxygen and buffer gas for a Mars base. Existing life support system technology is combined with innovative methods of water extraction, and buffer gas processing. The design may also be extended to incorporate an integrated greenhouse to supply food, oxygen and water recycling. It is found that the work required to supply one kilogram of an argon/nitrogen buffer gas is 9.4 kW-hr. To extract water from the dry Martian atmosphere can require up to 102.8 kW-hr per kilogram of water depending on the relative humidity of the air.
Method for materials deposition by ablation transfer processing
Weiner, K.H.
1996-04-16
A method in which a thin layer of semiconducting, insulating, or metallic material is transferred by ablation from a source substrate, coated uniformly with a thin layer of said material, to a target substrate, where said material is desired, with a pulsed, high intensity, patternable beam of energy. The use of a patternable beam allows area-selective ablation from the source substrate resulting in additive deposition of the material onto the target substrate which may require a very low percentage of the area to be covered. Since material is placed only where it is required, material waste can be minimized by reusing the source substrate for depositions on multiple target substrates. Due to the use of a pulsed, high intensity energy source the target substrate remains at low temperature during the process, and thus low-temperature, low cost transparent glass or plastic can be used as the target substrate. The method can be carried out atmospheric pressures and at room temperatures, thus eliminating vacuum systems normally required in materials deposition processes. This invention has particular application in the flat panel display industry, as well as minimizing materials waste and associated costs. 1 fig.
9 CFR 320.1 - Records required to be kept.
Code of Federal Regulations, 2010 CFR
2010-01-01
... processing procedures to destroy trichinae in § 318.10(c)(3)(iv) (Methods 5 and 6). (8) Records of nutrition... control of the production process using advanced meat/bone separation machinery and meat recovery systems...
Application of capability indices and control charts in the analytical method control strategy.
Oliva, Alexis; Llabres Martinez, Matías
2017-08-01
In this study, we assessed the usefulness of control charts in combination with the process capability indices, C pm and C pk , in the control strategy of an analytical method. The traditional X-chart and moving range chart were used to monitor the analytical method over a 2-year period. The results confirmed that the analytical method is in-control and stable. Different criteria were used to establish the specifications limits (i.e. analyst requirements) for fixed method performance (i.e. method requirements). If the specification limits and control limits are equal in breadth, the method can be considered "capable" (C pm = 1), but it does not satisfy the minimum method capability requirements proposed by Pearn and Shu (2003). Similar results were obtained using the C pk index. The method capability was also assessed as a function of method performance for fixed analyst requirements. The results indicate that the method does not meet the requirements of the analytical target approach. A real-example data of a SEC with light-scattering detection method was used as a model whereas previously published data were used to illustrate the applicability of the proposed approach. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Integrated aerodynamic-structural design of a forward-swept transport wing
NASA Technical Reports Server (NTRS)
Haftka, Raphael T.; Grossman, Bernard; Kao, Pi-Jen; Polen, David M.; Sobieszczanski-Sobieski, Jaroslaw
1989-01-01
The introduction of composite materials is having a profound effect on aircraft design. Since these materials permit the designer to tailor material properties to improve structural, aerodynamic and acoustic performance, they require an integrated multidisciplinary design process. Futhermore, because of the complexity of the design process, numerical optimization methods are required. The utilization of integrated multidisciplinary design procedures for improving aircraft design is not currently feasible because of software coordination problems and the enormous computational burden. Even with the expected rapid growth of supercomputers and parallel architectures, these tasks will not be practical without the development of efficient methods for cross-disciplinary sensitivities and efficient optimization procedures. The present research is part of an on-going effort which is focused on the processes of simultaneous aerodynamic and structural wing design as a prototype for design integration. A sequence of integrated wing design procedures has been developed in order to investigate various aspects of the design process.
Periodical capacity setting methods for make-to-order multi-machine production systems
Altendorfer, Klaus; Hübl, Alexander; Jodlbauer, Herbert
2014-01-01
The paper presents different periodical capacity setting methods for make-to-order, multi-machine production systems with stochastic customer required lead times and stochastic processing times to improve service level and tardiness. These methods are developed as decision support when capacity flexibility exists, such as, a certain range of possible working hours a week for example. The methods differ in the amount of information used whereby all are based on the cumulated capacity demand at each machine. In a simulation study the methods’ impact on service level and tardiness is compared to a constant provided capacity for a single and a multi-machine setting. It is shown that the tested capacity setting methods can lead to an increase in service level and a decrease in average tardiness in comparison to a constant provided capacity. The methods using information on processing time and customer required lead time distribution perform best. The results found in this paper can help practitioners to make efficient use of their flexible capacity. PMID:27226649
Harden, Angela; Thomas, James; Cargo, Margaret; Harris, Janet; Pantoja, Tomas; Flemming, Kate; Booth, Andrew; Garside, Ruth; Hannes, Karin; Noyes, Jane
2018-05-01
The Cochrane Qualitative and Implementation Methods Group develops and publishes guidance on the synthesis of qualitative and mixed-method evidence from process evaluations. Despite a proliferation of methods for the synthesis of qualitative research, less attention has focused on how to integrate these syntheses within intervention effectiveness reviews. In this article, we report updated guidance from the group on approaches, methods, and tools, which can be used to integrate the findings from quantitative studies evaluating intervention effectiveness with those from qualitative studies and process evaluations. We draw on conceptual analyses of mixed methods systematic review designs and the range of methods and tools that have been used in published reviews that have successfully integrated different types of evidence. We outline five key methods and tools as devices for integration which vary in terms of the levels at which integration takes place; the specialist skills and expertise required within the review team; and their appropriateness in the context of limited evidence. In situations where the requirement is the integration of qualitative and process evidence within intervention effectiveness reviews, we recommend the use of a sequential approach. Here, evidence from each tradition is synthesized separately using methods consistent with each tradition before integration takes place using a common framework. Reviews which integrate qualitative and process evaluation evidence alongside quantitative evidence on intervention effectiveness in a systematic way are rare. This guidance aims to support review teams to achieve integration and we encourage further development through reflection and formal testing. Copyright © 2017 Elsevier Inc. All rights reserved.
Methods for dispensing mercury into devices
Grossman, Mark W.; George, William A.
1987-04-28
A process for dispensing mercury into devices which requires mercury. Mercury is first electrolytically separated from either HgO or Hg.sub.2 Cl.sub.2 and plated onto a cathode wire. The cathode wire is then placed into a device requiring mercury.
Code of Federal Regulations, 2010 CFR
2010-07-01
... FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 29-FEDERAL PRODUCT DESCRIPTIONS 29.2... requirements for items, processes, procedures, practices, and methods that have been adopted as customary. Standards may also establish requirements for selection, application, and design criteria so as to achieve...
HL-20 operations and support requirements for the Personnel Launch System mission
NASA Technical Reports Server (NTRS)
Morris, W. D.; White, Nancy H.; Caldwell, Ronald G.
1993-01-01
The processing, mission planning, and support requirements were defined for the HL-20 lifting-body configuration that can serve as a Personnel Launch System. These requirements were based on the assumption of an operating environment that incorporates aircraft and airline support methods and techniques that are applicable to operations. The study covered the complete turnaround process for the HL-20, including landing through launch, and mission operations, but did not address the support requirements of the launch vehicle except for the integrated activities. Support is defined in terms of manpower, staffing levels, facilities, ground support equipment, maintenance/sparing requirements, and turnaround processing time. Support results were drawn from two contracted studies, plus an in-house analysis used to define the maintenance manpower. The results of the contracted studies were used as the basis for a stochastic simulation of the support environment to determine the sufficiency of support and the effect of variance on vehicle processing. Results indicate the levels of support defined for the HL-20 through this process to be sufficient to achieve the desired flight rate of eight flights per year.
A Study on Improving Information Processing Abilities Based on PBL
ERIC Educational Resources Information Center
Kim, Du Gyu; Lee, JaeMu
2014-01-01
This study examined an instruction method for the improvement of information processing abilities in elementary school students. Current elementary students are required to develop information processing abilities to create new knowledge for this digital age. There is, however, a shortage of instruction strategies for these information processing…
Code of Federal Regulations, 2014 CFR
2014-04-01
... STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Red Blood Cells § 640.16 Processing. (a) Separation. Within the..., Red Blood Cells may be prepared either by centrifugation, done in a manner that will not tend to... for Red Blood Cells shall be the original blood containers unless the method of processing requires a...
Code of Federal Regulations, 2012 CFR
2012-04-01
... STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Red Blood Cells § 640.16 Processing. (a) Separation. Within the..., Red Blood Cells may be prepared either by centrifugation, done in a manner that will not tend to... for Red Blood Cells shall be the original blood containers unless the method of processing requires a...
Code of Federal Regulations, 2013 CFR
2013-04-01
... STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Red Blood Cells § 640.16 Processing. (a) Separation. Within the..., Red Blood Cells may be prepared either by centrifugation, done in a manner that will not tend to... for Red Blood Cells shall be the original blood containers unless the method of processing requires a...
Code of Federal Regulations, 2011 CFR
2011-04-01
... STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Red Blood Cells § 640.16 Processing. (a) Separation. Within the..., Red Blood Cells may be prepared either by centrifugation, done in a manner that will not tend to... for Red Blood Cells shall be the original blood containers unless the method of processing requires a...
Using Visualization and Computation in the Analysis of Separation Processes
ERIC Educational Resources Information Center
Joo, Yong Lak; Choudhary, Devashish
2006-01-01
For decades, every chemical engineer has been asked to have a background in separations. The required separations course can, however, be uninspiring and superficial because understanding many separation processes involves conventional graphical methods and commercial process simulators. We utilize simple, user-friendly mathematical software,…
Simulation of Simple Controlled Processes with Dead-Time.
ERIC Educational Resources Information Center
Watson, Keith R.; And Others
1985-01-01
The determination of closed-loop response of processes containing dead-time is typically not covered in undergraduate process control, possibly because the solution by Laplace transforms requires the use of Pade approximation for dead-time, which makes the procedure lengthy and tedious. A computer-aided method is described which simplifies the…
Laser Synthesis of Supported Catalysts for Carbon Nanotubes
NASA Technical Reports Server (NTRS)
VanderWal, Randall L.; Ticich, Thomas M.; Sherry, Leif J.; Hall, Lee J.; Schubert, Kathy (Technical Monitor)
2003-01-01
Four methods of laser assisted catalyst generation for carbon nanotube (CNT) synthesis have been tested. These include pulsed laser transfer (PLT), photolytic deposition (PLD), photothermal deposition (PTD) and laser ablation deposition (LABD). Results from each method are compared based on CNT yield, morphology and structure. Under the conditions tested, the PLT was the easiest method to implement, required the least time and also yielded the best pattemation. The photolytic and photothermal methods required organometallics, extended processing time and partial vacuums. The latter two requirements also held for the ablation deposition approach. In addition to control of the substrate position, controlled deposition duration was necessary to achieve an active catalyst layer. Although all methods were tested on both metal and quartz substrates, only the quartz substrates proved to be inactive towards the deposited catalyst particles.
NASA Astrophysics Data System (ADS)
Hibbard-Lubow, David Luke
The demands of digital memory have increased exponentially in recent history, requiring faster, smaller and more accurate storage methods. Two promising solutions to this ever-present problem are Bit Patterned Media (BPM) and Spin-Transfer Torque Magnetic Random Access Memory (STT-MRAM). Producing these technologies requires difficult and expensive fabrication techniques. Thus, the production processes must be optimized to allow these storage methods to compete commercially while continuing to increase their information storage density and reliability. I developed a process for the production of nanomagnetic devices (which can take the form of several types of digital memory) embedded in thin silicon nitride films. My focus was on optimizing the reactive ion etching recipe required to embed the device in the film. Ultimately, I found that recipe 37 (Power: 250W, CF4 nominal/actual flow rate: 25/25.4 sccm, O2 nominal/actual flow rate: 3.1/5.2 sccm, which gave a maximum pressure around 400 mTorr) gave the most repeatable and anisotropic results. I successfully used processes described in this thesis to make embedded nanomagnets, which could be used as bit patterned media. Another promising application of this work is to make embedded magnetic tunneling junctions, which are the storage medium used in MRAM. Doing so will require still some tweaks to the fabrication methods. Techniques for making these changes and their potential effects are discussed.
Surface infrastructure functions, requirements and subsystems for a manned Mars mission
NASA Technical Reports Server (NTRS)
Fairchild, Kyle
1986-01-01
Planning and development for a permanently manned scientific outpost on Mars requires an in-depth understanding and analysis of the functions the outpost is expected to perform. The optimum configuration that accomplishes these functions then arises during the trade studies process. In a project this complex, it becomes necessary to use a formal methodology to document the design and planning process. The method chosen for this study is called top-down functional decomposition. This method is used to determine the functions that are needed to accomplish the overall mission, then determine what requirements and systems are needed to do each of the functions. This method facilitates automation of the trades and options process. In the example, this was done with an off-the shelf software package called TK! olver. The basic functions that a permanently manned outpost on Mars must accomplish are: (1) Establish the Life Critical Systems; (2) Support Planetary Sciences and Exploration; and (3) Develop and Maintain Long-term Support Functions, including those systems needed towards self-sufficiency. The top-down functional decomposition methology, combined with standard spread sheet software, offers a powerful tool to quickly assess various design trades and analyze options. As the specific subsystems, and the relational rule algorithms are further refined, it will be possible to very accurately determine the implications of continually evolving mission requirements.
The extraction of bitumen from western oil sands. Annual report, July 1991--July 1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oblad, A.G.; Bunger, J.W.; Dahlstrom, D.A.
1992-08-01
The University of Utah tar sand research and development program is concerned with research and development on Utah is extensive oil sands deposits. The program has been intended to develop a scientific and technological base required for eventual commercial recovery of the heavy oils from oil sands and processing these oils to produce synthetic crude oil and other products such as asphalt. The overall program is based on mining the oil sand, processing the mined sand to recover the heavy oils and upgrading them to products. Multiple deposits are being investigated since it is believed that a large scale (approximatelymore » 20,000 bbl/day) plant would require the use of resources from more than one deposit. The tasks or projects in the program are organized according to the following classification: Recovery technologies which includes thermal recovery methods, water extraction methods, and solvent extraction methods; upgrading and processing technologies which covers hydrotreating, hydrocracking, and hydropyrolysis; solvent extraction; production of specialty products; and environmental aspects of the production and processing technologies. These tasks are covered in this report.« less
Method for rapidly producing microporous and mesoporous materials
Coronado, Paul R.; Poco, John F.; Hrubesh, Lawrence W.; Hopper, Robert W.
1997-01-01
An improved, rapid process is provided for making microporous and mesoporous materials, including aerogels and pre-ceramics. A gel or gel precursor is confined in a sealed vessel to prevent structural expansion of the gel during the heating process. This confinement allows the gelation and drying processes to be greatly accelerated, and significantly reduces the time required to produce a dried aerogel compared to conventional methods. Drying may be performed either by subcritical drying with a pressurized fluid to expel the liquid from the gel pores or by supercritical drying. The rates of heating and decompression are significantly higher than for conventional methods.
A Bayesian Approach to Determination of F, D, and Z Values Used in Steam Sterilization Validation.
Faya, Paul; Stamey, James D; Seaman, John W
2017-01-01
For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the well-known D T , z , and F o values that are used in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these values to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can be made regarding the ability of a process to achieve a specified sterility criterion. LAY ABSTRACT: For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the critical process parameters that are evaluated in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these parameters to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can be made regarding the ability of a process to achieve a specified sterility criterion. © PDA, Inc. 2017.
Method of Optimizing the Construction of Machining, Assembly and Control Devices
NASA Astrophysics Data System (ADS)
Iordache, D. M.; Costea, A.; Niţu, E. L.; Rizea, A. D.; Babă, A.
2017-10-01
Industry dynamics, driven by economic and social requirements, must generate more interest in technological optimization, capable of ensuring a steady development of advanced technical means to equip machining processes. For these reasons, the development of tools, devices, work equipment and control, as well as the modernization of machine tools, is the certain solution to modernize production systems that require considerable time and effort. This type of approach is also related to our theoretical, experimental and industrial applications of recent years, presented in this paper, which have as main objectives the elaboration and use of mathematical models, new calculation methods, optimization algorithms, new processing and control methods, as well as some structures for the construction and configuration of technological equipment with a high level of performance and substantially reduced costs..
40 CFR 60.644 - Test methods and procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 6 2011-07-01 2011-07-01 false Test methods and procedures. 60.644... Gas Processing: SO2 Emissions § 60.644 Test methods and procedures. (a) In conducting the performance tests required in § 60.8, the owner or operator shall use as reference methods and procedures the test...
An Automated Method for High-Definition Transcranial Direct Current Stimulation Modeling*
Huang, Yu; Su, Yuzhuo; Rorden, Christopher; Dmochowski, Jacek; Datta, Abhishek; Parra, Lucas C.
2014-01-01
Targeted transcranial stimulation with electric currents requires accurate models of the current flow from scalp electrodes to the human brain. Idiosyncratic anatomy of individual brains and heads leads to significant variability in such current flows across subjects, thus, necessitating accurate individualized head models. Here we report on an automated processing chain that computes current distributions in the head starting from a structural magnetic resonance image (MRI). The main purpose of automating this process is to reduce the substantial effort currently required for manual segmentation, electrode placement, and solving of finite element models. In doing so, several weeks of manual labor were reduced to no more than 4 hours of computation time and minimal user interaction, while current-flow results for the automated method deviated by less than 27.9% from the manual method. Key facilitating factors are the addition of three tissue types (skull, scalp and air) to a state-of-the-art automated segmentation process, morphological processing to correct small but important segmentation errors, and automated placement of small electrodes based on easily reproducible standard electrode configurations. We anticipate that such an automated processing will become an indispensable tool to individualize transcranial direct current stimulation (tDCS) therapy. PMID:23367144
Fault detection of Tennessee Eastman process based on topological features and SVM
NASA Astrophysics Data System (ADS)
Zhao, Huiyang; Hu, Yanzhu; Ai, Xinbo; Hu, Yu; Meng, Zhen
2018-03-01
Fault detection in industrial process is a popular research topic. Although the distributed control system(DCS) has been introduced to monitor the state of industrial process, it still cannot satisfy all the requirements for fault detection of all the industrial systems. In this paper, we proposed a novel method based on topological features and support vector machine(SVM), for fault detection of industrial process. The proposed method takes global information of measured variables into account by complex network model and predicts whether a system has generated some faults or not by SVM. The proposed method can be divided into four steps, i.e. network construction, network analysis, model training and model testing respectively. Finally, we apply the model to Tennessee Eastman process(TEP). The results show that this method works well and can be a useful supplement for fault detection of industrial process.
The report examines the technologies used for drying of biomass and the energy requirements of biomass dryers. Biomass drying processes, drying methods, and the conventional types of dryers are surveyed generally. Drying methods and dryer studies using superheated steam as the d...
Agile manufacturing: The factory of the future
NASA Technical Reports Server (NTRS)
Loibl, Joseph M.; Bossieux, Terry A.
1994-01-01
The factory of the future will require an operating methodology which effectively utilizes all of the elements of product design, manufacturing and delivery. The process must respond rapidly to changes in product demand, product mix, design changes or changes in the raw materials. To achieve agility in a manufacturing operation, the design and development of the manufacturing processes must focus on customer satisfaction. Achieving greatest results requires that the manufacturing process be considered from product concept through sales. This provides the best opportunity to build a quality product for the customer at a reasonable rate. The primary elements of a manufacturing system include people, equipment, materials, methods and the environment. The most significant and most agile element in any process is the human resource. Only with a highly trained, knowledgeable work force can the proper methods be applied to efficiently process materials with machinery which is predictable, reliable and flexible. This paper discusses the affect of each element on the development of agile manufacturing systems.
A method of hidden Markov model optimization for use with geophysical data sets
NASA Technical Reports Server (NTRS)
Granat, R. A.
2003-01-01
Geophysics research has been faced with a growing need for automated techniques with which to process large quantities of data. A successful tool must meet a number of requirements: it should be consistent, require minimal parameter tuning, and produce scientifically meaningful results in reasonable time. We introduce a hidden Markov model (HMM)-based method for analysis of geophysical data sets that attempts to address these issues.
Advances in spectroscopic methods for quantifying soil carbon
USDA-ARS?s Scientific Manuscript database
The gold standard for soil C determination is combustion. However, this method requires expensive consumables, is limited to the determination of the total carbon and in the number of samples which can be processed (~100/d). With increased interest in soil C sequestration, faster methods are needed....
Code of Federal Regulations, 2012 CFR
2012-01-01
... teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor-student... policies and procedures. (3) The appropriate methods, procedures, and techniques for conducting flight...) The corrective action in the case of unsatisfactory training progress. (6) The approved methods...
Code of Federal Regulations, 2011 CFR
2011-01-01
... teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor-student... policies and procedures. (3) The appropriate methods, procedures, and techniques for conducting flight...) The corrective action in the case of unsatisfactory training progress. (6) The approved methods...
Code of Federal Regulations, 2013 CFR
2013-01-01
... teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor-student... policies and procedures. (3) The appropriate methods, procedures, and techniques for conducting flight...) The corrective action in the case of unsatisfactory training progress. (6) The approved methods...
Code of Federal Regulations, 2010 CFR
2010-01-01
... teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor-student... policies and procedures. (3) The appropriate methods, procedures, and techniques for conducting flight...) The corrective action in the case of unsatisfactory training progress. (6) The approved methods...
Evaluation and development plan of NRTA measurement methods for the Rokkasho Reprocessing Plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, T.K.; Hakkila, E.A.; Flosterbuer, S.F.
Near-real-time accounting (NRTA) has been proposed as a safeguards method at the Rokkasho Reprocessing Plant (RRP), a large-scale commercial boiling water and pressurized water reactors spent-fuel reprocessing facility. NRTA for RRP requires material balance closures every month. To develop a more effective and practical NRTA system for RRP, we have evaluated NRTA measurement techniques and systems that might be implemented in both the main process and the co-denitration process areas at RRP to analyze the concentrations of plutonium in solutions and mixed oxide powder. Based on the comparative evaluation, including performance, reliability, design criteria, operation methods, maintenance requirements, and estimatedmore » costs for each possible measurement method, recommendations for development were formulated. This paper discusses the evaluations and reports on the recommendation of the NRTA development plan for potential implementation at RRP.« less
Mass production of silicon pore optics for ATHENA
NASA Astrophysics Data System (ADS)
Wille, Eric; Bavdaz, Marcos; Collon, Maximilien
2016-07-01
Silicon Pore Optics (SPO) provide high angular resolution with low effective area density as required for the Advanced Telescope for High Energy Astrophysics (Athena). The x-ray telescope consists of several hundreds of SPO mirror modules. During the development of the process steps of the SPO technology, specific requirements of a future mass production have been considered right from the beginning. The manufacturing methods heavily utilise off-the-shelf equipment from the semiconductor industry, robotic automation and parallel processing. This allows to upscale the present production flow in a cost effective way, to produce hundreds of mirror modules per year. Considering manufacturing predictions based on the current technology status, we present an analysis of the time and resources required for the Athena flight programme. This includes the full production process starting with Si wafers up to the integration of the mirror modules. We present the times required for the individual process steps and identify the equipment required to produce two mirror modules per day. A preliminary timeline for building and commissioning the required infrastructure, and for flight model production of about 1000 mirror modules, is presented.
Improvement of Selected Logistics Processes Using Quality Engineering Tools
NASA Astrophysics Data System (ADS)
Zasadzień, Michał; Žarnovský, Jozef
2018-03-01
Increase in the number of orders, the increasing quality requirements and the speed of order preparation require implementation of new solutions and improvement of logistics processes. Any disruption that occurs during execution of an order often leads to customer dissatisfaction, as well as loss of his/her confidence. The article presents a case study of the use of quality engineering methods and tools to improve the e-commerce logistic process. This made it possible to identify and prioritize key issues, identify their causes, and formulate improvement and prevention measures.
An Improved Aerial Target Localization Method with a Single Vector Sensor
Zhao, Anbang; Bi, Xuejie; Hui, Juan; Zeng, Caigao; Ma, Lin
2017-01-01
This paper focuses on the problems encountered in the actual data processing with the use of the existing aerial target localization methods, analyzes the causes of the problems, and proposes an improved algorithm. Through the processing of the sea experiment data, it is found that the existing algorithms have higher requirements for the accuracy of the angle estimation. The improved algorithm reduces the requirements of the angle estimation accuracy and obtains the robust estimation results. The closest distance matching estimation algorithm and the horizontal distance estimation compensation algorithm are proposed. The smoothing effect of the data after being post-processed by using the forward and backward two-direction double-filtering method has been improved, thus the initial stage data can be filtered, so that the filtering results retain more useful information. In this paper, the aerial target height measurement methods are studied, the estimation results of the aerial target are given, so as to realize the three-dimensional localization of the aerial target and increase the understanding of the underwater platform to the aerial target, so that the underwater platform has better mobility and concealment. PMID:29135956
Baral, Nawa Raj; Shah, Ajay
2017-05-01
Pretreatment is required to destroy recalcitrant structure of lignocelluloses and then transform into fermentable sugars. This study assessed techno-economics of steam explosion, dilute sulfuric acid, ammonia fiber explosion and biological pretreatments, and identified bottlenecks and operational targets for process improvement. Techno-economic models of these pretreatment processes for a cellulosic biorefinery of 113.5 million liters butanol per year excluding fermentation and wastewater treatment sections were developed using a modelling software-SuperPro Designer. Experimental data of the selected pretreatment processes based on corn stover were gathered from recent publications, and used for this analysis. Estimated sugar production costs ($/kg) via steam explosion, dilute sulfuric acid, ammonia fiber explosion and biological methods were 0.43, 0.42, 0.65 and 1.41, respectively. The results suggest steam explosion and sulfuric acid pretreatment methods might be good alternatives at present state of technology and other pretreatment methods require research and development efforts to be competitive with these pretreatment methods. Copyright © 2017 Elsevier Ltd. All rights reserved.
Service Contract Compliance Management in Business Process Management
NASA Astrophysics Data System (ADS)
El Kharbili, Marwane; Pulvermueller, Elke
Compliance management is a critical concern for corporations, required to respect contracts. This concern is particularly relevant in the context of business process management (BPM) as this paradigm is getting adopted more widely for-designing and building IT systems. Enforcing contractual compliance needs to be modeled at different levels of a BPM framework, which also includes the service layer. In this paper, we discuss requirements and methods for modeling contractual compliance for an SOA-supported BPM. We also show how business rule management integrated into an industry BPM tool allows modeling and processing functional and non-functional-property constraints which may be extracted from business process contracts. This work proposes a framework that responds to the requirements identified and proposes an architecture implementing it. Our approach is also illustrated by an example.
15 CFR 90.8 - Evidence required.
Code of Federal Regulations, 2014 CFR
2014-01-01
..., DEPARTMENT OF COMMERCE PROCEDURE FOR CHALLENGING POPULATION ESTIMATES § 90.8 Evidence required. (a) The... the criteria, standards, and regular processes the Census Bureau employs to generate the population... uses a cohort-component of change method to produce population estimates. Each year, the components of...
Petzold, Thomas; Hertzschuch, Diana; Elchlep, Frank; Eberlein-Gonska, Maria
2014-01-01
Process management (PM) is a valuable method for the systematic analysis and structural optimisation of the quality and safety of clinical treatment. PM requires a high motivation and willingness to implement changes of both employees and management. Definition of quality indicators is required to systematically measure the quality of the specified processes. One way to represent comparable quality results is the use of quality indicators of the external quality assurance in accordance with Sect. 137 SGB V—a method which the Federal Joint Committee (GBA) and the institutions commissioned by the GBA have employed and consistently enhanced for more than ten years. Information on the quality of inpatient treatment is available for 30 defined subjects throughout Germany. The combination of specified processes with quality indicators is beneficial for the information of employees. A process-based indicator dashboard provides essential information about the treatment process. These can be used for process analysis. In a continuous consideration of these indicator results values can be determined and errors will be remedied quickly. If due consideration is given to these indicators, they can be used for benchmarking to identify potential process improvements. Copyright © 2014. Published by Elsevier GmbH.
Conceptual design of industrial process displays.
Pedersen, C R; Lind, M
1999-11-01
Today, process displays used in industry are often designed on the basis of piping and instrumentation diagrams without any method of ensuring that the needs of the operators are fulfilled. Therefore, a method for a systematic approach to the design of process displays is needed. This paper discusses aspects of process display design taking into account both the designer's and the operator's points of view. Three aspects are emphasized: the operator tasks, the display content and the display form. The distinction between these three aspects is the basis for proposing an outline for a display design method that matches the industrial practice of modular plant design and satisfies the needs of reusability of display design solutions. The main considerations in display design in the industry are to specify the operator's activities in detail, to extract the information the operators need from the plant design specification and documentation, and finally to present this information. The form of the display is selected from existing standardized display elements such as trend curves, mimic diagrams, ecological interfaces, etc. Further knowledge is required to invent new display elements. That is, knowledge about basic visual means of presenting information and how humans perceive and interpret these means and combinations. This knowledge is required in the systematic selection of graphical items for a given display content. The industrial part of the method is first illustrated in the paper by a simple example from a plant with batch processes. Later the method is applied to develop a supervisory display for a condenser system in a nuclear power plant. The differences between the continuous plant domain of power production and the batch processes from the example are analysed and broad categories of display types are proposed. The problems involved in specification and invention of a supervisory display are analysed and conclusions from these problems are made. It is concluded that the design method proposed provides a framework for the progress of the display design and is useful in pin-pointing the actual problems. The method was useful in reducing the number of existing displays that could fulfil the requirements of the supervision task. The method provided at the same time a framework for dealing with the problems involved in inventing new displays based on structured analysis. However the problems in a systematic approach to display invention still need consideration.
NASA Astrophysics Data System (ADS)
Trusiak, M.; Patorski, K.; Tkaczyk, T.
2014-12-01
We propose a fast, simple and experimentally robust method for reconstructing background-rejected optically-sectioned microscopic images using two-shot structured illumination approach. Innovative data demodulation technique requires two grid-illumination images mutually phase shifted by π (half a grid period) but precise phase displacement value is not critical. Upon subtraction of the two frames the input pattern with increased grid modulation is computed. The proposed demodulation procedure comprises: (1) two-dimensional data processing based on the enhanced, fast empirical mode decomposition (EFEMD) method for the object spatial frequency selection (noise reduction and bias term removal), and (2) calculating high contrast optically-sectioned image using the two-dimensional spiral Hilbert transform (HS). The proposed algorithm effectiveness is compared with the results obtained for the same input data using conventional structured-illumination (SIM) and HiLo microscopy methods. The input data were collected for studying highly scattering tissue samples in reflectance mode. In comparison with the conventional three-frame SIM technique we need one frame less and no stringent requirement on the exact phase-shift between recorded frames is imposed. The HiLo algorithm outcome is strongly dependent on the set of parameters chosen manually by the operator (cut-off frequencies for low-pass and high-pass filtering and η parameter value for optically-sectioned image reconstruction) whereas the proposed method is parameter-free. Moreover very short processing time required to efficiently demodulate the input pattern predestines proposed method for real-time in-vivo studies. Current implementation completes full processing in 0.25s using medium class PC (Inter i7 2,1 GHz processor and 8 GB RAM). Simple modification employed to extract only first two BIMFs with fixed filter window size results in reducing the computing time to 0.11s (8 frames/s).
Localization of multiple defects using the compact phased array (CPA) method
NASA Astrophysics Data System (ADS)
Senyurek, Volkan Y.; Baghalian, Amin; Tashakori, Shervin; McDaniel, Dwayne; Tansel, Ibrahim N.
2018-01-01
Array systems of transducers have found numerous applications in detection and localization of defects in structural health monitoring (SHM) of plate-like structures. Different types of array configurations and analysis algorithms have been used to improve the process of localization of defects. For accurate and reliable monitoring of large structures by array systems, a high number of actuator and sensor elements are often required. In this study, a compact phased array system consisting of only three piezoelectric elements is used in conjunction with an updated total focusing method (TFM) for localization of single and multiple defects in an aluminum plate. The accuracy of the localization process was greatly improved by including wave propagation information in TFM. Results indicated that the proposed CPA approach can locate single and multiple defects with high accuracy while decreasing the processing costs and the number of required transducers. This method can be utilized in critical applications such as aerospace structures where the use of a large number of transducers is not desirable.
Zhang, Le; Lawson, Ken; Yeung, Bernice; Wypych, Jette
2015-01-06
A purity method based on capillary zone electrophoresis (CZE) has been developed for the separation of isoforms of a highly glycosylated protein. The separation was found to be driven by the number of sialic acids attached to each isoform. The method has been characterized using orthogonal assays and shown to have excellent specificity, precision and accuracy. We have demonstrated the CZE method is a useful in-process assay to support cell culture and purification development of this glycoprotein. Compared to isoelectric focusing (IEF), the CZE method provides more quantitative results and higher sample throughput with excellent accuracy, qualities that are required for process development. In addition, the CZE method has been applied in the stability testing of purified glycoprotein samples.
Code of Federal Regulations, 2010 CFR
2010-01-01
... principles of the teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor... certificate holder's policies and procedures. (3) The applicable methods, procedures, and techniques for... approved methods, procedures, and limitations for performing the required normal, abnormal, and emergency...
Code of Federal Regulations, 2014 CFR
2014-01-01
...-learning process; (ii) Teaching methods and procedures; and (iii) The instructor-student relationship. (d... procedures. (3) The appropriate methods, procedures, and techniques for conducting flight instruction. (4... corrective action in the case of unsatisfactory training progress. (6) The approved methods, procedures, and...
Code of Federal Regulations, 2014 CFR
2014-01-01
... principles of the teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor... certificate holder's policies and procedures. (3) The applicable methods, procedures, and techniques for... approved methods, procedures, and limitations for performing the required normal, abnormal, and emergency...
Code of Federal Regulations, 2012 CFR
2012-01-01
... principles of the teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor... certificate holder's policies and procedures. (3) The applicable methods, procedures, and techniques for... approved methods, procedures, and limitations for performing the required normal, abnormal, and emergency...
Code of Federal Regulations, 2013 CFR
2013-01-01
... principles of the teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor... certificate holder's policies and procedures. (3) The applicable methods, procedures, and techniques for... approved methods, procedures, and limitations for performing the required normal, abnormal, and emergency...
Code of Federal Regulations, 2011 CFR
2011-01-01
... principles of the teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor... certificate holder's policies and procedures. (3) The applicable methods, procedures, and techniques for... approved methods, procedures, and limitations for performing the required normal, abnormal, and emergency...
GPU applications for data processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vladymyrov, Mykhailo, E-mail: mykhailo.vladymyrov@cern.ch; Aleksandrov, Andrey; INFN sezione di Napoli, I-80125 Napoli
2015-12-31
Modern experiments that use nuclear photoemulsion imply fast and efficient data acquisition from the emulsion can be performed. The new approaches in developing scanning systems require real-time processing of large amount of data. Methods that use Graphical Processing Unit (GPU) computing power for emulsion data processing are presented here. It is shown how the GPU-accelerated emulsion processing helped us to rise the scanning speed by factor of nine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Liping; Zhu, Fulong, E-mail: zhufulong@hust.edu.cn; Duan, Ke
Ultrasonic waves are widely used, with applications including the medical, military, and chemical fields. However, there are currently no effective methods for ultrasonic power measurement. Previously, ultrasonic power measurement has been reliant on mechanical methods such as hydrophones and radiation force balances. This paper deals with ultrasonic power measurement based on an unconventional method: acousto-optic interaction. Compared with mechanical methods, the optical method has a greater ability to resist interference and also has reduced environmental requirements. Therefore, this paper begins with an experimental determination of the acoustic power in water contained in a glass tank using a set of opticalmore » devices. Because the light intensity of the diffraction image generated by acousto-optic interaction contains the required ultrasonic power information, specific software was written to extract the light intensity information from the image through a combination of filtering, binarization, contour extraction, and other image processing operations. The power value can then be obtained rapidly by processing the diffraction image using a computer. The results of this work show that the optical method offers advantages that include accuracy, speed, and a noncontact measurement method.« less
Ultrasonic power measurement system based on acousto-optic interaction.
He, Liping; Zhu, Fulong; Chen, Yanming; Duan, Ke; Lin, Xinxin; Pan, Yongjun; Tao, Jiaquan
2016-05-01
Ultrasonic waves are widely used, with applications including the medical, military, and chemical fields. However, there are currently no effective methods for ultrasonic power measurement. Previously, ultrasonic power measurement has been reliant on mechanical methods such as hydrophones and radiation force balances. This paper deals with ultrasonic power measurement based on an unconventional method: acousto-optic interaction. Compared with mechanical methods, the optical method has a greater ability to resist interference and also has reduced environmental requirements. Therefore, this paper begins with an experimental determination of the acoustic power in water contained in a glass tank using a set of optical devices. Because the light intensity of the diffraction image generated by acousto-optic interaction contains the required ultrasonic power information, specific software was written to extract the light intensity information from the image through a combination of filtering, binarization, contour extraction, and other image processing operations. The power value can then be obtained rapidly by processing the diffraction image using a computer. The results of this work show that the optical method offers advantages that include accuracy, speed, and a noncontact measurement method.
Ultrasonic power measurement system based on acousto-optic interaction
NASA Astrophysics Data System (ADS)
He, Liping; Zhu, Fulong; Chen, Yanming; Duan, Ke; Lin, Xinxin; Pan, Yongjun; Tao, Jiaquan
2016-05-01
Ultrasonic waves are widely used, with applications including the medical, military, and chemical fields. However, there are currently no effective methods for ultrasonic power measurement. Previously, ultrasonic power measurement has been reliant on mechanical methods such as hydrophones and radiation force balances. This paper deals with ultrasonic power measurement based on an unconventional method: acousto-optic interaction. Compared with mechanical methods, the optical method has a greater ability to resist interference and also has reduced environmental requirements. Therefore, this paper begins with an experimental determination of the acoustic power in water contained in a glass tank using a set of optical devices. Because the light intensity of the diffraction image generated by acousto-optic interaction contains the required ultrasonic power information, specific software was written to extract the light intensity information from the image through a combination of filtering, binarization, contour extraction, and other image processing operations. The power value can then be obtained rapidly by processing the diffraction image using a computer. The results of this work show that the optical method offers advantages that include accuracy, speed, and a noncontact measurement method.
Wang, Ling; Muralikrishnan, Bala; Rachakonda, Prem; Sawyer, Daniel
2017-01-01
Terrestrial laser scanners (TLS) are increasingly used in large-scale manufacturing and assembly where required measurement uncertainties are on the order of few tenths of a millimeter or smaller. In order to meet these stringent requirements, systematic errors within a TLS are compensated in-situ through self-calibration. In the Network method of self-calibration, numerous targets distributed in the work-volume are measured from multiple locations with the TLS to determine parameters of the TLS error model. In this paper, we propose two new self-calibration methods, the Two-face method and the Length-consistency method. The Length-consistency method is proposed as a more efficient way of realizing the Network method where the length between any pair of targets from multiple TLS positions are compared to determine TLS model parameters. The Two-face method is a two-step process. In the first step, many model parameters are determined directly from the difference between front-face and back-face measurements of targets distributed in the work volume. In the second step, all remaining model parameters are determined through the Length-consistency method. We compare the Two-face method, the Length-consistency method, and the Network method in terms of the uncertainties in the model parameters, and demonstrate the validity of our techniques using a calibrated scale bar and front-face back-face target measurements. The clear advantage of these self-calibration methods is that a reference instrument or calibrated artifacts are not required, thus significantly lowering the cost involved in the calibration process. PMID:28890607
Solving coupled groundwater flow systems using a Jacobian Free Newton Krylov method
NASA Astrophysics Data System (ADS)
Mehl, S.
2012-12-01
Jacobian Free Newton Kyrlov (JFNK) methods can have several advantages for simulating coupled groundwater flow processes versus conventional methods. Conventional methods are defined here as those based on an iterative coupling (rather than a direct coupling) and/or that use Picard iteration rather than Newton iteration. In an iterative coupling, the systems are solved separately, coupling information is updated and exchanged between the systems, and the systems are re-solved, etc., until convergence is achieved. Trusted simulators, such as Modflow, are based on these conventional methods of coupling and work well in many cases. An advantage of the JFNK method is that it only requires calculation of the residual vector of the system of equations and thus can make use of existing simulators regardless of how the equations are formulated. This opens the possibility of coupling different process models via augmentation of a residual vector by each separate process, which often requires substantially fewer changes to the existing source code than if the processes were directly coupled. However, appropriate perturbation sizes need to be determined for accurate approximations of the Frechet derivative, which is not always straightforward. Furthermore, preconditioning is necessary for reasonable convergence of the linear solution required at each Kyrlov iteration. Existing preconditioners can be used and applied separately to each process which maximizes use of existing code and robust preconditioners. In this work, iteratively coupled parent-child local grid refinement models of groundwater flow and groundwater flow models with nonlinear exchanges to streams are used to demonstrate the utility of the JFNK approach for Modflow models. Use of incomplete Cholesky preconditioners with various levels of fill are examined on a suite of nonlinear and linear models to analyze the effect of the preconditioner. Comparisons of convergence and computer simulation time are made using conventional iteratively coupled methods and those based on Picard iteration to those formulated with JFNK to gain insights on the types of nonlinearities and system features that make one approach advantageous. Results indicate that nonlinearities associated with stream/aquifer exchanges are more problematic than those resulting from unconfined flow.
Real-time biscuit tile image segmentation method based on edge detection.
Matić, Tomislav; Aleksi, Ivan; Hocenski, Željko; Kraus, Dieter
2018-05-01
In this paper we propose a novel real-time Biscuit Tile Segmentation (BTS) method for images from ceramic tile production line. BTS method is based on signal change detection and contour tracing with a main goal of separating tile pixels from background in images captured on the production line. Usually, human operators are visually inspecting and classifying produced ceramic tiles. Computer vision and image processing techniques can automate visual inspection process if they fulfill real-time requirements. Important step in this process is a real-time tile pixels segmentation. BTS method is implemented for parallel execution on a GPU device to satisfy the real-time constraints of tile production line. BTS method outperforms 2D threshold-based methods, 1D edge detection methods and contour-based methods. Proposed BTS method is in use in the biscuit tile production line. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Methods for dispensing mercury into devices
Grossman, M.W.; George, W.A.
1987-04-28
A process is described for dispensing mercury into devices which requires mercury. Mercury is first electrolytically separated from either HgO or Hg[sub 2]Cl[sub 2] and plated onto a cathode wire. The cathode wire is then placed into a device requiring mercury. 2 figs.
Documentation Panels Enhance Teacher Education Programs
ERIC Educational Resources Information Center
Warash, Bobbie Gibson
2005-01-01
Documentation of children's projects is advantageous to their learning process and is also a good method for student teachers to observe the process of learning. Documentation panels are a unique way to help student teachers understand how children learn. Completing a panel requires a student teacher to think through a process. Teachers must learn…
5 CFR 2429.12 - Service of process and papers by the Authority.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Service of process and papers by the... REQUIREMENTS Miscellaneous § 2429.12 Service of process and papers by the Authority. (a) Methods of service... COUNSEL OF THE FEDERAL LABOR RELATIONS AUTHORITY AND FEDERAL SERVICE IMPASSES PANEL FEDERAL LABOR...
The Transition Assessment Process and IDEIA 2004
ERIC Educational Resources Information Center
Sitlington, Patricia L.; Clark, Gary M.
2007-01-01
This article will first provide an overview of the transition assessment process in terms of the requirements of the Individuals with Disabilities Education Improvement Act of 2004 and the basic tenets of the process. The second section will provide an overview of the methods of gathering assessment information on the student and on the living,…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-15
... complete an approval process as required of other entities seeking to purchase REO properties under the FHA... process must meet the provisions of the NSP Guidance on Conditional Purchase Agreements found at http... Housing Commissioner, HUD. ACTION: Notice. SUMMARY: This notice outlines the process by which governmental...
Unintended Consequences: How Qualification Constrains Innovation
NASA Technical Reports Server (NTRS)
Brice, Craig A.
2011-01-01
The development and implementation of new materials and manufacturing processes for aerospace application is often hindered by the high cost and long time span associated with current qualification procedures. The data requirements necessary for material and process qualification are extensive and often require millions of dollars and multiple years to complete. Furthermore, these qualification data can become obsolete for even minor changes to the processing route. This burden is a serious impediment to the pursuit of revolutionary new materials and more affordable processing methods for air vehicle structures. The application of integrated computational materials engineering methods to this problem can help to reduce the barriers to rapid insertion of new materials and processes. By establishing predictive capability for the development of microstructural features in relation to processing and relating this to critical property characteristics, a streamlined approach to qualification is possible. This paper critically examines the advantages and challenges to a modeling-assisted qualification approach for aerospace structural materials. An example of how this approach might apply towards the emerging field of additive manufacturing is discussed in detail.
Solution-processed flexible NiO resistive random access memory device
NASA Astrophysics Data System (ADS)
Kim, Soo-Jung; Lee, Heon; Hong, Sung-Hoon
2018-04-01
Non-volatile memories (NVMs) using nanocrystals (NCs) as active materials can be applied to soft electronic devices requiring a low-temperature process because NCs do not require a heat treatment process for crystallization. In addition, memory devices can be implemented simply by using a patterning technique using a solution process. In this study, a flexible NiO ReRAM device was fabricated using a simple NC patterning method that controls the capillary force and dewetting of a NiO NC solution at low temperature. The switching behavior of a NiO NC based memory was clearly observed by conductive atomic force microscopy (c-AFM).
Spacelab Mission Implementation Cost Assessment (SMICA)
NASA Technical Reports Server (NTRS)
Guynes, B. V.
1984-01-01
A total savings of approximately 20 percent is attainable if: (1) mission management and ground processing schedules are compressed; (2) the equipping, staffing, and operating of the Payload Operations Control Center is revised, and (3) methods of working with experiment developers are changed. The development of a new mission implementation technique, which includes mission definition, experiment development, and mission integration/operations, is examined. The Payload Operations Control Center is to relocate and utilize new computer equipment to produce cost savings. Methods of reducing costs by minimizing the Spacelab and payload processing time during pre- and post-mission operation at KSC are analyzed. The changes required to reduce costs in the analytical integration process are studied. The influence of time, requirements accountability, and risk on costs is discussed. Recommendation for cost reductions developed by the Spacelab Mission Implementation Cost Assessment study are listed.
NASA Technical Reports Server (NTRS)
Withey, James V.
1986-01-01
The validity of real-time software is determined by its ability to execute on a computer within the time constraints of the physical system it is modeling. In many applications the time constraints are so critical that the details of process scheduling are elevated to the requirements analysis phase of the software development cycle. It is not uncommon to find specifications for a real-time cyclic executive program included to assumed in such requirements. It was found that prelininary designs structured around this implementation abscure the data flow of the real world system that is modeled and that it is consequently difficult and costly to maintain, update and reuse the resulting software. A cyclic executive is a software component that schedules and implicitly synchronizes the real-time software through periodic and repetitive subroutine calls. Therefore a design method is sought that allows the deferral of process scheduling to the later stages of design. The appropriate scheduling paradigm must be chosen given the performance constraints, the largest environment and the software's lifecycle. The concept of process inversion is explored with respect to the cyclic executive.
Compute as Fast as the Engineers Can Think! ULTRAFAST COMPUTING TEAM FINAL REPORT
NASA Technical Reports Server (NTRS)
Biedron, R. T.; Mehrotra, P.; Nelson, M. L.; Preston, M. L.; Rehder, J. J.; Rogersm J. L.; Rudy, D. H.; Sobieski, J.; Storaasli, O. O.
1999-01-01
This report documents findings and recommendations by the Ultrafast Computing Team (UCT). In the period 10-12/98, UCT reviewed design case scenarios for a supersonic transport and a reusable launch vehicle to derive computing requirements necessary for support of a design process with efficiency so radically improved that human thought rather than the computer paces the process. Assessment of the present computing capability against the above requirements indicated a need for further improvement in computing speed by several orders of magnitude to reduce time to solution from tens of hours to seconds in major applications. Evaluation of the trends in computer technology revealed a potential to attain the postulated improvement by further increases of single processor performance combined with massively parallel processing in a heterogeneous environment. However, utilization of massively parallel processing to its full capability will require redevelopment of the engineering analysis and optimization methods, including invention of new paradigms. To that end UCT recommends initiation of a new activity at LaRC called Computational Engineering for development of new methods and tools geared to the new computer architectures in disciplines, their coordination, and validation and benefit demonstration through applications.
ERIC Educational Resources Information Center
Hamilton, Nancy Jo
2012-01-01
Reading is a process that requires the enactment of many cognitive processes. Each of these processes uses a certain amount of working memory resources, which are severely constrained by biology. More efficiency in the function of working memory may mediate the biological limits of same. Reading relevancy instructions may be one such method to…
An Application of the Quadrature-Free Discontinuous Galerkin Method
NASA Technical Reports Server (NTRS)
Lockard, David P.; Atkins, Harold L.
2000-01-01
The process of generating a block-structured mesh with the smoothness required for high-accuracy schemes is still a time-consuming process often measured in weeks or months. Unstructured grids about complex geometries are more easily generated, and for this reason, methods using unstructured grids have gained favor for aerodynamic analyses. The discontinuous Galerkin (DG) method is a compact finite-element projection method that provides a practical framework for the development of a high-order method using unstructured grids. Higher-order accuracy is obtained by representing the solution as a high-degree polynomial whose time evolution is governed by a local Galerkin projection. The traditional implementation of the discontinuous Galerkin uses quadrature for the evaluation of the integral projections and is prohibitively expensive. Atkins and Shu introduced the quadrature-free formulation in which the integrals are evaluated a-priori and exactly for a similarity element. The approach has been demonstrated to possess the accuracy required for acoustics even in cases where the grid is not smooth. Other issues such as boundary conditions and the treatment of non-linear fluxes have also been studied in earlier work This paper describes the application of the quadrature-free discontinuous Galerkin method to a two-dimensional shear layer problem. First, a brief description of the method is given. Next, the problem is described and the solution is presented. Finally, the resources required to perform the calculations are given.
Space infrared telescope pointing control system. Automated star pattern recognition
NASA Technical Reports Server (NTRS)
Powell, J. D.; Vanbezooijen, R. W. H.
1985-01-01
The Space Infrared Telescope Facility (SIRTF) is a free flying spacecraft carrying a 1 meter class cryogenically cooled infrared telescope nearly three oders of magnitude most sensitive than the current generation of infrared telescopes. Three automatic target acquisition methods will be presented that are based on the use of an imaging star tracker. The methods are distinguished by the number of guidestars that are required per target, the amount of computational capability necessary, and the time required for the complete acquisition process. Each method is described in detail.
NASA Technical Reports Server (NTRS)
Gault, J. W. (Editor); Trivedi, K. S. (Editor); Clary, J. B. (Editor)
1980-01-01
The validation process comprises the activities required to insure the agreement of system realization with system specification. A preliminary validation methodology for fault tolerant systems documented. A general framework for a validation methodology is presented along with a set of specific tasks intended for the validation of two specimen system, SIFT and FTMP. Two major areas of research are identified. First, are those activities required to support the ongoing development of the validation process itself, and second, are those activities required to support the design, development, and understanding of fault tolerant systems.
GSFC specification electronic data processing magnetic recording tape
NASA Technical Reports Server (NTRS)
Tinari, D. F.; Perry, J. L.
1980-01-01
The design requirements are given for magnetic oxide coated, electronic data processing tape, wound on reels. Magnetic recording tape types covered by this specification are intended for use on digital tape transports using the Non-Return-to-Zero-change-on-ones (NRZI) recording method for recording densities up to and including 800 characters per inch (cpi) and the Phase-Encoding (PE) recording method for a recording density of 1600 cpi.
Dahling, Daniel R
2002-01-01
Large-scale virus studies of groundwater systems require practical and sensitive procedures for both sample processing and viral assay. Filter adsorption-elution procedures have traditionally been used to process large-volume water samples for viruses. In this study, five filter elution procedures using cartridge filters were evaluated for their effectiveness in processing samples. Of the five procedures tested, the third method, which incorporated two separate beef extract elutions (one being an overnight filter immersion in beef extract), recovered 95% of seeded poliovirus compared with recoveries of 36 to 70% for the other methods. For viral enumeration, an expanded roller bottle quantal assay was evaluated using seeded poliovirus. This cytopathic-based method was considerably more sensitive than the standard plaque assay method. The roller bottle system was more economical than the plaque assay for the evaluation of comparable samples. Using roller bottles required less time and manipulation than the plaque procedure and greatly facilitated the examination of large numbers of samples. The combination of the improved filter elution procedure and the roller bottle assay for viral analysis makes large-scale virus studies of groundwater systems practical. This procedure was subsequently field tested during a groundwater study in which large-volume samples (exceeding 800 L) were processed through the filters.
Method for rapidly producing microporous and mesoporous materials
Coronado, P.R.; Poco, J.F.; Hrubesh, L.W.; Hopper, R.W.
1997-11-11
An improved, rapid process is provided for making microporous and mesoporous materials, including aerogels and pre-ceramics. A gel or gel precursor is confined in a sealed vessel to prevent structural expansion of the gel during the heating process. This confinement allows the gelation and drying processes to be greatly accelerated, and significantly reduces the time required to produce a dried aerogel compared to conventional methods. Drying may be performed either by subcritical drying with a pressurized fluid to expel the liquid from the gel pores or by supercritical drying. The rates of heating and decompression are significantly higher than for conventional methods. 3 figs.
Consolidation of lunar regolith: Microwave versus direct solar heating
NASA Technical Reports Server (NTRS)
Kunitzer, J.; Strenski, D. G.; Yankee, S. J.; Pletka, B. J.
1991-01-01
The production of construction materials on the lunar surface will require an appropriate fabrication technique. Two processing methods considered as being suitable for producing dense, consolidated products such as bricks are direct solar heating and microwave heating. An analysis was performed to compare the two processes in terms of the amount of power and time required to fabricate bricks of various size. The regolith was considered to be a mare basalt with an overall density of 60 pct. of theoretical. Densification was assumed to take place by vitrification since this process requires moderate amounts of energy and time while still producing dense products. Microwave heating was shown to be significantly faster compared to solar furnace heating for rapid production of realistic-size bricks.
USE OF TIE METHODS IN A LARGER CONTEXT: THE DIAGNOSTICS APPROACH
There is an increasing need to determine the identity of stressors in the environment. For example, in the US, the Total Maximum Daily Loading (TMDL) process requires states to determine if all surface waters meet specific use requirements (e.g., swimmable, fishable etc.). Surf...
UAS remote sensing for precision agriculture: An independent assessment
USDA-ARS?s Scientific Manuscript database
Small Unmanned Aircraft Systems (sUAS) are recognized as potentially important remote-sensing platforms for precision agriculture. However, research is required to determine which sensors and data processing methods are required to use sUAS in an efficient and cost-effective manner. Oregon State U...
Improved process robustness by using closed loop control in deep drawing applications
NASA Astrophysics Data System (ADS)
Barthau, M.; Liewald, M.; Christian, Held
2017-09-01
The production of irregular shaped deep-drawing parts with high quality requirements, which are common in today’s automotive production, permanently challenges production processes. High requirements on lightweight construction of passenger car bodies following European regulations until 2020 have been massively increasing the use of high strength steels substantially for years and are also leading to bigger challenges in sheet metal part production. Of course, the more and more complex shapes of today’s car body shells also intensify the issue due to modern and future design criteria. The metal forming technology tries to meet these challenges by developing a highly sophisticated layout of deep drawing dies that consider part quality requirements, process robustness and controlled material flow during the deep or stretch drawing process phase. A new method for controlling material flow using a closed loop system was developed at the IFU Stuttgart. In contrast to previous approaches, this new method allows a control intervention during the deep-drawing stroke. The blank holder force around the outline of the drawn part is used as control variable. The closed loop is designed as trajectory follow up with feed forward control. The used command variable is the part-wall stress that is measured with a piezo-electric measuring pin. In this paper the used control loop will be described in detail. The experimental tool that was built for testing the new control approach is explained here with its features. A method for gaining the follow up trajectories from simulation will also be presented. Furthermore, experimental results considering the robustness of the deep drawing process and the gain in process performance with developed control loop will be shown. Finally, a new procedure for the industrial application of the new control method of deep drawing will be presented by using a new kind of active element to influence the local blank holder pressure onto part flange.
Environmental monitoring of the orbiter payload bay and Orbiter Processing Facilities
NASA Technical Reports Server (NTRS)
Bartelson, D. W.; Johnson, A. M.
1985-01-01
Contamination control in the Orbiter Processing Facility (OPF) is studied. The clean level required in the OPF is generally clean, which means no residue, dirt, debris, or other extraneous contamination; various methods of maintaining this level of cleanliness are described. The monitoring and controlling of the temperature, relative humidity, and air quality in the OPF are examined. Additional modifications to the OPF to improve contamination control are discussed. The methods used to maintain the payload changeout room at a level of visually clean, no particulates are to be detected by the unaided eye, are described. The payload bay (PLB) must sustain the cleanliness level required for the specific Orbiter's mission; the three levels of clean are defined as: (1) standard, (2) sensitive, and (3) high sensitive. The cleaning and inspection verification required to achieve the desired cleanliness level on a variety of PLB surface types are examined.
Scandurra, Isabella; Hägglund, Maria; Koch, Sabine
2008-01-01
A significant problem with current health information technologies is that they poorly support collaborative work of healthcare professionals, sometimes leading to a fragmentation of workflow and disruption of healthcare processes. This paper presents two homecare cases, both applying multi-disciplinary thematic seminars (MdTS) as a collaborative method for user needs elicitation and requirements specification. This study describes the MdTS application to elicit user needs from different perspectives to coincide with collaborative professions' work practices in two cases. Despite different objectives, the two cases validated that MdTS emphasized the "points of intersection" in cooperative work. Different user groups with similar, yet distinct needs reached a common understanding of the entire work process, agreed upon requirements and participated in the design of prototypes supporting cooperative work. MdTS was applicable in both exploratory and normative studies aiming to elicit the specific requirements in a cooperative environment.
Structure and Randomness of Continuous-Time, Discrete-Event Processes
NASA Astrophysics Data System (ADS)
Marzen, Sarah E.; Crutchfield, James P.
2017-10-01
Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.
1991-10-01
SUBJECT TERMS 15. NUMBER OF PAGES engineering management information systems method formalization 60 information engineering process modeling 16 PRICE...CODE information systems requirements definition methods knowlede acquisition methods systems engineering 17. SECURITY CLASSIFICATION ji. SECURITY... Management , Inc., Santa Monica, California. CORYNEN, G. C., 1975, A Mathematical Theory of Modeling and Simula- tion. Ph.D. Dissertation, Department
Updating National Topographic Data Base Using Change Detection Methods
NASA Astrophysics Data System (ADS)
Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.
2016-06-01
The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.
Dobecki, Marek
2012-01-01
This paper reviews the requirements for measurement methods of chemical agents in the air at workstations. European standards, which have a status of Polish standards, comprise some requirements and information on sampling strategy, measuring techniques, type of samplers, sampling pumps and methods of occupational exposure evaluation at a given technological process. Measurement methods, including air sampling and analytical procedure in a laboratory, should be appropriately validated before intended use. In the validation process, selected methods are tested and budget of uncertainty is set up. The validation procedure that should be implemented in the laboratory together with suitable statistical tools and major components of uncertainity to be taken into consideration, were presented in this paper. Methods of quality control, including sampling and laboratory analyses were discussed. Relative expanded uncertainty for each measurement expressed as a percentage, should not exceed the limit of values set depending on the type of occupational exposure (short-term or long-term) and the magnitude of exposure to chemical agents in the work environment.
Challenges and opportunities in the manufacture and expansion of cells for therapy.
Maartens, Joachim H; De-Juan-Pardo, Elena; Wunner, Felix M; Simula, Antonio; Voelcker, Nicolas H; Barry, Simon C; Hutmacher, Dietmar W
2017-10-01
Laboratory-based ex vivo cell culture methods are largely manual in their manufacturing processes. This makes it extremely difficult to meet regulatory requirements for process validation, quality control and reproducibility. Cell culture concepts with a translational focus need to embrace a more automated approach where cell yields are able to meet the quantitative production demands, the correct cell lineage and phenotype is readily confirmed and reagent usage has been optimized. Areas covered: This article discusses the obstacles inherent in classical laboratory-based methods, their concomitant impact on cost-of-goods and that a technology step change is required to facilitate translation from bed-to-bedside. Expert opinion: While traditional bioreactors have demonstrated limited success where adherent cells are used in combination with microcarriers, further process optimization will be required to find solutions for commercial-scale therapies. New cell culture technologies based on 3D-printed cell culture lattices with favourable surface to volume ratios have the potential to change the paradigm in industry. An integrated Quality-by-Design /System engineering approach will be essential to facilitate the scaled-up translation from proof-of-principle to clinical validation.
A methodology for designing aircraft to low sonic boom constraints
NASA Technical Reports Server (NTRS)
Mack, Robert J.; Needleman, Kathy E.
1991-01-01
A method for designing conceptual supersonic cruise aircraft to meet low sonic boom requirements is outlined and described. The aircraft design is guided through a systematic evolution from initial three view drawing to a final numerical model description, while the designer using the method controls the integration of low sonic boom, high supersonic aerodynamic efficiency, adequate low speed handling, and reasonable structure and materials technologies. Some experience in preliminary aircraft design and in the use of various analytical and numerical codes is required for integrating the volume and lift requirements throughout the design process.
Boronization on NSTX using Deuterated Trimethylboron
DOE Office of Scientific and Technical Information (OSTI.GOV)
W.R. Blanchard; R.C. Gernhardt; H.W. Kugel
2002-01-28
Boronization on the National Spherical Torus Experiment (NSTX) has proved to be quite beneficial with increases in confinement and density, and decreases in impurities observed in the plasma. The boron has been applied to the interior surfaces of NSTX, about every 2 to 3 weeks of plasma operation, by producing a glow discharge in the vacuum vessel using deuterated trimethylboron (TMB) in a 10% mixture with helium. Special NSTX requirements restricted the selection of the candidate boronization method to the use of deuterated boron compounds. Deuterated TMB met these requirements, but is a hazardous gas and special care in themore » execution of the boronization process is required. This paper describes the existing GDC, Gas Injection, and Torus Vacuum Pumping System hardware used for this process, the glow discharge process, and the automated control system that allows for remote operation to maximize both the safety and efficacy of applying the boron coating. The administrative requirements and the detailed procedure for the setup, operation and shutdown of the process are also described.« less
Discovery of rare mutations in populations: TILLING by sequencing
USDA-ARS?s Scientific Manuscript database
Discovery of rare mutations in populations requires methods for processing and analyzing in parallel many individuals. Previous TILLING methods employed enzymatic or physical discrimination of heteroduplexed from homoduplexed target DNA. We used mutant populations of rice and wheat to develop a meth...
14 CFR 121.911 - Indoctrination curriculum.
Code of Federal Regulations, 2013 CFR
2013-01-01
... knowledge appropriate to the duty position. (c) For instructors: The fundamental principles of the teaching and learning process; methods and theories of instruction; and the knowledge necessary to use aircraft... curriculums, as appropriate. (d) For evaluators: General evaluation requirements of the AQP; methods of...
14 CFR 121.911 - Indoctrination curriculum.
Code of Federal Regulations, 2014 CFR
2014-01-01
... knowledge appropriate to the duty position. (c) For instructors: The fundamental principles of the teaching and learning process; methods and theories of instruction; and the knowledge necessary to use aircraft... curriculums, as appropriate. (d) For evaluators: General evaluation requirements of the AQP; methods of...
14 CFR 121.911 - Indoctrination curriculum.
Code of Federal Regulations, 2012 CFR
2012-01-01
... knowledge appropriate to the duty position. (c) For instructors: The fundamental principles of the teaching and learning process; methods and theories of instruction; and the knowledge necessary to use aircraft... curriculums, as appropriate. (d) For evaluators: General evaluation requirements of the AQP; methods of...
Method of transition from 3D model to its ontological representation in aircraft design process
NASA Astrophysics Data System (ADS)
Govorkov, A. S.; Zhilyaev, A. S.; Fokin, I. V.
2018-05-01
This paper proposes the method of transition from a 3D model to its ontological representation and describes its usage in the aircraft design process. The problems of design for manufacturability and design automation are also discussed. The introduced method is to aim to ease the process of data exchange between important aircraft design phases, namely engineering and design control. The method is also intended to increase design speed and 3D model customizability. This requires careful selection of the complex systems (CAD / CAM / CAE / PDM), providing the basis for the integration of design and technological preparation of production and more fully take into account the characteristics of products and processes for their manufacture. It is important to solve this problem, as investment in the automation define the company's competitiveness in the years ahead.
Applicability and Limitations of Reliability Allocation Methods
NASA Technical Reports Server (NTRS)
Cruz, Jose A.
2016-01-01
Reliability allocation process may be described as the process of assigning reliability requirements to individual components within a system to attain the specified system reliability. For large systems, the allocation process is often performed at different stages of system design. The allocation process often begins at the conceptual stage. As the system design develops, more information about components and the operating environment becomes available, different allocation methods can be considered. Reliability allocation methods are usually divided into two categories: weighting factors and optimal reliability allocation. When properly applied, these methods can produce reasonable approximations. Reliability allocation techniques have limitations and implied assumptions that need to be understood by system engineers. Applying reliability allocation techniques without understanding their limitations and assumptions can produce unrealistic results. This report addresses weighting factors, optimal reliability allocation techniques, and identifies the applicability and limitations of each reliability allocation technique.
Using nocturnal cold air drainage flow to monitor ecosystem processes in complex terrain
Thomas G. Pypker; Michael H. Unsworth; Alan C. Mix; William Rugh; Troy Ocheltree; Karrin Alstad; Barbara J. Bond
2007-01-01
This paper presents initial investigations of a new approach to monitor ecosystem processes in complex terrain on large scales. Metabolic processes in mountainous ecosystems are poorly represented in current ecosystem monitoring campaigns because the methods used for monitoring metabolism at the ecosystem scale (e.g., eddy covariance) require flat study sites. Our goal...
40 CFR 63.1412 - Continuous process vent applicability assessment procedures and methods.
Code of Federal Regulations, 2010 CFR
2010-07-01
... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... values, and engineering assessment control applicability assessment requirements are to be determined... by using the engineering assessment procedures in paragraph (k) of this section. (f) Volumetric flow...
40 CFR 63.1412 - Continuous process vent applicability assessment procedures and methods.
Code of Federal Regulations, 2012 CFR
2012-07-01
... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... values, and engineering assessment control applicability assessment requirements are to be determined... by using the engineering assessment procedures in paragraph (k) of this section. (f) Volumetric flow...
40 CFR 63.1412 - Continuous process vent applicability assessment procedures and methods.
Code of Federal Regulations, 2014 CFR
2014-07-01
... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... values, and engineering assessment control applicability assessment requirements are to be determined... by using the engineering assessment procedures in paragraph (k) of this section. (f) Volumetric flow...
40 CFR 63.1412 - Continuous process vent applicability assessment procedures and methods.
Code of Federal Regulations, 2013 CFR
2013-07-01
... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... values, and engineering assessment control applicability assessment requirements are to be determined... by using the engineering assessment procedures in paragraph (k) of this section. (f) Volumetric flow...
40 CFR 63.1412 - Continuous process vent applicability assessment procedures and methods.
Code of Federal Regulations, 2011 CFR
2011-07-01
... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... values, and engineering assessment control applicability assessment requirements are to be determined... by using the engineering assessment procedures in paragraph (k) of this section. (f) Volumetric flow...
Xu, Jason; Minin, Vladimir N
2015-07-01
Branching processes are a class of continuous-time Markov chains (CTMCs) with ubiquitous applications. A general difficulty in statistical inference under partially observed CTMC models arises in computing transition probabilities when the discrete state space is large or uncountable. Classical methods such as matrix exponentiation are infeasible for large or countably infinite state spaces, and sampling-based alternatives are computationally intensive, requiring integration over all possible hidden events. Recent work has successfully applied generating function techniques to computing transition probabilities for linear multi-type branching processes. While these techniques often require significantly fewer computations than matrix exponentiation, they also become prohibitive in applications with large populations. We propose a compressed sensing framework that significantly accelerates the generating function method, decreasing computational cost up to a logarithmic factor by only assuming the probability mass of transitions is sparse. We demonstrate accurate and efficient transition probability computations in branching process models for blood cell formation and evolution of self-replicating transposable elements in bacterial genomes.
Xu, Jason; Minin, Vladimir N.
2016-01-01
Branching processes are a class of continuous-time Markov chains (CTMCs) with ubiquitous applications. A general difficulty in statistical inference under partially observed CTMC models arises in computing transition probabilities when the discrete state space is large or uncountable. Classical methods such as matrix exponentiation are infeasible for large or countably infinite state spaces, and sampling-based alternatives are computationally intensive, requiring integration over all possible hidden events. Recent work has successfully applied generating function techniques to computing transition probabilities for linear multi-type branching processes. While these techniques often require significantly fewer computations than matrix exponentiation, they also become prohibitive in applications with large populations. We propose a compressed sensing framework that significantly accelerates the generating function method, decreasing computational cost up to a logarithmic factor by only assuming the probability mass of transitions is sparse. We demonstrate accurate and efficient transition probability computations in branching process models for blood cell formation and evolution of self-replicating transposable elements in bacterial genomes. PMID:26949377
Friendly Extensible Transfer Tool Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, William P.; Gutierrez, Kenneth M.; McRee, Susan R.
2016-04-15
Often data transfer software is designed to meet specific requirements or apply to specific environments. Frequently, this requires source code integration for added functionality. An extensible data transfer framework is needed to more easily incorporate new capabilities, in modular fashion. Using FrETT framework, functionality may be incorporated (in many cases without need of source code) to handle new platform capabilities: I/O methods (e.g., platform specific data access), network transport methods, data processing (e.g., data compression.).
NASA Astrophysics Data System (ADS)
Kowalska, Małgorzata; Janas, Sławomir; Woźniak, Magdalena
2018-04-01
The aim of this work was the presentation of an alternative method of determination of the total dry mass content in processed cheese. The authors claim that the presented method can be used in industry's quality control laboratories for routine testing and for quick in-process control. For the test purposes both reference method of determination of dry mass in processed cheese and moisture analyzer method were used. The tests were carried out for three different kinds of processed cheese. In accordance with the reference method, the sample was placed on a layer of silica sand and dried at the temperature of 102 °C for about 4 h. The moisture analyzer test required method validation, with regard to drying temperature range and mass of the analyzed sample. Optimum drying temperature of 110 °C was determined experimentally. For Hochland cream processed cheese sample, the total dry mass content, obtained using the reference method, was 38.92%, whereas using the moisture analyzer method, it was 38.74%. An average analysis time in case of the moisture analyzer method was 9 min. For the sample of processed cheese with tomatoes, the reference method result was 40.37%, and the alternative method result was 40.67%. For the sample of cream processed cheese with garlic the reference method gave value of 36.88%, and the alternative method, of 37.02%. An average time of those determinations was 16 min. Obtained results confirmed that use of moisture analyzer is effective. Compliant values of dry mass content were obtained for both of the used methods. According to the authors, the fact that the measurement took incomparably less time for moisture analyzer method, is a key criterion of in-process control and final quality control method selection.
Efficient SRAM yield optimization with mixture surrogate modeling
NASA Astrophysics Data System (ADS)
Zhongjian, Jiang; Zuochang, Ye; Yan, Wang
2016-12-01
Largely repeated cells such as SRAM cells usually require extremely low failure-rate to ensure a moderate chi yield. Though fast Monte Carlo methods such as importance sampling and its variants can be used for yield estimation, they are still very expensive if one needs to perform optimization based on such estimations. Typically the process of yield calculation requires a lot of SPICE simulation. The circuit SPICE simulation analysis accounted for the largest proportion of time in the process yield calculation. In the paper, a new method is proposed to address this issue. The key idea is to establish an efficient mixture surrogate model. The surrogate model is based on the design variables and process variables. This model construction method is based on the SPICE simulation to get a certain amount of sample points, these points are trained for mixture surrogate model by the lasso algorithm. Experimental results show that the proposed model is able to calculate accurate yield successfully and it brings significant speed ups to the calculation of failure rate. Based on the model, we made a further accelerated algorithm to further enhance the speed of the yield calculation. It is suitable for high-dimensional process variables and multi-performance applications.
WaferOptics® mass volume production and reliability
NASA Astrophysics Data System (ADS)
Wolterink, E.; Demeyer, K.
2010-05-01
The Anteryon WaferOptics® Technology platform contains imaging optics designs, materials, metrologies and combined with wafer level based Semicon & MEMS production methods. WaferOptics® first required complete new system engineering. This system closes the loop between application requirement specifications, Anteryon product specification, Monte Carlo Analysis, process windows, process controls and supply reject criteria. Regarding the Anteryon product Integrated Lens Stack (ILS), new design rules, test methods and control systems were assessed, implemented, validated and customer released for mass production. This includes novel reflowable materials, mastering process, replication, bonding, dicing, assembly, metrology, reliability programs and quality assurance systems. Many of Design of Experiments were performed to assess correlations between optical performance parameters and machine settings of all process steps. Lens metrologies such as FFL, BFL, and MTF were adapted for wafer level production and wafer mapping was introduced for yield management. Test methods for screening and validating suitable optical materials were designed. Critical failure modes such as delamination and popcorning were assessed and modeled with FEM. Anteryon successfully managed to integrate the different technologies starting from single prototypes to high yield mass volume production These parallel efforts resulted in a steep yield increase from 30% to over 90% in a 8 months period.
Functional Mobility Testing: A Novel Method to Create Suit Design Requirements
NASA Technical Reports Server (NTRS)
England, Scott A.; Benson, Elizabeth A.; Rajulu, Sudhakar L.
2008-01-01
This study was performed to aide in the creation of design requirements for the next generation of space suits that more accurately describe the level of mobility necessary for a suited crewmember through the use of an innovative methodology utilizing functional mobility. A novel method was utilized involving the collection of kinematic data while 20 subjects (10 male, 10 female) performed pertinent functional tasks that will be required of a suited crewmember during various phases of a lunar mission. These tasks were selected based on relevance and criticality from a larger list of tasks that may be carried out by the crew. Kinematic data was processed through Vicon BodyBuilder software to calculate joint angles for the ankle, knee, hip, torso, shoulder, elbow, and wrist. Maximum functional mobility was consistently lower than maximum isolated mobility. This study suggests that conventional methods for establishing design requirements for human-systems interfaces based on maximal isolated joint capabilities may overestimate the required mobility. Additionally, this method provides a valuable means of evaluating systems created from these requirements by comparing the mobility available in a new spacesuit, or the mobility required to use a new piece of hardware, to this newly established database of functional mobility.
Multidisciplinary and Active/Collaborative Approaches in Teaching Requirements Engineering
ERIC Educational Resources Information Center
Rosca, Daniela
2005-01-01
The requirements engineering course is a core component of the curriculum for the Master's in Software Engineering programme, at Monmouth University (MU). It covers the process, methods and tools specific to this area, together with the corresponding software quality issues. The need to produce software engineers with strong teamwork and…
Decision making in prioritization of required operational capabilities
NASA Astrophysics Data System (ADS)
Andreeva, P.; Karev, M.; Kovacheva, Ts.
2015-10-01
The paper describes an expert heuristic approach to prioritization of required operational capabilities in the field of defense. Based on expert assessment and by application of the method of Analytical Hierarchical Process, a methodology for their prioritization has been developed. It has been applied to practical simulation decision making games.
Methods for Documenting Systematic Review Searches: A Discussion of Common Issues
ERIC Educational Resources Information Center
Rader, Tamara; Mann, Mala; Stansfield, Claire; Cooper, Chris; Sampson, Margaret
2014-01-01
Introduction: As standardized reporting requirements for systematic reviews are being adopted more widely, review authors are under greater pressure to accurately record their search process. With careful planning, documentation to fulfill the Preferred Reporting Items for Systematic Reviews and Meta-Analyses requirements can become a valuable…
Source Testing for Particulate Matter.
ERIC Educational Resources Information Center
DeVorkin, Howard
Developed for presentation at the 12th Conference on Methods in Air Pollution and Industrial Hygiene Studies, University of Southern California, April, 1971, this outline covers procedures for the testing of particulate matter. These are: (1) basic requirements, (2) information required, (3) collection of samples, (4) processing of samples, (5)…
Detection of nitrogen deficiency in potatoes using small unmanned aircraft systems
USDA-ARS?s Scientific Manuscript database
Small Unmanned Aircraft Systems (sUAS) are recognized as potentially important remote-sensing platforms for precision agriculture. However, research is required to determine which sensors and data processing methods are required to use sUAS in an efficient and cost-effective manner. We set up a ni...
Oberbichler, S; Hackl, W O; Hörbst, A
2017-10-18
Long-term data collection is a challenging task in the domain of medical research. Many effects in medicine require long periods of time to become traceable e.g. the development of secondary malignancies based on a given radiotherapeutic treatment of the primary disease. Nevertheless, long-term studies often suffer from an initial lack of available information, thus disallowing a standardized approach for their approval by the ethics committee. This is due to several factors, such as the lack of existing case report forms or an explorative research approach in which data elements may change over time. In connection with current medical research and the ongoing digitalization in medicine, Long Term Medical Data Registries (MDR-LT) have become an important means of collecting and analyzing study data. As with any clinical study, ethical aspects must be taken into account when setting up such registries. This work addresses the problem of creating a valid, high-quality ethics committee proposal for medical registries by suggesting groups of tasks (building blocks), information sources and appropriate methods for collecting and analyzing the information, as well as a process model to compile an ethics committee proposal (EsPRit). To derive the building blocks and associated methods software and requirements engineering approaches were utilized. Furthermore, a process-oriented approach was chosen, as information required in the creating process of ethics committee proposals remain unknown in the beginning of planning an MDR-LT. Here, we derived the needed steps from medical product certification. This was done as the medical product certification itself also communicates a process-oriented approach rather than merely focusing on content. A proposal was created for validation and inspection of applicability by using the proposed building blocks. The proposed best practice was tested and refined within SEMPER (Secondary Malignoma - Prospective Evaluation of the Radiotherapeutics dose distribution as the cause for induction) as a case study. The proposed building blocks cover the topics of "Context Analysis", "Requirements Analysis", "Requirements Validation", "Electronic Case Report (eCRF) Design" and "Overall Concept Creation". Additional methods are attached with regards to each topic. The goals of each block can be met by applying those methods. The proposed methods are proven methods as applied in e.g. existing Medical Data Registry projects, as well as in software or requirements engineering. Several building blocks and attached methods could be identified in the creation of a generic ethics committee proposal. Hence, an Ethics Committee can make informed decisions on the suggested study via said blocks, using the suggested methods such as "Defining Clinical Questions" within the Context Analysis. The study creators have to confirm that they adhere to the proposed procedure within the ethic proposal statement. Additional existing Medical Data Registry projects can be compared to EsPRit for conformity to the proposed procedure. This allows for the identification of gaps, which can lead to amendments requested by the ethics committee.
Statistical Process Control Techniques for the Telecommunications Systems Manager
1992-03-01
products that are out of 59 tolerance and bad designs. The third type of defect, mistakes, are remedied by Poka - Yoke methods that are 1 introduced later...based on total production costs plus quality costs. Once production is underway, interventions are determined by their impact on the QLF. F. POKA - YOKE ...Mistakes require process improvements called Poka Yoke or mistake proofing. Shiego Shingo developed Poka Yoke methods to incorporate 100% inspection at
Multi-tasking arbitration and behaviour design for human-interactive robots
NASA Astrophysics Data System (ADS)
Kobayashi, Yuichi; Onishi, Masaki; Hosoe, Shigeyuki; Luo, Zhiwei
2013-05-01
Robots that interact with humans in household environments are required to handle multiple real-time tasks simultaneously, such as carrying objects, collision avoidance and conversation with human. This article presents a design framework for the control and recognition processes to meet these requirements taking into account stochastic human behaviour. The proposed design method first introduces a Petri net for synchronisation of multiple tasks. The Petri net formulation is converted to Markov decision processes and processed in an optimal control framework. Three tasks (safety confirmation, object conveyance and conversation) interact and are expressed by the Petri net. Using the proposed framework, tasks that normally tend to be designed by integrating many if-then rules can be designed in a systematic manner in a state estimation and optimisation framework from the viewpoint of the shortest time optimal control. The proposed arbitration method was verified by simulations and experiments using RI-MAN, which was developed for interactive tasks with humans.
NASA Astrophysics Data System (ADS)
Orloff, Nathan D.; Long, Christian J.; Obrzut, Jan; Maillaud, Laurent; Mirri, Francesca; Kole, Thomas P.; McMichael, Robert D.; Pasquali, Matteo; Stranick, Stephan J.; Alexander Liddle, J.
2015-11-01
Advances in roll-to-roll processing of graphene and carbon nanotubes have at last led to the continuous production of high-quality coatings and filaments, ushering in a wave of applications for flexible and wearable electronics, woven fabrics, and wires. These applications often require specific electrical properties, and hence precise control over material micro- and nanostructure. While such control can be achieved, in principle, by closed-loop processing methods, there are relatively few noncontact and nondestructive options for quantifying the electrical properties of materials on a moving web at the speed required in modern nanomanufacturing. Here, we demonstrate a noncontact microwave method for measuring the dielectric constant and conductivity (or geometry for samples of known dielectric properties) of materials in a millisecond. Such measurement times are compatible with current and future industrial needs, enabling real-time materials characterization and in-line control of processing variables without disrupting production.
Microtechnology management considering test and cost aspects for stacked 3D ICs with MEMS
NASA Astrophysics Data System (ADS)
Hahn, K.; Wahl, M.; Busch, R.; Grünewald, A.; Brück, R.
2018-01-01
Innovative automotive systems require complex semiconductor devices currently only available in consumer grade quality. The European project TRACE will develop and demonstrate methods, processes, and tools to facilitate usage of Consumer Electronics (CE) components to be deployable more rapidly in the life-critical automotive domain. Consumer electronics increasingly use heterogeneous system integration methods and "More than Moore" technologies, which are capable to combine different circuit domains (Analog, Digital, RF, MEMS) and which are integrated within SiP or 3D stacks. Making these technologies or at least some of the process steps available under automotive electronics requirements is an important goal to keep pace with the growing demand for information processing within cars. The approach presented in this paper aims at a technology management and recommendation system that covers technology data, functional and non-functional constraints, and application scenarios, and that will comprehend test planning and cost consideration capabilities.
Orloff, Nathan D.; Long, Christian J.; Obrzut, Jan; Maillaud, Laurent; Mirri, Francesca; Kole, Thomas P.; McMichael, Robert D.; Pasquali, Matteo; Stranick, Stephan J.; Alexander Liddle, J.
2015-01-01
Advances in roll-to-roll processing of graphene and carbon nanotubes have at last led to the continuous production of high-quality coatings and filaments, ushering in a wave of applications for flexible and wearable electronics, woven fabrics, and wires. These applications often require specific electrical properties, and hence precise control over material micro- and nanostructure. While such control can be achieved, in principle, by closed-loop processing methods, there are relatively few noncontact and nondestructive options for quantifying the electrical properties of materials on a moving web at the speed required in modern nanomanufacturing. Here, we demonstrate a noncontact microwave method for measuring the dielectric constant and conductivity (or geometry for samples of known dielectric properties) of materials in a millisecond. Such measurement times are compatible with current and future industrial needs, enabling real-time materials characterization and in-line control of processing variables without disrupting production. PMID:26592441
Production and use of metals and oxygen for lunar propulsion
NASA Technical Reports Server (NTRS)
Hepp, Aloysius F.; Linne, Diane L.; Landis, Geoffrey A.; Groth, Mary F.; Colvin, James E.
1991-01-01
Production, power, and propulsion technologies for using oxygen and metals derived from lunar resources are discussed. The production process is described, and several of the more developed processes are discussed. Power requirements for chemical, thermal, and electrical production methods are compared. The discussion includes potential impact of ongoing power technology programs on lunar production requirements. The performance potential of several possible metal fuels including aluminum, silicon, iron, and titanium are compared. Space propulsion technology in the area of metal/oxygen rocket engines is discussed.
Automatic Fringe Detection for Oil Film Interferometry Measurement of Skin Friction
NASA Technical Reports Server (NTRS)
Naughton, Jonathan W.; Decker, Robert K.; Jafari, Farhad
2001-01-01
This report summarizes two years of work on investigating algorithms for automatically detecting fringe patterns in images acquired using oil-drop interferometry for the determination of skin friction. Several different analysis methods were tested, and a combination of a windowed Fourier transform followed by a correlation was found to be most effective. The implementation of this method is discussed and details of the process are described. The results indicate that this method shows promise for automating the fringe detection process, but further testing is required.
Manufacture of silicon carbide using solar energy
Glatzmaier, Gregory C.
1992-01-01
A method is described for producing silicon carbide particles using solar energy. The method is efficient and avoids the need for use of electrical energy to heat the reactants. Finely divided silica and carbon are admixed and placed in a solar-heated reaction chamber for a time sufficient to cause a reaction between the ingredients to form silicon carbide of very small particle size. No grinding of silicon carbide is required to obtain small particles. The method may be carried out as a batch process or as a continuous process.
Scalable Parameter Estimation for Genome-Scale Biochemical Reaction Networks
Kaltenbacher, Barbara; Hasenauer, Jan
2017-01-01
Mechanistic mathematical modeling of biochemical reaction networks using ordinary differential equation (ODE) models has improved our understanding of small- and medium-scale biological processes. While the same should in principle hold for large- and genome-scale processes, the computational methods for the analysis of ODE models which describe hundreds or thousands of biochemical species and reactions are missing so far. While individual simulations are feasible, the inference of the model parameters from experimental data is computationally too intensive. In this manuscript, we evaluate adjoint sensitivity analysis for parameter estimation in large scale biochemical reaction networks. We present the approach for time-discrete measurement and compare it to state-of-the-art methods used in systems and computational biology. Our comparison reveals a significantly improved computational efficiency and a superior scalability of adjoint sensitivity analysis. The computational complexity is effectively independent of the number of parameters, enabling the analysis of large- and genome-scale models. Our study of a comprehensive kinetic model of ErbB signaling shows that parameter estimation using adjoint sensitivity analysis requires a fraction of the computation time of established methods. The proposed method will facilitate mechanistic modeling of genome-scale cellular processes, as required in the age of omics. PMID:28114351
Beamforming array techniques for acoustic emission monitoring of large concrete structures
NASA Astrophysics Data System (ADS)
McLaskey, Gregory C.; Glaser, Steven D.; Grosse, Christian U.
2010-06-01
This paper introduces a novel method of acoustic emission (AE) analysis which is particularly suited for field applications on large plate-like reinforced concrete structures, such as walls and bridge decks. Similar to phased-array signal processing techniques developed for other non-destructive evaluation methods, this technique adapts beamforming tools developed for passive sonar and seismological applications for use in AE source localization and signal discrimination analyses. Instead of relying on the relatively weak P-wave, this method uses the energy-rich Rayleigh wave and requires only a small array of 4-8 sensors. Tests on an in-service reinforced concrete structure demonstrate that the azimuth of an artificial AE source can be determined via this method for sources located up to 3.8 m from the sensor array, even when the P-wave is undetectable. The beamforming array geometry also allows additional signal processing tools to be implemented, such as the VESPA process (VElocity SPectral Analysis), whereby the arrivals of different wave phases are identified by their apparent velocity of propagation. Beamforming AE can reduce sampling rate and time synchronization requirements between spatially distant sensors which in turn facilitates the use of wireless sensor networks for this application.
Feasibility of Active Machine Learning for Multiclass Compound Classification.
Lang, Tobias; Flachsenberg, Florian; von Luxburg, Ulrike; Rarey, Matthias
2016-01-25
A common task in the hit-to-lead process is classifying sets of compounds into multiple, usually structural classes, which build the groundwork for subsequent SAR studies. Machine learning techniques can be used to automate this process by learning classification models from training compounds of each class. Gathering class information for compounds can be cost-intensive as the required data needs to be provided by human experts or experiments. This paper studies whether active machine learning can be used to reduce the required number of training compounds. Active learning is a machine learning method which processes class label data in an iterative fashion. It has gained much attention in a broad range of application areas. In this paper, an active learning method for multiclass compound classification is proposed. This method selects informative training compounds so as to optimally support the learning progress. The combination with human feedback leads to a semiautomated interactive multiclass classification procedure. This method was investigated empirically on 15 compound classification tasks containing 86-2870 compounds in 3-38 classes. The empirical results show that active learning can solve these classification tasks using 10-80% of the data which would be necessary for standard learning techniques.
Minimization of energy and surface roughness of the products machined by milling
NASA Astrophysics Data System (ADS)
Belloufi, A.; Abdelkrim, M.; Bouakba, M.; Rezgui, I.
2017-08-01
Metal cutting represents a large portion in the manufacturing industries, which makes this process the largest consumer of energy. Energy consumption is an indirect source of carbon footprint, we know that CO2 emissions come from the production of energy. Therefore high energy consumption requires a large production, which leads to high cost and a large amount of CO2 emissions. At this day, a lot of researches done on the Metal cutting, but the environmental problems of the processes are rarely discussed. The right selection of cutting parameters is an effective method to reduce energy consumption because of the direct relationship between energy consumption and cutting parameters in machining processes. Therefore, one of the objectives of this research is to propose an optimization strategy suitable for machining processes (milling) to achieve the optimum cutting conditions based on the criterion of the energy consumed during the milling. In this paper the problem of energy consumed in milling is solved by an optimization method chosen. The optimization is done according to the different requirements in the process of roughing and finishing under various technological constraints.
Power processing for electric propulsion
NASA Technical Reports Server (NTRS)
Finke, R. C.; Herron, B. G.; Gant, G. D.
1975-01-01
The inclusion of electric thruster systems in spacecraft design is considered. The propulsion requirements of such spacecraft dictate a wide range of thruster power levels and operational lifetimes, which must be matched by lightweight, efficient, and reliable thruster power processing systems. Electron bombardment ion thruster requirements are presented, and the performance characteristics of present power processing systems are reviewed. Design philosophies and alternatives in areas such as inverter type, arc protection, and control methods are discussed along with future performance potentials for meeting goals in the areas of power process or weight (10 kg/kW), efficiency (approaching 92 percent), reliability (0.96 for 15,000 hr), and thermal control capability (0.3 to 5 AU).
Quantitative Method for Simultaneous Analysis of Acetaminophen and 6 Metabolites.
Lammers, Laureen A; Achterbergh, Roos; Pistorius, Marcel C M; Romijn, Johannes A; Mathôt, Ron A A
2017-04-01
Hepatotoxicity after ingestion of high-dose acetaminophen [N-acetyl-para-aminophenol (APAP)] is caused by the metabolites of the drug. To gain more insight into factors influencing susceptibility to APAP hepatotoxicity, quantification of APAP and metabolites is important. A few methods have been developed to simultaneously quantify APAP and its most important metabolites. However, these methods require a comprehensive sample preparation and long run times. The aim of this study was to develop and validate a simplified, but sensitive method for the simultaneous quantification of acetaminophen, the main metabolites acetaminophen glucuronide and acetaminophen sulfate, and 4 Cytochrome P450-mediated metabolites by using liquid chromatography with mass spectrometric (LC-MS) detection. The method was developed and validated for the human plasma, and it entailed a single method for sample preparation, enabling quick processing of the samples followed by an LC-MS method with a chromatographic run time of 9 minutes. The method was validated for selectivity, linearity, accuracy, imprecision, dilution integrity, recovery, process efficiency, ionization efficiency, and carryover effect. The method showed good selectivity without matrix interferences. For all analytes, the mean process efficiency was >86%, and the mean ionization efficiency was >94%. Furthermore, the accuracy was between 90.3% and 112% for all analytes, and the within- and between-run imprecision were <20% for the lower limit of quantification and <14.3% for the middle level and upper limit of quantification. The method presented here enables the simultaneous quantification of APAP and 6 of its metabolites. It is less time consuming than previously reported methods because it requires only a single and simple method for the sample preparation followed by an LC-MS method with a short run time. Therefore, this analytical method provides a useful method for both clinical and research purposes.
Histogram-driven cupping correction (HDCC) in CT
NASA Astrophysics Data System (ADS)
Kyriakou, Y.; Meyer, M.; Lapp, R.; Kalender, W. A.
2010-04-01
Typical cupping correction methods are pre-processing methods which require either pre-calibration measurements or simulations of standard objects to approximate and correct for beam hardening and scatter. Some of them require the knowledge of spectra, detector characteristics, etc. The aim of this work was to develop a practical histogram-driven cupping correction (HDCC) method to post-process the reconstructed images. We use a polynomial representation of the raw-data generated by forward projection of the reconstructed images; forward and backprojection are performed on graphics processing units (GPU). The coefficients of the polynomial are optimized using a simplex minimization of the joint entropy of the CT image and its gradient. The algorithm was evaluated using simulations and measurements of homogeneous and inhomogeneous phantoms. For the measurements a C-arm flat-detector CT (FD-CT) system with a 30×40 cm2 detector, a kilovoltage on board imager (radiation therapy simulator) and a micro-CT system were used. The algorithm reduced cupping artifacts both in simulations and measurements using a fourth-order polynomial and was in good agreement to the reference. The minimization algorithm required less than 70 iterations to adjust the coefficients only performing a linear combination of basis images, thus executing without time consuming operations. HDCC reduced cupping artifacts without the necessity of pre-calibration or other scan information enabling a retrospective improvement of CT image homogeneity. However, the method can work with other cupping correction algorithms or in a calibration manner, as well.
Yang, Jie; McArdle, Conor; Daniels, Stephen
2014-01-01
A new data dimension-reduction method, called Internal Information Redundancy Reduction (IIRR), is proposed for application to Optical Emission Spectroscopy (OES) datasets obtained from industrial plasma processes. For example in a semiconductor manufacturing environment, real-time spectral emission data is potentially very useful for inferring information about critical process parameters such as wafer etch rates, however, the relationship between the spectral sensor data gathered over the duration of an etching process step and the target process output parameters is complex. OES sensor data has high dimensionality (fine wavelength resolution is required in spectral emission measurements in order to capture data on all chemical species involved in plasma reactions) and full spectrum samples are taken at frequent time points, so that dynamic process changes can be captured. To maximise the utility of the gathered dataset, it is essential that information redundancy is minimised, but with the important requirement that the resulting reduced dataset remains in a form that is amenable to direct interpretation of the physical process. To meet this requirement and to achieve a high reduction in dimension with little information loss, the IIRR method proposed in this paper operates directly in the original variable space, identifying peak wavelength emissions and the correlative relationships between them. A new statistic, Mean Determination Ratio (MDR), is proposed to quantify the information loss after dimension reduction and the effectiveness of IIRR is demonstrated using an actual semiconductor manufacturing dataset. As an example of the application of IIRR in process monitoring/control, we also show how etch rates can be accurately predicted from IIRR dimension-reduced spectral data. PMID:24451453
Improvements in surface singularity analysis and design methods. [applicable to airfoils
NASA Technical Reports Server (NTRS)
Bristow, D. R.
1979-01-01
The coupling of the combined source vortex distribution of Green's potential flow function with contemporary numerical techniques is shown to provide accurate, efficient, and stable solutions to subsonic inviscid analysis and design problems for multi-element airfoils. The analysis problem is solved by direct calculation of the surface singularity distribution required to satisfy the flow tangency boundary condition. The design or inverse problem is solved by an iteration process. In this process, the geometry and the associated pressure distribution are iterated until the pressure distribution most nearly corresponding to the prescribed design distribution is obtained. Typically, five iteration cycles are required for convergence. A description of the analysis and design method is presented, along with supporting examples.
EVALUATION OF BIOSOLID SAMPLE PROCESSING TECHNIQUES TO MAXIMIZE RECOVERY OF BACTERIA
Current federal regulations (40 CFR 503) require enumeration of fecal coliform or Salmoella prior to land application of Class A biosolids. This regulation specifies use of enumeration methods included in "Standard Methods for the Examination of Water and Wastewater 18th Edition,...
Noncontaminating technique for making holes in existing process systems
NASA Technical Reports Server (NTRS)
Hecker, T. P.; Czapor, H. P.; Giordano, S. M.
1972-01-01
Technique is developed for making cleanly-contoured holes in assembled process systems without introducing chips or other contaminants into system. Technique uses portable equipment and does not require dismantling of system. Method was tested on Inconel, stainless steel, ASTMA-53, and Hastelloy X in all positions.
14 CFR 21.143 - Quality control data requirements; prime manufacturer.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., purchased items, and parts and assemblies produced by manufacturers' suppliers including methods used to... special manufacturing processes involved, the means used to control the processes, the final test... procedure for recording review board decisions and disposing of rejected parts; (5) An outline of a system...
49 CFR 236.905 - Railroad Safety Program Plan (RSPP).
Code of Federal Regulations, 2012 CFR
2012-10-01
... to be used in the verification and validation process, consistent with appendix C to this part. The...; and (iv) The identification of the safety assessment process. (2) Design for verification and validation. The RSPP must require the identification of verification and validation methods for the...
49 CFR 236.905 - Railroad Safety Program Plan (RSPP).
Code of Federal Regulations, 2014 CFR
2014-10-01
... to be used in the verification and validation process, consistent with appendix C to this part. The...; and (iv) The identification of the safety assessment process. (2) Design for verification and validation. The RSPP must require the identification of verification and validation methods for the...
49 CFR 236.905 - Railroad Safety Program Plan (RSPP).
Code of Federal Regulations, 2013 CFR
2013-10-01
... to be used in the verification and validation process, consistent with appendix C to this part. The...; and (iv) The identification of the safety assessment process. (2) Design for verification and validation. The RSPP must require the identification of verification and validation methods for the...
49 CFR 236.905 - Railroad Safety Program Plan (RSPP).
Code of Federal Regulations, 2011 CFR
2011-10-01
... to be used in the verification and validation process, consistent with appendix C to this part. The...; and (iv) The identification of the safety assessment process. (2) Design for verification and validation. The RSPP must require the identification of verification and validation methods for the...
Automated geospatial Web Services composition based on geodata quality requirements
NASA Astrophysics Data System (ADS)
Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael
2012-10-01
Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.
Integration of QFD, AHP, and LPP methods in supplier development problems under uncertainty
NASA Astrophysics Data System (ADS)
Shad, Zahra; Roghanian, Emad; Mojibian, Fatemeh
2014-04-01
Quality function deployment (QFD) is a customer-driven approach, widely used to develop or process new product to maximize customer satisfaction. Last researches used linear physical programming (LPP) procedure to optimize QFD; however, QFD issue involved uncertainties, or fuzziness, which requires taking them into account for more realistic study. In this paper, a set of fuzzy data is used to address linguistic values parameterized by triangular fuzzy numbers. Proposed integrated approach including analytic hierarchy process (AHP), QFD, and LPP to maximize overall customer satisfaction under uncertain conditions and apply them in the supplier development problem. The fuzzy AHP approach is adopted as a powerful method to obtain the relationship between the customer requirements and engineering characteristics (ECs) to construct house of quality in QFD method. LPP is used to obtain the optimal achievement level of the ECs and subsequently the customer satisfaction level under different degrees of uncertainty. The effectiveness of proposed method will be illustrated by an example.
3DD - Three Dimensional Disposal of Spent Nuclear Fuel - 12449
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dvorakova, Marketa; Slovak, Jiri
2012-07-01
Three dimensional disposal is being considered as a way in which to store long-term spent nuclear fuel in underground disposal facilities in the Czech Republic. This method involves a combination of the two most common internationally recognised disposal methods in order to practically apply the advantages of both whilst, at the same time, eliminating their weaknesses; the method also allows easy removal in case of potential re-use. The proposed method for the disposal of spent nuclear fuel will reduce the areal requirements of future deep geological repositories by more than 30%. It will also simplify the container handling process bymore » using gravitational forces in order to meet requirements concerning the controllability of processes and ensuring operational and nuclear safety. With regard to the issue of the efficient potential removal of waste containers, this project offers an ingenious solution which does not disrupt the overall stability of the original disposal complex. (authors)« less
Welding methods for joining thermoplastic polymers for the hermetic enclosure of medical devices.
Amanat, Negin; James, Natalie L; McKenzie, David R
2010-09-01
New high performance polymers have been developed that challenge traditional encapsulation materials for permanent active medical implants. The gold standard for hermetic encapsulation for implants is a titanium enclosure which is sealed using laser welding. Polymers may be an alternative encapsulation material. Although many polymers are biocompatible, and permeability of polymers may be reduced to acceptable levels, the ability to create a hermetic join with an extended life remains the barrier to widespread acceptance of polymers for this application. This article provides an overview of the current techniques used for direct bonding of polymers, with a focus on thermoplastics. Thermal bonding methods are feasible, but some take too long and/or require two stage processing. Some methods are not suitable because of excessive heat load which may be delivered to sensitive components within the capsule. Laser welding is presented as the method of choice; however the establishment of suitable laser process parameters will require significant research. 2010. Published by Elsevier Ltd.
Robot-Assisted Fracture Surgery: Surgical Requirements and System Design.
Georgilas, Ioannis; Dagnino, Giulio; Tarassoli, Payam; Atkins, Roger; Dogramadzi, Sanja
2018-03-09
The design of medical devices is a complex and crucial process to ensure patient safety. It has been shown that improperly designed devices lead to errors and associated accidents and costs. A key element for a successful design is incorporating the views of the primary and secondary stakeholders early in the development process. They provide insights into current practice and point out specific issues with the current processes and equipment in use. This work presents how information from a user-study conducted in the early stages of the RAFS (Robot Assisted Fracture Surgery) project informed the subsequent development and testing of the system. The user needs were captured using qualitative methods and converted to operational, functional, and non-functional requirements based on the methods derived from product design and development. This work presents how the requirements inform a new workflow for intra-articular joint fracture reduction using a robotic system. It is also shown how the various elements of the system are developed to explicitly address one or more of the requirements identified, and how intermediate verification tests are conducted to ensure conformity. Finally, a validation test in the form of a cadaveric trial confirms the ability of the designed system to satisfy the aims set by the original research question and the needs of the users.
Where do Students Go Wrong in Applying the Scientific Method?
NASA Astrophysics Data System (ADS)
Rubbo, Louis; Moore, Christopher
2015-04-01
Non-science majors completing a liberal arts degree are frequently required to take a science course. Ideally with the completion of a required science course, liberal arts students should demonstrate an improved capability in the application of the scientific method. In previous work we have demonstrated that this is possible if explicit instruction is spent on the development of scientific reasoning skills. However, even with explicit instruction, students still struggle to apply the scientific process. Counter to our expectations, the difficulty is not isolated to a single issue such as stating a testable hypothesis, designing an experiment, or arriving at a supported conclusion. Instead students appear to struggle with every step in the process. This talk summarizes our work looking at and identifying where students struggle in the application of the scientific method. This material is based upon work supported by the National Science Foundation under Grant No. 1244801.
Indexing and retrieving point and region objects
NASA Astrophysics Data System (ADS)
Ibrahim, Azzam T.; Fotouhi, Farshad A.
1996-03-01
R-tree and its variants are examples of spatial data structures for paged-secondary memory. To process a query, these structures require multiple path traversals. In this paper, we present a new image access method, SB+-tree which requires a single path traversal to process a query. Also, SB+-tree will allow commercial databases an access method for spatial objects without a major change, since most commercial databases already support B+-tree as an access method for text data. The SB+-tree can be used for zero and non-zero size data objects. Non-zero size objects are approximated by their minimum bounding rectangles (MBRs). The number of SB+-trees generated is dependent upon the number of dimensions of the approximation of the object. The structure supports efficient spatial operations such as regions-overlap, distance and direction. In this paper, we experimentally and analytically demonstrate the superiority of SB+-tree over R-tree.
Curry, Wayne; Conway, Samuel; Goodfield, Clara; Miller, Kimberly; Mueller, Ronald L; Polini, Eugene
2010-12-01
The preparation of sterile parenteral products requires careful control of all ingredients, materials, and processes to ensure the final product has the identity and strength, and meets the quality and purity characteristics that it purports to possess. Contamination affecting these critical properties of parenteral products can occur in many ways and from many sources. The use of closures supplied by manufacturers in a ready-to-use state can be an effective method for reducing the risk of contamination and improving the quality of the drug product. This article will address contamination attributable to elastomeric container closure components and the regulatory requirements associated with container closure systems. Possible contaminants, including microorganisms, endotoxins, and chemicals, along with the methods by which these contaminants can enter the product will be reviewed. Such methods include inappropriate material selection, improper closure preparation processes, compromised container closure integrity, degradation of closures, and leaching of compounds from the closures.
Consistent approach to describing aircraft HIRF protection
NASA Technical Reports Server (NTRS)
Rimbey, P. R.; Walen, D. B.
1995-01-01
The high intensity radiated fields (HIRF) certification process as currently implemented is comprised of an inconsistent combination of factors that tend to emphasize worst case scenarios in assessing commercial airplane certification requirements. By examining these factors which include the process definition, the external HIRF environment, the aircraft coupling and corresponding internal fields, and methods of measuring equipment susceptibilities, activities leading to an approach to appraising airplane vulnerability to HIRF are proposed. This approach utilizes technically based criteria to evaluate the nature of the threat, including the probability of encountering the external HIRF environment. No single test or analytic method comprehensively addresses the full HIRF threat frequency spectrum. Additional tools such as statistical methods must be adopted to arrive at more realistic requirements to reflect commercial aircraft vulnerability to the HIRF threat. Test and analytic data are provided to support the conclusions of this report. This work was performed under NASA contract NAS1-19360, Task 52.
NASA Technical Reports Server (NTRS)
Eckert, E R G; Livingood, N B
1954-01-01
Various parts of aircraft propulsion engines that are in contact with hot gases often require cooling. Transpiration and film cooling, new methods that supposedly utilize cooling air more effectively than conventional convection cooling, have already been proposed. This report presents material necessary for a comparison of the cooling requirements of these three methods. Correlations that are regarded by the authors as the most reliable today are employed in evaluating each of the cooling processes. Calculations for the special case in which the gas velocity is constant along the cooled wall (flat plate) are presented. The calculations reveal that a comparison of the three cooling processes can be made on quite a general basis. The superiority of transpiration cooling is clearly shown for both laminar and turbulent flow. This superiority is reduced when the effects of radiation are included; for gas-turbine blades, however, there is evidence indicating that radiation may be neglected.
Improvement of radiology services based on the process management approach.
Amaral, Creusa Sayuri Tahara; Rozenfeld, Henrique; Costa, Janaina Mascarenhas Hornos; Magon, Maria de Fátima de Andrade; Mascarenhas, Yvone Maria
2011-06-01
The health sector requires continuous investments to ensure the improvement of products and services from a technological standpoint, the use of new materials, equipment and tools, and the application of process management methods. Methods associated with the process management approach, such as the development of reference models of business processes, can provide significant innovations in the health sector and respond to the current market trend for modern management in this sector (Gunderman et al. (2008) [4]). This article proposes a process model for diagnostic medical X-ray imaging, from which it derives a primary reference model and describes how this information leads to gains in quality and improvements. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Design of a Workstation by a Cognitive Approach
Jaspers, MWM; Steen, T.; Geelen, M.; van den Bos, C.
2001-01-01
To ensure ultimate acceptance of computer systems that are easy to use, provide the desired functionality and fits into users work practices requires the use of improved methods for system design and evaluation. Both designing and evaluating workstations that link up smoothly with daily routine of physicians' work requires a thorough understanding of their working practices. The application of methods from cognitive science may contribute to a thorough understanding of the activities involved in medical information processing. We used cognitive task analysis in designing a physicians' workstation, which seems a promising method to ensure that the system meets the user needs.
Evaluation and selection of decision-making methods to assess landfill mining projects.
Hermann, Robert; Baumgartner, Rupert J; Vorbach, Stefan; Ragossnig, Arne; Pomberger, Roland
2015-09-01
For the first time in Austria, fundamental technological and economic studies on recovering secondary raw materials from large landfills have been carried out, based on the 'LAMIS - Landfill Mining Austria' pilot project. A main focus of the research - and the subject of this article - was to develop an assessment or decision-making procedure that allows landfill owners to thoroughly examine the feasibility of a landfill mining project in advance. Currently there are no standard procedures that would sufficiently cover all the multiple-criteria requirements. The basic structure of the multiple attribute decision making process was used to narrow down on selection, conceptual design and assessment of suitable procedures. Along with a breakdown into preliminary and main assessment, the entire foundation required was created, such as definitions of requirements to an assessment method, selection and accurate description of the various assessment criteria and classification of the target system for the present 'landfill mining' vs. 'retaining the landfill in after-care' decision-making problem. Based on these studies, cost-utility analysis and the analytical-hierarchy process were selected from the range of multiple attribute decision-making procedures and examined in detail. Overall, both methods have their pros and cons with regard to their use for assessing landfill mining projects. Merging these methods or connecting them with single-criteria decision-making methods (like the net present value method) may turn out to be reasonable and constitute an appropriate assessment method. © The Author(s) 2015.
Comparative Evaluations of Four Specification Methods for Real-Time Systems
1989-12-01
December 1989 Comparative Evaluations of Four Specification Methods for Real - Time Systems David P. Wood William G. Wood Specification and Design Methods...Methods for Real - Time Systems Abstract: A number of methods have been proposed in the last decade for the specification of system and software requirements...and software specification for real - time systems . Our process for the identification of methods that meet the above criteria is described in greater
User-driven product data manager system design
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-03-01
With the infusion of information technologies into product development and production processes, effective management of product data is becoming essential to modern production enterprises. When an enterprise-wide Product Data Manager (PDM) is implemented, PDM designers must satisfy the requirements of individual users with different job functions and requirements, as well as the requirements of the enterprise as a whole. Concern must also be shown for the interrelationships between information, methods for retrieving archival information and integration of the PDM into the product development process. This paper describes a user-driven approach applied to PDM design for an agile manufacturing pilot projectmore » at Sandia National Laboratories that has been successful in achieving a much faster design-to-production process for a precision electro mechanical surety device.« less
Rapid microscale in-gel processing and digestion of proteins using surface acoustic waves.
Kulkarni, Ketav P; Ramarathinam, Sri H; Friend, James; Yeo, Leslie; Purcell, Anthony W; Perlmutter, Patrick
2010-06-21
A new method for in-gel sample processing and tryptic digestion of proteins is described. Sample preparation, rehydration, in situ digestion and peptide extraction from gel slices are dramatically accelerated by treating the gel slice with surface acoustic waves (SAWs). Only 30 minutes total workflow time is required for this new method to produce base peak chromatograms (BPCs) of similar coverage and intensity to those observed for traditional processing and overnight digestion. Simple set up, good reproducibility, excellent peptide recoveries, rapid turnover of samples and high confidence protein identifications put this technology at the fore-front of the next generation of proteomics sample processing tools.
STATISTICAL ANALYSIS OF SNAP 10A THERMOELECTRIC CONVERTER ELEMENT PROCESS DEVELOPMENT VARIABLES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitch, S.H.; Morris, J.W.
1962-12-15
Statistical analysis, primarily analysis of variance, was applied to evaluate several factors involved in the development of suitable fabrication and processing techniques for the production of lead telluride thermoelectric elements for the SNAP 10A energy conversion system. The analysis methods are described as to their application for determining the effects of various processing steps, estabIishing the value of individual operations, and evaluating the significance of test results. The elimination of unnecessary or detrimental processing steps was accomplished and the number of required tests was substantially reduced by application of these statistical methods to the SNAP 10A production development effort. (auth)
Horsch, Salome; Kopczynski, Dominik; Kuthe, Elias; Baumbach, Jörg Ingo; Rahmann, Sven
2017-01-01
Motivation Disease classification from molecular measurements typically requires an analysis pipeline from raw noisy measurements to final classification results. Multi capillary column—ion mobility spectrometry (MCC-IMS) is a promising technology for the detection of volatile organic compounds in the air of exhaled breath. From raw measurements, the peak regions representing the compounds have to be identified, quantified, and clustered across different experiments. Currently, several steps of this analysis process require manual intervention of human experts. Our goal is to identify a fully automatic pipeline that yields competitive disease classification results compared to an established but subjective and tedious semi-manual process. Method We combine a large number of modern methods for peak detection, peak clustering, and multivariate classification into analysis pipelines for raw MCC-IMS data. We evaluate all combinations on three different real datasets in an unbiased cross-validation setting. We determine which specific algorithmic combinations lead to high AUC values in disease classifications across the different medical application scenarios. Results The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace-operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology. PMID:28910313
Evaluation of four methods for estimating leaf area of isolated trees
P.J. Peper; E.G. McPherson
2003-01-01
The accurate modeling of the physiological and functional processes of urban forests requires information on the leaf area of urban tree species. Several non-destructive, indirect leaf area sampling methods have shown good performance for homogenous canopies. These methods have not been evaluated for use in urban settings where trees are typically isolated and...
Systems Engineering Management Procedures
1966-03-10
load -..................................................... tch 2 1t55 Trade Study-Companson ,f Methods for Measuring Quantities of Loaded... method of system operation and the ancillary equipment required such as instru- system elements is a highly involved process mentation. depot tooling...Installation and checkout. MiGI-Maintenance g-round equipment. IM-Item manager. NIP-Materiel improvement proipct. indenturo-A method of showing relationships
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ritterbusch, Stanley; Golay, Michael; Duran, Felicia
2003-01-29
OAK B188 Summary of methods proposed for risk informing the design and regulation of future nuclear power plants. All elements of the historical design and regulation process are preserved, but the methods proposed for new plants use probabilistic risk assessment methods as the primary decision making tool.
Liao, Yi-Fang; Tsai, Meng-Li; Yen, Chen-Tung; Cheng, Chiung-Hsiang
2011-02-15
Heat-fusing is a common process for fabricating microwire tetrodes. However, it is time-consuming, and the high-temperature treatment can easily cause the insulation of the microwire to overheat leading to short circuits. We herein provide a simple, fast method to fabricate microwire tetrodes without the heat-fusion process. By increasing the twisting density, we were able to fabricate tetrodes with good rigidity and integrity. This kind of tetrode showed good recording quality, penetrated the brain surface easily, and remained intact after chronic implantation. This method requires only general laboratory tools and is relatively simple even for inexperienced workers. © 2010 Elsevier B.V. All rights reserved.
Disc resonator gyroscope fabrication process requiring no bonding alignment
NASA Technical Reports Server (NTRS)
Shcheglov, Kirill V. (Inventor)
2010-01-01
A method of fabricating a resonant vibratory sensor, such as a disc resonator gyro. A silicon baseplate wafer for a disc resonator gyro is provided with one or more locating marks. The disc resonator gyro is fabricated by bonding a blank resonator wafer, such as an SOI wafer, to the fabricated baseplate, and fabricating the resonator structure according to a pattern based at least in part upon the location of the at least one locating mark of the fabricated baseplate. MEMS-based processing is used for the fabrication processing. In some embodiments, the locating mark is visualized using optical and/or infrared viewing methods. A disc resonator gyroscope manufactured according to these methods is described.
Material Processing Laser Systems In Production
NASA Astrophysics Data System (ADS)
Taeusch, David R.
1988-11-01
The laser processing system is now a respected, productive machine tool in the manufacturing industries. Systems in use today are proving their cost effectiveness and capabilities of processing quality parts. Several types of industrial lasers are described and their applications are discussed, with emphasis being placed on the production environment and methods of protection required for optical equipment against this normally hostile environment.
NASA Astrophysics Data System (ADS)
Au, How Meng
The aircraft design process traditionally starts with a given set of top-level requirements. These requirements can be aircraft performance related such as the fuel consumption, cruise speed, or takeoff field length, etc., or aircraft geometry related such as the cabin height or cabin volume, etc. This thesis proposes a new aircraft design process in which some of the top-level requirements are not explicitly specified. Instead, these previously specified parameters are now determined through the use of the Price-Per-Value-Factor (PPVF) index. This design process is well suited for design projects where general consensus of the top-level requirements does not exist. One example is the design of small commuter airliners. The above mentioned value factor is comprised of productivity, cabin volume, cabin height, cabin pressurization, mission fuel consumption, and field length, each weighted to a different exponent. The relative magnitude and positive/negative signs of these exponents are in agreement with general experience. The value factors of the commuter aircraft are shown to have improved over a period of four decades. In addition, the purchase price is shown to vary linearly with the value factor. The initial aircraft sizing process can be manpower intensive if the calculations are done manually. By incorporating automation into the process, the design cycle can be shortened considerably. The Fortran program functions and subroutines in this dissertation, in addition to the design and optimization methodologies described above, contribute to the reduction of manpower required for the initial sizing process. By combining the new design process mentioned above and the PPVF as the objective function, an optimization study is conducted on the design of a 20-seat regional jet. Handbook methods for aircraft design are written into a Fortran code. A genetic algorithm is used as the optimization scheme. The result of the optimization shows that aircraft designed to this PPVF index can be competitive compared to existing turboprop commuter aircraft. The process developed can be applied to other classes of aircraft with the designer modifying the cost function based upon the design goals.
Aligning observed and modelled behaviour based on workflow decomposition
NASA Astrophysics Data System (ADS)
Wang, Lu; Du, YuYue; Liu, Wei
2017-09-01
When business processes are mostly supported by information systems, the availability of event logs generated from these systems, as well as the requirement of appropriate process models are increasing. Business processes can be discovered, monitored and enhanced by extracting process-related information. However, some events cannot be correctly identified because of the explosion of the amount of event logs. Therefore, a new process mining technique is proposed based on a workflow decomposition method in this paper. Petri nets (PNs) are used to describe business processes, and then conformance checking of event logs and process models is investigated. A decomposition approach is proposed to divide large process models and event logs into several separate parts that can be analysed independently; while an alignment approach based on a state equation method in PN theory enhances the performance of conformance checking. Both approaches are implemented in programmable read-only memory (ProM). The correctness and effectiveness of the proposed methods are illustrated through experiments.
NASA Technical Reports Server (NTRS)
Spiering, Bruce; Underwood, Lauren; Ellis, Chris; Lehrter, John; Hagy, Jim; Schaeffer, Blake
2010-01-01
The goals of the project are to provide information from satellite remote sensing to support numeric nutrient criteria development and to determine data processing methods and data quality requirements to support nutrient criteria development and implementation. The approach is to identify water quality indicators that are used by decision makers to assess water quality and that are related to optical properties of the water; to develop remotely sensed data products based on algorithms relating remote sensing imagery to field-based observations of indicator values; to develop methods to assess estuarine water quality, including trends, spatial and temporal variability, and seasonality; and to develop tools to assist in the development and implementation of estuarine and coastal nutrient criteria. Additional slides present process, criteria development, typical data sources and analyses for criteria process, the power of remote sensing data for the process, examples from Pensacola Bay, spatial and temporal variability, pixel matchups, remote sensing validation, remote sensing in coastal waters, requirements for remotely sensed data products, and needs assessment. An additional presentation examines group engagement and information collection. Topics include needs assessment purpose and objectives, understanding water quality decision making, determining information requirements, and next steps.
Davis, Brian N.; Werpy, Jason; Friesz, Aaron M.; Impecoven, Kevin; Quenzer, Robert; Maiersperger, Tom; Meyer, David J.
2015-01-01
Current methods of searching for and retrieving data from satellite land remote sensing archives do not allow for interactive information extraction. Instead, Earth science data users are required to download files over low-bandwidth networks to local workstations and process data before science questions can be addressed. New methods of extracting information from data archives need to become more interactive to meet user demands for deriving increasingly complex information from rapidly expanding archives. Moving the tools required for processing data to computer systems of data providers, and away from systems of the data consumer, can improve turnaround times for data processing workflows. The implementation of middleware services was used to provide interactive access to archive data. The goal of this middleware services development is to enable Earth science data users to access remote sensing archives for immediate answers to science questions instead of links to large volumes of data to download and process. Exposing data and metadata to web-based services enables machine-driven queries and data interaction. Also, product quality information can be integrated to enable additional filtering and sub-setting. Only the reduced content required to complete an analysis is then transferred to the user.
Real-time model learning using Incremental Sparse Spectrum Gaussian Process Regression.
Gijsberts, Arjan; Metta, Giorgio
2013-05-01
Novel applications in unstructured and non-stationary human environments require robots that learn from experience and adapt autonomously to changing conditions. Predictive models therefore not only need to be accurate, but should also be updated incrementally in real-time and require minimal human intervention. Incremental Sparse Spectrum Gaussian Process Regression is an algorithm that is targeted specifically for use in this context. Rather than developing a novel algorithm from the ground up, the method is based on the thoroughly studied Gaussian Process Regression algorithm, therefore ensuring a solid theoretical foundation. Non-linearity and a bounded update complexity are achieved simultaneously by means of a finite dimensional random feature mapping that approximates a kernel function. As a result, the computational cost for each update remains constant over time. Finally, algorithmic simplicity and support for automated hyperparameter optimization ensures convenience when employed in practice. Empirical validation on a number of synthetic and real-life learning problems confirms that the performance of Incremental Sparse Spectrum Gaussian Process Regression is superior with respect to the popular Locally Weighted Projection Regression, while computational requirements are found to be significantly lower. The method is therefore particularly suited for learning with real-time constraints or when computational resources are limited. Copyright © 2012 Elsevier Ltd. All rights reserved.
Sluiter, Amie; Sluiter, Justin; Wolfrum, Ed; ...
2016-05-20
Accurate and precise chemical characterization of biomass feedstocks and process intermediates is a requirement for successful technical and economic evaluation of biofuel conversion technologies. The uncertainty in primary measurements of the fraction insoluble solid (FIS) content of dilute acid pretreated corn stover slurry is the major contributor to uncertainty in yield calculations for enzymatic hydrolysis of cellulose to glucose. This uncertainty is propagated through process models and impacts modeled fuel costs. The challenge in measuring FIS is obtaining an accurate measurement of insoluble matter in the pretreated materials, while appropriately accounting for all biomass derived components. Three methods were testedmore » to improve this measurement. One used physical separation of liquid and solid phases, and two utilized direct determination of dry matter content in two fractions. We offer a comparison of drying methods. Lastly, our results show utilizing a microwave dryer to directly determine dry matter content is the optimal method for determining FIS, based on the low time requirements and the method optimization done using model slurries.« less
NASA Astrophysics Data System (ADS)
Sudibyo, Aji, B. B.; Sumardi, S.; Mufakir, F. R.; Junaidi, A.; Nurjaman, F.; Karna, Aziza, Aulia
2017-01-01
Gold amalgamation process was widely used to treat gold ore. This process produces the tailing or amalgamation solid waste, which still contains gold at 8-9 ppm. Froth flotation is one of the promising methods to beneficiate gold from this tailing. However, this process requires optimal conditions which depends on the type of raw material. In this study, Taguchi method was used to optimize the optimum conditions of the froth flotation process. The Taguchi optimization shows that the gold recovery was strongly influenced by the particle size which is the best particle size at 150 mesh followed by the Potassium amyl xanthate concentration, pH and pine oil concentration at 1133.98, 4535.92 and 68.04 gr/ton amalgamation tailing, respectively.
Optoelectronic Inner-Product Neural Associative Memory
NASA Technical Reports Server (NTRS)
Liu, Hua-Kuang
1993-01-01
Optoelectronic apparatus acts as artificial neural network performing associative recall of binary images. Recall process is iterative one involving optical computation of inner products between binary input vector and one or more reference binary vectors in memory. Inner-product method requires far less memory space than matrix-vector method.
Curing conditions to inactivate Trichinella spiralis muscle larvae in ready-to-eat pork sausage
USDA-ARS?s Scientific Manuscript database
Curing processes for ready to eat (RTE) pork products currently require individual validation of methods to demonstrate inactivation of Trichinella spiralis. This is a major undertaking for each process; currently no model of meat chemistry exists that can be correlated with inactivation of Trichin...
Mazzio, Katherine A; Okamoto, Ken; Li, Zhi; Gutmann, Sebastian; Strein, Elisabeth; Ginger, David S; Schlaf, Rudy; Luscombe, Christine K
2013-02-14
A one pot method for organic/colloidal CdSe nanoparticle hybrid material synthesis is presented. Relative to traditional ligand exchange processes, these materials require smaller amounts of the desired capping ligand, shorter syntheses and fewer processing steps, while maintaining nanoparticle morphology.
Energy Implications of Materials Processing
ERIC Educational Resources Information Center
Hayes, Earl T.
1976-01-01
Processing of materials could become energy-limited rather than resource-limited. Methods to extract metals, industrial minerals, and energy materials and convert them to useful states requires more than one-fifth of the United States energy budget. Energy accounting by industries must include a total systems analysis of costs to insure net energy…
40 CFR 98.176 - Data reporting requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., you must report the following information for each process: (1) The carbon content of each process input and output used to determine CO2 emissions. (2) Whether the carbon content was determined from information from the supplier or by laboratory analysis, and if by laboratory analysis, the method used. (3...
Heat recirculating cooler for fluid stream pollutant removal
Richards, George A.; Berry, David A.
2008-10-28
A process by which heat is removed from a reactant fluid to reach the operating temperature of a known pollutant removal method and said heat is recirculated to raise the temperature of the product fluid. The process can be utilized whenever an intermediate step reaction requires a lower reaction temperature than the prior and next steps. The benefits of a heat-recirculating cooler include the ability to use known pollutant removal methods and increased thermal efficiency of the system.
Gaussian process based intelligent sampling for measuring nano-structure surfaces
NASA Astrophysics Data System (ADS)
Sun, L. J.; Ren, M. J.; Yin, Y. H.
2016-09-01
Nanotechnology is the science and engineering that manipulate matters at nano scale, which can be used to create many new materials and devices with a vast range of applications. As the nanotech product increasingly enters the commercial marketplace, nanometrology becomes a stringent and enabling technology for the manipulation and the quality control of the nanotechnology. However, many measuring instruments, for instance scanning probe microscopy, are limited to relatively small area of hundreds of micrometers with very low efficiency. Therefore some intelligent sampling strategies should be required to improve the scanning efficiency for measuring large area. This paper presents a Gaussian process based intelligent sampling method to address this problem. The method makes use of Gaussian process based Bayesian regression as a mathematical foundation to represent the surface geometry, and the posterior estimation of Gaussian process is computed by combining the prior probability distribution with the maximum likelihood function. Then each sampling point is adaptively selected by determining the position which is the most likely outside of the required tolerance zone among the candidates and then inserted to update the model iteratively. Both simulationson the nominal surface and manufactured surface have been conducted on nano-structure surfaces to verify the validity of the proposed method. The results imply that the proposed method significantly improves the measurement efficiency in measuring large area structured surfaces.
Jurado, Marisa; Algora, Manuel; Garcia-Sanchez, Félix; Vico, Santiago; Rodriguez, Eva; Perez, Sonia; Barbolla, Luz
2012-01-01
The Community Transfusion Centre in Madrid currently processes whole blood using a conventional procedure (Compomat, Fresenius) followed by automated processing of buffy coats with the OrbiSac system (CaridianBCT). The Atreus 3C system (CaridianBCT) automates the production of red blood cells, plasma and an interim platelet unit from a whole blood unit. Interim platelet unit are pooled to produce a transfusable platelet unit. In this study the Atreus 3C system was evaluated and compared to the routine method with regards to product quality and operational value. Over a 5-week period 810 whole blood units were processed using the Atreus 3C system. The attributes of the automated process were compared to those of the routine method by assessing productivity, space, equipment and staffing requirements. The data obtained were evaluated in order to estimate the impact of implementing the Atreus 3C system in the routine setting of the blood centre. Yield and in vitro quality of the final blood components processed with the two systems were evaluated and compared. The Atreus 3C system enabled higher throughput while requiring less space and employee time by decreasing the amount of equipment and processing time per unit of whole blood processed. Whole blood units processed on the Atreus 3C system gave a higher platelet yield, a similar amount of red blood cells and a smaller volume of plasma. These results support the conclusion that the Atreus 3C system produces blood components meeting quality requirements while providing a high operational efficiency. Implementation of the Atreus 3C system could result in a large organisational improvement.
NASA Technical Reports Server (NTRS)
Eller, H. H.; Sugg, F. E.
1970-01-01
The methods and procedures used to perform nondestructive testing inspections of the Saturn S-2 liquid hydrogen and liquid oxygen tank weldments during fabrication and after proof testing are described to document special skills developed during the program. All post-test inspection requirements are outlined including radiographic inspections procedures.
ERIC Educational Resources Information Center
Hayden, Howard C.
1995-01-01
Presents a method to calculate the amount of high-level radioactive waste by taking into consideration the following factors: the fission process that yields the waste, identification of the waste, the energy required to run a 1-GWe plant for one year, and the uranium mass required to produce that energy. Briefly discusses waste disposal and…
Prefield methods: streamlining forest or nonforest determinations to increase inventory efficiency
Sara Goeking; Gretchen Moisen; Kevin Megown; Jason Toombs
2009-01-01
Interior West Forest Inventory and Analysis has developed prefield protocols to distinguish forested plots that require field visits from nonforested plots that do not require field visits. Recent innovations have increased the efficiency of the prefield process. First, the incorporation of periodic inventory data into a prefield database increased the amount of...
ERIC Educational Resources Information Center
Jovanovic, Vukica
2010-01-01
The present mixed-methods study examined the opinions of industry practitioners related to the implementation of environmental compliance requirements into design and manufacturing processes of mechatronic and electromechanical products. It focused on the environmental standards for mechatronic and electromechanical products and how Product…
Driving Objectives and High-level Requirements for KP-Lab Technologies
ERIC Educational Resources Information Center
Lakkala, Minna; Paavola, Sami; Toikka, Seppo; Bauters, Merja; Markannen, Hannu; de Groot, Reuma; Ben Ami, Zvi; Baurens, Benoit; Jadin, Tanja; Richter, Christoph; Zoserl, Eva; Batatia, Hadj; Paralic, Jan; Babic, Frantisek; Damsa, Crina; Sins, Patrick; Moen, Anne; Norenes, Svein Olav; Bugnon, Alexandra; Karlgren, Klas; Kotzinons, Dimitris
2008-01-01
One of the central goals of the KP-Lab project is to co-design pedagogical methods and technologies for knowledge creation and practice transformation in an integrative and reciprocal manner. In order to facilitate this process user tasks, driving objectives and high-level requirements have been introduced as conceptual tools to mediate between…
Analysis of a crossed Bragg cell acousto-optical spectrometer for SETI
NASA Technical Reports Server (NTRS)
Gulkis, S.
1989-01-01
The search for radio signals from extraterrestrial intelligent beings (SETI) requires the use of large instantaneous bandwidth (500 MHz) and high resolution (20 Hz) spectrometers. Digital systems with a high degree of modularity can be used to provide this capability, and this method has been widely discussed. Another technique for meeting the SETI requirement is to use a crossed Bragg cell spectrometer as described by Psaltis and Casasent. This technique makes use of the Folded Spectrum concept, introduced by Thomas. The Folded Spectrum is a 2-D Fourier Transform of a raster scanned 1-D signal. It is directly related to the long 1-D spectrum of the original signal and is ideally suited for optical signal processing. The folded spectrum technique has received little attention to date, primarily because early systems made use of photographic film which are unsuitable for the real time data analysis and voluminous data requirements of SETI. An analysis of the crossed Bragg cell spectrometer is presented as a method to achieve the spectral processing requirements for SETI. Systematic noise contributions unique to the Bragg cell system will be discussed.
Analysis of a crossed Bragg cell acousto-optical spectrometer for SETI.
Gulkis, S
1989-01-01
The search for radio signals from extraterrestrial intelligent beings (SETI) requires the use of large instantaneous bandwidth (500 MHz) and high resolution (20 Hz) spectrometers. Digital systems with a high degree of modularity can be used to provide this capability, and this method has been widely discussed. Another technique for meeting the SETI requirement is to use a crossed Bragg cell spectrometer as described by Psaltis and Casasent. This technique makes use of the Folded Spectrum concept, introduced by Thomas. The Folded Spectrum is a 2-D Fourier Transform of a raster scanned 1-D signal. It is directly related to the long 1-D spectrum of the original signal and is ideally suited for optical signal processing. The folded spectrum technique has received little attention to date, primarily because early systems made use of photographic film which are unsuitable for the real time data analysis and voluminous data requirements of SETI. An analysis of the crossed Bragg cell spectrometer is presented as a method to achieve the spectral processing requirements for SETI. Systematic noise contributions unique to the Bragg cell system will be discussed.
Analysis of a crossed Bragg cell acousto-optical spectrometer for SETI
NASA Astrophysics Data System (ADS)
Gulkis, Samuel
The search for radio signals from extraterrestrial intelligent beings (SETI) requires the use of large instantaneous bandwidth (500 MHz) and high resolution (20 Hz) spectrometers. Digital systems with a high degree of modularity can be used to provide this capability, and this method has been widely discussed. Another technique for meeting the SETI requirement is to use a crossed Bragg cell spectrometer as described by Psaltis and Casasent. This technique makes use of the Folded Spectrum concept, introduced by Thomas. The Folded Spectrum is a 2-D Fourier Transform of a raster scanned 1-D signal. It is directly related to the long 1-D spectrum of the original signal and is ideally suited for optical signal processing. The folded spectrum technique has received little attention to date, primarily because early systems made use of photographic film which are unsuitable for the real time data analysis and voluminous data requirements of SETI. An analysis of the crossed Bragg cell spectrometer is presented as a method to achieve the spectral processing requirements for SETI. Systematic noise contributions unique to the Bragg cell system will be discussed.
Strengthening Interprofessional Requirements Engineering Through Action Sheets: A Pilot Study
Pohlmann, Sabrina; Heinze, Oliver; Brandner, Antje; Reiß, Christina; Kamradt, Martina; Szecsenyi, Joachim; Ose, Dominik
2016-01-01
Background The importance of information and communication technology for healthcare is steadily growing. Newly developed tools are addressing different user groups: physicians, other health care professionals, social workers, patients, and family members. Since often many different actors with different expertise and perspectives are involved in the development process it can be a challenge to integrate the user-reported requirements of those heterogeneous user groups. Nevertheless, the understanding and consideration of user requirements is the prerequisite of building a feasible technical solution. In the course of the presented project it proved to be difficult to gain clear action steps and priorities for the development process out of the primary requirements compilation. Even if a regular exchange between involved teams took place there was a lack of a common language. Objective The objective of this paper is to show how the already existing requirements catalog was subdivided into specific, prioritized, and coherent working packages and the cooperation of multiple interprofessional teams within one development project was reorganized at the same time. In the case presented, the manner of cooperation was reorganized and a new instrument called an Action Sheet was implemented. This paper introduces the newly developed methodology which was meant to smooth the development of a user-centered software product and to restructure interprofessional cooperation. Methods There were 10 focus groups in which views of patients with colorectal cancer, physicians, and other health care professionals were collected in order to create a requirements catalog for developing a personal electronic health record. Data were audio- and videotaped, transcribed verbatim, and thematically analyzed. Afterwards, the requirements catalog was reorganized in the form of Action Sheets which supported the interprofessional cooperation referring to the development process of a personal electronic health record for the Rhine-Neckar region. Results In order to improve the interprofessional cooperation the idea arose to align the requirements arising from the implementation project with the method of software development applied by the technical development team. This was realized by restructuring the original requirements set in a standardized way and under continuous adjustment between both teams. As a result not only the way of displaying the user demands but also of interprofessional cooperation was steered in a new direction. Conclusions User demands must be taken into account from the very beginning of the development process, but it is not always obvious how to bring them together with IT knowhow and knowledge of the contextual factors of the health care system. Action Sheets seem to be an effective tool for making the software development process more tangible and convertible for all connected disciplines. Furthermore, the working method turned out to support interprofessional ideas exchange. PMID:27756716
Improved Linear Algebra Methods for Redshift Computation from Limited Spectrum Data - II
NASA Technical Reports Server (NTRS)
Foster, Leslie; Waagen, Alex; Aijaz, Nabella; Hurley, Michael; Luis, Apolo; Rinsky, Joel; Satyavolu, Chandrika; Gazis, Paul; Srivastava, Ashok; Way, Michael
2008-01-01
Given photometric broadband measurements of a galaxy, Gaussian processes may be used with a training set to solve the regression problem of approximating the redshift of this galaxy. However, in practice solving the traditional Gaussian processes equation is too slow and requires too much memory. We employed several methods to avoid this difficulty using algebraic manipulation and low-rank approximation, and were able to quickly approximate the redshifts in our testing data within 17 percent of the known true values using limited computational resources. The accuracy of one method, the V Formulation, is comparable to the accuracy of the best methods currently used for this problem.
Improving the Aircraft Design Process Using Web-Based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)
2000-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Improving the Aircraft Design Process Using Web-based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.
2003-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Electrical Bonding: A Survey of Requirement, Methods, and Specifications
NASA Technical Reports Server (NTRS)
Evans, R. W.
1998-01-01
This document provides information helpful to engineers imposing electrical bonding requirements, reviewing waiver requests, or modifying specifications on various space programs. Electrical bonding specifications and some of the processes used in the United States have been reviewed. This document discusses the specifications, the types of bonds, the intent of each, and the basic requirements where possible. Additional topics discussed are resistance versus impedance, bond straps, corrosion, finishes, and special applications.
Design Tool Using a New Optimization Method Based on a Stochastic Process
NASA Astrophysics Data System (ADS)
Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio
Conventional optimization methods are based on a deterministic approach since their purpose is to find out an exact solution. However, such methods have initial condition dependence and the risk of falling into local solution. In this paper, we propose a new optimization method based on the concept of path integrals used in quantum mechanics. The method obtains a solution as an expected value (stochastic average) using a stochastic process. The advantages of this method are that it is not affected by initial conditions and does not require techniques based on experiences. We applied the new optimization method to a hang glider design. In this problem, both the hang glider design and its flight trajectory were optimized. The numerical calculation results prove that performance of the method is sufficient for practical use.
Exploring Operational Test and Evaluation of Unmanned Aircraft Systems: A Qualitative Case Study
NASA Astrophysics Data System (ADS)
Saliceti, Jose A.
The purpose of this qualitative case study was to explore and identify strategies that may potentially remedy operational test and evaluation procedures used to evaluate Unmanned Aircraft Systems (UAS) technology. The sample for analysis consisted of organizations testing and evaluating UASs (e.g., U.S. Air Force, U.S. Navy, U.S. Army, U.S. Marine Corps, U.S. Coast Guard, and Customs Border Protection). A purposeful sampling technique was used to select 15 subject matter experts in the field of operational test and evaluation of UASs. A questionnaire was provided to participants to construct a descriptive and robust research. Analysis of responses revealed themes related to each research question. Findings revealed operational testers utilized requirements documents to extrapolate measures for testing UAS technology and develop critical operational issues. The requirements documents were (a) developed without the contribution of stakeholders and operational testers, (b) developed with vague or unrealistic measures, and (c) developed without a systematic method to derive requirements from mission tasks. Four approaches are recommended to develop testable operational requirements and assist operational testers: (a) use a mission task analysis tool to derive requirements for mission essential tasks for the system, (b) exercise collaboration among stakeholders and testers to ensure testable operational requirements based on mission tasks, (c) ensure testable measures are used in requirements documents, and (d) create a repository list of critical operational issues by mission areas. The preparation of operational test and evaluation processes for UAS technology is not uniform across testers. The processes in place are not standardized, thus test plan preparation and reporting are different among participants. A standard method to prepare and report UAS technology should be used when preparing and reporting on UAS technology. Using a systematic process, such as mission-based test design, resonated among participants as an analytical method to link UAS mission tasks and measures of performance to the capabilities of the system under test when developing operational test plans. Further research should examine system engineering designs for system requirements traceability matrix of mission tasks and subtasks while using an analysis tool that adequately evaluates UASs with an acceptable level of confidence in the results.
Registration and Marking Requirements for UAS. Unmanned Aircraft System (UAS) Registration
NASA Technical Reports Server (NTRS)
2005-01-01
The registration of an aircraft is a prerequisite for issuance of a U.S. certificate of airworthiness by the FAA. The procedures and requirements for aircraft registration, and the subsequent issuance of registration numbers, are contained in FAR Part 47. However, the process/method(s) for applying the requirements of Parts 45 & 47 to Unmanned Aircraft Systems (UAS) has not been defined. This task resolved the application of 14 CFR Parts 45 and 47 to UAS. Key Findings: UAS are aircraft systems and as such the recommended approach to registration is to follow the same process for registration as manned aircraft. This will require manufacturers to comply with the requirements for 14 CFR 47, Aircraft Registration and 14 CFR 45, Identification and Registration Marking. In addition, only the UA should be identified with the N number registration markings. There should also be a documentation link showing the applicability of the control station and communication link to the UA. The documentation link can be in the form of a Type Certificate Data Sheet (TCDS) entry or a UAS logbook entry. The recommended process for the registration of UAS is similar to the manned aircraft process and is outlined in a 6-step process in the paper.
WALSH, TIMOTHY F.; JONES, ANDREA; BHARDWAJ, MANOJ; ...
2013-04-01
Finite element analysis of transient acoustic phenomena on unbounded exterior domains is very common in engineering analysis. In these problems there is a common need to compute the acoustic pressure at points outside of the acoustic mesh, since meshing to points of interest is impractical in many scenarios. In aeroacoustic calculations, for example, the acoustic pressure may be required at tens or hundreds of meters from the structure. In these cases, a method is needed for post-processing the acoustic results to compute the response at far-field points. In this paper, we compare two methods for computing far-field acoustic pressures, onemore » derived directly from the infinite element solution, and the other from the transient version of the Kirchhoff integral. Here, we show that the infinite element approach alleviates the large storage requirements that are typical of Kirchhoff integral and related procedures, and also does not suffer from loss of accuracy that is an inherent part of computing numerical derivatives in the Kirchhoff integral. In order to further speed up and streamline the process of computing the acoustic response at points outside of the mesh, we also address the nonlinear iterative procedure needed for locating parametric coordinates within the host infinite element of far-field points, the parallelization of the overall process, linear solver requirements, and system stability considerations.« less
The MINERVA Software Development Process
NASA Technical Reports Server (NTRS)
Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.
2017-01-01
This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.
An Adaptive Kalman Filter using a Simple Residual Tuning Method
NASA Technical Reports Server (NTRS)
Harman, Richard R.
1999-01-01
One difficulty in using Kalman filters in real world situations is the selection of the correct process noise, measurement noise, and initial state estimate and covariance. These parameters are commonly referred to as tuning parameters. Multiple methods have been developed to estimate these parameters. Most of those methods such as maximum likelihood, subspace, and observer Kalman Identification require extensive offline processing and are not suitable for real time processing. One technique, which is suitable for real time processing, is the residual tuning method. Any mismodeling of the filter tuning parameters will result in a non-white sequence for the filter measurement residuals. The residual tuning technique uses this information to estimate corrections to those tuning parameters. The actual implementation results in a set of sequential equations that run in parallel with the Kalman filter. Equations for the estimation of the measurement noise have also been developed. These algorithms are used to estimate the process noise and measurement noise for the Wide Field Infrared Explorer star tracker and gyro.
Detector motion method to increase spatial resolution in photon-counting detectors
NASA Astrophysics Data System (ADS)
Lee, Daehee; Park, Kyeongjin; Lim, Kyung Taek; Cho, Gyuseong
2017-03-01
Medical imaging requires high spatial resolution of an image to identify fine lesions. Photon-counting detectors in medical imaging have recently been rapidly replacing energy-integrating detectors due to the former`s high spatial resolution, high efficiency and low noise. Spatial resolution in a photon counting image is determined by the pixel size. Therefore, the smaller the pixel size, the higher the spatial resolution that can be obtained in an image. However, detector redesigning is required to reduce pixel size, and an expensive fine process is required to integrate a signal processing unit with reduced pixel size. Furthermore, as the pixel size decreases, charge sharing severely deteriorates spatial resolution. To increase spatial resolution, we propose a detector motion method using a large pixel detector that is less affected by charge sharing. To verify the proposed method, we utilized a UNO-XRI photon-counting detector (1-mm CdTe, Timepix chip) at the maximum X-ray tube voltage of 80 kVp. A similar spatial resolution of a 55- μm-pixel image was achieved by application of the proposed method to a 110- μm-pixel detector with a higher signal-to-noise ratio. The proposed method could be a way to increase spatial resolution without a pixel redesign when pixels severely suffer from charge sharing as pixel size is reduced.
NASA Technical Reports Server (NTRS)
Booth, E., Jr.; Yu, J. C.
1986-01-01
An experimental investigation of two dimensional blade vortex interaction was held at NASA Langley Research Center. The first phase was a flow visualization study to document the approach process of a two dimensional vortex as it encountered a loaded blade model. To accomplish the flow visualization study, a method for generating two dimensional vortex filaments was required. The numerical study used to define a new vortex generation process and the use of this process in the flow visualization study were documented. Additionally, photographic techniques and data analysis methods used in the flow visualization study are examined.
Reanalysis, compatibility and correlation in analysis of modified antenna structures
NASA Technical Reports Server (NTRS)
Levy, R.
1989-01-01
A simple computational procedure is synthesized to process changes in the microwave-antenna pathlength-error measure when there are changes in the antenna structure model. The procedure employs structural modification reanalysis methods combined with new extensions of correlation analysis to provide the revised rms pathlength error. Mainframe finite-element-method processing of the structure model is required only for the initial unmodified structure, and elementary postprocessor computations develop and deal with the effects of the changes. Several illustrative computational examples are included. The procedure adapts readily to processing spectra of changes for parameter studies or sensitivity analyses.
Walther, Cornelia; Kellner, Martin; Berkemeyer, Matthias; Brocard, Cécile; Dürauer, Astrid
2017-10-21
Escherichia coli stores large amounts of highly pure product within inclusion bodies (IBs). To take advantage of this beneficial feature, after cell disintegration, the first step to optimal product recovery is efficient IB preparation. This step is also important in evaluating upstream optimization and process development, due to the potential impact of bioprocessing conditions on product quality and on the nanoscale properties of IBs. Proper IB preparation is often neglected, due to laboratory-scale methods requiring large amounts of materials and labor. Miniaturization and parallelization can accelerate analyses of individual processing steps and provide a deeper understanding of up- and downstream processing interdependencies. Consequently, reproducible, predictive microscale methods are in demand. In the present study, we complemented a recently established high-throughput cell disruption method with a microscale method for preparing purified IBs. This preparation provided results comparable to laboratory-scale IB processing, regarding impurity depletion, and product loss. Furthermore, with this method, we performed a "design of experiments" study to demonstrate the influence of fermentation conditions on the performance of subsequent downstream steps and product quality. We showed that this approach provided a 300-fold reduction in material consumption for each fermentation condition and a 24-fold reduction in processing time for 24 samples.
Advance Technology Satellites in the Commercial Environment. Volume 2: Final Report
NASA Technical Reports Server (NTRS)
1984-01-01
A forecast of transponder requirements was obtained. Certain assumptions about system configurations are implicit in this process. The factors included are interpolation of baseline year values to produce yearly figures, estimation of satellite capture, effects of peak-hours and the time-zone staggering of peak hours, circuit requirements for acceptable grade of service capacity of satellite transponders, including various compression methods where applicable, and requirements for spare transponders in orbit. The graphical distribution of traffic requirements was estimated.
A New Sampling Strategy for the Detection of Fecal Bacteria Integrated with USEPA Method 1622/1623
USEPA Method 1622/1623 requires the concentration of Cryptosporidium and Giardia from 10 liters of water samples prior to detection. During this process the supernatant is discarded because it is assumed that most protozoa are retained in the filtration and centrifugation steps....
40 CFR 98.294 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... scales or methods used for accounting purposes. (3) Document the procedures used to ensure the accuracy of the monthly measurements of trona consumed. (b) If you calculate CO2 process emissions based on... your facility, or methods used for accounting purposes. (3) Document the procedures used to ensure the...
Assessing Creative Thinking in Design-Based Learning
ERIC Educational Resources Information Center
Doppelt, Yaron
2009-01-01
Infusing creative thinking competence through the design process of authentic projects requires not only changing the teaching methods and learning environment, but also adopting new assessment methods, such as portfolio assessment. The participants in this study were 128 high school pupils who have studied MECHATRONICS from 10th to 12th grades…
Hooked on Inquiry: History Labs in the Methods Course
ERIC Educational Resources Information Center
Wood, Linda Sargent
2012-01-01
Methods courses provide a rich opportunity to unpack what it means to "learn history by doing history." To help explain what "doing history" means, the author has created history labs to walk teacher candidates through the historical process. Each lab poses a historical problem, requires analysis of primary and secondary…
Identifying Teaching Methods that Engage Entrepreneurship Students
ERIC Educational Resources Information Center
Balan, Peter; Metcalfe, Mike
2012-01-01
Purpose: Entrepreneurship education particularly requires student engagement because of the complexity of the entrepreneurship process. The purpose of this paper is to describe how an established measure of engagement can be used to identify relevant teaching methods that could be used to engage any group of entrepreneurship students.…
Are We There Yet? Evaluating Library Collections, Reference Services, Programs, and Personnel.
ERIC Educational Resources Information Center
Robbins-Carter, Jane; Zweizig, Douglas L.
1985-01-01
This second in a five-lesson tutorial on library evaluation focuses on the evaluation of library collections. Highlights include the seven-step evaluation process described in lesson one; quantitative methods (total size, unfilled requests, circulation, turnover rate); and qualitative methods (impressionistic, list-checking). One required and…
NASA Astrophysics Data System (ADS)
Pries, V. V.; Proskuriakov, N. E.
2018-04-01
To control the assembly quality of multi-element mass-produced products on automatic rotor lines, control methods with operational feedback are required. However, due to possible failures in the operation of the devices and systems of automatic rotor line, there is always a real probability of getting defective (incomplete) products into the output process stream. Therefore, a continuous sampling control of the products completeness, based on the use of statistical methods, remains an important element in managing the quality of assembly of multi-element mass products on automatic rotor lines. The feature of continuous sampling control of the multi-element products completeness in the assembly process is its breaking sort, which excludes the possibility of returning component parts after sampling control to the process stream and leads to a decrease in the actual productivity of the assembly equipment. Therefore, the use of statistical procedures for continuous sampling control of the multi-element products completeness when assembled on automatic rotor lines requires the use of such sampling plans that ensure a minimum size of control samples. Comparison of the values of the limit of the average output defect level for the continuous sampling plan (CSP) and for the automated continuous sampling plan (ACSP) shows the possibility of providing lower limit values for the average output defects level using the ACSP-1. Also, the average sample size when using the ACSP-1 plan is less than when using the CSP-1 plan. Thus, the application of statistical methods in the assembly quality management of multi-element products on automatic rotor lines, involving the use of proposed plans and methods for continuous selective control, will allow to automating sampling control procedures and the required level of quality of assembled products while minimizing sample size.
Analytical and regression models of glass rod drawing process
NASA Astrophysics Data System (ADS)
Alekseeva, L. B.
2018-03-01
The process of drawing glass rods (light guides) is being studied. The parameters of the process affecting the quality of the light guide have been determined. To solve the problem, mathematical models based on general equations of continuum mechanics are used. The conditions for the stable flow of the drawing process have been found, which are determined by the stability of the motion of the glass mass in the formation zone to small uncontrolled perturbations. The sensitivity of the formation zone to perturbations of the drawing speed and viscosity is estimated. Experimental models of the drawing process, based on the regression analysis methods, have been obtained. These models make it possible to customize a specific production process to obtain light guides of the required quality. They allow one to find the optimum combination of process parameters in the chosen area and to determine the required accuracy of maintaining them at a specified level.
NASA Astrophysics Data System (ADS)
Schuck, Miller Harry
Automotive head-up displays require compact, bright, and inexpensive imaging systems. In this thesis, a compact head-up display (HUD) utilizing liquid-crystal-on-silicon microdisplay technology is presented from concept to implementation. The thesis comprises three primary areas of HUD research: the specification, design and implementation of a compact HUD optical system, the development of a wafer planarization process to enhance reflective device brightness and light immunity and the design, fabrication and testing of an inexpensive 640 x 512 pixel active matrix backplane intended to meet the HUD requirements. The thesis addresses the HUD problem at three levels, the systems level, the device level, and the materials level. At the systems level, the optical design of an automotive HUD must meet several competing requirements, including high image brightness, compact packaging, video-rate performance, and low cost. An optical system design which meets the competing requirements has been developed utilizing a fully-reconfigurable reflective microdisplay. The design consists of two optical stages, the first a projector stage which magnifies the display, and a second stage which forms the virtual image eventually seen by the driver. A key component of the optical system is a diffraction grating/field lens which forms a large viewing eyebox while reducing the optical system complexity. Image quality biocular disparity and luminous efficacy were analyzed and results of the optical implementation are presented. At the device level, the automotive HUD requires a reconfigurable, video-rate, high resolution image source for applications such as navigation and night vision. The design of a 640 x 512 pixel active matrix backplane which meets the requirements of the HUD is described. The backplane was designed to produce digital field sequential color images at video rates utilizing fast switching liquid crystal as the modulation layer. The design methodology is discussed, and the example of a clock generator is described from design to implementation. Electrical and optical test results of the fabricated backplane are presented. At the materials level, a planarization method was developed to meet the stringent brightness requirements of automotive HUD's. The research efforts described here have resulted in a simple, low cost post-processing method for planarizing microdisplay substrates based on a spin-cast polymeric resin, benzocyclobutene (BCB). Six- fold reductions in substrate step height were accomplished with a single coating. Via masking and dry etching methods were developed. High reflectivity metal was deposited and patterned over the planarized substrate to produce high aperture pixel mirrors. The process is simple, rapid, and results in microdisplays better able to meet the stringent requirements of high brightness display systems. Methods and results of the post- processing are described.
Reducing the complexity of the software design process with object-oriented design
NASA Technical Reports Server (NTRS)
Schuler, M. P.
1991-01-01
Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.
A Non-Intrusive GMA Welding Process Quality Monitoring System Using Acoustic Sensing.
Cayo, Eber Huanca; Alfaro, Sadek Crisostomo Absi
2009-01-01
Most of the inspection methods used for detection and localization of welding disturbances are based on the evaluation of some direct measurements of welding parameters. This direct measurement requires an insertion of sensors during the welding process which could somehow alter the behavior of the metallic transference. An inspection method that evaluates the GMA welding process evolution using a non-intrusive process sensing would allow not only the identification of disturbances during welding runs and thus reduce inspection time, but would also reduce the interference on the process caused by the direct sensing. In this paper a nonintrusive method for weld disturbance detection and localization for weld quality evaluation is demonstrated. The system is based on the acoustic sensing of the welding electrical arc. During repetitive tests in welds without disturbances, the stability acoustic parameters were calculated and used as comparison references for the detection and location of disturbances during the weld runs.
A Non-Intrusive GMA Welding Process Quality Monitoring System Using Acoustic Sensing
Cayo, Eber Huanca; Alfaro, Sadek Crisostomo Absi
2009-01-01
Most of the inspection methods used for detection and localization of welding disturbances are based on the evaluation of some direct measurements of welding parameters. This direct measurement requires an insertion of sensors during the welding process which could somehow alter the behavior of the metallic transference. An inspection method that evaluates the GMA welding process evolution using a non-intrusive process sensing would allow not only the identification of disturbances during welding runs and thus reduce inspection time, but would also reduce the interference on the process caused by the direct sensing. In this paper a nonintrusive method for weld disturbance detection and localization for weld quality evaluation is demonstrated. The system is based on the acoustic sensing of the welding electrical arc. During repetitive tests in welds without disturbances, the stability acoustic parameters were calculated and used as comparison references for the detection and location of disturbances during the weld runs. PMID:22399990
Gaussian processes: a method for automatic QSAR modeling of ADME properties.
Obrezanova, Olga; Csanyi, Gabor; Gola, Joelle M R; Segall, Matthew D
2007-01-01
In this article, we discuss the application of the Gaussian Process method for the prediction of absorption, distribution, metabolism, and excretion (ADME) properties. On the basis of a Bayesian probabilistic approach, the method is widely used in the field of machine learning but has rarely been applied in quantitative structure-activity relationship and ADME modeling. The method is suitable for modeling nonlinear relationships, does not require subjective determination of the model parameters, works for a large number of descriptors, and is inherently resistant to overtraining. The performance of Gaussian Processes compares well with and often exceeds that of artificial neural networks. Due to these features, the Gaussian Processes technique is eminently suitable for automatic model generation-one of the demands of modern drug discovery. Here, we describe the basic concept of the method in the context of regression problems and illustrate its application to the modeling of several ADME properties: blood-brain barrier, hERG inhibition, and aqueous solubility at pH 7.4. We also compare Gaussian Processes with other modeling techniques.
Method for Reducing the Refresh Rate of Fiber Bragg Grating Sensors
NASA Technical Reports Server (NTRS)
Parker, Allen R., Jr. (Inventor)
2014-01-01
The invention provides a method of obtaining the FBG data in final form (transforming the raw data into frequency and location data) by taking the raw FBG sensor data and dividing the data into a plurality of segments over time. By transforming the raw data into a plurality of smaller segments, processing time is significantly decreased. Also, by defining the segments over time, only one processing step is required. By employing this method, the refresh rate of FBG sensor systems can be improved from about 1 scan per second to over 20 scans per second.
Investigation of test methods, material properties and processes for solar cell encapsulants
NASA Technical Reports Server (NTRS)
Willis, P. B.
1985-01-01
The historical development of ethylene vinyl acetate (EVA) is presented, including the functional requirements, polymer selection, curing, stabilization, production and module processing. The construction and use of a new method for the accelerated aging of polymers is detailed. The method more closely resembles the conditions that may be encountered in actual module field exposure and additionally may permit service life to be predicted accurately. The use of hardboard as a low cost candidate substrate material is studied. The performance of surface antisoiling treatments useful for imparting a self cleaning property to modules is updated.
Farzandipour, Mehrdad; Meidani, Zahra; Riazi, Hossein; Sadeqi Jabali, Monireh
2016-12-01
Considering the integral role of understanding users' requirements in information system success, this research aimed to determine functional requirements of nursing information systems through a national survey. Delphi technique method was applied to conduct this study through three phases: focus group method modified Delphi technique and classic Delphi technique. A cross-sectional study was conducted to evaluate the proposed requirements within 15 general hospitals in Iran. Forty-three of 76 approved requirements were clinical, and 33 were administrative ones. Nurses' mean agreements for clinical requirements were higher than those of administrative requirements; minimum and maximum means of clinical requirements were 3.3 and 3.88, respectively. Minimum and maximum means of administrative requirements were 3.1 and 3.47, respectively. Research findings indicated that those information system requirements that support nurses in doing tasks including direct care, medicine prescription, patient treatment management, and patient safety have been the target of special attention. As nurses' requirements deal directly with patient outcome and patient safety, nursing information systems requirements should not only address automation but also nurses' tasks and work processes based on work analysis.
Identification of dynamic systems, theory and formulation
NASA Technical Reports Server (NTRS)
Maine, R. E.; Iliff, K. W.
1985-01-01
The problem of estimating parameters of dynamic systems is addressed in order to present the theoretical basis of system identification and parameter estimation in a manner that is complete and rigorous, yet understandable with minimal prerequisites. Maximum likelihood and related estimators are highlighted. The approach used requires familiarity with calculus, linear algebra, and probability, but does not require knowledge of stochastic processes or functional analysis. The treatment emphasizes unification of the various areas in estimation in dynamic systems is treated as a direct outgrowth of the static system theory. Topics covered include basic concepts and definitions; numerical optimization methods; probability; statistical estimators; estimation in static systems; stochastic processes; state estimation in dynamic systems; output error, filter error, and equation error methods of parameter estimation in dynamic systems, and the accuracy of the estimates.
Volume Segmentation and Ghost Particles
NASA Astrophysics Data System (ADS)
Ziskin, Isaac; Adrian, Ronald
2011-11-01
Volume Segmentation Tomographic PIV (VS-TPIV) is a type of tomographic PIV in which images of particles in a relatively thick volume are segmented into images on a set of much thinner volumes that may be approximated as planes, as in 2D planar PIV. The planes of images can be analysed by standard mono-PIV, and the volume of flow vectors can be recreated by assembling the planes of vectors. The interrogation process is similar to a Holographic PIV analysis, except that the planes of image data are extracted from two-dimensional camera images of the volume of particles instead of three-dimensional holographic images. Like the tomographic PIV method using the MART algorithm, Volume Segmentation requires at least two cameras and works best with three or four. Unlike the MART method, Volume Segmentation does not require reconstruction of individual particle images one pixel at a time and it does not require an iterative process, so it operates much faster. As in all tomographic reconstruction strategies, ambiguities known as ghost particles are produced in the segmentation process. The effect of these ghost particles on the PIV measurement is discussed. This research was supported by Contract 79419-001-09, Los Alamos National Laboratory.
Light, Janice; McNaughton, David
2014-06-01
In order to improve outcomes for individuals who require AAC, there is an urgent need for research across the full spectrum--from basic research to investigate fundamental language and communication processes, to applied clinical research to test applications of this new knowledge in the real world. To date, there has been a notable lack of basic research in the AAC field to investigate the underlying cognitive, sensory perceptual, linguistic, and motor processes of individuals with complex communication needs. Eye tracking research technology provides a promising method for researchers to investigate some of the visual cognitive processes that underlie interaction via AAC. The eye tracking research technology automatically records the latency, duration, and sequence of visual fixations, providing key information on what elements attract the individual's attention (and which ones do not), for how long, and in what sequence. As illustrated by the papers in this special issue, this information can be used to improve the design of AAC systems, assessments, and interventions to better meet the needs of individuals with developmental and acquired disabilities who require AAC (e.g., individuals with autism spectrum disorders, Down syndrome, intellectual disabilities of unknown origin, aphasia).
Solar Energy Systems for Lunar Oxygen Generation
NASA Technical Reports Server (NTRS)
Colozza, Anthony J.; Heller, Richard S.; Wong, Wayne A.; Hepp, Aloysius F.
2010-01-01
An evaluation of several solar concentrator-based systems for producing oxygen from lunar regolith was performed. The systems utilize a solar concentrator mirror to provide thermal energy for the oxygen production process. Thermal energy to power a Stirling heat engine and photovoltaics are compared for the production of electricity. The electricity produced is utilized to operate the equipment needed in the oxygen production process. The initial oxygen production method utilized in the analysis is hydrogen reduction of ilmenite. Utilizing this method of oxygen production a baseline system design was produced. This baseline system had an oxygen production rate of 0.6 kg/hr with a concentrator mirror size of 5 m. Variations were performed on the baseline design to show how changes in the system size and process (rate) affected the oxygen production rate. An evaluation of the power requirements for a carbothermal lunar regolith reduction reactor has also been conducted. The reactor had a total power requirement between 8,320 to 9,961 W when producing 1000 kg/year of oxygen. The solar concentrator used to provide the thermal power (over 82 percent of the total energy requirement) would have a diameter of less than 4 m.
Processes for Assessing the Thermal Stability of Han-Based Liquid Propellants. Revision
1990-07-01
indicators is not adequate, and potentiometric determination cr’ the equivalence point is the most suitable method (Kraft and Fischer 1972). The use of...be determined by Karl Fischer titration. This method requires a special titration apparatus because the Titroprozessor 636 is not suited for this type... methods obtained from the literature (Kraft and Fischer 1972), and, where necessary, the manufacturer has modified evaluation methods (Firmenschrift
Software for MR image overlay guided needle insertions: the clinical translation process
NASA Astrophysics Data System (ADS)
Ungi, Tamas; U-Thainual, Paweena; Fritz, Jan; Iordachita, Iulian I.; Flammang, Aaron J.; Carrino, John A.; Fichtinger, Gabor
2013-03-01
PURPOSE: Needle guidance software using augmented reality image overlay was translated from the experimental phase to support preclinical and clinical studies. Major functional and structural changes were needed to meet clinical requirements. We present the process applied to fulfill these requirements, and selected features that may be applied in the translational phase of other image-guided surgical navigation systems. METHODS: We used an agile software development process for rapid adaptation to unforeseen clinical requests. The process is based on iterations of operating room test sessions, feedback discussions, and software development sprints. The open-source application framework of 3D Slicer and the NA-MIC kit provided sufficient flexibility and stable software foundations for this work. RESULTS: All requirements were addressed in a process with 19 operating room test iterations. Most features developed in this phase were related to workflow simplification and operator feedback. CONCLUSION: Efficient and affordable modifications were facilitated by an open source application framework and frequent clinical feedback sessions. Results of cadaver experiments show that software requirements were successfully solved after a limited number of operating room tests.
NASA Astrophysics Data System (ADS)
Moustafa, Azza A.; Hegazy, Maha A.; Mohamed, Dalia; Ali, Omnia
2016-02-01
A novel approach for the resolution and quantitation of severely overlapped quaternary mixture of carbinoxamine maleate (CAR), pholcodine (PHL), ephedrine hydrochloride (EPH) and sunset yellow (SUN) in syrup was demonstrated utilizing different spectrophotometric assisted multivariate calibration methods. The applied methods have used different processing and pre-processing algorithms. The proposed methods were partial least squares (PLS), concentration residuals augmented classical least squares (CRACLS), and a novel method; continuous wavelet transforms coupled with partial least squares (CWT-PLS). These methods were applied to a training set in the concentration ranges of 40-100 μg/mL, 40-160 μg/mL, 100-500 μg/mL and 8-24 μg/mL for the four components, respectively. The utilized methods have not required any preliminary separation step or chemical pretreatment. The validity of the methods was evaluated by an external validation set. The selectivity of the developed methods was demonstrated by analyzing the drugs in their combined pharmaceutical formulation without any interference from additives. The obtained results were statistically compared with the official and reported methods where no significant difference was observed regarding both accuracy and precision.
Kundu, Anupam; Sabhapandit, Sanjib; Dhar, Abhishek
2011-03-01
We present an algorithm for finding the probabilities of rare events in nonequilibrium processes. The algorithm consists of evolving the system with a modified dynamics for which the required event occurs more frequently. By keeping track of the relative weight of phase-space trajectories generated by the modified and the original dynamics one can obtain the required probabilities. The algorithm is tested on two model systems of steady-state particle and heat transport where we find a huge improvement from direct simulation methods.
NASA Technical Reports Server (NTRS)
Newell, J. D.; Keller, R. A.; Baily, N. A.
1974-01-01
A simple method for outlining or contouring any area defined by a change in film density or fluoroscopic screen intensity is described. The entire process, except for the positioning of an electronic window, is accomplished using a small computer having appropriate softwave. The electronic window is operator positioned over the area to be processed. The only requirement is that the window be large enough to encompass the total area to be considered.
Taguchi experimental design to determine the taste quality characteristic of candied carrot
NASA Astrophysics Data System (ADS)
Ekawati, Y.; Hapsari, A. A.
2018-03-01
Robust parameter design is used to design product that is robust to noise factors so the product’s performance fits the target and delivers a better quality. In the process of designing and developing the innovative product of candied carrot, robust parameter design is carried out using Taguchi Method. The method is used to determine an optimal quality design. The optimal quality design is based on the process and the composition of product ingredients that are in accordance with consumer needs and requirements. According to the identification of consumer needs from the previous research, quality dimensions that need to be assessed are the taste and texture of the product. The quality dimension assessed in this research is limited to the taste dimension. Organoleptic testing is used for this assessment, specifically hedonic testing that makes assessment based on consumer preferences. The data processing uses mean and signal to noise ratio calculation and optimal level setting to determine the optimal process/composition of product ingredients. The optimal value is analyzed using confirmation experiments to prove that proposed product match consumer needs and requirements. The result of this research is identification of factors that affect the product taste and the optimal quality of product according to Taguchi Method.
A method of network topology optimization design considering application process characteristic
NASA Astrophysics Data System (ADS)
Wang, Chunlin; Huang, Ning; Bai, Yanan; Zhang, Shuo
2018-03-01
Communication networks are designed to meet the usage requirements of users for various network applications. The current studies of network topology optimization design mainly considered network traffic, which is the result of network application operation, but not a design element of communication networks. A network application is a procedure of the usage of services by users with some demanded performance requirements, and has obvious process characteristic. In this paper, we first propose a method to optimize the design of communication network topology considering the application process characteristic. Taking the minimum network delay as objective, and the cost of network design and network connective reliability as constraints, an optimization model of network topology design is formulated, and the optimal solution of network topology design is searched by Genetic Algorithm (GA). Furthermore, we investigate the influence of network topology parameter on network delay under the background of multiple process-oriented applications, which can guide the generation of initial population and then improve the efficiency of GA. Numerical simulations show the effectiveness and validity of our proposed method. Network topology optimization design considering applications can improve the reliability of applications, and provide guidance for network builders in the early stage of network design, which is of great significance in engineering practices.
Comparison of six methods for isolating mycobacteria from swine lymph nodes.
Thoen, C O; Richards, W D; Jarnagin, J L
1974-03-01
Six laboratory methods were compared for isolating acid-fast bacteria. Tuberculous lymph nodes from each of 48 swine as identified by federal meat inspectors were processed by each of the methods. Treated tissue suspensions were inoculated onto each of eight media which were observed at 7-day intervals for 9 weeks. There were no statistically significant differences between the number of Mycobacterium avium complex bacteria isolated by each of the six methods. Rapid tissue preparation methods involving treatment with 2% sodium hydroxide or treatment with 0.2% zephiran required only one-third to one-fourth the processing time as a standard method. There were small differences in the amount of contamination among the six methods, but no detectable differences in the time of first appearance of M. avium complex colonies.
Cho, Seok-Cheol; Choi, Woon-Yong; Oh, Sung-Ho; Lee, Choon-Geun; Seo, Yong-Chang; Kim, Ji-Seon; Song, Chi-Ho; Kim, Ga-Vin; Lee, Shin-Young; Kang, Do-Hyung; Lee, Hyeon-Yong
2012-01-01
Marine microalga, Scenedesmus sp., which is known to be suitable for biodiesel production because of its high lipid content, was subjected to the conventional Folch method of lipid extraction combined with high-pressure homogenization pretreatment process at 1200 psi and 35°C. Algal lipid yield was about 24.9% through this process, whereas only 19.8% lipid can be obtained by following a conventional lipid extraction procedure using the solvent, chloroform : methanol (2 : 1, v/v). Present approach requires 30 min process time and a moderate working temperature of 35°C as compared to the conventional extraction method which usually requires >5 hrs and 65°C temperature. It was found that this combined extraction process followed second-order reaction kinetics, which means most of the cellular lipids were extracted during initial periods of extraction, mostly within 30 min. In contrast, during the conventional extraction process, the cellular lipids were slowly and continuously extracted for >5 hrs by following first-order kinetics. Confocal and scanning electron microscopy revealed altered texture of algal biomass pretreated with high-pressure homogenization. These results clearly demonstrate that the Folch method coupled with high-pressure homogenization pretreatment can easily destruct the rigid cell walls of microalgae and release the intact lipids, with minimized extraction time and temperature, both of which are essential for maintaining good quality of the lipids for biodiesel production. PMID:22969270
Kim, Huiyong; Hwang, Sung June; Lee, Kwang Soon
2015-02-03
Among various CO2 capture processes, the aqueous amine-based absorption process is considered the most promising for near-term deployment. However, the performance evaluation of newly developed solvents still requires complex and time-consuming procedures, such as pilot plant tests or the development of a rigorous simulator. Absence of accurate and simple calculation methods for the energy performance at an early stage of process development has lengthened and increased expense of the development of economically feasible CO2 capture processes. In this paper, a novel but simple method to reliably calculate the regeneration energy in a standard amine-based carbon capture process is proposed. Careful examination of stripper behaviors and exploitation of energy balance equations around the stripper allowed for calculation of the regeneration energy using only vapor-liquid equilibrium and caloric data. Reliability of the proposed method was confirmed by comparing to rigorous simulations for two well-known solvents, monoethanolamine (MEA) and piperazine (PZ). The proposed method can predict the regeneration energy at various operating conditions with greater simplicity, greater speed, and higher accuracy than those proposed in previous studies. This enables faster and more precise screening of various solvents and faster optimization of process variables and can eventually accelerate the development of economically deployable CO2 capture processes.
Bright, T.J.
2013-01-01
Summary Background Many informatics studies use content analysis to generate functional requirements for system development. Explication of this translational process from qualitative data to functional requirements can strengthen the understanding and scientific rigor when applying content analysis in informatics studies. Objective To describe a user-centered approach transforming emergent themes derived from focus group data into functional requirements for informatics solutions and to illustrate these methods to the development of an antibiotic clinical decision support system (CDS). Methods The approach consisted of five steps: 1) identify unmet therapeutic planning information needs via Focus Group Study-I, 2) develop a coding framework of therapeutic planning themes to refine the domain scope to antibiotic therapeutic planning, 3) identify functional requirements of an antibiotic CDS system via Focus Group Study-II, 4) discover informatics solutions and functional requirements from coded data, and 5) determine the types of information needed to support the antibiotic CDS system and link with the identified informatics solutions and functional requirements. Results The coding framework for Focus Group Study-I revealed unmet therapeutic planning needs. Twelve subthemes emerged and were clustered into four themes; analysis indicated a need for an antibiotic CDS intervention. Focus Group Study-II included five types of information needs. Comments from the Barrier/Challenge to information access and Function/Feature themes produced three informatics solutions and 13 functional requirements of an antibiotic CDS system. Comments from the Patient, Institution, and Domain themes generated required data elements for each informatics solution. Conclusion This study presents one example explicating content analysis of focus group data and the analysis process to functional requirements from narrative data. Illustration of this 5-step method was used to develop an antibiotic CDS system, resolving unmet antibiotic prescribing needs. As a reusable approach, these techniques can be refined and applied to resolve unmet information needs with informatics interventions in additional domains. PMID:24454586
Method And Apparatus For Launching Microwave Energy Into A Plasma Processing Chamber
DOUGHTY, FRANK C.; [et al
2001-05-01
A method and apparatus for launching microwave energy to a plasma processing chamber in which the required magnetic field is generated by a permanent magnet structure and the permanent magnet material effectively comprises one or more surfaces of the waveguide structure. The waveguide structure functions as an impedance matching device and controls the field pattern of the launched microwave field to create a uniform plasma. The waveguide launcher may comprise a rectangular waveguide, a circular waveguide, or a coaxial waveguide with permanent magnet material forming the sidewalls of the guide and a magnetization pattern which produces the required microwave electron cyclotron resonance magnetic field, a uniform field absorption pattern, and a rapid decay of the fields away from the resonance zone. In addition, the incorporation of permanent magnet material as a portion of the waveguide structure places the magnetic material in close proximity to the vacuum chamber, allowing for a precisely controlled magnetic field configuration, and a reduction of the amount of permanent magnet material required.
Automated and unsupervised detection of malarial parasites in microscopic images.
Purwar, Yashasvi; Shah, Sirish L; Clarke, Gwen; Almugairi, Areej; Muehlenbachs, Atis
2011-12-13
Malaria is a serious infectious disease. According to the World Health Organization, it is responsible for nearly one million deaths each year. There are various techniques to diagnose malaria of which manual microscopy is considered to be the gold standard. However due to the number of steps required in manual assessment, this diagnostic method is time consuming (leading to late diagnosis) and prone to human error (leading to erroneous diagnosis), even in experienced hands. The focus of this study is to develop a robust, unsupervised and sensitive malaria screening technique with low material cost and one that has an advantage over other techniques in that it minimizes human reliance and is, therefore, more consistent in applying diagnostic criteria. A method based on digital image processing of Giemsa-stained thin smear image is developed to facilitate the diagnostic process. The diagnosis procedure is divided into two parts; enumeration and identification. The image-based method presented here is designed to automate the process of enumeration and identification; with the main advantage being its ability to carry out the diagnosis in an unsupervised manner and yet have high sensitivity and thus reducing cases of false negatives. The image based method is tested over more than 500 images from two independent laboratories. The aim is to distinguish between positive and negative cases of malaria using thin smear blood slide images. Due to the unsupervised nature of method it requires minimal human intervention thus speeding up the whole process of diagnosis. Overall sensitivity to capture cases of malaria is 100% and specificity ranges from 50-88% for all species of malaria parasites. Image based screening method will speed up the whole process of diagnosis and is more advantageous over laboratory procedures that are prone to errors and where pathological expertise is minimal. Further this method provides a consistent and robust way of generating the parasite clearance curves.
Generalization of the photo process window and its application to OPC test pattern design
NASA Astrophysics Data System (ADS)
Eisenmann, Hans; Peter, Kai; Strojwas, Andrzej J.
2003-07-01
From the early development phase up to the production phase, test pattern play a key role for microlithography. The requirement for test pattern is to represent the design well and to cover the space of all process conditions, e.g. to investigate the full process window and all other process parameters. This paper shows that the current state-of-the-art test pattern do not address these requirements sufficiently and makes suggestions for a better selection of test pattern. We present a new methodology to analyze an existing layout (e.g. logic library, test pattern or full chip) for critical layout situations which does not need precise process data. We call this method "process space decomposition", because it is aimed at decomposing the process impact to a layout feature into a sum of single independent contributions, the dimensions of the process space. This is a generalization of the classical process window, which examines defocus and exposure dependency of given test pattern, e.g. CD value of dense and isolated lines. In our process space we additionally define the dimensions resist effects, etch effects, mask error and misalignment, which describe the deviation of the printed silicon pattern from its target. We further extend it by the pattern space using a product based layout (library, full chip or synthetic test pattern). The criticality of pattern is defined by their deviation due to aerial image, their sensitivity to the respective dimension or several combinations of these. By exploring the process space for a given design, the method allows to find the most critical patterns independent of specific process parameters. The paper provides examples for different applications of the method: (1) selection of design oriented test pattern for lithography development (2) test pattern reduction in process characterization (3) verification/optimization of printability and performance of post processing procedures (like OPC) (4) creation of a sensitive process monitor.
Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Part 2
NASA Technical Reports Server (NTRS)
Kodiyalam, Srinivas; Yuan, Charles; Sobieski, Jaroslaw (Technical Monitor)
2000-01-01
A new MDO method, BLISS, and two different variants of the method, BLISS/RS and BLISS/S, have been implemented using iSIGHT's scripting language and evaluated in this report on multidisciplinary problems. All of these methods are based on decomposing a modular system optimization system into several subtasks optimization, that may be executed concurrently, and the system optimization that coordinates the subtasks optimization. The BLISS method and its variants are well suited for exploiting the concurrent processing capabilities in a multiprocessor machine. Several steps, including the local sensitivity analysis, local optimization, response surfaces construction and updates are all ideally suited for concurrent processing. Needless to mention, such algorithms that can effectively exploit the concurrent processing capabilities of the compute servers will be a key requirement for solving large-scale industrial design problems, such as the automotive vehicle problem detailed in Section 3.4.
Comparison of Direct Solar Energy to Resistance Heating for Carbothermal Reduction of Regolith
NASA Technical Reports Server (NTRS)
Muscatello, Anthony C.; Gustafson, Robert J.
2011-01-01
A comparison of two methods of delivering thermal energy to regolith for the carbo thermal reduction process has been performed. The comparison concludes that electrical resistance heating is superior to direct solar energy via solar concentrators for the following reasons: (1) the resistance heating method can process approximately 12 times as much regolith using the same amount of thermal energy as the direct solar energy method because of superior thermal insulation; (2) the resistance heating method is more adaptable to nearer-term robotic exploration precursor missions because it does not require a solar concentrator system; (3) crucible-based methods are more easily adapted to separation of iron metal and glass by-products than direct solar energy because the melt can be poured directly after processing instead of being remelted; and (4) even with projected improvements in the mass of solar concentrators, projected photovoltaic system masses are expected to be even lower.
Phillips, Jeffrey D.
2002-01-01
In 1997, the U.S. Geological Survey (USGS) contracted with Sial Geosciences Inc. for a detailed aeromagnetic survey of the Santa Cruz basin and Patagonia Mountains area of south-central Arizona. The contractor's Operational Report is included as an Appendix in this report. This section describes the data processing performed by the USGS on the digital aeromagnetic data received from the contractor. This processing was required in order to remove flight line noise, estimate the depths to the magnetic sources, and estimate the locations of the magnetic contacts. Three methods were used for estimating source depths and contact locations: the horizontal gradient method, the analytic signal method, and the local wavenumber method. The depth estimates resulting from each method are compared, and the contact locations are combined into an interpretative map showing the dip direction for some contacts.
Numerical techniques for high-throughput reflectance interference biosensing
NASA Astrophysics Data System (ADS)
Sevenler, Derin; Ünlü, M. Selim
2016-06-01
We have developed a robust and rapid computational method for processing the raw spectral data collected from thin film optical interference biosensors. We have applied this method to Interference Reflectance Imaging Sensor (IRIS) measurements and observed a 10,000 fold improvement in processing time, unlocking a variety of clinical and scientific applications. Interference biosensors have advantages over similar technologies in certain applications, for example highly multiplexed measurements of molecular kinetics. However, processing raw IRIS data into useful measurements has been prohibitively time consuming for high-throughput studies. Here we describe the implementation of a lookup table (LUT) technique that provides accurate results in far less time than naive methods. We also discuss an additional benefit that the LUT method can be used with a wider range of interference layer thickness and experimental configurations that are incompatible with methods that require fitting the spectral response.
Engineering Change Management Method Framework in Mechanical Engineering
NASA Astrophysics Data System (ADS)
Stekolschik, Alexander
2016-11-01
Engineering changes make an impact on different process chains in and outside the company, and lead to most error costs and time shifts. In fact, 30 to 50 per cent of development costs result from technical changes. Controlling engineering change processes can help us to avoid errors and risks, and contribute to cost optimization and a shorter time to market. This paper presents a method framework for controlling engineering changes at mechanical engineering companies. The developed classification of engineering changes and accordingly process requirements build the basis for the method framework. The developed method framework comprises two main areas: special data objects managed in different engineering IT tools and process framework. Objects from both areas are building blocks that can be selected to the overall business process based on the engineering process type and change classification. The process framework contains steps for the creation of change objects (both for overall change and for parts), change implementation, and release. Companies can select singleprocess building blocks from the framework, depending on the product development process and change impact. The developed change framework has been implemented at a division (10,000 employees) of a big German mechanical engineering company.
Joining precipitation-hardened nickel-base alloys by friction welding
NASA Technical Reports Server (NTRS)
Moore, T. J.
1972-01-01
Solid state deformation welding process, friction welding, has been developed for joining precipitation hardened nickel-base alloys and other gamma prime-strengthened materials which heretofore have been virtually unweldable. Method requires rotation of one of the parts to be welded, but where applicable, it is an ideal process for high volume production jobs.
Quality Space and Launch Requirements, Addendum to AS9100C
2015-05-08
45 8.9.1 Statistical Process Control (SPC) .......................................................................... 45...SMC Space and Missile Systems Center SME Subject Matter Expert SOW Statement of Work SPC Statistical Process Control SPO System Program Office SRP...occur without any individual data exceeding the control limits. Control limits are developed using standard statistical methods or other approved
Grading Homework to Emphasize Problem-Solving Process Skills
ERIC Educational Resources Information Center
Harper, Kathleen A.
2012-01-01
This article describes a grading approach that encourages students to employ particular problem-solving skills. Some strengths of this method, called "process-based grading," are that it is easy to implement, requires minimal time to grade, and can be used in conjunction with either an online homework delivery system or paper-based homework.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-17
...-length AUDIT PROCESS transmission contracts, See Note. production and operating agreements and related... production method. 1206.458(c)(1)(iv) and (c)(2)(vi).. Submit arm's-length washing AUDIT PROCESS contracts...). The Secretary is required by various laws to manage mineral resource production from Federal and...
Process and formulation effects on solar thermal drum dried prune pomace
USDA-ARS?s Scientific Manuscript database
The processing of dried plums into prune juice and concentrate yields prune pomace as a coproduct; the pomace could potentially be utilized as a food ingredient but requires stabilization for long-term storage. Drum drying is one method that could be used to dry and stabilize prune pomace, and a dru...
Teacher-Led Design of an Adaptive Learning Environment
ERIC Educational Resources Information Center
Mavroudi, Anna; Hadzilacos, Thanasis; Kalles, Dimitris; Gregoriades, Andreas
2016-01-01
This paper discusses a requirements engineering process that exemplifies teacher-led design in the case of an envisioned system for adaptive learning. Such a design poses various challenges and still remains an open research issue in the field of adaptive learning. Starting from a scenario-based elicitation method, the whole process was highly…
Guiding Students through the Jungle of Research-Based Literature
ERIC Educational Resources Information Center
Williams, Sherie
2005-01-01
Undergraduate students of today often lack the ability to effectively process research-based literature. In order to offer education students the most up-to-date methods, research-based literature must be considered. Hence a dilemma is born as to whether professors should discontinue requiring the processing of this type of information or teach…
NASA Technical Reports Server (NTRS)
Reinhart, L. E.
2001-01-01
This paper provides an overview of the U.S. space nuclear power system launch approval process as defined by the two separate requirements of the National Environmental Policy Act (NEPA) and Presidential Directive/National Security Council Memorandum No. 25 (PD/NSC-25).
An Adaptive Kalman Filter Using a Simple Residual Tuning Method
NASA Technical Reports Server (NTRS)
Harman, Richard R.
1999-01-01
One difficulty in using Kalman filters in real world situations is the selection of the correct process noise, measurement noise, and initial state estimate and covariance. These parameters are commonly referred to as tuning parameters. Multiple methods have been developed to estimate these parameters. Most of those methods such as maximum likelihood, subspace, and observer Kalman Identification require extensive offline processing and are not suitable for real time processing. One technique, which is suitable for real time processing, is the residual tuning method. Any mismodeling of the filter tuning parameters will result in a non-white sequence for the filter measurement residuals. The residual tuning technique uses this information to estimate corrections to those tuning parameters. The actual implementation results in a set of sequential equations that run in parallel with the Kalman filter. A. H. Jazwinski developed a specialized version of this technique for estimation of process noise. Equations for the estimation of the measurement noise have also been developed. These algorithms are used to estimate the process noise and measurement noise for the Wide Field Infrared Explorer star tracker and gyro.
NASA Astrophysics Data System (ADS)
Hatarik, Robert; Caggiano, J. A.; Callahan, D.; Casey, D.; Clark, D.; Doeppner, T.; Eckart, M.; Field, J.; Frenje, J.; Gatu Johnson, M.; Grim, G.; Hartouni, E.; Hurricane, O.; Kilkenny, J.; Knauer, J.; Ma, T.; Mannion, O.; Munro, D.; Sayre, D.; Spears, B.
2015-11-01
The method of moments was introduced by Pearson as a process for estimating the population distributions from which a set of ``random variables'' are measured. These moments are compared with a parameterization of the distributions, or of the same quantities generated by simulations of the process. Most diagnostics processes extract scalar parameters depending on the moments of spectra derived from analytic solutions to the fusion rate, necessarily based on simplifying assumptions of the confined plasma. The precision of the TOF spectra, and the nature of the implosions at the NIF require the inclusion of factors beyond the traditional analysis and require the addition of higher order moments to describe the data. This talk will present a diagnostic process for extracting the moments of the neutron energy spectrum for a comparison with theoretical considerations as well as simulations of the implosions. Work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344.
Processes involved in the development of latent fingerprints using the cyanoacrylate fuming method.
Lewis, L A; Smithwick, R W; Devault, G L; Bolinger, B; Lewis, S A
2001-03-01
Chemical processes involved in the development of latent fingerprints using the cyanoacrylate fuming method have been studied. Two major types of latent prints have been investigated-clean and oily prints. Scanning electron microscopy (SEM) has been used as a tool for determining the morphology of the polymer developed separately on clean and oily prints after cyanoacrylate fuming. A correlation between the chemical composition of an aged latent fingerprint, prior to development, and the quality of a developed fingerprint has been observed in the morphology. The moisture in the print prior to fuming has been found to be more important than the moisture in the air during fuming for the development of a useful latent print. In addition, the amount of time required to develop a high quality latent print has been found to be within 2 min. The cyanoacrylate polymerization process is extremely rapid. When heat is used to accelerate the fuming process, typically a period of 2 min is required to develop the print. The optimum development time depends upon the concentration of cyanoacrylate vapors within the enclosure.
The Use of Interactive Methods in the Educational Process of the Higher Education Institution
ERIC Educational Resources Information Center
Kutbiddinova, Rimma A.; Eromasova, Aleksandr? A.; Romanova, Marina A.
2016-01-01
The modernization of higher education and the transition to the new Federal Education Standards require a higher quality training of the graduates. The training of highly qualified specialists must meet strict requirements: a high level of professional competence, the developed communication skills, the ability to predict the results of one's own…
Survey of existing performance requirements in codes and standards for light-frame construction
G. E. Sherwood
1980-01-01
Present building codes and standards are a combination of specifications and performance criteria. Where specifications prevail, the introduction f new materials or methods can be a long, cumbersome process. To facilitate the introduction of new technology, performance requirements are becoming more prevalent. In some areas, there is a lack of information on which to...
40 CFR Table 4 to Subpart Dddd of... - Requirements for Performance Tests
Code of Federal Regulations, 2010 CFR
2010-07-01
... THC compliance option measure emissions of total HAP as THC Method 25A in appendix A to 40 CFR part 60... the methane emissions from the emissions of total HAP as THC. (6) each process unit subject to a... § 63.2240(c) establish the site-specific operating requirements (including the parameter limits or THC...
40 CFR Table 4 to Subpart Dddd of... - Requirements for Performance Tests
Code of Federal Regulations, 2011 CFR
2011-07-01
... THC compliance option measure emissions of total HAP as THC Method 25A in appendix A to 40 CFR part 60... the methane emissions from the emissions of total HAP as THC. (6) each process unit subject to a... § 63.2240(c) establish the site-specific operating requirements (including the parameter limits or THC...
Singlet oxygen detection in biological systems: Uses and limitations.
Koh, Eugene; Fluhr, Robert
2016-07-02
The study of singlet oxygen in biological systems is challenging in many ways. Singlet oxygen is a relatively unstable ephemeral molecule, and its properties make it highly reactive with many biomolecules, making it difficult to quantify accurately. Several methods have been developed to study this elusive molecule, but most studies thus far have focused on those conditions that produce relatively large amounts of singlet oxygen. However, the need for more sensitive methods is required as one begins to explore the levels of singlet oxygen required in signaling and regulatory processes. Here we discuss the various methods used in the study of singlet oxygen, and outline their uses and limitations.
Exploiting the User: Adapting Personas for Use in Security Visualization Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoll, Jennifer C.; McColgin, David W.; Gregory, Michelle L.
It has long been noted that visual representations of complex information can facilitate rapid understanding of data {citation], even with respect to ComSec applications {citation]. Recognizing that visualizations can increase usability in ComSec applications, [Zurko, Sasse] have argued that there is a need to create more usable security visualizations. (VisSec) However, usability of applications generally fall into the domain of Human Computer Interaction (HCI), which generally relies on heavy-weight user-centered design (UCD) processes. For example, the UCD process can involve many prototype iterations, or an ethnographic field study that can take months to complete. The problem is that VisSec projectsmore » generally do not have the resources to perform ethnographic field studies, or to employ complex UCD methods. They often are running on tight deadlines and budgets that can not afford standard UCD methods. In order to help resolve the conflict of needing more usable designs in ComSec, but not having the resources to employ complex UCD methods, in this paper we offer a stripped-down lighter weight version of a UCD process which can help with capturing user requirements. The approach we use is personas which a user requirements capturing method arising out of the Participatory Design philosophy [Grudin02].« less
Plasma process control with optical emission spectroscopy
NASA Astrophysics Data System (ADS)
Ward, P. P.
Plasma processes for cleaning, etching and desmear of electronic components and printed wiring boards (PWB) are difficult to predict and control. Non-uniformity of most plasma processes and sensitivity to environmental changes make it difficult to maintain process stability from day to day. To assure plasma process performance, weight loss coupons or post-plasma destructive testing must be used. The problem with these techniques is that they are not real-time methods and do not allow for immediate diagnosis and process correction. These methods often require scrapping some fraction of a batch to insure the integrity of the rest. Since these methods verify a successful cycle with post-plasma diagnostics, poor test results often determine that a batch is substandard and the resulting parts unusable. Both of these methods are a costly part of the overall fabrication cost. A more efficient method of testing would allow for constant monitoring of plasma conditions and process control. Process failures should be detected before the parts being treated. are damaged. Real time monitoring would allow for instantaneous corrections. Multiple site monitoring would allow for process mapping within one system or simultaneous monitoring of multiple systems. Optical emission spectroscopy conducted external to the plasma apparatus would allow for this sort of multifunctional analysis without perturbing the glow discharge. In this paper, optical emission spectroscopy for non-intrusive, in situ process control will be explored. A discussion of this technique as it applies towards process control, failure analysis and endpoint determination will be conducted. Methods for identifying process failures, progress and end of etch back and desmear processes will be discussed.
Risk management for moisture related effects in dry manufacturing processes: a statistical approach.
Quiroz, Jorge; Strong, John; Zhang, Lanju
2016-03-01
A risk- and science-based approach to control the quality in pharmaceutical manufacturing includes a full understanding of how product attributes and process parameters relate to product performance through a proactive approach in formulation and process development. For dry manufacturing, where moisture content is not directly manipulated within the process, the variability in moisture of the incoming raw materials can impact both the processability and drug product quality attributes. A statistical approach is developed using individual raw material historical lots as a basis for the calculation of tolerance intervals for drug product moisture content so that risks associated with excursions in moisture content can be mitigated. The proposed method is based on a model-independent approach that uses available data to estimate parameters of interest that describe the population of blend moisture content values and which do not require knowledge of the individual blend moisture content values. Another advantage of the proposed tolerance intervals is that, it does not require the use of tabulated values for tolerance factors. This facilitates the implementation on any spreadsheet program like Microsoft Excel. A computational example is used to demonstrate the proposed method.
Stable image acquisition for mobile image processing applications
NASA Astrophysics Data System (ADS)
Henning, Kai-Fabian; Fritze, Alexander; Gillich, Eugen; Mönks, Uwe; Lohweg, Volker
2015-02-01
Today, mobile devices (smartphones, tablets, etc.) are widespread and of high importance for their users. Their performance as well as versatility increases over time. This leads to the opportunity to use such devices for more specific tasks like image processing in an industrial context. For the analysis of images requirements like image quality (blur, illumination, etc.) as well as a defined relative position of the object to be inspected are crucial. Since mobile devices are handheld and used in constantly changing environments the challenge is to fulfill these requirements. We present an approach to overcome the obstacles and stabilize the image capturing process such that image analysis becomes significantly improved on mobile devices. Therefore, image processing methods are combined with sensor fusion concepts. The approach consists of three main parts. First, pose estimation methods are used to guide a user moving the device to a defined position. Second, the sensors data and the pose information are combined for relative motion estimation. Finally, the image capturing process is automated. It is triggered depending on the alignment of the device and the object as well as the image quality that can be achieved under consideration of motion and environmental effects.
Proposed algorithm to improve job shop production scheduling using ant colony optimization method
NASA Astrophysics Data System (ADS)
Pakpahan, Eka KA; Kristina, Sonna; Setiawan, Ari
2017-12-01
This paper deals with the determination of job shop production schedule on an automatic environment. On this particular environment, machines and material handling system are integrated and controlled by a computer center where schedule were created and then used to dictate the movement of parts and the operations at each machine. This setting is usually designed to have an unmanned production process for a specified interval time. We consider here parts with various operations requirement. Each operation requires specific cutting tools. These parts are to be scheduled on machines each having identical capability, meaning that each machine is equipped with a similar set of cutting tools therefore is capable of processing any operation. The availability of a particular machine to process a particular operation is determined by the remaining life time of its cutting tools. We proposed an algorithm based on the ant colony optimization method and embedded them on matlab software to generate production schedule which minimize the total processing time of the parts (makespan). We test the algorithm on data provided by real industry and the process shows a very short computation time. This contributes a lot to the flexibility and timelines targeted on an automatic environment.
Conceptual design of distillation-based hybrid separation processes.
Skiborowski, Mirko; Harwardt, Andreas; Marquardt, Wolfgang
2013-01-01
Hybrid separation processes combine different separation principles and constitute a promising design option for the separation of complex mixtures. Particularly, the integration of distillation with other unit operations can significantly improve the separation of close-boiling or azeotropic mixtures. Although the design of single-unit operations is well understood and supported by computational methods, the optimal design of flowsheets of hybrid separation processes is still a challenging task. The large number of operational and design degrees of freedom requires a systematic and optimization-based design approach. To this end, a structured approach, the so-called process synthesis framework, is proposed. This article reviews available computational methods for the conceptual design of distillation-based hybrid processes for the separation of liquid mixtures. Open problems are identified that must be addressed to finally establish a structured process synthesis framework for such processes.
Hardware architecture design of a fast global motion estimation method
NASA Astrophysics Data System (ADS)
Liang, Chaobing; Sang, Hongshi; Shen, Xubang
2015-12-01
VLSI implementation of gradient-based global motion estimation (GME) faces two main challenges: irregular data access and high off-chip memory bandwidth requirement. We previously proposed a fast GME method that reduces computational complexity by choosing certain number of small patches containing corners and using them in a gradient-based framework. A hardware architecture is designed to implement this method and further reduce off-chip memory bandwidth requirement. On-chip memories are used to store coordinates of the corners and template patches, while the Gaussian pyramids of both the template and reference frame are stored in off-chip SDRAMs. By performing geometric transform only on the coordinates of the center pixel of a 3-by-3 patch in the template image, a 5-by-5 area containing the warped 3-by-3 patch in the reference image is extracted from the SDRAMs by burst read. Patched-based and burst mode data access helps to keep the off-chip memory bandwidth requirement at the minimum. Although patch size varies at different pyramid level, all patches are processed in term of 3x3 patches, so the utilization of the patch-processing circuit reaches 100%. FPGA implementation results show that the design utilizes 24,080 bits on-chip memory and for a sequence with resolution of 352x288 and frequency of 60Hz, the off-chip bandwidth requirement is only 3.96Mbyte/s, compared with 243.84Mbyte/s of the original gradient-based GME method. This design can be used in applications like video codec, video stabilization, and super-resolution, where real-time GME is a necessity and minimum memory bandwidth requirement is appreciated.
Event-driven processing for hardware-efficient neural spike sorting
NASA Astrophysics Data System (ADS)
Liu, Yan; Pereira, João L.; Constandinou, Timothy G.
2018-02-01
Objective. The prospect of real-time and on-node spike sorting provides a genuine opportunity to push the envelope of large-scale integrated neural recording systems. In such systems the hardware resources, power requirements and data bandwidth increase linearly with channel count. Event-based (or data-driven) processing can provide here a new efficient means for hardware implementation that is completely activity dependant. In this work, we investigate using continuous-time level-crossing sampling for efficient data representation and subsequent spike processing. Approach. (1) We first compare signals (synthetic neural datasets) encoded with this technique against conventional sampling. (2) We then show how such a representation can be directly exploited by extracting simple time domain features from the bitstream to perform neural spike sorting. (3) The proposed method is implemented in a low power FPGA platform to demonstrate its hardware viability. Main results. It is observed that considerably lower data rates are achievable when using 7 bits or less to represent the signals, whilst maintaining the signal fidelity. Results obtained using both MATLAB and reconfigurable logic hardware (FPGA) indicate that feature extraction and spike sorting accuracies can be achieved with comparable or better accuracy than reference methods whilst also requiring relatively low hardware resources. Significance. By effectively exploiting continuous-time data representation, neural signal processing can be achieved in a completely event-driven manner, reducing both the required resources (memory, complexity) and computations (operations). This will see future large-scale neural systems integrating on-node processing in real-time hardware.
Challenges and Approaches to Make Multidisciplinary Team Meetings Interoperable - The KIMBo Project.
Krauss, Oliver; Holzer, Karl; Schuler, Andreas; Egelkraut, Reinhard; Franz, Barbara
2017-01-01
Multidisciplinary team meetings (MDTMs) are already in use for certain areas in healthcare (e.g. treatment of cancer). Due to the lack of common standards and accessibility for the applied IT systems, their potential is not yet completely exploited. Common requirements for MDTMs shall be identified and aggregated into a process definition to be automated by an application architecture utilizing modern standards in electronic healthcare, e.g. HL7 FHIR. To identify requirements, an extensive literature review as well as semi-structured expert interviews were conducted. Results showed, that interoperability and flexibility in terms of the process are key requirements to be addressed. An architecture blueprint as well as an aggregated process definition were derived from the insights gained. To evaluate the feasibility of identified requirements, methods of explorative prototyping in software engineering were used. MDTMs will become an important part of modern and future healthcare but the need for standardization in terms of interoperability is imminent.
Laser Welding Process Monitoring Systems: Advanced Signal Analysis for Quality Assurance
NASA Astrophysics Data System (ADS)
D'Angelo, Giuseppe
Laser material processing today is widely used in industry. Especially laser welding became one of the key-technologies, e. g., for the automotive sector. This is due to the improvement and development of new laser sources and the increasing knowledge gained at countless scientific research projects. Nevertheless, it is still not possible to use the full potential of this technology. Therefore, the introduction and application of quality-assuring systems is required. For a long time, the statement "the best sensor is no sensor" was often heard. Today, a change of paradigm can be observed. On the one hand, ISO 9000 and other by law enforced regulations have led to the understanding that quality monitoring is an essential tool in modern manufacturing and necessary in order to keep production results in deterministic boundaries. On the other hand, rising quality requirements not only set higher and higher requirements for the process technology but also demand qualityassurance measures which ensure the reliable recognition of process faults. As a result, there is a need for reliable online detection and correction of welding faults by means of an in-process monitoring. The chapter describes an advanced signals analysis technique to extract information from signals detected, during the laser welding process, by optical sensors. The technique is based on the method of reassignment which was first applied to the spectrogram by Kodera, Gendrin and de Villedary22,23 and later generalized to any bilinear time-frequency representation by Auger and Flandrin.24 Key to the method is a nonlinear convolution where the value of the convolution is not placed at the center of the convolution kernel but rather reassigned to the center of mass of the function within the kernel. The resulting reassigned representation yields significantly improved components localization. We compare the proposed time-frequency distributions by analyzing signals detected during the laser welding of tailored blanks, demonstrating the advantages of the reassigned representation, giving practical applicability to the proposed method.
Aspheres for high speed cine lenses
NASA Astrophysics Data System (ADS)
Beder, Christian
2005-09-01
To fulfil the requirements of today's high performance cine lenses aspheres are an indispensable part of lens design. Among making them manageable in shape and size, tolerancing aspheres is an essential part of the development process. The traditional method of tolerancing individual aspherical coefficients results in unemployable theoretical figures only. In order to obtain viable parameters that can easily be dealt with in a production line, more enhanced techniques are required. In this presentation, a method of simulating characteristic manufacturing errors and deducing surface deviation and slope error tolerances will be shown.
NASA Astrophysics Data System (ADS)
Wu, Linqin; Xu, Sheng; Jiang, Dezhi
2015-12-01
Industrial wireless networked control system has been widely used, and how to evaluate the performance of the wireless network is of great significance. In this paper, considering the shortcoming of the existing performance evaluation methods, a comprehensive performance evaluation method of networks multi-indexes fuzzy analytic hierarchy process (MFAHP) combined with the fuzzy mathematics and the traditional analytic hierarchy process (AHP) is presented. The method can overcome that the performance evaluation is not comprehensive and subjective. Experiments show that the method can reflect the network performance of real condition. It has direct guiding role on protocol selection, network cabling, and node setting, and can meet the requirements of different occasions by modifying the underlying parameters.
Improving operational anodising process performance using simulation approach
NASA Astrophysics Data System (ADS)
Liong, Choong-Yeun; Ghazali, Syarah Syahidah
2015-10-01
The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist of five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time.
Advances in multi-scale modeling of solidification and casting processes
NASA Astrophysics Data System (ADS)
Liu, Baicheng; Xu, Qingyan; Jing, Tao; Shen, Houfa; Han, Zhiqiang
2011-04-01
The development of the aviation, energy and automobile industries requires an advanced integrated product/process R&D systems which could optimize the product and the process design as well. Integrated computational materials engineering (ICME) is a promising approach to fulfill this requirement and make the product and process development efficient, economic, and environmentally friendly. Advances in multi-scale modeling of solidification and casting processes, including mathematical models as well as engineering applications are presented in the paper. Dendrite morphology of magnesium and aluminum alloy of solidification process by using phase field and cellular automaton methods, mathematical models of segregation of large steel ingot, and microstructure models of unidirectionally solidified turbine blade casting are studied and discussed. In addition, some engineering case studies, including microstructure simulation of aluminum casting for automobile industry, segregation of large steel ingot for energy industry, and microstructure simulation of unidirectionally solidified turbine blade castings for aviation industry are discussed.
NASA Technical Reports Server (NTRS)
Bredt, J. H.
1974-01-01
Two types of space processing operations may be considered economically justified; they are manufacturing operations that make profits and experiment operations that provide needed applied research results at lower costs than those of alternative methods. Some examples from the Skylab experiments suggest that applied research should become cost effective soon after the space shuttle and Spacelab become operational. In space manufacturing, the total cost of space operations required to process materials must be repaid by the value added to the materials by the processing. Accurate estimates of profitability are not yet possible because shuttle operational costs are not firmly established and the markets for future products are difficult to estimate. However, approximate calculations show that semiconductor products and biological preparations may be processed on a scale consistent with market requirements and at costs that are at least compatible with profitability using the Shuttle/Spacelab system.
Challenges and requirements of mask data processing for multi-beam mask writer
NASA Astrophysics Data System (ADS)
Choi, Jin; Lee, Dong Hyun; Park, Sinjeung; Lee, SookHyun; Tamamushi, Shuichi; Shin, In Kyun; Jeon, Chan Uk
2015-07-01
To overcome the resolution and throughput of current mask writer for advanced lithography technologies, the platform of e-beam writer have been evolved by the developments of hardware and software in writer. Especially, aggressive optical proximity correction (OPC) for unprecedented extension of optical lithography and the needs of low sensitivity resist for high resolution result in the limit of variable shaped beam writer which is widely used for mass production. The multi-beam mask writer is attractive candidate for photomask writing of sub-10nm device because of its high speed and the large degree of freedom which enable high dose and dose modulation for each pixel. However, the higher dose and almost unlimited appetite for dose modulation challenge the mask data processing (MDP) in aspects of extreme data volume and correction method. Here, we discuss the requirements of mask data processing for multi-beam mask writer and presents new challenges of the data format, data flow, and correction method for user and supplier MDP tool.
Nebert, D.D.
1989-01-01
In the process of developing a continuous hydrographic data layer for water resources applications in the Pacific Northwest, map-edge discontinuities in the U.S. Geological Survey 1:100 ,000-scale digital data that required application of computer-assisted edgematching procedures were identified. The spatial data sets required by the project must have line features that match closely enough across map boundaries to ensure full line topology when adjacent files are joined by the computer. Automated edgematching techniques are evaluated as to their effects on positional accuracy. Interactive methods such as selective node-matching and on-screen editing are also reviewed. Interactive procedures complement automated methods by allowing supervision of edgematching in a cartographic and hydrologic context. Common edge conditions encountered in the preparation of the Northwest Rivers data base are described, as are recommended processing solutions. Suggested edgematching procedures for 1:100,000-scale hydrography data are included in an appendix to encourage consistent processing of this theme on a national scale. (USGS)
Flood inundation extent mapping based on block compressed tracing
NASA Astrophysics Data System (ADS)
Shen, Dingtao; Rui, Yikang; Wang, Jiechen; Zhang, Yu; Cheng, Liang
2015-07-01
Flood inundation extent, depth, and duration are important factors affecting flood hazard evaluation. At present, flood inundation analysis is based mainly on a seeded region-growing algorithm, which is an inefficient process because it requires excessive recursive computations and it is incapable of processing massive datasets. To address this problem, we propose a block compressed tracing algorithm for mapping the flood inundation extent, which reads the DEM data in blocks before transferring them to raster compression storage. This allows a smaller computer memory to process a larger amount of data, which solves the problem of the regular seeded region-growing algorithm. In addition, the use of a raster boundary tracing technique allows the algorithm to avoid the time-consuming computations required by the seeded region-growing. Finally, we conduct a comparative evaluation in the Chin-sha River basin, results show that the proposed method solves the problem of flood inundation extent mapping based on massive DEM datasets with higher computational efficiency than the original method, which makes it suitable for practical applications.
Emerging methods for the study of coastal ecosystem landscape structure and change
Brock, John C.; Danielson, Jeffrey J.; Purkis, Sam
2013-01-01
Coastal landscapes are heterogeneous, dynamic, and evolve over a range of time scales due to intertwined climatic, geologic, hydrologic, biologic, and meteorological processes, and are also heavily impacted by human development, commercial activities, and resource extraction. A diversity of complex coastal systems around the globe, spanning glaciated shorelines to tropical atolls, wetlands, and barrier islands are responding to multiple human and natural drivers. Interdisciplinary research based on remote-sensing observations linked to process studies and models is required to understand coastal ecosystem landscape structure and change. Moreover, new techniques for coastal mapping and monitoring are increasingly serving the needs of policy-makers and resource managers across local, regional, and national scales. Emerging remote-sensing methods associated with a diversity of instruments and platforms are a key enabling element of integrated coastal ecosystem studies. These investigations require both targeted and synoptic mapping, and involve the monitoring of formative processes such as hydrodynamics, sediment transport, erosion, accretion, flooding, habitat modification, land-cover change, and biogeochemical fluxes.
Noise suppression methods for robust speech processing
NASA Astrophysics Data System (ADS)
Boll, S. F.; Ravindra, H.; Randall, G.; Armantrout, R.; Power, R.
1980-05-01
Robust speech processing in practical operating environments requires effective environmental and processor noise suppression. This report describes the technical findings and accomplishments during this reporting period for the research program funded to develop real time, compressed speech analysis synthesis algorithms whose performance in invariant under signal contamination. Fulfillment of this requirement is necessary to insure reliable secure compressed speech transmission within realistic military command and control environments. Overall contributions resulting from this research program include the understanding of how environmental noise degrades narrow band, coded speech, development of appropriate real time noise suppression algorithms, and development of speech parameter identification methods that consider signal contamination as a fundamental element in the estimation process. This report describes the current research and results in the areas of noise suppression using the dual input adaptive noise cancellation using the short time Fourier transform algorithms, articulation rate change techniques, and a description of an experiment which demonstrated that the spectral subtraction noise suppression algorithm can improve the intelligibility of 2400 bps, LPC 10 coded, helicopter speech by 10.6 point.
Christensen, Tom; Grimsmo, Anders
2005-01-01
User participation is important for developing a functional requirements specification for electronic communication. General practitioners and practising specialists, however, often work in small practices without the resources to develop and present their requirements. It was necessary to find a method that could engage practising doctors in order to promote their needs related to electronic communication. Qualitative research methods were used, starting a process to develop and study documents and collect data from meetings in project groups. Triangulation was used, in that the participants were organised into a panel of experts, a user group, a supplier group and an editorial committee. The panel of experts created a list of functional requirements for electronic communication in health care, consisting of 197 requirements, in addition to 67 requirements selected from an existing Norwegian standard for electronic patient records (EPRs). Elimination of paper copies sent in parallel with electronic messages, optimal workflow, a common electronic 'envelope' with directory services for units and end-users, and defined requirements for content with the possibility of decision support were the most important requirements. The results indicate that we have found a method of developing functional requirements which provides valid results both for practising doctors and for suppliers of EPR systems.
Feenstra, Roeland; Christen, David; Paranthaman, Mariappan
1999-01-01
A method is disclosed for fabricating YBa.sub.2 Cu.sub.3 O.sub.7 superconductor layers with the capability of carrying large superconducting currents on a metallic tape (substrate) supplied with a biaxially textured oxide buffer layer. The method represents a simplification of previously established techniques and provides processing requirements compatible with scale-up to long wire (tape) lengths and high processing speeds. This simplification has been realized by employing the BaF.sub.2 method to grow a YBa.sub.2 Cu.sub.3 O.sub.7 film on a metallic substrate having a biaxially textured oxide buffer layer.
Method for silicon carbide production by reacting silica with hydrocarbon gas
Glatzmaier, G.C.
1994-06-28
A method is described for producing silicon carbide particles using a silicon source material and a hydrocarbon. The method is efficient and is characterized by high yield. Finely divided silicon source material is contacted with hydrocarbon at a temperature of 400 C to 1000 C where the hydrocarbon pyrolyzes and coats the particles with carbon. The particles are then heated to 1100 C to 1600 C to cause a reaction between the ingredients to form silicon carbide of very small particle size. No grinding of silicon carbide is required to obtain small particles. The method may be carried out as a batch process or as a continuous process. 5 figures.
Method for silicon carbide production by reacting silica with hydrocarbon gas
Glatzmaier, Gregory C.
1994-01-01
A method is described for producing silicon carbide particles using a silicon source material and a hydrocarbon. The method is efficient and is characterized by high yield. Finely divided silicon source material is contacted with hydrocarbon at a temperature of 400.degree. C. to 1000.degree. C. where the hydrocarbon pyrolyzes and coats the particles with carbon. The particles are then heated to 1100.degree. C. to 1600.degree. C. to cause a reaction between the ingredients to form silicon carbide of very small particle size. No grinding of silicon carbide is required to obtain small particles. The method may be carried out as a batch process or as a continuous process.
Visualizing the deep end of sound: plotting multi-parameter results from infrasound data analysis
NASA Astrophysics Data System (ADS)
Perttu, A. B.; Taisne, B.
2016-12-01
Infrasound is sound below the threshold of human hearing: approximately 20 Hz. The field of infrasound research, like other waveform based fields relies on several standard processing methods and data visualizations, including waveform plots and spectrograms. The installation of the International Monitoring System (IMS) global network of infrasound arrays, contributed to the resurgence of infrasound research. Array processing is an important method used in infrasound research, however, this method produces data sets with a large number of parameters, and requires innovative plotting techniques. The goal in designing new figures is to be able to present easily comprehendible, and information-rich plots by careful selection of data density and plotting methods.
An automated and universal method for measuring mean grain size from a digital image of sediment
Buscombe, Daniel D.; Rubin, David M.; Warrick, Jonathan A.
2010-01-01
Existing methods for estimating mean grain size of sediment in an image require either complicated sequences of image processing (filtering, edge detection, segmentation, etc.) or statistical procedures involving calibration. We present a new approach which uses Fourier methods to calculate grain size directly from the image without requiring calibration. Based on analysis of over 450 images, we found the accuracy to be within approximately 16% across the full range from silt to pebbles. Accuracy is comparable to, or better than, existing digital methods. The new method, in conjunction with recent advances in technology for taking appropriate images of sediment in a range of natural environments, promises to revolutionize the logistics and speed at which grain-size data may be obtained from the field.
Disk space and load time requirements for eye movement biometric databases
NASA Astrophysics Data System (ADS)
Kasprowski, Pawel; Harezlak, Katarzyna
2016-06-01
Biometric identification is a very popular area of interest nowadays. Problems with the so-called physiological methods like fingerprints or iris recognition resulted in increased attention paid to methods measuring behavioral patterns. Eye movement based biometric (EMB) identification is one of the interesting behavioral methods and due to the intensive development of eye tracking devices it has become possible to define new methods for the eye movement signal processing. Such method should be supported by an efficient storage used to collect eye movement data and provide it for further analysis. The aim of the research was to check various setups enabling such a storage choice. There were various aspects taken into consideration, like disk space usage, time required for loading and saving whole data set or its chosen parts.
NASA Technical Reports Server (NTRS)
Elrod, B. D.; Jacobsen, A.; Cook, R. A.; Singh, R. N. P.
1983-01-01
One-way range and Doppler methods for providing user orbit and time determination are examined. Forward link beacon tracking, with on-board processing of independent navigation signals broadcast continuously by TDAS spacecraft; forward link scheduled tracking; with on-board processing of navigation data received during scheduled TDAS forward link service intervals; and return link scheduled tracking; with ground-based processing of user generated navigation data during scheduled TDAS return link service intervals are discussed. A system level definition and requirements assessment for each alternative, an evaluation of potential navigation performance and comparison with TDAS mission model requirements is included. TDAS satellite tracking is also addressed for two alternatives: BRTS and VLBI tracking.
A perspective on the Human-Rating process of US spacecraft: Both past and present
NASA Astrophysics Data System (ADS)
Zupp, George
1995-04-01
The purpose of this report is to characterize the process of Human-Rating as employed by NASA for human spaceflight. An Agency-wide committee was formed in November 1992 to develop a Human-Rating Requirements Definition for Launch Vehicles based on conventional (historical) methods. The committee members were from NASA Headquarters, Marshall Space Flight Center, Kennedy Space Center, Stennis Space Center, and Johnson Space Center. After considerable discussion and analysis, committee members concluded that Human-Rating is the process of satisfying the mutual constraints of cost, schedule, mission performance, and risk while addressing the requirements for human safety, human performance, and human health management and care.
A perspective on the Human-Rating process of US spacecraft: Both past and present
NASA Technical Reports Server (NTRS)
Zupp, George (Editor)
1995-01-01
The purpose of this report is to characterize the process of Human-Rating as employed by NASA for human spaceflight. An Agency-wide committee was formed in November 1992 to develop a Human-Rating Requirements Definition for Launch Vehicles based on conventional (historical) methods. The committee members were from NASA Headquarters, Marshall Space Flight Center, Kennedy Space Center, Stennis Space Center, and Johnson Space Center. After considerable discussion and analysis, committee members concluded that Human-Rating is the process of satisfying the mutual constraints of cost, schedule, mission performance, and risk while addressing the requirements for human safety, human performance, and human health management and care.
A review of methods for assessment of the rate of gastric emptying in the dog and cat: 1898-2002.
Wyse, C A; McLellan, J; Dickie, A M; Sutton, D G M; Preston, T; Yam, P S
2003-01-01
Gastric emptying is the process by which food is delivered to the small intestine at a rate and in a form that optimizes intestinal absorption of nutrients. The rate of gastric emptying is subject to alteration by physiological, pharmacological, and pathological conditions. Gastric emptying of solids is of greater clinical significance because disordered gastric emptying rarely is detectable in the liquid phase. Imaging techniques have the disadvantage of requiring restraint of the animal and access to expensive equipment. Radiographic methods require administration of test meals that are not similar to food. Scintigraphy is the gold standard method for assessment of gastric emptying but requires administration of a radioisotope. Magnetic resonance imaging has not yet been applied for assessment of gastric emptying in small animals. Ultrasonography is a potentially useful, but subjective, method for assessment of gastric emptying in dogs. Gastric tracer methods require insertion of gastric or intestinal cannulae and are rarely applied outside of the research laboratory. The paracetamol absorption test has been applied for assessment of liquid phase gastric emptying in the dog, but requires IV cannulation. The gastric emptying breath test is a noninvasive method for assessment of gastric emptying that has been applied in dogs and cats. This method can be carried out away from the veterinary hospital, but the effects of physiological and pathological abnormalities on the test are not known. Advances in technology will facilitate the development of reliable methods for assessment of gastric emptying in small animals.
Cognitive Load in Algebra: Element Interactivity in Solving Equations
ERIC Educational Resources Information Center
Ngu, Bing Hiong; Chung, Siu Fung; Yeung, Alexander Seeshing
2015-01-01
Central to equation solving is the maintenance of equivalence on both sides of the equation. However, when the process involves an interaction of multiple elements, solving an equation can impose a high cognitive load. The balance method requires operations on both sides of the equation, whereas the inverse method involves operations on one side…
ERIC Educational Resources Information Center
Molina, Paola; Marotta, Monica; Bulgarelli, Daniela
2016-01-01
Ability to reflect on practice is a key element of early childhood professionalism and is positively associated with the quality of educational services. "Observation-Projet" (Fontaine 2008, 2011b) is a method designed to support practitioners' reflection through the observational process. The method adapts the required scientific…
Calculation of the Poisson cumulative distribution function
NASA Technical Reports Server (NTRS)
Bowerman, Paul N.; Nolty, Robert G.; Scheuer, Ernest M.
1990-01-01
A method for calculating the Poisson cdf (cumulative distribution function) is presented. The method avoids computer underflow and overflow during the process. The computer program uses this technique to calculate the Poisson cdf for arbitrary inputs. An algorithm that determines the Poisson parameter required to yield a specified value of the cdf is presented.
ERIC Educational Resources Information Center
Travers, Steven T.
2017-01-01
Many developmental mathematics programs at community colleges in recent years have undergone a process of redesign in an attempt increase the historical poor rate of student successful completion of required developmental coursework. Various curriculum and instructional design models that incorporate methods of avoiding and accelerating the…
The Asbestos NESHAP (National Emission Standard for Hazardous Air Pollutants) requires the removal of all Regulated Asbestos-Containing Material (RACM) prior to the demolition of the buildings that fall under the auspices of the NESHAP. This removal process can be a costly and ti...
Disinfection of Cystoscopes by Subatmospheric Steam and Steam and Formaldehyde at 80°C
Alder, V. G.; Gingell, J. C.; Mitchell, J. P.
1971-01-01
A new method of disinfection adapted for endoscopic instruments uses low temperature steam at 80°C or steam and formaldehyde at 80°C. The process has considerable advantages over existing methods and more closely approaches the ideal requirements. ImagesFIG. 3FIG. 4FIG. 5 PMID:5569551
Wang, Liu; Wang, Rui; Yu, Yonghua; Zhang, Fang; Wang, Xiaofu; Ying, Yibin; Wu, Jian; Xu, Junfeng
2016-01-01
The requirement of power-dependent instruments or excessive operation time usually restricts current nucleic acid amplification methods from being used for detection of transgenic crops in the field. In this paper, an easy and rapid detection method which requires no electricity supply has been developed. The time-consuming process of nucleic acid purification is omitted in this method. DNA solution obtained from leaves with 0.5 M sodium hydroxide (NaOH) can be used for loop-mediated isothermal amplification (LAMP) only after simple dilution. Traditional instruments like a polymerase chain reaction (PCR) amplifier and water bath used for DNA amplification are abandoned. Three kinds of dewar flasks were tested and it turned out that the common dewar flask was the best. Combined with visual detection of LAMP amplicons by phosphate (Pi)-induced coloration reaction, the whole process of detection of transgenic crops via genetically pure material (leaf material of one plant) could be accomplished within 30 min. The feasibility of this method was also verified by analysis of practical samples.
Evaluation of counting methods for oceanic radium-228
NASA Astrophysics Data System (ADS)
Orr, James C.
1988-07-01
Measurement of open ocean 228Ra is difficult, typically requiring at least 200 L of seawater. The burden of collecting and processing these large-volume samples severely limits the widespread use of this promising tracer. To use smaller-volume samples, a more sensitive means of analysis is required. To seek out new and improved counting method(s), conventional 228Ra counting methods have been compared with some promising techniques which are currently used for other radionuclides. Of the conventional methods, α spectrometry possesses the highest efficiency (3-9%) and lowest background (0.0015 cpm), but it suffers from the need for complex chemical processing after sampling and the need to allow about 1 year for adequate ingrowth of 228Th granddaughter. The other two conventional counting methods measure the short-lived 228Ac daughter while it remains supported by 228Ra, thereby avoiding the complex sample processing and the long delay before counting. The first of these, high-resolution γ spectrometry, offers the simplest processing and an efficiency (4.8%) comparable to α spectrometry; yet its high background (0.16 cpm) and substantial equipment cost (˜30,000) limit its widespread use. The second no-wait method, β-γ coincidence spectrometry, also offers comparable efficiency (5.3%), but it possesses both lower background (0.0054 cpm) and lower initial cost (˜12,000). Three new (i.e., untried for 228Ra) techniques all seem to promise about a fivefold increase in efficiency over conventional methods. By employing liquid scintillation methods, both α spectrometry and β-γ coincidence spectrometry can improve their counter efficiency while retaining low background. The third new 228Ra counting method could be adapted from a technique which measures 224Ra by 220Rn emanation. After allowing for ingrowth and then counting for the 224Ra great-granddaughter, 228Ra could be back calculated, thereby yielding a method with high efficiency, where no sample processing is required. The efficiency and background of each of the three new methods have been estimated and are compared with those of the three methods currently employed to measure oceanic 228Ra. From efficiency and background, the relative figure of merit and the detection limit have been determined for each of the six counters. These data suggest that the new counting methods have the potential to measure most 228Ra samples with just 30 L of seawater, to better than 5% precision. Not only would this reduce the time, effort, and expense involved in sample collection, but 228Ra could then be measured on many small-volume samples (20-30 L) previously collected with only 226Ra in mind. By measuring 228Ra quantitatively on such small-volume samples, three analyses (large-volume 228Ra, large-volume 226Ra, and small-volume 226Ra) could be reduced to one, thereby dramatically improving analytical precision.
NASA Technical Reports Server (NTRS)
Henke, Luke
2010-01-01
The ICARE method is a flexible, widely applicable method for systems engineers to solve problems and resolve issues in a complete and comprehensive manner. The method can be tailored by diverse users for direct application to their function (e.g. system integrators, design engineers, technical discipline leads, analysts, etc.). The clever acronym, ICARE, instills the attitude of accountability, safety, technical rigor and engagement in the problem resolution: Identify, Communicate, Assess, Report, Execute (ICARE). This method was developed through observation of Space Shuttle Propulsion Systems Engineering and Integration (PSE&I) office personnel approach in an attempt to succinctly describe the actions of an effective systems engineer. Additionally it evolved from an effort to make a broadly-defined checklist for a PSE&I worker to perform their responsibilities in an iterative and recursive manner. The National Aeronautics and Space Administration (NASA) Systems Engineering Handbook states, engineering of NASA systems requires a systematic and disciplined set of processes that are applied recursively and iteratively for the design, development, operation, maintenance, and closeout of systems throughout the life cycle of the programs and projects. ICARE is a method that can be applied within the boundaries and requirements of NASA s systems engineering set of processes to provide an elevated sense of duty and responsibility to crew and vehicle safety. The importance of a disciplined set of processes and a safety-conscious mindset increases with the complexity of the system. Moreover, the larger the system and the larger the workforce, the more important it is to encourage the usage of the ICARE method as widely as possible. According to the NASA Systems Engineering Handbook, elements of a system can include people, hardware, software, facilities, policies and documents; all things required to produce system-level results, qualities, properties, characteristics, functions, behavior and performance. The ICARE method can be used to improve all elements of a system and, consequently, the system-level functional, physical and operational performance. Even though ICARE was specifically designed for a systems engineer, any person whose job is to examine another person, product, or process can use the ICARE method to improve effectiveness, implementation, usefulness, value, capability, efficiency, integration, design, and/or marketability. This paper provides the details of the ICARE method, emphasizing the method s application to systems engineering. In addition, a sample of other, non-systems engineering applications are briefly discussed to demonstrate how ICARE can be tailored to a variety of diverse jobs (from project management to parenting).
NASA Astrophysics Data System (ADS)
Zäh, Ralf-Kilian; Mosbach, Benedikt; Hollwich, Jan; Faupel, Benedikt
2017-02-01
To ensure the competitiveness of manufacturing companies it is indispensable to optimize their manufacturing processes. Slight variations of process parameters and machine settings have only marginally effects on the product quality. Therefore, the largest possible editing window is required. Such parameters are, for example, the movement of the laser beam across the component for the laser keyhole welding. That`s why it is necessary to keep the formation of welding seams within specified limits. Therefore, the quality of laser welding processes is ensured, by using post-process methods, like ultrasonic inspection, or special in-process methods. These in-process systems only achieve a simple evaluation which shows whether the weld seam is acceptable or not. Furthermore, in-process systems use no feedback for changing the control variables such as speed of the laser or adjustment of laser power. In this paper the research group presents current results of the research field of Online Monitoring, Online Controlling and Model predictive controlling in laser welding processes to increase the product quality. To record the characteristics of the welding process, tested online methods are used during the process. Based on the measurement data, a state space model is ascertained, which includes all the control variables of the system. Depending on simulation tools the model predictive controller (MPC) is designed for the model and integrated into an NI-Real-Time-System.
Overview and development of EDA tools for integration of DSA into patterning solutions
NASA Astrophysics Data System (ADS)
Torres, J. Andres; Fenger, Germain; Khaira, Daman; Ma, Yuansheng; Granik, Yuri; Kapral, Chris; Mitra, Joydeep; Krasnova, Polina; Ait-Ferhat, Dehia
2017-03-01
Directed Self-Assembly is the method by which a self-assembly polymer is forced to follow a desired geometry defined or influenced by a guiding pattern. Such guiding pattern uses surface potentials, confinement or both to achieve polymer configurations that result in circuit-relevant topologies, which can be patterned onto a substrate. Chemo, and grapho epitaxy of lines and space structures are now routinely inspected at full wafer level to understand the defectivity limits of the materials and their maximum resolution. In the same manner, there is a deeper understanding about the formation of cylinders using grapho-epitaxy processes. Academia has also contributed by developing methods that help reduce the number of masks in advanced nodes by "combining" DSA-compatible groups, thus reducing the total cost of the process. From the point of view of EDA, new tools are required when a technology is adopted, and most technologies are adopted when they show a clear cost-benefit over alternative techniques. In addition, years of EDA development have led to the creation of very flexible toolkits that permit rapid prototyping and evaluation of new process alternatives. With the development of high-chi materials, and by moving away of the well characterized PS-PMMA systems, as well as novel integrations in the substrates that work in tandem with diblock copolymer systems, it is necessary to assess any new requirements that may or may not need custom tools to support such processes. Hybrid DSA processes (which contain both chemo and grapho elements), are currently being investigated as possible contenders for sub-5nm process techniques. Because such processes permit the re-distribution of discontinuities in the regular arrays between the substrate and a cut operation, they have the potential to extend the number of applications for DSA. This paper illustrates the reason as to why some DSA processes can be supported by existing rules and technology, while other processes require the development of highly customized correction tools and models. It also illustrates how developing DSA cannot be done in isolation, and it requires the full collaboration of EDA, Material's suppliers, Manufacturing equipment, Metrology, and electronic manufacturers.
Study of Thermal Electrical Modified Etching for Glass and Its Application in Structure Etching
Zhan, Zhan; Li, Wei; Yu, Lingke; Wang, Lingyun; Sun, Daoheng
2017-01-01
In this work, an accelerating etching method for glass named thermal electrical modified etching (TEM etching) is investigated. Based on the identification of the effect in anodic bonding, a novel method for glass structure micromachining is proposed using TEM etching. To validate the method, TEM-etched glasses are prepared and their morphology is tested, revealing the feasibility of the new method for micro/nano structure micromachining. Furthermore, two kinds of edge effect in the TEM and etching processes are analyzed. Additionally, a parameter study of TEM etching involving transferred charge, applied pressure, and etching roughness is conducted to evaluate this method. The study shows that TEM etching is a promising manufacture method for glass with low process temperature, three-dimensional self-control ability, and low equipment requirement. PMID:28772521
Off-line real-time FTIR analysis of a process step in imipenem production
NASA Astrophysics Data System (ADS)
Boaz, Jhansi R.; Thomas, Scott M.; Meyerhoffer, Steven M.; Staskiewicz, Steven J.; Lynch, Joseph E.; Egan, Richard S.; Ellison, Dean K.
1992-08-01
We have developed an FT-IR method, using a Spectra-Tech Monit-IR 400 systems, to monitor off-line the completion of a reaction in real-time. The reaction is moisture-sensitive and analysis by more conventional methods (normal-phase HPLC) is difficult to reproduce. The FT-IR method is based on the shift of a diazo band when a conjugated beta-diketone is transformed into a silyl enol ether during the reaction. The reaction mixture is examined directly by IR and does not require sample workup. Data acquisition time is less than one minute. The method has been validated for specificity, precision and accuracy. The results obtained by the FT-IR method for known mixtures and in-process samples compare favorably with those from a normal-phase HPLC method.
The Use of a Microcomputer Based Array Processor for Real Time Laser Velocimeter Data Processing
NASA Technical Reports Server (NTRS)
Meyers, James F.
1990-01-01
The application of an array processor to laser velocimeter data processing is presented. The hardware is described along with the method of parallel programming required by the array processor. A portion of the data processing program is described in detail. The increase in computational speed of a microcomputer equipped with an array processor is illustrated by comparative testing with a minicomputer.
Continuous control of chaos based on the stability criterion.
Yu, Hong Jie; Liu, Yan Zhu; Peng, Jian Hua
2004-06-01
A method of chaos control based on stability criterion is proposed in the present paper. This method can stabilize chaotic systems onto a desired periodic orbit by a small time-continuous perturbation nonlinear feedback. This method does not require linearization of the system around the stabilized orbit and only an approximate location of the desired periodic orbit is required which can be automatically detected in the control process. The control can be started at any moment by choosing appropriate perturbation restriction condition. It seems that more flexibility and convenience are the main advantages of this method. The discussions on control of attitude motion of a spacecraft, Rössler system, and two coupled Duffing oscillators are given as numerical examples.
Simplified signal processing for impedance spectroscopy with spectrally sparse sequences
NASA Astrophysics Data System (ADS)
Annus, P.; Land, R.; Reidla, M.; Ojarand, J.; Mughal, Y.; Min, M.
2013-04-01
Classical method for measurement of the electrical bio-impedance involves excitation with sinusoidal waveform. Sinusoidal excitation at fixed frequency points enables wide variety of signal processing options, most general of them being Fourier transform. Multiplication with two quadrature waveforms at desired frequency could be easily accomplished both in analogue and in digital domains, even simplest quadrature square waves can be considered, which reduces signal processing task in analogue domain to synchronous switching followed by low pass filter, and in digital domain requires only additions. So called spectrally sparse excitation sequences (SSS), which have been recently introduced into bio-impedance measurement domain, are very reasonable choice when simultaneous multifrequency excitation is required. They have many good properties, such as ease of generation and good crest factor compared to similar multisinusoids. Typically, the usage of discrete or fast Fourier transform in signal processing step is considered so far. Usage of simplified methods nevertheless would reduce computational burden, and enable simpler, less costly and less energy hungry signal processing platforms. Accuracy of the measurement with SSS excitation when using different waveforms for quadrature demodulation will be compared in order to evaluate the feasibility of the simplified signal processing. Sigma delta modulated sinusoid (binary signal) is considered to be a good alternative for a synchronous demodulation.
Statistical Post-Processing of Wind Speed Forecasts to Estimate Relative Economic Value
NASA Astrophysics Data System (ADS)
Courtney, Jennifer; Lynch, Peter; Sweeney, Conor
2013-04-01
The objective of this research is to get the best possible wind speed forecasts for the wind energy industry by using an optimal combination of well-established forecasting and post-processing methods. We start with the ECMWF 51 member ensemble prediction system (EPS) which is underdispersive and hence uncalibrated. We aim to produce wind speed forecasts that are more accurate and calibrated than the EPS. The 51 members of the EPS are clustered to 8 weighted representative members (RMs), chosen to minimize the within-cluster spread, while maximizing the inter-cluster spread. The forecasts are then downscaled using two limited area models, WRF and COSMO, at two resolutions, 14km and 3km. This process creates four distinguishable ensembles which are used as input to statistical post-processes requiring multi-model forecasts. Two such processes are presented here. The first, Bayesian Model Averaging, has been proven to provide more calibrated and accurate wind speed forecasts than the ECMWF EPS using this multi-model input data. The second, heteroscedastic censored regression is indicating positive results also. We compare the two post-processing methods, applied to a year of hindcast wind speed data around Ireland, using an array of deterministic and probabilistic verification techniques, such as MAE, CRPS, probability transform integrals and verification rank histograms, to show which method provides the most accurate and calibrated forecasts. However, the value of a forecast to an end-user cannot be fully quantified by just the accuracy and calibration measurements mentioned, as the relationship between skill and value is complex. Capturing the full potential of the forecast benefits also requires detailed knowledge of the end-users' weather sensitive decision-making processes and most importantly the economic impact it will have on their income. Finally, we present the continuous relative economic value of both post-processing methods to identify which is more beneficial to the wind energy industry of Ireland.
Schulze, H Georg; Turner, Robin F B
2013-04-01
Raman spectra often contain undesirable, randomly positioned, intense, narrow-bandwidth, positive, unidirectional spectral features generated when cosmic rays strike charge-coupled device cameras. These must be removed prior to analysis, but doing so manually is not feasible for large data sets. We developed a quick, simple, effective, semi-automated procedure to remove cosmic ray spikes from spectral data sets that contain large numbers of relatively homogenous spectra. Although some inhomogeneous spectral data sets can be accommodated--it requires replacing excessively modified spectra with the originals and removing their spikes with a median filter instead--caution is advised when processing such data sets. In addition, the technique is suitable for interpolating missing spectra or replacing aberrant spectra with good spectral estimates. The method is applied to baseline-flattened spectra and relies on fitting a third-order (or higher) polynomial through all the spectra at every wavenumber. Pixel intensities in excess of a threshold of 3× the noise standard deviation above the fit are reduced to the threshold level. Because only two parameters (with readily specified default values) might require further adjustment, the method is easily implemented for semi-automated processing of large spectral sets.
Loss tolerant speech decoder for telecommunications
NASA Technical Reports Server (NTRS)
Prieto, Jr., Jaime L. (Inventor)
1999-01-01
A method and device for extrapolating past signal-history data for insertion into missing data segments in order to conceal digital speech frame errors. The extrapolation method uses past-signal history that is stored in a buffer. The method is implemented with a device that utilizes a finite-impulse response (FIR) multi-layer feed-forward artificial neural network that is trained by back-propagation for one-step extrapolation of speech compression algorithm (SCA) parameters. Once a speech connection has been established, the speech compression algorithm device begins sending encoded speech frames. As the speech frames are received, they are decoded and converted back into speech signal voltages. During the normal decoding process, pre-processing of the required SCA parameters will occur and the results stored in the past-history buffer. If a speech frame is detected to be lost or in error, then extrapolation modules are executed and replacement SCA parameters are generated and sent as the parameters required by the SCA. In this way, the information transfer to the SCA is transparent, and the SCA processing continues as usual. The listener will not normally notice that a speech frame has been lost because of the smooth transition between the last-received, lost, and next-received speech frames.
Winter, S; Smith, A; Lappin, D; McDonagh, G; Kirk, B
2017-12-01
Dental handpieces are required to be sterilized between patient use. Vacuum steam sterilization processes with fractionated pre/post-vacuum phases or unique cycles for specified medical devices are required for hollow instruments with internal lumens to assure successful air removal. Entrapped air will compromise achievement of required sterilization conditions. Many countries and professional organizations still advocate non-vacuum sterilization processes for these devices. To investigate non-vacuum downward/gravity displacement, type-N steam sterilization of dental handpieces, using thermometric methods to measure time to achieve sterilization temperature at different handpiece locations. Measurements at different positions within air turbines were undertaken with thermocouples and data loggers. Two examples of widely used UK benchtop steam sterilizers were tested: a non-vacuum benchtop sterilizer (Little Sister 3; Eschmann, Lancing, UK) and a vacuum benchtop sterilizer (Lisa; W&H, Bürmoos, Austria). Each sterilizer cycle was completed with three handpieces and each cycle in triplicate. A total of 140 measurements inside dental handpiece lumens were recorded. The non-vacuum process failed (time range: 0-150 s) to reliably achieve sterilization temperatures within the time limit specified by the international standard (15 s equilibration time). The measurement point at the base of the handpiece failed in all test runs (N = 9) to meet the standard. No failures were detected with the vacuum steam sterilization type B process with fractionated pre-vacuum and post-vacuum phases. Non-vacuum downward/gravity displacement, type-N steam sterilization processes are unreliable in achieving sterilization conditions inside dental handpieces, and the base of the handpiece is the site most likely to fail. Copyright © 2017 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.
Evaluation of STAT medication ordering process in a community hospital
Walsh., Kim; Schwartz., Barbara
Background: In most health care facilities, problems related to delays in STAT medication order processing time are of common concern. Objective: The purpose of this study was to evaluate processing time for STAT orders at Kimball Medical Center. Methods: All STAT orders were reviewed to determine processing time; order processing time was also stratified by physician order entry (physician entered (PE) orders vs. non-physician entered (NPE) orders). Collected data included medication ordered, indication, time ordered, time verified by pharmacist, time sent from pharmacy, and time charted as given to the patient. Results: A total of 502 STAT orders were reviewed and 389 orders were included for analysis. Overall, median time was 29 minutes, IQR 16–63; p<0.0001.). The time needed to process NPE orders was significantly less than that needed for PE orders (median 27 vs. 34 minutes; p=0.026). In terms of NPE orders, the median total time required to process STAT orders for medications available in the Automated Dispensing Devices (ADM) was within 30 minutes, while that required to process orders for medications not available in the ADM was significantly greater than 30 minutes. For PE orders, the median total time required to process orders for medications available in the ADM (i.e., not requiring pharmacy involvement) was significantly greater than 30 minutes. [Median time = 34 minutes (p<0.001)]. Conclusion: We conclude that STAT order processing time may be improved by increasing the availability of medications in ADM, and pharmacy involvement in the verification process. PMID:27382418
Brooks, Robin; Thorpe, Richard; Wilson, John
2004-11-11
A new mathematical treatment of alarms that considers them as multi-variable interactions between process variables has provided the first-ever method to calculate values for alarm limits. This has resulted in substantial reductions in false alarms and hence in alarm annunciation rates in field trials. It has also unified alarm management, process control and product quality control into a single mathematical framework so that operations improvement and hence economic benefits are obtained at the same time as increased process safety. Additionally, an algorithm has been developed that advises what changes should be made to Manipulable process variables to clear an alarm. The multi-variable Best Operating Zone at the heart of the method is derived from existing historical data using equation-free methods. It does not require a first-principles process model or an expensive series of process identification experiments. Integral with the method is a new format Process Operator Display that uses only existing variables to fully describe the multi-variable operating space. This combination of features makes it an affordable and maintainable solution for small plants and single items of equipment as well as for the largest plants. In many cases, it also provides the justification for the investments about to be made or already made in process historian systems. Field Trials have been and are being conducted at IneosChlor and Mallinckrodt Chemicals, both in the UK, of the new geometric process control (GPC) method for improving the quality of both process operations and product by providing Process Alarms and Alerts of much high quality than ever before. The paper describes the methods used, including a simple visual method for Alarm Rationalisation that quickly delivers large sets of Consistent Alarm Limits, and the extension to full Alert Management with highlights from the Field Trials to indicate the overall effectiveness of the method in practice.
Tsai, Po-Yen; Lee, I-Chin; Hsu, Hsin-Yun; Huang, Hong-Yuan; Fan, Shih-Kang; Liu, Cheng-Hsien
2016-01-01
Here, we describe a technique to manipulate a low number of beads to achieve high washing efficiency with zero bead loss in the washing process of a digital microfluidic (DMF) immunoassay. Previously, two magnetic bead extraction methods were reported in the DMF platform: (1) single-side electrowetting method and (2) double-side electrowetting method. The first approach could provide high washing efficiency, but it required a large number of beads. The second approach could reduce the required number of beads, but it was inefficient where multiple washes were required. More importantly, bead loss during the washing process was unavoidable in both methods. Here, an improved double-side electrowetting method is proposed for bead extraction by utilizing a series of unequal electrodes. It is shown that, with proper electrode size ratio, only one wash step is required to achieve 98% washing rate without any bead loss at bead number less than 100 in a droplet. It allows using only about 25 magnetic beads in DMF immunoassay to increase the number of captured analytes on each bead effectively. In our human soluble tumor necrosis factor receptor I (sTNF-RI) model immunoassay, the experimental results show that, comparing to our previous results without using the proposed bead extraction technique, the immunoassay with low bead number significantly enhances the fluorescence signal to provide a better limit of detection (3.14 pg/ml) with smaller reagent volumes (200 nl) and shorter analysis time (<1 h). This improved bead extraction technique not only can be used in the DMF immunoassay but also has great potential to be used in any other bead-based DMF systems for different applications. PMID:26858807
Supporting the design of office layout meeting ergonomics requirements.
Margaritis, Spyros; Marmaras, Nicolas
2007-11-01
This paper proposes a method and an information technology tool aiming to support the ergonomics layout design of individual workstations in a given space (building). The proposed method shares common ideas with previous generic methods for office layout. However, it goes a step forward and focuses on the cognitive tasks which have to be carried out by the designer or the design team trying to alleviate them. This is achieved in two ways: (i) by decomposing the layout design problem to six main stages, during which only a limited number of variables and requirements are considered and (ii) by converting the ergonomics requirements to functional design guidelines. The information technology tool (ErgoOffice 0.1) automates certain phases of the layout design process, and supports the design team either by its editing and graphical facilities or by providing adequate memory support.
Design and Analysis of Offshore Macroalgae Biorefineries.
Golberg, Alexander; Liberzon, Alexander; Vitkin, Edward; Yakhini, Zohar
2018-03-15
Displacing fossil fuels and their derivatives with renewables, and increasing sustainable food production are among the major challenges facing the world in the coming decades. A possible, sustainable direction for addressing this challenge is the production of biomass and the conversion of this biomass to the required products through a complex system coined biorefinery. Terrestrial biomass and microalgae are possible sources; however, concerns over net energy balance, potable water use, environmental hazards, and uncertainty in the processing technologies raise questions regarding their actual potential to meet the anticipated food, feed, and energy challenges in a sustainable way. Alternative sustainable sources for biorefineries are macroalgae grown and processed offshore. However, implementation of the offshore biorefineries requires detailed analysis of their technological, economic, and environmental performance. In this chapter, the basic principles of marine biorefineries design are shown. The methods to integrate thermodynamic efficiency, investment, and environmental aspects are discussed. The performance improvement by development of new cultivation methods that fit macroalgae physiology and development of new fermentation methods that address macroalgae unique chemical composition is shown.
NASA Technical Reports Server (NTRS)
Jeong, Myeong-Jae; Hsu, N. Christina; Kwiatkowska, Ewa J.; Franz, Bryan A.; Meister, Gerhard; Salustro, Clare E.
2012-01-01
The retrieval of aerosol properties from spaceborne sensors requires highly accurate and precise radiometric measurements, thus placing stringent requirements on sensor calibration and characterization. For the Terra/Moderate Resolution Imaging Spedroradiometer (MODIS), the characteristics of the detectors of certain bands, particularly band 8 [(B8); 412 nm], have changed significantly over time, leading to increased calibration uncertainty. In this paper, we explore a possibility of utilizing a cross-calibration method developed for characterizing the Terral MODIS detectors in the ocean bands by the National Aeronautics and Space Administration Ocean Biology Processing Group to improve aerosol retrieval over bright land surfaces. We found that the Terra/MODIS B8 reflectance corrected using the cross calibration method resulted in significant improvements for the retrieved aerosol optical thickness when compared with that from the Multi-angle Imaging Spectroradiometer, Aqua/MODIS, and the Aerosol Robotic Network. The method reported in this paper is implemented for the operational processing of the Terra/MODIS Deep Blue aerosol products.
NASA Technical Reports Server (NTRS)
Chen, J. C.; Garba, J. A.; Wada, B. K.
1978-01-01
In the design/analysis process of a payload structural system, the accelerations at the payload/launch vehicle interface obtained from a system analysis using a rigid payload are often used as the input forcing function to the elastic payload to obtain structural design loads. Such an analysis is at best an approximation since the elastic coupling effects are neglected. This paper develops a method wherein the launch vehicle/rigid payload interface accelerations are modified to account for the payload elasticity. The advantage of the proposed method, which is exact to the extent that the physical system can be described by a truncated set of generalized coordinates, is that the complete design/analysis process can be performed within the organization responsible for the payload design. The method requires the updating of the system normal modes to account for payload changes, but does not require a complete transient solution using the composite system model. An application to a real complex structure, the Viking Spacecraft System, is given.
Achieving superresolution with illumination-enhanced sparsity.
Yu, Jiun-Yann; Becker, Stephen R; Folberth, James; Wallin, Bruce F; Chen, Simeng; Cogswell, Carol J
2018-04-16
Recent advances in superresolution fluorescence microscopy have been limited by a belief that surpassing two-fold resolution enhancement of the Rayleigh resolution limit requires stimulated emission or the fluorophore to undergo state transitions. Here we demonstrate a new superresolution method that requires only image acquisitions with a focused illumination spot and computational post-processing. The proposed method utilizes the focused illumination spot to effectively reduce the object size and enhance the object sparsity and consequently increases the resolution and accuracy through nonlinear image post-processing. This method clearly resolves 70nm resolution test objects emitting ~530nm light with a 1.4 numerical aperture (NA) objective, and, when imaging through a 0.5NA objective, exhibits high spatial frequencies comparable to a 1.4NA widefield image, both demonstrating a resolution enhancement above two-fold of the Rayleigh resolution limit. More importantly, we examine how the resolution increases with photon numbers, and show that the more-than-two-fold enhancement is achievable with realistic photon budgets.
Non-Contact Temperature Requirements (NCTM) for drop and bubble physics
NASA Technical Reports Server (NTRS)
Hmelo, Anthony B.; Wang, Taylor G.
1989-01-01
Many of the materials research experiments to be conducted in the Space Processing program require a non-contaminating method of manipulating and controlling weightless molten materials. In these experiments, the melt is positioned and formed within a container without physically contacting the container's wall. An acoustic method, which was developed by Professor Taylor G. Wang before coming to Vanderbilt University from the Jet Propulsion Laboratory, has demonstrated the capability of positioning and manipulating room temperature samples. This was accomplished in an earth-based laboratory with a zero-gravity environment of short duration. However, many important facets of high temperature containerless processing technology have not been established yet, nor can they be established from the room temperature studies, because the details of the interaction between an acoustic field an a molten sample are largely unknown. Drop dynamics, bubble dynamics, coalescence behavior of drops and bubbles, electromagnetic and acoustic levitation methods applied to molten metals, and thermal streaming are among the topics discussed.
Quantifying induced effects of subsurface renewable energy storage
NASA Astrophysics Data System (ADS)
Bauer, Sebastian; Beyer, Christof; Pfeiffer, Tilmann; Boockmeyer, Anke; Popp, Steffi; Delfs, Jens-Olaf; Wang, Bo; Li, Dedong; Dethlefsen, Frank; Dahmke, Andreas
2015-04-01
New methods and technologies for energy storage are required for the transition to renewable energy sources. Subsurface energy storage systems such as salt caverns or porous formations offer the possibility of hosting large amounts of energy or substance. When employing these systems, an adequate system and process understanding is required in order to assess the feasibility of the individual storage option at the respective site and to predict the complex and interacting effects induced. This understanding is the basis for assessing the potential as well as the risks connected with a sustainable usage of these storage options, especially when considering possible mutual influences. For achieving this aim, in this work synthetic scenarios for the use of the geological underground as an energy storage system are developed and parameterized. The scenarios are designed to represent typical conditions in North Germany. The types of subsurface use investigated here include gas storage and heat storage in porous formations. The scenarios are numerically simulated and interpreted with regard to risk analysis and effect forecasting. For this, the numerical simulators Eclipse and OpenGeoSys are used. The latter is enhanced to include the required coupled hydraulic, thermal, geomechanical and geochemical processes. Using the simulated and interpreted scenarios, the induced effects are quantified individually and monitoring concepts for observing these effects are derived. This presentation will detail the general investigation concept used and analyze the parameter availability for this type of model applications. Then the process implementation and numerical methods required and applied for simulating the induced effects of subsurface storage are detailed and explained. Application examples show the developed methods and quantify induced effects and storage sizes for the typical settings parameterized. This work is part of the ANGUS+ project, funded by the German Ministry of Education and Research (BMBF).
Cook, Darren A N; Pilotte, Nils; Minetti, Corrado; Williams, Steven A; Reimer, Lisa J
2017-11-06
Background: Molecular xenomonitoring (MX), the testing of insect vectors for the presence of human pathogens, has the potential to provide a non-invasive and cost-effective method for monitoring the prevalence of disease within a community. Current MX methods require the capture and processing of large numbers of mosquitoes, particularly in areas of low endemicity, increasing the time, cost and labour required. Screening the excreta/feces (E/F) released from mosquitoes, rather than whole carcasses, improves the throughput by removing the need to discriminate vector species since non-vectors release ingested pathogens in E/F. It also enables larger numbers of mosquitoes to be processed per pool. However, this new screening approach requires a method of efficiently collecting E/F. Methods: We developed a cone with a superhydrophobic surface to allow for the efficient collection of E/F. Using mosquitoes exposed to either Plasmodium falciparum , Brugia malayi or Trypanosoma brucei brucei, we tested the performance of the superhydrophobic cone alongside two other collection methods. Results: All collection methods enabled the detection of DNA from the three parasites. Using the superhydrophobic cone to deposit E/F into a small tube provided the highest number of positive samples (16 out of 18) and facilitated detection of parasite DNA in E/F from individual mosquitoes. Further tests showed that following a simple washing step, the cone can be reused multiple times, further improving its cost-effectiveness. Conclusions: Incorporating the superhydrophobic cone into mosquito traps or holding containers could provide a simple and efficient method for collecting E/F. Where this is not possible, swabbing the container or using the washing method facilitates the detection of the three parasites used in this study.
Application of fluorescence spectroscopy for on-line bioprocess monitoring and control
NASA Astrophysics Data System (ADS)
Boehl, Daniela; Solle, D.; Toussaint, Hans J.; Menge, M.; Renemann, G.; Lindemann, Carsten; Hitzmann, Bernd; Scheper, Thomas-Helmut
2001-02-01
12 Modern bioprocess control requires fast data acquisition and in-time evaluation of bioprocess variables. On-line fluorescence spectroscopy for data acquisition and the use of chemometric methods accomplish these requirements. The presented investigations were performed with fluorescence spectrometers with wide ranges of excitation and emission wavelength. By detection of several biogenic fluorophors (amino acids, coenzymes and vitamins) a large amount of information about the state of the bioprocess are obtained. For the evaluation of the process variables partial least squares regression is used. This technique was applied to several bioprocesses: the production of ergotamine by Claviceps purpurea, the production of t-PA (tissue plasminogen activator) by animal cells and brewing processes. The main point of monitoring the brewing processes was to determine the process variables cell count and extract concentration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatterjee, Abhijit; Voter, Arthur
2009-01-01
We develop a variation of the temperature accelerated dynamics (TAD) method, called the p-TAD method, that efficiently generates an on-the-fly kinetic Monte Carlo (KMC) process catalog with control over the accuracy of the catalog. It is assumed that transition state theory is valid. The p-TAD method guarantees that processes relevant at the timescales of interest to the simulation are present in the catalog with a chosen confidence. A confidence measure associated with the process catalog is derived. The dynamics is then studied using the process catalog with the KMC method. Effective accuracy of a p-TAD calculation is derived when amore » KMC catalog is reused for conditions different from those the catalog was originally generated for. Different KMC catalog generation strategies that exploit the features of the p-TAD method and ensure higher accuracy and/or computational efficiency are presented. The accuracy and the computational requirements of the p-TAD method are assessed. Comparisons to the original TAD method are made. As an example, we study dynamics in sub-monolayer Ag/Cu(110) at the time scale of seconds using the p-TAD method. It is demonstrated that the p-TAD method overcomes several challenges plaguing the conventional KMC method.« less
Jensen, Mallory A.; LaSalvia, Vincenzo; Morishige, Ashley E.; ...
2016-08-01
The capital expense (capex) of conventional crystal growth methods is a barrier to sustainable growth of the photovoltaic industry. It is challenging for innovative techniques to displace conventional growth methods due the low dislocation density and high lifetime required for high efficiency devices. One promising innovation in crystal growth is the noncontact crucible method (NOC-Si), which combines aspects of Czochralski (Cz) and conventional casting. This material has the potential to satisfy the dual requirements, with capex likely between that of Cz (high capex) and multicrystalline silicon (mc-Si, low capex). In this contribution, we observe a strong dependence of solar cellmore » efficiency on ingot height, correlated with the evolution of swirl-like defects, for single crystalline n-type silicon grown by the NOC-Si method. We posit that these defects are similar to those observed in Cz, and we explore the response of NOC-Si to high temperature treatments including phosphorous diffusion gettering (PDG) and Tabula Rasa (TR). The highest lifetimes (2033 us for the top of the ingot and 342 us for the bottom of the ingot) are achieved for TR followed by a PDG process comprising a standard plateau and a low temperature anneal. Further improvements can be gained by tailoring the time-temperature profiles of each process. Lifetime analysis after the PDG process indicates the presence of a getterable impurity in the as-grown material, while analysis after TR points to the presence of oxide precipitates especially at the bottom of the ingot. Uniform lifetime degradation is observed after TR which we assign to a presently unknown defect. Lastly, future work includes additional TR processing to uncover the nature of this defect, microstructural characterization of suspected oxide precipitates, and optimization of the TR process to achieve the dual goals of high lifetime and spatial homogenization.« less
OPERATOR BURDEN IN METAL ADDITIVE MANUFACTURING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elliott, Amy M; Love, Lonnie J
2016-01-01
Additive manufacturing (AM) is an emerging manufacturing process that creates usable machine parts via layer-by-layer joining of a stock material. With this layer-wise approach, high-performance geometries can be created which are impossible with traditional manufacturing methods. Metal AM technology has the potential to significantly reduce the manufacturing burden of developing custom hardware; however, a major consideration in choosing a metal AM system is the required amount of operator involvement (i.e., operator burden) in the manufacturing process. The operator burden not only determines the amount of operator training and specialization required but also the usability of the system in a facility.more » As operators of several metal AM processes, the Manufacturing Demonstration Facility (MDF) at Oak Ridge National Labs is uniquely poised to provide insight into requirements for operator involvement in each of the three major metal AM processes. The paper covers an overview of each of the three metal AM technologies, focusing on the burden on the operator to complete the build cycle, process the part for final use, and reset the AM equipment for future builds.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Posseme, N., E-mail: nicolas.posseme@cea.fr; Pollet, O.; Barnola, S.
2014-08-04
Silicon nitride spacer etching realization is considered today as one of the most challenging of the etch process for the new devices realization. For this step, the atomic etch precision to stop on silicon or silicon germanium with a perfect anisotropy (no foot formation) is required. The situation is that none of the current plasma technologies can meet all these requirements. To overcome these issues and meet the highly complex requirements imposed by device fabrication processes, we recently proposed an alternative etching process to the current plasma etch chemistries. This process is based on thin film modification by light ionsmore » implantation followed by a selective removal of the modified layer with respect to the non-modified material. In this Letter, we demonstrate the benefit of this alternative etch method in term of film damage control (silicon germanium recess obtained is less than 6 A), anisotropy (no foot formation), and its compatibility with other integration steps like epitaxial. The etch mechanisms of this approach are also addressed.« less
Gas Chromatic Mass Spectrometer
NASA Technical Reports Server (NTRS)
Wey, Chowen
1995-01-01
Gas chromatograph/mass spectrometer (GC/MS) used to measure and identify combustion species present in trace concentration. Advanced extractive diagnostic method measures to parts per billion (PPB), as well as differentiates between different types of hydrocarbons. Applicable for petrochemical, waste incinerator, diesel transporation, and electric utility companies in accurately monitoring types of hydrocarbon emissions generated by fuel combustion, in order to meet stricter environmental requirements. Other potential applications include manufacturing processes requiring precise detection of toxic gaseous chemicals, biomedical applications requiring precise identification of accumulative gaseous species, and gas utility operations requiring high-sensitivity leak detection.
Middleton, John; Vaks, Jeffrey E
2007-04-01
Errors of calibrator-assigned values lead to errors in the testing of patient samples. The ability to estimate the uncertainties of calibrator-assigned values and other variables minimizes errors in testing processes. International Organization of Standardization guidelines provide simple equations for the estimation of calibrator uncertainty with simple value-assignment processes, but other methods are needed to estimate uncertainty in complex processes. We estimated the assigned-value uncertainty with a Monte Carlo computer simulation of a complex value-assignment process, based on a formalized description of the process, with measurement parameters estimated experimentally. This method was applied to study uncertainty of a multilevel calibrator value assignment for a prealbumin immunoassay. The simulation results showed that the component of the uncertainty added by the process of value transfer from the reference material CRM470 to the calibrator is smaller than that of the reference material itself (<0.8% vs 3.7%). Varying the process parameters in the simulation model allowed for optimizing the process, while keeping the added uncertainty small. The patient result uncertainty caused by the calibrator uncertainty was also found to be small. This method of estimating uncertainty is a powerful tool that allows for estimation of calibrator uncertainty for optimization of various value assignment processes, with a reduced number of measurements and reagent costs, while satisfying the requirements to uncertainty. The new method expands and augments existing methods to allow estimation of uncertainty in complex processes.
Exclusive Reactions Involving Pions and Nucleons
NASA Technical Reports Server (NTRS)
Norbury, John W.; Blattnig, Steve R.; Tripathi, R. K.
2002-01-01
The HZETRN code requires inclusive cross sections as input. One of the methods used to calculate these cross sections requires knowledge of all exclusive processes contributing to the inclusive reaction. Conservation laws are used to determine all possible exclusive reactions involving strong interactions between pions and nucleons. Inclusive particle masses are subsequently determined and are needed in cross-section calculations for inclusive pion production.
40 CFR Table 4 to Subpart Dddd of... - Requirements for Performance Tests
Code of Federal Regulations, 2014 CFR
2014-07-01
... HAP as THC compliance option measure emissions of total HAP as THC Method 25A in appendix A to 40 CFR... subtract the methane emissions from the emissions of total HAP as THC. (6) each process unit subject to a... § 63.2240(c) establish the site-specific operating requirements (including the parameter limits or THC...
40 CFR Table 4 to Subpart Dddd of... - Requirements for Performance Tests
Code of Federal Regulations, 2012 CFR
2012-07-01
... HAP as THC compliance option measure emissions of total HAP as THC Method 25A in appendix A to 40 CFR... subtract the methane emissions from the emissions of total HAP as THC. (6) each process unit subject to a... § 63.2240(c) establish the site-specific operating requirements (including the parameter limits or THC...
40 CFR Table 4 to Subpart Dddd of... - Requirements for Performance Tests
Code of Federal Regulations, 2013 CFR
2013-07-01
... HAP as THC compliance option measure emissions of total HAP as THC Method 25A in appendix A to 40 CFR... subtract the methane emissions from the emissions of total HAP as THC. (6) each process unit subject to a... § 63.2240(c) establish the site-specific operating requirements (including the parameter limits or THC...
NASA Technical Reports Server (NTRS)
Dabney, James B.; Arthur, James Douglas
2017-01-01
Agile methods have gained wide acceptance over the past several years, to the point that they are now a standard management and execution approach for small-scale software development projects. While conventional Agile methods are not generally applicable to large multi-year and mission-critical systems, Agile hybrids are now being developed (such as SAFe) to exploit the productivity improvements of Agile while retaining the necessary process rigor and coordination needs of these projects. From the perspective of Independent Verification and Validation (IVV), however, the adoption of these hybrid Agile frameworks is becoming somewhat problematic. Hence, we find it prudent to question the compatibility of conventional IVV techniques with (hybrid) Agile practices.This paper documents our investigation of (a) relevant literature, (b) the modification and adoption of Agile frameworks to accommodate the development of large scale, mission critical systems, and (c) the compatibility of standard IVV techniques within hybrid Agile development frameworks. Specific to the latter, we found that the IVV methods employed within a hybrid Agile process can be divided into three groups: (1) early lifecycle IVV techniques that are fully compatible with the hybrid lifecycles, (2) IVV techniques that focus on tracing requirements, test objectives, etc. are somewhat incompatible, but can be tailored with a modest effort, and (3) IVV techniques involving an assessment requiring artifact completeness that are simply not compatible with hybrid Agile processes, e.g., those that assume complete requirement specification early in the development lifecycle.
Multiwavelength digital holography for polishing tool shape measurement
NASA Astrophysics Data System (ADS)
Lédl, Vít.; Psota, Pavel; Václavík, Jan; Doleček, Roman; Vojtíšek, Petr
2013-09-01
Classical mechano-chemical polishing is still a valuable technique, which gives unbeatable results for some types of optical surfaces. For example, optics for high power lasers requires minimized subsurface damage, very high cosmetic quality, and low mid spatial frequency error. One can hardly achieve this with use of subaperture polishing. The shape of the polishing tool plays a crucial role in achieving the required form of the optical surface. Often the shape of the polishing tool or pad is not known precisely enough during the manufacturing process. The tool shape is usually premachined and later is changed during the polishing procedure. An experienced worker could estimate the shape of the tool indirectly from the shape of the polished element, and that is why he can achieve the required shape in few reasonably long iterative steps. Therefore the lack of the exact tool shape knowledge is tolerated. Sometimes, this indirect method is not feasible even if small parts are considered. Moreover, if processes on machines like planetary (continuous) polishers are considered, the incorrect shape of the polishing pad could extend the polishing times extremely. Every iteration step takes hours. Even worse, polished piece could be wasted if the pad has a poor shape. The ability of the tool shape determination would be very valuable in those types of lengthy processes. It was our primary motivation to develop a contactless measurement method for large diffusive surfaces and demonstrate its usability. The proposed method is based on application of multiwavelength digital holographic interferometry with phase shift.
Russian Earth Science Research Program on ISS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Armand, N. A.; Tishchenko, Yu. G.
1999-01-22
Version of the Russian Earth Science Research Program on the Russian segment of ISS is proposed. The favorite tasks are selected, which may be solved with the use of space remote sensing methods and tools and which are worthwhile for realization. For solving these tasks the specialized device sets (submodules), corresponding to the specific of solved tasks, are working out. They would be specialized modules, transported to the ISS. Earth remote sensing research and ecological monitoring (high rates and large bodies transmitted from spaceborne information, comparatively stringent requirements to the period of its processing, etc.) cause rather high requirements tomore » the ground segment of receiving, processing, storing, and distribution of space information in the interests of the Earth natural resources investigation. Creation of the ground segment has required the development of the interdepartmental data receiving and processing center. Main directions of works within the framework of the ISS program are determined.« less
[Energy requirements of petroleum workers in Western Siberia].
Bondarev, G I; Vissarionova, V Ia; Dupik, V S; Zemlianskaia, T A
1982-01-01
Energy requirements of drillers, derrick mounters and maintenance workers belonging to dispersed collectives were defined on the basis of materials available at the oil field Surgutneft named for the 50th anniversary of October. Energy requirements of the team workers were studied by the method of Douglas-Haldane during autumn-winter in the course of performing various production processes. Energy requirements were established as regards the operations made in the course of the basic technological processes. The budget of the working time was calculated in accordance with a rate-qualification manual. Energy consumption during out-of-work time was established by the method of individual questionnaires, followed by energy consumption calculation during various types of the work according to the generally accepted energy equivalents. The daily energy consumption with regard to the eight-hour work was found to constitute 3100-3660 kcal for drillers and the first assistant drillers, and 3700-3900 kcal for the second and third assistant drillers. The oilmen were distributed into groups in terms of the work intensity: group II--drillers, first assistant drillers and maintenance workers; group III--the second and third assistant drillers, assistant maintenance workers, and derrick mounters.
Supercritical Fluid Technologies to Fabricate Proliposomes.
Falconer, James R; Svirskis, Darren; Adil, Ali A; Wu, Zimei
2015-01-01
Proliposomes are stable drug carrier systems designed to form liposomes upon addition of an aqueous phase. In this review, current trends in the use of supercritical fluid (SCF) technologies to prepare proliposomes are discussed. SCF methods are used in pharmaceutical research and industry to address limitations associated with conventional methods of pro/liposome fabrication. The SCF solvent methods of proliposome preparation are eco-friendly (known as green technology) and, along with the SCF anti-solvent methods, could be advantageous over conventional methods; enabling better design of particle morphology (size and shape). The major hurdles of SCF methods include poor scalability to industrial manufacturing which may result in variable particle characteristics. In the case of SCF anti-solvent methods, another hurdle is the reliance on organic solvents. However, the amount of solvent required is typically less than that used by the conventional methods. Another hurdle is that most of the SCF methods used have complicated manufacturing processes, although once the setup has been completed, SCF technologies offer a single-step process in the preparation of proliposomes compared to the multiple steps required by many other methods. Furthermore, there is limited research into how proliposomes will be converted into liposomes for the end-user, and how such a product can be prepared reproducibly in terms of vesicle size and drug loading. These hurdles must be overcome and with more research, SCF methods, especially where the SCF acts as a solvent, have the potential to offer a strong alternative to the conventional methods to prepare proliposomes.