Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-02
...] Draft Guidance for Industry and FDA Staff: Processing/ Reprocessing Medical Devices in Health Care... Devices in Health Care Settings: Validation Methods and Labeling.'' The recommendations in this guidance... Staff: Processing/Reprocessing Medical Devices in Health Care Settings: Validation Methods and Labeling...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-25
... elements of process validation for the manufacture of human and animal drug and biological products... process validation for the manufacture of human and animal drug and biological products, including APIs. This guidance describes process validation activities in three stages: In Stage 1, Process Design, the...
FDA 2011 process validation guidance: lifecycle compliance model.
Campbell, Cliff
2014-01-01
This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.
ERIC Educational Resources Information Center
Jacobs, James A.
In an effort to develop a course in materials and processes of industry at Norfolk State College using Barton Herrscher's model of systematic instruction, a group of 12 NASA-Langley Research Center's (NASA-LRC) research engineers and technicians were recruited. The group acted as consultants in validating the content of the course and aided in…
How Many Batches Are Needed for Process Validation under the New FDA Guidance?
Yang, Harry
2013-01-01
The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and they highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance. The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and THEY highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance.
Microbiological Validation of the IVGEN System
NASA Technical Reports Server (NTRS)
Porter, David A.
2013-01-01
The principal purpose of this report is to describe a validation process that can be performed in part on the ground prior to launch, and in space for the IVGEN system. The general approach taken is derived from standard pharmaceutical industry validation schemes modified to fit the special requirements of in-space usage.
NASA Astrophysics Data System (ADS)
Staszak, Katarzyna
2017-11-01
The membrane processes have played important role in the industrial separation process. These technologies can be found in all industrial areas such as food, beverages, metallurgy, pulp and paper, textile, pharmaceutical, automotive, biotechnology and chemical industry, as well as in water treatment for domestic and industrial application. Although these processes are known since twentieth century, there are still many studies that focus on the testing of new membranes' materials and determining of conditions for optimal selectivity, i. e. the optimum transmembrane pressure (TMP) or permeate flux to minimize fouling. Moreover the researchers proposed some calculation methods to predict the membrane processes properties. In this article, the laboratory scale experiments of membrane separation techniques, as well their validation by calculation methods are presented. Because membrane is the "heart" of the process, experimental and computational methods for its characterization are also described.
NASA Astrophysics Data System (ADS)
Colla, V.; Desanctis, M.; Dimatteo, A.; Lovicu, G.; Valentini, R.
2011-09-01
The purpose of the present work is the implementation and validation of a model able to predict the microstructure changes and the mechanical properties in the modern high-strength dual-phase steels after the continuous annealing process line (CAPL) and galvanizing (Galv) process. Experimental continuous cooling transformation (CCT) diagrams for 13 differently alloying dual-phase steels were measured by dilatometry from the intercritical range and were used to tune the parameters of the microstructural prediction module of the model. Mechanical properties and microstructural features were measured for more than 400 dual-phase steels simulating the CAPL and Galv industrial process, and the results were used to construct the mechanical model that predicts mechanical properties from microstructural features, chemistry, and process parameters. The model was validated and proved its efficiency in reproducing the transformation kinetic and mechanical properties of dual-phase steels produced by typical industrial process. Although it is limited to the dual-phase grades and chemical compositions explored, this model will constitute a useful tool for the steel industry.
Validation of contractor HMA testing data in the materials acceptance process.
DOT National Transportation Integrated Search
2010-08-01
"This study conducted an analysis of the SCDOT HMA specification. A Research Steering Committee comprised of SCDOT, FHWA, and Industry representatives provided oversight of the process. The research process included a literature review, a brief surve...
A European Competence Framework for Industrial Pharmacy Practice in Biotechnology.
Atkinson, Jeffrey; Crowley, Pat; De Paepe, Kristien; Gennery, Brian; Koster, Andries; Martini, Luigi; Moffat, Vivien; Nicholson, Jane; Pauwels, Gunther; Ronsisvalle, Giuseppe; Sousa, Vitor; van Schravendijk, Chris; Wilson, Keith
2015-07-29
The PHAR-IN (" Competences for industrial pharmacy practice in biotechnology ") looked at whether there is a difference in how industrial employees and academics rank competences for practice in the biotechnological industry. A small expert panel consisting of the authors of this paper produced a biotechnology competence framework by drawing up an initial list of competences then ranking them in importance using a three-stage Delphi process. The framework was next evaluated and validated by a large expert panel of academics ( n = 37) and industrial employees ( n = 154). Results show that priorities for industrial employees and academics were similar. The competences for biotechnology practice that received the highest scores were mainly in: "Research and Development", '"Upstream" and "Downstream" Processing', "Product development and formulation", "Aseptic processing", "Analytical methodology", "Product stability", and "Regulation". The main area of disagreement was in the category "Ethics and drug safety" where academics ranked competences higher than did industrial employees.
A review and validation of the IMPLAN model for Pennsylvania's solid hardwood product industries
Bruce E. Lord; Charles H. Strauss
1993-01-01
The IMPLAN model for Pennsylvania was reviewed with respect to the industries processing the state's solid hardwood resources. Several sectors were found to be under represented in the standard sources of industrial activity. Further problems were attributed to the lack of distinction between hardwoods and softwoods in the national model. A further set of changes...
Process Improvements in Training Device Acceptance Testing: A Study in Total Quality Management
1990-12-12
Quality Management , a small group of Government and industry specialists examined the existing training device acceptance test process for potential improvements. The agreed-to mission of the Air Force/Industry partnership was to continuously identify and promote implementable approaches to minimize the cost and time required for acceptance testing while ensuring that validated performance supports the user training requirements. Application of a Total Quality process improvement model focused on the customers and their requirements, analyzed how work was accomplished, and
TRANSPORT PLANNING MODEL FOR WIDE AREA RECYCLING SYSTEM OF INDUSTRIAL WASTE PLASTIC
NASA Astrophysics Data System (ADS)
Arai, Yasuhiro; Kawamura, Hisashi; Koizumi, Akira; Mogi, Satoshi
To date, the majority of industrial waste plastic generated in an urban city has been processed into landfill. However, it is now necessary to actively utilize that plastic as a useful resource to create a recycling society with a low environment influence. In order to construct a reasonable recycling system, it is necessary to address the "transportation problem," which means determining how much industrial waste plastic is to be transported to what location. With the goal of eliminating landfill processing, this study considers a transport planning model for industrial waste plastic applying linear programming. The results of running optimized calculations under given scenarios clarified not only the possibilities for recycle processing in the Metropolitan area, but also the validity of wide area recycling system.
Development and validation of nonthermal and advanced thermal food safety intervention technologies
USDA-ARS?s Scientific Manuscript database
Alternative nonthermal and thermal food safety interventions are gaining acceptance by the food processing industry and consumers. These technologies include high pressure processing, ultraviolet and pulsed light, ionizing radiation, pulsed and radiofrequency electric fields, cold atmospheric plasm...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-31
... intended for transfusion, including recommendations for validation and quality control monitoring of the..., including recommendations for validation and quality control monitoring of the leukocyte reduction process... control number 0910-0052; the collections of information in 21 CFR 606.100(b), 606.100(c), and 606.121...
System theory in industrial patient monitoring: an overview.
Baura, G D
2004-01-01
Patient monitoring refers to the continuous observation of repeating events of physiologic function to guide therapy or to monitor the effectiveness of interventions, and is used primarily in the intensive care unit and operating room. Commonly processed signals are the electrocardiogram, intraarterial blood pressure, arterial saturation of oxygen, and cardiac output. To this day, the majority of physiologic waveform processing in patient monitors is conducted using heuristic curve fitting. However in the early 1990s, a few enterprising engineers and physicians began using system theory to improve their core processing. Applications included improvement of signal-to-noise ratio, either due to low signal levels or motion artifact, and improvement in feature detection. The goal of this mini-symposium is to review the early work in this emerging field, which has led to technologic breakthroughs. In this overview talk, the process of system theory algorithm research and development is discussed. Research for industrial monitors involves substantial data collection, with some data used for algorithm training and the remainder used for validation. Once the algorithms are validated, they are translated into detailed specifications. Development then translates these specifications into DSP code. The DSP code is verified and validated per the Good Manufacturing Practices mandated by FDA.
Development and validation of instrument for ergonomic evaluation of tablet arm chairs
Tirloni, Adriana Seára; dos Reis, Diogo Cunha; Bornia, Antonio Cezar; de Andrade, Dalton Francisco; Borgatto, Adriano Ferreti; Moro, Antônio Renato Pereira
2016-01-01
The purpose of this study was to develop and validate an evaluation instrument for tablet arm chairs based on ergonomic requirements, focused on user perceptions and using Item Response Theory (IRT). This exploratory study involved 1,633 participants (university students and professors) in four steps: a pilot study (n=26), semantic validation (n=430), content validation (n=11) and construct validation (n=1,166). Samejima's graded response model was applied to validate the instrument. The results showed that all the steps (theoretical and practical) of the instrument's development and validation processes were successful and that the group of remaining items (n=45) had a high consistency (0.95). This instrument can be used in the furniture industry by engineers and product designers and in the purchasing process of tablet arm chairs for schools, universities and auditoriums. PMID:28337099
A review of the solar array manufacturing industry costing standards
NASA Technical Reports Server (NTRS)
1977-01-01
The solar array manufacturing industry costing standards model is designed to compare the cost of producing solar arrays using alternative manufacturing processes. Constructive criticism of the methodology used is intended to enhance its implementation as a practical design tool. Three main elements of the procedure include workbook format and presentation, theoretical model validity and standard financial parameters.
ERIC Educational Resources Information Center
Hamm, Michael S.
In 1993-1996, the Grocers Research and Educational Foundation of the National Grocers Association developed entry-level skill standards for the food marketing industry. A coalition formed early in the project directed the skill standard development process and solicited input from major organizations involved in the industry. The validity of the…
NASA Astrophysics Data System (ADS)
Zheng, Zhongchao; Seto, Tatsuru; Kim, Sanghong; Kano, Manabu; Fujiwara, Toshiyuki; Mizuta, Masahiko; Hasebe, Shinji
2018-06-01
The Czochralski (CZ) process is the dominant method for manufacturing large cylindrical single-crystal ingots for the electronics industry. Although many models and control methods for the CZ process have been proposed, they were only tested with small equipment and only a few industrial application were reported. In this research, we constructed a first-principle model for controlling industrial CZ processes that produce 300 mm single-crystal silicon ingots. The developed model, which consists of energy, mass balance, hydrodynamic, and geometrical equations, calculates the crystal radius and the crystal growth rate as output variables by using the heater input, the crystal pulling rate, and the crucible rise rate as input variables. To improve accuracy, we modeled the CZ process by considering factors such as changes in the positions of the crucible and the melt level. The model was validated with the operation data from an industrial 300 mm CZ process. We compared the calculated and actual values of the crystal radius and the crystal growth rate, and the results demonstrated that the developed model simulated the industrial process with high accuracy.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
...; (Formerly FDA-2007D-0393)] Guidance for Industry: Blood Establishment Computer System Validation in the User... Industry: Blood Establishment Computer System Validation in the User's Facility'' dated April 2013. The... document entitled ``Guidance for Industry: Blood Establishment Computer System Validation in the User's...
Fault detection and diagnosis in an industrial fed-batch cell culture process.
Gunther, Jon C; Conner, Jeremy S; Seborg, Dale E
2007-01-01
A flexible process monitoring method was applied to industrial pilot plant cell culture data for the purpose of fault detection and diagnosis. Data from 23 batches, 20 normal operating conditions (NOC) and three abnormal, were available. A principal component analysis (PCA) model was constructed from 19 NOC batches, and the remaining NOC batch was used for model validation. Subsequently, the model was used to successfully detect (both offline and online) abnormal process conditions and to diagnose the root causes. This research demonstrates that data from a relatively small number of batches (approximately 20) can still be used to monitor for a wide range of process faults.
Development of a Scale-up Tool for Pervaporation Processes
Thiess, Holger; Strube, Jochen
2018-01-01
In this study, an engineering tool for the design and optimization of pervaporation processes is developed based on physico-chemical modelling coupled with laboratory/mini-plant experiments. The model incorporates the solution-diffusion-mechanism, polarization effects (concentration and temperature), axial dispersion, pressure drop and the temperature drop in the feed channel due to vaporization of the permeating components. The permeance, being the key model parameter, was determined via dehydration experiments on a mini-plant scale for the binary mixtures ethanol/water and ethyl acetate/water. A second set of experimental data was utilized for the validation of the model for two chemical systems. The industrially relevant ternary mixture, ethanol/ethyl acetate/water, was investigated close to its azeotropic point and compared to a simulation conducted with the determined binary permeance data. Experimental and simulation data proved to agree very well for the investigated process conditions. In order to test the scalability of the developed engineering tool, large-scale data from an industrial pervaporation plant used for the dehydration of ethanol was compared to a process simulation conducted with the validated physico-chemical model. Since the membranes employed in both mini-plant and industrial scale were of the same type, the permeance data could be transferred. The comparison of the measured and simulated data proved the scalability of the derived model. PMID:29342956
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geng, R. L.; Golden, B. A.; Kushnick, P.
2011-07-01
One of the major goals of ILC SRF cavity R&D is to develop industrial capabilities of cavity manufacture and processing in all three regions. In the past several years, Jefferson Lab, in collaboration with Fermi National Accelerator Laboratory, has processed and tested all the 9-cell cavities of the first batch (4 cavities) and second batch (6 cavities) production cavities manufactured by Advanced Energy Systems Inc. (AES). Over the course, close information feedback was maintained, resulting in changes in fabrication and processing procedures. A light buffered chemical polishing was introduced, removing the weld splatters that could not be effectively removed bymore » heavy EP alone. An 800 Celsius 2 hour vacuum furnace heat treatment procedure replaced the original 600 Celsius 10 hour procedure. Four out of the six 9-cell cavities of the second production bath achieved a gradient of 36-41 MV/m at a Q0 of more than 8E9 at 35 MV/m. This result validated AES as the first ''ILC certified'' industrial vendor in the US for ILC cavity manufacture.« less
Process analytical technology in the pharmaceutical industry: a toolkit for continuous improvement.
Scott, Bradley; Wilcock, Anne
2006-01-01
Process analytical technology (PAT) refers to a series of tools used to ensure that quality is built into products while at the same time improving the understanding of processes, increasing efficiency, and decreasing costs. It has not been widely adopted by the pharmaceutical industry. As the setting for this paper, the current pharmaceutical manufacturing paradigm and PAT guidance to date are discussed prior to the review of PAT principles and tools, benefits, and challenges. The PAT toolkit contains process analyzers, multivariate analysis tools, process control tools, and continuous improvement/knowledge management/information technology systems. The integration and implementation of these tools is complex, and has resulted in uncertainty with respect to both regulation and validation. The paucity of staff knowledgeable in this area may complicate adoption. Studies to quantitate the benefits resulting from the adoption of PAT within the pharmaceutical industry would be a valuable addition to the qualitative studies that are currently available.
Carrasco, Juan A; Dormido, Sebastián
2006-04-01
The use of industrial control systems in simulators facilitates the execution of engineering activities related with the installation and the optimization of the control systems in real plants. "Industrial control system" intends to be a valid term that would represent all the control systems which can be installed in an industrial plant, ranging from complex distributed control systems and SCADA packages to small single control devices. This paper summarizes the current alternatives for the development of simulators of industrial plants and presents an analysis of the process of integrating an industrial control system into a simulator, with the aim of helping in the installation of real control systems in simulators.
Assessment Methodology for Process Validation Lifecycle Stage 3A.
Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Chen, Shu; Ingram, Marzena; Spes, Jana
2017-07-01
The paper introduces evaluation methodologies and associated statistical approaches for process validation lifecycle Stage 3A. The assessment tools proposed can be applied to newly developed and launched small molecule as well as bio-pharma products, where substantial process and product knowledge has been gathered. The following elements may be included in Stage 3A: number of 3A batch determination; evaluation of critical material attributes, critical process parameters, critical quality attributes; in vivo in vitro correlation; estimation of inherent process variability (IPV) and PaCS index; process capability and quality dashboard (PCQd); and enhanced control strategy. US FDA guidance on Process Validation: General Principles and Practices, January 2011 encourages applying previous credible experience with suitably similar products and processes. A complete Stage 3A evaluation is a valuable resource for product development and future risk mitigation of similar products and processes. Elements of 3A assessment were developed to address industry and regulatory guidance requirements. The conclusions made provide sufficient information to make a scientific and risk-based decision on product robustness.
NASA Astrophysics Data System (ADS)
Nilasari, Yoni; Dasining
2018-04-01
In this era of globalization, every human resource is faced with a competitive climate that will have a major impact on the development of the business and industrial sector. Therefore it is deemed necessary to research the development of curriculum based on INQF and the business/industries sector in order to improve the competence of Sewing Technique for Vocational High School Students of fashion clothing program. The development of curricula based on INQF and the business/industries is an activity to produce a curriculum that suits the needs of the business and industries sector. The formulation of the problem in this research are: (1) what is the curriculum based on INQF and the business/industries sector?; (2) how is the process and procedure of curriculum development of fashion program profession based on INQF and the business/industries sector?; And (3) how the result of the curriculum of fashion expertise based on INQF and the business/industries sector. The aims of research are: (1) explain what is meant by curriculum based on INQF and business/industries sector; (2) to know the process and procedure of curriculum development of fashion program profession based on INQF and the business/industries sectors ; And (3) to know result the curriculum of clothing expertise based on INQF and the business/industries sector. The research method chosen in developing curriculum based on INQFand business/industry sector is using by 4-D model from Thiagarajan, which includes: (1) define; (2) design; (3) development; And (4) disseminate. Step 4, not done but in this study. The result of the research shows that: (1) the curriculum based on INQF and the business/industries sector is the curriculum created by applying the principles and procedures of the Indonesian National Qualification Framework (INQF) that will improve the quality of graduates of Vocational High School level 2, and establish cooperation with Business/industries as a guest teacher (counselor) in the learning process; (2) process and procedure of curriculum development of fashion program profession based on INQF and business/industries sector is process and procedure of curriculum development of fashion program profession based on INQF and business/industries sector there are several stages: feasibility study and requirement, preparation of initial concept of curriculum planning based on INQF and the business/industries sector in the field of fashion, as well as the development of a plan to implement the curriculum based on INQF and the business/industries sector in the field of fashion, this development will produce a curriculum of fashion proficiency program in the form of learning competency of sewing technology where the implementer of learning (counselor) Is a guest teacher from business/industries sector. (3) the learning device validity aspect earns an average score of 3.5 with very valid criteria and the practicality aspect of the device obtains an average score of 3.3 with practical criteria.
Validity and consistency assessment of accident analysis methods in the petroleum industry.
Ahmadi, Omran; Mortazavi, Seyed Bagher; Khavanin, Ali; Mokarami, Hamidreza
2017-11-17
Accident analysis is the main aspect of accident investigation. It includes the method of connecting different causes in a procedural way. Therefore, it is important to use valid and reliable methods for the investigation of different causal factors of accidents, especially the noteworthy ones. This study aimed to prominently assess the accuracy (sensitivity index [SI]) and consistency of the six most commonly used accident analysis methods in the petroleum industry. In order to evaluate the methods of accident analysis, two real case studies (process safety and personal accident) from the petroleum industry were analyzed by 10 assessors. The accuracy and consistency of these methods were then evaluated. The assessors were trained in the workshop of accident analysis methods. The systematic cause analysis technique and bowtie methods gained the greatest SI scores for both personal and process safety accidents, respectively. The best average results of the consistency in a single method (based on 10 independent assessors) were in the region of 70%. This study confirmed that the application of methods with pre-defined causes and a logic tree could enhance the sensitivity and consistency of accident analysis.
Validation of X1 motorcycle model in industrial plant layout by using WITNESSTM simulation software
NASA Astrophysics Data System (ADS)
Hamzas, M. F. M. A.; Bareduan, S. A.; Zakaria, M. Z.; Tan, W. J.; Zairi, S.
2017-09-01
This paper demonstrates a case study on simulation, modelling and analysis for X1 Motorcycles Model. In this research, a motorcycle assembly plant has been selected as a main place of research study. Simulation techniques by using Witness software were applied to evaluate the performance of the existing manufacturing system. The main objective is to validate the data and find out the significant impact on the overall performance of the system for future improvement. The process of validation starts when the layout of the assembly line was identified. All components are evaluated to validate whether the data is significance for future improvement. Machine and labor statistics are among the parameters that were evaluated for process improvement. Average total cycle time for given workstations is used as criterion for comparison of possible variants. From the simulation process, the data used are appropriate and meet the criteria for two-sided assembly line problems.
NASA Astrophysics Data System (ADS)
Martin, Ffion A.; Warrior, Nicholas A.; Simacek, Pavel; Advani, Suresh; Hughes, Adrian; Darlington, Roger; Senan, Eissa
2018-03-01
Very short manufacture cycle times are required if continuous carbon fibre and epoxy composite components are to be economically viable solutions for high volume composite production for the automotive industry. Here, a manufacturing process variant of resin transfer moulding (RTM), targets a reduction of in-mould manufacture time by reducing the time to inject and cure components. The process involves two stages; resin injection followed by compression. A flow simulation methodology using an RTM solver for the process has been developed. This paper compares the simulation prediction to experiments performed using industrial equipment. The issues encountered during the manufacturing are included in the simulation and their sensitivity to the process is explored.
Ungers, L J; Moskowitz, P D; Owens, T W; Harmon, A D; Briggs, T M
1982-02-01
Determining occupational health and safety risks posed by emerging technologies is difficult because of limited statistics. Nevertheless, estimates of such risks must be constructed to permit comparison of various technologies to identify the most attractive processes. One way to estimate risks is to use statistics on related industries. Based on process labor requirements and associated occupational health data, risks to workers and to society posed by an emerging technology can be calculated. Using data from the California semiconductor industry, this study applies a five-step occupational risk assessment procedure to four processes for the fabrication of photovoltaic cells. The validity of the occupational risk assessment method is discussed.
78 FR 56718 - Draft Guidance for Industry on Bioanalytical Method Validation; Availability
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-13
...] Draft Guidance for Industry on Bioanalytical Method Validation; Availability AGENCY: Food and Drug... availability of a draft guidance for industry entitled ``Bioanalytical Method Validation.'' The draft guidance is intended to provide recommendations regarding analytical method development and validation for the...
Workflow for Criticality Assessment Applied in Biopharmaceutical Process Validation Stage 1.
Zahel, Thomas; Marschall, Lukas; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Mueller, Eric M; Murphy, Patrick; Natschläger, Thomas; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph
2017-10-12
Identification of critical process parameters that impact product quality is a central task during regulatory requested process validation. Commonly, this is done via design of experiments and identification of parameters significantly impacting product quality (rejection of the null hypothesis that the effect equals 0). However, parameters which show a large uncertainty and might result in an undesirable product quality limit critical to the product, may be missed. This might occur during the evaluation of experiments since residual/un-modelled variance in the experiments is larger than expected a priori. Estimation of such a risk is the task of the presented novel retrospective power analysis permutation test. This is evaluated using a data set for two unit operations established during characterization of a biopharmaceutical process in industry. The results show that, for one unit operation, the observed variance in the experiments is much larger than expected a priori, resulting in low power levels for all non-significant parameters. Moreover, we present a workflow of how to mitigate the risk associated with overlooked parameter effects. This enables a statistically sound identification of critical process parameters. The developed workflow will substantially support industry in delivering constant product quality, reduce process variance and increase patient safety.
Jarrett, P Gary
2006-01-01
The primary purpose of this study is to undertake a diagnostic investigation of the international health care logistical environment and determine whether regulatory policies or industry procedures have hindered the implementation of just-in-time (JIT) systems and then to recommend operational improvements to be achieved by implementing JIT Systems. The analysis was conducted in a systematic manner and compared the anticipated benefits with benefits validated in other industries from the implementation of JIT. An extensive literature review was conducted. In this particular study the cost and benefit outcomes achieved from a health care JIT implementation were compared with those achieved by the manufacturing, service, and retail industries. Chiefly, it was found that the health service market must be restructured to encourage greater price competition among priorities. A new standardization process should eliminate duplication of products and realize substantial savings. The analysis was conducted in a systematic manner and compared the anticipated benefits with benefits validated in other industries from the implementation of JIT.
Validation of Air Force Hazard Assessment Rating Methodology.
1985-09-01
the product of the probability and magnitude of each undesirable possible outcome, integrated or summed over all undesirable outcomes [28:5]. Risk...is difficult to keep risk management factors out of the risk assessment process (27:35). Site Contamination. Since site contamination is a major part...same standards for risks from new products and industries as it does for old products and industries. The risks of new products are screened very
Jiang, Yingnan; Hua, Ming; Wu, Bian; Ma, Hongrui; Pan, Bingcai; Zhang, Quanxing
2014-05-01
Effective arsenic removal from highly laden industrial wastewater is an important but challenging task. Here, a combined coprecipitation/nano-adsorption process, with ferric chloride and calcium chloride as coprecipitation agents and polymer-based nanocomposite as selective adsorbent, has been validated for arsenic removal from tungsten-smelting wastewater. On the basis of operating optimization, a binary FeCl3 (520 mg/L)-CaCl2 (300 mg/L) coprecipitation agent could remove more than 93% arsenic from the wastewater. The resulting precipitate has proved environmental safety based on leaching toxicity test. Fixed-bed column packed with zirconium or ferric-oxide-loaded nanocomposite was employed for further elimination of arsenic in coprecipitated effluent, resulting in a significant decrease of arsenic (from 0.96 to less than 0.5 mg/L). The working capacity of zirconium-loaded nanocomposite was 220 bed volumes per run, much higher than that of ferric-loaded nanocomposite (40 bed volumes per run). The exhausted zirconium-loaded nanocomposite could be efficiently in situ regenerated with a binary NaOH-NaCl solution for reuse without any significant capacity loss. The results validated the combinational coprecipitation/nano-adsorption process to be a potential alternative for effective arsenic removal from highly laden industrial effluent.
The development of the ICME supply-chain: Route to ICME implementation and sustainment
NASA Astrophysics Data System (ADS)
Furrer, David; Schirra, John
2011-04-01
Over the past twenty years, integrated computational materials engineering (ICME) has emerged as a key engineering field with great promise. Models simulating materials-related phenomena have been developed and are being validated for industrial application. The integration of computational methods into material, process and component design has been a challenge, however, in part due to the complexities in the development of an ICME "supply-chain" that supports, sustains and delivers this emerging technology. ICME touches many disciplines, which results in a requirement for many types of computational-based technology organizations to be involved to provide tools that can be rapidly developed, validated, deployed and maintained for industrial applications. The need for, and the current state of an ICME supply-chain along with development and future requirements for the continued pace of introduction of ICME into industrial design practices will be reviewed within this article.
Validation of sterilizing grade filtration.
Jornitz, M W; Meltzer, T H
2003-01-01
Validation consideration of sterilizing grade filters, namely 0.2 micron, changed when FDA voiced concerns about the validity of Bacterial Challenge tests performed in the past. Such validation exercises are nowadays considered to be filter qualification. Filter validation requires more thorough analysis, especially Bacterial Challenge testing with the actual drug product under process conditions. To do so, viability testing is a necessity to determine the Bacterial Challenge test methodology. Additionally to these two compulsory tests, other evaluations like extractable, adsorption and chemical compatibility tests should be considered. PDA Technical Report # 26, Sterilizing Filtration of Liquids, describes all parameters and aspects required for the comprehensive validation of filters. The report is a most helpful tool for validation of liquid filters used in the biopharmaceutical industry. It sets the cornerstones of validation requirements and other filtration considerations.
NASA Astrophysics Data System (ADS)
Pizette, Patrick; Govender, Nicolin; Wilke, Daniel N.; Abriak, Nor-Edine
2017-06-01
The use of the Discrete Element Method (DEM) for industrial civil engineering industrial applications is currently limited due to the computational demands when large numbers of particles are considered. The graphics processing unit (GPU) with its highly parallelized hardware architecture shows potential to enable solution of civil engineering problems using discrete granular approaches. We demonstrate in this study the pratical utility of a validated GPU-enabled DEM modeling environment to simulate industrial scale granular problems. As illustration, the flow discharge of storage silos using 8 and 17 million particles is considered. DEM simulations have been performed to investigate the influence of particle size (equivalent size for the 20/40-mesh gravel) and induced shear stress for two hopper shapes. The preliminary results indicate that the shape of the hopper significantly influences the discharge rates for the same material. Specifically, this work shows that GPU-enabled DEM modeling environments can model industrial scale problems on a single portable computer within a day for 30 seconds of process time.
Pairis-Garcia, M; Moeller, S J
2017-03-01
The Common Swine Industry Audit (CSIA) was developed and scientifically evaluated through the combined efforts of a task force consisting of university scientists, veterinarians, pork producers, packers, processers, and retail and food service personnel to provide stakeholders throughout the pork chain with a consistent, reliable, and verifiable system to ensure on-farm swine welfare and food safety. The CSIA tool was built from the framework of the Pork Quality Assurance Plus (PQA Plus) site assessment program with the purpose of developing a single, common audit platform for the U.S. swine industry. Twenty-seven key aspects of swine care are captured and evaluated in CSIA and cover the specific focal areas of animal records, animal observations, facilities, and caretakers. Animal-based measures represent approximately 50% of CSIA evaluation criteria and encompass critical failure criteria, including observation of willful acts of abuse and determination of timely euthanasia. Objective, science-based measures of animal well-being parameters (e.g., BCS, lameness, lesions, hernias) are assessed within CSIA using statistically validated sample sizes providing a detection ability of 1% with 95% confidence. The common CSIA platform is used to identify care issues and facilitate continuous improvement in animal care through a validated, repeatable, and feasible animal-based audit process. Task force members provide continual updates to the CSIA tool with a specific focus toward 1) identification and interpretation of appropriate animal-based measures that provide inherent value to pig welfare, 2) establishment of acceptability thresholds for animal-based measures, and 3) interpretation of CSIA data for use and improvement of welfare within the U.S. swine industry.
The Role of Empirical Evidence for Transferring a New Technology to Industry
NASA Astrophysics Data System (ADS)
Baldassarre, Maria Teresa; Bruno, Giovanni; Caivano, Danilo; Visaggio, Giuseppe
Technology transfer and innovation diffusion are key success factors for an enterprise. The shift to a new software technology involves, on one hand, inevitable changes to ingrained and familiar processes and, on the other, requires training, changes in practices and commitment on behalf of technical staff and management. Nevertheless, industry is often reluctant to innovation due to the changes it determines. The process of innovation diffusion is easier if the new technology is supported by empirical evidence. In this sense our conjecture is that Empirical Software Engineering (ESE) serves as means for validating and transferring a new technology within production processes. In this paper, the authors report their experience of a method, Multiview Framework, defined in the SERLAB research laboratory as support for designing and managing a goal oriented measurement program that has been validated through various empirical studies before being transferred to an Italian SME. Our discussion points out the important role of empirical evidence for obtaining management commitment and buy-in on behalf of technical staff, and for making technological transfer possible.
Fuel quality-processing study. Volume 2: Literature survey
NASA Technical Reports Server (NTRS)
Jones, G. E., Jr.; Amero, R.; Murthy, B.; Cutrone, M.
1981-01-01
The validity of initial assumptions about raw materials choices and relevant upgrading processing options was confirmed. The literature survey also served to define the on-site (at the turbine location) options for fuel treatment and exhaust gas treatment. The literature survey also contains a substantial compilation of specification and physical property information about liquid fuel products relevant to industrial gas turbines.
NASA Astrophysics Data System (ADS)
Wrożyna, Andrzej; Pernach, Monika; Kuziak, Roman; Pietrzyk, Maciej
2016-04-01
Due to their exceptional strength properties combined with good workability the Advanced High-Strength Steels (AHSS) are commonly used in automotive industry. Manufacturing of these steels is a complex process which requires precise control of technological parameters during thermo-mechanical treatment. Design of these processes can be significantly improved by the numerical models of phase transformations. Evaluation of predictive capabilities of models, as far as their applicability in simulation of thermal cycles thermal cycles for AHSS is considered, was the objective of the paper. Two models were considered. The former was upgrade of the JMAK equation while the latter was an upgrade of the Leblond model. The models can be applied to any AHSS though the examples quoted in the paper refer to the Dual Phase (DP) steel. Three series of experimental simulations were performed. The first included various thermal cycles going beyond limitations of the continuous annealing lines. The objective was to validate models behavior in more complex cooling conditions. The second set of tests included experimental simulations of the thermal cycle characteristic for the continuous annealing lines. Capability of the models to describe properly phase transformations in this process was evaluated. The third set included data from the industrial continuous annealing line. Validation and verification of models confirmed their good predictive capabilities. Since it does not require application of the additivity rule, the upgrade of the Leblond model was selected as the better one for simulation of industrial processes in AHSS production.
NASA Astrophysics Data System (ADS)
Petrila, S.; Brabie, G.; Chirita, B.
2016-08-01
The analysis performed on manufacturing flows within industrial enterprises producing hydrostatic components twos made on a number of factors that influence smooth running of production such: distance between pieces, waiting time from one surgery to another; time achievement of setups on CNC machines; tool changing in case of a large number of operators and manufacturing complexity of large files [2]. To optimize the manufacturing flow it was used the software Tecnomatix. This software represents a complete portfolio of manufacturing solutions digital manufactured by Siemens. It provides innovation by linking all production methods of a product from process design, process simulation, validation and ending the manufacturing process. Among its many capabilities to create a wide range of simulations, the program offers various demonstrations regarding the behavior manufacturing cycles. This program allows the simulation and optimization of production systems and processes in several areas such as: car suppliers, production of industrial equipment; electronics manufacturing, design and production of aerospace and defense parts.
Enabling Automotive Innovation: Tales from a Physicist in Industry
NASA Astrophysics Data System (ADS)
Pinkerton, Frederick
Measurements and instrumentation play an obvious and critical technical role in the automotive industry to assure compliance with government and industry standards such as emissions and fuel economy. Less obvious and equally critical is the role they play in innovative materials for future transportation needs. In today's open innovation environment, where research is distributed among industrial, academic, and government lab partners, the ability to capture, validate, and incorporate both internal and external inventions combines a deep knowledge base and the research tools to evaluate advanced materials and processes. Examples of the impact of measurements and instrumentation on internal, external, and shared research will be given from the experiences of the author and his research colleagues.
An assessment of space shuttle flight software development processes
NASA Technical Reports Server (NTRS)
1993-01-01
In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.
Vehicle Technologies Program Awards and Patents
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2011-12-13
Award-winning technologies and processes are hallmarks of the programs funded by the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, and industrial partners. Awards, patents, and other recognition validate the products of research undertaken as part of the Vehicle Technologies Program.
Application of industry-standard guidelines for the validation of avionics software
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.; Shagnea, Anita M.
1990-01-01
The application of industry standards to the development of avionics software is discussed, focusing on verification and validation activities. It is pointed out that the procedures that guide the avionics software development and testing process are under increased scrutiny. The DO-178A guidelines, Software Considerations in Airborne Systems and Equipment Certification, are used by the FAA for certifying avionics software. To investigate the effectiveness of the DO-178A guidelines for improving the quality of avionics software, guidance and control software (GCS) is being developed according to the DO-178A development method. It is noted that, due to the extent of the data collection and configuration management procedures, any phase in the life cycle of a GCS implementation can be reconstructed. Hence, a fundamental development and testing platform has been established that is suitable for investigating the adequacy of various software development processes. In particular, the overall effectiveness and efficiency of the development method recommended by the DO-178A guidelines are being closely examined.
Dozier, Samantha; Brown, Jeffrey; Currie, Alistair
2011-11-29
In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches.
Sensor Fusion to Estimate the Depth and Width of the Weld Bead in Real Time in GMAW Processes
Sampaio, Renato Coral; Vargas, José A. R.
2018-01-01
The arc welding process is widely used in industry but its automatic control is limited by the difficulty in measuring the weld bead geometry and closing the control loop on the arc, which has adverse environmental conditions. To address this problem, this work proposes a system to capture the welding variables and send stimuli to the Gas Metal Arc Welding (GMAW) conventional process with a constant voltage power source, which allows weld bead geometry estimation with an open-loop control. Dynamic models of depth and width estimators of the weld bead are implemented based on the fusion of thermographic data, welding current and welding voltage in a multilayer perceptron neural network. The estimators were trained and validated off-line with data from a novel algorithm developed to extract the features of the infrared image, a laser profilometer was implemented to measure the bead dimensions and an image processing algorithm that measures depth by making a longitudinal cut in the weld bead. These estimators are optimized for embedded devices and real-time processing and were implemented on a Field-Programmable Gate Array (FPGA) device. Experiments to collect data, train and validate the estimators are presented and discussed. The results show that the proposed method is useful in industrial and research environments. PMID:29570698
Sensor Fusion to Estimate the Depth and Width of the Weld Bead in Real Time in GMAW Processes.
Bestard, Guillermo Alvarez; Sampaio, Renato Coral; Vargas, José A R; Alfaro, Sadek C Absi
2018-03-23
The arc welding process is widely used in industry but its automatic control is limited by the difficulty in measuring the weld bead geometry and closing the control loop on the arc, which has adverse environmental conditions. To address this problem, this work proposes a system to capture the welding variables and send stimuli to the Gas Metal Arc Welding (GMAW) conventional process with a constant voltage power source, which allows weld bead geometry estimation with an open-loop control. Dynamic models of depth and width estimators of the weld bead are implemented based on the fusion of thermographic data, welding current and welding voltage in a multilayer perceptron neural network. The estimators were trained and validated off-line with data from a novel algorithm developed to extract the features of the infrared image, a laser profilometer was implemented to measure the bead dimensions and an image processing algorithm that measures depth by making a longitudinal cut in the weld bead. These estimators are optimized for embedded devices and real-time processing and were implemented on a Field-Programmable Gate Array (FPGA) device. Experiments to collect data, train and validate the estimators are presented and discussed. The results show that the proposed method is useful in industrial and research environments.
Bryan-Jones, Katherine; Bero, Lisa A.
2003-01-01
Objectives. We describe tobacco industry strategies to defeat the Occupational Safety and Health Administration (OSHA) Indoor Air Quality rule and the implementation of those strategies. Methods. We analyzed tobacco industry documents, public commentary on, and media coverage of the OSHA rule. Results. The tobacco industry had 5 strategies: (1) maintain scientific debate about the basis of the rule, (2) delay deliberation on the rule, (3) redefine the scope of the rule, (4) recruit and assist labor and business organizations in opposing the rule, and (5) increase media coverage of the tobacco industry position. The tobacco industry successfully implemented all 5 strategies. Conclusions. Our findings suggest that regulatory authorities must take into account the source, motivation, and validity of arguments used in the regulatory process in order to make accurately informed decisions. PMID:12660202
Adams, Farzana; Nolte, Fred; Colton, James; De Beer, John; Weddig, Lisa
2018-02-23
An experiment to validate the precooking of tuna as a control for histamine formation was carried out at a commercial tuna factory in Fiji. Albacore tuna ( Thunnus alalunga) were brought on board long-line catcher vessels alive, immediately chilled but never frozen, and delivered to an on-shore facility within 3 to 13 days. These fish were then allowed to spoil at 25 to 30°C for 21 to 25 h to induce high levels of histamine (>50 ppm), as a simulation of "worst-case" postharvest conditions, and subsequently frozen. These spoiled fish later were thawed normally and then precooked at a commercial tuna processing facility to a target maximum core temperature of 60°C. These tuna were then held at ambient temperatures of 19 to 37°C for up to 30 h, and samples were collected every 6 h for histamine analysis. After precooking, no further histamine formation was observed for 12 to 18 h, indicating that a conservative minimum core temperature of 60°C pauses subsequent histamine formation for 12 to 18 h. Using the maximum core temperature of 60°C provided a challenge study to validate a recommended minimum core temperature of 60°C, and 12 to 18 h was sufficient to convert precooked tuna into frozen loins or canned tuna. This industrial-scale process validation study provides support at a high confidence level for the preventive histamine control associated with precooking. This study was conducted with tuna deliberately allowed to spoil to induce high concentrations of histamine and histamine-forming capacity and to fail standard organoleptic evaluations, and the critical limits for precooking were validated. Thus, these limits can be used in a hazard analysis critical control point plan in which precooking is identified as a critical control point.
Noel, Jonathan K; Babor, Thomas F
2017-01-01
Exposure to alcohol marketing is considered to be potentially harmful to adolescents. In addition to statutory regulation, industry self-regulation is a common way to protect adolescents from alcohol marketing exposures. This paper critically reviews research designed to evaluate the effectiveness of the alcohol industry's compliance procedures to manage complaints when alcohol marketing is considered to have violated a self-regulatory code. Peer-reviewed papers were identified through four literature search engines: PubMed, SCOPUS, PsychINFO and CINAHL. Non-peer-reviewed reports produced by public health agencies, alcohol research centers, non-governmental organizations, government research centers and national industry advertising associations were also included. The search process yielded three peer-reviewed papers, seven non-peer reviewed reports published by academic institutes and non-profit organizations and 20 industry reports. The evidence indicates that the complaint process lacks standardization across countries, industry adjudicators may be trained inadequately or biased and few complaints are upheld against advertisements pre-determined to contain violations of a self-regulatory code. The current alcohol industry marketing complaint process used in a wide variety of countries may be ineffective at removing potentially harmful content from the market-place. The process of determining the validity of complaints employed by most industry groups appears to suffer from serious conflict of interest and procedural weaknesses that could compromise objective adjudication of even well-documented complaints. In our opinion the current system of self-regulation needs major modifications if it is to serve public health objectives, and more systematic evaluations of the complaint process are needed. © 2016 Society for the Study of Addiction.
The development of an industrial-scale fed-batch fermentation simulation.
Goldrick, Stephen; Ştefan, Andrei; Lovett, David; Montague, Gary; Lennox, Barry
2015-01-10
This paper describes a simulation of an industrial-scale fed-batch fermentation that can be used as a benchmark in process systems analysis and control studies. The simulation was developed using a mechanistic model and validated using historical data collected from an industrial-scale penicillin fermentation process. Each batch was carried out in a 100,000 L bioreactor that used an industrial strain of Penicillium chrysogenum. The manipulated variables recorded during each batch were used as inputs to the simulator and the predicted outputs were then compared with the on-line and off-line measurements recorded in the real process. The simulator adapted a previously published structured model to describe the penicillin fermentation and extended it to include the main environmental effects of dissolved oxygen, viscosity, temperature, pH and dissolved carbon dioxide. In addition the effects of nitrogen and phenylacetic acid concentrations on the biomass and penicillin production rates were also included. The simulated model predictions of all the on-line and off-line process measurements, including the off-gas analysis, were in good agreement with the batch records. The simulator and industrial process data are available to download at www.industrialpenicillinsimulation.com and can be used to evaluate, study and improve on the current control strategy implemented on this facility. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.
Pageler, Natalie M; Grazier G'Sell, Max Jacob; Chandler, Warren; Mailes, Emily; Yang, Christine; Longhurst, Christopher A
2016-09-01
The objective of this project was to use statistical techniques to determine the completeness and accuracy of data migrated during electronic health record conversion. Data validation during migration consists of mapped record testing and validation of a sample of the data for completeness and accuracy. We statistically determined a randomized sample size for each data type based on the desired confidence level and error limits. The only error identified in the post go-live period was a failure to migrate some clinical notes, which was unrelated to the validation process. No errors in the migrated data were found during the 12- month post-implementation period. Compared to the typical industry approach, we have demonstrated that a statistical approach to sampling size for data validation can ensure consistent confidence levels while maximizing efficiency of the validation process during a major electronic health record conversion. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Wanthal, Steven; Schaefer, Joseph; Justusson, Brian; Hyder, Imran; Engelstad, Stephen; Rose, Cheryl
2017-01-01
The Advanced Composites Consortium is a US Government/Industry partnership supporting technologies to enable timeline and cost reduction in the development of certified composite aerospace structures. A key component of the consortium's approach is the development and validation of improved progressive damage and failure analysis methods for composite structures. These methods will enable increased use of simulations in design trade studies and detailed design development, and thereby enable more targeted physical test programs to validate designs. To accomplish this goal with confidence, a rigorous verification and validation process was developed. The process was used to evaluate analysis methods and associated implementation requirements to ensure calculation accuracy and to gage predictability for composite failure modes of interest. This paper introduces the verification and validation process developed by the consortium during the Phase I effort of the Advanced Composites Project. Specific structural failure modes of interest are first identified, and a subset of standard composite test articles are proposed to interrogate a progressive damage analysis method's ability to predict each failure mode of interest. Test articles are designed to capture the underlying composite material constitutive response as well as the interaction of failure modes representing typical failure patterns observed in aerospace structures.
Processing data base information having nonwhite noise
Gross, Kenneth C.; Morreale, Patricia
1995-01-01
A method and system for processing a set of data from an industrial process and/or a sensor. The method and system can include processing data from either real or calculated data related to an industrial process variable. One of the data sets can be an artificial signal data set generated by an autoregressive moving average technique. After obtaining two data sets associated with one physical variable, a difference function data set is obtained by determining the arithmetic difference between the two pairs of data sets over time. A frequency domain transformation is made of the difference function data set to obtain Fourier modes describing a composite function data set. A residual function data set is obtained by subtracting the composite function data set from the difference function data set and the residual function data set (free of nonwhite noise) is analyzed by a statistical probability ratio test to provide a validated data base.
The Development of Model for Measuring Railway Wheels Manufacturing Readiness Level
NASA Astrophysics Data System (ADS)
Inrawan Wiratmadja, Iwan; Mufid, Anas
2016-02-01
In an effort to grow the railway wheel industry in Indonesia and reduce the dependence on imports, Metal Industries Development Center (MIDC) makes the implementation of the railway wheel manufacturing technology in Indonesia. MIDC is an institution based on research and development having a task to research the production of railway wheels prototype and acts as a supervisor to the industry in Indonesia, for implementing the railway wheel manufacturing technology. The process of implementing manufacturing technology requires a lot of resources. Therefore it is necessary to measure the manufacturing readiness process. Measurement of railway wheels manufacturing readiness was in this study done using the manufacturing readiness level (MRL) model from the United States Department of Defense. MRL consists of 10 manufacturing readiness levels described by 90 criteria and 184 sub-criteria. To get a manufacturing readiness measurement instrument that is good and accurate, the development process involved experts through expert judgment method and validated with a content validity ratio (CVR). Measurement instrument developed in this study consist of 448 indicators. The measurement results show that MIDC's railway wheels manufacturing readiness is at the level 4. This shows that there is a gap between the current level of manufacturing readiness owned by MIDC and manufacturing readiness levels required to achieve the program objectives, which is level 5. To achieve the program objectives at level 5, a number of actions were required to be done by MIDC. Indicators that must be improved to be able to achieve level 5 are indicators related to the cost and financing, process capability and control, quality management, workers, and manufacturing management criteria.
Drug-loaded erythrocytes: on the road toward marketing approval
Bourgeaux, Vanessa; Lanao, José M; Bax, Bridget E; Godfrin, Yann
2016-01-01
Erythrocyte drug encapsulation is one of the most promising therapeutic alternative approaches for the administration of toxic or rapidly cleared drugs. Drug-loaded erythrocytes can operate through one of the three main mechanisms of action: extension of circulation half-life (bioreactor), slow drug release, or specific organ targeting. Although the clinical development of erythrocyte carriers is confronted with regulatory and development process challenges, industrial development is expanding. The manufacture of this type of product can be either centralized or bedside based, and different procedures are employed for the encapsulation of therapeutic agents. The major challenges for successful industrialization include production scalability, process validation, and quality control of the released therapeutic agents. Advantages and drawbacks of the different manufacturing processes as well as success key points of clinical development are discussed. Several entrapment technologies based on osmotic methods have been industrialized. Companies have already achieved many of the critical clinical stages, thus providing the opportunity in the future to cover a wide range of diseases for which effective therapies are not currently available. PMID:26929599
Drug-loaded erythrocytes: on the road toward marketing approval.
Bourgeaux, Vanessa; Lanao, José M; Bax, Bridget E; Godfrin, Yann
2016-01-01
Erythrocyte drug encapsulation is one of the most promising therapeutic alternative approaches for the administration of toxic or rapidly cleared drugs. Drug-loaded erythrocytes can operate through one of the three main mechanisms of action: extension of circulation half-life (bioreactor), slow drug release, or specific organ targeting. Although the clinical development of erythrocyte carriers is confronted with regulatory and development process challenges, industrial development is expanding. The manufacture of this type of product can be either centralized or bedside based, and different procedures are employed for the encapsulation of therapeutic agents. The major challenges for successful industrialization include production scalability, process validation, and quality control of the released therapeutic agents. Advantages and drawbacks of the different manufacturing processes as well as success key points of clinical development are discussed. Several entrapment technologies based on osmotic methods have been industrialized. Companies have already achieved many of the critical clinical stages, thus providing the opportunity in the future to cover a wide range of diseases for which effective therapies are not currently available.
Developing a new industrial engineering curriculum using a systems engineering approach
NASA Astrophysics Data System (ADS)
Buyurgan, Nebil; Kiassat, Corey
2017-11-01
This paper reports on the development of an engineering curriculum for a new industrial engineering programme at a medium-sized private university in the northeast United States. A systems engineering process has been followed to design and develop the new curriculum. Considering the programme curriculum as a system, first the stakeholders have been identified, and some preliminary analysis on their needs and requirements has been conducted. Following that, the phases of conceptual design, preliminary design, and detailed design have been pursued during which different levels of validation, assessment, and evaluation processes have been utilised. In addition, a curriculum assessment and continuous improvement process have been developed to assess the curriculum and the courses frequently. The resulting curriculum is flexible, allowing the pursuit of accelerated graduate programmes, a second major, various minor options, and study-abroad; relevant, tailored to the needs of industry partners in the vicinity; and practical, providing hands-on education, resulting in employment-ready graduates.
Catalysis on Single Supported Atoms
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeBusk, Melanie Moses; Narula, Chaitanya Kumar
2015-01-01
The highly successful application of supported metals as heterogeneous catalysts in automotive catalysts, fuel cells, and other multitudes of industrial processes have led to extensive efforts to understand catalyst behavior at the nano-scale. Recent discovery of simple wet methods to prepare single supported atoms, the smallest nano-catalyst, has allowed for experimental validation of catalytic activity of a variety of catalysts and potential for large scale production for such catalysts for industrial processes. In this chapter, we summarize the synthetic and structural aspects of single supported atoms. We also present proposed mechanisms for the activity of single supported catalysts where conventionalmore » mechanisms cannot operate due to lack of M-M bonds in the catalysts.« less
Float-zone processing in a weightless environment
NASA Technical Reports Server (NTRS)
Fowle, A. A.; Haggerty, J. S.; Perron, R. R.; Strong, P. F.; Swanson, J. L.
1976-01-01
The results were reported of investigations to: (1) test the validity of analyses which set maximum practical diameters for Si crystals that can be processed by the float zone method in a near weightless environment, (2) determine the convective flow patterns induced in a typical float zone, Si melt under conditions perceived to be advantageous to the crystal growth process using flow visualization techniques applied to a dimensionally scaled model of the Si melt, (3) revise the estimates of the economic impact of space produced Si crystal by the float zone method on the U.S. electronics industry, and (4) devise a rational plan for future work related to crystal growth phenomena wherein low gravity conditions available in a space site can be used to maximum benefit to the U.S. electronics industry.
Development of an environmental impact model for the steel industry in Libya
NASA Astrophysics Data System (ADS)
Zaghinin, Mansur Salem
The global demand for steel is rising due to the infrastructural development of emergent economies in countries such as India, China, Thailand and Libya. Consequently, global steel production has increased dramatically and is expected to grow further in the future. Processing iron and steel is associated with a number of sustainable development challenges, including various economic, environmental and social issues. The increasing prominence of environmental issues in international and national political discourse, including the developing countries, means that stakeholders demand that manufacturers minimise the negative impacts of their operations.The steel industry must be able to measure and assess its environmental impacts and demonstrate continuous improvements. This requires an environmental management strategy to manage and minimise impacts on the environment. This study focuses on developing an environmental impacts model in steel industry to investigate the most important environmental parameters and their importance in order to mitigate environmental impacts.Based on the literature review and the elements that are considered as waste (derived from the waste survey in Libyan iron and steel industry), the potential environmental impacts of the steel industry are identified as criteria and sub-criteria. Then, a model is built using the Analytical Hierarchy Process (AHP) software based on the identified criteria and sub-criteria.The model also illustrates the overall goal which is creating environmental impacts model for steel industry, in addition, criteria and sub-criteria are listed to clarify the situation and make the analysis clearer and understandable. Pair wise comparisons are used to derive accurate ratio scale priorities.The results are analysed and presented as prioritised list of environmental impacts. Moreover, a series of sensitivity analyses are conducted to investigate the impact of changing the priority of the criteria on the alternatives' ranking. The validation of the proposed model is carried out to assess its validity and to see this model from the perspectives of the professionals from steel industry.
Presgrave, Octavio; Moura, Wlamir; Caldeira, Cristiane; Pereira, Elisabete; Bôas, Maria H Villas; Eskes, Chantra
2016-03-01
The need for the creation of a Brazilian centre for the validation of alternative methods was recognised in 2008, and members of academia, industry and existing international validation centres immediately engaged with the idea. In 2012, co-operation between the Oswaldo Cruz Foundation (FIOCRUZ) and the Brazilian Health Surveillance Agency (ANVISA) instigated the establishment of the Brazilian Center for the Validation of Alternative Methods (BraCVAM), which was officially launched in 2013. The Brazilian validation process follows OECD Guidance Document No. 34, where BraCVAM functions as the focal point to identify and/or receive requests from parties interested in submitting tests for validation. BraCVAM then informs the Brazilian National Network on Alternative Methods (RENaMA) of promising assays, which helps with prioritisation and contributes to the validation studies of selected assays. A Validation Management Group supervises the validation study, and the results obtained are peer-reviewed by an ad hoc Scientific Review Committee, organised under the auspices of BraCVAM. Based on the peer-review outcome, BraCVAM will prepare recommendations on the validated test method, which will be sent to the National Council for the Control of Animal Experimentation (CONCEA). CONCEA is in charge of the regulatory adoption of all validated test methods in Brazil, following an open public consultation. 2016 FRAME.
Siracusa, Giovanna; Becarelli, Simone; Lorenzi, Roberto; Gentini, Alessandro; Di Gregorio, Simona
2017-10-25
Polychlorinated biphenyls (PCBs) are hazardous soil contaminants for which a bio-based technology for their recovery is essential. The objective of this study was to validate the exploitation of spent mushroom substrate (SMS), a low or null cost organic waste derived from the industrial production of P. ostreatus, as bulking agent in a dynamic biopile pilot plant. The SMS shows potential oxidative capacity towards recalcitrant compounds. The aim was consistent with the design of a process of oxidation of highly chlorinated PCBs, which is independent from their reductive dehalogenation. Feasibility was verified at a mesocosm scale and validated at pilot scale in a dynamic biopile pilot plant treating ten tons of a historically contaminated soil (9.28±0.08mg PCB/kg soil dry weight). Mixing of the SMS with the soil was required for the depletion of the contaminants. At the pilot scale, after eight months of incubation, 94.1% depletion was recorded. A positive correlation between Actinobacteria and Firmicutes active metabolism, soil laccase activity and PCB removal was observed. The SMS was found to be exploitable as a versatile low cost organic substrate capable of activating processes for the oxidation of highly chlorinated PCBs. Moreover, its exploitation as bulking agent in biopiles is a valuable management strategy for the re-utilisation of an organic waste deriving from the industrial cultivation of edible mushrooms. Copyright © 2017 Elsevier B.V. All rights reserved.
Verma, Tushar; Wei, Xinyao; Lau, Soon Kiat; Bianchini, Andreia; Eskridge, Kent M; Subbiah, Jeyamkondan
2018-04-01
Salmonella in low-moisture foods is an emerging challenge due to numerous food product recalls and foodborne illness outbreaks. Identification of suitable surrogate is critical for process validation at industry level due to implementation of new Food Safety Modernization Act of 2011. The objective of this study was to evaluate Enterococcus faecium NRRL B-2354 as a surrogate for Salmonella during the extrusion of low-moisture food. Oat flour, a low-moisture food, was adjusted to different moisture (14% to 26% wet basis) and fat (5% to 15% w/w) contents and was inoculated with E. faecium NRRL B-2354. Inoculated material was then extruded in a lab-scale single-screw extruder running at different screw speeds (75 to 225 rpm) and different temperatures (75, 85, and 95 °C). A split-plot central composite 2nd order response surface design was used, with the central point replicated six times. The data from the selective media (m-Enterococcus agar) was used to build the response surface model for inactivation of E. faecium NRRL B-2354. Results indicated that E. faecium NRRL B-2354 always had higher heat resistance compared to Salmonella at all conditions evaluated in this study. However, the patterns of contour plots showing the effect of various product and process parameters on inactivation of E. faecium NRRL B-2354 was different from that of Salmonella. Although E. faecium NRRL B-2354 may be an acceptable surrogate for extrusion of low-moisture products due to higher resistance than Salmonella, another surrogate with similar inactivation behavior may be preferred and needs to be identified. Food Safety Modernization Act requires the food industry to validate processing interventions. This study validated extrusion processing and demonstrated that E. faecium NRRL B-2354 is an acceptable surrogate for extrusion of low-moisture products. The developed response surface model allows the industry to identify process conditions to achieve a desired lethality for their products based on composition. © 2018 Institute of Food Technologists®.
EUV mask manufacturing readiness in the merchant mask industry
NASA Astrophysics Data System (ADS)
Green, Michael; Choi, Yohan; Ham, Young; Kamberian, Henry; Progler, Chris; Tseng, Shih-En; Chiou, Tsann-Bim; Miyazaki, Junji; Lammers, Ad; Chen, Alek
2017-10-01
As nodes progress into the 7nm and below regime, extreme ultraviolet lithography (EUVL) becomes critical for all industry participants interested in remaining at the leading edge. One key cost driver for EUV in the supply chain is the reflective EUV mask. As of today, the relatively few end users of EUV consist primarily of integrated device manufactures (IDMs) and foundries that have internal (captive) mask manufacturing capability. At the same time, strong and early participation in EUV by the merchant mask industry should bring value to these chip makers, aiding the wide-scale adoption of EUV in the future. For this, merchants need access to high quality, representative test vehicles to develop and validate their own processes. This business circumstance provides the motivation for merchants to form Joint Development Partnerships (JDPs) with IDMs, foundries, Original Equipment Manufacturers (OEMs) and other members of the EUV supplier ecosystem that leverage complementary strengths. In this paper, we will show how, through a collaborative supplier JDP model between a merchant and OEM, a novel, test chip driven strategy is applied to guide and validate mask level process development. We demonstrate how an EUV test vehicle (TV) is generated for mask process characterization in advance of receiving chip maker-specific designs. We utilize the TV to carry out mask process "stress testing" to define process boundary conditions which can be used to create Mask Rule Check (MRC) rules as well as serve as baseline conditions for future process improvement. We utilize Advanced Mask Characterization (AMC) techniques to understand process capability on designs of varying complexity that include EUV OPC models with and without sub-resolution assist features (SRAFs). Through these collaborations, we demonstrate ways to develop EUV processes and reduce implementation risks for eventual mass production. By reducing these risks, we hope to expand access to EUV mask capability for the broadest community possible as the technology is implemented first within and then beyond the initial early adopters.
Use of a Computer-Mediated Delphi Process to Validate a Mass Casualty Conceptual Model
CULLEY, JOAN M.
2012-01-01
Since the original work on the Delphi technique, multiple versions have been developed and used in research and industry; however, very little empirical research has been conducted that evaluates the efficacy of using online computer, Internet, and e-mail applications to facilitate a Delphi method that can be used to validate theoretical models. The purpose of this research was to develop computer, Internet, and e-mail applications to facilitate a modified Delphi technique through which experts provide validation for a proposed conceptual model that describes the information needs for a mass-casualty continuum of care. Extant literature and existing theoretical models provided the basis for model development. Two rounds of the Delphi process were needed to satisfy the criteria for consensus and/or stability related to the constructs, relationships, and indicators in the model. The majority of experts rated the online processes favorably (mean of 6.1 on a seven-point scale). Using online Internet and computer applications to facilitate a modified Delphi process offers much promise for future research involving model building or validation. The online Delphi process provided an effective methodology for identifying and describing the complex series of events and contextual factors that influence the way we respond to disasters. PMID:21076283
Use of a computer-mediated Delphi process to validate a mass casualty conceptual model.
Culley, Joan M
2011-05-01
Since the original work on the Delphi technique, multiple versions have been developed and used in research and industry; however, very little empirical research has been conducted that evaluates the efficacy of using online computer, Internet, and e-mail applications to facilitate a Delphi method that can be used to validate theoretical models. The purpose of this research was to develop computer, Internet, and e-mail applications to facilitate a modified Delphi technique through which experts provide validation for a proposed conceptual model that describes the information needs for a mass-casualty continuum of care. Extant literature and existing theoretical models provided the basis for model development. Two rounds of the Delphi process were needed to satisfy the criteria for consensus and/or stability related to the constructs, relationships, and indicators in the model. The majority of experts rated the online processes favorably (mean of 6.1 on a seven-point scale). Using online Internet and computer applications to facilitate a modified Delphi process offers much promise for future research involving model building or validation. The online Delphi process provided an effective methodology for identifying and describing the complex series of events and contextual factors that influence the way we respond to disasters.
Lifetime evaluation of large format CMOS mixed signal infrared devices
NASA Astrophysics Data System (ADS)
Linder, A.; Glines, Eddie
2015-09-01
New large scale foundry processes continue to produce reliable products. These new large scale devices continue to use industry best practice to screen for failure mechanisms and validate their long lifetime. The Failure-in-Time analysis in conjunction with foundry qualification information can be used to evaluate large format device lifetimes. This analysis is a helpful tool when zero failure life tests are typical. The reliability of the device is estimated by applying the failure rate to the use conditions. JEDEC publications continue to be the industry accepted methods.
Risk-based Methodology for Validation of Pharmaceutical Batch Processes.
Wiles, Frederick
2013-01-01
In January 2011, the U.S. Food and Drug Administration published new process validation guidance for pharmaceutical processes. The new guidance debunks the long-held industry notion that three consecutive validation batches or runs are all that are required to demonstrate that a process is operating in a validated state. Instead, the new guidance now emphasizes that the level of monitoring and testing performed during process performance qualification (PPQ) studies must be sufficient to demonstrate statistical confidence both within and between batches. In some cases, three qualification runs may not be enough. Nearly two years after the guidance was first published, little has been written defining a statistical methodology for determining the number of samples and qualification runs required to satisfy Stage 2 requirements of the new guidance. This article proposes using a combination of risk assessment, control charting, and capability statistics to define the monitoring and testing scheme required to show that a pharmaceutical batch process is operating in a validated state. In this methodology, an assessment of process risk is performed through application of a process failure mode, effects, and criticality analysis (PFMECA). The output of PFMECA is used to select appropriate levels of statistical confidence and coverage which, in turn, are used in capability calculations to determine when significant Stage 2 (PPQ) milestones have been met. The achievement of Stage 2 milestones signals the release of batches for commercial distribution and the reduction of monitoring and testing to commercial production levels. Individuals, moving range, and range/sigma charts are used in conjunction with capability statistics to demonstrate that the commercial process is operating in a state of statistical control. The new process validation guidance published by the U.S. Food and Drug Administration in January of 2011 indicates that the number of process validation batches or runs required to demonstrate that a pharmaceutical process is operating in a validated state should be based on sound statistical principles. The old rule of "three consecutive batches and you're done" is no longer sufficient. The guidance, however, does not provide any specific methodology for determining the number of runs required, and little has been published to augment this shortcoming. The paper titled "Risk-based Methodology for Validation of Pharmaceutical Batch Processes" describes a statistically sound methodology for determining when a statistically valid number of validation runs has been acquired based on risk assessment and calculation of process capability.
Lardon, L; Puñal, A; Martinez, J A; Steyer, J P
2005-01-01
Anaerobic digestion (AD) plants are highly efficient wastewater treatment processes with possible energetic valorisation. Despite these advantages, many industries are still reluctant to use them because of their instability in the face of changes in operating conditions. To the face this drawback and to enhance the industrial use of anaerobic digestion, one solution is to develop and to implement knowledge base (KB) systems that are able to detect and to assess in real-time the quality of operating conditions of the processes. Case-based techniques and heuristic approaches have been already tested and validated on AD processes but two major properties were lacking: modularity of the system (the knowledge base system should be easily tuned on a new process and should still work if one or more sensors are added or removed) and uncertainty management (the assessment of the KB system should remain relevant even in the case of too poor or conflicting information sources). This paper addresses these two points and presents a modular KB system where an uncertain reasoning formalism is used to combine partial and complementary fuzzy diagnosis modules. Demonstration of the interest of the approach is provided from real-life experiments performed on an industrial 2,000 m3 CSTR anaerobic digester.
A new e-beam application in the pharmaceutical industry
NASA Astrophysics Data System (ADS)
Sadat, Theo; Malcolm, Fiona
2005-10-01
The paper presents a new electron beam application in the pharmaceutical industry: an in-line self-shielded atropic transfer system using electron beam for surface decontamination of products entering a pharmaceutical filling line. The unit was developed by Linac Technologies in response to the specifications of a multi-national pharmaceutical company, to solve the risk of microbial contamination entering a filling line housed inside an isolator. In order to fit the sterilization unit inside the pharmaceutical plant, a "miniature" low-energy (200 keV) electron beam accelerator and e-beam tunnel were designed, all conforming to the pharmaceutical good manufacturing practice (GMP) regulations. Process validation using biological indicators is described, with reference to the regulations governing the pharmaceutical industry. Other industrial applications of a small-sized self-shielded electron beam sterilization unit are mentioned.
A Real-time Evaluation of Human-based Approaches to Safety Testing: What We Can Do Now (TDS)
Despite ever-increasing efforts in early safety assessment in all industries, there are still many chemicals that prove toxic in humans. While greater use of human in vitro test methods may serve to reduce this problem, the formal validation process applied to such tests represen...
Towards Developing an Industry-Validated Food Technology Curriculum in Afghanistan
ERIC Educational Resources Information Center
Ebner, Paul; McNamara, Kevin; Deering, Amanda; Oliver, Haley; Rahimi, Mirwais; Faisal, Hamid
2017-01-01
Afghanistan remains an agrarian country with most analyses holding food production and processing as key to recovery. To date, however, there are no public or private higher education departments focused on food technology. To bridge this gap, Herat University initiated a new academic department conferring BS degrees in food technology. Models for…
NASA Astrophysics Data System (ADS)
Yaakob, Mazri; Ali, Wan Nur Athirah Wan; Radzuan, Kamaruddin
2016-08-01
Building Information Modeling (BIM) is defined as existing from the earliest concept to demolition and it involves creating and using an intelligent 3D model to inform and communicate project decisions. This research aims to identify the critical success factors (CSFs) of BIM implementation in Malaysian construction industry. A literature review was done to explore previous BIM studies on definitions and history of BIM, construction issues, application of BIM in construction projects as well as benefits of BIM. A series of interviews with multidisciplinary Malaysian construction experts will be conducted purposely for data collection process guided by the research design and methodology approach of this study. The analysis of qualitative data from the process will be combined with criteria identified in the literature review in order to identify the CSFs. Finally, the CSFs of BIM implementation will be validated by further Malaysian industrialists during a workshop. The validated CSFs can be used as a term of reference for both Malaysian practitioners and academics towards measuring BIM effectiveness level in their organizations.
A European Competence Framework for Industrial Pharmacy Practice in Biotechnology
Atkinson, Jeffrey; Crowley, Pat; De Paepe, Kristien; Gennery, Brian; Koster, Andries; Martini, Luigi; Moffat, Vivien; Nicholson, Jane; Pauwels, Gunther; Ronsisvalle, Giuseppe; Sousa, Vitor; van Schravendijk, Chris; Wilson, Keith
2015-01-01
The PHAR-IN (“Competences for industrial pharmacy practice in biotechnology”) looked at whether there is a difference in how industrial employees and academics rank competences for practice in the biotechnological industry. A small expert panel consisting of the authors of this paper produced a biotechnology competence framework by drawing up an initial list of competences then ranking them in importance using a three-stage Delphi process. The framework was next evaluated and validated by a large expert panel of academics (n = 37) and industrial employees (n = 154). Results show that priorities for industrial employees and academics were similar. The competences for biotechnology practice that received the highest scores were mainly in: “Research and Development”,‘“Upstream” and “Downstream” Processing’, “Product development and formulation”,“Aseptic processing”, “Analytical methodology”, “Product stability”, and “Regulation”. The main area of disagreement was in the category “Ethics and drug safety” where academics ranked competences higher than did industrial employees. PMID:28975907
Localized analysis of paint-coat drying using dynamic speckle interferometry
NASA Astrophysics Data System (ADS)
Sierra-Sosa, Daniel; Tebaldi, Myrian; Grumel, Eduardo; Rabal, Hector; Elmaghraby, Adel
2018-07-01
The paint-coating is part of several industrial processes, including the automotive industry, architectural coatings, machinery and appliances. These paint-coatings must comply with high quality standards, for this reason evaluation techniques from paint-coatings are in constant development. One important factor from the paint-coating process is the drying, as it has influence on the quality of final results. In this work we present an assessment technique based on the optical dynamic speckle interferometry, this technique allows for the temporal activity evaluation of the paint-coating drying process, providing localized information from drying. This localized information is relevant in order to address the drying homogeneity, optimal drying, and quality control. The technique relies in the definition of a new temporal history of the speckle patterns to obtain the local activity; this information is then clustered to provide a convenient indicative of different drying process stages. The experimental results presented were validated using the gravimetric drying curves
Validation of optimization strategies using the linear structured production chains
NASA Astrophysics Data System (ADS)
Kusiak, Jan; Morkisz, Paweł; Oprocha, Piotr; Pietrucha, Wojciech; Sztangret, Łukasz
2017-06-01
Different optimization strategies applied to sequence of several stages of production chains were validated in this paper. Two benchmark problems described by ordinary differential equations (ODEs) were considered. A water tank and a passive CR-RC filter were used as the exemplary objects described by the first and the second order differential equations, respectively. Considered in the work optimization problems serve as the validators of strategies elaborated by the Authors. However, the main goal of research is selection of the best strategy for optimization of two real metallurgical processes which will be investigated in an on-going projects. The first problem will be the oxidizing roasting process of zinc sulphide concentrate where the sulphur from the input concentrate should be eliminated and the minimal concentration of sulphide sulphur in the roasted products has to be achieved. Second problem will be the lead refining process consisting of three stages: roasting to the oxide, oxide reduction to metal and the oxidizing refining. Strategies, which appear the most effective in considered benchmark problems will be candidates for optimization of the mentioned above industrial processes.
Eremenco, Sonya; Pease, Sheryl; Mann, Sarah; Berry, Pamela
2017-01-01
This paper describes the rationale and goals of the Patient-Reported Outcome (PRO) Consortium's instrument translation process. The PRO Consortium has developed a number of novel PRO measures which are in the process of qualification by the U.S. Food and Drug Administration (FDA) for use in clinical trials where endpoints based on these measures would support product labeling claims. Given the importance of FDA qualification of these measures, the PRO Consortium's Process Subcommittee determined that a detailed linguistic validation (LV) process was necessary to ensure that all translations of Consortium-developed PRO measures are performed using a standardized approach with the rigor required to meet regulatory and pharmaceutical industry expectations, as well as having a clearly defined instrument translation process that the translation industry can support. The consensus process involved gathering information about current best practices from 13 translation companies with expertise in LV, consolidating the findings to generate a proposed process, and obtaining iterative feedback from the translation companies and PRO Consortium member firms on the proposed process in two rounds of review in order to update existing principles of good practice in LV and to provide sufficient detail for the translation process to ensure consistency across PRO Consortium measures, sponsors, and translation companies. The consensus development resulted in a 12-step process that outlines universal and country-specific new translation approaches, as well as country-specific adaptations of existing translations. The PRO Consortium translation process will play an important role in maintaining the validity of the data generated through these measures by ensuring that they are translated by qualified linguists following a standardized and rigorous process that reflects best practice.
Development and Validation of a Safety Climate Scale for Manufacturing Industry
Ghahramani, Abolfazl; Khalkhali, Hamid R.
2015-01-01
Background This paper describes the development of a scale for measuring safety climate. Methods This study was conducted in six manufacturing companies in Iran. The scale developed through conducting a literature review about the safety climate and constructing a question pool. The number of items was reduced to 71 after performing a screening process. Results The result of content validity analysis showed that 59 items had excellent item content validity index (≥ 0.78) and content validity ratio (> 0.38). The exploratory factor analysis resulted in eight safety climate dimensions. The reliability value for the final 45-item scale was 0.96. The result of confirmatory factor analysis showed that the safety climate model is satisfactory. Conclusion This study produced a valid and reliable scale for measuring safety climate in manufacturing companies. PMID:26106508
Experimental study on combined cold forging process of backward cup extrusion and piercing
NASA Astrophysics Data System (ADS)
Henry, Robinson; Liewald, Mathias
2018-05-01
A reduction in material usage of cold forged components while maintaining the functional requirements can be achieved using hollow or tubular preforms. These preforms are used to meet lightweight requirements and to decrease production costs of cold formed components. To increase production efficiency in common multi-stage cold forming processes, manufacturing of hollow preforms by combining the processes backward cup extrusion and piercing was established and will be discussed in this paper. Corresponding investigations and experimental studies are reported in this article. The objectives of the experimental investigations have been the detection of significant process parameters, determination of process limits for the combined processes and validation of the numerical investigations. In addition, the general influence concerning surface quality and diameter tolerance of hollow performs are discussed in this paper. The final goal is to summarize a guideline for industrial application, moreover, to transfer the knowledge to industry, as regards what are required part geometries to reduce the number of forming stages as well as tool cost.
IVHM for the 3rd Generation RLV Program: Technology Development
NASA Technical Reports Server (NTRS)
Kahle, Bill
2000-01-01
The objective behind the Integrated Vehicle Health Management (IVHM) project is to develop and integrate the technologies which can provide a continuous, intelligent, and adaptive health state of a vehicle and use this information to improve safety and reduce costs of operations. Technological areas discussed include: developing, validating, and transfering next generation IVHM technologies to near term industry and government reusable launch systems; focus NASA on the next generation and highly advanced sensor and software technologies; and validating IVHM systems engineering design process for future programs.
NASA Astrophysics Data System (ADS)
Luo, Keqin
1999-11-01
The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and accurately what is going on in each tank, and (iii) identify all WM opportunities through process improvement. This work has formed a solid foundation for the further development of powerful WM technologies for comprehensive WM in the following decade.
Verification and Validation of Digitally Upgraded Control Rooms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, Ronald; Lau, Nathan
2015-09-01
As nuclear power plants undertake main control room modernization, a challenge is the lack of a clearly defined human factors process to follow. Verification and validation (V&V) as applied in the nuclear power community has tended to involve efforts such as integrated system validation, which comes at the tail end of the design stage. To fill in guidance gaps and create a step-by-step process for control room modernization, we have developed the Guideline for Operational Nuclear Usability and Knowledge Elicitation (GONUKE). This approach builds on best practices in the software industry, which prescribe an iterative user-centered approach featuring multiple cyclesmore » of design and evaluation. Nuclear regulatory guidance for control room design emphasizes summative evaluation—which occurs after the design is complete. In the GONUKE approach, evaluation is also performed at the formative stage of design—early in the design cycle using mockups and prototypes for evaluation. The evaluation may involve expert review (e.g., software heuristic evaluation at the formative stage and design verification against human factors standards like NUREG-0700 at the summative stage). The evaluation may also involve user testing (e.g., usability testing at the formative stage and integrated system validation at the summative stage). An additional, often overlooked component of evaluation is knowledge elicitation, which captures operator insights into the system. In this report we outline these evaluation types across design phases that support the overall modernization process. The objective is to provide industry-suitable guidance for steps to be taken in support of the design and evaluation of a new human-machine interface (HMI) in the control room. We suggest the value of early-stage V&V and highlight how this early-stage V&V can help improve the design process for control room modernization. We argue that there is a need to overcome two shortcomings of V&V in current practice—the propensity for late-stage V&V and the use of increasingly complex psychological assessment measures for V&V.« less
Expert system verification and validation study. Delivery 3A and 3B: Trip summaries
NASA Technical Reports Server (NTRS)
French, Scott
1991-01-01
Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.
The Development and Validation of the Game User Experience Satisfaction Scale (GUESS).
Phan, Mikki H; Keebler, Joseph R; Chaparro, Barbara S
2016-12-01
The aim of this study was to develop and psychometrically validate a new instrument that comprehensively measures video game satisfaction based on key factors. Playtesting is often conducted in the video game industry to help game developers build better games by providing insight into the players' attitudes and preferences. However, quality feedback is difficult to obtain from playtesting sessions without a quality gaming assessment tool. There is a need for a psychometrically validated and comprehensive gaming scale that is appropriate for playtesting and game evaluation purposes. The process of developing and validating this new scale followed current best practices of scale development and validation. As a result, a mixed-method design that consisted of item pool generation, expert review, questionnaire pilot study, exploratory factor analysis (N = 629), and confirmatory factor analysis (N = 729) was implemented. A new instrument measuring video game satisfaction, called the Game User Experience Satisfaction Scale (GUESS), with nine subscales emerged. The GUESS was demonstrated to have content validity, internal consistency, and convergent and discriminant validity. The GUESS was developed and validated based on the assessments of over 450 unique video game titles across many popular genres. Thus, it can be applied across many types of video games in the industry both as a way to assess what aspects of a game contribute to user satisfaction and as a tool to aid in debriefing users on their gaming experience. The GUESS can be administered to evaluate user satisfaction of different types of video games by a variety of users. © 2016, Human Factors and Ergonomics Society.
A new dawn for industrial photosynthesis.
Robertson, Dan E; Jacobson, Stuart A; Morgan, Frederick; Berry, David; Church, George M; Afeyan, Noubar B
2011-03-01
Several emerging technologies are aiming to meet renewable fuel standards, mitigate greenhouse gas emissions, and provide viable alternatives to fossil fuels. Direct conversion of solar energy into fungible liquid fuel is a particularly attractive option, though conversion of that energy on an industrial scale depends on the efficiency of its capture and conversion. Large-scale programs have been undertaken in the recent past that used solar energy to grow innately oil-producing algae for biomass processing to biodiesel fuel. These efforts were ultimately deemed to be uneconomical because the costs of culturing, harvesting, and processing of algal biomass were not balanced by the process efficiencies for solar photon capture and conversion. This analysis addresses solar capture and conversion efficiencies and introduces a unique systems approach, enabled by advances in strain engineering, photobioreactor design, and a process that contradicts prejudicial opinions about the viability of industrial photosynthesis. We calculate efficiencies for this direct, continuous solar process based on common boundary conditions, empirical measurements and validated assumptions wherein genetically engineered cyanobacteria convert industrially sourced, high-concentration CO(2) into secreted, fungible hydrocarbon products in a continuous process. These innovations are projected to operate at areal productivities far exceeding those based on accumulation and refining of plant or algal biomass or on prior assumptions of photosynthetic productivity. This concept, currently enabled for production of ethanol and alkane diesel fuel molecules, and operating at pilot scale, establishes a new paradigm for high productivity manufacturing of nonfossil-derived fuels and chemicals.
Braithwaite, Jeffrey; Westbrook, Johanna; Pawsey, Marjorie; Greenfield, David; Naylor, Justine; Iedema, Rick; Runciman, Bill; Redman, Sally; Jorm, Christine; Robinson, Maureen; Nathan, Sally; Gibberd, Robert
2006-01-01
Background Accreditation has become ubiquitous across the international health care landscape. Award of full accreditation status in health care is viewed, as it is in other sectors, as a valid indicator of high quality organisational performance. However, few studies have empirically demonstrated this assertion. The value of accreditation, therefore, remains uncertain, and this persists as a central legitimacy problem for accreditation providers, policymakers and researchers. The question arises as to how best to research the validity, impact and value of accreditation processes in health care. Most health care organisations participate in some sort of accreditation process and thus it is not possible to study its merits using a randomised controlled strategy. Further, tools and processes for accreditation and organisational performance are multifaceted. Methods/design To understand the relationship between them a multi-method research approach is required which incorporates both quantitative and qualitative data. The generic nature of accreditation standard development and inspection within different sectors enhances the extent to which the findings of in-depth study of accreditation process in one industry can be generalised to other industries. This paper presents a research design which comprises a prospective, multi-method, multi-level, multi-disciplinary approach to assess the validity, impact and value of accreditation. Discussion The accreditation program which assesses over 1,000 health services in Australia is used as an exemplar for testing this design. The paper proposes this design as a framework suitable for application to future international research into accreditation. Our aim is to stimulate debate on the role of accreditation and how to research it. PMID:16968552
Gene-Auto: Automatic Software Code Generation for Real-Time Embedded Systems
NASA Astrophysics Data System (ADS)
Rugina, A.-E.; Thomas, D.; Olive, X.; Veran, G.
2008-08-01
This paper gives an overview of the Gene-Auto ITEA European project, which aims at building a qualified C code generator from mathematical models under Matlab-Simulink and Scilab-Scicos. The project is driven by major European industry partners, active in the real-time embedded systems domains. The Gene- Auto code generator will significantly improve the current development processes in such domains by shortening the time to market and by guaranteeing the quality of the generated code through the use of formal methods. The first version of the Gene-Auto code generator has already been released and has gone thought a validation phase on real-life case studies defined by each project partner. The validation results are taken into account in the implementation of the second version of the code generator. The partners aim at introducing the Gene-Auto results into industrial development by 2010.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldenburg, C.M.
2011-06-01
The need for risk-driven field experiments for CO{sub 2} geologic storage processes to complement ongoing pilot-scale demonstrations is discussed. These risk-driven field experiments would be aimed at understanding the circumstances under which things can go wrong with a CO{sub 2} capture and storage (CCS) project and cause it to fail, as distinguished from accomplishing this end using demonstration and industrial scale sites. Such risk-driven tests would complement risk-assessment efforts that have already been carried out by providing opportunities to validate risk models. In addition to experimenting with high-risk scenarios, these controlled field experiments could help validate monitoring approaches to improvemore » performance assessment and guide development of mitigation strategies.« less
Ward, Marie; McDonald, Nick; Morrison, Rabea; Gaynor, Des; Nugent, Tony
2010-02-01
Aircraft maintenance is a highly regulated, safety critical, complex and competitive industry. There is a need to develop innovative solutions to address process efficiency without compromising safety and quality. This paper presents the case that in order to improve a highly complex system such as aircraft maintenance, it is necessary to develop a comprehensive and ecologically valid model of the operational system, which represents not just what is meant to happen, but what normally happens. This model then provides the backdrop against which to change or improve the system. A performance report, the Blocker Report, specific to aircraft maintenance and related to the model was developed gathering data on anything that 'blocks' task or check performance. A Blocker Resolution Process was designed to resolve blockers and improve the current check system. Significant results were obtained for the company in the first trial and implications for safety management systems and hazard identification are discussed. Statement of Relevance: Aircraft maintenance is a safety critical, complex, competitive industry with a need to develop innovative solutions to address process and safety efficiency. This research addresses this through the development of a comprehensive and ecologically valid model of the system linked with a performance reporting and resolution system.
Fiber Bragg grating sensors for real-time monitoring of evacuation process
NASA Astrophysics Data System (ADS)
Guru Prasad, A. S.; Hegde, Gopalkrishna M.; Asokan, S.
2010-03-01
Fiber bragg grating (FBG) sensors have been widely used for number of sensing applications like temperature, pressure, acousto-ultrasonic, static and dynamic strain, refractive index change measurements and so on. Present work demonstrates the use of FBG sensors in in-situ measurement of vacuum process with simultaneous leak detection capability. Experiments were conducted in a bell jar vacuum chamber facilitated with conventional Pirani gauge for vacuum measurement. Three different experiments have been conducted to validate the performance of FBG sensor in monitoring vacuum creating process and air bleeding. The preliminary results of FBG sensors in vacuum monitoring have been compared with that of commercial Pirani gauge sensor. This novel technique offers a simple alternative to conventional method for real time monitoring of evacuation process. Proposed FBG based vacuum sensor has potential applications in vacuum systems involving hazardous environment such as chemical and gas plants, automobile industries, aeronautical establishments and leak monitoring in process industries, where the electrical or MEMS based sensors are prone to explosion and corrosion.
Product pricing in the Solar Array Manufacturing Industry - An executive summary of SAMICS
NASA Technical Reports Server (NTRS)
Chamberlain, R. G.
1978-01-01
Capabilities, methodology, and a description of input data to the Solar Array Manufacturing Industry Costing Standards (SAMICS) are presented. SAMICS were developed to provide a standardized procedure and data base for comparing manufacturing processes of Low-cost Solar Array (LSA) subcontractors, guide the setting of research priorities, and assess the progress of LSA toward its hundred-fold cost reduction goal. SAMICS can be used to estimate the manufacturing costs and product prices and determine the impact of inflation, taxes, and interest rates, but it is limited by its ignoring the effects of the market supply and demand and an assumption that all factories operate in a production line mode. The SAMICS methodology defines the industry structure, hypothetical supplier companies, and manufacturing processes and maintains a body of standardized data which is used to compute the final product price. The input data includes the product description, the process characteristics, the equipment cost factors, and production data for the preparation of detailed cost estimates. Activities validating that SAMICS produced realistic price estimates and cost breakdowns are described.
Space - A unique environment for process modeling R&D
NASA Technical Reports Server (NTRS)
Overfelt, Tony
1991-01-01
Process modeling, the application of advanced computational techniques to simulate real processes as they occur in regular use, e.g., welding, casting and semiconductor crystal growth, is discussed. Using the low-gravity environment of space will accelerate the technical validation of the procedures and enable extremely accurate determinations of the many necessary thermophysical properties. Attention is given to NASA's centers for the commercial development of space; joint ventures of universities, industries, and goverment agencies to study the unique attributes of space that offer potential for applied R&D and eventual commercial exploitation.
Dietary Intake of Trans Fatty Acids in Children Aged 4-5 in Spain: The INMA Cohort Study.
Scholz, Alexander; Gimenez-Monzo, Daniel; Navarrete-Muñoz, Eva Maria; Garcia-de-la-Hera, Manuela; Fernandez-Somoano, Ana; Tardon, Adonina; Santa Marina, Loreto; Irazabal, Amaia; Romaguera, Dora; Guxens, Mònica; Julvez, Jordi; Llop, Sabrina; Lopez-Espinosa, Maria-Jose; Vioque, Jesus
2016-10-10
Trans fatty acid (TFA) intake has been identified as a health hazard in adults, but data on preschool children are scarce. We analyzed the data from the Spanish INMA Project to determine the intake of total, industrial and natural TFA, their main sources and the associated socio-demographic and lifestyle factors in children aged 4-5 ( n = 1793). TFA intake was estimated using a validated Food Frequency Questionnaire, and multiple linear regression was used to explore associated factors. The mean daily intakes of total, industrial and natural TFA were 1.36, 0.60, and 0.71 g/day, respectively. Ten percent of the children obtained >1% of their energy intake from TFA. The main sources of industrial TFA were fast food, white bread and processed baked goods. Milk, red and processed meat and processed baked goods were the main sources of natural TFA. Having parents from countries other than Spain was significantly associated with higher natural TFA (in mg/day) intake (β 45.5) and television viewing was significantly associated with higher industrial TFA intake (β 18.3). Higher fruits and vegetables intake was significantly associated with lower intakes of all TFAs, whereas higher sweetened beverages intake was significantly associated with lower total and natural TFA intake. Thus, total and industrial TFA intake was associated with less healthy food patterns and lifestyles in Spanish preschool children.
Opportunities and challenges of real-time release testing in biopharmaceutical manufacturing.
Jiang, Mo; Severson, Kristen A; Love, John Christopher; Madden, Helena; Swann, Patrick; Zang, Li; Braatz, Richard D
2017-11-01
Real-time release testing (RTRT) is defined as "the ability to evaluate and ensure the quality of in-process and/or final drug product based on process data, which typically includes a valid combination of measured material attributes and process controls" (ICH Q8[R2]). This article discusses sensors (process analytical technology, PAT) and control strategies that enable RTRT for the spectrum of critical quality attributes (CQAs) in biopharmaceutical manufacturing. Case studies from the small-molecule and biologic pharmaceutical industry are described to demonstrate how RTRT can be facilitated by integrated manufacturing and multivariable control strategies to ensure the quality of products. RTRT can enable increased assurance of product safety, efficacy, and quality-with improved productivity including faster release and potentially decreased costs-all of which improve the value to patients. To implement a complete RTRT solution, biologic drug manufacturers need to consider the special attributes of their industry, particularly sterility and the measurement of viral and microbial contamination. Continued advances in on-line and in-line sensor technologies are key for the biopharmaceutical manufacturing industry to achieve the potential of RTRT. Related article: http://onlinelibrary.wiley.com/doi/10.1002/bit.26378/full. © 2017 Wiley Periodicals, Inc.
On the Modeling of Vacuum Arc Remelting Process in Titanium Alloys
NASA Astrophysics Data System (ADS)
Patel, Ashish; Fiore, Daniel
2016-07-01
Mathematical modeling is routinely used in the process development and production of advanced aerospace alloys to gain greater insight into the effect of process parameters on final properties. This article describes the application of a 2-D mathematical VAR model presented at previous LMPC meetings. The impact of process parameters on melt pool geometry, solidification behavior, fluid-flow and chemistry in a Ti-6Al-4V ingot is discussed. Model predictions are validated against published data from a industrial size ingot, and results of a parametric study on particle dissolution are also discussed.
Space station needs, attributes, and architectural options. Volume 1. Executive summary
NASA Technical Reports Server (NTRS)
Pritchard, E. B.
1983-01-01
The initial space station should be manned, placed in 28.5 deg orbit, and provide substantial economic, performance, and social benefits. The most beneficial space station capabilities include: a space test facility; a transport harbor; satellite servicing and assembly; and an observatory. A space industrial park could be added once further development effort validates the cost and expanding commercial market for space processed materials. The potential accrued gross mission model benefit derived from these capabilities is $5.9B without the industrial park, and $9.3B with it. An unclassified overview of all phases of the study is presented.
Knowledge Repository for Fmea Related Knowledge
NASA Astrophysics Data System (ADS)
Cândea, Gabriela Simona; Kifor, Claudiu Vasile; Cândea, Ciprian
2014-11-01
This paper presents innovative usage of knowledge system into Failure Mode and Effects Analysis (FMEA) process using the ontology to represent the knowledge. Knowledge system is built to serve multi-projects work that nowadays are in place in any manufacturing or services provider, and knowledge must be retained and reused at the company level and not only at project level. The system is following the FMEA methodology and the validation of the concept is compliant with the automotive industry standards published by Automotive Industry Action Group, and not only. Collaboration is assured trough web-based GUI that supports multiple users access at any time
Advanced bulk processing of lightweight materials for utilization in the transportation sector
NASA Astrophysics Data System (ADS)
Milner, Justin L.
The overall objective of this research is to develop the microstructure of metallic lightweight materials via multiple advanced processing techniques with potentials for industrial utilization on a large scale to meet the demands of the aerospace and automotive sectors. This work focused on (i) refining the grain structure to increase the strength, (ii) controlling the texture to increase formability and (iii) directly reducing processing/production cost of lightweight material components. Advanced processing is conducted on a bulk scale by several severe plastic deformation techniques including: accumulative roll bonding, isolated shear rolling and friction stir processing to achieve the multiple targets of this research. Development and validation of the processing techniques is achieved through wide-ranging experiments along with detailed mechanical and microstructural examination of the processed material. On a broad level, this research will make advancements in processing of bulk lightweight materials facilitating industrial-scale implementation. Where accumulative roll bonding and isolated shear rolling, currently feasible on an industrial scale, processes bulk sheet materials capable of replacing more expensive grades of alloys and enabling low-temperature and high-strain-rate formability. Furthermore, friction stir processing to manufacture lightweight tubes, made from magnesium alloys, has the potential to increase the utilization of these materials in the automotive and aerospace sectors for high strength - high formability applications. With the increased utilization of these advanced processing techniques will significantly reduce the cost associated with lightweight materials for many applications in the transportation sectors.
Large Aircraft Robotic Paint Stripping (LARPS) system and the high pressure water process
NASA Astrophysics Data System (ADS)
See, David W.; Hofacker, Scott A.; Stone, M. Anthony; Harbaugh, Darcy
1993-03-01
The aircraft maintenance industry is beset by new Environmental Protection Agency (EPA) guidelines on air emissions, Occupational Safety and Health Administration (OSHA) standards, dwindling labor markets, Federal Aviation Administration (FAA) safety guidelines, and increased operating costs. In light of these factors, the USAF's Wright Laboratory Manufacturing Technology Directorate and the Aircraft Division of the Oklahoma City Air Logistics Center initiated a MANTECH/REPTECH effort to automate an alternate paint removal method and eliminate the current manual methylene chloride chemical stripping methods. This paper presents some of the background and history of the LARPS program, describes the LARPS system, documents the projected operational flow, quantifies some of the projected system benefits and describes the High Pressure Water Stripping Process. Certification of an alternative paint removal method to replace the current chemical process is being performed in two phases: Process Optimization and Process Validation. This paper also presents the results of the Process Optimization for metal substrates. Data on the coating removal rate, residual stresses, surface roughness, preliminary process envelopes, and technical plans for process Validation Testing will be discussed.
Mathematical modeling of simultaneous carbon-nitrogen-sulfur removal from industrial wastewater.
Xu, Xi-Jun; Chen, Chuan; Wang, Ai-Jie; Ni, Bing-Jie; Guo, Wan-Qian; Yuan, Ye; Huang, Cong; Zhou, Xu; Wu, Dong-Hai; Lee, Duu-Jong; Ren, Nan-Qi
2017-01-05
A mathematical model of carbon, nitrogen and sulfur removal (C-N-S) from industrial wastewater was constructed considering the interactions of sulfate-reducing bacteria (SRB), sulfide-oxidizing bacteria (SOB), nitrate-reducing bacteria (NRB), facultative bacteria (FB), and methane producing archaea (MPA). For the kinetic network, the bioconversion of C-N by heterotrophic denitrifiers (NO 3 - →NO 2 - →N 2 ), and that of C-S by SRB (SO 4 2- →S 2- ) and SOB (S 2- →S 0 ) was proposed and calibrated based on batch experimental data. The model closely predicted the profiles of nitrate, nitrite, sulfate, sulfide, lactate, acetate, methane and oxygen under both anaerobic and micro-aerobic conditions. The best-fit kinetic parameters had small 95% confidence regions with mean values approximately at the center. The model was further validated using independent data sets generated under different operating conditions. This work was the first successful mathematical modeling of simultaneous C-N-S removal from industrial wastewater and more importantly, the proposed model was proven feasible to simulate other relevant processes, such as sulfate-reducing, sulfide-oxidizing process (SR-SO) and denitrifying sulfide removal (DSR) process. The model developed is expected to enhance our ability to predict the treatment of carbon-nitrogen-sulfur contaminated industrial wastewater. Copyright © 2016 Elsevier B.V. All rights reserved.
Application of process tomography in gas-solid fluidised beds in different scales and structures
NASA Astrophysics Data System (ADS)
Wang, H. G.; Che, H. Q.; Ye, J. M.; Tu, Q. Y.; Wu, Z. P.; Yang, W. Q.; Ocone, R.
2018-04-01
Gas-solid fluidised beds are commonly used in particle-related processes, e.g. for coal combustion and gasification in the power industry, and the coating and granulation process in the pharmaceutical industry. Because the operation efficiency depends on the gas-solid flow characteristics, it is necessary to investigate the flow behaviour. This paper is about the application of process tomography, including electrical capacitance tomography (ECT) and microwave tomography (MWT), in multi-scale gas-solid fluidisation processes in the pharmaceutical and power industries. This is the first time that both ECT and MWT have been applied for this purpose in multi-scale and complex structure. To evaluate the sensor design and image reconstruction and to investigate the effects of sensor structure and dimension on the image quality, a normalised sensitivity coefficient is introduced. In the meantime, computational fluid dynamic (CFD) analysis based on a computational particle fluid dynamic (CPFD) model and a two-phase fluid model (TFM) is used. Part of the CPFD-TFM simulation results are compared and validated by experimental results from ECT and/or MWT. By both simulation and experiment, the complex flow hydrodynamic behaviour in different scales is analysed. Time-series capacitance data are analysed both in time and frequency domains to reveal the flow characteristics.
Hamawand, Ihsan; Pittaway, Pam; Lewis, Larry; Chakrabarty, Sayan; Caldwell, Justin; Eberhard, Jochen; Chakraborty, Arpita
2017-02-01
This article addresses the novel dewatering process of immersion-frying of paunch and dissolved air flotation (DAF) sludge to produce high energy pellets. Literature have been analysed to address the feasibility of replacing conventional boiler fuel at meat processing facilities with high energy paunch-DAF sludge pellets (capsules). The value proposition of pelleting and frying this mixture into energy pellets is based on a Cost-Benefit Analysis (CBA). The CBA is based on information derived from the literature and consultation with the Australian Meat Processing Industry. The calorific properties of a mixture of paunch cake solids and DAF sludge were predicted from literature and industry consultation to validate the product. This study shows that the concept of pelletizing and frying paunch is economically feasible. The complete frying and dewatering of the paunch and DAF sludge mixture produces pellets with energy content per kilogram equivalent to coal. The estimated cost of this new product is half the price of coal and the payback period is estimated to be between 1.8 and 3.2years. Further research is required for proof of concept, and to identify the technical challenges associated with integrating this technology into existing meat processing plants. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.
Integration of design and inspection
NASA Astrophysics Data System (ADS)
Simmonds, William H.
1990-08-01
Developments in advanced computer integrated manufacturing technology, coupled with the emphasis on Total Quality Management, are exposing needs for new techniques to integrate all functions from design through to support of the delivered product. One critical functional area that must be integrated into design is that embracing the measurement, inspection and test activities necessary for validation of the delivered product. This area is being tackled by a collaborative project supported by the UK Government Department of Trade and Industry. The project is aimed at developing techniques for analysing validation needs and for planning validation methods. Within the project an experimental Computer Aided Validation Expert system (CAVE) is being constructed. This operates with a generalised model of the validation process and helps with all design stages: specification of product requirements; analysis of the assurance provided by a proposed design and method of manufacture; development of the inspection and test strategy; and analysis of feedback data. The kernel of the system is a knowledge base containing knowledge of the manufacturing process capabilities and of the available inspection and test facilities. The CAVE system is being integrated into a real life advanced computer integrated manufacturing facility for demonstration and evaluation.
Adaptive inferential sensors based on evolving fuzzy models.
Angelov, Plamen; Kordon, Arthur
2010-04-01
A new technique to the design and use of inferential sensors in the process industry is proposed in this paper, which is based on the recently introduced concept of evolving fuzzy models (EFMs). They address the challenge that the modern process industry faces today, namely, to develop such adaptive and self-calibrating online inferential sensors that reduce the maintenance costs while keeping the high precision and interpretability/transparency. The proposed new methodology makes possible inferential sensors to recalibrate automatically, which reduces significantly the life-cycle efforts for their maintenance. This is achieved by the adaptive and flexible open-structure EFM used. The novelty of this paper lies in the following: (1) the overall concept of inferential sensors with evolving and self-developing structure from the data streams; (2) the new methodology for online automatic selection of input variables that are most relevant for the prediction; (3) the technique to detect automatically a shift in the data pattern using the age of the clusters (and fuzzy rules); (4) the online standardization technique used by the learning procedure of the evolving model; and (5) the application of this innovative approach to several real-life industrial processes from the chemical industry (evolving inferential sensors, namely, eSensors, were used for predicting the chemical properties of different products in The Dow Chemical Company, Freeport, TX). It should be noted, however, that the methodology and conclusions of this paper are valid for the broader area of chemical and process industries in general. The results demonstrate that well-interpretable and with-simple-structure inferential sensors can automatically be designed from the data stream in real time, which predict various process variables of interest. The proposed approach can be used as a basis for the development of a new generation of adaptive and evolving inferential sensors that can address the challenges of the modern advanced process industry.
1992 NASA Life Support Systems Analysis workshop
NASA Technical Reports Server (NTRS)
Evanich, Peggy L.; Crabb, Thomas M.; Gartrell, Charles F.
1992-01-01
The 1992 Life Support Systems Analysis Workshop was sponsored by NASA's Office of Aeronautics and Space Technology (OAST) to integrate the inputs from, disseminate information to, and foster communication among NASA, industry, and academic specialists. The workshop continued discussion and definition of key issues identified in the 1991 workshop, including: (1) modeling and experimental validation; (2) definition of systems analysis evaluation criteria; (3) integration of modeling at multiple levels; and (4) assessment of process control modeling approaches. Through both the 1991 and 1992 workshops, NASA has continued to seek input from industry and university chemical process modeling and analysis experts, and to introduce and apply new systems analysis approaches to life support systems. The workshop included technical presentations, discussions, and interactive planning, with sufficient time allocated for discussion of both technology status and technology development recommendations. Key personnel currently involved with life support technology developments from NASA, industry, and academia provided input to the status and priorities of current and future systems analysis methods and requirements.
An Overview of the Processes of Social Transition in Rural Appalachia. Discussion Paper.
ERIC Educational Resources Information Center
Photiadis, John D.
There are two very distinct schools of thought concerning the causation of many of Appalachia'a problems. One school treats the region's socio-cultural pecularities as the major cause of developmental problems; the other blames the coal industry. This paper suggests that, at the base, both schools of thought are valid in explaining cause and…
Ruiz-Gonzalez, Ruben; Gomez-Gil, Jaime; Gomez-Gil, Francisco Javier; Martínez-Martínez, Víctor
2014-01-01
The goal of this article is to assess the feasibility of estimating the state of various rotating components in agro-industrial machinery by employing just one vibration signal acquired from a single point on the machine chassis. To do so, a Support Vector Machine (SVM)-based system is employed. Experimental tests evaluated this system by acquiring vibration data from a single point of an agricultural harvester, while varying several of its working conditions. The whole process included two major steps. Initially, the vibration data were preprocessed through twelve feature extraction algorithms, after which the Exhaustive Search method selected the most suitable features. Secondly, the SVM-based system accuracy was evaluated by using Leave-One-Out cross-validation, with the selected features as the input data. The results of this study provide evidence that (i) accurate estimation of the status of various rotating components in agro-industrial machinery is possible by processing the vibration signal acquired from a single point on the machine structure; (ii) the vibration signal can be acquired with a uniaxial accelerometer, the orientation of which does not significantly affect the classification accuracy; and, (iii) when using an SVM classifier, an 85% mean cross-validation accuracy can be reached, which only requires a maximum of seven features as its input, and no significant improvements are noted between the use of either nonlinear or linear kernels. PMID:25372618
Ruiz-Gonzalez, Ruben; Gomez-Gil, Jaime; Gomez-Gil, Francisco Javier; Martínez-Martínez, Víctor
2014-11-03
The goal of this article is to assess the feasibility of estimating the state of various rotating components in agro-industrial machinery by employing just one vibration signal acquired from a single point on the machine chassis. To do so, a Support Vector Machine (SVM)-based system is employed. Experimental tests evaluated this system by acquiring vibration data from a single point of an agricultural harvester, while varying several of its working conditions. The whole process included two major steps. Initially, the vibration data were preprocessed through twelve feature extraction algorithms, after which the Exhaustive Search method selected the most suitable features. Secondly, the SVM-based system accuracy was evaluated by using Leave-One-Out cross-validation, with the selected features as the input data. The results of this study provide evidence that (i) accurate estimation of the status of various rotating components in agro-industrial machinery is possible by processing the vibration signal acquired from a single point on the machine structure; (ii) the vibration signal can be acquired with a uniaxial accelerometer, the orientation of which does not significantly affect the classification accuracy; and, (iii) when using an SVM classifier, an 85% mean cross-validation accuracy can be reached, which only requires a maximum of seven features as its input, and no significant improvements are noted between the use of either nonlinear or linear kernels.
Thomas, Craig E; Will, Yvonne
2012-02-01
Attrition in the drug industry due to safety findings remains high and requires a shift in the current safety testing paradigm. Many companies are now positioning safety assessment at each stage of the drug development process, including discovery, where an early perspective on potential safety issues is sought, often at chemical scaffold level, using a variety of emerging technologies. Given the lengthy development time frames of drugs in the pharmaceutical industry, the authors believe that the impact of new technologies on attrition is best measured as a function of the quality and timeliness of candidate compounds entering development. The authors provide an overview of in silico and in vitro models, as well as more complex approaches such as 'omics,' and where they are best positioned within the drug discovery process. It is important to take away that not all technologies should be applied to all projects. Technologies vary widely in their validation state, throughput and cost. A thoughtful combination of validated and emerging technologies is crucial in identifying the most promising candidates to move to proof-of-concept testing in humans. In spite of the challenges inherent in applying new technologies to drug discovery, the successes and recognition that we cannot continue to rely on safety assessment practices used for decades have led to rather dramatic strategy shifts and fostered partnerships across government agencies and industry. We are optimistic that these efforts will ultimately benefit patients by delivering effective and safe medications in a timely fashion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skaggs, B.E.
1995-07-01
The Environmental Protection Agency staff published the final storm water regulation on November 16, 1990. The storm water regulation is included in the National Pollutant Discharge Elimination System (NPDES) regulations. It specifies the permit application requirements for certain storm water discharges such as industrial activity or municipal separate storm sewers serving populations of 100,000 or greater. Storm water discharge associated with industrial activity is discharge from any conveyance used for collecting and conveying storm water that is directly related to manufacturing, processing, or raw material storage areas at an industrial plant. Quantitative testing data is required for these discharges. Anmore » individual storm water permit application was completed and submitted to Tennessee Department of Environment and Conservation (TDEC) personnel in October 1992. After reviewing this data in the permit application, TDEC personnel expressed concern with the fecal coliform levels at many of the outfalls. The 1995 NPDES Permit (Part 111-N, page 44) requires that an investigation be conducted to determine the validity of this data. If the fecal coliform data is valid, the permit requires that a report be submitted indicating possible causes and proposed corrective actions.« less
Bamboo–Polylactic Acid (PLA) Composite Material for Structural Applications
Pozo Morales, Angel; Güemes, Alfredo; Fernandez-Lopez, Antonio; Carcelen Valero, Veronica; De La Rosa Llano, Sonia
2017-01-01
Developing an eco-friendly industry based on green materials, sustainable technologies, and optimum processes with low environmental impact is a general societal goal, but this remains a considerable challenge to achieve. Despite the large number of research on green structural composites, limited investigation into the most appropriate manufacturing methodology to develop a structural material at industrial level has taken place. Laboratory panels have been manufactured with different natural fibers but the methodologies and values obtained could not be extrapolated at industrial level. Bamboo industry panels have increased in the secondary structural sector such as building application, flooring and sport device, because it is one of the cheapest raw materials. At industrial level, the panels are manufactured with only the inner and intermediate region of the bamboo culm. However, it has been found that the mechanical properties of the external shells of bamboo culm are much better than the average cross-sectional properties. Thin strips of bamboo (1.5 mm thick and 1500 mm long) were machined and arranged with the desired lay-up and shape to obtain laminates with specific properties better than those of conventional E-Glass/Epoxy laminates in terms of both strength and stiffness. The strips of bamboo were bonded together by a natural thermoplastic polylactic acid (PLA) matrix to meet biodegradability requirements. The innovative mechanical extraction process developed in this study can extract natural strip reinforcements with high performance, low cost, and high rate, with no negative environmental impact, as no chemical treatments are used. The process can be performed at the industrial level. Furthermore, in order to validate the structural applications of the composite, the mechanical properties were analyzed under ageing conditions. This material could satisfy the requirements for adequate mechanical properties and life cycle costs at industrial sectors such as energy or automotive. PMID:29120398
Bamboo-Polylactic Acid (PLA) Composite Material for Structural Applications.
Pozo Morales, Angel; Güemes, Alfredo; Fernandez-Lopez, Antonio; Carcelen Valero, Veronica; De La Rosa Llano, Sonia
2017-11-09
Developing an eco-friendly industry based on green materials, sustainable technologies, and optimum processes with low environmental impact is a general societal goal, but this remains a considerable challenge to achieve. Despite the large number of research on green structural composites, limited investigation into the most appropriate manufacturing methodology to develop a structural material at industrial level has taken place. Laboratory panels have been manufactured with different natural fibers but the methodologies and values obtained could not be extrapolated at industrial level. Bamboo industry panels have increased in the secondary structural sector such as building application, flooring and sport device, because it is one of the cheapest raw materials. At industrial level, the panels are manufactured with only the inner and intermediate region of the bamboo culm. However, it has been found that the mechanical properties of the external shells of bamboo culm are much better than the average cross-sectional properties. Thin strips of bamboo (1.5 mm thick and 1500 mm long) were machined and arranged with the desired lay-up and shape to obtain laminates with specific properties better than those of conventional E-Glass/Epoxy laminates in terms of both strength and stiffness. The strips of bamboo were bonded together by a natural thermoplastic polylactic acid (PLA) matrix to meet biodegradability requirements. The innovative mechanical extraction process developed in this study can extract natural strip reinforcements with high performance, low cost, and high rate, with no negative environmental impact, as no chemical treatments are used. The process can be performed at the industrial level. Furthermore, in order to validate the structural applications of the composite, the mechanical properties were analyzed under ageing conditions. This material could satisfy the requirements for adequate mechanical properties and life cycle costs at industrial sectors such as energy or automotive.
Experimental and computational fluid dynamic studies of mixing for complex oral health products
NASA Astrophysics Data System (ADS)
Garcia, Marti Cortada; Mazzei, Luca; Angeli, Panagiota
2015-11-01
Mixing high viscous non-Newtonian fluids is common in the consumer health industry. Sometimes this process is empirical and involves many pilot plants trials which are product specific. The first step to study the mixing process is to build on knowledge on the rheology of the fluids involved. In this research a systematic approach is used to validate the rheology of two liquids: glycerol and a gel formed by polyethylene glycol and carbopol. Initially, the constitutive equation is determined which relates the viscosity of the fluids with temperature, shear rate, and concentration. The key variable for the validation is the power required for mixing, which can be obtained both from CFD and experimentally using a stirred tank and impeller of well-defined geometries at different impeller speeds. A good agreement between the two values indicates a successful validation of the rheology and allows the CFD model to be used for the study of mixing in the complex vessel geometries and increased sizes encountered during scale up.
Computational fluid dynamics modeling of laboratory flames and an industrial flare.
Singh, Kanwar Devesh; Gangadharan, Preeti; Chen, Daniel H; Lou, Helen H; Li, Xianchang; Richmond, Peyton
2014-11-01
A computational fluid dynamics (CFD) methodology for simulating the combustion process has been validated with experimental results. Three different types of experimental setups were used to validate the CFD model. These setups include an industrial-scale flare setups and two lab-scale flames. The CFD study also involved three different fuels: C3H6/CH/Air/N2, C2H4/O2/Ar and CH4/Air. In the first setup, flare efficiency data from the Texas Commission on Environmental Quality (TCEQ) 2010 field tests were used to validate the CFD model. In the second setup, a McKenna burner with flat flames was simulated. Temperature and mass fractions of important species were compared with the experimental data. Finally, results of an experimental study done at Sandia National Laboratories to generate a lifted jet flame were used for the purpose of validation. The reduced 50 species mechanism, LU 1.1, the realizable k-epsilon turbulence model, and the EDC turbulence-chemistry interaction model were usedfor this work. Flare efficiency, axial profiles of temperature, and mass fractions of various intermediate species obtained in the simulation were compared with experimental data and a good agreement between the profiles was clearly observed. In particular the simulation match with the TCEQ 2010 flare tests has been significantly improved (within 5% of the data) compared to the results reported by Singh et al. in 2012. Validation of the speciated flat flame data supports the view that flares can be a primary source offormaldehyde emission.
Microbiological corrosion of ASTM SA105 carbon steel pipe for industrial fire water usage
NASA Astrophysics Data System (ADS)
Chidambaram, S.; Ashok, K.; Karthik, V.; Venkatakrishnan, P. G.
2018-02-01
The large number of metallic systems developed for last few decades against both general uniform corrosion and localized corrosion. Among all microbiological induced corrosion (MIC) is attractive, multidisciplinary and complex in nature. Many chemical processing industries utilizes fresh water for fire service to nullify major/minor fire. One such fire water service line pipe attacked by micro-organisms leads to leakage which is industrially important from safety point of view. Also large numbers of leakage reported in similar fire water service of nearby food processing plant, paper & pulp plant, steel plant, electricity board etc…In present investigation one such industrial fire water service line failure analysis of carbon steel line pipe was analyzed to determine the cause of failure. The water sample subjected to various chemical and bacterial analyses. Turbidity, pH, calcium hardness, free chlorine, oxidation reduction potential, fungi, yeasts, sulphide reducing bacteria (SRB) and total bacteria (TB) were measured on water sample analysis. The corrosion rate was measured on steel samples and corrosion coupon measurements were installed in fire water for validating non flow assisted localized corrosion. The sulphide reducing bacteria (SRB) presents in fire water causes a localized micro biological corrosion attack of line pipe.
2013-06-01
ABBREVIATIONS ANSI American National Standards Institute ASIS American Society of Industrial Security CCTV Closed Circuit Television CONOPS...is globally recognized for the development and maintenance of standards. ASTM defines a specification as an explicit set of requirements...www.rkb.us/saver/. One of the SAVER reports titled CCTV Technology Handbook has a chapter on system design. The report uses terms like functional
USDA-ARS?s Scientific Manuscript database
The beef industry must provide documentation to the regulatory agency that the antimicrobial interventions implemented or any subsequent change in the process is effective under the actual conditions that apply in its operation. The main objective of this study was to determine whether surface pH af...
Failure mode and effects analysis outputs: are they valid?
Shebl, Nada Atef; Franklin, Bryony Dean; Barber, Nick
2012-06-10
Failure Mode and Effects Analysis (FMEA) is a prospective risk assessment tool that has been widely used within the aerospace and automotive industries and has been utilised within healthcare since the early 1990s. The aim of this study was to explore the validity of FMEA outputs within a hospital setting in the United Kingdom. Two multidisciplinary teams each conducted an FMEA for the use of vancomycin and gentamicin. Four different validity tests were conducted: Face validity: by comparing the FMEA participants' mapped processes with observational work. Content validity: by presenting the FMEA findings to other healthcare professionals. Criterion validity: by comparing the FMEA findings with data reported on the trust's incident report database. Construct validity: by exploring the relevant mathematical theories involved in calculating the FMEA risk priority number. Face validity was positive as the researcher documented the same processes of care as mapped by the FMEA participants. However, other healthcare professionals identified potential failures missed by the FMEA teams. Furthermore, the FMEA groups failed to include failures related to omitted doses; yet these were the failures most commonly reported in the trust's incident database. Calculating the RPN by multiplying severity, probability and detectability scores was deemed invalid because it is based on calculations that breach the mathematical properties of the scales used. There are significant methodological challenges in validating FMEA. It is a useful tool to aid multidisciplinary groups in mapping and understanding a process of care; however, the results of our study cast doubt on its validity. FMEA teams are likely to need different sources of information, besides their personal experience and knowledge, to identify potential failures. As for FMEA's methodology for scoring failures, there were discrepancies between the teams' estimates and similar incidents reported on the trust's incident database. Furthermore, the concept of multiplying ordinal scales to prioritise failures is mathematically flawed. Until FMEA's validity is further explored, healthcare organisations should not solely depend on their FMEA results to prioritise patient safety issues.
Cognitive Bias in the Verification and Validation of Space Flight Systems
NASA Technical Reports Server (NTRS)
Larson, Steve
2012-01-01
Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of future systems.
Tan, Xiaodong; Lin, Jianyan; Wang, Fengjie; Luo, Hong; Luo, Lan; Wu, Lei
2007-09-01
This study was designed to understand the status of HIV/AIDS knowledge, attitude and practice (KAP) among different populations and to provide scientific evidences for further health education. Three rounds of questionnaires were administered among service industry workers who were selected through stratified cluster sampling. Study subjects included hotel attendants, employees of beauty parlors and service workers of transportation industry. Data were analyzed using the analytical hierarchy process. All demonstrated high KAP overall. Synthetic scoring indexes of the three surveys were above 75%. However, the correct response rate on questions whether mosquito bite can transmit HIV/AIDS and what is the relationship between STD with HIV was unsatisfactory (lower than expected); and their attitudes towards people living with HIV and AIDS need to be improved. Moreover, the effect of health education on these groups was unclear. In conclusion, analytical hierarchy process is a valid method in estimating overall effect of HIV/AIDS health education. Although the present status of HIV/AIDS KAP among the service industry workers was relatively good, greater efforts should be made to improve their HIV transmission knowledge, attitude and understanding of the relationship between STDs and HIV.
Rodríguez, N Husillos; Granados, R J; Blanco-Varela, M T; Cortina, J L; Martínez-Ramírez, S; Marsal, M; Guillem, M; Puig, J; Fos, C; Larrotcha, E; Flores, J
2012-03-01
This paper describes an industrial process for stabilising sewage sludge (SS) with lime and evaluates the viability of the stabilised product, denominated Neutral, as a raw material for the cement industry. Lime not only stabilised the sludge, raised the temperature of the mix to 80-100°C, furthering water evaporation, portlandite formation and the partial oxidation of the organic matter present in the sludge. Process mass and energy balances were determined. Neutral, a white powder consisting of portlandite (49.8%), calcite (16.6%), inorganic oxides (13.4%) and organic matter and moisture (20.2%), proved to be technologically apt for inclusion as a component in cement raw mixes. In this study, it was used instead of limestone in raw mixes clinkerised at 1400, 1450 and 1500°C. These raw meals exhibited greater reactivity at high temperatures than the limestone product and their calcination at 1500°C yielded clinker containing over 75% calcium silicates, the key phases in Portland clinker. Finally, the two types of raw meal (Neutral and limestone) were observed to exhibit similar mineralogy and crystal size and distribution. Published by Elsevier Ltd.
Ruszczyńska, A; Szteyn, J; Wiszniewska-Laszczych, A
2007-01-01
Producing dairy products which are safe for consumers requires the constant monitoring of the microbiological quality of raw material, the production process itself and the end product. Traditional methods, still a "gold standard", require a specialized laboratory working on recognized and validated methods. Obtaining results is time- and labor-consuming and do not allow rapid evaluation. Hence, there is a need for a rapid, precise method enabling the real-time monitoring of microbiological quality, and flow cytometry serves this function well. It is based on labeling cells suspended in a solution with fluorescent dyes and pumping them into a measurement zone where they are exposed to a precisely focused laser beam. This paper is aimed at presenting the possibilities of applying flow cytometry in the dairy industry.
NASA Technical Reports Server (NTRS)
Bache, George
1993-01-01
Validation of CFD codes is a critical first step in the process of developing CFD design capability. The MSFC Pump Technology Team has recognized the importance of validation and has thus funded several experimental programs designed to obtain CFD quality validation data. The first data set to become available is for the SSME High Pressure Fuel Turbopump Impeller. LDV Data was taken at the impeller inlet (to obtain a reliable inlet boundary condition) and three radial positions at the impeller discharge. Our CFD code, TASCflow, is used within the Propulsion and Commercial Pump industry as a tool for pump design. The objective of this work, therefore, is to further validate TASCflow for application in pump design. TASCflow was used to predict flow at the impeller discharge for flowrates of 80, 100 and 115 percent of design flow. Comparison to data has been made with encouraging results.
NASA Astrophysics Data System (ADS)
Rehman, Minhaj Ahemad Abdul; Shrivastava, Rakesh Lakshmikumar; Shrivastava, Rashmi Rakesh
2017-04-01
Green Manufacturing (GM) deals with manufacturing practices that reduces or eliminates the adverse environmental impact during any of its phases. It emphasizes the use of processes that do not contaminate the environment or hurt consumers, employees, or other stakeholders. This paper presents the comparative analysis of two Indian industries representing different sectors for validating GM framework. It also highlights the road map of the companies for achieving performance improvement through GM implementation and its impact on organisational performance. The case studies helps in evaluating the companies GM implementation and overall business performance. For this, a developed diagnostic instrument in the form of questionnaire was administered amongst employees in the companies respectively and their responses were analysed. In order to have a better understanding of the impact of GM implementation, the information about overall business performance was obtained over the last 3 years. The diagnostic instrument developed here may be used by manufacturing organisations to prioritise their management efforts to assess and implement GM.
Risk assessment of occupational exposure to heavy metal mixtures: a study protocol.
Omrane, Fatma; Gargouri, Imed; Khadhraoui, Moncef; Elleuch, Boubaker; Zmirou-Navier, Denis
2018-03-05
Sfax is a very industrialized city located in the southern region of Tunisia where heavy metals (HMs) pollution is now an established matter of fact. The health of its residents mainly those engaged in industrial metals-based activities is under threat. Indeed, such workers are being exposed to a variety of HMs mixtures, and this exposure has cumulative properties. Whereas current HMs exposure assessment is mainly carried out using direct air monitoring approaches, the present study aims to assess health risks associated with chronic occupational exposure to HMs in industry, using a modeling approach that will be validated later on. To this end, two questionnaires were used. The first was an identification/descriptive questionnaire aimed at identifying, for each company: the specific activities, materials used, manufactured products and number of employees exposed. The second related to the job-task of the exposed persons, workplace characteristics (dimensions, ventilation, etc.), type of metals and emission configuration in space and time. Indoor air HMs concentrations were predicted, based on the mathematical models generally used to estimate occupational exposure to volatile substances (such as solvents). Later on, and in order to validate the adopted model, air monitoring will be carried out, as well as some biological monitoring aimed at assessing HMs excretion in the urine of workers volunteering to participate. Lastly, an interaction-based hazard index HI int and a decision support tool will be used to predict the cumulative risk assessment for HMs mixtures. One hundred sixty-one persons working in the 5 participating companies have been identified. Of these, 110 are directly engaged with HMs in the course of the manufacturing process. This model-based prediction of occupational exposure represents an alternative tool that is both time-saving and cost-effective in comparison with direct air monitoring approaches. Following validation of the different models according to job processes, via comparison with direct measurements and exploration of correlations with biological monitoring, these estimates will allow a cumulative risk characterization.
Validation of mathematical model for CZ process using small-scale laboratory crystal growth furnace
NASA Astrophysics Data System (ADS)
Bergfelds, Kristaps; Sabanskis, Andrejs; Virbulis, Janis
2018-05-01
The present material is focused on the modelling of small-scale laboratory NaCl-RbCl crystal growth furnace. First steps towards fully transient simulations are taken in the form of stationary simulations that deal with the optimization of material properties to match the model to experimental conditions. For this purpose, simulation software primarily used for the modelling of industrial-scale silicon crystal growth process was successfully applied. Finally, transient simulations of the crystal growth are presented, giving a sufficient agreement to experimental results.
Analysis of the regimes in the scanner-based laser hardening process
NASA Astrophysics Data System (ADS)
Martínez, S.; Lamikiz, A.; Ukar, E.; Calleja, A.; Arrizubieta, J. A.; Lopez de Lacalle, L. N.
2017-03-01
Laser hardening is becoming a consolidated process in different industrial sectors such as the automotive industry or in the die and mold industry. The key to ensure the success in this process is to control the surface temperature and the hardened layer thickness. Furthermore, the development of reliable scanners, based on moving optics for guiding high power lasers at extremely fast speeds allows the rapid motion of laser spots, resulting on tailored shapes of swept areas by the laser. If a scanner is used to sweep a determined area, the laser energy density distribution can be adapted by varying parameters such us the scanning speed or laser power inside this area. Despite its advantages in terms of versatility, the use of scanners for the laser hardening process has not yet been introduced in the thermal hardening industry because of the difficulty of the temperature control and possible non-homogeneous hardness thickness layers. In the present work the laser hardening process with scanning optics applied to AISI 1045 steel has been studied, with special emphasis on the influence of the scanning speed and the results derived from its variation, the evolution of the hardened layer thickness and different strategies for the control of the process temperature. For this purpose, the hardened material has been studied by measuring microhardness at different points and the shape of the hardened layer has also been evaluated. All tests have been performed using an experimental setup designed to keep a nominal temperature value using a closed-loop control. The tests results show two different regimes depending on the scanning speed and feed rate values. The experimental results conclusions have been validated by means of thermal simulations at different conditions.
NASA Astrophysics Data System (ADS)
Jadhav, J. R.; Mantha, S. S.; Rane, S. B.
2015-06-01
The demands for automobiles increased drastically in last two and half decades in India. Many global automobile manufacturers and Tier-1 suppliers have already set up research, development and manufacturing facilities in India. The Indian automotive component industry started implementing Lean practices to fulfill the demand of these customers. United Nations Industrial Development Organization (UNIDO) has taken proactive approach in association with Automotive Component Manufacturers Association of India (ACMA) and the Government of India to assist Indian SMEs in various clusters since 1999 to make them globally competitive. The primary objectives of this research are to study the UNIDO-ACMA Model as well as ISM Model of Lean implementation and validate the ISM Model by comparing with UNIDO-ACMA Model. It also aims at presenting a roadmap for Lean implementation in Indian automotive component industry. This paper is based on secondary data which include the research articles, web articles, doctoral thesis, survey reports and books on automotive industry in the field of Lean, JIT and ISM. ISM Model for Lean practice bundles was developed by authors in consultation with Lean practitioners. The UNIDO-ACMA Model has six stages whereas ISM Model has eight phases for Lean implementation. The ISM-based Lean implementation model is validated through high degree of similarity with UNIDO-ACMA Model. The major contribution of this paper is the proposed ISM Model for sustainable Lean implementation. The ISM-based Lean implementation framework presents greater insight of implementation process at more microlevel as compared to UNIDO-ACMA Model.
DOT National Transportation Integrated Search
1996-03-13
"This is the transcript of the Joint FAA/Industry Symposium on Level B Airplane Simulator Aeromodel Validation Requirements held on March 13-14, 1996, at the Washington Dulles Airport Hilton. The purpose of the meeting was to discuss the aeromodeling...
Dynamics of Postcombustion CO2 Capture Plants: Modeling, Validation, and Case Study
2017-01-01
The capture of CO2 from power plant flue gases provides an opportunity to mitigate emissions that are harmful to the global climate. While the process of CO2 capture using an aqueous amine solution is well-known from experience in other technical sectors (e.g., acid gas removal in the gas processing industry), its operation combined with a power plant still needs investigation because in this case, the interaction with power plants that are increasingly operated dynamically poses control challenges. This article presents the dynamic modeling of CO2 capture plants followed by a detailed validation using transient measurements recorded from the pilot plant operated at the Maasvlakte power station in the Netherlands. The model predictions are in good agreement with the experimental data related to the transient changes of the main process variables such as flow rate, CO2 concentrations, temperatures, and solvent loading. The validated model was used to study the effects of fast power plant transients on the capture plant operation. A relevant result of this work is that an integrated CO2 capture plant might enable more dynamic operation of retrofitted fossil fuel power plants because the large amount of steam needed by the capture process can be diverted rapidly to and from the power plant. PMID:28413256
Lucena, Rafael; Cárdenas, Soledad; Gallego, Mercedes; Valcárcel, Miguel
2006-03-01
Monitoring the exhaustion of alkaline degreasing baths is one of the main aspects in metal mechanizing industrial process control. The global level of surfactant, and mainly grease, can be used as ageing indicators. In this paper, an attenuated total reflection-Fourier transform infrared (ATR-FTIR) membrane-based sensor is presented for the determination of these parameters. The system is based on a micro-liquid-liquid extraction of the analytes through a polymeric membrane from the aqueous to the organic solvent layer which is in close contact with the internal reflection element and continuously monitored. Samples are automatically processed using a simple, robust sequential injection analysis (SIA) configuration, on-line coupled to the instrument. The global signal obtained for both families of compounds are processed via a multivariate calibration technique (partial least squares, PLS). Excellent correlation was obtained for the values given by the proposed method compared to those of the gravimetric reference one with very low error values for both calibration and validation.
Using 'big data' to validate claims made in the pharmaceutical approval process.
Wasser, Thomas; Haynes, Kevin; Barron, John; Cziraky, Mark
2015-01-01
Big Data in the healthcare setting refers to the storage, assimilation, and analysis of large quantities of information regarding patient care. These data can be collected and stored in a wide variety of ways including electronic medical records collected at the patient bedside, or through medical records that are coded and passed to insurance companies for reimbursement. When these data are processed it is possible to validate claims as a part of the regulatory review process regarding the anticipated performance of medications and devices. In order to analyze properly claims by manufacturers and others, there is a need to express claims in terms that are testable in a timeframe that is useful and meaningful to formulary committees. Claims for the comparative benefits and costs, including budget impact, of products and devices need to be expressed in measurable terms, ideally in the context of submission or validation protocols. Claims should be either consistent with accessible Big Data or able to support observational studies where Big Data identifies target populations. Protocols should identify, in disaggregated terms, key variables that would lead to direct or proxy validation. Once these variables are identified, Big Data can be used to query massive quantities of data in the validation process. Research can be passive or active in nature. Passive, where the data are collected retrospectively; active where the researcher is prospectively looking for indicators of co-morbid conditions, side-effects or adverse events, testing these indicators to determine if claims are within desired ranges set forth by the manufacturer. Additionally, Big Data can be used to assess the effectiveness of therapy through health insurance records. This, for example, could indicate that disease or co-morbid conditions cease to be treated. Understanding the basic strengths and weaknesses of Big Data in the claim validation process provides a glimpse of the value that this research can provide to industry. Big Data can support a research agenda that focuses on the process of claims validation to support formulary submissions as well as inputs to ongoing disease area and therapeutic class reviews.
Improvement of Computer Software Quality through Software Automated Tools.
1986-08-31
requirement for increased emphasis on software quality assurance has lead to the creation of various methods of verification and validation. Experience...result was a vast array of methods , systems, languages and automated tools to assist in the process. Given that the primary role of quality assurance is...Unfortunately, there is no single method , tool or technique that can insure accurate, reliable and cost effective software. Therefore, government and industry
Methodology for assessing laser-based equipment
NASA Astrophysics Data System (ADS)
Pelegrina-Bonilla, Gabriel; Hermsdorf, Jörg; Thombansen, Ulrich; Abels, Peter; Kaierle, Stefan; Neumann, Jörg
2017-10-01
Methodologies for the assessment of technology's maturity are widely used in industry and research. Probably the best known are technology readiness levels (TRLs), initially pioneered by the National Aeronautics and Space Administration (NASA). At the beginning, only descriptively defined TRLs existed, but over time, automated assessment techniques in the form of questionnaires emerged in order to determine TRLs. Originally TRLs targeted equipment for space applications, but the demands on industrial relevant equipment are partly different in terms of, for example, overall costs, product quantities, or the presence of competitors. Therefore, we present a commonly valid assessment methodology with the aim of assessing laser-based equipment for industrial use, in general. The assessment is carried out with the help of a questionnaire, which allows for a user-friendly and easy accessible way to monitor the progress from the lab-proven state to the application-ready product throughout the complete development period. The assessment result is presented in a multidimensional metric in order to reveal the current specific strengths and weaknesses of the equipment development process, which can be used to direct the remaining development process of the equipment in the right direction.
NASA Astrophysics Data System (ADS)
Pranav Nithin, R.; Gopikrishnan, S.; Sumesh, A.
2018-02-01
Cooling towers are the heat transfer devices commonly found in industries which are used to extract the high temperature from the coolants and make it reusable in various plants. Basically, the cooling towers has Fills made of PVC sheets stacked together to increase the surface area exposure of the cooling liquid flowing through it. This paper focuses on the study in such a manufacturing plant where fills are being manufactured. The productivity using the current manufacturing method was only 6 to 8 fills per day, where the ideal capacity was of 14 fills per day. In this plant manual labor was employed in the manufacturing process. A change in the process modification designed and implemented will help the industry to increase the productivity to 14. In this paper, initially the simulation study was done using ARENA the simulation package and later the new design was done using CAD Package and validated using Ansys Mechanical APDL. It’s found that, by the implementation of the safe design the productivity can be increased to 196 Units.
SERENITY in Air Traffic Management
NASA Astrophysics Data System (ADS)
Felici, Massimo; Meduri, Valentino; Tedeschi, Alessandra; Riccucci, Carlo
This chapter is concerned with the validation of an implementation of the SERENITY Runtime Framework (SRF) tailored for the Air Traffic Management (ATM) domain. It reports our experience in the design and validation phases of a tool, which relies on the SRF in order to support Security and Dependability (S&D) Patterns into work practices. In particular, this chapter pinpoints the activities concerning the identification of S&D Patterns, the design of an ATM prototype and its validation. The validation activities involve qualitative as well as quantitative approaches. These activities as a whole highlight the validation process for adopting S&D Patterns within the ATM domain. Moreover, they stress how S&D Patters enhance and relate to critical features within an industry domain. The empirical results point out that S&D Patterns relate to work practices. Furthermore, they highlight design and validation activities in order to tailor systems relying on S&D Patterns to specific application domains. This strengths and supports the adoption of S&D Patterns in order to address AmI (Ambient Intelligence) requirements (e.g., awareness, proactiveness, resilience, etc.) within the ATM domain.
Green supplier selection: a new genetic/immune strategy with industrial application
NASA Astrophysics Data System (ADS)
Kumar, Amit; Jain, Vipul; Kumar, Sameer; Chandra, Charu
2016-10-01
With the onset of the 'climate change movement', organisations are striving to include environmental criteria into the supplier selection process. This article hybridises a Green Data Envelopment Analysis (GDEA)-based approach with a new Genetic/Immune Strategy for Data Envelopment Analysis (GIS-DEA). A GIS-DEA approach provides a different view to solving multi-criteria decision making problems using data envelopment analysis (DEA) by considering DEA as a multi-objective optimisation problem with efficiency as one objective and proximity of solution to decision makers' preferences as the other objective. The hybrid approach called GIS-GDEA is applied here to a well-known automobile spare parts manufacturer in India and the results presented. User validation developed based on specific set of criteria suggests that the supplier selection process with GIS-GDEA is more practical than other approaches in a current industrial scenario with multiple decision makers.
Space station needs, attributes and architectural options. Part 1: Summary
NASA Technical Reports Server (NTRS)
1983-01-01
Candidate missions for the space station were subjected to an evaluation/filtering process which included the application of budgetary constraints and performance of benefits analysis. Results show that the initial space station should be manned, placed in a 28.5 deg orbit, and provide capabilities which include a space test facility, satellite service, a transport harbor, and an observatory. A space industrial park may be added once further development effort validates the cost and expanding commercial market for space-processed material. Using the space station as a national space test facility can enhance national security, as well as commercial and scientific interests alike. The potential accrued gross mission model benefit derived from these capabilities is $5.9B without the industrial park, and $9.3B with it. Other benefits include the lowering of acquisition costs for NASA and DoD space assets and a basis for broadening international participation.
Demey, D; Vanderhaegen, B; Vanhooren, H; Liessens, J; Van Eyck, L; Hopkins, L; Vanrolleghem, P A
2001-01-01
In this paper, the practical implementation and validation of advanced control strategies, designed using model based techniques, at an industrial wastewater treatment plant is demonstrated. The plant under study is treating the wastewater of a large pharmaceutical production facility. The process characteristics of the wastewater treatment were quantified by means of tracer tests, intensive measurement campaigns and the use of on-line sensors. In parallel, a dynamical model of the complete wastewater plant was developed according to the specific kinetic characteristics of the sludge and the highly varying composition of the industrial wastewater. Based on real-time data and dynamic models, control strategies for the equalisation system, the polymer dosing and phosphorus addition were established. The control strategies are being integrated in the existing SCADA system combining traditional PLC technology with robust PC based control calculations. The use of intelligent control in wastewater treatment offers a wide spectrum of possibilities to upgrade existing plants, to increase the capacity of the plant and to eliminate peaks. This can result in a more stable and secure overall performance and, finally, in cost savings. The use of on-line sensors has a potential not only for monitoring concentrations, but also for manipulating flows and concentrations. This way the performance of the plant can be secured.
Implementation of WirelessHART in the NS-2 Simulator and Validation of Its Correctness
Zand, Pouria; Mathews, Emi; Havinga, Paul; Stojanovski, Spase; Sisinni, Emiliano; Ferrari, Paolo
2014-01-01
One of the first standards in the wireless sensor networks domain, WirelessHART (HART (Highway Addressable Remote Transducer)), was introduced to address industrial process automation and control requirements. This standard can be used as a reference point to evaluate other wireless protocols in the domain of industrial monitoring and control. This makes it worthwhile to set up a reliable WirelessHART simulator in order to achieve that reference point in a relatively easy manner. Moreover, it offers an alternative to expensive testbeds for testing and evaluating the performance of WirelessHART. This paper explains our implementation of WirelessHART in the NS-2 network simulator. According to our knowledge, this is the first implementation that supports the WirelessHART network manager, as well as the whole stack (all OSI (Open Systems Interconnection model) layers) of the WirelessHART standard. It also explains our effort to validate the correctness of our implementation, namely through the validation of the implementation of the WirelessHART stack protocol and of the network manager. We use sniffed traffic from a real WirelessHART testbed installed in the Idrolab plant for these validations. This confirms the validity of our simulator. Empirical analysis shows that the simulated results are nearly comparable to the results obtained from real networks. We also demonstrate the versatility and usability of our implementation by providing some further evaluation results in diverse scenarios. For example, we evaluate the performance of the WirelessHART network by applying incremental interference in a multi-hop network. PMID:24841245
Benedict, Ralph HB; DeLuca, John; Phillips, Glenn; LaRocca, Nicholas; Hudson, Lynn D; Rudick, Richard
2017-01-01
Cognitive and motor performance measures are commonly employed in multiple sclerosis (MS) research, particularly when the purpose is to determine the efficacy of treatment. The increasing focus of new therapies on slowing progression or reversing neurological disability makes the utilization of sensitive, reproducible, and valid measures essential. Processing speed is a basic elemental cognitive function that likely influences downstream processes such as memory. The Multiple Sclerosis Outcome Assessments Consortium (MSOAC) includes representatives from advocacy organizations, Food and Drug Administration (FDA), European Medicines Agency (EMA), National Institute of Neurological Disorders and Stroke (NINDS), academic institutions, and industry partners along with persons living with MS. Among the MSOAC goals is acceptance and qualification by regulators of performance outcomes that are highly reliable and valid, practical, cost-effective, and meaningful to persons with MS. A critical step for these neuroperformance metrics is elucidation of clinically relevant benchmarks, well-defined degrees of disability, and gradients of change that are deemed clinically meaningful. This topical review provides an overview of research on one particular cognitive measure, the Symbol Digit Modalities Test (SDMT), recognized as being particularly sensitive to slowed processing of information that is commonly seen in MS. The research in MS clearly supports the reliability and validity of this test and recently has supported a responder definition of SDMT change approximating 4 points or 10% in magnitude. PMID:28206827
Benedict, Ralph Hb; DeLuca, John; Phillips, Glenn; LaRocca, Nicholas; Hudson, Lynn D; Rudick, Richard
2017-04-01
Cognitive and motor performance measures are commonly employed in multiple sclerosis (MS) research, particularly when the purpose is to determine the efficacy of treatment. The increasing focus of new therapies on slowing progression or reversing neurological disability makes the utilization of sensitive, reproducible, and valid measures essential. Processing speed is a basic elemental cognitive function that likely influences downstream processes such as memory. The Multiple Sclerosis Outcome Assessments Consortium (MSOAC) includes representatives from advocacy organizations, Food and Drug Administration (FDA), European Medicines Agency (EMA), National Institute of Neurological Disorders and Stroke (NINDS), academic institutions, and industry partners along with persons living with MS. Among the MSOAC goals is acceptance and qualification by regulators of performance outcomes that are highly reliable and valid, practical, cost-effective, and meaningful to persons with MS. A critical step for these neuroperformance metrics is elucidation of clinically relevant benchmarks, well-defined degrees of disability, and gradients of change that are deemed clinically meaningful. This topical review provides an overview of research on one particular cognitive measure, the Symbol Digit Modalities Test (SDMT), recognized as being particularly sensitive to slowed processing of information that is commonly seen in MS. The research in MS clearly supports the reliability and validity of this test and recently has supported a responder definition of SDMT change approximating 4 points or 10% in magnitude.
Dhillon, Gurpreet S; Brar, Satinder K; Kaur, Surinder; Verma, Mausam
2013-05-01
The citric acid (CA) industry is currently struggling to develop a sustainable and economical process owing to high substrate and energy costs. Increasing interest in the replacement of costly synthetic substrates by renewable waste biomass has fostered research on agro-industrial wastes and screening of raw materials for economical CA production. The food-processing industry generates substantial quantities of waste biomass that could be used as a valuable low-cost fermentation substrate. The present study evaluated the potential of different agro-industrial wastes, namely apple pomace (AP), brewer's spent grain, citrus waste and sphagnum peat moss, as substrates for solid state CA production using Aspergillus niger NRRL 2001. Among the four substrates, AP resulted in highest CA production of 61.06 ± 1.9 g kg(-1) dry substrate (DS) after a 72 h incubation period. Based on the screening studies, AP was selected for optimisation studies through response surface methodology (RSM). Maximum CA production of 312.32 g kg(-1) DS was achieved at 75% (v/w) moisture and 3% (v/w) methanol after a 144 h incubation period. The validation of RSM-optimised parameters in plastic trays resulted in maximum CA production of 364.4 ± 4.50 g kg(-1) DS after a 120 h incubation period. The study demonstrated the potential of AP as a cheap substrate for higher CA production. This study contributes to knowledge about the future application of carbon rich agro-industrial wastes for their value addition to CA. It also offers economic and environmental benefits over traditional ways used to dispose off agro-industrial wastes. © 2012 Society of Chemical Industry.
Challenges in Materials Transformation Modeling for Polyolefins Industry
NASA Astrophysics Data System (ADS)
Lai, Shih-Yaw; Swogger, Kurt W.
2004-06-01
Unlike most published polymer processing and/or forming research, the transformation of polyolefins to fabricated articles often involves non-confined flow or so-called free surface flow (e.g. fiber spinning, blown films, and cast films) in which elongational flow takes place during a fabrication process. Obviously, the characterization and validation of extensional rheological parameters and their use to develop rheological constitutive models are the focus of polyolefins materials transformation research. Unfortunately, there are challenges that remain with limited validation for non-linear, non-isothermal constitutive models for polyolefins. Further complexity arises in the transformation of polyolefins in the elongational flow system as it involves stress-induced crystallization process. The complicated nature of elongational, non-linear rheology and non-isothermal crystallization kinetics make the development of numerical methods very challenging for the polyolefins materials forming modeling. From the product based company standpoint, the challenges of materials transformation research go beyond elongational rheology, crystallization kinetics and its numerical modeling. In order to make models useful for the polyolefin industry, it is critical to develop links between molecular parameters to both equipment and materials forming parameters. The recent advances in the constrained geometry catalysis and materials sciences understanding (INSITE technology and molecular design capability) has made industrial polyolefinic materials forming modeling more viable due to the fact that the molecular structure of the polymer can be well predicted and controlled during the polymerization. In this paper, we will discuss inter-relationship (models) among molecular parameters such as polymer molecular weight (Mw), molecular weight distribution (MWD), long chain branching (LCB), short chain branching (SCB or comonomer types and distribution) and their affects on shear and elongational rheologies, on tie-molecules probabilities, on non-isothermal stress-induced crystallization, on crystalline/amorphous orientation vs. mechanical property relationship, etc. All of the above mentioned inter-relationships (models) are critical to the successful development of a knowledge based industrial model. Dow Polyolefins and Elastomers business is one of the world largest polyolefins resin producers with the most advanced INSITE technology and a "6-Day model" molecular design capability. Dow also offers one of the broadest polyolefinic product ranges and applications to the market.
Merz, Michael; Kettner, Lucas; Langolf, Emma; Appel, Daniel; Blank, Imre; Stressler, Timo; Fischer, Lutz
2016-08-01
Due to allergies or other health disorders a certain segment of the population is not able to safely consume some plant proteins, which are the main protein support in human nutrition. Coeliac disease is a prominent autoimmune disorder and requires a strict adherence to a gluten-free diet. The aim of this study was to identify suitable combinations of enzymatic hydrolysis and common unit operations in food processing (centrifugation, ultra-filtration) to produce gluten-free wheat gluten hydrolysates for food application. To analyse the hydrolysates, a simple and cheap competitive ELISA protocol was designed and validated in this study as well. The competitive ELISA was validated using gliadin spiked skim milk protein hydrolysates, due to the latter application of the assay. The limit of quantification was 4.19 mg kg(-1) , which allowed the identification of gluten-free (<20 mg kg(-1) ) hydrolysates. Enzymatic hydrolysis, including the type of peptidase, and the downstream processing greatly affected the antigenicity of the hydrolysates. Enzymatic hydrolysis and downstream processing operations, such as centrifugation and ultra-filtration, reduced the antigenicity of wheat gluten hydrolysates. Gluten-free hydrolysates were obtained with Flavourzyme after centrifugation (25 g L(-1) substrate) and after 1 kDa ultra-filtration (100 g L(-1) substrate). A multiple peptidase complex, such as Flavourzyme, seems to be required for the production of gluten-free hydrolysates. © 2015 Society of Chemical Industry. © 2015 Society of Chemical Industry.
A survey of size-fractionated dust levels in the U.S. wood processing industry.
Kalliny, Medhat I; Brisolara, Joseph A; Glindmeyer, Henry; Rando, Roy
2008-08-01
A survey of size-fractionated dust exposure was carried out in 10 wood processing plants across the United States as part of a 5-year longitudinal respiratory health study. The facilities included a sawmill, plywood assembly plants, secondary wood milling operations, and factories producing finished wood products such as wood furniture and cabinets. Size-fractionated dust exposures were determined using the RespiCon Personal Particle Sampler. There were 2430 valid sets of respirable, thoracic, and inhalable dust samples collected. Overall, geometric mean (geometric standard deviation) exposure levels were found to be 1.44 (2.67), 0.35 (2.65), and 0.18 (2.54) mg/m, for the inhalable, thoracic, and respirable fractions, respectively. Averaged across all samples, the respirable fraction accounted for 16.7% of the inhalable dust mass, whereas the corresponding figure for thoracic fraction as a percentage of the inhalable fraction was 28.7%. Exposures in the furniture manufacturing plants were significantly higher than those in sawmill and plywood assembly plants, wood milling plants, and cabinet manufacturing plants, whereas the sawmill and plywood assembly plants exhibited significantly lower dust levels than the other industry segments. Among work activities, cleaning with compressed air and sanding processes produced the highest size-fractionated dust exposures, whereas forklift drivers demonstrated the lowest respirable and inhalable dust fractions and shipping processes produced the lowest thoracic dust fraction. Other common work activities such as sawing, milling, and clamping exhibited intermediate exposure levels, but there were significant differences in relative ranking of these across the various industry segments. Processing of hardwood and mixed woods generally were associated with higher exposures than were softwood and plywood, although these results were confounded with industry segment also.
The RFad Method--a new fatigue recovery time assessment for industrial activities.
Silva e Santos, Marcello; Vidal, Mario Cesar Rodriguez; Moreira, Sergio Bastos
2012-01-01
This paper presents a study about fatigue recovery time assessment processes in work activities. It came about due to a demand presented by an automotive industry giant, in need of updating existing cycle time sheets and TAKT time parameters. The company decided to hire an Ergonomics Laboratory with ties to a major Brazilian University in order to evaluate current conditions and establish a new method to either calculate recovery times or validate existing assessment criteria, based in the ergonomics evaluation of the work activities. It is clear that there has been evident evolution in the industrial sector in the past two decades. It brought up fast modernization of industrial processes, not only in production but also in terms of management systems. Due to improved computer and robotics systems, combined with overall operational advancements - like the use of lighter hand tools and more effective hoist systems - most work activities have had its physical effort requirements reduced in the past decades. Thus, compensation factors built into production times need to be reviewed in order to avoid unnecessary costs associated to them. By using ergonomics considerations, we prevent simply removing the physical variables built in rest time calculations without taking on account, for example, additional cognitive load represented by the use of more sophisticated pieces of equipment.
Reniers, G L L; Audenaert, A; Pauwels, N; Soudan, K
2011-02-15
This article empirically assesses and validates a methodology to make evacuation decisions in case of major fire accidents in chemical clusters. In this paper, a number of empirical results are presented, processed and discussed with respect to the implications and management of evacuation decisions in chemical companies. It has been shown in this article that in realistic industrial settings, suboptimal interventions may result in case the prospect to obtain additional information at later stages of the decision process is ignored. Empirical results also show that implications of interventions, as well as the required time and workforce to complete particular shutdown activities, may be very different from one company to another. Therefore, to be optimal from an economic viewpoint, it is essential that precautionary evacuation decisions are tailor-made per company. Copyright © 2010 Elsevier B.V. All rights reserved.
Arvanitoyannis, Ioannis S; Varzakas, Theodoros H
2008-05-01
The Failure Mode and Effect Analysis (FMEA) model was applied for risk assessment of salmon manufacturing. A tentative approach of FMEA application to the salmon industry was attempted in conjunction with ISO 22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (salmon processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points were identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram and fishbone diagram). In this work, a comparison of ISO 22000 analysis with HACCP is carried out over salmon processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the RPN per identified processing hazard. Fish receiving, casing/marking, blood removal, evisceration, filet-making cooling/freezing, and distribution were the processes identified as the ones with the highest RPN (252, 240, 210, 210, 210, 210, 200 respectively) and corrective actions were undertaken. After the application of corrective actions, a second calculation of RPN values was carried out resulting in substantially lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO 22000 system of a salmon processing industry is anticipated to prove advantageous to industrialists, state food inspectors, and consumers.
Joint Service Solvent Substitution (JS3)
2012-05-01
Process Evaluation Acceptance Criteria Market Research Demonstration Plan Demonstration Validate Implementation Approval Start Approval...PRF-63460D Mg (AZ 31B-H24) mg/cm^2 0.7 Mg (SAE AMS 4377) “ Al (AMS-QQ-A-250) “ Al (7075-T6) “ 0.49 Ti (AMS 4911, 6AL- 4V ...properties – Evaluation of vendor test results – Industry experience – DOD Aerospace & Shipbuilding NESHAP experience Market Research 21 Approved for
JPRS Report, Near East & South Asia
1989-08-24
the parliament persisted, even after the forces of occupation pulled out of Leba- non. Based on the depth of nationalist feeling in Dam- ascus ...poured method after the primary aluminum block is heated. The rods then solidify, arc cut, and undergo a process of industrial aging . The sections...of Arab feebleness the like of which was seen only in the age of Arab decadence. Difference of opinion validates democracy in any arena. But when
Analytical Modeling and Performance Prediction of Remanufactured Gearbox Components
NASA Astrophysics Data System (ADS)
Pulikollu, Raja V.; Bolander, Nathan; Vijayakar, Sandeep; Spies, Matthew D.
Gearbox components operate in extreme environments, often leading to premature removal or overhaul. Though worn or damaged, these components still have the ability to function given the appropriate remanufacturing processes are deployed. Doing so reduces a significant amount of resources (time, materials, energy, manpower) otherwise required to produce a replacement part. Unfortunately, current design and analysis approaches require extensive testing and evaluation to validate the effectiveness and safety of a component that has been used in the field then processed outside of original OEM specification. To test all possible combination of component coupled with various levels of potential damage repaired through various options of processing would be an expensive and time consuming feat, thus prohibiting a broad deployment of remanufacturing processes across industry. However, such evaluation and validation can occur through Integrated Computational Materials Engineering (ICME) modeling and simulation. Sentient developed a microstructure-based component life prediction (CLP) tool to quantify and assist gearbox components remanufacturing process. This was achieved by modeling the design-manufacturing-microstructure-property relationship. The CLP tool assists in remanufacturing of high value, high demand rotorcraft, automotive and wind turbine gears and bearings. This paper summarizes the CLP models development, and validation efforts by comparing the simulation results with rotorcraft spiral bevel gear physical test data. CLP analyzes gear components and systems for safety, longevity, reliability and cost by predicting (1) New gearbox component performance, and optimal time-to-remanufacture (2) Qualification of used gearbox components for remanufacturing process (3) Predicting the remanufactured component performance.
Evolution of microbiological analytical methods for dairy industry needs.
Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence
2014-01-01
Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry's needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards.
Rodríguez, M T Torres; Andrade, L Cristóbal; Bugallo, P M Bello; Long, J J Casares
2011-09-15
Life cycle thinking (LCT) is one of the philosophies that has recently appeared in the context of the sustainable development. Some of the already existing tools and methods, as well as some of the recently emerged ones, which seek to understand, interpret and design the life of a product, can be included into the scope of the LCT philosophy. That is the case of the material and energy flow analysis (MEFA), a tool derived from the industrial metabolism definition. This paper proposes a methodology combining MEFA with another technique derived from sustainable development which also fits the LCT philosophy, the BAT (best available techniques) analysis. This methodology, applied to an industrial process, seeks to identify the so-called improvable flows by MEFA, so that the appropriate candidate BAT can be selected by BAT analysis. Material and energy inputs, outputs and internal flows are quantified, and sustainable solutions are provided on the basis of industrial metabolism. The methodology has been applied to an exemplary roof tile manufacture plant for validation. 14 Improvable flows have been identified and 7 candidate BAT have been proposed aiming to reduce these flows. The proposed methodology provides a way to detect improvable material or energy flows in a process and selects the most sustainable options to enhance them. Solutions are proposed for the detected improvable flows, taking into account their effectiveness on improving such flows. Copyright © 2011 Elsevier B.V. All rights reserved.
Modeling and Analysis of CNC Milling Process Parameters on Al3030 based Composite
NASA Astrophysics Data System (ADS)
Gupta, Anand; Soni, P. K.; Krishna, C. M.
2018-04-01
The machining of Al3030 based composites on Computer Numerical Control (CNC) high speed milling machine have assumed importance because of their wide application in aerospace industries, marine industries and automotive industries etc. Industries mainly focus on surface irregularities; material removal rate (MRR) and tool wear rate (TWR) which usually depends on input process parameters namely cutting speed, feed in mm/min, depth of cut and step over ratio. Many researchers have carried out researches in this area but very few have taken step over ratio or radial depth of cut also as one of the input variables. In this research work, the study of characteristics of Al3030 is carried out at high speed CNC milling machine over the speed range of 3000 to 5000 r.p.m. Step over ratio, depth of cut and feed rate are other input variables taken into consideration in this research work. A total nine experiments are conducted according to Taguchi L9 orthogonal array. The machining is carried out on high speed CNC milling machine using flat end mill of diameter 10mm. Flatness, MRR and TWR are taken as output parameters. Flatness has been measured using portable Coordinate Measuring Machine (CMM). Linear regression models have been developed using Minitab 18 software and result are validated by conducting selected additional set of experiments. Selection of input process parameters in order to get best machining outputs is the key contributions of this research work.
Kinetic models in industrial biotechnology - Improving cell factory performance.
Almquist, Joachim; Cvijovic, Marija; Hatzimanikatis, Vassily; Nielsen, Jens; Jirstrand, Mats
2014-07-01
An increasing number of industrial bioprocesses capitalize on living cells by using them as cell factories that convert sugars into chemicals. These processes range from the production of bulk chemicals in yeasts and bacteria to the synthesis of therapeutic proteins in mammalian cell lines. One of the tools in the continuous search for improved performance of such production systems is the development and application of mathematical models. To be of value for industrial biotechnology, mathematical models should be able to assist in the rational design of cell factory properties or in the production processes in which they are utilized. Kinetic models are particularly suitable towards this end because they are capable of representing the complex biochemistry of cells in a more complete way compared to most other types of models. They can, at least in principle, be used to in detail understand, predict, and evaluate the effects of adding, removing, or modifying molecular components of a cell factory and for supporting the design of the bioreactor or fermentation process. However, several challenges still remain before kinetic modeling will reach the degree of maturity required for routine application in industry. Here we review the current status of kinetic cell factory modeling. Emphasis is on modeling methodology concepts, including model network structure, kinetic rate expressions, parameter estimation, optimization methods, identifiability analysis, model reduction, and model validation, but several applications of kinetic models for the improvement of cell factories are also discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
A Premiere example of the illusion of harm reduction cigarettes in the 1990s.
Pollay, R W; Dewhirst, T
2003-09-01
To use the product launch of Player's Premiere as a case study for understanding the new cigarette product development process during the 1990s. We determine the (in)validity of industry claims that: (1) development of the physical product preceded the promotional promise of "less irritation"; (2) "less irritation" was actually realised; (3) advertising informed consumers; and (4) advertising regulations caused the product's failure in the marketplace. Court proceedings assessing the constitutionality of Canada's Tobacco Act, which substantially restricts cigarette advertising. The 2002 Quebec Superior Court trial yielded a new collection of internal documents from Imperial Tobacco Ltd (ITL), including several about the development and marketing of Player's Premiere. Trial testimony and corporate documents were reviewed to determine the validity of the industry representations about the new cigarette product development process, focusing on the case history of Player's Premiere. In direct contradiction to industry testimony, the documentary evidence demonstrates that (1) communications for Player's Premiere, which claimed less irritation, were developed long before finding a product that could deliver on the promise; (2) ITL did not sell a "less irritating" product that matched its promotional promise; (3) the advertising and other communications for Player's Premiere were extensive, relying on the hi-tech appearances ("tangible credibility") of a "unique" filter, yet were uninformative and vague; and (4) Player's Premiere failed in the marketplace, despite extensive advertising and retail support, because it was an inferior product that did not live up to its promotional promise, not because of regulation of commercial speech. New product development entails extensive consumer research to craft all communications tools in fine detail. In the case of Player's Premiere, this crafting created a false and misleading impression of technological advances producing a "less irritating" cigarette. This product was solely a massive marketing ploy with neither consumer benefits, nor public health benefits. The industry attempted to deceive both consumers and the court.
A Premiere example of the illusion of harm reduction cigarettes in the 1990s
Pollay, R; Dewhirst, T
2003-01-01
Objective: To use the product launch of Player's Premiere as a case study for understanding the new cigarette product development process during the 1990s. We determine the (in)validity of industry claims that: (1) development of the physical product preceded the promotional promise of "less irritation"; (2) "less irritation" was actually realised; (3) advertising informed consumers; and (4) advertising regulations caused the product's failure in the marketplace. Setting: Court proceedings assessing the constitutionality of Canada's Tobacco Act, which substantially restricts cigarette advertising. The 2002 Quebec Superior Court trial yielded a new collection of internal documents from Imperial Tobacco Ltd (ITL), including several about the development and marketing of Player's Premiere. Method: Trial testimony and corporate documents were reviewed to determine the validity of the industry representations about the new cigarette product development process, focusing on the case history of Player's Premiere. Results: In direct contradiction to industry testimony, the documentary evidence demonstrates that (1) communications for Player's Premiere, which claimed less irritation, were developed long before finding a product that could deliver on the promise; (2) ITL did not sell a "less irritating" product that matched its promotional promise; (3) the advertising and other communications for Player's Premiere were extensive, relying on the hi-tech appearances ("tangible credibility") of a "unique" filter, yet were uninformative and vague; and (4) Player's Premiere failed in the marketplace, despite extensive advertising and retail support, because it was an inferior product that did not live up to its promotional promise, not because of regulation of commercial speech. Conclusions: New product development entails extensive consumer research to craft all communications tools in fine detail. In the case of Player's Premiere, this crafting created a false and misleading impression of technological advances producing a "less irritating" cigarette. This product was solely a massive marketing ploy with neither consumer benefits, nor public health benefits. The industry attempted to deceive both consumers and the court. PMID:12958396
Banach, Jennifer L.; Sampers, Imca; Van Haute, Sam; van der Fels-Klerx, H.J. (Ine)
2015-01-01
The potential cross-contamination of pathogens between clean and contaminated produce in the washing tank is highly dependent on the water quality. Process wash water disinfectants are applied to maintain the water quality during processing. The review examines the efficacy of process wash water disinfectants during produce processing with the aim to prevent cross-contamination of pathogens. Process wash water disinfection requires short contact times so microorganisms are rapidly inactivated. Free chlorine, chlorine dioxide, ozone, and peracetic acid were considered suitable disinfectants. A disinfectant’s reactivity with the organic matter will determine the disinfectant residual, which is of paramount importance for microbial inactivation and should be monitored in situ. Furthermore, the chemical and worker safety, and the legislative framework will determine the suitability of a disinfection technique. Current research often focuses on produce decontamination and to a lesser extent on preventing cross-contamination. Further research on a sanitizer’s efficacy in the washing water is recommended at the laboratory scale, in particular with experimental designs reflecting industrial conditions. Validation on the industrial scale is warranted to better understand the overall effects of a sanitizer. PMID:26213953
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKane, Aimee; Scheihing, Paul; Williams, Robert
2007-07-01
More than fifteen years after the launch of programs in theU.K. and U.S., industry still offers one of the largest opportunities forenergy savings worldwide. The International Energy Agency (IEA) estimatesthe savings potential from cost-optimization of industrial motor-drivensystems alone at 7 percent of global electricity use. The U.S. Departmentof Energy (USDOE) Industrial Technologies Program estimates 7 percentsavings potential in total US industrial energy use through theapplication of proven best practice. Simple paybacks for these types ofprojects are frequently two years or less. The technology required toachieve these savings is widely available; the technical skills requiredto identify energy saving opportunities are knownmore » and transferable.Although programs like USDOE's Best Practices have been highlysuccessful, most plants, as supported by 2002 MECS data, remain eitherunaware or unmotivated to improve their energy efficiency--as evidencedby the 98 percent of US industrial facilities reporting to MECS say thatthey lack a full-time energy manager. With the renewed interest in energyefficiency worldwide and the emergence of carbon trading and newfinancial instruments such as white certificates1, there is a need tointroduce greater transparency into the way that industrial facilitiesidentify, develop, and document energy efficiency projects. Historically,industrial energy efficiency projects have been developed by plantengineers, frequently with assistance from consultants and/or supplierswith highly specialized technical skills. Under this scenario,implementation of energy efficiency improvements is dependent onindividuals. These individuals typically include "champions" within anindustrial facility or corporation, working in cooperation withconsultants or suppliers who have substantial knowledge based on years ofexperience. This approach is not easily understood by others without thisspecialized technical knowledge, penetrates the market fairly slowly, andhas no assurance of persistence, since champions may leave the company orbe reassigned after project completion.This paper presents an alternatescenario that builds on the body of expert knowledge concerning energymanagement best practices and the experience of industrial champions toengage industry in continuous energy efficiency improvement at thefacility rather than the individual level. Under this scenario,standardized methodologies for applying and validating energy managementbest practices in industrial facilities will be developed through aconsensus process involving both plant personnel and specializedconsultants and suppliers. The resulting protocols will describe aprocess or framework for conducting an energy savings assessment andverifying the results that will be transparent to policymakers, managers,and the financial community, and validated by a third-party organization.Additionally, a global dialogue is being initiated by the United NationsIndustrial Development Organization (UNIDO) concerning the development ofan international industrial energy management standard that would be ISOcompatible. The proposed scenario will combine the resulting standardwith the best practice protocols for specific energy systems (i.e.,steam, process heating, compressed air, pumping systems, etc.) to formthe foundation of a third party, performance-based certification programfor the overall industrial facility that is compatible with existingmanagement systems, including ISO 9001:2000, 14001:2004 and 6 Sigma. Thelong term goal of this voluntary, industry designed certification programis to develop a transparent, globally accepted system for validatingenergy efficiency projects and management practices. This system wouldcreate a verified record of energy savings with potential market valuethat could be recognized among sectors and countries.« less
Friesen, Melissa C; Coble, Joseph B; Katki, Hormuzd A; Ji, Bu-Tian; Xue, Shouzheng; Lu, Wei; Stewart, Patricia A
2011-07-01
In epidemiologic studies that rely on professional judgment to assess occupational exposures, the raters' accurate assessment is vital to detect associations. We examined the influence of the type of questionnaire, type of industry, and type of rater on the raters' ability to reliably and validly assess within-industry differences in exposure. Our aim was to identify areas where improvements in exposure assessment may be possible. Subjects from three foundries (n = 72) and three textile plants (n = 74) in Shanghai, China, completed an occupational history (OH) and an industry-specific questionnaire (IQ). Six total dust measurements were collected per subject and were used to calculate a subject-specific measurement mean, which was used as the gold standard. Six raters independently ranked the intensity of each subject's current job on an ordinal scale (1-4) based on the OH alone and on the OH and IQ together. Aggregate ratings were calculated for the group, for industrial hygienists, and for occupational physicians. We calculated intra-class correlation coefficients (ICCs) to evaluate the reliability of the raters. We calculated the correlation between the subject-specific measurement means and the ratings to evaluate the raters' validity. Analyses were stratified by industry, type of questionnaire, and type of rater. We also examined the agreement between the ratings by exposure category, where the subject-specific measurement means were categorized into two and four categories. The reliability and validity measures were higher for the aggregate ratings than for the ratings from the individual raters. The group's performance was maximized with three raters. Both the reliability and validity measures were higher for the foundry industry than for the textile industry. The ICCs were consistently lower in the OH/IQ round than in the OH round in both industries. In contrast, the correlations with the measurement means were higher in the OH/IQ round than in the OH round for the foundry industry (group rating, OH/IQ: Spearman rho = 0.77; OH: rho = 0.64). No pattern by questionnaire type was observed for the textile industry (group rating, Spearman rho = 0.50, both assessment rounds). For both industries, the agreement by exposure category was higher when the task was reduced to discriminating between two versus four exposure categories. Assessments based on professional judgment may reduce misclassification by using two or three raters, by using questionnaires that systematically collect task information, and by defining intensity categories that are distinguishable by the raters. However, few studies have the resources to use multiple raters and these additional efforts may not be adequate for obtaining valid subjective ratings. Thus, improving exposure assessment approaches for studies that rely on professional judgment remain an important research need.
Failure mode and effects analysis outputs: are they valid?
2012-01-01
Background Failure Mode and Effects Analysis (FMEA) is a prospective risk assessment tool that has been widely used within the aerospace and automotive industries and has been utilised within healthcare since the early 1990s. The aim of this study was to explore the validity of FMEA outputs within a hospital setting in the United Kingdom. Methods Two multidisciplinary teams each conducted an FMEA for the use of vancomycin and gentamicin. Four different validity tests were conducted: · Face validity: by comparing the FMEA participants’ mapped processes with observational work. · Content validity: by presenting the FMEA findings to other healthcare professionals. · Criterion validity: by comparing the FMEA findings with data reported on the trust’s incident report database. · Construct validity: by exploring the relevant mathematical theories involved in calculating the FMEA risk priority number. Results Face validity was positive as the researcher documented the same processes of care as mapped by the FMEA participants. However, other healthcare professionals identified potential failures missed by the FMEA teams. Furthermore, the FMEA groups failed to include failures related to omitted doses; yet these were the failures most commonly reported in the trust’s incident database. Calculating the RPN by multiplying severity, probability and detectability scores was deemed invalid because it is based on calculations that breach the mathematical properties of the scales used. Conclusion There are significant methodological challenges in validating FMEA. It is a useful tool to aid multidisciplinary groups in mapping and understanding a process of care; however, the results of our study cast doubt on its validity. FMEA teams are likely to need different sources of information, besides their personal experience and knowledge, to identify potential failures. As for FMEA’s methodology for scoring failures, there were discrepancies between the teams’ estimates and similar incidents reported on the trust’s incident database. Furthermore, the concept of multiplying ordinal scales to prioritise failures is mathematically flawed. Until FMEA’s validity is further explored, healthcare organisations should not solely depend on their FMEA results to prioritise patient safety issues. PMID:22682433
Abdollahi, Yadollah; Sairi, Nor Asrina; Said, Suhana Binti Mohd; Abouzari-lotf, Ebrahim; Zakaria, Azmi; Sabri, Mohd Faizul Bin Mohd; Islam, Aminul; Alias, Yatimah
2015-11-05
It is believe that 80% industrial of carbon dioxide can be controlled by separation and storage technologies which use the blended ionic liquids absorber. Among the blended absorbers, the mixture of water, N-methyldiethanolamine (MDEA) and guanidinium trifluoromethane sulfonate (gua) has presented the superior stripping qualities. However, the blended solution has illustrated high viscosity that affects the cost of separation process. In this work, the blended fabrication was scheduled with is the process arranging, controlling and optimizing. Therefore, the blend's components and operating temperature were modeled and optimized as input effective variables to minimize its viscosity as the final output by using back-propagation artificial neural network (ANN). The modeling was carried out by four mathematical algorithms with individual experimental design to obtain the optimum topology using root mean squared error (RMSE), R-squared (R(2)) and absolute average deviation (AAD). As a result, the final model (QP-4-8-1) with minimum RMSE and AAD as well as the highest R(2) was selected to navigate the fabrication of the blended solution. Therefore, the model was applied to obtain the optimum initial level of the input variables which were included temperature 303-323 K, x[gua], 0-0.033, x[MDAE], 0.3-0.4, and x[H2O], 0.7-1.0. Moreover, the model has obtained the relative importance ordered of the variables which included x[gua]>temperature>x[MDEA]>x[H2O]. Therefore, none of the variables was negligible in the fabrication. Furthermore, the model predicted the optimum points of the variables to minimize the viscosity which was validated by further experiments. The validated results confirmed the model schedulability. Accordingly, ANN succeeds to model the initial components of the blended solutions as absorber of CO2 capture in separation technologies that is able to industries scale up. Copyright © 2015 Elsevier B.V. All rights reserved.
Assessing Requirements Quality through Requirements Coverage
NASA Technical Reports Server (NTRS)
Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt
2008-01-01
In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.
Energy optimization aspects by injection process technology
NASA Astrophysics Data System (ADS)
Tulbure, A.; Ciortea, M.; Hutanu, C.; Farcas, V.
2016-08-01
In the proposed paper, the authors examine the energy aspects related to the injection moulding process technology in the automotive industry. Theoretical considerations have been validated by experimental measurements on the manufacturing process, for two types of injections moulding machines, hydraulic and electric. Practical measurements have been taken with professional equipment separately on each technological operation: lamination, compression, injection and expansion. For results traceability, the following parameters were, whenever possible, maintained: cycle time, product weight and the relative time. The aim of the investigations was to carry out a professional energy audit with accurate losses identification. Base on technological diagram for each production cycle, at the end of this contribution, some measure to reduce the energy consumption were proposed.
A Fresnel collector process heat experiment at Capitol Concrete Products
NASA Technical Reports Server (NTRS)
Hauger, J. S.
1981-01-01
An experiment is planned, conducted and evaluated to determine the feasibility of using a Power Kinetics' Fresnel concentrator to provide process heat in an industrial environment. The plant provides process steam at 50 to 60 psig to two autoclaves for curing masonry blocks. When steam is not required, the plant preheats hot water for later use. A second system is installed at the Jet Propulsion Laboratory parabolic dish test site for hardware validation and experiment control. Experiment design allows for the extrapolation of results to varying demands for steam and hot water, and includes a consideration of some socio-technical factors such as the impact on production scheduling of diurnal variations in energy availability.
Rahman, Md. Sayedur; Sathasivam, Kathiresan V.
2015-01-01
Biosorption process is a promising technology for the removal of heavy metals from industrial wastes and effluents using low-cost and effective biosorbents. In the present study, adsorption of Pb2+, Cu2+, Fe2+, and Zn2+ onto dried biomass of red seaweed Kappaphycus sp. was investigated as a function of pH, contact time, initial metal ion concentration, and temperature. The experimental data were evaluated by four isotherm models (Langmuir, Freundlich, Temkin, and Dubinin-Radushkevich) and four kinetic models (pseudo-first-order, pseudo-second-order, Elovich, and intraparticle diffusion models). The adsorption process was feasible, spontaneous, and endothermic in nature. Functional groups in the biomass involved in metal adsorption process were revealed as carboxylic and sulfonic acids and sulfonate by Fourier transform infrared analysis. A total of nine error functions were applied to validate the models. We strongly suggest the analysis of error functions for validating adsorption isotherm and kinetic models using linear methods. The present work shows that the red seaweed Kappaphycus sp. can be used as a potentially low-cost biosorbent for the removal of heavy metal ions from aqueous solutions. Further study is warranted to evaluate its feasibility for the removal of heavy metals from the real environment. PMID:26295032
Rahman, Md Sayedur; Sathasivam, Kathiresan V
2015-01-01
Biosorption process is a promising technology for the removal of heavy metals from industrial wastes and effluents using low-cost and effective biosorbents. In the present study, adsorption of Pb(2+), Cu(2+), Fe(2+), and Zn(2+) onto dried biomass of red seaweed Kappaphycus sp. was investigated as a function of pH, contact time, initial metal ion concentration, and temperature. The experimental data were evaluated by four isotherm models (Langmuir, Freundlich, Temkin, and Dubinin-Radushkevich) and four kinetic models (pseudo-first-order, pseudo-second-order, Elovich, and intraparticle diffusion models). The adsorption process was feasible, spontaneous, and endothermic in nature. Functional groups in the biomass involved in metal adsorption process were revealed as carboxylic and sulfonic acids and sulfonate by Fourier transform infrared analysis. A total of nine error functions were applied to validate the models. We strongly suggest the analysis of error functions for validating adsorption isotherm and kinetic models using linear methods. The present work shows that the red seaweed Kappaphycus sp. can be used as a potentially low-cost biosorbent for the removal of heavy metal ions from aqueous solutions. Further study is warranted to evaluate its feasibility for the removal of heavy metals from the real environment.
Results from an Independent View on The Validation of Safety-Critical Space Systems
NASA Astrophysics Data System (ADS)
Silva, N.; Lopes, R.; Esper, A.; Barbosa, R.
2013-08-01
The Independent verification and validation (IV&V) has been a key process for decades, and is considered in several international standards. One of the activities described in the “ESA ISVV Guide” is the independent test verification (stated as Integration/Unit Test Procedures and Test Data Verification). This activity is commonly overlooked since customers do not really see the added value of checking thoroughly the validation team work (could be seen as testing the tester's work). This article presents the consolidated results of a large set of independent test verification activities, including the main difficulties, results obtained and advantages/disadvantages for the industry of these activities. This study will support customers in opting-in or opting-out for this task in future IV&V contracts since we provide concrete results from real case studies in the space embedded systems domain.
Sterilization validation for medical compresses at IRASM multipurpose irradiation facility
NASA Astrophysics Data System (ADS)
Alexandru, Mioara; Ene, Mihaela
2007-08-01
In Romania, IRASM Radiation Processing Center is the unique supplier of radiation sterilization services—industrial scale (ISO 9001:2000 and ISO 13485:2003 certified). Its Laboratory of Microbiological Testing is the sole third party competent laboratory (GLPractice License, ISO 17025 certification in progress) for pharmaceutics and medical devices as well. We here refer to medical compresses as a distinct category of sterile products, made from different kind of hydrophilic materials (cotton, non-woven, polyurethane foam) with or without an impregnated ointment base (paraffin, plant extracts). These products are included in the class of medical devices, but for the sterilization validation, from microbiological point of view, there are important differences in testing method compared to the common medical devices (syringes, catheters, etc). In this paper, we present some results and practical solutions chosen to perform a sterilization validation, compliant with ISO 11137: 2006.
Development and Validation of a 3-Dimensional CFB Furnace Model
NASA Astrophysics Data System (ADS)
Vepsäläinen, Arl; Myöhänen, Karl; Hyppäneni, Timo; Leino, Timo; Tourunen, Antti
At Foster Wheeler, a three-dimensional CFB furnace model is essential part of knowledge development of CFB furnace process regarding solid mixing, combustion, emission formation and heat transfer. Results of laboratory and pilot scale phenomenon research are utilized in development of sub-models. Analyses of field-test results in industrial-scale CFB boilers including furnace profile measurements are simultaneously carried out with development of 3-dimensional process modeling, which provides a chain of knowledge that is utilized as feedback for phenomenon research. Knowledge gathered by model validation studies and up-to-date parameter databases are utilized in performance prediction and design development of CFB boiler furnaces. This paper reports recent development steps related to modeling of combustion and formation of char and volatiles of various fuel types in CFB conditions. Also a new model for predicting the formation of nitrogen oxides is presented. Validation of mixing and combustion parameters for solids and gases are based on test balances at several large-scale CFB boilers combusting coal, peat and bio-fuels. Field-tests including lateral and vertical furnace profile measurements and characterization of solid materials provides a window for characterization of fuel specific mixing and combustion behavior in CFB furnace at different loads and operation conditions. Measured horizontal gas profiles are projection of balance between fuel mixing and reactions at lower part of furnace and are used together with both lateral temperature profiles at bed and upper parts of furnace for determination of solid mixing and combustion model parameters. Modeling of char and volatile based formation of NO profiles is followed by analysis of oxidizing and reducing regions formed due lower furnace design and mixing characteristics of fuel and combustion airs effecting to formation ofNO furnace profile by reduction and volatile-nitrogen reactions. This paper presents CFB process analysis focused on combustion and NO profiles in pilot and industrial scale bituminous coal combustion.
Scheduling job shop - A case study
NASA Astrophysics Data System (ADS)
Abas, M.; Abbas, A.; Khan, W. A.
2016-08-01
The scheduling in job shop is important for efficient utilization of machines in the manufacturing industry. There are number of algorithms available for scheduling of jobs which depend on machines tools, indirect consumables and jobs which are to be processed. In this paper a case study is presented for scheduling of jobs when parts are treated on available machines. Through time and motion study setup time and operation time are measured as total processing time for variety of products having different manufacturing processes. Based on due dates different level of priority are assigned to the jobs and the jobs are scheduled on the basis of priority. In view of the measured processing time, the times for processing of some new jobs are estimated and for efficient utilization of the machines available an algorithm is proposed and validated.
Application of a Model for Simulating the Vacuum Arc Remelting Process in Titanium Alloys
NASA Astrophysics Data System (ADS)
Patel, Ashish; Tripp, David W.; Fiore, Daniel
Mathematical modeling is routinely used in the process development and production of advanced aerospace alloys to gain greater insight into system dynamics and to predict the effect of process modifications or upsets on final properties. This article describes the application of a 2-D mathematical VAR model presented in previous LMPC meetings. The impact of process parameters on melt pool geometry, solidification behavior, fluid-flow and chemistry in Ti-6Al-4V ingots will be discussed. Model predictions were first validated against the measured characteristics of industrially produced ingots, and process inputs and model formulation were adjusted to match macro-etched pool shapes. The results are compared to published data in the literature. Finally, the model is used to examine ingot chemistry during successive VAR melts.
Rahman, Syed Abidur; Taghizadeh, Seyedeh Khadijeh; Ramayah, T; Ahmad, Noor Hazlina
2015-01-01
Service innovation management practice is currently being widely scrutinized mainly in the developed countries, where it has been initiated. The current study attempts to propose a framework and empirically validate and explain the service innovation practices for successful performance in the telecommunications industry of two developing countries, Malaysia and Bangladesh. The research framework proposes relationships among organisational culture, operating core (innovation process, cross-functional organisation, and implementation of tools/technology), competition-informed pricing, and performance. A total of 176 usable data from both countries are analysed for the purpose of the research. The findings show that organisational culture tends to be more influential on innovation process and cross-functional organisation in Malaysian telecommunication industry. In contrast, implementation of tools/technology plays a more instrumental role in competition-informed pricing practices in Bangladesh. This study revealed few differences in the innovation management practices between two developing countries. The findings have strategic implications for the service sectors in both the developing countries regarding implementation of innovative enterprises, especially in Bangladesh where innovation is the basis for survival. Testing the innovation management practices in the developing countries perhaps contains uniqueness in the field of innovation management.
NASA Astrophysics Data System (ADS)
Binti Shamsuddin, Norsila
Technology advancement and development in a higher learning institution is a chance for students to be motivated to learn in depth in the information technology areas. Students should take hold of the opportunity to blend their skills towards these technologies as preparation for them when graduating. The curriculum itself can rise up the students' interest and persuade them to be directly involved in the evolvement of the technology. The aim of this study is to see how deep is the students' involvement as well as their acceptance towards the adoption of the technology used in Computer Graphics and Image Processing subjects. The study will be towards the Bachelor students in Faculty of Industrial Information Technology (FIIT), Universiti Industri Selangor (UNISEL); Bac. In Multimedia Industry, BSc. Computer Science and BSc. Computer Science (Software Engineering). This study utilizes the new Unified Theory of Acceptance and Use of Technology (UTAUT) to further validate the model and enhance our understanding of the adoption of Computer Graphics and Image Processing Technologies. Four (4) out of eight (8) independent factors in UTAUT will be studied towards the dependent factor.
Wanka, Franziska; Arentshorst, Mark; Cairns, Timothy C; Jørgensen, Thomas; Ram, Arthur F J; Meyer, Vera
2016-08-20
The filamentous ascomycete Aspergillus niger is used in many industrial processes for the production of enzymes and organic acids by batch and fed-batch cultivation. An alternative technique is continuous cultivation, which promises improved yield and optimized pipeline efficiency. In this work, we have used perfusion (retentostat) cultivation to validate two promoters that are suitable for A. niger continuous cultivation of industrially relevant products. Firstly, promoters of genes encoding either an antifungal protein (Panafp) or putative hydrophobin (PhfbD) were confirmed as active throughout retentostat culture by assessing mRNA and protein levels using a luciferase (mluc) reporter system. This demonstrated the anafp promoter mediates a high but temporally variable expression profile, whereas the hfbD promoter mediates a semi-constant, moderate-to-high protein expression during retentostat culture. In order to assess whether these promoters were suitable to produce heterologous proteins during retentostat cultivation, the secreted antifungal protein (AFP) from Aspergillus giganteus, which has many potential biotechnological applications, was expressed in A. niger during retentostat cultivation. Additionally, this assay was used to concomitantly validate that native secretion signals encoded in anafp and hfbD genes can be harnessed for secretion of heterologous proteins. Afp mRNA and protein abundance were comparable to luciferase measurements throughout retentostat cultivation, validating the use of Panafp and PhfbD for perfusion cultivation. Finally, a gene encoding the highly commercially relevant thermal hysteresis protein (THP) was expressed in this system, which did not yield detectable protein. Both hfbD and anafp promoters are suitable for production of useful products in A. niger during perfusion cultivation. These findings provide a platform for further optimisations for high production of heterologous proteins with industrial relevance.
Chen, Zejun; Han, Huiquan; Ren, Wei; Huang, Guangjie
2015-01-01
On-line spray water cooling (OSWC) of electric-resistance-welded (ERW) steel pipes can replace the conventional off-line heat treatment process and become an important and critical procedure. The OSWC process improves production efficiency, decreases costs, and enhances the mechanical properties of ERW steel pipe, especially the impact properties of the weld joint. In this paper, an annular OSWC process is investigated based on an experimental simulation platform that can obtain precise real-time measurements of the temperature of the pipe, the water pressure and flux, etc. The effects of the modes of annular spray water cooling and related cooling parameters on the mechanical properties of the pipe are investigated. The temperature evolutions of the inner and outer walls of the pipe are measured during the spray water cooling process, and the uniformity of mechanical properties along the circumferential and longitudinal directions is investigated. A heat transfer coefficient model of spray water cooling is developed based on measured temperature data in conjunction with simulation using the finite element method. Industrial tests prove the validity of the heat transfer model of a steel pipe undergoing spray water cooling. The research results can provide a basis for the industrial application of the OSWC process in the production of ERW steel pipes. PMID:26201073
Chen, Zejun; Han, Huiquan; Ren, Wei; Huang, Guangjie
2015-01-01
On-line spray water cooling (OSWC) of electric-resistance-welded (ERW) steel pipes can replace the conventional off-line heat treatment process and become an important and critical procedure. The OSWC process improves production efficiency, decreases costs, and enhances the mechanical properties of ERW steel pipe, especially the impact properties of the weld joint. In this paper, an annular OSWC process is investigated based on an experimental simulation platform that can obtain precise real-time measurements of the temperature of the pipe, the water pressure and flux, etc. The effects of the modes of annular spray water cooling and related cooling parameters on the mechanical properties of the pipe are investigated. The temperature evolutions of the inner and outer walls of the pipe are measured during the spray water cooling process, and the uniformity of mechanical properties along the circumferential and longitudinal directions is investigated. A heat transfer coefficient model of spray water cooling is developed based on measured temperature data in conjunction with simulation using the finite element method. Industrial tests prove the validity of the heat transfer model of a steel pipe undergoing spray water cooling. The research results can provide a basis for the industrial application of the OSWC process in the production of ERW steel pipes.
NASA Astrophysics Data System (ADS)
Dou, S.; Commer, M.; Ajo Franklin, J. B.; Freifeld, B. M.; Robertson, M.; Wood, T.; McDonald, S.
2017-12-01
Archer Daniels Midland Company's (ADM) world-scale agricultural processing and biofuels production complex located in Decatur, Illinois, is host to two industrial-scale carbon capture and storage projects. The first operation within the Illinois Basin-Decatur Project (IBDP) is a large-scale pilot that injected 1,000,000 metric tons of CO2 over a three year period (2011-2014) in order to validate the Illinois Basin's capacity to permanently store CO2. Injection for the second operation, the Illinois Industrial Carbon Capture and Storage Project (ICCS), started in April 2017, with the purpose of demonstrating the integration of carbon capture and storage (CCS) technology at an ethanol plant. The capacity to store over 1,000,000 metric tons of CO2 per year is anticipated. The latter project is accompanied by the development of an intelligent monitoring system (IMS) that will, among other tasks, perform hydrogeophysical joint analysis of pressure, temperature and seismic reflection data. Using a preliminary radial model assumption, we carry out synthetic joint inversion studies of these data combinations. We validate the history-matching process to be applied to field data once CO2-breakthrough at observation wells occurs. This process will aid the estimation of permeability and porosity for a reservoir model that best matches monitoring observations. The reservoir model will further be used for forecasting studies in order to evaluate different leakage scenarios and develop appropriate early-warning mechanisms. Both the inversion and forecasting studies aim at building an IMS that will use the seismic and pressure-temperature data feeds for providing continuous model calibration and reservoir status updates.
Tomasula, P M; Yee, W C F; McAloon, A J; Nutter, D W; Bonnaillie, L M
2013-05-01
Energy-savings measures have been implemented in fluid milk plants to lower energy costs and the energy-related carbon dioxide (CO2) emissions. Although these measures have resulted in reductions in steam, electricity, compressed air, and refrigeration use of up to 30%, a benchmarking framework is necessary to examine the implementation of process-specific measures that would lower energy use, costs, and CO2 emissions even further. In this study, using information provided by the dairy industry and equipment vendors, a customizable model of the fluid milk process was developed for use in process design software to benchmark the electrical and fuel energy consumption and CO2 emissions of current processes. It may also be used to test the feasibility of new processing concepts to lower energy and CO2 emissions with calculation of new capital and operating costs. The accuracy of the model in predicting total energy usage of the entire fluid milk process and the pasteurization step was validated using available literature and industry energy data. Computer simulation of small (40.0 million L/yr), medium (113.6 million L/yr), and large (227.1 million L/yr) processing plants predicted the carbon footprint of milk, defined as grams of CO2 equivalents (CO2e) per kilogram of packaged milk, to within 5% of the value of 96 g of CO 2e/kg of packaged milk obtained in an industry-conducted life cycle assessment and also showed, in agreement with the same study, that plant size had no effect on the carbon footprint of milk but that larger plants were more cost effective in producing milk. Analysis of the pasteurization step showed that increasing the percentage regeneration of the pasteurizer from 90 to 96% would lower its thermal energy use by almost 60% and that implementation of partial homogenization would lower electrical energy use and CO2e emissions of homogenization by 82 and 5.4%, respectively. It was also demonstrated that implementation of steps to lower non-process-related electrical energy in the plant would be more effective in lowering energy use and CO2e emissions than fuel-related energy reductions. The model also predicts process-related water usage, but this portion of the model was not validated due to a lack of data. The simulator model can serve as a benchmarking framework for current plant operations and a tool to test cost-effective process upgrades or evaluate new technologies that improve the energy efficiency and lower the carbon footprint of milk processing plants. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric
2014-03-01
Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses. © 2013 Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Saeli, Manfredi; Novais, Rui M.; Seabra, Maria Paula; Labrincha, João A.
2017-11-01
Sustainability in construction is a major concern worldwide, due to the huge volume of materials and energy consumed by this sector. Associated supplementing industries (e.g. Portland cement production) constitute a significant source of CO2 emissions and global warming. Valorisation and reuse of industrial wastes and by-products make geopolymers a solid and sustainable via to be followed as a valid alternative to Portland cement. In this work the mix design of a green fly ash-based geopolymer is evaluated as an environmentally friendly construction material. In the pursuit of sustainability, wastes from a regional kraft pulp industry are exploited for the material processing. Furthermore, a simple, reproducible, and low-cost manufacture is used. The mix design is hence optimised in order to improve the desirable mechanical performance of the material intended for structural applications in construction. Tests indicate that geopolymers may efficiently substitute the ordinary Portland cement as a mortar/concrete binder. Furthermore, valorisation and reuse of wastes in geopolymers is a suboptimal way of gaining financial surplus for the involved industrial players, while contributes for the implementation of a desirable circular economy.
ERIC Educational Resources Information Center
DUENK, LESTER G.
THE PRIMARY OBJECTIVE OF THIS STUDY WAS TO ESTABLISH THE CONCURRENT VALIDITY OF THE MINNESOTA TESTS OF CREATIVE THINKING, ABBREVIATED FORM VII, (MTCT VII) BY DETERMINING THE RELATIONSHIP BETWEEN ITS SCORES AND CREATIVE ABILITY AS MEASURED BY ACCUMULATED TEACHER RATINGS OF INDUSTRIAL ARTS PROJECTS AND INVESTIGATOR-DEVELOPED TESTS OF CREATIVITY. THE…
Sensor sentinel computing device
Damico, Joseph P.
2016-08-02
Technologies pertaining to authenticating data output by sensors in an industrial environment are described herein. A sensor sentinel computing device receives time-series data from a sensor by way of a wireline connection. The sensor sentinel computing device generates a validation signal that is a function of the time-series signal. The sensor sentinel computing device then transmits the validation signal to a programmable logic controller in the industrial environment.
NASA Technical Reports Server (NTRS)
Walker, Eric L.
2005-01-01
Wind tunnel experiments will continue to be a primary source of validation data for many types of mathematical and computational models in the aerospace industry. The increased emphasis on accuracy of data acquired from these facilities requires understanding of the uncertainty of not only the measurement data but also any correction applied to the data. One of the largest and most critical corrections made to these data is due to wall interference. In an effort to understand the accuracy and suitability of these corrections, a statistical validation process for wall interference correction methods has been developed. This process is based on the use of independent cases which, after correction, are expected to produce the same result. Comparison of these independent cases with respect to the uncertainty in the correction process establishes a domain of applicability based on the capability of the method to provide reasonable corrections with respect to customer accuracy requirements. The statistical validation method was applied to the version of the Transonic Wall Interference Correction System (TWICS) recently implemented in the National Transonic Facility at NASA Langley Research Center. The TWICS code generates corrections for solid and slotted wall interference in the model pitch plane based on boundary pressure measurements. Before validation could be performed on this method, it was necessary to calibrate the ventilated wall boundary condition parameters. Discrimination comparisons are used to determine the most representative of three linear boundary condition models which have historically been used to represent longitudinally slotted test section walls. Of the three linear boundary condition models implemented for ventilated walls, the general slotted wall model was the most representative of the data. The TWICS code using the calibrated general slotted wall model was found to be valid to within the process uncertainty for test section Mach numbers less than or equal to 0.60. The scatter among the mean corrected results of the bodies of revolution validation cases was within one count of drag on a typical transport aircraft configuration for Mach numbers at or below 0.80 and two counts of drag for Mach numbers at or below 0.90.
Challenges and opportunities in the manufacture and expansion of cells for therapy.
Maartens, Joachim H; De-Juan-Pardo, Elena; Wunner, Felix M; Simula, Antonio; Voelcker, Nicolas H; Barry, Simon C; Hutmacher, Dietmar W
2017-10-01
Laboratory-based ex vivo cell culture methods are largely manual in their manufacturing processes. This makes it extremely difficult to meet regulatory requirements for process validation, quality control and reproducibility. Cell culture concepts with a translational focus need to embrace a more automated approach where cell yields are able to meet the quantitative production demands, the correct cell lineage and phenotype is readily confirmed and reagent usage has been optimized. Areas covered: This article discusses the obstacles inherent in classical laboratory-based methods, their concomitant impact on cost-of-goods and that a technology step change is required to facilitate translation from bed-to-bedside. Expert opinion: While traditional bioreactors have demonstrated limited success where adherent cells are used in combination with microcarriers, further process optimization will be required to find solutions for commercial-scale therapies. New cell culture technologies based on 3D-printed cell culture lattices with favourable surface to volume ratios have the potential to change the paradigm in industry. An integrated Quality-by-Design /System engineering approach will be essential to facilitate the scaled-up translation from proof-of-principle to clinical validation.
Emulsion droplet interactions: a front-tracking treatment
NASA Astrophysics Data System (ADS)
Mason, Lachlan; Juric, Damir; Chergui, Jalel; Shin, Seungwon; Craster, Richard V.; Matar, Omar K.
2017-11-01
Emulsion coalescence influences a multitude of industrial applications including solvent extraction, oil recovery and the manufacture of fast-moving consumer goods. Droplet interaction models are vital for the design and scale-up of processing systems, however predictive modelling at the droplet-scale remains a research challenge. This study simulates industrially relevant moderate-inertia collisions for which a high degree of droplet deformation occurs. A hybrid front-tracking/level-set approach is used to automatically account for interface merging without the need for `bookkeeping' of interface connectivity. The model is implemented in Code BLUE using a parallel multi-grid solver, allowing both film and droplet-scale dynamics to be resolved efficiently. Droplet interaction simulations are validated using experimental sequences from the literature in the presence and absence of background turbulence. The framework is readily extensible for modelling the influence of surfactants and non-Newtonian fluids on droplet interaction processes. EPSRC, UK, MEMPHIS program Grant (EP/K003976/1), RAEng Research Chair (OKM), PETRONAS.
The current state of drug discovery and a potential role for NMR metabolomics.
Powers, Robert
2014-07-24
The pharmaceutical industry has significantly contributed to improving human health. Drugs have been attributed to both increasing life expectancy and decreasing health care costs. Unfortunately, there has been a recent decline in the creativity and productivity of the pharmaceutical industry. This is a complex issue with many contributing factors resulting from the numerous mergers, increase in out-sourcing, and the heavy dependency on high-throughput screening (HTS). While a simple solution to such a complex problem is unrealistic and highly unlikely, the inclusion of metabolomics as a routine component of the drug discovery process may provide some solutions to these problems. Specifically, as the binding affinity of a chemical lead is evolved during the iterative structure-based drug design process, metabolomics can provide feedback on the selectivity and the in vivo mechanism of action. Similarly, metabolomics can be used to evaluate and validate HTS leads. In effect, metabolomics can be used to eliminate compounds with potential efficacy and side effect problems while prioritizing well-behaved leads with druglike characteristics.
OAST Technology for the Future. Executive Summary
NASA Technical Reports Server (NTRS)
1988-01-01
NASA's Office of Aeronautics and Space Technology (OAST) conducted a workshop on the In-Space Technology Experiments Program (IN-STEP) December 6-9, 1988, in Atlanta, Georgia. The purpose of this workshop was to identify and prioritize space technologies which are critical for future national space programs and which require validation in the space environment. A secondary objective was to review the current NASA (In-Reach) and Industry/University (Out-Reach) experiments. Finally, the aerospace community was requested to review and comment on the proposed plans for the continuation of the In-Space Technology Experiments Program. In particular, the review included the proposed process for focusing the next experiment selection on specific, critical technologies and the process for implementing the hardware development and integration on the Space Shuttle vehicle. The product of the workshop was a prioritized listing of the critical space technology needs in each of eight technology disciplines. These listings were the cumulative recommendations of nearly 400 participants, which included researchers, technologists, and managers from aerospace industries, universities, and government organizations.
NASA Astrophysics Data System (ADS)
Rashid, Asif; Masood, Tariq; Erkoyuncu, John Ahmet; Tjahjono, Benny; Khan, Nawar; Shami, Muiz-ud-din
2018-02-01
The research aims to investigate business value critical success factors (CSFs) of enterprise systems (ES) through their life cycle in pursuit of resilient smart factory for emerging aircraft industry. This article provides an extensive literature analysis of past 22 years based on conscientious criteria of authors: (i) who have published strategic content relevant to CSFs, (ii) received more than 300 citations and (iii) concurrently published two or more papers relevant to ES CSFs. The most cited strategic CSFs were termed as classical CSFs. The 22 CSFs were identified, validated and synthesised for better understanding of success across life cycle by aircraft industry experts. The top 10 empirically verified CSFs have numerous differences with past generic classical CSFs. This article canvases real insights of two distinct views: process and variance approaches of the ES CSFs. The process approach, which is a neglected research area, facilitates the researchers for identification of ES life cycle process coupled with a view of resource deployment when it is needed the most. While the variance approach facilitates practitioners and researchers in finding out which resource (CSF) is relatively more important. The significant findings for ES life cycle can help the practitioners and researchers to make rational decisions throughout the ES life cycle.
Certifiable database generation for SVS
NASA Astrophysics Data System (ADS)
Schiefele, Jens; Damjanovic, Dejan; Kubbat, Wolfgang
2000-06-01
In future aircraft cockpits SVS will be used to display 3D physical and virtual information to pilots. A review of prototype and production Synthetic Vision Displays (SVD) from Euro Telematic, UPS Advanced Technologies, Universal Avionics, VDO-Luftfahrtgeratewerk, and NASA, are discussed. As data sources terrain, obstacle, navigation, and airport data is needed, Jeppesen-Sanderson, Inc. and Darmstadt Univ. of Technology currently develop certifiable methods for acquisition, validation, and processing methods for terrain, obstacle, and airport databases. The acquired data will be integrated into a High-Quality Database (HQ-DB). This database is the master repository. It contains all information relevant for all types of aviation applications. From the HQ-DB SVS relevant data is retried, converted, decimated, and adapted into a SVS Real-Time Onboard Database (RTO-DB). The process of data acquisition, verification, and data processing will be defined in a way that allows certication within DO-200a and new RTCA/EUROCAE standards for airport and terrain data. The open formats proposed will be established and evaluated for industrial usability. Finally, a NASA-industry cooperation to develop industrial SVS products under the umbrella of the NASA Aviation Safety Program (ASP) is introduced. A key element of the SVS NASA-ASP is the Jeppesen lead task to develop methods for world-wide database generation and certification. Jeppesen will build three airport databases that will be used in flight trials with NASA aircraft.
1989-09-25
Orders and test specifications. Some mandatory replacement of high failure items are directed by Technical Orders to extend MTBF. Precision bearing and...Experience is very high but natural attrition is reducing the numbers faster than training is furnishing younger mechanics. Surge conditions would be...model validation run output revealed that utilization of equipment is very low and manpower is high . Based on this analysis and the brainstorming
European Workshop Industrical Computer Science Systems approach to design for safety
NASA Technical Reports Server (NTRS)
Zalewski, Janusz
1992-01-01
This paper presents guidelines on designing systems for safety, developed by the Technical Committee 7 on Reliability and Safety of the European Workshop on Industrial Computer Systems. The focus is on complementing the traditional development process by adding the following four steps: (1) overall safety analysis; (2) analysis of the functional specifications; (3) designing for safety; (4) validation of design. Quantitative assessment of safety is possible by means of a modular questionnaire covering various aspects of the major stages of system development.
van der Kuijp, Tsering Jan; Huang, Lei; Cherry, Christopher R
2013-08-03
Despite China's leaded gasoline phase out in 2000, the continued high rates of lead poisoning found in children's blood lead levels reflect the need for identifying and controlling other sources of lead pollution. From 2001 to 2007, 24% of children in China studied (N = 94,778) were lead poisoned with levels exceeding 100 μg/L. These levels stand well above the global average of 16%. These trends reveal that China still faces significant public health challenges, with millions of children currently at risk of lead poisoning. The unprecedented growth of China's lead-acid battery industry from the electric bike, automotive, and photovoltaic industries may explain these persistently high levels, as China remains the world's leading producer, refiner, and consumer of both lead and lead-acid batteries.This review assesses the role of China's rising lead-acid battery industry on lead pollution and exposure. It starts with a synthesis of biological mechanisms of lead exposure followed by an analysis of the key technologies driving the rapid growth of this industry. It then details the four main stages of lead battery production, explaining how each stage results in significant lead loss and pollution. A province-level accounting of each of these industrial operations is also included. Next, reviews of the literature describe how this industry may have contributed to mass lead poisonings throughout China. Finally, the paper closes with a discussion of new policies that address the lead-acid battery industry and identifies policy frameworks to mitigate exposure.This paper is the first to integrate the market factors, production processes, and health impacts of China's growing lead-acid battery industry to illustrate its vast public health consequences. The implications of this review are two-fold: it validates calls for a nationwide assessment of lead exposure pathways and levels in China as well as for a more comprehensive investigation into the health impacts of the lead-acid battery industry. The continuous growth of this industry signals the urgent need for effective regulatory action to protect the health and lives of China's future generations.
2013-01-01
Despite China’s leaded gasoline phase out in 2000, the continued high rates of lead poisoning found in children’s blood lead levels reflect the need for identifying and controlling other sources of lead pollution. From 2001 to 2007, 24% of children in China studied (N = 94,778) were lead poisoned with levels exceeding 100 μg/L. These levels stand well above the global average of 16%. These trends reveal that China still faces significant public health challenges, with millions of children currently at risk of lead poisoning. The unprecedented growth of China’s lead-acid battery industry from the electric bike, automotive, and photovoltaic industries may explain these persistently high levels, as China remains the world’s leading producer, refiner, and consumer of both lead and lead-acid batteries. This review assesses the role of China’s rising lead-acid battery industry on lead pollution and exposure. It starts with a synthesis of biological mechanisms of lead exposure followed by an analysis of the key technologies driving the rapid growth of this industry. It then details the four main stages of lead battery production, explaining how each stage results in significant lead loss and pollution. A province-level accounting of each of these industrial operations is also included. Next, reviews of the literature describe how this industry may have contributed to mass lead poisonings throughout China. Finally, the paper closes with a discussion of new policies that address the lead-acid battery industry and identifies policy frameworks to mitigate exposure. This paper is the first to integrate the market factors, production processes, and health impacts of China’s growing lead-acid battery industry to illustrate its vast public health consequences. The implications of this review are two-fold: it validates calls for a nationwide assessment of lead exposure pathways and levels in China as well as for a more comprehensive investigation into the health impacts of the lead-acid battery industry. The continuous growth of this industry signals the urgent need for effective regulatory action to protect the health and lives of China’s future generations. PMID:23915167
Formal verification and testing: An integrated approach to validating Ada programs
NASA Technical Reports Server (NTRS)
Cohen, Norman H.
1986-01-01
An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.
Kern, Simon; Meyer, Klas; Guhl, Svetlana; Gräßer, Patrick; Paul, Andrea; King, Rudibert; Maiwald, Michael
2018-05-01
Monitoring specific chemical properties is the key to chemical process control. Today, mainly optical online methods are applied, which require time- and cost-intensive calibration effort. NMR spectroscopy, with its advantage being a direct comparison method without need for calibration, has a high potential for enabling closed-loop process control while exhibiting short set-up times. Compact NMR instruments make NMR spectroscopy accessible in industrial and rough environments for process monitoring and advanced process control strategies. We present a fully automated data analysis approach which is completely based on physically motivated spectral models as first principles information (indirect hard modeling-IHM) and applied it to a given pharmaceutical lithiation reaction in the framework of the European Union's Horizon 2020 project CONSENS. Online low-field NMR (LF NMR) data was analyzed by IHM with low calibration effort, compared to a multivariate PLS-R (partial least squares regression) approach, and both validated using online high-field NMR (HF NMR) spectroscopy. Graphical abstract NMR sensor module for monitoring of the aromatic coupling of 1-fluoro-2-nitrobenzene (FNB) with aniline to 2-nitrodiphenylamine (NDPA) using lithium-bis(trimethylsilyl) amide (Li-HMDS) in continuous operation. Online 43.5 MHz low-field NMR (LF) was compared to 500 MHz high-field NMR spectroscopy (HF) as reference method.
Material model validation for laser shock peening process simulation
NASA Astrophysics Data System (ADS)
Amarchinta, H. K.; Grandhi, R. V.; Langer, K.; Stargel, D. S.
2009-01-01
Advanced mechanical surface enhancement techniques have been used successfully to increase the fatigue life of metallic components. These techniques impart deep compressive residual stresses into the component to counter potentially damage-inducing tensile stresses generated under service loading. Laser shock peening (LSP) is an advanced mechanical surface enhancement technique used predominantly in the aircraft industry. To reduce costs and make the technique available on a large-scale basis for industrial applications, simulation of the LSP process is required. Accurate simulation of the LSP process is a challenging task, because the process has many parameters such as laser spot size, pressure profile and material model that must be precisely determined. This work focuses on investigating the appropriate material model that could be used in simulation and design. In the LSP process material is subjected to strain rates of 106 s-1, which is very high compared with conventional strain rates. The importance of an accurate material model increases because the material behaves significantly different at such high strain rates. This work investigates the effect of multiple nonlinear material models for representing the elastic-plastic behavior of materials. Elastic perfectly plastic, Johnson-Cook and Zerilli-Armstrong models are used, and the performance of each model is compared with available experimental results.
Nanotubes May Break Through "Chip Wall"
NASA Technical Reports Server (NTRS)
Laufenberg, Larry
2003-01-01
In 1965, just four years after the first planar integrated circuit (IC) was discovered, Cordon Moore observed that the number of transistors per integrated circuit had grown exponentially. He predicted that this would continue, and the media soon began to call his prophesy "Moore's Law" For nearly forty years, Moore's Law has been validated by the technological progress achieved in the semiconductor industry. Now, however, industry experts are warning of a "Red Brick Wall" that may soon block the continued scaling predicted by by Moore's Law. The "red bricks" in the wall are those areas of technical challenge for which no known manufacturable solution exists. One such "brick" is the challenge of finding a new material and processing technology to replace the metals used today to interconnect transistors on a chip.
Tuning the sapphire EFG process to the growth of Al2O3/YAG/ZrO2:Y eutectic
NASA Astrophysics Data System (ADS)
Carroz, L.; Duffar, T.
2018-05-01
In this work, a model is proposed, in order to analytically study the working point of the Edge defined Film-fed Growth (EFG) pulling of crystal plates. The model takes into account the heat equilibrium at the interface and the pressure equilibrium across the meniscus. It is validated on an industrial device dedicated to the pulling of sapphire ribbons. Then, the model is applied to pulling ceramic alloy plates, of the ternary eutectic Al2O3/YAG/ZrO2:Y. This allowed understanding the experimental difficulties of pulling this new material and suggested improvements of the control software. From these results, pulling net shaped ceramic alloy plates was successful in the same industrial equipment as used for sapphire.
NASA Astrophysics Data System (ADS)
Riera, Enrique; Blanco, Alfonso; García, José; Benedito, José; Mulet, Antonio; Gallego-Juárez, Juan A.; Blasco, Miguel
2010-01-01
Oil is an important component of almonds and other vegetable substrates that can show an influence on human health. In this work the development and validation of an innovative, robust, stable, reliable and efficient ultrasonic system at pilot scale to assist supercritical CO2 extraction of oils from different substrates is presented. In the extraction procedure ultrasonic energy represents an efficient way of producing deep agitation enhancing mass transfer processes because of some mechanisms (radiation pressure, streaming, agitation, high amplitude vibrations, etc.). A previous work to this research pointed out the feasibility of integrating an ultrasonic field inside a supercritical extractor without losing a significant volume fraction. This pioneer method enabled to accelerate mass transfer and then, improving supercritical extraction times. To commercially develop the new procedure fulfilling industrial requirements, a new configuration device has been designed, implemented, tested and successfully validated for supercritical fluid extraction of oil from different vegetable substrates.
In Situ Roughness Measurements for the Solar Cell Industry Using an Atomic Force Microscope
González-Jorge, Higinio; Alvarez-Valado, Victor; Valencia, Jose Luis; Torres, Soledad
2010-01-01
Areal roughness parameters always need to be under control in the thin film solar cell industry because of their close relationship with the electrical efficiency of the cells. In this work, these parameters are evaluated for measurements carried out in a typical fabrication area for this industry. Measurements are made using a portable atomic force microscope on the CNC diamond cutting machine where an initial sample of transparent conductive oxide is cut into four pieces. The method is validated by making a comparison between the parameters obtained in this process and in the laboratory under optimal conditions. Areal roughness parameters and Fourier Spectral Analysis of the data show good compatibility and open the possibility to use this type of measurement instrument to perform in situ quality control. This procedure gives a sample for evaluation without destroying any of the transparent conductive oxide; in this way 100% of the production can be tested, so improving the measurement time and rate of production. PMID:22319338
In situ roughness measurements for the solar cell industry using an atomic force microscope.
González-Jorge, Higinio; Alvarez-Valado, Victor; Valencia, Jose Luis; Torres, Soledad
2010-01-01
Areal roughness parameters always need to be under control in the thin film solar cell industry because of their close relationship with the electrical efficiency of the cells. In this work, these parameters are evaluated for measurements carried out in a typical fabrication area for this industry. Measurements are made using a portable atomic force microscope on the CNC diamond cutting machine where an initial sample of transparent conductive oxide is cut into four pieces. The method is validated by making a comparison between the parameters obtained in this process and in the laboratory under optimal conditions. Areal roughness parameters and Fourier Spectral Analysis of the data show good compatibility and open the possibility to use this type of measurement instrument to perform in situ quality control. This procedure gives a sample for evaluation without destroying any of the transparent conductive oxide; in this way 100% of the production can be tested, so improving the measurement time and rate of production.
Perceived importance of employees' traits in the service industry.
Lange, Rense; Houran, James
2009-04-01
Selection assessments are common practice to help reduce employee turnover in the service industry, but as too little is known about employees' characteristics, which are valued most highly by human resources professionals, a sample of 108 managers and human resources professionals rated the perceived importance of 31 performance traits for Line, Middle, and Senior employees. Rasch scaling analyses indicated strong consensus among the respondents. Nonsocial skills, abilities, and traits such as Ethical Awareness, Self-motivation, Writing Skills, Verbal Ability, Creativity, and Problem Solving were rated as more important for higher level employees. By contrast, traits which directly affect the interaction with customers and coworkers (Service Orientation, Communication Style, Agreeableness, Sense of Humor, Sensitivity to Diversity, Group Process, and Team Building) were rated as more important for lower level employees. Respondents' age and sex did not substantially alter these findings. Results are discussed in terms of improving industry professionals' perceived ecological and external validities of generic and customized assessments of employee.
Kolbl, Sabina; Paloczi, Attila; Panjan, Jože; Stres, Blaž
2014-02-01
The primary aim of the study was to develop and validate an in-house upscale of Automatic Methane Potential Test System II for studying real-time inocula and real-scale substrates in batch, codigestion and enzyme enhanced hydrolysis experiments, in addition to semi-continuous operation of the developed equipment and experiments testing inoculum functional quality. The successful upscale to 5L enabled comparison of different process configurations in shorter preparation times with acceptable accuracy and high-through put intended for industrial decision making. The adoption of the same scales, equipment and methodologies in batch and semi-continuous tests mirroring those at full scale biogas plants resulted in matching methane yields between the two laboratory tests and full-scale, confirming thus the increased decision making value of the approach for industrial operations. Copyright © 2013 Elsevier Ltd. All rights reserved.
Compliance revisited: pharmaceutical drug trials in the era of the contract research organization.
Jonvallen, Petra
2009-12-01
Over the past decade, the management of clinical trials of pharmaceuticals has become a veritable industry, as evidenced by the emergence and proliferation of contract research organizations (CROs) that co-ordinate and monitor trials. This article focuses on work performed by one CRO involved in the introduction of new software, modelled on industrial production processes, into clinical trial practices. It investigates how this new management technique relates to the work performed in the clinic to ensure that trial participants comply with the protocol. Using an analytical distinction between 'classical' management work and invisible work, the article contextualizes the meaning of compliance in the clinic and suggests that the work involved in producing compliance should be taken into consideration by those concerned with validity of trials, as clinical trials are put under private industrial management. The article builds on participant observation at a Swedish university hospital and interviews the nurses, dieticians, doctors and a software engineer, all part of a team involved in pharmaceutical drug trials on a potential obesity drug.
The Development of Ontology from Multiple Databases
NASA Astrophysics Data System (ADS)
Kasim, Shahreen; Aswa Omar, Nurul; Fudzee, Mohd Farhan Md; Azhar Ramli, Azizul; Aizi Salamat, Mohamad; Mahdin, Hairulnizam
2017-08-01
The area of halal industry is the fastest growing global business across the world. The halal food industry is thus crucial for Muslims all over the world as it serves to ensure them that the food items they consume daily are syariah compliant. Currently, ontology has been widely used in computer sciences area such as web on the heterogeneous information processing, semantic web, and information retrieval. However, ontology has still not been used widely in the halal industry. Today, Muslim community still have problem to verify halal status for products in the market especially foods consisting of E number. This research tried to solve problem in validating the halal status from various halal sources. There are various chemical ontology from multilple databases found to help this ontology development. The E numbers in this chemical ontology are codes for chemicals that can be used as food additives. With this E numbers ontology, Muslim community could identify and verify the halal status effectively for halal products in the market.
Henry, Teresa R; Penn, Lara D; Conerty, Jason R; Wright, Francesca E; Gorman, Gregory; Pack, Brian W
2016-11-01
Non-clinical dose formulations (also known as pre-clinical or GLP formulations) play a key role in early drug development. These formulations are used to introduce active pharmaceutical ingredients (APIs) into test organisms for both pharmacokinetic and toxicological studies. Since these studies are ultimately used to support dose and safety ranges in human studies, it is important to understand not only the concentration and PK/PD of the active ingredient but also to generate safety data for likely process impurities and degradation products of the active ingredient. As such, many in the industry have chosen to develop and validate methods which can accurately detect and quantify the active ingredient along with impurities and degradation products. Such methods often provide trendable results which are predictive of stability, thus leading to the name; stability indicating methods. This document provides an overview of best practices for those choosing to include development and validation of such methods as part of their non-clinical drug development program. This document is intended to support teams who are either new to stability indicating method development and validation or who are less familiar with the requirements of validation due to their position within the product development life cycle.
A Science and Risk-Based Pragmatic Methodology for Blend and Content Uniformity Assessment.
Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Doshi, Chetan
2018-04-01
This paper describes a pragmatic approach that can be applied in assessing powder blend and unit dosage uniformity of solid dose products at Process Design, Process Performance Qualification, and Continued/Ongoing Process Verification stages of the Process Validation lifecycle. The statistically based sampling, testing, and assessment plan was developed due to the withdrawal of the FDA draft guidance for industry "Powder Blends and Finished Dosage Units-Stratified In-Process Dosage Unit Sampling and Assessment." This paper compares the proposed Grouped Area Variance Estimate (GAVE) method with an alternate approach outlining the practicality and statistical rationalization using traditional sampling and analytical methods. The approach is designed to fit solid dose processes assuring high statistical confidence in both powder blend uniformity and dosage unit uniformity during all three stages of the lifecycle complying with ASTM standards as recommended by the US FDA.
Industrial Instrument Mechanic. Occupational Analyses Series.
ERIC Educational Resources Information Center
Dean, Ann; Zagorac, Mike; Bumbaka, Nick
This analysis covers tasks performed by an industrial instrument mechanic, an occupational title some provinces and territories of Canada have also identified as industrial instrumentation and instrument mechanic. A guide to analysis discusses development, structure, and validation method; scope of the occupation; trends; and safety. To facilitate…
An Investigation and Prediction of Springback of Sheet Metals under Cold Forming Condition
NASA Astrophysics Data System (ADS)
Elsayed, A.; Mohamed, M.; Shazly, M.; Hegazy, A.
2017-12-01
Low formability and springback especially at room temperature are known to be major obstacles to advancements in sheet metal forming industries. The integration of numerical simulation within the R&D activities of the automotive industries provides a significant development in overcoming these drawbacks. The aim of the present work is to model and predict the springback of a Galvanized low carbon steel automotive panel part. This part suffers from both positive and negative springback which physically measured using CMM. The objective is to determine the suitable forming process parameters that minimize and compensate the springback through robust FE model. The analysis of the springback was carried out following (Isotropic model and Yoshida - Uemori model) which are calibrated through cyclic stress strain curve. The material data of the Galvanized low carbon steel was implemented via lookup tables in the commercial finite element software Pam-Stamp(TM). Firstly, the FE model was validated using the deformed part which suffers from springback problem at the same forming condition. The FE results were compared with the measured experimental trails providing very good agreement. Secondly, the validated FE model was used to determine the suitable forming parameters which could minimise the springback of the deformed part.
Wirges, M; Funke, A; Serno, P; Knop, K; Kleinebudde, P
2013-05-05
Incorporation of an active pharmaceutical ingredient (API) into the coating layer of film-coated tablets is a method mainly used to formulate fixed-dose combinations. Uniform and precise spray-coating of an API represents a substantial challenge, which could be overcome by applying Raman spectroscopy as process analytical tool. In pharmaceutical industry, Raman spectroscopy is still mainly used as a bench top laboratory analytical method and usually not implemented in the production process. Concerning the application in the production process, a lot of scientific approaches stop at the level of feasibility studies and do not manage the step to production scale and process applications. The present work puts the scale up of an active coating process into focus, which is a step of highest importance during the pharmaceutical development. Active coating experiments were performed at lab and production scale. Using partial least squares (PLS), a multivariate model was constructed by correlating in-line measured Raman spectral data with the coated amount of API. By transferring this model, being implemented for a lab scale process, to a production scale process, the robustness of this analytical method and thus its applicability as a Process Analytical Technology (PAT) tool for the correct endpoint determination in pharmaceutical manufacturing could be shown. Finally, this method was validated according to the European Medicine Agency (EMA) guideline with respect to the special requirements of the applied in-line model development strategy. Copyright © 2013 Elsevier B.V. All rights reserved.
Sampling and sample processing in pesticide residue analysis.
Lehotay, Steven J; Cook, Jo Marie
2015-05-13
Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.
Biological treatment of model dyes and textile wastewaters.
Paz, Alicia; Carballo, Julia; Pérez, María José; Domínguez, José Manuel
2017-08-01
Previous works conducted in our laboratory, reveled that Bacillus aryabhattai DC100 produce ligninolytic enzymes such as laccases and/or peroxidases, opening new applications in different bioprocesses, including the treatment of disposal residues such as dyestuffs from textile processing industries. This work described the degradation of three commercial model dyes Coomassie Brilliant Blue G-250 (CBB), Indigo Carmine (IC) and Remazol Brilliant Blue R (RBBR) under different culture media and operational conditions. The process was optimized using a Central Composite Rotatable Design, and the desirability predicted complete decolorization of 150 mg/L CBB at 37 °C, 304.09 rpm and salt concentration of 19.204 g/L. The model was validated with concentrations up to 180 mg/L CBB and IC, not being able to remove high amount of RBBR. The procedure here developed also allowed Chemical Oxygen Demands (COD) reductions in CBB of about 42%, meanwhile tests on real effluents from a local textile industry involved COD reductions of 50% in a liquid wastewater and 14% in semi-liquid sludge. Thus, allow the authorized discharge of wastewater into the corresponding treatment plant. Decolorization efficiencies and COD reductions open on the potential application of B. aryabhattai DC100 on the bioremediation of real effluents from textile industries. Copyright © 2017 Elsevier Ltd. All rights reserved.
Fleetwood, Gill; Chlebus, Magda; Coenen, Joachim; Dudoignon, Nicolas; Lecerf, Catherine; Maisonneuve, Catherine; Robinson, Sally
2015-01-01
Animal research together with other investigational methods (computer modeling, in vitro tests, etc) remains an indispensable part of the pharmaceutical research and development process. The European pharmaceutical industry recognizes the responsibilities inherent in animal research and is committed to applying and enhancing 3Rs principles. New nonsentient, ex vivo, and in vitro methods are developed every day and contribute to reducing and, in some instances, replacing in vivo studies. Their utility is however limited by the extent of our current knowledge and understanding of complex biological systems. Until validated alternative ways to model these complex interactions become available, animals remain indispensable in research and safety testing. In the interim, scientists continue to look for ways to reduce the number of animals needed to obtain valid results, refine experimental techniques to enhance animal welfare, and replace animals with other research methods whenever feasible. As research goals foster increasing cross-sector and international collaboration, momentum is growing to enhance and coordinate scientific innovation globally—beyond a single company, stakeholder group, sector, region, or country. The implementation of 3Rs strategies can be viewed as an integral part of this continuously evolving science, demonstrating the link between science and welfare, benefiting both the development of new medicines and animal welfare. This goal is one of the key objectives of the Research and Animal Welfare working group of the European Federation of Pharmaceutical Industries and Associations. PMID:25836966
Fleetwood, Gill; Chlebus, Magda; Coenen, Joachim; Dudoignon, Nicolas; Lecerf, Catherine; Maisonneuve, Catherine; Robinson, Sally
2015-03-01
Animal research together with other investigational methods (computer modeling, in vitro tests, etc) remains an indispensable part of the pharmaceutical research and development process. The European pharmaceutical industry recognizes the responsibilities inherent in animal research and is committed to applying and enhancing 3Rs principles. New nonsentient, ex vivo, and in vitro methods are developed every day and contribute to reducing and, in some instances, replacing in vivo studies. Their utility is however limited by the extent of our current knowledge and understanding of complex biological systems. Until validated alternative ways to model these complex interactions become available, animals remain indispensable in research and safety testing. In the interim, scientists continue to look for ways to reduce the number of animals needed to obtain valid results, refine experimental techniques to enhance animal welfare, and replace animals with other research methods whenever feasible. As research goals foster increasing cross-sector and international collaboration, momentum is growing to enhance and coordinate scientific innovation globally-beyond a single company, stakeholder group, sector, region, or country. The implementation of 3Rs strategies can be viewed as an integral part of this continuously evolving science, demonstrating the link between science and welfare, benefiting both the development of new medicines and animal welfare. This goal is one of the key objectives of the Research and Animal Welfare working group of the European Federation of Pharmaceutical Industries and Associations.
Validation of NASA Thermal Ice Protection Computer Codes. Part 1; Program Overview
NASA Technical Reports Server (NTRS)
Miller, Dean; Bond, Thomas; Sheldon, David; Wright, William; Langhals, Tammy; Al-Khalil, Kamel; Broughton, Howard
1996-01-01
The Icing Technology Branch at NASA Lewis has been involved in an effort to validate two thermal ice protection codes developed at the NASA Lewis Research Center. LEWICE/Thermal (electrothermal deicing & anti-icing), and ANTICE (hot-gas & electrothermal anti-icing). The Thermal Code Validation effort was designated as a priority during a 1994 'peer review' of the NASA Lewis Icing program, and was implemented as a cooperative effort with industry. During April 1996, the first of a series of experimental validation tests was conducted in the NASA Lewis Icing Research Tunnel(IRT). The purpose of the April 96 test was to validate the electrothermal predictive capabilities of both LEWICE/Thermal, and ANTICE. A heavily instrumented test article was designed and fabricated for this test, with the capability of simulating electrothermal de-icing and anti-icing modes of operation. Thermal measurements were then obtained over a range of test conditions, for comparison with analytical predictions. This paper will present an overview of the test, including a detailed description of: (1) the validation process; (2) test article design; (3) test matrix development; and (4) test procedures. Selected experimental results will be presented for de-icing and anti-icing modes of operation. Finally, the status of the validation effort at this point will be summarized. Detailed comparisons between analytical predictions and experimental results are contained in the following two papers: 'Validation of NASA Thermal Ice Protection Computer Codes: Part 2- The Validation of LEWICE/Thermal' and 'Validation of NASA Thermal Ice Protection Computer Codes: Part 3-The Validation of ANTICE'
NREL Spectrum of Clean Energy Innovation (Brochure)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2011-09-01
This brochure describes the NREL Spectrum of Clean Energy Innovation, which includes analysis and decision support, fundamental science, market relevant research, systems integration, testing and validation, commercialization and deployment. Through deep technical expertise and an unmatched breadth of capabilities, the National Renewable Energy Laboratory (NREL) leads an integrated approach across the spectrum of renewable energy innovation. From scientific discovery to accelerating market deployment, NREL works in partnership with private industry to drive the transformation of our nation's energy systems. NREL integrates the entire spectrum of innovation, including fundamental science, market relevant research, systems integration, testing and validation, commercialization, and deployment.more » Our world-class analysis and decision support informs every point on the spectrum. The innovation process at NREL is inter-dependent and iterative. Many scientific breakthroughs begin in our own laboratories, but new ideas and technologies may come to NREL at any point along the innovation spectrum to be validated and refined for commercial use.« less
Dozier, Samantha; Brown, Jeffrey; Currie, Alistair
2011-01-01
Simple Summary Many vaccines are tested for quality in experiments that require the use of large numbers of animals in procedures that often cause significant pain and distress. Newer technologies have fostered the development of vaccine quality control tests that reduce or eliminate the use of animals, but the availability of these newer methods has not guaranteed their acceptance by regulators or use by manufacturers. We discuss a strategic approach that has been used to assess and ultimately increase the use of non-animal vaccine quality tests in the U.S. and U.K. Abstract In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches. PMID:26486625
Technologically enhanced naturally occurring radioactive materials.
Vearrier, David; Curtis, John A; Greenberg, Michael I
2009-05-01
Naturally occurring radioactive materials (NORM) are ubiquitous throughout the earth's crust. Human manipulation of NORM for economic ends, such as mining, ore processing, fossil fuel extraction, and commercial aviation, may lead to what is known as "technologically enhanced naturally occurring radioactive materials," often called TENORM. The existence of TENORM results in an increased risk for human exposure to radioactivity. Workers in TENORM-producing industries may be occupationally exposed to ionizing radiation. TENORM industries may release significant amounts of radioactive material into the environment resulting in the potential for widespread exposure to ionizing radiation. These industries include mining, phosphate processing, metal ore processing, heavy mineral sand processing, titanium pigment production, fossil fuel extraction and combustion, manufacture of building materials, thorium compounds, aviation, and scrap metal processing. A search of the PubMed database ( www.pubmed.com ) and Ovid Medline database ( ovidsp.tx.ovid.com ) was performed using a variety of search terms including NORM, TENORM, and occupational radiation exposure. A total of 133 articles were identified, retrieved, and reviewed. Seventy-three peer-reviewed articles were chosen to be cited in this review. A number of studies have evaluated the extent of ionizing radiation exposure both among workers and the general public due to TENORM. Quantification of radiation exposure is limited because of modeling constraints. In some occupational settings, an increased risk of cancer has been reported and postulated to be secondary to exposure to TENORM, though these reports have not been validated using toxicological principles. NORM and TENORM have the potential to cause important human health effects. It is important that these adverse health effects are evaluated using the basic principles of toxicology, including the magnitude and type of exposure, as well as threshold and dose response.
Vieites, J M; Botana, L M; Vieytes, M R; Leira, F J
1999-05-01
Changes in toxin profile and total toxicity levels of paralytic shellfish poison (PSP)-containing mussels were monitored during the standard canning process of pickled mussels and mussels in brine using mouse bioassays and high-performance liquid chromatography. Detoxification percentages for canned mussel meat exceeded 50% of initial toxicity. Total toxicity reduction did not fully correspond to toxin destruction, which was due to the loss of PSP to cooking water and packing media of the canned product. Significant differences in detoxification percentages were due to changes in toxin profile during heat treatment in packing media. Toxin conversion phenomena should be determined to validate detoxification procedures in the canning industry.
A validity test of movie, television, and video-game ratings.
Walsh, D A; Gentile, D A
2001-06-01
Numerous studies have documented the potential effects on young audiences of violent content in media products, including movies, television programs, and computer and video games. Similar studies have evaluated the effects associated with sexual content and messages. Cumulatively, these effects represent a significant public health risk for increased aggressive and violent behavior, spread of sexually transmitted diseases, and pediatric pregnancy. In partial response to these risks and to public and legislative pressure, the movie, television, and gaming industries have implemented ratings systems intended to provide information about the content and appropriate audiences for different films, shows, and games. To test the validity of the current movie-, television-, and video game-rating systems. Panel study. Participants used the KidScore media evaluation tool, which evaluates films, television shows, and video games on 10 aspects, including the appropriateness of the media product for children based on age. When an entertainment industry rates a product as inappropriate for children, parent raters agree that it is inappropriate for children. However, parent raters disagree with industry usage of many of the ratings designating material suitable for children of different ages. Products rated as appropriate for adolescents are of the greatest concern. The level of disagreement varies from industry to industry and even from rating to rating. Analysis indicates that the amount of violent content and portrayals of violence are the primary markers for disagreement between parent raters and industry ratings. As 1 part of a solution to the complex public health problems posed by violent and sexually explicit media products, ratings can have value if used with caution. Parents and caregivers relying on the ratings systems to guide their children's use of media products should continue to monitor content independently. Industry ratings systems should be revised with input from the medical and scientific communities to improve their reliability and validity. A single ratings system, applied universally across industries, would greatly simplify the efforts of parents and caregivers to use the system as well as the efforts of outside parties to monitor the use and validity of the system.
Chang, Joonho; Moon, Seung Ki; Jung, Kihyo; Kim, Wonmo; Parkinson, Matthew; Freivalds, Andris; Simpson, Timothy W; Baik, Seon Pill
2018-05-01
This study presents usability considerations and solutions for the design of glasses-type wearable computer displays and examines their effectiveness in a case study. Design countermeasures were investigated by a four-step design process: (1) preliminary design analysis; (2) design idea generation; (3) final design selection; and (4) virtual fitting trial. Three design interventions were devised from the design process: (1) weight balance to reduce pressure concentrated on the nose, (2) compliant temples to accommodate diverse head sizes and (3) a hanger mechanism to help spectacle users hang their wearable display on their eye glasses. To investigate their effectiveness, in the case study, the novel 3D glasses adopting the three interventions were compared with two existing 3D glasses in terms of neck muscle fatigue and subjective discomfort rating. While neck muscle fatigue was not significantly different among the three glasses (p = 0.467), the novel glasses had significantly smaller discomfort ratings (p = 0.009). Relevance to Industry: A four-step design process identified usability considerations and solutions for the design of glasses-type wearable computer displays. A novel 3D glasses was proposed through the process and its effectiveness was validated. The results identify design considerations and opportunities relevant to the emerging wearable display industry.
NASA Astrophysics Data System (ADS)
Trendafiloski, G.; Gaspa Rebull, O.; Ewing, C.; Podlaha, A.; Magee, B.
2012-04-01
Calibration and validation are crucial steps in the production of the catastrophe models for the insurance industry in order to assure the model's reliability and to quantify its uncertainty. Calibration is needed in all components of model development including hazard and vulnerability. Validation is required to ensure that the losses calculated by the model match those observed in past events and which could happen in future. Impact Forecasting, the catastrophe modelling development centre of excellence within Aon Benfield, has recently launched its earthquake model for Algeria as a part of the earthquake model for the Maghreb region. The earthquake model went through a detailed calibration process including: (1) the seismic intensity attenuation model by use of macroseismic observations and maps from past earthquakes in Algeria; (2) calculation of the country-specific vulnerability modifiers by use of past damage observations in the country. The use of Benouar, 1994 ground motion prediction relationship was proven as the most appropriate for our model. Calculation of the regional vulnerability modifiers for the country led to 10% to 40% larger vulnerability indexes for different building types compared to average European indexes. The country specific damage models also included aggregate damage models for residential, commercial and industrial properties considering the description of the buildings stock given by World Housing Encyclopaedia and the local rebuilding cost factors equal to 10% for damage grade 1, 20% for damage grade 2, 35% for damage grade 3, 75% for damage grade 4 and 100% for damage grade 5. The damage grades comply with the European Macroseismic Scale (EMS-1998). The model was validated by use of "as-if" historical scenario simulations of three past earthquake events in Algeria M6.8 2003 Boumerdes, M7.3 1980 El-Asnam and M7.3 1856 Djidjelli earthquake. The calculated return periods of the losses for client market portfolio align with the repeatability of such catastrophe losses in the country. The validation process also included collaboration between Aon Benfield and its client in order to consider the insurance market penetration in Algeria estimated approximately at 5%. Thus, we believe that the applied approach led towards the production of an earthquake model for Algeria that is scientifically sound and reliable from one side and market and client oriented on the other side.
Deoiledjatropha seed cake is a useful nutrient for pullulan production.
Choudhury, Anirban Roy; Sharma, Nishat; Prasad, G S
2012-03-30
Ever increasing demand for fossil fuels is a major factor for rapid depletion of these non-renewable energy resources, which has enhanced the interest of finding out alternative sources of energy. In recent years jatropha seed oil has been used extensively for production of bio-diesel and has shown significant potential to replace petroleum fuels at least partially. De-oiled jatropha seed cake (DOJSC) which comprises of approximately 55 to 65% of the biomass is a byproduct of bio-diesel industry. DOJSC contains toxic components like phorbol esters which restricts its utilization as animal feed. Thus along with the enhancement of biodiesel production from jatropha, there is an associated problem of handling this toxic byproduct. Utilization of DOJSC as a feed stock for production of biochemicals may be an attractive solution to the problem.Pullulan is an industrially important polysaccharide with several potential applications in food, pharmaceuticals and cosmetic industries. However, the major bottleneck for commercial utilization of pullulan is its high cost. A cost effective process for pullulan production may be developed using DOJSC as sole nutrient source which will in turn also help in utilization of the byproduct of bio-diesel industry. In the present study, DOJSC has been used as a nutrient for production of pullulan, in place of conventional nutrients like yeast extract and peptone. Process optimization was done in shake flasks, and under optimized conditions (8% DOJSC, 15% dextrose, 28°C temperature, 200 rpm, 5% inoculum, 6.0 pH) 83.98 g/L pullulan was obtained. The process was further validated in a 5 L laboratory scale fermenter. This is the first report of using DOJSC as nutrient for production of an exopolysaccharide. Successful use of DOJSC as nutrient will help in finding significant application of this toxic byproduct of biodiesel industry. This in turn also have a significant impact on cost reduction and may lead to development of a cost effective green technology for pullulan production.
Deoiledjatropha seed cake is a useful nutrient for pullulan production
2012-01-01
Background Ever increasing demand for fossil fuels is a major factor for rapid depletion of these non-renewable energy resources, which has enhanced the interest of finding out alternative sources of energy. In recent years jatropha seed oil has been used extensively for production of bio-diesel and has shown significant potential to replace petroleum fuels at least partially. De-oiled jatropha seed cake (DOJSC) which comprises of approximately 55 to 65% of the biomass is a byproduct of bio-diesel industry. DOJSC contains toxic components like phorbol esters which restricts its utilization as animal feed. Thus along with the enhancement of biodiesel production from jatropha, there is an associated problem of handling this toxic byproduct. Utilization of DOJSC as a feed stock for production of biochemicals may be an attractive solution to the problem. Pullulan is an industrially important polysaccharide with several potential applications in food, pharmaceuticals and cosmetic industries. However, the major bottleneck for commercial utilization of pullulan is its high cost. A cost effective process for pullulan production may be developed using DOJSC as sole nutrient source which will in turn also help in utilization of the byproduct of bio-diesel industry. Results In the present study, DOJSC has been used as a nutrient for production of pullulan, in place of conventional nutrients like yeast extract and peptone. Process optimization was done in shake flasks, and under optimized conditions (8% DOJSC, 15% dextrose, 28°C temperature, 200 rpm, 5% inoculum, 6.0 pH) 83.98 g/L pullulan was obtained. The process was further validated in a 5 L laboratory scale fermenter. Conclusion This is the first report of using DOJSC as nutrient for production of an exopolysaccharide. Successful use of DOJSC as nutrient will help in finding significant application of this toxic byproduct of biodiesel industry. This in turn also have a significant impact on cost reduction and may lead to development of a cost effective green technology for pullulan production. PMID:22462652
An industrial approach to design compelling VR and AR experience
NASA Astrophysics Data System (ADS)
Richir, Simon; Fuchs, Philippe; Lourdeaux, Domitile; Buche, Cédric; Querrec, Ronan
2013-03-01
The convergence of technologies currently observed in the field of VR, AR, robotics and consumer electronic reinforces the trend of new applications appearing every day. But when transferring knowledge acquired from research to businesses, research laboratories are often at a loss because of a lack of knowledge of the design and integration processes in creating an industrial scale product. In fact, the innovation approaches that take a good idea from the laboratory to a successful industrial product are often little known to researchers. The objective of this paper is to present the results of the work of several research teams that have finalized a working method for researchers and manufacturers that allow them to design virtual or augmented reality systems and enable their users to enjoy "a compelling VR experience". That approach, called "the I2I method", present 11 phases from "Establishing technological and competitive intelligence and industrial property" to "Improvements" through the "Definition of the Behavioral Interface, Virtual Environment and Behavioral Software Assistance". As a result of the experience gained by various research teams, this design approach benefits from contributions from current VR and AR research. Our objective is to validate and continuously move such multidisciplinary design team methods forward.
Intellectual property and access to medicines: an analysis of legislation in Central America.
Cerón, Alejandro; Godoy, Angelina Snodgrass
2009-10-01
Globalization of intellectual property (IP) protection for medicines has been advancing during the past decade. Countries are obliged to adapt their legislation as a requirement of their membership to the World Trade Organization or as a condition of being part of international trade agreements. There is a growing recognition that, in low-income countries, stronger IP protection is a barrier to access to medicines. At the same time, the number of low-income countries writing national legislation to protect IP for pharmaceutical products is growing worldwide, but little research has been done on the ways in which this process is happening at the national level. This paper aims to contribute to the understanding of the implementation of IP legislation at the national level by providing a comparative analysis of the countries that are part of the United States-Dominican Republic-Central America Free Trade Agreement (DR-CAFTA). The analysis shows three trends. First, countries have often implemented stronger IP protection than required by trade agreements. Second, some countries have adopted IP protection before signing the trade agreements. Third, the process of ratification of DR-CAFTA increased public debate around these issues, which in some cases led to IP legislation that considers public health needs. These trends suggest that industrialized countries and the pharmaceutical industry are using more tactics than just trade agreements to push for increased IP protection and that the process of national legislation is a valid arena for confronting public health needs to those of the industry.
An experimental study on particle effects in liquid sheets
NASA Astrophysics Data System (ADS)
Sauret, Alban; Troger, Anthony; Jop, Pierre
2017-06-01
Many industrial processes, such as surface coating or liquid transport in tubes, involve liquid sheets or thin films of suspensions. In these situations, the thickness of the liquid film becomes comparable to the particle size, which leads to unexpected dynamics. In addition, the classical constitutive rheological law for suspensions cannot be applied as the continuum approximation is no longer valid. Here, we consider experimentally a transient particle-laden liquid sheet that expands radially. We characterize the influence of the particles on the shape of the liquid film and the atomization process. We highlight that the presence of particles modifies the thickness and stability of the liquid sheet. Our study suggests that the influence of particles through capillary effects can modify significantly the dynamics of processes that involve suspensions and particles confined in liquid films.
Sun, Peishi; Huang, Bing; Huang, Ruohua; Yang, Ping
2002-05-01
For the process of biopurifying waste gas containing VOC in low concentration by using a biological trickling filter, the related kinetic model and simulation of the new Adsorption-Biofilm theory were investigated in this study. By using the lab test data and the industrial test data, the results of contrast and validation indicated that the model had a good applicability for describing the practical bio-purification process of VOC waste gas. In the simulation study for the affection of main factor, such as the concentration of toluene in inlet gas, the gas flow and the height of biofilm-packing, a good pertinence was showed between calculated data and test dada, the interrelation coefficients were in 0.80-0.97.
ERIC Educational Resources Information Center
Simkin, Mark G.
2008-01-01
Data-validation routines enable computer applications to test data to ensure their accuracy, completeness, and conformance to industry or proprietary standards. This paper presents five programming cases that require students to validate five different types of data: (1) simple user data entries, (2) UPC codes, (3) passwords, (4) ISBN numbers, and…
Reexamining competitive priorities: Empirical study in service sector
NASA Astrophysics Data System (ADS)
Idris, Fazli; Mohammad, Jihad
2015-02-01
The general objective of this study is to validate the multi-level concept of competitive priorities using reflective-formative model at a higher order for service industries. An empirical study of 228 firms from 9 different service industries is conducted to answer the objective of this study. Partial least square analysis with SmartPLS 2.0 was used to perform the analysis. Finding revealed six priorities: cost, flexibility, delivery, quality talent management, quality tangibility, and innovativeness. It emerges that quality are expanded into two types; one is related to managing talent for process improvement and the second one is the physical appearance and tangibility of the service quality. This study has confirmed competitive priorities as formative second-order hierarchical latent construct by using rigorous empirical evidence. Implications, limitation and suggestion for future research are accordingly discussed in this paper.
1989-08-14
for a long time In getting those required parts. 6-29-89 VR-ALC Model Validation Heeting Minutes Page Seven MANPSB • This Is a manufacturing RCC. • No...Manpower. 4. Training shop - mechanics get transferred to r-15 Shop. Ninutes of Brainstorming Session June 20. 1969 Horning Session Page Two 5. Two shifts...11. RCC KANPSB completes then ships to storage - delay 10 to 15 days to get the same part back In finlshing the repair. 12. Major repair coordination
Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2013)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, David C.; Syamlal, Madhava; Cottrell, Roger
2013-09-30
The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that is developing and deploying state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models, with uncertainty quantification (UQ), optimization, risk analysis and decision making capabilities. The CCSI Toolset incorporates commercial and open-source software currently in use by industry and is also developing new software tools asmore » necessary to fill technology gaps identified during execution of the project. Ultimately, the CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. CCSI is led by the National Energy Technology Laboratory (NETL) and leverages the Department of Energy (DOE) national laboratories’ core strengths in modeling and simulation, bringing together the best capabilities at NETL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), Lawrence Livermore National Laboratory (LLNL), and Pacific Northwest National Laboratory (PNNL). The CCSI’s industrial partners provide representation from the power generation industry, equipment manufacturers, technology providers and engineering and construction firms. The CCSI’s academic participants (Carnegie Mellon University, Princeton University, West Virginia University, Boston University and the University of Texas at Austin) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 13, CCSI announced the initial release of its first set of computational tools and models during the October 2012 meeting of its Industry Advisory Board. This initial release led to five companies licensing the CCSI Toolset under a Test and Evaluation Agreement this year. By the end of FY13, the CCSI Technical Team had completed development of an updated suite of computational tools and models. The list below summarizes the new and enhanced toolset components that were released following comprehensive testing during October 2013. 1. FOQUS. Framework for Optimization and Quantification of Uncertainty and Sensitivity. Package includes: FOQUS Graphic User Interface (GUI), simulation-based optimization engine, Turbine Client, and heat integration capabilities. There is also an updated simulation interface and new configuration GUI for connecting Aspen Plus or Aspen Custom Modeler (ACM) simulations to FOQUS and the Turbine Science Gateway. 2. A new MFIX-based Computational Fluid Dynamics (CFD) model to predict particle attrition. 3. A new dynamic reduced model (RM) builder, which generates computationally efficient RMs of the behavior of a dynamic system. 4. A completely re-written version of the algebraic surrogate model builder for optimization (ALAMO). The new version is several orders of magnitude faster than the initial release and eliminates the MATLAB dependency. 5. A new suite of high resolution filtered models for the hydrodynamics associated with horizontal cylindrical objects in a flow path. 6. The new Turbine Science Gateway (Cluster), which supports FOQUS for running multiple simulations for optimization or UQ using a local computer or cluster. 7. A new statistical tool (BSS-ANOVA-UQ) for calibration and validation of CFD models. 8. A new basic data submodel in Aspen Plus format for a representative high viscosity capture solvent, 2-MPZ system. 9. An updated RM tool for CFD (REVEAL) that can create a RM from MFIX. A new lightweight, stand-alone version will be available in late 2013. 10. An updated RM integration tool to convert the RM from REVEAL into a CAPE-OPEN or ACM model for use in a process simulator. 11. An updated suite of unified steady-state and dynamic process models for solid sorbent carbon capture included bubbling fluidized bed and moving bed reactors. 12. An updated and unified set of compressor models including steady-state design point model and dynamic model with surge detection. 13. A new framework for the synthesis and optimization of coal oxycombustion power plants using advanced optimization algorithms. This release focuses on modeling and optimization of a cryogenic air separation unit (ASU). 14. A new technical risk model in spreadsheet format. 15. An updated version of the sorbent kinetic/equilibrium model for parameter estimation for the 1st generation sorbent model. 16. An updated process synthesis superstructure model to determine optimal process configurations utilizing surrogate models from ALAMO for adsorption and regeneration in a solid sorbent process. 17. Validation models for NETL Carbon Capture Unit utilizing sorbent AX. Additional validation models will be available for sorbent 32D in 2014. 18. An updated hollow fiber membrane model and system example for carbon capture. 19. An updated reference power plant model in Thermoflex that includes additional steam extraction and reinjection points to enable heat integration module. 20. An updated financial risk model in spreadsheet format.« less
Identifying thermal breakdown products of thermoplastics.
Guillemot, Marianne; Oury, Benoît; Melin, Sandrine
2017-07-01
Polymers processed to produce plastic articles are subjected to temperatures between 150°C and 450°C or more during overheated processing and breakdowns. Heat-based processing of this nature can lead to emission of volatile organic compounds (VOCs) into the thermoplastic processing shop. In this study, laboratory experiments, qualitative and quantitative emissions measurement in thermoplastic factories were carried out. The first step was to identify the compounds released depending on the thermoplastic nature, the temperature and the type of process. Then a thermal degradation protocol that can extrapolate the laboratory results to industry scenarios was developed. The influence of three parameters on released thermal breakdown products was studied: the sample preparation methods-manual cutting, ambient, or cold grinding-the heating rate during thermal degradation-5, 10 20, and 50°C/min-and the decomposition method-thermogravimetric analysis and pyrolysis. Laboratory results were compared to atmospheric measurements taken at 13 companies to validate the protocol and thereby ensure its representativeness of industrial thermal processing. This protocol was applied to most commonly used thermoplastics to determine their thermal breakdown products and their thermal behaviour. Emissions data collected by personal exposure monitoring and sampling at the process emission area show airborne concentrations of detected compounds to be in the range of 0-3 mg/m 3 under normal operating conditions. Laser cutting or purging operations generate higher pollution levels in particular formaldehyde which was found in some cases at a concentration above the workplace exposure limit.
Validation and calibration of a TDLAS oxygen sensor for in-line measurement on flow-packed products
NASA Astrophysics Data System (ADS)
Cocola, L.; Fedel, M.; Allermann, H.; Landa, S.; Tondello, G.; Bardenstein, A.; Poletto, L.
2016-05-01
A device based on Tunable Diode Laser Absorption Spectroscopy has been developed for non-invasive evaluation of gaseous oxygen concentration inside packed food containers. This work has been done in the context of the SAFETYPACK European project in order to enable full, automated product testing on a production line. The chosen samples at the end of the manufacturing process are modified atmosphere bags of processed mozzarella, in which the target oxygen concentration is required to be below 5%. The spectrometer allows in-line measurement of moving samples which are passing on a conveyor belt, with an optical layout optimized for bags made of a flexible scattering material, and works by sensing the gas phase in the headspace at the top of the package. A field applicable method for the calibration of this device has been identified and validated against traditional, industry standard, invasive measurement techniques. This allows some degrees of freedom for the end-user regarding packaging dimensions and shape. After deployment and setup of the instrument at the end-user manufacturing site, performance has been evaluated on a different range of samples in order to validate the choice of electro optical and geometrical parameters regarding sample handling and measurement timing at the actual measurement conditions.
Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2012)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, David C.; Syamlal, Madhava; Cottrell, Roger
2012-09-30
The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that is developing and deploying state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models, with uncertainty quantification (UQ), optimization, risk analysis and decision making capabilities. The CCSI Toolset incorporates commercial and open-source software currently in use by industry and is also developing new software tools asmore » necessary to fill technology gaps identified during execution of the project. Ultimately, the CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. CCSI is organized into 8 technical elements that fall under two focus areas. The first focus area (Physicochemical Models and Data) addresses the steps necessary to model and simulate the various technologies and processes needed to bring a new Carbon Capture and Storage (CCS) technology into production. The second focus area (Analysis & Software) is developing the software infrastructure to integrate the various components and implement the tools that are needed to make quantifiable decisions regarding the viability of new CCS technologies. CCSI also has an Industry Advisory Board (IAB). By working closely with industry from the inception of the project to identify industrial challenge problems, CCSI ensures that the simulation tools are developed for the carbon capture technologies of most relevance to industry. CCSI is led by the National Energy Technology Laboratory (NETL) and leverages the Department of Energy (DOE) national laboratories' core strengths in modeling and simulation, bringing together the best capabilities at NETL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), Lawrence Livermore National Laboratory (LLNL), and Pacific Northwest National Laboratory (PNNL). The CCSI's industrial partners provide representation from the power generation industry, equipment manufacturers, technology providers and engineering and construction firms. The CCSI's academic participants (Carnegie Mellon University, Princeton University, West Virginia University, and Boston University) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 12, CCSI released its first set of computational tools and models. This pre-release, a year ahead of the originally planned first release, is the result of intense industry interest in getting early access to the tools and the phenomenal progress of the CCSI technical team. These initial components of the CCSI Toolset provide new models and computational capabilities that will accelerate the commercial development of carbon capture technologies as well as related technologies, such as those found in the power, refining, chemicals, and gas production industries. The release consists of new tools for process synthesis and optimization to help identify promising concepts more quickly, new physics-based models of potential capture equipment and processes that will reduce the time to design and troubleshoot new systems, a framework to quantify the uncertainty of model predictions, and various enabling tools that provide new capabilities such as creating reduced order models (ROMs) from reacting multiphase flow simulations and running thousands of process simulations concurrently for optimization and UQ.« less
Asilian-Mahabadi, Hassan; Khosravi, Yahya; Hassanzadeh-Rangi, Narmin; Hajizadeh, Ebrahim; Behzadan, Amir H
2018-02-05
Occupational safety in general, and construction safety in particular, is a complex phenomenon. This study was designed to develop a new valid measure to evaluate factors affecting unsafe behavior in the construction industry. A new questionnaire was generated from qualitative research according to the principles of grounded theory. Key measurement properties (face validity, content validity, construct validity, reliability and discriminative validity) were examined using qualitative and quantitative approaches. The receiver operating characteristic curve was used to estimate the discriminating power and the optimal cutoff score. Construct validity revealed an interpretable 12-factor structure which explained 61.87% of variance. Good internal consistency (Cronbach's α = 0.94) and stability (intra-class correlation coefficient = 0.93) were found for the new instrument. The area under the curve, sensitivity and specificity were 0.80, 0.80 and 0.75, respectively. The new instrument also discriminated safety performance among the construction sites with different workers' accident histories (F = 6.40, p < 0.05). The new instrument appears to be a valid, reliable and sensitive instrument that will contribute to investigating the root causes of workers' unsafe behaviors, thus promoting safety performance in the construction industry.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-01
... Effectiveness of a Proposed Rule Change Relating to FINRA Trade Reporting Notice on Price Validation and Price... (``Notice'') that explains the price validation protocol of the FINRA trade reporting facilities and sets... trades by comparing the submitted price against price validation parameters established by FINRA...
75 FR 2435 - Addition to the List of Validated End-Users in the People's Republic of China (PRC)
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-15
... DEPARTMENT OF COMMERCE Bureau of Industry and Security 15 CFR Part 748 [Docket No. 0908111226... (PRC) AGENCY: Bureau of Industry and Security, Commerce. ACTION: Final rule. SUMMARY: In this final rule, the Bureau of Industry and Security (BIS) amends the Export Administration Regulations (EAR) to...
Composites Materials and Manufacturing Technologies for Space Applications
NASA Technical Reports Server (NTRS)
Vickers, J. H.; Tate, L. C.; Gaddis, S. W.; Neal, R. E.
2016-01-01
Composite materials offer significant advantages in space applications. Weight reduction is imperative for deep space systems. However, the pathway to deployment of composites alternatives is problematic. Improvements in the materials and processes are needed, and extensive testing is required to validate the performance, qualify the materials and processes, and certify components. Addressing these challenges could lead to the confident adoption of composites in space applications and provide spin-off technical capabilities for the aerospace and other industries. To address the issues associated with composites applications in space systems, NASA sponsored a Technical Interchange Meeting (TIM) entitled, "Composites Materials and Manufacturing Technologies for Space Applications," the proceedings of which are summarized in this Conference Publication. The NASA Space Technology Mission Directorate and the Game Changing Program chartered the meeting. The meeting was hosted by the National Center for Advanced Manufacturing (NCAM)-a public/private partnership between NASA, the State of Louisiana, Louisiana State University, industry, and academia, in association with the American Composites Manufacturers Association. The Louisiana Center for Manufacturing Sciences served as the coordinator for the TIM.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zee, Ralph; Schindler, Anton; Duke, Steve
The objective of this project is to conduct research to determine the feasibility of using alternate fuel sources for the production of cement. Successful completion of this project will also be beneficial to other commercial processes that are highly energy intensive. During this report period, we have completed all the subtasks in the preliminary survey. Literature searches focused on the types of alternative fuels currently used in the cement industry around the world. Information was obtained on the effects of particular alternative fuels on the clinker/cement product and on cement plant emissions. Federal regulations involving use of waste fuels weremore » examined. Information was also obtained about the trace elements likely to be found in alternative fuels, coal, and raw feeds, as well as the effects of various trace elements introduced into system at the feed or fuel stage on the kiln process, the clinker/cement product, and concrete made from the cement. The experimental part of this project involves the feasibility of a variety of alternative materials mainly commercial wastes to substitute for coal in an industrial cement kiln in Lafarge NA and validation of the experimental results with energy conversion consideration.« less
Wu, Xiaohui; Yang, Yang; Wu, Gaoming; Mao, Juan; Zhou, Tao
2016-01-01
Applications of activated sludge models (ASM) in simulating industrial biological wastewater treatment plants (WWTPs) are still difficult due to refractory and complex components in influents as well as diversity in activated sludges. In this study, an ASM3 modeling study was conducted to simulate and optimize a practical coking wastewater treatment plant (CWTP). First, respirometric characterizations of the coking wastewater and CWTP biomasses were conducted to determine the specific kinetic and stoichiometric model parameters for the consecutive aeration-anoxic-aeration (O-A/O) biological process. All ASM3 parameters have been further estimated and calibrated, through cross validation by the model dynamic simulation procedure. Consequently, an ASM3 model was successfully established to accurately simulate the CWTP performances in removing COD and NH4-N. An optimized CWTP operation condition could be proposed reducing the operation cost from 6.2 to 5.5 €/m(3) wastewater. This study is expected to provide a useful reference for mathematic simulations of practical industrial WWTPs. Copyright © 2015 Elsevier Ltd. All rights reserved.
OAST Technology for the Future. Volume 3 - Critical Technologies, Themes 5-8
NASA Technical Reports Server (NTRS)
1988-01-01
NASA's Office of Aeronautics and Space Technology (OAST) conducted a workshop on the In-Space Technology Experiments Program IN-STEP) December 6-9, 1988, in Atlanta, Georgia. The purpose of this workshop was to identify and prioritize space technologies which are critical for future national space programs and which require validation in the 5 ace environment. A secondary objective was to review the current NASA (In-Reach and Industry/University (Out-Reach) experiments. Finally, the aerospace community was requested to review and comment on the proposed plans for the continuation of the In-Space Technology Experiments Program. In particular, the review included the proposed process for focusing the next experiment selection on specific, critical technologies and the process for implementing the hardware development and integration on the Space Shuttle vehicle. The product of the workshop was a prioritized listing of the critical space technology needs in each of eight technology disciplines. These listings were the cumulative recommendations of nearly 400 participants, which included researchers, technologists, and managers from aerospace industries, universities, and government organizations.
OAST Technology for the Future. Volume 2 - Critical Technologies, Themes 1-4
NASA Technical Reports Server (NTRS)
1988-01-01
NASA's Office of Aeronautics and Space Technology (OAST) conducted a workshop on the In-Space Technology Experiments Program IN-STEP) December 6-9, 1988, in Atlanta, Georgia. The purpose of this workshop was to identify and prioritize space technologies which are critical for future national space programs and which. require validation in the space environment. A secondary objective was to review the current NASA (InReach) and Industry/University (Out-Reach) experiments. Finally, the aerospace community was requested to review and comment on the proposed plans for the continuation of the In-Space Technology Experiments Program. In particular, the review included the proposed process for focusing the next experiment selection on specific, critical technologies and the process for implementing the hardware development and integration on the Space Shuttle vehicle. The product of the workshop was a prioritized listing of the critical space technology needs in each of eight technology disciplines. These listings were the cumulative recommendations of nearly 400 participants, which included researchers, technologists, and managers from aerospace industries, universities, and government organizations.
Srivastava, Abhinay Kumar; Tripathi, Abhishek Dutt; Jha, Alok; Poonia, Amrita; Sharma, Nitya
2015-06-01
In the present work Lactobacillus delbrueckii was used to utilize agro-industrial byproduct (cane molasses) for lactic acid production under submerged fermentation process. Screening of LAB was done by Fourier transform infra red spectroscopy (FTIR). Effect of different amino acids (DL-Phenylalanine, L-Lysine and DL-Aspartic acid) on the fermentation process was done by high performance liquid chromatography (HPLC). Central composite rotatable design (CCRD) was used to optimize the levels of three parameters viz. tween 80, amino acid and cane molasses concentration during fermentative production of lactic acid. Under optimum condition lactic acid production was enhanced from 55.89 g/L to 84.50 g/L. Further, validation showed 81.50 g/L lactic acid production. Scale up was done on 7.5 L fermentor. Productivity was found to be 3.40 g/L/h which was higher than previous studies with reduced fermentation time from 24 h to 12 h. Further characterization of lactic acid was done by FTIR.
The Current State of Drug Discovery and a Potential Role for NMR Metabolomics
2015-01-01
The pharmaceutical industry has significantly contributed to improving human health. Drugs have been attributed to both increasing life expectancy and decreasing health care costs. Unfortunately, there has been a recent decline in the creativity and productivity of the pharmaceutical industry. This is a complex issue with many contributing factors resulting from the numerous mergers, increase in out-sourcing, and the heavy dependency on high-throughput screening (HTS). While a simple solution to such a complex problem is unrealistic and highly unlikely, the inclusion of metabolomics as a routine component of the drug discovery process may provide some solutions to these problems. Specifically, as the binding affinity of a chemical lead is evolved during the iterative structure-based drug design process, metabolomics can provide feedback on the selectivity and the in vivo mechanism of action. Similarly, metabolomics can be used to evaluate and validate HTS leads. In effect, metabolomics can be used to eliminate compounds with potential efficacy and side effect problems while prioritizing well-behaved leads with druglike characteristics. PMID:24588729
Ultrasound-Assisted Extraction of Stilbenes from Grape Canes.
Piñeiro, Zulema; Marrufo-Curtido, Almudena; Serrano, Maria Jose; Palma, Miguel
2016-06-16
An analytical ultrasound-assisted extraction (UAE) method has been optimized and validated for the rapid extraction of stilbenes from grape canes. The influence of sample pre-treatment (oven or freeze-drying) and several extraction variables (solvent, sample-solvent ratio and extraction time between others) on the extraction process were analyzed. The new method allowed the main stilbenes in grape canes to be extracted in just 10 min, with an extraction temperature of 75 °C and 60% ethanol in water as the extraction solvent. Validation of the extraction method was based on analytical properties. The resulting RSDs (n = 5) for interday/intraday precision were less than 10%. Furthermore, the method was successfully applied in the analysis of 20 different grape cane samples. The result showed that grape cane byproducts are potentially sources of bioactive compounds of interest for pharmaceutical and food industries.
Minimization of Blast furnace Fuel Rate by Optimizing Burden and Gas Distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dr. Chenn Zhou
2012-08-15
The goal of the research is to improve the competitive edge of steel mills by using the advanced CFD technology to optimize the gas and burden distributions inside a blast furnace for achieving the best gas utilization. A state-of-the-art 3-D CFD model has been developed for simulating the gas distribution inside a blast furnace at given burden conditions, burden distribution and blast parameters. The comprehensive 3-D CFD model has been validated by plant measurement data from an actual blast furnace. Validation of the sub-models is also achieved. The user friendly software package named Blast Furnace Shaft Simulator (BFSS) has beenmore » developed to simulate the blast furnace shaft process. The research has significant benefits to the steel industry with high productivity, low energy consumption, and improved environment.« less
Experimental Equipment Validation for Methane (CH4) and Carbon Dioxide (CO2) Hydrates
NASA Astrophysics Data System (ADS)
Saad Khan, Muhammad; Yaqub, Sana; Manner, Naathiya; Ani Karthwathi, Nur; Qasim, Ali; Mellon, Nurhayati Binti; Lal, Bhajan
2018-04-01
Clathrate hydrates are eminent structures regard as a threat to the gas and oil industry in light of their irritating propensity to subsea pipelines. For natural gas transmission and processing, the formation of gas hydrate is one of the main flow assurance delinquent has led researchers toward conducting fresh and meticulous studies on various aspects of gas hydrates. This paper highlighted the thermodynamic analysis on pure CH4 and CO2 gas hydrates on the custom fabricated equipment (Sapphire cell hydrate reactor) for experimental validation. CO2 gas hydrate formed at lower pressure (41 bar) as compared to CH4 gas hydrate (70 bar) while comparison of thermodynamic properties between CH4 and CO2 also presented in this study. This preliminary study could provide pathways for the quest of potent hydrate inhibitors.
Grants4Targets - an innovative approach to translate ideas from basic research into novel drugs.
Lessl, Monika; Schoepe, Stefanie; Sommer, Anette; Schneider, Martin; Asadullah, Khusru
2011-04-01
Collaborations between industry and academia are steadily gaining importance. To combine expertises Bayer Healthcare has set up a novel open innovation approach called Grants4Targets. Ideas on novel drug targets can easily be submitted to http://www.grants4targets.com. After a review process, grants are provided to perform focused experiments to further validate the proposed targets. In addition to financial support specific know-how on target validation and drug discovery is provided. Experienced scientists are nominated as project partners and, depending on the project, tools or specific models are provided. Around 280 applications have been received and 41 projects granted. According to our experience, this type of bridging fund combined with joint efforts provides a valuable tool to foster drug discovery collaborations. Copyright © 2010 Elsevier Ltd. All rights reserved.
de Silva, Odile
2002-12-01
COLIPA (the European Federation of the Cosmetics Industry) represents 24 international companies and 2000 small and medium-sized enterprises. Together with ECVAM, COLIPA has been involved in the development and validation of alternative methods since the beginning of the validation efforts. The work of the Steering Committee on Alternatives to Animal Testing (SCAAT) is based on collaboration between companies, but also with academia, trade associations, the Scientific Committee on Cosmetics and Non-Food Products (SCCNFP), European Commission Directorates General, and ECVAM. Some success has been achieved, but some validation efforts have failed. One lesson is that the search for alternatives requires a lot of humility.
NASA Astrophysics Data System (ADS)
Mohd Salleh, Khairul Anuar; Rahman, Mohd Fitri Abdul; Lee, Hyoung Koo; Al Dahhan, Muthanna H.
2014-06-01
Local liquid velocity measurements in Trickle Bed Reactors (TBRs) are one of the essential components in its hydrodynamic studies. These measurements are used to effectively determine a reactor's operating condition. This study was conducted to validate a newly developed technique that combines Digital Industrial Radiography (DIR) with Particle Tracking Velocimetry (PTV) to measure the Local Liquid Velocity (VLL) inside TBRs. Three millimeter-sized Expanded Polystyrene (EPS) beads were used as packing material. Three validation procedures were designed to test the newly developed technique. All procedures and statistical approaches provided strong evidence that the technique can be used to measure the VLL within TBRs.
A review of the FDA draft guidance document for software validation: guidance for industry.
Keatley, K L
1999-01-01
A Draft Guidance Document (Version 1.1) was issued by the United States Food and Drug Administration (FDA) to address the software validation requirement of the Quality System Regulation, 21 CFR Part 820, effective June 1, 1997. The guidance document outlines validation considerations that the FDA regards as applicable to both medical device software and software used to "design, develop or manufacture" medical devices. The Draft Guidance is available at the FDA web site http:@www.fda.gov/cdrh/comps/swareval++ +.html. Presented here is a review of the main features of the FDA document for Quality System Regulation (QSR), and some guidance for its implementation in industry.
Mohd Salleh, Khairul Anuar; Rahman, Mohd Fitri Abdul; Lee, Hyoung Koo; Al Dahhan, Muthanna H
2014-06-01
Local liquid velocity measurements in Trickle Bed Reactors (TBRs) are one of the essential components in its hydrodynamic studies. These measurements are used to effectively determine a reactor's operating condition. This study was conducted to validate a newly developed technique that combines Digital Industrial Radiography (DIR) with Particle Tracking Velocimetry (PTV) to measure the Local Liquid Velocity (V(LL)) inside TBRs. Three millimeter-sized Expanded Polystyrene (EPS) beads were used as packing material. Three validation procedures were designed to test the newly developed technique. All procedures and statistical approaches provided strong evidence that the technique can be used to measure the V(LL) within TBRs.
Health risk assessment and the practice of industrial hygiene.
Paustenbach, D J
1990-07-01
It has been claimed that there may be as many as 2000 airborne chemicals to which persons could be exposed in the workplace and in the community. Of these, occupational exposure limits have been set for approximately 700 chemicals, and only about 30 chemicals have limits for the ambient air. It is likely that some type of health risk assessment methodology will be used to establish limits for the remainder. Although these methods have been used for over 10 yr to set environmental limits, each step of the process (hazard identification, dose-response assessment, exposure assessment, and risk characterization) contains a number of traps into which scientists and risk managers can fall. For example, regulatory approaches to the hazard identification step have allowed little discrimination between the various animal carcinogens, even though these chemicals can vary greatly in their potency and mechanisms of action. In general, epidemiology data have been given little weight compared to the results of rodent bioassays. The dose-response extrapolation process, as generally practiced, often does not present the range of equally plausible values. Procedures which acknowledge and quantitatively account for some or all of the different classes of chemical carcinogens have not been widely adopted. For example, physiologically based pharmacokinetic (PB-PK) and biologically based models need to become a part of future risk assessments. The exposure evaluation portion of risk assessments can now be significantly more valid because of better dispersion models, validated exposure parameters, and the use of computers to account for complex environmental factors. Using these procedures, industrial hygienists are now able to quantitatively estimate the risks caused not only by the inhalation of chemicals but also those caused by dermal contact and incidental ingestion. The appropriate use of risk assessment methods should allow scientists and risk managers to set scientifically valid environmental and occupational standards for air contaminants.
Research Activities at Plasma Research Laboratory at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Sharma, S. P.; Rao, M. V. V. S.; Meyyappan, Meyya
2000-01-01
In order to meet NASA's requirements for the rapid development and validation of future generation electronic devices as well as associated materials and processes, enabling technologies are being developed at NASA-Ames Research Center using a multi-discipline approach. The first step is to understand the basic physics of the chemical reactions in the area of plasma reactors and processes. Low pressure glow discharges are indispensable in the fabrication of microelectronic circuits. These plasmas are used to deposit materials and also etch fine features in device fabrication. However, many plasma-based processes suffer from stability and reliability problems leading to a compromise in performance and a potentially increased cost for the semiconductor manufacturing industry. Although a great deal of laboratory-scale research has been performed on many of these processing plasmas, little is known about the gas-phase and surface chemical reactions that are critical in many etch and deposition processes, and how these reactions are influenced by the variation in operating conditions. Such a lack of understanding has hindered the development of process models that can aid in the scaling and improvement of plasma etch and deposition systems. Our present research involves the study of such plasmas. An inductively-coupled plasma (ICP) source in place of the standard upper electrode assembly of the Gaseous Electronics Conference (GEC) radio-frequency (RF) Reference Cell is used to investigate the discharge characteristics. This ICP source generates plasmas with higher electron densities and lower operating pressures than obtainable with the original parallel-plate version of the GEC Cell. This expanded operating regime is more relevant to new generations of industrial plasma systems being used by the microelectronics industry. The research goal is to develop an understanding of the physical phenomena involved in plasma processing and to measure much needed fundamental parameters, such as gas phase and surface reaction rates, species concentration, temperature, ion energy distribution, and electron number density.
Manufacturing of tailored tubes with a process integrated heat treatment
NASA Astrophysics Data System (ADS)
Hordych, Illia; Boiarkin, Viacheslav; Rodman, Dmytro; Nürnberger, Florian
2017-10-01
The usage of work-pieces with tailored properties allows for reducing costs and materials. One example are tailored tubes that can be used as end parts e.g. in the automotive industry or in domestic applications as well as semi-finished products for subsequent controlled deformation processes. An innovative technology to manufacture tubes is roll forming with a subsequent inductive heating and adapted quenching to obtain tailored properties in the longitudinal direction. This processing offers a great potential for the production of tubes with a wide range of properties, although this novel approach still requires a suited process design. Based on experimental data, a process simulation is being developed. The simulation shall be suitable for a virtual design of the tubes and allows for gaining a deeper understanding of the required processing. The model proposed shall predict microstructural and mechanical tube properties by considering process parameters, different geometries, batch-related influences etc. A validation is carried out using experimental data of tubes manufactured from various steel grades.
Li, Kangkang; Yu, Hai; Tade, Moses; Feron, Paul; Yu, Jingwen; Wang, Shujuan
2014-06-17
An advanced NH3 abatement and recycling process that makes great use of the waste heat in flue gas was proposed to solve the problems of ammonia slip, NH3 makeup, and flue gas cooling in the ammonia-based CO2 capture process. The rigorous rate-based model, RateFrac in Aspen Plus, was thermodynamically and kinetically validated by experimental data from open literature and CSIRO pilot trials at Munmorah Power Station, Australia, respectively. After a thorough sensitivity analysis and process improvement, the NH3 recycling efficiency reached as high as 99.87%, and the NH3 exhaust concentration was only 15.4 ppmv. Most importantly, the energy consumption of the NH3 abatement and recycling system was only 59.34 kJ/kg CO2 of electricity. The evaluation of mass balance and temperature steady shows that this NH3 recovery process was technically effective and feasible. This process therefore is a promising prospect toward industrial application.
NASA Astrophysics Data System (ADS)
Buchholz, Bernhard; Ebert, Volker
2018-01-01
Highly accurate water vapor measurements are indispensable for understanding a variety of scientific questions as well as industrial processes. While in metrology water vapor concentrations can be defined, generated, and measured with relative uncertainties in the single percentage range, field-deployable airborne instruments deviate even under quasistatic laboratory conditions up to 10-20 %. The novel SEALDH-II hygrometer, a calibration-free, tuneable diode laser spectrometer, bridges this gap by implementing a new holistic concept to achieve higher accuracy levels in the field. We present in this paper the absolute validation of SEALDH-II at a traceable humidity generator during 23 days of permanent operation at 15 different H2O mole fraction levels between 5 and 1200 ppmv. At each mole fraction level, we studied the pressure dependence at six different gas pressures between 65 and 950 hPa. Further, we describe the setup for this metrological validation, the challenges to overcome when assessing water vapor measurements on a high accuracy level, and the comparison results. With this validation, SEALDH-II is the first airborne, metrologically validated humidity transfer standard which links several scientific airborne and laboratory measurement campaigns to the international metrological water vapor scale.
Validation of the geographic position of EPER-Spain industries
García-Pérez, Javier; Boldo, Elena; Ramis, Rebeca; Vidal, Enrique; Aragonés, Nuria; Pérez-Gómez, Beatriz; Pollán, Marina; López-Abente, Gonzalo
2008-01-01
Background The European Pollutant Emission Register in Spain (EPER-Spain) is a public inventory of pollutant industries created by decision of the European Union. The location of these industries is geocoded and the first published data correspond to 2001. Publication of these data will allow for quantification of the effect of proximity to one or more such plant on cancer and all-cause mortality observed in nearby towns. However, as errors have been detected in the geocoding of many of the pollutant foci shown in the EPER, it was decided that a validation study should be conducted into the accuracy of these co-ordinates. EPER-Spain geographic co-ordinates were drawn from the European Environment Agency (EEA) server and the Spanish Ministry of the Environment (MOE). The Farm Plot Geographic Information System (Sistema de Información Geográfica de Parcelas Agrícolas) (SIGPAC) enables orthophotos (digitalized aerial images) of any territorial point across Spain to be obtained. Through a search of co-ordinates in the SIGPAC, all the industrial foci (except farms) were located. The quality criteria used to ascertain possible errors in industrial location were high, medium and low quality, where industries were situated at a distance of less than 500 metres, more than 500 metres but less than 1 kilometre, and more than 1 kilometre from their real locations, respectively. Results Insofar as initial registry quality was concerned, 84% of industrial complexes were inaccurately positioned (low quality) according to EEA data versus 60% for Spanish MOE data. The distribution of the distances between the original and corrected co-ordinates for each of the industries on the registry revealed that the median error was 2.55 kilometres for Spain overall (according to EEA data). The Autonomous Regions that displayed most errors in industrial geocoding were Murcia, Canary Islands, Andalusia and Madrid. Correct co-ordinates were successfully allocated to 100% of EPER-Spain industries. Conclusion Knowing the exact location of pollutant foci is vital to obtain reliable and valid conclusions in any study where distance to the focus is a decisive factor, as in the case of the consequences of industrial pollution on the health of neighbouring populations. PMID:18190678
Globally sustainable manganese metal production and use.
Hagelstein, Karen
2009-09-01
The "cradle to grave" concept of managing chemicals and wastes has been a descriptive analogy of proper environmental stewardship since the 1970s. The concept incorporates environmentally sustainable product choices-such as metal alloys utilized steel products which civilization is dependent upon. Manganese consumption is related to the increasing production of raw steel and upgrading ferroalloys. Nonferrous applications of manganese include production of dry-cell batteries, plant fertilizer components, animal feed and colorant for bricks. The manganese ore (high grade 35% manganese) production world wide is about 6 million ton/year and electrolytic manganese metal demand is about 0.7 million ton/year. The total manganese demand is consumed globally by industries including construction (23%), machinery (14%), and transportation (11%). Manganese is recycled within scrap of iron and steel, a small amount is recycled within aluminum used beverage cans. Recycling rate is 37% and efficiency is estimated as 53% [Roskill Metals and Minerals Reports, January 13, 2005. Manganese Report: rapid rise in output caused by Chinese crude steel production. Available from: http://www.roskill.com/reports/manganese.]. Environmentally sustainable management choices include identifying raw material chemistry, utilizing clean production processes, minimizing waste generation, recycling materials, controlling occupational exposures, and collecting representative environmental data. This paper will discuss two electrolytically produced manganese metals, the metal production differences, and environmental impacts cited to date. The two electrolytic manganese processes differ due to the addition of sulfur dioxide or selenium dioxide. Adverse environmental impacts due to use of selenium dioxide methodology include increased water consumption and order of magnitude greater solid waste generation per ton of metal processed. The use of high grade manganese ores in the electrolytic process also reduces the quantity of solid wastes generated during processing. Secondary aluminum facilities have reported hazardous waste generation management issues due to baghouse dusts from rotary furnaces processing selenium contaminated manganese alloys. Environmental impacts resulting from industry are represented by emission inventories of chemical releases to the air, water, and soil. The U.S. metals industry releases reported to EPA Toxic Release Inventory indicate the primary metals industry is the major source of metal air toxic emissions, exceeding electric utility air toxic emissions. The nonferrous metals industry is reported to be the Organization for Economic Co-operation and Development (OECD) most intensive airborne and land pollution source of bioaccumulative metals. However, total waste emissions from industries in the OECD countries have declined due to improving energy consumption. Emission registers and access are improving around the world. However, environmental databases for metal particulates have low confidence ratings since the majority of air toxic emissions are not reported, not monitored, or are estimated based on worst-case emission factors. Environmental assessments including biological monitoring are necessary to validate mandated particulate metal emission reductions and control technologies during metal processing.
Winge, Stefan; Yderland, Louise; Kannicht, Christoph; Hermans, Pim; Adema, Simon; Schmidt, Torben; Gilljam, Gustav; Linhult, Martin; Tiemeyer, Maya; Belyanskaya, Larisa; Walter, Olaf
2015-11-01
Human-cl rhFVIII (Nuwiq®), a new generation recombinant factor VIII (rFVIII), is the first rFVIII produced in a human cell-line approved by the European Medicines Agency. To describe the development, upscaling and process validation for industrial-scale human-cl rhFVIII purification. The purification process involves one centrifugation, two filtration, five chromatography columns and two dedicated pathogen clearance steps (solvent/detergent treatment and 20 nm nanofiltration). The key purification step uses an affinity resin (VIIISelect) with high specificity for FVIII, removing essentially all host-cell proteins with >80% product recovery. The production-scale multi-step purification process efficiently removes process- and product-related impurities and results in a high-purity rhFVIII product, with an overall yield of ∼50%. Specific activity of the final product was >9000 IU/mg, and the ratio between active FVIII and total FVIII protein present was >0.9. The entire production process is free of animal-derived products. Leaching of potential harmful compounds from chromatography resins and all pathogens tested were below the limit of quantification in the final product. Human-cl rhFVIII can be produced at 500 L bioreactor scale, maintaining high purity and recoveries. The innovative purification process ensures a high-purity and high-quality human-cl rhFVIII product with a high pathogen safety margin. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Workshop Report: Crystal City VI-Bioanalytical Method Validation for Biomarkers.
Arnold, Mark E; Booth, Brian; King, Lindsay; Ray, Chad
2016-11-01
With the growing focus on translational research and the use of biomarkers to drive drug development and approvals, biomarkers have become a significant area of research within the pharmaceutical industry. However, until the US Food and Drug Administration's (FDA) 2013 draft guidance on bioanalytical method validation included consideration of biomarker assays using LC-MS and LBA, those assays were created, validated, and used without standards of performance. This lack of expectations resulted in the FDA receiving data from assays of varying quality in support of efficacy and safety claims. The AAPS Crystal City VI (CC VI) Workshop in 2015 was held as the first forum for industry-FDA discussion around the general issues of biomarker measurements (e.g., endogenous levels) and specific technology strengths and weaknesses. The 2-day workshop served to develop a common understanding among the industrial scientific community of the issues around biomarkers, informed the FDA of the current state of the science, and will serve as a basis for further dialogue as experience with biomarkers expands with both groups.
Li, Kaiyue; Wang, Weiying; Liu, Yanping; Jiang, Su; Huang, Guo; Ye, Liming
2017-01-01
The active ingredients and thus pharmacological efficacy of traditional Chinese medicine (TCM) at different degrees of parching process vary greatly. Near-infrared spectroscopy (NIR) was used to develop a new method for rapid online analysis of TCM parching process, using two kinds of chemical indicators (5-(hydroxymethyl) furfural [5-HMF] content and 420 nm absorbance) as reference values which were obviously observed and changed in most TCM parching process. Three representative TCMs, Areca ( Areca catechu L.), Malt ( Hordeum Vulgare L.), and Hawthorn ( Crataegus pinnatifida Bge.), were used in this study. With partial least squares regression, calibration models of NIR were generated based on two kinds of reference values, i.e. 5-HMF contents measured by high-performance liquid chromatography (HPLC) and 420 nm absorbance measured by ultraviolet-visible spectroscopy (UV/Vis), respectively. In the optimized models for 5-HMF, the root mean square errors of prediction (RMSEP) for Areca, Malt, and Hawthorn was 0.0192, 0.0301, and 0.2600 and correlation coefficients ( R cal ) were 99.86%, 99.88%, and 99.88%, respectively. Moreover, in the optimized models using 420 nm absorbance as reference values, the RMSEP for Areca, Malt, and Hawthorn was 0.0229, 0.0096, and 0.0409 and R cal were 99.69%, 99.81%, and 99.62%, respectively. NIR models with 5-HMF content and 420 nm absorbance as reference values can rapidly and effectively identify three kinds of TCM in different parching processes. This method has great promise to replace current subjective color judgment and time-consuming HPLC or UV/Vis methods and is suitable for rapid online analysis and quality control in TCM industrial manufacturing process. Near-infrared spectroscopy.(NIR) was used to develop a new method for online analysis of traditional Chinese medicine.(TCM) parching processCalibration and validation models of Areca, Malt, and Hawthorn were generated by partial least squares regression using 5.(hydroxymethyl) furfural contents and 420.nm absorbance as reference values, respectively, which were main indicator components during parching process of most TCMThe established NIR models of three TCMs had low root mean square errors of prediction and high correlation coefficientsThe NIR method has great promise for use in TCM industrial manufacturing processes for rapid online analysis and quality control. Abbreviations used: NIR: Near-infrared Spectroscopy; TCM: Traditional Chinese medicine; Areca: Areca catechu L.; Hawthorn: Crataegus pinnatifida Bge.; Malt: Hordeum vulgare L.; 5-HMF: 5-(hydroxymethyl) furfural; PLS: Partial least squares; D: Dimension faction; SLS: Straight line subtraction, MSC: Multiplicative scatter correction; VN: Vector normalization; RMSECV: Root mean square errors of cross-validation; RMSEP: Root mean square errors of validation; R cal : Correlation coefficients; RPD: Residual predictive deviation; PAT: Process analytical technology; FDA: Food and Drug Administration; ICH: International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use.
Technology Readiness of the NEXT Ion Propulsion System
NASA Technical Reports Server (NTRS)
Benson, Scott W.; Patterson, Michael J.
2008-01-01
The NASA's Evolutionary Xenon Thruster (NEXT) ion propulsion system has been in advanced technology development under the NASA In-Space Propulsion Technology project. The highest fidelity hardware planned has now been completed by the government/industry team, including: a flight prototype model (PM) thruster, an engineering model (EM) power processing unit, EM propellant management assemblies, a breadboard gimbal, and control unit simulators. Subsystem and system level technology validation testing is in progress. To achieve the objective Technology Readiness Level 6, environmental testing is being conducted to qualification levels in ground facilities simulating the space environment. Additional tests have been conducted to characterize the performance range and life capability of the NEXT thruster. This paper presents the status and results of technology validation testing accomplished to date, the validated subsystem and system capabilities, and the plans for completion of this phase of NEXT development. The next round of competed planetary science mission announcements of opportunity, and directed mission decisions, are anticipated to occur in 2008 and 2009. Progress to date, and the success of on-going technology validation, indicate that the NEXT ion propulsion system will be a primary candidate for mission consideration in these upcoming opportunities.
NASA Astrophysics Data System (ADS)
Hadzaman, N. A. H.; Takim, R.; Nawawi, A. H.; Mohamad Yusuwan, N.
2018-04-01
BIM governance assessment instrument is a process of analysing the importance in developing BIM governance solution to tackle the existing problems during team collaboration in BIM-based projects. Despite the deployment of integrative technologies in construction industry particularly BIM, it is still insufficient compare to other sectors. Several studies have been established the requirements of BIM implementation concerning all technical and non-technical BIM adoption issues. However, the data are regarded as inadequate to develop a BIM governance framework. Hence, the objective of the paper is to evaluate the content validity of the BIM governance instrument prior to the main data collection. Two methods were employed in the form of literature review and questionnaire survey. Based on the literature review, 273 items with six main constructs are suggested to be incorporated in the BIM governance instrument. The Content Validity Ratio (CVR) scores revealed that 202 out of 273 items are considered as the utmost critical by the content experts. The findings for Item Level Content Validity Index (I-CVI) and Modified Kappa Coefficient however revealed that 257 items in BIM governance instrument are appropriate and excellent. The instrument is highly reliable for future strategies and the development of BIM projects in Malaysia.
NASA Astrophysics Data System (ADS)
Launch vehicle propulsion system reliability considerations during the design and verification processes are discussed. The tools available for predicting and minimizing anomalies or failure modes are described and objectives for validating advanced launch system propulsion reliability are listed. Methods for ensuring vehicle/propulsion system interface reliability are examined and improvements in the propulsion system development process are suggested to improve reliability in launch operations. Also, possible approaches to streamline the specification and procurement process are given. It is suggested that government and industry should define reliability program requirements and manage production and operations activities in a manner that provides control over reliability drivers. Also, it is recommended that sufficient funds should be invested in design, development, test, and evaluation processes to ensure that reliability is not inappropriately subordinated to other management considerations.
Segarra-Oña, María-del-Val; Peiró-Signes, Angel; Cervelló-Royo, Roberto
2015-12-01
This paper examines key aspects in the innovative behavior of the construction firms that determine their environmental orientation while innovating. Structural equation modeling was used and data of 222 firms retrieved from the Spanish Technological Innovation Panel (PITEC) for 2010 to analyse the drivers of environmental orientation of the construction firms during the innovation process. The results show that the environmental orientation is positively affected by the product and process orientation of construction firms during the innovation process. Furthermore, the positive relation between the importance of market information sources and environmental orientation, mediated by process and product orientation, is discussed. Finally, a model that explains these relations is proposed and validated. Results have important managerial implications for those companies worried about their eco-innovative focus as the types of actions and relations within firms most suitable for improving their eco-innovative orientation are highlighted.
Evaluation in industry of a draft code of practice for manual handling.
Ashby, Liz; Tappin, David; Bentley, Tim
2004-05-01
This paper reports findings from a study which evaluated the draft New Zealand Code of Practice for Manual Handling. The evaluation assessed the ease of use, applicability and validity of the Code and in particular the associated manual handling hazard assessment tools, within New Zealand industry. The Code was studied in a sample of eight companies from four sectors of industry. Subjective feedback and objective findings indicated that the Code was useful, applicable and informative. The manual handling hazard assessment tools incorporated in the Code could be adequately applied by most users, with risk assessment outcomes largely consistent with the findings of researchers using more specific ergonomics methodologies. However, some changes were recommended to the risk assessment tools to improve usability and validity. The evaluation concluded that both the Code and the tools within it would benefit from simplification, improved typography and layout, and industry-specific information on manual handling hazards.
Advanced Software V&V for Civil Aviation and Autonomy
NASA Technical Reports Server (NTRS)
Brat, Guillaume P.
2017-01-01
With the advances in high-computing platform (e.g., advanced graphical processing units or multi-core processors), computationally-intensive software techniques such as the ones used in artificial intelligence or formal methods have provided us with an opportunity to further increase safety in the aviation industry. Some of these techniques have facilitated building safety at design time, like in aircraft engines or software verification and validation, and others can introduce safety benefits during operations as long as we adapt our processes. In this talk, I will present how NASA is taking advantage of these new software techniques to build in safety at design time through advanced software verification and validation, which can be applied earlier and earlier in the design life cycle and thus help also reduce the cost of aviation assurance. I will then show how run-time techniques (such as runtime assurance or data analytics) offer us a chance to catch even more complex problems, even in the face of changing and unpredictable environments. These new techniques will be extremely useful as our aviation systems become more complex and more autonomous.
Modeling the Formation of Transverse Weld during Billet-on-Billet Extrusion
Mahmoodkhani, Yahya; Wells, Mary; Parson, Nick; Jowett, Chris; Poole, Warren
2014-01-01
A comprehensive mathematical model of the hot extrusion process for aluminum alloys has been developed and validated. The plasticity module was developed using a commercial finite element package, DEFORM-2D, a transient Lagrangian model which couples the thermal and deformation phenomena. Validation of the model against industrial data indicated that it gave excellent predictions of the pressure during extrusion. The finite element predictions of the velocity fields were post-processed to calculate the thickness of the surface cladding as one billet is fed in after another through the die (i.e., the transverse weld). The mathematical model was then used to assess the effect a change in feeder dimensions would have on the shape, thickness and extent of the transverse weld during extrusion. Experimental measurements for different combinations of billet materials show that the model is able to accurately predict the transverse weld shape as well as the clad surface layer to thicknesses of 50 μm. The transverse weld is significantly affected by the feeder geometry shape, but the effects of ram speed, billet material and temperature on the transverse weld dimensions are negligible. PMID:28788629
Experimental Validation of Various Temperature Modells for Semi-Physical Tyre Model Approaches
NASA Astrophysics Data System (ADS)
Hackl, Andreas; Scherndl, Christoph; Hirschberg, Wolfgang; Lex, Cornelia
2017-10-01
With increasing level of complexity and automation in the area of automotive engineering, the simulation of safety relevant Advanced Driver Assistance Systems (ADAS) leads to increasing accuracy demands in the description of tyre contact forces. In recent years, with improvement in tyre simulation, the needs for coping with tyre temperatures and the resulting changes in tyre characteristics are rising significantly. Therefore, experimental validation of three different temperature model approaches is carried out, discussed and compared in the scope of this article. To investigate or rather evaluate the range of application of the presented approaches in combination with respect of further implementation in semi-physical tyre models, the main focus lies on the a physical parameterisation. Aside from good modelling accuracy, focus is held on computational time and complexity of the parameterisation process. To evaluate this process and discuss the results, measurements from a Hoosier racing tyre 6.0 / 18.0 10 LCO C2000 from an industrial flat test bench are used. Finally the simulation results are compared with the measurement data.
In silico analysis of Pycnoporus cinnabarinus laccase active site with toxic industrial dyes.
Prasad, Nirmal K; Vindal, Vaibhav; Narayana, Siva Lakshmi; Ramakrishna, V; Kunal, Swaraj Priyaranjan; Srinivas, M
2012-05-01
Laccases belong to multicopper oxidases, a widespread class of enzymes implicated in many oxidative functions in various industrial oxidative processes like production of fine chemicals to bioremediation of contaminated soil and water. In order to understand the mechanisms of substrate binding and interaction between substrates and Pycnoporus cinnabarinus laccase, a homology model was generated. The resulted model was further validated and used for docking studies with toxic industrial dyes- acid blue 74, reactive black 5 and reactive blue 19. Interactions of chemical mediators with the laccase was also examined. The docking analysis showed that the active site always cannot accommodate the dye molecules, due to constricted nature of the active site pocket and steric hindrance of the residues whereas mediators are relatively small and can easily be accommodated into the active site pocket, which, thereafter leads to the productive binding. The binding properties of these compounds along with identification of critical active site residues can be used for further site-directed mutagenesis experiments in order to identify their role in activity and substrate specificity, ultimately leading to improved mutants for degradation of these toxic compounds.
Chen, Yu-Cheng; Coble, Joseph B; Deziel, Nicole C; Ji, Bu-Tian; Xue, Shouzheng; Lu, Wei; Stewart, Patricia A; Friesen, Melissa C
2014-11-01
The reliability and validity of six experts' exposure ratings were evaluated for 64 nickel-exposed and 72 chromium-exposed workers from six Shanghai electroplating plants based on airborne and urinary nickel and chromium measurements. Three industrial hygienists and three occupational physicians independently ranked the exposure intensity of each metal on an ordinal scale (1-4) for each worker's job in two rounds: the first round was based on responses to an occupational history questionnaire and the second round also included responses to an electroplating industry-specific questionnaire. The Spearman correlation (r(s)) was used to compare each rating's validity to its corresponding subject-specific arithmetic mean of four airborne or four urinary measurements. Reliability was moderately high (weighted kappa range=0.60-0.64). Validity was poor to moderate (r(s)=-0.37-0.46) for both airborne and urinary concentrations of both metals. For airborne nickel concentrations, validity differed by plant. For dichotomized metrics, sensitivity and specificity were higher based on urinary measurements (47-78%) than airborne measurements (16-50%). Few patterns were observed by metal, assessment round, or expert type. These results suggest that, for electroplating exposures, experts can achieve moderately high agreement and (reasonably) distinguish between low and high exposures when reviewing responses to in-depth questionnaires used in population-based case-control studies.
Chen, Yu-Cheng; Coble, Joseph B; Deziel, Nicole C.; Ji, Bu-Tian; Xue, Shouzheng; Lu, Wei; Stewart, Patricia A; Friesen, Melissa C
2014-01-01
The reliability and validity of six experts’ exposure ratings were evaluated for 64 nickel-exposed and 72 chromium-exposed workers from six Shanghai electroplating plants based on airborne and urinary nickel and chromium measurements. Three industrial hygienists and three occupational physicians independently ranked the exposure intensity of each metal on an ordinal scale (1–4) for each worker's job in two rounds: the first round was based on responses to an occupational history questionnaire and the second round also included responses to an electroplating industry-specific questionnaire. Spearman correlation (rs) was used to compare each rating's validity to its corresponding subject-specific arithmetic mean of four airborne or four urinary measurements. Reliability was moderately-high (weighted kappa range=0.60–0.64). Validity was poor to moderate (rs= -0.37–0.46) for both airborne and urinary concentrations of both metals. For airborne nickel concentrations, validity differed by plant. For dichotomized metrics, sensitivity and specificity were higher based on urinary measurements (47–78%) than airborne measurements (16–50%). Few patterns were observed by metal, assessment round, or expert type. These results suggest that, for electroplating exposures, experts can achieve moderately-high agreement and (reasonably) distinguish between low and high exposures when reviewing responses to in-depth questionnaires used in population-based case-control studies. PMID:24736099
ERIC Educational Resources Information Center
Hofstader, Robert; Chapman, Kenneth
This document discusses the Voluntary Industry Standards for Chemical Process Industries Technical Workers Project and issues of relevance to the education and employment of chemical laboratory technicians (CLTs) and process technicians (PTs). Section 1 consists of the following background information: overview of the chemical process industries,…
Schechtman, Leonard M
2002-01-01
Toxicological testing in the current regulatory environment is steeped in a history of using animals to answer questions about the safety of products to which humans are exposed. That history forms the basis for the testing strategies that have evolved to satisfy the needs of the regulatory bodies that render decisions that affect, for the most part, virtually all phases of premarket product development and evaluation and, to a lesser extent, postmarketing surveillance. Only relatively recently have the levels of awareness of, and responsiveness to, animal welfare issues reached current proportions. That paradigm shift, although sluggish, has nevertheless been progressive. New and alternative toxicological methods for hazard evaluation and risk assessment have now been adopted and are being viewed as a means to address those issues in a manner that considers humane treatment of animals yet maintains scientific credibility and preserves the goal of ensuring human safety. To facilitate this transition, regulatory agencies and regulated industry must work together toward improved approaches. They will need assurance that the methods will be reliable and the results comparable with, or better than, those derived from the current classical methods. That confidence will be a function of the scientific validation and resultant acceptance of any given method. In the United States, to fulfill this need, the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) and its operational center, the National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM), have been constituted as prescribed in federal law. Under this mandate, ICCVAM has developed a process and established criteria for the scientific validation and regulatory acceptance of new and alternative methods. The role of ICCVAM in the validation and acceptance process and the criteria instituted toward that end are described. Also discussed are the participation of the US Food and Drug Administration (FDA) in the ICCVAM process and that agency's approach to the application and implementation of ICCVAM-recommended methods.
Direct adaptive control of a PUMA 560 industrial robot
NASA Technical Reports Server (NTRS)
Seraji, Homayoun; Lee, Thomas; Delpech, Michel
1989-01-01
The implementation and experimental validation of a new direct adaptive control scheme on a PUMA 560 industrial robot is described. The testbed facility consists of a Unimation PUMA 560 six-jointed robot and controller, and a DEC MicroVAX II computer which hosts the Robot Control C Library software. The control algorithm is implemented on the MicroVAX which acts as a digital controller for the PUMA robot, and the Unimation controller is effectively bypassed and used merely as an I/O device to interface the MicroVAX to the joint motors. The control algorithm for each robot joint consists of an auxiliary signal generated by a constant-gain Proportional plus Integral plus Derivative (PID) controller, and an adaptive position-velocity (PD) feedback controller with adjustable gains. The adaptive independent joint controllers compensate for the inter-joint couplings and achieve accurate trajectory tracking without the need for the complex dynamic model and parameter values of the robot. Extensive experimental results on PUMA joint control are presented to confirm the feasibility of the proposed scheme, in spite of strong interactions between joint motions. Experimental results validate the capabilities of the proposed control scheme. The control scheme is extremely simple and computationally very fast for concurrent processing with high sampling rates.
Validation of the FEA of a deep drawing process with additional force transmission
NASA Astrophysics Data System (ADS)
Behrens, B.-A.; Bouguecha, A.; Bonk, C.; Grbic, N.; Vucetic, M.
2017-10-01
In order to meet requirements by automotive industry like decreasing the CO2 emissions, which reflects in reducing vehicles mass in the car body, the chassis and the powertrain, the continuous innovation and further development of existing production processes are required. In sheet metal forming processes the process limits and components characteristics are defined through the process specific loads. While exceeding the load limits, a failure in the material occurs, which can be avoided by additional force transmission activated in the deep drawing process before the process limit is achieved. This contribution deals with experimental investigations of a forming process with additional force transmission regarding the extension of the process limits. Based on FEA a tool system is designed and developed by IFUM. For this purpose, the steel material HCT600 is analyzed numerically. Within the experimental investigations, the deep drawing processes, with and without the additional force transmission are carried out. Here, a comparison of the produced rectangle cups is done. Subsequently, the identical deep drawing processes are investigated numerically. Thereby, the values of the punch reaction force and displacement are estimated and compared with experimental results. Thus, the validation of material model is successfully carried out on process scale. For further quantitative verification of the FEA results the experimental determined geometry of the rectangular cup is measured optically with ATOS system of the company GOM mbH and digitally compared with external software Geomagic®QualifyTM. The goal of this paper is the verification of the transferability of the FEA model for a conventional deep drawing process to a deep drawing process with additional force transmission with a counter punch.
Ponce-Robles, Laura; Rivas, Gracia; Esteban, Belen; Oller, Isabel; Malato, Sixto; Agüera, Ana
2017-10-01
An analytical method was developed and validated for the determination of ten pesticides in sewage sludge coming from an agro-food industry. The method was based on the application of Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) extraction for solid sewage sludge and SPE extraction for sludge aqueous phase, followed by liquid chromatography (LC) coupled to hybrid quadrupole/linear ion trap mass spectrometry (QqLIT-MS). The QuEChERS method was reported 14 years ago and nowadays is mainly applied to the analysis of pesticides in food. More recent applications have been reported in other matrices as sewage sludge, but the complexity of the matrix makes necessary the optimization of the cleanup step to improve the efficiency of the analysis. With this aim, several dispersive solid-phase extraction cleanup sorbents were tested, choosing C18 + PSA as a d-SPE sorbent. The proposed method was satisfactorily validated for most compounds investigated, showing recoveries higher than 80% in most cases, with the only exception of prochloraz (71%) at low concentration level. Limits of quantification were lower than 40 ng l -1 in the aqueous phase and below 40 ng g -1 in the solid phase for the majority of the analytes. The method was applied to solid sludge and the sludge aqueous phase coming from an agro-food industry which processes fruits and vegetables. Graphical abstract Application of LC/MS/MS advanced analytical techniques for determination of pesticides contained in sewage sludge.
Study of Material Used in Nanotechnology for the Recycling of Industrial Waste Water
NASA Astrophysics Data System (ADS)
Larbi, L.; Fertikh, N.; Toubal, A.
The objective of our study is to recycle the industrial waste water of a industrial Complex after treatment by the bioprocess MBR (membrane bioreactor). In order to apply this bioprocess, the water quality in question was first of all studied. To characterize this industrial waste water, a series of physicochemical analysis was carried out according to standardized directives and methods. Following-up the water quality to meet the regulatory requirements with rejection of this industrial waste water, a study was done thanks to the permanently monitoring of the following relevant parameters(P): the flow, the potential of hydrogen (pH), the total suspended solids(TSS), the turbidity (Turb), the chemical oxygen demand (COD),the biochemical oxygen demand (BOD), the Kjeldahl total nitrogen (KTN) and ammonia (NH4+), the total phosphorus (Ptot), the fluorine (F), the oils (O), the fats (F) and the phenols (Ph). According to collected information, it was established the sampling rates to which the quality control was done, the selected analytical methods were validated by the control charts and the analysis test number was determined by the Cochran test. The results of the quality control show that some rejected water contents are not in the Algerian standards, but, in our case, the objective is the preoccupation for a standard setting of these industrial water parameters so as to recycle it. The process adopted by MBR for waste water treatment is being studied, first in the development of the experimental characterizing of the reactor and the selected membrane.
In-situ monitoring of ? phase transformation in Ti-6Al-6V-2Sn using laser ultrasonics
NASA Astrophysics Data System (ADS)
Hinterlechner, Irina; Barriobero-Vila, Pere; Reitinger, Bernhard; Fromherz, Thomas; Requena, Guillermo; Burgholzer, Peter
2018-04-01
Titanium is of great interest for metal processing industries due to its superior material properties, but it is also quite expensive. Therefore, a detailed knowledge of ? phase transformation and consequential the distribution of ? and ? phase in titanium alloys is crucial for their material properties and as a consequence for further processing steps. Measuring the ultrasonic velocity and attenuation by laser ultrasonics technology (LUS) as a non-destructive and non-contact technique, it is possible to qualitatively monitor in-situ the phase transformation during heating the sample from room temperature up to ?. We validate LUS methodology against high energy X-ray diffraction as well as against conventional metallurgic measurements and get excellent agreement between the results of these methods.
Holvoet, Kevin; Jacxsens, Liesbeth; Sampers, Imca; Uyttendaele, Mieke
2012-04-01
This study provided insight into the degree of microbial contamination in the processing chain of prepacked (bagged) lettuce in two Belgian fresh-cut produce processing companies. The pathogens Salmonella and Listeria monocytogenes were not detected. Total psychrotrophic aerobic bacterial counts (TPACs) in water samples, fresh produce, and environmental samples suggested that the TPAC is not a good indicator of overall quality and best manufacturing practices during production and processing. Because of the high TPACs in the harvested lettuce crops, the process water becomes quickly contaminated, and subsequent TPACs do not change much throughout the production process of a batch. The hygiene indicator Escherichia coli was used to assess the water management practices in these two companies in relation to food safety. Practices such as insufficient cleaning and disinfection of washing baths, irregular refilling of the produce wash baths with water of good microbial quality, and the use of high product/water ratios resulted in a rapid increase in E. coli in the processing water, with potential transfer to the end product (fresh-cut lettuce). The washing step in the production of fresh-cut lettuce was identified as a potential pathway for dispersion of microorganisms and introduction of E. coli to the end product via cross-contamination. An intervention step to reduce microbial contamination is needed, particularly when no sanitizers are used as is the case in some European Union countries. Thus, from a food safety point of view proper water management (and its validation) is a critical point in the fresh-cut produce processing industry.
NASA Technical Reports Server (NTRS)
Harris, Charles E.; Starnes, James H., Jr.; Newman, James C., Jr.
1995-01-01
NASA is developing a 'tool box' that includes a number of advanced structural analysis computer codes which, taken together, represent the comprehensive fracture mechanics capability required to predict the onset of widespread fatigue damage. These structural analysis tools have complementary and specialized capabilities ranging from a finite-element-based stress-analysis code for two- and three-dimensional built-up structures with cracks to a fatigue and fracture analysis code that uses stress-intensity factors and material-property data found in 'look-up' tables or from equations. NASA is conducting critical experiments necessary to verify the predictive capabilities of the codes, and these tests represent a first step in the technology-validation and industry-acceptance processes. NASA has established cooperative programs with aircraft manufacturers to facilitate the comprehensive transfer of this technology by making these advanced structural analysis codes available to industry.
Life in the fast lane: high-throughput chemistry for lead generation and optimisation.
Hunter, D
2001-01-01
The pharmaceutical industry has come under increasing pressure due to regulatory restrictions on the marketing and pricing of drugs, competition, and the escalating costs of developing new drugs. These forces can be addressed by the identification of novel targets, reductions in the development time of new drugs, and increased productivity. Emphasis has been placed on identifying and validating new targets and on lead generation: the response from industry has been very evident in genomics and high throughput screening, where new technologies have been applied, usually coupled with a high degree of automation. The combination of numerous new potential biological targets and the ability to screen large numbers of compounds against many of these targets has generated the need for large diverse compound collections. To address this requirement, high-throughput chemistry has become an integral part of the drug discovery process. Copyright 2002 Wiley-Liss, Inc.
Evolution of microbiological analytical methods for dairy industry needs
Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence
2014-01-01
Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry’s needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards. PMID:24570675
Seismic anisotropy in deforming salt bodies
NASA Astrophysics Data System (ADS)
Prasse, P.; Wookey, J. M.; Kendall, J. M.; Dutko, M.
2017-12-01
Salt is often involved in forming hydrocarbon traps. Studying salt dynamics and the deformation processes is important for the exploration industry. We have performed numerical texture simulations of single halite crystals deformed by simple shear and axial extension using the visco-plastic self consistent approach (VPSC). A methodology from subduction studies to estimate strain in a geodynamic simulation is applied to a complex high-resolution salt diapir model. The salt diapir deformation is modelled with the ELFEN software by our industrial partner Rockfield, which is based on a finite-element code. High strain areas at the bottom of the head-like strctures of the salt diapir show high amount of seismic anisotropy due to LPO development of halite crystals. The results demonstrate that a significant degree of seismic anisotropy can be generated, validating the view that this should be accounted for in the treatment of seismic data in, for example, salt diapir settings.
Streamlining Software Aspects of Certification: Report on the SSAC Survey
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.; Dorsey, Cheryl A.; Knight, John C.; Leveson, Nancy G.; McCormick, G. Frank
1999-01-01
The aviation system now depends on information technology more than ever before to ensure safety and efficiency. To address concerns about the efficacy of software aspects of the certification process, the Federal Aviation Administration (FAA) began the Streamlining Software Aspects of Certification (SSAC) program. The SSAC technical team was commissioned to gather data, analyze results, and propose recommendations to maximize efficiency and minimize cost and delay, without compromising safety. The technical team conducted two public workshops to identify and prioritize software approval issues, and conducted a survey to validate the most urgent of those issues. The SSAC survey, containing over two hundred questions about the FAA's software approval process, reached over four hundred industry software developers, aircraft manufacturers, and FAA designated engineering representatives. Three hundred people responded. This report presents the SSAC program rationale, survey process, preliminary findings, and recommendations.
Dynamic of particle-laden liquid sheet
NASA Astrophysics Data System (ADS)
Sauret, Alban; Jop, Pierre; Troger, Anthony
2016-11-01
Many industrial processes, such as surface coating or liquid transport in tubes, involve liquid sheets or thin liquid films of suspensions. In these situations, the thickness of the liquid film becomes comparable to the particle size, which leads to unexpected dynamics. In addition, the classical constitutive rheological law cannot be applied as the continuum approximation is no longer valid. Here, we consider experimentally a transient free liquid sheet that expands radially. We characterize the influence of the particles on the shape of the liquid film as a function of time and the atomization process. We highlight that the presence of particles modifies the thickness and the stability of the liquid sheet. Our study suggests that the influence of particles through capillary effects can modify significantly the dynamics of processes that involve suspensions and particles confined in liquid films.
A Tailored Ontology Supporting Sensor Implementation for the Maintenance of Industrial Machines.
Maleki, Elaheh; Belkadi, Farouk; Ritou, Mathieu; Bernard, Alain
2017-09-08
The longtime productivity of an industrial machine is improved by condition-based maintenance strategies. To do this, the integration of sensors and other cyber-physical devices is necessary in order to capture and analyze a machine's condition through its lifespan. Thus, choosing the best sensor is a critical step to ensure the efficiency of the maintenance process. Indeed, considering the variety of sensors, and their features and performance, a formal classification of a sensor's domain knowledge is crucial. This classification facilitates the search for and reuse of solutions during the design of a new maintenance service. Following a Knowledge Management methodology, the paper proposes and develops a new sensor ontology that structures the domain knowledge, covering both theoretical and experimental sensor attributes. An industrial case study is conducted to validate the proposed ontology and to demonstrate its utility as a guideline to ease the search of suitable sensors. Based on the ontology, the final solution will be implemented in a shared repository connected to legacy CAD (computer-aided design) systems. The selection of the best sensor is, firstly, obtained by the matching of application requirements and sensor specifications (that are proposed by this sensor repository). Then, it is refined from the experimentation results. The achieved solution is recorded in the sensor repository for future reuse. As a result, the time and cost of the design process of new condition-based maintenance services is reduced.
Application of target costing in machining
NASA Astrophysics Data System (ADS)
Gopalakrishnan, Bhaskaran; Kokatnur, Ameet; Gupta, Deepak P.
2004-11-01
In today's intensely competitive and highly volatile business environment, consistent development of low cost and high quality products meeting the functionality requirements is a key to a company's survival. Companies continuously strive to reduce the costs while still producing quality products to stay ahead in the competition. Many companies have turned to target costing to achieve this objective. Target costing is a structured approach to determine the cost at which a proposed product, meeting the quality and functionality requirements, must be produced in order to generate the desired profits. It subtracts the desired profit margin from the company's selling price to establish the manufacturing cost of the product. Extensive literature review revealed that companies in automotive, electronic and process industries have reaped the benefits of target costing. However target costing approach has not been applied in the machining industry, but other techniques based on Geometric Programming, Goal Programming, and Lagrange Multiplier have been proposed for application in this industry. These models follow a forward approach, by first selecting a set of machining parameters, and then determining the machining cost. Hence in this study we have developed an algorithm to apply the concepts of target costing, which is a backward approach that selects the machining parameters based on the required machining costs, and is therefore more suitable for practical applications in process improvement and cost reduction. A target costing model was developed for turning operation and was successfully validated using practical data.
Fiehe, Sandra; Wagner, Georg; Schlanstein, Peter; Rosefort, Christiane; Kopp, Rüdger; Bensberg, Ralf; Knipp, Peter; Schmitz-Rode, Thomas; Steinseifer, Ulrich; Arens, Jutta
2014-04-01
The ultimate objective of university research and development projects is usually to create knowledge, but also to successfully transfer results to industry for subsequent marketing. We hypothesized that the university technology transfer requires efficient measures to improve this important step. Besides good scientific practice, foresighted and industry-specific adapted documentation of research processes in terms of a quality management system might improve the technology transfer. In order to bridge the gap between research institute and cooperating industry, a model project has been accompanied by a project specific amount of quality management. However, such a system had to remain manageable and must not constrain the researchers' creativity. Moreover, topics and research team are strongly interdisciplinary, which entails difficulties regarding communication because of different perspectives and terminology. In parallel to the technical work of the model project, an adaptable quality management system with a quality manual, defined procedures, and forms and documents accompanying the research, development and validation was implemented. After process acquisition and analysis the appropriate amount of management for the model project was identified by a self-developed rating system considering project characteristics like size, innovation, stakeholders, interdisciplinarity, etc. Employees were trained according to their needs. The management was supported and the technical documentation was optimized. Finally, the quality management system has been transferred successfully to further projects.
Computational Modeling in Structural Materials Processing
NASA Technical Reports Server (NTRS)
Meyyappan, Meyya; Arnold, James O. (Technical Monitor)
1997-01-01
High temperature materials such as silicon carbide, a variety of nitrides, and ceramic matrix composites find use in aerospace, automotive, machine tool industries and in high speed civil transport applications. Chemical vapor deposition (CVD) is widely used in processing such structural materials. Variations of CVD include deposition on substrates, coating of fibers, inside cavities and on complex objects, and infiltration within preforms called chemical vapor infiltration (CVI). Our current knowledge of the process mechanisms, ability to optimize processes, and scale-up for large scale manufacturing is limited. In this regard, computational modeling of the processes is valuable since a validated model can be used as a design tool. The effort is similar to traditional chemically reacting flow modeling with emphasis on multicomponent diffusion, thermal diffusion, large sets of homogeneous reactions, and surface chemistry. In the case of CVI, models for pore infiltration are needed. In the present talk, examples of SiC nitride, and Boron deposition from the author's past work will be used to illustrate the utility of computational process modeling.
Mantanus, J; Rozet, E; Van Butsele, K; De Bleye, C; Ceccato, A; Evrard, B; Hubert, Ph; Ziémons, E
2011-08-05
Using near infrared (NIR) and Raman spectroscopy as PAT tools, 3 critical quality attributes of a silicone-based drug reservoir were studied. First, the Active Pharmaceutical Ingredient (API) homogeneity in the reservoir was evaluated using Raman spectroscopy (mapping): the API distribution within the industrial drug reservoirs was found to be homogeneous while API aggregates were detected in laboratory scale samples manufactured with a non optimal mixing process. Second, the crosslinking process of the reservoirs was monitored at different temperatures with NIR spectroscopy. Conformity tests and Principal Component Analysis (PCA) were performed on the collected data to find out the relation between the temperature and the time necessary to reach the crosslinking endpoints. An agreement was found between the conformity test results and the PCA results. Compared to the conformity test method, PCA had the advantage to discriminate the heating effect from the crosslinking effect occurring together during the monitored process. Therefore the 2 approaches were found to be complementary. Third, based on the HPLC reference method, a NIR model able to quantify the API in the drug reservoir was developed and thoroughly validated. Partial Least Squares (PLS) regression on the calibration set was performed to build prediction models of which the ability to quantify accurately was tested with the external validation set. The 1.2% Root Mean Squared Error of Prediction (RMSEP) of the NIR model indicated the global accuracy of the model. The accuracy profile based on tolerance intervals was used to generate a complete validation report. The 95% tolerance interval calculated on the validation results indicated that each future result will have a relative error below ±5% with a probability of at least 95%. In conclusion, 3 critical quality attributes of silicone-based drug reservoirs were quickly and efficiently evaluated by NIR and Raman spectroscopy. Copyright © 2011 Elsevier B.V. All rights reserved.
Multiphysics modeling of selective laser sintering/melting
NASA Astrophysics Data System (ADS)
Ganeriwala, Rishi Kumar
A significant percentage of total global employment is due to the manufacturing industry. However, manufacturing also accounts for nearly 20% of total energy usage in the United States according to the EIA. In fact, manufacturing accounted for 90% of industrial energy consumption and 84% of industry carbon dioxide emissions in 2002. Clearly, advances in manufacturing technology and efficiency are necessary to curb emissions and help society as a whole. Additive manufacturing (AM) refers to a relatively recent group of manufacturing technologies whereby one can 3D print parts, which has the potential to significantly reduce waste, reconfigure the supply chain, and generally disrupt the whole manufacturing industry. Selective laser sintering/melting (SLS/SLM) is one type of AM technology with the distinct advantage of being able to 3D print metals and rapidly produce net shape parts with complicated geometries. In SLS/SLM parts are built up layer-by-layer out of powder particles, which are selectively sintered/melted via a laser. However, in order to produce defect-free parts of sufficient strength, the process parameters (laser power, scan speed, layer thickness, powder size, etc.) must be carefully optimized. Obviously, these process parameters will vary depending on material, part geometry, and desired final part characteristics. Running experiments to optimize these parameters is costly, energy intensive, and extremely material specific. Thus a computational model of this process would be highly valuable. In this work a three dimensional, reduced order, coupled discrete element - finite difference model is presented for simulating the deposition and subsequent laser heating of a layer of powder particles sitting on top of a substrate. Validation is provided and parameter studies are conducted showing the ability of this model to help determine appropriate process parameters and an optimal powder size distribution for a given material. Next, thermal stresses upon cooling are calculated using the finite difference method. Different case studies are performed and general trends can be seen. This work concludes by discussing future extensions of this model and the need for a multi-scale approach to achieve comprehensive part-level models of the SLS/SLM process.
NASA Astrophysics Data System (ADS)
Kaiser, C.; Roll, K.; Volk, W.
2017-09-01
In the automotive industry, the manufacturing of automotive outer panels requires hemming processes in which two sheet metal parts are joined together by bending the flange of the outer part over the inner part. Because of decreasing development times and the steadily growing number of vehicle derivatives, an efficient digital product and process validation is necessary. Commonly used simulations, which are based on the finite element method, demand significant modelling effort, which results in disadvantages especially in the early product development phase. To increase the efficiency of designing hemming processes this paper presents a hemming-specific metamodel approach. The approach includes a part analysis in which the outline of the automotive outer panels is initially split into individual segments. By doing a para-metrization of each of the segments and assigning basic geometric shapes, the outline of the part is approximated. Based on this, the hemming parameters such as flange length, roll-in, wrinkling and plastic strains are calculated for each of the geometric basic shapes by performing a meta-model-based segmental product validation. The metamodel is based on an element similar formulation that includes a reference dataset of various geometric basic shapes. A random automotive outer panel can now be analysed and optimized based on the hemming-specific database. By implementing this approach into a planning system, an efficient optimization of designing hemming processes will be enabled. Furthermore, valuable time and cost benefits can be realized in a vehicle’s development process.
Method validation for methanol quantification present in working places
NASA Astrophysics Data System (ADS)
Muna, E. D. M.; Bizarri, C. H. B.; Maciel, J. R. M.; da Rocha, G. P.; de Araújo, I. O.
2015-01-01
Given the widespread use of methanol by different industry sectors and high toxicity associated with this substance, it is necessary to use an analytical method able to determine in a sensitive, precise and accurate levels of methanol in the air of working environments. Based on the methodology established by the National Institute for Occupational Safety and Health (NIOSH), it was validated a methodology for determination of methanol in silica gel tubes which had demonstrated its effectiveness based on the participation of the international collaborative program sponsored by the American Industrial Hygiene Association (AIHA).
Zimmermann, Hartmut F; Hentschel, Norbert
2011-01-01
With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.
NASA Astrophysics Data System (ADS)
Ismail, A. H.; Mahardika, R. Z. Z.
2017-12-01
Supply chain management has increased more significance with the impact of globalization. In the present worldwide market, well-managed supply chain is a standout amongst the most vital requirement to be more competitive in the market. For any organization incorporate cement industry, the most critical decision in initial process of supply chain management is to buy products, materials or services from suppliers. So the role of suppliers is irrefutable important in the global aggressive markets. Appropriate decision of supplier selection can lead to reducing cost in supply chain management. However, it is becoming more complex because of existing various criteria and involving the suitable experts in the company to make valid decision in accordance with its criteria. In this study, the supplier selection of an Indonesia’s leading cement company is analyzed by using one of the popular multi-criteria decision making method, Saaty’s analytical network process (ANP). It is employed for the selection of the best alternative among three suppliers of pasted bag. Supplier with the highest rank comes from several major steps from building the relationship between various criteria to rating the alternatives with the help of experts from the company. The results show that, Communication capability, Flexible payment terms, Ability to meet delivery quantities are the most important criteria in the pasted bag supplier selection in Indonesian cement industry with 0.155, 0.110 and 0.1 ANP coefficient respectively. And based on the ANP coefficient values in limit supermatrix, the A2 or supplier 2 had the highest score with 64.7% or 0.13 ANP coefficient.
Intellectual property and access to medicines: an analysis of legislation in Central America
Cerón, Alejandro
2009-01-01
Abstract Globalization of intellectual property (IP) protection for medicines has been advancing during the past decade. Countries are obliged to adapt their legislation as a requirement of their membership to the World Trade Organization or as a condition of being part of international trade agreements. There is a growing recognition that, in low-income countries, stronger IP protection is a barrier to access to medicines. At the same time, the number of low-income countries writing national legislation to protect IP for pharmaceutical products is growing worldwide, but little research has been done on the ways in which this process is happening at the national level. This paper aims to contribute to the understanding of the implementation of IP legislation at the national level by providing a comparative analysis of the countries that are part of the United States–Dominican Republic–Central America Free Trade Agreement (DR-CAFTA). The analysis shows three trends. First, countries have often implemented stronger IP protection than required by trade agreements. Second, some countries have adopted IP protection before signing the trade agreements. Third, the process of ratification of DR-CAFTA increased public debate around these issues, which in some cases led to IP legislation that considers public health needs. These trends suggest that industrialized countries and the pharmaceutical industry are using more tactics than just trade agreements to push for increased IP protection and that the process of national legislation is a valid arena for confronting public health needs to those of the industry. PMID:19876546
Hurtado, Adriana; Guàrdia, Maria Dolors; Picouet, Pierre; Jofré, Anna; Ros, José María; Bañón, Sancho
2017-02-01
Non-thermal pasteurisation by high-pressure processing (HPP) is increasingly replacing thermal processing (TP) to maintain the properties of fresh fruit products. The resulting products need to be validated from a sensory and nutritional standpoint. The objective was to assess a mild HPP treatment to stabilise red fruit-based smoothies in a wide (sensory quality and major nutrients) study. HPP (350 MPa/ 10 °C/ 5 min) provided 'fresh-like' smoothies, free of cooked-fruit flavours, for at least 14 days at 4 °C, although their sensory stability was low compared with the TP-smoothies (85 °C/ 7 min). In HPP-smoothies, the loss of fresh fruit flavour and reduced sliminess were the clearest signs of sensory deterioration during storage. Furthermore, HPP permitted the higher initial retention of vitamin C, although this vitamin and, to a lesser extent, total phenols, had a higher degradation rate during storage. The content of sugar present was not affected by either processing treatment. Mild HPP treatment did not alter the sensory and nutritional properties of smoothies. The sensory and nutritional losses during storage were less than might be expected, probably due to the high antioxidant content and the natural turbidity provided by red fruits. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
NASA Technical Reports Server (NTRS)
Kuo, Kenneth K.; Lu, Y. C.; Chiaverini, Martin J.; Harting, George C.
1994-01-01
An experimental study on the fundamental processes involved in fuel decomposition and boundary layer combustion in hybrid rocket motors is being conducted at the High Pressure Combustion Laboratory of the Pennsylvania State University. This research should provide a useful engineering technology base in the development of hybrid rocket motors as well as a fundamental understanding of the complex processes involved in hybrid propulsion. A high pressure slab motor has been designed and manufactured for conducting experimental investigations. Oxidizer (LOX or GOX) supply and control systems have been designed and partly constructed for the head-end injection into the test chamber. Experiments using HTPB fuel, as well as fuels supplied by NASA designated industrial companies will be conducted. Design and construction of fuel casting molds and sample holders have been completed. The portion of these items for industrial company fuel casting will be sent to the McDonnell Douglas Aerospace Corporation in the near future. The study focuses on the following areas: observation of solid fuel burning processes with LOX or GOX, measurement and correlation of solid fuel regression rate with operating conditions, measurement of flame temperature and radical species concentrations, determination of the solid fuel subsurface temperature profile, and utilization of experimental data for validation of a companion theoretical study (Part 2) also being conducted at PSU.
Integrated Syntactic/Semantic XML Data Validation with a Reusable Software Component
ERIC Educational Resources Information Center
Golikov, Steven
2013-01-01
Data integration is a critical component of enterprise system integration, and XML data validation is the foundation for sound data integration of XML-based information systems. Since B2B e-commerce relies on data validation as one of the critical components for enterprise integration, it is imperative for financial industries and e-commerce…
PFEM-based modeling of industrial granular flows
NASA Astrophysics Data System (ADS)
Cante, J.; Dávalos, C.; Hernández, J. A.; Oliver, J.; Jonsén, P.; Gustafsson, G.; Häggblad, H.-Å.
2014-05-01
The potential of numerical methods for the solution and optimization of industrial granular flows problems is widely accepted by the industries of this field, the challenge being to promote effectively their industrial practice. In this paper, we attempt to make an exploratory step in this regard by using a numerical model based on continuous mechanics and on the so-called Particle Finite Element Method (PFEM). This goal is achieved by focusing two specific industrial applications in mining industry and pellet manufacturing: silo discharge and calculation of power draw in tumbling mills. Both examples are representative of variations on the granular material mechanical response—varying from a stagnant configuration to a flow condition. The silo discharge is validated using the experimental data, collected on a full-scale flat bottomed cylindrical silo. The simulation is conducted with the aim of characterizing and understanding the correlation between flow patterns and pressures for concentric discharges. In the second example, the potential of PFEM as a numerical tool to track the positions of the particles inside the drum is analyzed. Pressures and wall pressures distribution are also studied. The power draw is also computed and validated against experiments in which the power is plotted in terms of the rotational speed of the drum.
Determination of aflatoxins in by-products of industrial processing of cocoa beans.
Copetti, Marina V; Iamanaka, Beatriz T; Pereira, José Luiz; Lemes, Daniel P; Nakano, Felipe; Taniwaki, Marta H
2012-01-01
This study has examined the occurrence of aflatoxins in 168 samples of different fractions obtained during the processing of cocoa in manufacturing plants (shell, nibs, mass, butter, cake and powder) using an optimised methodology for cocoa by-products. The method validation was based on selectivity, linearity, limit of detection and recovery. The method was shown to be adequate for use in quantifying the contamination of cocoa by aflatoxins B(1), B(2), G(1) and G(2). Furthermore, the method was easier to use than other methods available in the literature. For aflatoxin extraction from cocoa samples, a methanol-water solution was used, and then immunoaffinity columns were employed for clean-up before the determination by high-performance liquid chromatography. A survey demonstrated a widespread occurrence of aflatoxins in cocoa by-products, although in general the levels of aflatoxins present in the fractions from industrial processing of cocoa were low. A maximum aflatoxin contamination of 13.3 ng g(-1) was found in a nib sample. The lowest contamination levels were found in cocoa butter. Continued monitoring of aflatoxins in cocoa by-products is nevertheless necessary because these toxins have a high toxicity to humans and cocoa is widely consumed by children through cocoa-containing products, like candies.
NREL Begins On-Site Validation of Drivetrain Gearbox and Bearings | News |
drivetrain failure often leads to higher-than-expected operations and maintenance costs. NREL researchers operations and maintenance costs for the wind industry. The validation is expected to last through the spring
NASA Astrophysics Data System (ADS)
Tsibranska, Irene; Vlaev, Serafim; Tylkowski, Bartosz
2018-01-01
Integrating biological treatment with membrane separation has found a broad area of applications and industrial attention. Submerged membrane bioreactors (SMBRs), based on membrane modules immersed in the bioreactor, or side stream ones connected in recycle have been employed in different biotechnological processes for separation of thermally unstable products. Fouling is one of the most important challenges in the integrated SMBRs. A number of works are devoted to fouling analysis and its treatment, especially exploring the opportunity for enhanced fouling control in SMBRs. The main goal of the review is to provide a comprehensive yet concise overview of modeling the fouling in SMBRs in view of the problematics of model validation, either by real system measurements at different scales or by analysis of the obtained theoretical results. The review is focused on the current state of research applying computational fluid dynamics (CFD) modeling techniques.
Donelan, Ronan; Walker, Stuart; Salek, Sam
2016-01-01
The impact of decision-making during the development and the regulatory review of medicines greatly influences the delivery of new medicinal products. Currently, there is no generic instrument that can be used to assess the quality of decision-making. This study describes the development of the Quality of Decision-Making Orientation Scheme QoDoS(©) instrument for appraising the quality of decision-making. Semi-structured interviews about decision-making were carried out with 29 senior decision makers from the pharmaceutical industry (10), regulatory authorities (9) and contract research organizations (10). The interviews offered a qualified understanding of the subjective decision-making approach, influences, behaviors and other factors that impact such processes for individuals and organizations involved in the delivery of new medicines. Thematic analysis of the transcribed interviews was carried out using NVivo8® software. Content validity was carried out using qualitative and quantitative data by an expert panel, which led to the developmental version of the QoDoS. Further psychometric evaluations were performed, including factor analysis, item reduction, reliability testing and construct validation. The thematic analysis of the interviews yielded a 94-item initial version of the QoDoS(©) with a 5-point Likert scale. The instrument was tested for content validity using a panel of experts for language clarity, completeness, relevance and scaling, resulting in a favorable agreement by panel members with an intra-class correlation coefficient value of 0.89 (95% confidence interval = 0.56, 0.99). A 76-item QoDoS(©) (version 2) emerged from content validation. Factor analysis produced a 47-item measure with four domains. The 47-item QoDoS(©) (version 3) showed high internal consistency (n = 120, Cronbach's alpha = 0.89), high reproducibility (n = 20, intra-class correlation = 0.77) and a mean completion time of 10 min. Reliability testing and construct validation was successfully performed. The QoDoS(©) is both reliable and valid for use. It has the potential for extensive use in medicines development by both the pharmaceutical industry and regulatory authorities. The QoDoS(©) can be used to assess the quality of decision-making and to inform decision makers of the factors that influence decision-making.
Arvanitoyannis, Ioannis S; Varzakas, Theodoros H
2009-08-01
Failure Mode and Effect Analysis (FMEA) has been applied for the risk assessment of snails manufacturing. A tentative approach of FMEA application to the snails industry was attempted in conjunction with ISO 22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (snails processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over snails processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the RPN per identified processing hazard. Sterilization of tins, bioaccumulation of heavy metals, packaging of shells and poisonous mushrooms, were the processes identified as the ones with the highest RPN (280, 240, 147, 144, respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a snails processing industry is considered imperative.
Online monitoring of fermentation processes via non-invasive low-field NMR.
Kreyenschulte, Dirk; Paciok, Eva; Regestein, Lars; Blümich, Bernhard; Büchs, Jochen
2015-09-01
For the development of biotechnological processes in academia as well as in industry new techniques are required which enable online monitoring for process characterization and control. Nuclear magnetic resonance (NMR) spectroscopy is a promising analytical tool, which has already found broad applications in offline process analysis. The use of online monitoring, however, is oftentimes constrained by high complexity of custom-made NMR bioreactors and considerable costs for high-field NMR instruments (>US$200,000). Therefore, low-field (1) H NMR was investigated in this study in a bypass system for real-time observation of fermentation processes. The new technique was validated with two microbial systems. For the yeast Hansenula polymorpha glycerol consumption could accurately be assessed in spite of the presence of high amounts of complex constituents in the medium. During cultivation of the fungal strain Ustilago maydis, which is accompanied by the formation of several by-products, the concentrations of glucose, itaconic acid, and the relative amount of glycolipids could be quantified. While low-field spectra are characterized by reduced spectral resolution compared to high-field NMR, the compact design combined with the high temporal resolution (15 s-8 min) of spectra acquisition allowed online monitoring of the respective processes. Both applications clearly demonstrate that the investigated technique is well suited for reaction monitoring in opaque media while at the same time it is highly robust and chemically specific. It can thus be concluded that low-field NMR spectroscopy has a great potential for non-invasive online monitoring of biotechnological processes at the research and practical industrial scales. © 2015 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Zhang, Zhongyang; Nian, Qiong; Doumanidis, Charalabos C.; Liao, Yiliang
2018-02-01
Nanosecond pulsed laser shock processing (LSP) techniques, including laser shock peening, laser peen forming, and laser shock imprinting, have been employed for widespread industrial applications. In these processes, the main beneficial characteristic is the laser-induced shockwave with a high pressure (in the order of GPa), which leads to the plastic deformation with an ultrahigh strain rate (105-106/s) on the surface of target materials. Although LSP processes have been extensively studied by experiments, few efforts have been put on elucidating underlying process mechanisms through developing a physics-based process model. In particular, development of a first-principles model is critical for process optimization and novel process design. This work aims at introducing such a theoretical model for a fundamental understanding of process mechanisms in LSP. Emphasis is placed on the laser-matter interaction and plasma dynamics. This model is found to offer capabilities in predicting key parameters including electron and ion temperatures, plasma state variables (temperature, density, and pressure), and the propagation of the laser shockwave. The modeling results were validated by experimental data.
Participatory design of a preliminary safety checklist for general practice
Bowie, Paul; Ferguson, Julie; MacLeod, Marion; Kennedy, Susan; de Wet, Carl; McNab, Duncan; Kelly, Moya; McKay, John; Atkinson, Sarah
2015-01-01
Background The use of checklists to minimise errors is well established in high reliability, safety-critical industries. In health care there is growing interest in checklists to standardise checking processes and ensure task completion, and so provide further systemic defences against error and patient harm. However, in UK general practice there is limited experience of safety checklist use. Aim To identify workplace hazards that impact on safety, health and wellbeing, and performance, and codesign a standardised checklist process. Design and setting Application of mixed methods to identify system hazards in Scottish general practices and develop a safety checklist based on human factors design principles. Method A multiprofessional ‘expert’ group (n = 7) and experienced front-line GPs, nurses, and practice managers (n = 18) identified system hazards and developed and validated a preliminary checklist using a combination of literature review, documentation review, consensus building workshops using a mini-Delphi process, and completion of content validity index exercise. Results A prototype safety checklist was developed and validated consisting of six safety domains (for example, medicines management), 22 sub-categories (for example, emergency drug supplies) and 78 related items (for example, stock balancing, secure drug storage, and cold chain temperature recording). Conclusion Hazards in the general practice work system were prioritised that can potentially impact on the safety, health and wellbeing of patients, GP team members, and practice performance, and a necessary safety checklist prototype was designed. However, checklist efficacy in improving safety processes and outcomes is dependent on user commitment, and support from leaders and promotional champions. Although further usability development and testing is necessary, the concept should be of interest in the UK and internationally. PMID:25918338
Biowaste biorefinery in Europe: opportunities and research & development needs.
Fava, Fabio; Totaro, Grazia; Diels, Ludo; Reis, Maria; Duarte, Jose; Carioca, Osvaldo Beserra; Poggi-Varaldo, Héctor M; Ferreira, Bruno Sommer
2015-01-25
This review aims to explore the needs and opportunities of research & development in the field of biowaste biorefinery in Europe. Modern industry in recent years is giving its close attention on organic waste as a new precious bioresource. Specific biowaste valorisation pathways are focusing on food processing waste, being food sector the first manufacture in Europe. Anyway they need to be further tested and validated and then transferred at the larger scale. In particular, they also need to become integrated, combining biomass pretreatments and recovery of biogenic chemicals with bioconversion processes in order to obtain a large class of chemicals. This will help to (a) use the whole biowaste, by avoiding producing residues and providing to the approach the required environmental sustainability, and (b) producing different biobased products that enter different markets, to get the possible economical sustainability of the whole biorefinery. However, the costs of the developed integrated processes might be high, mostly for the fact that the industry dealing with such issues is still underdeveloped and therefore dominated by high processing costs. Such costs can be significantly reduced by intensifying research & development on process integration and intensification. The low or no cost of starting material along with the environmental benefits coming from the concomitant biowaste disposal would offset the high capital costs for initiating such a biorefinery. As long as the oil prices tend to increase (and they will) this strategy will become even more attractive. Copyright © 2013 Elsevier B.V. All rights reserved.
Ten Commandments Revisited: A Ten-Year Perspective on the Industrial Application of Formal Methods
NASA Technical Reports Server (NTRS)
Bowen, Jonathan P.; Hinchey, Michael G.
2005-01-01
Ten years ago, our 1995 paper Ten Commandments of Formal Methods suggested some guidelines to help ensure the success of a formal methods project. It proposed ten important requirements (or "commandments") for formal developers to consider and follow, based on our knowledge of several industrial application success stories, most of which have been reported in more detail in two books. The paper was surprisingly popular, is still widely referenced, and used as required reading in a number of formal methods courses. However, not all have agreed with some of our commandments, feeling that they may not be valid in the long-term. We re-examine the original commandments ten years on, and consider their validity in the light of a further decade of industrial best practice and experiences.
Zhang, Ti; Cai, Shuang; Forrest, Wai Chee; Mohr, Eva; Yang, Qiuhong; Forrest, M Laird
2016-09-01
Cisplatin, a platinum chemotherapeutic, is one of the most commonly used chemotherapeutic agents for many solid tumors. In this work, we developed and validated an inductively coupled plasma mass spectrometry (ICP-MS) method for quantitative determination of platinum levels in rat urine, plasma, and tissue matrices including liver, brain, lungs, kidney, muscle, heart, spleen, bladder, and lymph nodes. The tissues were processed using a microwave accelerated reaction system (MARS) system prior to analysis on an Agilent 7500 ICP-MS. According to the Food and Drug Administration guidance for industry, bioanalytical validation parameters of the method, such as selectivity, accuracy, precision, recovery, and stability were evaluated in rat biological samples. Our data suggested that the method was selective for platinum without interferences caused by other presenting elements, and the lower limit of quantification was 0.5 ppb. The accuracy and precision of the method were within 15% variation and the recoveries of platinum for all tissue matrices examined were determined to be 85-115% of the theoretical values. The stability of the platinum-containing solutions, including calibration standards, stock solutions, and processed samples in rat biological matrices was investigated. Results indicated that the samples were stable after three cycles of freeze-thaw and for up to three months. © The Author(s) 2016.
Zhang, Ti; Cai, Shuang; Forrest, Wai Chee; Mohr, Eva; Yang, Qiuhong; Forrest, M. Laird
2016-01-01
Cisplatin, a platinum chemotherapeutic, is one of the most commonly used chemotherapeutic agents for many solid tumors. In this work, we developed and validated an inductively coupled plasma mass spectrometry (ICP-MS) method for quantitative determination of platinum levels in rat urine, plasma, and tissue matrices including liver, brain, lungs, kidney, muscle, heart, spleen, bladder, and lymph nodes. The tissues were processed using a microwave accelerated reaction system (MARS) system prior to analysis on an Agilent 7500 ICP-MS. According to the Food and Drug Administration guidance for industry, bioanalytical validation parameters of the method, such as selectivity, accuracy, precision, recovery, and stability were evaluated in rat biological samples. Our data suggested that the method was selective for platinum without interferences caused by other presenting elements, and the lower limit of quantification was 0.5 ppb. The accuracy and precision of the method were within 15% variation and the recoveries of platinum for all tissue matrices examined were determined to be 85–115% of the theoretical values. The stability of the platinum-containing solutions, including calibration standards, stock solutions, and processed samples in rat biological matrices was investigated. Results indicated that the samples were stable after three cycles of freeze–thaw and for up to three months. PMID:27527103
DOT National Transportation Integrated Search
2016-04-15
This state of the practice review is a literature and industry review of existing vehicle trajectory datasets, vehicle trajectory collection methods, and traffic simulation model validation techniques. This report has the following four sections and ...
Quantitative analysis of packed and compacted granular systems by x-ray microtomography
NASA Astrophysics Data System (ADS)
Fu, Xiaowei; Milroy, Georgina E.; Dutt, Meenakshi; Bentham, A. Craig; Hancock, Bruno C.; Elliott, James A.
2005-04-01
The packing and compaction of powders are general processes in pharmaceutical, food, ceramic and powder metallurgy industries. Understanding how particles pack in a confined space and how powders behave during compaction is crucial for producing high quality products. This paper outlines a new technique, based on modern desktop X-ray tomography and image processing, to quantitatively investigate the packing of particles in the process of powder compaction and provide great insights on how powder densify during powder compaction, which relate in terms of materials properties and processing conditions to tablet manufacture by compaction. A variety of powder systems were considered, which include glass, sugar, NaCl, with a typical particle size of 200-300 mm and binary mixtures of NaCl-Glass Spheres. The results are new and have been validated by SEM observation and numerical simulations using discrete element methods (DEM). The research demonstrates that XMT technique has the potential in further investigating of pharmaceutical processing and even verifying other physical models on complex packing.
Han, Tae Hee; Kim, Moon Jung; Kim, Shinyoung; Kim, Hyun Ok; Lee, Mi Ae; Choi, Ji Seon; Hur, Mina; St John, Andrew
2013-05-01
Failure modes and effects analysis (FMEA) is a risk management tool used by the manufacturing industry but now being applied in laboratories. Teams from six South Korean blood banks used this tool to map their manual and automated blood grouping processes and determine the risk priority numbers (RPNs) as a total measure of error risk. The RPNs determined by each of the teams consistently showed that the use of automation dramatically reduced the RPN compared to manual processes. In addition, FMEA showed where the major risks occur in each of the manual processes and where attention should be prioritized to improve the process. Despite no previous experience with FMEA, the teams found the technique relatively easy to use and the subjectivity associated with assigning risk numbers did not affect the validity of the data. FMEA should become a routine technique for improving processes in laboratories. © 2012 American Association of Blood Banks.
Ibrahim, Reham S; Fathy, Hoda
2018-03-30
Tracking the impact of commonly applied post-harvesting and industrial processing practices on the compositional integrity of ginger rhizome was implemented in this work. Untargeted metabolite profiling was performed using digitally-enhanced HPTLC method where the chromatographic fingerprints were extracted using ImageJ software then analysed with multivariate Principal Component Analysis (PCA) for pattern recognition. A targeted approach was applied using a new, validated, simple and fast HPTLC image analysis method for simultaneous quantification of the officially recognized markers 6-, 8-, 10-gingerol and 6-shogaol in conjunction with chemometric Hierarchical Clustering Analysis (HCA). The results of both targeted and untargeted metabolite profiling revealed that peeling, drying in addition to storage employed during processing have a great influence on ginger chemo-profile, the different forms of processed ginger shouldn't be used interchangeably. Moreover, it deemed necessary to consider the holistic metabolic profile for comprehensive evaluation of ginger during processing. Copyright © 2018. Published by Elsevier B.V.
Kolstad, Henrik A; Sønderskov, Jette; Burstyn, Igor
2005-03-01
In epidemiological research, self-reported information about determinants and levels of occupational exposures is difficult to obtain, especially if the disease under study has a high mortality rate or follow-up has exceeded several years. In this paper, we present a semi-quantitative exposure assessment strategy for nested case-control studies of styrene exposure among workers of the Danish reinforced plastics industry when no information on job title, task or other indicators of individual exposure were readily available from cases and controls. The strategy takes advantage of the variability in styrene exposure level and styrene exposure probability across companies. The study comprised 1522 cases of selected malignancies and neurodegenerative diseases and controls employed in 230 reinforced plastics companies and other related industries. Between 1960 and 1996, 3057 measurements of styrene exposure level obtained from 191 companies, were identified. Mixed effects models were used to estimate expected styrene exposure levels by production characteristics for all companies. Styrene exposure probability within each company was estimated for all but three cases and controls from the fraction of laminators, which was reported by a sample of 945 living colleagues of the cases and controls and by employers and dealers of plastic raw materials. The estimates were validated from a subset of 427 living cases and controls that reported their own work as laminators in the industry. We computed styrene exposure scores that integrated estimated styrene exposure level and styrene exposure probability. Product (boats), process (hand and spray lamination) and calendar year period were the major determinants of styrene exposure level. Within-company styrene exposure variability increased by calendar year and was accounted for when computing the styrene exposure scores. Exposure probability estimates based on colleagues' reports showed the highest predictive values in the validation test, which also indicated that up to 67% of the workers were correctly classified into a styrene-exposed job. Styrene exposure scores declined about 10-fold from the 1960s-1990s. This exposure assessment approach may be justified in other industries, and especially in industries dominated by small companies with simple exposure conditions.
NASA Astrophysics Data System (ADS)
Jiang, Hui; Liu, Guohai; Mei, Congli; Yu, Shuang; Xiao, Xiahong; Ding, Yuhan
2012-11-01
The feasibility of rapid determination of the process variables (i.e. pH and moisture content) in solid-state fermentation (SSF) of wheat straw using Fourier transform near infrared (FT-NIR) spectroscopy was studied. Synergy interval partial least squares (siPLS) algorithm was implemented to calibrate regression model. The number of PLS factors and the number of subintervals were optimized simultaneously by cross-validation. The performance of the prediction model was evaluated according to the root mean square error of cross-validation (RMSECV), the root mean square error of prediction (RMSEP) and the correlation coefficient (R). The measurement results of the optimal model were obtained as follows: RMSECV = 0.0776, Rc = 0.9777, RMSEP = 0.0963, and Rp = 0.9686 for pH model; RMSECV = 1.3544% w/w, Rc = 0.8871, RMSEP = 1.4946% w/w, and Rp = 0.8684 for moisture content model. Finally, compared with classic PLS and iPLS models, the siPLS model revealed its superior performance. The overall results demonstrate that FT-NIR spectroscopy combined with siPLS algorithm can be used to measure process variables in solid-state fermentation of wheat straw, and NIR spectroscopy technique has a potential to be utilized in SSF industry.
Free Trade, A New National Security Policy for the 21st Century
1990-03-30
view had some validity prior to the industrial revolution as countries were basically self-sufficient. However, with the growth and spread of the...eliminated complete self-sufficiency. 3 As the Industrial Revolution expanded, communities and then regions within nations became interdependent and...prosperous national economies emerged. A significant by-product of the industrial revolution was the development and massive production of weapons that
Efficiency analysis of wood processing industry in China during 2006-2015
NASA Astrophysics Data System (ADS)
Zhang, Kun; Yuan, Baolong; Li, Yanxuan
2018-03-01
The wood processing industry is an important industry which affects the national economy and social development. The data envelopment analysis model (DEA) is a quantitative evaluation method for studying industrial efficiency. In this paper, the wood processing industry of 8 provinces in southern China is taken as the study object, and the efficiency of each province in 2006 to 2015 was measured and calculated with the DEA method, and the efficiency changes, technological changes and Malmquist index were analyzed dynamically. The empirical results show that there is a widening gap in the efficiency of wood processing industry of the 8 provinces, and the technological progress has shown a lag in the promotion of wood processing industry. According to the research conclusion, along with the situation of domestic and foreign wood processing industry development, the government must introduce relevant policies to strengthen the construction of the wood processing industry technology innovation policy system and the industrial coordinated development system.
Evaluation of an Innovative Approach to Validation of ...
UV disinfection is an effective process for inactivating many microbial pathogens found in source waters with the potential as stand-alone treatment or in combination with other disinfectants. For surface and groundwater sourced drinking water applications, the U.S. Environmental Protection Agency (USEPA) provided guidance on the validation of UV reactors nearly a decade ago. The focus of the guidance was primarily for inactivation of Cryptosporidium and Giardia. Over the last ten years many lessons have been learned, validation practices have been modified, new science issues discovered, and changes in operation & monitoring of UV systems need to be addressed. Also, there remains no standard approach for validating UV reactors to meet a 4-log (99.99%) inactivation of viruses. USEPA in partnership with the Cadmus Group, Carollo Engineers, and other State & Industry collaborators, are evaluating new approaches for validating UV reactors to meet groundwater & surface water pathogen inactivation including viruses for low-pressure and medium-pressure UV systems. A particular challenge for medium-pressure UV is the monitoring of low-wavelength germicidal contributions for appropriate crediting of disinfection under varying reactor conditions of quartz sleeve fouling, lamp aging, and changes in UV absorbance of the water over time. In the current effort, bench and full-scale studies are being conducted on a low pressure (LP) UV reactor and a medium pressure (MP) UV re
Diagnostic for Plasma Enhanced Chemical Vapor Deposition and Etch Systems
NASA Technical Reports Server (NTRS)
Cappelli, Mark A.
1999-01-01
In order to meet NASA's requirements for the rapid development and validation of future generation electronic devices as well as associated materials and processes, enabling technologies ion the processing of semiconductor materials arising from understanding etch chemistries are being developed through a research collaboration between Stanford University and NASA-Ames Research Center, Although a great deal of laboratory-scale research has been performed on many of materials processing plasmas, little is known about the gas-phase and surface chemical reactions that are critical in many etch and deposition processes, and how these reactions are influenced by the variation in operating conditions. In addition, many plasma-based processes suffer from stability and reliability problems leading to a compromise in performance and a potentially increased cost for the semiconductor manufacturing industry. Such a lack of understanding has hindered the development of process models that can aid in the scaling and improvement of plasma etch and deposition systems. The research described involves the study of plasmas used in semiconductor processes. An inductively coupled plasma (ICP) source in place of the standard upper electrode assembly of the Gaseous Electronics Conference (GEC) radio-frequency (RF) Reference Cell is used to investigate the discharge characteristics and chemistries. This ICP source generates plasmas with higher electron densities (approximately 10(exp 12)/cu cm) and lower operating pressures (approximately 7 mTorr) than obtainable with the original parallel-plate version of the GEC Cell. This expanded operating regime is more relevant to new generations of industrial plasma systems being used by the microelectronics industry. The motivation for this study is to develop an understanding of the physical phenomena involved in plasma processing and to measure much needed fundamental parameters, such as gas-phase and surface reaction rates. species concentration, temperature, ion energy distribution, and electron number density. A wide variety of diagnostic techniques are under development through this consortium grant to measure these parameters. including molecular beam mass spectrometry (MBMS). Fourier transform infrared (FTIR) spectroscopy, broadband ultraviolet (UV) absorption spectroscopy, a compensated Langmuir probe. Additional diagnostics. Such as microwave interferometry and microwave absorption for measurements of plasma density and radical concentrations are also planned.
NASA Astrophysics Data System (ADS)
Wells, M. A.; Samarasekera, I. V.; Brimacombe, J. K.; Hawbolt, E. B.; Lloyd, D. J.
1998-06-01
A comprehensive mathematical model of the hot tandem rolling process for aluminum alloys has been developed. Reflecting the complex thermomechanical and microstructural changes effected in the alloys during rolling, the model incorporated heat flow, plastic deformation, kinetics of static recrystallization, final recrystallized grain size, and texture evolution. The results of this microstructural engineering study, combining computer modeling, laboratory tests, and industrial measurements, are presented in three parts. In this Part I, laboratory measurements of static recrystallization kinetics and final recrystallized grain size are described for AA5182 and AA5052 aluminum alloys and expressed quantitatively by semiempirical equations. In Part II, laboratory measurements of the texture evolution during static recrystallization are described for each of the alloys and expressed mathematically using a modified form of the Avrami equation. Finally, Part III of this article describes the development of an overall mathematical model for an industrial aluminum hot tandem rolling process which incorporates the microstructure and texture equations developed and the model validation using industrial data. The laboratory measurements for the microstructural evolution were carried out using industrially rolled material and a state-of-the-art plane strain compression tester at Alcan International. Each sample was given a single deformation and heat treated in a salt bath at 400 °C for various lengths of time to effect different levels of recrystallization in the samples. The range of hot-working conditions used for the laboratory study was chosen to represent conditions typically seen in industrial aluminum hot tandem rolling processes, i.e., deformation temperatures of 350 °C to 500 °C, strain rates of 0.5 to 100 seconds and total strains of 0.5 to 2.0. The semiempirical equations developed indicated that both the recrystallization kinetics and the final recrystallized grain size were dependent on the deformation history of the material i.e., total strain and Zener-Hollomon parameter ( Z), where Z = dot \\varepsilon exp left( {{Q_{def} }/{RT_{def }}} right) and time at the recrystallization temperature.
Lofgren, Don J; Reeb-Whitaker, Carolyn K; Adams, Darrin
2010-07-01
Chemical substance exposure data from the Washington State Occupational Safety and Health Administration (OSHA) program were reviewed to determine if inspections conducted as a result of a report of a hazard from a complainant or referent may alert the agency to uncharacterized or emerging health hazards. Exposure and other electronically stored data from 6890 health inspection reports conducted between April 2003 and August 2008 were extracted from agency records. A total of 515 (7%) inspections with one or more personal airborne chemical substance samples were identified for further study. Inspections by report of a hazard and by targeting were compared for the following: number of inspections, number and percentage of inspections with workers exposed to substances above an agency's permissible exposure limit, types of industries inspected, and number and type of chemical substances assessed. Report of a hazard inspections documented work sites with worker overexposure at the same rate as agency targeted inspections (approximately 35% of the time), suggesting that complainants and referents are a credible pool of observers capable of directing the agency to airborne chemical substance hazards. Report of a hazard inspections were associated with significantly broader distribution of industries as well as a greater variety of chemical substance exposures than were targeted inspections. Narrative text that described business type and processes inspected was more useful than NAICS codes alone and critical in identifying processes and industries that may be associated with new hazards. Finally, previously identified emerging hazards were found among the report of a hazard data. These findings indicate that surveillance of OSHA inspection data can be a valid tool to identify uncharacterized and emerging health hazards. Additional research is needed to develop criteria for objective review and prioritization of the data for intervention. Federal OSHA and other state OSHA agencies will need to add electronic data entry fields more descriptive of industry, process, and substance to fully use agency exposure data for hazard surveillance.
ERIC Educational Resources Information Center
Seboka, B.; Deressa, A.
2000-01-01
Indigenous social networks of Ethiopian farmers participate in seed exchange based on mutual interdependence and trust. A government-imposed extension program must validate the role of local seed systems in developing a national seed industry. (SK)
A fuzzy-logic-based controller for methane production in anaerobic fixed-film reactors.
Robles, A; Latrille, E; Ruano, M V; Steyer, J P
2017-01-01
The main objective of this work was to develop a controller for biogas production in continuous anaerobic fixed-bed reactors, which used effluent total volatile fatty acids (VFA) concentration as control input in order to prevent process acidification at closed loop. To this aim, a fuzzy-logic-based control system was developed, tuned and validated in an anaerobic fixed-bed reactor at pilot scale that treated industrial winery wastewater. The proposed controller varied the flow rate of wastewater entering the system as a function of the gaseous outflow rate of methane and VFA concentration. Simulation results show that the proposed controller is capable to achieve great process stability even when operating at high VFA concentrations. Pilot results showed the potential of this control approach to maintain the process working properly under similar conditions to the ones expected at full-scale plants.
The new millennium program: Fast-track procurements
NASA Astrophysics Data System (ADS)
Metzger, Robert M.
1996-11-01
The National Aeronautics and Space Administration's (NASA's) New Millennium Program (NMP) has embarked on a technology flight-validation demonstration program to enable the kinds of missions that NASA envisions for the 21st century. Embedded in this program is the concept of rapid mission development supported by a fast-track procurement process. This process begins with the decision to initiate a procurement very early in the program along with the formation of a technical acquisition team. A close working relationship among the team members is essential to avoiding delays and developing a clear acquisition plan. The request for proposal (RFP) that is subsequently issued seeks a company with proven capabilities, so that the time allotted for responses from proposers and the length of proposals they submit can be shortened. The fast-track procurement process has been demonstrated during selection of NMP's industrial partners and has been proven to work.
Nowak, Sascha; Winter, Martin
2017-03-06
Quantitative electrolyte extraction from lithium ion batteries (LIB) is of great interest for recycling processes. Following the generally valid EU legal guidelines for the recycling of batteries, 50 wt % of a LIB cell has to be recovered, which cannot be achieved without the electrolyte; hence, the electrolyte represents a target component for the recycling of LIBs. Additionally, fluoride or fluorinated compounds, as inevitably present in LIB electrolytes, can hamper or even damage recycling processes in industry and have to be removed from the solid LIB parts, as well. Finally, extraction is a necessary tool for LIB electrolyte aging analysis as well as for post-mortem investigations in general, because a qualitative overview can already be achieved after a few minutes of extraction for well-aged, apparently "dry" LIB cells, where the electrolyte is deeply penetrated or even gellified in the solid battery materials.
Comprehensive pulsed electric field (PEF) system analysis for microalgae processing.
Buchmann, Leandro; Bloch, Robin; Mathys, Alexander
2018-06-07
Pulsed electric field (PEF) is an emerging nonthermal technique with promising applications in microalgae biorefinery concepts. In this work, the flow field in continuous PEF processing and its influencing factors were analyzed and energy input distributions in PEF treatment chambers were investigated. The results were obtained using an interdisciplinary approach that combined multiphysics simulations with ultrasonic Doppler velocity profiling (UVP) and rheological measurements of Arthrospira platensis suspensions as a case study for applications in the biobased industry. UVP enabled non-invasive validation of multiphysics simulations. A. platensis suspensions follow a non-Newtonian, shear-thinning behavior, and measurement data could be fitted with rheological functions, which were used as an input for fluid dynamics simulations. Within the present work, a comprehensive system characterization was achieved that will facilitate research in the field of PEF processing. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Spectral studies related to dissociation of HBr, HCl and BrO
NASA Technical Reports Server (NTRS)
Ginter, M. L.
1986-01-01
Concern over halogen catalyzed decomposition of O3 in the upper atmosphere has generated need for data on the atomic and molecular species X, HX and XO (where X is Cl and Br). Of special importance are Cl produced from freon decomposition and Cl and Br produced from natural processes and from other industrial and agricultural chemicals. Basic spectral data is provided on HCl, HBr, and BrO necessary to detect specific states and energy levels, to enable detailed modeling of the processes involving molecular dissociation, ionization, etc., and to help evaluate field experiments to check the validity of model calculations for these species in the upper atmosphere. Results contained in four published papers and two major spectral compilations are summarized together with other results obtained.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stein, Joshua; Burnham, Laurie; Jones, Christian Birk
The U.S. DOE Regional Test Center for Solar Technologies program was established to validate photovoltaic (PV) technologies installed in a range of different climates. The program is funded by the Energy Department's SunShot Initiative. The initiative seeks to make solar energy cost competitive with other forms of electricity by the end of the decade. Sandia National Laboratory currently manages four different sites across the country. The National Renewable Energy Laboratory manages a fifth site in Colorado. The entire PV portfolio currently includes 20 industry partners and almost 500 kW of installed systems. The program follows a defined process that outlinesmore » tasks, milestones, agreements, and deliverables. The process is broken out into four main parts: 1) planning and design, 2) installation, 3) operations, and 4) decommissioning. This operations manual defines the various elements of each part.« less
Localization of wood floor structure by infrared thermography
NASA Astrophysics Data System (ADS)
Cochior Plescanu, C.; Klein, M.; Ibarra-Castanedo, C.; Bendada, A.; Maldague, X.
2008-03-01
One of our industrial partners, Assek Technologie, is interested in developing a technique that would improve the drying process of wood floor in basements after flooding. In order to optimize the procedure, the floor structure and the damaged (wet) area extent must first be determined with minimum intrusion (minimum or no dismantling). The present study presents the use of infrared thermography to reveal the structure of (flooded) wood floors. The procedure involves opening holes in the floor. Injecting some hot air through those holes reveals the framing structure even if the floor is covered by vinyl or ceramic tiles. This study indicates that thermal imaging can also be used as a tool to validate the decontamination process after drying. Thermal images were obtained on small-scale models and in a demonstration room.
A status of the Turbine Technology Team activities
NASA Technical Reports Server (NTRS)
Griffin, Lisa W.
1992-01-01
The recent activities of the Turbine Technology Team of the Consortium for Computational Fluid Dynamics (CFD) Application in Propulsion Technology is presented. The team consists of members from the government, industry, and universities. The goal of this team is to demonstrate the benefits to the turbine design process attainable through the application of CFD. This goal is to be achieved by enhancing and validating turbine design tools for improved loading and flowfield definition and loss prediction, and transferring the advanced technology to the turbine design process. In order to demonstrate the advantages of using CFD early in the design phase, the Space Transportation Main Engine (STME) turbines for the National Launch System (NLS) were chosen on which to focus the team's efforts. The Turbine Team activities run parallel to the STME design work.
NASA Astrophysics Data System (ADS)
Nishino, Takayuki
The face hobbing process has been widely applied in automotive industry. But so far few analytical tools have been developed. This makes it difficult for us to optimize gear design. To settle this situation, this study aims at developing a computerized tool to predict the running performances such as loaded tooth contact pattern, static transmission error and so on. First, based upon kinematical analysis of a cutting machine, a mathematical description of tooth surface generation is given. Second, based upon the theory of gearing and differential geometry, conjugate tooth surfaces are studied. Then contact lines are generated. Third, load distribution along contact lines is formulated. Last, the numerical model is validated by measuring loaded transmission error and loaded tooth contact pattern.
An optimization model to agroindustrial sector in antioquia (Colombia, South America)
NASA Astrophysics Data System (ADS)
Fernandez, J.
2015-06-01
This paper develops a proposal of a general optimization model for the flower industry, which is defined by using discrete simulation and nonlinear optimization, whose mathematical models have been solved by using ProModel simulation tools and Gams optimization. It defines the operations that constitute the production and marketing of the sector, statistically validated data taken directly from each operation through field work, the discrete simulation model of the operations and the linear optimization model of the entire industry chain are raised. The model is solved with the tools described above and presents the results validated in a case study.
Use of stable isotope signatures to determine mercury sources in the Great Lakes
Lepak, Ryan F.; Yin, Runsheng; Krabbenhoft, David P.; Ogorek, Jacob M.; DeWild, John F.; Holsen, Thomas M.; Hurley, James P.
2015-01-01
Sources of mercury (Hg) in Great Lakes sediments were assessed with stable Hg isotope ratios using multicollector inductively coupled plasma mass spectrometry. An isotopic mixing model based on mass-dependent (MDF) and mass-independent fractionation (MIF) (δ202Hg and Δ199Hg) identified three primary Hg sources for sediments: atmospheric, industrial, and watershed-derived. Results indicate atmospheric sources dominate in Lakes Huron, Superior, and Michigan sediments while watershed-derived and industrial sources dominate in Lakes Erie and Ontario sediments. Anomalous Δ200Hg signatures, also apparent in sediments, provided independent validation of the model. Comparison of Δ200Hg signatures in predatory fish from three lakes reveals that bioaccumulated Hg is more isotopically similar to atmospherically derived Hg than a lake’s sediment. Previous research suggests Δ200Hg is conserved during biogeochemical processing and odd mass-independent fractionation (MIF) is conserved during metabolic processing, so it is suspected even is similarly conserved. Given these assumptions, our data suggest that in some cases, atmospherically derived Hg may be a more important source of MeHg to higher trophic levels than legacy sediments in the Great Lakes.
Command Disaggregation Attack and Mitigation in Industrial Internet of Things
Zhu, Pei-Dong; Hu, Yi-Fan; Cui, Peng-Shuai; Zhang, Yan
2017-01-01
A cyber-physical attack in the industrial Internet of Things can cause severe damage to physical system. In this paper, we focus on the command disaggregation attack, wherein attackers modify disaggregated commands by intruding command aggregators like programmable logic controllers, and then maliciously manipulate the physical process. It is necessary to investigate these attacks, analyze their impact on the physical process, and seek effective detection mechanisms. We depict two different types of command disaggregation attack modes: (1) the command sequence is disordered and (2) disaggregated sub-commands are allocated to wrong actuators. We describe three attack models to implement these modes with going undetected by existing detection methods. A novel and effective framework is provided to detect command disaggregation attacks. The framework utilizes the correlations among two-tier command sequences, including commands from the output of central controller and sub-commands from the input of actuators, to detect attacks before disruptions occur. We have designed components of the framework and explain how to mine and use these correlations to detect attacks. We present two case studies to validate different levels of impact from various attack models and the effectiveness of the detection framework. Finally, we discuss how to enhance the detection framework. PMID:29065461
NASA Astrophysics Data System (ADS)
Cha, Kyung-Jin; Kim, Yang Sok
2018-01-01
Nowadays, information technology (IT) outsourcing companies face enduring demands to reduce cost while increasing productivity. This pressure leads many IT outsourcing companies to rely on outsourcing arrangements with IT personnel suppliers. In order to maximise efficiency, outsourcing companies have focused on fostering high-performing suppliers through improved collaboration and mutual relations. However, it is very difficult to advance to a long-term partnership using the existing outsourcing process because of insufficient collaboration between IT outsourcing companies and their suppliers. Based on collaboration perspective of supply chain management (SCM), this study identifies the critical success factors for collaborative strategic partnerships and presents an evaluation framework for assessing and managing suppliers. We have developed an organisational process model for Supplier relationship management (SRM)-based collaboration which includes some of the key constructs from the previous studies and interviews with the IT outsourcing industry people. In this study, we will identify four types of strategic suppliers and suggest approaches for improving collaborative relationship between an IT outsourcing company and its partner companies. In addition, to validate the feasibility of the proposed model, we applied it to a well-known Korean IT outsourcing company 'A'.
Command Disaggregation Attack and Mitigation in Industrial Internet of Things.
Xun, Peng; Zhu, Pei-Dong; Hu, Yi-Fan; Cui, Peng-Shuai; Zhang, Yan
2017-10-21
A cyber-physical attack in the industrial Internet of Things can cause severe damage to physical system. In this paper, we focus on the command disaggregation attack, wherein attackers modify disaggregated commands by intruding command aggregators like programmable logic controllers, and then maliciously manipulate the physical process. It is necessary to investigate these attacks, analyze their impact on the physical process, and seek effective detection mechanisms. We depict two different types of command disaggregation attack modes: (1) the command sequence is disordered and (2) disaggregated sub-commands are allocated to wrong actuators. We describe three attack models to implement these modes with going undetected by existing detection methods. A novel and effective framework is provided to detect command disaggregation attacks. The framework utilizes the correlations among two-tier command sequences, including commands from the output of central controller and sub-commands from the input of actuators, to detect attacks before disruptions occur. We have designed components of the framework and explain how to mine and use these correlations to detect attacks. We present two case studies to validate different levels of impact from various attack models and the effectiveness of the detection framework. Finally, we discuss how to enhance the detection framework.
Aleixandre-Tudo, Jose Luis; Nieuwoudt, Helene; Aleixandre, Jose Luis; du Toit, Wessel
2018-01-01
The wine industry requires reliable methods for the quantification of phenolic compounds during the winemaking process. Infrared spectroscopy appears as a suitable technique for process control and monitoring. The ability of Fourier transform near infrared (FT-NIR), attenuated total reflectance mid infrared (ATR-MIR) and Fourier transform infrared (FT-IR) spectroscopies to predict compositional phenolic levels during red wine fermentation and aging was investigated. Prediction models containing a large number of samples collected over two vintages from several industrial fermenting tanks as well as wine samples covering a varying number of vintages were validated. FT-NIR appeared as the most accurate technique to predict the phenolic content. Although slightly less accurate models were observed, ATR-MIR and FT-IR can also be used for the prediction of the majority of phenolic measurements. Additionally, the slope and intercept test indicated a systematic error for the three spectroscopies which seems to be slightly more pronounced for HPLC generated phenolics data than for the spectrophotometric parameters. However, the results also showed that the predictions made with the three instruments are statistically comparable. The robustness of the prediction models was also investigated and discussed. Copyright © 2017 Elsevier B.V. All rights reserved.
Modelling the transport and decay processes of microbial tracers in a macro-tidal estuary.
Abu-Bakar, Amyrhul; Ahmadian, Reza; Falconer, Roger A
2017-10-15
The Loughor Estuary is a macro-tidal coastal basin, located along the Bristol Channel, in the South West of the U.K. The maximum spring tidal range in the estuary is up to 7.5 m, near Burry Port Harbour. This estuarine region can experience severe coastal flooding during high spring tides, including extreme flooding of the intertidal saltmarshes at Llanrhidian, as well as the lower industrial and residential areas at Llanelli and Gowerton. The water quality of this estuarine basin needs to comply with the designated standards for safe recreational bathing and shellfish harvesting industries. The waterbody however, potentially receives overloading of bacterial inputs that enter the estuarine system from both point and diffuse sources. Therefore, a microbial tracer study was carried out to get a better understanding of the faecal bacteria sources and to enable a hydro-environmental model to be refined and calibrated for both advection and dispersion transport. A two-dimensional hydro-environmental model has been refined and extended to predict the highest water level covering the intertidal floodplains of the Loughor Estuary. The validated hydrodynamic model for both water levels and currents, was included with the injected mass of microbial tracer, i.e. MS2 coliphage from upstream of the estuary, and modelled as a non-conservative tracer over several tidal cycles through the system. The calibration and validation of the transport and decay of microbial tracer was undertaken, by comparing the model results and the measured data at two different sampling locations. The refined model developed as a part of this study, was used to acquire a better understanding of the water quality processes and the potential sources of bacterial pollution in the estuary. Copyright © 2017 Elsevier Ltd. All rights reserved.
48 CFR 970.1504-1-9 - Special considerations: Cost-plus-award-fee.
Code of Federal Regulations, 2012 CFR
2012-10-01
....e., nuclear energy processing, industrial environmental cleanup); (iii) Construction of facilities... industrial/DOE settings (i.e., nuclear energy, chemical or petroleum processing, industrial environmental... industrial/DOE settings (i.e., nuclear energy, chemical processing, industrial environmental cleanup); (ii...
48 CFR 970.1504-1-9 - Special considerations: Cost-plus-award-fee.
Code of Federal Regulations, 2014 CFR
2014-10-01
....e., nuclear energy processing, industrial environmental cleanup); (iii) Construction of facilities... industrial/DOE settings (i.e., nuclear energy, chemical or petroleum processing, industrial environmental... industrial/DOE settings (i.e., nuclear energy, chemical processing, industrial environmental cleanup); (ii...
48 CFR 970.1504-1-9 - Special considerations: Cost-plus-award-fee.
Code of Federal Regulations, 2013 CFR
2013-10-01
....e., nuclear energy processing, industrial environmental cleanup); (iii) Construction of facilities... industrial/DOE settings (i.e., nuclear energy, chemical or petroleum processing, industrial environmental... industrial/DOE settings (i.e., nuclear energy, chemical processing, industrial environmental cleanup); (ii...
48 CFR 970.1504-1-9 - Special considerations: Cost-plus-award-fee.
Code of Federal Regulations, 2010 CFR
2010-10-01
....e., nuclear energy processing, industrial environmental cleanup); (iii) Construction of facilities... industrial/DOE settings (i.e., nuclear energy, chemical or petroleum processing, industrial environmental... industrial/DOE settings (i.e., nuclear energy, chemical processing, industrial environmental cleanup); (ii...
48 CFR 970.1504-1-9 - Special considerations: Cost-plus-award-fee.
Code of Federal Regulations, 2011 CFR
2011-10-01
....e., nuclear energy processing, industrial environmental cleanup); (iii) Construction of facilities... industrial/DOE settings (i.e., nuclear energy, chemical or petroleum processing, industrial environmental... industrial/DOE settings (i.e., nuclear energy, chemical processing, industrial environmental cleanup); (ii...
McCambridge, Jim; Mialon, Melissa
2018-06-13
Alcohol companies have recently invested large sums of money in answering research questions to which they have clear vested interests in the outcomes. There have been extensive concerns about corporate influence on public health sciences, following the experience with the tobacco industry. This systematic review aims to investigate the perspectives of researchers on the activities of alcohol industry actors in relation to science, in order to guide future research. All data published in peer-reviewed journals (including commentaries, opinion pieces, editorials and letters as well as research reports) were eligible for inclusion. This analysis focuses on the manifest rather than latent content of the articulated views, and accordingly adopts a thematic analysis using an inductive approach to the generation of themes. There are serious concerns identified in three main areas, principally defined by where the impacts of industry scientific activities occur; on evidence informed policy making (instrumental uses of research by industry actors), on the content of the scientific evidence base itself (industry funding as a source of bias); and on the processes of undertaking research (transgressions of basic scientific norms). There are also opposing views which provide a useful critique. The evidence-base on the validity of all concerns has been slow to develop. The concerns are extensive, longstanding and unresolved and high quality investigations are needed. This study informs the detailed content of the research needed to address the concerns identified here. © 2018 The Authors Drug and Alcohol Review published by John Wiley & Sons Australia, Ltd on behalf of Australasian Professional Society on Alcohol and other Drugs.
Varzakas, Theodoros H
2011-09-01
The Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of pastry processing. A tentative approach of FMEA application to the pastry industry was attempted in conjunction with ISO22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (pastry processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over pastry processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the Risk Priority Number (RPN) per identified processing hazard. Storage of raw materials and storage of final products at -18°C followed by freezing were the processes identified as the ones with the highest RPN (225, 225, and 144 respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a pastry processing industry is considered imperative.
Organizational capacity for change in health care: Development and validation of a scale.
Spaulding, Aaron; Kash, Bita A; Johnson, Christopher E; Gamm, Larry
We do not have a strong understanding of a health care organization's capacity for attempting and completing multiple and sometimes competing change initiatives. Capacity for change implementation is a critical success factor as the health care industry is faced with ongoing demands for change and transformation because of technological advances, market forces, and regulatory environment. The aim of this study was to develop and validate a tool to measure health care organizations' capacity to change by building upon previous conceptualizations of absorptive capacity and organizational readiness for change. A multistep process was used to develop the organizational capacity for change survey. The survey was sent to two populations requesting answers to questions about the organization's leadership, culture, and technologies in use throughout the organization. Exploratory and confirmatory factor analyses were conducted to validate the survey as a measurement tool for organizational capacity for change in the health care setting. The resulting organizational capacity for change measurement tool proves to be a valid and reliable method of evaluating a hospital's capacity for change through the measurement of the population's perceptions related to leadership, culture, and organizational technologies. The organizational capacity for change measurement tool can help health care managers and leaders evaluate the capacity of employees, departments, and teams for change before large-scale implementation.
Validation of a program for supercritical power plant calculations
NASA Astrophysics Data System (ADS)
Kotowicz, Janusz; Łukowicz, Henryk; Bartela, Łukasz; Michalski, Sebastian
2011-12-01
This article describes the validation of a supercritical steam cycle. The cycle model was created with the commercial program GateCycle and validated using in-house code of the Institute of Power Engineering and Turbomachinery. The Institute's in-house code has been used extensively for industrial power plants calculations with good results. In the first step of the validation process, assumptions were made about the live steam temperature and pressure, net power, characteristic quantities for high- and low-pressure regenerative heat exchangers and pressure losses in heat exchangers. These assumptions were then used to develop a steam cycle model in Gate-Cycle and a model based on the code developed in-house at the Institute of Power Engineering and Turbomachinery. Properties, such as thermodynamic parameters at characteristic points of the steam cycle, net power values and efficiencies, heat provided to the steam cycle and heat taken from the steam cycle, were compared. The last step of the analysis was calculation of relative errors of compared values. The method used for relative error calculations is presented in the paper. The assigned relative errors are very slight, generally not exceeding 0.1%. Based on our analysis, it can be concluded that using the GateCycle software for calculations of supercritical power plants is possible.
Real-time sensor validation and fusion for distributed autonomous sensors
NASA Astrophysics Data System (ADS)
Yuan, Xiaojing; Li, Xiangshang; Buckles, Bill P.
2004-04-01
Multi-sensor data fusion has found widespread applications in industrial and research sectors. The purpose of real time multi-sensor data fusion is to dynamically estimate an improved system model from a set of different data sources, i.e., sensors. This paper presented a systematic and unified real time sensor validation and fusion framework (RTSVFF) based on distributed autonomous sensors. The RTSVFF is an open architecture which consists of four layers - the transaction layer, the process fusion layer, the control layer, and the planning layer. This paradigm facilitates distribution of intelligence to the sensor level and sharing of information among sensors, controllers, and other devices in the system. The openness of the architecture also provides a platform to test different sensor validation and fusion algorithms and thus facilitates the selection of near optimal algorithms for specific sensor fusion application. In the version of the model presented in this paper, confidence weighted averaging is employed to address the dynamic system state issue noted above. The state is computed using an adaptive estimator and dynamic validation curve for numeric data fusion and a robust diagnostic map for decision level qualitative fusion. The framework is then applied to automatic monitoring of a gas-turbine engine, including a performance comparison of the proposed real-time sensor fusion algorithms and a traditional numerical weighted average.
Simulation of a complete X-ray digital radiographic system for industrial applications.
Nazemi, E; Rokrok, B; Movafeghi, A; Choopan Dastjerdi, M H
2018-05-19
Simulating X-ray images is of great importance in industry and medicine. Using such simulation permits us to optimize parameters which affect image's quality without the limitations of an experimental procedure. This study revolves around a novel methodology to simulate a complete industrial X-ray digital radiographic system composed of an X-ray tube and a computed radiography (CR) image plate using Monte Carlo N Particle eXtended (MCNPX) code. In the process of our research, an industrial X-ray tube with maximum voltage of 300 kV and current of 5 mA was simulated. A 3-layer uniform plate including a polymer overcoat layer, a phosphor layer and a polycarbonate backing layer was also defined and simulated as the CR imaging plate. To model the image formation in the image plate, at first the absorbed dose was calculated in each pixel inside the phosphor layer of CR imaging plate using the mesh tally in MCNPX code and then was converted to gray value using a mathematical relationship determined in a separate procedure. To validate the simulation results, an experimental setup was designed and the images of two step wedges created out of aluminum and steel were captured by the experiments and compared with the simulations. The results show that the simulated images are in good agreement with the experimental ones demonstrating the ability of the proposed methodology for simulating an industrial X-ray imaging system. Copyright © 2018 Elsevier Ltd. All rights reserved.
Influence of agricultural activities, forest fires and agro-industries on air quality in Thailand.
Phairuang, Worradorn; Hata, Mitsuhiko; Furuuchi, Masami
2017-02-01
Annual and monthly-based emission inventories in northern, central and north-eastern provinces in Thailand, where agriculture and related agro-industries are very intensive, were estimated to evaluate the contribution of agricultural activity, including crop residue burning, forest fires and related agro-industries on air quality monitored in corresponding provinces. The monthly-based emission inventories of air pollutants, or, particulate matter (PM), NOx and SO 2 , for various agricultural crops were estimated based on information on the level of production of typical crops: rice, corn, sugarcane, cassava, soybeans and potatoes using emission factors and other parameters related to country-specific values taking into account crop type and the local residue burning period. The estimated monthly emission inventory was compared with air monitoring data obtained at monitoring stations operated by the Pollution Control Department, Thailand (PCD) for validating the estimated emission inventory. The agro-industry that has the greatest impact on the regions being evaluated, is the sugar processing industry, which uses sugarcane as a raw material and its residue as fuel for the boiler. The backward trajectory analysis of the air mass arriving at the PCD station was calculated to confirm this influence. For the provinces being evaluated which are located in the upper northern, lower northern and northeast in Thailand, agricultural activities and forest fires were shown to be closely correlated to the ambient PM concentration while their contribution to the production of gaseous pollutants is much less. Copyright © 2016. Published by Elsevier B.V.
Development of code evaluation criteria for assessing predictive capability and performance
NASA Technical Reports Server (NTRS)
Lin, Shyi-Jang; Barson, S. L.; Sindir, M. M.; Prueger, G. H.
1993-01-01
Computational Fluid Dynamics (CFD), because of its unique ability to predict complex three-dimensional flows, is being applied with increasing frequency in the aerospace industry. Currently, no consistent code validation procedure is applied within the industry. Such a procedure is needed to increase confidence in CFD and reduce risk in the use of these codes as a design and analysis tool. This final contract report defines classifications for three levels of code validation, directly relating the use of CFD codes to the engineering design cycle. Evaluation criteria by which codes are measured and classified are recommended and discussed. Criteria for selecting experimental data against which CFD results can be compared are outlined. A four phase CFD code validation procedure is described in detail. Finally, the code validation procedure is demonstrated through application of the REACT CFD code to a series of cases culminating in a code to data comparison on the Space Shuttle Main Engine High Pressure Fuel Turbopump Impeller.
ERIC Educational Resources Information Center
McGaughy, Charis; Bryck, Rick; de Gonzalez, Alicia
2012-01-01
This study is a validity study of the recently revised version of the Health Science Standards. The purpose of this study is to understand how the Health Science Standards relate to college and career readiness, as represented by survey ratings submitted by entry-level college instructors of health science courses and industry representatives. For…
A Virtual Sensor for Online Fault Detection of Multitooth-Tools
Bustillo, Andres; Correa, Maritza; Reñones, Anibal
2011-01-01
The installation of suitable sensors close to the tool tip on milling centres is not possible in industrial environments. It is therefore necessary to design virtual sensors for these machines to perform online fault detection in many industrial tasks. This paper presents a virtual sensor for online fault detection of multitooth tools based on a Bayesian classifier. The device that performs this task applies mathematical models that function in conjunction with physical sensors. Only two experimental variables are collected from the milling centre that performs the machining operations: the electrical power consumption of the feed drive and the time required for machining each workpiece. The task of achieving reliable signals from a milling process is especially complex when multitooth tools are used, because each kind of cutting insert in the milling centre only works on each workpiece during a certain time window. Great effort has gone into designing a robust virtual sensor that can avoid re-calibration due to, e.g., maintenance operations. The virtual sensor developed as a result of this research is successfully validated under real conditions on a milling centre used for the mass production of automobile engine crankshafts. Recognition accuracy, calculated with a k-fold cross validation, had on average 0.957 of true positives and 0.986 of true negatives. Moreover, measured accuracy was 98%, which suggests that the virtual sensor correctly identifies new cases. PMID:22163766
NASA Astrophysics Data System (ADS)
Jonny, Zagloed, Teuku Yuri M.
2017-11-01
This paper aims to present an integrated health care model for Indonesian health care industry. Based on previous researches, there are two health care models in the industry such as decease- and patient-centered care models. In their developments, the patient-centered care model is widely applied due to its capability in reducing cost and improving quality simultaneously. However, there is still no comprehensive model resulting in cost reduction, quality improvement, patient satisfaction and hospital profitability simultaneously. Therefore, this research is intended to develop that model. In doing so, first, a conceptual model using Kano's Model, Quality Function Deployment (QFD) and Balanced Scorecard (BSC) is developed to generate several important elements of the model as required by stakeholders. Then, a case study of an Indonesian hospital is presented to evaluate the validity of the model using correlation analysis. As a result, it can be concluded that the model is validated implying several managerial insights among its elements such as l) leadership (r=0.85) and context of the organization (r=0.77) improve operations; 2) planning (r=0.96), support process (r=0.87) and continual improvement (r=0.95) also improve operations; 3) operations improve customer satisfaction (r=0.89) and financial performance (r=0.93) and 4) customer satisfaction improves the financial performance (0.98).
A virtual sensor for online fault detection of multitooth-tools.
Bustillo, Andres; Correa, Maritza; Reñones, Anibal
2011-01-01
The installation of suitable sensors close to the tool tip on milling centres is not possible in industrial environments. It is therefore necessary to design virtual sensors for these machines to perform online fault detection in many industrial tasks. This paper presents a virtual sensor for online fault detection of multitooth tools based on a bayesian classifier. The device that performs this task applies mathematical models that function in conjunction with physical sensors. Only two experimental variables are collected from the milling centre that performs the machining operations: the electrical power consumption of the feed drive and the time required for machining each workpiece. The task of achieving reliable signals from a milling process is especially complex when multitooth tools are used, because each kind of cutting insert in the milling centre only works on each workpiece during a certain time window. Great effort has gone into designing a robust virtual sensor that can avoid re-calibration due to, e.g., maintenance operations. The virtual sensor developed as a result of this research is successfully validated under real conditions on a milling centre used for the mass production of automobile engine crankshafts. Recognition accuracy, calculated with a k-fold cross validation, had on average 0.957 of true positives and 0.986 of true negatives. Moreover, measured accuracy was 98%, which suggests that the virtual sensor correctly identifies new cases.
Geiger, M F; Astrin, J J; Borsch, T; Burkhardt, U; Grobe, P; Hand, R; Hausmann, A; Hohberg, K; Krogmann, L; Lutz, M; Monje, C; Misof, B; Morinière, J; Müller, K; Pietsch, S; Quandt, D; Rulik, B; Scholler, M; Traunspurger, W; Haszprunar, G; Wägele, W
2016-09-01
Biodiversity loss is mainly driven by human activity. While concern grows over the fate of hot spots of biodiversity, contemporary species losses still prevail in industrialized nations. Therefore, strategies were formulated to halt or reverse the loss, driven by evidence for its value for ecosystem services. Maintenance of the latter through conservation depends on correctly identified species. To this aim, the German Federal Ministry of Education and Research is funding the GBOL project, a consortium of natural history collections, botanic gardens, and universities working on a barcode reference database for the country's fauna and flora. Several noticeable findings could be useful for future campaigns: (i) validating taxon lists to serve as a taxonomic backbone is time-consuming, but without alternative; (ii) offering financial incentives to taxonomic experts, often citizen scientists, is indispensable; (iii) completion of the libraries for widespread species enables analyses of environmental samples, but the process may not hold pace with technological advancements; (iv) discoveries of new species are among the best stories for the media; (v) a commitment to common data standards and repositories is needed, as well as transboundary cooperation between nations; (vi) after validation, all data should be published online via the BOLD to make them searchable for external users and to allow cross-checking with data from other countries.
A Tailored Ontology Supporting Sensor Implementation for the Maintenance of Industrial Machines
Belkadi, Farouk; Bernard, Alain
2017-01-01
The longtime productivity of an industrial machine is improved by condition-based maintenance strategies. To do this, the integration of sensors and other cyber-physical devices is necessary in order to capture and analyze a machine’s condition through its lifespan. Thus, choosing the best sensor is a critical step to ensure the efficiency of the maintenance process. Indeed, considering the variety of sensors, and their features and performance, a formal classification of a sensor’s domain knowledge is crucial. This classification facilitates the search for and reuse of solutions during the design of a new maintenance service. Following a Knowledge Management methodology, the paper proposes and develops a new sensor ontology that structures the domain knowledge, covering both theoretical and experimental sensor attributes. An industrial case study is conducted to validate the proposed ontology and to demonstrate its utility as a guideline to ease the search of suitable sensors. Based on the ontology, the final solution will be implemented in a shared repository connected to legacy CAD (computer-aided design) systems. The selection of the best sensor is, firstly, obtained by the matching of application requirements and sensor specifications (that are proposed by this sensor repository). Then, it is refined from the experimentation results. The achieved solution is recorded in the sensor repository for future reuse. As a result, the time and cost of the design process of new condition-based maintenance services is reduced. PMID:28885592
Sloothaak, J; Odoni, D I; de Graaff, L H; Martins Dos Santos, V A P; Schaap, P J; Tamayo-Ramos, J A
2015-01-01
The development of biological processes that replace the existing petrochemical-based industry is one of the biggest challenges in biotechnology. Aspergillus niger is one of the main industrial producers of lignocellulolytic enzymes, which are used in the conversion of lignocellulosic feedstocks into fermentable sugars. Both the hydrolytic enzymes responsible for lignocellulose depolymerisation and the molecular mechanisms controlling their expression have been well described, but little is known about the transport systems for sugar uptake in A. niger. Understanding the transportome of A. niger is essential to achieve further improvements at strain and process design level. Therefore, this study aims to identify and classify A. niger sugar transporters, using newly developed tools for in silico and in vivo analysis of its membrane-associated proteome. In the present research work, a hidden Markov model (HMM), that shows a good performance in the identification and segmentation of functionally validated glucose transporters, was constructed. The model (HMMgluT) was used to analyse the A. niger membrane-associated proteome response to high and low glucose concentrations at a low pH. By combining the abundance patterns of the proteins found in the A. niger plasmalemma proteome with their HMMgluT scores, two new putative high-affinity glucose transporters, denoted MstG and MstH, were identified. MstG and MstH were functionally validated and biochemically characterised by heterologous expression in a S. cerevisiae glucose transport null mutant. They were shown to be a high-affinity glucose transporter (K m = 0.5 ± 0.04 mM) and a very high-affinity glucose transporter (K m = 0.06 ± 0.005 mM), respectively. This study, focusing for the first time on the membrane-associated proteome of the industrially relevant organism A. niger, shows the global response of the transportome to the availability of different glucose concentrations. Analysis of the A. niger transportome with the newly developed HMMgluT showed to be an efficient approach for the identification and classification of new glucose transporters.
Environmental effects of interstate power trading on electricity consumption mixes.
Marriott, Joe; Matthews, H Scott
2005-11-15
Although many studies of electricity generation use national or state average generation mix assumptions, in reality a great deal of electricity is transferred between states with very different mixes of fossil and renewable fuels, and using the average numbers could result in incorrect conclusions in these studies. We create electricity consumption profiles for each state and for key industry sectors in the U.S. based on existing state generation profiles, net state power imports, industry presence by state, and an optimization model to estimate interstate electricity trading. Using these "consumption mixes" can provide a more accurate assessment of electricity use in life-cycle analyses. We conclude that the published generation mixes for states that import power are misleading, since the power consumed in-state has a different makeup than the power that was generated. And, while most industry sectors have consumption mixes similar to the U.S. average, some of the most critical sectors of the economy--such as resource extraction and material processing sectors--are very different. This result does validate the average mix assumption made in many environmental assessments, but it is important to accurately quantify the generation methods for electricity used when doing life-cycle analyses.
NASA Astrophysics Data System (ADS)
Latré, S.; Desplentere, F.; De Pooter, S.; Seveno, D.
2017-10-01
Nanoscale materials showing superior thermal properties have raised the interest of the building industry. By adding these materials to conventional construction materials, it is possible to decrease the total thermal conductivity by almost one order of magnitude. This conductivity is mainly influenced by the dispersion quality within the matrix material. At the industrial scale, the main challenge is to control this dispersion to reduce or even eliminate thermal bridges. This allows to reach an industrially relevant process to balance out the high material cost and their superior thermal insulation properties. Therefore, a methodology is required to measure and describe these nanoscale distributions within the inorganic matrix material. These distributions are either random or normally distributed through thickness within the matrix material. We show that the influence of these distributions is meaningful and modifies the thermal conductivity of the building material. Hence, this strategy will generate a thermal model allowing to predict the thermal behavior of the nanoscale particles and their distributions. This thermal model will be validated by the hot wire technique. For the moment, a good correlation is found between the numerical results and experimental data for a randomly distributed form of nanoparticles in all directions.
Saving Material with Systematic Process Designs
NASA Astrophysics Data System (ADS)
Kerausch, M.
2011-08-01
Global competition is forcing the stamping industry to further increase quality, to shorten time-to-market and to reduce total cost. Continuous balancing between these classical time-cost-quality targets throughout the product development cycle is required to ensure future economical success. In today's industrial practice, die layout standards are typically assumed to implicitly ensure the balancing of company specific time-cost-quality targets. Although die layout standards are a very successful approach, there are two methodical disadvantages. First, the capabilities for tool design have to be continuously adapted to technological innovations; e.g. to take advantage of the full forming capability of new materials. Secondly, the great variety of die design aspects have to be reduced to a generic rule or guideline; e.g. binder shape, draw-in conditions or the use of drawbeads. Therefore, it is important to not overlook cost or quality opportunities when applying die design standards. This paper describes a systematic workflow with focus on minimizing material consumption. The starting point of the investigation is a full process plan for a typical structural part. All requirements are definedaccording to a predefined set of die design standards with industrial relevance are fulfilled. In a first step binder and addendum geometry is systematically checked for material saving potentials. In a second step, blank shape and draw-in are adjusted to meet thinning, wrinkling and springback targets for a minimum blank solution. Finally the identified die layout is validated with respect to production robustness versus splits, wrinkles and springback. For all three steps the applied methodology is based on finite element simulation combined with a stochastical variation of input variables. With the proposed workflow a well-balanced (time-cost-quality) production process assuring minimal material consumption can be achieved.
Atmospheric stability and complex terrain: comparing measurements and CFD
NASA Astrophysics Data System (ADS)
Koblitz, T.; Bechmann, A.; Berg, J.; Sogachev, A.; Sørensen, N.; Réthoré, P.-E.
2014-12-01
For wind resource assessment, the wind industry is increasingly relying on Computational Fluid Dynamics models that focus on modeling the airflow in a neutrally stratified surface layer. So far, physical processes that are specific to the atmospheric boundary layer, for example the Coriolis force, buoyancy forces and heat transport, are mostly ignored in state-of-the-art flow solvers. In order to decrease the uncertainty of wind resource assessment, the effect of thermal stratification on the atmospheric boundary layer should be included in such models. The present work focuses on non-neutral atmospheric flow over complex terrain including physical processes like stability and Coriolis force. We examine the influence of these effects on the whole atmospheric boundary layer using the DTU Wind Energy flow solver EllipSys3D. To validate the flow solver, measurements from Benakanahalli hill, a field experiment that took place in India in early 2010, are used. The experiment was specifically designed to address the combined effects of stability and Coriolis force over complex terrain, and provides a dataset to validate flow solvers. Including those effects into EllipSys3D significantly improves the predicted flow field when compared against the measurements.
15 CFR 748.15 - Authorization Validated End-User (VEU).
Code of Federal Regulations, 2010 CFR
2010-01-01
..., Bureau of Industry and Security, U.S. Department of Commerce, 14th Street and Pennsylvania Avenue, NW... People's Republic of China. (2) India. (c) Item restrictions. Items controlled under the EAR for missile... Services, Bureau of Industry and Security, U.S. Department of Commerce, 14th Street and Constitution Avenue...
Lee, Jin; Huang, Yueng-hsiang; Robertson, Michelle M; Murphy, Lauren A; Garabet, Angela; Chang, Wen-Ruey
2014-02-01
The goal of this study was to examine the external validity of a 12-item generic safety climate scale for lone workers in order to evaluate the appropriateness of generalized use of the scale in the measurement of safety climate across various lone work settings. External validity evidence was established by investigating the measurement equivalence (ME) across different industries and companies. Confirmatory factor analysis (CFA)-based and item response theory (IRT)-based perspectives were adopted to examine the ME of the generic safety climate scale for lone workers across 11 companies from the trucking, electrical utility, and cable television industries. Fairly strong evidence of ME was observed for both organization- and group-level generic safety climate sub-scales. Although significant invariance was observed in the item intercepts across the different lone work settings, absolute model fit indices remained satisfactory in the most robust step of CFA-based ME testing. IRT-based ME testing identified only one differentially functioning item from the organization-level generic safety climate sub-scale, but its impact was minimal and strong ME was supported. The generic safety climate scale for lone workers reported good external validity and supported the presence of a common feature of safety climate among lone workers. The scale can be used as an effective safety evaluation tool in various lone work situations. Copyright © 2013 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burge, S.W.
Erosion has been identified as one of the significant design issues in fluid beds. A cooperative R&D venture of industry, research, and government organizations was recently formed to meet the industry need for a better understanding of erosion in fluid beds. Research focussed on bed hydrodynamics, which are considered to be the primary erosion mechanism. As part of this work, ANL developed an analytical model (FLUFIX) for bed hydrodynamics. Partial validation was performed using data from experiments sponsored by the research consortium. Development of a three-dimensional fluid bed hydrodynamic model was part of Asea-Babcock`s in-kind contribution to the R&D venture.more » This model, FORCE2, was developed by Babcock & Wilcox`s Research and Development Division existing B&W program and on the gas-solids modeling and was based on an existing B&W program and on the gas-solids modeling technology developed by ANL and others. FORCE2 contains many of the features needed to model plant size beds and, therefore can be used along with the erosion technology to assess metal wastage in industrial equipment. As part of the development efforts, FORCE2 was partially validated using ANL`s two-dimensional model, FLUFIX, and experimental data. Time constraints as well as the lack of good hydrodynamic data, particularly at the plant scale, prohibited a complete validation of FORCE2. This report describes this initial validation of FORCE2.« less
A Model of Physical Performance for Occupational Tasks.
ERIC Educational Resources Information Center
Hogan, Joyce
This report acknowledges the problems faced by industrial/organizational psychologists who must make personnel decisions involving physically demanding jobs. The scarcity of criterion-related validation studies and the difficulty of generalizing validity are considered, and a model of physical performance that builds on Fleishman's (1984)…
Introduction to Validation of Analytical Methods: Potentiometric Determination of CO[subscript 2
ERIC Educational Resources Information Center
Hipólito-Nájera, A. Ricardo; Moya-Hernandez, M. Rosario; Gomez-Balderas, Rodolfo; Rojas-Hernandez, Alberto; Romero-Romo, Mario
2017-01-01
Validation of analytical methods is a fundamental subject for chemical analysts working in chemical industries. These methods are also relevant for pharmaceutical enterprises, biotechnology firms, analytical service laboratories, government departments, and regulatory agencies. Therefore, for undergraduate students enrolled in majors in the field…
Slide presentation at Conference: ASCE 7th Civil Engineering Conference in the Asian Region. USEPA in partnership with the Cadmus Group, Carollo Engineers, and other State & Industry collaborators, are evaluating new approaches for validating UV reactors to meet groundwater & sur...
76 FR 44331 - Ocean Transportation Intermediary License; Revocation
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-25
.... Johns Industrial Parkway North, Suite 6, Jacksonville, FL 32246. Date Revoked: June 13, 2011. Reason... Revoked: June 3, 2011. Reason: Failed to maintain valid bonds. License Number: 001362F. Name: Malvar..., 2011. Reason: Failed to maintain a valid bond. License Number: 003644NF. Name: Forward Logistics Group...
The FoReVer Methodology: A MBSE Framework for Formal Verification
NASA Astrophysics Data System (ADS)
Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald
2013-08-01
The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.
Remote Laser Welding of Zinc Coated Steel Sheets in an Edge Lap Configuration with Zero Gap
NASA Astrophysics Data System (ADS)
Roos, Christian; Schmidt, Michael
Remote Laser Welding (RLW) of zinc-coated steel sheets is a great challenge for the automotive industry but offers high potentials with respect to flexibility and costs. In state of the art applications, sheets are joined in overlap configuration with a preset gap for a stable zinc degassing. This paper investigates RLW of fillets without a preset gap and conditions for a stable process. The influence of process parameters on weld quality and process stability is shown. Experimental data give evidence, that the degassing of zinc through the capillary and the rear melt pool are the major degassing mechanisms. Furthermore the paper gives experimental validation of the zinc degassing in advance of the process zone to the open side of the fillet. Chemical analysis of the hot-dip galvanized zinc coating proof the iron-zinc-alloys to be the reason for a limited effectiveness of this mechanism in comparison to pure zinc as intermediate.
NASA Astrophysics Data System (ADS)
Montoya Villena, Rafael
According to its title, the general objective of the Thesis consists in developing a clear, simple and systematic methodology for programming type PLC devices. With this aim in mind, we will use the following elements: Codification of all variables types. This section is very important since it allows us working with little information. The necessary rules are given to codify all type of phrases produced in industrial processes. An algorithm that describes process evolution and that has been called process D.F. This is one of the most important contributions, since it will allow us, together with information codification, representing the process evolution in a graphic way and with any design theory used. Theory selection. Evidently, the use of some kind of design method is necessary to obtain logic equations. For this particular case, we will use binodal theory, an ideal theory for wired technologies, since it can obtain highly reduced schemas for relatively simple automatisms, which means a minimum number of components used. User program outline algorithm (D.F.P.). This is another necessary contribution and perhaps the most important one, since logic equations resulting from binodal theory are compatible with process evolution if wired technology is used, whether it is electric, electronic, pneumatic, etc. On the other hand, PLC devices performance characteristics force the program instructions order to validate or not the automatism, as we have proven in different articles and lectures at congresses both national and international. Therefore, we will codify any information concerning the automating process, graphically represent its temporal evolution and, applying binodal theory and D.F.P (previously adapted), succeed in making logic equations compatible with the process to be automated and the device in which they will be implemented (PLC in our case)
NASA Astrophysics Data System (ADS)
Li, Can; Wang, Fei; Zang, Lixuan; Zang, Hengchang; Alcalà, Manel; Nie, Lei; Wang, Mingyu; Li, Lian
2017-03-01
Nowadays, as a powerful process analytical tool, near infrared spectroscopy (NIRS) has been widely applied in process monitoring. In present work, NIRS combined with multivariate analysis was used to monitor the ethanol precipitation process of fraction I + II + III (FI + II + III) supernatant in human albumin (HA) separation to achieve qualitative and quantitative monitoring at the same time and assure the product's quality. First, a qualitative model was established by using principal component analysis (PCA) with 6 of 8 normal batches samples, and evaluated by the remaining 2 normal batches and 3 abnormal batches. The results showed that the first principal component (PC1) score chart could be successfully used for fault detection and diagnosis. Then, two quantitative models were built with 6 of 8 normal batches to determine the content of the total protein (TP) and HA separately by using partial least squares regression (PLS-R) strategy, and the models were validated by 2 remaining normal batches. The determination coefficient of validation (Rp2), root mean square error of cross validation (RMSECV), root mean square error of prediction (RMSEP) and ratio of performance deviation (RPD) were 0.975, 0.501 g/L, 0.465 g/L and 5.57 for TP, and 0.969, 0.530 g/L, 0.341 g/L and 5.47 for HA, respectively. The results showed that the established models could give a rapid and accurate measurement of the content of TP and HA. The results of this study indicated that NIRS is an effective tool and could be successfully used for qualitative and quantitative monitoring the ethanol precipitation process of FI + II + III supernatant simultaneously. This research has significant reference value for assuring the quality and improving the recovery ratio of HA in industrialization scale by using NIRS.
Li, Can; Wang, Fei; Zang, Lixuan; Zang, Hengchang; Alcalà, Manel; Nie, Lei; Wang, Mingyu; Li, Lian
2017-03-15
Nowadays, as a powerful process analytical tool, near infrared spectroscopy (NIRS) has been widely applied in process monitoring. In present work, NIRS combined with multivariate analysis was used to monitor the ethanol precipitation process of fraction I+II+III (FI+II+III) supernatant in human albumin (HA) separation to achieve qualitative and quantitative monitoring at the same time and assure the product's quality. First, a qualitative model was established by using principal component analysis (PCA) with 6 of 8 normal batches samples, and evaluated by the remaining 2 normal batches and 3 abnormal batches. The results showed that the first principal component (PC1) score chart could be successfully used for fault detection and diagnosis. Then, two quantitative models were built with 6 of 8 normal batches to determine the content of the total protein (TP) and HA separately by using partial least squares regression (PLS-R) strategy, and the models were validated by 2 remaining normal batches. The determination coefficient of validation (R p 2 ), root mean square error of cross validation (RMSECV), root mean square error of prediction (RMSEP) and ratio of performance deviation (RPD) were 0.975, 0.501g/L, 0.465g/L and 5.57 for TP, and 0.969, 0.530g/L, 0.341g/L and 5.47 for HA, respectively. The results showed that the established models could give a rapid and accurate measurement of the content of TP and HA. The results of this study indicated that NIRS is an effective tool and could be successfully used for qualitative and quantitative monitoring the ethanol precipitation process of FI+II+III supernatant simultaneously. This research has significant reference value for assuring the quality and improving the recovery ratio of HA in industrialization scale by using NIRS. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Lewis, Pattie
2007-01-01
Headquarters National Aeronautics and Space Administration (NASA) chartered the NASA Acquisition Pollution Prevention (AP2) Office to coordinate agency activities affecting pollution prevention issues identified during system and component acquisition and sustainment processes. The primary objectives of the AP2 Office are to: (1) Reduce or eliminate the use of hazardous materials or hazardous processes at manufacturing, remanufacturing, and sustainment locations. (2) Avoid duplication of effort in actions required to reduce or eliminate hazardous materials through joint center cooperation and technology sharing. The objective of this project was to qualify candidate alternative Low-Emission Surface Preparation/Depainting Technologies for Structural Steel applications at NASA facilities. This project compares the surface preparation/depainting performance of the proposed alternatives to existing surface preparation/depainting systems or standards. This Joint Test Report (JTR) contains the results of testing as per the outlines of the Joint Test Protocol (JTP), Joint Test Protocol for Validation of Alternative Low-Emission Surface Preparation/Depainting Technologies for Structural Steel, and the Field Test Plan (FTP), Field Evaluations Test Plan for Validation of Alternative Low-Emission Surface Preparation/Depainting Technologies for Structural Steel, for critical requirements and tests necessary to qualify alternatives for coating removal systems. These tests were derived from engineering, performance, and operational impact (supportability) requirements defined by a consensus of government and industry participants. This JTR documents the results of the testing as well as any test modifications made during the execution of the project. This JTR is made available as a reference for future pollution prevention endeavors by other NASA Centers, the Department of Defense and commercial users to minimize duplication of effort. The current coating removal processes identified herein are for polyurethane, epoxy and other paint systems applied by conventional wet-spray processes. A table summarizes the target hazardous materials, processes and materials, applications, affected programs, and candidate substrates.
Rigaux, Clémence; André, Stéphane; Albert, Isabelle; Carlin, Frédéric
2014-02-03
Microbial spoilage of canned foods by thermophilic and highly heat-resistant spore-forming bacteria, such as Geobacillus stearothermophilus, is a persistent problem in the food industry. An incubation test at 55 °C for 7 days, then validation of biological stability, is used as an indicator of compliance with good manufacturing practices. We propose a microbial risk assessment model predicting the percentage of non-stability due to G. stearothermophilus in canned green beans manufactured by a French company. The model accounts for initial microbial contaminations of fresh unprocessed green beans with G. stearothermophilus, cross-contaminations in the processing chain, inactivation processes and probability of survival and growth. The sterilization process is modeled by an equivalent heating time depending on sterilization value F₀ and on G. stearothermophilus resistance parameter z(T). Following the recommendations of international organizations, second order Monte-Carlo simulations are used, separately propagating uncertainty and variability on parameters. As a result of the model, the mean predicted non-stability rate is of 0.5%, with a 95% uncertainty interval of [0.1%; 1.2%], which is highly similar to data communicated by the French industry. A sensitivity analysis based on Sobol indices and some scenario tests underline the importance of cross-contamination at the blanching step, in addition to inactivation due to the sterilization process. Copyright © 2013 Elsevier B.V. All rights reserved.
A model to predict accommodations needed by disabled persons.
Babski-Reeves, Kari; Williams, Sabrina; Waters, Tzer Nan; Crumpton-Young, Lesia L; McCauley-Bell, Pamela
2005-09-01
In this paper, several approaches to assist employers in the accommodation process for disabled employees are discussed and a mathematical model is proposed to assist employers in predicting the accommodation level needed by an individual with a mobility-related disability. This study investigates the validity and reliability of this model in assessing the accommodation level needed by individuals utilizing data collected from twelve individuals with mobility-related disabilities. Based on the results of the statistical analyses, this proposed model produces a feasible preliminary measure for assessing the accommodation level needed for persons with mobility-related disabilities. Suggestions for practical application of this model in an industrial setting are addressed.
Paolantonacci, Philippe; Appourchaux, Philippe; Claudel, Béatrice; Ollivier, Monique; Dennett, Richard; Siret, Laurent
2018-01-01
Polyvalent human normal immunoglobulins for intravenous use (IVIg), indicated for rare and often severe diseases, are complex plasma-derived protein preparations. A quality by design approach has been used to develop the Laboratoire Français du Fractionnement et des Biotechnologies new-generation IVIg, targeting a high level of purity to generate an enhanced safety profile while maintaining a high level of efficacy. A modular approach of quality by design was implemented consisting of five consecutive steps to cover all the stages from the product design to the final product control strategy.A well-defined target product profile was translated into 27 product quality attributes that formed the basis of the process design. In parallel, a product risk analysis was conducted and identified 19 critical quality attributes among the product quality attributes. Process risk analysis was carried out to establish the links between process parameters and critical quality attributes. Twelve critical steps were identified, and for each of these steps a risk mitigation plan was established.Among the different process risk mitigation exercises, five process robustness studies were conducted at qualified small scale with a design of experiment approach. For each process step, critical process parameters were identified and, for each critical process parameter, proven acceptable ranges were established. The quality risk management and risk mitigation outputs, including verification of proven acceptable ranges, were used to design the process verification exercise at industrial scale.Finally, the control strategy was established using a mix, or hybrid, of the traditional approach plus elements of the quality by design enhanced approach, as illustrated, to more robustly assign material and process controls and in order to securely meet product specifications.The advantages of this quality by design approach were improved process knowledge for industrial design and process validation and a clear justification of the process and product specifications as a basis for control strategy and future comparability exercises. © PDA, Inc. 2018.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crabtree, George; Glotzer, Sharon; McCurdy, Bill
This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. Newmore » materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of abating, has enabled the development of computer simulations and models of unprecedented fidelity. We are at the threshold of a new era where the integrated synthesis, characterization, and modeling of complex materials and chemical processes will transform our ability to understand and design new materials and chemistries with predictive power. In turn, this predictive capability will transform technological innovation by accelerating the development and deployment of new materials and processes in products and manufacturing. Harnessing the potential of computational science and engineering for the discovery and development of materials and chemical processes is essential to maintaining leadership in these foundational fields that underpin energy technologies and industrial competitiveness. Capitalizing on the opportunities presented by simulation-based engineering and science in materials and chemistry will require an integration of experimental capabilities with theoretical and computational modeling; the development of a robust and sustainable infrastructure to support the development and deployment of advanced computational models; and the assembly of a community of scientists and engineers to implement this integration and infrastructure. This community must extend to industry, where incorporating predictive materials science and chemistry into design tools can accelerate the product development cycle and drive economic competitiveness. The confluence of new theories, new materials synthesis capabilities, and new computer platforms has created an unprecedented opportunity to implement a "materials-by-design" paradigm with wide-ranging benefits in technological innovation and scientific discovery. The Workshop on Computational Materials Science and Chemistry for Innovation was convened in Bethesda, Maryland, on July 26-27, 2010. Sponsored by the Department of Energy (DOE) Offices of Advanced Scientific Computing Research and Basic Energy Sciences, the workshop brought together 160 experts in materials science, chemistry, and computational science representing more than 65 universities, laboratories, and industries, and four agencies. The workshop examined seven foundational challenge areas in materials science and chemistry: materials for extreme conditions, self-assembly, light harvesting, chemical reactions, designer fluids, thin films and interfaces, and electronic structure. Each of these challenge areas is critical to the development of advanced energy systems, and each can be accelerated by the integrated application of predictive capability with theory and experiment. The workshop concluded that emerging capabilities in predictive modeling and simulation have the potential to revolutionize the development of new materials and chemical processes. Coupled with world-leading materials characterization and nanoscale science facilities, this predictive capability provides the foundation for an innovation ecosystem that can accelerate the discovery, development, and deployment of new technologies, including advanced energy systems. Delivering on the promise of this innovation ecosystem requires the following: Integration of synthesis, processing, characterization, theory, and simulation and modeling. Many of the newly established Energy Frontier Research Centers and Energy Hubs are exploiting this integration. Achieving/strengthening predictive capability in foundational challenge areas. Predictive capability in the seven foundational challenge areas described in this report is critical to the development of advanced energy technologies. Developing validated computational approaches that span vast differences in time and length scales. This fundamental computational challenge crosscuts all of the foundational challenge areas. Similarly challenging is coupling of analytical data from multiple instruments and techniques that are required to link these length and time scales. Experimental validation and quantification of uncertainty in simulation and modeling. Uncertainty quantification becomes increasingly challenging as simulations become more complex. Robust and sustainable computational infrastructure, including software and applications. For modeling and simulation, software equals infrastructure. To validate the computational tools, software is critical infrastructure that effectively translates huge arrays of experimental data into useful scientific understanding. An integrated approach for managing this infrastructure is essential. Efficient transfer and incorporation of simulation-based engineering and science in industry. Strategies for bridging the gap between research and industrial applications and for widespread industry adoption of integrated computational materials engineering are needed.« less
Nutrient and media recycling in heterotrophic microalgae cultures.
Lowrey, Joshua; Armenta, Roberto E; Brooks, Marianne S
2016-02-01
In order for microalgae-based processes to reach commercial production for biofuels and high-value products such as omega-3 fatty acids, it is necessary that economic feasibility be demonstrated at the industrial scale. Therefore, process optimization is critical to ensure that the maximum yield can be achieved from the most efficient use of resources. This is particularly true for processes involving heterotrophic microalgae, which have not been studied as extensively as phototrophic microalgae. An area that has received significant conceptual praise, but little experimental validation, is that of nutrient recycling, where the waste materials from prior cultures and post-lipid extraction are reused for secondary fermentations. While the concept is very simple and could result in significant economic and environmental benefits, there are some underlying challenges that must be overcome before adoption of nutrient recycling is viable at commercial scale. Even more, adapting nutrient recycling for optimized heterotrophic cultures presents some added challenges that must be identified and addressed that have been largely unexplored to date. These challenges center on carbon and nitrogen recycling and the implications of using waste materials in conjunction with virgin nutrients for secondary cultures. The aim of this review is to provide a foundation for further understanding of nutrient recycling for microalgae cultivation. As such, we outline the current state of technology and practical challenges associated with nutrient recycling for heterotrophic microalgae on an industrial scale and give recommendations for future work.
The Air Force Officer Qualifying Test: Validity, Fairness, and Bias
2010-01-01
scores. The Standards for Educational and Psychological Testing (AERA, APA, and NCME, 1999) provides a set of guidelines published and endorsed by the...determining the validity and bias of selection tests falls upon professionals in the discipline of industrial/organizational psychology 20 See Roper v. Dep’t...i). 30 The Air Force Officer Qualifying Test : Validity, Fairness, and Bias and closely related fields (e.g., educational psychology and
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Tengfang; Flapper, Joris; Ke, Jing
The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variablesmore » affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and water usage in individual dairy plants, augment benchmarking activities in the market places, and facilitate implementation of efficiency measures and strategies to save energy and water usage in the dairy industry. Industrial adoption of this emerging tool and technology in the market is expected to benefit dairy plants, which are important customers of California utilities. Further demonstration of this benchmarking tool is recommended, for facilitating its commercialization and expansion in functions of the tool. Wider use of this BEST-Dairy tool and its continuous expansion (in functionality) will help to reduce the actual consumption of energy and water in the dairy industry sector. The outcomes comply very well with the goals set by the AB 1250 for PIER program.« less
Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti
2013-01-01
Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.
Hydropower Research | Water Power | NREL
facilities are available to support hydropower technology validation and design optimization. Photo of water optimized prior to expensive and time-consuming open-water validation. Photo of electric power lines at -machinery. Using these methodologies, tools, and direct industry data, they analyze the near- and long-term
The Polygraph: Concept, Usage and Validity.
ERIC Educational Resources Information Center
Simpson, B. Allen
1986-01-01
Deals with the use of the "lie detector" or "polygraphic test" as a method of detecting deception in industries and law enforcement agencies. Explains what the polygraph is and how it operates. Presents a series of specific arguments for and against the validity of the instrument. Research appears to be inconclusive. (Author/ABB)
78 FR 3425 - Ocean Transportation Intermediary License Revocations
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-16
...., Suite 208, Artesia, CA 90701. Date Revoked: November 11, 2012. Reason: Failed to maintain a valid bond... of Industry, CA 91748. Date Revoked: November 18, 2012. Reason: Failed to maintain a valid bond...: November 5, 2012. Reason: Voluntary Surrender of License. License No.: 021296NF. Name: ITW International...
Non-destructive inspection in industrial equipment using robotic mobile manipulation
NASA Astrophysics Data System (ADS)
Maurtua, Iñaki; Susperregi, Loreto; Ansuategui, Ander; Fernández, Ane; Ibarguren, Aitor; Molina, Jorge; Tubio, Carlos; Villasante, Cristobal; Felsch, Torsten; Pérez, Carmen; Rodriguez, Jorge R.; Ghrissi, Meftah
2016-05-01
MAINBOT project has developed service robots based applications to autonomously execute inspection tasks in extensive industrial plants in equipment that is arranged horizontally (using ground robots) or vertically (climbing robots). The industrial objective has been to provide a means to help measuring several physical parameters in multiple points by autonomous robots, able to navigate and climb structures, handling non-destructive testing sensors. MAINBOT has validated the solutions in two solar thermal plants (cylindrical-parabolic collectors and central tower), that are very demanding from mobile manipulation point of view mainly due to the extension (e.g. a thermal solar plant of 50Mw, with 400 hectares, 400.000 mirrors, 180 km of absorber tubes, 140m height tower), the variability of conditions (outdoor, day-night), safety requirements, etc. Once the technology was validated in simulation, the system was deployed in real setups and different validation tests carried out. In this paper two of the achievements related with the ground mobile inspection system are presented: (1) Autonomous navigation localization and planning algorithms to manage navigation in huge extensions and (2) Non-Destructive Inspection operations: thermography based detection algorithms to provide automatic inspection abilities to the robots.
Multimodal inspection in power engineering and building industries: new challenges and solutions
NASA Astrophysics Data System (ADS)
Kujawińska, Małgorzata; Malesa, Marcin; Malowany, Krzysztof
2013-09-01
Recently the demand and number of applications of full-field, optical measurement methods based on noncoherent light sources increased significantly. They include traditional image processing, thermovision, digital image correlation (DIC) and structured light methods. However, there are still numerous challenges connected with implementation of these methods to in-situ, long-term monitoring in industrial, civil engineering and cultural heritage applications, multimodal measurements of a variety of object features or simply adopting instruments to work in hard environmental conditions. In this paper we focus on 3D DIC method and present its enhancements concerning software modifications (new visualization methods and a method for automatic merging of data distributed in time) and hardware improvements. The modified 3D DIC system combined with infrared camera system is applied in many interesting cases: measurements of boiler drum during annealing and of pipelines in heat power stations and monitoring of different building steel struts at construction site and validation of numerical models of large building structures constructed of graded metal plate arches.
Application of a Subspace-Based Fault Detection Method to Industrial Structures
NASA Astrophysics Data System (ADS)
Mevel, L.; Hermans, L.; van der Auweraer, H.
1999-11-01
Early detection and localization of damage allow increased expectations of reliability, safety and reduction of the maintenance cost. This paper deals with the industrial validation of a technique to monitor the health of a structure in operating conditions (e.g. rotating machinery, civil constructions subject to ambient excitations, etc.) and to detect slight deviations in a modal model derived from in-operation measured data. In this paper, a statistical local approach based on covariance-driven stochastic subspace identification is proposed. The capabilities and limitations of the method with respect to health monitoring and damage detection are discussed and it is explained how the method can be practically used in industrial environments. After the successful validation of the proposed method on a few laboratory structures, its application to a sports car is discussed. The example illustrates that the method allows the early detection of a vibration-induced fatigue problem of a sports car.
The genomic applications in practice and prevention network.
Khoury, Muin J; Feero, W Gregory; Reyes, Michele; Citrin, Toby; Freedman, Andrew; Leonard, Debra; Burke, Wylie; Coates, Ralph; Croyle, Robert T; Edwards, Karen; Kardia, Sharon; McBride, Colleen; Manolio, Teri; Randhawa, Gurvaneet; Rasooly, Rebekah; St Pierre, Jeannette; Terry, Sharon
2009-07-01
The authors describe the rationale and initial development of a new collaborative initiative, the Genomic Applications in Practice and Prevention Network. The network convened by the Centers for Disease Control and Prevention and the National Institutes of Health includes multiple stakeholders from academia, government, health care, public health, industry and consumers. The premise of Genomic Applications in Practice and Prevention Network is that there is an unaddressed chasm between gene discoveries and demonstration of their clinical validity and utility. This chasm is due to the lack of readily accessible information about the utility of most genomic applications and the lack of necessary knowledge by consumers and providers to implement what is known. The mission of Genomic Applications in Practice and Prevention Network is to accelerate and streamline the effective integration of validated genomic knowledge into the practice of medicine and public health, by empowering and sponsoring research, evaluating research findings, and disseminating high quality information on candidate genomic applications in practice and prevention. Genomic Applications in Practice and Prevention Network will develop a process that links ongoing collection of information on candidate genomic applications to four crucial domains: (1) knowledge synthesis and dissemination for new and existing technologies, and the identification of knowledge gaps, (2) a robust evidence-based recommendation development process, (3) translation research to evaluate validity, utility and impact in the real world and how to disseminate and implement recommended genomic applications, and (4) programs to enhance practice, education, and surveillance.
de Brugerolle, Anne
2007-01-01
SkinEthic Laboratories is a France-based biotechnology company recognised as the world leader in tissue engineering. SkinEthic is devoted to develop and produce reliable and robust in vitro alternative methods to animal use in cosmetic, chemical and pharmaceutical industries. SkinEthic models provide relevant tools for efficacy and safety screening tests in order to support an integrated decision-making during research and development phases. Some screening tests are referenced and validated as alternatives to animal use (Episkin), others are in the process of validation under ECVAM and OECD guidelines. SkinEthic laboratories provide a unique and joined experience of more than 20 years from Episkin SNC and SkinEthic SA. Their unique cell culture process allows in vitro reconstructed human tissues with well characterized histology, functionality and ultrastructure features to be mass produced. Our product line includes skin models: a reconstructed human epidermis with a collagen layer, Episkin, reconstructed human epidermis without or with melanocytes (with a tanning degree from phototype II to VI) and a reconstructed human epithelium, i.e. cornea, and other mucosa, i.e. oral, gingival, oesophageal and vaginal. Our philosophy is based on 3 main commitments: to support our customers by providing robust and reliable models, to ensure training and education in using validated protocols, allowing a large array of raw materials, active ingredients and finished products in solid, liquid, powder, cream or gel form to be screened, and, to provide a dedicated service to our partners.
Gómez-Carracedo, M P; Andrade, J M; Rutledge, D N; Faber, N M
2007-03-07
Selecting the correct dimensionality is critical for obtaining partial least squares (PLS) regression models with good predictive ability. Although calibration and validation sets are best established using experimental designs, industrial laboratories cannot afford such an approach. Typically, samples are collected in an (formally) undesigned way, spread over time and their measurements are included in routine measurement processes. This makes it hard to evaluate PLS model dimensionality. In this paper, classical criteria (leave-one-out cross-validation and adjusted Wold's criterion) are compared to recently proposed alternatives (smoothed PLS-PoLiSh and a randomization test) to seek out the optimum dimensionality of PLS models. Kerosene (jet fuel) samples were measured by attenuated total reflectance-mid-IR spectrometry and their spectra where used to predict eight important properties determined using reference methods that are time-consuming and prone to analytical errors. The alternative methods were shown to give reliable dimensionality predictions when compared to external validation. By contrast, the simpler methods seemed to be largely affected by the largest changes in the modeling capabilities of the first components.
Developing a Measure of Internet Well-Being: Nomological (Predictive) Validation
ERIC Educational Resources Information Center
Sirgy, M. Joseph; Lee, Dong-Jin; Bae, Jeannie
2006-01-01
This paper reports on an effort to develop an Internet well-being measure for possible use by government agencies and industry associations that are directly involved with the promulgation of the Internet. Such measures can help officials gauge the social health of those Internet-related industries and institutional sectors, which in turn can…
Validation of a pre-existing safety climate scale for the Turkish furniture manufacturing industry.
Akyuz, Kadri Cemil; Yildirim, Ibrahim; Gungor, Celal
2018-03-22
Understanding the safety climate level is essential to implement a proactive safety program. The objective of this study is to explore the possibility of having a safety climate scale for the Turkish furniture manufacturing industry since there has not been any scale available. The questionnaire recruited 783 subjects. Confirmatory factor analysis (CFA) tested a pre-existing safety scale's fit to the industry. The CFA indicated that the structures of the model present a non-satisfactory fit with the data (χ 2 = 2033.4, df = 314, p ≤ 0.001; root mean square error of approximation = 0.08, normed fit index = 0.65, Tucker-Lewis index = 0.65, comparative fit index = 0.69, parsimony goodness-of-fit index = 0.68). The results suggest that a new scale should be developed and validated to measure the safety climate level in the Turkish furniture manufacturing industry. Due to the hierarchical structure of organizations, future studies should consider a multilevel approach in their exploratory factor analyses while developing a new scale.
The Transition from Spacecraft Development Ot Flight Operation: Human Factor Considerations
NASA Technical Reports Server (NTRS)
Basilio, Ralph R.
2000-01-01
In the field of aeronautics and astronautics, a paradigm shift has been witnessed by those in academia, research and development, and private industry. Long development life cycles and the budgets to support such programs and projects has given way to aggressive task schedules and leaner resources to draw from all the while challenging assigned individuals to create and produce improved products of processes. however, this "faster, better, cheaper" concept cannot merely be applied to the design, development, and test of complex systems such as earth-orbiting of interplanetary robotic spacecraft. Full advantage is not possible without due consideration and application to mission operations planning and flight operations, Equally as important as the flight system, the mission operations system consisting of qualified personnel, ground hardware and software tools, and verified and validated operational processes, should also be regarded as a complex system requiring personnel to draw upon formal education, training, related experiences, and heuristic reasoning in engineering an effective and efficient system. Unquestionably, qualified personnel are the most important elements of a mission operations system. This paper examines the experiences of the Deep Space I Project, the first in a series of new technology in-flight validation missions sponsored by the United States National Aeronautics and Space Administration (NASA), specifically, in developing a subsystems analysis and technology validation team comprised of former spacecraft development personnel. Human factor considerations are investigated from initial concept/vision formulation; through operational process development; personnel test and training; to initial uplink product development and test support. Emphasis has been placed on challenges and applied or recommended solutions, so as to provide opportunities for future programs and projects to address and disposition potential issues and concerns as early as possible to reap the benefits associated with learning from other's past experiences.
Ens, Waldemar; Senner, Frank; Gygax, Benjamin; Schlotterbeck, Götz
2014-05-01
A new method for the simultaneous determination of iodated X-ray contrast media (ICM) and artificial sweeteners (AS) by liquid chromatography-tandem mass spectrometry (LC-MS/MS) operated in positive and negative ionization switching mode was developed. The method was validated for surface, ground, and drinking water samples. In order to gain higher sensitivities, a 10-fold sample enrichment step using a Genevac EZ-2 plus centrifugal vacuum evaporator that provided excellent recoveries (90 ± 6 %) was selected for sample preparation. Limits of quantification below 10 ng/L were obtained for all compounds. Furthermore, sample preparation recoveries and matrix effects were investigated thoroughly for all matrix types. Considerable matrix effects were observed in surface water and could be compensated by the use of four stable isotope-labeled internal standards. Due to their persistence, fractions of diatrizoic acid, iopamidol, and acesulfame could pass the whole drinking water production process and were observed also in drinking water. To monitor the fate and occurrence of these compounds, the validated method was applied to samples from different stages of the drinking water production process of the Industrial Works of Basel (IWB). Diatrizoic acid was found as the most persistent compound which was eliminated by just 40 % during the whole drinking water treatment process, followed by iopamidol (80 % elimination) and acesulfame (85 % elimination). All other compounds were completely restrained and/or degraded by the soil and thus were not detected in groundwater. Additionally, a direct injection method without sample preparation achieving 3-20 ng/L limits of quantification was compared to the developed method.
Robust interval-based regulation for anaerobic digestion processes.
Alcaraz-González, V; Harmand, J; Rapaport, A; Steyer, J P; González-Alvarez, V; Pelayo-Ortiz, C
2005-01-01
A robust regulation law is applied to the stabilization of a class of biochemical reactors exhibiting partially known highly nonlinear dynamic behavior. An uncertain environment with the presence of unknown inputs is considered. Based on some structural and operational conditions, this regulation law is shown to exponentially stabilize the aforementioned bioreactors around a desired set-point. This approach is experimentally applied and validated on a pilot-scale (1 m3) anaerobic digestion process for the treatment of raw industrial wine distillery wastewater where the objective is the regulation of the chemical oxygen demand (COD) by using the dilution rate as the manipulated variable. Despite large disturbances on the input COD and state and parametric uncertainties, this regulation law gave excellent performances leading the output COD towards its set-point and keeping it inside a pre-specified interval.
NASA Astrophysics Data System (ADS)
Das, A.; Bang, H. S.; Bang, H. S.
2018-05-01
Multi-material combinations of aluminium alloy and carbon-fiber-reinforced-plastics (CFRP) have gained attention in automotive and aerospace industries to enhance fuel efficiency and strength-to-weight ratio of components. Various limitations of laser beam welding, adhesive bonding and mechanical fasteners make these processes inefficient to join metal and CFRP sheets. Friction lap joining is an alternative choice for the same. Comprehensive studies in friction lap joining of aluminium to CFRP sheets are essential and scare in the literature. The present work reports a combined theoretical and experimental study in joining of AA5052 and CFRP sheets using friction lap joining process. A three-dimensional finite element based heat transfer model is developed to compute the temperature fields and thermal cycles. The computed results are validated extensively with the corresponding experimentally measured results.
[Validation of a scale measuring coping with extreme risks].
López-Vázquez, Esperanza; Marván, María Luisa
2004-01-01
The objective of this study was to validate, in Mexico, the French coping scale "Echelle Toulousaine de Coping". In the fall of 2001, the scale questionnaire was applied to 209 subjects living in different areas of Mexico, exposed to five different types of extreme natural or industrial risks. The discriminatory capacity of the items, as well as the factorial structure and internal consistency of the scale, were analyzed using Mann-Whitney's U test, principal components factorial analysis, and Cronbach's alpha. The final scale was composed of 26 items forming two groups: active coping and passive coping. Internal consistency of the instrument was high, both in the total sample and in the subsample of natural and industrial risks. The coping scale is reliable and valid for the Mexican population. The English version of this paper is available at: http://www.insp.mx/salud/index.html.
Production process stability - core assumption of INDUSTRY 4.0 concept
NASA Astrophysics Data System (ADS)
Chromjakova, F.; Bobak, R.; Hrusecka, D.
2017-06-01
Today’s industrial enterprises are confronted by implementation of INDUSTRY 4.0 concept with basic problem - stabilised manufacturing and supporting processes. Through this phenomenon of stabilisation, they will achieve positive digital management of both processes and continuously throughput. There is required structural stability of horizontal (business) and vertical (digitized) manufacturing processes, supported through digitalised technologies of INDUSTRY 4.0 concept. Results presented in this paper based on the research results and survey realised in more industrial companies. Following will described basic model for structural process stabilisation in manufacturing environment.
Jiang, Hui; Liu, Guohai; Mei, Congli; Yu, Shuang; Xiao, Xiahong; Ding, Yuhan
2012-11-01
The feasibility of rapid determination of the process variables (i.e. pH and moisture content) in solid-state fermentation (SSF) of wheat straw using Fourier transform near infrared (FT-NIR) spectroscopy was studied. Synergy interval partial least squares (siPLS) algorithm was implemented to calibrate regression model. The number of PLS factors and the number of subintervals were optimized simultaneously by cross-validation. The performance of the prediction model was evaluated according to the root mean square error of cross-validation (RMSECV), the root mean square error of prediction (RMSEP) and the correlation coefficient (R). The measurement results of the optimal model were obtained as follows: RMSECV=0.0776, R(c)=0.9777, RMSEP=0.0963, and R(p)=0.9686 for pH model; RMSECV=1.3544% w/w, R(c)=0.8871, RMSEP=1.4946% w/w, and R(p)=0.8684 for moisture content model. Finally, compared with classic PLS and iPLS models, the siPLS model revealed its superior performance. The overall results demonstrate that FT-NIR spectroscopy combined with siPLS algorithm can be used to measure process variables in solid-state fermentation of wheat straw, and NIR spectroscopy technique has a potential to be utilized in SSF industry. Copyright © 2012 Elsevier B.V. All rights reserved.
Modeling of Texture Evolution During Hot Forging of Alpha/Beta Titanium Alloys (Preprint)
2007-06-01
treatment. The approach was validated via an industrial -scale trail comprising hot pancake forging of Ti- 6Al-4V. 15. SUBJECT TERMS titanium... industrial -scale trial comprising hot pancake forging of Ti-6Al-4V. Keywords: Titanium, Texture, Modeling, Strain Partitioning, Variant Selection... industrial -scale forging of Ti- 6Al-4V. 2. Background A brief review of pertinent previous efforts in the area of texture modeling is presented below
U.S. healthcare fix: leveraging the lessons from the food supply chain.
Kumar, Sameer; Blair, John T
2013-01-01
U.S. healthcare costs consistently outpace inflation, causing growing problems of affordability. This trend cannot be sustained indefinitely. The purpose of this study is to use supply-chain tools for macro-level examination of the U.S. healthcare as a business system and identify options and best use practices. We compare the important and successful U.S. food industry to the essential but problematic U.S. healthcare industry. Supply chain strategies leading to food business operations success are examined and healthcare applications suggested. We emphasize "total cost of ownership" which includes all costs incurred by all stakeholders of U.S. healthcare, including maintenance and cleanup, not just the initial purchase price. U.S. hospitals and clinics can use supply chain strategies in a total cost of ownership framework to reduce healthcare costs while maintaining patient care quality. Supply chain strategies of resource pooling, mass customization, centralized logistics, specialization, postponement and continuous improvement that have been successfully used in the U.S. food industry should be more widely applied to the U.S. healthcare industry. New and growing areas of telemedicine and medical tourism should be included in the supply chain analysis of U.S. healthcare. Valid statistical analysis of results in all areas of U.S. healthcare is an important part of the process. U.S. healthcare industry problems are systematic operational and supply chain problems rather than problems with workforce or technology. Examination of the U.S. healthcare industry through a supply chain framework should lead to significant operational improvement in both prevention and treatment of acute and chronic ailments. A rational and unemotional reorganization of the U.S. healthcare system operations, using supply chain strategies, should help reduce healthcare costs while maintaining quality and increasing accessibility.
NASA Astrophysics Data System (ADS)
Wu, Yu-Liang; Jiang, Ze-Yi; Zhang, Xin-Xin; Xue, Qing-Guo; Yu, Ai-Bing; Shen, Yan-Song
2017-10-01
Metallurgical dusts can be recycled through direct reduction in rotary hearth furnaces (RHFs) via addition into carbon-based composite pellets. While iron in the dust is recycled, several heavy and alkali metal elements harmful for blast furnace operation, including Zn, Pb, K, and Na, can also be separated and then recycled. However, there is a lack of understanding on thermochemical behavior related to direct reduction in an industrial-scale RHF, especially removal behavior of Zn, Pb, K, and Na, leading to technical issues in industrial practice. In this work, an integrated model of the direct reduction process in an industrial-scale RHF is described. The integrated model includes three mathematical submodels and one physical model, specifically, a three-dimensional (3-D) CFD model of gas flow and heat transfer in an RHF chamber, a one-dimensional (1-D) CFD model of direct reduction inside a pellet, an energy/mass equilibrium model, and a reduction physical experiment using a Si-Mo furnace. The model is validated by comparing the simulation results with measurements in terms of furnace temperature, furnace pressure, and pellet indexes. The model is then used for describing in-furnace phenomena and pellet behavior in terms of heat transfer, direct reduction, and removal of a range of heavy and alkali metal elements under industrial-scale RHF conditions. The results show that the furnace temperature in the preheating section should be kept at a higher level in an industrial-scale RHF compared with that in a pilot-scale RHF. The removal rates of heavy and alkali metal elements inside the composite pellet are all faster than iron metallization, specifically in the order of Pb, Zn, K, and Na.
Wallert, Mark A; Provost, Joseph J
2014-01-01
To enhance the preparedness of graduates from the Biochemistry and Biotechnology (BCBT) Major at Minnesota State University Moorhead for employment in the bioscience industry we have developed a new Industry certificate program. The BCBT Industry Certificate was developed to address specific skill sets that local, regional, and national industry experts identified as lacking in new B.S. and B.A. biochemistry graduates. The industry certificate addresses concerns related to working in a regulated industry such as Good Laboratory Practices, Good Manufacturing Practices, and working in a Quality System. In this article we specifically describe how we developed a validation course that uses Standard Operating Procedures to describe grading policy and laboratory notebook requirements in an effort to better prepare students to transition into industry careers. © 2013 by The International Union of Biochemistry and Molecular Biology.
High-power ultrasonic processing: Recent developments and prospective advances
NASA Astrophysics Data System (ADS)
Gallego-Juarez, Juan A.
2010-01-01
Although the application of ultrasonic energy to produce or to enhance a wide variety of processes have been explored since about the middle of the 20th century, only a reduced number of ultrasonic processes have been established at industrial level. However, during the last ten years the interest in ultrasonic processing has revived particularly in industrial sectors where the ultrasonic technology may represent a clean and efficient tool to improve classical existing processes or an innovation alternative for the development of new processes. Such seems to be the case of relevant sectors such as food industry, environment, pharmaceuticals and chemicals manufacture, machinery, mining, etc where power ultrasound is becoming an emerging technology for process development. The possible major problem in the application of high-intensity ultrasound on industrial processing is the design and development of efficient power ultrasonic systems (generators and reactors) capable of large scale successful operation specifically adapted to each individual process. In the area of ultrasonic processing in fluid media and more specifically in gases, the development of the steppedplate transducers and other power ge with extensive radiating surface has strongly contributed to the implementation at semi-industrial and industrial stage of several commercial applications, in sectors such as food and beverage industry (defoaming, drying, extraction, etc), environment (air cleaning, sludge filtration, etc...), machinery and process for manufacturing (textile washing, paint manufacture, etc). The development of different cavitational reactors for liquid treatment in continuous flow is helping to introduce into industry the wide potential of the area of sonochemistry. Processes such as water and effluent treatment, crystallization, soil remediation, etc have been already implemented at semi-industrial and/or industrial stage. Other single advances in sectors like mining or energy have also to be mentioned. The objective of this paper is to review some recent developments in ultrasonic processing to show the present situation and the prospective progresses of high-power ultrasonics as an innovative technology in many industrial sectors.
NASA Astrophysics Data System (ADS)
Jamaluddin, Z.; Razali, A. M.; Mustafa, Z.
2015-02-01
The purpose of this paper is to examine the relationship between the quality management practices (QMPs) and organisational performance for the manufacturing industry in Malaysia. In this study, a QMPs and organisational performance framework is developed according to a comprehensive literature review which cover aspects of hard and soft quality factors in manufacturing process environment. A total of 11 hypotheses have been put forward to test the relationship amongst the six constructs, which are management commitment, training, process management, quality tools, continuous improvement and organisational performance. The model is analysed using Structural Equation Modeling (SEM) with AMOS software version 18.0 using Maximum Likelihood (ML) estimation. A total of 480 questionnaires were distributed, and 210 questionnaires were valid for analysis. The results of the modeling analysis using ML estimation indicate that the fits statistics of QMPs and organisational performance model for manufacturing industry is admissible. From the results, it found that the management commitment have significant impact on the training and process management. Similarly, the training had significant effect to the quality tools, process management and continuous improvement. Furthermore, the quality tools have significant influence on the process management and continuous improvement. Likewise, the process management also has a significant impact to the continuous improvement. In addition the continuous improvement has significant influence the organisational performance. However, the results of the study also found that there is no significant relationship between management commitment and quality tools, and between the management commitment and continuous improvement. The results of the study can be used by managers to prioritize the implementation of QMPs. For instances, those practices that are found to have positive impact on organisational performance can be recommended to managers so that they can allocate resources to improve these practices to get better performance.
Impact Of The Material Variability On The Stamping Process: Numerical And Analytical Analysis
NASA Astrophysics Data System (ADS)
Ledoux, Yann; Sergent, Alain; Arrieux, Robert
2007-05-01
The finite element simulation is a very useful tool in the deep drawing industry. It is used more particularly for the development and the validation of new stamping tools. It allows to decrease cost and time for the tooling design and set up. But one of the most important difficulties to have a good agreement between the simulation and the real process comes from the definition of the numerical conditions (mesh, punch travel speed, limit conditions,…) and the parameters which model the material behavior. Indeed, in press shop, when the sheet set changes, often a variation of the formed part geometry is observed according to the variability of the material properties between these different sets. This last parameter represents probably one of the main source of process deviation when the process is set up. That's why it is important to study the influence of material data variation on the geometry of a classical stamped part. The chosen geometry is an omega shaped part because of its simplicity and it is representative one in the automotive industry (car body reinforcement). Moreover, it shows important springback deviations. An isotropic behaviour law is assumed. The impact of the statistical deviation of the three law coefficients characterizing the material and the friction coefficient around their nominal values is tested. A Gaussian distribution is supposed and their impact on the geometry variation is studied by FE simulation. An other approach is envisaged consisting in modeling the process variability by a mathematical model and then, in function of the input parameters variability, it is proposed to define an analytical model which leads to find the part geometry variability around the nominal shape. These two approaches allow to predict the process capability as a function of the material parameter variability.
Unusual Applications of Ultrasound in Industry
NASA Astrophysics Data System (ADS)
Keilman, George
The application of physical acoustics in industry has been accelerated by increased understanding of the physics of industrial processes, coupled with rapid advancements in transducers, microelectronics, data acquisition, signal processing, and related software fields. This has led to some unusual applications of ultrasound to improve industrial processes.
NASA Astrophysics Data System (ADS)
Lemieux, Alain
The advantages of producing metal parts by rheocasting are generally recognised for common foundry alloys of Al-Si. However, other more performing alloys in terms of mechanical properties could have a great interest in specialized applications in the automotive industry, while remaining competitive in the forming. Indeed, the growing demand for more competitive products requires the development of new alloys better suited to semi-solid processes. Among others, Al-Cu alloys of the 2XX series are known for their superior mechanical strength. However, in the past, 2XX alloys were never candidates for pressure die casting. The main reason is their propensity to hot tearing. Semi-solid processes provide better conditions for molding with the rheological behavior of dough and molding temperatures lower reducing this type of defect. In the initial phase, this research has studied factors that reduce hot tearing susceptibility of castings produced by semi-solid SEED of alloy 206. Subsequently, a comparative study on the tensile properties and fatigue was performed on four variants of the alloy 206. The results of tensile strength and fatigue were compared with the specifications for applications in the automotive industry and also to other competing processes and alloys. During this study, several metallurgical aspects were analyzed. The following main points have been validated: i) the main effects of compositional variations of silicon, iron and copper alloy Al-Cu (206) on the mechanical properties, and ii) certain relationships between the mechanism of hot cracking and the solidification rate in semi-solid. Parts produced from the semi-solid paste coming from the SEED process combined with modified 206 alloys have been successfully molded and achieved superior mechanical properties than the requirements of the automotive industry. The fatigue properties of the two best modified 206 alloys were higher than those of A357 alloy castings and are close to those of the wrought alloy AA6061. At present, there is simply no known application for pressure die-cast alloy with 206 (Liquid Die-casting). This is mainly due to the high propensity to hot cracking and limitations facing the part geometry and the subsequent assembly. This study demonstrated that in addition to pieces produced by semi-solid die-casting using large variations in chemical composition, the SEED process allows obtaining spare sound (sound part) and more complex geometry. Moreover, as the semi-solid parts have less porosity, they can also be machined and welded for some applications. The conclusions of this study demonstrate significant progress in identifying the main issues related to the feasibility of die-casting good parts with high performance using the modified 206 alloy combined with SEED process. This work is therefore a baseline work in the development of new Al-Cu alloys for industries of semi-solid and, at the same time, for the expansion of aluminum for high performance applications in the industry. N.B. This thesis is part of a research project developed by the NSERC / Rio Tinto Akan Industrial Research Chair in Metallurgy of Innovative Aluminum Transformation (CIMTAL).
Ensuring Evidence-Based Safe and Effective mHealth Applications.
Vallespin, Bárbara; Cornet, Joan; Kotzeva, Anna
2016-01-01
The Internet and the digitalization of information have brought big changes in healthcare, but the arrival of smartphones and tablets represent a true revolution and a new paradigm is opened which completely changes our lives. In order to validate the impact of these new technologies in health care, it is essential to have enough clinical studies that validate their impact in wellbeing and healthcare of the patient. Traditional regulatory organisations are still looking for their role in this area. If they follow the classical path of medical devices, we get to a technical, administration and economic collapse. This contribution first presents the main indicators showing the potential of mHealth adoption. It then proposes a classification of mobile health care apps, and presents frameworks for mHealth evaluation. Regulation of mHealth as part of the evaluation process is discussed. Finally, the necessary steps and challenges that have to be taken into account by the industry to prepare the entrance of these technologies into the EU market is analysed.
Retrieval and Validation of Aerosol Optical Depth by using the GF-1 Remote Sensing Data
NASA Astrophysics Data System (ADS)
Zhang, L.; Xu, S.; Wang, L.; Cai, K.; Ge, Q.
2017-05-01
Based on the characteristics of GF-1 remote sensing data, the method and data processing procedure to retrieve the Aerosol Optical Depth (AOD) are developed in this study. The surface contribution over dense vegetation and urban bright target areas are respectively removed by using the dark target and deep blue algorithms. Our method is applied for the three serious polluted Beijing-Tianjin-Hebei (BTH), Yangtze River Delta (YRD) and Pearl River Delta (PRD) regions. The retrieved AOD are validated by ground-based AERONET data from Beijing, Hangzhou, Hong Kong sites. Our results show that, 1) the heavy aerosol loadings are usually distributed in high industrial emission and dense populated cities, with the AOD value near 1. 2) There is a good agreement between satellite-retrievals and in-site observations, with the coefficient factors of 0.71 (BTH), 0.55 (YRD) and 0.54(PRD). 3) The GF-1 retrieval uncertainties are mainly from the impact of cloud contamination, high surface reflectance and assumed aerosol model.
Primary standards for measuring flow rates from 100 nl/min to 1 ml/min - gravimetric principle.
Bissig, Hugo; Petter, Harm Tido; Lucas, Peter; Batista, Elsa; Filipe, Eduarda; Almeida, Nelson; Ribeiro, Luis Filipe; Gala, João; Martins, Rui; Savanier, Benoit; Ogheard, Florestan; Niemann, Anders Koustrup; Lötters, Joost; Sparreboom, Wouter
2015-08-01
Microflow and nanoflow rate calibrations are important in several applications such as liquid chromatography, (scaled-down) process technology, and special health-care applications. However, traceability in the microflow and nanoflow range does not go below 16 μl/min in Europe. Furthermore, the European metrology organization EURAMET did not yet validate this traceability by means of an intercomparison between different National Metrology Institutes (NMIs). The NMIs METAS, Centre Technique des Industries Aérauliques et Thermiques, IPQ, Danish Technological Institute, and VSL have therefore developed and validated primary standards to cover the flow rate range from 0.1 μl/min to at least 1 ml/min. In this article, we describe the different designs and methods of the primary standards of the gravimetric principle and the results obtained at the intercomparison for the upper flow rate range for the various NMIs and Bronkhorst High-Tech, the manufacturer of the transfer standards used.
Barriers to Achieving Economies of Scale in Analysis of EHR Data. A Cautionary Tale.
Sendak, Mark P; Balu, Suresh; Schulman, Kevin A
2017-08-09
Signed in 2009, the Health Information Technology for Economic and Clinical Health Act infused $28 billion of federal funds to accelerate adoption of electronic health records (EHRs). Yet, EHRs have produced mixed results and have even raised concern that the current technology ecosystem stifles innovation. We describe the development process and report initial outcomes of a chronic kidney disease analytics application that identifies high-risk patients for nephrology referral. The cost to validate and integrate the analytics application into clinical workflow was $217,138. Despite the success of the program, redundant development and validation efforts will require $38.8 million to scale the application across all multihospital systems in the nation. We address the shortcomings of current technology investments and distill insights from the technology industry. To yield a return on technology investments, we propose policy changes that address the underlying issues now being imposed on the system by an ineffective technology business model.
NASA Astrophysics Data System (ADS)
Boughari, Yamina
New methodologies have been developed to optimize the integration, testing and certification of flight control systems, an expensive process in the aerospace industry. This thesis investigates the stability of the Cessna Citation X aircraft without control, and then optimizes two different flight controllers from design to validation. The aircraft's model was obtained from the data provided by the Research Aircraft Flight Simulator (RAFS) of the Cessna Citation business aircraft. To increase the stability and control of aircraft systems, optimizations of two different flight control designs were performed: 1) the Linear Quadratic Regulation and the Proportional Integral controllers were optimized using the Differential Evolution algorithm and the level 1 handling qualities as the objective function. The results were validated for the linear and nonlinear aircraft models, and some of the clearance criteria were investigated; and 2) the Hinfinity control method was applied on the stability and control augmentation systems. To minimize the time required for flight control design and its validation, an optimization of the controllers design was performed using the Differential Evolution (DE), and the Genetic algorithms (GA). The DE algorithm proved to be more efficient than the GA. New tools for visualization of the linear validation process were also developed to reduce the time required for the flight controller assessment. Matlab software was used to validate the different optimization algorithms' results. Research platforms of the aircraft's linear and nonlinear models were developed, and compared with the results of flight tests performed on the Research Aircraft Flight Simulator. Some of the clearance criteria of the optimized H-infinity flight controller were evaluated, including its linear stability, eigenvalues, and handling qualities criteria. Nonlinear simulations of the maneuvers criteria were also investigated during this research to assess the Cessna Citation X's flight controller clearance, and therefore, for its anticipated certification.
NASA Astrophysics Data System (ADS)
Kim, Dongwook; Quagliato, Luca; Lee, Wontaek; Kim, Naksoo
2017-09-01
In the ERW (electric resistance welding) pipe manufacturing, material properties, process conditions and settings strongly influences the mechanical performances of the final product, as well as they can make them to be not uniform and to change from point to point in the pipe. The present research work proposes an integrated numerical model for the study of the whole ERW process, considering roll forming, welding and sizing stations, allowing to infer the influence of the process parameters on the final quality of the pipe, in terms of final shape and residual stress. The developed numerical model has been initially validated comparing the dimensions of the pipe derived from the simulation results with those of industrial production, proving the reliability of the approach. Afterwards, by varying the process parameters in the numerical simulation, namely the roll speed, the sizing ratio and the friction factor, the influence on the residual stress in the pipe, at the end of the process and after each station, is studied and discussed along the paper.
Benefits of an automated GLP final report preparation software solution.
Elvebak, Larry E
2011-07-01
The final product of analytical laboratories performing US FDA-regulated (or GLP) method validation and bioanalysis studies is the final report. Although there are commercial-off-the-shelf (COTS) software/instrument systems available to laboratory managers to automate and manage almost every aspect of the instrumental and sample-handling processes of GLP studies, there are few software systems available to fully manage the GLP final report preparation process. This lack of appropriate COTS tools results in the implementation of rather Byzantine and manual processes to cobble together all the information needed to generate a GLP final report. The manual nature of these processes results in the need for several iterative quality control and quality assurance events to ensure data accuracy and report formatting. The industry is in need of a COTS solution that gives laboratory managers and study directors the ability to manage as many portions as possible of the GLP final report writing process and the ability to generate a GLP final report with the click of a button. This article describes the COTS software features needed to give laboratory managers and study directors such a solution.
Development and Validation of an Acid Mine Drainage Treatment Process for Source Water
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lane, Ann
Throughout Northern Appalachia and surrounding regions, hundreds of abandoned mine sites exist which frequently are the source of Acid Mine Drainage (AMD). AMD typically contains metal ions in solution with sulfate ions which have been leached from the mine. These large volumes of water, if treated to a minimum standard, may be of use in Hydraulic Fracturing (HF) or other industrial processes. This project’s focus is to evaluate an AMD water treatment technology for the purpose of providing treated AMD as an alternative source of water for HF operations. The HydroFlex™ technology allows the conversion of a previous environmental liabilitymore » into an asset while reducing stress on potable water sources. The technology achieves greater than 95% water recovery, while removing sulfate to concentrations below 100 mg/L and common metals (e.g., iron and aluminum) below 1 mg/L. The project is intended to demonstrate the capability of the process to provide AMD as alternative source water for HF operations. The second budget period of the project has been completed during which Battelle conducted two individual test campaigns in the field. The first test campaign demonstrated the ability of the HydroFlex system to remove sulfate to levels below 100 mg/L, meeting the requirements indicated by industry stakeholders for use of the treated AMD as source water. The second test campaign consisted of a series of focused confirmatory tests aimed at gathering additional data to refine the economic projections for the process. Throughout the project, regular communications were held with a group of project stakeholders to ensure alignment of the project objectives with industry requirements. Finally, the process byproduct generated by the HydroFlex process was evaluated for the treatment of produced water against commercial treatment chemicals. It was found that the process byproduct achieved similar results for produced water treatment as the chemicals currently in use. Further, the process byproduct demonstrated better settling characteristics in bench scale testing. The field testing conducted in the second project budget period demonstrated the ability of the HydroFlex technology to meet industry requirements for AMD water chemical composition so that it can be used as source water in HF activities. System and operational improvements were identified in an additional series of confirmatory tests to achieve competitive cost targets. Finally, the application of the HydroFlex process byproduct in produced water treatment was demonstrated, further supporting the commercial implementation of the technology. Overall, the project results demonstrate a path to the economic treatment of AMD to support its increased use as source water in HF, particularly in regions with limited local freshwater availability.« less
ERIC Educational Resources Information Center
Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette
2017-01-01
Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-25
... intended for transfusion, including recommendations for validation and quality control monitoring of the... intended for transfusion, including recommendations for validation and quality control monitoring of the... control number 0910-0052; the collections of information in 21 CFR 606.100(b), 606.100(c), 606.121, and...
Cast iron cutting with nano TiN and multilayer TiN-CrN coated inserts
NASA Astrophysics Data System (ADS)
Perucca, M.; Durante, S.; Semmler, U.; Rüger, C.; Fuentes, G. G.; Almandoz, E.
2012-09-01
During the past decade great success has been achieved in the development of duplex and multilayer multi-functional surface systems. Among these surface systems outstanding properties have nanoscale multilayer coatings. Within the framework of the M3-2S project funded in the 7th European Framework Programme, several nanoscale multilayer coatings have been developed and investigated for experimental and industrial validation. This paper shows the performance of TiN and TiN/CrN nanoscale multilayer coatings on WC cutting inserts when machining GJL250 cast iron. The thin films have been deposited by cathodic arc evaporation in an industrial PVD system. The multilayer deposition characteristic and its properties are shown. The inserts have been investigated in systematic cutting experiments of cast iron bars on a turning machine specifically equipped for force measurements, accompanied by wear determination. Furthermore, equivalent experiments have been carried out on an industrial turning unit. Industrial validation criteria have been applied to assess the comparative performance of the coatings. The choice of the material and the machined parts is driven by an interest in automotive applications. The industrial tests show the need to further optimise the multi-scale modelling approach in order to reduce the lead time of the coating development as well as to improve simulation reliability.
NASA Astrophysics Data System (ADS)
Zhou, Zongchuan; Dang, Dongsheng; Qi, Caijuan; Tian, Hongliang
2018-02-01
It is of great significance to make accurate forecasting for the power consumption of high energy-consuming industries. A forecasting model for power consumption of high energy-consuming industries based on system dynamics is proposed in this paper. First, several factors that have influence on the development of high energy-consuming industries in recent years are carefully dissected. Next, by analysing the relationship between each factor and power consumption, the system dynamics flow diagram and equations are set up to reflect the relevant relationships among variables. In the end, the validity of the model is verified by forecasting the power consumption of electrolytic aluminium industry in Ningxia according to the proposed model.
ERIC Educational Resources Information Center
Demirkan, Haluk; Goul, Michael; Gros, Mary
2010-01-01
Many e-learning service systems fail. This is particularly true for those sponsored by joint industry/university consortia where substantial economic investments are required up-front. This article provides an industry/university consortia reference model validated through experiences with the 8-year-old Teradata University Network. The reference…
The Motion Picture Audience: A Neglected Aspect of Film Research.
ERIC Educational Resources Information Center
Austin, Bruce A.
There has been little valid and reliable research of the motion picture audience. Specific reasons for the movie industry's own inattention to audience research include the early popularity of films and the fact that since the industry does not sell advertising it does not need to account for its audience size and preferences. Some researchers…
Deciphering the V-Chip: An Examination of the Television Industry's Program Rating Judgments.
ERIC Educational Resources Information Center
Kunkel, Dale; Farinola, Wendy Jo Maynard; Farrar, Kirstie; Donnerstein, Edward; Biely, Erica; Zwarun, Lara
2002-01-01
Investigates the validity of the television industry's labeling of sensitive program content following the advent of the V-chip television ratings system. Examines programs for the nature and extent of portrayals of violence, sexual behavior and dialogue, and adult language. Suggests there are substantial limitations in the ability of the V-chip…
Assessment of educational research capabilities at selected minority institutions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, W.J.
1983-07-01
A university, or consortia, to be engaged in DOE R and D programs, must be a source of high quality science, be able to mount multidisciplinary efforts, be dedicated to the agency mission, and be able to bring together unique outside resources if not available elsewhere. The DOE should establish a process through which the minority institutions can more fully participate in the funded research process and not be subjected to criticism which has any reasonable chance of being valid. The DOE should support efforts to establish opportunity for access to and participation in all of the DOE programs bymore » minority colleges and universities so that they can become resources that can contribute to the solution of the nation's energy problems through: involvement in research and development programs of the DOE, and eventually in those of other agencies and industry; education and training of the quantities of personnel needed in energy, energy technology, energy-related issues and disciplines; planning, decision and preparation of quality interdisciplinary curricula; acquisition of the understanding of energy, energy technology, and energy-related issues and policy necessary for technology and information transfer to the local community, industry, academia, and governments; and support of exploratory research in unique projects and new ideas prior to the researcher's obtaining longer-term support elsewhere.« less
Navya, P N; Pushpa, S Murthy
2013-08-01
Coffee cherry husk (CH) is one of the major by-products obtained from coffee processing industry and accounts to 43 ± 5.9% of cellulose. Screening of fungal organism for cellulase production was carried out and the potential organism was identified as Rhizopus stolonifer by internal transcribed spacer's (ITS)-5.8S rDNA analysis. A systematic study with response surface methodology (RSM) based on CCRD was used to study the interactions among the variables such as pH (3-7), moisture (40-80%) and progression duration (72-168 h) of the fermentation process to maximize the enzyme production. Under the optimized cultivation condition, R. stolonifer synthesized 22,109 U/gds. Model validations at optimum operating conditions showed excellent agreement between the experimental results and the predicted responses with a confidence level of 95%. Endoglucanase thus produced was utilized for ethanol production by simultaneous saccharification and fermentation and maximum of 65.5 g/L of ethanol was obtained. This fungal cellulase has also reported to be efficient detergent additives and promising for commercial use. The present study demonstrates coffee husk as a significant bioprocess substrate. Statistical optimization with major parameters for cellulase production can be highly applicable for industrial scale. Furthermore, value addition to coffee husk with sustainable waste management leading to environment conservation can be achieved.
Modelling of Dispersed Gas-Liquid Flow using LBGK and LPT Approach
NASA Astrophysics Data System (ADS)
Agarwal, Alankar; Prakash, Akshay; Ravindra, B.
2017-11-01
The dynamics of gas bubbles play a significant, if not crucial, role in a large variety of industrial process that involves using reactors. Many of these processes are still not well understood in terms of optimal scale-up strategies.An accurate modeling of bubbles and bubble swarms become important for high fidelity bioreactor simulations. This study is a part of the development of robust bubble fluid interaction modules for simulation of industrial-scale reactors. The work presents the simulation of a single bubble rising in a quiescent water tank using current models presented in the literature for bubble-fluid interaction. In this multiphase benchmark problem, the continuous phase (water) is discretized using the Lattice Bhatnagar-Gross and Krook (LBGK) model of Lattice Boltzmann Method (LBM), while the dispersed gas phase (i.e. air-bubble) modeled with the Lagrangian particle tracking (LPT) approach. The cheap clipped fourth order polynomial function is used to model the interaction between two phases. The model is validated by comparing the simulation results for terminal velocity of a bubble at varying bubble diameter and the influence of bubble motion in liquid velocity with the theoretical and previously available experimental data. This work is supported by the ``Centre for Development of Advanced Computing (C-DAC), Pune'' by providing the advanced computational facility in PARAM Yuva-II.
Ultrasonic measurements of the bulk flow field in foams
NASA Astrophysics Data System (ADS)
Nauber, Richard; Büttner, Lars; Eckert, Kerstin; Fröhlich, Jochen; Czarske, Jürgen; Heitkam, Sascha
2018-01-01
The flow field of moving foams is relevant for basic research and for the optimization of industrial processes such as froth flotation. However, no adequate measurement technique exists for the local velocity distribution inside the foam bulk. We have investigated the ultrasound Doppler velocimetry (UDV), providing the first two-dimensional, non-invasive velocity measurement technique with an adequate spatial (10 mm ) and temporal resolution (2.5 Hz ) that is applicable to medium scale foam flows. The measurement object is dry aqueous foam flowing upward in a rectangular channel. An array of ultrasound transducers is mounted within the channel, sending pulses along the main flow axis, and receiving echoes from the foam bulk. This results in a temporally and spatially resolved, planar velocity field up to a measurement depth of 200 mm , which is approximately one order of magnitude larger than those of optical techniques. A comparison with optical reference measurements of the surface velocity of the foam allows to validate the UDV results. At 2.5 Hz frame rate an uncertainty below 15 percent and an axial spatial resolution better than 10 mm is found. Therefore, UDV is a suitable tool for monitoring of industrial processes as well as the scientific investigation of three-dimensional foam flows on medium scales.
NASA Astrophysics Data System (ADS)
Dresp, G.; Petermann, M.; Fieback, T. M.
2018-04-01
An existing apparatus for forced flow through of liquid sorbents has been enhanced with an optically accessible system including a transparent crucible, high pressure viewing cell, and camera. With this optical system, the active surface area between gas and liquid can be determined in situ for the first time under industrial process conditions while maintaining the accuracy of a magnetic suspension balance. Additionally, occurring swelling and the resulting buoyancy changes can now be corrected, further improving the quality of the data. Validation measurements focusing on the sorption isotherms, swelling, and bubble geometry of 1-butyl-3-methylimidazolium tetrafluoroborate with nitrogen at 303 K and up to 17 MPa, as well as with carbon dioxide at 303 K, 323 K, and 373 K at up to 3.5 MPa were completed. Absorption of nitrogen resulted in no observable volume change, whereas absorption of carbon dioxide resulted in temperature independent swelling of up to 9.8%. The gas bubble's structure and behavior during its ascend through the liquid was optically tracked in situ. Combining these two data sets with the absorption kinetics forms the basis to determine the measuring system independent mass transfer coefficients, which are applicable in other laboratory scale and industrial processes.
Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona
2012-01-01
Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.
US DOE Regional Test Centers Program - 2016 Annual Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stein, Joshua
The US Department of Energy’s Regional Test Center (RTC) program provides outdoor validation and bankability data for innovative solar technologies at five sites across the US representing a range of climate conditions. Data helps get new technologies to market faster and improves US industry competitiveness. Managed by Sandia National Laboratories and the National Renewable Energy Laboratory (NREL), the RTC program partners with US manufacturers of photovoltaic (PV) technologies, including modules, inverters, and balance-of-system equipment. The study is collaborative, with manufacturers (also known as RTC industry partners) and the national labs working together on a system design and validation strategy thatmore » meets a clearly defined set of performance and reliability objectives.« less
Real-time monitoring of high-gravity corn mash fermentation using in situ raman spectroscopy.
Gray, Steven R; Peretti, Steven W; Lamb, H Henry
2013-06-01
In situ Raman spectroscopy was employed for real-time monitoring of simultaneous saccharification and fermentation (SSF) of corn mash by an industrial strain of Saccharomyces cerevisiae. An accurate univariate calibration model for ethanol was developed based on the very strong 883 cm(-1) C-C stretching band. Multivariate partial least squares (PLS) calibration models for total starch, dextrins, maltotriose, maltose, glucose, and ethanol were developed using data from eight batch fermentations and validated using predictions for a separate batch. The starch, ethanol, and dextrins models showed significant prediction improvement when the calibration data were divided into separate high- and low-concentration sets. Collinearity between the ethanol and starch models was avoided by excluding regions containing strong ethanol peaks from the starch model and, conversely, excluding regions containing strong saccharide peaks from the ethanol model. The two-set calibration models for starch (R(2) = 0.998, percent error = 2.5%) and ethanol (R(2) = 0.999, percent error = 2.1%) provide more accurate predictions than any previously published spectroscopic models. Glucose, maltose, and maltotriose are modeled to accuracy comparable to previous work on less complex fermentation processes. Our results demonstrate that Raman spectroscopy is capable of real time in situ monitoring of a complex industrial biomass fermentation. To our knowledge, this is the first PLS-based chemometric modeling of corn mash fermentation under typical industrial conditions, and the first Raman-based monitoring of a fermentation process with glucose, oligosaccharides and polysaccharides present. Copyright © 2013 Wiley Periodicals, Inc.
Myszka, Kamila; Schmidt, Marcin T; Białas, Wojciech; Olkowicz, Mariola; Leja, Katarzyna; Czaczyk, Katarzyna
2016-09-01
In the process of Pseudomonas fluorescens biofilm formation, N-acyl-l-homoserine lactone (AHL)-mediated flagella synthesis plays a key role. Inhibition of AHL production may attenuate P. fluorescens biofilm on solid surfaces. This work validated the anti-biofilm properties of p-coumaric and gallic acids via the ability of phenolics to suppress AHL synthesis in P. fluorescens KM120. The dependence between synthesis of AHL molecules, expression of flagella gene (flgA) and the ability of biofilm formation by P. fluorescens KM120 on a stainless steel surface (type 304L) was also investigated. Research was carried out in a purpose-built flow cell device. Limitations on AHL synthesis in P. fluorescens KM120 were observed at concentrations of 120 and 240 µmol L(-1) of phenolic acids in medium. At such levels of gallic and p-coumaric acids the ability of P. fluorescens KM120 to synthesize 3-oxo-C6-homoserine lactone (HSL) was not observed. These concentrations caused decreased expression of flgA gene in P. fluorescens KM120. The changes in expression of AHL-dependent flgA gene significantly decreased the rate of microorganism colonization on the stainless steel surface. Phenolic acids are able to inhibit biofilm formation. The results obtained in the work may help to develop alternative techniques for anti-biofilm treatment in the food industry. © 2015 Society of Chemical Industry. © 2015 Society of Chemical Industry.
A study of palm biomass processing strategy in Sarawak
NASA Astrophysics Data System (ADS)
Lee, S. J. Y.; Ng, W. P. Q.; Law, K. H.
2017-06-01
In the past decades, palm industry is booming due to its profitable nature. An environmental concern regarding on the palm industry is the enormous amount of waste produced from palm industry. The waste produced or palm biomass is one significant renewable energy source and raw material for value-added products like fiber mats, activated carbon, dried fiber, bio-fertilizer and et cetera in Malaysia. There is a need to establish the palm biomass industry for the recovery of palm biomass for efficient utilization and waste reduction. The development of the industry is strongly depending on the two reasons, the availability and supply consistency of palm biomass as well as the availability of palm biomass processing facilities. In Malaysia, the development of palm biomass industry is lagging due to the lack of mature commercial technology and difficult logistic planning as a result of scattered locality of palm oil mill, where palm biomass is generated. Two main studies have been carried out in this research work: i) industrial study of the feasibility of decentralized and centralized palm biomass processing in Sarawak and ii) development of a systematic and optimized palm biomass processing planning for the development of palm biomass industry in Sarawak, Malaysia. Mathematical optimization technique is used in this work to model the above case scenario for biomass processing to achieve maximum economic potential and resource feasibility. An industrial study of palm biomass processing strategy in Sarawak has been carried out to evaluate the optimality of centralized processing and decentralize processing of the local biomass industry. An optimal biomass processing strategy is achieved.
Optimization of an asymmetric thin-walled tube in rotary draw bending process
NASA Astrophysics Data System (ADS)
Xue, Xin; Liao, Juan; Vincze, Gabriela; Gracio, Jose J.
2013-12-01
The rotary draw bending is one of the advanced thin-walled tube forming processes with high efficiency, low consumption and good flexibility in several industries such as automotive, aerospace and shipping. However it may cause undesirable deformations such as over-thinning and ovalization, which bring the weakening of the strength and difficulties in the assembly process respectively. Accurate modeling and effective optimization design to eliminate or reduce undesirable deformations in tube bending process have been a challenging topic. In this paper, in order to study the deformation behaviors of an asymmetric thin-walled tube in rotary draw bending process, a 3D elastic-plastic finite element model has been built under the ABAQUS environment, and the reliability of the model is validated by comparison with experiment. Then, the deformation mechanism of thin-walled tube in bending process was briefly analysis and the effects of wall thickness ratio, section height width ratio and mandrel extension on wall thinning and ovalization in bending process were investigated by using Response Surface Methodology. Finally, multi-objective optimization method was used to obtain an optimum solution of design variables based on simulation results.
Industry-academic partnerships: an approach to accelerate innovation.
Chen, Jennwood; Pickett, Timothy; Langell, Ashley; Trane, Ashley; Charlesworth, Brian; Loken, Kris; Lombardo, Sarah; Langell, John T
2016-09-01
Biotechnology companies are process-driven organizations and often struggle with their ability to innovate. Universities, on the other hand, thrive on discovery and variation as a source of innovation. As such, properly structured academic-industry partnerships in medical technology development may enhance and accelerate innovation. Through joint industry-academic efforts, our objective was to develop a technology aimed at global cervical cancer prevention. Our Center for Medical Innovation assembled a multidisciplinary team of students, surgical residents, and clinical faculty to enter in the University of Utah's annual Bench-to-Bedside competition. Bench-to-Bedside is a university program centered on medical innovation. Teams are given access to university resources and are provided $500.00 for prototype development. Participation by team members are on a volunteer basis. Our industry partner presented the validated need and business mentorship. The team studied the therapeutic landscape, environmental constraints, and used simulation to understand human factors design and usage requirements. A physical device was manufactured by first creating a digital image (SOLIDWORKS 3D CAD). Then, using a 3-dimensional printer (Stratasys Objet30 Prime 3D printer), the image was translated into a physical object. Tissue burn depth analysis was performed on raw chicken breasts warmed to room temperature. Varying combinations of time and temperature were tested, and burn depth and diameter were measured 30 min after each trial. An arithmetic mean was calculated for each corresponding time and temperature combination. User comprehension of operation and sterilization was tested via a participant validation study. Clinical obstetricians and gynecologists were given explicit instructions on usage details and then asked to operate the device. Participant behaviors and questions were recorded. Our efforts resulted in a functional battery-powered hand-held thermocoagulation prototype in just 72 d. Total cost of development was <$500. Proof of concept trials at 100°C demonstrated an average ablated depth and diameter of 4.7 mm and 23.3 mm, respectively, corresponding to treatment efficacy of all grades of precancerous cervical lesions. User comprehension studies showed variable understanding with respect to operation and sterilization instructions. Our experience with using industry-academic partnerships as a means to create medical technologies resulted in the rapid production of a low-cost device that could potentially serve as an integral piece of the "screen-and-treat" approach to premalignant cervical lesions as outlined by World Health Organization. This case study highlights the impact of accelerating medical advances through industry-academic partnership that leverages their combined resources. Published by Elsevier Inc.
Chen, Jing; McGhee, Sarah M; Townsend, Joy; Lam, Tai Hing; Hedley, Anthony J
2015-06-01
Estimates of illicit cigarette consumption are limited and the data obtained from studies funded by the tobacco industry have a tendency to inflate them. This study aimed to validate an industry-funded estimate of 35.9% for Hong Kong using a framework taken from an industry-funded report, but with more transparent data sources. Illicit cigarette consumption was estimated as the difference between total cigarette consumption and the sum of legal domestic sales and legal personal imports (duty-free consumption). Reliable data from government reports and scientifically valid routine sources were used to estimate the total cigarette consumption by Hong Kong smokers and legal domestic sales in Hong Kong. Consumption by visitors and legal duty-free consumption by Hong Kong passengers were estimated under three scenarios for the assumptions to examine the uncertainty around the estimate. A two-way sensitivity analysis was conducted using different levels of possible undeclared smoking and under-reporting of self-reported daily consumption. Illicit cigarette consumption was estimated to be about 8.2-15.4% of the total cigarette consumption in Hong Kong in 2012 with a midpoint estimate of 11.9%, as compared with the industry-funded estimate of 35.9% of cigarette consumption. The industry-funded estimate was inflated by 133-337% of the probable true value. Only with significant levels of under-reporting of daily cigarette consumption and undeclared smoking could we approximate the value reported in the industry-funded study. The industry-funded estimate inflates the likely levels of illicit cigarette consumption. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
21 CFR 820.75 - Process validation.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...
21 CFR 820.75 - Process validation.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...
21 CFR 820.75 - Process validation.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...
21 CFR 820.75 - Process validation.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...
21 CFR 820.75 - Process validation.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...
Industrial Chemistry: A Series of New Courses at the Undergraduate Level.
ERIC Educational Resources Information Center
Jasinski, Jerry P.; Miller, Robert E.
1985-01-01
Describes four courses in the undergraduate bachelor of science program in industrial chemistry at Keene State College (NH). They are (1) introduction to industrial chemistry; (2) polymers--synthesis and separation techniques; (3) inorganic industrial processes; and (4) organic industrial processes. (JN)
System for monitoring an industrial or biological process
Gross, Kenneth C.; Wegerich, Stephan W.; Vilim, Rick B.; White, Andrew M.
1998-01-01
A method and apparatus for monitoring and responding to conditions of an industrial process. Industrial process signals, such as repetitive manufacturing, testing and operational machine signals, are generated by a system. Sensor signals characteristic of the process are generated over a time length and compared to reference signals over the time length. The industrial signals are adjusted over the time length relative to the reference signals, the phase shift of the industrial signals is optimized to the reference signals and the resulting signals output for analysis by systems such as SPRT.
System for monitoring an industrial or biological process
Gross, K.C.; Wegerich, S.W.; Vilim, R.B.; White, A.M.
1998-06-30
A method and apparatus are disclosed for monitoring and responding to conditions of an industrial process. Industrial process signals, such as repetitive manufacturing, testing and operational machine signals, are generated by a system. Sensor signals characteristic of the process are generated over a time length and compared to reference signals over the time length. The industrial signals are adjusted over the time length relative to the reference signals, the phase shift of the industrial signals is optimized to the reference signals and the resulting signals output for analysis by systems such as SPRT. 49 figs.
Modelling and analysis of solar cell efficiency distributions
NASA Astrophysics Data System (ADS)
Wasmer, Sven; Greulich, Johannes
2017-08-01
We present an approach to model the distribution of solar cell efficiencies achieved in production lines based on numerical simulations, metamodeling and Monte Carlo simulations. We validate our methodology using the example of an industrial feasible p-type multicrystalline silicon “passivated emitter and rear cell” process. Applying the metamodel, we investigate the impact of each input parameter on the distribution of cell efficiencies in a variance-based sensitivity analysis, identifying the parameters and processes that need to be improved and controlled most accurately. We show that if these could be optimized, the mean cell efficiencies of our examined cell process would increase from 17.62% ± 0.41% to 18.48% ± 0.09%. As the method relies on advanced characterization and simulation techniques, we furthermore introduce a simplification that enhances applicability by only requiring two common measurements of finished cells. The presented approaches can be especially helpful for ramping-up production, but can also be applied to enhance established manufacturing.
Bayar, Nadia; Friji, Marwa; Kammoun, Radhouane
2018-02-15
In this study, pectin was isolated from Opuntia ficus indica (OFI) cladodes after removing mucilage using the xylanase and cellulase. The process variables were optimized by the Box Behnken design with three factors at three levels. The optimal extraction condition obtained was: liquid to solid (LS), cellulase to xylanase and enzymes to matter ratios of 22ml/g, 2:1U/U and 4U/g, respectively. The simulated extraction yield of 17.91% was validated by the experimental result (16.67±0.30). The enzyme-extracted pectin from OFI cladodes (EAEPC) was low methylated, with a high uronic acid content, a water and oil holding capacity of 5.42g/g and 1.23g/g, respectively, a good foam and emulsion stability and important DPPH radical scavenging activity. Both the OFI cladodes and enzymatic process present promising alternatives to traditional sources and extraction processes of pectin, respectively. EAEPC thus represents a promising additive in food industries. Copyright © 2017. Published by Elsevier Ltd.
An optimization method for defects reduction in fiber laser keyhole welding
NASA Astrophysics Data System (ADS)
Ai, Yuewei; Jiang, Ping; Shao, Xinyu; Wang, Chunming; Li, Peigen; Mi, Gaoyang; Liu, Yang; Liu, Wei
2016-01-01
Laser welding has been widely used in automotive, power, chemical, nuclear and aerospace industries. The quality of welded joints is closely related to the existing defects which are primarily determined by the welding process parameters. This paper proposes a defects optimization method that takes the formation mechanism of welding defects and weld geometric features into consideration. The analysis of welding defects formation mechanism aims to investigate the relationship between welding defects and process parameters, and weld features are considered to identify the optimal process parameters for the desired welded joints with minimum defects. The improved back-propagation neural network possessing good modeling for nonlinear problems is adopted to establish the mathematical model and the obtained model is solved by genetic algorithm. The proposed method is validated by macroweld profile, microstructure and microhardness in the confirmation tests. The results show that the proposed method is effective at reducing welding defects and obtaining high-quality joints for fiber laser keyhole welding in practical production.
NASA Technical Reports Server (NTRS)
Kuo, Kenneth K.; Lu, Y. C.; Chiaverini, Martin J.; Harting, George C.
1994-01-01
An experimental study on the fundamental processes involved in fuel decomposition and boundary layer combustion in hybrid rocket motors is being conducted at the High Pressure Combustion Laboratory of the Pennsylvania State University. This research should provide an engineering technology base for development of large scale hybrid rocket motors as well as a fundamental understanding of the complex processes involved in hybrid propulsion. A high pressure slab motor has been designed for conducting experimental investigations. Oxidizer (LOX or GOX) is injected through the head-end over a solid fuel (HTPB) surface. Experiments using fuels supplied by NASA designated industrial companies will also be conducted. The study focuses on the following areas: measurement and observation of solid fuel burning with LOX or GOX, correlation of solid fuel regression rate with operating conditions, measurement of flame temperature and radical species concentrations, determination of the solid fuel subsurface temperature profile, and utilization of experimental data for validation of a companion theoretical study also being conducted at PSU.
NASA Astrophysics Data System (ADS)
Ghafuri, Mohazabeh; Golfar, Bahareh; Nosrati, Mohsen; Hoseinkhani, Saman
2014-12-01
The process of ATP production is one of the most vital processes in living cells which happens with a high efficiency. Thermodynamic evaluation of this process and the factors involved in oxidative phosphorylation can provide a valuable guide for increasing the energy production efficiency in research and industry. Although energy transduction has been studied qualitatively in several researches, there are only few brief reviews based on mathematical models on this subject. In our previous work, we suggested a mathematical model for ATP production based on non-equilibrium thermodynamic principles. In the present study, based on the new discoveries on the respiratory chain of animal mitochondria, Golfar's model has been used to generate improved results for the efficiency of oxidative phosphorylation and the rate of energy loss. The results calculated from the modified coefficients for the proton pumps of the respiratory chain enzymes are closer to the experimental results and validate the model.
Jones, Kelly K; Zenk, Shannon N; Tarlov, Elizabeth; Powell, Lisa M; Matthews, Stephen A; Horoi, Irina
2017-01-07
Food environment characterization in health studies often requires data on the location of food stores and restaurants. While commercial business lists are commonly used as data sources for such studies, current literature provides little guidance on how to use validation study results to make decisions on which commercial business list to use and how to maximize the accuracy of those lists. Using data from a retrospective cohort study [Weight And Veterans' Environments Study (WAVES)], we (a) explain how validity and bias information from existing validation studies (count accuracy, classification accuracy, locational accuracy, as well as potential bias by neighborhood racial/ethnic composition, economic characteristics, and urbanicity) were used to determine which commercial business listing to purchase for retail food outlet data and (b) describe the methods used to maximize the quality of the data and results of this approach. We developed data improvement methods based on existing validation studies. These methods included purchasing records from commercial business lists (InfoUSA and Dun and Bradstreet) based on store/restaurant names as well as standard industrial classification (SIC) codes, reclassifying records by store type, improving geographic accuracy of records, and deduplicating records. We examined the impact of these procedures on food outlet counts in US census tracts. After cleaning and deduplicating, our strategy resulted in a 17.5% reduction in the count of food stores that were valid from those purchased from InfoUSA and 5.6% reduction in valid counts of restaurants purchased from Dun and Bradstreet. Locational accuracy was improved for 7.5% of records by applying street addresses of subsequent years to records with post-office (PO) box addresses. In total, up to 83% of US census tracts annually experienced a change (either positive or negative) in the count of retail food outlets between the initial purchase and the final dataset. Our study provides a step-by-step approach to purchase and process business list data obtained from commercial vendors. The approach can be followed by studies of any size, including those with datasets too large to process each record by hand and will promote consistency in characterization of the retail food environment across studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreau, J.W.
1980-12-01
This engineering and economic study evaluated the potential for developing a geothermal industrial park in the Puna District near Pahoa on the Island of Hawaii. Direct heat industrial applications were analyzed from a marketing, engineering, economic, environmental, and sociological standpoint to determine the most viable industries for the park. An extensive literature search produced 31 existing processes currently using geothermal heat. An additional list was compiled indicating industrial processes that require heat that could be provided by geothermal energy. From this information, 17 possible processes were selected for consideration. Careful scrutiny and analysis of these 17 processes revealed three thatmore » justified detailed economic workups. The three processes chosen for detailed analysis were: an ethanol plant using bagasse and wood as feedstock; a cattle feed mill using sugar cane leaf trash as feedstock; and a papaya processing facility providing both fresh and processed fruit. In addition, a research facility to assess and develop other processes was treated as a concept. Consideration was given to the impediments to development, the engineering process requirements and the governmental support for each process. The study describes the geothermal well site chosen, the pipeline to transmit the hydrothermal fluid, and the infrastructure required for the industrial park. A conceptual development plan for the ethanol plant, the feedmill and the papaya processing facility was prepared. The study concluded that a direct heat industrial park in Pahoa, Hawaii, involves considerable risks.« less
Metrology - Beyond the Calibration Lab
NASA Technical Reports Server (NTRS)
Mimbs, Scott M.
2008-01-01
We rely on data from measurements every day; a gas-pump, a speedometer, and a supermarket weight scale are just three examples of measurements we use to make decisions. We generally accept the data from these measurements as "valid." One reason we can accept the data is the "legal metrology" requirements established and regulated by the government in matters of commerce. The measurement data used by NASA, other government agencies, and industry can be critical to decisions which affect everything from economic viability, to mission success, to the security of the nation. Measurement data can even affect life and death decisions. Metrology requirements must adequately provide for risks associated with these decisions. To do this, metrology must be integrated into all aspects of an industry including research, design, testing, and product acceptance. Metrology, the science of measurement, has traditionally focused on the calibration of instruments, and although instrument calibration is vital, it is only a part of the process that assures quality in measurement data. For example, measurements made in research can influence the fundamental premises that establish the design parameters, which then flow down to the manufacturing processes, and eventually impact the final product. Because a breakdown can occur anywhere within this cycle, measurement quality assurance has to be integrated into every part of the life-cycle process starting with the basic research and ending with the final product inspection process. The purpose of this paper is to discuss the role of metrology in the various phases of a product's life-cycle. For simplicity, the cycle will be divided in four broad phases, with discussions centering on metrology within NASA. .
Scientific Data Purchase Project Overview Presentation
NASA Technical Reports Server (NTRS)
Holekamp, Kara; Fletcher, Rose
2001-01-01
The Scientific Data Purchase (SDP) project acquires science data from commercial sources. It is a demonstration project to test a new way of doing business, tap new sources of data, support Earth science research, and support the commercial remote sensing industry. Phase I of the project reviews simulated/prototypical data sets from 10 companies. Phase II of the project is a 3 year purchase/distribution of select data from 5 companies. The status of several SDP projects is reviewed in this viewgraph presentation, as is the SDP process of tasking, verification, validation, and data archiving. The presentation also lists SDP results for turnaround time, metrics, customers, data use, science research, applications research, and user feedback.
Drug discovery in the next millennium.
Ohlstein, E H; Ruffolo, R R; Elliott, J D
2000-01-01
Selection and validation of novel molecular targets have become of paramount importance in light of the plethora of new potential therapeutic drug targets that have emerged from human gene sequencing. In response to this revolution within the pharmaceutical industry, the development of high-throughput methods in both biology and chemistry has been necessitated. This review addresses these technological advances as well as several new areas that have been created by necessity to deal with this new paradigm, such as bioinformatics, cheminformatics, and functional genomics. With many of these key components of future drug discovery now in place, it is possible to map out a critical path for this process that will be used into the new millennium.
Wind Plant Performance Prediction (WP3) Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craig, Anna
The methods for analysis of operational wind plant data are highly variable across the wind industry, leading to high uncertainties in the validation and bias-correction of preconstruction energy estimation methods. Lack of credibility in the preconstruction energy estimates leads to significant impacts on project financing and therefore the final levelized cost of energy for the plant. In this work, the variation in the evaluation of a wind plant's operational energy production as a result of variations in the processing methods applied to the operational data is examined. Preliminary results indicate that selection of the filters applied to the data andmore » the filter parameters can have significant impacts in the final computed assessment metrics.« less
Assurance of COTS Boards for Space Flight. Part 1
NASA Technical Reports Server (NTRS)
Plante, Jeannette; Helmold, Norm; Eveland, Clay
1998-01-01
Space Flight hardware and software designers are increasingly turning to Commercial-Off-the-Shelf (COTS) products in hopes of meeting the demands imposed on them by projects with short development cycle times. The Technology Validation Assurance (TVA) team at NASA GSFC has embarked on applying a method for inserting COTS hardware into the Spartan 251 spacecraft. This method includes Procurement, Characterization, Ruggedization/Remediation and Verification Testing process steps which are intended to increase the uses confidence in the hardware's ability to function in the intended application for the required duration. As this method is refined with use, it has the potential for becoming a benchmark for industry-wide use of COTS in high reliability systems.
Fugitive Methane Gas Emission Monitoring in oil and gas industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, Levente
Identifying fugitive methane leaks allow optimization of the extraction process, can extend gas extraction equipment lifetime, and eliminate hazardous work conditions. We demonstrate a wireless sensor network based on cost effective and robust chemi-resistive methane sensors combined with real time analytics to identify leaks from 2 scfh to 10000 scfh. The chemi-resistive sensors were validated for sensitivity better than 1 ppm of methane plume detection. The real time chemical sensor and wind data is integrated into an inversion models to identify the location and the magnitude of the methane leak. This integrated solution can be deployed in outdoor environment formore » long term monitoring of chemical plumes.« less
Austin, S Bryn; Gordon, Allegra R; Kennedy, Grace A; Sonneville, Kendrin R; Blossom, Jeffrey; Blood, Emily A
2013-12-06
Cosmetic procedures have proliferated rapidly over the past few decades, with over $11 billion spent on cosmetic surgeries and other minimally invasive procedures and another $2.9 billion spent on U.V. indoor tanning in 2012 in the United States alone. While research interest is increasing in tandem with the growth of the industry, methods have yet to be developed to identify and geographically locate the myriad types of businesses purveying cosmetic procedures. Geographic location of cosmetic-procedure businesses is a critical element in understanding the public health impact of this industry; however no studies we are aware of have developed valid and feasible methods for spatial analyses of these types of businesses. The aim of this pilot validation study was to establish the feasibility of identifying businesses offering surgical and minimally invasive cosmetic procedures and to characterize the spatial distribution of these businesses. We developed and tested three methods for creating a geocoded list of cosmetic-procedure businesses in Boston (MA) and Seattle (WA), USA, comparing each method on sensitivity and staff time required per confirmed cosmetic-procedure business. Methods varied substantially. Our findings represent an important step toward enabling rigorous health-linked spatial analyses of the health implications of this little-understood industry.
Lima, Gustavo F; Freitas, Victor C G; Araújo, Renan P; Maitelli, André L; Salazar, Andrés O
2017-09-15
The pipeline inspection using a device called Pipeline Inspection Gauge (PIG) is safe and reliable when the PIG is at low speeds during inspection. We built a Testing Laboratory, containing a testing loop and supervisory system to study speed control techniques for PIGs. The objective of this work is to present and validate the Testing Laboratory, which will allow development of a speed controller for PIGs and solve an existing problem in the oil industry. The experimental methodology used throughout the project is also presented. We installed pressure transducers on pipeline outer walls to detect the PIG's movement and, with data from supervisory, calculated an average speed of 0.43 m/s. At the same time, the electronic board inside the PIG received data from odometer and calculated an average speed of 0.45 m/s. We found an error of 4.44%, which is experimentally acceptable. The results showed that it is possible to successfully build a Testing Laboratory to detect the PIG's passage and estimate its speed. The validation of the Testing Laboratory using data from the odometer and its auxiliary electronic was very successful. Lastly, we hope to develop more research in the oil industry area using this Testing Laboratory.
Freitas, Victor C. G.; Araújo, Renan P.; Maitelli, André L.; Salazar, Andrés O.
2017-01-01
The pipeline inspection using a device called Pipeline Inspection Gauge (PIG) is safe and reliable when the PIG is at low speeds during inspection. We built a Testing Laboratory, containing a testing loop and supervisory system to study speed control techniques for PIGs. The objective of this work is to present and validate the Testing Laboratory, which will allow development of a speed controller for PIGs and solve an existing problem in the oil industry. The experimental methodology used throughout the project is also presented. We installed pressure transducers on pipeline outer walls to detect the PIG’s movement and, with data from supervisory, calculated an average speed of 0.43 m/s. At the same time, the electronic board inside the PIG received data from odometer and calculated an average speed of 0.45 m/s. We found an error of 4.44%, which is experimentally acceptable. The results showed that it is possible to successfully build a Testing Laboratory to detect the PIG’s passage and estimate its speed. The validation of the Testing Laboratory using data from the odometer and its auxiliary electronic was very successful. Lastly, we hope to develop more research in the oil industry area using this Testing Laboratory. PMID:28914757
TIMES-SS--recent refinements resulting from an industrial skin sensitisation consortium.
Patlewicz, G; Kuseva, C; Mehmed, A; Popova, Y; Dimitrova, G; Ellis, G; Hunziker, R; Kern, P; Low, L; Ringeissen, S; Roberts, D W; Mekenyan, O
2014-01-01
The TImes MEtabolism Simulator platform for predicting Skin Sensitisation (TIMES-SS) is a hybrid expert system, first developed at Bourgas University using funding and data from a consortium of industry and regulators. TIMES-SS encodes structure-toxicity and structure-skin metabolism relationships through a number of transformations, some of which are underpinned by mechanistic 3D QSARs. The model estimates semi-quantitative skin sensitisation potency classes and has been developed with the aim of minimising animal testing, and also to be scientifically valid in accordance with the OECD principles for (Q)SAR validation. In 2007 an external validation exercise was undertaken to fully address these principles. In 2010, a new industry consortium was established to coordinate research efforts in three specific areas: refinement of abiotic reactions in the skin (namely autoxidation) in the skin, refinement of the manner in which chemical reactivity was captured in terms of structure-toxicity rules (inclusion of alert reliability parameters) and defining the domain based on the underlying experimental data (study of discrepancies between local lymph node assay Local Lymph Node Assay (LLNA) and Guinea Pig Maximisation Test (GPMT)). The present paper summarises the progress of these activities and explains how the insights derived have been translated into refinements, resulting in increased confidence and transparency in the robustness of the TIMES-SS predictions.
Austin, S. Bryn; Gordon, Allegra R.; Kennedy, Grace A.; Sonneville, Kendrin R.; Blossom, Jeffrey; Blood, Emily A.
2013-01-01
Cosmetic procedures have proliferated rapidly over the past few decades, with over $11 billion spent on cosmetic surgeries and other minimally invasive procedures and another $2.9 billion spent on U.V. indoor tanning in 2012 in the United States alone. While research interest is increasing in tandem with the growth of the industry, methods have yet to be developed to identify and geographically locate the myriad types of businesses purveying cosmetic procedures. Geographic location of cosmetic-procedure businesses is a critical element in understanding the public health impact of this industry; however no studies we are aware of have developed valid and feasible methods for spatial analyses of these types of businesses. The aim of this pilot validation study was to establish the feasibility of identifying businesses offering surgical and minimally invasive cosmetic procedures and to characterize the spatial distribution of these businesses. We developed and tested three methods for creating a geocoded list of cosmetic-procedure businesses in Boston (MA) and Seattle (WA), USA, comparing each method on sensitivity and staff time required per confirmed cosmetic-procedure business. Methods varied substantially. Our findings represent an important step toward enabling rigorous health-linked spatial analyses of the health implications of this little-understood industry. PMID:24322394
Environmental effects of interstate power trading on electricity consumption mixes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joe Marriott; H. Scott Matthews
2005-11-15
Although many studies of electricity generation use national or state average generation mix assumptions, in reality a great deal of electricity is transferred between states with very different mixes of fossil and renewable fuels, and using the average numbers could result in incorrect conclusions in these studies. The authors create electricity consumption profiles for each state and for key industry sectors in the U.S. based on existing state generation profiles, net state power imports, industry presence by state, and an optimization model to estimate interstate electricity trading. Using these 'consumption mixes' can provide a more accurate assessment of electricity usemore » in life-cycle analyses. It is concluded that the published generation mixes for states that import power are misleading, since the power consumed in-state has a different makeup than the power that was generated. And, while most industry sectors have consumption mixes similar to the U.S. average, some of the most critical sectors of the economy - such as resource extraction and material processing sectors - are very different. This result does validate the average mix assumption made in many environmental assessments, but it is important to accurately quantify the generation methods for electricity used when doing life-cycle analyses. 16 refs., 7 figs., 2 tabs.« less
Xiao, Yong-Qing; Li, Li; Liu, Ying; Ma, Yin-Lian; Yu, Ding-Rong
2016-01-01
To elucidate the key issues in the development and innovation of traditional Chinese medicine processing discipline and Chinese herbal pieces industry Chinese herbal pieces industry. According to the author's accumulated experience over years and demand of the development of the Chinese herbal pieces industry, the key issues in the development and innovation on the Chinese herbal pieces industry were summarized. According to the author, the traditional Chinese medicine processing discipline shall focus on a application basis research. The development of this discipline should be closely related to the development of Chinese herbal pieces. The traditional Chinese medicine processing discipline can be improved and its results can be transformed only if this discipline were correlated with the Chinese herbal pieces industry, matched with the development of the Chinese herbal pieces industry, and solved the problems in the development on the Chinese herbal pieces industry. The development of traditional Chinese medicine processing discipline and the Chinese herbal pieces industry also requires scientific researchers to make constant innovations, realize the specialty of the researches, and innovate based on inheritance. Copyright© by the Chinese Pharmaceutical Association.
The prevalence of occupational dermatitis in the UK printing industry
Livesley, E; Rushton, L; English, J; Williams, H
2002-01-01
Aims: To quantify occupational ill health resulting from dermatitis in the UK printing industry and to explore links with particular processes and activities. Methods: Approximately 2600 members of the Graphical, Paper and Media Union living in Nottinghamshire were sent a self completion questionnaire. A sample of respondents, both those who reported current skin problems and those who did not, were invited for a short dermatological examination. Results: The overall response rate was 62%. A total of 1189 respondents were directly involved in the printing industry and categorised according to work in pre-press (25%), printing (46%), or finishing (42%) processes. A total of 490 respondents (41%) self reported having a skin complaint at some time. Prevalence was highest in males (43%) and those working in printing (49%), in particular those who cleaned rollers and cylinders or who came into contact with substances containing isocyanates on a daily basis. The most commonly affected areas reported were the fingers and webs between the fingers. Twenty six per cent of the 490 reported a current problem on the hand. Reported symptoms included itching (61%), rash (58%), and dry skin (56%). Although certain printing industry substances were thought by respondents to aggravate their condition, constant washing and friction was most often cited. Reported use of protective equipment and cleansing products was generally high, particularly by printers. Clinical examination confirmed the high self reported prevalence and also identified a substantial proportion of mild cases which were not reported. The overall prevalence of occupationally related skin complaints is estimated to be 40%. Conclusions: A much higher prevalence of dermatitis has been identified than from routine surveillance schemes. The use of good quality records from unions with high membership facilitated access to workers across a range of company sites and printing processes. Validation of self reported symptoms through clinical examination was shown to be essential. The importance of non-chemical causes of dermatitis was highlighted. The findings point towards the need for the development of effective and acceptable risk reduction strategies, in particular to reduce water contact and friction. PMID:12107299
Status and plans for the ANOPP/HSR prediction system
NASA Technical Reports Server (NTRS)
Nolan, Sandra K.
1992-01-01
ANOPP is a comprehensive prediction system which was developed and validated by NASA. Because ANOPP is a system prediction program, it allows aerospace industry researchers to create trade-off studies with a variety of aircraft noise problems. The extensive validation of ANOPP allows the program results to be used as a benchmark for testing other prediction codes.
Engagement DEOCS 4.1 Construct Validity Summary
2017-08-01
Engagement DEOCS 4.1 Construct Validity Summary DEFENSE EQUAL OPPORTUNITY MANAGEMENT INSTITUTE DIRECTORATE OF...increasingly popular construct in industry and research. Indeed, management literature suggests employee engagement is the key to an organization’s...definition was drawn upon to inform the creation of a definition and measure of engagement that was then adapted using subject matter expert (SME)1
ERIC Educational Resources Information Center
Peltier, James W.; Cummins, Shannon; Pomirleanu, Nadia; Cross, James; Simon, Rob
2014-01-01
Students' desire and intention to pursue a career in sales continue to lag behind industry demand for sales professionals. This article develops and validates a reliable and parsimonious scale for measuring and predicting student intention to pursue a selling career. The instrument advances previous scales in three ways. The instrument is…
ERIC Educational Resources Information Center
Acharya, Sushil; Manohar, Priyadarshan; Wu, Peter; Schilling, Walter
2017-01-01
Imparting real world experiences in a software verification and validation (SV&V) course is often a challenge due to the lack of effective active learning tools. This pedagogical requirement is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains. Realizing the…
Authentication of Electromagnetic Interference Removal in Johnson Noise Thermometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Britton Jr, Charles L.; Roberts, Michael
This report summarizes the testing performed offsite at the TVA Kingston Fossil Plant (KFP). This location is selected as a valid offsite test facility because the environment is very similar to the expected industrial nuclear power plant environment. This report will discuss the EMI discovered in the environment, the removal technique validity, and results from the measurements.
ERIC Educational Resources Information Center
Rogers, Sandra K.; Dahlberg, Maurine F.
This report documents a statewide competency validation project to provide current information about job skills considered "important to know" by Texas industrial experts in the areas of auto mechanics, diesel mechanics, office occupations, print shop trades, and welding. Section 1 describes the steps used to conduct the study and…
Development and Validation of a Computer Interactive Test Battery.
ERIC Educational Resources Information Center
Sheppard, Valarie A.; Baker, Todd A.; Gebhardt, Deborah L.; Leonard, Kristine M.
The purpose of this project was to develop valid evaluation procedures for the selection of Container Equipment Operators (CEOs) in the shipping industry. A job analysis was conducted to identify the essential tasks of the CEO job. Site visits, a task inventory, and the determination of essential tasks were used in the job analysis. The skills and…
1980-01-15
Code B364078464 V99QAXNH30303 H2590D. IS KEY WORDS fCo.. e.1 Odn Od It -C.eWV WHO Idnlif b 61-k n 0ber) Strategic Targeting Copper Industry INDATAK 20...develop, debug and test an industrial simulation model (INDATAK) using the LOGATAK model as a point of departure. The copper processing industry is...significant processes in the copper industry, including the transportation network connecting the processing elements, have been formatted for use in
Evaluation of the whole body physiologically based pharmacokinetic (WB-PBPK) modeling of drugs.
Munir, Anum; Azam, Shumaila; Fazal, Sahar; Bhatti, A I
2018-08-14
The Physiologically based pharmacokinetic (PBPK) modeling is a supporting tool in drug discovery and improvement. Simulations produced by these models help to save time and aids in examining the effects of different variables on the pharmacokinetics of drugs. For this purpose, Sheila and Peters suggested a PBPK model capable of performing simulations to study a given drug absorption. There is a need to extend this model to the whole body entailing all another process like distribution, metabolism, and elimination, besides absorption. The aim of this scientific study is to hypothesize a WB-PBPK model through integrating absorption, distribution, metabolism, and elimination processes with the existing PBPK model.Absorption, distribution, metabolism, and elimination models are designed, integrated with PBPK model and validated. For validation purposes, clinical records of few drugs are collected from the literature. The developed WB-PBPK model is affirmed by comparing the simulations produced by the model against the searched clinical data. . It is proposed that the WB-PBPK model may be used in pharmaceutical industries to create of the pharmacokinetic profiles of drug candidates for better outcomes, as it is advance PBPK model and creates comprehensive PK profiles for drug ADME in concentration-time plots. Copyright © 2018 Elsevier Ltd. All rights reserved.
Lifecycle assessment of microalgae to biofuel: Comparison of thermochemical processing pathways
Bennion, Edward P.; Ginosar, Daniel M.; Moses, John; ...
2015-01-16
Microalgae are currently being investigated as a renewable transportation fuel feedstock based on various advantages that include high annual yields, utilization of poor quality land, does not compete with food, and can be integrated with various waste streams. This study focuses on directly assessing the impact of two different thermochemical conversion technologies on the microalgae to biofuel process through life cycle assessment. A system boundary of a “well to pump” (WTP) is defined and includes sub-process models of the growth, dewatering, thermochemical bio-oil recovery, bio-oil stabilization, conversion to renewable diesel, and transport to the pump. Models were validated with experimentalmore » and literature data and are representative of an industrial-scale microalgae to biofuel process. Two different thermochemical bio-oil conversion systems are modeled and compared on a systems level, hydrothermal liquefaction (HTL) and pyrolysis. The environmental impact of the two pathways were quantified on the metrics of net energy ratio (NER), defined here as energy consumed over energy produced, and greenhouse gas (GHG) emissions. Results for WTP biofuel production through the HTL pathway were determined to be 1.23 for the NER and GHG emissions of -11.4 g CO 2-eq (MJ renewable diesel) -1. WTP biofuel production through the pyrolysis pathway results in a NER of 2.27 and GHG emissions of 210 g CO2 eq (MJ renewable diesel)-1. The large environmental impact associated with the pyrolysis pathway is attributed to feedstock drying requirements and combustion of co-products to improve system energetics. Discussion focuses on a detailed breakdown of the overall process energetics and GHGs, impact of modeling at laboratory- scale compared to industrial-scale, environmental impact sensitivity to engineering systems input parameters for future focused research and development and a comparison of results to literature.« less
IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
William M. Bond; Salih Ersayin
2007-03-30
This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency ofmore » individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern Minnesota, and future proposals are pending with non-taconite mineral processing applications.« less
Karri, Rama Rao; Sahu, J N
2018-01-15
Zn (II) is one the common pollutant among heavy metals found in industrial effluents. Removal of pollutant from industrial effluents can be accomplished by various techniques, out of which adsorption was found to be an efficient method. Applications of adsorption limits itself due to high cost of adsorbent. In this regard, a low cost adsorbent produced from palm oil kernel shell based agricultural waste is examined for its efficiency to remove Zn (II) from waste water and aqueous solution. The influence of independent process variables like initial concentration, pH, residence time, activated carbon (AC) dosage and process temperature on the removal of Zn (II) by palm kernel shell based AC from batch adsorption process are studied systematically. Based on the design of experimental matrix, 50 experimental runs are performed with each process variable in the experimental range. The optimal values of process variables to achieve maximum removal efficiency is studied using response surface methodology (RSM) and artificial neural network (ANN) approaches. A quadratic model, which consists of first order and second order degree regressive model is developed using the analysis of variance and RSM - CCD framework. The particle swarm optimization which is a meta-heuristic optimization is embedded on the ANN architecture to optimize the search space of neural network. The optimized trained neural network well depicts the testing data and validation data with R 2 equal to 0.9106 and 0.9279 respectively. The outcomes indicates that the superiority of ANN-PSO based model predictions over the quadratic model predictions provided by RSM. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bennion, Edward P.
Microalgae are currently being investigated as a renewable transportation fuel feedstock based on various advantages that include high annual yields, utilization of poor quality land, does not compete with food, and can be integrated with various waste streams. This study focuses on directly assessing the impact of two different thermochemical conversion technologies on the microalgae-to-biofuel process through life cycle assessment. A system boundary of a "well to pump" (WTP) is defined and includes sub-process models of the growth, dewatering, thermochemical bio-oil recovery, bio-oil stabilization, conversion to renewable diesel, and transport to the pump. Models were validated with experimental and literature data and are representative of an industrial-scale microalgae-to-biofuel process. Two different thermochemical bio-oil conversion systems are modeled and compared on a systems level, hydrothermal liquefaction (HTL) and pyrolysis. The environmental impact of the two pathways were quantified on the metrics of net energy ratio (NER), defined here as energy consumed over energy produced, and greenhouse gas (GHG) emissions. Results for WTP biofuel production through the HTL pathway were determined to be 1.23 for the NER and GHG emissions of -11.4 g CO2 eq (MJ renewable diesel)-1. WTP biofuel production through the pyrolysis pathway results in a NER of 2.27 and GHG emissions of 210 g CO2 eq (MJ renewable diesel)-1. The large environmental impact associated with the pyrolysis pathway is attributed to feedstock drying requirements and combustion of co-products to improve system energetics. Discussion focuses on a detailed breakdown of the overall process energetics and GHGs, impact of modeling at laboratory-scale compared to industrial-scale, environmental impact sensitivity to engineering systems input parameters for future focused research and development, and a comparison of results to literature.
Spacecraft Testing Programs: Adding Value to the Systems Engineering Process
NASA Technical Reports Server (NTRS)
Britton, Keith J.; Schaible, Dawn M.
2011-01-01
Testing has long been recognized as a critical component of spacecraft development activities - yet many major systems failures may have been prevented with more rigorous testing programs. The question is why is more testing not being conducted? Given unlimited resources, more testing would likely be included in a spacecraft development program. Striking the right balance between too much testing and not enough has been a long-term challenge for many industries. The objective of this paper is to discuss some of the barriers, enablers, and best practices for developing and sustaining a strong test program and testing team. This paper will also explore the testing decision factors used by managers; the varying attitudes toward testing; methods to develop strong test engineers; and the influence of behavior, culture and processes on testing programs. KEY WORDS: Risk, Integration and Test, Validation, Verification, Test Program Development
Backside contacted field effect transistor array for extracellular signal recording.
Ingebrandt, S; Yeung, C K; Staab, W; Zetterer, T; Offenhäusser, A
2003-04-01
A new approach to the design of field-effect transistor (FET) sensors and the use of these FETs in detecting extracellular electrophysiological recordings is reported. Backside contacts were engineered by deep reactive ion etching and a gas phase boron doping process of the holes using planar diffusion sources. The metal contacts were designed to fit on top of the bonding pads of a standard industrial 22-pin DIL (dual inline) chip carrier. To minimise contact resistance, the metal backside contacts of the chips were electroless plated with gold. The chips were mounted on top of the bonding pads using a standard flip-chip process and a fineplacer unit previously described. Rat embryonic myocytes were cultured on these new devices (effective growth area 6 x 6 mm(2)) in order to confirm their validity in electrophysiological recording. Copyright 2003 Elsevier Science B.V.
An innovative recycling process to obtain pure polyethylene and polypropylene from household waste.
Serranti, Silvia; Luciani, Valentina; Bonifazi, Giuseppe; Hu, Bin; Rem, Peter C
2015-01-01
An innovative recycling process, based on magnetic density separation (MDS) and hyperspectral imaging (HSI), to obtain high quality polypropylene and polyethylene as secondary raw materials, is presented. More in details, MDS was applied to two different polyolefin mixtures coming from household waste. The quality of the two separated PP and PE streams, in terms of purity, was evaluated by a classification procedure based on HSI working in the near infrared range (1000-1700 nm). The classification model was built using known PE and PP samples as training set. The results obtained by HSI were compared with those obtained by classical density analysis carried in laboratory on the same polymers. The results obtained by MDS and the quality assessment of the plastic products by HSI showed that the combined action of these two technologies is a valid solution that can be implemented at industrial level. Copyright © 2014 Elsevier Ltd. All rights reserved.
A Computational Fluid Dynamic Model for a Novel Flash Ironmaking Process
NASA Astrophysics Data System (ADS)
Perez-Fontes, Silvia E.; Sohn, Hong Yong; Olivas-Martinez, Miguel
A computational fluid dynamic model for a novel flash ironmaking process based on the direct gaseous reduction of iron oxide concentrates is presented. The model solves the three-dimensional governing equations including both gas-phase and gas-solid reaction kinetics. The turbulence-chemistry interaction in the gas-phase is modeled by the eddy dissipation concept incorporating chemical kinetics. The particle cloud model is used to track the particle phase in a Lagrangian framework. A nucleation and growth kinetics rate expression is adopted to calculate the reduction rate of magnetite concentrate particles. Benchmark experiments reported in the literature for a nonreacting swirling gas jet and a nonpremixed hydrogen jet flame were simulated for validation. The model predictions showed good agreement with measurements in terms of gas velocity, gas temperature and species concentrations. The relevance of the computational model for the analysis of a bench reactor operation and the design of an industrial-pilot plant is discussed.
NASA Astrophysics Data System (ADS)
Li, Ming-zhou; Zhou, Jie-min; Tong, Chang-ren; Zhang, Wen-hai; Chen, Zhuo; Wang, Jin-liang
2018-05-01
Based on the principle of multiphase equilibrium, a mathematical model of the copper flash converting process was established by the equilibrium constant method, and a computational system was developed with the use of MetCal software platform. The mathematical model was validated by comparing simulated outputs, industrial data, and published data. To obtain high-quality blister copper, a low copper content in slag, and increased impurity removal rate, the model was then applied to investigate the effects of the operational parameters [oxygen/feed ratio (R OF), flux rate (R F), and converting temperature (T)] on the product weights, compositions, and the distribution behaviors of impurity elements. The optimized results showed that R OF, R F, and T should be controlled at approximately 156 Nm3/t, within 3.0 pct, and at approximately 1523 K (1250 °C), respectively.
NASA Technical Reports Server (NTRS)
2002-01-01
Ames Research Center granted Reality Capture Technologies (RCT), Inc., a license to further develop NASA's Mars Map software platform. The company incorporated NASA#s innovation into software that uses the Virtual Plant Model (VPM)(TM) to structure, modify, and implement the construction sites of industrial facilities, as well as develop, validate, and train operators on procedures. The VPM orchestrates the exchange of information between engineering, production, and business transaction systems. This enables users to simulate, control, and optimize work processes while increasing the reliability of critical business decisions. Engineers can complete the construction process and test various aspects of it in virtual reality before building the actual structure. With virtual access to and simulation of the construction site, project personnel can manage, access control, and respond to changes on complex constructions more effectively. Engineers can also create operating procedures, training, and documentation. Virtual Plant Model(TM) is a trademark of Reality Capture Technologies, Inc.