NASA Technical Reports Server (NTRS)
Befrui, Bizhan A.
1995-01-01
This viewgraph presentation discusses the following: STAR-CD computational features; STAR-CD turbulence models; common features of industrial complex flows; industry-specific CFD development requirements; applications and experiences of industrial complex flows, including flow in rotating disc cavities, diffusion hole film cooling, internal blade cooling, and external car aerodynamics; and conclusions on turbulence modeling needs.
As part of its continuing development and evaluation, the QUIC model (Quick Urban & Industrial Complex) was used to study flow and dispersion in complex terrain for two cases. First, for a small area of lower Manhattan near the World Trade Center site, comparisons were made bet...
As part of its continuing development and evaluation, the QUIC model (Quick Urban & Industrial Complex) was used to study flow and dispersion in complex terrain for two cases. First, for a small area of lower Manhattan near the World Trade Center site, comparisons were made bet...
The poster shows comparisons of wind velocities and sand fluxes between field measurements and a computer model, called QUIC (Quick Urban & Industrial Complex). The comparisons were made for a small desert region in New Mexico.
Economic and environmental optimization of a multi-site utility network for an industrial complex.
Kim, Sang Hun; Yoon, Sung-Geun; Chae, Song Hwa; Park, Sunwon
2010-01-01
Most chemical companies consume a lot of steam, water and electrical resources in the production process. Given recent record fuel costs, utility networks must be optimized to reduce the overall cost of production. Environmental concerns must also be considered when preparing modifications to satisfy the requirements for industrial utilities, since wastes discharged from the utility networks are restricted by environmental regulations. Construction of Eco-Industrial Parks (EIPs) has drawn attention as a promising approach for retrofitting existing industrial parks to improve energy efficiency. The optimization of the utility network within an industrial complex is one of the most important undertakings to minimize energy consumption and waste loads in the EIP. In this work, a systematic approach to optimize the utility network of an industrial complex is presented. An important issue in the optimization of a utility network is the desire of the companies to achieve high profits while complying with the environmental regulations. Therefore, the proposed optimization was performed with consideration of both economic and environmental factors. The proposed approach consists of unit modeling using thermodynamic principles, mass and energy balances, development of a multi-period Mixed Integer Linear Programming (MILP) model for the integration of utility systems in an industrial complex, and an economic/environmental analysis of the results. This approach is applied to the Yeosu Industrial Complex, considering seasonal utility demands. The results show that both the total utility cost and waste load are reduced by optimizing the utility network of an industrial complex. 2009 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bosikov, I. I.; Klyuev, R. V.; Revazov, V. Ch; Pilieva, D. E.
2018-03-01
The article describes research and analysis of hazardous processes occurring in the natural-industrial system and effectiveness assessment of its functioning using mathematical models. Studies of the functioning regularities of the natural and industrial system are becoming increasingly relevant in connection with the formulation of the task of modernizing production and the economy of Russia as a whole. In connection with a significant amount of poorly structured data, it is complicated by regulations for the effective functioning of production processes, social and natural complexes, under which a sustainable development of the natural-industrial system of the mining and processing complex would be ensured. Therefore, the scientific and applied problems, the solution of which allows one to formalize the hidden structural functioning patterns of the natural-industrial system and to make managerial decisions of organizational and technological nature to improve the efficiency of the system, are very relevant.
NASA Astrophysics Data System (ADS)
Wray, Timothy J.
Computational fluid dynamics (CFD) is routinely used in performance prediction and design of aircraft, turbomachinery, automobiles, and in many other industrial applications. Despite its wide range of use, deficiencies in its prediction accuracy still exist. One critical weakness is the accurate simulation of complex turbulent flows using the Reynolds-Averaged Navier-Stokes equations in conjunction with a turbulence model. The goal of this research has been to develop an eddy viscosity type turbulence model to increase the accuracy of flow simulations for mildly separated flows, flows with rotation and curvature effects, and flows with surface roughness. It is accomplished by developing a new zonal one-equation turbulence model which relies heavily on the flow physics; it is now known in the literature as the Wray-Agarwal one-equation turbulence model. The effectiveness of the new model is demonstrated by comparing its results with those obtained by the industry standard one-equation Spalart-Allmaras model and two-equation Shear-Stress-Transport k - o model and experimental data. Results for subsonic, transonic, and supersonic flows in and about complex geometries are presented. It is demonstrated that the Wray-Agarwal model can provide the industry and CFD researchers an accurate, efficient, and reliable turbulence model for the computation of a large class of complex turbulent flows.
A Conceptual Model for Analysing Management Development in the UK Hospitality Industry
ERIC Educational Resources Information Center
Watson, Sandra
2007-01-01
This paper presents a conceptual, contingent model of management development. It explains the nature of the UK hospitality industry and its potential influence on MD practices, prior to exploring dimensions and relationships in the model. The embryonic model is presented as a model that can enhance our understanding of the complexities of the…
Spreading Effect in Industrial Complex Network Based on Revised Structural Holes Theory
Ye, Qing; Guan, Jun
2016-01-01
This paper analyzed the spreading effect of industrial sectors with complex network model under perspective of econophysics. Input-output analysis, as an important research tool, focuses more on static analysis. However, the fundamental aim of industry analysis is to figure out how interaction between different industries makes impacts on economic development, which turns out to be a dynamic process. Thus, industrial complex network based on input-output tables from WIOD is proposed to be a bridge connecting accurate static quantitative analysis and comparable dynamic one. With application of revised structural holes theory, flow betweenness and random walk centrality were respectively chosen to evaluate industrial sectors’ long-term and short-term spreading effect process in this paper. It shows that industries with higher flow betweenness or random walk centrality would bring about more intensive industrial spreading effect to the industrial chains they stands in, because value stream transmission of industrial sectors depends on how many products or services it can get from the other ones, and they are regarded as brokers with bigger information superiority and more intermediate interests. PMID:27218468
Spreading Effect in Industrial Complex Network Based on Revised Structural Holes Theory.
Xing, Lizhi; Ye, Qing; Guan, Jun
2016-01-01
This paper analyzed the spreading effect of industrial sectors with complex network model under perspective of econophysics. Input-output analysis, as an important research tool, focuses more on static analysis. However, the fundamental aim of industry analysis is to figure out how interaction between different industries makes impacts on economic development, which turns out to be a dynamic process. Thus, industrial complex network based on input-output tables from WIOD is proposed to be a bridge connecting accurate static quantitative analysis and comparable dynamic one. With application of revised structural holes theory, flow betweenness and random walk centrality were respectively chosen to evaluate industrial sectors' long-term and short-term spreading effect process in this paper. It shows that industries with higher flow betweenness or random walk centrality would bring about more intensive industrial spreading effect to the industrial chains they stands in, because value stream transmission of industrial sectors depends on how many products or services it can get from the other ones, and they are regarded as brokers with bigger information superiority and more intermediate interests.
NASA Astrophysics Data System (ADS)
Gorlov, A. P.; Averchenkov, V. I.; Rytov, M. Yu; Eryomenko, V. T.
2017-01-01
The article is concerned with mathematical simulation of protection level assessment of complex organizational and technical systems of industrial enterprises by creating automated system, which main functions are: information security (IS) audit, forming of the enterprise threats model, recommendations concerning creation of the information protection system, a set of organizational-administrative documentation.
Chen, Sheng-Po; Wang, Chieh-Heng; Lin, Wen-Dian; Tong, Yu-Huei; Chen, Yu-Chun; Chiu, Ching-Jui; Chiang, Hung-Chi; Fan, Chen-Lun; Wang, Jia-Lin; Chang, Julius S
2018-05-01
The present study combines high-resolution measurements at various distances from a world-class gigantic petrochemical complex with model simulations to test a method to assess industrial emissions and their effect on local air quality. Due to the complexity in wind conditions which were highly seasonal, the dominant wind flow patterns in the coastal region of interest were classified into three types, namely northeast monsoonal (NEM) flows, southwest monsoonal (SEM) flows and local circulation (LC) based on six years of monitoring data. Sulfur dioxide (SO 2 ) was chosen as an indicative pollutant for prominent industrial emissions. A high-density monitoring network of 12 air-quality stations distributed within a 20-km radius surrounding the petrochemical complex provided hourly measurements of SO 2 and wind parameters. The SO 2 emissions from major industrial sources registered by the monitoring network were then used to validate model simulations and to illustrate the transport of the SO 2 plumes under the three typical wind patterns. It was found that the coupling of observations and modeling was able to successfully explain the transport of the industrial plumes. Although the petrochemical complex was seemingly the only major source to affect local air quality, multiple prominent sources from afar also played a significant role in local air quality. As a result, we found that a more complete and balanced assessment of the local air quality can be achieved only after taking into account the wind characteristics and emission factors of a much larger spatial scale than the initial (20 km by 20 km) study domain. Copyright © 2018 Elsevier Ltd. All rights reserved.
Using ICT techniques for improving mechatronic systems' dependability
NASA Astrophysics Data System (ADS)
Miron, Emanuel; Silva, João P. M. A.; Machado, José; Olaru, Dumitru; Prisacaru, Gheorghe
2013-10-01
The use of analysis techniques for industrial controller's analysis, such as Simulation and Formal Verification, is complex on industrial context. This complexity is due to the fact that such techniques require sometimes high investment in specific skilled human resources that have sufficient theoretical knowledge in those domains. This paper aims, mainly, to show that it is possible to obtain a timed automata model for formal verification purposes, considering the CAD model of a mechanical component. This systematic approach can be used, by companies, for the analysis of industrial controllers programs. For this purpose, it is discussed, in the paper, the best way to systematize these procedures, and this paper describes, only, the first step of a complex process and promotes a discussion of the main difficulties that can be found and a possibility for handle those difficulties. A library for formal verification purposes is obtained from original 3D CAD models using Software as a Service platform (SaaS) that, nowadays, has become a common deliverable model for many applications, because SaaS is typically accessed by users via internet access.
The application of CFD to the modelling of fires in complex geometries
NASA Astrophysics Data System (ADS)
Burns, A. D.; Clarke, D. S.; Guilbert, P.; Jones, I. P.; Simcox, S.; Wilkes, N. S.
The application of Computational Fluid Dynamics (CFD) to industrial safety is a challenging activity. In particular it involves the interaction of several different physical processes, including turbulence, combustion, radiation, buoyancy, compressible flow and shock waves in complex three-dimensional geometries. In addition, there may be multi-phase effects arising, for example, from sprinkler systems for extinguishing fires. The FLOW3D software (1-3) from Computational Fluid Dynamics Services (CFDS) is in widespread use in industrial safety problems, both within AEA Technology, and also by CFDS's commercial customers, for example references (4-13). This paper discusses some other applications of FLOW3D to safety problems. These applications illustrate the coupling of the gas flows with radiation models and combustion models, particularly for complex geometries where simpler radiation models are not applicable.
Structural Behavioral Study on the General Aviation Network Based on Complex Network
NASA Astrophysics Data System (ADS)
Zhang, Liang; Lu, Na
2017-12-01
The general aviation system is an open and dissipative system with complex structures and behavioral features. This paper has established the system model and network model for general aviation. We have analyzed integral attributes and individual attributes by applying the complex network theory and concluded that the general aviation network has influential enterprise factors and node relations. We have checked whether the network has small world effect, scale-free property and network centrality property which a complex network should have by applying degree distribution of functions and proved that the general aviation network system is a complex network. Therefore, we propose to achieve the evolution process of the general aviation industrial chain to collaborative innovation cluster of advanced-form industries by strengthening network multiplication effect, stimulating innovation performance and spanning the structural hole path.
Tips on Creating Complex Geometry Using Solid Modeling Software
ERIC Educational Resources Information Center
Gow, George
2008-01-01
Three-dimensional computer-aided drafting (CAD) software, sometimes referred to as "solid modeling" software, is easy to learn, fun to use, and becoming the standard in industry. However, many users have difficulty creating complex geometry with the solid modeling software. And the problem is not entirely a student problem. Even some teachers and…
Chang, Pao-Erh Paul; Yang, Jen-Chih Rena; Den, Walter; Wu, Chang-Fu
2014-09-01
Emissions of volatile organic compounds (VOCs) are most frequent environmental nuisance complaints in urban areas, especially where industrial districts are nearby. Unfortunately, identifying the responsible emission sources of VOCs is essentially a difficult task. In this study, we proposed a dynamic approach to gradually confine the location of potential VOC emission sources in an industrial complex, by combining multi-path open-path Fourier transform infrared spectrometry (OP-FTIR) measurement and the statistical method of principal component analysis (PCA). Close-cell FTIR was further used to verify the VOC emission source by measuring emitted VOCs from selected exhaust stacks at factories in the confined areas. Multiple open-path monitoring lines were deployed during a 3-month monitoring campaign in a complex industrial district. The emission patterns were identified and locations of emissions were confined by the wind data collected simultaneously. N,N-Dimethyl formamide (DMF), 2-butanone, toluene, and ethyl acetate with mean concentrations of 80.0 ± 1.8, 34.5 ± 0.8, 103.7 ± 2.8, and 26.6 ± 0.7 ppbv, respectively, were identified as the major VOC mixture at all times of the day around the receptor site. As the toxic air pollutant, the concentrations of DMF in air samples were found exceeding the ambient standard despite the path-average effect of OP-FTIR upon concentration levels. The PCA data identified three major emission sources, including PU coating, chemical packaging, and lithographic printing industries. Applying instrumental measurement and statistical modeling, this study has established a systematic approach for locating emission sources. Statistical modeling (PCA) plays an important role in reducing dimensionality of a large measured dataset and identifying underlying emission sources. Instrumental measurement, however, helps verify the outcomes of the statistical modeling. The field study has demonstrated the feasibility of using multi-path OP-FTIR measurement. The wind data incorporating with the statistical modeling (PCA) may successfully identify the major emission source in a complex industrial district.
CarbonSAFE Illinois - Macon County
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whittaker, Steve
CarbonSAFE Illinois is a a Feasibility study to develop an established geologic storage complex in Macon County, Illinois, for commercial-scale storage of industrially sourced CO2. Feasibility activities are focused on the Mt. Simon Storage Complex; a step-out well will be drilled near existing storage sites (i.e., the Midwest Geological Sequestration Consortium’s Illinois Basin – Decatur Project and the Illinois Industrial Carbon Capture and Storage Project) to further establish commercial viability of this complex and to evaluate EOR potential in a co-located oil-field trend. The Archer Daniels Midland facility (ethanol plant), City Water, Light, and Power in Springfield, Illinois (coal-fired powermore » station), and other regional industries are potential sources of anthropogenic CO2 for storage at this complex. Site feasibility will be evaluated through drilling results, static and dynamic modeling, and quantitative risk assessment. Both studies will entail stakeholder engagement, consideration of infrastructure requirements, existing policy, and business models. Project data will help calibrate the National Risk Assessment Partnership (NRAP) Toolkit to better understand the risks of commercial-scale carbon storage.« less
Trigger chemistries for better industrial formulations.
Wang, Hsuan-Chin; Zhang, Yanfeng; Possanza, Catherine M; Zimmerman, Steven C; Cheng, Jianjun; Moore, Jeffrey S; Harris, Keith; Katz, Joshua S
2015-04-01
In recent years, innovations and consumer demands have led to increasingly complex liquid formulations. These growing complexities have provided industrial players and their customers access to new markets through product differentiation, improved performance, and compatibility/stability with other products. One strategy for enabling more complex formulations is the use of active encapsulation. When encapsulation is employed, strategies are required to effect the release of the active at the desired location and time of action. One particular route that has received significant academic research effort is the employment of triggers to induce active release upon a specific stimulus, though little has translated for industrial use to date. To address emerging industrial formulation needs, in this review, we discuss areas of trigger release chemistries and their applications specifically as relevant to industrial use. We focus the discussion on the use of heat, light, shear, and pH triggers as applied in several model polymeric systems for inducing active release. The goal is that through this review trends will emerge for how technologies can be better developed to maximize their value through industrial adaptation.
Parametric Modeling as a Technology of Rapid Prototyping in Light Industry
NASA Astrophysics Data System (ADS)
Tomilov, I. N.; Grudinin, S. N.; Frolovsky, V. D.; Alexandrov, A. A.
2016-04-01
The paper deals with the parametric modeling method of virtual mannequins for the purposes of design automation in clothing industry. The described approach includes the steps of generation of the basic model on the ground of the initial one (obtained in 3D-scanning process), its parameterization and deformation. The complex surfaces are presented by the wireframe model. The modeling results are evaluated with the set of similarity factors. Deformed models are compared with their virtual prototypes. The results of modeling are estimated by the standard deviation factor.
Fault detection of Tennessee Eastman process based on topological features and SVM
NASA Astrophysics Data System (ADS)
Zhao, Huiyang; Hu, Yanzhu; Ai, Xinbo; Hu, Yu; Meng, Zhen
2018-03-01
Fault detection in industrial process is a popular research topic. Although the distributed control system(DCS) has been introduced to monitor the state of industrial process, it still cannot satisfy all the requirements for fault detection of all the industrial systems. In this paper, we proposed a novel method based on topological features and support vector machine(SVM), for fault detection of industrial process. The proposed method takes global information of measured variables into account by complex network model and predicts whether a system has generated some faults or not by SVM. The proposed method can be divided into four steps, i.e. network construction, network analysis, model training and model testing respectively. Finally, we apply the model to Tennessee Eastman process(TEP). The results show that this method works well and can be a useful supplement for fault detection of industrial process.
S. Salazar; M. Mendoza; A. M. Tejeda
2006-01-01
A spatial model is presented to explain the concentration of heavy metals (Fe, Cu, Zn, Ni, Cr, Co and Pb), in the soils around the industrial complex near the Port of Veracruz, Mexico. Unexpected low concentration sites where then tested to detect woody plant species that may have the capability to hiperacumulate these contaminants, hence having a potential for...
Scale-free phenomenon in industries in China
NASA Astrophysics Data System (ADS)
Tang, Da-Hai; Chen, Bo-Kui; Gao, Ya-Chun; Wang, Bing-Hong
2013-12-01
In this paper, we investigate the data of industries in China and find that the frequency distributions of fixed assets and fixed-assets’ investment of industries obey power laws. We show that these power-law modes can be explained by the rules of the Simon Model, rather than the existing investment theories such as the classical investment theory or acceleration principle. Moreover, the mechanism of the investment distribution may be similar to the forest-fire model of self-organizing criticality. By introducing the complex system methods, this research changes the traditional opinion of the investment and gains some meaningful understanding in the dynamics of industries and the economic cycle.
SPECIATION OF COMPLEX ORGANIC CONTAMINANTS IN WATER WITH RAMAN SPECTROSCOPY
Pesticides and industrial chemicals are typically complex organic molecules with multiple heteroatoms that can ionize, tautomerize, and form various types of hydrates in water. However, conceptual models for predicting the fate of these chemicals in the environment ignore these ...
ACTIVIS: Visual Exploration of Industry-Scale Deep Neural Network Models.
Kahng, Minsuk; Andrews, Pierre Y; Kalro, Aditya; Polo Chau, Duen Horng
2017-08-30
While deep learning models have achieved state-of-the-art accuracies for many prediction tasks, understanding these models remains a challenge. Despite the recent interest in developing visual tools to help users interpret deep learning models, the complexity and wide variety of models deployed in industry, and the large-scale datasets that they used, pose unique design challenges that are inadequately addressed by existing work. Through participatory design sessions with over 15 researchers and engineers at Facebook, we have developed, deployed, and iteratively improved ACTIVIS, an interactive visualization system for interpreting large-scale deep learning models and results. By tightly integrating multiple coordinated views, such as a computation graph overview of the model architecture, and a neuron activation view for pattern discovery and comparison, users can explore complex deep neural network models at both the instance- and subset-level. ACTIVIS has been deployed on Facebook's machine learning platform. We present case studies with Facebook researchers and engineers, and usage scenarios of how ACTIVIS may work with different models.
Multiscale Materials Modeling in an Industrial Environment.
Weiß, Horst; Deglmann, Peter; In 't Veld, Pieter J; Cetinkaya, Murat; Schreiner, Eduard
2016-06-07
In this review, we sketch the materials modeling process in industry. We show that predictive and fast modeling is a prerequisite for successful participation in research and development processes in the chemical industry. Stable and highly automated workflows suitable for handling complex systems are a must. In particular, we review approaches to build and parameterize soft matter systems. By satisfying these prerequisites, efficiency for the development of new materials can be significantly improved, as exemplified here for formulation polymer development. This is in fact in line with recent Materials Genome Initiative efforts sponsored by the US government. Valuable contributions to product development are possible today by combining existing modeling techniques in an intelligent fashion, provided modeling and experiment work hand in hand.
A Simplified Approach to Risk Assessment Based on System Dynamics: An Industrial Case Study.
Garbolino, Emmanuel; Chery, Jean-Pierre; Guarnieri, Franck
2016-01-01
Seveso plants are complex sociotechnical systems, which makes it appropriate to support any risk assessment with a model of the system. However, more often than not, this step is only partially addressed, simplified, or avoided in safety reports. At the same time, investigations have shown that the complexity of industrial systems is frequently a factor in accidents, due to interactions between their technical, human, and organizational dimensions. In order to handle both this complexity and changes in the system over time, this article proposes an original and simplified qualitative risk evaluation method based on the system dynamics theory developed by Forrester in the early 1960s. The methodology supports the development of a dynamic risk assessment framework dedicated to industrial activities. It consists of 10 complementary steps grouped into two main activities: system dynamics modeling of the sociotechnical system and risk analysis. This system dynamics risk analysis is applied to a case study of a chemical plant and provides a way to assess the technological and organizational components of safety. © 2016 Society for Risk Analysis.
A Telecommunications Industry Primer: A Systems Model.
ERIC Educational Resources Information Center
Obermier, Timothy R.; Tuttle, Ronald H.
2003-01-01
Describes the Telecommunications Systems Model to help technical educators and students understand the increasingly complex telecommunications infrastructure. Specifically looks at ownership and regulatory status, service providers, transport medium, network protocols, and end-user services. (JOW)
The formulations of the AMS/EPA Regulatory Model Improvement Committee's applied air dispersion model (AERMOD) are described. This is the second in a series of three articles. Part I describes the model's methods for characterizing the atmospheric boundary layer and complex ter...
Rivers, Patrick A; Glover, Saundra H
2008-01-01
In all industries, competition among businesses has long been encouraged as a mechanism to increase value for patients. In other words, competition ensures the provision of better products and services to satisfy the needs of customers This paper aims to develop a model that can be used to empirically investigate a number of complex issues and relationships associated with competition in the health care industry. A literature review was conducted. A total of 50 items of literature related to the subject were reviewed. Various perspectives of competition, the nature of service quality, health system costs, and patient satisfaction in health care are examined. A model of the relationship among these variables is developed. The model depicts patient satisfaction as an outcome measure directly dependent on competition. Quality of care and health care systems costs, while also directly dependent on the strategic mission and goals, are considered as determinants of customer satisfaction as well. The model is discussed in the light of propositions for empirical research. Empirical studies based on the model proposed in this paper should help identify areas with significant impact on patient satisfaction while maintaining high quality of service at lower costs in a competitive environment. The authors develop a research model which included propositions to examine the complex issues of competition in the health care industry.
Hemachandra, C K; Pathiratne, A
2017-10-01
Complex effluents originating from diverse industrial processes in industrial zones could pose cytotoxic/genotoxic hazards to biota in the receiving ecosystems which cannot be revealed by conventional monitoring methods. This study assessed potential cytotoxicity/genotoxicity of treated effluents of two industrial zones which are discharged into Kelani river, Sri Lanka combining erythrocytic abnormality tests and comet assay of the tropical model fish, Nile tilapia. Exposure of fish to the effluents induced erythrocytic DNA damage and deformed erythrocytes with serrated membranes, vacuolations, nuclear buds and micronuclei showing cytotoxic/genotoxic hazards in all cases. Occasional exceedance of industrial effluent discharge regulatory limits was noted for color and lead which may have contributed to the observed cytotoxicity/genotoxicity of effluents. The results demonstrate that fish erythrocytic responses could be used as effective bioanalytical tools for cytotoxic/genotoxic hazard assessments of complex effluents of industrial zones for optimization of the waste treatment process in order to reduce biological impacts.
The corporate university: a model for sustaining an expert workforce in the human services.
Gould, Karen E
2005-05-01
The human service industry has become a complex industry in which agencies must respond to the demands of the marketplace. To respond to these demands, agencies must develop and maintain their knowledge capital by offering an extensive array of learning opportunities related to their business goals. The corporate university, a contemporary educational model designed to maintain an expert workforce, allows agencies to meet this need effectively.
Analysis Center. Areas of Expertise Mathematical modeling, simulation, and optimization of complex Industrial and Applied Mathematics Mathematical Optimization Society Featured Publications Stoll, Brady
The model for estimation production cost of embroidery handicraft
NASA Astrophysics Data System (ADS)
Nofierni; Sriwana, IK; Septriani, Y.
2017-12-01
Embroidery industry is one of type of micro industry that produce embroidery handicraft. These industries are emerging in some rural areas of Indonesia. Embroidery clothing are produce such as scarves and clothes that show cultural value of certain region. The owner of an enterprise must calculate the cost of production before making a decision on how many products are received from the customer. A calculation approach to production cost analysis is needed to consider the feasibility of each order coming. This study is proposed to design the expert system (ES) in order to improve production management in the embroidery industry. The model will design used Fuzzy inference system as a model to estimate production cost. Research conducted based on survey and knowledge acquisitions from stakeholder of supply chain embroidery handicraft industry at Bukittinggi, West Sumatera, Indonesia. This paper will use fuzzy input where the quality, the complexity of the design and the working hours required and the result of the model are useful to manage production cost on embroidery production.
Cooperative Support a Model to Increase Minority Participation in Science
ERIC Educational Resources Information Center
Smith, Melvin O.
1978-01-01
A model is described that can be used to increase minority participation in the sciences and involves the cooperation of the business-industrial complex, higher education in the historically Black colleges and the government. (MN)
Preferential attachment and growth dynamics in complex systems
NASA Astrophysics Data System (ADS)
Yamasaki, Kazuko; Matia, Kaushik; Buldyrev, Sergey V.; Fu, Dongfeng; Pammolli, Fabio; Riccaboni, Massimo; Stanley, H. Eugene
2006-09-01
Complex systems can be characterized by classes of equivalency of their elements defined according to system specific rules. We propose a generalized preferential attachment model to describe the class size distribution. The model postulates preferential growth of the existing classes and the steady influx of new classes. According to the model, the distribution changes from a pure exponential form for zero influx of new classes to a power law with an exponential cut-off form when the influx of new classes is substantial. Predictions of the model are tested through the analysis of a unique industrial database, which covers both elementary units (products) and classes (markets, firms) in a given industry (pharmaceuticals), covering the entire size distribution. The model’s predictions are in good agreement with the data. The paper sheds light on the emergence of the exponent τ≈2 observed as a universal feature of many biological, social and economic problems.
The kinetics of thermal generation of flavour.
Parker, Jane K
2013-01-01
Control and optimisation of flavour is the ultimate challenge for the food and flavour industry. The major route to flavour formation during thermal processing is the Maillard reaction, which is a complex cascade of interdependent reactions initiated by the reaction between a reducing sugar and an amino compound. The complexity of the reaction means that researchers turn to kinetic modelling in order to understand the control points of the reaction and to manipulate the flavour profile. Studies of the kinetics of flavour formation have developed over the past 30 years from single- response empirical models of binary aqueous systems to sophisticated multi-response models in food matrices, based on the underlying chemistry, with the power to predict the formation of some key aroma compounds. This paper discusses in detail the development of kinetic models of thermal generation of flavour and looks at the challenges involved in predicting flavour. Copyright © 2012 Society of Chemical Industry.
Collaboration between industry and academia--prospects for male fertility control.
Stock, G; Habenicht, U F
1999-12-01
Drug development within the pharmaceutical industry is probably the field with the highest level of regulations. Due to the complexity of the different components of drug development and drug surveillance the need for a sophisticated organization and infrastructure is obvious. In addition, there is a necessity for sufficient resources and long-term commitment as well as logistic and long-term knowledge management. In order to secure high professional standards at all levels of this highly complex value creating chain, the number of cooperative arrangements in the pharmaceutical industry are increasing. The identification of new targets in the drug finding process calls in particular for outside partners. At the same time the preparedness of non-industrial researchers to cooperate with industry has also increased significantly. The area of fertility control, especially male fertility control, provides an excellent example for this kind of cooperation between industrial and non-industrial partners. Here a cooperative network is described which probably meets practically all relevant criteria for both the non-industrial but also the industrial partner. Some principles for the management of such a cooperative network are discussed. We believe that this kind of network can serve as a model for similar networks in other fields.
Economic-environmental modeling of point source pollution in Jefferson County, Alabama, USA.
Kebede, Ellene; Schreiner, Dean F; Huluka, Gobena
2002-05-01
This paper uses an integrated economic-environmental model to assess the point source pollution from major industries in Jefferson County, Northern Alabama. Industrial expansion generates employment, income, and tax revenue for the public sector; however, it is also often associated with the discharge of chemical pollutants. Jefferson County is one of the largest industrial counties in Alabama that experienced smog warnings and ambient ozone concentration, 1996-1999. Past studies of chemical discharge from industries have used models to assess the pollution impact of individual plants. This study, however, uses an extended Input-Output (I-O) economic model with pollution emission coefficients to assess direct and indirect pollutant emission for several major industries in Jefferson County. The major findings of the study are: (a) the principal emission by the selected industries are volatile organic compounds (VOC) and these contribute to the ambient ozone concentration; (b) the direct and indirect emissions are significantly higher than the direct emission by some industries, indicating that an isolated analysis will underestimate the emission by an industry; (c) while low emission coefficient industries may suggest industry choice they may also emit the most hazardous chemicals. This study is limited by the assumptions made, and the data availability, however it provides a useful analytical tool for direct and cumulative emission estimation and generates insights on the complexity in choice of industries.
A fuzzy model for assessing risk of occupational safety in the processing industry.
Tadic, Danijela; Djapan, Marko; Misita, Mirjana; Stefanovic, Miladin; Milanovic, Dragan D
2012-01-01
Managing occupational safety in any kind of industry, especially in processing, is very important and complex. This paper develops a new method for occupational risk assessment in the presence of uncertainties. Uncertain values of hazardous factors and consequence frequencies are described with linguistic expressions defined by a safety management team. They are modeled with fuzzy sets. Consequence severities depend on current hazardous factors, and their values are calculated with the proposed procedure. The proposed model is tested with real-life data from fruit processing firms in Central Serbia.
NASA Astrophysics Data System (ADS)
Xing, Lizhi; Dong, Xianlei; Guan, Jun
2017-04-01
Input-output table is very comprehensive and detailed in describing the national economic system with lots of economic relationships, which contains supply and demand information among industrial sectors. The complex network, a theory and method for measuring the structure of complex system, can describe the structural characteristics of the internal structure of the research object by measuring the structural indicators of the social and economic system, revealing the complex relationship between the inner hierarchy and the external economic function. This paper builds up GIVCN-WIOT models based on World Input-Output Database in order to depict the topological structure of Global Value Chain (GVC), and assumes the competitive advantage of nations is equal to the overall performance of its domestic sectors' impact on the GVC. Under the perspective of econophysics, Global Industrial Impact Coefficient (GIIC) is proposed to measure the national competitiveness in gaining information superiority and intermediate interests. Analysis of GIVCN-WIOT models yields several insights including the following: (1) sectors with higher Random Walk Centrality contribute more to transmitting value streams within the global economic system; (2) Half-Value Ratio can be used to measure robustness of open-economy macroeconomics in the process of globalization; (3) the positive correlation between GIIC and GDP indicates that one country's global industrial impact could reveal its international competitive advantage.
Optimal service using Matlab - simulink controlled Queuing system at call centers
NASA Astrophysics Data System (ADS)
Balaji, N.; Siva, E. P.; Chandrasekaran, A. D.; Tamilazhagan, V.
2018-04-01
This paper presents graphical integrated model based academic research on telephone call centres. This paper introduces an important feature of impatient customers and abandonments in the queue system. However the modern call centre is a complex socio-technical system. Queuing theory has now become a suitable application in the telecom industry to provide better online services. Through this Matlab-simulink multi queuing structured models provide better solutions in complex situations at call centres. Service performance measures analyzed at optimal level through Simulink queuing model.
The Defense Industrial Base: Prescription for a Psychosomatic Ailment
1983-08-01
The Decision- Making Process ------------------------- 65 Notes ---------------------------------------- FIGURE 4-1. The Decision [laking Process...the strategy and tactics process to make certain that we can attain out national security objectives. (IFP is also known as mobilization planning or...decision- making model that could improve the capacity and capability-of the military-industrial complex, thereby increasing the probability of success
Sung, Joo Hyun; Oh, Inbo; Kim, Ahra; Lee, Jiho; Sim, Chang Sun; Yoo, Cheolin; Park, Sang Jin; Kim, Geun Bae; Kim, Yangho
2018-01-29
Industrial pollution may affect the heavy metal body burden of people living near industrial complexes. We determined the average concentrations of atmospheric heavy metals in areas close to and distant from industrial complexes in Korea, and the body concentrations of these heavy metals in residents living near and distant from these facilities. The atmospheric data of heavy metals (lead and cadmium) were from the Regional Air Monitoring Network in Ulsan. We recruited 1,148 participants, 872 who lived near an industrial complex ("exposed" group) and 276 who lived distant from industrial complexes ("non-exposed" group), and measured their concentrations of blood lead, urinary cadmium, and urinary total mercury. The results showed that atmospheric and human concentrations of heavy metals were higher in areas near industrial complexes. In addition, residents living near industrial complexes had higher individual and combined concentrations (cadmium + lead + mercury) of heavy metals. We conclude that residents living near industrial complexes are exposed to high concentrations of heavy metals, and should be carefully monitored. © 2018 The Korean Academy of Medical Sciences.
2017-01-01
Background Industrial pollution may affect the heavy metal body burden of people living near industrial complexes. We determined the average concentrations of atmospheric heavy metals in areas close to and distant from industrial complexes in Korea, and the body concentrations of these heavy metals in residents living near and distant from these facilities. Methods The atmospheric data of heavy metals (lead and cadmium) were from the Regional Air Monitoring Network in Ulsan. We recruited 1,148 participants, 872 who lived near an industrial complex (“exposed” group) and 276 who lived distant from industrial complexes (“non-exposed” group), and measured their concentrations of blood lead, urinary cadmium, and urinary total mercury. Results The results showed that atmospheric and human concentrations of heavy metals were higher in areas near industrial complexes. In addition, residents living near industrial complexes had higher individual and combined concentrations (cadmium + lead + mercury) of heavy metals. Conclusion We conclude that residents living near industrial complexes are exposed to high concentrations of heavy metals, and should be carefully monitored. PMID:29349943
The academic-industrial complex: navigating the translational and cultural divide.
Freedman, Stephen; Mullane, Kevin
2017-07-01
In general, the fruits of academic discoveries can only be realized through joint efforts with industry. However, the poor reproducibility of much academic research has damaged credibility and jeopardized translational efforts that could benefit patients. Meanwhile, journals are rife with articles bemoaning the limited productivity and increasing costs of the biopharmaceutical industry and its resultant predilection for mergers and reorganizations while decreasing internal research efforts. The ensuing disarray and uncertainty has created tremendous opportunities for academia and industry to form even closer ties, and to embrace new operational and financial models to their joint benefit. This review article offers a personal perspective on the opportunities, models and approaches that harness the increased interface and growing interdependency between biomedical research institutes, the biopharmaceutical industry and the technological world. Copyright © 2017 Elsevier Ltd. All rights reserved.
A cooperative game theory approach to transmission planning in power systems
NASA Astrophysics Data System (ADS)
Contreras, Javier
The rapid restructuring of the electric power industry from a vertically integrated entity into a decentralized industry has given rise to complex problems. In particular, the transmission component of the electric power system requires new methodologies to fully capture this emerging competitive industry. Game theory models are used to model strategic interactions in a competitive environment. This thesis presents a new decentralized framework to study the transmission network expansion problem using cooperative game theory. First, the players and the rules of the game are defined. Second, a coalition formation scheme is developed. Finally, the optimized cost of expansion is allocated based on the history of the coalition formation.
The system of technical diagnostics of the industrial safety information network
NASA Astrophysics Data System (ADS)
Repp, P. V.
2017-01-01
This research is devoted to problems of safety of the industrial information network. Basic sub-networks, ensuring reliable operation of the elements of the industrial Automatic Process Control System, were identified. The core tasks of technical diagnostics of industrial information safety were presented. The structure of the technical diagnostics system of the information safety was proposed. It includes two parts: a generator of cyber-attacks and the virtual model of the enterprise information network. The virtual model was obtained by scanning a real enterprise network. A new classification of cyber-attacks was proposed. This classification enables one to design an efficient generator of cyber-attacks sets for testing the virtual modes of the industrial information network. The numerical method of the Monte Carlo (with LPτ - sequences of Sobol), and Markov chain was considered as the design method for the cyber-attacks generation algorithm. The proposed system also includes a diagnostic analyzer, performing expert functions. As an integrative quantitative indicator of the network reliability the stability factor (Kstab) was selected. This factor is determined by the weight of sets of cyber-attacks, identifying the vulnerability of the network. The weight depends on the frequency and complexity of cyber-attacks, the degree of damage, complexity of remediation. The proposed Kstab is an effective integral quantitative measure of the information network reliability.
Industrial systems biology and its impact on synthetic biology of yeast cell factories.
Fletcher, Eugene; Krivoruchko, Anastasia; Nielsen, Jens
2016-06-01
Engineering industrial cell factories to effectively yield a desired product while dealing with industrially relevant stresses is usually the most challenging step in the development of industrial production of chemicals using microbial fermentation processes. Using synthetic biology tools, microbial cell factories such as Saccharomyces cerevisiae can be engineered to express synthetic pathways for the production of fuels, biopharmaceuticals, fragrances, and food flavors. However, directing fluxes through these synthetic pathways towards the desired product can be demanding due to complex regulation or poor gene expression. Systems biology, which applies computational tools and mathematical modeling to understand complex biological networks, can be used to guide synthetic biology design. Here, we present our perspective on how systems biology can impact synthetic biology towards the goal of developing improved yeast cell factories. Biotechnol. Bioeng. 2016;113: 1164-1170. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
Rivers, Patrick A.; Glover, Saundra H.
2010-01-01
Purpose In all industries, competition among businesses has long been encouraged as a mechanism to increase value for patients. In other words, competition ensures the provision of better products and services to satisfy the needs of customers This paper aims to develop a model that can be used to empirically investigate a number of complex issues and relationships associated with competition in the health care industry. Design/methodology/approach A literature review was conducted. A total of 50 items of literature related to the subject were reviewed.. Various perspectives of competition, the nature of service quality, health system costs, and patient satisfaction in health care are examined Findings A model of the relationship among these variables is developed. The model depicts patient satisfaction as an outcome measure directly dependent on competition. Quality of care and health care systems costs, while also directly dependent on the strategic mission and goals, are considered as determinants of customer satisfaction as well. The model is discussed in the light of propositions for empirical research. Practical implications Empirical studies based on the model proposed in this paper should help identify areas with significant impact on patient satisfaction while maintaining high quality of service at lower costs in a competitive environment. Originality/value The authors develop a research model which included propositions to examine the complex issues of competition in the health care industry. PMID:19579575
OpenFOAM: Open source CFD in research and industry
NASA Astrophysics Data System (ADS)
Jasak, Hrvoje
2009-12-01
The current focus of development in industrial Computational Fluid Dynamics (CFD) is integration of CFD into Computer-Aided product development, geometrical optimisation, robust design and similar. On the other hand, in CFD research aims to extend the boundaries ofpractical engineering use in "non-traditional " areas. Requirements of computational flexibility and code integration are contradictory: a change of coding paradigm, with object orientation, library components, equation mimicking is proposed as a way forward. This paper describes OpenFOAM, a C++ object oriented library for Computational Continuum Mechanics (CCM) developed by the author. Efficient and flexible implementation of complex physical models is achieved by mimicking the form ofpartial differential equation in software, with code functionality provided in library form. Open Source deployment and development model allows the user to achieve desired versatility in physical modeling without the sacrifice of complex geometry support and execution efficiency.
Clinical Complexity in Medicine: A Measurement Model of Task and Patient Complexity.
Islam, R; Weir, C; Del Fiol, G
2016-01-01
Complexity in medicine needs to be reduced to simple components in a way that is comprehensible to researchers and clinicians. Few studies in the current literature propose a measurement model that addresses both task and patient complexity in medicine. The objective of this paper is to develop an integrated approach to understand and measure clinical complexity by incorporating both task and patient complexity components focusing on the infectious disease domain. The measurement model was adapted and modified for the healthcare domain. Three clinical infectious disease teams were observed, audio-recorded and transcribed. Each team included an infectious diseases expert, one infectious diseases fellow, one physician assistant and one pharmacy resident fellow. The transcripts were parsed and the authors independently coded complexity attributes. This baseline measurement model of clinical complexity was modified in an initial set of coding processes and further validated in a consensus-based iterative process that included several meetings and email discussions by three clinical experts from diverse backgrounds from the Department of Biomedical Informatics at the University of Utah. Inter-rater reliability was calculated using Cohen's kappa. The proposed clinical complexity model consists of two separate components. The first is a clinical task complexity model with 13 clinical complexity-contributing factors and 7 dimensions. The second is the patient complexity model with 11 complexity-contributing factors and 5 dimensions. The measurement model for complexity encompassing both task and patient complexity will be a valuable resource for future researchers and industry to measure and understand complexity in healthcare.
NASA Astrophysics Data System (ADS)
Wrożyna, Andrzej; Pernach, Monika; Kuziak, Roman; Pietrzyk, Maciej
2016-04-01
Due to their exceptional strength properties combined with good workability the Advanced High-Strength Steels (AHSS) are commonly used in automotive industry. Manufacturing of these steels is a complex process which requires precise control of technological parameters during thermo-mechanical treatment. Design of these processes can be significantly improved by the numerical models of phase transformations. Evaluation of predictive capabilities of models, as far as their applicability in simulation of thermal cycles thermal cycles for AHSS is considered, was the objective of the paper. Two models were considered. The former was upgrade of the JMAK equation while the latter was an upgrade of the Leblond model. The models can be applied to any AHSS though the examples quoted in the paper refer to the Dual Phase (DP) steel. Three series of experimental simulations were performed. The first included various thermal cycles going beyond limitations of the continuous annealing lines. The objective was to validate models behavior in more complex cooling conditions. The second set of tests included experimental simulations of the thermal cycle characteristic for the continuous annealing lines. Capability of the models to describe properly phase transformations in this process was evaluated. The third set included data from the industrial continuous annealing line. Validation and verification of models confirmed their good predictive capabilities. Since it does not require application of the additivity rule, the upgrade of the Leblond model was selected as the better one for simulation of industrial processes in AHSS production.
Microcomputers, Model Rockets, and Race Cars.
ERIC Educational Resources Information Center
Mirus, Edward A., Jr.
1985-01-01
The industrial education orientation program at Wisconsin School for the Deaf (WSD) presents problem-solving situations to all seventh- and eighth-grade hearing-impaired students. WSD developed user-friendly microcomputer software to guide students individually through complex computations involving model race cars and rockets while freeing…
NASA Astrophysics Data System (ADS)
Gwiazda, A.; Banas, W.; Sekala, A.; Foit, K.; Hryniewicz, P.; Kost, G.
2015-11-01
Process of workcell designing is limited by different constructional requirements. They are related to technological parameters of manufactured element, to specifications of purchased elements of a workcell and to technical characteristics of a workcell scene. This shows the complexity of the design-constructional process itself. The results of such approach are individually designed workcell suitable to the specific location and specific production cycle. Changing this parameters one must rebuild the whole configuration of a workcell. Taking into consideration this it is important to elaborate the base of typical elements of a robot kinematic chain that could be used as the tool for building Virtual modelling of kinematic chains of industrial robots requires several preparatory phase. Firstly, it is important to create a database element, which will be models of industrial robot arms. These models could be described as functional primitives that represent elements between components of the kinematic pairs and structural members of industrial robots. A database with following elements is created: the base kinematic pairs, the base robot structural elements, the base of the robot work scenes. The first of these databases includes kinematic pairs being the key component of the manipulator actuator modules. Accordingly, as mentioned previously, it includes the first stage rotary pair of fifth stage. This type of kinematic pairs was chosen due to the fact that it occurs most frequently in the structures of industrial robots. Second base consists of structural robot elements therefore it allows for the conversion of schematic structures of kinematic chains in the structural elements of the arm of industrial robots. It contains, inter alia, the structural elements such as base, stiff members - simple or angular units. They allow converting recorded schematic three-dimensional elements. Last database is a database of scenes. It includes elements of both simple and complex: simple models of technological equipment, conveyors models, models of the obstacles and like that. Using these elements it could be formed various production spaces (robotized workcells), in which it is possible to virtually track the operation of an industrial robot arm modelled in the system.
The central purpose of our study was to examine the performance of the United States Environmental Protection Agency's (EPA) nonreactive Gaussian air quality dispersion model, the Industrial Source Complex Short Term Model (ISCST3) Version 98226, in predicting polychlorinated dib...
Stability of ecological industry chain: an entropy model approach.
Wang, Qingsong; Qiu, Shishou; Yuan, Xueliang; Zuo, Jian; Cao, Dayong; Hong, Jinglan; Zhang, Jian; Dong, Yong; Zheng, Ying
2016-07-01
A novel methodology is proposed in this study to examine the stability of ecological industry chain network based on entropy theory. This methodology is developed according to the associated dissipative structure characteristics, i.e., complexity, openness, and nonlinear. As defined in the methodology, network organization is the object while the main focus is the identification of core enterprises and core industry chains. It is proposed that the chain network should be established around the core enterprise while supplementation to the core industry chain helps to improve system stability, which is verified quantitatively. Relational entropy model can be used to identify core enterprise and core eco-industry chain. It could determine the core of the network organization and core eco-industry chain through the link form and direction of node enterprises. Similarly, the conductive mechanism of different node enterprises can be examined quantitatively despite the absence of key data. Structural entropy model can be employed to solve the problem of order degree for network organization. Results showed that the stability of the entire system could be enhanced by the supplemented chain around the core enterprise in eco-industry chain network organization. As a result, the sustainability of the entire system could be further improved.
Thermal Indices and Thermophysiological Modeling for Heat Stress.
Havenith, George; Fiala, Dusan
2015-12-15
The assessment of the risk of human exposure to heat is a topic as relevant today as a century ago. The introduction and use of heat stress indices and models to predict and quantify heat stress and heat strain has helped to reduce morbidity and mortality in industrial, military, sports, and leisure activities dramatically. Models used range from simple instruments that attempt to mimic the human-environment heat exchange to complex thermophysiological models that simulate both internal and external heat and mass transfer, including related processes through (protective) clothing. This article discusses the most commonly used indices and models and looks at how these are deployed in the different contexts of industrial, military, and biometeorological applications, with focus on use to predict related thermal sensations, acute risk of heat illness, and epidemiological analysis of morbidity and mortality. A critical assessment is made of tendencies to use simple indices such as WBGT in more complex conditions (e.g., while wearing protective clothing), or when employed in conjunction with inappropriate sensors. Regarding the more complex thermophysiological models, the article discusses more recent developments including model individualization approaches and advanced systems that combine simulation models with (body worn) sensors to provide real-time risk assessment. The models discussed in the article range from historical indices to recent developments in using thermophysiological models in (bio) meteorological applications as an indicator of the combined effect of outdoor weather settings on humans. Copyright © 2015 John Wiley & Sons, Inc.
Experimental application of OMA solutions on the model of industrial structure
NASA Astrophysics Data System (ADS)
Mironov, A.; Mironovs, D.
2017-10-01
It is very important and sometimes even vital to maintain reliability of industrial structures. High quality control during production and structural health monitoring (SHM) in exploitation provides reliable functioning of large, massive and remote structures, like wind generators, pipelines, power line posts, etc. This paper introduces a complex of technological and methodical solutions for SHM and diagnostics of industrial structures, including those that are actuated by periodic forces. Solutions were verified on a wind generator scaled model with integrated system of piezo-film deformation sensors. Simultaneous and multi-patch Operational Modal Analysis (OMA) approaches were implemented as methodical means for structural diagnostics and monitoring. Specially designed data processing algorithms provide objective evaluation of structural state modification.
On the engineering design for systematic integration of agent-orientation in industrial automation.
Yu, Liyong; Schüller, Andreas; Epple, Ulrich
2014-09-01
In today's automation industry, agent-oriented development of system functionalities appears to have a great potential for increasing autonomy and flexibility of complex operations, while lowering the workload of users. In this paper, we present a reference model for the harmonious and systematical integration of agent-orientation in industrial automation. Considering compatibility with existing automation systems and best practice, this model combines advantages of function block technology, service orientation and native description methods from the automation standard IEC 61131-3. This approach can be applied as a guideline for the engineering design of future agent-oriented automation systems. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
A modified Lotka-Volterra model for the evolution of coordinate symbiosis in energy enterprise
NASA Astrophysics Data System (ADS)
Zhou, Li; Wang, Teng; Lyu, Xiaohuan; Yu, Jing
2018-02-01
Recent developments in energy markets make the operating industries more dynamic and complex, and energy enterprises cooperate more closely in the industrial chain and symbiosis. In order to further discuss the evolution of coordinate symbiosis in energy enterprises, a modified Lotka-Volterra equation is introduced to develop a symbiosis analysis model of energy groups. According to the equilibrium and stability analysis, a conclusion is obtained that if the upstream energy group and the downstream energy group are in symbiotic state, the growth of their utility will be greater than their independent value. Energy enterprises can get mutual benefits and positive promotions in industrial chain by their cooperation.
Egea, Francisco J; Torrente, Roberto G; Aguilar, Alfredo
2018-01-25
In the last ten years, bioeconomy strategies and policy-related bioeconomy initiatives have been developed all over the world. Some of them are currently in the process of translation into specific actions. In most cases, the approaches followed have been top-down policy-related initiatives, triggered by the public sector originating a dynamic which can bring together different bioeconomy stakeholders i.e. industry, academia, financial operators and farmers. This article describes a bottom-up situation with unique bioeconomy-related features that deserve specific attention. Over the last 40 years, Almería, in the south east of Spain, has developed one of the most efficient agro-industrial complexes in the world, evolving from a traditional and subsistence agriculture, to becoming the major vegetable exporter in the European Union (EU). This growth set aside issues such as sustainability, long-term perspectives on water resources or agricultural waste. However, societal concerns about a circular economy, as well as policy initiatives in the EU and in Spain on bioeconomy are changing the situation towards an integrated, efficient and sustainable bioeconomy system. Currently, the production chain demands innovations related to the use of biomass as source of bioproducts and bioenergy in order to remain competitive. Some positive aspects are the relatively small size of the agro-industrial area, making transport and communications rapid and easy, and the existence of strong and dedicated academic and financial institutions. This article outlines the current efforts and initiatives to couple the existing successful agro-industrial complex with that of a fully sustainable bioeconomy model. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Lawley, Russell; Lee, Kathryn; Lark, Murray
2015-04-01
At BGS, expert elicitation has been used to evaluate the relative impacts arising from multiple geohazards that can affect UK housing stock. In turn this 'consensus' understanding has been developed into a tool to assist the UK insurance industry underwrite the domestic property market. BGS models six geohazards deemed relevant to UK Housing: Landslides, Shrink-swell (heave), Compressibles, Dissolution (Karst), collapsibles and running sand. The models are widely used and have been developed over 2 decades of research. However, stakeholders such as the insurance industry are not well equipped to managed geohazard models directly and need the information to be categorised in a meaningful way, appropriate to their business models. Creating terminologies to communicate the relative threats for each geohazard has been relatively straightforward, but communicating the impacts of multiple geohazards, or comparing the relative risks of one geohazard against another has proved more difficult. Expert elicitation has been used since 2010 to try and build a consensus model for geohazards and to help BGS communicate its knowledge to stakeholders. Typically, the BGS geohazard models are provided with 5 levels of susceptibility: A (low or absent) ,B, C, D and E (high). Resolving individual models is relatively simple, but the insurance market is extremely dynamic and a need to simplify and convey the possible threats from all geohazards into a single 'rating' of susceptibility has emerged. This poses a problem when trying to convey the geological understanding behind the models. For example, how do you convey the combined (or comparative) susceptibility of a high susceptibility to Dissolution, with a moderate susceptibility to Landslides. This complexity is further hampered when needing to consider that stakeholders resolve spatial distributions via use of frameworks such as 'Postcode' sectors, and that the outputs of most geohazard models are sensitive to scope and scale of such frameworks. The elicitation process (the first to be deployed by BGS) allowed a significant degree of structured knowledge-exchange between experts of differing geohazards backgrounds. Consensus over likely impacts arising from the geohazards was achieved (where previously there had been none). In the process of harmonising the models it became clear that further elicitation (within BGS and externally) could be used to refine the models on a more regular basis and provide a consistency relevant to other industries (such as construction). By establishing a consensus, it has been possible to provide improved understanding to the insurance industry with simpler metrics, whilst maintaining scope for also conveying the underlying complexity and natural variance in the models. We will discuss our experience of the use of elicitation methodology and the implications of our results for further work at the BGS to convey uncertain and complex models to stakeholders and non-geologists.
NASA Astrophysics Data System (ADS)
Mammarella, M. C.; Grandoni, G.; Fernando, J.; Cacciani, M.; di Sabatino, S.; Favaron, M.; Fedele, P.
2010-09-01
The connection among boundary layer phenomena, atmospheric pollutant dynamics and human health is an established fact, taking many different forms depending on local characteristics, including slope and position of relief and/or coastline, surface roughness, emission patterns. The problem is especially interesting in complex and coastal terrain, where concurrence of slope and sea induced local circulation interact reciprocally, yielding a complex pattern whose interpretation may go beyond pure modeling, and devise specific measurements among which the planetary boundary layer (PBL) height. An occasion for studying this important theme has been offered by Regione Molise and Valle del Biferno Consortium (COSIB), for the specific case of the industrial complex of Valle del Biferno, 3 km inland of Termoli, in Central Italy, on the Adriatic coast. The local government, sensitive to air quality and public health in the industrial area, together with COSIB has co-financed a research project aimed at gaining knowledge about local meteorology, PBL phenomena and atmospheric pollutant dispersion in the area. Expected results include new air quality monitoring and control methodologies in Valle del Biferno for a sustainable development in an environmentally respectful manner, at a site already characterized by a high environmental and landscape value. The research project, developed by ENEA, has began in 2007 and will conclude in December 2010. Project activities involve research group from Europe, the United States of America, and the Russian Federation. Scientific and practical results will be published and presented in occasion of the final workshop to be held on project conclusion. The scientific interest of Valle del Biferno case stems from the specific local characteristics at site. Given the valley orientation respect to mean synoptic circulation, local effects as sea and slope breezes are dominant, and a complex wind regime develops affecting local transport and diffusion of pollutants emitted in the area of the industrial complex. All effects studied, although influenced by local conditions, characterize not only this industrial area but all areas located along the coastline. This location is highly frequent in Italy and the World, as most industrial complexes in the World occur at coastal sites, where access to harbors and transport networks are facilitated. The Valle del Biferno case may then yield important data to many industrial sites.
Geohydrology and simulation of ground-water flow in the aquifer system near Calvert City, Kentucky
Starn, J.J.; Arihood, L.D.; Rose, M.F.
1995-01-01
The U.S. Geological Survey, in cooperation with the Kentucky Natural Resources and Environmental Protection Cabinet, constructed a two-dimensional, steady-state ground-water-flow model to estimate hydraulic properties, contributing areas to discharge boundaries, and the average linear velocity at selected locations in an aquifer system near Calvert City, Ky. Nonlinear regression was used to estimate values of model parameters and the reliability of the parameter estimates. The regression minimizes the weighted difference between observed and calculated hydraulic heads and rates of flow. The calibrated model generally was better than alternative models considered, and although adding transmissive faults in the bedrock produced a slightly better model, fault transmissivity was not estimated reliably. The average transmissivity of the aquifer was 20,000 feet squared per day. Recharge to two outcrop areas, the McNairy Formation of Cretaceous age and the alluvium of Quaternary age, were 0.00269 feet per day (11.8 inches per year) and 0.000484 feet per day (2.1 inches per year), respectively. Contributing areas to wells at the Calvert City Water Company in 1992 did not include the Calvert City Industrial Complex. Since completing the fieldwork for this study in 1992, the Calvert City Water Company discontinued use of their wells and began withdrawing water from new wells that were located 4.5 miles east-southeast of the previous location; the contributing area moved farther from the industrial complex. The extent of the alluvium contributing water to wells was limited by the overlying lacustrine deposits. The average linear ground-water velocity at the industrial complex ranged from 0.90 feet per day to 4.47 feet per day with a mean of 1.98 feet per day.
Real-time monitoring of high-gravity corn mash fermentation using in situ raman spectroscopy.
Gray, Steven R; Peretti, Steven W; Lamb, H Henry
2013-06-01
In situ Raman spectroscopy was employed for real-time monitoring of simultaneous saccharification and fermentation (SSF) of corn mash by an industrial strain of Saccharomyces cerevisiae. An accurate univariate calibration model for ethanol was developed based on the very strong 883 cm(-1) C-C stretching band. Multivariate partial least squares (PLS) calibration models for total starch, dextrins, maltotriose, maltose, glucose, and ethanol were developed using data from eight batch fermentations and validated using predictions for a separate batch. The starch, ethanol, and dextrins models showed significant prediction improvement when the calibration data were divided into separate high- and low-concentration sets. Collinearity between the ethanol and starch models was avoided by excluding regions containing strong ethanol peaks from the starch model and, conversely, excluding regions containing strong saccharide peaks from the ethanol model. The two-set calibration models for starch (R(2) = 0.998, percent error = 2.5%) and ethanol (R(2) = 0.999, percent error = 2.1%) provide more accurate predictions than any previously published spectroscopic models. Glucose, maltose, and maltotriose are modeled to accuracy comparable to previous work on less complex fermentation processes. Our results demonstrate that Raman spectroscopy is capable of real time in situ monitoring of a complex industrial biomass fermentation. To our knowledge, this is the first PLS-based chemometric modeling of corn mash fermentation under typical industrial conditions, and the first Raman-based monitoring of a fermentation process with glucose, oligosaccharides and polysaccharides present. Copyright © 2013 Wiley Periodicals, Inc.
Predicting Deforestation Patterns in Loreto, Peru from 2000-2010 Using a Nested GLM Approach
NASA Astrophysics Data System (ADS)
Vijay, V.; Jenkins, C.; Finer, M.; Pimm, S.
2013-12-01
Loreto is the largest province in Peru, covering about 370,000 km2. Because of its remote location in the Amazonian rainforest, it is also one of the most sparsely populated. Though a majority of the region remains covered by forest, deforestation is being driven by human encroachment through industrial activities and the spread of colonization and agriculture. The importance of accurate predictive modeling of deforestation has spawned an extensive body of literature on the topic. We present a nested GLM approach based on predictions of deforestation from 2000-2010 and using variables representing the expected drivers of deforestation. Models were constructed using 2000 to 2005 changes and tested against data for 2005 to 2010. The most complex model, which included transportation variables (roads and navigable rivers), spatial contagion processes, population centers and industrial activities, performed better in predicting the 2005 to 2010 changes (75.8% accurate) than did a simpler model using only transportation variables (69.2% accurate). Finally we contrast the GLM approach with a more complex spatially articulated model.
Dynamic Fuzzy Model Development for a Drum-type Boiler-turbine Plant Through GK Clustering
NASA Astrophysics Data System (ADS)
Habbi, Ahcène; Zelmat, Mimoun
2008-10-01
This paper discusses a TS fuzzy model identification method for an industrial drum-type boiler plant using the GK fuzzy clustering approach. The fuzzy model is constructed from a set of input-output data that covers a wide operating range of the physical plant. The reference data is generated using a complex first-principle-based mathematical model that describes the key dynamical properties of the boiler-turbine dynamics. The proposed fuzzy model is derived by means of fuzzy clustering method with particular attention on structure flexibility and model interpretability issues. This may provide a basement of a new way to design model based control and diagnosis mechanisms for the complex nonlinear plant.
NASA Astrophysics Data System (ADS)
Erickson, M.; Olaguer, J.; Wijesinghe, A.; Colvin, J.; Neish, B.; Williams, J.
2014-12-01
It is becoming increasingly important to understand the emissions and health effects of industrial facilities. Many areas have no or limited sustained monitoring capabilities, making it difficult to quantify the major pollution sources affecting human health, especially in fence line communities. Developments in real-time monitoring and micro-scale modeling offer unique ways to tackle these complex issues. This presentation will demonstrate the capability of coupling real-time observations with micro-scale modeling to provide real-time information and near real-time source attribution. The Houston Advanced Research Center constructed the Mobile Acquisition of Real-time Concentrations (MARC) laboratory. MARC consists of a Ford E-350 passenger van outfitted with a Proton Transfer Reaction Mass Spectrometer (PTR-MS) and meteorological equipment. This allows for the fast measurement of various VOCs important to air quality. The data recorded from the van is uploaded to an off-site database and the information is broadcast to a website in real-time. This provides for off-site monitoring of MARC's observations, which allows off-site personnel to provide immediate input to the MARC operators on how to best achieve project objectives. The information stored in the database can also be used to provide near real-time source attribution. An inverse model has been used to ascertain the amount, location, and timing of emissions based on MARC measurements in the vicinity of industrial sites. The inverse model is based on a 3D micro-scale Eulerian forward and adjoint air quality model known as the HARC model. The HARC model uses output from the Quick Urban and Industrial Complex (QUIC) wind model and requires a 3D digital model of the monitored facility based on lidar or industrial permit data. MARC is one of the instrument platforms deployed during the 2014 Benzene and other Toxics Exposure Study (BEE-TEX) in Houston, TX. The main goal of the study is to quantify and explain the origin of ambient exposure to hazardous air pollutants in an industrial fence line community near the Houston Ship Channel. Preliminary results derived from analysis of MARC observations during the BEE-TEX experiment will be presented.
Source apportionment is challenging in urban environments with clustered sourceemissions that have similar chemical signatures. A field and inverse modeling studywas conducted in Elizabeth, New Jersey to observe gaseous and particulate pollutionnear the Port of New York and New J...
The University in the Knowledge Economy: The Triple Helix Model and Its Implications
ERIC Educational Resources Information Center
Zheng, Peijun; Harris, Michael
2007-01-01
In the context of the global knowledge economy, the three major players--university, industry, and government--are becoming increasingly interdependent. As more intensified interactions and relationships of increasing complexity among the institutions evolve, the Triple Helix model attempts to describe not only interactions among university,…
Simulation of the effect of air pollution on forest ecosystems in a region
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tarko, A.M.; Bykadorov, A.V.; Kryuchkov, V.V.
1995-03-01
This article describes a model of air pollution effects on spruce in forests of the northern taiga regions which have been exposed to air pollution from a large metallurgical industrial complex. Both the predictions the model makes about forest ecosystem degradation zones and the limitations of the model are discussed. 5 refs., 1 fig.
Cabello, Purificación; Luque-Almagro, Víctor M; Olaya-Abril, Alfonso; Sáez, Lara P; Moreno-Vivián, Conrado; Roldán, M Dolores
2018-01-01
Abstract Mining, jewellery and metal-processing industries use cyanide for extracting gold and other valuable metals, generating large amounts of highly toxic wastewater. Biological treatments may be a clean alternative under the environmental point of view to the conventional physical or chemical processes used to remove cyanide and related compounds from these industrial effluents. Pseudomonas pseudoalcaligenes CECT5344 can grow under alkaline conditions using cyanide, cyanate or different nitriles as the sole nitrogen source, and is able to remove up to 12 mM total cyanide from a jewellery industry wastewater that contains cyanide free and complexed to metals. Complete genome sequencing of this bacterium has allowed the application of transcriptomic and proteomic techniques, providing a holistic view of the cyanide biodegradation process. The complex response to cyanide by the cyanotrophic bacterium P. pseudoalcaligenes CECT5344 and the potential biotechnological applications of this model organism in the bioremediation of cyanide-containing industrial residues are reviewed. PMID:29438505
Cabello, Purificación; Luque-Almagro, Víctor M; Olaya-Abril, Alfonso; Sáez, Lara P; Moreno-Vivián, Conrado; Roldán, M Dolores
2018-03-01
Mining, jewellery and metal-processing industries use cyanide for extracting gold and other valuable metals, generating large amounts of highly toxic wastewater. Biological treatments may be a clean alternative under the environmental point of view to the conventional physical or chemical processes used to remove cyanide and related compounds from these industrial effluents. Pseudomonas pseudoalcaligenes CECT5344 can grow under alkaline conditions using cyanide, cyanate or different nitriles as the sole nitrogen source, and is able to remove up to 12 mM total cyanide from a jewellery industry wastewater that contains cyanide free and complexed to metals. Complete genome sequencing of this bacterium has allowed the application of transcriptomic and proteomic techniques, providing a holistic view of the cyanide biodegradation process. The complex response to cyanide by the cyanotrophic bacterium P. pseudoalcaligenes CECT5344 and the potential biotechnological applications of this model organism in the bioremediation of cyanide-containing industrial residues are reviewed.
Ruth, Matthias; Davidsdottir, Brynhildur; Amato, Anthony
2004-03-01
Changes in material use, energy use and emissions profiles of industry are the result of complex interrelationships among a multitude of technological and economic drivers. To better understand and guide such changes requires that attention is paid to the time-varying consequences that technology and economic influences have on an industry's choice of inputs and its associated (desired and undesired) outputs. This paper lays out an approach to improving our understanding of the dynamics of large industrial systems. The approach combines engineering and econometric analysis with a detailed representation of an industry's capital stock structure. A transparent dynamic computer modeling approach is chosen to integrate information from these analyses in ways that foster participation of stakeholders from industry and government agencies in all stages of the modeling process-from problem definition and determination of system boundaries to generation of scenarios and interpretation of results. Three case studies of industrial energy use in the USA are presented-one each for the iron and steel, pulp and paper, and ethylene industry. Dynamic models of these industries are described and then used to investigate alternative carbon emissions and investment-led policies. A comparison of results clearly points towards two key issues: the need for industry specific policy approaches in order to effectively influence industrial energy use, fuel mix and carbon emissions, and the need for longer time horizons than have typically been chosen for the analysis of industrial responses to climate change policies.
Ward, Marie; McDonald, Nick; Morrison, Rabea; Gaynor, Des; Nugent, Tony
2010-02-01
Aircraft maintenance is a highly regulated, safety critical, complex and competitive industry. There is a need to develop innovative solutions to address process efficiency without compromising safety and quality. This paper presents the case that in order to improve a highly complex system such as aircraft maintenance, it is necessary to develop a comprehensive and ecologically valid model of the operational system, which represents not just what is meant to happen, but what normally happens. This model then provides the backdrop against which to change or improve the system. A performance report, the Blocker Report, specific to aircraft maintenance and related to the model was developed gathering data on anything that 'blocks' task or check performance. A Blocker Resolution Process was designed to resolve blockers and improve the current check system. Significant results were obtained for the company in the first trial and implications for safety management systems and hazard identification are discussed. Statement of Relevance: Aircraft maintenance is a safety critical, complex, competitive industry with a need to develop innovative solutions to address process and safety efficiency. This research addresses this through the development of a comprehensive and ecologically valid model of the system linked with a performance reporting and resolution system.
NASA Astrophysics Data System (ADS)
Schaefer, R. K.; Nix, M.; Ihde, A. G.; Paxton, L. J.; Weiss, M.; Simpkins, S.; Fountain, G. H.; APl GAIA Team
2011-12-01
In this paper we describe the application of a proven methodology for modeling the complex social and economic interactions of a system under stress to the regional issues that are tied to global climate disruption. Under the auspices of the GAIA project (http://gaia.jhuapl.edu), we have investigated simulating the complex interplay between climate, politics, society, industry, and the environment in the Chesapeake Bay Watershed and associated geographic areas of Maryland, Virginia, and Pennsylvania. This Chesapeake Bay simulation draws on interrelated geophysical and climate models to support decision-making analysis about the Bay. In addition to physical models, however, human activity is also incorporated via input and output calculations. For example, policy implications are modeled in relation to business activities surrounding fishing, farming, industry and manufacturing, land development, and tourism. This approach fosters collaboration among subject matter experts to advance a more complete understanding of the regional impacts of climate change. Simulated interactive competition, in which teams of experts are assigned conflicting objectives in a controlled environment, allow for subject exploration which avoids trivial solutions that neglect the possible responses of affected parties. Results include improved planning, the anticipation of areas of conflict or high risk, and the increased likelihood of developing mutually acceptable solutions.
The consumer-provider relationship in the dental industry.
Griffith, Andrew S; Abratt, Russell
2013-01-01
This article explains how a consumer's level of trust and commitment to his or her dental service provider factors into the relationship between the consumer and the dental professional. Very little research has been done that describes the complexity of this relationship. This article documents the complexity of and influences on that relationship by providing an interaction model of this relationship.
2017-01-01
The input-output table is comprehensive and detailed in describing the national economic system with complex economic relationships, which embodies information of supply and demand among industrial sectors. This paper aims to scale the degree of competition/collaboration on the global value chain from the perspective of econophysics. Global Industrial Strongest Relevant Network models were established by extracting the strongest and most immediate industrial relevance in the global economic system with inter-country input-output tables and then transformed into Global Industrial Resource Competition Network/Global Industrial Production Collaboration Network models embodying the competitive/collaborative relationships based on bibliographic coupling/co-citation approach. Three indicators well suited for these two kinds of weighted and non-directed networks with self-loops were introduced, including unit weight for competitive/collaborative power, disparity in the weight for competitive/collaborative amplitude and weighted clustering coefficient for competitive/collaborative intensity. Finally, these models and indicators were further applied to empirically analyze the function of sectors in the latest World Input-Output Database, to reveal inter-sector competitive/collaborative status during the economic globalization. PMID:28873432
Xing, Lizhi
2017-01-01
The input-output table is comprehensive and detailed in describing the national economic system with complex economic relationships, which embodies information of supply and demand among industrial sectors. This paper aims to scale the degree of competition/collaboration on the global value chain from the perspective of econophysics. Global Industrial Strongest Relevant Network models were established by extracting the strongest and most immediate industrial relevance in the global economic system with inter-country input-output tables and then transformed into Global Industrial Resource Competition Network/Global Industrial Production Collaboration Network models embodying the competitive/collaborative relationships based on bibliographic coupling/co-citation approach. Three indicators well suited for these two kinds of weighted and non-directed networks with self-loops were introduced, including unit weight for competitive/collaborative power, disparity in the weight for competitive/collaborative amplitude and weighted clustering coefficient for competitive/collaborative intensity. Finally, these models and indicators were further applied to empirically analyze the function of sectors in the latest World Input-Output Database, to reveal inter-sector competitive/collaborative status during the economic globalization.
Gordon, Sarah; Daneshian, Mardas; Bouwstra, Joke; Caloni, Francesca; Constant, Samuel; Davies, Donna E; Dandekar, Gudrun; Guzman, Carlos A; Fabian, Eric; Haltner, Eleonore; Hartung, Thomas; Hasiwa, Nina; Hayden, Patrick; Kandarova, Helena; Khare, Sangeeta; Krug, Harald F; Kneuer, Carsten; Leist, Marcel; Lian, Guoping; Marx, Uwe; Metzger, Marco; Ott, Katharina; Prieto, Pilar; Roberts, Michael S; Roggen, Erwin L; Tralau, Tewes; van den Braak, Claudia; Walles, Heike; Lehr, Claus-Michael
2015-01-01
Models of the outer epithelia of the human body - namely the skin, the intestine and the lung - have found valid applications in both research and industrial settings as attractive alternatives to animal testing. A variety of approaches to model these barriers are currently employed in such fields, ranging from the utilization of ex vivo tissue to reconstructed in vitro models, and further to chip-based technologies, synthetic membrane systems and, of increasing current interest, in silico modeling approaches. An international group of experts in the field of epithelial barriers was convened from academia, industry and regulatory bodies to present both the current state of the art of non-animal models of the skin, intestinal and pulmonary barriers in their various fields of application, and to discuss research-based, industry-driven and regulatory-relevant future directions for both the development of new models and the refinement of existing test methods. Issues of model relevance and preference, validation and standardization, acceptance, and the need for simplicity versus complexity were focal themes of the discussions. The outcomes of workshop presentations and discussions, in relation to both current status and future directions in the utilization and development of epithelial barrier models, are presented by the attending experts in the current report.
Network model of bilateral power markets based on complex networks
NASA Astrophysics Data System (ADS)
Wu, Yang; Liu, Junyong; Li, Furong; Yan, Zhanxin; Zhang, Li
2014-06-01
The bilateral power transaction (BPT) mode becomes a typical market organization with the restructuring of electric power industry, the proper model which could capture its characteristics is in urgent need. However, the model is lacking because of this market organization's complexity. As a promising approach to modeling complex systems, complex networks could provide a sound theoretical framework for developing proper simulation model. In this paper, a complex network model of the BPT market is proposed. In this model, price advantage mechanism is a precondition. Unlike other general commodity transactions, both of the financial layer and the physical layer are considered in the model. Through simulation analysis, the feasibility and validity of the model are verified. At same time, some typical statistical features of BPT network are identified. Namely, the degree distribution follows the power law, the clustering coefficient is low and the average path length is a bit long. Moreover, the topological stability of the BPT network is tested. The results show that the network displays a topological robustness to random market member's failures while it is fragile against deliberate attacks, and the network could resist cascading failure to some extent. These features are helpful for making decisions and risk management in BPT markets.
Complex networks as an emerging property of hierarchical preferential attachment.
Hébert-Dufresne, Laurent; Laurence, Edward; Allard, Antoine; Young, Jean-Gabriel; Dubé, Louis J
2015-12-01
Real complex systems are not rigidly structured; no clear rules or blueprints exist for their construction. Yet, amidst their apparent randomness, complex structural properties universally emerge. We propose that an important class of complex systems can be modeled as an organization of many embedded levels (potentially infinite in number), all of them following the same universal growth principle known as preferential attachment. We give examples of such hierarchy in real systems, for instance, in the pyramid of production entities of the film industry. More importantly, we show how real complex networks can be interpreted as a projection of our model, from which their scale independence, their clustering, their hierarchy, their fractality, and their navigability naturally emerge. Our results suggest that complex networks, viewed as growing systems, can be quite simple, and that the apparent complexity of their structure is largely a reflection of their unobserved hierarchical nature.
Complex networks as an emerging property of hierarchical preferential attachment
NASA Astrophysics Data System (ADS)
Hébert-Dufresne, Laurent; Laurence, Edward; Allard, Antoine; Young, Jean-Gabriel; Dubé, Louis J.
2015-12-01
Real complex systems are not rigidly structured; no clear rules or blueprints exist for their construction. Yet, amidst their apparent randomness, complex structural properties universally emerge. We propose that an important class of complex systems can be modeled as an organization of many embedded levels (potentially infinite in number), all of them following the same universal growth principle known as preferential attachment. We give examples of such hierarchy in real systems, for instance, in the pyramid of production entities of the film industry. More importantly, we show how real complex networks can be interpreted as a projection of our model, from which their scale independence, their clustering, their hierarchy, their fractality, and their navigability naturally emerge. Our results suggest that complex networks, viewed as growing systems, can be quite simple, and that the apparent complexity of their structure is largely a reflection of their unobserved hierarchical nature.
Aliabadi, Mohsen; Golmohammadi, Rostam; Khotanlou, Hassan; Mansoorizadeh, Muharram; Salarpour, Amir
2014-01-01
Noise prediction is considered to be the best method for evaluating cost-preventative noise controls in industrial workrooms. One of the most important issues is the development of accurate models for analysis of the complex relationships among acoustic features affecting noise level in workrooms. In this study, advanced fuzzy approaches were employed to develop relatively accurate models for predicting noise in noisy industrial workrooms. The data were collected from 60 industrial embroidery workrooms in the Khorasan Province, East of Iran. The main acoustic and embroidery process features that influence the noise were used to develop prediction models using MATLAB software. Multiple regression technique was also employed and its results were compared with those of fuzzy approaches. Prediction errors of all prediction models based on fuzzy approaches were within the acceptable level (lower than one dB). However, Neuro-fuzzy model (RMSE=0.53dB and R2=0.88) could slightly improve the accuracy of noise prediction compared with generate fuzzy model. Moreover, fuzzy approaches provided more accurate predictions than did regression technique. The developed models based on fuzzy approaches as useful prediction tools give professionals the opportunity to have an optimum decision about the effectiveness of acoustic treatment scenarios in embroidery workrooms.
NASA Astrophysics Data System (ADS)
Glushkov, A. V.; Khetselius, O. Yu; Agayar, E. V.; Buyadzhi, V. V.; Romanova, A. V.; Mansarliysky, V. F.
2017-10-01
We present a new effective approach to analysis and modelling the natural air ventilation in an atmosphere of the industrial city, which is based on the Arakawa-Schubert and Glushkov models, modified to calculate the current involvement of the ensemble of clouds, and advanced mathematical methods of modelling an unsteady turbulence in the urban area. For the first time the methods of a plane complex field and spectral expansion algorithms are applied to calculate the air circulation for the cloud layer arrays, penetrating into the territory of the industrial city. We have also taken into account for the mechanisms of transformation of the cloud system advection over the territory of the urban area. The results of test computing the air ventilation characteristics are presented for the Odessa city. All above cited methods and models together with the standard monitoring and management systems can be considered as a basis for comprehensive “Green City” construction technology.
Microfluidic Model Porous Media: Fabrication and Applications.
Anbari, Alimohammad; Chien, Hung-Ta; Datta, Sujit S; Deng, Wen; Weitz, David A; Fan, Jing
2018-05-01
Complex fluid flow in porous media is ubiquitous in many natural and industrial processes. Direct visualization of the fluid structure and flow dynamics is critical for understanding and eventually manipulating these processes. However, the opacity of realistic porous media makes such visualization very challenging. Micromodels, microfluidic model porous media systems, have been developed to address this challenge. They provide a transparent interconnected porous network that enables the optical visualization of the complex fluid flow occurring inside at the pore scale. In this Review, the materials and fabrication methods to make micromodels, the main research activities that are conducted with micromodels and their applications in petroleum, geologic, and environmental engineering, as well as in the food and wood industries, are discussed. The potential applications of micromodels in other areas are also discussed and the key issues that should be addressed in the near future are proposed. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Datasets for Ostrava PMF paper
These data support a published journal paper described as follows:A 14-week investigation during a warm and cold seasons was conducted to improve understanding of airpollution sources that might be impacting air quality in Ostrava, the Czech Republic. Fine particulatematter (PM2.5) samples were collected in consecutive 12-h day and night increments during spring andfall 2012 sampling campaigns. Sampling sites were strategically located to evaluate conditions in closeproximity of a large steel works industrial complex, as well as away from direct influence of theindustrial complex. These samples were analyzed for metals and other elements, organic and elemental(black) carbon, and polycyclic aromatic hydrocarbons (PAHs). The PM2.5 samples were supplementedwith pollutant gases and meteorological parameters. We applied the EPA PMF v5.1 model with uncertainty estimate features to the Ostrava data set. Using the model's bootstrapping procedure and other considerations, six factors were determined to provide the optimum solution. Each model run consisted of 100 iterations to ensure that the solution represents a global minimum. The resulting factors were identified as representing coal (power plants), mixed Cl, crustal, industrial 1 (alkali metals and PAHs), industrial 2 (transition metals), and home heat/transportation. The home heating source is thought to be largely domestic boilers burning low quality fuels such as lignite, wood, and domestic waste.Transportation-r
Pines, Jesse M
2006-05-01
Emergency Medicine plays a vital role in the health care continuum in the United States. Michael Porters' five forces model of industry analysis provides an insight into the economics of emergency care by showing how the forces of supplier power, buyer power, threat of substitution, barriers to entry, and internal rivalry affect Emergency Medicine. Illustrating these relationships provides a view into the complexities of the emergency care industry and offers opportunities for Emergency Departments, groups of physicians, and the individual emergency physician to maximize the relationship with other market players.
Intelligent simulation of aquatic environment economic policy coupled ABM and SD models.
Wang, Huihui; Zhang, Jiarui; Zeng, Weihua
2018-03-15
Rapid urbanization and population growth have resulted in serious water shortage and pollution of the aquatic environment, which are important reasons for the complex increase in environmental deterioration in the region. This study examines the environmental consequences and economic impacts of water resource shortages under variant economic policies; however, this requires complex models that jointly consider variant agents and sectors within a systems perspective. Thus, we propose a complex system model that couples multi-agent based models (ABM) and system dynamics (SD) models to simulate the impact of alternative economic policies on water use and pricing. Moreover, this model took the constraint of the local water resources carrying capacity into consideration. Results show that to achieve the 13th Five Year Plan targets in Dianchi, water prices for local residents and industries should rise to 3.23 and 4.99 CNY/m 3 , respectively. The corresponding sewage treatment fees for residents and industries should rise to 1.50 and 2.25 CNY/m 3 , respectively, assuming comprehensive adjustment of industrial structure and policy. At the same time, the local government should exercise fine-scale economic policy combined with emission fees assessed for those exceeding a standard, and collect fines imposed as punishment for enterprises that exceed emission standards. When fines reach 500,000 CNY, the total number of enterprises that exceed emission standards in the basin can be controlled within 1%. Moreover, it is suggested that the volume of water diversion in Dianchi should be appropriately reduced to 3.06×10 8 m 3 . The reduced expense of water diversion should provide funds to use for the construction of recycled water facilities. Then the local rise in the rate of use of recycled water should reach 33%, and 1.4 CNY/m 3 for the price of recycled water could be provided to ensure the sustainable utilization of local water resources. Copyright © 2017 Elsevier B.V. All rights reserved.
Eom, Sang-Yong; Choi, Jonghyuk; Bae, Sanghyuk; Lim, Ji-Ae; Kim, Guen-Bae; Yu, Seung-Do; Kim, Yangho; Lim, Hyun-Sul; Son, Bu-Soon; Paek, Domyung; Kim, Yong-Dae; Kim, Heon; Ha, Mina; Kwon, Ho-Jang
2018-01-01
Several epidemiological studies have reported an association between environmental pollution and various health conditions in individuals residing in industrial complexes. To evaluate the effects of pollution from industrial complex on human health, we performed a pooled analysis of environmental epidemiologic monitoring data for residents living near national industrial complexes in Korea. The respiratory and allergic symptoms and the prevalence of acute and chronic diseases, including cancer, were used as the outcome variables for health effects. Multiple logistic regression analysis was used to analyze the relationship between exposure to pollution from industrial complexes and health conditions. After adjusting for age, sex, smoking status, occupational exposure, level of education, and body mass index, the residents near the industrial complexes were found to have more respiratory symptoms, such as cough (odds ratio [OR], 1.18; 95% confidence interval [CI], 1.06 to 1.31) and sputum production (OR, 1.13; 95% CI, 1.03 to 1.24), and symptoms of atopic dermatitis (OR, 1.10; 95% CI, 1.01 to 1.20). Among residents of the industrial complexes, the prevalence of acute eye disorders was approximately 40% higher (OR, 1.39; 95% CI, 1.04 to 1.84) and the prevalence of lung and uterine cancer was 3.45 times and 1.88 times higher, respectively, than those among residents of the control area. This study showed that residents living in the vicinity of industrial complexes have a high risk of acute and chronic diseases including respiratory and allergic conditions. These results can be used as basic objective data for developing health management measures for individuals residing near industrial complexes.
Directions for computational mechanics in automotive crashworthiness
NASA Technical Reports Server (NTRS)
Bennett, James A.; Khalil, T. B.
1993-01-01
The automotive industry has used computational methods for crashworthiness since the early 1970's. These methods have ranged from simple lumped parameter models to full finite element models. The emergence of the full finite element models in the mid 1980's has significantly altered the research direction. However, there remains a need for both simple, rapid modeling methods and complex detailed methods. Some directions for continuing research are discussed.
Directions for computational mechanics in automotive crashworthiness
NASA Astrophysics Data System (ADS)
Bennett, James A.; Khalil, T. B.
1993-08-01
The automotive industry has used computational methods for crashworthiness since the early 1970's. These methods have ranged from simple lumped parameter models to full finite element models. The emergence of the full finite element models in the mid 1980's has significantly altered the research direction. However, there remains a need for both simple, rapid modeling methods and complex detailed methods. Some directions for continuing research are discussed.
Gao, Xiangyun; Huang, Shupei; Sun, Xiaoqi; Hao, Xiaoqing; An, Feng
2018-03-01
Microscopic factors are the basis of macroscopic phenomena. We proposed a network analysis paradigm to study the macroscopic financial system from a microstructure perspective. We built the cointegration network model and the Granger causality network model based on econometrics and complex network theory and chose stock price time series of the real estate industry and its upstream and downstream industries as empirical sample data. Then, we analysed the cointegration network for understanding the steady long-term equilibrium relationships and analysed the Granger causality network for identifying the diffusion paths of the potential risks in the system. The results showed that the influence from a few key stocks can spread conveniently in the system. The cointegration network and Granger causality network are helpful to detect the diffusion path between the industries. We can also identify and intervene in the transmission medium to curb risk diffusion.
Huang, Shupei; Sun, Xiaoqi; Hao, Xiaoqing; An, Feng
2018-01-01
Microscopic factors are the basis of macroscopic phenomena. We proposed a network analysis paradigm to study the macroscopic financial system from a microstructure perspective. We built the cointegration network model and the Granger causality network model based on econometrics and complex network theory and chose stock price time series of the real estate industry and its upstream and downstream industries as empirical sample data. Then, we analysed the cointegration network for understanding the steady long-term equilibrium relationships and analysed the Granger causality network for identifying the diffusion paths of the potential risks in the system. The results showed that the influence from a few key stocks can spread conveniently in the system. The cointegration network and Granger causality network are helpful to detect the diffusion path between the industries. We can also identify and intervene in the transmission medium to curb risk diffusion. PMID:29657804
COMPUTATIONAL TOXICOLOGY: AN APPROACH FOR PRIORITIZING CHEMICAL RISK ASSESSMENTS
Characterizing toxic effects for industrial chemicals carries the challenge of focusing resources on the greatest potential risks for human health and the environment. The union of molecular modeling, bioinformatics and simulation of complex systems with emerging technologies suc...
NASA Technical Reports Server (NTRS)
Zapata, Edgar
2012-01-01
This paper presents past and current work in dealing with indirect industry and NASA costs when providing cost estimation or analysis for NASA projects and programs. Indirect costs, when defined as those costs in a project removed from the actual hardware or software hands-on labor; makes up most of the costs of today's complex large scale NASA space/industry projects. This appears to be the case across phases from research into development into production and into the operation of the system. Space transportation is the case of interest here. Modeling and cost estimation as a process rather than a product will be emphasized. Analysis as a series of belief systems in play among decision makers and decision factors will also be emphasized to provide context.
Computational Chemistry Toolkit for Energetic Materials Design
2006-11-01
industry are aggressively engaged in efforts to develop multiscale modeling and simulation methodologies to model and analyze complex phenomena across...energetic materials design. It is hoped that this toolkit will evolve into a collection of well-integrated multiscale modeling methodologies...Experimenta Theoreticala This Work 1-5-Diamino-4- methyl- tetrazolium nitrate 8.4 41.7 47.5 1-5-Diamino-4- methyl- tetrazolium azide 138.1 161.6
Streamlining DOD Acquisitions: Balancing Schedule with Complexity
2006-09-01
from them has a distinct industrial flavor: streamlined processes, benchmarking, and business models . The requirements generation com- munity led by... model ), and the Department of the Navy assumed program lead. [Stable Program Inputs (-)] By 1984, the program goals included delivery of 913 V-22...they subsequently specified a crew of two. [Stable Program Input (-)] The contractor team won in a “fly-off” solely via modeling and simulation
Labour supply in the home care industry: A case study in a Dutch region.
Breedveld, Elly J; Meijboom, Bert R; de Roo, Aad A
2006-04-01
Health organizations have started to become more market-driven. Therefore, it is important for health organizations to analyse the competitive dynamics of their industrial structure. However, relevant theories and models have mainly been developed for organizations acting in the profit sector. In this paper, we adapt Porter's 'five forces model' to the home care industry. In particular, we modify the (determinants of the) bargaining power of labour suppliers. We then apply the modified Porter-model to the home care industry in the Netherlands for the period of 1987-1997 with special attention for labour supply. The new instrument clarifies the complexity of the supply chains and value systems of the home care industry. As can be illustrated by developments in the home care industry in the province of North Brabant during the 1990s, competition between home care providers has influenced labour market relations, but so do other factors as well. Between 1987 and 1997, the bargaining power of labour suppliers was relatively limited. After 1997, however, the demand for home care personnel has increased strongly. In spite of the present economic recession, scarcity on this labour market seems to prevail in the longer term due to a growing demand for home care services.
Biomass Resource Allocation among Competing End Uses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Newes, E.; Bush, B.; Inman, D.
The Biomass Scenario Model (BSM) is a system dynamics model developed by the U.S. Department of Energy as a tool to better understand the interaction of complex policies and their potential effects on the biofuels industry in the United States. However, it does not currently have the capability to account for allocation of biomass resources among the various end uses, which limits its utilization in analysis of policies that target biomass uses outside the biofuels industry. This report provides a more holistic understanding of the dynamics surrounding the allocation of biomass among uses that include traditional use, wood pellet exports,more » bio-based products and bioproducts, biopower, and biofuels by (1) highlighting the methods used in existing models' treatments of competition for biomass resources; (2) identifying coverage and gaps in industry data regarding the competing end uses; and (3) exploring options for developing models of biomass allocation that could be integrated with the BSM to actively exchange and incorporate relevant information.« less
Reported emissions of organic gases are not consistent with observations
Henry, Ronald C.; Spiegelman, Clifford H.; Collins, John F.; Park, EunSug
1997-01-01
Regulatory agencies and photochemical models of ozone rely on self-reported industrial emission rates of organic gases. Incorrect self-reported emissions can severely impact on air quality models and regulatory decisions. We compared self-reported emissions of organic gases in Houston, Texas, to measurements at a receptor site near the Houston ship channel, a major petrochemical complex. We analyzed hourly observations of total nonmethane organic carbon and 54 hydrocarbon compounds from C-2 to C-9 for the period June through November, 1993. We were able to demonstrate severe inconsistencies between reported emissions and major sources as derived from the data using a multivariate receptor model. The composition and the location of the sources as deduced from the data are not consistent with the reported industrial emissions. On the other hand, our observationally based methods did correctly identify the location and composition of a relatively small nearby chemical plant. This paper provides strong empirical evidence that regulatory agencies and photochemical models are making predictions based on inaccurate industrial emissions. PMID:11038551
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khawli, Toufik Al; Eppelt, Urs; Hermanns, Torsten
2016-06-08
In production industries, parameter identification, sensitivity analysis and multi-dimensional visualization are vital steps in the planning process for achieving optimal designs and gaining valuable information. Sensitivity analysis and visualization can help in identifying the most-influential parameters and quantify their contribution to the model output, reduce the model complexity, and enhance the understanding of the model behavior. Typically, this requires a large number of simulations, which can be both very expensive and time consuming when the simulation models are numerically complex and the number of parameter inputs increases. There are three main constituent parts in this work. The first part ismore » to substitute the numerical, physical model by an accurate surrogate model, the so-called metamodel. The second part includes a multi-dimensional visualization approach for the visual exploration of metamodels. In the third part, the metamodel is used to provide the two global sensitivity measures: i) the Elementary Effect for screening the parameters, and ii) the variance decomposition method for calculating the Sobol indices that quantify both the main and interaction effects. The application of the proposed approach is illustrated with an industrial application with the goal of optimizing a drilling process using a Gaussian laser beam.« less
NASA Astrophysics Data System (ADS)
Khawli, Toufik Al; Gebhardt, Sascha; Eppelt, Urs; Hermanns, Torsten; Kuhlen, Torsten; Schulz, Wolfgang
2016-06-01
In production industries, parameter identification, sensitivity analysis and multi-dimensional visualization are vital steps in the planning process for achieving optimal designs and gaining valuable information. Sensitivity analysis and visualization can help in identifying the most-influential parameters and quantify their contribution to the model output, reduce the model complexity, and enhance the understanding of the model behavior. Typically, this requires a large number of simulations, which can be both very expensive and time consuming when the simulation models are numerically complex and the number of parameter inputs increases. There are three main constituent parts in this work. The first part is to substitute the numerical, physical model by an accurate surrogate model, the so-called metamodel. The second part includes a multi-dimensional visualization approach for the visual exploration of metamodels. In the third part, the metamodel is used to provide the two global sensitivity measures: i) the Elementary Effect for screening the parameters, and ii) the variance decomposition method for calculating the Sobol indices that quantify both the main and interaction effects. The application of the proposed approach is illustrated with an industrial application with the goal of optimizing a drilling process using a Gaussian laser beam.
Kumar, B Shiva; Venkateswarlu, Ch
2014-08-01
The complex nature of biological reactions in biofilm reactors often poses difficulties in analyzing such reactors experimentally. Mathematical models could be very useful for their design and analysis. However, application of biofilm reactor models to practical problems proves somewhat ineffective due to the lack of knowledge of accurate kinetic models and uncertainty in model parameters. In this work, we propose an inverse modeling approach based on tabu search (TS) to estimate the parameters of kinetic and film thickness models. TS is used to estimate these parameters as a consequence of the validation of the mathematical models of the process with the aid of measured data obtained from an experimental fixed-bed anaerobic biofilm reactor involving the treatment of pharmaceutical industry wastewater. The results evaluated for different modeling configurations of varying degrees of complexity illustrate the effectiveness of TS for accurate estimation of kinetic and film thickness model parameters of the biofilm process. The results show that the two-dimensional mathematical model with Edward kinetics (with its optimum parameters as mu(max)rho(s)/Y = 24.57, Ks = 1.352 and Ki = 102.36) and three-parameter film thickness expression (with its estimated parameters as a = 0.289 x 10(-5), b = 1.55 x 10(-4) and c = 15.2 x 10(-6)) better describes the biofilm reactor treating the industry wastewater.
Analysis and Design of Complex Network Environments
2012-03-01
and J. Lowe, “The myths and facts behind cyber security risks for industrial control systems ,” in the Proceedings of the VDE Kongress, VDE Congress...questions about 1) how to model them, 2) the design of experiments necessary to discover their structure (and thus adapt system inputs to optimize the...theoretical work that clarifies fundamental limitations of complex networks with network engineering and systems biology to implement specific designs and
ERIC Educational Resources Information Center
Novick, Sheldon
1976-01-01
The basic automobile design has persisted for fifty years with innovations only in production and marketing. Complex interrelationships among steel, oil, rubber, road building, and automobile industries perpetuate the "modern Model T's." Efforts to alter design to meet safety and environmental standards face tremendous resistance from a…
Park, Hung-Suck; Rene, Eldon R; Choi, Soo-Mi; Chiu, Anthony S F
2008-04-01
The Korea National Cleaner Production Center (KNCPC) affiliated to the Korea Institute of Industrial Technology (KITECH) has started a 15 year, 3-phase EIP master plan with the support of Ministry of Commerce, Industry, and Energy (MOCIE). A total of 6 industrial parks, including industrial parks in Ulsan city, known as the industrial capital of South Korea, are planning projects to find the feasibility of shifting existing industrial parks to eco-industrial parks. The basic survey shows that Ulsan industrial complex has been continuously evolving from conventional industrial complexes to eco-industrial parks by spontaneous industrial symbiosis. This paper describes the Korean national policies and the developmental activities of this vision to drive the global trend of innovation for converting the existing industrial parks to eco-industrial parks through inter-industry waste, energy, and material exchange in Ulsan Industrial complexes. In addition, the primary and supportive components of the Ulsan EIP pilot project, which will be implemented for 5 years is elaborated with its schedules and economic benefits.
Lobo, Francine Albernaz Tf; Silva, Vitoria; Domingues, Josiane; Rodrigues, Silvana; Costa, Valéria; Falcão, Deborah; de Lima Araújo, Kátia G
2018-05-01
This work aimed to prepare inclusion complexes using yellow bell pepper pigments and β-cyclodextrin by two different procedures (method A, ultrasonic homogenisation; method B, kneading), to characterise them and evaluate their colour stability in an isotonic beverage model. The extract/β-cyclodextrin ratio was 1:2 for both inclusion methodologies evaluated. The formed extract-β-cyclodextrin complexes and a physical mixture of extract and β-cyclodextrin were evaluated by differential scanning calorimetry (DSC) and Fourier transform infrared spectroscopy (FTIR). Both methodologies resulted in good complex yield and inclusion efficiency. The colour indices L* (lightness), a* (green/red) and b* (blue/yellow) of isotonic drinks added with the complexes were measured during storage under irradiance (1400 lx) and in the absence of light at temperatures between 25 and 31 °C for 21 days. The complex obtained by inclusion method B promoted better colour protection for the beverage compared with the use of the crude extract, showing that the molecular inclusion of yellow bell pepper carotenoids can provide good results for that purpose. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Kinetic models in industrial biotechnology - Improving cell factory performance.
Almquist, Joachim; Cvijovic, Marija; Hatzimanikatis, Vassily; Nielsen, Jens; Jirstrand, Mats
2014-07-01
An increasing number of industrial bioprocesses capitalize on living cells by using them as cell factories that convert sugars into chemicals. These processes range from the production of bulk chemicals in yeasts and bacteria to the synthesis of therapeutic proteins in mammalian cell lines. One of the tools in the continuous search for improved performance of such production systems is the development and application of mathematical models. To be of value for industrial biotechnology, mathematical models should be able to assist in the rational design of cell factory properties or in the production processes in which they are utilized. Kinetic models are particularly suitable towards this end because they are capable of representing the complex biochemistry of cells in a more complete way compared to most other types of models. They can, at least in principle, be used to in detail understand, predict, and evaluate the effects of adding, removing, or modifying molecular components of a cell factory and for supporting the design of the bioreactor or fermentation process. However, several challenges still remain before kinetic modeling will reach the degree of maturity required for routine application in industry. Here we review the current status of kinetic cell factory modeling. Emphasis is on modeling methodology concepts, including model network structure, kinetic rate expressions, parameter estimation, optimization methods, identifiability analysis, model reduction, and model validation, but several applications of kinetic models for the improvement of cell factories are also discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
An analysis of the adoption of managerial innovation: cost accounting systems in hospitals.
Glandon, G L; Counte, M A
1995-11-01
The adoption of new medical technologies has received significant attention in the hospital industry, in part, because of its observed relation to hospital cost increases. However, few comprehensive studies exist regarding the adoption of non-medical technologies in the hospital setting. This paper develops and tests a model of the adoption of a managerial innovation, new to the hospital industry, that of cost accounting systems based upon standard costs. The conceptual model hypothesizes that four organizational context factors (size, complexity, ownership and slack resources) and two environmental factors (payor mix and interorganizational dependency) influence hospital adoption of cost accounting systems. Based on responses to a mail survey of hospitals in the Chicago area and AHA annual survey information for 1986, a sample of 92 hospitals was analyzed. Greater hospital size, complexity, slack resources, and interorganizational dependency all were associated with adoption. Payor mix had no significant influence and the hospital ownership variables had a mixed influence. The logistic regression model was significant overall and explained over 15% of the variance in the adoption decision.
OPTIMIZATION OF MODERN DISPERSIVE RAMAN SPECTROMETERS FOR MOLECULAR SPECIATION OF ORGANICS IN WATER
Pesticides and industrial chemicals are typically complex organic molecules with multiple heteroatoms that can ionize in water. However, models for understanding the behavior of these chemicals in the environment typically assume that they exist exclusively as neutral species --...
Screening procedure for airborne pollutants emitted from a high-tech industrial complex in Taiwan.
Wang, John H C; Tsai, Ching-Tsan; Chiang, Chow-Feng
2015-11-01
Despite the modernization of computational techniques, atmospheric dispersion modeling remains a complicated task as it involves the use of large amounts of interrelated data with wide variability. The continuously growing list of regulated air pollutants also increases the difficulty of this task. To address these challenges, this study aimed to develop a screening procedure for a long-term exposure scenario by generating a site-specific lookup table of hourly averaged dispersion factors (χ/Q), which could be evaluated by downwind distance, direction, and effective plume height only. To allow for such simplification, the average plume rise was weighted with the frequency distribution of meteorological data so that the prediction of χ/Q could be decoupled from the meteorological data. To illustrate this procedure, 20 receptors around a high-tech complex in Taiwan were selected. Five consecutive years of hourly meteorological data were acquired to generate a lookup table of χ/Q, as well as two regression formulas of plume rise as functions of downwind distance, buoyancy flux, and stack height. To calculate the concentrations for the selected receptors, a six-step Excel algorithm was programmed with four years of emission records and 10 most critical toxics were screened out. A validation check using Industrial Source Complex (ISC3) model with the same meteorological and emission data showed an acceptable overestimate of 6.7% in the average concentration of 10 nearby receptors. The procedure proposed in this study allows practical and focused emission management for a large industrial complex and can therefore be integrated into an air quality decision-making system. Copyright © 2015 Elsevier Ltd. All rights reserved.
On-line identification of fermentation processes for ethanol production.
Câmara, M M; Soares, R M; Feital, T; Naomi, P; Oki, S; Thevelein, J M; Amaral, M; Pinto, J C
2017-07-01
A strategy for monitoring fermentation processes, specifically, simultaneous saccharification and fermentation (SSF) of corn mash, was developed. The strategy covered the development and use of first principles, semimechanistic and unstructured process model based on major kinetic phenomena, along with mass and energy balances. The model was then used as a reference model within an identification procedure capable of running on-line. The on-line identification procedure consists on updating the reference model through the estimation of corrective parameters for certain reaction rates using the most recent process measurements. The strategy makes use of standard laboratory measurements for sugars quantification and in situ temperature and liquid level data. The model, along with the on-line identification procedure, has been tested against real industrial data and have been able to accurately predict the main variables of operational interest, i.e., state variables and its dynamics, and key process indicators. The results demonstrate that the strategy is capable of monitoring, in real time, this complex industrial biomass fermentation. This new tool provides a great support for decision-making and opens a new range of opportunities for industrial optimization.
NASA Astrophysics Data System (ADS)
Shughrue, C. M.; Werner, B.; Nugnug, P. T.
2010-12-01
The catastrophic Deepwater Horizon oil spill highlights the risks for widespread environmental damage resulting from petroleum resource extraction. Possibilities for amelioration of these risks depend critically on understanding the dynamics and nonlinear interactions between various components of the coupled human-environmental resource extraction system. We use a complexity analysis to identify the levels of description and time scales at which these interactions are strongest, and then use the analysis as the basis for an agent-based numerical model with which decadal trends can be analyzed. Oil industry economic and technological activity and associated oil spills are components of a complex system that is coupled to natural environment, legislation, regulation, media, and resistance systems over annual to decadal time scales. In the model, oil spills are produced stochastically with a range of magnitudes depending on a reliability-engineering-based assessment of failure for the technology employed, human factors including compliance with operating procedures, and risks associated with the drilling environment. Oil industry agents determine drilling location and technological investment using a cost-benefit analysis relating projected revenue from added production to technology cost and government regulation. Media outlet agents reporting on the oil industry and environmental damage from oil spills assess the impacts of aggressively covering a story on circulation increases, advertiser concerns and potential loss of information sources. Environmental advocacy group agents increase public awareness of environmental damage (through media and public contact), solicit memberships and donations, and apply direct pressure on legislators for policy change. Heterogeneous general public agents adjust their desire for change in the level of regulation, contact their representatives or participate in resistance via protest by considering media sources, personal experiences with oil spills and individual predispositions toward the industry. Legislator agents pass legislation and influence regulator agents based on interaction with oil industry, media and general public agents. Regulator agents generate and enforce regulations by responding to pressure from legislator and oil industry agents. Oil spill impacts on the natural environment are related to number and magnitude of spills, drilling locations, and spill response methodology, determined collaboratively by government and oil company agents. Agents at the corporate and government levels use heterogeneous prediction models combined with a constant absolute risk aversion utility for wealth. This model simulates a nonlinear adaptive system with mechanisms to self-regulate oil industry activity, environmental damage and public response. A comparison of model output with historical oil industry development and environmental damage; the sensitivity of oil spill damage to economic, political and social factors; the potential for the emergence of new and possibly unstable behaviors; and opportunities for intervening in system dynamics to alter expected outcomes will be discussed. Supported by NSF: Geomorphology and Land Use Dynamics Program
Comprehensive chemical characterization of industrial PM2.5 from steel industry activities
NASA Astrophysics Data System (ADS)
Sylvestre, Alexandre; Mizzi, Aurélie; Mathiot, Sébastien; Masson, Fanny; Jaffrezo, Jean L.; Dron, Julien; Mesbah, Boualem; Wortham, Henri; Marchand, Nicolas
2017-03-01
Industrial sources are among the least documented PM (Particulate Matter) source in terms of chemical composition, which limits our understanding of their effective impact on ambient PM concentrations. We report 4 chemical emission profiles of PM2.5 for multiple activities located in a vast metallurgical complex. Emissions profiles were calculated as the difference of species concentrations between an upwind and a downwind site normalized by the absolute PM2.5 enrichment between both sites. We characterized the PM2.5 emissions profiles of the industrial activities related to the cast iron (complex 1) and the iron ore conversion processes (complex 2), as well as 2 storage areas: a blast furnace slag area (complex 3) and an ore terminal (complex 4). PM2.5 major fractions (Organic Carbon (OC) and Elemental Carbon (EC), major ions), organic markers as well as metals/trace elements are reported for the 4 industrial complexes. Among the trace elements, iron is the most emitted for the complex 1 (146.0 mg g-1 of PM2.5), the complex 2 (70.07 mg g-1) and the complex 3 (124.4 mg g-1) followed by Al, Mn and Zn. A strong emission of Polycyclic Aromatic Hydrocarbons (PAH), representing 1.3% of the Organic Matter (OM), is observed for the iron ore transformation complex (complex 2) which merges the activities of coke and iron sinter production and the blast furnace processes. In addition to unsubstituted PAHs, sulfur containing PAHs (SPAHs) are also significantly emitted (between 0.011 and 0.068 mg g-1) by the complex 2 and could become very useful organic markers of steel industry activities. For the complexes 1 and 2 (cast iron and iron ore converters), a strong fraction of sulfate ranging from 0.284 to 0.336 g g-1) and only partially neutralized by ammonium, is observed indicating that sulfates, if not directly emitted by the industrial activity, are formed very quickly in the plume. Emission from complex 4 (Ore terminal) are characterized by high contribution of Al (125.7 mg g-1 of PM2.5) but also, in a lesser extent, of Fe, Mn, Ti and Zn. We also highlighted high contribution of calcium ranging from 0.123 to 0.558 g g-1 for all of the industrial complexes under study. Since calcium is also widely used as a proxy of the dust contributions in source apportionment studies, our results suggest that this assumption should be reexamined in environments impacted by industrial emissions.
NASA Astrophysics Data System (ADS)
Pascu, Nicoleta Elisabeta; CǎruÅ£aşu, Nicoleta LuminiÅ£a.; Geambaşu, Gabriel George; Adîr, Victor Gabriel; Arion, Aurel Florin; Ivaşcu, Laura
2018-02-01
Aerial vehicles have become indispensable. There are in this field UAV (Unconventional Aerial vehicle) and transportation airplanes and other aerospace vehicles for spatial tourism. Today, the research and development activity in aerospace industry is focused to obtain a good and efficient design for airplanes, to solve the problem of high pollution and to reduce the noise. For these goals are necessary to realize light and resistant components. The aerospace industry products are, generally, very complex concerning geometric shapes and the costs are high, usually. Due to the progress in this field (products obtained using FDM) was possible to reduce the number of used tools, welding belts, and, of course, to eliminate a lot of machine tools. In addition, the complex shapes are easier product using this high technology, the cost is more attractive and the time is lower. This paper allows to present a few aspects about FDM technology and the obtained structures using it, as follows: computer geometric modeling (different designing softs) to design and redesign complex structures using 3D printing, for this kind of vehicles; finite element analysis to identify what is the influence of design for different structures; testing the structures.
NASA Astrophysics Data System (ADS)
Wang, Rongxi; Gao, Xu; Gao, Jianmin; Gao, Zhiyong; Kang, Jiani
2018-02-01
As one of the most important approaches for analyzing the mechanism of fault pervasion, fault root cause tracing is a powerful and useful tool for detecting the fundamental causes of faults so as to prevent any further propagation and amplification. Focused on the problems arising from the lack of systematic and comprehensive integration, an information transfer-based novel data-driven framework for fault root cause tracing of complex electromechanical systems in the processing industry was proposed, taking into consideration the experience and qualitative analysis of conventional fault root cause tracing methods. Firstly, an improved symbolic transfer entropy method was presented to construct a directed-weighted information model for a specific complex electromechanical system based on the information flow. Secondly, considering the feedback mechanisms in the complex electromechanical systems, a method for determining the threshold values of weights was developed to explore the disciplines of fault propagation. Lastly, an iterative method was introduced to identify the fault development process. The fault root cause was traced by analyzing the changes in information transfer between the nodes along with the fault propagation pathway. An actual fault root cause tracing application of a complex electromechanical system is used to verify the effectiveness of the proposed framework. A unique fault root cause is obtained regardless of the choice of the initial variable. Thus, the proposed framework can be flexibly and effectively used in fault root cause tracing for complex electromechanical systems in the processing industry, and formulate the foundation of system vulnerability analysis and condition prediction, as well as other engineering applications.
NASA Astrophysics Data System (ADS)
Chicea, Anca-Lucia
2015-09-01
The paper presents the process of building geometric and kinematic models of a technological equipment used in the process of manufacturing devices. First, the process of building the model for a six axes industrial robot is presented. In the second part of the paper, the process of building the model for a five-axis CNC milling machining center is also shown. Both models can be used for accurate cutting processes simulation of complex parts, such as prosthetic devices.
Natural gas availability and ambient air quality in the Baton Rouge/New Orleans industrial complex
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fieler, E.R.; Harrison, D.P.
1978-02-26
Three scenarios were modeled for the Baton Rouge/New Orleans area for 1985: one assumes the substitution of residual oil (0.7% sulfur) for gas to decrease gas-burning stationary sources from 80 to 8% and the use of properly designed stacks for large emitters; the second makes identical gas supply assumptions but adds proper stack dispersion for medium as well as large emitters; and the third is based on 16% gas-burning stationary sources. The Climatological Dispersion Model was used to translate (1974) emission rates into ambient air concentrations. Growth rates for residential, commercial, and transportation sources, but not industry, were considered. Themore » results show that proper policies, which would require not only tall stacks for large oil burning units (and for intermediate units also in the areas of high industrial concentration), but also the careful location of new plants would permit continued industrial expansion without severe air pollution problems.« less
A methodology model for quality management in a general hospital.
Stern, Z; Naveh, E
1997-01-01
A reappraisal is made of the relevance of industrial modes of quality management to the issues of medical care. Analysis of the nature of medical care, which differentiates it from the supplier-client relationships of industry, presents the main intrinsic characteristics, which create problems in application of the industrial quality management approaches to medical care. Several examples are the complexity of the relationship between the medical action and the result obtained, the client's nonacceptance of economic profitability as a value in his medical care, and customer satisfaction biased by variable standards of knowledge. The real problems unique to hospitals are addressed, and a methodology model for their quality management is offered. Included is a sample of indicator vectors, measurements of quality care, cost of medical care, quality of service, and human resources. These are based on the trilogy of planning quality, quality control, and improving quality. The conclusions confirm the inadequacy of industrial quality management approaches for medical institutions and recommend investment in formulation of appropriate concepts.
Ren, Jingzheng; Liang, Hanwei; Dong, Liang; Sun, Lu; Gao, Zhiqiu
2016-08-15
Industrial symbiosis provides novel and practical pathway to the design for the sustainability. Decision support tool for its verification is necessary for practitioners and policy makers, while to date, quantitative research is limited. The objective of this work is to present an innovative approach for supporting decision-making in the design for the sustainability with the implementation of industrial symbiosis in chemical complex. Through incorporating the emergy theory, the model is formulated as a multi-objective approach that can optimize both the economic benefit and sustainable performance of the integrated industrial system. A set of emergy based evaluation index are designed. Multi-objective Particle Swarm Algorithm is proposed to solve the model, and the decision-makers are allowed to choose the suitable solutions form the Pareto solutions. An illustrative case has been studied by the proposed method, a few of compromises between high profitability and high sustainability can be obtained for the decision-makers/stakeholders to make decision. Copyright © 2016 Elsevier B.V. All rights reserved.
Mittra, James; Tait, Joyce; Wield, David
2011-03-01
The pharmaceutical and agro-biotechnology industries have been confronted by dwindling product pipelines and rapid developments in life sciences, thus demanding a strategic rethink of conventional research and development. Despite offering both industries a solution to the pipeline problem, the life sciences have also brought complex regulatory challenges for firms. In this paper, we comment on the response of these industries to the life science trajectory, in the context of maturing conventional small-molecule product pipelines and routes to market. The challenges of managing transition from maturity to new high-value-added innovation models are addressed. Furthermore, we argue that regulation plays a crucial role in shaping the innovation systems of both industries, and as such, we suggest potentially useful changes to the current regulatory system. Copyright © 2010 Elsevier Ltd. All rights reserved.
Enhancing Manufacturing Process Education via Computer Simulation and Visualization
ERIC Educational Resources Information Center
Manohar, Priyadarshan A.; Acharya, Sushil; Wu, Peter
2014-01-01
Industrially significant metal manufacturing processes such as melting, casting, rolling, forging, machining, and forming are multi-stage, complex processes that are labor, time, and capital intensive. Academic research develops mathematical modeling of these processes that provide a theoretical framework for understanding the process variables…
Anticipatory control: A software retrofit for current plant controllers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parthasarathy, S.; Parlos, A.G.; Atiya, A.F.
1993-01-01
The design and simulated testing of an artificial neural network (ANN)-based self-adapting controller for complex process systems are presented in this paper. The proposed controller employs concepts based on anticipatory systems, which have been widely used in the petroleum and chemical industries, and they are slowly finding their way into the power industry. In particular, model predictive control (MPC) is used for the systematic adaptation of the controller parameters to achieve desirable plant performance over the entire operating envelope. The versatile anticipatory control algorithm developed in this study is projected to enhance plant performance and lend robustness to drifts inmore » plant parameters and to modeling uncertainties. This novel technique of integrating recurrent ANNs with a conventional controller structure appears capable of controlling complex, nonlinear, and nonminimum phase process systems. The direct, on-line adaptive control algorithm presented in this paper considers the plant response over a finite time horizon, diminishing the need for manual control or process interruption for controller gain tuning.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cafferty, Kara Grace
This report describes conditions, as required by the state of Idaho Industrial Wastewater Reuse Permit (WRU-I-0160-01, Modification 1, formerly LA 000160 01), for the wastewater reuse site at the Idaho National Laboratory Site’s Materials and Fuels Complex Industrial Waste Ditch and Industrial Waste Pond from November 1, 2015, through October 31, 2016.
NASA Astrophysics Data System (ADS)
Chen, Bing; Stein, Ariel F.; Maldonado, Pabla Guerrero; Sanchez de la Campa, Ana M.; Gonzalez-Castanedo, Yolanda; Castell, Nuria; de la Rosa, Jesus D.
2013-06-01
This study presents a description of the emission, transport, dispersion, and deposition of heavy metals contained in atmospheric aerosols emitted from a large industrial complex in southern Spain using the HYSPLIT model coupled with high- (MM5) and low-resolution (GDAS) meteorological simulations. The dispersion model was configured to simulate eight size fractions (<0.33, 0.66, 1.3, 2.5, 5, 14, 17, and >17 μm) of metals based on direct measurements taken at the industrial emission stacks. Twelve stacks in four plants were studied and the stacks showed considerable differences for both emission fluxes and size ranges of metals. We model the dispersion of six major metals; Cr, Co, Ni, La, Zn, and Mo, which represent 77% of the total mass of the 43 measured elements. The prediction shows that the modeled industrial emissions produce an enrichment of heavy metals by a factor of 2-5 for local receptor sites when compared to urban and rural background areas in Spain. The HYSPLIT predictions based on the meteorological fields from MM5 show reasonable consistence with the temporal evolution of concentrations of Cr, Co, and Ni observed at three sites downwind of the industrial area. The magnitude of concentrations of metals at two receptors was underestimated for both MM5 (by a factor of 2-3) and GDAS (by a factor of 4-5) meteorological runs. The model prediction shows that heavy metal pollution from industrial emissions in this area is dominated by the ultra-fine (<0.66 μm) and fine (<2.5 μm) size fractions.
NASA Astrophysics Data System (ADS)
Kassem, M.; Soize, C.; Gagliardini, L.
2009-06-01
In this paper, an energy-density field approach applied to the vibroacoustic analysis of complex industrial structures in the low- and medium-frequency ranges is presented. This approach uses a statistical computational model. The analyzed system consists of an automotive vehicle structure coupled with its internal acoustic cavity. The objective of this paper is to make use of the statistical properties of the frequency response functions of the vibroacoustic system observed from previous experimental and numerical work. The frequency response functions are expressed in terms of a dimensionless matrix which is estimated using the proposed energy approach. Using this dimensionless matrix, a simplified vibroacoustic model is proposed.
Three dimensional hair model by means particles using Blender
NASA Astrophysics Data System (ADS)
Alvarez-Cedillo, Jesús Antonio; Almanza-Nieto, Roberto; Herrera-Lozada, Juan Carlos
2010-09-01
The simulation and modeling of human hair is a process whose computational complexity is very large, this due to the large number of factors that must be calculated to give a realistic appearance. Generally, the method used in the film industry to simulate hair is based on particle handling graphics. In this paper we present a simple approximation of how to model human hair using particles in Blender. [Figure not available: see fulltext.
Petticrew, Mark; Katikireddi, Srinivasa Vittal; Knai, Cécile; Cassidy, Rebecca; Maani Hessari, Nason; Thomas, James; Weishaar, Heide
2017-01-01
Background Corporations use a range of strategies to dispute their role in causing public health harms and to limit the scope of effective public health interventions. This is well documented in relation to the activities of the tobacco industry, but research on other industries is less well developed. We therefore analysed public statements and documents from four unhealthy commodity industries to investigate whether and how they used arguments about complexity in this way. Methods We analysed alcohol, food, soda and gambling industry documents and websites and minutes of reports of relevant health select committees, using standard document analysis methods. Results Two main framings were identified: (i) these industries argue that aetiology is complex, so individual products cannot be blamed; and (ii) they argue that population health measures are ‘too simple’ to address complex public health problems. However, in this second framing, there are inherent contradictions in how industry used ‘complexity’, as their alternative solutions are generally not, in themselves, complex. Conclusion The concept of complexity, as commonly used in public health, is also widely employed by unhealthy commodity industries to influence how the public and policymakers understand health issues. It is frequently used in response to policy announcements and in response to new scientific evidence (particularly evidence on obesity and alcohol harms). The arguments and language may reflect the existence of a cross-industry ‘playbook’, whose use results in the undermining of effective public health policies – in particular the undermining of effective regulation of profitable industry activities that are harmful to the public’s health. PMID:28978619
High-frequency CAD-based scattering model: SERMAT
NASA Astrophysics Data System (ADS)
Goupil, D.; Boutillier, M.
1991-09-01
Specifications for an industrial radar cross section (RCS) calculation code are given: it must be able to exchange data with many computer aided design (CAD) systems, it must be fast, and it must have powerful graphic tools. Classical physical optics (PO) and equivalent currents (EC) techniques have proven their efficiency on simple objects for a long time. Difficult geometric problems occur when objects with very complex shapes have to be computed. Only a specific geometric code can solve these problems. We have established that, once these problems have been solved: (1) PO and EC give good results on complex objects of large size compared to wavelength; and (2) the implementation of these objects in a software package (SERMAT) allows fast and sufficiently precise domain RCS calculations to meet industry requirements in the domain of stealth.
NASA Astrophysics Data System (ADS)
Gigante-Barrera, Ángel; Dindar, Serdar; Kaewunruen, Sakdirat; Ruikar, Darshan
2017-10-01
Railway turnouts are complex systems designed using complex geometries and grades which makes them difficult to be managed in terms of risk prevention. This feature poses a substantial peril to rail users as it is considered a cause of derailment. In addition, derailment deals to financial losses due to operational downtimes and monetary compensations in case of death or injure. These are fundamental drivers to consider mitigating risks arising from poor risk management during design. Prevention through design (PtD) is a process that introduces tacit knowledge from industry professionals during the design process. There is evidence that Building Information Modelling (BIM) can help to mitigate risk since the inception of the project. BIM is considered an Information System (IS) were tacit knowledge can be stored and retrieved from a digital database making easy to take promptly decisions as information is ready to be analysed. BIM at the model element level entails working with 3D elements and embedded data, therefore adding a layer of complexity to the management of information along the different stages of the project and across different disciplines. In order to overcome this problem, the industry has created a framework for model progression specification named Level of Development (LOD). The paper presents an IDM based framework for design risk mitigation through code validation using the LOD. This effort resulted on risk datasets which describe graphically and non-graphically a rail turnout as the model progresses. Thus, permitting its inclusion within risk information systems. The assignment of an LOD construct to a set of data, requires specialised management and process related expertise. Furthermore, the selection of a set of LOD constructs requires a purpose based analysis. Therefore, a framework for LOD constructs implementation within the IDM for code checking is required for the industry to progress in this particular field.
Hong, H. L.; Wang, Q.; Dong, C.; Liaw, Peter K.
2014-01-01
Metallic alloys show complex chemistries that are not yet understood so far. It has been widely accepted that behind the composition selection lies a short-range-order mechanism for solid solutions. The present paper addresses this fundamental question by examining the face-centered-cubic Cu-Zn α-brasses. A new structural approach, the cluster-plus-glue-atom model, is introduced, which suits specifically for the description of short-range-order structures in disordered systems. Two types of formulas are pointed out, [Zn-Cu12]Zn1~6 and [Zn-Cu12](Zn,Cu)6, which explain the α-brasses listed in the American Society for Testing and Materials (ASTM) specifications. In these formulas, the bracketed parts represent the 1st-neighbor cluster, and each cluster is matched with one to six 2nd-neighbor Zn atoms or with six mixed (Zn,Cu) atoms. Such a cluster-based formulism describes the 1st- and 2nd-neighbor local atomic units where the solute and solvent interactions are ideally satisfied. The Cu-Ni industrial alloys are also explained, thus proving the universality of the cluster-formula approach in understanding the alloy selections. The revelation of the composition formulas for the Cu-(Zn,Ni) industrial alloys points to the common existence of simple composition rules behind seemingly complex chemistries of industrial alloys, thus offering a fundamental and practical method towards composition interpretations of all kinds of alloys. PMID:25399835
Hong, H. L.; Wang, Q.; Dong, C.; ...
2014-11-17
Metallic alloys show complex chemistries that are not yet understood so far. It has been widely accepted that behind the composition selection lies a short-range-order mechanism for solid solutions. The present paper addresses this fundamental question by examining the face-centered-cubic Cu-Zn α-brasses. A new structural approach, the cluster-plus-glue-atom model, is introduced, which suits specifically for the description of short-range-order structures in disordered systems. Two types of formulas are pointed out, [Zn-Cu 12]Zn 1~6 and [Zn-Cu 12](Zn,Cu) 6, which explain the α-brasses listed in the American Society for Testing and Materials (ASTM) specifications. In these formulas, the bracketed parts represent themore » 1 st-neighbor cluster, and each cluster is matched with one to six 2 nd-neighbor Zn atoms or with six mixed (Zn,Cu) atoms. Such a cluster-based formulism describes the 1 st- and 2 nd-neighbor local atomic units where the solute and solvent interactions are ideally satisfied. The Cu-Ni industrial alloys are also explained, thus proving the universality of the cluster-formula approach in understanding the alloy selections. As a result, the revelation of the composition formulas for the Cu-(Zn,Ni) industrial alloys points to the common existence of simple composition rules behind seemingly complex chemistries of industrial alloys, thus offering a fundamental and practical method towards composition interpretations of all kinds of alloys.« less
ERIC Educational Resources Information Center
Angier, Natalie
1983-01-01
Scientists are designing computer models of biological systems, and of compounds with complex molecules, that can be used to get answers once obtainable only by sacrificing laboratory animals. Although most programs are still under development, some are in use by industrial/pharmaceutical companies. The programs and experiments they simulate are…
Ganga, G M D; Esposto, K F; Braatz, D
2012-01-01
The occupational exposure limits of different risk factors for development of low back disorders (LBDs) have not yet been established. One of the main problems in setting such guidelines is the limited understanding of how different risk factors for LBDs interact in causing injury, since the nature and mechanism of these disorders are relatively unknown phenomena. Industrial ergonomists' role becomes further complicated because the potential risk factors that may contribute towards the onset of LBDs interact in a complex manner, which makes it difficult to discriminate in detail among the jobs that place workers at high or low risk of LBDs. The purpose of this paper was to develop a comparative study between predictions based on the neural network-based model proposed by Zurada, Karwowski & Marras (1997) and a linear discriminant analysis model, for making predictions about industrial jobs according to their potential risk of low back disorders due to workplace design. The results obtained through applying the discriminant analysis-based model proved that it is as effective as the neural network-based model. Moreover, the discriminant analysis-based model proved to be more advantageous regarding cost and time savings for future data gathering.
Advanced process control framework initiative
NASA Astrophysics Data System (ADS)
Hill, Tom; Nettles, Steve
1997-01-01
The semiconductor industry, one the world's most fiercely competitive industries, is driven by increasingly complex process technologies and global competition to improve cycle time, quality, and process flexibility. Due to the complexity of these problems, current process control techniques are generally nonautomated, time-consuming, reactive, nonadaptive, and focused on individual fabrication tools and processes. As the semiconductor industry moves into higher density processes, radical new approaches are required. To address the need for advanced factory-level process control in this environment, Honeywell, Advanced Micro Devices (AMD), and SEMATECH formed the Advanced Process Control Framework Initiative (APCFI) joint research project. The project defines and demonstrates an Advanced Process Control (APC) approach based on SEMATECH's Computer Integrated Manufacturing (CIM) Framework. Its scope includes the coordination of Manufacturing Execution Systems, process control tools, and wafer fabrication equipment to provide necessary process control capabilities. Moreover, it takes advantage of the CIM Framework to integrate and coordinate applications from other suppliers that provide services necessary for the overall system to function. This presentation discusses the key concept of model-based process control that differentiates the APC Framework. This major improvement over current methods enables new systematic process control by linking the knowledge of key process settings to desired product characteristics that reside in models created with commercial model development tools The unique framework-based approach facilitates integration of commercial tools and reuse of their data by tying them together in an object-based structure. The presentation also explores the perspective of each organization's involvement in the APCFI project. Each has complementary goals and expertise to contribute; Honeywell represents the supplier viewpoint, AMD represents the user with 'real customer requirements', and SEMATECH provides a consensus-building organization that widely disseminates technology to suppliers and users in the semiconductor industry that face similar equipment and factory control systems challenges.
Application of process tomography in gas-solid fluidised beds in different scales and structures
NASA Astrophysics Data System (ADS)
Wang, H. G.; Che, H. Q.; Ye, J. M.; Tu, Q. Y.; Wu, Z. P.; Yang, W. Q.; Ocone, R.
2018-04-01
Gas-solid fluidised beds are commonly used in particle-related processes, e.g. for coal combustion and gasification in the power industry, and the coating and granulation process in the pharmaceutical industry. Because the operation efficiency depends on the gas-solid flow characteristics, it is necessary to investigate the flow behaviour. This paper is about the application of process tomography, including electrical capacitance tomography (ECT) and microwave tomography (MWT), in multi-scale gas-solid fluidisation processes in the pharmaceutical and power industries. This is the first time that both ECT and MWT have been applied for this purpose in multi-scale and complex structure. To evaluate the sensor design and image reconstruction and to investigate the effects of sensor structure and dimension on the image quality, a normalised sensitivity coefficient is introduced. In the meantime, computational fluid dynamic (CFD) analysis based on a computational particle fluid dynamic (CPFD) model and a two-phase fluid model (TFM) is used. Part of the CPFD-TFM simulation results are compared and validated by experimental results from ECT and/or MWT. By both simulation and experiment, the complex flow hydrodynamic behaviour in different scales is analysed. Time-series capacitance data are analysed both in time and frequency domains to reveal the flow characteristics.
Zhang, Qin; Yao, Quanying
2018-05-01
The dynamic uncertain causality graph (DUCG) is a newly presented framework for uncertain causality representation and probabilistic reasoning. It has been successfully applied to online fault diagnoses of large, complex industrial systems, and decease diagnoses. This paper extends the DUCG to model more complex cases than what could be previously modeled, e.g., the case in which statistical data are in different groups with or without overlap, and some domain knowledge and actions (new variables with uncertain causalities) are introduced. In other words, this paper proposes to use -mode, -mode, and -mode of the DUCG to model such complex cases and then transform them into either the standard -mode or the standard -mode. In the former situation, if no directed cyclic graph is involved, the transformed result is simply a Bayesian network (BN), and existing inference methods for BNs can be applied. In the latter situation, an inference method based on the DUCG is proposed. Examples are provided to illustrate the methodology.
Ammonia formation by a thiolate-bridged diiron amide complex as a nitrogenase mimic
NASA Astrophysics Data System (ADS)
Li, Yang; Li, Ying; Wang, Baomin; Luo, Yi; Yang, Dawei; Tong, Peng; Zhao, Jinfeng; Luo, Lun; Zhou, Yuhan; Chen, Si; Cheng, Fang; Qu, Jingping
2013-04-01
Although nitrogenase enzymes routinely convert molecular nitrogen into ammonia under ambient temperature and pressure, this reaction is currently carried out industrially using the Haber-Bosch process, which requires extreme temperatures and pressures to activate dinitrogen. Biological fixation occurs through dinitrogen and reduced NxHy species at multi-iron centres of compounds bearing sulfur ligands, but it is difficult to elucidate the mechanistic details and to obtain stable model intermediate complexes for further investigation. Metal-based synthetic models have been applied to reveal partial details, although most models involve a mononuclear system. Here, we report a diiron complex bridged by a bidentate thiolate ligand that can accommodate HN=NH. Following reductions and protonations, HN=NH is converted to NH3 through pivotal intermediate complexes bridged by N2H3- and NH2- species. Notably, the final ammonia release was effected with water as the proton source. Density functional theory calculations were carried out, and a pathway of biological nitrogen fixation is proposed.
Xiong, Lihu; Zhu, Wenjia
2017-01-01
Coastal wetlands offer many important ecosystem services both in natural and in social systems. How to simultaneously decrease the destructive effects flowing from human activities and maintaining the sustainability of regional wetland ecosystems are an important issue for coastal wetlands zones. We use carbon credits as the basis for regional sustainable developing policy-making. With the case of Gouqi Island, a typical coastal wetlands zone that locates in the East China Sea, a carbon cycle model was developed to illustrate the complex social-ecological processes. Carbon-related processes in natural ecosystem, primary industry, secondary industry, tertiary industry, and residents on the island were identified in the model. The model showed that 36780 tons of carbon is released to atmosphere with the form of CO2, and 51240 tons of carbon is captured by the ecosystem in 2014 and the three major resources of carbon emission are transportation and tourism development and seawater desalination. Based on the carbon-related processes and carbon balance, we proposed suggestions on the sustainable development strategy of Gouqi Island as coastal wetlands zone. PMID:28286690
Li, Yanxia; Xiong, Lihu; Zhu, Wenjia
2017-01-01
Coastal wetlands offer many important ecosystem services both in natural and in social systems. How to simultaneously decrease the destructive effects flowing from human activities and maintaining the sustainability of regional wetland ecosystems are an important issue for coastal wetlands zones. We use carbon credits as the basis for regional sustainable developing policy-making. With the case of Gouqi Island, a typical coastal wetlands zone that locates in the East China Sea, a carbon cycle model was developed to illustrate the complex social-ecological processes. Carbon-related processes in natural ecosystem, primary industry, secondary industry, tertiary industry, and residents on the island were identified in the model. The model showed that 36780 tons of carbon is released to atmosphere with the form of CO 2 , and 51240 tons of carbon is captured by the ecosystem in 2014 and the three major resources of carbon emission are transportation and tourism development and seawater desalination. Based on the carbon-related processes and carbon balance, we proposed suggestions on the sustainable development strategy of Gouqi Island as coastal wetlands zone.
MLP based models to predict PM10, O3 concentrations, in Sines industrial area
NASA Astrophysics Data System (ADS)
Durao, R.; Pereira, M. J.
2012-04-01
Sines is an important Portuguese industrial area located southwest cost of Portugal with important nearby protected natural areas. The main economical activities are related with this industrial area, the deep-water port, petrochemical and thermo-electric industry. Nevertheless, tourism is also an important economic activity especially in summer time with potential to grow. The aim of this study is to develop prediction models of pollutant concentration categories (e.g. low concentration and high concentration) in order to provide early warnings to the competent authorities who are responsible for the air quality management. The knowledge in advanced of pollutant high concentrations occurrence will allow the implementation of mitigation actions and the release of precautionary alerts to population. The regional air quality monitoring network consists in three monitoring stations where a set of pollutants' concentrations are registered on a continuous basis. From this set stands out the tropospheric ozone (O3) and particulate matter (PM10) due to the high concentrations occurring in the region and their adverse effects on human health. Moreover, the major industrial plants of the region monitor SO2, NO2 and particles emitted flows at the principal chimneys (point sources), also on a continuous basis,. Therefore Artificial neuronal networks (ANN) were the applied methodology to predict next day pollutant concentrations; due to the ANNs structure they have the ability to capture the non-linear relationships between predictor variables. Hence the first step of this study was to apply multivariate exploratory techniques to select the best predictor variables. The classification trees methodology (CART) was revealed to be the most appropriate in this case.. Results shown that pollutants atmospheric concentrations are mainly dependent on industrial emissions and a complex combination of meteorological factors and the time of the year. In the second step, the Multi-layer perceptron (MLP) have shown to be able to learn the existent complex relationships using different combination of meteorological and emissions variables. Furthermore, MLP models identified what are the meteorological conditions that most affect O3 and PM10 concentrations in the region, namely wind speed and direction, boundary layer height, temperature, sunshine duration, relative humidity and the weather type. The developed MLP models showed good predictive success with model performances between 0.66 and 0.87, indicating a reasonable accuracy for models development and generalization capability. These performance values are obtained using cross entropy error functions. This error functions are only available for classification problems and ensure that the network outputs are true class membership probabilities, which is known to enhance the performance of classification neural networks.
Posch, Andreas E; Spadiut, Oliver; Herwig, Christoph
2012-06-22
Filamentous fungi are versatile cell factories and widely used for the production of antibiotics, organic acids, enzymes and other industrially relevant compounds at large scale. As a fact, industrial production processes employing filamentous fungi are commonly based on complex raw materials. However, considerable lot-to-lot variability of complex media ingredients not only demands for exhaustive incoming components inspection and quality control, but unavoidably affects process stability and performance. Thus, switching bioprocesses from complex to defined media is highly desirable. This study presents a strategy for strain characterization of filamentous fungi on partly complex media using redundant mass balancing techniques. Applying the suggested method, interdependencies between specific biomass and side-product formation rates, production of fructooligosaccharides, specific complex media component uptake rates and fungal strains were revealed. A 2-fold increase of the overall penicillin space time yield and a 3-fold increase in the maximum specific penicillin formation rate were reached in defined media compared to complex media. The newly developed methodology enabled fast characterization of two different industrial Penicillium chrysogenum candidate strains on complex media based on specific complex media component uptake kinetics and identification of the most promising strain for switching the process from complex to defined conditions. Characterization at different complex/defined media ratios using only a limited number of analytical methods allowed maximizing the overall industrial objectives of increasing both, method throughput and the generation of scientific process understanding.
2012-01-01
Background Filamentous fungi are versatile cell factories and widely used for the production of antibiotics, organic acids, enzymes and other industrially relevant compounds at large scale. As a fact, industrial production processes employing filamentous fungi are commonly based on complex raw materials. However, considerable lot-to-lot variability of complex media ingredients not only demands for exhaustive incoming components inspection and quality control, but unavoidably affects process stability and performance. Thus, switching bioprocesses from complex to defined media is highly desirable. Results This study presents a strategy for strain characterization of filamentous fungi on partly complex media using redundant mass balancing techniques. Applying the suggested method, interdependencies between specific biomass and side-product formation rates, production of fructooligosaccharides, specific complex media component uptake rates and fungal strains were revealed. A 2-fold increase of the overall penicillin space time yield and a 3-fold increase in the maximum specific penicillin formation rate were reached in defined media compared to complex media. Conclusions The newly developed methodology enabled fast characterization of two different industrial Penicillium chrysogenum candidate strains on complex media based on specific complex media component uptake kinetics and identification of the most promising strain for switching the process from complex to defined conditions. Characterization at different complex/defined media ratios using only a limited number of analytical methods allowed maximizing the overall industrial objectives of increasing both, method throughput and the generation of scientific process understanding. PMID:22727013
[The economic-industrial health care complex and the social and economic dimension of development].
Gadelha, Carlos Augusto Grabois; Costa, Laís Silveira; Maldonado, José
2012-12-01
The strategic role of health care in the national development agenda has been increasingly recognized and institutionalized. In addition to its importance as a structuring element of the Social Welfare State, health care plays a leading role in the generation of innovation - an essential element for competitiveness in knowledge society. However, health care's productive basis is still fragile, and this negatively affects both the universal provision of health care services and Brazil's competitive inclusion in the globalized environment. This situation suggests the need of a more systematic analysis of the complex relationships among productive, technological and social interests in the scope of health care. Consequently, it is necessary to produce further knowledge about the Economic-Industrial Health Care Complex due to its potential for contributing to a socially inclusive development model. This means reversing the hierarchy between economic and social interests in the sanitary field, thus minimizing the vulnerability of the Brazilian health care policy.
New Age of 3D Geological Modelling or Complexity is not an Issue Anymore
NASA Astrophysics Data System (ADS)
Mitrofanov, Aleksandr
2017-04-01
Geological model has a significant value in almost all types of researches related to regional mapping, geodynamics and especially to structural and resource geology of mineral deposits. Well-developed geological model must take into account all vital features of modelling object without over-simplification and also should adequately represent the interpretation of the geologist. In recent years with the gradual exhaustion deposits with relatively simple morphology geologists from all over the world are faced with the necessity of building the representative models for more and more structurally complex objects. Meanwhile, the amount of tools used for that has not significantly changed in the last two-three decades. The most widespread method of wireframe geological modelling now was developed in 1990s and is fully based on engineering design set of instruments (so-called CAD). Strings and polygons representing the section-based interpretation are being used as an intermediate step in the process of wireframes generation. Despite of significant time required for this type of modelling, it still can provide sufficient results for simple and medium-complexity geological objects. However, with the increasing complexity more and more vital features of the deposit are being sacrificed because of fundamental inability (or much greater time required for modelling) of CAD-based explicit techniques to develop the wireframes of the appropriate complexity. At the same time alternative technology which is not based on sectional approach and which uses the fundamentally different mathematical algorithms is being actively developed in the variety of other disciplines: medicine, advanced industrial design, game and cinema industry. In the recent years this implicit technology started to being developed for geological modelling purpose and nowadays it is represented by very powerful set of tools that has been integrated in almost all major commercial software packages. Implicit modelling allows to develop geological models that really correspond with complicated geological reality. Models can include fault blocking, complex structural trends and folding; can be based on excessive input dataset (like lots of drilling on the mining stage) or, on the other hand, on a quite few drillholes intersections with significant input from geological interpretation of the deposit. In any case implicit modelling, if is used correctly, allows to incorporate the whole batch of geological data and relatively quickly get the easily adjustable, flexible and robust geological wireframes that can be used as a reliable foundation on the following stages of geological investigations. In SRK practice nowadays almost all the wireframe models used for structural and resource geology are developed with implicit modelling tools which significantly increased the speed and quality of geological modelling.
Optimization of a method for preparing solid complexes of essential clove oil with β-cyclodextrins.
Hernández-Sánchez, Pilar; López-Miranda, Santiago; Guardiola, Lucía; Serrano-Martínez, Ana; Gabaldón, José Antonio; Nuñez-Delicado, Estrella
2017-01-01
Clove oil (CO) is an aromatic oily liquid used in the food, cosmetics and pharmaceutical industries for its functional properties. However, its disadvantages of pungent taste, volatility, light sensitivity and poor water solubility can be solved by applying microencapsulation or complexation techniques. Essential CO was successfully solubilized in aqueous solution by forming inclusion complexes with β-cyclodextrins (β-CDs). Moreover, phase solubility studies demonstrated that essential CO also forms insoluble complexes with β-CDs. Based on these results, essential CO-β-CD solid complexes were prepared by the novel approach of microwave irradiation (MWI), followed by three different drying methods: vacuum oven drying (VO), freeze-drying (FD) or spray-drying (SD). FD was the best option for drying the CO-β-CD solid complexes, followed by VO and SD. MWI can be used efficiently to prepare essential CO-β-CD complexes with good yield on an industrial scale. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Query-Time Optimization Techniques for Structured Queries in Information Retrieval
ERIC Educational Resources Information Center
Cartright, Marc-Allen
2013-01-01
The use of information retrieval (IR) systems is evolving towards larger, more complicated queries. Both the IR industrial and research communities have generated significant evidence indicating that in order to continue improving retrieval effectiveness, increases in retrieval model complexity may be unavoidable. From an operational perspective,…
THE INFLUENCE OF A TALL BUILDING ON STREET-CANYON FLOW IN AN URBAN NEIGHBORHOOD
This study presents a velocity comparison between meteorological wind tunnel results and results from the Quick Urban & Industrial Complex model (QUIC, version 3.9) for a simplified urban area, representing a regular array of city blocks composed of row houses in Brooklyn, New Yo...
Exploring the sustainability of industrial production and energy generation with a model system
The importance and complexity of sustainability has been well recognized and a formal study of sustainability based on system theory approaches is imperative as many of the relationships between the various components of the system could be non-linear, intertwined, and non-intuit...
THE INFLUENCE OF A TALL BUILDING ON STREET CANYON FLOW IN AN URBAN NEIGBORHOOD
This study presents a velocity comparison between meteorological wind tunnel results and results from the Quick Urban & Industrial Complex model (QUIC, version 3.9) for a simplified urban area, representing a regular array of city blocks composed of row houses in Brooklyn, New Yo...
Calibration of 3D ALE finite element model from experiments on friction stir welding of lap joints
NASA Astrophysics Data System (ADS)
Fourment, Lionel; Gastebois, Sabrina; Dubourg, Laurent
2016-10-01
In order to support the design of such a complex process like Friction Stir Welding (FSW) for the aeronautic industry, numerical simulation software requires (1) developing an efficient and accurate Finite Element (F.E.) formulation that allows predicting welding defects, (2) properly modeling the thermo-mechanical complexity of the FSW process and (3) calibrating the F.E. model from accurate measurements from FSW experiments. This work uses a parallel ALE formulation developed in the Forge® F.E. code to model the different possible defects (flashes and worm holes), while pin and shoulder threads are modeled by a new friction law at the tool / material interface. FSW experiments require using a complex tool with scroll on shoulder, which is instrumented for providing sensitive thermal data close to the joint. Calibration of unknown material thermal coefficients, constitutive equations parameters and friction model from measured forces, torques and temperatures is carried out using two F.E. models, Eulerian and ALE, to reach a satisfactory agreement assessed by the proper sensitivity of the simulation to process parameters.
Using neural networks for prediction of air pollution index in industrial city
NASA Astrophysics Data System (ADS)
Rahman, P. A.; Panchenko, A. A.; Safarov, A. M.
2017-10-01
This scientific paper is dedicated to the use of artificial neural networks for the ecological prediction of state of the atmospheric air of an industrial city for capability of the operative environmental decisions. In the paper, there is also the described development of two types of prediction models for determining of the air pollution index on the basis of neural networks: a temporal (short-term forecast of the pollutants content in the air for the nearest days) and a spatial (forecast of atmospheric pollution index in any point of city). The stages of development of the neural network models are briefly overviewed and description of their parameters is also given. The assessment of the adequacy of the prediction models, based on the calculation of the correlation coefficient between the output and reference data, is also provided. Moreover, due to the complexity of perception of the «neural network code» of the offered models by the ordinary users, the software implementations allowing practical usage of neural network models are also offered. It is established that the obtained neural network models provide sufficient reliable forecast, which means that they are an effective tool for analyzing and predicting the behavior of dynamics of the air pollution in an industrial city. Thus, this scientific work successfully develops the urgent matter of forecasting of the atmospheric air pollution index in industrial cities based on the use of neural network models.
Large eddy simulation of flows in industrial compressors: a path from 2015 to 2035
Gourdain, N.; Sicot, F.; Duchaine, F.; Gicquel, L.
2014-01-01
A better understanding of turbulent unsteady flows is a necessary step towards a breakthrough in the design of modern compressors. Owing to high Reynolds numbers and very complex geometry, the flow that develops in such industrial machines is extremely hard to predict. At this time, the most popular method to simulate these flows is still based on a Reynolds-averaged Navier–Stokes approach. However, there is some evidence that this formalism is not accurate for these components, especially when a description of time-dependent turbulent flows is desired. With the increase in computing power, large eddy simulation (LES) emerges as a promising technique to improve both knowledge of complex physics and reliability of flow solver predictions. The objective of the paper is thus to give an overview of the current status of LES for industrial compressor flows as well as to propose future research axes regarding the use of LES for compressor design. While the use of wall-resolved LES for industrial multistage compressors at realistic Reynolds number should not be ready before 2035, some possibilities exist to reduce the cost of LES, such as wall modelling and the adaptation of the phase-lag condition. This paper also points out the necessity to combine LES to techniques able to tackle complex geometries. Indeed LES alone, i.e. without prior knowledge of such flows for grid construction or the prohibitive yet ideal use of fully homogeneous meshes to predict compressor flows, is quite limited today. PMID:25024422
The dynamic simulation of the Progetto Energia combined cycle power plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giglio, R.; Cerabolini, M.; Pisacane, F.
1996-12-31
Over the next four years, the Progetto Energia project is building several cogeneration plants to satisfy the increasing demands of Italy`s industrial complex and the country`s demand for electrical power. Located at six different sites within Italy`s borders these Combined Cycle Cogeneration Plants will supply a total of 500 MW of electricity and 100 tons/hr of process steam to Italian industries and residences. To ensure project success, a dynamic model of the 50 MW base unit was developed. The goal established for the model was to predict the dynamic behavior of the complex thermodynamic system in order to assess equipmentmore » performance and control system effectiveness for normal operation and, more importantly, abrupt load changes. In addition to fulfilling its goals, the dynamic study guided modifications to controller logic that significantly improved steam drum pressure control and bypassed steam de-superheating performance. Simulations of normal and abrupt transient events allowed engineers to define optimum controller gain coefficients. The paper discusses the Combined Cycle plant configuration, its operating modes and control system, the dynamic model representation, the simulation results and project benefits.« less
Investigation of model-based physical design restrictions (Invited Paper)
NASA Astrophysics Data System (ADS)
Lucas, Kevin; Baron, Stanislas; Belledent, Jerome; Boone, Robert; Borjon, Amandine; Couderc, Christophe; Patterson, Kyle; Riviere-Cazaux, Lionel; Rody, Yves; Sundermann, Frank; Toublan, Olivier; Trouiller, Yorick; Urbani, Jean-Christophe; Wimmer, Karl
2005-05-01
As lithography and other patterning processes become more complex and more non-linear with each generation, the task of physical design rules necessarily increases in complexity also. The goal of the physical design rules is to define the boundary between the physical layout structures which will yield well from those which will not. This is essentially a rule-based pre-silicon guarantee of layout correctness. However the rapid increase in design rule requirement complexity has created logistical problems for both the design and process functions. Therefore, similar to the semiconductor industry's transition from rule-based to model-based optical proximity correction (OPC) due to increased patterning complexity, opportunities for improving physical design restrictions by implementing model-based physical design methods are evident. In this paper we analyze the possible need and applications for model-based physical design restrictions (MBPDR). We first analyze the traditional design rule evolution, development and usage methodologies for semiconductor manufacturers. Next we discuss examples of specific design rule challenges requiring new solution methods in the patterning regime of low K1 lithography and highly complex RET. We then evaluate possible working strategies for MBPDR in the process development and product design flows, including examples of recent model-based pre-silicon verification techniques. Finally we summarize with a proposed flow and key considerations for MBPDR implementation.
Zhang, Yajun; Chai, Tianyou; Wang, Hong; Wang, Dianhui; Chen, Xinkai
2018-06-01
Complex industrial processes are multivariable and generally exhibit strong coupling among their control loops with heavy nonlinear nature. These make it very difficult to obtain an accurate model. As a result, the conventional and data-driven control methods are difficult to apply. Using a twin-tank level control system as an example, a novel multivariable decoupling control algorithm with adaptive neural-fuzzy inference system (ANFIS)-based unmodeled dynamics (UD) compensation is proposed in this paper for a class of complex industrial processes. At first, a nonlinear multivariable decoupling controller with UD compensation is introduced. Different from the existing methods, the decomposition estimation algorithm using ANFIS is employed to estimate the UD, and the desired estimating and decoupling control effects are achieved. Second, the proposed method does not require the complicated switching mechanism which has been commonly used in the literature. This significantly simplifies the obtained decoupling algorithm and its realization. Third, based on some new lemmas and theorems, the conditions on the stability and convergence of the closed-loop system are analyzed to show the uniform boundedness of all the variables. This is then followed by the summary on experimental tests on a heavily coupled nonlinear twin-tank system that demonstrates the effectiveness and the practicability of the proposed method.
García-Diéguez, Carlos; Bernard, Olivier; Roca, Enrique
2013-03-01
The Anaerobic Digestion Model No. 1 (ADM1) is a complex model which is widely accepted as a common platform for anaerobic process modeling and simulation. However, it has a large number of parameters and states that hinder its calibration and use in control applications. A principal component analysis (PCA) technique was extended and applied to simplify the ADM1 using data of an industrial wastewater treatment plant processing winery effluent. The method shows that the main model features could be obtained with a minimum of two reactions. A reduced stoichiometric matrix was identified and the kinetic parameters were estimated on the basis of representative known biochemical kinetics (Monod and Haldane). The obtained reduced model takes into account the measured states in the anaerobic wastewater treatment (AWT) plant and reproduces the dynamics of the process fairly accurately. The reduced model can support on-line control, optimization and supervision strategies for AWT plants. Copyright © 2013 Elsevier Ltd. All rights reserved.
Wei, Wei; Lv, Zhaofeng; Cheng, Shuiyuan; Wang, Lili; Ji, Dongsheng; Zhou, Ying; Han, Lihui; Wang, Litao
2015-06-01
This study selected a petrochemical industrial complex in Beijing, China, to understand the characteristics of surface ozone (O3) in this industrial area through the on-site measurement campaign during the July-August of 2010 and 2011, and to reveal the response of local O3 to its precursors' emissions through the NCAR-Master Mechanism model (NCAR-MM) simulation. Measurement results showed that the O3 concentration in this industrial area was significantly higher, with the mean daily average of 124.6 μg/m(3) and mean daily maximum of 236.8 μg/m(3), which are, respectively, 90.9 and 50.6 % higher than those in Beijing urban area. Moreover, the diurnal O3 peak generally started up early in 11:00-12:00 and usually remained for 5-6 h, greatly different with the normal diurnal pattern of urban O3. Then, we used NCAR-MM to simulate the average diurnal variation of photochemical O3 in sunny days of August 2010 in both industrial and urban areas. A good agreement in O3 diurnal variation pattern and in O3 relative level was obtained for both areas. For example of O3 daily maximum, the calculated value in the industrial area was about 51 % higher than in the urban area, while measured value in the industrial area was approximately 60 % higher than in the urban area. Finally, the sensitivity analysis of photochemical O3 to its precursors was conducted based on a set of VOCs/NOx emissions cases. Simulation results implied that in the industrial area, the response of O3 to VOCs was negative and to NOx was positive under the current conditions, with the sensitivity coefficients of -0.16~-0.43 and +0.04~+0.06, respectively. By contrast, the urban area was within the VOCs-limitation regime, where ozone enhancement in response to increasing VOCs emissions and to decreasing NOx emission. So, we think that the VOCs emissions control for this petrochemical industrial complex will increase the potential risk of local ozone pollution aggravation, but will be helpful to inhibit the ozone formation in Beijing urban area through reducing the VOCs transport from the industrial area to the urban area.
Next Step Toward Widespread Residential Deep Energy Retrofits
DOE Office of Scientific and Technical Information (OSTI.GOV)
McIlvaine, J.; Saunders, S.; Bordelon, E.
The complexity of deep energy retrofits warrants additional training to successfully manage multiple improvements that will change whole house air, heat, and moisture flow dynamics. The home performance contracting industry has responded to these challenges by aggregating skilled labor for assessment of and implementation under one umbrella. Two emerging business models are profiled that seek to resolve many of the challenges, weaknesses, opportunities, and threats described for the conventional business models.
Proposed best practice for projects that involve modelling and simulation.
O'Kelly, Michael; Anisimov, Vladimir; Campbell, Chris; Hamilton, Sinéad
2017-03-01
Modelling and simulation has been used in many ways when developing new treatments. To be useful and credible, it is generally agreed that modelling and simulation should be undertaken according to some kind of best practice. A number of authors have suggested elements required for best practice in modelling and simulation. Elements that have been suggested include the pre-specification of goals, assumptions, methods, and outputs. However, a project that involves modelling and simulation could be simple or complex and could be of relatively low or high importance to the project. It has been argued that the level of detail and the strictness of pre-specification should be allowed to vary, depending on the complexity and importance of the project. This best practice document does not prescribe how to develop a statistical model. Rather, it describes the elements required for the specification of a project and requires that the practitioner justify in the specification the omission of any of the elements and, in addition, justify the level of detail provided about each element. This document is an initiative of the Special Interest Group for modelling and simulation. The Special Interest Group for modelling and simulation is a body open to members of Statisticians in the Pharmaceutical Industry and the European Federation of Statisticians in the Pharmaceutical Industry. Examples of a very detailed specification and a less detailed specification are included as appendices. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Tan, Chao; Chen, Hui; Wang, Chao; Zhu, Wanping; Wu, Tong; Diao, Yuanbo
2013-03-01
Near and mid-infrared (NIR/MIR) spectroscopy techniques have gained great acceptance in the industry due to their multiple applications and versatility. However, a success of application often depends heavily on the construction of accurate and stable calibration models. For this purpose, a simple multi-model fusion strategy is proposed. It is actually the combination of Kohonen self-organizing map (KSOM), mutual information (MI) and partial least squares (PLSs) and therefore named as KMICPLS. It works as follows: First, the original training set is fed into a KSOM for unsupervised clustering of samples, on which a series of training subsets are constructed. Thereafter, on each of the training subsets, a MI spectrum is calculated and only the variables with higher MI values than the mean value are retained, based on which a candidate PLS model is constructed. Finally, a fixed number of PLS models are selected to produce a consensus model. Two NIR/MIR spectral datasets from brewing industry are used for experiments. The results confirms its superior performance to two reference algorithms, i.e., the conventional PLS and genetic algorithm-PLS (GAPLS). It can build more accurate and stable calibration models without increasing the complexity, and can be generalized to other NIR/MIR applications.
Shi, Ping; Yan, Bo
2016-01-01
We conducted an exploratory investigation of factors influencing the adoption of radio frequency identification (RFID) methods in the agricultural product distribution industry. Through a literature review and field research, and based on the technology-organization-environment (TOE) theoretical framework, this paper analyzes factors influencing RFID adoption in the agricultural product distribution industry in reference to three contexts: technological, organizational, and environmental contexts. An empirical analysis of the TOE framework was conducted by applying structural equation modeling based on actual data from a questionnaire survey on the agricultural product distribution industry in China. The results show that employee resistance and uncertainty are not supported by the model. Technological compatibility, perceived effectiveness, organizational size, upper management support, trust between enterprises, technical knowledge, competitive pressure and support from the Chinese government, which are supported by the model, have significantly positive effects on RFID adoption. Meanwhile, organizational size has the strongest positive effect, while competitive pressure levels have the smallest effect. Technological complexities and costs have significantly negative effects on RFID adoption, with cost being the most significantly negative influencing factor. These research findings will afford enterprises in the agricultural products supply chain with a stronger understanding of the factors that influence RFID adoption in the agricultural product distribution industry. In addition, these findings will help enterprises remain aware of how these factors affect RFID adoption and will thus help enterprises make more accurate and rational decisions by promoting RFID application in the agricultural product distribution industry.
Multiple beam mask writers: an industry solution to the write time crisis
NASA Astrophysics Data System (ADS)
Litt, Lloyd C.
2010-09-01
The semiconductor industry is under constant pressure to reduce production costs even as technology complexity increases. Lithography represents the most expensive process due to its high capital equipment costs and the implementation of low-k1 lithographic processes, which has added to the complexity of making masks through the greater use of optical proximity correction, pixelated masks, and double or triple patterning. Each of these mask technologies allows the production of semiconductors at future nodes while extending the utility of current immersion tools. Low k1 patterning complexity combined with increased data due to smaller feature sizes is driving extremely long mask write times. While a majority of the industry is willing to accept mask write times of up to 24 hours, evidence suggests that the write times for many masks at the 22 nm node and beyond will be significantly longer. It has been estimated that $50M+ in non-recurring engineering (NRE) costs will be required to develop a multiple beam mask writer system, yet the business case to recover this kind of investment is not strong. Moreover, funding such a development is a high risk for an individual supplier. The problem is compounded by a disconnect between the tool customer (the mask supplier) and the final mask customer that will bear the increased costs if a high speed writer is not available. Since no individual company will likely risk entering this market, some type of industry-wide funding model will be needed. Because SEMATECH's member companies strongly support a multiple beam technology for mask writers to reduce the write time and cost of 193 nm and EUV masks, SEMATECH plans to pursue an advanced mask writer program in 2011 and 2012. In 2010, efforts will focus on identifying a funding model to address the investment to develop such a technology.
NASA Astrophysics Data System (ADS)
Zerkle, Ronald D.; Prakash, Chander
1995-03-01
This viewgraph presentation summarizes some CFD experience at GE Aircraft Engines for flows in the primary gaspath of a gas turbine engine and in turbine blade cooling passages. It is concluded that application of the standard k-epsilon turbulence model with wall functions is not adequate for accurate CFD simulation of aerodynamic performance and heat transfer in the primary gas path of a gas turbine engine. New models are required in the near-wall region which include more physics than wall functions. The two-layer modeling approach appears attractive because of its computational complexity. In addition, improved CFD simulation of film cooling and turbine blade internal cooling passages will require anisotropic turbulence models. New turbulence models must be practical in order to have a significant impact on the engine design process. A coordinated turbulence modeling effort between NASA centers would be beneficial to the gas turbine industry.
NASA Technical Reports Server (NTRS)
Zerkle, Ronald D.; Prakash, Chander
1995-01-01
This viewgraph presentation summarizes some CFD experience at GE Aircraft Engines for flows in the primary gaspath of a gas turbine engine and in turbine blade cooling passages. It is concluded that application of the standard k-epsilon turbulence model with wall functions is not adequate for accurate CFD simulation of aerodynamic performance and heat transfer in the primary gas path of a gas turbine engine. New models are required in the near-wall region which include more physics than wall functions. The two-layer modeling approach appears attractive because of its computational complexity. In addition, improved CFD simulation of film cooling and turbine blade internal cooling passages will require anisotropic turbulence models. New turbulence models must be practical in order to have a significant impact on the engine design process. A coordinated turbulence modeling effort between NASA centers would be beneficial to the gas turbine industry.
Industrialization and Household Complexity in Rural Taiwan.
ERIC Educational Resources Information Center
Lavely, William
1990-01-01
In 274 Taiwanese townships, farm household complexity in 1960 and 1970 was positively related to the proportion of the labor force in nonagricultural occupations. The close proximity of industry to family farms in Taiwan has reduced rural to urban migration usually associated with industrialization. Contains 46 references. (Author/SV)
Cost efficiency of the non-associative flow rule simulation of an industrial component
NASA Astrophysics Data System (ADS)
Galdos, Lander; de Argandoña, Eneko Saenz; Mendiguren, Joseba
2017-10-01
In the last decade, metal forming industry is becoming more and more competitive. In this context, the FEM modeling has become a primary tool of information for the component and process design. Numerous researchers have been focused on improving the accuracy of the material models implemented on the FEM in order to improve the efficiency of the simulations. Aimed at increasing the efficiency of the anisotropic behavior modelling, in the last years the use of non-associative flow rule models (NAFR) has been presented as an alternative to the classic associative flow rule models (AFR). In this work, the cost efficiency of the used flow rule model has been numerically analyzed by simulating an industrial drawing operation with two different models of the same degree of flexibility: one AFR model and one NAFR model. From the present study, it has been concluded that the flow rule has a negligible influence on the final drawing prediction; this is mainly driven by the model parameter identification procedure. Even though the NAFR formulation is complex when compared to the AFR, the present study shows that the total simulation time while using explicit FE solvers has been reduced without loss of accuracy. Furthermore, NAFR formulations have an advantage over AFR formulations in parameter identification because the formulation decouples the yield stress and the Lankford coefficients.
NASA Astrophysics Data System (ADS)
Azougagh, Yassine; Benhida, Khalid; Elfezazi, Said
2016-02-01
In this paper, the focus is on studying the performance of complex systems in a supply chain context by developing a structured modelling approach based on the methodology ASDI (Analysis, Specification, Design and Implementation) by combining the modelling by Petri nets and simulation using ARENA. The linear approach typically followed in conducting of this kind of problems has to cope with a difficulty of modelling due to the complexity and the number of parameters of concern. Therefore, the approach used in this work is able to structure modelling a way to cover all aspects of the performance study. The modelling structured approach is first introduced before being applied to the case of an industrial system in the field of phosphate. Results of the performance indicators obtained from the models developed, permitted to test the behaviour and fluctuations of this system and to develop improved models of the current situation. In addition, in this paper, it was shown how Arena software can be adopted to simulate complex systems effectively. The method in this research can be applied to investigate various improvements scenarios and their consequences before implementing them in reality.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mike lewis
2013-02-01
This report summarizes radiological monitoring performed on samples from specific groundwater monitoring wells associated with the Industrial Wastewater Reuse Permit for the Materials and Fuels Complex Industrial Waste Ditch and Industrial Waste Pond WRU-I-0160-01, Modification 1 (formerly LA-000160-01). The radiological monitoring was performed to fulfill Department of Energy requirements under the Atomic Energy Act.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mike Lewis
2014-02-01
This report summarizes radiological monitoring performed on samples from specific groundwater monitoring wells associated with the Industrial Wastewater Reuse Permit for the Materials and Fuels Complex Industrial Waste Ditch and Industrial Waste Pond WRU-I-0160-01, Modification 1 (formerly LA-000160-01). The radiological monitoring was performed to fulfill Department of Energy requirements under the Atomic Energy Act.
NASA Astrophysics Data System (ADS)
Ren, Shuwen; Li, Jing; Guan, Huashi
2010-12-01
An excess of reactive oxygen species (ROS) leads to a variety of chronic health problems. As potent antioxidants, marine bioactive extracts containing oligosaccharides and peptides have been extensively studied. Recently, there is a growing interest in protein-polysaccharide complexes because of their potential uses in pharmaceutical and food industries. However, only few studies are available on the antioxidant activities of such complexes, in terms of their ROS scavenging capability. In this study, we combined different marine oligosaccharides (isolated and purified) with collagen peptides derived from tilapia fish skin, and evaluated the antioxidant activity of the marine peptide-oligosaccharide complexes vis-à-vis the activity of their original component molecules. Biochemical and cellular assays were performed to measure the scavenging effects on 1, 1-diphenyl-2-picrylhydrazyl (DPPH), hydroxyl and superoxide radicals, and to evaluate the influences on the activities of superoxide dismutase (SOD), glutathione peroxidase (GSH-Px) and the level of malondialdehyde (MDA) in UV-induced photoaging models. The results indicated that the antioxidant activities of all the complexes were stronger than those of their individual components. Among the 11 complexes tested, two complexes, namely MA1000+CP and κ-ca3000+CP, turned out to be highly effective antioxidants. Although the detailed mechanisms of this improved scavenging ability are not fully understood, this work provides insights into the design of highly efficient peptide-oligosaccharide complexes for potential applications in pharmaceutical, cosmetics and food industries.
Hanning, Brian; Predl, Nicolle
2015-09-01
Traditional overnight rehabilitation payment models in the private sector are not based on a rigorous classification system and vary greatly between contracts with no consideration of patient complexity. The payment rates are not based on relative cost and the length-of-stay (LOS) point at which a reduced rate applies (step downs) varies markedly. The rehabilitation Australian National Sub-Acute and Non-Acute Patient (AN-SNAP) model (RAM), which has been in place for over 2 years in some private hospitals, bases payment on a rigorous classification system, relative cost and industry LOS. RAM is in the process of being rolled out more widely. This paper compares and contrasts RAM with traditional overnight rehabilitation payment models. It considers the advantages of RAM for hospitals and Australian Health Service Alliance. It also considers payment model changes in the context of maintaining industry consistency with Electronic Claims Lodgement and Information Processing System Environment (ECLIPSE) and health reform generally.
USDA-ARS?s Scientific Manuscript database
Bovine mastitis is an inflammation-driven disease of the bovine mammary gland that costs the global dairy industry several billion dollars per annum. Because disease susceptibility is a multi-factorial complex phenotype, a multi-omic integrative biology approach is required to dissect the multilayer...
Inhibition of lipid oxidation by formation of caseinate-oil-oat gum complexes
USDA-ARS?s Scientific Manuscript database
Lipid oxidation, particularly oxidation of unsaturated fatty acids such as omega-3 fatty acids, has posed a serious challenge to the food industry trying to incorporate heart-healthy oil products into their lines of healthful foods and beverages. In this study, model plant oil was dispersed into so...
Managing Reward in Developing Economies: The Challenge for Multinational Corporations
ERIC Educational Resources Information Center
Opute, John
2010-01-01
Reward has been, and continues to be, subject to significant changes in developing economies; the industrial relations model prevalent being driven by the complex socio-economic and cultural paradigms and the increasing demands of globalisation. The issue of reward in developing economies is therefore central and dependent on numerous contextual…
A simulated approach to estimating PM10 and PM2.5 concentrations downwind from cotton gins
USDA-ARS?s Scientific Manuscript database
Cotton gins are required to obtain operating permits from state air pollution regulatory agencies (SAPRA), which regulate the amount of particulate matter that can be emitted. Industrial Source Complex Short Term version 3 (ISCST3) is the Gaussian dispersion model currently used by some SAPRAs to pr...
Sandstorms are frequent in the northern Chihuahuan Desert in New Mexico, an area characterized by open areas lacking vegetation, individual mesquite bushes, and mesquite coppice dunes. Field measurements of sand fluxes and wind velocities over a two year period provided a descri...
A Study of Ship Acquisition Cost Estimating in the Naval Sea Systems Command. Appendices
1977-10-01
Shipbuilding Is A Heovy Fabrication Industry Pro- ducing Small Numbers Of Expensive, Complex Units Of Output PAGE A-2 (1) Due to its heavy ...estimate future ship construction costs. - A-l 1. SHIPBUILDING IS A HEAVY FABRICATION INDUSTRY PRODUCING SMALL NUMBERS OF EXPENSIVE, COMPLEX...extensively in production line industries such as automotive products and the airframe industry. (1) Due To Its Heavy Construction Orientation
Application of the GERTS II simulator in the industrial environment.
NASA Technical Reports Server (NTRS)
Whitehouse, G. E.; Klein, K. I.
1971-01-01
GERT was originally developed to aid in the analysis of stochastic networks. GERT can be used to graphically model and analyze complex systems. Recently a simulator model, GERTS II, has been developed to solve GERT Networks. The simulator language used in the development of this model was GASP II A. This paper discusses the possible application of GERTS II to model and analyze (1) assembly line operations, (2) project management networks, (3) conveyor systems and (4) inventory systems. Finally, an actual application dealing with a job shop loading problem is presented.
The Next Step Toward Widespread Residential Deep Energy Retrofits
DOE Office of Scientific and Technical Information (OSTI.GOV)
McIlvaine, J.; Martin, E.; Saunders, S.
The complexity of deep energy retrofits warrants additional training to successfully manage multiple improvements that will change whole house air, heat, and moisture flow dynamics. The home performance contracting industry has responded to these challenges by aggregating skilled labor for assessment of and implementation under one umbrella. Two emerging business models are profiled that seek to resolve many of the challenges, weaknesses, opportunities, and threats described for the conventional business models.
NASA Astrophysics Data System (ADS)
Kozak, J.; Gulbinowicz, D.; Gulbinowicz, Z.
2009-05-01
The need for complex and accurate three dimensional (3-D) microcomponents is increasing rapidly for many industrial and consumer products. Electrochemical machining process (ECM) has the potential of generating desired crack-free and stress-free surfaces of microcomponents. This paper reports a study of pulse electrochemical micromachining (PECMM) using ultrashort (nanoseconds) pulses for generating complex 3-D microstructures of high accuracy. A mathematical model of the microshaping process with taking into consideration unsteady phenomena in electrical double layer has been developed. The software for computer simulation of PECM has been developed and the effects of machining parameters on anodic localization and final shape of machined surface are presented.
Constituent bioconcentration in rainbow trout exposed to a complex chemical mixture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linder, G.; Bergman, H.L.; Meyer, J.S.
1984-09-01
Classically, aquatic contaminant fate models predicting a chemical's bioconcentration factor (BCF) are based upon single-compound derived models, yet such BCF predictions may deviate from observed BCFs when physicochemical interactions or biological responses to complex chemical mixture exposures are not adequately considered in the predictive model. Rainbow trout were exposed to oil-shale retort waters. Such a study was designed to model the potential biological effects precluded by exposure to complex chemical mixtures such as solid waste leachates, agricultural runoff, and industrial process waste waters. Chromatographic analysis of aqueous and nonaqueous liquid-liquid reservoir components yielded differences in mixed extraction solvent HPLC profilesmore » of whole fish exposed for 1 and 3 weeks to the highest dilution of the complex chemical mixture when compared to their corresponding control, yet subsequent whole fish extractions at 6, 9, 12, and 15 weeks into exposure demonstrated no qualitative differences between control and exposed fish. Liver extractions and deproteinized bile samples from exposed fish were qualitatively different than their corresponding controls. These findings support the projected NOEC of 0.0045% dilution, even though the differences in bioconcentration profiles suggest hazard assessment strategies may be useful in evaluating environmental fate processes associated with complex chemical mixtures. 12 references, 4 figures, 2 tables.« less
On Convergence of Development Costs and Cost Models for Complex Spaceflight Instrument Electronics
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Patel, Umeshkumar D.; Kasa, Robert L.; Hestnes, Phyllis; Brown, Tammy; Vootukuru, Madhavi
2008-01-01
Development costs of a few recent spaceflight instrument electrical and electronics subsystems have diverged from respective heritage cost model predictions. The cost models used are Grass Roots, Price-H and Parametric Model. These cost models originated in the military and industry around 1970 and were successfully adopted and patched by NASA on a mission-by-mission basis for years. However, the complexity of new instruments recently changed rapidly by orders of magnitude. This is most obvious in the complexity of representative spaceflight instrument electronics' data system. It is now required to perform intermediate processing of digitized data apart from conventional processing of science phenomenon signals from multiple detectors. This involves on-board instrument formatting of computational operands from row data for example, images), multi-million operations per second on large volumes of data in reconfigurable hardware (in addition to processing on a general purpose imbedded or standalone instrument flight computer), as well as making decisions for on-board system adaptation and resource reconfiguration. The instrument data system is now tasked to perform more functions, such as forming packets and instrument-level data compression of more than one data stream, which are traditionally performed by the spacecraft command and data handling system. It is furthermore required that the electronics box for new complex instruments is developed for one-digit watt power consumption, small size and that it is light-weight, and delivers super-computing capabilities. The conflict between the actual development cost of newer complex instruments and its electronics components' heritage cost model predictions seems to be irreconcilable. This conflict and an approach to its resolution are addressed in this paper by determining the complexity parameters, complexity index, and their use in enhanced cost model.
Scaling Laws of Discrete-Fracture-Network Models
NASA Astrophysics Data System (ADS)
Philippe, D.; Olivier, B.; Caroline, D.; Jean-Raynald, D.
2006-12-01
The statistical description of fracture networks through scale still remains a concern for geologists, considering the complexity of fracture networks. A challenging task of the last 20-years studies has been to find a solid and rectifiable rationale to the trivial observation that fractures exist everywhere and at all sizes. The emergence of fractal models and power-law distributions quantifies this fact, and postulates in some ways that small-scale fractures are genetically linked to their larger-scale relatives. But the validation of these scaling concepts still remains an issue considering the unreachable amount of information that would be necessary with regards to the complexity of natural fracture networks. Beyond the theoretical interest, a scaling law is a basic and necessary ingredient of Discrete-Fracture-Network models (DFN) that are used for many environmental and industrial applications (groundwater resources, mining industry, assessment of the safety of deep waste disposal sites, ..). Indeed, such a function is necessary to assemble scattered data, taken at different scales, into a unified scaling model, and to interpolate fracture densities between observations. In this study, we discuss some important issues related to scaling laws of DFN: - We first describe a complete theoretical and mathematical framework that takes account of both the fracture- size distribution and the fracture clustering through scales (fractal dimension). - We review the scaling laws that have been obtained, and we discuss the ability of fracture datasets to really constrain the parameters of the DFN model. - And finally we discuss the limits of scaling models.
Workplace injuries, safety climate and behaviors: application of an artificial neural network.
Abubakar, A Mohammed; Karadal, Himmet; Bayighomog, Steven W; Merdan, Ethem
2018-05-09
This article proposes and tests a model for the interaction effect of the organizational safety climate and behaviors on workplace injuries. Using artificial neural network and survey data from 306 metal casting industry employees in central Anatolia, we found that an organizational safety climate mitigates workplace injuries, and safety behaviors enforce the strength of the negative impact of the safety climate on workplace injuries. The results suggest a complex relationship between the organizational safety climate, safety behavior and workplace injuries. Theoretical and practical implications are discussed in light of decreasing workplace injuries in the Anatolian metal casting industry.
Diaz, Ana Belen; Blandino, Ana; Webb, Colin; Caro, Ildefonso
2016-11-01
A simple kinetic model, with only three fitting parameters, for several enzyme productions in Petri dishes by solid-state fermentation is proposed in this paper, which may be a valuable tool for simulation of this type of processes. Basically, the model is able to predict temporal fungal enzyme production by solid-state fermentation on complex substrates, maximum enzyme activity expected and time at which these maxima are reached. In this work, several fermentations in solid state were performed in Petri dishes, using four filamentous fungi grown on different agro-industrial residues, measuring xylanase, exo-polygalacturonase, cellulose and laccase activities over time. Regression coefficients after fitting experimental data to the proposed model turned out to be quite high in all cases. In fact, these results are very interesting considering, on the one hand, the simplicity of the model and, on the other hand, that enzyme activities correspond to different enzymes, produced by different fungi on different substrates.
Measuring the impact of final demand on global production system based on Markov process
NASA Astrophysics Data System (ADS)
Xing, Lizhi; Guan, Jun; Wu, Shan
2018-07-01
Input-output table is a comprehensive and detailed in describing the national economic systems, consisting of supply and demand information among various industrial sectors. The complex network, a theory and method for measuring the structure of complex system, can depict the structural properties of social and economic systems, and reveal the complicated relationships between the inner hierarchies and the external macroeconomic functions. This paper tried to measure the globalization degree of industrial sectors on the global value chain. Firstly, it constructed inter-country input-output network models to reproduce the topological structure of global economic system. Secondly, it regarded the propagation of intermediate goods on the global value chain as Markov process and introduced counting first passage betweenness to quantify the added processing amount when globally final demand stimulates this production system. Thirdly, it analyzed the features of globalization at both global and country-sector level
Status, Emerging Ideas and Future Directions of Turbulence Modeling Research in Aeronautics
NASA Technical Reports Server (NTRS)
Duraisamy, Karthik; Spalart, Philippe R.; Rumsey, Christopher L.
2017-01-01
In July 2017, a three-day Turbulence Modeling Symposium sponsored by the University of Michigan and NASA was held in Ann Arbor, Michigan. This meeting brought together nearly 90 experts from academia, government and industry, with good international participation, to discuss the state of the art in turbulence modeling, emerging ideas, and to wrestle with questions surrounding its future. Emphasis was placed on turbulence modeling in a predictive context in complex problems, rather than on turbulence theory or descriptive modeling. This report summarizes many of the questions, discussions, and conclusions from the symposium, and suggests immediate next steps.
NASA Astrophysics Data System (ADS)
Gavrishchaka, Valeriy V.; Kovbasinskaya, Maria; Monina, Maria
2008-11-01
Novelty detection is a very desirable additional feature of any practical classification or forecasting system. Novelty and rare patterns detection is the main objective in such applications as fault/abnormality discovery in complex technical and biological systems, fraud detection and risk management in financial and insurance industry. Although many interdisciplinary approaches for rare event modeling and novelty detection have been proposed, significant data incompleteness due to the nature of the problem makes it difficult to find a universal solution. Even more challenging and much less formalized problem is novelty detection in complex strategies and models where practical performance criteria are usually multi-objective and the best state-of-the-art solution is often not known due to the complexity of the task and/or proprietary nature of the application area. For example, it is much more difficult to detect a series of small insider trading or other illegal transactions mixed with valid operations and distributed over long time period according to a well-designed strategy than a single, large fraudulent transaction. Recently proposed boosting-based optimization was shown to be an effective generic tool for the discovery of stable multi-component strategies/models from the existing parsimonious base strategies/models in financial and other applications. Here we outline how the same framework can be used for novelty and fraud detection in complex strategies and models.
NASA Astrophysics Data System (ADS)
Hill, Ian; White, Toby; Owen, Sarah
2014-05-01
Extraction and processing of rock materials to produce aggregates is carried out at some 20,000 quarries across the EU. All stages of the processing and transport of hard and dense materials inevitably consume high levels of energy and have consequent significant carbon footprints. The FP7 project "the Energy Efficient Quarry" (EE-Quarry) has been addressing this problem and has devised strategies, supported by modelling software, to assist the quarrying industry to assess and optimise its energy use, and to minimise its carbon footprint. Aggregate quarries across Europe vary enormously in the scale of the quarrying operations, the nature of the worked mineral, and the processing to produce a final market product. Nevertheless most quarries involve most or all of a series of essential stages; deposit assessment, drilling and blasting, loading and hauling, and crushing and screening. The process of determining the energy-efficiency of each stage is complex, but is broadly understood in principle and there are numerous sources of information and guidance available in the literature and on-line. More complex still is the interaction between each of these stages. For example, using a little more energy in blasting to increase fragmentation may save much greater energy in later crushing and screening, but also generate more fines material which is discarded as waste and the embedded energy in this material is lost. Thus the calculation of the embedded energy in the waste material becomes an input to the determination of the blasting strategy. Such feedback loops abound in the overall quarry optimisation. The project has involved research and demonstration operations at a number of quarries distributed across Europe carried out by all partners in the EE-Quarry project, working in collaboration with many of the major quarrying companies operating in the EU. The EE-Quarry project is developing a sophisticated modelling tool, the "EE-Quarry Model" available to the quarrying industry on a web-based platform. This tool guides quarry managers and operators through the complex, multi-layered, iterative, process of assessing the energy efficiency of their own quarry operation. They are able to evaluate the optimisation of the energy-efficiency of the overall quarry through examining both the individual stages of processing, and the interactions between them. The project is also developing on-line distance learning modules designed for Continuous Professional Development (CPD) activities for staff across the quarrying industry in the EU and beyond. The presentation will describe development of the model, and the format and scope of the resulting software tool and its user-support available to the quarrying industry.
Development of a structured approach for decomposition of complex systems on a functional basis
NASA Astrophysics Data System (ADS)
Yildirim, Unal; Felician Campean, I.
2014-07-01
The purpose of this paper is to present the System State Flow Diagram (SSFD) as a structured and coherent methodology to decompose a complex system on a solution- independent functional basis. The paper starts by reviewing common function modelling frameworks in literature and discusses practical requirements of the SSFD in the context of the current literature and current approaches in industry. The proposed methodology is illustrated through the analysis of a case study: design analysis of a generic Bread Toasting System (BTS).
NASA Astrophysics Data System (ADS)
Perconti, Philip; Bedair, Sarah S.; Bajaj, Jagmohan; Schuster, Jonathan; Reed, Meredith
2016-09-01
To increase Soldier readiness and enhance situational understanding in ever-changing and complex environments, there is a need for rapid development and deployment of Army technologies utilizing sensors, photonics, and electronics. Fundamental aspects of these technologies include the research and development of semiconductor materials and devices which are ubiquitous in numerous applications. Since many Army technologies are considered niche, there is a lack of significant industry investment in the fundamental research and understanding of semiconductor technologies relevant to the Army. To address this issue, the US Army Research Laboratory is establishing a Center for Semiconductor Materials and Device Modeling and seeks to leverage expertise and resources across academia, government and industry. Several key research areas—highlighted and addressed in this paper—have been identified by ARL and external partners and will be pursued in a collaborative fashion by this Center. This paper will also address the mechanisms by which the Center is being established and will operate.
Silva, Patrick J; Ramos, Kenneth S
2018-04-17
Innovation ecosystems tied to academic medical centers (AMCs) are inextricably linked to policy, practices, and infrastructure resulting from the passage of the Bayh-Dole Act in 1980. Bayh-Dole smoothed the way to patenting and licensing new drugs, and to some degree, medical devices and diagnostic reagents. Property rights under Bayh-Dole provided a significant incentive for industry investments in clinical trials, clinical validation, and industrial scale-up of products that advanced health care. Bayh-Dole amplified private investment in biotechnology drug development, and from the authors' perspective did not significantly interfere with the ability of AMCs to produce excellent peer-reviewed science. In today's policy environment, it is increasingly difficulty to patent and license products based on the laws of nature - as the scope of patentability has been narrowed by case law and development of a suitable clinical and business case for the technology is increasingly a gating consideration for licensees. Consequently, fewer academic patents are commercially valuable. The role of technology transfer organizations in engaging industry partners has thus become increasingly complex. The partnering toolbox and the organizational mandate for commercialization must evolve toward novel collaborative models that exploit opportunities for future patent creation (early drug discovery), data exchange (precision medicine using big data), cohort assembly (clinical trials), and decision rule validation (clinical trials). These inputs all contribute to intellectual property rights, and their clinical exploitation manifests the commercialization of translational science. New collaboration models between AMCs and industry must be established to leverage the assets within AMCs that industry partners deem valuable.
Xin-Gang, Zhao; Yu-Zhuo, Zhang; Ling-Zhi, Ren; Yi, Zuo; Zhi-Gong, Wu
2017-10-01
Among the regulatory policies, feed-in tariffs (FIT) and renewable portfolio standards (RPS) are the most popular to promote the development of renewable energy power industry. They can significantly contribute to the expansion of domestic industrial activities in terms of sustainable energy. This paper uses system dynamics (SD) to establish models of long-term development of China's waste incineration power industry under FIT and RPS schemes, and provides a case study by using scenario analysis method. The model, on the one hand, not only clearly shows the complex logical relationship between the factors but also assesses policy effects of the two policy tools in the development of the industry. On the other hand, it provides a reference for scholars to study similar problems in different countries, thereby facilitating an understanding of waste incineration power's long-term sustainable development pattern under FIT and RPS schemes, and helping to provide references for policy-making institutions. The results show that in the perfect competitive market, the implementation of RPS can promote long-term and rapid development of China's waste incineration power industry given the constraints and actions of the mechanisms of RPS quota proportion, the TGC valid period, and fines, compared with FIT. At the end of the paper, policy implications are offered as references for the government. Copyright © 2017 Elsevier Ltd. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Bovine respiratory disease complex (BRDC) is the leading cause of economic loss in the U.S. cattle industry. BRDC likely results from simultaneous or sequential infections with multiple pathogens including both viruses and bacteria. Bovine viral diarrhea virus (BVDV) and bovine corona virus (BoCV...
Master Teachers: Making a Difference on the Edge of Chaos
ERIC Educational Resources Information Center
Chapin, Dexter
2008-01-01
The No Child Left Behind legislation, by legitimizing a stark, one-size-fits-all, industrial model of education, has denied the inherent complexity and richness of what teachers do. Discussing teaching in terms of Chaos Theory, Chapin explains that while excellent teaching may occur at the edge of chaos, it is not chaotic. There are patterns…
Three Applications of Automated Test Assembly within a User-Friendly Modeling Environment
ERIC Educational Resources Information Center
Cor, Ken; Alves, Cecilia; Gierl, Mark
2009-01-01
While linear programming is a common tool in business and industry, there have not been many applications in educational assessment and only a handful of individuals have been actively involved in conducting psychometric research in this area. Perhaps this is due, at least in part, to the complexity of existing software packages. This article…
NASA Astrophysics Data System (ADS)
Mu, Nan; Wang, Kun; Xie, Zexiao; Ren, Ping
2017-05-01
To realize online rapid measurement for complex workpieces, a flexible measurement system based on an articulated industrial robot with a structured light sensor mounted on the end-effector is developed. A method for calibrating the system parameters is proposed in which the hand-eye transformation parameters and the robot kinematic parameters are synthesized in the calibration process. An initial hand-eye calibration is first performed using a standard sphere as the calibration target. By applying the modified complete and parametrically continuous method, we establish a synthesized kinematic model that combines the initial hand-eye transformation and distal link parameters as a whole with the sensor coordinate system as the tool frame. According to the synthesized kinematic model, an error model is constructed based on spheres' center-to-center distance errors. Consequently, the error model parameters can be identified in a calibration experiment using a three-standard-sphere target. Furthermore, the redundancy of error model parameters is eliminated to ensure the accuracy and robustness of the parameter identification. Calibration and measurement experiments are carried out based on an ER3A-C60 robot. The experimental results show that the proposed calibration method enjoys high measurement accuracy, and this efficient and flexible system is suitable for online measurement in industrial scenes.
An IUR evolutionary game model on the patent cooperate of Shandong China
NASA Astrophysics Data System (ADS)
Liu, Mengmeng; Ma, Yinghong; Liu, Zhiyuan; You, Xuemei
2017-06-01
Organizations of industries and university & research institutes cooperate to meet their respective needs based on social contacts, trust and share complementary resources. From the perspective of complex network together with the patent data of Shandong province in China, a novel evolutionary game model on patent cooperation network is presented. Two sides in the game model are industries and universities & research institutes respectively. The cooperation is represented by a connection when a new patent is developed together by the two sides. The optimal strategy of the evolutionary game model is quantified by the average positive cooperation probability p ¯ and the average payoff U ¯ . The feasibility of this game model is simulated on the parameters such as the knowledge spillover, the punishment, the development cost and the distribution coefficient of the benefit. The numerical simulations show that the cooperative behaviors are affected by the variation of parameters. The knowledge spillover displays different behaviors when the punishment is larger than the development cost or less than it. Those results indicate that reasonable punishment would improve the positive cooperation. The appropriate punishment will be useful to enhance the big degree nodes positively cooperate with industries and universities & research institutes. And an equitable plan for the distribution of cooperative profits is half-and-half distribution strategy for the two sides in game.
Reliability Standards of Complex Engineering Systems
NASA Astrophysics Data System (ADS)
Galperin, E. M.; Zayko, V. A.; Gorshkalev, P. A.
2017-11-01
Production and manufacture play an important role in today’s modern society. Industrial production is nowadays characterized by increased and complex communications between its parts. The problem of preventing accidents in a large industrial enterprise becomes especially relevant. In these circumstances, the reliability of enterprise functioning is of particular importance. Potential damage caused by an accident at such enterprise may lead to substantial material losses and, in some cases, can even cause a loss of human lives. That is why industrial enterprise functioning reliability is immensely important. In terms of their reliability, industrial facilities (objects) are divided into simple and complex. Simple objects are characterized by only two conditions: operable and non-operable. A complex object exists in more than two conditions. The main characteristic here is the stability of its operation. This paper develops the reliability indicator combining the set theory methodology and a state space method. Both are widely used to analyze dynamically developing probability processes. The research also introduces a set of reliability indicators for complex technical systems.
Tan, Chao; Chen, Hui; Wang, Chao; Zhu, Wanping; Wu, Tong; Diao, Yuanbo
2013-03-15
Near and mid-infrared (NIR/MIR) spectroscopy techniques have gained great acceptance in the industry due to their multiple applications and versatility. However, a success of application often depends heavily on the construction of accurate and stable calibration models. For this purpose, a simple multi-model fusion strategy is proposed. It is actually the combination of Kohonen self-organizing map (KSOM), mutual information (MI) and partial least squares (PLSs) and therefore named as KMICPLS. It works as follows: First, the original training set is fed into a KSOM for unsupervised clustering of samples, on which a series of training subsets are constructed. Thereafter, on each of the training subsets, a MI spectrum is calculated and only the variables with higher MI values than the mean value are retained, based on which a candidate PLS model is constructed. Finally, a fixed number of PLS models are selected to produce a consensus model. Two NIR/MIR spectral datasets from brewing industry are used for experiments. The results confirms its superior performance to two reference algorithms, i.e., the conventional PLS and genetic algorithm-PLS (GAPLS). It can build more accurate and stable calibration models without increasing the complexity, and can be generalized to other NIR/MIR applications. Copyright © 2012 Elsevier B.V. All rights reserved.
Industrial Adoption of Model-Based Systems Engineering: Challenges and Strategies
NASA Astrophysics Data System (ADS)
Maheshwari, Apoorv
As design teams are becoming more globally integrated, one of the biggest challenges is to efficiently communicate across the team. The increasing complexity and multi-disciplinary nature of the products are also making it difficult to keep track of all the information generated during the design process by these global team members. System engineers have identified Model-based Systems Engineering (MBSE) as a possible solution where the emphasis is placed on the application of visual modeling methods and best practices to systems engineering (SE) activities right from the beginning of the conceptual design phases through to the end of the product lifecycle. Despite several advantages, there are multiple challenges restricting the adoption of MBSE by industry. We mainly consider the following two challenges: a) Industry perceives MBSE just as a diagramming tool and does not see too much value in MBSE; b) Industrial adopters are skeptical if the products developed using MBSE approach will be accepted by the regulatory bodies. To provide counter evidence to the former challenge, we developed a generic framework for translation from an MBSE tool (Systems Modeling Language, SysML) to an analysis tool (Agent-Based Modeling, ABM). The translation is demonstrated using a simplified air traffic management problem and provides an example of a potential quite significant value: the ability to use MBSE representations directly in an analysis setting. For the latter challenge, we are developing a reference model that uses SysML to represent a generic infusion pump and SE process for planning, developing, and obtaining regulatory approval of a medical device. This reference model demonstrates how regulatory requirements can be captured effectively through model-based representations. We will present another case study at the end where we will apply the knowledge gained from both case studies to a UAV design problem.
Chusai, Chatinai; Manomaiphiboon, Kasemsan; Saiyasitpanich, Phirun; Thepanondh, Sarawut
2012-08-01
Map Ta Phut industrial area (MA) is the largest industrial complex in Thailand. There has been concern about many air pollutants over this area. Air quality management for the area is known to be difficult, due to lack of understanding of how emissions from different sources or sectors (e.g., industrial, power plant, transportation, and residential) contribute to air quality degradation in the area. In this study, a dispersion study of NO2 and SO2 was conducted using the AERMOD model. The area-specific emission inventories of NOx and SO2 were prepared, including both stack and nonstack sources, and divided into 11 emission groups. Annual simulations were performed for the year 2006. Modeled concentrations were evaluated with observations. Underestimation of both pollutants was Jbund, and stack emission estimates were scaled to improve the modeled results before quantifying relative roles of individual emission groups to ambient concentration overfour selected impacted areas (two are residential and the others are highly industrialized). Two concentration measures (i.e., annual average area-wide concentration or AC, and area-wide robust highest concentration or AR) were used to aggregately represent mean and high-end concentrations Jbfor each individual area, respectively. For AC-NO2, on-road mobile emissions were found to be the largest contributor in the two residential areas (36-38% of total AC-NO2), while petrochemical-industry emissions play the most important role in the two industrialized areas (34-51%). For AR-NO2, biomass burning has the most influence in all impacted areas (>90%) exceptJor one residential area where on-road mobile is the largest (75%). For AC-SO2, the petrochemical industry contributes most in all impacted areas (38-56%). For AR-SO2, the results vary. Since the petrochemical industry was often identified as the major contributor despite not being the largest emitter, air quality workers should pay special attention to this emission group when managing air quality for the MA.
Chusai, Chatinai; Manomaiphiboon, Kasemsan; Saiyasitpanich, Phirun; Thepanondh, Sarawut
2012-08-01
Map Ta Phut industrial area (MA) is the largest industrial complex in Thailand. There has been concern about many air pollutants over this area. Air quality management for the area is known to be difficult, due to lack of understanding of how emissions from different sources or sectors (e.g., industrial, power plant, transportation, and residential) contribute to air quality degradation in the area. In this study, a dispersion study of NO 2 and SO 2 was conducted using the AERMOD model. The area-specific emission inventories of NO x and SO 2 were prepared, including both stack and nonstack sources, and divided into 11 emission groups. Annual simulations were performed for the year 2006. Modeled concentrations were evaluated with observations. Underestimation of both pollutants was found, and stack emission estimates were scaled to improve the modeled results before quantifying relative roles of individual emission groups to ambient concentration over four selected impacted areas (two are residential and the others are highly industrialized). Two concentration measures (i.e., annual average area-wide concentration or AC, and area-wide robust highest concentration or AR) were used to aggregately represent mean and high-end concentrations for each individual area, respectively. For AC-NO 2 , on-road mobile emissions were found to be the largest contributor in the two residential areas (36-38% of total AC-NO 2 ), while petrochemical-industry emissions play the most important role in the two industrialized areas (34-51%). For AR-NO 2 , biomass burning has the most influence in all impacted areas (>90%) except for one residential area where on-road mobile is the largest (75%). For AC-SO 2 , the petrochemical industry contributes most in all impacted areas (38-56%). For AR-SO 2 , the results vary. Since the petrochemical industry was often identified as the major contributor despite not being the largest emitter, air quality workers should pay special attention to this emission group when managing air quality for the MA. [Box: see text] [Box: see text].
Can Industrial Physics Avoid Being Creatively Destroyed?
NASA Astrophysics Data System (ADS)
Hass, Kenneth C.
2004-03-01
Opportunities abound for physics and physicists to remain vital contributors to industrial innovation throughout the 21st century. The key questions are whether those trained in physics are sufficiently willing and flexible to continuously enhance their value to their companies by adapting to changing business priorities and whether business leaders are sufficiently enlightened to recognize and exploit the unique skills and creativity that physicists often provide. "Industrial physics" today is more diverse than ever, and answers to the above questions will vary with sector, company, and even individual physicists. Such heterogeneity creates new challenges for the physics community in general, which may need to undergo significant cultural change to maintain strong ties between physicists in industry, academia, and government. Insights from the emerging science of complex systems will be used to emphasize the importance of realistic mental models for the interactions between science and technology and the pathways from scientific advance to successful commercialization. Examples will be provided of the ongoing value of physics-based research in the auto industry and of the growing importance of interdisciplinary approaches to the technical needs of industry.
Model Recommended Values of Corporate Culture for Industrial Companies in Slovak Republic
NASA Astrophysics Data System (ADS)
Urbanovičová, Petra; Mikulášková, Justína; Čambál, Miloš
2017-09-01
The main objective of the paper is to describe the recommended values model of corporate culture and supporting business performance for industrial companies operating in the Slovak Republic. This model was developed on the basis of research results within the STU Project to support young researchers entitled "Changing the potential of the companýs success using the principles of spiral management and its impact on corporate culture". The current paper is a part of submitted VEGA project No.1/0348/17 "The impact of the coexistence of different generations of employees on the sustainable performance of organisations". This model will be the basis for defining corporate values and developing or changing corporate culture for the companies operating on or coming (from abroad) to the Slovak market. The characteristic features of the value model are simplicity, complexity and applicability. This model takes into account the current situation on the Slovak market. The values of this model have a different level of significance given and each value is defined by the specified principles.
On-line coating of glass with tin oxide by atmospheric pressure chemical vapor deposition.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allendorf, Mark D.; Sopko, J.F.; Houf, William G.
2006-11-01
Atmospheric pressure chemical vapor deposition (APCVD) of tin oxide is a very important manufacturing technique used in the production of low-emissivity glass. It is also the primary method used to provide wear-resistant coatings on glass containers. The complexity of these systems, which involve chemical reactions in both the gas phase and on the deposition surface, as well as complex fluid dynamics, makes process optimization and design of new coating reactors a very difficult task. In 2001 the U.S. Dept. of Energy Industrial Technologies Program Glass Industry of the Future Team funded a project to address the need for more accuratemore » data concerning the tin oxide APCVD process. This report presents a case study of on-line APCVD using organometallic precursors, which are the primary reactants used in industrial coating processes. Research staff at Sandia National Laboratories in Livermore, CA, and the PPG Industries Glass Technology Center in Pittsburgh, PA collaborated to produce this work. In this report, we describe a detailed investigation of the factors controlling the growth of tin oxide films. The report begins with a discussion of the basic elements of the deposition chemistry, including gas-phase thermochemistry of tin species and mechanisms of chemical reactions involved in the decomposition of tin precursors. These results provide the basis for experimental investigations in which tin oxide growth rates were measured as a function of all major process variables. The experiments focused on growth from monobutyltintrichloride (MBTC) since this is one of the two primary precursors used industrially. There are almost no reliable growth-rate data available for this precursor. Robust models describing the growth rate as a function of these variables are derived from modeling of these data. Finally, the results are used to conduct computational fluid dynamic simulations of both pilot- and full-scale coating reactors. As a result, general conclusions are reached concerning the factors affecting the growth rate in on-line APCVD reactors. In addition, a substantial body of data was generated that can be used to model many different industrial tin oxide coating processes. These data include the most extensive compilation of thermochemistry for gas-phase tin-containing species as well as kinetic expressions describing tin oxide growth rates over a wide range of temperatures, pressures, and reactant concentrations.« less
An efficient formulation of robot arm dynamics for control and computer simulation
NASA Astrophysics Data System (ADS)
Lee, C. S. G.; Nigam, R.
This paper describes an efficient formulation of the dynamic equations of motion of industrial robots based on the Lagrange formulation of d'Alembert's principle. This formulation, as applied to a PUMA robot arm, results in a set of closed form second order differential equations with cross product terms. They are not as efficient in computation as those formulated by the Newton-Euler method, but provide a better analytical model for control analysis and computer simulation. Computational complexities of this dynamic model together with other models are tabulated for discussion.
The Academic-Industrial Complexity: Failure to Launch.
Levin, Leonard A; Behar-Cohen, Francine
2017-12-01
The pharmaceutical industry has long known that ∼80% of the results of academic laboratories cannot be reproduced when repeated in industry laboratories. Yet academic investigators are typically unaware of this problem, which severely impedes the drug development process. This academic-industrial complication is not one of deception, but rather a complex issue related to how scientific research is carried out and translated in strikingly different enterprises. This Opinion describes the reasons for inconsistencies between academic and industrial laboratories and what can be done to repair this failure of translation. Copyright © 2017 Elsevier Ltd. All rights reserved.
The Role of the Food Industry in Obesity Prevention.
Binks, Martin
2016-06-01
Obesity is a complex disease of diverse etiology. Among the potential influences in the development of obesity, the food supply chain remains an important influence. We provide a conceptual overview related to the food industry's role in obesity prevention. We first discuss some limitations of current public health efforts. We then describe how a model that attends to personal autonomy in the context of supportive policy intervention can empower individuals in their efforts to navigate the food supply chain. We then provide an evidence informed overview of key areas where continued efforts to collaboratively engage the food industry, through solution-focused dialogue and action, have the potential to contribute to obesity prevention. While challenging, appropriately transparent, well-governed public-private partnerships have the demonstrated potential to benefit the communities we serve.
From the water wheel to turbines and hydroelectricity. Technological evolution and revolutions
NASA Astrophysics Data System (ADS)
Viollet, Pierre-Louis
2017-08-01
Since its appearance in the first century BC, the water wheel has developed with increasing pre-industrial activities, and has been at the origin of the industrial revolution for metallurgy, textile mills, and paper mills. Since the nineteenth century, the water wheel has become highly efficient. The reaction turbine appeared by 1825, and continued to undergo technological development. The impulsion turbine appeared for high chutes, by 1880. Other turbines for low-head chutes were further designed. Turbine development was associated, after 1890, with the use of hydropower to generate electricity, both for industrial activities, and for the benefits of cities. A model ;one city + one plant; was followed in the twentieth century by more complex and efficient schemes when electrical interconnection developed, together with pumped plants for energy storage.
Shuai, Jianfei; Kim, Sunshin; Ryu, Hyeonsu; Park, Jinhyeon; Lee, Chae Kwan; Kim, Geun-Bae; Ultra, Venecio U; Yang, Wonho
2018-04-20
Studying human health in areas with industrial contamination is a serious and complex issue. In recent years, attention has increasingly focused on the health implications of large industrial complexes. A variety of potential toxic chemicals have been produced during manufacturing processes and activities in industrial complexes in South Korea. A large number of dyeing industries gathered together in Daegu dyeing industrial complex. The residents near the industrial complex could be often exposed to volatile organic compounds. This study aimed to evaluate VOCs levels in the ambient air of DDIC, to assess the impact on human health risks, and to find more convincing evidences to prove these VOCs emitted from DDIC. According to deterministic risk assessment, inhalation was the most important route. Residential indoor, outdoor and personal exposure air VOCs were measured by passive samplers in exposed area and controlled area in different seasons. Satisfaction with ambient environments and self-reported diseases were also obtained by questionnaire survey. The VOCs concentrations in exposed area and controlled area was compared by t-test. The relationships among every VOC were tested by correlation. The values of hazard quotient (HQ) and life cancer risk were estimated. The concentrations of measured VOCs were presented, moreover, the variety of concentrations according the distances from the residential settings to the industrial complex site in exposed area. The residential indoor, outdoor, and personal exposure concentrations of toluene, DMF and chloroform in exposed area were significantly higher than the corresponding concentrations in controlled area both in summer and autumn. Toluene, DMF, chloroform and MEK had significantly positive correlations with each other in indoor and outdoor, and even in personal exposure. The HQ for DMF exceeded 1, and the life cancer risk of chloroform was greater than 10 - 4 in exposed area. The prevalence of respiratory diseases, anaphylactic diseases and cardiovascular diseases in exposed area were significantly higher than in controlled area. This study showed that adverse cancer and non-cancer health effects may occur by VOCs emitted from DDIC, and some risk managements are needed. Moreover, this study provides a convenient preliminarily method for pollutants source characteristics.
NASA Technical Reports Server (NTRS)
DeLoach, Richard
2012-01-01
This paper reviews the derivation of an equation for scaling response surface modeling experiments. The equation represents the smallest number of data points required to fit a linear regression polynomial so as to achieve certain specified model adequacy criteria. Specific criteria are proposed which simplify an otherwise rather complex equation, generating a practical rule of thumb for the minimum volume of data required to adequately fit a polynomial with a specified number of terms in the model. This equation and the simplified rule of thumb it produces can be applied to minimize the cost of wind tunnel testing.
Hierarchical analytical and simulation modelling of human-machine systems with interference
NASA Astrophysics Data System (ADS)
Braginsky, M. Ya; Tarakanov, D. V.; Tsapko, S. G.; Tsapko, I. V.; Baglaeva, E. A.
2017-01-01
The article considers the principles of building the analytical and simulation model of the human operator and the industrial control system hardware and software. E-networks as the extension of Petri nets are used as the mathematical apparatus. This approach allows simulating complex parallel distributed processes in human-machine systems. The structural and hierarchical approach is used as the building method for the mathematical model of the human operator. The upper level of the human operator is represented by the logical dynamic model of decision making based on E-networks. The lower level reflects psychophysiological characteristics of the human-operator.
A constitutive law for dense granular flows.
Jop, Pierre; Forterre, Yoël; Pouliquen, Olivier
2006-06-08
A continuum description of granular flows would be of considerable help in predicting natural geophysical hazards or in designing industrial processes. However, the constitutive equations for dry granular flows, which govern how the material moves under shear, are still a matter of debate. One difficulty is that grains can behave like a solid (in a sand pile), a liquid (when poured from a silo) or a gas (when strongly agitated). For the two extreme regimes, constitutive equations have been proposed based on kinetic theory for collisional rapid flows, and soil mechanics for slow plastic flows. However, the intermediate dense regime, where the granular material flows like a liquid, still lacks a unified view and has motivated many studies over the past decade. The main characteristics of granular liquids are: a yield criterion (a critical shear stress below which flow is not possible) and a complex dependence on shear rate when flowing. In this sense, granular matter shares similarities with classical visco-plastic fluids such as Bingham fluids. Here we propose a new constitutive relation for dense granular flows, inspired by this analogy and recent numerical and experimental work. We then test our three-dimensional (3D) model through experiments on granular flows on a pile between rough sidewalls, in which a complex 3D flow pattern develops. We show that, without any fitting parameter, the model gives quantitative predictions for the flow shape and velocity profiles. Our results support the idea that a simple visco-plastic approach can quantitatively capture granular flow properties, and could serve as a basic tool for modelling more complex flows in geophysical or industrial applications.
A multi-model approach to monitor emissions of CO2 and CO from an urban-industrial complex
NASA Astrophysics Data System (ADS)
Super, Ingrid; Denier van der Gon, Hugo A. C.; van der Molen, Michiel K.; Sterk, Hendrika A. M.; Hensen, Arjan; Peters, Wouter
2017-11-01
Monitoring urban-industrial emissions is often challenging because observations are scarce and regional atmospheric transport models are too coarse to represent the high spatiotemporal variability in the resulting concentrations. In this paper we apply a new combination of an Eulerian model (Weather Research and Forecast, WRF, with chemistry) and a Gaussian plume model (Operational Priority Substances - OPS). The modelled mixing ratios are compared to observed CO2 and CO mole fractions at four sites along a transect from an urban-industrial complex (Rotterdam, the Netherlands) towards rural conditions for October-December 2014. Urban plumes are well-mixed at our semi-urban location, making this location suited for an integrated emission estimate over the whole study area. The signals at our urban measurement site (with average enhancements of 11 ppm CO2 and 40 ppb CO over the baseline) are highly variable due to the presence of distinct source areas dominated by road traffic/residential heating emissions or industrial activities. This causes different emission signatures that are translated into a large variability in observed ΔCO : ΔCO2 ratios, which can be used to identify dominant source types. We find that WRF-Chem is able to represent synoptic variability in CO2 and CO (e.g. the median CO2 mixing ratio is 9.7 ppm, observed, against 8.8 ppm, modelled), but it fails to reproduce the hourly variability of daytime urban plumes at the urban site (R2 up to 0.05). For the urban site, adding a plume model to the model framework is beneficial to adequately represent plume transport especially from stack emissions. The explained variance in hourly, daytime CO2 enhancements from point source emissions increases from 30 % with WRF-Chem to 52 % with WRF-Chem in combination with the most detailed OPS simulation. The simulated variability in ΔCO : ΔCO2 ratios decreases drastically from 1.5 to 0.6 ppb ppm-1, which agrees better with the observed standard deviation of 0.4 ppb ppm-1. This is partly due to improved wind fields (increase in R2 of 0.10) but also due to improved point source representation (increase in R2 of 0.05) and dilution (increase in R2 of 0.07). Based on our analysis we conclude that a plume model with detailed and accurate dispersion parameters adds substantially to top-down monitoring of greenhouse gas emissions in urban environments with large point source contributions within a ˜ 10 km radius from the observation sites.
A modelling case study to evaluate control strategies for ozone reduction in Southwestern Spain
NASA Astrophysics Data System (ADS)
Castell, N.; Mantilla, E.; Salvador, R.; Stein, A. F.; Millán, M.
2009-09-01
Ozone is a strong oxidant and when certain concentrations are reached it has adverse effects on health, vegetation and materials. With the aim of protecting human health and ecosystems, European Directive 2008/50/EC establishes target values for ozone concentrations, to be achieved from 2010 onwards. In our study area, located in southwestern Spain, ozone levels regularly exceed the human health protection threshold defined in the European Directive. Indeed, this threshold was exceeded on 92 days in 2007, despite the fact that the Directive stipulates that it should not be exceeded on more than 25 days per calendar year averaged over three years. It is urgent, therefore, to reduce the current ozone levels, but because ozone is a secondary pollutant, this reduction must necessarily involve limiting the emission of its precursors, primarily nitrogen oxides (NOx) and volatile organic compounds (VOC). During the central months of the year, southwestern Spain is under strong insolation and weak synoptic forcing, promoting the development of sea breezes and mountain-induced winds and creating re-circulations of pollutants. The complex topography of the area induces the formation of vertical layers, into which the pollutants are injected and subjected to long distance transport and compensatory subsidence. The characteristics of these highly complex flows have important effects on the pollutant dispersion. In this study two ozone pollution episodes have been selected to assess the ozone response to reductions in NOx and VOC emissions from industry and traffic. The first corresponds to a typical summer episode, with the development of breezes in an anticyclonic situation with low gradient pressure and high temperatures, while the second episode presents a configuration characteristic of spring or early summer, with a smooth westerly flow and more moderate temperatures. Air pollution studies in complex terrain require the use of high-resolution models to resolve the complex structures of the local flows and their impact on emissions; nevertheless, these mesoscale systems are developed within the scope of a synoptic circulation, which also affects both the breeze development and the pollutant transport. In order to take the relationship between the different atmospheric scales into account, we used the CAMx photochemical model coupled with the MM5 meteorological model, both configured with a system of nested grids. The study domain covers an area of 28224 km2, with 2 km horizontal resolution and 18 vertical layers up to a height of 5 km with high resolution in the levels close to the ground. This paper assesses the impact over the hourly and 8-hourly maximum daily ozone concentrations of four reduction strategies in an area with complex terrain: (i) 25% reduction in VOC and NOx from industry and traffic, (ii) 50% reduction in NOx and VOC from the industry, (iii) 50% reduction in NOx and VOC from traffic, and (iv) 100% reduction in NOx and VOC from the petrochemical plant and the refinery. The study area has large industrial sources, such as a petroleum refinery, a petrochemical plant, several chemical complexes and co-generation power plants, among others. The study area includes the cities of Huelva (148,000 inhabitants), Seville (699,760 inhabitants) and Cadiz (127,200 inhabitants). The analyses presented in this work provide an assessment of the effectiveness of several strategies to reduce ozone pollution in different meteorological scenarios.
NASA Astrophysics Data System (ADS)
Armenio, Vincenzo; Fakhari, Ahmad; Petronio, Andrea; Padovan, Roberta; Pittaluga, Chiara; Caprino, Giovanni
2015-11-01
Massive flow separation is ubiquitous in industrial applications, ruling drag and hydrodynamic noise. In spite of considerable efforts, its numerical prediction still represents a challenge for CFD models in use in engineering. Aside commercial software, over the latter years the opensource software OpenFOAMR (OF) has emerged as a valid tool for prediction of complex industrial flows. In the present work, we simulate two flows representative of a class of situations occurring in industrial problems: the flow around sphere and that around a wall-mounted square cylinder at Re = 10000 . We compare the performance two different tools, namely OF and ANSYS CFX 15.0 (CFX) using different unstructured grids and turbulence models. The grids have been generated using SNAPPYHEXMESH and ANSYS ICEM CFD 15.0 with different near wall resolutions. The codes have been run in a RANS mode using k - ɛ model (OF) and SST - k - ω (CFX) with and without wall-layer models. OF has been also used in LES, WMLES and DES mode. Regarding the sphere, RANS models were not able to catch separation, while good prediction of separation and distribution of stresses over the surface were obtained using LES, WMLES and DES. Results for the second test case are currently under analysis. Financial support from COSMO ``cfd open source per opera mortta'' PAR FSC 2007-2013, Friuli Venezia Giulia.
Pollution characterization of liquid waste of the factory complex Fertial (Arzew, Algeria).
Redouane, Fares; Mourad, Lounis
2016-03-01
The industrial development in Algeria has made a worrying situation for all socioeconomic stakeholders. Indeed, this economic growth is marked in recent years by the establishment of factories and industrial plants that discharge liquid waste in marine shorelines. These releases could destabilize the environmental balance in the coming years, hence the need to support the processing of all sources of pollution. Remediation of such discharges requires several steps of identifying the various pollutants to their treatments. Therefore, the authors conducted this first work of characterization of industrial effluents generated by the mineral fertilizer factory complex Fertial (Arzew), and discussed the pollution load generated by this type of industry. This monitoring would establish a tool for reflection and decision support developed by a management system capable of ensuring effective and sustainable management of effluents from industrial activities of Fertial. The authors conducted this first work of characterization of industrial effluents generated by the mineral fertilizer factory complex Fertial (Arzew), and discussed the pollution load generated by this type of industry. This monitoring would establish a tool for reflection and decision support developed by a management system capable of ensuring effective and sustainable management of effluents from industrial activities of Fertial.
NASA Astrophysics Data System (ADS)
Aksenova, Olesya; Pachkina, Anna
2017-11-01
The article deals with the problem of necessity of educational process transformation to meet the requirements of modern miming industry; cooperative developing of new educational programs and implementation of educational process taking into account modern manufacturability. The paper proves the idea of introduction into mining professionals learning process studying of three-dimensional models of surface technological complex, ore reserves and underground digging complex as well as creating these models in different graphic editors and working with the information analysis model obtained on the basis of these three-dimensional models. The technological process of manless coal mining at the premises of the mine Polysaevskaya controlled by the information analysis models built on the basis of three-dimensional models of individual objects and technological process as a whole, and at the same time requiring the staff able to use the programs of three-dimensional positioning in the miners and equipment global frame of reference is covered.
NASA Astrophysics Data System (ADS)
Yondo, Raul; Andrés, Esther; Valero, Eusebio
2018-01-01
Full scale aerodynamic wind tunnel testing, numerical simulation of high dimensional (full-order) aerodynamic models or flight testing are some of the fundamental but complex steps in the various design phases of recent civil transport aircrafts. Current aircraft aerodynamic designs have increase in complexity (multidisciplinary, multi-objective or multi-fidelity) and need to address the challenges posed by the nonlinearity of the objective functions and constraints, uncertainty quantification in aerodynamic problems or the restrained computational budgets. With the aim to reduce the computational burden and generate low-cost but accurate models that mimic those full order models at different values of the design variables, Recent progresses have witnessed the introduction, in real-time and many-query analyses, of surrogate-based approaches as rapid and cheaper to simulate models. In this paper, a comprehensive and state-of-the art survey on common surrogate modeling techniques and surrogate-based optimization methods is given, with an emphasis on models selection and validation, dimensionality reduction, sensitivity analyses, constraints handling or infill and stopping criteria. Benefits, drawbacks and comparative discussions in applying those methods are described. Furthermore, the paper familiarizes the readers with surrogate models that have been successfully applied to the general field of fluid dynamics, but not yet in the aerospace industry. Additionally, the review revisits the most popular sampling strategies used in conducting physical and simulation-based experiments in aircraft aerodynamic design. Attractive or smart designs infrequently used in the field and discussions on advanced sampling methodologies are presented, to give a glance on the various efficient possibilities to a priori sample the parameter space. Closing remarks foster on future perspectives, challenges and shortcomings associated with the use of surrogate models by aircraft industrial aerodynamicists, despite their increased interest among the research communities.
Garg, Harish
2013-03-01
The main objective of the present paper is to propose a methodology for analyzing the behavior of the complex repairable industrial systems. In real-life situations, it is difficult to find the most optimal design policies for MTBF (mean time between failures), MTTR (mean time to repair) and related costs by utilizing available resources and uncertain data. For this, the availability-cost optimization model has been constructed for determining the optimal design parameters for improving the system design efficiency. The uncertainties in the data related to each component of the system are estimated with the help of fuzzy and statistical methodology in the form of the triangular fuzzy numbers. Using these data, the various reliability parameters, which affects the system performance, are obtained in the form of the fuzzy membership function by the proposed confidence interval based fuzzy Lambda-Tau (CIBFLT) methodology. The computed results by CIBFLT are compared with the existing fuzzy Lambda-Tau methodology. Sensitivity analysis on the system MTBF has also been addressed. The methodology has been illustrated through a case study of washing unit, the main part of the paper industry. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.
The new medical-industrial complex.
Relman, A S
1980-10-23
The most important health-care development of the day is the recent, relatively unheralded rise of a huge new industry that supplies health-care services for profit. Proprietary hospitals and nursing homes, diagnostic laboratories, home-care and emergency-room services, hemodialysis, and a wide variety of other services produced a gross income to this industry last year of about $35 billion to +40 billion. This new "medical-industrial complex" may be more efficient than its nonprofit competition, but it creates the problems of overuse and fragmentation of services, overemphasis on technology, and "cream-skimming," and it may also exercise undue influence on national health policy. In this medical market, physicians must act as discerning purchasing agents for their patients and therefore should have no conflicting financial interests. Closer attention from the public and the profession, and careful study, are necessary to ensure that the "medical-industrial complex" puts the interest of the public before those of its stockholders.
Switching and optimizing control for coal flotation process based on a hybrid model
Dong, Zhiyong; Wang, Ranfeng; Fan, Minqiang; Fu, Xiang
2017-01-01
Flotation is an important part of coal preparation, and the flotation column is widely applied as efficient flotation equipment. This process is complex and affected by many factors, with the froth depth and reagent dosage being two of the most important and frequently manipulated variables. This paper proposes a new method of switching and optimizing control for the coal flotation process. A hybrid model is built and evaluated using industrial data. First, wavelet analysis and principal component analysis (PCA) are applied for signal pre-processing. Second, a control model for optimizing the set point of the froth depth is constructed based on fuzzy control, and a control model is designed to optimize the reagent dosages based on expert system. Finally, the least squares-support vector machine (LS-SVM) is used to identify the operating conditions of the flotation process and to select one of the two models (froth depth or reagent dosage) for subsequent operation according to the condition parameters. The hybrid model is developed and evaluated on an industrial coal flotation column and exhibits satisfactory performance. PMID:29040305
Wu, Xiaohui; Yang, Yang; Wu, Gaoming; Mao, Juan; Zhou, Tao
2016-01-01
Applications of activated sludge models (ASM) in simulating industrial biological wastewater treatment plants (WWTPs) are still difficult due to refractory and complex components in influents as well as diversity in activated sludges. In this study, an ASM3 modeling study was conducted to simulate and optimize a practical coking wastewater treatment plant (CWTP). First, respirometric characterizations of the coking wastewater and CWTP biomasses were conducted to determine the specific kinetic and stoichiometric model parameters for the consecutive aeration-anoxic-aeration (O-A/O) biological process. All ASM3 parameters have been further estimated and calibrated, through cross validation by the model dynamic simulation procedure. Consequently, an ASM3 model was successfully established to accurately simulate the CWTP performances in removing COD and NH4-N. An optimized CWTP operation condition could be proposed reducing the operation cost from 6.2 to 5.5 €/m(3) wastewater. This study is expected to provide a useful reference for mathematic simulations of practical industrial WWTPs. Copyright © 2015 Elsevier Ltd. All rights reserved.
Fleetwood, Gill; Chlebus, Magda; Coenen, Joachim; Dudoignon, Nicolas; Lecerf, Catherine; Maisonneuve, Catherine; Robinson, Sally
2015-01-01
Animal research together with other investigational methods (computer modeling, in vitro tests, etc) remains an indispensable part of the pharmaceutical research and development process. The European pharmaceutical industry recognizes the responsibilities inherent in animal research and is committed to applying and enhancing 3Rs principles. New nonsentient, ex vivo, and in vitro methods are developed every day and contribute to reducing and, in some instances, replacing in vivo studies. Their utility is however limited by the extent of our current knowledge and understanding of complex biological systems. Until validated alternative ways to model these complex interactions become available, animals remain indispensable in research and safety testing. In the interim, scientists continue to look for ways to reduce the number of animals needed to obtain valid results, refine experimental techniques to enhance animal welfare, and replace animals with other research methods whenever feasible. As research goals foster increasing cross-sector and international collaboration, momentum is growing to enhance and coordinate scientific innovation globally—beyond a single company, stakeholder group, sector, region, or country. The implementation of 3Rs strategies can be viewed as an integral part of this continuously evolving science, demonstrating the link between science and welfare, benefiting both the development of new medicines and animal welfare. This goal is one of the key objectives of the Research and Animal Welfare working group of the European Federation of Pharmaceutical Industries and Associations. PMID:25836966
Fleetwood, Gill; Chlebus, Magda; Coenen, Joachim; Dudoignon, Nicolas; Lecerf, Catherine; Maisonneuve, Catherine; Robinson, Sally
2015-03-01
Animal research together with other investigational methods (computer modeling, in vitro tests, etc) remains an indispensable part of the pharmaceutical research and development process. The European pharmaceutical industry recognizes the responsibilities inherent in animal research and is committed to applying and enhancing 3Rs principles. New nonsentient, ex vivo, and in vitro methods are developed every day and contribute to reducing and, in some instances, replacing in vivo studies. Their utility is however limited by the extent of our current knowledge and understanding of complex biological systems. Until validated alternative ways to model these complex interactions become available, animals remain indispensable in research and safety testing. In the interim, scientists continue to look for ways to reduce the number of animals needed to obtain valid results, refine experimental techniques to enhance animal welfare, and replace animals with other research methods whenever feasible. As research goals foster increasing cross-sector and international collaboration, momentum is growing to enhance and coordinate scientific innovation globally-beyond a single company, stakeholder group, sector, region, or country. The implementation of 3Rs strategies can be viewed as an integral part of this continuously evolving science, demonstrating the link between science and welfare, benefiting both the development of new medicines and animal welfare. This goal is one of the key objectives of the Research and Animal Welfare working group of the European Federation of Pharmaceutical Industries and Associations.
Salmani-Ghabeshi, S; Palomo-Marín, M R; Bernalte, E; Rueda-Holgado, F; Miró-Rodríguez, C; Cereceda-Balic, F; Fadic, X; Vidal, V; Funes, M; Pinilla-Gil, E
2016-11-01
The Punchuncaví Valley in central Chile, heavily affected by a range of anthropogenic emissions from a localized industrial complex, has been studied as a model environment for evaluating the spatial gradient of human health risk, which are mainly caused by trace elemental pollutants in soil. Soil elemental profiles in 121 samples from five selected locations representing different degrees of impact from the industrial source were used for human risk estimation. Distance to source dependent cumulative non-carcinogenic hazard indexes above 1 for children (max 4.4 - min 1.5) were found in the study area, ingestion being the most relevant risk pathway. The significance of health risk differences within the study area was confirmed by statistical analysis (ANOVA and HCA) of individual hazard index values at the five sampling locations. As was the dominant factor causing unacceptable carcinogenic risk levels for children (<10 -4 ) at the two sampling locations which are closer to the industrial complex, whereas the risk was just in the tolerable range (10 -6 - 10 -4 ) for children and adults in the rest of the sampling locations at the study area. Furthermore, we assessed gamma ray radiation external hazard indexes and annual effective dose rate from the natural radioactivity elements ( 226 Ra, 232 Th and 40 K) levels in the surface soils of the study area. The highest average values for the specific activity of 232 Th (31 Bq kg -1 ), 40 K (615 Bq kg - 1 ), and 226 Ra (25 Bq kg -1 ) are lower than limit recommended by OECD, so no significant radioactive risk was detected within the study area. In addition, no significant variability of radioactive risk was observed among sampling locations. Copyright © 2016 Elsevier Ltd. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Feed efficiency (FE), characterized as the ability to convert feed nutrients into saleable milk or meat directly affects the profitability of dairy production, is of increasing economic importance in the dairy industry. We conjecture that FE is a complex trait whose variation and relationships or pa...
Complex dynamics and empirical evidence (Invited Paper)
NASA Astrophysics Data System (ADS)
Delli Gatti, Domenico; Gaffeo, Edoardo; Giulioni, Gianfranco; Gallegati, Mauro; Kirman, Alan; Palestrini, Antonio; Russo, Alberto
2005-05-01
Standard macroeconomics, based on a reductionist approach centered on the representative agent, is badly equipped to explain the empirical evidence where heterogeneity and industrial dynamics are the rule. In this paper we show that a simple agent-based model of heterogeneous financially fragile agents is able to replicate a large number of scaling type stylized facts with a remarkable degree of statistical precision.
NASA Astrophysics Data System (ADS)
Tang, Jian; Qiao, Junfei; Wu, ZhiWei; Chai, Tianyou; Zhang, Jian; Yu, Wen
2018-01-01
Frequency spectral data of mechanical vibration and acoustic signals relate to difficult-to-measure production quality and quantity parameters of complex industrial processes. A selective ensemble (SEN) algorithm can be used to build a soft sensor model of these process parameters by fusing valued information selectively from different perspectives. However, a combination of several optimized ensemble sub-models with SEN cannot guarantee the best prediction model. In this study, we use several techniques to construct mechanical vibration and acoustic frequency spectra of a data-driven industrial process parameter model based on selective fusion multi-condition samples and multi-source features. Multi-layer SEN (MLSEN) strategy is used to simulate the domain expert cognitive process. Genetic algorithm and kernel partial least squares are used to construct the inside-layer SEN sub-model based on each mechanical vibration and acoustic frequency spectral feature subset. Branch-and-bound and adaptive weighted fusion algorithms are integrated to select and combine outputs of the inside-layer SEN sub-models. Then, the outside-layer SEN is constructed. Thus, "sub-sampling training examples"-based and "manipulating input features"-based ensemble construction methods are integrated, thereby realizing the selective information fusion process based on multi-condition history samples and multi-source input features. This novel approach is applied to a laboratory-scale ball mill grinding process. A comparison with other methods indicates that the proposed MLSEN approach effectively models mechanical vibration and acoustic signals.
Adsorption of saturated fatty acid in urea complexation: Kinetics and equilibrium studies
NASA Astrophysics Data System (ADS)
Setyawardhani, Dwi Ardiana; Sulistyo, Hary; Sediawan, Wahyudi Budi; Fahrurrozi, Mohammad
2018-02-01
Urea complexation is fractionation process for concentrating poly-unsaturated fatty acids (PUFAs) from vegetable oil or animal fats. For process design and optimization in commercial industries, it is necessary to provide kinetics and equilibrium data. Urea inclusion compounds (UICs) as the product is a unique complex form which one molecule (guest) is enclosed within another molecule (host). In urea complexation, the guest-host bonding exists between saturated fatty acids (SFAs) and crystalline urea. This research studied the complexation is analogous to an adsorption process. The Batch adsorption process was developed to obtain the experimental data. The ethanolic urea solution was mixed with SFA in certain compositions and adsorption times. The mixture was heated until it formed homogenous and clear solution, then it cooled very slowly until the first numerous crystal appeared. Adsorption times for the kinetic data were determined since the crystal formed. The temperature was maintained constant at room temperature. Experimental sets of data were observed with adsorption kinetics and equilibrium models. High concentration of saturated fatty acid (SFA) was used to represent adsorption kinetics and equilibrium parameters. Kinetic data were examined with pseudo first-order, pseudo second-order and intra particle diffusion models. Linier, Freundlich and Langmuir isotherm were used to study the equilibrium model of this adsorption. The experimental data showed that SFA adsorption in urea crystal followed pseudo second-order model. The compatibility of the data with Langmuir isotherm showed that urea complexation was a monolayer adsorption.
Meta-control of combustion performance with a data mining approach
NASA Astrophysics Data System (ADS)
Song, Zhe
Large scale combustion process is complex and proposes challenges of optimizing its performance. Traditional approaches based on thermal dynamics have limitations on finding optimal operational regions due to time-shift nature of the process. Recent advances in information technology enable people collect large volumes of process data easily and continuously. The collected process data contains rich information about the process and, to some extent, represents a digital copy of the process over time. Although large volumes of data exist in industrial combustion processes, they are not fully utilized to the level where the process can be optimized. Data mining is an emerging science which finds patterns or models from large data sets. It has found many successful applications in business marketing, medical and manufacturing domains The focus of this dissertation is on applying data mining to industrial combustion processes, and ultimately optimizing the combustion performance. However the philosophy, methods and frameworks discussed in this research can also be applied to other industrial processes. Optimizing an industrial combustion process has two major challenges. One is the underlying process model changes over time and obtaining an accurate process model is nontrivial. The other is that a process model with high fidelity is usually highly nonlinear, solving the optimization problem needs efficient heuristics. This dissertation is set to solve these two major challenges. The major contribution of this 4-year research is the data-driven solution to optimize the combustion process, where process model or knowledge is identified based on the process data, then optimization is executed by evolutionary algorithms to search for optimal operating regions.
SOI technology for power management in automotive and industrial applications
NASA Astrophysics Data System (ADS)
Stork, Johannes M. C.; Hosey, George P.
2017-02-01
Semiconductor on Insulator (SOI) technology offers an assortment of opportunities for chip manufacturers in the Power Management market. Recent advances in the automotive and industrial markets, along with emerging features, the increasing use of sensors, and the ever-expanding "Internet of Things" (IoT) are providing for continued growth in these markets while also driving more complex solutions. The potential benefits of SOI include the ability to place both high-voltage and low-voltage devices on a single chip, saving space and cost, simplifying designs and models, and improving performance, thereby cutting development costs and improving time to market. SOI also offers novel new approaches to long-standing technologies.
Structure and dynamics of microbe-exuded polymers and their interactions with calcite surfaces.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cygan, Randall Timothy; Mitchell, Ralph; Perry, Thomas D.
2005-12-01
Cation binding by polysaccharides is observed in many environments and is important for predictive environmental modeling, and numerous industrial and food technology applications. The complexities of these organo-cation interactions are well suited to predictive molecular modeling studies for investigating the roles of conformation and configuration of polysaccharides on cation binding. In this study, alginic acid was chosen as a model polymer and representative disaccharide and polysaccharide subunits were modeled. The ability of disaccharide subunits to bind calcium and to associate with the surface of calcite was investigated. The findings were extended to modeling polymer interactions with calcium ions.
NASA Astrophysics Data System (ADS)
Gurnon, Amanda Kate
The complex, nonlinear flow behavior of soft materials transcends industrial applications, smart material design and non-equilibrium thermodynamics. A long-standing, fundamental challenge in soft-matter science is establishing a quantitative connection between the deformation field, local microstructure and macroscopic dynamic flow properties i.e., the rheology. Soft materials are widely used in consumer products and industrial processes including energy recovery, surfactants for personal healthcare (e.g. soap and shampoo), coatings, plastics, drug delivery, medical devices and therapeutics. Oftentimes, these materials are processed by, used during, or exposed to non-equilibrium conditions for which the transient response of the complex fluid is critical. As such, designing new dynamic experiments is imperative to testing these materials and further developing micromechanical models to predict their transient response. Two of the most common classes of these soft materials stand as the focus of the present research; they are: solutions of polymer-like micelles (PLM or also known as wormlike micelles, WLM) and concentrated colloidal suspensions. In addition to their varied applications these two different classes of soft materials are also governed by different physics. In contrast, to the shear thinning behavior of the WLMs at high shear rates, the near hard-sphere colloidal suspensions are known to display increases, sometimes quite substantial, in viscosity (known as shear thickening). The stress response of these complex fluids derive from the shear-induced microstructure, thus measurements of the microstructure under flow are critical for understanding the mechanisms underlying the complex, nonlinear rheology of these complex fluids. A popular micromechanical model is reframed from its original derivation for predicting steady shear rheology of polymers and WLMs to be applicable to weakly nonlinear oscillatory shear flow. The validity, utility and limits of this constitutive model are tested by comparison with experiments on model WLM solutions. Further comparisons to the nonlinear oscillatory shear responses measured from colloidal suspensions establishes this analysis as a promising, quantitative method for understanding the underlying mechanisms responsible for the nonlinear dynamic response of complex fluids. A new experimental technique is developed to measure the microstructure of complex fluids during steady and transient shear flow using small-angle neutron scattering (SANS). The Flow-SANS experimental method is now available to the broader user communities at the NIST Center for Neutron Research, Gaithersburg, MD and the Institut Laue-Langevin, Grenoble, France. Using this new method, a model shear banding WLM solution is interrogated under steady and oscillatory shear. For the first time, the flow-SANS methods identify new metastable states for shear banding WLM solutions, thus establishing the method as capable of probing new states not accessible using traditional steady or linear oscillatory shear methods. The flow-induced three-dimensional microstructure of a colloidal suspension under steady and dynamic oscillatory shear is also measured using these rheo- and flow-SANS methods. A new structure state is identified in the shear thickening regime that proves critical for defining the "hydrocluster" microstructure state of the suspension that is responsible for shear thickening. For both the suspensions and the WLM solutions, stress-SANS rules with the measured microstructures define the individual stress components arising separately from conservative and hydrodynamic forces and these are compared with the macroscopic rheology. Analysis of these results defines the crucial length- and time-scales of the transient microstructure response. The novel dynamic microstructural measurements presented in this dissertation provide new insights into the complexities of shear thickening and shear banding flow phenomena, which are effects observed more broadly across many different types of soft materials. Consequently, the microstructure-rheology property relationships developed for these two classes of complex fluids will aid in the testing and advancement of micromechanical constitutive model development, smart material design, industrial processing and fundamental non-equilibrium thermodynamic research of a broad range of soft materials.
Lanthanide complexes as luminogenic probes to measure sulfide levels in industrial samples.
Thorson, Megan K; Ung, Phuc; Leaver, Franklin M; Corbin, Teresa S; Tuck, Kellie L; Graham, Bim; Barrios, Amy M
2015-10-08
A series of lanthanide-based, azide-appended complexes were investigated as hydrogen sulfide-sensitive probes. Europium complex 1 and Tb complex 3 both displayed a sulfide-dependent increase in luminescence, while Tb complex 2 displayed a decrease in luminescence upon exposure to NaHS. The utility of the complexes for monitoring sulfide levels in industrial oil and water samples was investigated. Complex 3 provided a sensitive measure of sulfide levels in petrochemical water samples (detection limit ∼ 250 nM), while complex 1 was capable of monitoring μM levels of sulfide in partially refined crude oil. Copyright © 2015 Elsevier B.V. All rights reserved.
Harnsilawat, Thepkunya; Pongsawatmanit, Rungnaphar; McClements, David J
2006-07-26
The potential of utilizing interfacial complexes, formed through the electrostatic interactions of proteins and polysaccharides at oil-water interfaces, to stabilize model beverage cloud emulsions has been examined. These interfacial complexes were formed by mixing charged polysaccharides with oil-in-water emulsions containing oppositely charged protein-coated oil droplets. Model beverage emulsions were prepared that consisted of 0.1 wt % corn oil droplets coated by beta-lactoglobulin (beta-Lg), beta-Lg/alginate, beta-Lg/iota-carrageenan, or beta-Lg/gum arabic interfacial layers (pH 3 or 4). Stable emulsions were formed when the polysaccharide concentration was sufficient to saturate the protein-coated droplets. The emulsions were subjected to variations in pH (from 3 to 7), ionic strength (from 0 to 250 mM NaCl), and thermal processing (from 30 or 90 degrees C), and the influence on their stability was determined. The emulsions containing alginate and carrageenan had the best stability to ionic strength and thermal processing. This study shows that the controlled formation of protein-polysaccharide complexes at droplet surfaces may be used to produce stable beverage emulsions, which may have important implications for industrial applications.
Efficient evaluation of wireless real-time control networks.
Horvath, Peter; Yampolskiy, Mark; Koutsoukos, Xenofon
2015-02-11
In this paper, we present a system simulation framework for the design and performance evaluation of complex wireless cyber-physical systems. We describe the simulator architecture and the specific developments that are required to simulate cyber-physical systems relying on multi-channel, multihop mesh networks. We introduce realistic and efficient physical layer models and a system simulation methodology, which provides statistically significant performance evaluation results with low computational complexity. The capabilities of the proposed framework are illustrated in the example of WirelessHART, a centralized, real-time, multi-hop mesh network designed for industrial control and monitor applications.
Rodríguez-Amigo, Beatriz; Delcanale, Pietro; Rotger, Gabriel; Juárez-Jiménez, Jordi; Abbruzzetti, Stefania; Summer, Andrea; Agut, Montserrat; Luque, F Javier; Nonell, Santi; Viappiani, Cristiano
2015-01-01
Using a combination of molecular modeling and spectroscopic experiments, the naturally occurring, pharmacologically active hypericin compound is shown to form a stable complex with the dimeric form of β-lactoglobulin (β-LG). Binding is predicted to occur at the narrowest cleft found at the interface between monomers in the dimeric β-LG. The complex is able to preserve the fluorescence and singlet oxygen photosensitizing properties of the dye. The equilibrium constant for hypericin binding has been determined as Ka=1.40±0.07µM(-1), equivalent to a dissociation constant, Kd=0.71±0.03µM. The complex is active against Staphylococcus aureus bacteria. Overall, the results are encouraging for pursuing the potential application of the complex between hypericin and β-LG as a nanodevice with bactericidal properties for disinfection. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Evaluation of an urban vegetative canopy scheme and impact on plume dispersion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Matthew A; Williams, Michael D; Zajic, Dragan
2009-01-01
The Quick Urban and Industrial Complex (QUIC) atmospheric dispersion modeling system attempts to fill an important gap between the fast, but nonbuilding-aware Gaussian plume models and the building-aware but slow computational fluid dynamics (CFD) models. While Gaussian models have the ability to give answers quickly to emergency responders, they are unlikely to be able to adequately account for the effects of the building-induced complex flow patterns on the near-source dispersion of contaminants. QUIC uses a diagnostic massconsistent empirical wind model called QUIC-URB that is based on the methodology of Rockle (1990), (see also Kaplan and Dinar 1996). In this approach,more » the recirculation zones that form around and between buildings are inserted into the flow using empirical parameterizations and then the wind field is forced to be mass consistent. Although not as accurate as CFD codes, this approach is several orders of magnitude faster and accounts for the bulk effects of buildings.« less
Predicting Human Preferences Using the Block Structure of Complex Social Networks
Guimerà, Roger; Llorente, Alejandro; Moro, Esteban; Sales-Pardo, Marta
2012-01-01
With ever-increasing available data, predicting individuals' preferences and helping them locate the most relevant information has become a pressing need. Understanding and predicting preferences is also important from a fundamental point of view, as part of what has been called a “new” computational social science. Here, we propose a novel approach based on stochastic block models, which have been developed by sociologists as plausible models of complex networks of social interactions. Our model is in the spirit of predicting individuals' preferences based on the preferences of others but, rather than fitting a particular model, we rely on a Bayesian approach that samples over the ensemble of all possible models. We show that our approach is considerably more accurate than leading recommender algorithms, with major relative improvements between 38% and 99% over industry-level algorithms. Besides, our approach sheds light on decision-making processes by identifying groups of individuals that have consistently similar preferences, and enabling the analysis of the characteristics of those groups. PMID:22984533
Active disturbance rejection controller for chemical reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Both, Roxana; Dulf, Eva H.; Muresan, Cristina I., E-mail: roxana.both@aut.utcluj.ro
2015-03-10
In the petrochemical industry, the synthesis of 2 ethyl-hexanol-oxo-alcohols (plasticizers alcohol) is of high importance, being achieved through hydrogenation of 2 ethyl-hexenal inside catalytic trickle bed three-phase reactors. For this type of processes the use of advanced control strategies is suitable due to their nonlinear behavior and extreme sensitivity to load changes and other disturbances. Due to the complexity of the mathematical model an approach was to use a simple linear model of the process in combination with an advanced control algorithm which takes into account the model uncertainties, the disturbances and command signal limitations like robust control. However themore » resulting controller is complex, involving cost effective hardware. This paper proposes a simple integer-order control scheme using a linear model of the process, based on active disturbance rejection method. By treating the model dynamics as a common disturbance and actively rejecting it, active disturbance rejection control (ADRC) can achieve the desired response. Simulation results are provided to demonstrate the effectiveness of the proposed method.« less
Dynamic patterns of overexploitation in fisheries.
Perissi, Ilaria; Bardi, Ugo; El Asmar, Toufic; Lavacchi, Alessandro
2017-09-10
Understanding overfishing and regulating fishing quotas is a major global challenge for the 21st Century both in terms of providing food for humankind and to preserve the oceans' ecosystems. However, fishing is a complex economic activity, affected not just by overfishing but also by such factors as pollution, technology, financial factors and more. For this reason, it is often difficult to state with complete certainty that overfishing is the cause of the decline of a fishery. In this study, we developed a simple dynamic model specifically designed to isolate and to study the role of depletion on production. The model is based on the well-known Lotka-Volterra model, or Prey-Predator mechanism, assuming that the fish stock and the fishing industry are coupled variables that dynamically affect each other. In the model, the fishing industry acts as the "predator" and the fish stock as the "prey". If the model can fit historical data, in particular relative to the productive decline of specific fisheries, then we have a strong indication that the decline of the fish stock is driving the decline of the fishery production. The model doesn't pretend to be a general description of the fishing industry in all its varied forms; however, the data reported here show that the model can describe several historical cases of fisheries whose production decreased and collapsed, indicating that the overexploitation of the fish stocks is an important factor in the decline of fisheries.
Deconstructing the Education-Industrial Complex in the Digital Age
ERIC Educational Resources Information Center
Loveless, Douglas, Ed.; Sullivan, Pamela, Ed.; Dredger, Katie, Ed.; Burns, Jim, Ed.
2017-01-01
Developments in the education field are affected by numerous, and often conflicting, social, cultural, and economic factors. With the increasing corporatization of education, teaching and learning paradigms are continuously altered. "Deconstructing the Education-Industrial Complex in the Digital Age" is an authoritative reference source…
Viegas, Carla; Sabino, Raquel; Botelho, Daniel; dos Santos, Mateus; Gomes, Anita Quintal
2015-09-01
Cork oak is the second most dominant forest species in Portugal and makes this country the world leader in cork export. Occupational exposure to Chrysonilia sitophila and the Penicillium glabrum complex in cork industry is common, and the latter fungus is associated with suberosis. However, as conventional methods seem to underestimate its presence in occupational environments, the aim of our study was to see whether information obtained by polymerase chain reaction (PCR), a molecular-based method, can complement conventional findings and give a better insight into occupational exposure of cork industry workers. We assessed fungal contamination with the P. glabrum complex in three cork manufacturing plants in the outskirts of Lisbon using both conventional and molecular methods. Conventional culturing failed to detect the fungus at six sampling sites in which PCR did detect it. This confirms our assumption that the use of complementing methods can provide information for a more accurate assessment of occupational exposure to the P. glabrum complex in cork industry.
Stochastic Industrial Source Detection Using Lower Cost Methods
NASA Astrophysics Data System (ADS)
Thoma, E.; George, I. J.; Brantley, H.; Deshmukh, P.; Cansler, J.; Tang, W.
2017-12-01
Hazardous air pollutants (HAPs) can be emitted from a variety of sources in industrial facilities, energy production, and commercial operations. Stochastic industrial sources (SISs) represent a subcategory of emissions from fugitive leaks, variable area sources, malfunctioning processes, and improperly controlled operations. From the shared perspective of industries and communities, cost-effective detection of mitigable SIS emissions can yield benefits such as safer working environments, cost saving through reduced product loss, lower air shed pollutant impacts, and improved transparency and community relations. Methods for SIS detection can be categorized by their spatial regime of operation, ranging from component-level inspection to high-sensitivity kilometer scale surveys. Methods can be temporally intensive (providing snap-shot measures) or sustained in both time-integrated and continuous forms. Each method category has demonstrated utility, however, broad adoption (or routine use) has thus far been limited by cost and implementation viability. Described here are a subset of SIS methods explored by the U.S EPA's next generation emission measurement (NGEM) program that focus on lower cost methods and models. An emerging systems approach that combines multiple forms to help compensate for reduced performance factors of lower cost systems is discussed. A case study of a multi-day HAP emission event observed by a combination of low cost sensors, open-path spectroscopy, and passive samplers is detailed. Early field results of a novel field gas chromatograph coupled with a fast HAP concentration sensor is described. Progress toward near real-time inverse source triangulation assisted by pre-modeled facility profiles using the Los Alamos Quick Urban & Industrial Complex (QUIC) model is discussed.
NASA Astrophysics Data System (ADS)
Krimi, Soufiene; Beigang, René
2017-02-01
In this contribution, we present a highly accurate approach for real-time thickness measurements of multilayered coatings using terahertz time domain spectroscopy in reflection geometry. The proposed approach combines the benefits of a model-based material parameters extraction method to calibrate the specimen under test, a generalized modeling method to simulate the terahertz radiation behavior within arbitrary thin films, and the robustness of a powerful evolutionary optimization algorithm to increase the sensitivity and the precision of the minimum thickness measurement limit. Furthermore, a novel self-calibration model is introduced, which takes into consideration the real industrial challenges such as the effect of wet-on-wet spray in the car painting process and the influence of the spraying conditions and the sintering process on ceramic thermal barrier coatings (TBCs) in aircraft industry. In addition, the developed approach enables for some applications the simultaneous determination of the complex refractive index and the coating thickness. Hence, a pre-calibration of the specimen under test is not required for such cases. Due to the high robustness of the self-calibration method and the genetic optimization algorithms, the approach has been successfully applied to resolve individual layer thicknesses within multi-layered coated samples down to less than 10 µm. The regression method can be applied in time-domain, frequency-domain or in both the time and frequency-domain simultaneously. The data evaluation uses general-purpose computing on graphics processing units and thanks to the developed highly parallelized algorithm lasts less than 300 ms. Thus, industrial requirements for fast thickness measurements with an "every-second-cycle" can be fulfilled.
Deposition parameterizations for the Industrial Source Complex (ISC3) model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wesely, Marvin L.; Doskey, Paul V.; Shannon, J. D.
2002-06-01
Improved algorithms have been developed to simulate the dry and wet deposition of hazardous air pollutants (HAPs) with the Industrial Source Complex version 3 (ISC3) model system. The dry deposition velocities (concentrations divided by downward flux at a specified height) of the gaseous HAPs are modeled with algorithms adapted from existing dry deposition modules. The dry deposition velocities are described in a conventional resistance scheme, for which micrometeorological formulas are applied to describe the aerodynamic resistances above the surface. Pathways to uptake at the ground and in vegetative canopies are depicted with several resistances that are affected by variations inmore » air temperature, humidity, solar irradiance, and soil moisture. The role of soil moisture variations in affecting the uptake of gases through vegetative plant leaf stomata is assessed with the relative available soil moisture, which is estimated with a rudimentary budget of soil moisture content. Some of the procedures and equations are simplified to be commensurate with the type and extent of information on atmospheric and surface conditions available to the ISC3 model system user. For example, standardized land use types and seasonal categories provide sets of resistances to uptake by various components of the surface. To describe the dry deposition of the large number of gaseous organic HAPS, a new technique based on laboratory study results and theoretical considerations has been developed providing a means of evaluating the role of lipid solubility in uptake by the waxy outer cuticle of vegetative plant leaves.« less
NASA Technical Reports Server (NTRS)
Seufzer, William J.
2014-01-01
Additive manufacturing is coming into industrial use and has several desirable attributes. Control of the deposition remains a complex challenge, and so this literature review was initiated to capture current modeling efforts in the field of additive manufacturing. This paper summarizes about 10 years of modeling and simulation related to both welding and additive manufacturing. The goals were to learn who is doing what in modeling and simulation, to summarize various approaches taken to create models, and to identify research gaps. Later sections in the report summarize implications for closed-loop-control of the process, implications for local research efforts, and implications for local modeling efforts.
NASA Astrophysics Data System (ADS)
Matthaios, Vasileios N.; Triantafyllou, Athanasios G.; Albanis, Triantafyllos A.; Sakkas, Vasileios; Garas, Stelios
2018-05-01
Atmospheric modeling is considered an important tool with several applications such as prediction of air pollution levels, air quality management, and environmental impact assessment studies. Therefore, evaluation studies must be continuously made, in order to improve the accuracy and the approaches of the air quality models. In the present work, an attempt is made to examine the air pollution model (TAPM) efficiency in simulating the surface meteorology, as well as the SO2 concentrations in a mountainous complex terrain industrial area. Three configurations under different circumstances, firstly with default datasets, secondly with data assimilation, and thirdly with updated land use, ran in order to investigate the surface meteorology for a 3-year period (2009-2011) and one configuration applied to predict SO2 concentration levels for the year of 2011.The modeled hourly averaged meteorological and SO2 concentration values were statistically compared with those from five monitoring stations across the domain to evaluate the model's performance. Statistical measures showed that the surface temperature and relative humidity are predicted well in all three simulations, with index of agreement (IOA) higher than 0.94 and 0.70 correspondingly, in all monitoring sites, while an overprediction of extreme low temperature values is noted, with mountain altitudes to have an important role. However, the results also showed that the model's performance is related to the configuration regarding the wind. TAPM default dataset predicted better the wind variables in the center of the simulation than in the boundaries, while improvement in the boundary horizontal winds implied the performance of TAPM with updated land use. TAPM assimilation predicted the wind variables fairly good in the whole domain with IOA higher than 0.83 for the wind speed and higher than 0.85 for the horizontal wind components. Finally, the SO2 concentrations were assessed by the model with IOA varied from 0.37 to 0.57, mostly dependent on the grid/monitoring station of the simulated domain. The present study can be used, with relevant adaptations, as a user guideline for future conducting simulations in mountainous complex terrain.
López-Navarro, Miguel Ángel; Llorens-Monzonís, Jaume; Tortosa-Edo, Vicente
2013-01-01
Perceived risk of environmental threats often translates into psychological stress with a wide range of effects on health and well-being. Petrochemical industrial complexes constitute one of the sites that can cause considerable pollution and health problems. The uncertainty around emissions results in a perception of risk for citizens residing in neighboring areas, which translates into anxiety and physiological stress. In this context, social trust is a key factor in managing the perceived risk. In the case of industrial risks, it is essential to distinguish between trust in the companies that make up the industry, and trust in public institutions. In the context of a petrochemical industrial complex located in the port of Castellón (Spain), this paper primarily discusses how trust—both in the companies located in the petrochemical complex and in the public institutions—affects citizens’ health risk perception. The research findings confirm that while the trust in companies negatively affects citizens’ health risk perception, trust in public institutions does not exert a direct and significant effect. Analysis also revealed that trust in public institutions and health risk perception are essentially linked indirectly (through trust in companies). PMID:23337129
Buys, L; Mengersen, K; Johnson, S; van Buuren, N; Chauvin, A
2014-01-15
Sustainability is a key driver for decisions in the management and future development of industries. The World Commission on Environment and Development (WCED, 1987) outlined imperatives which need to be met for environmental, economic and social sustainability. Development of strategies for measuring and improving sustainability in and across these domains, however, has been hindered by intense debate between advocates for one approach fearing that efforts by those who advocate for another could have unintended adverse impacts. Studies attempting to compare the sustainability performance of countries and industries have also found ratings of performance quite variable depending on the sustainability indices used. Quantifying and comparing the sustainability of industries across the triple bottom line of economy, environment and social impact continues to be problematic. Using the Australian dairy industry as a case study, a Sustainability Scorecard, developed as a Bayesian network model, is proposed as an adaptable tool to enable informed assessment, dialogue and negotiation of strategies at a global level as well as being suitable for developing local solutions. Copyright © 2013 Elsevier Ltd. All rights reserved.
Creating Value with Long Term R&D: The life science industry
NASA Astrophysics Data System (ADS)
Soloman, Darlene J. S.
2008-03-01
Agilent Laboratories looks to the future to identify, invest and enable technologies and applications that will nurture the world’s people, environment and economies, and help ensure Agilent’s continuing leadership. Following a brief introduction to Agilent Technologies and Agilent Laboratories, Solomon will discuss how innovation and long-term R&D are transcending traditional boundaries. Focusing on the life sciences industry, she will discuss current trends in R&D and the importance of measurement in advancing the industry. She will describe some of the challenges that are disrupting the pharmaceutical industry where significant and sustained investment in R&D has not translated into large numbers of block-buster therapeutics. Much of this gap results from the profound complexity of biological systems. New discoveries quickly generate new questions, which in turn drive more research and necessitate new business models. Solomon will highlight examples of Agilent’s long-range R&D in life sciences, emphasizing the importance of physics. She’ll conclude with the importance of creating sustainable value with R&D.
[Process design in high-reliability organizations].
Sommer, K-J; Kranz, J; Steffens, J
2014-05-01
Modern medicine is a highly complex service industry in which individual care providers are linked in a complicated network. The complexity and interlinkedness is associated with risks concerning patient safety. Other highly complex industries like commercial aviation have succeeded in maintaining or even increasing its safety levels despite rapidly increasing passenger figures. Standard operating procedures (SOPs), crew resource management (CRM), as well as operational risk evaluation (ORE) are historically developed and trusted parts of a comprehensive and systemic safety program. If medicine wants to follow this quantum leap towards increased patient safety, it must intensively evaluate the results of other high-reliability industries and seek step-by-step implementation after a critical assessment.
A Study of the Interaction of Millimeter Wave Fields with Biological Systems.
1984-07-01
structurally complex proteins . The third issue is the relevance of the parameters used in previous modeling efforts. The strength of the exciton-phonon...modes of proteins in the millimeter and submillimeter regions of the electromagnetic spectrum. Specifically: o " Four separate groups of frequencies...Rhodopseudomonas Sphaeroides (4). In industrial or military environments a significant number of personnel are exposed to electromagnetic fields
Dispersion of pollutants in densely populated urban areas is a research area of clear importance. Currently, few numerical tools exist capable of describing airflow and dispersion patterns in these complex regions in a time efficient manner. (QUIC), Quick Urban & Industrial C...
NASA Technical Reports Server (NTRS)
Kline, S. J. (Editor); Cantwell, B. J. (Editor); Lilley, G. M.
1982-01-01
Computational techniques for simulating turbulent flows were explored, together with the results of experimental investigations. Particular attention was devoted to the possibility of defining a universal closure model, applicable for all turbulence situations; however, conclusions were drawn that zonal models, describing localized structures, were the most promising techniques to date. The taxonomy of turbulent flows was summarized, as were algebraic, differential, integral, and partial differential methods for numerical depiction of turbulent flows. Numerous comparisons of theoretically predicted and experimentally obtained data for wall pressure distributions, velocity profiles, turbulent kinetic energy profiles, Reynolds shear stress profiles, and flows around transonic airfoils were presented. Simplifying techniques for reducing the necessary computational time for modeling complex flowfields were surveyed, together with the industrial requirements and applications of computational fluid dynamics techniques.
Dynamic Business Networks: A Headache for Sustainable Systems Interoperability
NASA Astrophysics Data System (ADS)
Agostinho, Carlos; Jardim-Goncalves, Ricardo
Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.
A survey of fuzzy logic monitoring and control utilisation in medicine.
Mahfouf, M; Abbod, M F; Linkens, D A
2001-01-01
Intelligent systems have appeared in many technical areas, such as consumer electronics, robotics and industrial control systems. Many of these intelligent systems are based on fuzzy control strategies which describe complex systems mathematical models in terms of linguistic rules. Since the 1980s new techniques have appeared from which fuzzy logic has been applied extensively in medical systems. The justification for such intelligent systems driven solutions is that biological systems are so complex that the development of computerised systems within such environments is not always a straightforward exercise. In practice, a precise model may not exist for biological systems or it may be too difficult to model. In most cases fuzzy logic is considered to be an ideal tool as human minds work from approximate data, extract meaningful information and produce crisp solutions. This paper surveys the utilisation of fuzzy logic control and monitoring in medical sciences with an analysis of its possible future penetration.
NASA Astrophysics Data System (ADS)
Aggarwal, Anil Kr.; Kumar, Sanjeev; Singh, Vikram
2017-03-01
The binary states, i.e., success or failed state assumptions used in conventional reliability are inappropriate for reliability analysis of complex industrial systems due to lack of sufficient probabilistic information. For large complex systems, the uncertainty of each individual parameter enhances the uncertainty of the system reliability. In this paper, the concept of fuzzy reliability has been used for reliability analysis of the system, and the effect of coverage factor, failure and repair rates of subsystems on fuzzy availability for fault-tolerant crystallization system of sugar plant is analyzed. Mathematical modeling of the system is carried out using the mnemonic rule to derive Chapman-Kolmogorov differential equations. These governing differential equations are solved with Runge-Kutta fourth-order method.
A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
NASA Astrophysics Data System (ADS)
Nissen-Meyer, T.; Luo, Y.; Morency, C.; Tromp, J.
2008-12-01
Seismic-wave propagation in exploration-industry settings has seen major research and development efforts for decades, yet large-scale applications have often been limited to 2D or 3D finite-difference, (visco- )acoustic wave propagation due to computational limitations. We explore the possibility of including all relevant physical signatures in the wavefield using the spectral- element method (SPECFEM3D, SPECFEM2D), thereby accounting for acoustic, (visco-)elastic, poroelastic, anisotropic wave propagation in meshes which honor all crucial discontinuities. Mesh design is the crux of the problem, and we use CUBIT (Sandia Laboratories) to generate unstructured quadrilateral 2D and hexahedral 3D meshes for these complex background models. While general hexahedral mesh generation is an unresolved problem, we are able to accommodate most of the relevant settings (e.g., layer-cake models, salt bodies, overthrusting faults, and strong topography) with respectively tailored workflows. 2D simulations show localized, characteristic wave effects due to these features that shall be helpful in designing survey acquisition geometries in a relatively economic fashion. We address some of the fundamental issues this comprehensive modeling approach faces regarding its feasibility: Assessing geological structures in terms of the necessity to honor the major structural units, appropriate velocity model interpolation, quality control of the resultant mesh, and computational cost for realistic settings up to frequencies of 40 Hz. The solution to this forward problem forms the basis for subsequent 2D and 3D adjoint tomography within this context, which is the subject of a companion paper.
Tao, Jing; Barry, Terrell; Segawa, Randy; Neal, Rosemary; Tuli, Atac
2013-01-01
Kettleman City, California, reported a higher than expected number of birth defect cases between 2007 and 2010, raising the concern of community and government agencies. A pesticide exposure evaluation was conducted as part of a complete assessment of community chemical exposure. Nineteen pesticides that potentially cause birth defects were investigated. The Industrial Source Complex Short-Term Model Version 3 (ISCST3) was used to estimate off-site air concentrations associated with pesticide applications within 8 km of the community from late 2006 to 2009. The health screening levels were designed to indicate potential health effects and used for preliminary health evaluations of estimated air concentrations. A tiered approach was conducted. The first tier modeled simple, hypothetical worst-case situations for each of 19 pesticides. The second tier modeled specific applications of the pesticides with estimated concentrations exceeding health screening levels in the first tier. The pesticide use report database of the California Department of Pesticide Regulation provided application information. Weather input data were summarized from the measurements of a local weather station in the California Irrigation Management Information System. The ISCST3 modeling results showed that during the target period, only two application days of one pesticide (methyl isothiocyanate) produced air concentration estimates above the health screening level for developmental effects at the boundary of Kettleman City. These results suggest that the likelihood of birth defects caused by pesticide exposure was low. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Jun, Gyuchan Thomas; Ward, James; Clarkson, P John
2010-07-01
The UK health service, which had been diagnosed to be seriously out of step with good design practice, has been recommended to obtain knowledge of design and risk management practice from other safety-critical industries. While these other industries have benefited from a broad range of systems modelling approaches, healthcare remains a long way behind. In order to investigate the healthcare-specific applicability of systems modelling approaches, this study identified 10 distinct methods through meta-model analysis. Healthcare workers' perception on 'ease of use' and 'usefulness' was then evaluated. The characterisation of the systems modelling methods showed that each method had particular capabilities to describe specific aspects of a complex system. However, the healthcare workers found that some of the methods, although potentially very useful, would be difficult to understand, particularly without prior experience. This study provides valuable insights into a better use of the systems modelling methods in healthcare. STATEMENT OF RELEVANCE: The findings in this study provide insights into how to make a better use of various systems modelling approaches to the design and risk management of healthcare delivery systems, which have been a growing research interest among ergonomists and human factor professionals.
NASA Astrophysics Data System (ADS)
Buyvis, V. A.; Novichikhin, A. V.; Temlyantsev, M. V.
2017-09-01
A number of features of coal industry functioning was determined for the conditions of Kemerovo region, and the specifics of planning and organization of coal transportation were revealed. The analysis of indicators of motor and railway types of transport in the process of coal transportation was executed. The necessity of improving the tools of coal products transportation in the modern conditions is substantiated. Specific features of functioning of a road-transport complex in the fuel and raw material region (on the example of Kemerovo region) are determined. The modern scientific and applied problems of functioning and allocation of the road-transport complex resources are identified. To justify the management decisions on the development and improvement of road-transport complex a set of indicators are proposed: infrastructural, transportation performance, operating, social and economic. Mathematical models of indicators are recommended for formulation and justification of decisions made during operational and strategic planning of development, evaluation and development of algorithms of functioning and allocation of road-transport sector in Kemerovo region in the future.
NASA Astrophysics Data System (ADS)
Yan, Wang-Ji; Ren, Wei-Xin
2016-12-01
Recent advances in signal processing and structural dynamics have spurred the adoption of transmissibility functions in academia and industry alike. Due to the inherent randomness of measurement and variability of environmental conditions, uncertainty impacts its applications. This study is focused on statistical inference for raw scalar transmissibility functions modeled as complex ratio random variables. The goal is achieved through companion papers. This paper (Part I) is dedicated to dealing with a formal mathematical proof. New theorems on multivariate circularly-symmetric complex normal ratio distribution are proved on the basis of principle of probabilistic transformation of continuous random vectors. The closed-form distributional formulas for multivariate ratios of correlated circularly-symmetric complex normal random variables are analytically derived. Afterwards, several properties are deduced as corollaries and lemmas to the new theorems. Monte Carlo simulation (MCS) is utilized to verify the accuracy of some representative cases. This work lays the mathematical groundwork to find probabilistic models for raw scalar transmissibility functions, which are to be expounded in detail in Part II of this study.
Computational Process Modeling for Additive Manufacturing (OSU)
NASA Technical Reports Server (NTRS)
Bagg, Stacey; Zhang, Wei
2015-01-01
Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.
2017-09-29
Report: The Military-Industrial-Scientific Complex and the Rise of New Powers: Conceptual, Theoretical and Methodological Contributions and the... Methodological Contributions and the Brazilian Case Report Term: 0-Other Email: aminvielle@ucsd.edu Distribution Statement: 1-Approved for public
Evaluating the usability of a commercial cooling vest in the Hong Kong industries.
Chan, Albert P; Yang, Yang; Song, Wen-Fang
2018-03-01
The provision of appropriate personal cooling vests is recognized as an effective measure to combat heat stress. However, personal cooling vests are not widely implemented in the Hong Kong industries. The current study aims to evaluate the usability of a hybrid cooling vest that is associated with the success of its application in industrial settings. A self-administrated questionnaire focusing on 10 subjective attributes of cooling effect, ergonomic design and usability of a hybrid cooling vest was administered with 232 occupational workers in the construction, horticultural and cleaning, airport apron services and kitchen and catering industries. A structural equation model estimated by analysis of moment structures was constructed to evaluate the usability of the cooling vest, as influenced by cooling effect and ergonomic design. Results showed that cooling effect (path coefficient = 0.69, p < 0.001) and ergonomic design (path coefficient = 0.55, p < 0.001) significantly affect the usability of the cooling vest. The structural equation model is feasible to examine the complex nature of the structural relationships among the subjective perceptions of personal cooling vests. The empirical findings furnish sound evidence for further optimization of the hybrid cooling vest in terms of cooling effect and ergonomic design for occupational workers.
Cybersecurity in Hospitals: A Systematic, Organizational Perspective
Kaiser, Jessica P
2018-01-01
Background Cybersecurity incidents are a growing threat to the health care industry in general and hospitals in particular. The health care industry has lagged behind other industries in protecting its main stakeholder (ie, patients), and now hospitals must invest considerable capital and effort in protecting their systems. However, this is easier said than done because hospitals are extraordinarily technology-saturated, complex organizations with high end point complexity, internal politics, and regulatory pressures. Objective The purpose of this study was to develop a systematic and organizational perspective for studying (1) the dynamics of cybersecurity capability development at hospitals and (2) how these internal organizational dynamics interact to form a system of hospital cybersecurity in the United States. Methods We conducted interviews with hospital chief information officers, chief information security officers, and health care cybersecurity experts; analyzed the interview data; and developed a system dynamics model that unravels the mechanisms by which hospitals build cybersecurity capabilities. We then use simulation analysis to examine how changes to variables within the model affect the likelihood of cyberattacks across both individual hospitals and a system of hospitals. Results We discuss several key mechanisms that hospitals use to reduce the likelihood of cybercriminal activity. The variable that most influences the risk of cyberattack in a hospital is end point complexity, followed by internal stakeholder alignment. Although resource availability is important in fueling efforts to close cybersecurity capability gaps, low levels of resources could be compensated for by setting a high target level of cybersecurity. Conclusions To enhance cybersecurity capabilities at hospitals, the main focus of chief information officers and chief information security officers should be on reducing end point complexity and improving internal stakeholder alignment. These strategies can solve cybersecurity problems more effectively than blindly pursuing more resources. On a macro level, the cyber vulnerability of a country’s hospital infrastructure is affected by the vulnerabilities of all individual hospitals. In this large system, reducing variation in resource availability makes the whole system less vulnerable—a few hospitals with low resources for cybersecurity threaten the entire infrastructure of health care. In other words, hospitals need to move forward together to make the industry less attractive to cybercriminals. Moreover, although compliance is essential, it does not equal security. Hospitals should set their target level of cybersecurity beyond the requirements of current regulations and policies. As of today, policies mostly address data privacy, not data security. Thus, policy makers need to introduce policies that not only raise the target level of cybersecurity capabilities but also reduce the variability in resource availability across the entire health care system. PMID:29807882
NASA Astrophysics Data System (ADS)
Wang, Ling; Lin, Li
2004-02-01
Since 1970"s, the environmental protection movement has challenged industries to increase their investment in Environmentally Conscious Manufacturing (ECM) techniques and management tools. Social considerations for global citizens and their descendants also motivated the examination on the complex issues of sustainable development beyond the immediate economic impact. Consequently, industrial enterprises have started to understand sustainable development in considering the Triple Bottom Line (TBL): economic prosperity, environmental quality and social justice. For the management, however, a lack of systematic ECM methodologies hinders their effort in planning, evaluating, reporting and auditing of sustainability. To address this critical need, this research develops a framework of a sustainable management system by incorporating a Life Cycle Analysis (LCA) of industrial operations with the TBL mechanism. A TBL metric system with seven sets of indices for the TBL elements and their complex relations is identified for the comprehensive evaluation of a company"s sustainability performance. Utilities of the TBL indices are estimated to represent the views of various stakeholders, including the company, investors, employees and the society at large. Costs of these indices are also captured to reflect the company"s effort in meeting the utilities. An optimization model is formulated to maximize the economic, environmental and social benefits by the company"s effort in developing sustainable strategies. To promote environmental and social consciousness, the methodology can significantly facilitate management decisions by its capabilities of including "non-business" values and external costs that the company has not contemplated before.
Gallego, Alejandro; O'Hara Murray, Rory; Berx, Barbara; Turrell, William R; Beegle-Krause, C J; Inall, Mark; Sherwin, Toby; Siddorn, John; Wakelin, Sarah; Vlasenko, Vasyl; Hole, Lars R; Dagestad, Knut Frode; Rees, John; Short, Lucy; Rønningen, Petter; Main, Charlotte E; Legrand, Sebastien; Gutierrez, Tony; Witte, Ursula; Mulanaphy, Nicole
2018-02-01
As oil reserves in established basins become depleted, exploration and production moves towards relatively unexploited areas, such as deep waters off the continental shelf. The Faroe-Shetland Channel (FSC, NE Atlantic) and adjacent areas have been subject to increased focus by the oil industry. In addition to extreme depths, metocean conditions in this region characterise an environment with high waves and strong winds, strong currents, complex circulation patterns, sharp density gradients, and large small- and mesoscale variability. These conditions pose operational challenges to oil spill response and question the suitability of current oil spill modelling frameworks (oil spill models and their forcing data) to adequately simulate the behaviour of a potential oil spill in the area. This article reviews the state of knowledge relevant to deepwater oil spill modelling for the FSC area and identifies knowledge gaps and research priorities. Our analysis should be relevant to other areas of complex oceanography. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.
Unraveling reaction pathways and specifying reaction kinetics for complex systems.
Vinu, R; Broadbelt, Linda J
2012-01-01
Many natural and industrial processes involve a complex set of competing reactions that include several different species. Detailed kinetic modeling of such systems can shed light on the important pathways involved in various transformations and therefore can be used to optimize the process conditions for the desired product composition and properties. This review focuses on elucidating the various components involved in modeling the kinetics of pyrolysis and oxidation of polymers. The elementary free radical steps that constitute the chain reaction mechanism of gas-phase/nonpolar liquid-phase processes are outlined. Specification of the rate coefficients of the various reaction families, which is central to the theme of kinetics, is described. Construction of the reaction network on the basis of the types of end groups and reactive moieties in a polymer chain is discussed. Modeling frameworks based on the method of moments and kinetic Monte Carlo are evaluated using illustrations. Finally, the prospects and challenges in modeling biomass conversion are addressed.
Recent Advances in Algal Genetic Tool Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Dahlin, Lukas; T. Guarnieri, Michael
The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well asmore » prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.« less
A flexible 3D laser scanning system using a robotic arm
NASA Astrophysics Data System (ADS)
Fei, Zixuan; Zhou, Xiang; Gao, Xiaofei; Zhang, Guanliang
2017-06-01
In this paper, we present a flexible 3D scanning system based on a MEMS scanner mounted on an industrial arm with a turntable. This system has 7-degrees of freedom and is able to conduct a full field scan from any angle, suitable for scanning object with the complex shape. The existing non-contact 3D scanning system usually uses laser scanner that projects fixed stripe mounted on the Coordinate Measuring Machine (CMM) or industrial robot. These existing systems can't perform path planning without CAD models. The 3D scanning system presented in this paper can scan the object without CAD models, and we introduced this path planning method in the paper. We also propose a practical approach to calibrating the hand-in-eye system based on binocular stereo vision and analyzes the errors of the hand-eye calibration.
Recent Advances in Algal Genetic Tool Development
R. Dahlin, Lukas; T. Guarnieri, Michael
2016-06-24
The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well asmore » prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.« less
Seismic anisotropy in deforming salt bodies
NASA Astrophysics Data System (ADS)
Prasse, P.; Wookey, J. M.; Kendall, J. M.; Dutko, M.
2017-12-01
Salt is often involved in forming hydrocarbon traps. Studying salt dynamics and the deformation processes is important for the exploration industry. We have performed numerical texture simulations of single halite crystals deformed by simple shear and axial extension using the visco-plastic self consistent approach (VPSC). A methodology from subduction studies to estimate strain in a geodynamic simulation is applied to a complex high-resolution salt diapir model. The salt diapir deformation is modelled with the ELFEN software by our industrial partner Rockfield, which is based on a finite-element code. High strain areas at the bottom of the head-like strctures of the salt diapir show high amount of seismic anisotropy due to LPO development of halite crystals. The results demonstrate that a significant degree of seismic anisotropy can be generated, validating the view that this should be accounted for in the treatment of seismic data in, for example, salt diapir settings.
NASA Astrophysics Data System (ADS)
Mezentsev, Yu A.; Baranova, N. V.
2018-05-01
A universal economical and mathematical model designed for determination of optimal strategies for managing subsystems (components of subsystems) of production and logistics of enterprises is considered. Declared universality allows taking into account on the system level both production components, including limitations on the ways of converting raw materials and components into sold goods, as well as resource and logical restrictions on input and output material flows. The presented model and generated control problems are developed within the framework of the unified approach that allows one to implement logical conditions of any complexity and to define corresponding formal optimization tasks. Conceptual meaning of used criteria and limitations are explained. The belonging of the generated tasks of the mixed programming with the class of NP is shown. An approximate polynomial algorithm for solving the posed optimization tasks for mixed programming of real dimension with high computational complexity is proposed. Results of testing the algorithm on the tasks in a wide range of dimensions are presented.
Improving the water solubility of Monascus pigments under acidic conditions with gum arabic.
Jian, Wenjie; Sun, Yuanming; Wu, Jian-Yong
2017-07-01
Monascus pigments (Mps) are natural food colorants and their stability in acidic solutions is important for application in the food industry. This study aimed to evaluate the use of gum arabic (GA) as a stabilizer for maintaining the solubility of Mps in an acidic aqueous solution exposed to a high temperature, and to analyze the molecular interactions between GA and Mps. Mps dispersed (0.2 g kg -1 ) in deionized water at pH 3.0-4.0 without GA formed precipitates but remained in a stable solution in the presence of GA (1 g kg -1 ). The significant improvement of Mps water solubility under acidic conditions was attributed to the formation of Mps-GA complexes, as indicated by a sharp increase in the fluorescence intensity. The results on particle size, zeta potential, and transmission electron microscopy further suggested that molecular binding of Mps to GA, electrostatic repulsion, and steric hindrance of GA were contributing factors to preventing the aggregation of Mps in acidic solutions. A mechanistic model was presented for GA-Mps interactions and complex structures. GA was proven to be an effective stabilizer of natural food colorants in acidic solutions. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Sheesley, Rebecca J; Schauer, James J; Orf, Marya L
2010-02-01
Industrial sources can have a significant but poorly defined impact on ambient particulate matter concentrations in select areas. Detailed emission profiles are often not available and are hard to develop because of the diversity of emissions across time and space at large industrial complexes. A yearlong study was conducted in an industrial area in Detroit, MI, which combined real-time particle mass (tapered element oscillating microbalance) and black carbon (aetholometer) measurements with molecular marker measurements of monthly average concentrations as well as daily concentrations of select high pollution days. The goal of the study was to use the real-time data to define days in which the particulate matter concentration in the atmosphere was largely impacted by local source emissions and to use daily speciation data to derive emission profiles for the industrial source. When combined with motor vehicle exhaust, wood smoke and road dust profiles, the industrial source profile was used to determine the contribution of the local industrial source to the total organic carbon (OC) concentrations using molecular marker-chemical mass balance modeling (MM-CMB). The MM-CMB analysis revealed that the industrial source had minimal impact on the monthly average carbonaceous aerosol concentration, but contributed approximately 2 microg m(-3), or a little over one-third of the total OC, on select high-impact days.
Research on the exponential growth effect on network topology: Theoretical and empirical analysis
NASA Astrophysics Data System (ADS)
Li, Shouwei; You, Zongjun
Integrated circuit (IC) industry network has been built in Yangtze River Delta with the constant expansion of IC industry. The IC industry network grows exponentially with the establishment of new companies and the establishment of contacts with old firms. Based on preferential attachment and exponential growth, the paper presents the analytical results in which the vertices degree of scale-free network follows power-law distribution p(k)˜k‑γ (γ=2β+1) and parameter β satisfies 0.5≤β≤1. At the same time, we find that the preferential attachment takes place in a dynamic local world and the size of the dynamic local world is in direct proportion to the size of whole networks. The paper also gives the analytical results of no-preferential attachment and exponential growth on random networks. The computer simulated results of the model illustrate these analytical results. Through some investigations on the enterprises, this paper at first presents the distribution of IC industry, composition of industrial chain and service chain firstly. Then, the correlative network and its analysis of industrial chain and service chain are presented. The correlative analysis of the whole IC industry is also presented at the same time. Based on the theory of complex network, the analysis and comparison of industrial chain network and service chain network in Yangtze River Delta are provided in the paper.
Coal conversion products industrial applications
NASA Technical Reports Server (NTRS)
Dunkin, J. H.; Warren, D.
1980-01-01
Coal-based synthetic fuels complexes under development consideration by NASA/MSFC will produce large quantities of synthetic fuels, primarily medium BTU gas, which could be sold commercially to industries located in South Central Tennessee and Northern Alabama. The complexes would be modular in construction, and subsequent modules may produce liquid fuels or fuels for electric power production. Current and projected industries in the two states which have a propensity for utilizing coal-based synthetic fuels were identified, and a data base was compiled to support MFSC activities.
The development of the ICME supply-chain: Route to ICME implementation and sustainment
NASA Astrophysics Data System (ADS)
Furrer, David; Schirra, John
2011-04-01
Over the past twenty years, integrated computational materials engineering (ICME) has emerged as a key engineering field with great promise. Models simulating materials-related phenomena have been developed and are being validated for industrial application. The integration of computational methods into material, process and component design has been a challenge, however, in part due to the complexities in the development of an ICME "supply-chain" that supports, sustains and delivers this emerging technology. ICME touches many disciplines, which results in a requirement for many types of computational-based technology organizations to be involved to provide tools that can be rapidly developed, validated, deployed and maintained for industrial applications. The need for, and the current state of an ICME supply-chain along with development and future requirements for the continued pace of introduction of ICME into industrial design practices will be reviewed within this article.
On Parallelizing Single Dynamic Simulation Using HPC Techniques and APIs of Commercial Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diao, Ruisheng; Jin, Shuangshuang; Howell, Frederic
Time-domain simulations are heavily used in today’s planning and operation practices to assess power system transient stability and post-transient voltage/frequency profiles following severe contingencies to comply with industry standards. Because of the increased modeling complexity, it is several times slower than real time for state-of-the-art commercial packages to complete a dynamic simulation for a large-scale model. With the growing stochastic behavior introduced by emerging technologies, power industry has seen a growing need for performing security assessment in real time. This paper presents a parallel implementation framework to speed up a single dynamic simulation by leveraging the existing stability model librarymore » in commercial tools through their application programming interfaces (APIs). Several high performance computing (HPC) techniques are explored such as parallelizing the calculation of generator current injection, identifying fast linear solvers for network solution, and parallelizing data outputs when interacting with APIs in the commercial package, TSAT. The proposed method has been tested on a WECC planning base case with detailed synchronous generator models and exhibits outstanding scalable performance with sufficient accuracy.« less
A network analysis of indirect carbon emission flows among different industries in China.
Du, Qiang; Xu, Yadan; Wu, Min; Sun, Qiang; Bai, Libiao; Yu, Ming
2018-06-17
Indirect carbon emissions account for a large ratio of the total carbon emissions in processes to make the final products, and this implies indirect carbon emission flow across industries. Understanding these flows is crucial for allocating a carbon allowance for each industry. By combining input-output analysis and complex network theory, this study establishes an indirect carbon emission flow network (ICEFN) for 41 industries from 2005 to 2014 to investigate the interrelationships among different industries. The results show that the ICEFN was consistent with a small-world nature based on an analysis of the average path lengths and the clustering coefficients. Moreover, key industries in the ICEFN were identified using complex network theory on the basis of degree centrality and betweenness centrality. Furthermore, the 41 industries of the ICEFN were divided into four industrial subgroups that are related closely to one another. Finally, possible policy implications were provided based on the knowledge of the structure of the ICEFN and its trend.
ALGE3D: A Three-Dimensional Transport Model
NASA Astrophysics Data System (ADS)
Maze, G. M.
2017-12-01
Of the top 10 most populated US cities from a 2015 US Census Bureau estimate, 7 of the cities are situated near the ocean, a bay, or on one of the Great Lakes. A contamination of the water ways in the United States could be devastating to the economy (through tourism and industries such as fishing), public health (from direct contact, or contaminated drinking water), and in some cases even infrastructure (water treatment plants). Current national response models employed by emergency response agencies have well developed models to simulate the effects of hazardous contaminants in riverine systems that are primarily driven by one-dimensional flows; however in more complex systems, such as tidal estuaries, bays, or lakes, a more complex model is needed. While many models exist, none are capable of quick deployment in emergency situations that could contain a variety of release situations including a mixture of both particulate and dissolved chemicals in a complex flow area. ALGE3D, developed at the Department of Energy's (DOE) Savannah River National Laboratory (SRNL), is a three-dimensional hydrodynamic code which solves the momentum, mass, and energy conservation equations to predict the movement and dissipation of thermal or dissolved chemical plumes discharged into cooling lakes, rivers, and estuaries. ALGE3D is capable of modeling very complex flows, including areas with tidal flows which include wetting and drying of land. Recent upgrades have increased the capabilities including the transport of particulate tracers, allowing for more complete modeling of the transport of pollutants. In addition the model is capable of coupling with a one-dimension riverine transport model or a two-dimension atmospheric deposition model in the event that a contamination event occurs upstream or upwind of the water body.
2010-04-30
estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining...previous and current complex SW development efforts, the program offices will have a source of objective lessons learned and metrics that can be applied...the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this
[The Third World before the Third World, 1770-1870].
Batou, J
1992-01-01
The advent of the development gap between the industrialized countries and the underdeveloped countries is explored through an examination of early attempts to industrialize in Latin America and the Middle East in the years 1770-1870. The beginning of the development gap can be dated to 1830-60, with the diffusion of the industrial revolution in Western Europe and the US. The periphery remained poorly defined and still enjoyed a significant degree of economic autonomy through 1870, but lowered cost of international freight, the increasing cost and technological complexity of machinery,and other factors after that date combined to assure increasing economic integration of nations. Latin America and the Middle East were selected for study because they were the only present-day developing regions to have developed modern industry before 1850-60 except for Bengal, which was already colonized by the British. The industrial revolution was a decisive development in the history of human societies, marked by a drastic acceleration of the rate of economic growth as much as by an unprecedented increase in inequality of development between countries. Societies bypassed by technological innovations thus seemed doomed sooner or later to depend on societies at the center of development. Third world contemporaries of the early industrial revolution appear to have been aware of this, and some peripheral states made serious efforts to avoid the worst forms of external dependence and to resist the deindustrialization, pauperization, and direct colonization of underdevelopment. 3 types of attempts at industrialization in Latin America and the Middle East before 1860-80 are distinguished and described, including partial and unsuccessful public efforts in several countries, isolated private initiatives going against prevailing trends in Mexico and Brazil, and industrial development directed step by step by the state in Egypt and Paraguay. It is argued that the model of industrialization in Egypt and Paraguay anticipated the Japanese experience in certain respects and would have had a good chance of success hand not devastating warfare destroyed the economics of both countries. The author explores 5 questions to assess the relevance of the Paraguayan and Egyptian model: 1) whether the natural environment of the 2 countries offered favorable conditions for modern factories, 2) whether peripheral states had the resources for financing a true industrialization policy, 3) whether the sociocultural context of the 2 countries would have permitted them to develop an industrial culture, 4) whether the West would have tolerated the competition implied by their economic development and industrialization, and 5) whether this model of industrialization was adjusted to the specific conditions of the periphery.
Chuang, Hsiao-Chi; Shie, Ruei-Hao; Chio, Chia-Pin; Yuan, Tzu-Hsuen; Lee, Jui-Huan; Chan, Chang-Chuan
2018-05-01
This study evaluated associations between the bioreactivity of PM 2.5 in vitro and emission sources in the vicinity of a petrochemical complex in Taiwan. The average PM 2.5 was 30.2 μg/m 3 from 9 February to 23 March 2016, and the PM 2.5 was clustered in long-range transport (with major local source) (12.8 μg/m 3 ), and major (17.3 μg/m 3 ) and minor industrial emissions (4.7 μg/m 3 ) using a k-means clustering model. A reduction in cell viability and increases in the cytotoxicity-related lactate dehydrogenase (LDH), oxidative stress-related 8-isoprostane, and inflammation-related interleukin (IL)-6 occurred due to PM 2.5 in a dose-dependent manner. The PM 2.5 from major industrial emissions was significantly correlated with increased 8-isoprostane and IL-6, but this was not observed for long-range transport or minor industrial emissions. The bulk metal concentration was 9.52 ng/m 3 in PM 2.5 . We further observed that As, Ba, Cd, and Se were correlated with LDH in the long-range transport group. Pb in PM 2.5 from the major industrial emissions was correlated with LDH, whereas Pb and Se were correlated with 8-isoprostane. Sr was correlated with cell viability in the minor industrial emissions group. We demonstrated a new approach to investigate particle bioreactivity, which suggested that petrochemical-emitted PM 2.5 should be a concern for surrounding residents' health. Copyright © 2018 Elsevier Ltd. All rights reserved.
Complex Adaptive System Models and the Genetic Analysis of Plasma HDL-Cholesterol Concentration
Rea, Thomas J.; Brown, Christine M.; Sing, Charles F.
2006-01-01
Despite remarkable advances in diagnosis and therapy, ischemic heart disease (IHD) remains a leading cause of morbidity and mortality in industrialized countries. Recent efforts to estimate the influence of genetic variation on IHD risk have focused on predicting individual plasma high-density lipoprotein cholesterol (HDL-C) concentration. Plasma HDL-C concentration (mg/dl), a quantitative risk factor for IHD, has a complex multifactorial etiology that involves the actions of many genes. Single gene variations may be necessary but are not individually sufficient to predict a statistically significant increase in risk of disease. The complexity of phenotype-genotype-environment relationships involved in determining plasma HDL-C concentration has challenged commonly held assumptions about genetic causation and has led to the question of which combination of variations, in which subset of genes, in which environmental strata of a particular population significantly improves our ability to predict high or low risk phenotypes. We document the limitations of inferences from genetic research based on commonly accepted biological models, consider how evidence for real-world dynamical interactions between HDL-C determinants challenges the simplifying assumptions implicit in traditional linear statistical genetic models, and conclude by considering research options for evaluating the utility of genetic information in predicting traits with complex etiologies. PMID:17146134
NASA Technical Reports Server (NTRS)
Bowen, Brent D.; Headley, Dean E.; Kane, Karisa D.
1998-01-01
Enhancing competitiveness in the global airline industry is at the forefront of attention with airlines, government, and the flying public. The seemingly unchecked growth of major airline alliances is heralded as an enhancement to global competition. However, like many mega-conglomerates, mega-airlines will face complications driven by size regardless of the many recitations of enhanced efficiency. Outlined herein is a conceptual model to serve as a decision tool for policy-makers, managers, and consumers of airline services. This model is developed using public data for the United States (U.S.) major airline industry available from the U/S. Department of Transportation, Federal Aviation Administration, the National Aeronautics and Space Administration, the National Transportation Safety Board, and other public and private sector sources. Data points include number of accidents, pilot deviations, operational performance indicators, flight problems, and other factors. Data from these sources provide opportunity to develop a model based on a complex dot product equation of two vectors. A row vector is weighted for importance by a key informant panel of government, industry, and consumer experts, while a column vector is established with the factor value. The resulting equation, known as the national Airline Quality Rating (AQR), where Q is quality, C is weight, and V is the value of the variables, is stated Q=C[i1-19] x V[i1-19]. Looking at historical patterns of AQR results provides the basis for establishment of an industry benchmark for the purpose of enhancing airline operational performance. A 7 year average of overall operational performance provides the resulting benchmark indicator. Applications from this example can be applied to the many competitive environments of the global industry and assist policy-makers faced with rapidly changing regulatory challenges.
Complex Economies Have a Lateral Escape from the Poverty Trap
Pugliese, Emanuele; Chiarotti, Guido L.; Zaccaria, Andrea; Pietronero, Luciano
2017-01-01
We analyze the decisive role played by the complexity of economic systems at the onset of the industrialization process of countries over the past 50 years. Our analysis of the input growth dynamics, considering a further dimension through a recently introduced measure of economic complexity, reveals that more differentiated and more complex economies face a lower barrier (in terms of GDP per capita) when starting the transition towards industrialization. As a consequence, we can extend the classical concept of a one-dimensional poverty trap, by introducing a two-dimensional poverty trap: a country will start the industrialization process if it is rich enough (as in neo-classical economic theories), complex enough (using this new dimension and laterally escaping from the poverty trap), or a linear combination of the two. This naturally leads to the proposal of a Complex Index of Relative Development (CIRD) which shows, when analyzed as a function of the growth due to input, a shape of an upside down parabola similar to that expected from the standard economic theories when considering only the GDP per capita dimension. PMID:28072867
Complex Economies Have a Lateral Escape from the Poverty Trap.
Pugliese, Emanuele; Chiarotti, Guido L; Zaccaria, Andrea; Pietronero, Luciano
2017-01-01
We analyze the decisive role played by the complexity of economic systems at the onset of the industrialization process of countries over the past 50 years. Our analysis of the input growth dynamics, considering a further dimension through a recently introduced measure of economic complexity, reveals that more differentiated and more complex economies face a lower barrier (in terms of GDP per capita) when starting the transition towards industrialization. As a consequence, we can extend the classical concept of a one-dimensional poverty trap, by introducing a two-dimensional poverty trap: a country will start the industrialization process if it is rich enough (as in neo-classical economic theories), complex enough (using this new dimension and laterally escaping from the poverty trap), or a linear combination of the two. This naturally leads to the proposal of a Complex Index of Relative Development (CIRD) which shows, when analyzed as a function of the growth due to input, a shape of an upside down parabola similar to that expected from the standard economic theories when considering only the GDP per capita dimension.
Reusing models of actors and services in smart homecare to improve sustainability.
Walderhaug, Ståle; Stav, Erlend; Mikalsen, Marius
2008-01-01
Industrial countries are faced with a growing elderly population. Homecare systems with assistive smart house technology enable elderly to live independently at home. Development of such smart home care systems is complex and expensive and there is no common reference model that can facilitate service reuse. This paper proposes reusable actor and service models based on a model-driven development process where end user organizations and domain healthcare experts from four European countries have been involved. The models, specified using UML can be reused actively as assets in the system design and development process and can reduce development costs, and improve interoperability and sustainability of systems. The models are being evaluated in the European IST project MPOWER.
The Effect of Salt on the Complex Coacervation of Vinyl Polyelectrolytes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perry, Sarah; Li, Yue; Priftis, Dimitrios
2014-06-01
Complex coacervation is an electrostatically-driven phase separation phenomenon that is utilized in a wide range of everyday applications and is of great interest for the creation of self-assembled materials. Here, we utilized turbidity to characterize the effect of salt type on coacervate formation using two vinyl polyelectrolytes, poly(acrylic acid sodium salt) (pAA) and poly(allylamine hydrochloride) (pAH), as simple models for industrial and biological coacervates. We confirmed the dominant role of salt valence on the extent of coacervate formation, while demonstrating the presence of significant secondary effects, which can be described by Hofmeister-like behavior. These results revealed the importance of ion-specificmore » interactions, which are crucial for the informed design of coacervate-based materials for use in complex ionic environments, and can enable more detailed theoretical investigations on the role of subtle electrostatic and thermodynamic effects in complex coacervation.« less
Hong, Eunju; Lee, Seokwon; Kim, Geun-Bae; Kim, Tae-Jong; Kim, Hyoung-Wook; Lee, Kyoungho; Son, Bu-Soon
2018-04-24
This study aims to identify environmental air pollution adversely affecting pulmonary function among a community-based general population living in Korean industrial complexes. A total of 1963 residents participated in a pulmonary function test (PFT). The sample population consisted of an exposed group ( n = 1487) living within a radius of 5 km of industrial complexes and a control group ( n = 476) living over a radius of 10 km from the industrial complexes in Gwangyang and Yeosu cities. PFT results were calculated for each resident of the study population. On-site questionnaire surveys with face-to-face interviews were also conducted to collect more detailed information on personal lifestyles, medical history, exposure to air pollution, and respiratory disease and related symptoms. A total of 486 measured samples were collected by eight automated air-monitoring stations installed in four counties of Gwangyang and four counties of Yeosu in South Korea from January 2006 to February 2007. Mean levels of SO₂ (0.012 ppm), CO (0.648 ppm), NO₂ (0.02 ppm), O₃ (0.034 ppm), and PM 10 (43.07 μg/m³), collected within a radius of 5 km, were significantly higher than those collected over a radius of 10 km from Gwangyang and Yeosu industrial complexes. Prevalence odds ratio (OR) of abnormal pulmonary function in the exposed group of residents (<5 km) was elevated at 1.24 (95% CI 0.71⁻1.96), but not statistically significant ( p > 0.05). In multiple linear regression analysis, forced expiratory volume in one second (FEV₁) and forced vital capacity (FVC) levels significantly declined as SO₂, CO, and O₃ levels increased when adjusting for age, sex, body mass index (BMI), alcohol, smoking, secondhand smoke, and respiratory disease and related symptoms ( n = 1963) ( p < 0.05). These results suggest that exposure to air pollution affects pulmonary function levels of residents living in Korean industrial complexes.
Fast mask writers: technology options and considerations
NASA Astrophysics Data System (ADS)
Litt, Lloyd C.; Groves, Timothy; Hughes, Greg
2011-04-01
The semiconductor industry is under constant pressure to reduce production costs even as the complexity of technology increases. Lithography represents the most expensive process due to its high capital equipment costs and the implementation of low-k1 lithographic processes, which have added to the complexity of making masks because of the greater use of optical proximity correction, pixelated masks, and double or triple patterning. Each of these mask technologies allows the production of semiconductors at future nodes while extending the utility of current immersion tools. Low-k1 patterning complexity combined with increased data due to smaller feature sizes is driving extremely long mask write times. While a majority of the industry is willing to accept times of up to 24 hours, evidence suggests that the write times for many masks at the 22 nm node and beyond will be significantly longer. It has been estimated that funding on the order of 50M to 90M for non-recurring engineering (NRE) costs will be required to develop a multiple beam mask writer system, yet the business case to recover this kind of investment is not strong. Moreover, funding such a development poses a high risk for an individual supplier. The structure of the mask fabrication marketplace separates the mask writer equipment customer (the mask supplier) from the final customer (wafer manufacturer) that will be most effected by the increase in mask cost that will result if a high speed mask writer is not available. Since no individual company will likely risk entering this market, some type of industry-wide funding model will be needed.
Occupational exposure to fungi and particles in animal feed industry.
Viegas, Carla; Faria, Tiago; Carolino, Elisabete; Sabino, Raquel; Gomes, Anita Quintal; Viegas, Susana
Very few studies regarding fungal and particulate matter (PM) exposure in feed industry have been reported, although such contaminants are likely to be a significant contributing factor to several symptoms reported among workers. The purpose of this study has been to characterize fungal and dust exposure in one Portuguese feed industry. Air and surface samples were collected and subject to further macro- and microscopic observations. In addition we collected other air samples in order to perform real-time quantitative polymerase chain reaction (PCR) amplification of genes from Aspergillus fumigatus and Aspergillus flavus complexes as well as Stachybotrys chartarum. Additionally, two exposure metrics were considered - particle mass concentration (PMC), measured in 5 different sizes (PM0.5, PM1, PM2.5, PM5, PM10), and particle number concentration (PNC) based on results given in 6 different sizes in terms of diameter (0.3 μm, 0.5 μm, 1 μm, 2.5 μm, 5 μm and 10 μm). Species from the Aspergillus fumigatus complex were the most abundant in air (46.6%) and in surfaces, Penicillium genus was the most frequently found (32%). The only DNA was detected from A. fumigatus complex. The most prevalent in dust samples were smaller particles which may reach deep into the respiratory system and trigger not only local effects but also the systemic ones. Future research work must be developed aiming at assessing the real health effects of these co-exposures. Med Pr 2016;67(2):143-154. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.
Translations on Eastern Europe Political, Sociological, and Military Affairs No. 1567
1978-07-21
the industrial development of Jordan by building some industrial capital investments units, for instance electric power plants, cement and ceramics...independence and to the industrialization of these countries and at the same time creates possibilities for expanding imports of economically important raw...construction of important industrial projects, agro-complexes, industrial and agricultural cooperation, the use of new technologies in industry and
Neo-Industrial and Sustainable Development of Russia as Mineral Resources Exploiting Country
NASA Astrophysics Data System (ADS)
Prokudina, Marina; Zhironkina, Olga; Kalinina, Oksana; Gasanov, Magerram; Agafonov, Felix
2017-11-01
In the Russian economy, the world leadership in the extraction of different mineral resources is combined with the potential for their processing and a significant scientific sector. Innovative development of raw materials extraction is impossible without the parallel technological modernization of the high-tech sector. In general, the complex of these processes is a neo-industrialization of the economy. Neo-industrially oriented transformation of the economy reflects complex changes in its structure, the transformation of established stable relationships between various elements of the system of social production that determine macroeconomic proportions. Neo-industrial transformations come along with the modification of economic relations associated with investments, innovations, labor and income distribution, with the process of locating productive forces and regulating the economy by the government. Neo-industrialization of economy is not only significant changes in its technological and reproductive structure (the development of high-tech industries, the integration of science and industry), but, above all, the implementation of a system structural policy of innovative development of raw material industry and the recovery of manufacturing industries on a new technological basis.
Engineering Hydrogel Microenvironments to Recapitulate the Stem Cell Niche.
Madl, Christopher M; Heilshorn, Sarah C
2018-06-04
Stem cells are a powerful resource for many applications including regenerative medicine, patient-specific disease modeling, and toxicology screening. However, eliciting the desired behavior from stem cells, such as expansion in a naïve state or differentiation into a particular mature lineage, remains challenging. Drawing inspiration from the native stem cell niche, hydrogel platforms have been developed to regulate stem cell fate by controlling microenvironmental parameters including matrix mechanics, degradability, cell-adhesive ligand presentation, local microstructure, and cell-cell interactions. We survey techniques for modulating hydrogel properties and review the effects of microenvironmental parameters on maintaining stemness and controlling differentiation for a variety of stem cell types. Looking forward, we envision future hydrogel designs spanning a spectrum of complexity, ranging from simple, fully defined materials for industrial expansion of stem cells to complex, biomimetic systems for organotypic cell culture models.
Policy modeling for industrial energy use
DOE Office of Scientific and Technical Information (OSTI.GOV)
Worrell, Ernst; Park, Hi-Chun; Lee, Sang-Gon
2003-03-01
The international workshop on Policy Modeling for Industrial Energy Use was jointly organized by EETA (Professional Network for Engineering Economic Technology Analysis) and INEDIS (International Network for Energy Demand Analysis in the Industrial Sector). The workshop has helped to layout the needs and challenges to include policy more explicitly in energy-efficiency modeling. The current state-of-the-art models have a proven track record in forecasting future trends under conditions similar to those faced in the recent past. However, the future of energy policy in a climate-restrained world is likely to demand different and additional services to be provided by energy modelers. Inmore » this workshop some of the international models used to make energy consumption forecasts have been discussed as well as innovations to enable the modeling of policy scenarios. This was followed by the discussion of future challenges, new insights in the data needed to determine the inputs into energy model s, and methods to incorporate decision making and policy in the models. Based on the discussion the workshop participants came to the following conclusions and recommendations: Current energy models are already complex, and it is already difficult to collect the model inputs. Hence, new approaches should be transparent and not lead to extremely complex models that try to ''do everything''. The model structure will be determined by the questions that need to be answered. A good understanding of the decision making framework of policy makers and clear communication on the needs are essential to make any future energy modeling effort successful. There is a need to better understand the effects of policy on future energy use, emissions and the economy. To allow the inclusion of policy instruments in models, evaluation of programs and instruments is essential, and need to be included in the policy instrument design. Increased efforts are needed to better understand the effects of innovative (no n-monetary) policy instruments through evaluation and to develop approaches to model both conventional and innovative policies. The explicit modeling of barriers and decision making in the models seems a promising way to enable modeling of conventional and innovative policies. A modular modeling approach is essential to not only provide transparency, but also to use the available resources most effectively and efficiently. Many large models have been developed in the past, but have been abandoned after only brief periods of use. A development path based on modular building blocks needs the establishment of a flexible but uniform modeling framework. The leadership of international agencies and organizations is essential in the establishment of such a framework. A preference is given for ''softlinks'' between different modules and models, to increase transparency and reduce complexity. There is a strong need to improve the efficiency of data collection and interpretation efforts to produce reliable model inputs. The workshop participants support the need for the establishment of an (in-)formal exchanges of information, as well as modeling approaches. The development of an informal network of research institutes and universities to help build a common dataset and exchange ideas on specific areas is proposed. Starting with an exchange of students would be a relative low-cost way to start such collaboration. It would be essential to focus on specific topics. It is also essential to maintain means of regular exchange of ideas between researchers in the different focus points.« less
Modeling topology formation during laser ablation
NASA Astrophysics Data System (ADS)
Hodapp, T. W.; Fleming, P. R.
1998-07-01
Micromachining high aspect-ratio structures can be accomplished through ablation of surfaces with high-powered lasers. Industrial manufacturers now use these methods to form complex and regular surfaces at the 10-1000 μm feature size range. Despite its increasingly wide acceptance on the manufacturing floor, the underlying photochemistry of the ablation mechanism, and hence the dynamics of the machining process, is still a question of considerable debate. We have constructed a computer model to investigate and predict the topological formation of ablated structures. Qualitative as well as quantitative agreement with excimer-laser machined polyimide substrates has been demonstrated. This model provides insights into the drilling process for high-aspect-ratio holes.
Economic development and wage inequality: A complex system analysis
Pugliese, Emanuele; Pietronero, Luciano
2017-01-01
Adapting methods from complex system analysis, this paper analyzes the features of the complex relationship between wage inequality and the development and industrialization of a country. Development is understood as a combination of a monetary index, GDP per capita, and a recently introduced measure of a country’s economic complexity: Fitness. Initially the paper looks at wage inequality on a global scale, over the time period 1990–2008. Our empirical results show that globally the movement of wage inequality along with the ongoing industrialization of countries has followed a longitudinally persistent pattern comparable to the one theorized by Kuznets in the fifties: countries with an average level of development suffer the highest levels of wage inequality. Next, the study narrows its focus on wage inequality within the United States. By using data on wages and employment in the approximately 3100 US counties over the time interval 1990–2014, it generalizes the Fitness-Complexity metric for geographic units and industrial sectors, and then investigates wage inequality between NAICS industries. The empirical time and scale dependencies are consistent with a relation between wage inequality and development driven by institutional factors comparing countries, and by change in the structural compositions of sectors in a homogeneous institutional environment, such as the counties of the United States. PMID:28926577
Economic development and wage inequality: A complex system analysis.
Sbardella, Angelica; Pugliese, Emanuele; Pietronero, Luciano
2017-01-01
Adapting methods from complex system analysis, this paper analyzes the features of the complex relationship between wage inequality and the development and industrialization of a country. Development is understood as a combination of a monetary index, GDP per capita, and a recently introduced measure of a country's economic complexity: Fitness. Initially the paper looks at wage inequality on a global scale, over the time period 1990-2008. Our empirical results show that globally the movement of wage inequality along with the ongoing industrialization of countries has followed a longitudinally persistent pattern comparable to the one theorized by Kuznets in the fifties: countries with an average level of development suffer the highest levels of wage inequality. Next, the study narrows its focus on wage inequality within the United States. By using data on wages and employment in the approximately 3100 US counties over the time interval 1990-2014, it generalizes the Fitness-Complexity metric for geographic units and industrial sectors, and then investigates wage inequality between NAICS industries. The empirical time and scale dependencies are consistent with a relation between wage inequality and development driven by institutional factors comparing countries, and by change in the structural compositions of sectors in a homogeneous institutional environment, such as the counties of the United States.
Climate change trade measures : estimating industry effects
DOT National Transportation Integrated Search
2009-06-01
Estimating the potential effects of domestic emissions pricing for industries in the United States is complex. If the United States were to regulate greenhouse gas emissions, production costs could rise for certain industries and could cause output, ...
Yerger, Valerie B; Przewoznik, Jennifer; Malone, Ruth E
2007-11-01
Industry has played a complex role in the rise of tobacco-related diseases in the United States. The tobacco industry's activities, including targeted marketing, are arguably among the most powerful corporate influences on health and health policy. We analyzed over 400 internal tobacco industry documents to explore how, during the past several decades, the industry targeted inner cities populated predominantly by low-income African American residents with highly concentrated menthol cigarette marketing. We study how major tobacco companies competed against one another in menthol wars fought within these urban cores. Little previous work has analyzed the way in which the inner city's complex geography of race, class, and place shaped the avenues used by tobacco corporations to increase tobacco use in low-income, predominantly African American urban cores in the 1970s-1990s. Our analysis shows how the industry's activities contributed to the racialized geography of today's tobacco-related health disparities.
ERIC Educational Resources Information Center
Meiners, Erica R.; Reyes, Karen Benita
2008-01-01
In this article, the authors seek to contribute to the growing engagement with the school-prison nexus by considering two, perhaps less obvious, factors that implicate schools in the business of the prison industrial complex (PIC)--the examples of gentrification and sex offender registries. By unpacking some of the rhetoric that surrounds…
CAD-supported university course on photonics and fiber optic communications
NASA Astrophysics Data System (ADS)
Chan, David K. C.; Richter, Andre
2002-05-01
The highly competitive global photonics industry has created a significant demand for professional Photonic Design Automation (PDA) tools and personnel trained to use them effectively. In such a dynamic field, CAD-supported courses built around widely used industrial PDA tools provide many advantages, especially when offered through tertiary education institutions (which are ideally suited to producing the future workforce of the Photonics industry). An objective of VPIsystems' University program is to develop tertiary level courses based on VPIsystems' WDM transmission and component modeling software tools. Advantages offered by such courses include: visualizing and aiding the understanding of complex physical problems encountered in the design of fiber-optic communication systems; virtual laboratory exercises that can accurately reproduce the behavior of real systems and components without the prohibitive infrastructure and maintenance costs of real laboratories; flexibility in studying interrelated physical effects individually or in combination to facilitate learning; provide expertise and practical insights in areas, including industry-focused topics, that are not generally covered in traditional tertiary courses; provide exposure to, currently, the most widely used PDA tools in the industry. In this paper, details of VPIsystems' University program and its CAD-supported Photonics courses will be presented.
Material model for physically based rendering
NASA Astrophysics Data System (ADS)
Robart, Mathieu; Paulin, Mathias; Caubet, Rene
1999-09-01
In computer graphics, a complete knowledge of the interactions between light and a material is essential to obtain photorealistic pictures. Physical measurements allow us to obtain data on the material response, but are limited to industrial surfaces and depend on measure conditions. Analytic models do exist, but they are often inadequate for common use: the empiric ones are too simple to be realistic, and the physically-based ones are often to complex or too specialized to be generally useful. Therefore, we have developed a multiresolution virtual material model, that not only describes the surface of a material, but also its internal structure thanks to distribution functions of microelements, arranged in layers. Each microelement possesses its own response to an incident light, from an elementary reflection to a complex response provided by its inner structure, taking into account geometry, energy, polarization, . . ., of each light ray. This model is virtually illuminated, in order to compute its response to an incident radiance. This directional response is stored in a compressed data structure using spherical wavelets, and is destined to be used in a rendering model such as directional radiosity.
Experimental Validation of Various Temperature Modells for Semi-Physical Tyre Model Approaches
NASA Astrophysics Data System (ADS)
Hackl, Andreas; Scherndl, Christoph; Hirschberg, Wolfgang; Lex, Cornelia
2017-10-01
With increasing level of complexity and automation in the area of automotive engineering, the simulation of safety relevant Advanced Driver Assistance Systems (ADAS) leads to increasing accuracy demands in the description of tyre contact forces. In recent years, with improvement in tyre simulation, the needs for coping with tyre temperatures and the resulting changes in tyre characteristics are rising significantly. Therefore, experimental validation of three different temperature model approaches is carried out, discussed and compared in the scope of this article. To investigate or rather evaluate the range of application of the presented approaches in combination with respect of further implementation in semi-physical tyre models, the main focus lies on the a physical parameterisation. Aside from good modelling accuracy, focus is held on computational time and complexity of the parameterisation process. To evaluate this process and discuss the results, measurements from a Hoosier racing tyre 6.0 / 18.0 10 LCO C2000 from an industrial flat test bench are used. Finally the simulation results are compared with the measurement data.
Challenges in Materials Transformation Modeling for Polyolefins Industry
NASA Astrophysics Data System (ADS)
Lai, Shih-Yaw; Swogger, Kurt W.
2004-06-01
Unlike most published polymer processing and/or forming research, the transformation of polyolefins to fabricated articles often involves non-confined flow or so-called free surface flow (e.g. fiber spinning, blown films, and cast films) in which elongational flow takes place during a fabrication process. Obviously, the characterization and validation of extensional rheological parameters and their use to develop rheological constitutive models are the focus of polyolefins materials transformation research. Unfortunately, there are challenges that remain with limited validation for non-linear, non-isothermal constitutive models for polyolefins. Further complexity arises in the transformation of polyolefins in the elongational flow system as it involves stress-induced crystallization process. The complicated nature of elongational, non-linear rheology and non-isothermal crystallization kinetics make the development of numerical methods very challenging for the polyolefins materials forming modeling. From the product based company standpoint, the challenges of materials transformation research go beyond elongational rheology, crystallization kinetics and its numerical modeling. In order to make models useful for the polyolefin industry, it is critical to develop links between molecular parameters to both equipment and materials forming parameters. The recent advances in the constrained geometry catalysis and materials sciences understanding (INSITE technology and molecular design capability) has made industrial polyolefinic materials forming modeling more viable due to the fact that the molecular structure of the polymer can be well predicted and controlled during the polymerization. In this paper, we will discuss inter-relationship (models) among molecular parameters such as polymer molecular weight (Mw), molecular weight distribution (MWD), long chain branching (LCB), short chain branching (SCB or comonomer types and distribution) and their affects on shear and elongational rheologies, on tie-molecules probabilities, on non-isothermal stress-induced crystallization, on crystalline/amorphous orientation vs. mechanical property relationship, etc. All of the above mentioned inter-relationships (models) are critical to the successful development of a knowledge based industrial model. Dow Polyolefins and Elastomers business is one of the world largest polyolefins resin producers with the most advanced INSITE technology and a "6-Day model" molecular design capability. Dow also offers one of the broadest polyolefinic product ranges and applications to the market.
Development Of Simulation Model For Fluid Catalytic Cracking
NASA Astrophysics Data System (ADS)
Ghosh, Sobhan
2010-10-01
Fluid Catalytic Cracking (FCC) is the most widely used secondary conversion process in the refining industry, for producing gasoline, olefins, and middle distillate from heavier petroleum fractions. There are more than 500 units in the world with a total processing capacity of about 17 to 20% of the crude capacity. FCC catalyst is the highest consumed catalyst in the process industry. On one hand, FCC is quite flexible with respect to it's ability to process wide variety of crudes with a flexible product yield pattern, and on the other hand, the interdependence of the major operating parameters makes the process extremely complex. An operating unit is self balancing and some fluctuations in the independent parameters are automatically adjusted by changing the temperatures and flow rates at different sections. However, a good simulation model is very useful to the refiner to get the best out of the process, in terms of selection of the best catalyst, to cope up with the day to day changing of the feed quality and the demands of different products from FCC unit. In addition, a good model is of great help in designing the process units and peripherals. A simple empirical model is often adequate to monitor the day to day operations, but they are not of any use in handling the other problems such as, catalyst selection or, design / modification of the plant. For this, a kinetic based rigorous model is required. Considering the complexity of the process, large number of chemical species undergoing "n" number of parallel and consecutive reactions, it is virtually impossible to develop a simulation model based on the kinetic parameters. The most common approach is to settle for a semi empirical model. We shall take up the key issues for developing a FCC model and the contribution of such models in the optimum operation of the plant.
[Service quality in health care: the application of the results of marketing research].
Verheggen, F W; Harteloh, P P
1993-01-01
This paper deals with quality assurance in health care and its relation to quality assurance in trade and industry. We present the service quality model--a model of quality from marketing research--and discuss how it can be applied to health care. Traditional quality assurance appears to have serious flaws. It lacks a general theory of the sources of hazards in the complex process of patient care and tends to stagnate, for no real improvement takes place. Departing from this criticism, modern quality assurance in health care is marked by: defining quality in a preferential sense as "fitness for use"; the use of theories and models of trade and industry (process-control); an emphasis on analyzing the process, instead of merely inspecting it; use of the Deming problem solving technique (plan, do, check, act); improvement of the process of care by altering perceptions of parties involved. We present an experience of application and utilization of this method in the University Hospital Maastricht, The Netherlands. The successful application of this model requires a favorable corporate culture and motivation of the health care workers. This model provides a useful framework to uplift the traditional approach to quality assurance in health care.
NASA Astrophysics Data System (ADS)
Navaz, H. K.; Dang, A. L.; Atkinson, T.; Zand, A.; Nowakowski, A.; Kamensky, K.
2014-05-01
A general-purpose multi-phase and multi-component computer model capable of solving the complex problems encountered in the agent substrate interaction is developed. The model solves the transient and time-accurate mass and momentum governing equations in a three dimensional space. The provisions for considering all the inter-phase activities (solidification, evaporation, condensation, etc.) are included in the model. The chemical reactions among all phases are allowed and the products of the existing chemical reactions in all three phases are possible. The impact of chemical reaction products on the transport properties in porous media such as porosity, capillary pressure, and permeability is considered. Numerous validations for simulants, agents, and pesticides with laboratory and open air data are presented. Results for chemical reactions in the presence of pre-existing water in porous materials such as moisture, or separated agent and water droplets on porous substrates are presented. The model will greatly enhance the capabilities in predicting the level of threat after any chemical such as Toxic Industrial Chemicals (TICs) and Toxic Industrial Materials (TIMs) release on environmental substrates. The model's generality makes it suitable for both defense and pharmaceutical applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lindsey, Nicholas C.
The growth of additive manufacturing as a disruptive technology poses nuclear proliferation concerns worthy of serious consideration. Additive manufacturing began in the early 1980s with technological advances in polymer manipulation, computer capabilities, and computer-aided design (CAD) modeling. It was originally limited to rapid prototyping; however, it eventually developed into a complete means of production that has slowly penetrated the consumer market. Today, additive manufacturing machines can produce complex and unique items in a vast array of materials including plastics, metals, and ceramics. These capabilities have democratized the manufacturing industry, allowing almost anyone to produce items as simple as cup holdersmore » or as complex as jet fuel nozzles. Additive manufacturing, or three-dimensional (3D) printing as it is commonly called, relies on CAD files created or shared by individuals with additive manufacturing machines to produce a 3D object from a digital model. This sharing of files means that a 3D object can be scanned or rendered as a CAD model in one country, and then downloaded and printed in another country, allowing items to be shared globally without physically crossing borders. The sharing of CAD files online has been a challenging task for the export controls regime to manage over the years, and additive manufacturing could make these transfers more common. In this sense, additive manufacturing is a disruptive technology not only within the manufacturing industry but also within the nuclear nonproliferation world. This paper provides an overview of additive manufacturing concerns of proliferation.« less
NASA Astrophysics Data System (ADS)
Prucha, R. H.; Dayton, C. S.; Hawley, C. M.
2002-12-01
The Rocky Flats Environmental Technology Site (RFETS) in Golden, Colorado, a former Department of Energy nuclear weapons manufacturing facility, is currently undergoing closure. The natural semi-arid interaction between surface and subsurface flow at RFETS is complex and complicated by the industrial modifications to the flow system. Using a substantial site data set, a distributed parameter, fully-integrated hydrologic model was developed to assess the hydrologic impact of different hypothetical site closure configurations on the current flow system and to better understand the integrated hydrologic behavior of the system. An integrated model with this level of detail has not been previously developed in a semi-arid area, and a unique, but comprehensive, approach was required to calibrate and validate the model. Several hypothetical scenarios were developed to simulate hydrologic effects of modifying different aspects of the site. For example, some of the simulated modifications included regrading the current land surface, changing the existing surface channel network, removing subsurface trenches and gravity drain flow systems, installing a slurry wall and geotechnical cover, changing the current vegetative cover, and converting existing buildings and pavement to permeable soil areas. The integrated flow model was developed using a rigorous physically-based code so that realistic design parameters can simulate these changes. This code also permitted evaluation of changes to complex integrated hydrologic system responses that included channelized and overland flow, pond levels, unsaturated zone storage, groundwater heads and flow directions, and integrated water balances for key areas. Results generally show that channel flow offsite decreases substantially for different scenarios, while groundwater heads generally increase within the reconfigured industrial area most of which is then discharged as evapotranspiration. These changes have significant implications to site closure and operation.
NASA Astrophysics Data System (ADS)
Wells, M. A.; Samarasekera, I. V.; Brimacombe, J. K.; Hawbolt, E. B.; Lloyd, D. J.
1998-06-01
A comprehensive mathematical model of the hot tandem rolling process for aluminum alloys has been developed. Reflecting the complex thermomechanical and microstructural changes effected in the alloys during rolling, the model incorporated heat flow, plastic deformation, kinetics of static recrystallization, final recrystallized grain size, and texture evolution. The results of this microstructural engineering study, combining computer modeling, laboratory tests, and industrial measurements, are presented in three parts. In this Part I, laboratory measurements of static recrystallization kinetics and final recrystallized grain size are described for AA5182 and AA5052 aluminum alloys and expressed quantitatively by semiempirical equations. In Part II, laboratory measurements of the texture evolution during static recrystallization are described for each of the alloys and expressed mathematically using a modified form of the Avrami equation. Finally, Part III of this article describes the development of an overall mathematical model for an industrial aluminum hot tandem rolling process which incorporates the microstructure and texture equations developed and the model validation using industrial data. The laboratory measurements for the microstructural evolution were carried out using industrially rolled material and a state-of-the-art plane strain compression tester at Alcan International. Each sample was given a single deformation and heat treated in a salt bath at 400 °C for various lengths of time to effect different levels of recrystallization in the samples. The range of hot-working conditions used for the laboratory study was chosen to represent conditions typically seen in industrial aluminum hot tandem rolling processes, i.e., deformation temperatures of 350 °C to 500 °C, strain rates of 0.5 to 100 seconds and total strains of 0.5 to 2.0. The semiempirical equations developed indicated that both the recrystallization kinetics and the final recrystallized grain size were dependent on the deformation history of the material i.e., total strain and Zener-Hollomon parameter ( Z), where Z = dot \\varepsilon exp left( {{Q_{def} }/{RT_{def }}} right) and time at the recrystallization temperature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Frederick
2012-02-01
This report describes conditions, as required by the state of Idaho Industrial Wastewater Reuse Permit (LA-000160-01), for the wastewater reuse site at the Idaho National Laboratory Site's Materials and Fuels Complex Industrial Waste Ditch and Industrial Waste Pond from November 1, 2010 through October 31, 2011. The report contains the following information: (1) Facility and system description; (2) Permit required effluent monitoring data and loading rates; (3) Groundwater monitoring data; (4) Status of special compliance conditions; and (5) Discussion of the facility's environmental impacts. During the 2011 reporting year, an estimated 6.99 million gallons of wastewater were discharged to themore » Industrial Waste Ditch and Pond which is well below the permit limit of 13 million gallons per year. Using the dissolved iron data, the concentrations of all permit-required analytes in the samples from the down gradient monitoring wells were below the Ground Water Quality Rule Primary and Secondary Constituent Standards.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
David B. Frederick
2011-02-01
This report describes conditions, as required by the state of Idaho Industrial Wastewater Reuse Permit (#LA 000160 01), for the wastewater reuse site at the Idaho National Laboratory Site’s Materials and Fuels Complex Industrial Waste Ditch and Industrial Waste Pond from May 1, 2010 through October 31, 2010. The report contains the following information: • Facility and system description • Permit required effluent monitoring data and loading rates • Groundwater monitoring data • Status of special compliance conditions • Discussion of the facility’s environmental impacts During the 2010 partial reporting year, an estimated 3.646 million gallons of wastewater were dischargedmore » to the Industrial Waste Ditch and Pond which is well below the permit limit of 13 million gallons per year. The concentrations of all permit-required analytes in the samples from the down gradient monitoring wells were below the Ground Water Quality Rule Primary and Secondary Constituent Standards.« less
Ilizaliturri-Hernández, César Arturo; González-Mille, Donaji Josefina; Mejía-Saavedra, Jesús; Espinosa-Reyes, Guillermo; Torres-Dosal, Arturo; Pérez-Maldonado, Iván
2013-02-01
The Coatzacoalcos Region in Veracruz, Mexico houses one of the most important industrial complexes in Mexico and Latin America. Lead is an ubiquitous environmental pollutant which represents a great risk to human health and ecosystems. Amphibian populations have been recognized as biomonitors of changes in environmental conditions. The purpose of this research is to measure exposure to lead and evaluate hematological and biochemical effects in specimens of giant toads (Rhinella marina) taken from three areas surrounding an industrial complex in the Coatzacoalcos River downstream. Lead levels in toads' blood are between 10.8 and 70.6 μg/dL and are significantly higher in industrial sites. We have found a significant decrease in the delta-aminolevulinic acid dehydratase (δ-ALAD) activity in blood from 35.3 to 78 % for the urban-industrial and industrial sites, respectively. In addition, we have identified a strong inverse relationship between the δ-ALAD activity and the blood lead levels (r = -0.84, p < 0.001). Hemoglobin and mean corpuscular hemoglobin levels, as well as the condition factor, are found to be lower at industrial sites compared with the reference sites. Our results suggest that the R. marina can be considered a good biomonitor of the δ-ALAD activity inhibition and hematological alterations at low lead concentrations.
NASA Astrophysics Data System (ADS)
Xiong, H.; Hamila, N.; Boisse, P.
2017-10-01
Pre-impregnated thermoplastic composites have recently attached increasing interest in the automotive industry for their excellent mechanical properties and their rapid cycle manufacturing process, modelling and numerical simulations of forming processes for composites parts with complex geometry is necessary to predict and optimize manufacturing practices, especially for the consolidation effects. A viscoelastic relaxation model is proposed to characterize the consolidation behavior of thermoplastic prepregs based on compaction tests with a range of temperatures. The intimate contact model is employed to predict the evolution of the consolidation which permits the microstructure prediction of void presented through the prepreg. Within a hyperelastic framework, several simulation tests are launched by combining a new developed solid shell finite element and the consolidation models.
NASA Astrophysics Data System (ADS)
Hullo, J.-F.; Thibault, G.; Boucheny, C.
2015-02-01
In a context of increased maintenance operations and workers generational renewal, a nuclear owner and operator like Electricité de France (EDF) is interested in the scaling up of tools and methods of "as-built virtual reality" for larger buildings and wider audiences. However, acquisition and sharing of as-built data on a large scale (large and complex multi-floored buildings) challenge current scientific and technical capacities. In this paper, we first present a state of the art of scanning tools and methods for industrial plants with very complex architecture. Then, we introduce the inner characteristics of the multi-sensor scanning and visualization of the interior of the most complex building of a power plant: a nuclear reactor building. We introduce several developments that made possible a first complete survey of such a large building, from acquisition, processing and fusion of multiple data sources (3D laser scans, total-station survey, RGB panoramic, 2D floor plans, 3D CAD as-built models). In addition, we present the concepts of a smart application developed for the painless exploration of the whole dataset. The goal of this application is to help professionals, unfamiliar with the manipulation of such datasets, to take into account spatial constraints induced by the building complexity while preparing maintenance operations. Finally, we discuss the main feedbacks of this large experiment, the remaining issues for the generalization of such large scale surveys and the future technical and scientific challenges in the field of industrial "virtual reality".
NASA Astrophysics Data System (ADS)
Kalsom Yusof, Umi; Nor Akmal Khalid, Mohd
2015-05-01
Semiconductor industries need to constantly adjust to the rapid pace of change in the market. Most manufactured products usually have a very short life cycle. These scenarios imply the need to improve the efficiency of capacity planning, an important aspect of the machine allocation plan known for its complexity. Various studies have been performed to balance productivity and flexibility in the flexible manufacturing system (FMS). Many approaches have been developed by the researchers to determine the suitable balance between exploration (global improvement) and exploitation (local improvement). However, not much work has been focused on the domain of machine allocation problem that considers the effects of machine breakdowns. This paper develops a model to minimize the effect of machine breakdowns, thus increasing the productivity. The objectives are to minimize system unbalance and makespan as well as increase throughput while satisfying the technological constraints such as machine time availability. To examine the effectiveness of the proposed model, results for throughput, system unbalance and makespan on real industrial datasets were performed with applications of intelligence techniques, that is, a hybrid of genetic algorithm and harmony search. The result aims to obtain a feasible solution to the domain problem.
NASA Astrophysics Data System (ADS)
Nagy, M.; Behúlová, M.
2017-11-01
Nowadays, the laser technology is used in a wide spectrum of applications, especially in engineering, electronics, medicine, automotive, aeronautic or military industries. In the field of mechanical engineering, the laser technology reaches the biggest increase in the automotive industry, mainly due to the introduction of automation utilizing 5-axial movements. Modelling and numerical simulation of laser welding processes has been exploited with many advantages for the investigation of physical principles and complex phenomena connected with this joining technology. The paper is focused on the application of numerical simulation to the design of welding parameters for the circumferential laser welding of thin-walled exhaust pipes from theAISI 304 steel for automotive industry. Using the developed and experimentally verified simulation model for laser welding of tubes, the influence of welding parameters including the laser velocity from 30 mm.s-1 to 60 mm.s-1 and the laser power from 500 W to 1200 W on the temperature fields and dimensions of fusion zone was investigated using the program code ANSYS. Based on obtained results, the welding schedule for the laser beam welding of thin-walled tubes from the AISI 304 steel was suggested.
Experimental determination of dynamic parameters of an industrial robot
NASA Astrophysics Data System (ADS)
Banas, W.; Cwikła, G.; Foit, K.; Gwiazda, A.; Monica, Z.; Sekala, A.
2017-08-01
In an industry increasingly used are industrial robots. Commonly used are two basic methods of programming, on-line programming and off-line programming. In both cases, the programming consists in getting to the selected points record this position, and set the order of movement of the robot, and the introduction of logical tests. Such a program is easy to write, and it is suitable for most industrial applications. Especially when the process is known, respectively slow and unchanging. In this case, the program is being prepared for a universal model of the robot with the appropriate geometry and are checked only collisions. Is not taken into account the dynamics of the robot and how it will really behave while in motion. For this reason, the robot programmed to be tested at a reduced speed, which is raised gradually to the final value. Depending on the complexity of the move and the proximity of the elements it takes a lot of time. It is easy to notice that the robot at different speeds have different trajectories and behaves differently.
NASA Astrophysics Data System (ADS)
Maluck, Julian; Donner, Reik V.
2017-02-01
International trade has grown considerably during the process of globalization. Complex supply chains for the production of goods have resulted in an increasingly connected International Trade Network (ITN). Traditionally, direct trade relations between industries have been regarded as mediators of supply and demand spillovers. With increasing network connectivity the question arises if higher-order relations become more important in explaining a national sector's susceptibility to supply and demand changes of its trading partner. In this study we address this question by investigating empirically to what extent the topological properties of the ITN provide information about positive correlations in the production of two industry sectors. We observe that although direct trade relations between industries serve as important indicators for correlations in the industries' value added growth, opportunities of substitution for required production inputs as well as second-order trade relations cannot be neglected. Our results contribute to a better understanding of the relation between trade and economic productivity and can serve as a basis for the improvement of crisis spreading models that evaluate contagion threats in the case of a node's failure in the ITN.
Radio-Frequency Applications for Food Processing and Safety.
Jiao, Yang; Tang, Juming; Wang, Yifen; Koral, Tony L
2018-03-25
Radio-frequency (RF) heating, as a thermal-processing technology, has been extending its applications in the food industry. Although RF has shown some unique advantages over conventional methods in industrial drying and frozen food thawing, more research is needed to make it applicable for food safety applications because of its complex heating mechanism. This review provides comprehensive information regarding RF-heating history, mechanism, fundamentals, and applications that have already been fully developed or are still under research. The application of mathematical modeling as a useful tool in RF food processing is also reviewed in detail. At the end of the review, we summarize the active research groups in the RF food thermal-processing field, and address the current problems that still need to be overcome.
How to revive breakthrough innovation in the pharmaceutical industry.
Munos, Bernard H; Chin, William W
2011-06-29
Over the past 20 years, pharmaceutical companies have implemented conservative management practices to improve the predictability of therapeutics discovery and success rates of drug candidates. This approach has often yielded compounds that are only marginally better than existing therapies, yet require larger, longer, and more complex trials. To fund them, companies have shifted resources away from drug discovery to late clinical development; this has hurt innovation and amplified the crisis brought by the expiration of patents on many best-selling drugs. Here, we argue that more breakthrough therapeutics will reach patients only if the industry ceases to pursue "safe" incremental innovation, re-engages in high-risk discovery research, and adopts collaborative innovation models that allow sharing of knowledge and costs among collaborators.
Calvo, Roque; D’Amato, Roberto; Gómez, Emilio; Domingo, Rosario
2016-01-01
The development of an error compensation model for coordinate measuring machines (CMMs) and its integration into feature measurement is presented. CMMs are widespread and dependable instruments in industry and laboratories for dimensional measurement. From the tip probe sensor to the machine display, there is a complex transformation of probed point coordinates through the geometrical feature model that makes the assessment of accuracy and uncertainty measurement results difficult. Therefore, error compensation is not standardized, conversely to other simpler instruments. Detailed coordinate error compensation models are generally based on CMM as a rigid-body and it requires a detailed mapping of the CMM’s behavior. In this paper a new model type of error compensation is proposed. It evaluates the error from the vectorial composition of length error by axis and its integration into the geometrical measurement model. The non-explained variability by the model is incorporated into the uncertainty budget. Model parameters are analyzed and linked to the geometrical errors and uncertainty of CMM response. Next, the outstanding measurement models of flatness, angle, and roundness are developed. The proposed models are useful for measurement improvement with easy integration into CMM signal processing, in particular in industrial environments where built-in solutions are sought. A battery of implementation tests are presented in Part II, where the experimental endorsement of the model is included. PMID:27690052
Co-operation in the Training of Students between Higher Education Institutions and Industry.
ERIC Educational Resources Information Center
Meshkov, N. N.
1983-01-01
The evolution, purposes, and administration of teaching-research-industry complexes in the Byelorussian Soviet Socialist Republic, established to improve the coordination of specialist training and industry productivity and technology, are described, and industry progress made in recent years as a result of these programs is outlined. (MSE)
Atmospheric stability and complex terrain: comparing measurements and CFD
NASA Astrophysics Data System (ADS)
Koblitz, T.; Bechmann, A.; Berg, J.; Sogachev, A.; Sørensen, N.; Réthoré, P.-E.
2014-12-01
For wind resource assessment, the wind industry is increasingly relying on Computational Fluid Dynamics models that focus on modeling the airflow in a neutrally stratified surface layer. So far, physical processes that are specific to the atmospheric boundary layer, for example the Coriolis force, buoyancy forces and heat transport, are mostly ignored in state-of-the-art flow solvers. In order to decrease the uncertainty of wind resource assessment, the effect of thermal stratification on the atmospheric boundary layer should be included in such models. The present work focuses on non-neutral atmospheric flow over complex terrain including physical processes like stability and Coriolis force. We examine the influence of these effects on the whole atmospheric boundary layer using the DTU Wind Energy flow solver EllipSys3D. To validate the flow solver, measurements from Benakanahalli hill, a field experiment that took place in India in early 2010, are used. The experiment was specifically designed to address the combined effects of stability and Coriolis force over complex terrain, and provides a dataset to validate flow solvers. Including those effects into EllipSys3D significantly improves the predicted flow field when compared against the measurements.
Potential toxicity and affinity of triphenylmethane dye malachite green to lysozyme.
Ding, Fei; Li, Xiu-Nan; Diao, Jian-Xiong; Sun, Ye; Zhang, Li; Ma, Lin; Yang, Xin-Ling; Zhang, Li; Sun, Ying
2012-04-01
Malachite green is a triphenylmethane dye that is used extensively in many industrial and aquacultural processes, generating environmental concerns and health problems to human being. In this contribution, the complexation between lysozyme and malachite green was verified by means of computer-aided molecular modeling, steady state and time-resolved fluorescence, and circular dichroism (CD) approaches. The precise binding patch of malachite green in lysozyme has been identified from molecular modeling and ANS displacement, Trp-62, Trp-63, and Trp-108 residues of lysozyme were earmarked to possess high-affinity for this dye, the principal forces in the lysozyme-malachite green adduct are hydrophobic and π-π interactions. Steady state fluorescence proclaimed the complex of malachite green with lysozyme yields quenching through static type, which substantiates time-resolved fluorescence measurements that lysozyme-malachite green conjugation formation has an affinity of 10(3)M(-1). Moreover, via molecular modeling and also CD data, we can safely arrive at a conclusion that the polypeptide chain of lysozyme partially destabilized upon complexation with malachite green. The data emerged here will help to further understand the toxicological action of malachite green in human body. Copyright © 2012 Elsevier Inc. All rights reserved.
Emergent behaviour in a chlorophenol-mineralising three-tiered microbial ‘food web’
Wade, M.J.; Pattinson, R.W.; Parker, N.G.; Dolfing, J.
2016-01-01
Anaerobic digestion enables the water industry to treat wastewater as a resource for generating energy and recovering valuable by-products. The complexity of the anaerobic digestion process has motivated the development of complex models. However, this complexity makes it intractable to pin-point stability and emergent behaviour. Here, the widely used Anaerobic Digestion Model No. 1 (ADM1) has been reduced to its very backbone, a syntrophic two-tiered microbial ‘food chain’ and a slightly more complex three-tiered microbial ‘food web’, with their stability analysed as a function of the inflowing substrate concentration and dilution rate. Parameterised for phenol and chlorophenol degradation, steady-states were always stable and non-oscillatory. Low input concentrations of chlorophenol were sufficient to maintain chlorophenol- and phenol-degrading populations but resulted in poor conversion and a hydrogen flux that was too low to sustain hydrogenotrophic methanogens. The addition of hydrogen and phenol boosted the populations of all three organisms, resulting in the counterintuitive phenomena that (i) the phenol degraders were stimulated by adding hydrogen, even though hydrogen inhibits phenol degradation, and (ii) the dechlorinators indirectly benefitted from measures that stimulated their hydrogenotrophic competitors; both phenomena hint at emergent behaviour. PMID:26551153
Asian-American Studies in the Age of the Prison Industrial Complex: Departures and Re-Narrations
ERIC Educational Resources Information Center
Rodriguez, Dylan
2005-01-01
This essay offers a schematic reflection on the institutional formation and political location of Asian-American Studies in relation to the rise of the United Sates prison industrial complex over the last three decades. The author is generally concerned with the peculiar location of "Asian-Americans" as fabricated cultural figures within a U.S.…
ERIC Educational Resources Information Center
Maranto, Robert; Van Raemdonck, Dirk C.
2011-01-01
Many people view subgovernments such as the "military-industrial complex" as largely self-governing and budget maximizing. Yet, as defense cutbacks in the 1970s and 1990s show, such networks do not maintain their privileged status indefinitely. In similar fashion, some claim public education is too autonomous and too focused on budget…
ERIC Educational Resources Information Center
Feldman, Jonathan
This book presents the thesis that U.S. universities have become part of an academic-military-industrial complex that support repression and murder in Central America. Part 1 explains how U.S. policies have been based on murder in Central America and examines the responsibility of transnational corporations and U.S. war planners in this…
ERIC Educational Resources Information Center
Patton, Madeline
2015-01-01
Data breach prevention is a battle, rarely plain and never simple. For higher education institutions, the Sisyphean aspects of the task are more complex than for industry and business. Two-year colleges have payrolls and vendor contracts like those enterprises. They also have public record and student confidentiality requirements. Colleges must…
ERIC Educational Resources Information Center
Fink, C. Dennis; And Others
Recent efforts to assess complex human performances in various work settings are reviewed. The review is based upon recent psychological, educational, and industrial literature, and technical reports sponsored by the military services. A few selected military and industrial locations were also visited in order to learn about current research and…
Locating Anomalies in Complex Data Sets Using Visualization and Simulation
NASA Technical Reports Server (NTRS)
Panetta, Karen
2001-01-01
The research goals are to create a simulation framework that can accept any combination of models written at the gate or behavioral level. The framework provides the ability to fault simulate and create scenarios of experiments using concurrent simulation. In order to meet these goals we have had to fulfill the following requirements. The ability to accept models written in VHDL, Verilog or the C languages. The ability to propagate faults through any model type. The ability to create experiment scenarios efficiently without generating every possible combination of variables. The ability to accept adversity of fault models beyond the single stuck-at model. Major development has been done to develop a parser that can accept models written in various languages. This work has generated considerable attention from other universities and industry for its flexibility and usefulness. The parser uses LEXX and YACC to parse Verilog and C. We have also utilized our industrial partnership with Alternative System's Inc. to import vhdl into our simulator. For multilevel simulation, we needed to modify the simulator architecture to accept models that contained multiple outputs. This enabled us to accept behavioral components. The next major accomplishment was the addition of "functional fault models". Functional fault models change the behavior of a gate or model. For example, a bridging fault can make an OR gate behave like an AND gate. This has applications beyond fault simulation. This modeling flexibility will make the simulator more useful for doing verification and model comparison. For instance, two or more versions of an ALU can be comparatively simulated in a single execution. The results will show where and how the models differed so that the performance and correctness of the models may be evaluated. A considerable amount of time has been dedicated to validating the simulator performance on larger models provided by industry and other universities.
The Application of Simulation Method in Isothermal Elastic Natural Gas Pipeline
NASA Astrophysics Data System (ADS)
Xing, Chunlei; Guan, Shiming; Zhao, Yue; Cao, Jinggang; Chu, Yanji
2018-02-01
This Elastic pipeline mathematic model is of crucial importance in natural gas pipeline simulation because of its compliance with the practical industrial cases. The numerical model of elastic pipeline will bring non-linear complexity to the discretized equations. Hence the Newton-Raphson method cannot achieve fast convergence in this kind of problems. Therefore A new Newton Based method with Powell-Wolfe Condition to simulate the Isothermal elastic pipeline flow is presented. The results obtained by the new method aregiven based on the defined boundary conditions. It is shown that the method converges in all cases and reduces significant computational cost.
Beef cattle growing and backgrounding programs.
Peel, Derrell S
2003-07-01
The stocker industry is one of many diverse production and marketing activities that make up the United States beef industry. The stocker industry is probably the least understood industry sector and yet it plays a vital role in helping the industry exploit its competitive advantage of using forage resources and providing an economical means of adjusting the timing and volume of cattle and meat in a complex market environment.
Cloud Model-Based Artificial Immune Network for Complex Optimization Problem
Wang, Mingan; Li, Jianming; Guo, Dongliang
2017-01-01
This paper proposes an artificial immune network based on cloud model (AINet-CM) for complex function optimization problems. Three key immune operators—cloning, mutation, and suppression—are redesigned with the help of the cloud model. To be specific, an increasing half cloud-based cloning operator is used to adjust the dynamic clone multipliers of antibodies, an asymmetrical cloud-based mutation operator is used to control the adaptive evolution of antibodies, and a normal similarity cloud-based suppressor is used to keep the diversity of the antibody population. To quicken the searching convergence, a dynamic searching step length strategy is adopted. For comparative study, a series of numerical simulations are arranged between AINet-CM and the other three artificial immune systems, that is, opt-aiNet, IA-AIS, and AAIS-2S. Furthermore, two industrial applications—finite impulse response (FIR) filter design and proportional-integral-differential (PID) controller tuning—are investigated and the results demonstrate the potential searching capability and practical value of the proposed AINet-CM algorithm. PMID:28630620
Cloud Model-Based Artificial Immune Network for Complex Optimization Problem.
Wang, Mingan; Feng, Shuo; Li, Jianming; Li, Zhonghua; Xue, Yu; Guo, Dongliang
2017-01-01
This paper proposes an artificial immune network based on cloud model (AINet-CM) for complex function optimization problems. Three key immune operators-cloning, mutation, and suppression-are redesigned with the help of the cloud model. To be specific, an increasing half cloud-based cloning operator is used to adjust the dynamic clone multipliers of antibodies, an asymmetrical cloud-based mutation operator is used to control the adaptive evolution of antibodies, and a normal similarity cloud-based suppressor is used to keep the diversity of the antibody population. To quicken the searching convergence, a dynamic searching step length strategy is adopted. For comparative study, a series of numerical simulations are arranged between AINet-CM and the other three artificial immune systems, that is, opt-aiNet, IA-AIS, and AAIS-2S. Furthermore, two industrial applications-finite impulse response (FIR) filter design and proportional-integral-differential (PID) controller tuning-are investigated and the results demonstrate the potential searching capability and practical value of the proposed AINet-CM algorithm.
Solid Rocket Motor Combustion Instability Modeling in COMSOL Multiphysics
NASA Technical Reports Server (NTRS)
Fischbach, Sean R.
2015-01-01
Combustion instability modeling of Solid Rocket Motors (SRM) remains a topic of active research. Many rockets display violent fluctuations in pressure, velocity, and temperature originating from the complex interactions between the combustion process, acoustics, and steady-state gas dynamics. Recent advances in defining the energy transport of disturbances within steady flow-fields have been applied by combustion stability modelers to improve the analysis framework [1, 2, 3]. Employing this more accurate global energy balance requires a higher fidelity model of the SRM flow-field and acoustic mode shapes. The current industry standard analysis tool utilizes a one dimensional analysis of the time dependent fluid dynamics along with a quasi-three dimensional propellant grain regression model to determine the SRM ballistics. The code then couples with another application that calculates the eigenvalues of the one dimensional homogenous wave equation. The mean flow parameters and acoustic normal modes are coupled to evaluate the stability theory developed and popularized by Culick [4, 5]. The assumption of a linear, non-dissipative wave in a quiescent fluid remains valid while acoustic amplitudes are small and local gas velocities stay below Mach 0.2. The current study employs the COMSOL multiphysics finite element framework to model the steady flow-field parameters and acoustic normal modes of a generic SRM. The study requires one way coupling of the CFD High Mach Number Flow (HMNF) and mathematics module. The HMNF module evaluates the gas flow inside of a SRM using St. Robert's law to model the solid propellant burn rate, no slip boundary conditions, and the hybrid outflow condition. Results from the HMNF model are verified by comparing the pertinent ballistics parameters with the industry standard code outputs (i.e. pressure drop, thrust, ect.). These results are then used by the coefficient form of the mathematics module to determine the complex eigenvalues of the Acoustic Velocity Potential Equation (AVPE). The mathematics model is truncated at the nozzle sonic line, where a zero flux boundary condition is self-satisfying. The remaining boundaries are modeled with a zero flux boundary condition, assuming zero acoustic absorption on all surfaces. The results of the steady-state CFD and AVPE analyses are used to calculate the linear acoustic growth rate as is defined by Flandro and Jacob [2, 3]. In order to verify the process implemented within COMSOL we first employ the Culick theory and compare the results with the industry standard. After the process is verified, the Flandro/Jacob energy balance theory is employed and results displayed.
Sustainable Financing of Innovative Therapies: A Review of Approaches.
Hollis, Aidan
2016-10-01
The process of innovation is inherently complex, and it occurs within an even more complex institutional environment characterized by incomplete information, market power, and externalities. There are therefore different competing approaches to supporting and financing innovation in medical technologies, which bring their own advantages and disadvantages. This article reviews value- and cost-based pricing, as well direct government funding, and cross-cutting institutional structures. It argues that performance-based risk-sharing agreements are likely to have little effect on the sustainability of financing; that there is a role for cost-based pricing models in some situations; and that the push towards longer exclusivity periods is likely contrary to the interests of industry.
A complex network for studying the transmission mechanisms in stock market
NASA Astrophysics Data System (ADS)
Long, Wen; Guan, Lijing; Shen, Jiangjian; Song, Linqiu; Cui, Lingxiao
2017-10-01
This paper introduces a new complex network to describe the volatility transmission mechanisms in stock market. The network can not only endogenize stock market's volatility but also figure out the direction of volatility spillover. In this model, we first use BEKK-GARCH to estimate the volatility spillover effects among Chinese 18 industry sectors. Then, based on the ARCH coefficients and GARCH coefficients, the directional shock networks and variance networks in different stages are constructed separately. We find that the spillover effects and network structures changes in different stages. The results of the topological stability test demonstrate that the connectivity of networks becomes more fragile to selective attacks than stochastic attacks.
On the dimension of complex responses in nonlinear structural vibrations
NASA Astrophysics Data System (ADS)
Wiebe, R.; Spottswood, S. M.
2016-07-01
The ability to accurately model engineering systems under extreme dynamic loads would prove a major breakthrough in many aspects of aerospace, mechanical, and civil engineering. Extreme loads frequently induce both nonlinearities and coupling which increase the complexity of the response and the computational cost of finite element models. Dimension reduction has recently gained traction and promises the ability to distill dynamic responses down to a minimal dimension without sacrificing accuracy. In this context, the dimensionality of a response is related to the number of modes needed in a reduced order model to accurately simulate the response. Thus, an important step is characterizing the dimensionality of complex nonlinear responses of structures. In this work, the dimensionality of the nonlinear response of a post-buckled beam is investigated. Significant detail is dedicated to carefully introducing the experiment, the verification of a finite element model, and the dimensionality estimation algorithm as it is hoped that this system may help serve as a benchmark test case. It is shown that with minor modifications, the method of false nearest neighbors can quantitatively distinguish between the response dimension of various snap-through, non-snap-through, random, and deterministic loads. The state-space dimension of the nonlinear system in question increased from 2-to-10 as the system response moved from simple, low-level harmonic to chaotic snap-through. Beyond the problem studied herein, the techniques developed will serve as a prescriptive guide in developing fast and accurate dimensionally reduced models of nonlinear systems, and eventually as a tool for adaptive dimension-reduction in numerical modeling. The results are especially relevant in the aerospace industry for the design of thin structures such as beams, panels, and shells, which are all capable of spatio-temporally complex dynamic responses that are difficult and computationally expensive to model.
The impact of manufacturing complexity drivers on performance-a preliminary study
NASA Astrophysics Data System (ADS)
Huah Leang, Suh; Mahmood, Wan Hasrulnizzam Wan; Rahman, Muhamad Arfauz A.
2018-03-01
Manufacturing systems, in pursuit of cost, time and flexibility optimisation are becoming more and more complex, exhibiting a dynamic and nonlinear behaviour. Unpredictability is a distinct characteristic of such behaviour and effects production planning significantly. Therefore, this study was undertaken to investigate the priority level and current achievement of manufacturing performance in Malaysia’s manufacturing industry and the complexity drivers on manufacturing productivity performance. The results showed that Malaysia’s manufacturing industry prioritised product quality and they managed to achieve a good on time delivery performance. However, for other manufacturing performance, there was a difference where the current achievement of manufacturing performances in Malaysia’s manufacturing industry is slightly lower than the priority given to them. The strong correlation of significant value for priority status was observed between efficient production levelling (finished goods) and finish product management while the strong correlation of significant value for current achievement was minimised the number of workstation and factory transportation system. This indicates that complexity drivers have an impact towards manufacturing performance. Consequently, it is necessary to identify complexity drivers to achieve well manufacturing performance.
A methodology for overall consequence modeling in chemical industry.
Arunraj, N S; Maiti, J
2009-09-30
Risk assessment in chemical process industry is a very important issue for safeguarding human and the ecosystem from damages caused to them. Consequence assessment is an integral part of risk assessment. However, the commonly used consequence estimation methods involve time-consuming complex mathematical models and simple assimilation of losses without considering all the consequence factors. This lead to the deterioration of quality of estimated risk value. So, the consequence modeling has to be performed in detail considering all major losses with optimal time to improve the decisive value of risk. The losses can be broadly categorized into production loss, assets loss, human health and safety loss, and environment loss. In this paper, a conceptual framework is developed to assess the overall consequence considering all the important components of major losses. Secondly, a methodology is developed for the calculation of all the major losses, which are normalized to yield the overall consequence. Finally, as an illustration, the proposed methodology is applied to a case study plant involving benzene extraction. The case study result using the proposed consequence assessment scheme is compared with that from the existing methodologies.
Exploration of agent of change’s role in biodiesel energy transition process using agent-based model
NASA Astrophysics Data System (ADS)
Hidayatno, A.; Vicky, L. R.; Destyanto, A. R.
2017-11-01
As the world’s largest Crude Palm Oil (CPO) producer, Indonesia uses CPO as raw material for biodiesel. A number of policies have been designed by the Indonesian government to support adoption of biodiesel. However, the role of energy alternatives faced complex problems. Agent-based modeling can be applied to predict the impact of policies on the actors in the business process to acquire a rich discernment of the behavior and decision making by the biodiesel industries. This study evaluates government policy by attending at the adoption of the biodiesel industry in the tender run by a government with the intervention of two policy options biodiesel energy utilization by developing an agent-based model. The simulation result show that the policy of adding the biodiesel plant installed capacity has a good impact in increasing the production capacity and vendor adoption in the tender. Even so, the government should consider the cost to be incurred and the profits for vendors, so the biodiesel production targets can be successfully fulfilled.
Realization of a Complex Control & Diagnosis System on Simplified Hardware
NASA Astrophysics Data System (ADS)
Stetter, R.; Swamy Prasad, M.
2015-11-01
Energy is an important factor in today's industrial environment. Pump systems account for about 20% of the total industrial electrical energy consumption. Several studies show that with proper monitoring, control and maintenance, the efficiency of pump systems can be increased. Controlling pump systems with intelligent systems can help to reduce a pump's energy consumption by up to one third of its original consumption. The research in this paper was carried out in the scope of a research project which involves modelling and simulation of pump systems. This paper focuses on the future implementation of modelling capabilities in PLCs (programmable logic controllers). The whole project aims to use a pump itself as the sensor rather than introducing external sensors into the system, which would increase the cost considerably. One promising approach for an economic and robust industrial implementation of this intelligence is the use of PLCs. PLCs can be simulated in multiple ways; in this project, Codesys was chosen for several reasons which are explained in this paper. The first part of this paper explains the modelling of a pump itself, the process load of the asynchronous motor with a control system, and the simulation possibilities of the motor in Codesys. The second part describes the simulation and testing of a system realized. The third part elaborates the Codesys system structure and interfacing of the system with external files. The final part consists of comparing the result with an earlier Matlab/SIMULINK model and original test data.
Decision support methods for the environmental assessment of contamination at mining sites.
Jordan, Gyozo; Abdaal, Ahmed
2013-09-01
Polluting mine accidents and widespread environmental contamination associated with historic mining in Europe and elsewhere has triggered the improvement of related environmental legislation and of the environmental assessment and management methods for the mining industry. Mining has some unique features such as natural background pollution associated with natural mineral deposits, industrial activities and contamination located in the three-dimensional sub-surface space, the problem of long-term remediation after mine closure, problem of secondary contaminated areas around mine sites and abandoned mines in historic regions like Europe. These mining-specific problems require special tools to address the complexity of the environmental problems of mining-related contamination. The objective of this paper is to review and evaluate some of the decision support methods that have been developed and applied to mining contamination. In this paper, only those methods that are both efficient decision support tools and provide a 'holistic' approach to the complex problem as well are considered. These tools are (1) landscape ecology, (2) industrial ecology, (3) landscape geochemistry, (4) geo-environmental models, (5) environmental impact assessment, (6) environmental risk assessment, (7) material flow analysis and (8) life cycle assessment. This unique inter-disciplinary study should enable both the researcher and the practitioner to obtain broad view on the state-of-the-art of decision support methods for the environmental assessment of contamination at mine sites. Documented examples and abundant references are also provided.
Biomechanics of Tetrahymena escaping from dead ends
NASA Astrophysics Data System (ADS)
Ishikawa, Takuji; Kikuchi, Kenji
2017-11-01
Behaviors of swimming microorganisms in complex environments are important in understanding cells' distribution in nature and in industries. Although cell's swimming and spreading in an infinite fluid has been intensively investigated, that in a narrow region bounded by walls is still unclear. Thus, in this study, we used Tetrahymena thermophila as a model microorganism, and experimentally investigated its behavior between flat plates with an angle. The results showed that the cells tended to escape from the narrow region, and the swimming velocity and the radius of curvature of the trajectories decreased as they swam narrower region. We then developed a computational model of swimming Tetrahymena. The results showed that the escaping behavior could be well explained by fluid mechanics. The obtained knowledge is useful in understanding cells' behaviors in complex environments, such as in porous media and in a granular matter. This research was supported by JSPS KAKENHI Grants, numbers 25000008 and 17H00853.
Experimental and computational fluid dynamics studies of mixing of complex oral health products
NASA Astrophysics Data System (ADS)
Cortada-Garcia, Marti; Migliozzi, Simona; Weheliye, Weheliye Hashi; Dore, Valentina; Mazzei, Luca; Angeli, Panagiota; ThAMes Multiphase Team
2017-11-01
Highly viscous non-Newtonian fluids are largely used in the manufacturing of specialized oral care products. Mixing often takes place in mechanically stirred vessels where the flow fields and mixing times depend on the geometric configuration and the fluid physical properties. In this research, we study the mixing performance of complex non-Newtonian fluids using Computational Fluid Dynamics models and validate them against experimental laser-based optical techniques. To this aim, we developed a scaled-down version of an industrial mixer. As test fluids, we used mixtures of glycerol and a Carbomer gel. The viscosities of the mixtures against shear rate at different temperatures and phase ratios were measured and found to be well described by the Carreau model. The numerical results were compared against experimental measurements of velocity fields from Particle Image Velocimetry (PIV) and concentration profiles from Planar Laser Induced Fluorescence (PLIF).
Characterizing Adsorption Performance of Granular Activated Carbon with Permittivity.
Yang, Yang; Shi, Chao; Zhang, Yi; Ye, Jinghua; Zhu, Huacheng; Huang, Kama
2017-03-07
A number of studies have achieved the consensus that microwave thermal technology can regenerate the granular activated carbon (GAC) more efficiently and energy-conservatively than other technologies. In particular, in the microwave heating industry, permittivity is a crucial parameter. This paper developed two equivalent models to establish the relationship between effective complex permittivity and pore volume of the GAC. It is generally based on Maxwell-Garnett approximation (MGA) theory. With two different assumptions in the model, two quantificational expressions were derived, respectively. Permittivity measurements and Brunauer-Emmett-Teller (BET) testing had been introduced in the experiments. Results confirmed the two expressions, which were extremely similar. Theoretical and experimental graphs were matched. This paper set up a bridge which links effective complex permittivity and pore volume of the GAC. Furthermore, it provides a potential and convenient method for the rapid assisted characterization of the GAC in its adsorption performance.
Characterizing Adsorption Performance of Granular Activated Carbon with Permittivity
Yang, Yang; Shi, Chao; Zhang, Yi; Ye, Jinghua; Zhu, Huacheng; Huang, Kama
2017-01-01
A number of studies have achieved the consensus that microwave thermal technology can regenerate the granular activated carbon (GAC) more efficiently and energy-conservatively than other technologies. In particular, in the microwave heating industry, permittivity is a crucial parameter. This paper developed two equivalent models to establish the relationship between effective complex permittivity and pore volume of the GAC. It is generally based on Maxwell-Garnett approximation (MGA) theory. With two different assumptions in the model, two quantificational expressions were derived, respectively. Permittivity measurements and Brunauer–Emmett–Teller (BET) testing had been introduced in the experiments. Results confirmed the two expressions, which were extremely similar. Theoretical and experimental graphs were matched. This paper set up a bridge which links effective complex permittivity and pore volume of the GAC. Furthermore, it provides a potential and convenient method for the rapid assisted characterization of the GAC in its adsorption performance. PMID:28772628
Climate Change Impacts and Vulnerability Assessment in Industrial Complexes
NASA Astrophysics Data System (ADS)
Lee, H. J.; Lee, D. K.
2016-12-01
Climate change has recently caused frequent natural disasters, such as floods, droughts, and heat waves. Such disasters have also increased industrial damages. We must establish climate change adaptation policies to reduce the industrial damages. It is important to make accurate vulnerability assessment to establish climate change adaptation policies. Thus, this study aims at establishing a new index to assess vulnerability level in industrial complexes. Most vulnerability indices have been developed with subjective approaches, such as the Delphi survey and the Analytic Hierarchy Process(AHP). The subjective approaches rely on the knowledge of a few experts, which provokes the lack of the reliability of the indices. To alleviate the problem, we have designed a vulnerability index incorporating objective approaches. We have investigated 42 industrial complex sites in Republic of Korea (ROK). To calculate weights of variables, we used entropy method as an objective method integrating the Delphi survey as a subjective method. Finally, we found our method integrating both subjective method and objective method could generate result. The integration of the entropy method enables us to assess the vulnerability objectively. Our method will be useful to establish climate change adaptation policies by reducing the uncertainties of the methods based on the subjective approaches.
Klein, Michael T; Hou, Gang; Quann, Richard J; Wei, Wei; Liao, Kai H; Yang, Raymond S H; Campain, Julie A; Mazurek, Monica A; Broadbelt, Linda J
2002-01-01
A chemical engineering approach for the rigorous construction, solution, and optimization of detailed kinetic models for biological processes is described. This modeling capability addresses the required technical components of detailed kinetic modeling, namely, the modeling of reactant structure and composition, the building of the reaction network, the organization of model parameters, the solution of the kinetic model, and the optimization of the model. Even though this modeling approach has enjoyed successful application in the petroleum industry, its application to biomedical research has just begun. We propose to expand the horizons on classic pharmacokinetics and physiologically based pharmacokinetics (PBPK), where human or animal bodies were often described by a few compartments, by integrating PBPK with reaction network modeling described in this article. If one draws a parallel between an oil refinery, where the application of this modeling approach has been very successful, and a human body, the individual processing units in the oil refinery may be considered equivalent to the vital organs of the human body. Even though the cell or organ may be much more complicated, the complex biochemical reaction networks in each organ may be similarly modeled and linked in much the same way as the modeling of the entire oil refinery through linkage of the individual processing units. The integrated chemical engineering software package described in this article, BioMOL, denotes the biological application of molecular-oriented lumping. BioMOL can build a detailed model in 1-1,000 CPU sec using standard desktop hardware. The models solve and optimize using standard and widely available hardware and software and can be presented in the context of a user-friendly interface. We believe this is an engineering tool with great promise in its application to complex biological reaction networks. PMID:12634134
Demonstration of reduced-order urban scale building energy models
Heidarinejad, Mohammad; Mattise, Nicholas; Dahlhausen, Matthew; ...
2017-09-08
The aim of this study is to demonstrate a developed framework to rapidly create urban scale reduced-order building energy models using a systematic summary of the simplifications required for the representation of building exterior and thermal zones. These urban scale reduced-order models rely on the contribution of influential variables to the internal, external, and system thermal loads. OpenStudio Application Programming Interface (API) serves as a tool to automate the process of model creation and demonstrate the developed framework. The results of this study show that the accuracy of the developed reduced-order building energy models varies only up to 10% withmore » the selection of different thermal zones. In addition, to assess complexity of the developed reduced-order building energy models, this study develops a novel framework to quantify complexity of the building energy models. Consequently, this study empowers the building energy modelers to quantify their building energy model systematically in order to report the model complexity alongside the building energy model accuracy. An exhaustive analysis on four university campuses suggests that the urban neighborhood buildings lend themselves to simplified typical shapes. Specifically, building energy modelers can utilize the developed typical shapes to represent more than 80% of the U.S. buildings documented in the CBECS database. One main benefits of this developed framework is the opportunity for different models including airflow and solar radiation models to share the same exterior representation, allowing a unifying exchange data. Altogether, the results of this study have implications for a large-scale modeling of buildings in support of urban energy consumption analyses or assessment of a large number of alternative solutions in support of retrofit decision-making in the building industry.« less
Demonstration of reduced-order urban scale building energy models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heidarinejad, Mohammad; Mattise, Nicholas; Dahlhausen, Matthew
The aim of this study is to demonstrate a developed framework to rapidly create urban scale reduced-order building energy models using a systematic summary of the simplifications required for the representation of building exterior and thermal zones. These urban scale reduced-order models rely on the contribution of influential variables to the internal, external, and system thermal loads. OpenStudio Application Programming Interface (API) serves as a tool to automate the process of model creation and demonstrate the developed framework. The results of this study show that the accuracy of the developed reduced-order building energy models varies only up to 10% withmore » the selection of different thermal zones. In addition, to assess complexity of the developed reduced-order building energy models, this study develops a novel framework to quantify complexity of the building energy models. Consequently, this study empowers the building energy modelers to quantify their building energy model systematically in order to report the model complexity alongside the building energy model accuracy. An exhaustive analysis on four university campuses suggests that the urban neighborhood buildings lend themselves to simplified typical shapes. Specifically, building energy modelers can utilize the developed typical shapes to represent more than 80% of the U.S. buildings documented in the CBECS database. One main benefits of this developed framework is the opportunity for different models including airflow and solar radiation models to share the same exterior representation, allowing a unifying exchange data. Altogether, the results of this study have implications for a large-scale modeling of buildings in support of urban energy consumption analyses or assessment of a large number of alternative solutions in support of retrofit decision-making in the building industry.« less
Natural Hazards characterisation in industrial practice
NASA Astrophysics Data System (ADS)
Bernardara, Pietro
2017-04-01
The definition of rare hydroclimatic extremes (up to 10-4 annual probability of occurrence) is of the utmost importance for the design of high value industrial infrastructures, such as grids, power plants, offshore platforms. The underestimation as well as the overestimation of the risk may lead to huge costs (ex. mid-life expensive works or overdesign) which may even prevent the project to happen. Nevertheless, the uncertainty associated to the extrapolation towards the rare frequencies are huge and manifold. They are mainly due to the scarcity of observations, the lack of quality on the extreme value records and on the arbitrary choice of the models used for extrapolations. This often put the design engineers in uncomfortable situations when they must choose the design values to use. Providentially, the recent progresses in the earth observation techniques, information technology, historical data collection and weather and ocean modelling are making huge datasets available. A careful use of big datasets of observations and modelled data are leading towards a better understanding of the physics of the underlying phenomena, the complex interactions between them and thus of the extreme events frequency extrapolations. This will move the engineering practice from the single site, small sample, application of statistical analysis to a more spatially coherent, physically driven extrapolation of extreme values. Few examples, from the EDF industrial practice are given to illustrate these progresses and their potential impact on the design approaches.
Cybersecurity in Hospitals: A Systematic, Organizational Perspective.
Jalali, Mohammad S; Kaiser, Jessica P
2018-05-28
Cybersecurity incidents are a growing threat to the health care industry in general and hospitals in particular. The health care industry has lagged behind other industries in protecting its main stakeholder (ie, patients), and now hospitals must invest considerable capital and effort in protecting their systems. However, this is easier said than done because hospitals are extraordinarily technology-saturated, complex organizations with high end point complexity, internal politics, and regulatory pressures. The purpose of this study was to develop a systematic and organizational perspective for studying (1) the dynamics of cybersecurity capability development at hospitals and (2) how these internal organizational dynamics interact to form a system of hospital cybersecurity in the United States. We conducted interviews with hospital chief information officers, chief information security officers, and health care cybersecurity experts; analyzed the interview data; and developed a system dynamics model that unravels the mechanisms by which hospitals build cybersecurity capabilities. We then use simulation analysis to examine how changes to variables within the model affect the likelihood of cyberattacks across both individual hospitals and a system of hospitals. We discuss several key mechanisms that hospitals use to reduce the likelihood of cybercriminal activity. The variable that most influences the risk of cyberattack in a hospital is end point complexity, followed by internal stakeholder alignment. Although resource availability is important in fueling efforts to close cybersecurity capability gaps, low levels of resources could be compensated for by setting a high target level of cybersecurity. To enhance cybersecurity capabilities at hospitals, the main focus of chief information officers and chief information security officers should be on reducing end point complexity and improving internal stakeholder alignment. These strategies can solve cybersecurity problems more effectively than blindly pursuing more resources. On a macro level, the cyber vulnerability of a country's hospital infrastructure is affected by the vulnerabilities of all individual hospitals. In this large system, reducing variation in resource availability makes the whole system less vulnerable-a few hospitals with low resources for cybersecurity threaten the entire infrastructure of health care. In other words, hospitals need to move forward together to make the industry less attractive to cybercriminals. Moreover, although compliance is essential, it does not equal security. Hospitals should set their target level of cybersecurity beyond the requirements of current regulations and policies. As of today, policies mostly address data privacy, not data security. Thus, policy makers need to introduce policies that not only raise the target level of cybersecurity capabilities but also reduce the variability in resource availability across the entire health care system. ©Mohammad S Jalali, Jessica P Kaiser. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 28.05.2018.
Solid Rocket Motor Combustion Instability Modeling in COMSOL Multiphysics
NASA Technical Reports Server (NTRS)
Fischbach, S. R.
2015-01-01
Combustion instability modeling of Solid Rocket Motors (SRM) remains a topic of active research. Many rockets display violent fluctuations in pressure, velocity, and temperature originating from the complex interactions between the combustion process, acoustics, and steady-state gas dynamics. Recent advances in defining the energy transport of disturbances within steady flow-fields have been applied by combustion stability modelers to improve the analysis framework. Employing this more accurate global energy balance requires a higher fidelity model of the SRM flow-field and acoustic mode shapes. The current industry standard analysis tool utilizes a one dimensional analysis of the time dependent fluid dynamics along with a quasi-three dimensional propellant grain regression model to determine the SRM ballistics. The code then couples with another application that calculates the eigenvalues of the one dimensional homogenous wave equation. The mean flow parameters and acoustic normal modes are coupled to evaluate the stability theory developed and popularized by Culick. The assumption of a linear, non-dissipative wave in a quiescent fluid remains valid while acoustic amplitudes are small and local gas velocities stay below Mach 0.2. The current study employs the COMSOL Multiphysics finite element framework to model the steady flow-field parameters and acoustic normal modes of a generic SRM. This work builds upon previous efforts to verify the use of the acoustic velocity potential equation (AVPE) laid out by Campos. The acoustic velocity potential (psi) describing the acoustic wave motion in the presence of an inhomogeneous steady high-speed flow is defined by, del squared psi - (lambda/c) squared psi - M x [M x del((del)(psi))] - 2((lambda)(M)/c + M x del(M) x (del)(psi) - 2(lambda)(psi)[M x del(1/c)] = 0. with M as the Mach vector, c as the speed of sound, and ? as the complex eigenvalue. The study requires one way coupling of the CFD High Mach Number Flow (HMNF) and mathematics module. The HMNF module evaluates the gas flow inside of a SRM using St. Robert's law to model the solid propellant burn rate, slip boundary conditions, and the supersonic outflow condition. Results from the HMNF model are verified by comparing the pertinent ballistics parameters with the industry standard code outputs (i.e. pressure drop, axial velocity, exit velocity). These results are then used by the coefficient form of the mathematics module to determine the complex eigenvalues of the AVPE. The mathematics model is truncated at the nozzle sonic line, where a zero flux boundary condition is self-satisfying. The remaining boundaries are modeled with a zero flux boundary condition, assuming zero acoustic absorption on all surfaces. The one way coupled analysis is perform four times utilizing geometries determined through traditional SRM modeling procedures. The results of the steady-state CFD and AVPE analyses are used to calculate the linear acoustic growth rate as is defined by Flandro and Jacob. In order to verify the process implemented within COMSOL we first employ the Culick theory and compare the results with the industry standard. After the process is verified, the Flandro/Jacob energy balance theory is employed and results displayed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turinsky, Paul J., E-mail: turinsky@ncsu.edu; Kothe, Douglas B., E-mail: kothe@ornl.gov
The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear powermore » industry that M&S can assist in addressing. To date CASL has developed a multi-physics “core simulator” based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M&S capabilities, which is in progress, will assist in addressing long-standing and future operational and safety challenges of the nuclear industry. - Highlights: • Complexity of physics based modeling of light water reactor cores being addressed. • Capability developed to help address problems that have challenged the nuclear power industry. • Simulation capabilities that take advantage of high performance computing developed.« less
Qualitative Fault Isolation of Hybrid Systems: A Structural Model Decomposition-Based Approach
NASA Technical Reports Server (NTRS)
Bregon, Anibal; Daigle, Matthew; Roychoudhury, Indranil
2016-01-01
Quick and robust fault diagnosis is critical to ensuring safe operation of complex engineering systems. A large number of techniques are available to provide fault diagnosis in systems with continuous dynamics. However, many systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete behavioral modes, each with its own continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task computationally more complex due to the large number of possible system modes and the existence of autonomous mode transitions. This paper presents a qualitative fault isolation framework for hybrid systems based on structural model decomposition. The fault isolation is performed by analyzing the qualitative information of the residual deviations. However, in hybrid systems this process becomes complex due to possible existence of observation delays, which can cause observed deviations to be inconsistent with the expected deviations for the current mode in the system. The great advantage of structural model decomposition is that (i) it allows to design residuals that respond to only a subset of the faults, and (ii) every time a mode change occurs, only a subset of the residuals will need to be reconfigured, thus reducing the complexity of the reasoning process for isolation purposes. To demonstrate and test the validity of our approach, we use an electric circuit simulation as the case study.
Izard, Catherine F; Weber, Christopher L; Matthews, H Scott
2010-09-01
Carbon Border Tax Adjustments (BTAs) are a politically popular strategy for avoiding competitive disadvantage problems when a country implements a unilateral climate change policy. A BTA taxes carbon embodied in imported goods in order to protect domestic industry and motivate other countries to implement climate change policy. To estimate the effectiveness of a BTA, is it is necessary to know which products are covered, where they were originally produced and ultimately exported from, and how the covered amount compares to total production in foreign countries. Using a scrap-adjusted, mixed-unit input-output model in conjunction with a multiregional input-output model, this analysis evaluates the effectiveness of BTAs for the case study of U.S. steel imports. Most imported steel by mass is embedded in finished products (60%), and 30% of that steel is produced in a different country than the one from which the final good is exported. Given the magnitudes involved and complexities of global supply chains, a BTA that protects domestic industry will be a challenge to implement. We propose a logistically feasible BTA structure that minimizes the information burden while still accounting for these complexities. However, the amount of steel imported to the U.S. is negligible (5%) compared to foreign production in BTA-eligible countries and is unlikely to motivate affected countries to impose an emissions reduction policy.
A model-based design and validation approach with OMEGA-UML and the IF toolset
NASA Astrophysics Data System (ADS)
Ben-hafaiedh, Imene; Constant, Olivier; Graf, Susanne; Robbana, Riadh
2009-03-01
Intelligent, embedded systems such as autonomous robots and other industrial systems are becoming increasingly more heterogeneous with respect to the platforms on which they are implemented, and thus the software architecture more complex to design and analyse. In this context, it is important to have well-defined design methodologies which should be supported by (1) high level design concepts allowing to master the design complexity, (2) concepts for the expression of non-functional requirements and (3) analysis tools allowing to verify or invalidate that the system under development will be able to conform to its requirements. We illustrate here such an approach for the design of complex embedded systems on hand of a small case study used as a running example for illustration purposes. We briefly present the important concepts of the OMEGA-RT UML profile, we show how we use this profile in a modelling approach, and explain how these concepts are used in the IFx verification toolbox to integrate validation into the design flow and make scalable verification possible.
Safety Metrics for Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Leveson, Nancy G; Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
Industry's Struggle for Skilled Workers.
ERIC Educational Resources Information Center
Barker, Don
1979-01-01
The growing shortage of skilled workers in industrial maintenance, the growing complexity of equipment, and the automation of production processes call for improved and increased employee training and retraining. A General Motors training supervisor notes how education and industry can cooperate to provide this education and training. (MF)
NASA Astrophysics Data System (ADS)
Angst, Sebastian; Engelke, Lukas; Winterer, Markus; Wolf, Dietrich E.
2017-06-01
Densification of (semi-)conducting particle agglomerates with the help of an electrical current is much faster and more energy efficient than traditional thermal sintering or powder compression. Therefore, this method becomes more and more common among experimentalists, engineers, and in industry. The mechanisms at work at the particle scale are highly complex because of the mutual feedback between current and pore structure. This paper extends previous modelling approaches in order to study mixtures of particles of two different materials. In addition to the delivery of Joule heat throughout the sample, especially in current bottlenecks, thermoelectric effects must be taken into account. They lead to segregation or spatial correlations in the particle arrangement. Various model extensions are possible and will be discussed.
Ice Accretion Modeling using an Eulerian Approach for Droplet Impingement
NASA Technical Reports Server (NTRS)
Kim, Joe Woong; Garza, Dennis P.; Sankar, Lakshmi N.; Kreeger, Richard E.
2012-01-01
A three-dimensional Eulerian analysis has been developed for modeling droplet impingement on lifting bodes. The Eulerian model solves the conservation equations of mass and momentum to obtain the droplet flow field properties on the same mesh used in CFD simulations. For complex configurations such as a full rotorcraft, the Eulerian approach is more efficient because the Lagrangian approach would require a significant amount of seeding for accurate estimates of collection efficiency. Simulations are done for various benchmark cases such as NACA0012 airfoil, MS317 airfoil and oscillating SC2110 airfoil to illustrate its use. The present results are compared with results from the Lagrangian approach used in an industry standard analysis called LEWICE.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mike Lewis
2013-02-01
This report describes conditions, as required by the state of Idaho Industrial Wastewater Reuse Permit (WRU-I-0160-01, formerly LA 000160 01), for the wastewater reuse site at the Idaho National Laboratory Site’s Materials and Fuels Complex Industrial Waste Ditch and Industrial Waste Pond from November 1, 2011 through October 31, 2012. The report contains the following information: • Facility and system description • Permit required effluent monitoring data and loading rates • Groundwater monitoring data • Status of special compliance conditions • Discussion of the facility’s environmental impacts During the 2012 reporting year, an estimated 11.84 million gallons of wastewater weremore » discharged to the Industrial Waste Ditch and Pond which is well below the permit limit of 17 million gallons per year. The concentrations of all permit-required analytes in the samples from the down gradient monitoring wells were below the Ground Water Quality Rule Primary and Secondary Constituent Standards.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, Mike
This report describes conditions, as required by the state of Idaho Industrial Wastewater Reuse Permit (WRU-I-0160-01, formerly LA 000160 01), for the wastewater reuse site at the Idaho National Laboratory Site’s Materials and Fuels Complex Industrial Waste Ditch and Industrial Waste Pond from November 1, 2013 through October 31, 2014. The report contains the following information; Facility and system description; Permit required effluent monitoring data and loading rates; Groundwater monitoring data; Status of special compliance conditions; Noncompliance issues; and Discussion of the facility’s environmental impacts During the 2014 reporting year, an estimated 10.11 million gallons of wastewater were discharged tomore » the Industrial Waste Ditch and Pond which is well below the permit limit of 17 million gallons per year. The concentrations of all permit-required analytes in the samples from the down gradient monitoring wells were below the applicable Idaho Department of Environmental Quality’s groundwater quality standard levels.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mike Lewis
This report describes conditions, as required by the state of Idaho Industrial Wastewater Reuse Permit (WRU-I-0160-01, formerly LA 000160 01), for the wastewater reuse site at the Idaho National Laboratory Site’s Materials and Fuels Complex Industrial Waste Ditch and Industrial Waste Pond from November 1, 2012 through October 31, 2013. The report contains the following information: • Facility and system description • Permit required effluent monitoring data and loading rates • Groundwater monitoring data • Status of special compliance conditions • Discussion of the facility’s environmental impacts During the 2013 reporting year, an estimated 9.64 million gallons of wastewater weremore » discharged to the Industrial Waste Ditch and Pond which is well below the permit limit of 17 million gallons per year. The concentrations of all permit-required analytes in the samples from the down gradient monitoring wells were below the applicable Idaho Department of Environmental Quality’s groundwater quality standard levels.« less
Modeling Particle Exposure in US Trucking Terminals
Davis, ME; Smith, TJ; Laden, F; Hart, JE; Ryan, LM; Garshick, E
2007-01-01
Multi-tiered sampling approaches are common in environmental and occupational exposure assessment, where exposures for a given individual are often modeled based on simultaneous measurements taken at multiple indoor and outdoor sites. The monitoring data from such studies is hierarchical by design, imposing a complex covariance structure that must be accounted for in order to obtain unbiased estimates of exposure. Statistical methods such as structural equation modeling (SEM) represent a useful alternative to simple linear regression in these cases, providing simultaneous and unbiased predictions of each level of exposure based on a set of covariates specific to the exposure setting. We test the SEM approach using data from a large exposure assessment of diesel and combustion particles in the US trucking industry. The exposure assessment includes data from 36 different trucking terminals across the United States sampled between 2001 and 2005, measuring PM2.5 and its elemental carbon (EC), organic carbon (OC) components, by personal monitoring, and sampling at two indoor work locations and an outdoor “background” location. Using the SEM method, we predict: 1) personal exposures as a function of work related exposure and smoking status; 2) work related exposure as a function of terminal characteristics, indoor ventilation, job location, and background exposure conditions; and 3) background exposure conditions as a function of weather, nearby source pollution, and other regional differences across terminal sites. The primary advantage of SEMs in this setting is the ability to simultaneously predict exposures at each of the sampling locations, while accounting for the complex covariance structure among the measurements and descriptive variables. The statistically significant results and high R2 values observed from the trucking industry application supports the broader use of this approach in exposure assessment modeling. PMID:16856739
Modeling the Pre-Industrial Roots of Modern Super-Exponential Population Growth
Stutz, Aaron Jonas
2014-01-01
To Malthus, rapid human population growth—so evident in 18th Century Europe—was obviously unsustainable. In his Essay on the Principle of Population, Malthus cogently argued that environmental and socioeconomic constraints on population rise were inevitable. Yet, he penned his essay on the eve of the global census size reaching one billion, as nearly two centuries of super-exponential increase were taking off. Introducing a novel extension of J. E. Cohen's hallmark coupled difference equation model of human population dynamics and carrying capacity, this article examines just how elastic population growth limits may be in response to demographic change. The revised model involves a simple formalization of how consumption costs influence carrying capacity elasticity over time. Recognizing that complex social resource-extraction networks support ongoing consumption-based investment in family formation and intergenerational resource transfers, it is important to consider how consumption has impacted the human environment and demography—especially as global population has become very large. Sensitivity analysis of the consumption-cost model's fit to historical population estimates, modern census data, and 21st Century demographic projections supports a critical conclusion. The recent population explosion was systemically determined by long-term, distinctly pre-industrial cultural evolution. It is suggested that modern globalizing transitions in technology, susceptibility to infectious disease, information flows and accumulation, and economic complexity were endogenous products of much earlier biocultural evolution of family formation's embeddedness in larger, hierarchically self-organizing cultural systems, which could potentially support high population elasticity of carrying capacity. Modern super-exponential population growth cannot be considered separately from long-term change in the multi-scalar political economy that connects family formation and intergenerational resource transfers to wider institutions and social networks. PMID:25141019
NASA Astrophysics Data System (ADS)
Claussen, U.
1984-01-01
The improvement of contrast and visibility of LCD by two different means was undertaken. The two methods are: (1) development of fluorescent dyes to increase the visibility of fluorescent activated displays (FLAD); and (2) development of dichroic dyes to increase the contrast of displays. This work was done in close cooperation with the electronic industry, where the newly synthesized dyes were tested. The targets for the chemical synthesis were selected with the help of computer model calculations. A marketable range of dyes was developed. Since the interest of the electronic industries concerning FLAD was low, the investigations were stopped. Dichroic dyes, especially black mixtures with good light fastness, order parameter, and solubility in nematic phases were developed. The application of these dyes is restricted to indoor use because of an increase of viscosity below -10 C. Applications on a technical scale, e.g., for the automotive industry, will be possible if the displays work at temperatures down to -40 C. This problem requires a complex optimization of the dye/nematic phase system.
Rattanachomsri, Ukrit; Kanokratana, Pattanop; Eurwilaichitr, Lily; Igarashi, Yasuo; Champreda, Verawat
2011-01-01
Sugarcane bagasse is an important lignocellulosic by-product with potential for conversion to biofuels and chemicals in biorefinery. As a step towards an understanding of microbial diversity and the processes existing in bagasse collection sites, the microbial community in industrial bagasse feedstock piles was investigated. Molecular biodiversity analysis of 16S rDNA sequences revealed the presence of a complex bacterial community. A diverse group of mainly aerobic and facultative anaerobic bacteria was identified reflecting the aerobic and high temperature microenvironmental conditions under the pile surface. The major bacterial taxa present were identified as Firmicutes, Alpha- and Gammaproteobacteria, Acidobacteria, Bacteroidetes, and Actinobacteria. Analysis of the eukaryotic microbial assemblage based on an internal transcribed spacer revealed the predominance of diverse cellulolytic and hemicellulolytic ascomycota. A microbial interaction model is proposed, focusing on lignocellulose degradation and methane metabolism. The insights into the microbial community in this study provide a basis for efficient utilization of bagasse in lignocellulosic biomass-based industries.
Tang, Zhentao; Hou, Wenqian; Liu, Xiuming; Wang, Mingfeng; Duan, Yixiang
2016-08-26
Integral analysis plays an important role in study and quality control of substances with complex matrices in our daily life. As the preliminary construction of integral analysis of substances with complex matrices, developing a relatively comprehensive and sensitive methodology might offer more informative and reliable characteristic components. Flavoring mixtures belonging to the representatives of substances with complex matrices have now been widely used in various fields. To better study and control the quality of flavoring mixtures as additives in food industry, an in-house fabricated solid-phase microextraction (SPME) fiber was prepared based on sol-gel technology in this work. The active organic component of the fiber coating was multi-walled carbon nanotubes (MWCNTs) functionalized with hydroxyl-terminated polydimethyldiphenylsiloxane, which integrate the non-polar and polar chains of both materials. In this way, more sensitive extraction capability for a wider range of compounds can be obtained in comparison with commercial SPME fibers. Preliminarily integral analysis of three similar types of samples were realized by the optimized SPME-GC-MS method. With the obtained GC-MS data, a valid and well-fit model was established by partial least square discriminant analysis (PLS-DA) for classification of these samples (R2X=0.661, R2Y=0.996, Q2=0.986). The validity of the model (R2=0.266, Q2=-0.465) has also approved the potential to predict the "belongingness" of new samples. With the PLS-DA and SPSS method, further screening out the markers among three similar batches of samples may be helpful for monitoring and controlling the quality of the flavoring mixtures as additives in food industry. Conversely, the reliability and effectiveness of the GC-MS data has verified the comprehensive and efficient extraction performance of the in-house fabricated fiber. Copyright © 2016 Elsevier B.V. All rights reserved.
A Modified Theoretical Model of Intrinsic Hardness of Crystalline Solids
Dai, Fu-Zhi; Zhou, Yanchun
2016-01-01
Super-hard materials have been extensively investigated due to their practical importance in numerous industrial applications. To stimulate the design and exploration of new super-hard materials, microscopic models that elucidate the fundamental factors controlling hardness are desirable. The present work modified the theoretical model of intrinsic hardness proposed by Gao. In the modification, we emphasize the critical role of appropriately decomposing a crystal to pseudo-binary crystals, which should be carried out based on the valence electron population of each bond. After modification, the model becomes self-consistent and predicts well the hardness values of many crystals, including crystals composed of complex chemical bonds. The modified model provides fundamental insights into the nature of hardness, which can facilitate the quest for intrinsic super-hard materials. PMID:27604165
NASA Astrophysics Data System (ADS)
Tveito, Knut Omdal; Pakanati, Akash; M'Hamdi, Mohammed; Combeau, Hervé; Založnik, Miha
2018-04-01
Macrosegregation is a result of the interplay of various transport mechanisms, including natural convection, solidification shrinkage, and grain motion. Experimental observations also indicate the impact of grain morphology, ranging from dendritic to globular, on macrosegregation formation. To avoid the complexity arising due to modeling of an equiaxed dendritic grain, we present the development of a simplified three-phase, multiscale equiaxed dendritic solidification model based on the volume-averaging method, which accounts for the above-mentioned transport phenomena. The validity of the model is assessed by comparing it with the full three-phase model without simplifications. It is then applied to qualitatively analyze the impact of grain morphology on macrosegregation formation in an industrial scale direct chill cast aluminum alloy ingot.
NASA Astrophysics Data System (ADS)
Bastien, J.; Picot-Colbeaux, G.; Crastes de Paulet, F.; Rorive, A.; Bouvet, A.; Goderniaux, P.; Thiery, D.
2016-12-01
The Carboniferous Limestone groundwater extends from East to West across Belgium and the North of France (1420 km²). In a high population density and industrial activity region, it represents huge volumes of abstracted groundwater (98 Mm³). The aquifer thus constitutes a critical reserve for public distribution and industrial uses. This water reservoir is intensively exploited from both sides of the border since the end of the 19th century. Historically, this transboundary aquifer was overexploited, due to the massive requirements of the industry. As a consequence, a substantial piezometric level decrease was observed (up to 50 m). Due to the karstic nature of the aquifer, many sinkhole collapses were induced in the studied area. A reduction of the abstracted volumes was implemented in the 90s, which contributed to the relative stabilization of the piezometric levels, but the equilibrium remains uncertain. Due to complex political, urbanistic and industrial developments across this region, a reasonable and long-term management model was needed, involving all concerned countries and regions. Within the framework of the Interreg ScaldWIN Project, a belgo-french collaboration allowed the acquisition of new sets of geological and hydrogeological data. A new piezometric map was established and correlated with chemical and isotopic analyses. It enabled a more accurate knowledge on the main flow directions within the aquifer, and the relation between recharge area and the confined area, where groundwater is aged up to 10000 years. A new numerical model of the aquifer was implemented and calibrated by using the MARTHE code. This 4 layer-model includes a part of the French chalk aquifer and integrates all abstracted groundwater volumes (wells and quarries) from 1900 to 2010. Atmospheric and surface waters and potential evapotranspiration are included in relation to the groundwater. This model is used by the different partners to consider globally and locally the impact of existing and future abstracted water volumes, and to help for a sustainable water resources management between the two countries.
NASA Astrophysics Data System (ADS)
Mottaeva, Asiiat
2017-10-01
The article is dedicated to the problems of the participation of the energy enterprises in the social-and-economic development of the regions and municipalities. The complex of mechanisms of the implementation of the Energy strategy in the form of strategic initiatives of the development of the energy industry representing the complex inter-industry state-private long-term projects is presented in the article. The author considers the development of the energy industry to be the key driver of the social-and-economic development of regions. The author proves, that the increase in competitiveness of Russian energy, geographical and grocery diversification of export and improvement of quality of export products might allow to solve some problems of the development of national economy.
Combined analysis of modeled and monitored SO2 concentrations at a complex smelting facility.
Rehbein, Peter J G; Kennedy, Michael G; Cotsman, David J; Campeau, Madonna A; Greenfield, Monika M; Annett, Melissa A; Lepage, Mike F
2014-03-01
Vale Canada Limited owns and operates a large nickel smelting facility located in Sudbury, Ontario. This is a complex facility with many sources of SO2 emissions, including a mix of source types ranging from passive building roof vents to North America's tallest stack. In addition, as this facility performs batch operations, there is significant variability in the emission rates depending on the operations that are occurring. Although SO2 emission rates for many of the sources have been measured by source testing, the reliability of these emission rates has not been tested from a dispersion modeling perspective. This facility is a significant source of SO2 in the local region, making it critical that when modeling the emissions from this facility for regulatory or other purposes, that the resulting concentrations are representative of what would actually be measured or otherwise observed. To assess the accuracy of the modeling, a detailed analysis of modeled and monitored data for SO2 at the facility was performed. A mobile SO2 monitor sampled at five locations downwind of different source groups for different wind directions resulting in a total of 168 hr of valid data that could be used for the modeled to monitored results comparison. The facility was modeled in AERMOD (American Meteorological Society/U.S. Environmental Protection Agency Regulatory Model) using site-specific meteorological data such that the modeled periods coincided with the same times as the monitored events. In addition, great effort was invested into estimating the actual SO2 emission rates that would likely be occurring during each of the monitoring events. SO2 concentrations were modeled for receptors around each monitoring location so that the modeled data could be directly compared with the monitored data. The modeled and monitored concentrations were compared and showed that there were no systematic biases in the modeled concentrations. This paper is a case study of a Combined Analysis of Modelled and Monitored Data (CAMM), which is an approach promulgated within air quality regulations in the Province of Ontario, Canada. Although combining dispersion models and monitoring data to estimate or refine estimates of source emission rates is not a new technique, this study shows how, with a high degree of rigor in the design of the monitoring and filtering of the data, it can be applied to a large industrial facility, with a variety of emission sources. The comparison of modeled and monitored SO2 concentrations in this case study also provides an illustration of the AERMOD model performance for a large industrial complex with many sources, at short time scales in comparison with monitored data. Overall, this analysis demonstrated that the AERMOD model performed well.
Systems Biology of Industrial Microorganisms
NASA Astrophysics Data System (ADS)
Papini, Marta; Salazar, Margarita; Nielsen, Jens
The field of industrial biotechnology is expanding rapidly as the chemical industry is looking towards more sustainable production of chemicals that can be used as fuels or building blocks for production of solvents and materials. In connection with the development of sustainable bioprocesses, it is a major challenge to design and develop efficient cell factories that can ensure cost efficient conversion of the raw material into the chemical of interest. This is achieved through metabolic engineering, where the metabolism of the cell factory is engineered such that there is an efficient conversion of sugars, the typical raw materials in the fermentation industry, into the desired product. However, engineering of cellular metabolism is often challenging due to the complex regulation that has evolved in connection with adaptation of the different microorganisms to their ecological niches. In order to map these regulatory structures and further de-regulate them, as well as identify ingenious metabolic engineering strategies that full-fill mass balance constraints, tools from systems biology can be applied. This involves both high-throughput analysis tools like transcriptome, proteome and metabolome analysis, as well as the use of mathematical modeling to simulate the phenotypes resulting from the different metabolic engineering strategies. It is in fact expected that systems biology may substantially improve the process of cell factory development, and we therefore propose the term Industrial Systems Biology for how systems biology will enhance the development of industrial biotechnology for sustainable chemical production.
Systems biology of industrial microorganisms.
Papini, Marta; Salazar, Margarita; Nielsen, Jens
2010-01-01
The field of industrial biotechnology is expanding rapidly as the chemical industry is looking towards more sustainable production of chemicals that can be used as fuels or building blocks for production of solvents and materials. In connection with the development of sustainable bioprocesses, it is a major challenge to design and develop efficient cell factories that can ensure cost efficient conversion of the raw material into the chemical of interest. This is achieved through metabolic engineering, where the metabolism of the cell factory is engineered such that there is an efficient conversion of sugars, the typical raw materials in the fermentation industry, into the desired product. However, engineering of cellular metabolism is often challenging due to the complex regulation that has evolved in connection with adaptation of the different microorganisms to their ecological niches. In order to map these regulatory structures and further de-regulate them, as well as identify ingenious metabolic engineering strategies that full-fill mass balance constraints, tools from systems biology can be applied. This involves both high-throughput analysis tools like transcriptome, proteome and metabolome analysis, as well as the use of mathematical modeling to simulate the phenotypes resulting from the different metabolic engineering strategies. It is in fact expected that systems biology may substantially improve the process of cell factory development, and we therefore propose the term Industrial Systems Biology for how systems biology will enhance the development of industrial biotechnology for sustainable chemical production.
Simulating Mass Removal of Groundwater Contaminant Plumes with Complex and Simple Models
NASA Astrophysics Data System (ADS)
Lopez, J.; Guo, Z.; Fogg, G. E.
2016-12-01
Chlorinated solvents used in industrial, commercial, and other applications continue to pose significant threats to human health through contamination of groundwater resources. A recent National Research Council report concludes that it is unlikely that remediation of these complex sites will be achieved in a time frame of 50-100 years under current methods and standards (NRC, 2013). Pump and treat has been a common strategy at many sites to contain and treat groundwater contamination. In these sites, extensive retention of contaminant mass in low-permeability materials (tailing) has been observed after years or decades of pumping. Although transport models can be built that contain enough of the complex, 3D heterogeneity to simulate the tailing and long cleanup times, this is seldom done because of the large data and computational burdens. Hence, useful, reliable models to simulate various cleanup strategies are rare. The purpose of this study is to explore other potential ways to simulate the mass-removal processes with shorter time and less cost but still produce robust results by capturing effects of the heterogeneity and long-term retention of mass. A site containing a trichloroethylene groundwater plume was selected as the study area. The plume is located within alluvial sediments in the Tucson Basin. A fully heterogeneous domain is generated first and MODFLOW is used to simulate the flow field. Contaminant transport is simulated using both MT3D and RWHet for the fully heterogeneous model. Other approaches, including dual-domain mass transfer and heterogeneous chemical reactions, are manipulated to simulate the mass removal in a less heterogeneous, or homogeneous, domain and results are compared to the results obtained from complex models. The capability of these simpler models to simulate remediation processes, especially capture the late-time tailing, are examined.
Gonzalez-Vogel, Alvaro; Eyzaguirre, Jaime; Oleas, Gabriela; Callegari, Eduardo; Navarrete, Mario
2011-01-01
Proteins secreted by filamentous fungi play key roles in different aspects of their biology. The fungus Penicillium purpurogenum, used as a model organism, is able to degrade hemicelluloses and pectins by secreting a variety of enzymes to the culture medium. This work shows that these enzymes interact with each other to form high molecular weight, catalytically active complexes. By using a proteomics approach, we were able to identify several protein complexes in the secretome of this fungus. The expression and assembly of these complexes depend on the carbon source used and display molecular masses ranging from 300 to 700 kDa. These complexes are composed of a variety of enzymes, including arabinofuranosidases, acetyl xylan esterases, feruloyl esterases, β-glucosidases and xylanases. The protein-protein interactions in these multienzyme complexes were confirmed by coimmunoprecipitation assays. One of the complexes was purified from sugar beet pulp cultures and the subunits identified by tandem mass spectrometry. A better understanding of the biological significance of these kinds of interactions will help in the comprehension of the degradation mechanisms used by fungi and may be of special interest to the biotechnology industry.
Roussy, Georges; Dichtel, Bernard; Chaabane, Haykel
2003-01-01
By using a new integrated circuit, which is marketed for bluetooth applications, it is possible to simplify the method of measuring the complex impedance, complex reflection coefficient and complex transmission coefficient in an industrial microwave setup. The Analog Devices circuit AD 8302, which measures gain and phase up to 2.7 GHz, operates with variable level input signals and is less sensitive to both amplitude and frequency fluctuations of the industrial magnetrons than are mixers and AM crystal detectors. Therefore, accurate gain and phase measurements can be performed with low stability generators. A mechanical setup with an AD 8302 is described; the calibration procedure and its performance are presented.
Development of a numerical methodology for flowforming process simulation of complex geometry tubes
NASA Astrophysics Data System (ADS)
Varela, Sonia; Santos, Maite; Arroyo, Amaia; Pérez, Iñaki; Puigjaner, Joan Francesc; Puigjaner, Blanca
2017-10-01
Nowadays, the incremental flowforming process is widely explored because of the usage of complex tubular products is increasing due to the light-weighting trend and the use of expensive materials. The enhanced mechanical properties of finished parts combined with the process efficiency in terms of raw material and energy consumption are the key factors for its competitiveness and sustainability, which is consistent with EU industry policy. As a promising technology, additional steps for extending the existing flowforming limits in the production of tubular products are required. The objective of the present research is to further expand the current state of the art regarding limitations on tube thickness and diameter, exploring the feasibility to flowform complex geometries as tubes of elevated thickness of up to 60 mm. In this study, the analysis of the backward flowforming process of 7075 aluminum tubular preform is carried out to define the optimum process parameters, machine requirements and tooling geometry as demonstration case. Numerical simulation studies on flowforming of thin walled tubular components have been considered to increase the knowledge of the technology. The calculation of the rotational movement of the mesh preform, the high ratio thickness/length and the thermomechanical condition increase significantly the computation time of the numerical simulation model. This means that efficient and reliable tools able to predict the forming loads and the quality of flowformed thick tubes are not available. This paper aims to overcome this situation by developing a simulation methodology based on FEM simulation code including new strategies. Material characterization has also been performed through tensile test to able to design the process. Finally, to check the reliability of the model, flowforming tests at industrial environment have been developed.
Solving Problems With SINDA/FLUINT
NASA Technical Reports Server (NTRS)
2002-01-01
SINDA/FLUINT, the NASA standard software system for thermohydraulic analysis, provides computational simulation of interacting thermal and fluid effects in designs modeled as heat transfer and fluid flow networks. The product saves time and money by making the user's design process faster and easier, and allowing the user to gain a better understanding of complex systems. The code is completely extensible, allowing the user to choose the features, accuracy and approximation levels, and outputs. Users can also add their own customizations as needed to handle unique design tasks or to automate repetitive tasks. Applications for SINDA/FLUINT include the pharmaceutical, petrochemical, biomedical, electronics, and energy industries. The system has been used to simulate nuclear reactors, windshield wipers, and human windpipes. In the automotive industry, it simulates the transient liquid/vapor flows within air conditioning systems.
1981-11-01
essence of these arrangements is specialization based in international differentials in * 379 the costs of labor services. The availability of low...of electronic equipment vary with the complexity and cost of the equipment, a differentiated market for chips of varying densities, for use in...level of chip density, while more complex products will be most economically produced with higher levels of chip density. Thuse a differentiated
ERIC Educational Resources Information Center
Fraser, Cary
2009-01-01
This article presents the author's response to Henry Giroux's "The University in Chains: Confronting the Military-Industrial-Academic Complex." Henry Giroux has written a provocative assessment of the contemporary challenges facing the United States as a society, which over the course of the 20th century had assumed the role of leader and exemplar…
Analysis of Metals Concentration in the Soils of SIPCOT Industrial Complex, Cuddalore, Tamil Nadu
Mathivanan, V.; Prabavathi, R.; Prithabai, C.; Selvisabhanayakam
2010-01-01
Phytoremediation is a promising area of new research, both for its low cost and great benefit to society in the clean retrieval of contaminated sites. Phytoremediation is the use of living green plants for in situ risk reduction and/or removal of contaminants from contaminated soil, water, sediments, and air. Specially selected or engineered plants are used in the process. The soil samples were taken from Cuddalore Old Town (OT) and the samples from SIPCOT industrial complex, which was the study area and analyzed for various metals concentrations. Fifteen metals have been analyzed by adopting standard procedure. The detection limits of metal concentration are drawn as control. The various (15) metal concentrations in the soil samples were found higher in soil taken from SIPCOT industrial complex, compared with samples taken from Cuddalore OT. In all the observations, it was found that most of the metals like calcium, cadmium, chromium, cobalt, nickel, and zinc showed maximum concentrations, whereas arsenic, antimony, lead, magnesium, sodium have shown minimum concentrations, both when compared with control. From the present study, it was found that the soil collected from SIPCOT complex area were more polluted due to the presence of various industrial effluents, municipal wastes, and sewages when compared with the soil collected from Cuddalore OT. PMID:21170256
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sayyar-Rodsari, Bijan; Schweiger, Carl; /SLAC /Pavilion Technologies, Inc., Austin, TX
2010-08-25
Timely estimation of deviations from optimal performance in complex systems and the ability to identify corrective measures in response to the estimated parameter deviations has been the subject of extensive research over the past four decades. The implications in terms of lost revenue from costly industrial processes, operation of large-scale public works projects and the volume of the published literature on this topic clearly indicates the significance of the problem. Applications range from manufacturing industries (integrated circuits, automotive, etc.), to large-scale chemical plants, pharmaceutical production, power distribution grids, and avionics. In this project we investigated a new framework for buildingmore » parsimonious models that are suited for diagnosis and fault estimation of complex technical systems. We used Support Vector Machines (SVMs) to model potentially time-varying parameters of a First-Principles (FP) description of the process. The combined SVM & FP model was built (i.e. model parameters were trained) using constrained optimization techniques. We used the trained models to estimate faults affecting simulated beam lifetime. In the case where a large number of process inputs are required for model-based fault estimation, the proposed framework performs an optimal nonlinear principal component analysis of the large-scale input space, and creates a lower dimension feature space in which fault estimation results can be effectively presented to the operation personnel. To fulfill the main technical objectives of the Phase I research, our Phase I efforts have focused on: (1) SVM Training in a Combined Model Structure - We developed the software for the constrained training of the SVMs in a combined model structure, and successfully modeled the parameters of a first-principles model for beam lifetime with support vectors. (2) Higher-order Fidelity of the Combined Model - We used constrained training to ensure that the output of the SVM (i.e. the parameters of the beam lifetime model) are physically meaningful. (3) Numerical Efficiency of the Training - We investigated the numerical efficiency of the SVM training. More specifically, for the primal formulation of the training, we have developed a problem formulation that avoids the linear increase in the number of the constraints as a function of the number of data points. (4) Flexibility of Software Architecture - The software framework for the training of the support vector machines was designed to enable experimentation with different solvers. We experimented with two commonly used nonlinear solvers for our simulations. The primary application of interest for this project has been the sustained optimal operation of particle accelerators at the Stanford Linear Accelerator Center (SLAC). Particle storage rings are used for a variety of applications ranging from 'colliding beam' systems for high-energy physics research to highly collimated x-ray generators for synchrotron radiation science. Linear accelerators are also used for collider research such as International Linear Collider (ILC), as well as for free electron lasers, such as the Linear Coherent Light Source (LCLS) at SLAC. One common theme in the operation of storage rings and linear accelerators is the need to precisely control the particle beams over long periods of time with minimum beam loss and stable, yet challenging, beam parameters. We strongly believe that beyond applications in particle accelerators, the high fidelity and cost benefits of a combined model-based fault estimation/correction system will attract customers from a wide variety of commercial and scientific industries. Even though the acquisition of Pavilion Technologies, Inc. by Rockwell Automation Inc. in 2007 has altered the small business status of the Pavilion and it no longer qualifies for a Phase II funding, our findings in the course of the Phase I research have convinced us that further research will render a workable model-based fault estimation and correction for particle accelerators and industrial plants feasible.« less
Coarse-grained simulation of polymer-filler blends
NASA Astrophysics Data System (ADS)
Legters, Gregg; Kuppa, Vikram; Beaucage, Gregory; Univ of Dayton Collaboration; Univ of Cincinnati Collaboration
The practical use of polymers often relies on additives that improve the property of the mixture. Examples of such complex blends include tires, pigments, blowing agents and other reactive additives in thermoplastics, and recycled polymers. Such systems usually exhibit a complex partitioning of the components. Most prior work has either focused on fine-grained details such as molecular modeling of chains at interfaces, or on coarse, heuristic, trial-and-error approaches to compounding (eg: tire industry). Thus, there is a significant gap in our understanding of how complex hierarchical structure (across several decades in length) develops in these multicomponent systems. This research employs dissipative particle thermodynamics in conjunction with a pseudo-thermodynamic parameter derived from scattering experiments to represent polymer-filler interactions. DPD simulations will probe how filler dispersion and hierarchical morphology develops in these complex blends, and are validated against experimental (scattering) data. The outcome of our approach is a practical solution to compounding issues, based on a mutually validating experimental and simulation methodology. Support from the NSF (CMMI-1636036/1635865) is gratefully acknowledged.
The chemical industry faces environmental, social and health challenges that are common across all economic sectors. From worker exposure to toxic substances, to product design and use, to the cost and handling of waste disposal, the industry must overcome numerous complex hurdle...
48 CFR 315.201 - Exchanges with industry before receipt of proposals.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Exchanges with industry... Receipt of Proposals and Information 315.201 Exchanges with industry before receipt of proposals. (e)(1... complex projects involving R & D, IT, construction, and other highly technical requirements. An RFI may...
NASA Astrophysics Data System (ADS)
Debnath, Lokenath
2010-09-01
This article is essentially devoted to a brief historical introduction to Euler's formula for polyhedra, topology, theory of graphs and networks with many examples from the real-world. Celebrated Königsberg seven-bridge problem and some of the basic properties of graphs and networks for some understanding of the macroscopic behaviour of real physical systems are included. We also mention some important and modern applications of graph theory or network problems from transportation to telecommunications. Graphs or networks are effectively used as powerful tools in industrial, electrical and civil engineering, communication networks in the planning of business and industry. Graph theory and combinatorics can be used to understand the changes that occur in many large and complex scientific, technical and medical systems. With the advent of fast large computers and the ubiquitous Internet consisting of a very large network of computers, large-scale complex optimization problems can be modelled in terms of graphs or networks and then solved by algorithms available in graph theory. Many large and more complex combinatorial problems dealing with the possible arrangements of situations of various kinds, and computing the number and properties of such arrangements can be formulated in terms of networks. The Knight's tour problem, Hamilton's tour problem, problem of magic squares, the Euler Graeco-Latin squares problem and their modern developments in the twentieth century are also included.
Sensors, nano-electronics and photonics for the Army of 2030 and beyond
NASA Astrophysics Data System (ADS)
Perconti, Philip; Alberts, W. C. K.; Bajaj, Jagmohan; Schuster, Jonathan; Reed, Meredith
2016-02-01
The US Army's future operating concept will rely heavily on sensors, nano-electronics and photonics technologies to rapidly develop situational understanding in challenging and complex environments. Recent technology breakthroughs in integrated 3D multiscale semiconductor modeling (from atoms-to-sensors), combined with ARL's Open Campus business model for collaborative research provide a unique opportunity to accelerate the adoption of new technology for reduced size, weight, power, and cost of Army equipment. This paper presents recent research efforts on multi-scale modeling at the US Army Research Laboratory (ARL) and proposes the establishment of a modeling consortium or center for semiconductor materials modeling. ARL's proposed Center for Semiconductor Materials Modeling brings together government, academia, and industry in a collaborative fashion to continuously push semiconductor research forward for the mutual benefit of all Army partners.
NASA Astrophysics Data System (ADS)
Junker, Philipp; Hackl, Klaus
2016-09-01
Numerical simulations are a powerful tool to analyze the complex thermo-mechanically coupled material behavior of shape memory alloys during product engineering. The benefit of the simulations strongly depends on the quality of the underlying material model. In this contribution, we discuss a variational approach which is based solely on energetic considerations and demonstrate that unique calibration of such a model is sufficient to predict the material behavior at varying ambient temperature. In the beginning, we recall the necessary equations of the material model and explain the fundamental idea. Afterwards, we focus on the numerical implementation and provide all information that is needed for programing. Then, we show two different ways to calibrate the model and discuss the results. Furthermore, we show how this model is used during real-life industrial product engineering.
Metaphors to Drive By: Exploring New Ways to Guide Human-Robot Interaction
DOE Office of Scientific and Technical Information (OSTI.GOV)
David J. Bruemmer; David I. Gertman; Curtis W. Nielsen
2007-08-01
Autonomous behaviors created by the research and development community are not being extensively utilized within energy, defense, security, or industrial contexts. This paper provides evidence that the interaction methods used alongside these behaviors may not provide a mental model that can be easily adopted or used by operators. Although autonomy has the potential to reduce overall workload, the use of robot behaviors often increased the complexity of the underlying interaction metaphor. This paper reports our development of new metaphors that support increased robot complexity without passing the complexity of the interaction onto the operator. Furthermore, we illustrate how recognition ofmore » problems in human-robot interactions can drive the creation of new metaphors for design and how human factors lessons in usability, human performance, and our social contract with technology have the potential for enormous payoff in terms of establishing effective, user-friendly robot systems when appropriate metaphors are used.« less
Difficult Decisions Made Easier
NASA Technical Reports Server (NTRS)
2006-01-01
NASA missions are extremely complex and prone to sudden, catastrophic failure if equipment falters or if an unforeseen event occurs. For these reasons, NASA trains to expect the unexpected. It tests its equipment and systems in extreme conditions, and it develops risk-analysis tests to foresee any possible problems. The Space Agency recently worked with an industry partner to develop reliability analysis software capable of modeling complex, highly dynamic systems, taking into account variations in input parameters and the evolution of the system over the course of a mission. The goal of this research was multifold. It included performance and risk analyses of complex, multiphase missions, like the insertion of the Mars Reconnaissance Orbiter; reliability analyses of systems with redundant and/or repairable components; optimization analyses of system configurations with respect to cost and reliability; and sensitivity analyses to identify optimal areas for uncertainty reduction or performance enhancement.
Light-Directed Particle Patterning by Evaporative Optical Marangoni Assembly.
Varanakkottu, Subramanyan Namboodiri; Anyfantakis, Manos; Morel, Mathieu; Rudiuk, Sergii; Baigl, Damien
2016-01-13
Controlled particle deposition on surfaces is crucial for both exploiting collective properties of particles and their integration into devices. Most available methods depend on intrinsic properties of either the substrate or the particles to be deposited making them difficult to apply to complex, naturally occurring or industrial formulations. Here we describe a new strategy to pattern particles from an evaporating drop, regardless of inherent particle characteristics and suspension composition. We use light to generate Marangoni surface stresses resulting in flow patterns that accumulate particles at predefined positions. Using projected images, we generate a broad variety of complex patterns, including multiple spots, lines and letters. Strikingly, this method, which we call evaporative optical Marangoni assembly (eOMA), allows us to pattern particles regardless of their size or surface properties, in model suspensions as well as in complex, real-world formulations such as commercial coffee.
The importance of ligand speciation in environmental research: a case study.
Sillanpää, M; Orama, M; Rämö, J; Oikari, A
2001-02-21
The speciations of EDTA and DTPA in process, waste and river waters are modelled and simulated, specifically to the mode of occurrence in the pulp and paper mill effluents and subsequently in receiving waters. Due to relatively short residence times in bleaching process and waste water treatment and slow exchange kinetics, it is expected that the thermodynamic equilibrium is not necessarily reached. Therefore, the initial speciation plays a key role. As such, the simulations have been extended to the process waters of the pulp and paper industry taking into account estimated average conditions. The results reveal that the main species are; Mn and Ca complexes of EDTA and DTPA in pulp mill process waters; Fe(III) and Mn complexes of EDTA and DTPA in waste waters; Fe(III) and Zn complexes of EDTA and DTPA in receiving waters. It is also shown how the increasing concentration of complexing agents effects the speciation. Alkaline earth metal chelation plays a significant role in the speciation of EDTA and DTPA when there is a noticeable molar excess of complexing agents compared with transition metals.
Mechanics of Cellulose Synthase Complexes in Living Plant Cells
NASA Astrophysics Data System (ADS)
Zehfroosh, Nina; Liu, Derui; Ramos, Kieran P.; Yang, Xiaoli; Goldner, Lori S.; Baskin, Tobias I.
The polymer cellulose is one of the major components of the world's biomass with unique and fascinating characteristics such as its high tensile strength, renewability, biodegradability, and biocompatibility. Because of these distinctive aspects, cellulose has been the subject of enormous scientific and industrial interest, yet there are still fundamental open questions about cellulose biosynthesis. Cellulose is synthesized by a complex of transmembrane proteins called ``Cellulose Synthase A'' (CESA) in the plasma membrane. Studying the dynamics and kinematics of the CESA complex will help reveal the mechanism of cellulose synthesis and permit the development and validation of models of CESA motility. To understand what drives these complexes through the cell membrane, we used total internal reflection fluorescence microscopy (TIRFM) and variable angle epi-fluorescence microscopy to track individual, fluorescently-labeled CESA complexes as they move in the hypocotyl and root of living plants. A mean square displacement analysis will be applied to distinguish ballistic, diffusional, and other forms of motion. We report on the results of these tracking experiments. This work was funded by NSF/PHY-1205989.
Saving lives: A meta-analysis of team training in healthcare.
Hughes, Ashley M; Gregory, Megan E; Joseph, Dana L; Sonesh, Shirley C; Marlow, Shannon L; Lacerenza, Christina N; Benishek, Lauren E; King, Heidi B; Salas, Eduardo
2016-09-01
As the nature of work becomes more complex, teams have become necessary to ensure effective functioning within organizations. The healthcare industry is no exception. As such, the prevalence of training interventions designed to optimize teamwork in this industry has increased substantially over the last 10 years (Weaver, Dy, & Rosen, 2014). Using Kirkpatrick's (1956, 1996) training evaluation framework, we conducted a meta-analytic examination of healthcare team training to quantify its effectiveness and understand the conditions under which it is most successful. Results demonstrate that healthcare team training improves each of Kirkpatrick's criteria (reactions, learning, transfer, results; d = .37 to .89). Second, findings indicate that healthcare team training is largely robust to trainee composition, training strategy, and characteristics of the work environment, with the only exception being the reduced effectiveness of team training programs that involve feedback. As a tertiary goal, we proposed and found empirical support for a sequential model of healthcare team training where team training affects results via learning, which leads to transfer, which increases results. We find support for this sequential model in the healthcare industry (i.e., the current meta-analysis) and in training across all industries (i.e., using meta-analytic estimates from Arthur, Bennett, Edens, & Bell, 2003), suggesting the sequential benefits of training are not unique to medical teams. Ultimately, this meta-analysis supports the expanded use of team training and points toward recommendations for optimizing its effectiveness within healthcare settings. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Scramjet Combustor Simulations Using Reduced Chemical Kinetics for Practical Fuels
2003-12-01
the aerospace industry in reducing prototype and testing costs and the time needed to bring products to market . Accurate simulation of chemical...JP-8 kinetics and soot models into the UNICORN CFD code (Montgomery et al., 2003a) NSF Phase I and II SBIRs for development of a computer-assisted...divided by diameter QSS quasi-steady state REI Reaction Engineering International UNICORN UNsteady Ignition and COmbustion with ReactioNs VULCAN Viscous Upwind aLgorithm for Complex flow ANalysis
Leading Change: Implementation of a New Care Coordination Model.
Ireland, Anne M
2016-05-01
Today's healthcare environment is characterized by a multitude of changes: acquisitions and mergers, streamlining of operations, restructuring and leadership shifts, new regulatory requirements with the 10th revision of the International Statistical Classification of Diseases and Related Health Problems, implementation and meaningful use, and advances in technology driven by the employment of electronic health records. The impact of these changes is complex and fraught with challenges in an industry that historically and culturally is cautious and slow to change. .
Symbolic PathFinder: Symbolic Execution of Java Bytecode
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Rungta, Neha
2010-01-01
Symbolic Pathfinder (SPF) combines symbolic execution with model checking and constraint solving for automated test case generation and error detection in Java programs with unspecified inputs. In this tool, programs are executed on symbolic inputs representing multiple concrete inputs. Values of variables are represented as constraints generated from the analysis of Java bytecode. The constraints are solved using off-the shelf solvers to generate test inputs guaranteed to achieve complex coverage criteria. SPF has been used successfully at NASA, in academia, and in industry.
Learning Human Aspects of Collaborative Software Development
ERIC Educational Resources Information Center
Hadar, Irit; Sherman, Sofia; Hazzan, Orit
2008-01-01
Collaboration has become increasingly widespread in the software industry as systems have become larger and more complex, adding human complexity to the technological complexity already involved in developing software systems. To deal with this complexity, human-centric software development methods, such as Extreme Programming and other agile…
Hopkins, Adam J; Richmond, Geraldine L
2013-03-01
Adsorption of small molecular solutes in an aqueous solution to a soft hydrophobic surface is a topic relevant to many fields. In biological and industrial systems, the interfacial environment is often complex, containing an array of salts and organic compounds in the solution phase. Additionally, the surface itself can have a complex structure that can interact in unpredictable ways with small solutes in its vicinity. In this work, we studied model adsorption processes on hydrocarbon and fluorocarbon self-assembled monolayers by using vibrational sum frequency spectroscopy, with methanol and butylammonium chloride as adsorbates. The results indicate that differences in surface functionality have a significant impact on the organization of adsorbed organic species at hydrophobic surfaces.
Combustion research for gas turbine engines
NASA Technical Reports Server (NTRS)
Mularz, E. J.; Claus, R. W.
1985-01-01
Research on combustion is being conducted at Lewis Research Center to provide improved analytical models of the complex flow and chemical reaction processes which occur in the combustor of gas turbine engines and other aeropropulsion systems. The objective of the research is to obtain a better understanding of the various physical processes that occur in the gas turbine combustor in order to develop models and numerical codes which can accurately describe these processes. Activities include in-house research projects, university grants, and industry contracts and are classified under the subject areas of advanced numerics, fuel sprays, fluid mixing, and radiation-chemistry. Results are high-lighted from several projects.
Bioremediation of a Complex Industrial Effluent by Biosorbents Derived from Freshwater Macroalgae
Kidgell, Joel T.; de Nys, Rocky; Hu, Yi; Paul, Nicholas A.; Roberts, David A.
2014-01-01
Biosorption with macroalgae is a promising technology for the bioremediation of industrial effluents. However, the vast majority of research has been conducted on simple mock effluents with little data available on the performance of biosorbents in complex effluents. Here we evaluate the efficacy of dried biomass, biochar, and Fe-treated biomass and biochar to remediate 21 elements from a real-world industrial effluent from a coal-fired power station. The biosorbents were produced from the freshwater macroalga Oedogonium sp. (Chlorophyta) that is native to the industrial site from which the effluent was sourced, and which has been intensively cultivated to provide a feed stock for biosorbents. The effect of pH and exposure time on sorption was also assessed. These biosorbents showed specificity for different suites of elements, primarily differentiated by ionic charge. Overall, biochar and Fe-biochar were more successful biosorbents than their biomass counterparts. Fe-biochar adsorbed metalloids (As, Mo, and Se) at rates independent of effluent pH, while untreated biochar removed metals (Al, Cd, Ni and Zn) at rates dependent on pH. This study demonstrates that the biomass of Oedogonium is an effective substrate for the production of biosorbents to remediate both metals and metalloids from a complex industrial effluent. PMID:24919058
NASA Technical Reports Server (NTRS)
Follen, Gregory J.; Naiman, Cynthia G.
1999-01-01
The NASA Lewis Research Center is developing an environment for analyzing and designing aircraft engines-the Numerical Propulsion System Simulation (NPSS). NPSS will integrate multiple disciplines, such as aerodynamics, structure, and heat transfer, and will make use of numerical "zooming" on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS uses the latest computing and communication technologies to capture complex physical processes in a timely, cost-effective manner. The vision of NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Through the NASA/Industry Cooperative Effort agreement, NASA Lewis and industry partners are developing a new engine simulation called the National Cycle Program (NCP). NCP, which is the first step toward NPSS and is its initial framework, supports the aerothermodynamic system simulation process for the full life cycle of an engine. U.S. aircraft and airframe companies recognize NCP as the future industry standard common analysis tool for aeropropulsion system modeling. The estimated potential payoff for NCP is a $50 million/yr savings to industry through improved engineering productivity.
NASA Astrophysics Data System (ADS)
Grilo, Tiago J.; Vladimirov, Ivaylo N.; Valente, Robertt A. F.; Reese, Stefanie
2016-06-01
In the present paper, a finite strain model for complex combined isotropic-kinematic hardening is presented. It accounts for finite elastic and finite plastic strains and is suitable for any anisotropic yield criterion. In order to model complex cyclic hardening phenomena, the kinematic hardening is described by several back stress components. To that end, a new procedure is proposed in which several multiplicative decompositions of the plastic part of the deformation gradient are considered. The formulation incorporates a completely general format of the yield function, which means that any yield function can by employed by following a procedure that ensures the principle of material frame indifference. The constitutive equations are derived in a thermodynamically consistent way and numerically integrated by means of a backward-Euler algorithm based on the exponential map. The performance of the constitutive model is assessed via numerical simulations of industry-relevant sheet metal forming processes (U-channel forming and draw/re-draw of a panel benchmarks), the results of which are compared to experimental data. The comparison between numerical and experimental results shows that the use of multiple back stress components is very advantageous in the description of springback. This holds in particular if one carries out a comparison with the results of using only one component. Moreover, the numerically obtained results are in excellent agreement with the experimental data.
Aziza, Fanny; Mettler, Eric; Daudin, Jean-Jacques; Sanaa, Moez
2006-06-01
Cheese smearing is a complex process and the potential for cross-contamination with pathogenic or undesirable microorganisms is critical. During ripening, cheeses are salted and washed with brine to develop flavor and remove molds that could develop on the surfaces. Considering the potential for cross-contamination of this process in quantitative risk assessments could contribute to a better understanding of this phenomenon and, eventually, improve its control. The purpose of this article is to model the cross-contamination of smear-ripened cheeses due to the smearing operation under industrial conditions. A compartmental, dynamic, and stochastic model is proposed for mechanical brush smearing. This model has been developed to describe the exchange of microorganisms between compartments. Based on the analytical solution of the model equations and on experimental data collected with an industrial smearing machine, we assessed the values of the transfer parameters of the model. Monte Carlo simulations, using the distributions of transfer parameters, provide the final number of contaminated products in a batch and their final level of contamination for a given scenario taking into account the initial number of contaminated cheeses of the batch and their contaminant load. Based on analytical results, the model provides indicators for smearing efficiency and propensity of the process for cross-contamination. Unlike traditional approaches in mechanistic models, our approach captures the variability and uncertainty inherent in the process and the experimental data. More generally, this model could represent a generic base to use in modeling similar processes prone to cross-contamination.
Beyond a series of security nets: Applying STAMP & STPA to port security
Williams, Adam D.
2015-11-17
Port security is an increasing concern considering the significant role of ports in global commerce and today’s increasingly complex threat environment. Current approaches to port security mirror traditional models of accident causality -- ‘a series of security nets’ based on component reliability and probabilistic assumptions. Traditional port security frameworks result in isolated and inconsistent improvement strategies. Recent work in engineered safety combines the ideas of hierarchy, emergence, control and communication into a new paradigm for understanding port security as an emergent complex system property. The ‘System-Theoretic Accident Model and Process (STAMP)’ is a new model of causality based on systemsmore » and control theory. The associated analysis process -- System Theoretic Process Analysis (STPA) -- identifies specific technical or procedural security requirements designed to work in coordination with (and be traceable to) overall port objectives. This process yields port security design specifications that can mitigate (if not eliminate) port security vulnerabilities related to an emphasis on component reliability, lack of coordination between port security stakeholders or economic pressures endemic in the maritime industry. As a result, this article aims to demonstrate how STAMP’s broader view of causality and complexity can better address the dynamic and interactive behaviors of social, organizational and technical components of port security.« less
Beyond a series of security nets: Applying STAMP & STPA to port security
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Adam D.
Port security is an increasing concern considering the significant role of ports in global commerce and today’s increasingly complex threat environment. Current approaches to port security mirror traditional models of accident causality -- ‘a series of security nets’ based on component reliability and probabilistic assumptions. Traditional port security frameworks result in isolated and inconsistent improvement strategies. Recent work in engineered safety combines the ideas of hierarchy, emergence, control and communication into a new paradigm for understanding port security as an emergent complex system property. The ‘System-Theoretic Accident Model and Process (STAMP)’ is a new model of causality based on systemsmore » and control theory. The associated analysis process -- System Theoretic Process Analysis (STPA) -- identifies specific technical or procedural security requirements designed to work in coordination with (and be traceable to) overall port objectives. This process yields port security design specifications that can mitigate (if not eliminate) port security vulnerabilities related to an emphasis on component reliability, lack of coordination between port security stakeholders or economic pressures endemic in the maritime industry. As a result, this article aims to demonstrate how STAMP’s broader view of causality and complexity can better address the dynamic and interactive behaviors of social, organizational and technical components of port security.« less
Modern methods of surveyor observations in opencast mining under complex hydrogeological conditions.
NASA Astrophysics Data System (ADS)
Usoltseva, L. A.; Lushpei, V. P.; Mursin, VA
2017-10-01
The article considers the possibility of linking the modern methods of surveying security of open mining works to improve industrial safety in the Primorsky Territory, as well as their use in the educational process. Industrial Safety in the management of Surface Mining depends largely on the applied assessment methods and methods of stability of pit walls and slopes of dumps in the complex mining and hydro-geological conditions.
The Sines industrial complex monitoring programme: A preliminary report.
Jones, M P; Catarino, F M; Sérgio, C; Bento-Pereira, F
1981-06-01
It is anticipated that the establishment of the industrial complex at Sines, Alentejo, Portugal, will have some impact on the environment. Details of the methods used in the monitoring programme are provided. Records of the epiphytic lichen vegetation in permanent quadrats have been made and changes shown in selected sites over a three year period are discussed. Material has been collected for analysis for heavy metals and the results discussed. There is considerable variation in replicates and in interspecies values. The problem of age and bio-accumulation is mentioned. Scanning electron microscopy has shown the accumulation of particulates, as yet unidentified, the quantity varying with increase in age and surface texture. A broadly based study of the local epiphytic flora is being carried out to record the present day diversity. There appears, as yet, to be no detectable influence of the industrial complex on the epiphytic flora of the permanent quadrats.
Risk assessment in the upstream crude oil supply chain: Leveraging analytic hierarchy process
NASA Astrophysics Data System (ADS)
Briggs, Charles Awoala
For an organization to be successful, an effective strategy is required, and if implemented appropriately the strategy will result in a sustainable competitive advantage. The importance of decision making in the oil industry is reflected in the magnitude and nature of the industry. Specific features of the oil industry supply chain, such as its longer chain, the complexity of its transportation system, its complex production and storage processes, etc., pose challenges to its effective management. Hence, understanding the risks, the risk sources, and their potential impacts on the oil industry's operations will be helpful in proposing a risk management model for the upstream oil supply chain. The risk-based model in this research uses a three-level analytic hierarchy process (AHP), a multiple-attribute decision-making technique, to underline the importance of risk analysis and risk management in the upstream crude oil supply chain. Level 1 represents the overall goal of risk management; Level 2 is comprised of the various risk factors; and Level 3 represents the alternative criteria of the decision maker as indicated on the hierarchical structure of the crude oil supply chain. Several risk management experts from different oil companies around the world were surveyed, and six major types of supply chain risks were identified: (1) exploration and production, (2) environmental and regulatory compliance, (3) transportation, (4) availability of oil, (5) geopolitical, and (6) reputational. Also identified are the preferred methods of managing risks which include; (1) accept and control the risks, (2) avoid the risk by stopping the activity, or (3) transfer or share the risks to other companies or insurers. The results from the survey indicate that the most important risk to manage is transportation risk with a priority of .263, followed by exploration/production with priority of .198, with an overall inconsistency of .03. With respect to major objectives the most preferred risk management policy option based on the result of the composite score is accept and control risk with a priority of .446, followed by transfer or share risk with a priority of .303. The least likely option is to terminate or forgo activity with a priority of .251.
Microbial Cellulases and Their Industrial Applications
Kuhad, Ramesh Chander; Gupta, Rishi; Singh, Ajay
2011-01-01
Microbial cellulases have shown their potential application in various industries including pulp and paper, textile, laundry, biofuel production, food and feed industry, brewing, and agriculture. Due to the complexity of enzyme system and immense industrial potential, cellulases have been a potential candidate for research by both the academic and industrial research groups. Nowadays, significant attentions have been devoted to the current knowledge of cellulase production and the challenges in cellulase research especially in the direction of improving the process economics of various industries. Scientific and technological developments and the future prospects for application of cellulases in different industries are discussed in this paper. PMID:21912738
Moving Contact Lines: Linking Molecular Dynamics and Continuum-Scale Modeling.
Smith, Edward R; Theodorakis, Panagiotis E; Craster, Richard V; Matar, Omar K
2018-05-17
Despite decades of research, the modeling of moving contact lines has remained a formidable challenge in fluid dynamics whose resolution will impact numerous industrial, biological, and daily life applications. On the one hand, molecular dynamics (MD) simulation has the ability to provide unique insight into the microscopic details that determine the dynamic behavior of the contact line, which is not possible with either continuum-scale simulations or experiments. On the other hand, continuum-based models provide a link to the macroscopic description of the system. In this Feature Article, we explore the complex range of physical factors, including the presence of surfactants, which governs the contact line motion through MD simulations. We also discuss links between continuum- and molecular-scale modeling and highlight the opportunities for future developments in this area.
[Market-based medicine or patient-based medicine?].
Justich, Pablo R
2015-04-01
The health care has evolved over the centuries from a theocentric model to a model centered on man, environment and society. The process of neoliberal globalization has changed the relationship between the components of the health system and population. The active participation of organizations such as the World Trade Organization, the International Monetary Fund and the World Bank by the techno-medical industrial complex tends to make the health care in a model focused on economy. This, impacts negatively on all components in the process of health care and have an adverse effect on the humanized care. The analysis of each sector in particular and their interactions shows the effects of this change. Alternatives are proposed for each sector to contribute to a model of care focused on the patient, their family and the social environment.
Using an instrumented manikin for Space Station Freedom analysis
NASA Technical Reports Server (NTRS)
Orr, Linda; Hill, Richard
1989-01-01
One of the most intriguing and complex areas of current computer graphics research is animating human figures to behave in a realistic manner. Believable, accurate human models are desirable for many everyday uses including industrial and architectural design, medical applications, and human factors evaluations. For zero-gravity (0-g) spacecraft design and mission planning scenarios, they are particularly valuable since 0-g conditions are difficult to simulate in a one-gravity Earth environment. At NASA/JSC, an in-house human modeling package called PLAID is currently being used to produce animations for human factors evaluation of Space Station Freedom design issues. Presented here is an introductory background discussion of problems encountered in existing techniques for animating human models and how an instrumented manikin can help improve the realism of these models.
NASA Astrophysics Data System (ADS)
Wimmer, E.
2008-02-01
A workshop, 'Theory Meets Industry', was held on 12-14 June 2007 in Vienna, Austria, attended by a well balanced number of academic and industrial scientists from America, Europe, and Japan. The focus was on advances in ab initio solid state calculations and their practical use in industry. The theoretical papers addressed three dominant themes, namely (i) more accurate total energies and electronic excitations, (ii) more complex systems, and (iii) more diverse and accurate materials properties. Hybrid functionals give some improvements in energies, but encounter difficulties for metallic systems. Quantum Monte Carlo methods are progressing, but no clear breakthrough is on the horizon. Progress in order-N methods is steady, as is the case for efficient methods for exploring complex energy hypersurfaces and large numbers of structural configurations. The industrial applications were dominated by materials issues in energy conversion systems, the quest for hydrogen storage materials, improvements of electronic and optical properties of microelectronic and display materials, and the simulation of reactions on heterogeneous catalysts. The workshop is a clear testimony that ab initio computations have become an industrial practice with increasingly recognized impact.
Optimation and Determination of Fe-Oxinate Complex by Using High Performance Liquid Chromatography
NASA Astrophysics Data System (ADS)
Oktavia, B.; Nasra, E.; Sary, R. C.
2018-04-01
The need for iron will improve the industrial processes that require iron as its raw material. Control of industrial iron waste is very important to do. One method of iron analysis is to conduct indirect analysis of iron (III) ions by complexing with 8-Hydroxyquinoline or oxine. In this research, qualitative and quantitative tests of iron (III) ions in the form of complex with oxine. The analysis was performed using HPLC at a wavelength of 470 nm with an ODS C18 column. Three methods of analysis were performed: 1) Fe-oxinate complexes were prepared in an ethanol solvent so no need for separation anymore, (2) Fe-oxinate complexes were made in chloroform so that a solvent extraction was required before the complex was injected into the column while the third complex was formed in the column, wherein the eluent contains the oxide and the metal ions are then injected. The resulting chromatogram shows that the 3rd way provides a better chromatogram for iron analysis.
Ratkin, N E; Asming, V E; Koshkin, V V
2001-01-01
The goal of this work was to develop computational techniques for sulphates, nickel and copper accumulation in the snow in the local pollution zone. The main task was to reveal the peculiarities of formation and pollution of snow cover on the region with complex cross-relief. A digital cartographic model of aerotechnogenic pollution of snow cover in the landscapes of the local zone has been developed, based on five-year experimental data. Data regarding annual emissions from the industrial complex, information about distribution of wind and the sum of precipitation from meteostation "Nikel" for the winter period, allowed the model to ensure: * material presentation in the form of maps of water capacity and accumulation of sulphates, nickel and copper in the snow over any winter period in retrospective; * calculation of water capacity and accumulation of pollutants for watersheds and other natural-territorial complexes; * solution of the opposite problem about the determination of the emissions of sulphates, nickel and copper from the enterprise by measuring snow pollution in datum points. The model can be used in other northern regions of the Russian Federation with similar physical-geographical and climatic conditions. The relationships between the sum of precipitation and water capacity in the landscapes of the same type and also the relationships between pollution content in snow and relief, pollution content in snow and distance from the source of emissions, were used as the basis for the model.
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.; Glassman, Nanci A.; Affelder, Linda O.; Hecht, Laura M.; Kennedy, John M.; Barclay, Rebecca O.
1993-01-01
An exploratory study was conducted that investigated the influence of technical uncertainty and project complexity on information use by U.S. industry-affiliated aerospace engineers and scientists. The study utilized survey research in the form of a self-administered mail questionnaire. U.S. aerospace engineers and scientists on the Society of Automotive Engineers (SAE) mailing list served as the study population. The adjusted response rate was 67 percent. The survey instrument is appendix C to this report. Statistically significant relationships were found to exist between technical uncertainty, project complexity, and information use. Statistically significant relationships were found to exist between technical uncertainty, project complexity, and the use of federally funded aerospace R&D. The results of this investigation are relevant to researchers investigating information-seeking behavior of aerospace engineers. They are also relevant to R&D managers and policy planners concerned with transferring the results of federally funded aerospace R&D to the U.S. aerospace industry.
NASA Astrophysics Data System (ADS)
Zaccheo, T. S.; Pernini, T.; Dobler, J. T.; Blume, N.; Braun, M.
2017-12-01
This work highlights the use of the greenhouse-gas laser imaging tomography experiment (GreenLITETM) data in conjunction with a sparse tomography approach to identify and quantify both urban and industrial sources of CO2 and CH4. The GreenLITETM system provides a user-defined set of time-sequenced intersecting chords or integrated column measurements at a fixed height through a quasi-horizontal plane of interest. This plane, with unobstructed views along the lines of sight, may range from complex industrial facilities to a small city scale or urban sector. The continuous time phased absorption measurements are converted to column concentrations and combined with a plume based model to estimate the 2-D distribution of gas concentration over extended areas ranging from 0.04-25 km2. Finally, these 2-D maps of concentration are combined with ancillary meteorological and atmospheric data to identify potential emission sources and provide first order estimates of their associated fluxes. In this presentation, we will provide a brief overview of the systems and results from both controlled release experiments and a long-term system deployment in Paris, FR. These results provide a quantitative assessment of the system's ability to detect and estimate CO2 and CH4 sources, and demonstrate its ability to perform long-term autonomous monitoring and quantification of either persistent or sporadic emissions that may have both health and safety as well as environmental impacts.
Piva, Francesco; Ciaprini, Francesco; Onorati, Fulvio; Benedetti, Maura; Fattorini, Daniele; Ausili, Antonella; Regoli, Francesco
2011-04-01
Quality assessments are crucial to all activities related to removal and management of sediments. Following a multidisciplinary, weight of evidence approach, a new model is presented here for comprehensive assessment of hazards associated to polluted sediments. The lines of evidence considered were sediment chemistry, assessment of bioavailability, sub-lethal effects on biomarkers, and ecotoxicological bioassays. A conceptual and software-assisted model was developed with logical flow-charts elaborating results from each line of evidence on the basis of several chemical and biological parameters, normative guidelines or scientific evidence; the data are thus summarized into four specific synthetic indices, before their integration into an overall sediment hazard evaluation. This model was validated using European eels (Anguilla anguilla) as the bioindicator species, exposed under laboratory conditions to sediments from an industrial site, and caged under field conditions in two harbour areas. The concentrations of aliphatic hydrocarbons, polycyclic aromatic hydrocarbons and trace metals were much higher in the industrial compared to harbour sediments, and accordingly the bioaccumulation in liver and gills of exposed eels showed marked differences between conditions seen. Among biomarkers, significant variations were observed for cytochrome P450-related responses, oxidative stress biomarkers, lysosomal stability and genotoxic effects; the overall elaboration of these data, as those of standard ecotoxicological bioassays with bacteria, algae and copepods, confirmed a higher level of biological hazard for industrial sediments. Based on comparisons with expert judgment, the model presented efficiently discriminates between the various conditions, both as individual modules and as an integrated final evaluation, and it appears to be a powerful tool to support more complex processes of environmental risk assessment. Copyright © 2010 Elsevier Ltd. All rights reserved.
A Distributed Model of Oilseed Biorefining, via Integrated Industrial Ecology Exchanges
NASA Astrophysics Data System (ADS)
Ferrell, Jeremy C.
As the demand for direct petroleum substitutes increases, biorefineries are poised to become centers for conversion of biomass into fuels, energy, and biomaterials. A distributed model offers reduced transportation, tailored process technology to available feedstock, and increased local resilience. Oilseeds are capable of producing a wide variety of useful products additive to food, feed, and fuel needs. Biodiesel manufacturing technology lends itself to smaller-scale distributed facilities able to process diverse feedstocks and meet demand of critical diesel fuel for basic municipal services, safety, sanitation, infrastructure repair, and food production. Integrating biodiesel refining facilities as tenants of eco-industrial parks presents a novel approach for synergistic energy and material exchanges whereby environmental and economic metrics can be significantly improved upon compared to stand alone models. This research is based on the Catawba County NC EcoComplex and the oilseed crushing and biodiesel processing facilities (capacity-433 tons biodiesel per year) located within. Technical and environmental analyses of the biorefinery components as well as agronomic and economic models are presented. The life cycle assessment for the two optimal biodiesel feedstocks, soybeans and used cooking oil, resulted in fossil energy ratios of 7.19 and 12.1 with carbon intensity values of 12.51 gCO2-eq/MJ and 7.93 gCO2-eq/MJ, respectively within the industrial ecology system. Economic modeling resulted in a biodiesel conversion cost of 1.43 per liter of fuel produced with used cooking oil, requiring a subsidy of 0.58 per liter to reach the break-even point. As subsidies continue significant fluctuation, metrics other than operating costs are required to justify small-scale biofuel projects.
Characterising freeze in the UK: applications for the insurance industry
NASA Astrophysics Data System (ADS)
Raven, E. K.; Keef, C.; Busby, K.
2012-04-01
The UK winters of 2009-2010 and 2010-2011 were characterised by prolonged and widespread low temperatures. This was challenging for the UK insurance industry and organisations such as the emergency services, the Highways Agency and British Gas who had to manage the extra demands that resulted. In the 6-day period running to Christmas Eve 2010, British Gas reported 100,000 boiler repair call-outs, whilst those 190,000 homes and businesses left with frozen and subsequently burst pipes contributed to the ABI's estimated £ 900 million in insured losses for December 2010 alone; the highest payout by the industry for damages associated with cold weather. Unfortunately, the severity of these winters made the difference between profit and loss for some primary UK insurance companies. To enable better pricing of premiums in the future, insurance companies are looking to understand the potential risk from cold waves at a local, postcode-level, whilst reinsurance firms seek to determine the accumulated loss across the UK associated with spatially coherent events. Other industry sectors also strive to improve their understanding of weather extremes for planning and management. Underpinning this is the need to statistically characterise the physical hazard. Aimed primarily at the re/insurance industry, we have applied an established methodology for developing statistical event sets and applied this to generate a UK freeze event set. An event set provides a stochastic set of several thousand events over 10's of 1000's of years and is typically applied within probabilistic catastrophe models. Our method applies extreme value theory and dependence modelling to explain low-temperature relationships across the UK and over time using historical records. The resulting event set represents the spatial and temporal dependence of cold waves in the UK and is modelled against household factors that increase the vulnerability to freezing conditions, such as property type, age and condition. By presenting our methodology, we illustrate some of the complex spatial and temporal relationships in UK freeze events and place the past two winters into a statistical context. Furthermore, we demonstrate the application of event sets within catastrophe modelling and risk mapping services.
NASA Astrophysics Data System (ADS)
Commendatore, Pasquale; Kubin, Ingrid; Sushko, Iryna
2018-05-01
We consider a three-region developing economy with poor transport infrastructures. Two models are related to different stages of development: in the first all regions are autarkic; in the second two of the regions begin to integrate with the third region still not accessible to trade. The properties of the two models are studied also considering the interplay between industry location and trade patterns. Dynamics of these models are described by two-dimensional piecewise smooth maps, characterized by multistability and complex bifurcation structure of the parameter space. We obtain analytical results related to stability of various fixed points and illustrate several bifurcation structures by means of two-dimensional bifurcation diagrams and basins of coexisting attractors.
Dsikowitzky, Larissa; Hagemann, Lukas; Dwiyitno; Ariyani, Farida; Irianto, Hari Eko; Schwarzbauer, Jan
2017-12-01
During the last decades, the global industrial production partly shifted from industrialized nations to emerging and developing countries. In these upcoming economies, the newly developed industrial centers are generally located in densely populated areas, resulting in the discharge of often only partially treated industrial and municipal wastewaters into the surface waters. There is a huge gap of knowledge about the composition of the complex organic pollutant mixtures occurring in such heavily impacted areas. Therefore, we applied a non-target screening to comprehensively assess river pollution in a large industrial area located in the megacity Jakarta. More than 100 structurally diverse organic contaminants were identified, some of which were reported here for the first time as environmental contaminants. The concentrations of paper manufacturing chemicals in river water-for example, of the endocrine-disrupting compound bisphenol A (50-8000 ng L -1 )-were as high as in pure untreated paper industry wastewaters. The non-target screening approach is the adequate tool for the identification of water contaminants in the new global centers of industrial manufacturing-as the first crucial step towards the evaluation of as yet unrecognized environmental risks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flathers, M.B.; Bache, G.E.; Rainsberger, R.
1996-04-01
The flow field of a complex three-dimensional radial inlet for an industrial pipeline centrifugal compressor has been experimentally determined on a half-scale model. Based on the experimental results, inlet guide vanes have been designed to correct pressure and swirl angle distribution deficiencies. The unvaned and vaned inlets are analyzed with a commercially available fully three-dimensional viscous Navier-Stokes code. Since experimental results were available prior to the numerical study, the unvaned analysis is considered a postdiction while the vaned analysis is considered a prediction. The computational results of the unvaned inlet have been compared to the previously obtained experimental results. Themore » experimental method utilized for the unvaned inlet is repeated for the vaned inlet and the data have been used to verify the computational results. The paper will discuss experimental, design, and computational procedures, grid generation, boundary conditions, and experimental versus computational methods. Agreement between experimental and computational results is very good, both in prediction and postdiction modes. The results of this investigation indicate that CFD offers a measurable advantage in design, schedule, and cost and can be applied to complex, three-dimensional radial inlets.« less
Reducing systems biology to practice in pharmaceutical company research; selected case studies.
Benson, N; Cucurull-Sanchez, L; Demin, O; Smirnov, S; van der Graaf, P
2012-01-01
Reviews of the productivity of the pharmaceutical industry have concluded that the current business model is unsustainable. Various remedies for this have been proposed, however, arguably these do not directly address the fundamental issue; namely, that it is the knowledge required to enable good decisions in the process of delivering a drug that is largely absent; in turn, this leads to a disconnect between our intuition of what the right drug target is and the reality of pharmacological intervention in a system such as a human disease state. As this system is highly complex, modelling will be required to elucidate emergent properties together with the data necessary to construct such models. Currently, however, both the models and data available are limited. The ultimate solution to the problem of pharmaceutical productivity may be the virtual human, however, it is likely to be many years, if at all, before this goal is realised. The current challenge is, therefore, whether systems modelling can contribute to improving productivity in the pharmaceutical industry in the interim and help to guide the optimal route to the virtual human. In this context, this chapter discusses the emergence of systems pharmacology in drug discovery from the interface of pharmacokinetic-pharmacodynamic modelling and systems biology. Examples of applications to the identification of optimal drug targets in given pathways, selecting drug modalities and defining biomarkers are discussed, together with future directions.
Description and evaluation of the QUIC bio-slurry scheme: droplet evaporation and surface deposition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zajic, Dragan; Brown, Michael J; Nelson, Matthew A
2010-01-01
The Quick Urban and Industrial Complex (QUIC) dispersion modeling system was developed with the goal of improving the transport and dispersion modeling capabilities within urban areas. The modeling system has the ability to rapidly obtain a detailed 3D flow field around building clusters and uses an urbanized Lagrangian random-walk approach to account for transport and dispersion (e.g., see Singh et al., 2008; Williams et al., 2009; and Gowardhan et al., 2009). In addition to wind-tunnel testing, the dispersion modeling system has been evaluated against full-scale urban tracer experiments performed in Salt Lake City, Oklahoma City, and New York City (Gowardhanmore » et al., 2006; Gowardhan et al., 2009; Allwine et al., 2008) and the wind model output to measurements taken in downtown Oklahoma City.« less
Wang, Rusong
2005-12-01
Based on the Social-Economic-Natural Complex Ecosystem theory, this paper questioned 8 kinds of misunderstandings in current planning, incubation, development, and management of circular economy, which had led to either ultra-right or ultra-left actions in ecological and economic development. Rather than concentrated only on the 3-r micro-principles of "reduce-reuse-recycle", thise paper suggested 3-R macro-principles of "Rethinking-Reform-Refunction" for circular economy development. Nine kinds of eco-integrative strategies in industrial transition were put forward, i.e., food web-based horizontal/parallel coupling, life cycle-oriented vertical/serial coupling, functional service rather than products-oriented production, flexible and adaptive structure, ecosystem-based regional coupling, social integrity, comprehensive capacity building, employment enhancement, and respecting human dignity. Ten promising potential eco-industries in China's near-future circular economy development were proposed, such as the transition of traditional chemical fertilizer and pesticide industry to a new kind of industrial complex for agro-ecosystem management.
NASA Astrophysics Data System (ADS)
Mallory, Nicolas Joseph
The design of robust automated flight control systems for aircraft of varying size and complexity is a topic of continuing interest for both military and civilian industries. By merging the benefits of robustness from sliding mode control (SMC) with the familiarity and transparency of design tradeoff offered by frequency domain approaches, this thesis presents pseudo-sliding mode control as a viable option for designing automated flight control systems for complex six degree-of-freedom aircraft. The infinite frequency control switching of SMC is replaced, by necessity, with control inputs that are continuous in nature. An introduction to SMC theory is presented, followed by a detailed design of a pseudo-sliding mode control and automated flight control system for a six degree-of-freedom model of a Hughes OH6 helicopter. This model is then controlled through three different waypoint missions that demonstrate the stability of the system and the aircraft's ability to follow certain maneuvers despite time delays, large changes in model parameters and vehicle dynamics, actuator dynamics, sensor noise, and atmospheric disturbances.
Levels of blood lead and urinary cadmium in industrial complex residents in Ulsan.
Kim, Sang Hoon; Kim, Yang Ho; An, Hyun Chan; Sung, Joo Hyun; Sim, Chang Sun
2017-01-01
Populations neighboring industrial complexes are at an increased health risk, due to constant exposure to various potentially hazardous compounds released during industrial production activity. Although there are many previous studies that focus on occupational exposure to heavy metals, studies that focused on environmental exposure to lead and cadmium are relatively rare. The purpose of this study is to evaluate the extent of the environmental exposure of heavy metals in residents of industrial area. Four areas in close proximity to the Ulsan petrochemical industrial complex and the Onsan national industrial complex were selected to be included in the exposure group, and an area remotely located from these industrial complexes was selected as the non-exposure group. Among the residents of our study areas, a total of 1573 subjects aged 20 years and older were selected and all study subjects completed a written questionnaire. Blood and urine samples were obtained from about one third of the subjects (465 subjects) who provided informed consent for biological sample collection. Total 429 subjects (320 subjects from exposure area, 109 subjects from non-exposure area) were included in final analysis. The geometric mean blood lead level among the subjects in the exposed group was 2.449 μg/dL, which was significantly higher than the non-exposure group's level of 2.172 μg/dL. Similarly, the geometric mean urine cadmium levels between the two groups differed significantly, at 1.077 μg/g Cr. for the exposed group, and 0.709 μg/g Cr. for the non-exposure group. In a multiple linear regression analysis to determine the relationship between blood lead level and related factors, the results showed that blood lead level had a significant positive correlation with age, the male, exposure area, and non-drinkers. In the same way, urine cadmium level was positively correlated with age, the female, exposure area, and smokers. This study found that blood lead levels and urine cadmium levels were significantly higher among the residents of industrial areas than among the non-exposure area residents, which is thought to be due to the difference in environmental exposure of lead and cadmium. Furthermore, it was clear that at a low level of exposure, differences in blood lead or urine cadmium levels based on age, gender, and smoking status were greater than the differences based on area of residence. Therefore, when evaluating heavy metal levels in the body at a low level of exposure, age, gender, and smoking status must be adjusted, as they are significant confounding factors.
2016-05-26
Complex and Austere Environment Sb. GRANT NUMBER Sc. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Sd. PROJECT NUMBER MAJ Sun Ryu Se. TASK NUMBER Sf...supports the United States Armed Forces to project combat power during hostilities. In 2014, TRADOC published the new Army Operating Concept (AOC...Sustaining the Army Organic Industrial Base in the Post- Afghanistan Conflict Era” (Civilian Research Project , US Army War College, 2014), 1. 8
Data-based virtual unmodeled dynamics driven multivariable nonlinear adaptive switching control.
Chai, Tianyou; Zhang, Yajun; Wang, Hong; Su, Chun-Yi; Sun, Jing
2011-12-01
For a complex industrial system, its multivariable and nonlinear nature generally make it very difficult, if not impossible, to obtain an accurate model, especially when the model structure is unknown. The control of this class of complex systems is difficult to handle by the traditional controller designs around their operating points. This paper, however, explores the concepts of controller-driven model and virtual unmodeled dynamics to propose a new design framework. The design consists of two controllers with distinct functions. First, using input and output data, a self-tuning controller is constructed based on a linear controller-driven model. Then the output signals of the controller-driven model are compared with the true outputs of the system to produce so-called virtual unmodeled dynamics. Based on the compensator of the virtual unmodeled dynamics, the second controller based on a nonlinear controller-driven model is proposed. Those two controllers are integrated by an adaptive switching control algorithm to take advantage of their complementary features: one offers stabilization function and another provides improved performance. The conditions on the stability and convergence of the closed-loop system are analyzed. Both simulation and experimental tests on a heavily coupled nonlinear twin-tank system are carried out to confirm the effectiveness of the proposed method.
NASA Astrophysics Data System (ADS)
Sun, Zhiyong; Hao, Lina; Song, Bo; Yang, Ruiguo; Cao, Ruimin; Cheng, Yu
2016-10-01
Micro/nano positioning technologies have been attractive for decades for their various applications in both industrial and scientific fields. The actuators employed in these technologies are typically smart material actuators, which possess inherent hysteresis that may cause systems behave unexpectedly. Periodic reference tracking capability is fundamental for apparatuses such as scanning probe microscope, which employs smart material actuators to generate periodic scanning motion. However, traditional controller such as PID method cannot guarantee accurate fast periodic scanning motion. To tackle this problem and to conduct practical implementation in digital devices, this paper proposes a novel control method named discrete extended unparallel Prandtl-Ishlinskii model based internal model (d-EUPI-IM) control approach. To tackle modeling uncertainties, the robust d-EUPI-IM control approach is investigated, and the associated sufficient stabilizing conditions are derived. The advantages of the proposed controller are: it is designed and represented in discrete form, thus practical for digital devices implementation; the extended unparallel Prandtl-Ishlinskii model can precisely represent forward/inverse complex hysteretic characteristics, thus can reduce modeling uncertainties and benefits controllers design; in addition, the internal model principle based control module can be utilized as a natural oscillator for tackling periodic references tracking problem. The proposed controller was verified through comparative experiments on a piezoelectric actuator platform, and convincing results have been achieved.
NASA Astrophysics Data System (ADS)
Saldanha, Shamith L.; Kalaichelvi, V.; Karthikeyan, R.
2018-04-01
TIG Welding is a high quality form of welding which is very popular in industries. It is one of the few types of welding that can be used to join dissimilar metals. Here a weld joint is formed between stainless steel and monel alloy. It is desired to have control over the weld geometry of such a joint through the adjustment of experimental parameters which are welding current, wire feed speed, arc length and the shielding gas flow rate. To facilitate the automation of the same, a model of the welding system is needed. However the underlying welding process is complex and non-linear, and analytical methods are impractical for industrial use. Therefore artificial neural networks (ANN) are explored for developing the model, as they are well-suited for modelling non-linear multi-variate data. Feed-forward neural networks with backpropagation training algorithm are used, and the data for training the ANN taken from experimental work. There are four outputs corresponding to the weld geometry. Different training and testing phases were carried out using MATLAB software and ANN approximates the given data with minimum amount of error.
A Multi-Agent Approach to the Simulation of Robotized Manufacturing Systems
NASA Astrophysics Data System (ADS)
Foit, K.; Gwiazda, A.; Banaś, W.
2016-08-01
The recent years of eventful industry development, brought many competing products, addressed to the same market segment. The shortening of a development cycle became a necessity if the company would like to be competitive. Because of switching to the Intelligent Manufacturing model the industry search for new scheduling algorithms, while the traditional ones do not meet the current requirements. The agent-based approach has been considered by many researchers as an important way of evolution of modern manufacturing systems. Due to the properties of the multi-agent systems, this methodology is very helpful during creation of the model of production system, allowing depicting both processing and informational part. The complexity of such approach makes the analysis impossible without the computer assistance. Computer simulation still uses a mathematical model to recreate a real situation, but nowadays the 2D or 3D virtual environments or even virtual reality have been used for realistic illustration of the considered systems. This paper will focus on robotized manufacturing system and will present the one of possible approaches to the simulation of such systems. The selection of multi-agent approach is motivated by the flexibility of this solution that offers the modularity, robustness and autonomy.
Xie, Y L; Li, Y P; Huang, G H; Li, Y F; Chen, L R
2011-04-15
In this study, an inexact-chance-constrained water quality management (ICC-WQM) model is developed for planning regional environmental management under uncertainty. This method is based on an integration of interval linear programming (ILP) and chance-constrained programming (CCP) techniques. ICC-WQM allows uncertainties presented as both probability distributions and interval values to be incorporated within a general optimization framework. Complexities in environmental management systems can be systematically reflected, thus applicability of the modeling process can be highly enhanced. The developed method is applied to planning chemical-industry development in Binhai New Area of Tianjin, China. Interval solutions associated with different risk levels of constraint violation have been obtained. They can be used for generating decision alternatives and thus help decision makers identify desired policies under various system-reliability constraints of water environmental capacity of pollutant. Tradeoffs between system benefits and constraint-violation risks can also be tackled. They are helpful for supporting (a) decision of wastewater discharge and government investment, (b) formulation of local policies regarding water consumption, economic development and industry structure, and (c) analysis of interactions among economic benefits, system reliability and pollutant discharges. Copyright © 2011 Elsevier B.V. All rights reserved.
Transformation of environmental conditions in large former Soviet countries: regional analysis
NASA Astrophysics Data System (ADS)
Bityukova, V. R.; Borovikov, M. S.
2018-01-01
The article studies changes in the structure of environmental conditions of regions in the large former Soviet countries (case study of Russia and Kazakhstan) that have formed considerable contrasts in the placement of industrial complex and population settlement during the previous development stages. The changes related to the transition to market economy have led to essential transformation of environmental conditions. A complex index allowing to assess changes at the regional level in Kazakhstan and Russia and to reveal main similarities and differences between those changes is applied to studying the transformation of regional and industry structure. The article examines both industry-specific and spatial patterns forming environmental conditions at the regional level.
[Health maintenance strategy for construction industry workers].
Perminova, I Iu; Logvinenko, I I
2011-01-01
The authors analyzed work conditions and health state of workers engaged into construction industry in Kemerovo city. Findings are that complex approach to carrying out the strategy "Health for all in XXI century" causes health preservation.
Remote analysis of anthropogenic effect on boreal forests using nonlinear multidimensional models
NASA Astrophysics Data System (ADS)
Shchemel, Anton; Ivanova, Yuliya; Larko, Alexander
Nowadays anthropogenic stress of mining and refining oil and gas is becoming significant prob-lem in Eastern Siberia. The task of revealing effect of that industry is not trivial because of complicated access to the sites of mining. Due to that, severe problem of supplying detection of oil and gas complex effect on forest ecosystems arises. That estimation should allow revealing the sites of any negative changes in forest communities in proper time. The intellectual system of analyzing remote sensing data of different resolution and different spectral characteristics with sophisticated nonlinear models is dedicated to solve the problem. The work considers re-mote detection and estimation of forest degradation using analysis of free remote sensing data without total field observations of oil and gas mining territory. To analyze a state of vegetation the following remote sensing data were used as input parameters for our models: albedo, surface temperature and data of about thirty spectral bands in visible and infrared region. The data of MODIS satellite from the year 2000 was used. Chosen data allowed producing complex estima-tion of parameters linked with the quality (set of species, physiological state) and the quantity of vegetation. To verify obtained estimation each index was calculated for a territory in which oil and gas mining is provided along with the same calculations for a sample "clear" territory. Monthly data for vegetation period and annual mean values were analyzed. The work revealed some trends of annual data probably linked with intensification of anthropogenic effect on the ecosystems. The models we managed to build are easy to apply for using by fair personnel of emergency control and oversight institutions. It was found to be helpful to use exactly the full set of values obtained from the satellite for multilateral estimation of anthropogenic effect on forest ecosystems of objects of the oil mining industry for producing generalized estimation indices by the developed models.
Drivers of multidimensional eco-innovation: empirical evidence from the Brazilian industry.
da Silva Rabêlo, Olivan; de Azevedo Melo, Andrea Sales Soares
2018-03-08
The study analyses the relationships between the main drivers of eco-innovation introduced by innovative industries, focused on cooperation strategy. Eco-innovation is analysed by means of a multidimensional identification strategy, showing the relationships between the independent variables and the variable of interest. The literature discussing environmental innovation is different from the one discussing other types of innovation inasmuch as it seeks to grasp its determinants and to mostly highlight the relevance of environmental regulation. The key feature of this paper is that it ascribes special relevance to cooperation strategy with external partners and to the propensity of innovative industry introducing eco-innovation. A sample of 35,060 Brazilian industries were analysed, between 2003 and 2011, by means of Binomial, Multinomial and Ordinal logistic regressions with microdata collected with the research and innovation department (PINTEC) from the Brazilian Institute of Geography and Statistics (Instituto Brasileiro de Geografia e Estatística). The econometric results estimated by the Logit Multinomial method suggest that the cooperation with external partners practiced by innovative industries facilitates the adoption of eco-innovation in dimension 01 with probability of 64.59%, 57.63% in dimension 02 and 81.02% in dimension 03. The data reveal that the higher the degree of eco-innovation complexity, the harder industries seek to obtain cooperation with external partners. When calculating with the Logit Ordinal and Binomial models, cooperation increases the probability that the industry is eco-innovative in 65.09% and 89.34%, respectively. Environmental regulation and innovation in product and information management were also positively correlated as drivers of eco-innovation.
Dolinoy, Dana C.; Miranda, Marie Lynn
2004-01-01
The Toxics Release Inventory (TRI) requires facilities with 10 or more full-time employees that process > 25,000 pounds in aggregate or use > 10,000 pounds of any one TRI chemical to report releases annually. However, little is known about releases from non-TRI-reporting facilities, nor has attention been given to the very localized equity impacts associated with air toxics releases. Using geographic information systems and industrial source complex dispersion modeling, we developed methods for characterizing air releases from TRI-reporting as well as non-TRI-reporting facilities at four levels of geographic resolution. We characterized the spatial distribution and concentration of air releases from one representative industry in Durham County, North Carolina (USA). Inclusive modeling of all facilities rather than modeling of TRI sites alone significantly alters the magnitude and spatial distribution of modeled air concentrations. Modeling exposure receptors at more refined levels of geographic resolution reveals localized, neighborhood-level exposure hot spots that are not apparent at coarser geographic scales. Multivariate analysis indicates that inclusive facility modeling at fine levels of geographic resolution reveals exposure disparities by income and race. These new methods significantly enhance the ability to model air toxics, perform equity analysis, and clarify conflicts in the literature regarding environmental justice findings. This work has substantial implications for how to structure TRI reporting requirements, as well as methods and types of analysis that will successfully elucidate the spatial distribution of exposure potentials across geographic, income, and racial lines. PMID:15579419
Schuck, Edgar; Bohnert, Tonika; Chakravarty, Arijit; Damian-Iordache, Valeriu; Gibson, Christopher; Hsu, Cheng-Pang; Heimbach, Tycho; Krishnatry, Anu Shilpa; Liederer, Bianca M; Lin, Jing; Maurer, Tristan; Mettetal, Jerome T; Mudra, Daniel R; Nijsen, Marjoleen Jma; Raybon, Joseph; Schroeder, Patricia; Schuck, Virna; Suryawanshi, Satyendra; Su, Yaming; Trapa, Patrick; Tsai, Alice; Vakilynejad, Majid; Wang, Shining; Wong, Harvey
2015-03-01
The application of modeling and simulation techniques is increasingly common in preclinical stages of the drug discovery and development process. A survey focusing on preclinical pharmacokinetic/pharmacodynamics (PK/PD) analysis was conducted across pharmaceutical companies that are members of the International Consortium for Quality and Innovation in Pharmaceutical Development. Based on survey responses, ~68% of companies use preclinical PK/PD analysis in all therapeutic areas indicating its broad application. An important goal of preclinical PK/PD analysis in all pharmaceutical companies is for the selection/optimization of doses and/or dose regimens, including prediction of human efficacious doses. Oncology was the therapeutic area with the most PK/PD analysis support and where it showed the most impact. Consistent use of more complex systems pharmacology models and hybrid physiologically based pharmacokinetic models with PK/PD components was less common compared to traditional PK/PD models. Preclinical PK/PD analysis is increasingly being included in regulatory submissions with ~73% of companies including these data to some degree. Most companies (~86%) have seen impact of preclinical PK/PD analyses in drug development. Finally, ~59% of pharmaceutical companies have plans to expand their PK/PD modeling groups over the next 2 years indicating continued growth. The growth of preclinical PK/PD modeling groups in pharmaceutical industry is necessary to establish required resources and skills to further expand use of preclinical PK/PD modeling in a meaningful and impactful manner.
Inheritance of evolved clethodim resistance in Lolium rigidum populations from Australia.
Saini, Rupinder Kaur; Malone, Jenna; Gill, Gurjeet; Preston, Christopher
2017-08-01
In Australia, the extensive use of clethodim for the control of Lolium rigidum has resulted in the evolution of many clethodim-resistant L. rigidum populations. Five clethodim-resistant populations of L. rigidum were analysed for the inheritance of clethodim resistance. Reciprocal crosses were made between resistant (R) and susceptible (S) populations. Within crosses, dose-responses of reciprocal F 1 families of all populations except A61 were similar to each other, indicating that clethodim resistance in these populations is encoded on the nuclear genome. The level of dominance observed in the dose-response experiments ranged from partial to complete within the herbicide rate used. In the A61 population, within each cross, the response of F 1 from the maternal and paternal parent was different, indicating that resistance is inherited through the female parent. All backcross populations segregated in a different manner. Only one population, FP, fitted a single-gene model (1:1). Two populations fitted two-gene models: a 3:1 inheritance model for F4 and a 1:3 inheritance model for A91. For population E2, no clear pattern of inheritance was determined, suggesting more complex inheritance. The results of this study indicate that different patterns of clethodim resistance in L. rigidum exist. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
Kristiana, S. P. D.
2017-12-01
Corporate chain store is one type of retail industries companies that are developing growing rapidly in Indonesia. The competition between retail companies is very tight, so retailer companies should evaluate its performance continuously in order to survive. The selling price of products is one of the essential attributes and gets attention of many consumers where it’s used to evaluate the performance of the industry. This research aimed to determine optimal selling price of product with considering cost factors, namely purchase price of the product from supplier, holding costs, and transportation costs. Fuzzy logic approach is used in data processing with MATLAB software. Fuzzy logic is selected to solve the problem because this method can consider complexities factors. The result is a model of determination of the optimal selling price by considering three cost factors as inputs in the model. Calculating MAPE and model prediction ability for some products are used as validation and verification where the average value is 0.0525 for MAPE and 94.75% for prediction ability. The conclusion is this model can predict the selling price of up to 94.75%, so it can be used as tools for the corporate chain store in particular to determine the optimal selling price for its products.
Rodrigues, Diulia C Q; Soares, Atílio P; Costa, Esly F; Costa, Andréa O S
2017-01-01
Cement is one of the most used building materials in the world. The process of cement production involves numerous and complex reactions that occur under different temperatures. Thus, there is great interest in the optimization of cement manufacturing. Clinker production is one of the main steps of cement production and it occurs inside the kiln. In this paper, the dry process of clinker production is analysed in a rotary kiln that operates in counter flow. The main phenomena involved in clinker production is as follows: free residual water evaporation of raw material, decomposition of magnesium carbonate, decarbonation, formation of C3A and C4AF, formation of dicalcium silicate, and formation of tricalcium silicate. The main objective of this study was to propose a mathematical model that realistically describes the temperature profile and the concentration of clinker components in a real rotary kiln. In addition, the influence of different speeds of inlet gas and solids in the system was analysed. The mathematical model is composed of partial differential equations. The model was implemented in Mathcad (available at CCA/UFES) and solved using industrial input data. The proposal model is satisfactory to describe the temperature and concentration profiles of a real rotary kiln.
Quality by control: Towards model predictive control of mammalian cell culture bioprocesses.
Sommeregger, Wolfgang; Sissolak, Bernhard; Kandra, Kulwant; von Stosch, Moritz; Mayer, Martin; Striedner, Gerald
2017-07-01
The industrial production of complex biopharmaceuticals using recombinant mammalian cell lines is still mainly built on a quality by testing approach, which is represented by fixed process conditions and extensive testing of the end-product. In 2004 the FDA launched the process analytical technology initiative, aiming to guide the industry towards advanced process monitoring and better understanding of how critical process parameters affect the critical quality attributes. Implementation of process analytical technology into the bio-production process enables moving from the quality by testing to a more flexible quality by design approach. The application of advanced sensor systems in combination with mathematical modelling techniques offers enhanced process understanding, allows on-line prediction of critical quality attributes and subsequently real-time product quality control. In this review opportunities and unsolved issues on the road to a successful quality by design and dynamic control implementation are discussed. A major focus is directed on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses. Design of experiments providing information about the process dynamics upon parameter change, dynamic process models, on-line process state predictions and powerful software environments seem to be a prerequisite for quality by control realization. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Mesoscale carbon sequestration site screening and CCS infrastructure analysis.
Keating, Gordon N; Middleton, Richard S; Stauffer, Philip H; Viswanathan, Hari S; Letellier, Bruce C; Pasqualini, Donatella; Pawar, Rajesh J; Wolfsberg, Andrew V
2011-01-01
We explore carbon capture and sequestration (CCS) at the meso-scale, a level of study between regional carbon accounting and highly detailed reservoir models for individual sites. We develop an approach to CO(2) sequestration site screening for industries or energy development policies that involves identification of appropriate sequestration basin, analysis of geologic formations, definition of surface sites, design of infrastructure, and analysis of CO(2) transport and storage costs. Our case study involves carbon management for potential oil shale development in the Piceance-Uinta Basin, CO and UT. This study uses new capabilities of the CO(2)-PENS model for site screening, including reservoir capacity, injectivity, and cost calculations for simple reservoirs at multiple sites. We couple this with a model of optimized source-sink-network infrastructure (SimCCS) to design pipeline networks and minimize CCS cost for a given industry or region. The CLEAR(uff) dynamical assessment model calculates the CO(2) source term for various oil production levels. Nine sites in a 13,300 km(2) area have the capacity to store 6.5 GtCO(2), corresponding to shale-oil production of 1.3 Mbbl/day for 50 years (about 1/4 of U.S. crude oil production). Our results highlight the complex, nonlinear relationship between the spatial deployment of CCS infrastructure and the oil-shale production rate.
NASA Astrophysics Data System (ADS)
Shi, Pengpeng; Zhang, Pengcheng; Jin, Ke; Chen, Zhenmao; Zheng, Xiaojing
2018-04-01
Metal magnetic memory (MMM) testing (also known as micro-magnetic testing) is a new non-destructive electromagnetic testing method that can diagnose ferromagnetic materials at an early stage by measuring the MMM signal directly on the material surface. Previous experiments have shown that many factors affect MMM signals, in particular, the temperature, the elastoplastic state, and the complex environmental magnetic field. However, the fact that there have been only a few studies of either how these factors affect the signals or the physical coupling mechanisms among them seriously limits the industrial applications of MMM testing. In this paper, a nonlinear constitutive relation for a ferromagnetic material considering the influences of temperature and elastoplastic state is established under a weak magnetic field and is used to establish a nonlinear thermo-magneto-elastoplastic coupling model of MMM testing. Comparing with experimental data verifies that the proposed theoretical model can accurately describe the thermo-magneto-elastoplastic coupling influence on MMM signals. The proposed theoretical model can predict the MMM signals in a complex environment and so is expected to provide a theoretical basis for improving the degree of quantification in MMM testing.
NASA Astrophysics Data System (ADS)
Vintila, Iuliana; Gavrus, Adinel
2017-10-01
The present research paper proposes the validation of a rigorous computation model used as a numerical tool to identify rheological behavior of complex emulsions W/O. Considering a three-dimensional description of a general viscoplastic flow it is detailed the thermo-mechanical equations used to identify fluid or soft material's rheological laws starting from global experimental measurements. Analyses are conducted for complex emulsions W/O having generally a Bingham behavior using the shear stress - strain rate dependency based on a power law and using an improved analytical model. Experimental results are investigated in case of rheological behavior for crude and refined rapeseed/soybean oils and four types of corresponding W/O emulsions using different physical-chemical composition. The rheological behavior model was correlated with the thermo-mechanical analysis of a plane-plane rheometer, oil content, chemical composition, particle size and emulsifier's concentration. The parameters of rheological laws describing the industrial oils and the W/O concentrated emulsions behavior were computed from estimated shear stresses using a non-linear regression technique and from experimental torques using the inverse analysis tool designed by A. Gavrus (1992-2000).
1984-10-01
Textile Fibers/Products Foods , Feeds, Beverages Industrial Supplies Value of Goods Exported ($ billions) 1958 1968 1978 $18.1 billion...character of its government, the soundness of its economy, its industrial efficiency, the development of its internal communications, the quality...decades the United States produced more raw materials than its growing industrial complex could consume. From a raw-materials-surplus-nation we
Petroleum accounting principles, procedures, and issues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brock, H.R.; Klingstedt, J.P.; Jones, D.M.
1985-01-01
This book begins with the basics and leads one through the complexities of accounting and reporting for the industry. It presents the material one needs as an accountant in the petroleum industry. Examples deal with real problems and issues. It also includes numerous illustrations and examples, as well as sample forms, lease agreements, and industry and governmental regulations.
Atkinson, Jo-An; O'Donnell, Eloise; Wiggers, John; McDonnell, Geoff; Mitchell, Jo; Freebairn, Louise; Indig, Devon; Rychetnik, Lucie
2017-02-15
Development of effective policy responses to address complex public health problems can be challenged by a lack of clarity about the interaction of risk factors driving the problem, differing views of stakeholders on the most appropriate and effective intervention approaches, a lack of evidence to support commonly implemented and acceptable intervention approaches, and a lack of acceptance of effective interventions. Consequently, political considerations, community advocacy and industry lobbying can contribute to a hotly contested debate about the most appropriate course of action; this can hinder consensus and give rise to policy resistance. The problem of alcohol misuse and its associated harms in New South Wales (NSW), Australia, provides a relevant example of such challenges. Dynamic simulation modelling is increasingly being valued by the health sector as a robust tool to support decision making to address complex problems. It allows policy makers to ask 'what-if' questions and test the potential impacts of different policy scenarios over time, before solutions are implemented in the real world. Participatory approaches to modelling enable researchers, policy makers, program planners, practitioners and consumer representatives to collaborate with expert modellers to ensure that models are transparent, incorporate diverse evidence and perspectives, are better aligned to the decision-support needs of policy makers, and can facilitate consensus building for action. This paper outlines a procedure for embedding stakeholder engagement and consensus building in the development of dynamic simulation models that can guide the development of effective, coordinated and acceptable policy responses to complex public health problems, such as alcohol-related harms in NSW.
NASA Astrophysics Data System (ADS)
Nishida, R. T.; Beale, S. B.; Pharoah, J. G.; de Haart, L. G. J.; Blum, L.
2018-01-01
This work is among the first where the results of an extensive experimental research programme are compared to performance calculations of a comprehensive computational fluid dynamics model for a solid oxide fuel cell stack. The model, which combines electrochemical reactions with momentum, heat, and mass transport, is used to obtain results for an established industrial-scale fuel cell stack design with complex manifolds. To validate the model, comparisons with experimentally gathered voltage and temperature data are made for the Jülich Mark-F, 18-cell stack operating in a test furnace. Good agreement is obtained between the model and experiment results for cell voltages and temperature distributions, confirming the validity of the computational methodology for stack design. The transient effects during ramp up of current in the experiment may explain a lower average voltage than model predictions for the power curve.
Users matter : multi-agent systems model of high performance computing cluster users.
DOE Office of Scientific and Technical Information (OSTI.GOV)
North, M. J.; Hood, C. S.; Decision and Information Sciences
2005-01-01
High performance computing clusters have been a critical resource for computational science for over a decade and have more recently become integral to large-scale industrial analysis. Despite their well-specified components, the aggregate behavior of clusters is poorly understood. The difficulties arise from complicated interactions between cluster components during operation. These interactions have been studied by many researchers, some of whom have identified the need for holistic multi-scale modeling that simultaneously includes network level, operating system level, process level, and user level behaviors. Each of these levels presents its own modeling challenges, but the user level is the most complex duemore » to the adaptability of human beings. In this vein, there are several major user modeling goals, namely descriptive modeling, predictive modeling and automated weakness discovery. This study shows how multi-agent techniques were used to simulate a large-scale computing cluster at each of these levels.« less
NASA Astrophysics Data System (ADS)
Komen, E. M. J.; Camilo, L. H.; Shams, A.; Geurts, B. J.; Koren, B.
2017-09-01
LES for industrial applications with complex geometries is mostly characterised by: a) a finite volume CFD method using a non-staggered arrangement of the flow variables and second order accurate spatial and temporal discretisation schemes, b) an implicit top-hat filter, where the filter length is equal to the local computational cell size, and c) eddy-viscosity type LES models. LES based on these three main characteristics is indicated as industrial LES in this paper. It becomes increasingly clear that the numerical dissipation in CFD codes typically used in industrial applications with complex geometries may inhibit the predictive capabilities of explicit LES. Therefore, there is a need to quantify the numerical dissipation rate in such CFD codes. In this paper, we quantify the numerical dissipation rate in physical space based on an analysis of the transport equation for the mean turbulent kinetic energy. Using this method, we quantify the numerical dissipation rate in a quasi-Direct Numerical Simulation (DNS) and in under-resolved DNS of, as a basic demonstration case, fully-developed turbulent channel flow. With quasi-DNS, we indicate a DNS performed using a second order accurate finite volume method typically used in industrial applications. Furthermore, we determine and explain the trends in the performance of industrial LES for fully-developed turbulent channel flow for four different Reynolds numbers for three different LES mesh resolutions. The presented explanation of the mechanisms behind the observed trends is based on an analysis of the turbulent kinetic energy budgets. The presented quantitative analyses demonstrate that the numerical errors in the industrial LES computations of the considered turbulent channel flows result in a net numerical dissipation rate which is larger than the subgrid-scale dissipation rate. No new computational methods are presented in this paper. Instead, the main new elements in this paper are our detailed quantification method for the numerical dissipation rate, the application of this method to a quasi-DNS and under-resolved DNS of fully-developed turbulent channel flow, and the explanation of the effects of the numerical dissipation on the observed trends in the performance of industrial LES for fully-developed turbulent channel flows.
Challenges to the development of complex virtual reality surgical simulations.
Seymour, N E; Røtnes, J S
2006-11-01
Virtual reality simulation in surgical training has become more widely used and intensely investigated in an effort to develop safer, more efficient, measurable training processes. The development of virtual reality simulation of surgical procedures has begun, but well-described technical obstacles must be overcome to permit varied training in a clinically realistic computer-generated environment. These challenges include development of realistic surgical interfaces and physical objects within the computer-generated environment, modeling of realistic interactions between objects, rendering of the surgical field, and development of signal processing for complex events associated with surgery. Of these, the realistic modeling of tissue objects that are fully responsive to surgical manipulations is the most challenging. Threats to early success include relatively limited resources for development and procurement, as well as smaller potential for return on investment than in other simulation industries that face similar problems. Despite these difficulties, steady progress continues to be made in these areas. If executed properly, virtual reality offers inherent advantages over other training systems in creating a realistic surgical environment and facilitating measurement of surgeon performance. Once developed, complex new virtual reality training devices must be validated for their usefulness in formative training and assessment of skill to be established.
Karwowski, Waldemar; Ahram, Tareq Z
2012-01-01
In order to leverage individual and organizational learning and to remain competitive in current turbulent markets it is important for employees, managers, planners and leaders to perform at high levels over time. Employee competence and skills are extremely important matters in view of the general shortage of talent and the mobility of employees with talent. Two factors emerged to have the greatest impact on the competitiveness of complex service systems: improving managerial and employee's knowledge attainment for skills, and improving the training and development of the workforce. This paper introduces the knowledge-based user-centered service design approach for sustainable skill and performance improvement in education, design and modeling of the next generation of complex service systems. The rest of the paper cover topics in human factors and sustainable business process modeling for the service industry, and illustrates the user-centered service system development cycle with the integration of systems engineering concepts in service systems. A roadmap for designing service systems of the future is discussed. The framework introduced in this paper is based on key user-centered design principles and systems engineering applications to support service competitiveness.
Spectroscopic confirmation of uranium(VI)-carbonato adsorption complexes on hematite
Bargar, John R.; Reitmeyer, Rebecca; Davis, James A.
1999-01-01
Evaluating societal risks posed by uranium contamination from waste management facilities, mining sites, and heavy industry requires knowledge about uranium transport in groundwater, often the most significant pathway of exposure to humans. It has been proposed that uranium mobility in aquifers may be controlled by adsorption of U(VI)−carbonato complexes on oxide minerals. The existence of such complexes has not been demonstrated, and little is known about their compositions and reaction stoichiometries. We have used attenuated total reflectance Fourier transform infrared (ATR-FTIR) and extended X-ray absorption fine structure (EXAFS) spectroscopies to probe the existence, structures, and compositions of ≡FeOsurface−U(VI)−carbonato complexes on hematite throughout the pH range of uranyl uptake under conditions relevant to aquifers. U(VI)−carbonato complexes were found to be the predominant adsorbed U(VI) species at all pH values examined, a much wider pH range than previously postulated based on analogy to aqueous U(VI)−carbonato complexes, which are trace constituents at pH < 6. This result indicates the inadequacy of the common modeling assumption that the compositions and predominance of adsorbed species can be inferred from aqueous species. By extension, adsorbed carbonato complexes may be of major importance to the groundwater transport of similar actinide contaminants such as neptunium and plutonium.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Windt, Laurent, E-mail: laurent.dewindt@mines-paristech.fr; Bertron, Alexandra; Larreur-Cayol, Steeves
2015-03-15
Interactions of short-chain organic acids with hydrated cement phases affect structure durability in the agro-food and nuclear waste industries but can also be used to modify cement properties. Most previous studies have been experimental, performed at fixed concentrations and pH, without quantitatively discriminating among polyacidity effects, or complexation and salt precipitation processes. This paper addresses such issues by thermodynamic equilibrium calculations for acetic, citric, oxalic, succinic acids and a simplified hydrated CEM-I. The thermodynamic constants collected from the literature allow the speciation to be modeled over a wide range of pH and concentrations. Citric and oxalic had a stronger chelatingmore » effect than acetic acid, while succinic acid was intermediate. Similarly, Ca-citrate and Ca-oxalate salts were more insoluble than Ca-acetate and Ca-succinate salts. Regarding aluminium complexation, hydroxyls, sulfates, and acid competition was highlighted. The exploration of acid mixtures showed the preponderant effect of oxalate and citrate over acetate and succinate.« less
PumpKin: A tool to find principal pathways in plasma chemical models
NASA Astrophysics Data System (ADS)
Markosyan, A. H.; Luque, A.; Gordillo-Vázquez, F. J.; Ebert, U.
2014-10-01
PumpKin is a software package to find all principal pathways, i.e. the dominant reaction sequences, in chemical reaction systems. Although many tools are available to integrate numerically arbitrarily complex chemical reaction systems, few tools exist in order to analyze the results and interpret them in relatively simple terms. In particular, due to the large disparity in the lifetimes of the interacting components, it is often useful to group reactions into pathways that recycle the fastest species. This allows a researcher to focus on the slow chemical dynamics, eliminating the shortest timescales. Based on the algorithm described by Lehmann (2004), PumpKin automates the process of finding such pathways, allowing the user to analyze complex kinetics and to understand the consumption and production of a certain species of interest. We designed PumpKin with an emphasis on plasma chemical systems but it can also be applied to atmospheric modeling and to industrial applications such as plasma medicine and plasma-assisted combustion.
Myerburg, Robert J; Ullmann, Steven G
2015-04-01
Although identification and management of cardiovascular risk markers have provided important population risk insights and public health benefits, individual risk prediction remains challenging. Using sudden cardiac death risk as a base case, the complex epidemiology of sudden cardiac death risk and the substantial new funding required to study individual risk are explored. Complex epidemiology derives from the multiple subgroups having different denominators and risk profiles, while funding limitations emerge from saturation of conventional sources of research funding without foreseeable opportunities for increases. A resolution to this problem would have to emerge from new sources of funding targeted to individual risk prediction. In this analysis, we explore the possibility of a research funding strategy that would offer business incentives to the insurance industries, while providing support for unresolved research goals. The model is developed for the case of sudden cardiac death risk, but the concept is applicable to other areas of the medical enterprise. © 2015 American Heart Association, Inc.
Assessing Requirements Quality through Requirements Coverage
NASA Technical Reports Server (NTRS)
Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt
2008-01-01
In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.
Virtual environments from panoramic images
NASA Astrophysics Data System (ADS)
Chapman, David P.; Deacon, Andrew
1998-12-01
A number of recent projects have demonstrated the utility of Internet-enabled image databases for the documentation of complex, inaccessible and potentially hazardous environments typically encountered in the petrochemical and nuclear industries. Unfortunately machine vision and image processing techniques have not, to date, enabled the automatic extraction geometrical data from such images and thus 3D CAD modeling remains an expensive and laborious manual activity. Recent developments in panoramic image capture and presentation offer an alternative intermediate deliverable which, in turn, offers some of the benefits of a 3D model at a fraction of the cost. Panoramic image display tools such as Apple's QuickTime VR (QTVR) and Live Spaces RealVR provide compelling and accessible digital representations of the real world and justifiably claim to 'put the reality in Virtual Reality.' This paper will demonstrate how such technologies can be customized, extended and linked to facility management systems delivered over a corporate intra-net to enable end users to become familiar with remote sites and extract simple dimensional data. In addition strategies for the integration of such images with documents gathered from 2D or 3D CAD and Process and Instrumentation Diagrams (P&IDs) will be described as will techniques for precise 'As-Built' modeling using the calibrated images from which panoramas have been derived and the use of textures from these images to increase the realism of rendered scenes. A number of case studies relating to both nuclear and process engineering will demonstrate the extent to which such solution are scaleable in order to deal with the very large volumes of image data required to fully document the large, complex facilities typical of these industry sectors.
View of an unknown industrial building in the Dolphin Jute ...
View of an unknown industrial building in the Dolphin Jute Mill Complex, looking southwest. Note Garret Mountain at upper left and historic Dexter-Lambert smokestack. - Dolphin Manufacturing Company, Spruce & Barbour Streets, Paterson, Passaic County, NJ
Complex deformation routes for direct recycling aluminium alloy scrap via industrial hot extrusion
NASA Astrophysics Data System (ADS)
Paraskevas, Dimos; Kellens, Karel; Kampen, Carlos; Mohammadi, Amirahmad; Duflou, Joost R.
2018-05-01
This paper presents the final results of an industrial project, aiming for direct hot extrusion of wrought aluminium alloy scrap at an industrial scale. Two types of complex deformation/extrusion routes were tested for the production of the same profile, starting from AA6060 scrap in form of machining chips. More specifically scrap-based billets were extruded through: a 2-porthole and a 4-porthole die-set, modified for enhanced scrap consolidation and grain refinement. For comparison reasons, cast billets of the same alloy were extruded through the modified 2-porthole die set. The tensile testing results as well as microstructural investigations show that the 4-porthole extrusion route further improves scrap consolidation compared to the 2-porthole die output. The successful implementation of solid state recycling, directly at industrial level, indicates the technological readiness level of this research.
NASA Astrophysics Data System (ADS)
Chen, Hudong
2001-06-01
There have been considerable advances in Lattice Boltzmann (LB) based methods in the last decade. By now, the fundamental concept of using the approach as an alternative tool for computational fluid dynamics (CFD) has been substantially appreciated and validated in mainstream scientific research and in industrial engineering communities. Lattice Boltzmann based methods possess several major advantages: a) less numerical dissipation due to the linear Lagrange type advection operator in the Boltzmann equation; b) local dynamic interactions suitable for highly parallel processing; c) physical handling of boundary conditions for complicated geometries and accurate control of fluxes; d) microscopically consistent modeling of thermodynamics and of interface properties in complex multiphase flows. It provides a great opportunity to apply the method to practical engineering problems encountered in a wide range of industries from automotive, aerospace to chemical, biomedical, petroleum, nuclear, and others. One of the key challenges is to extend the applicability of this alternative approach to regimes of highly turbulent flows commonly encountered in practical engineering situations involving high Reynolds numbers. Over the past ten years, significant efforts have been made on this front at Exa Corporation in developing a lattice Boltzmann based commercial CFD software, PowerFLOW. It has become a useful computational tool for the simulation of turbulent aerodynamics in practical engineering problems involving extremely complex geometries and flow situations, such as in new automotive vehicle designs world wide. In this talk, we present an overall LB based algorithm concept along with certain key extensions in order to accurately handle turbulent flows involving extremely complex geometries. To demonstrate the accuracy of turbulent flow simulations, we provide a set of validation results for some well known academic benchmarks. These include straight channels, backward-facing steps, flows over a curved hill and typical NACA airfoils at various angles of attack including prediction of stall angle. We further provide numerous engineering cases, ranging from external aerodynamics around various car bodies to internal flows involved in various industrial devices. We conclude with a discussion of certain future extensions for complex fluids.
NASA Astrophysics Data System (ADS)
Skryzalin, P. A.; Ramirez, C.; Durrheim, R. J.; Raveloson, A.; Nyblade, A.; Feineman, M. D.
2016-12-01
The Bushveld Igneous Complex contains one of the most studied and economically important layered mafic intrusions in the world. The Rustenburg Layered Suite outcrops in northern South Africa over an area of 65,000 km2, and has a volume of up to 1,000,000 km3. Both the Bushveld Igneous Complex and the Molopo Farms Complex in Botswana intruded the crust at 2.05 Ga. Despite being extensively exploited by the mining industry, many questions still exist regarding the structure of the Bushveld Igneous Complex, specifically the total size and connectivity of the different outcrops. In this study, we used receiver function analysis, a technique for determining the seismic velocity structure of the crust and upper mantle, to search for evidence of the Bushveld at station LBTB, which lies in Botswana, between the Far Western Limb of the Bushveld and the Molopo Farms Complex. The goal of our study was to determine whether a fast, high-density mafic body can be seen in the crust beneath this region using receiver functions. Observation of a high density layer would argue in favor of connectivity of the Bushveld between The Far Western Limb and the Molopo Farms Complex. We forward modeled stacks of receiver functions as well as sub-stacks that were split into azimuthal groups which share similar characteristics. We found that there was no evidence for a high velocity zone in the crust, and that the Moho in this region is located at a depth of 38 ± 3 km, about 8-9 km shallower than Moho depths determined beneath the Bushveld Complex. These two lines of evidence give no reason to assume connectivity between the Bushveld Igneous Complex and the Molopo Farms Complex, and rather suggest two separate intrusive suites.
Guerreiro, Joana F.; Muir, Alexander; Ramachandran, Subramaniam; Thorner, Jeremy; Sá-Correia, Isabel
2016-01-01
Acetic acid-induced inhibition of yeast growth and metabolism limits the productivity of industrial fermentation processes, especially when lignocellulosic hydrolysates are used as feedstock in industrial biotechnology. Tolerance to acetic acid of food spoilage yeasts is also a problem in the preservation of acidic foods and beverages. Thus, understanding the molecular mechanisms underlying adaptation and tolerance to acetic acid stress is increasingly important in industrial biotechnology and the food industry. Prior genetic screens for S. cerevisiae mutants with increased sensitivity to acetic acid identified loss-of-function mutations in the YPK1 gene, which encodes a protein kinase activated by the Target of Rapamycin (TOR) Complex 2 (TORC2). We show here by several independent criteria that TORC2-Ypk1 signaling is stimulated in response to acetic acid stress. Moreover, we demonstrate that TORC2-mediated Ypk1 phosphorylation and activation is necessary for acetic acid tolerance, and occurs independently of Hrk1, a protein kinase previously implicated in the cellular response to acetic acid. In addition, we show that TORC2-Ypk1-mediated activation of L-serine: palmitoyl-CoA acyltransferase, the enzyme complex that catalyzes the first committed step of sphingolipid biosynthesis, is required for acetic acid tolerance. Furthermore, analysis of the sphingolipid pathway using inhibitors and mutants indicates that it is production of certain complex sphingolipids that contributes to conferring acetic acid tolerance. Consistent with that conclusion, promoting sphingolipid synthesis by adding exogenous long-chain base precursor phytosphingosine to the growth medium enhanced acetic acid tolerance. Thus, appropriate modulation of the TORC2-Ypk1-sphingolipid axis in industrial yeast strains may have utility in improving fermentations of acetic acid-containing feedstocks. PMID:27671892
Identification of specific organic contaminants in different units of a chemical production site.
Dsikowitzky, L; Botalova, O; al Sandouk-Lincke, N A; Schwarzbauer, J
2014-07-01
Due to the very limited number of studies dealing with the chemical composition of industrial wastewaters, many industrial organic contaminants still escape our view and consequently also our control. We present here the chemical characterization of wastewaters from different units of a chemical complex, thereby contributing to the characterization of industrial pollution sources. The chemicals produced in the investigated complex are widely and intensively used and the synthesis processes are common and applied worldwide. The chemical composition of untreated and treated wastewaters from the chemical complex was investigated by applying a non-target screening which allowed for the identification of 39 organic contaminants. According to their application most of them belonged to four groups: (i) unspecific educts or intermediates of industrial syntheses, (ii) chemicals for the manufacturing of pharmaceuticals, (iii) educts for the synthesis of polymers and resins, and (iv) compounds known as typical constituents of municipal sewage. A number of halogenated compounds with unknown toxicity and with very high molecular diversity belonged to the second group. Although these compounds were completely removed or degraded during wastewater treatment, they could be useful as "alarm indicators" for industrial accidents in pharmaceutical manufacturing units or for malfunctions of wastewater treatment plants. Three potential branch-specific indicators for polymer manufacturing were found in the outflow of the complex. Among all compounds, bisphenol A, which was present in the leachate water of the on-site waste deposit, occurred in the highest concentrations of up to 20 000 μg L(-1). The comparison of contaminant loads in the inflow and outflow of the on-site wastewater treatment facility showed that most contaminants were completely or at least significantly removed or degraded during the treatment, except two alkylthiols, which were enriched during the treatment process. The chemical composition of the inflow samples showed a very heterogenic composition and strongly varied, reflecting that large scale industrial synthesis is carried out in batches. The outflow contained mainly unspecific chlorinated educts or intermediates of industrial syntheses as well as compounds which are known as typical constituents of municipal wastewaters.
Uncovering Randomness and Success in Society
Jalan, Sarika; Sarkar, Camellia; Madhusudanan, Anagha; Dwivedi, Sanjiv Kumar
2014-01-01
An understanding of how individuals shape and impact the evolution of society is vastly limited due to the unavailability of large-scale reliable datasets that can simultaneously capture information regarding individual movements and social interactions. We believe that the popular Indian film industry, “Bollywood”, can provide a social network apt for such a study. Bollywood provides massive amounts of real, unbiased data that spans more than 100 years, and hence this network has been used as a model for the present paper. The nodes which maintain a moderate degree or widely cooperate with the other nodes of the network tend to be more fit (measured as the success of the node in the industry) in comparison to the other nodes. The analysis carried forth in the current work, using a conjoined framework of complex network theory and random matrix theory, aims to quantify the elements that determine the fitness of an individual node and the factors that contribute to the robustness of a network. The authors of this paper believe that the method of study used in the current paper can be extended to study various other industries and organizations. PMID:24533073
Uncovering randomness and success in society.
Jalan, Sarika; Sarkar, Camellia; Madhusudanan, Anagha; Dwivedi, Sanjiv Kumar
2014-01-01
An understanding of how individuals shape and impact the evolution of society is vastly limited due to the unavailability of large-scale reliable datasets that can simultaneously capture information regarding individual movements and social interactions. We believe that the popular Indian film industry, "Bollywood", can provide a social network apt for such a study. Bollywood provides massive amounts of real, unbiased data that spans more than 100 years, and hence this network has been used as a model for the present paper. The nodes which maintain a moderate degree or widely cooperate with the other nodes of the network tend to be more fit (measured as the success of the node in the industry) in comparison to the other nodes. The analysis carried forth in the current work, using a conjoined framework of complex network theory and random matrix theory, aims to quantify the elements that determine the fitness of an individual node and the factors that contribute to the robustness of a network. The authors of this paper believe that the method of study used in the current paper can be extended to study various other industries and organizations.
Past and present of adolescence in society: the 'teen brain' debate in perspective.
Feixa, Carles
2011-08-01
Understood as the stage in individual life comprised between physiological puberty (a "natural" condition) and the recognition of the adult status (a "cultural" construction), adolescence has been envisaged as a universal condition, a stage in human development to be found in all societies and historical moments. Nevertheless, anthropological founding's across space and times depict a more complex panorama. The large variety of situations can be grouped into five big models of adolescence, which correspond to different types of society: "puber" from the primitive stateless societies; "ephebe" from ancient states; "boy and girl" from pre-industrial rural societies; "teenager" from the first industrialisation process and "youngsters" from modern post-industrial societies. In order to describe the features of these five models of youth, this article presents a series of ethnographical examples to illustrate the enormous plasticity of adolescence in past and present. This perspective is to be considered as the psycho-social and cultural environment for adolescent brain development, that will be discussed in depth along in this special issue. Copyright © 2011 Elsevier Ltd. All rights reserved.
Engineering: Defining and differentiating its unique culture
NASA Astrophysics Data System (ADS)
Pilotte, Mary K.
The world of work for engineering professionals is changing. At a rapid pace, experienced engineers are exiting the workforce due to retirement of the Baby Boomer generation, while at the same time the problems facing engineers are increasingly complex and frequently global in nature. For firms to protect their knowledge assets, they must ensure that acquired understandings are shared among their engineering work groups. Engineering teaching and learning in the workplace (i.e., knowledge sharing), is a social activity that resides in a social context governed by the professional engineering culture. This quantitative study uses Hofstede's Organizational Cultural Values Model (Hofstede, Neuijen, Ohayv, & Sanders, 1990) to examine dimensions of engineering culture in the workplace, producing a central tendency profile of engineering's cultural practices. Further, it explores through hypotheses if demographic differentiators, including birth generation, gender, race, industry sector of employment, and engineering discipline, play roles in forming engineering cultural practices. Results both corroborate aspects of Hofstede's model and assert new understandings relative to factors influencing dimensions of engineering practice. Outcomes are discussed in terms of their potential impact on industrial knowledge sharing and formation of beneficial engineering cultures.
Sustainable intensification: a multifaceted, systemic approach to international development.
Himmelstein, Jennifer; Ares, Adrian; van Houweling, Emily
2016-12-01
Sustainable intensification (SI) is a term increasingly used to describe a type of approach applied to international agricultural projects. Despite its widespread use, there is still little understanding or knowledge of the various facets of this composite paradigm. A review of the literature has led to the formalization of three principles that convey the current characterization of SI, comprising a whole system, participatory, agroecological approach. Specific examples of potential bottlenecks to the SI approach are cited, in addition to various technologies and techniques that can be applied to overcome these obstacles. Models of similar, succcessful approaches to agricultural development are examined, along with higher level processes. Additionally, this review explores the desired end points of SI and argues for the inclusion of gender and nutrition throughout the process. To properly apply the SI approach, its various aspects need to be understood and adapted to different cultural and geographic situations. New modeling systems and examples of the effective execution of SI strategies can assist with the successful application of the SI paradigm within complex developing communities. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
Luo, Y.; Nissen-Meyer, T.; Morency, C.; Tromp, J.
2008-12-01
Seismic imaging in the exploration industry is often based upon ray-theoretical migration techniques (e.g., Kirchhoff) or other ideas which neglect some fraction of the seismic wavefield (e.g., wavefield continuation for acoustic-wave first arrivals) in the inversion process. In a companion paper we discuss the possibility of solving the full physical forward problem (i.e., including visco- and poroelastic, anisotropic media) using the spectral-element method. With such a tool at hand, we can readily apply the adjoint method to tomographic inversions, i.e., iteratively improving an initial 3D background model to fit the data. In the context of this inversion process, we draw connections between kernels in adjoint tomography and basic imaging principles in migration. We show that the images obtained by migration are nothing but particular kinds of adjoint kernels (mainly density kernels). Migration is basically a first step in the iterative inversion process of adjoint tomography. We apply the approach to basic 2D problems involving layered structures, overthrusting faults, topography, salt domes, and poroelastic regions.
Smart Grid Interoperability Maturity Model Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Drummond, R.; Giroti, Tony
The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across anmore » information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.« less
Rueda-Holgado, F; Calvo-Blázquez, L; Cereceda-Balic, F; Pinilla-Gil, E
2016-02-01
Fractionation of elemental contents in atmospheric samples is useful to evaluate pollution levels for risk assessment and pollution sources assignment. We present here the main results of long-term characterization of atmospheric deposition by using a recently developed atmospheric elemental fractionation sampler (AEFS) for major and trace elements monitoring around an important industrial complex located in Puchuncaví region (Chile). Atmospheric deposition samples were collected during two sampling campaigns (2010 and 2011) at four sampling locations: La Greda (LG), Los Maitenes (LM), Puchuncaví (PU) and Valle Alegre (VA). Sample digestion and ICP-MS gave elements deposition values (Al, As, Ba, Cd, Co, Cu, Fe, K, Mn, Pb, Sb, Ti, V and Zn) in the insoluble fraction of the total atmospheric deposition. Results showed that LG location, the closest location to the industrial complex, was the more polluted sampling site having the highest values for the analyzed elements. PU and LM were the next more polluted and, finally, the lowest elements concentrations were registered at VA. The application of Principal Component Analysis and Cluster Analysis identified industrial, traffic and mineral-crustal factors. We found critical loads exceedances for Pb at all sampling locations in the area affected by the industrial emissions, more significant in LG close to the industrial complex, with a trend to decrease in 2011, whereas no exceedances due to atmospheric deposition were detected for Cd. Copyright © 2015 Elsevier Ltd. All rights reserved.
Three-dimensional microstructure simulation of Ni-based superalloy investment castings
NASA Astrophysics Data System (ADS)
Pan, Dong; Xu, Qingyan; Liu, Baicheng
2011-05-01
An integrated macro and micro multi-scale model for the three-dimensional microstructure simulation of Ni-based superalloy investment castings was developed, and applied to industrial castings to investigate grain evolution during solidification. A ray tracing method was used to deal with the complex heat radiation transfer. The microstructure evolution was simulated based on the Modified Cellular Automaton method, which was coupled with three-dimensional nested macro and micro grids. Experiments for Ni-based superalloy turbine wheel investment casting were carried out, which showed a good correspondence with the simulated results. It is indicated that the proposed model is able to predict the microstructure of the casting precisely, which provides a tool for the optimizing process.
Radiological impact of natural radionuclides from soils of Salamanca, Mexico.
Mandujano-García, C D; Sosa, M; Mantero, J; Costilla, R; Manjón, G; García-Tenorio, R
2016-11-01
Salamanca is the centre of a large industrial complex associated with the production and refining of oil-derived products in the state of Guanajuato, Mexico. The city also hosts a large chemical industry, and in past years a major fertilizer industry. All of them followed NORM (naturally occurring radioactive materials) industrial activities, where either raw materials or residues enriched in natural radionuclides are handled or generated, which can have an environmental radiological impact on their environmental compartments (e.g. soils and aquatic systems). In this study, activity concentrations of radionuclides from the 238 U and 232 Th natural series present in superficial urban soils surrounding an industrial complex in Salamanca, México, have been determined to analyse the possible environmental radiological impact of some of the industrial activities. The alpha-particle and gamma-ray spectrometry is used for the radiometric characterization. The results revealed the presence of 10-42, 11-51 and 178-811Bq/kg of 238 U, 232 Th and 40 K, respectively, without any clear anthropogenic increment in relation to the values normally found in unaffected soils. Thus, the radioactive impact of the industrial activities on the surrounding soils can be evaluated as very low, representing no radiological risk for the health of the population. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Fifield, Leonard S.; Gandhi, Umesh N.
This project proposed to integrate, optimize and validate the fiber orientation and length distribution models previously developed and implemented in the Autodesk Simulation Moldflow Insight (ASMI) package for injection-molded long-carbon-fiber thermoplastic composites into a cohesive prediction capability. The current effort focused on rendering the developed models more robust and efficient for automotive industry part design to enable weight savings and cost reduction. The project goal has been achieved by optimizing the developed models, improving and integrating their implementations in ASMI, and validating them for a complex 3D LCF thermoplastic automotive part (Figure 1). Both PP and PA66 were used asmore » resin matrices. After validating ASMI predictions for fiber orientation and fiber length for this complex part against the corresponding measured data, in collaborations with Toyota and Magna PNNL developed a method using the predictive engineering tool to assess LCF/PA66 complex part design in terms of stiffness performance. Structural three-point bending analyses of the complex part and similar parts in steel were then performed for this purpose, and the team has then demonstrated the use of stiffness-based complex part design assessment to evaluate weight savings relative to the body system target (≥ 35%) set in Table 2 of DE-FOA-0000648 (AOI #1). In addition, starting from the part-to-part analysis, the PE tools enabled an estimated weight reduction for the vehicle body system using 50 wt% LCF/PA66 parts relative to the current steel system. Also, from this analysis an estimate of the manufacturing cost including the material cost for making the equivalent part in steel has been determined and compared to the costs for making the LCF/PA66 part to determine the cost per “saved” pound.« less
Curriculum Development for the Achievement of Multiple Goals in the Agri-Food Industry.
ERIC Educational Resources Information Center
Stonehouse, D. P.
1994-01-01
The agri-food industry is concerned with maximizing global food output while preventing environmental damage. Agricultural education focuses on multidisciplinary, holistic, and integrative approaches that enhance student capabilities to address this complex issue. (SK)
ERIC Educational Resources Information Center
Daniel, Ryan; Daniel, Leah
2015-01-01
This article reflects on ongoing research-led teaching in the area of creative industries in higher education. Specifically it reports on key work-integrated learning strategies designed to better prepare graduates for the employment sector. The creative industries sector is complex and competitive, characterized by non-linear career paths driven…
ERIC Educational Resources Information Center
Reams, Bernard Dinsmore
The use of complex research agreements for joint research activities between industry and universities is assessed, with attention to the legal rights of the contracting parties. The focus is research relationships between a university and a company or an individual scientist and industry. The historical development and legal foundation of…
The Nubian Complex of Dhofar, Oman: an African middle stone age industry in Southern Arabia.
Rose, Jeffrey I; Usik, Vitaly I; Marks, Anthony E; Hilbert, Yamandu H; Galletti, Christopher S; Parton, Ash; Geiling, Jean Marie; Cerný, Viktor; Morley, Mike W; Roberts, Richard G
2011-01-01
Despite the numerous studies proposing early human population expansions from Africa into Arabia during the Late Pleistocene, no archaeological sites have yet been discovered in Arabia that resemble a specific African industry, which would indicate demographic exchange across the Red Sea. Here we report the discovery of a buried site and more than 100 new surface scatters in the Dhofar region of Oman belonging to a regionally-specific African lithic industry--the late Nubian Complex--known previously only from the northeast and Horn of Africa during Marine Isotope Stage 5, ∼128,000 to 74,000 years ago. Two optically stimulated luminescence age estimates from the open-air site of Aybut Al Auwal in Oman place the Arabian Nubian Complex at ∼106,000 years ago, providing archaeological evidence for the presence of a distinct northeast African Middle Stone Age technocomplex in southern Arabia sometime in the first half of Marine Isotope Stage 5.