Sample records for principles process-product models

  1. Research and exploration of product innovative design for function

    NASA Astrophysics Data System (ADS)

    Wang, Donglin; Wei, Zihui; Wang, Youjiang; Tan, Runhua

    2009-07-01

    Products innovation is under the prerequisite of realizing the new function, the realization of the new function must solve the contradiction. A new process model of new product innovative design was proposed based on Axiomatic Design (AD) Theory and Functional Structure Analysis (FSA), imbedded Principle of Solving Contradiction. In this model, employ AD Theory to guide FSA, determine the contradiction for the realization of the principle solution. To provide powerful support for innovative design tools in principle solution, Principle of Solving Contradiction in the model were imbedded, so as to boost up the innovation of principle solution. As a case study, an innovative design of button battery separator paper punching machine has been achieved with application of the proposed model.

  2. Usability in product design--the importance and need for systematic assessment models in product development--Usa-Design Model (U-D) ©.

    PubMed

    Merino, Giselle Schmidt Alves Díaz; Teixeira, Clarissa Stefani; Schoenardie, Rodrigo Petry; Merino, Eugenio Andrés Diáz; Gontijo, Leila Amaral

    2012-01-01

    In product design, human factors are considered as an element of differentiation given that today's consumer demands are increasing. Safety, wellbeing, satisfaction, health, effectiveness, efficiency, and other aspects must be effectively incorporated into the product development process. This work proposes a usability assessment model that can be incorporated as an assessment tool. The methodological approach is settled in two stages. First a literature review focus specifically on usability and developing user-centred products. After this, a model of usability named Usa-Design (U-D©) is presented. Consisted of four phases: understanding the use context, pre-preliminary usability assessment (efficiency/effectiveness/satisfaction); assessment of usability principles and results, U-D© features are modular and flexible, allowing principles used in Phase 3 to be changed according to the needs and scenario of each situation. With qualitative/quantitative measurement scales of easy understanding and application, the model results are viable and applicable throughout all the product development process.

  3. Technological, Economic, and Environmental Optimization of Aluminum Recycling

    NASA Astrophysics Data System (ADS)

    Ioana, Adrian; Semenescu, Augustin

    2013-08-01

    The four strategic directions (referring to the entire life cycle of aluminum) are as follows: production, primary use, recycling, and reuse. Thus, in this work, the following are analyzed and optimized: reducing greenhouse gas emissions from aluminum production, increasing energy efficiency in aluminum production, maximizing used-product collection, recycling, and reusing. According to the energetic balance at the gaseous environment level, the conductive transfer model is also analyzed through the finished elements method. Several principles of modeling and optimization are presented and analyzed: the principle of analogy, the principle of concepts, and the principle of hierarchization. Based on these principles, an original diagram model is designed together with the corresponding logic diagram. This article also presents and analyzes the main benefits of aluminum recycling and reuse. Recycling and reuse of aluminum have the main advantage that it requires only about 5% of energy consumed to produce it from bauxite. The aluminum recycling and production process causes the emission of pollutants such as dioxides and furans, hydrogen chloride, and particulate matter. To control these emissions, aluminum recyclers are required to comply with the National Emission Standards for Hazardous Air Pollutants for Secondary Aluminum Production. The results of technological, economic, and ecological optimization of aluminum recycling are based on the criteria function's evaluation in the modeling system.

  4. PDS4 - Some Principles for Agile Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D. J.; Hardman, S. H.; Joyner, R.; Algermissen, S.; Padams, J.

    2015-12-01

    PDS4, a research data management and curation system for NASA's Planetary Science Archive, was developed using principles that promote the characteristics of agile development. The result is an efficient system that produces better research data products while using less resources (time, effort, and money) and maximizes their usefulness for current and future scientists. The key principle is architectural. The PDS4 information architecture is developed and maintained independent of the infrastructure's process, application and technology architectures. The information architecture is based on an ontology-based information model developed to leverage best practices from standard reference models for digital archives, digital object registries, and metadata registries and capture domain knowledge from a panel of planetary science domain experts. The information model provides a sharable, stable, and formal set of information requirements for the system and is the primary source for information to configure most system components, including the product registry, search engine, validation and display tools, and production pipelines. Multi-level governance is also allowed for the effective management of the informational elements at the common, discipline, and project level. This presentation will describe the development principles, components, and uses of the information model and how an information model-driven architecture exhibits characteristics of agile curation including early delivery, evolutionary development, adaptive planning, continuous improvement, and rapid and flexible response to change.

  5. In silico regenerative medicine: how computational tools allow regulatory and financial challenges to be addressed in a volatile market

    PubMed Central

    Geris, L.; Guyot, Y.; Schrooten, J.; Papantoniou, I.

    2016-01-01

    The cell therapy market is a highly volatile one, due to the use of disruptive technologies, the current economic situation and the small size of the market. In such a market, companies as well as academic research institutes are in need of tools to advance their understanding and, at the same time, reduce their R&D costs, increase product quality and productivity, and reduce the time to market. An additional difficulty is the regulatory path that needs to be followed, which is challenging in the case of cell-based therapeutic products and should rely on the implementation of quality by design (QbD) principles. In silico modelling is a tool that allows the above-mentioned challenges to be addressed in the field of regenerative medicine. This review discusses such in silico models and focuses more specifically on the bioprocess. Three (clusters of) examples related to this subject are discussed. The first example comes from the pharmaceutical engineering field where QbD principles and their implementation through the use of in silico models are both a regulatory and economic necessity. The second example is related to the production of red blood cells. The described in silico model is mainly used to investigate the manufacturing process of the cell-therapeutic product, and pays special attention to the economic viability of the process. Finally, we describe the set-up of a model capturing essential events in the development of a tissue-engineered combination product in the context of bone tissue engineering. For each of the examples, a short introduction to some economic aspects is given, followed by a description of the in silico tool or tools that have been developed to allow the implementation of QbD principles and optimal design. PMID:27051516

  6. In silico regenerative medicine: how computational tools allow regulatory and financial challenges to be addressed in a volatile market.

    PubMed

    Geris, L; Guyot, Y; Schrooten, J; Papantoniou, I

    2016-04-06

    The cell therapy market is a highly volatile one, due to the use of disruptive technologies, the current economic situation and the small size of the market. In such a market, companies as well as academic research institutes are in need of tools to advance their understanding and, at the same time, reduce their R&D costs, increase product quality and productivity, and reduce the time to market. An additional difficulty is the regulatory path that needs to be followed, which is challenging in the case of cell-based therapeutic products and should rely on the implementation of quality by design (QbD) principles. In silico modelling is a tool that allows the above-mentioned challenges to be addressed in the field of regenerative medicine. This review discusses such in silico models and focuses more specifically on the bioprocess. Three (clusters of) examples related to this subject are discussed. The first example comes from the pharmaceutical engineering field where QbD principles and their implementation through the use of in silico models are both a regulatory and economic necessity. The second example is related to the production of red blood cells. The described in silico model is mainly used to investigate the manufacturing process of the cell-therapeutic product, and pays special attention to the economic viability of the process. Finally, we describe the set-up of a model capturing essential events in the development of a tissue-engineered combination product in the context of bone tissue engineering. For each of the examples, a short introduction to some economic aspects is given, followed by a description of the in silico tool or tools that have been developed to allow the implementation of QbD principles and optimal design.

  7. Application of quality by design principles to the development and technology transfer of a major process improvement for the manufacture of a recombinant protein.

    PubMed

    Looby, Mairead; Ibarra, Neysi; Pierce, James J; Buckley, Kevin; O'Donovan, Eimear; Heenan, Mary; Moran, Enda; Farid, Suzanne S; Baganz, Frank

    2011-01-01

    This study describes the application of quality by design (QbD) principles to the development and implementation of a major manufacturing process improvement for a commercially distributed therapeutic protein produced in Chinese hamster ovary cell culture. The intent of this article is to focus on QbD concepts, and provide guidance and understanding on how the various components combine together to deliver a robust process in keeping with the principles of QbD. A fed-batch production culture and a virus inactivation step are described as representative examples of upstream and downstream unit operations that were characterized. A systematic approach incorporating QbD principles was applied to both unit operations, involving risk assessment of potential process failure points, small-scale model qualification, design and execution of experiments, definition of operating parameter ranges and process validation acceptance criteria followed by manufacturing-scale implementation and process validation. Statistical experimental designs were applied to the execution of process characterization studies evaluating the impact of operating parameters on product quality attributes and process performance parameters. Data from process characterization experiments were used to define the proven acceptable range and classification of operating parameters for each unit operation. Analysis of variance and Monte Carlo simulation methods were used to assess the appropriateness of process design spaces. Successful implementation and validation of the process in the manufacturing facility and the subsequent manufacture of hundreds of batches of this therapeutic protein verifies the approaches taken as a suitable model for the development, scale-up and operation of any biopharmaceutical manufacturing process. Copyright © 2011 American Institute of Chemical Engineers (AIChE).

  8. Modeling and Simulation of Metallurgical Process Based on Hybrid Petri Net

    NASA Astrophysics Data System (ADS)

    Ren, Yujuan; Bao, Hong

    2016-11-01

    In order to achieve the goals of energy saving and emission reduction of iron and steel enterprises, an increasing number of modeling and simulation technologies are used to research and analyse metallurgical production process. In this paper, the basic principle of Hybrid Petri net is used to model and analyse the Metallurgical Process. Firstly, the definition of Hybrid Petri Net System of Metallurgical Process (MPHPNS) and its modeling theory are proposed. Secondly, the model of MPHPNS based on material flow is constructed. The dynamic flow of materials and the real-time change of each technological state in metallurgical process are simulated vividly by using this model. The simulation process can implement interaction between the continuous event dynamic system and the discrete event dynamic system at the same level, and play a positive role in the production decision.

  9. Energy-based culture medium design for biomanufacturing optimization: A case study in monoclonal antibody production by GS-NS0 cells.

    PubMed

    Quiroga-Campano, Ana L; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2018-03-02

    Demand for high-value biologics, a rapidly growing pipeline, and pressure from competition, time-to-market and regulators, necessitate novel biomanufacturing approaches, including Quality by Design (QbD) principles and Process Analytical Technologies (PAT), to facilitate accelerated, efficient and effective process development platforms that ensure consistent product quality and reduced lot-to-lot variability. Herein, QbD and PAT principles were incorporated within an innovative in vitro-in silico integrated framework for upstream process development (UPD). The central component of the UPD framework is a mathematical model that predicts dynamic nutrient uptake and average intracellular ATP content, based on biochemical reaction networks, to quantify and characterize energy metabolism and its adaptive response, metabolic shifts, to maintain ATP homeostasis. The accuracy and flexibility of the model depends on critical cell type/product/clone-specific parameters, which are experimentally estimated. The integrated in vitro-in silico platform and the model's predictive capacity reduced burden, time and expense of experimentation resulting in optimal medium design compared to commercially available culture media (80% amino acid reduction) and a fed-batch feeding strategy that increased productivity by 129%. The framework represents a flexible and efficient tool that transforms, improves and accelerates conventional process development in biomanufacturing with wide applications, including stem cell-based therapies. Copyright © 2018. Published by Elsevier Inc.

  10. What Should Leaders Do When Inefficiency Is Perceived as a Cost of Inclusivity in Strategic Planning Processes in Health Care?

    PubMed

    Kochar, Aveena; Chisty, Alia

    2017-11-01

    During the development of new health care policies, quality improvement teams can face the challenge of weighing differing opinions within the group that can hinder progress. It is essential in such cases to refer to the four keys principles of quality improvement (QI) as a guide to enhance group cooperation and promote development of the mutual objective. Co-production is a model that emphasizes the participation of the patient-a service receiver-in the production of services being rendered by the health care professional. By putting into practice the QI principles and using the model of co-production, quality improvement teams can improve efficiency of health systems and clinical outcomes. © 2017 American Medical Association. All Rights Reserved.

  11. On the Efficiency of Text Production in Vocabulary Learning: An Empirical Study on Iranian GFL-Learners

    ERIC Educational Resources Information Center

    Haghani, Nader; Kiani, Samira

    2018-01-01

    The concept of text-oriented vocabulary exercises is based on Kühn's (2000) three-step model of vocabulary teaching--receptive, reflective and productive vocabulary exercises--which focuses on working with texts. Since the production is in principle more exhausting than the reception--as can be seen from the Levels of Processing Effect--one can…

  12. Optimality and inference in hydrology from entropy production considerations: synthetic hillslope numerical experiments

    NASA Astrophysics Data System (ADS)

    Kollet, S. J.

    2015-05-01

    In this study, entropy production optimization and inference principles are applied to a synthetic semi-arid hillslope in high-resolution, physics-based simulations. The results suggest that entropy or power is indeed maximized, because of the strong nonlinearity of variably saturated flow and competing processes related to soil moisture fluxes, the depletion of gradients, and the movement of a free water table. Thus, it appears that the maximum entropy production (MEP) principle may indeed be applicable to hydrologic systems. In the application to hydrologic system, the free water table constitutes an important degree of freedom in the optimization of entropy production and may also relate the theory to actual observations. In an ensuing analysis, an attempt is made to transfer the complex, "microscopic" hillslope model into a macroscopic model of reduced complexity using the MEP principle as an interference tool to obtain effective conductance coefficients and forces/gradients. The results demonstrate a new approach for the application of MEP to hydrologic systems and may form the basis for fruitful discussions and research in future.

  13. A quality by design approach to scale-up of high-shear wet granulation process.

    PubMed

    Pandey, Preetanshu; Badawy, Sherif

    2016-01-01

    High-shear wet granulation is a complex process that in turn makes scale-up a challenging task. Scale-up of high-shear wet granulation process has been studied extensively in the past with various different methodologies being proposed in the literature. This review article discusses existing scale-up principles and categorizes the various approaches into two main scale-up strategies - parameter-based and attribute-based. With the advent of quality by design (QbD) principle in drug product development process, an increased emphasis toward the latter approach may be needed to ensure product robustness. In practice, a combination of both scale-up strategies is often utilized. In a QbD paradigm, there is also a need for an increased fundamental and mechanistic understanding of the process. This can be achieved either by increased experimentation that comes at higher costs, or by using modeling techniques, that are also discussed as part of this review.

  14. Improving preanalytic processes using the principles of lean production (Toyota Production System).

    PubMed

    Persoon, Thomas J; Zaleski, Sue; Frerichs, Janice

    2006-01-01

    The basic technologies used in preanalytic processes for chemistry tests have been mature for a long time, and improvements in preanalytic processes have lagged behind improvements in analytic and postanalytic processes. We describe our successful efforts to improve chemistry test turnaround time from a central laboratory by improving preanalytic processes, using existing resources and the principles of lean production. Our goal is to report 80% of chemistry tests in less than 1 hour and to no longer recognize a distinction between expedited and routine testing. We used principles of lean production (the Toyota Production System) to redesign preanalytic processes. The redesigned preanalytic process has fewer steps and uses 1-piece flow to move blood samples through the accessioning, centrifugation, and aliquoting processes. Median preanalytic processing time was reduced from 29 to 19 minutes, and the laboratory met the goal of reporting 80% of chemistry results in less than 1 hour for 11 consecutive months.

  15. Computational principles of working memory in sentence comprehension.

    PubMed

    Lewis, Richard L; Vasishth, Shravan; Van Dyke, Julie A

    2006-10-01

    Understanding a sentence requires a working memory of the partial products of comprehension, so that linguistic relations between temporally distal parts of the sentence can be rapidly computed. We describe an emerging theoretical framework for this working memory system that incorporates several independently motivated principles of memory: a sharply limited attentional focus, rapid retrieval of item (but not order) information subject to interference from similar items, and activation decay (forgetting over time). A computational model embodying these principles provides an explanation of the functional capacities and severe limitations of human processing, as well as accounts of reading times. The broad implication is that the detailed nature of cross-linguistic sentence processing emerges from the interaction of general principles of human memory with the specialized task of language comprehension.

  16. A first-principle model of 300 mm Czochralski single-crystal Si production process for predicting crystal radius and crystal growth rate

    NASA Astrophysics Data System (ADS)

    Zheng, Zhongchao; Seto, Tatsuru; Kim, Sanghong; Kano, Manabu; Fujiwara, Toshiyuki; Mizuta, Masahiko; Hasebe, Shinji

    2018-06-01

    The Czochralski (CZ) process is the dominant method for manufacturing large cylindrical single-crystal ingots for the electronics industry. Although many models and control methods for the CZ process have been proposed, they were only tested with small equipment and only a few industrial application were reported. In this research, we constructed a first-principle model for controlling industrial CZ processes that produce 300 mm single-crystal silicon ingots. The developed model, which consists of energy, mass balance, hydrodynamic, and geometrical equations, calculates the crystal radius and the crystal growth rate as output variables by using the heater input, the crystal pulling rate, and the crucible rise rate as input variables. To improve accuracy, we modeled the CZ process by considering factors such as changes in the positions of the crucible and the melt level. The model was validated with the operation data from an industrial 300 mm CZ process. We compared the calculated and actual values of the crystal radius and the crystal growth rate, and the results demonstrated that the developed model simulated the industrial process with high accuracy.

  17. Clarity of objectives and working principles enhances the success of biomimetic programs.

    PubMed

    Wolff, Jonas O; Wells, David; Reid, Chris R; Blamires, Sean J

    2017-09-26

    Biomimetics, the transfer of functional principles from living systems into product designs, is increasingly being utilized by engineers. Nevertheless, recurring problems must be overcome if it is to avoid becoming a short-lived fad. Here we assess the efficiency and suitability of methods typically employed by examining three flagship examples of biomimetic design approaches from different disciplines: (1) the creation of gecko-inspired adhesives; (2) the synthesis of spider silk, and (3) the derivation of computer algorithms from natural self-organizing systems. We find that identification of the elemental working principles is the most crucial step in the biomimetic design process. It bears the highest risk of failure (e.g. losing the target function) due to false assumptions about the working principle. Common problems that hamper successful implementation are: (i) a discrepancy between biological functions and the desired properties of the product, (ii) uncertainty about objectives and applications, (iii) inherent limits in methodologies, and (iv) false assumptions about the biology of the models. Projects that aim for multi-functional products are particularly challenging to accomplish. We suggest a simplification, modularisation and specification of objectives, and a critical assessment of the suitability of the model. Comparative analyses, experimental manipulation, and numerical simulations followed by tests of artificial models have led to the successful extraction of working principles. A searchable database of biological systems would optimize the choice of a model system in top-down approaches that start at an engineering problem. Only when biomimetic projects become more predictable will there be wider acceptance of biomimetics as an innovative problem-solving tool among engineers and industry.

  18. Utilisation of biomass gasification by-products for onsite energy production.

    PubMed

    Vakalis, S; Sotiropoulos, A; Moustakas, K; Malamis, D; Baratieri, M

    2016-06-01

    Small scale biomass gasification is a sector with growth and increasing applications owing to the environmental goals of the European Union and the incentivised policies of most European countries. This study addresses two aspects, which are at the centre of attention concerning the operation and development of small scale gasifiers; reuse of waste and increase of energy efficiency. Several authors have denoted that the low electrical efficiency of these systems is the main barrier for further commercial development. In addition, gasification has several by-products that have no further use and are discarded as waste. In the framework of this manuscript, a secondary reactor is introduced and modelled. The main operating principle is the utilisation of char and flue gases for further energy production. These by-products are reformed into secondary producer gas by means of a secondary reactor. In addition, a set of heat exchangers capture the waste heat and optimise the process. This case study is modelled in a MATLAB-Cantera environment. The model is non-stoichiometric and applies the Gibbs minimisation principle. The simulations show that some of the thermal energy is depleted during the process owing to the preheating of flue gases. Nonetheless, the addition of a secondary reactor results in an increase of the electrical power production efficiency and the combined heat and power (CHP) efficiency. © The Author(s) 2016.

  19. Using natural selection and optimization for smarter vegetation models - challenges and opportunities

    NASA Astrophysics Data System (ADS)

    Franklin, Oskar; Han, Wang; Dieckmann, Ulf; Cramer, Wolfgang; Brännström, Åke; Pietsch, Stephan; Rovenskaya, Elena; Prentice, Iain Colin

    2017-04-01

    Dynamic global vegetation models (DGVMs) are now indispensable for understanding the biosphere and for estimating the capacity of ecosystems to provide services. The models are continuously developed to include an increasing number of processes and to utilize the growing amounts of observed data becoming available. However, while the versatility of the models is increasing as new processes and variables are added, their accuracy suffers from the accumulation of uncertainty, especially in the absence of overarching principles controlling their concerted behaviour. We have initiated a collaborative working group to address this problem based on a 'missing law' - adaptation and optimization principles rooted in natural selection. Even though this 'missing law' constrains relationships between traits, and therefore can vastly reduce the number of uncertain parameters in ecosystem models, it has rarely been applied to DGVMs. Our recent research have shown that optimization- and trait-based models of gross primary production can be both much simpler and more accurate than current models based on fixed functional types, and that observed plant carbon allocations and distributions of plant functional traits are predictable with eco-evolutionary models. While there are also many other examples of the usefulness of these and other theoretical principles, it is not always straight-forward to make them operational in predictive models. In particular on longer time scales, the representation of functional diversity and the dynamical interactions among individuals and species presents a formidable challenge. Here we will present recent ideas on the use of adaptation and optimization principles in vegetation models, including examples of promising developments, but also limitations of the principles and some key challenges.

  20. Maximum entropy production: Can it be used to constrain conceptual hydrological models?

    Treesearch

    M.C. Westhoff; E. Zehe

    2013-01-01

    In recent years, optimality principles have been proposed to constrain hydrological models. The principle of maximum entropy production (MEP) is one of the proposed principles and is subject of this study. It states that a steady state system is organized in such a way that entropy production is maximized. Although successful applications have been reported in...

  1. Human Time-Frequency Acuity Beats the Fourier Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Oppenheim, Jacob N.; Magnasco, Marcelo O.

    2013-01-01

    The time-frequency uncertainty principle states that the product of the temporal and frequency extents of a signal cannot be smaller than 1/(4π). We study human ability to simultaneously judge the frequency and the timing of a sound. Our subjects often exceeded the uncertainty limit, sometimes by more than tenfold, mostly through remarkable timing acuity. Our results establish a lower bound for the nonlinearity and complexity of the algorithms employed by our brains in parsing transient sounds, rule out simple “linear filter” models of early auditory processing, and highlight timing acuity as a central feature in auditory object processing.

  2. Semi-inner-products in Banach Spaces with Applications to Regularized Learning, Sampling, and Sparse Approximation

    DTIC Science & Technology

    2016-03-13

    dynamics of percept formation modeled as operant (selectionist) process, Cognitive Neurodynamics , (08 2013): 0. doi: 10.1007/s11571-013-9262-0 Jun Zhang... cognitive principles for categorization. Our execution plan include three specific topics (“Aims”) 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13...while rooted in human cognitive principles for categorization. Our execution plan include three specific topics (“Aims”) 1. Apply RKBS theory to

  3. GMP in blood collection and processing.

    PubMed

    Wagstaff, W

    1998-01-01

    The principles of Good Manufacturing Practice have, in the main, been universally developed for the guidance of the pharmaceutical industry rather than for transfusion services. However, these rules and guides are increasingly being adapted for use in blood centres, in the production of labile blood components and of plasma for fractionation. The guide for pharmaceutical industries produced by the commission of the European Communities is used as a model here, the nine basic requirements being those applicable to Quality Management, personnel, premises and equipment, document, production, Quality Control, contract manufacture and analysis, complaints and product recall, and self-inspection. Though having more direct application to the production laboratory preparing blood components, the majority of these requirements and principles are also directly applicable to all of the activities involved in blood collection.

  4. 3-PG simulations of young ponderosa pine plantations under varied management intensity: why do they grow so differently?

    Treesearch

    Liang Wei; Marshall John; Jianwei Zhang; Hang Zhou; Robert Powers

    2014-01-01

    Models can be powerful tools for estimating forest productivity and guiding forest management, but their credibility and complexity are often an issue for forest managers. We parameterized a process-based forest growth model, 3-PG (Physiological Principles Predicting Growth), to simulate growth of ponderosa pine (Pinus ponderosa) plantations in...

  5. A practical approach for exploration and modeling of the design space of a bacterial vaccine cultivation process.

    PubMed

    Streefland, M; Van Herpen, P F G; Van de Waterbeemd, B; Van der Pol, L A; Beuvery, E C; Tramper, J; Martens, D E; Toft, M

    2009-10-15

    A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process, especially for processes involving biological products, usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged by new or additional clinical testing. Recent changes in the regulations for pharmaceutical processing allow broader ranges of process settings to be submitted for regulatory approval, the so-called process design space, which means that a manufacturer can optimize his process within the submitted ranges after the product has entered the market, which allows flexible processes. In this article, the applicability of this concept of the process design space is investigated for the cultivation process step for a vaccine against whooping cough disease. An experimental design (DoE) is applied to investigate the ranges of critical process parameters that still result in a product that meets specifications. The on-line process data, including near infrared spectroscopy, are used to build a descriptive model of the processes used in the experimental design. Finally, the data of all processes are integrated in a multivariate batch monitoring model that represents the investigated process design space. This article demonstrates how the general principles of PAT and process design space can be applied for an undefined biological product such as a whole cell vaccine. The approach chosen for model development described here, allows on line monitoring and control of cultivation batches in order to assure in real time that a process is running within the process design space.

  6. Application of Quality by Design to the characterization of the cell culture process of an Fc-Fusion protein.

    PubMed

    Rouiller, Yolande; Solacroup, Thomas; Deparis, Véronique; Barbafieri, Marco; Gleixner, Ralf; Broly, Hervé; Eon-Duval, Alex

    2012-06-01

    The production bioreactor step of an Fc-Fusion protein manufacturing cell culture process was characterized following Quality by Design principles. Using scientific knowledge derived from the literature and process knowledge gathered during development studies and manufacturing to support clinical trials, potential critical and key process parameters with a possible impact on product quality and process performance, respectively, were determined during a risk assessment exercise. The identified process parameters were evaluated using a design of experiment approach. The regression models generated from the data allowed characterizing the impact of the identified process parameters on quality attributes. The main parameters having an impact on product titer were pH and dissolved oxygen, while those having the highest impact on process- and product-related impurities and variants were pH and culture duration. The models derived from characterization studies were used to define the cell culture process design space. The design space limits were set in such a way as to ensure that the drug substance material would consistently have the desired quality. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Articulatory speech synthesis and speech production modelling

    NASA Astrophysics Data System (ADS)

    Huang, Jun

    This dissertation addresses the problem of speech synthesis and speech production modelling based on the fundamental principles of human speech production. Unlike the conventional source-filter model, which assumes the independence of the excitation and the acoustic filter, we treat the entire vocal apparatus as one system consisting of a fluid dynamic aspect and a mechanical part. We model the vocal tract by a three-dimensional moving geometry. We also model the sound propagation inside the vocal apparatus as a three-dimensional nonplane-wave propagation inside a viscous fluid described by Navier-Stokes equations. In our work, we first propose a combined minimum energy and minimum jerk criterion to estimate the dynamic vocal tract movements during speech production. Both theoretical error bound analysis and experimental results show that this method can achieve very close match at the target points and avoid the abrupt change in articulatory trajectory at the same time. Second, a mechanical vocal fold model is used to compute the excitation signal of the vocal tract. The advantage of this model is that it is closely coupled with the vocal tract system based on fundamental aerodynamics. As a result, we can obtain an excitation signal with much more detail than the conventional parametric vocal fold excitation model. Furthermore, strong evidence of source-tract interaction is observed. Finally, we propose a computational model of the fricative and stop types of sounds based on the physical principles of speech production. The advantage of this model is that it uses an exogenous process to model the additional nonsteady and nonlinear effects due to the flow mode, which are ignored by the conventional source- filter speech production model. A recursive algorithm is used to estimate the model parameters. Experimental results show that this model is able to synthesize good quality fricative and stop types of sounds. Based on our dissertation work, we carefully argue that the articulatory speech production model has the potential to flexibly synthesize natural-quality speech sounds and to provide a compact computational model for speech production that can be beneficial to a wide range of areas in speech signal processing.

  8. Architectures, representations and processes of language production

    PubMed Central

    Alario, F.-Xavier; Costa, Albert; Ferreira, Victor S.; Pickering, Martin J.

    2007-01-01

    We present an overview of recent research conducted in the field of language production based on papers presented at the first edition of the International Workshop on Language Production (Marseille, France, September 2004). This article comprises two main parts. In the first part, consisting of three sections, we review the articles that are included in this Special Issue. These three sections deal with three different topics of general interest for models of language production: (A) the general organisational principles of the language production system, (B) several aspects of the lexical selection process and (C) the representations and processes used during syntactic encoding. In the second part, we discuss future directions for research in the field of language production, given the considerable developments that have occurred in recent years. PMID:17710209

  9. An improvement in the calculation of the efficiency of oxidative phosphorylation and rate of energy dissipation in mitochondria

    NASA Astrophysics Data System (ADS)

    Ghafuri, Mohazabeh; Golfar, Bahareh; Nosrati, Mohsen; Hoseinkhani, Saman

    2014-12-01

    The process of ATP production is one of the most vital processes in living cells which happens with a high efficiency. Thermodynamic evaluation of this process and the factors involved in oxidative phosphorylation can provide a valuable guide for increasing the energy production efficiency in research and industry. Although energy transduction has been studied qualitatively in several researches, there are only few brief reviews based on mathematical models on this subject. In our previous work, we suggested a mathematical model for ATP production based on non-equilibrium thermodynamic principles. In the present study, based on the new discoveries on the respiratory chain of animal mitochondria, Golfar's model has been used to generate improved results for the efficiency of oxidative phosphorylation and the rate of energy loss. The results calculated from the modified coefficients for the proton pumps of the respiratory chain enzymes are closer to the experimental results and validate the model.

  10. Prebiotic Synthesis of Autocatalytic Products From Formaldehyde-Derived Sugars as the Carbon and Energy Source

    NASA Technical Reports Server (NTRS)

    Weber, Arthur L.

    2003-01-01

    Our research objective is to understand and model the chemical processes on the primitive Earth that generated the first autocatalytic molecules and microstructures involved in the origin of life. Our approach involves: (a) investigation of a model origin-of-life process named the Sugar Model that is based on the reaction of formaldehyde- derived sugars (trioses and tetroses) with ammonia, and (b) elucidation of the constraints imposed on the chemistry of the origin of life by the fixed energies and rates of C,H,O-organic reactions under mild aqueous conditions. Recently, we demonstrated that under mild aqueous conditions the Sugar Model process yields autocatalytic products, and generates organic micropherules (2-20 micron dia.) that exhibit budding, size uniformity, and chain formation. We also discovered that the sugar substrates of the Sugar Model are capable of reducing nitrite to ammonia under mild aqueous conditions. In addition studies done in collaboration with Sandra Pizzarrello (Arizona State University) revealed that chiral amino acids (including meteoritic isovaline) catalyze both the synthesis and specific handedness of chiral sugars. Our systematic survey of the energies and rates of reactions of C,H,O-organic substrates under mild aqueous conditions revealed several general principles (rules) that govern the direction and rate of organic reactions. These reactivity principles constrain the structure of chemical pathways used in the origin of life, and in modern and primitive metabolism.

  11. 21 CFR 123.10 - Training.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... HACCP principles to fish and fishery product processing at least equivalent to that received under... standardized curriculum. (a) Developing a HACCP plan, which could include adapting a model or generic-type HACCP plan, that is appropriate for a specific processor, in order to meet the requirements of § 123.6(b...

  12. Teacher Education: Considerations for a Knowledge Base Framework.

    ERIC Educational Resources Information Center

    Tumposky, Nancy

    Traditionally, the knowledge base has been defined more as product than process and has encompassed definitions, principles, values, and facts. Recent reforms in teaching and teacher education have brought about efforts to redefine the knowledge base. The reconceptualized knowledge base builds upon the earlier model but gives higher priority to…

  13. Mechanism of dehydration of phenols on nobel metals using first-principles micokinetic modeling

    USDA-ARS?s Scientific Manuscript database

    Phenolic compounds constitute a sizable fraction of depolymerized biomass and are an ideal feedstock for the production of chemicals such as benzene and toluene. However, these compounds require catalytic upgrade via hydrodeoxygenation (HDO), a process whereby oxygen is removed as water by adding hy...

  14. Methodology for Physics and Engineering of Reliable Products

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Gibbel, Mark

    1996-01-01

    Physics of failure approaches have gained wide spread acceptance within the electronic reliability community. These methodologies involve identifying root cause failure mechanisms, developing associated models, and utilizing these models to inprove time to market, lower development and build costs and higher reliability. The methodology outlined herein sets forth a process, based on integration of both physics and engineering principles, for achieving the same goals.

  15. A rugged landscape model for self-organization and emergent leadership in creative problem solving and production groups.

    PubMed

    Guastello, Stephen J; Craven, Joanna; Zygowicz, Karen M; Bock, Benjamin R

    2005-07-01

    The process by which an initially leaderless group differentiates into one containing leadership and secondary role structures was examined using the swallowtail catastrophe model and principles of selforganization. The objectives were to identify the control variables in the process of leadership emergence in creative problem solving groups and production groups. In the first of two experiments, groups of university students (total N = 114) played a creative problem solving game. Participants later rated each other on leadership behavior, styles, and variables related to the process of conversation. A performance quality measure was included also. Control parameters in the swallowtail catastrophe model were identified through a combination of factor analysis and nonlinear regression. Leaders displayed a broad spectrum of behaviors in the general categories of Controlling the Conversation and Creativity in their role-play. In the second experiment, groups of university students (total N = 197) engaged in a laboratory work experiment that had a substantial production goal component. The same system of ratings and modeling strategy was used along with a work production measure. Leaders in the production task emerged to the extent that they exhibited control over both the creative and production aspects of the task, they could keep tension low, and the externally imposed production goals were realistic.

  16. Ergonomics and design: its principles applied in the industry.

    PubMed

    Tavares, Ademario Santos; Silva, Francisco Nilson da

    2012-01-01

    Industrial Design encompasses both product development and optimization of production process. In this sense, Ergonomics plays a fundamental role, because its principles, methods and techniques can help operators to carry out their tasks most successfully. A case study carried out in an industry shows that the interaction among Design, Production Engineering and Materials Engineering departments may improve some aspects concerned security, comfort, efficiency and performance. In this process, Ergonomics had shown to be of essential importance to strategic decision making to the improvement of production section.

  17. Research on manufacturing service behavior modeling based on block chain theory

    NASA Astrophysics Data System (ADS)

    Zhao, Gang; Zhang, Guangli; Liu, Ming; Yu, Shuqin; Liu, Yali; Zhang, Xu

    2018-04-01

    According to the attribute characteristics of processing craft, the manufacturing service behavior is divided into service attribute, basic attribute, process attribute, resource attribute. The attribute information model of manufacturing service is established. The manufacturing service behavior information is successfully divided into public and private domain. Additionally, the block chain technology is introduced, and the information model of manufacturing service based on block chain principle is established, which solves the problem of sharing and secreting information of processing behavior, and ensures that data is not tampered with. Based on the key pairing verification relationship, the selective publishing mechanism for manufacturing information is established, achieving the traceability of product data, guarantying the quality of processing quality.

  18. Perspective: Maximum caliber is a general variational principle for dynamical systems.

    PubMed

    Dixit, Purushottam D; Wagoner, Jason; Weistuch, Corey; Pressé, Steve; Ghosh, Kingshuk; Dill, Ken A

    2018-01-07

    We review here Maximum Caliber (Max Cal), a general variational principle for inferring distributions of paths in dynamical processes and networks. Max Cal is to dynamical trajectories what the principle of maximum entropy is to equilibrium states or stationary populations. In Max Cal, you maximize a path entropy over all possible pathways, subject to dynamical constraints, in order to predict relative path weights. Many well-known relationships of non-equilibrium statistical physics-such as the Green-Kubo fluctuation-dissipation relations, Onsager's reciprocal relations, and Prigogine's minimum entropy production-are limited to near-equilibrium processes. Max Cal is more general. While it can readily derive these results under those limits, Max Cal is also applicable far from equilibrium. We give examples of Max Cal as a method of inference about trajectory distributions from limited data, finding reaction coordinates in bio-molecular simulations, and modeling the complex dynamics of non-thermal systems such as gene regulatory networks or the collective firing of neurons. We also survey its basis in principle and some limitations.

  19. Lean Production as an Innovative Approach to Construction

    NASA Astrophysics Data System (ADS)

    Spišáková, Marcela; Kozlovská, Mária

    2013-06-01

    Lean production presents a new approach to the construction management which has enabled enterprises to attain very high levels of efficiency, competitiveness and flexibility in production systems. Nowadays, a number of industrial processes are managed in accordance with these advanced management principles [1]. The principles of lean production are applied within the integrated design and delivery solutions (IDDS) and prefabricated construction. IDDS uses collaborative work processes and enhanced skills, with integrated data, information, and knowledge management to minimize structural and process inefficiencies and to enhance the value delivered during design, build, and operation, and across projects. Prefabrication presents a one of opportunities for construction methods, which allows the compliance with principles of sustainable design, provides the potential benefits such as faster construction, fewer housing defects, reduction in energy use and waste and elimination of environmental and safety risks. This paper presents the lean production within the IDDS and its potential in the modern prefabrication. There is created a field providing of benefits of lean production in construction industry.

  20. Mathematical model of whole-process calculation for bottom-blowing copper smelting

    NASA Astrophysics Data System (ADS)

    Li, Ming-zhou; Zhou, Jie-min; Tong, Chang-ren; Zhang, Wen-hai; Li, He-song

    2017-11-01

    The distribution law of materials in smelting products is key to cost accounting and contaminant control. Regardless, the distribution law is difficult to determine quickly and accurately by mere sampling and analysis. Mathematical models for material and heat balance in bottom-blowing smelting, converting, anode furnace refining, and electrolytic refining were established based on the principles of material (element) conservation, energy conservation, and control index constraint in copper bottom-blowing smelting. Simulation of the entire process of bottom-blowing copper smelting was established using a self-developed MetCal software platform. A whole-process simulation for an enterprise in China was then conducted. Results indicated that the quantity and composition information of unknown materials, as well as heat balance information, can be quickly calculated using the model. Comparison of production data revealed that the model can basically reflect the distribution law of the materials in bottom-blowing copper smelting. This finding provides theoretical guidance for mastering the performance of the entire process.

  1. Maximum entropy production in environmental and ecological systems.

    PubMed

    Kleidon, Axel; Malhi, Yadvinder; Cox, Peter M

    2010-05-12

    The coupled biosphere-atmosphere system entails a vast range of processes at different scales, from ecosystem exchange fluxes of energy, water and carbon to the processes that drive global biogeochemical cycles, atmospheric composition and, ultimately, the planetary energy balance. These processes are generally complex with numerous interactions and feedbacks, and they are irreversible in their nature, thereby producing entropy. The proposed principle of maximum entropy production (MEP), based on statistical mechanics and information theory, states that thermodynamic processes far from thermodynamic equilibrium will adapt to steady states at which they dissipate energy and produce entropy at the maximum possible rate. This issue focuses on the latest development of applications of MEP to the biosphere-atmosphere system including aspects of the atmospheric circulation, the role of clouds, hydrology, vegetation effects, ecosystem exchange of energy and mass, biogeochemical interactions and the Gaia hypothesis. The examples shown in this special issue demonstrate the potential of MEP to contribute to improved understanding and modelling of the biosphere and the wider Earth system, and also explore limitations and constraints to the application of the MEP principle.

  2. The TAME Project: Towards improvement-oriented software environments

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Rombach, H. Dieter

    1988-01-01

    Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.

  3. Using the Maximum Entropy Principle as a Unifying Theory Characterization and Sampling of Multi-Scaling Processes in Hydrometeorology

    DTIC Science & Technology

    2015-08-20

    evapotranspiration (ET) over oceans may be significantly lower than previously thought. The MEP model parameterized turbulent transfer coefficients...fluxes, ocean freshwater fluxes, regional crop yield among others. An on-going study suggests that the global annual evapotranspiration (ET) over...Bras, Jingfeng Wang. A model of evapotranspiration based on the theory of maximum entropy production, Water Resources Research, (03 2011): 0. doi

  4. Good Manufacturing Practices (GMP) manufacturing of advanced therapy medicinal products: a novel tailored model for optimizing performance and estimating costs.

    PubMed

    Abou-El-Enein, Mohamed; Römhild, Andy; Kaiser, Daniel; Beier, Carola; Bauer, Gerhard; Volk, Hans-Dieter; Reinke, Petra

    2013-03-01

    Advanced therapy medicinal products (ATMP) have gained considerable attention in academia due to their therapeutic potential. Good Manufacturing Practice (GMP) principles ensure the quality and sterility of manufacturing these products. We developed a model for estimating the manufacturing costs of cell therapy products and optimizing the performance of academic GMP-facilities. The "Clean-Room Technology Assessment Technique" (CTAT) was tested prospectively in the GMP facility of BCRT, Berlin, Germany, then retrospectively in the GMP facility of the University of California-Davis, California, USA. CTAT is a two-level model: level one identifies operational (core) processes and measures their fixed costs; level two identifies production (supporting) processes and measures their variable costs. The model comprises several tools to measure and optimize performance of these processes. Manufacturing costs were itemized using adjusted micro-costing system. CTAT identified GMP activities with strong correlation to the manufacturing process of cell-based products. Building best practice standards allowed for performance improvement and elimination of human errors. The model also demonstrated the unidirectional dependencies that may exist among the core GMP activities. When compared to traditional business models, the CTAT assessment resulted in a more accurate allocation of annual expenses. The estimated expenses were used to set a fee structure for both GMP facilities. A mathematical equation was also developed to provide the final product cost. CTAT can be a useful tool in estimating accurate costs for the ATMPs manufactured in an optimized GMP process. These estimates are useful when analyzing the cost-effectiveness of these novel interventions. Copyright © 2013 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  5. Integrated Application of Quality-by-Design Principles to Drug Product Development: A Case Study of Brivanib Alaninate Film-Coated Tablets.

    PubMed

    Badawy, Sherif I F; Narang, Ajit S; LaMarche, Keirnan R; Subramanian, Ganeshkumar A; Varia, Sailesh A; Lin, Judy; Stevens, Tim; Shah, Pankaj A

    2016-01-01

    Modern drug product development is expected to follow quality-by-design (QbD) paradigm. At the same time, although there are several issue-specific examples in the literature that demonstrate the application of QbD principles, a holistic demonstration of the application of QbD principles to drug product development and control strategy, is lacking. This article provides an integrated case study on the systematic application of QbD to product development and demonstrates the implementation of QbD concepts in the different aspects of product and process design for brivanib alaninate film-coated tablets. Using a risk-based approach, the strategy for development entailed identification of product critical quality attributes (CQAs), assessment of risks to the CQAs, and performing experiments to understand and mitigate identified risks. Quality risk assessments and design of experiments were performed to understand the quality of the input raw materials required for a robust formulation and the impact of manufacturing process parameters on CQAs. In addition to the material property and process parameter controls, the proposed control strategy includes use of process analytical technology and conventional analytical tests to control in-process material attributes and ensure quality of the final product. Copyright © 2016. Published by Elsevier Inc.

  6. Aura Satellite Mission: Oxford/RAL Spring School in Quantitative Earth Observation

    NASA Technical Reports Server (NTRS)

    Douglass, Anne

    2005-01-01

    The four instruments on Aura are providing new and exciting measurements of stratospheric and tropospheric ozone, species that contribute to ozone production and loss, and long-lived gases such as nitrous oxide and methane that provide information about atmospheric transport. These discussions of atmospheric chemistry will start with the basic principles of ozone production and loss. Aura data will be used where possible to illustrate the pertinent atmospheric processes. Three-dimensional model simulations will be used both to illustrate present capabilities in constituent modeling and to demonstrate how observations are used to evaluate and improve models and our ability to predict future ozone evolution.

  7. HEALTH TECHNOLOGY ASSESSMENT FOR DECISION MAKING IN LATIN AMERICA: GOOD PRACTICE PRINCIPLES.

    PubMed

    Pichon-Riviere, Andrés; Soto, Natalie C; Augustovski, Federico Ariel; García Martí, Sebastián; Sampietro-Colom, Laura

    2018-06-11

    The aim of this study was to identify good practice principles for health technology assessment (HTA) that are the most relevant and of highest priority for application in Latin America and to identify potential barriers to their implementation in the region. HTA good practice principles proposed at the international level were identified and then explored during a deliberative process in a forum of assessors, funders, and product manufacturers. Forty-two representatives from ten Latin American countries participated. Good practice principles proposed at the international level were considered valid and potentially relevant to Latin America. Five principles were identified as priority and with the greatest potential to be strengthened at this time: transparency in the production of HTA, involvement of relevant stakeholders in the HTA process, mechanisms to appeal decisions, clear priority-setting processes in HTA, and a clear link between HTA and decision making. The main challenge identified was to find a balance between the application of these principles and the available resources in a way that would not detract from the production of reports and adaptation to the needs of decision makers. The main recommendation was to progress gradually in strengthening HTA and its link to decision making by developing appropriate processes for each country, without trying to impose, in the short-term, standards taken from examples at the international level without adequate adaptation of these to local contexts.

  8. Generalized mathematical model of red muds’ thickener of alumina production

    NASA Astrophysics Data System (ADS)

    Fedorova, E. R.; Vinogradova, A. A.

    2018-03-01

    The article describes the principle of a generalized mathematical model of the red mud’s thickener construction. The model of the red muds’ thickener of alumina production consists of sub-models of flocculation zones containing solid fraction feed slurry, free-fall and cramped sedimentation zones or effective sedimentation zones, bleaching zones. The generalized mathematical model of thickener allows predicting the content of solid fraction in the condensed product and in the upper discharge. The sub-model of solid phase aggregation allows one to count up average size of floccules, which is created during the flocculation process in feedwell. The sub-model of the free-fall and cramped sedimentation zone allows one to count up the concentration profile taking into account the variable cross-sectional area of the thickener. The sub-model of the bleaching zone is constructed on the basis of the theory of the precipitation of Kinc, supplemented by correction factors.

  9. Kaizen: a process improvement model for the business of health care and perioperative nursing professionals.

    PubMed

    Tetteh, Hassan A

    2012-01-01

    Kaizen is a proven management technique that has a practical application for health care in the context of health care reform and the 2010 Institute of Medicine landmark report on the future of nursing. Compounded productivity is the unique benefit of kaizen, and its principles are change, efficiency, performance of key essential steps, and the elimination of waste through small and continuous process improvements. The kaizen model offers specific instruction for perioperative nurses to achieve process improvement in a five-step framework that includes teamwork, personal discipline, improved morale, quality circles, and suggestions for improvement. Published by Elsevier Inc.

  10. The amount of ergonomics and user involvement in 151 design processes.

    PubMed

    Kok, Barbara N E; Slegers, Karin; Vink, Peter

    2012-01-01

    Ergonomics, usability and user-centered design are terms that are well known among designers. Yet, products often seem to fail to meet the users' needs, resulting in a gap between expected and experienced usability. To understand the possible causes of this gap the actions taken by the designer during the design process are studied in this paper. This can show whether and how certain actions influence the user-friendliness of the design products. The aim of this research was to understand whether ergonomic principles and methods are included in the design process, whether users are involved in this process and whether the experience of the designer (in ergonomics/user involvement) has an effect on the end product usability. In this study the design processes of 151 tangible products of students in design were analyzed. It showed that in 75% of the cases some ergonomic principles were applied. User involvement was performed in only 1/3 of the design cases. Hardly any correlation was found between the designers' experience in ergonomic principles and the way they applied it and no correlations were found between the designers' experience in user involvement and the users' involvement in the design process.

  11. Chemical vapor deposition modeling for high temperature materials

    NASA Technical Reports Server (NTRS)

    Gokoglu, Suleyman A.

    1992-01-01

    The formalism for the accurate modeling of chemical vapor deposition (CVD) processes has matured based on the well established principles of transport phenomena and chemical kinetics in the gas phase and on surfaces. The utility and limitations of such models are discussed in practical applications for high temperature structural materials. Attention is drawn to the complexities and uncertainties in chemical kinetics. Traditional approaches based on only equilibrium thermochemistry and/or transport phenomena are defended as useful tools, within their validity, for engineering purposes. The role of modeling is discussed within the context of establishing the link between CVD process parameters and material microstructures/properties. It is argued that CVD modeling is an essential part of designing CVD equipment and controlling/optimizing CVD processes for the production and/or coating of high performance structural materials.

  12. Optimal control of nutrition restricted dynamics model of Microalgae biomass growth model

    NASA Astrophysics Data System (ADS)

    Ratianingsih, R.; Azim; Nacong, N.; Resnawati; Mardlijah; Widodo, B.

    2017-12-01

    The biomass of the microalgae is very potential to be proposed as an alternative renewable energy resources because it could be extracted into lipid. Afterward, the lipid could be processed to get the biodiesel or bioethanol. The extraction of the biomass on lipid synthesis process is very important to be studied because the process just gives some amount of lipid. A mathematical model of restricted microalgae biomass growth just gives 1/3 proportion of lipid with respect to the biomass in the synthesis process. An optimal control is designed to raise the ratio between the number of lipid formation and the microalgae biomass to be used in synthesis process. The minimum/ Pontryagin maximum principle is used to get the optimal lipid production. The simulation shows that the optimal lipid formation could be reach by simultaneously controlling the carbon dioxide, in the respiration and photosynthesis the process, and intake nutrition rates of liquid waste and urea substrate. The production of controlled microalgae lipid could be increase 6.5 times comparing to the uncontrolled one.

  13. Production of high-quality polydisperse construction mixes for additive 3D technologies.

    NASA Astrophysics Data System (ADS)

    Gerasimov, M. D.; Brazhnik, Yu V.; Gorshkov, P. S.; Latyshev, S. S.

    2018-03-01

    The paper describes a new design of a mixer allowing production of high quality polydisperse powders, used in additive 3D technologies. A new principle of dry powder particle mixing is considered, implementing a possibility of a close-to-ideal distribution of such particles in common space. A mathematical model of the mixer is presented, allowing evaluating quality indicators of the produced mixture. Experimental results are shown and rational values of process parameters of the mixer are obtained.

  14. Structural principles for computational and de novo design of 4Fe-4S metalloproteins

    PubMed Central

    Nanda, Vikas; Senn, Stefan; Pike, Douglas H.; Rodriguez-Granillo, Agustina; Hansen, Will; Khare, Sagar D.; Noy, Dror

    2017-01-01

    Iron-sulfur centers in metalloproteins can access multiple oxidation states over a broad range of potentials, allowing them to participate in a variety of electron transfer reactions and serving as catalysts for high-energy redox processes. The nitrogenase FeMoCO cluster converts di-nitrogen to ammonia in an eight-electron transfer step. The 2(Fe4S4) containing bacterial ferredoxin is an evolutionarily ancient metalloprotein fold and is thought to be a primordial progenitor of extant oxidoreductases. Controlling chemical transformations mediated by iron-sulfur centers such as nitrogen fixation, hydrogen production as well as electron transfer reactions involved in photosynthesis are of tremendous importance for sustainable chemistry and energy production initiatives. As such, there is significant interest in the design of iron-sulfur proteins as minimal models to gain fundamental understanding of complex natural systems and as lead-molecules for industrial and energy applications. Herein, we discuss salient structural characteristics of natural iron-sulfur proteins and how they guide principles for design. Model structures of past designs are analyzed in the context of these principles and potential directions for enhanced designs are presented, and new areas of iron-sulfur protein design are proposed. PMID:26449207

  15. The role of production and teamwork practices in construction safety: a cognitive model and an empirical case study.

    PubMed

    Mitropoulos, Panagiotis Takis; Cupido, Gerardo

    2009-01-01

    In construction, the challenge for researchers and practitioners is to develop work systems (production processes and teams) that can achieve high productivity and high safety at the same time. However, construction accident causation models ignore the role of work practices and teamwork. This study investigates the mechanisms by which production and teamwork practices affect the likelihood of accidents. The paper synthesizes a new model for construction safety based on the cognitive perspective (Fuller's Task-Demand-Capability Interface model, 2005) and then presents an exploratory case study. The case study investigates and compares the work practices of two residential framing crews: a 'High Reliability Crew' (HRC)--that is, a crew with exceptional productivity and safety over several years, and an average performing crew from the same company. The model explains how the production and teamwork practices generate the work situations that workers face (the task demands) and affect the workers ability to cope (capabilities). The case study indicates that the work practices of the HRC directly influence the task demands and match them with the applied capabilities. These practices were guided by the 'principle' of avoiding errors and rework and included work planning and preparation, work distribution, managing the production pressures, and quality and behavior monitoring. The Task Demand-Capability model links construction research to a cognitive model of accident causation and provides a new way to conceptualize safety as an emergent property of the production practices and teamwork processes. The empirical evidence indicates that the crews' work practices and team processes strongly affect the task demands, the applied capabilities, and the match between demands and capabilities. The proposed model and the exploratory case study will guide further discovery of work practices and teamwork processes that can increase both productivity and safety in construction operations. Such understanding will enable training of construction foremen and crews in these practices to systematically develop high reliability crews.

  16. Completing and Adapting Models of Biological Processes

    NASA Technical Reports Server (NTRS)

    Margaria, Tiziana; Hinchey, Michael G.; Raffelt, Harald; Rash, James L.; Rouff, Christopher A.; Steffen, Bernhard

    2006-01-01

    We present a learning-based method for model completion and adaptation, which is based on the combination of two approaches: 1) R2D2C, a technique for mechanically transforming system requirements via provably equivalent models to running code, and 2) automata learning-based model extrapolation. The intended impact of this new combination is to make model completion and adaptation accessible to experts of the field, like biologists or engineers. The principle is briefly illustrated by generating models of biological procedures concerning gene activities in the production of proteins, although the main application is going to concern autonomic systems for space exploration.

  17. Practicing universal design to actual hand tool design process.

    PubMed

    Lin, Kai-Chieh; Wu, Chih-Fu

    2015-09-01

    UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  18. Adsorption and ion exchange: basic principles and their application in food processing.

    PubMed

    Kammerer, Judith; Carle, Reinhold; Kammerer, Dietmar R

    2011-01-12

    A comprehensive overview of adsorption and ion exchange technology applied for food and nutraceutical production purposes is given in the present paper. Emanating from these fields of application, the main adsorbent and ion-exchange resin materials, their historical development, industrial production, and the main parameters characterizing these sorbents are covered. Furthermore, adsorption and ion exchange processes are detailed, also providing profound insights into kinetics, thermodynamics, and equilibrium model assumptions. In addition, the most important industrial adsorber and ion exchange processes making use of vessels and columns are summarized. Finally, an extensive overview of selected industrial applications of these technologies is provided, which is divided into general applications, food production applications, and the recovery of valuable bio- and technofunctional compounds from the byproducts of plant food processing, which may be used as natural food additives or for their potential health-beneficial effects in functional or enriched foods and nutraceuticals.

  19. Unified reduction principle for the evolution of mutation, migration, and recombination

    PubMed Central

    Altenberg, Lee; Liberman, Uri; Feldman, Marcus W.

    2017-01-01

    Modifier-gene models for the evolution of genetic information transmission between generations of organisms exhibit the reduction principle: Selection favors reduction in the rate of variation production in populations near equilibrium under a balance of constant viability selection and variation production. Whereas this outcome has been proven for a variety of genetic models, it has not been proven in general for multiallelic genetic models of mutation, migration, and recombination modification with arbitrary linkage between the modifier and major genes under viability selection. We show that the reduction principle holds for all of these cases by developing a unifying mathematical framework that characterizes all of these evolutionary models. PMID:28265103

  20. Deep hierarchies in the primate visual cortex: what can we learn for computer vision?

    PubMed

    Krüger, Norbert; Janssen, Peter; Kalkan, Sinan; Lappe, Markus; Leonardis, Ales; Piater, Justus; Rodríguez-Sánchez, Antonio J; Wiskott, Laurenz

    2013-08-01

    Computational modeling of the primate visual system yields insights of potential relevance to some of the challenges that computer vision is facing, such as object recognition and categorization, motion detection and activity recognition, or vision-based navigation and manipulation. This paper reviews some functional principles and structures that are generally thought to underlie the primate visual cortex, and attempts to extract biological principles that could further advance computer vision research. Organized for a computer vision audience, we present functional principles of the processing hierarchies present in the primate visual system considering recent discoveries in neurophysiology. The hierarchical processing in the primate visual system is characterized by a sequence of different levels of processing (on the order of 10) that constitute a deep hierarchy in contrast to the flat vision architectures predominantly used in today's mainstream computer vision. We hope that the functional description of the deep hierarchies realized in the primate visual system provides valuable insights for the design of computer vision algorithms, fostering increasingly productive interaction between biological and computer vision research.

  1. 40 CFR 63.1323 - Batch process vents-methods and procedures for group determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... recovering monomer, reaction products, by-products, or solvent from a stripper operated in batch mode, and the primary condenser recovering monomer, reaction products, by-products, or solvent from a...

  2. The precautionary principle.

    PubMed

    Hayes, A Wallace

    2005-06-01

    The Precautionary Principle in its simplest form states: "When an activity raises threats of harm to human health or the environment, precautionary measures should be taken even if some cause-and-effect relationships are not fully established scientifically". This Principle is the basis for European environmental law, and plays an increasing role in developing environmental health policies as well. It also is used in environmental decision-making in Canada and in several European countries, especially in Denmark, Sweden, and Germany. The Precautionary Principle has been used in the environmental decision-making process and in regulating drugs and other consumer products in the United States. The Precautionary Principle enhances the collection of risk information for, among other items, high production volume chemicals and risk-based analyses in general. It does not eliminate the need for good science or for science-based risk assessments. Public participation is encouraged in both the review process and the decision-making process. The Precautionary Principle encourages, and in some cases may require, transparency of the risk assessment process on health risk of chemicals both for public health and the environment. A debate continues on whether the Principle should embrace the "polluter pays" directive and place the responsibility for providing risk assessment on industry. The best elements of a precautionary approach demand good science and challenge the scientific community to improve methods used for risk assessment.

  3. 76 FR 4360 - Guidance for Industry on Process Validation: General Principles and Practices; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-25

    ... elements of process validation for the manufacture of human and animal drug and biological products... process validation for the manufacture of human and animal drug and biological products, including APIs. This guidance describes process validation activities in three stages: In Stage 1, Process Design, the...

  4. 40 CFR 63.488 - Methods and procedures for batch front-end process vent group determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... primary condenser recovering monomer, reaction products, by-products, or solvent from a stripper operated in batch mode, and the primary condenser recovering monomer, reaction products, by-products, or...

  5. Creating an environment for caring using lean principles of the Virginia Mason Production System.

    PubMed

    Nelson-Peterson, Dana L; Leppa, Carol J

    2007-06-01

    As healthcare leaders search for viable options to cut costs, increase efficiencies, and improve the product that they offer to customers, many are looking at different business models to adopt. At the same time, an aging workforce of nurses feel the pressure of being overworked and understaffed, resulting in both decreased job and patient satisfaction. Virginia Mason Medical Center in Seattle, Wash, has implemented the Virginia Mason Production System, using proven concepts adapted from the Toyota Production System that effectively eliminate "muda" or waste, in workplace processes. The authors discuss the application of the Virginia Mason Production System and how it has resulted in increased time for nurses to care for their patients.

  6. Principles to Products: Toward Realizing MOS 2.0

    NASA Technical Reports Server (NTRS)

    Bindschadler, Duane L.; Delp, Christopher L.

    2012-01-01

    This is a report on the Operations Revitalization Initiative, part of the ongoing NASA-funded Advanced Multi-Mission Operations Systems (AMMOS) program. We are implementing products that significantly improve efficiency and effectiveness of Mission Operations Systems (MOS) for deep-space missions. We take a multi-mission approach, in keeping with our organization's charter to "provide multi-mission tools and services that enable mission customers to operate at a lower total cost to NASA." Focusing first on architectural fundamentals of the MOS, we review the effort's progress. In particular, we note the use of stakeholder interactions and consideration of past lessons learned to motivate a set of Principles that guide the evolution of the AMMOS. Thus guided, we have created essential patterns and connections (detailed in companion papers) that are explicitly modeled and support elaboration at multiple levels of detail (system, sub-system, element...) throughout a MOS. This architecture is realized in design and implementation products that provide lifecycle support to a Mission at the system and subsystem level. The products include adaptable multi-mission engineering documentation that describes essentials such as operational concepts and scenarios, requirements, interfaces and agreements, information models, and mission operations processes. Because we have adopted a model-based system engineering method, these documents and their contents are meaningfully related to one another and to the system model. This means they are both more rigorous and reusable (from mission to mission) than standard system engineering products. The use of models also enables detailed, early (e.g., formulation phase) insight into the impact of changes (e.g., to interfaces or to software) that is rigorous and complete, allowing better decisions on cost or technical trades. Finally, our work provides clear and rigorous specification of operations needs to software developers, further enabling significant gains in productivity.

  7. A Hybrid Fuzzy Model for Lean Product Development Performance Measurement

    NASA Astrophysics Data System (ADS)

    Osezua Aikhuele, Daniel; Mohd Turan, Faiz

    2016-02-01

    In the effort for manufacturing companies to meet up with the emerging consumer demands for mass customized products, many are turning to the application of lean in their product development process, and this is gradually moving from being a competitive advantage to a necessity. However, due to lack of clear understanding of the lean performance measurements, many of these companies are unable to implement and fully integrated the lean principle into their product development process. Extensive literature shows that only few studies have focus systematically on the lean product development performance (LPDP) evaluation. In order to fill this gap, the study therefore proposed a novel hybrid model based on Fuzzy Reasoning Approach (FRA), and the extension of Fuzzy-AHP and Fuzzy-TOPSIS methods for the assessment of the LPDP. Unlike the existing methods, the model considers the importance weight of each of the decision makers (Experts) since the performance criteria/attributes are required to be rated, and these experts have different level of expertise. The rating is done using a new fuzzy Likert rating scale (membership-scale) which is designed such that it can address problems resulting from information lost/distortion due to closed-form scaling and the ordinal nature of the existing Likert scale.

  8. Virtual manufacturing in reality

    NASA Astrophysics Data System (ADS)

    Papstel, Jyri; Saks, Alo

    2000-10-01

    SMEs play an important role in manufacturing industry. But from time to time there is a shortage in resources to complete the particular order in time. Number of systems is introduced to produce digital information in order to support product and process development activities. Main problem is lack of opportunity for direct data transition within design system modules when needed temporary extension of design capacity (virtuality) or to implement integrated concurrent product development principles. The planning experience in the field is weakly used as well. The concept of virtual manufacturing is a supporting idea to solve this problem. At the same time a number of practical problems should be solved like information conformity, data transfer, unified technological concepts acceptation etc. In the present paper the proposed ways to solve the practical problems of virtual manufacturing are described. General objective is to introduce the knowledge-based CAPP system as missing module for Virtual Manufacturing in the selected product domain. Surface-centered planning concept based on STEP- based modeling principles, and knowledge-based process planning methodology will be used to gain the objectives. As a result the planning module supplied by design data with direct access, and supporting advising environment is expected. Mould producing SME would be as test basis.

  9. Supercritical Fluid Fractionation of JP-8

    DTIC Science & Technology

    1991-12-26

    applications, such as coffee decaffeination , spice extraction, and lipids purification. The processing principles have also long been well known and ipracticed...PRINCIPLES OF SUPERCRITICAL FLUID EXTRACTION 8 A. Background on Supercritical Fluid Solubility 8 B. Supercritical Fluid Extraction Process ...Operation I0 1. Batch Extraction of Solid Materials 10 2. Counter-Current Continuous SCF Processing of Liquid 15 Products 3. Supercritical Fluid Extraction vs

  10. Experimenting `learn by doing' and `learn by failing'

    NASA Astrophysics Data System (ADS)

    Pozzi, Rossella; Noè, Carlo; Rossi, Tommaso

    2015-01-01

    According to the literature, in recent years, developing experiential learning has fulfilled the requirement of a deep understanding of lean philosophy by engineering students, demonstrating the advantages and disadvantages of some of the key principles of lean manufacturing. On the other hand, the literature evidences how some kinds of game-based experiential learning overlook daily difficulties, which play a central role in manufacturing systems. To fill the need of a game overcoming such lack of vision, an innovative game direct in-field, named Kart Factory, has been developed. Actual production shifts are simulated, while keeping all the elements peculiar to a real production set (i.e. complexity, effort, safety). The working environment is a real pedal car assembly department, the products to be assembled have relevant size and weight (i.e. up to 35 kg approximately), and the provided tools are real production equipment (e.g. keys, screwdrivers, trans-pallets, etc.). Due to the need to maximise the impact on students, a labour-intensive process characterises the production department. The whole training process is based on three educational principles: Experience Value Principle, Error Value Principle, and Team Value Principle. As the 'learn by doing' and 'learn by failing' are favoured, the theory follows the practice, while crating the willingness to 'do' instead of just designing or planning. The gathered data prove the Kart Factory's effectiveness in reaching a good knowledge of lean concepts, notwithstanding the students' initial knowledge level.

  11. Application of multivariate analysis and mass transfer principles for refinement of a 3-L bioreactor scale-down model--when shake flasks mimic 15,000-L bioreactors better.

    PubMed

    Ahuja, Sanjeev; Jain, Shilpa; Ram, Kripa

    2015-01-01

    Characterization of manufacturing processes is key to understanding the effects of process parameters on process performance and product quality. These studies are generally conducted using small-scale model systems. Because of the importance of the results derived from these studies, the small-scale model should be predictive of large scale. Typically, small-scale bioreactors, which are considered superior to shake flasks in simulating large-scale bioreactors, are used as the scale-down models for characterizing mammalian cell culture processes. In this article, we describe a case study where a cell culture unit operation in bioreactors using one-sided pH control and their satellites (small-scale runs conducted using the same post-inoculation cultures and nutrient feeds) in 3-L bioreactors and shake flasks indicated that shake flasks mimicked the large-scale performance better than 3-L bioreactors. We detail here how multivariate analysis was used to make the pertinent assessment and to generate the hypothesis for refining the existing 3-L scale-down model. Relevant statistical techniques such as principal component analysis, partial least square, orthogonal partial least square, and discriminant analysis were used to identify the outliers and to determine the discriminatory variables responsible for performance differences at different scales. The resulting analysis, in combination with mass transfer principles, led to the hypothesis that observed similarities between 15,000-L and shake flask runs, and differences between 15,000-L and 3-L runs, were due to pCO2 and pH values. This hypothesis was confirmed by changing the aeration strategy at 3-L scale. By reducing the initial sparge rate in 3-L bioreactor, process performance and product quality data moved closer to that of large scale. © 2015 American Institute of Chemical Engineers.

  12. Competition and cooperation among similar representations: toward a unified account of facilitative and inhibitory effects of lexical neighbors.

    PubMed

    Chen, Qi; Mirman, Daniel

    2012-04-01

    One of the core principles of how the mind works is the graded, parallel activation of multiple related or similar representations. Parallel activation of multiple representations has been particularly important in the development of theories and models of language processing, where coactivated representations (neighbors) have been shown to exhibit both facilitative and inhibitory effects on word recognition and production. Researchers generally ascribe these effects to interactive activation and competition, but there is no unified explanation for why the effects are facilitative in some cases and inhibitory in others. We present a series of simulations of a simple domain-general interactive activation and competition model that is broadly consistent with more specialized domain-specific models of lexical processing. The results showed that interactive activation and competition can indeed account for the complex pattern of reversals. Critically, the simulations revealed a core computational principle that determines whether neighbor effects are facilitative or inhibitory: strongly active neighbors exert a net inhibitory effect, and weakly active neighbors exert a net facilitative effect.

  13. UCXp camera imaging principle and key technologies of data post-processing

    NASA Astrophysics Data System (ADS)

    Yuan, Fangyan; Li, Guoqing; Zuo, Zhengli; Liu, Jianmin; Wu, Liang; Yu, Xiaoping; Zhao, Haitao

    2014-03-01

    The large format digital aerial camera product UCXp was introduced into the Chinese market in 2008, the image consists of 17310 columns and 11310 rows with a pixel size of 6 mm. The UCXp camera has many advantages compared with the same generation camera, with multiple lenses exposed almost at the same time and no oblique lens. The camera has a complex imaging process whose principle will be detailed in this paper. On the other hand, the UCXp image post-processing method, including data pre-processing and orthophoto production, will be emphasized in this article. Based on the data of new Beichuan County, this paper will describe the data processing and effects.

  14. Model competencies in regulatory therapeutic product assessment: Health Canada's good review guiding principles as a reviewing community's code of intellectual conduct.

    PubMed

    Lim, Robyn R

    2007-08-01

    This article describes some work from the Therapeutic Products Directorate of Health Canada regarding Good Review Practices (GRP). Background information is provided on the Therapeutic Products Directorate (TPD) and its regulatory activities regarding drug and medical device assessment in both the pre- and post-market setting. The TPD Good Review Guiding Principles (GRGP) are described which include a Definition of a Good Therapeutic Product Regulatory Review, Ten Hallmarks of a Good Therapeutic Product Regulatory Review and Ten Precepts. Analysis of the guiding principles discusses possible linkages between the guiding principles and intellectual virtues. Through this analysis an hypothesis is developed that the guiding principles outline a code of intellectual conduct for Health Canada's reviewers of evidence for efficacy, safety, manufacturing quality and benefit-risk regarding therapeutic products. Opportunities to advance therapeutic product regulatory review as a scientific discipline in its own right and to acknowledge that these reviewers constitute a specific community of practice are discussed. Integration of intellectual and ethical approaches across therapeutic product review sectors is also suggested.

  15. Comparison between 2 methods of solid-liquid extraction for the production of Cinchona calisaya elixir: an experimental kinetics and numerical modeling approach.

    PubMed

    Naviglio, Daniele; Formato, Andrea; Gallo, Monica

    2014-09-01

    The purpose of this study is to compare the extraction process for the production of China elixir starting from the same vegetable mixture, as performed by conventional maceration or a cyclically pressurized extraction process (rapid solid-liquid dynamic extraction) using the Naviglio Extractor. Dry residue was used as a marker for the kinetics of the extraction process because it was proportional to the amount of active principles extracted and, therefore, to their total concentration in the solution. UV spectra of the hydroalcoholic extracts allowed for the identification of the predominant chemical species in the extracts, while the organoleptic tests carried out on the final product provided an indication of the acceptance of the beverage and highlighted features that were not detectable by instrumental analytical techniques. In addition, a numerical simulation of the process has been performed, obtaining useful information about the timing of the process (time history) as well as its mathematical description. © 2014 Institute of Food Technologists®

  16. Kinetic models in industrial biotechnology - Improving cell factory performance.

    PubMed

    Almquist, Joachim; Cvijovic, Marija; Hatzimanikatis, Vassily; Nielsen, Jens; Jirstrand, Mats

    2014-07-01

    An increasing number of industrial bioprocesses capitalize on living cells by using them as cell factories that convert sugars into chemicals. These processes range from the production of bulk chemicals in yeasts and bacteria to the synthesis of therapeutic proteins in mammalian cell lines. One of the tools in the continuous search for improved performance of such production systems is the development and application of mathematical models. To be of value for industrial biotechnology, mathematical models should be able to assist in the rational design of cell factory properties or in the production processes in which they are utilized. Kinetic models are particularly suitable towards this end because they are capable of representing the complex biochemistry of cells in a more complete way compared to most other types of models. They can, at least in principle, be used to in detail understand, predict, and evaluate the effects of adding, removing, or modifying molecular components of a cell factory and for supporting the design of the bioreactor or fermentation process. However, several challenges still remain before kinetic modeling will reach the degree of maturity required for routine application in industry. Here we review the current status of kinetic cell factory modeling. Emphasis is on modeling methodology concepts, including model network structure, kinetic rate expressions, parameter estimation, optimization methods, identifiability analysis, model reduction, and model validation, but several applications of kinetic models for the improvement of cell factories are also discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Deciphering mRNA Sequence Determinants of Protein Production Rate

    NASA Astrophysics Data System (ADS)

    Szavits-Nossan, Juraj; Ciandrini, Luca; Romano, M. Carmen

    2018-03-01

    One of the greatest challenges in biophysical models of translation is to identify coding sequence features that affect the rate of translation and therefore the overall protein production in the cell. We propose an analytic method to solve a translation model based on the inhomogeneous totally asymmetric simple exclusion process, which allows us to unveil simple design principles of nucleotide sequences determining protein production rates. Our solution shows an excellent agreement when compared to numerical genome-wide simulations of S. cerevisiae transcript sequences and predicts that the first 10 codons, which is the ribosome footprint length on the mRNA, together with the value of the initiation rate, are the main determinants of protein production rate under physiological conditions. Finally, we interpret the obtained analytic results based on the evolutionary role of the codons' choice for regulating translation rates and ribosome densities.

  18. The heuristic-analytic theory of reasoning: extension and evaluation.

    PubMed

    Evans, Jonathan St B T

    2006-06-01

    An extensively revised heuristic-analytic theory of reasoning is presented incorporating three principles of hypothetical thinking. The theory assumes that reasoning and judgment are facilitated by the formation of epistemic mental models that are generated one at a time (singularity principle) by preconscious heuristic processes that contextualize problems in such a way as to maximize relevance to current goals (relevance principle). Analytic processes evaluate these models but tend to accept them unless there is good reason to reject them (satisficing principle). At a minimum, analytic processing of models is required so as to generate inferences or judgments relevant to the task instructions, but more active intervention may result in modification or replacement of default models generated by the heuristic system. Evidence for this theory is provided by a review of a wide range of literature on thinking and reasoning.

  19. A theoretical study of concentration of profiles of primary cytochemical-enzyme reaction products in membrane-bound cell organelles and its application to lysosomal acid phosphatase.

    PubMed

    Cornelisse, C J; Hermens, W T; Joe, M T; Duijndam, W A; van Duijn, P

    1976-11-01

    A numerical method was developed for computing the steady-state concentration gradient of a diffusible enzyme reaction product in a membrane-limited compartment of a simplified theoretical cell model. In cytochemical enzyme reactions proceeding according to the metal-capture principle, the local concentration of the primary reaction product is an important factor in the onset of the precipitation process and in the distribution of the final reaction product. The following variables were incorporated into the model: enzyme activity, substrate concentration, Km, diffusion coefficient of substrate and product, particle radius and cell radius. The method was applied to lysosomal acid phosphatase. Numerical values for the variables were estimated from experimental data in the literature. The results show that the calculated phosphate concentrations inside lysosomes are several orders of magnitude lower than the critical concentrations for efficient phosphate capture found in a previous experimental model study. Reasons for this apparent discrepancy are discussed.

  20. Integrating the Principles of Toxicology into a Chemistry Curriculum

    EPA Science Inventory

    Designing safer products, processes and materials requires a commitment to engaging a transdisciplinary, systems approach utilizing the principles of chemistry, toxicology, environmental sciences and other allied disciplines. Chemistry and toxicology are inherently complementary ...

  1. Biological evolution of replicator systems: towards a quantitative approach.

    PubMed

    Martin, Osmel; Horvath, J E

    2013-04-01

    The aim of this work is to study the features of a simple replicator chemical model of the relation between kinetic stability and entropy production under the action of external perturbations. We quantitatively explore the different paths leading to evolution in a toy model where two independent replicators compete for the same substrate. To do that, the same scenario described originally by Pross (J Phys Org Chem 17:312-316, 2004) is revised and new criteria to define the kinetic stability are proposed. Our results suggest that fast replicator populations are continually favored by the effects of strong stochastic environmental fluctuations capable to determine the global population, the former assumed to be the only acting evolution force. We demonstrate that the process is continually driven by strong perturbations only, and that population crashes may be useful proxies for these catastrophic environmental fluctuations. As expected, such behavior is particularly enhanced under very large scale perturbations, suggesting a likely dynamical footprint in the recovery patterns of new species after mass extinction events in the Earth's geological past. Furthermore, the hypothesis that natural selection always favors the faster processes may give theoretical support to different studies that claim the applicability of maximum principles like the Maximum Metabolic Flux (MMF) or Maximum Entropy Productions Principle (MEPP), seen as the main goal of biological evolution.

  2. Biological Evolution of Replicator Systems: Towards a Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Martin, Osmel; Horvath, J. E.

    2013-04-01

    The aim of this work is to study the features of a simple replicator chemical model of the relation between kinetic stability and entropy production under the action of external perturbations. We quantitatively explore the different paths leading to evolution in a toy model where two independent replicators compete for the same substrate. To do that, the same scenario described originally by Pross (J Phys Org Chem 17:312-316, 2004) is revised and new criteria to define the kinetic stability are proposed. Our results suggest that fast replicator populations are continually favored by the effects of strong stochastic environmental fluctuations capable to determine the global population, the former assumed to be the only acting evolution force. We demonstrate that the process is continually driven by strong perturbations only, and that population crashes may be useful proxies for these catastrophic environmental fluctuations. As expected, such behavior is particularly enhanced under very large scale perturbations, suggesting a likely dynamical footprint in the recovery patterns of new species after mass extinction events in the Earth's geological past. Furthermore, the hypothesis that natural selection always favors the faster processes may give theoretical support to different studies that claim the applicability of maximum principles like the Maximum Metabolic Flux (MMF) or Maximum Entropy Productions Principle (MEPP), seen as the main goal of biological evolution.

  3. Instructors' Use of the Principles of Teaching and Learning during College Class Sessions

    ERIC Educational Resources Information Center

    Foster, Daniel D.; Whittington, M. Susie

    2017-01-01

    The purpose of this study was to measure the frequency of utilization of the Principles of Teaching and Learning (Newcomb, McCracken, Warmbrod, & Whittington, 2004) during college class sessions. Process-product research was implemented (Gage, 1972; Rosenshine & Furst, 1973) using the Principles of Teaching and Learning Assessment (PTLA)…

  4. Tobacco industry responsibility for butts: a Model Tobacco Waste Act

    PubMed Central

    Curtis, Clifton; Novotny, Thomas E; Lee, Kelley; Freiberg, Mike; McLaughlin, Ian

    2017-01-01

    Cigarette butts and other postconsumer products from tobacco use are the most common waste elements picked up worldwide each year during environmental cleanups. Under the environmental principle of Extended Producer Responsibility, tobacco product manufacturers may be held responsible for collection, transport, processing and safe disposal of tobacco product waste (TPW). Legislation has been applied to other toxic and hazardous postconsumer waste products such as paints, pesticide containers and unused pharmaceuticals, to reduce, prevent and mitigate their environmental impacts. Additional product stewardship (PS) requirements may be necessary for other stakeholders and beneficiaries of tobacco product sales and use, especially suppliers, retailers and consumers, in order to ensure effective TPW reduction. This report describes how a Model Tobacco Waste Act may be adopted by national and subnational jurisdictions to address the environmental impacts of TPW. Such a law will also reduce tobacco use and its health consequences by raising attention to the environmental hazards of TPW, increasing the price of tobacco products, and reducing the number of tobacco product retailers. PMID:26931480

  5. Design and optimization of the micro-engine turbine rotor manufacturing using the rapid prototyping technology

    NASA Astrophysics Data System (ADS)

    Vdovin, R. A.; Smelov, V. G.

    2017-02-01

    This work describes the experience in manufacturing the turbine rotor for the micro-engine. It demonstrates the design principles for the complex investment casting process combining the use of the ProCast software and the rapid prototyping techniques. At the virtual modelling stage, in addition to optimized process parameters, the casting structure was improved to obtain the defect-free section. The real production stage allowed demonstrating the performance and fitness of rapid prototyping techniques for the manufacture of geometrically-complex engine-building parts.

  6. A study on using pre-forming blank in single point incremental forming process by finite element analysis

    NASA Astrophysics Data System (ADS)

    Abass, K. I.

    2016-11-01

    Single Point Incremental Forming process (SPIF) is a forming technique of sheet material based on layered manufacturing principles. The edges of sheet material are clamped while the forming tool is moved along the tool path. The CNC milling machine is used to manufacturing the product. SPIF involves extensive plastic deformation and the description of the process is more complicated by highly nonlinear boundary conditions, namely contact and frictional effects have been accomplished. However, due to the complex nature of these models, numerical approaches dominated by Finite Element Analysis (FEA) are now in widespread use. The paper presents the data and main results of a study on effect of using preforming blank in SPIF through FEA. The considered SPIF has been studied under certain process conditions referring to the test work piece, tool, etc., applying ANSYS 11. The results show that the simulation model can predict an ideal profile of processing track, the behaviour of contact tool-workpiece, the product accuracy by evaluation its thickness, surface strain and the stress distribution along the deformed blank section during the deformation stages.

  7. Fulfillment of GMP standard, halal standard, and applying HACCP for production process of beef floss (Case study: Ksatria enterprise)

    NASA Astrophysics Data System (ADS)

    A'diat, Arkan Addien Al; Liquiddanu, Eko; Laksono, Pringgo Widyo; Sutopo, Wahyudi; Suletra, I. Wayan

    2018-02-01

    Along with the increasing number of the modern retail business in Indonesia, give an opportunity to small and medium enterprise (SME) to sell its products through the modern retailer. There are some obstacles faced by the SMEs, one of them is about product standard. Product standard that must be owned by SMEs are GMP standard and halal standard. This research was conducted to know the fulfillment by the beef floss enterprise in jagalan in fulfilling the GMP standard and halal. In addition, Hazard Analysis and Critical Control Points (HACCP) system was applied to analyze the process. HACCP which used in this research was based on the seven principles in SNI (Indonesian National Standard) 01-4852-1998. The seven principles included hazard analysis, critical control point (CCP) determination, critical limit establishment, CCP monitor system establishment, corrective action establishment, verification, and also documentation establishment that must be applied in preparing HACCP plan. Based on this case study, it is concluded that there were 5 CCPs : the boiling process, roasting process, frying process, the beef floss draining process, and the packaging process.

  8. Achieving Research Impact Through Co-creation in Community-Based Health Services: Literature Review and Case Study.

    PubMed

    Greenhalgh, Trisha; Jackson, Claire; Shaw, Sara; Janamian, Tina

    2016-06-01

    Co-creation-collaborative knowledge generation by academics working alongside other stakeholders-is an increasingly popular approach to aligning research and service development. It has potential for "moving beyond the ivory towers" to deliver significant societal impact via dynamic, locally adaptive community-academic partnerships. Principles of successful co-creation include a systems perspective, a creative approach to research focused on improving human experience, and careful attention to governance and process. If these principles are not followed, co-creation efforts may fail. Co-creation-collaborative knowledge generation by academics working alongside other stakeholders-reflects a "Mode 2" relationship (knowledge production rather than knowledge translation) between universities and society. Co-creation is widely believed to increase research impact. We undertook a narrative review of different models of co-creation relevant to community-based health services. We contrasted their diverse disciplinary roots and highlighted their common philosophical assumptions, principles of success, and explanations for failures. We applied these to an empirical case study of a community-based research-service partnership led by the Centre of Research Excellence in Quality and Safety in Integrated Primary-Secondary Care at the University of Queensland, Australia. Co-creation emerged independently in several fields, including business studies ("value co-creation"), design science ("experience-based co-design"), computer science ("technology co-design"), and community development ("participatory research"). These diverse models share some common features, which were also evident in the case study. Key success principles included (1) a systems perspective (assuming emergence, local adaptation, and nonlinearity); (2) the framing of research as a creative enterprise with human experience at its core; and (3) an emphasis on process (the framing of the program, the nature of relationships, and governance and facilitation arrangements, especially the style of leadership and how conflict is managed). In both the literature review and the case study, co-creation "failures" could often be tracked back to abandoning (or never adopting) these principles. All co-creation models made strong claims for significant and sustainable societal impacts as a result of the adaptive and developmental research process; these were illustrated in the case study. Co-creation models have high potential for societal impact but depend critically on key success principles. To capture the nonlinear chains of causation in the co-creation pathway, impact metrics must reflect the dynamic nature and complex interdependencies of health research systems and address processes as well as outcomes. © 2016 Milbank Memorial Fund.

  9. Achieving Research Impact Through Co‐creation in Community‐Based Health Services: Literature Review and Case Study

    PubMed Central

    JACKSON, CLAIRE; SHAW, SARA; JANAMIAN, TINA

    2016-01-01

    Policy Points: Co‐creation—collaborative knowledge generation by academics working alongside other stakeholders—is an increasingly popular approach to aligning research and service development.It has potential for “moving beyond the ivory towers” to deliver significant societal impact via dynamic, locally adaptive community‐academic partnerships.Principles of successful co‐creation include a systems perspective, a creative approach to research focused on improving human experience, and careful attention to governance and process.If these principles are not followed, co‐creation efforts may fail. Context Co‐creation—collaborative knowledge generation by academics working alongside other stakeholders—reflects a “Mode 2” relationship (knowledge production rather than knowledge translation) between universities and society. Co‐creation is widely believed to increase research impact. Methods We undertook a narrative review of different models of co‐creation relevant to community‐based health services. We contrasted their diverse disciplinary roots and highlighted their common philosophical assumptions, principles of success, and explanations for failures. We applied these to an empirical case study of a community‐based research‐service partnership led by the Centre of Research Excellence in Quality and Safety in Integrated Primary‐Secondary Care at the University of Queensland, Australia. Findings Co‐creation emerged independently in several fields, including business studies (“value co‐creation”), design science (“experience‐based co‐design”), computer science (“technology co‐design”), and community development (“participatory research”). These diverse models share some common features, which were also evident in the case study. Key success principles included (1) a systems perspective (assuming emergence, local adaptation, and nonlinearity); (2) the framing of research as a creative enterprise with human experience at its core; and (3) an emphasis on process (the framing of the program, the nature of relationships, and governance and facilitation arrangements, especially the style of leadership and how conflict is managed). In both the literature review and the case study, co‐creation “failures” could often be tracked back to abandoning (or never adopting) these principles. All co‐creation models made strong claims for significant and sustainable societal impacts as a result of the adaptive and developmental research process; these were illustrated in the case study. Conclusions Co‐creation models have high potential for societal impact but depend critically on key success principles. To capture the nonlinear chains of causation in the co‐creation pathway, impact metrics must reflect the dynamic nature and complex interdependencies of health research systems and address processes as well as outcomes. PMID:27265562

  10. Formulating "Principles of Procedure" for the Foreign Language Classroom: A Framework for Process Model Language Curricula

    ERIC Educational Resources Information Center

    Villacañas de Castro, Luis S.

    2016-01-01

    This article aims to apply Stenhouse's process model of curriculum to foreign language (FL) education, a model which is characterized by enacting "principles of procedure" which are specific to the discipline which the school subject belongs to. Rather than to replace or dissolve current approaches to FL teaching and curriculum…

  11. How quantitative measures unravel design principles in multi-stage phosphorylation cascades.

    PubMed

    Frey, Simone; Millat, Thomas; Hohmann, Stefan; Wolkenhauer, Olaf

    2008-09-07

    We investigate design principles of linear multi-stage phosphorylation cascades by using quantitative measures for signaling time, signal duration and signal amplitude. We compare alternative pathway structures by varying the number of phosphorylations and the length of the cascade. We show that a model for a weakly activated pathway does not reflect the biological context well, unless it is restricted to certain parameter combinations. Focusing therefore on a more general model, we compare alternative structures with respect to a multivariate optimization criterion. We test the hypothesis that the structure of a linear multi-stage phosphorylation cascade is the result of an optimization process aiming for a fast response, defined by the minimum of the product of signaling time and signal duration. It is then shown that certain pathway structures minimize this criterion. Several popular models of MAPK cascades form the basis of our study. These models represent different levels of approximation, which we compare and discuss with respect to the quantitative measures.

  12. How extrusion shapes food processing

    USDA-ARS?s Scientific Manuscript database

    This month's column will explore food extrusion. Extrusion is one of the most commonly used food manufacturing processes. Its versatility enables production of a diverse array of food products. This column will review the basic principles and provide an overview of applications. I would like to ...

  13. Applying the Principles of Lean Production to Gastrointestinal Biopsy Handling: From the Factory Floor to the Anatomic Pathology Laboratory.

    PubMed

    Sugianto, Jessica Z; Stewart, Brian; Ambruzs, Josephine M; Arista, Amanda; Park, Jason Y; Cope-Yokoyama, Sandy; Luu, Hung S

    2015-01-01

    To implement Lean principles to accommodate expanding volumes of gastrointestinal biopsies and to improve laboratory processes overall. Our continuous improvement (kaizen) project analyzed the current state for gastrointestinal biopsy handling using value-stream mapping for specimens obtained at a 487-bed tertiary care pediatric hospital in Dallas, Texas. We identified non-value-added time within the workflow process, from receipt of the specimen in the histology laboratory to the delivery of slides and paperwork to the pathologist. To eliminate non-value-added steps, we implemented the changes depicted in a revised-state value-stream map. Current-state value-stream mapping identified a total specimen processing time of 507 minutes, of which 358 minutes were non-value-added. This translated to a process cycle efficiency of 29%. Implementation of a revised-state value stream resulted in a total process time reduction to 238 minutes, of which 89 minutes were non-value-added, and an improved process cycle efficiency of 63%. Lean production principles of continuous improvement and waste elimination can be successfully implemented within the clinical laboratory.

  14. Bioprocess systems engineering: transferring traditional process engineering principles to industrial biotechnology.

    PubMed

    Koutinas, Michalis; Kiparissides, Alexandros; Pistikopoulos, Efstratios N; Mantalaris, Athanasios

    2012-01-01

    The complexity of the regulatory network and the interactions that occur in the intracellular environment of microorganisms highlight the importance in developing tractable mechanistic models of cellular functions and systematic approaches for modelling biological systems. To this end, the existing process systems engineering approaches can serve as a vehicle for understanding, integrating and designing biological systems and processes. Here, we review the application of a holistic approach for the development of mathematical models of biological systems, from the initial conception of the model to its final application in model-based control and optimisation. We also discuss the use of mechanistic models that account for gene regulation, in an attempt to advance the empirical expressions traditionally used to describe micro-organism growth kinetics, and we highlight current and future challenges in mathematical biology. The modelling research framework discussed herein could prove beneficial for the design of optimal bioprocesses, employing rational and feasible approaches towards the efficient production of chemicals and pharmaceuticals.

  15. Bioprocess systems engineering: transferring traditional process engineering principles to industrial biotechnology

    PubMed Central

    Koutinas, Michalis; Kiparissides, Alexandros; Pistikopoulos, Efstratios N.; Mantalaris, Athanasios

    2013-01-01

    The complexity of the regulatory network and the interactions that occur in the intracellular environment of microorganisms highlight the importance in developing tractable mechanistic models of cellular functions and systematic approaches for modelling biological systems. To this end, the existing process systems engineering approaches can serve as a vehicle for understanding, integrating and designing biological systems and processes. Here, we review the application of a holistic approach for the development of mathematical models of biological systems, from the initial conception of the model to its final application in model-based control and optimisation. We also discuss the use of mechanistic models that account for gene regulation, in an attempt to advance the empirical expressions traditionally used to describe micro-organism growth kinetics, and we highlight current and future challenges in mathematical biology. The modelling research framework discussed herein could prove beneficial for the design of optimal bioprocesses, employing rational and feasible approaches towards the efficient production of chemicals and pharmaceuticals. PMID:24688682

  16. Study unique artistic lopburi province for design brass tea set of bantahkrayang community

    NASA Astrophysics Data System (ADS)

    Pliansiri, V.; Seviset, S.

    2017-07-01

    The objectives of this study were as follows: 1) to study the production process of handcrafted Brass Tea Set; and 2) to design and develop the handcrafted of Brass Tea Set. The process of design was started by mutual analytical processes and conceptual framework for product design, Quality Function Deployment, Theory of Inventive Problem Solving, Principles of Craft Design, and Principle of Reverse Engineering. The experts in field of both Industrial Product Design and Brass Handicraft Product, have evaluated the Brass Tea Set design and created prototype of Brass tea set by the sample of consumers who have ever bought the Brass Tea Set of Bantahkrayang Community on this research. The statistics methods used were percentage, mean ({{{\\overline X}} = }) and standard deviation (S.D.) 3. To assess consumer satisfaction toward of handcrafted Brass tea set was at the high level.

  17. Reverse engineering of the homogeneous-entity product profiles based on CCD

    NASA Astrophysics Data System (ADS)

    Gan, Yong; Zhong, Jingru; Sun, Ning; Sun, Aoran

    2011-08-01

    This measurement system uses delaminated measurement principle, measures the three perpendicular direction values of the entities. When the measured entity is immerged in the liquid layer by layer, every layer's image are collected by CCD and digitally processed. It introduces the basic measuring principle and the working process of the measure method. According to Archimedes law, the related buoyancy and volume that soaked in different layer's depth are measured by electron balance and the mathematics models are established. Through calculating every layer's weight and centre of gravity by computer based on the method of Artificial Intelligence, we can reckon 3D coordinate values of every minute entity cell in different layers and its 3D contour picture is constructed. The experimental results show that for all the homogeneous entity insoluble in water, it can measure them. The measurement velocity is fast and non-destructive test, it can measure the entity with internal hole.

  18. A fiber optic sensor for on-line non-touch monitoring of roll shape

    NASA Astrophysics Data System (ADS)

    Guo, Yuan; Qu, Weijian; Yuan, Qi

    2009-07-01

    Basing on the principle of reflective displacement fibre-optic sensor, a high accuracy non-touch on-line optical fibre sensor for detecting roll shape is presented. The principle and composition of the detection system and the operation process are expatiated also. By using a novel probe of three optical fibres in equal transverse space, the effects of fluctuations in the light source, reflective changing of target surface and the intensity losses in the fibre lines are automatically compensated. Meantime, an optical fibre sensor model of correcting static error based on BP artificial neural network (ANN) is set up. Also by using interpolation method and value filtering to process the signals, effectively reduce the influence of random noise and the vibration of the roll bearing. So the accuracy and resolution were enhanced remarkably. Experiment proves that the resolution is 1μm and the precision can reach to 0.1%. So the system reaches to the demand of practical production process.

  19. Optical Fiber On-Line Detection System for Non-Touch Monitoring Roller Shape

    NASA Astrophysics Data System (ADS)

    Guo, Y.; Wang, Y. T.

    2006-10-01

    Basing on the principle of reflective displacement fiber-optic sensor, a high accuracy non-touch on-line optical fiber measurement system for roller shape is presented. The principle and composition of the detection system and the operation process are expatiated also. By using a novel probe of three optical fibers in equal transverse space, the effects of fluctuations in the light source, reflective changing of target surface and the intensity losses in the fiber lines are automatically compensated. Meantime, an optical fiber sensor model of correcting static error based on BP artificial neural network (ANN) is set up. Also by using interpolation method and value filtering to process the signals, effectively reduce the influence of random noise and the vibration of the roller bearing. So enhance the accuracy and resolution remarkably. Experiment proves that the accuracy of the system reach to the demand of practical production process, it provides a new method for the high speed, accurate and automatic on line detection of the mill roller shape.

  20. The experience factory: Can it make you a 5? or what is its relationship to other quality and improvement concepts?

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1992-01-01

    The concepts of quality improvements have permeated many businesses. It is clear that the nineties will be the quality era for software and there is a growing need to develop or adapt quality improvement approaches to the software business. Thus we must understand software as an artifact and software as a business. Since the business we are dealing with is software, we must understand the nature of software and software development. The software discipline is evolutionary and experimental; it is a laboratory science. Software is development not production. The technologies of the discipline are human based. There is a lack of models that allow us to reason about the process and the product. All software is not the same; process is a variable, goals are variable, etc. Packaged, reusable, experiences require additional resources in the form of organization, processes, people, etc. There have been a variety of organizational frameworks proposed to improve quality for various businesses. The ones discussed in this presentation include: Plan-Do-Check-Act, a quality improvement process based upon a feedback cycle for optimizing a single process model/production line; the Experience Factory/Quality Improvement Paradigm, continuous improvements through the experimentation, packaging, and reuse of experiences based upon a business's needs; Total Quality Management, a management approach to long term success through customer satisfaction based on the participation of all members of an organization; the SEI capability maturity model, a staged process improvement based upon assessment with regard to a set of key process areas until you reach a level 5 which represents a continuous process improvement; and Lean (software) Development, a principle supporting the concentration of the production on 'value added' activities and the elimination of reduction of 'not value added' activities.

  1. Drug-device combination products in the twenty-first century: epinephrine auto-injector development using human factors engineering.

    PubMed

    Edwards, Eric S; Edwards, Evan T; Simons, F Estelle R; North, Robert

    2015-05-01

    The systematic application of human factors engineering (HFE) principles to the development of drug-device combination products, including epinephrine auto-injectors (EAIs), has the potential to improve the effectiveness and safety of drug administration. A PubMed search was performed to assess the role of HFE in the development of drug-device combination products. The following keywords were used in different combinations: 'human factors engineering,' 'human factors,' 'medical products,' 'epinephrine/adrenaline auto-injector,' 'healthcare' and 'patient safety.' This review provides a summary of HFE principles and their application to the development of drug-device combination products as advised by the US FDA. It also describes the HFE process that was applied to the development of Auvi-Q, a novel EAI, highlighting specific steps that occurred during the product-development program. For drug-device combination products, device labeling and usability are critical and have the potential to impact clinical outcomes. Application of HFE principles to the development of drug-delivery devices has the potential to improve product quality and reliability, reduce risk and improve patient safety when applied early in the development process. Additional clinical and real-world studies will confirm whether the application of HFE has helped to develop an EAI that better meets the needs of patients at risk of anaphylaxis.

  2. A symbolic/subsymbolic interface protocol for cognitive modeling

    PubMed Central

    Simen, Patrick; Polk, Thad

    2009-01-01

    Researchers studying complex cognition have grown increasingly interested in mapping symbolic cognitive architectures onto subsymbolic brain models. Such a mapping seems essential for understanding cognition under all but the most extreme viewpoints (namely, that cognition consists exclusively of digitally implemented rules; or instead, involves no rules whatsoever). Making this mapping reduces to specifying an interface between symbolic and subsymbolic descriptions of brain activity. To that end, we propose parameterization techniques for building cognitive models as programmable, structured, recurrent neural networks. Feedback strength in these models determines whether their components implement classically subsymbolic neural network functions (e.g., pattern recognition), or instead, logical rules and digital memory. These techniques support the implementation of limited production systems. Though inherently sequential and symbolic, these neural production systems can exploit principles of parallel, analog processing from decision-making models in psychology and neuroscience to explain the effects of brain damage on problem solving behavior. PMID:20711520

  3. Reprint of Design of synthetic microbial communities for biotechnological production processes.

    PubMed

    Jagmann, Nina; Philipp, Bodo

    2014-12-20

    In their natural habitats microorganisms live in multi-species communities, in which the community members exhibit complex metabolic interactions. In contrast, biotechnological production processes catalyzed by microorganisms are usually carried out with single strains in pure cultures. A number of production processes, however, may be more efficiently catalyzed by the concerted action of microbial communities. This review will give an overview of organismic interactions between microbial cells and of biotechnological applications of microbial communities. It focuses on synthetic microbial communities that consist of microorganisms that have been genetically engineered. Design principles for such synthetic communities will be exemplified based on plausible scenarios for biotechnological production processes. These design principles comprise interspecific metabolic interactions via cross-feeding, regulation by interspecific signaling processes via metabolites and autoinducing signal molecules, and spatial structuring of synthetic microbial communities. In particular, the implementation of metabolic interdependencies, of positive feedback regulation and of inducible cell aggregation and biofilm formation will be outlined. Synthetic microbial communities constitute a viable extension of the biotechnological application of metabolically engineered single strains and enlarge the scope of microbial production processes. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Building bridges from process R&D: from a customer-supplier relationship to full partnership.

    PubMed

    Federsel

    2000-08-01

    A new and forward-looking way of running process R&D is introduced that integrates this core business in an efficient manner into the network of activities in different disciplines, which constitute the arena for the development of pharmaceutical products. The interfaces with surrounding areas are discussed in addition to the novel organizational principles implemented in process R&D and the workflow emanating from this. Furthermore, the Tollgate model used to keep track of the progress in a project and the pre-study concept are presented in detail. Finally, the main differences between operating modes in the past and in the future are highlighted.

  5. WHO Expert Committee on Specifications for Pharmaceutical Preparations.

    PubMed

    2014-01-01

    The Expert Committee on Specifications for Pharmaceutical Preparations works towards clear, independent and practical standards and guidelines for the quality assurance of medicines. Standards are developed by the Committee through worldwide consultation and an international consensus-building process. The following new guidelines were adopted and recommended for use, in addition to 20 monographs and general texts for inclusion in The International Pharmacopoeia and 11 new International Chemical Reference Substances. The International Pharmacopoeia--updating mechanism for the section on radiopharmaceuticals; WHO good manufacturing practices for pharmaceutical products: main principles; Model quality assurance system for procurement agencies; Assessment tool based on the model quality assurance system for procurement agencies: aide-memoire for inspection; Guidelines on submission of documentation for prequalification of finished pharmaceutical products approved by stringent regulatory authorities; and Guidelines on submission of documentation for a multisource (generic) finished pharmaceutical product: quality part.

  6. Layout design-based research on optimization and assessment method for shipbuilding workshop

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Meng, Mei; Liu, Shuang

    2013-06-01

    The research study proposes to examine a three-dimensional visualization program, emphasizing on improving genetic algorithms through the optimization of a layout design-based standard and discrete shipbuilding workshop. By utilizing a steel processing workshop as an example, the principle of minimum logistic costs will be implemented to obtain an ideological equipment layout, and a mathematical model. The objectiveness is to minimize the total necessary distance traveled between machines. An improved control operator is implemented to improve the iterative efficiency of the genetic algorithm, and yield relevant parameters. The Computer Aided Tri-Dimensional Interface Application (CATIA) software is applied to establish the manufacturing resource base and parametric model of the steel processing workshop. Based on the results of optimized planar logistics, a visual parametric model of the steel processing workshop is constructed, and qualitative and quantitative adjustments then are applied to the model. The method for evaluating the results of the layout is subsequently established through the utilization of AHP. In order to provide a mode of reference to the optimization and layout of the digitalized production workshop, the optimized discrete production workshop will possess a certain level of practical significance.

  7. Considerations for setting the specifications of vaccines.

    PubMed

    Minor, Philip

    2012-05-01

    The specifications of vaccines are determined by the particular product and its method of manufacture, which raise issues unique to the vaccine in question. However, the general principles are shared, including the need to have sufficient active material to immunize a very high proportion of recipients, an acceptable level of safety, which may require specific testing or may come from the production process, and an acceptable low level of contamination with unwanted materials, which may include infectious agents or materials used in production. These principles apply to the earliest smallpox vaccines and the most recent recombinant vaccines, such as those against HPV. Manufacturing development includes more precise definitions of the product through improved tests and tighter control of the process parameters. Good manufacturing practice plays a major role, which is likely to increase in importance in assuring product quality almost independent of end-product specifications.

  8. Interactive natural language acquisition in a multi-modal recurrent neural architecture

    NASA Astrophysics Data System (ADS)

    Heinrich, Stefan; Wermter, Stefan

    2018-01-01

    For the complex human brain that enables us to communicate in natural language, we gathered good understandings of principles underlying language acquisition and processing, knowledge about sociocultural conditions, and insights into activity patterns in the brain. However, we were not yet able to understand the behavioural and mechanistic characteristics for natural language and how mechanisms in the brain allow to acquire and process language. In bridging the insights from behavioural psychology and neuroscience, the goal of this paper is to contribute a computational understanding of appropriate characteristics that favour language acquisition. Accordingly, we provide concepts and refinements in cognitive modelling regarding principles and mechanisms in the brain and propose a neurocognitively plausible model for embodied language acquisition from real-world interaction of a humanoid robot with its environment. In particular, the architecture consists of a continuous time recurrent neural network, where parts have different leakage characteristics and thus operate on multiple timescales for every modality and the association of the higher level nodes of all modalities into cell assemblies. The model is capable of learning language production grounded in both, temporal dynamic somatosensation and vision, and features hierarchical concept abstraction, concept decomposition, multi-modal integration, and self-organisation of latent representations.

  9. The influence of organic production on food quality - research findings, gaps and future challenges.

    PubMed

    Załęcka, Aneta; Bügel, Susanne; Paoletti, Flavio; Kahl, Johannes; Bonanno, Adriana; Dostalova, Anne; Rahmann, Gerold

    2014-10-01

    Although several meta-analysis studies have been published comparing the quality of food derived from organic and non-organic origin, it is still not clear if food from organic production per se can guarantee product-related added value to consumers. This paper aims to summarize the status quo in order to identify research gaps and suggest future research challenges. Organic food is described according to a quality model already published. The influence of organic production on food quality is structured in primary production and processing. Furthermore, organic food authentication is discussed. Organic food seems to contain fewer pesticide residues and statistically more selected health-related compounds such as polyphenols in plant products and polyunsaturated fatty acids in milk and meat products, but the health relevance for consumers is not clear yet. Comparing food from organic origin with so called 'conventional' food seems not to be appropriate, because 'conventional' is not defined. In organic food quality research a system approach is needed from which systemic markers can be selected. Research on the impact of processing technologies on the quality according to organic principles seems of high relevance, since most of the food is processed. © 2014 Society of Chemical Industry.

  10. On-line identification of fermentation processes for ethanol production.

    PubMed

    Câmara, M M; Soares, R M; Feital, T; Naomi, P; Oki, S; Thevelein, J M; Amaral, M; Pinto, J C

    2017-07-01

    A strategy for monitoring fermentation processes, specifically, simultaneous saccharification and fermentation (SSF) of corn mash, was developed. The strategy covered the development and use of first principles, semimechanistic and unstructured process model based on major kinetic phenomena, along with mass and energy balances. The model was then used as a reference model within an identification procedure capable of running on-line. The on-line identification procedure consists on updating the reference model through the estimation of corrective parameters for certain reaction rates using the most recent process measurements. The strategy makes use of standard laboratory measurements for sugars quantification and in situ temperature and liquid level data. The model, along with the on-line identification procedure, has been tested against real industrial data and have been able to accurately predict the main variables of operational interest, i.e., state variables and its dynamics, and key process indicators. The results demonstrate that the strategy is capable of monitoring, in real time, this complex industrial biomass fermentation. This new tool provides a great support for decision-making and opens a new range of opportunities for industrial optimization.

  11. Statistical mechanical theory for steady state systems. VI. Variational principles

    NASA Astrophysics Data System (ADS)

    Attard, Phil

    2006-12-01

    Several variational principles that have been proposed for nonequilibrium systems are analyzed. These include the principle of minimum rate of entropy production due to Prigogine [Introduction to Thermodynamics of Irreversible Processes (Interscience, New York, 1967)], the principle of maximum rate of entropy production, which is common on the internet and in the natural sciences, two principles of minimum dissipation due to Onsager [Phys. Rev. 37, 405 (1931)] and to Onsager and Machlup [Phys. Rev. 91, 1505 (1953)], and the principle of maximum second entropy due to Attard [J. Chem.. Phys. 122, 154101 (2005); Phys. Chem. Chem. Phys. 8, 3585 (2006)]. The approaches of Onsager and Attard are argued to be the only viable theories. These two are related, although their physical interpretation and mathematical approximations differ. A numerical comparison with computer simulation results indicates that Attard's expression is the only accurate theory. The implications for the Langevin and other stochastic differential equations are discussed.

  12. Green extraction of natural products: concept and principles.

    PubMed

    Chemat, Farid; Vian, Maryline Abert; Cravotto, Giancarlo

    2012-01-01

    The design of green and sustainable extraction methods of natural products is currently a hot research topic in the multidisciplinary area of applied chemistry, biology and technology. Herein we aimed to introduce the six principles of green-extraction, describing a multifaceted strategy to apply this concept at research and industrial level. The mainstay of this working protocol are new and innovative technologies, process intensification, agro-solvents and energy saving. The concept, principles and examples of green extraction here discussed, offer an updated glimpse of the huge technological effort that is being made and the diverse applications that are being developed.

  13. Soft-sensing model of temperature for aluminum reduction cell on improved twin support vector regression

    NASA Astrophysics Data System (ADS)

    Li, Tao

    2018-06-01

    The complexity of aluminum electrolysis process leads the temperature for aluminum reduction cells hard to measure directly. However, temperature is the control center of aluminum production. To solve this problem, combining some aluminum plant's practice data, this paper presents a Soft-sensing model of temperature for aluminum electrolysis process on Improved Twin Support Vector Regression (ITSVR). ITSVR eliminates the slow learning speed of Support Vector Regression (SVR) and the over-fit risk of Twin Support Vector Regression (TSVR) by introducing a regularization term into the objective function of TSVR, which ensures the structural risk minimization principle and lower computational complexity. Finally, the model with some other parameters as auxiliary variable, predicts the temperature by ITSVR. The simulation result shows Soft-sensing model based on ITSVR has short time-consuming and better generalization.

  14. How eco-evolutionary principles can guide tree breeding and tree biotechnology for enhanced productivity.

    PubMed

    Franklin, Oskar; Palmroth, Sari; Näsholm, Torgny

    2014-11-01

    Tree breeding and biotechnology can enhance forest productivity and help alleviate the rising pressure on forests from climate change and human exploitation. While many physiological processes and genes are targeted in search of genetically improved tree productivity, an overarching principle to guide this search is missing. Here, we propose a method to identify the traits that can be modified to enhance productivity, based on the differences between trees shaped by natural selection and 'improved' trees with traits optimized for productivity. We developed a tractable model of plant growth and survival to explore such potential modifications under a range of environmental conditions, from non-water limited to severely drought-limited sites. We show how key traits are controlled by a trade-off between productivity and survival, and that productivity can be increased at the expense of long-term survival by reducing isohydric behavior (stomatal regulation of leaf water potential) and allocation to defense against pests compared with native trees. In contrast, at dry sites occupied by naturally drought-resistant trees, the model suggests a better strategy may be to select trees with slightly lower wood density than the native trees and to augment isohydric behavior and allocation to defense. Thus, which traits to modify, and in which direction, depend on the original tree species or genotype, the growth environment and wood-quality versus volume production preferences. In contrast to this need for customization of drought and pest resistances, consistent large gains in productivity for all genotypes can be obtained if root traits can be altered to reduce competition for water and nutrients. Our approach illustrates the potential of using eco-evolutionary theory and modeling to guide plant breeding and genetic technology in selecting target traits in the quest for higher forest productivity. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Simultaneous Clostridial fermentation, lipase-catalyzed esterification, and ester extraction to enrich diesel with butyl butyrate.

    PubMed

    van den Berg, Corjan; Heeres, Arjan S; van der Wielen, Luuk A M; Straathof, Adrie J J

    2013-01-01

    The recovery of 1-butanol from fermentation broth is energy-intensive since typical concentrations in fermentation broth are below 20 g L(-1). To prevent butanol inhibition and high downstream processing costs, we aimed at producing butyl esters instead of 1-butanol. It is shown that it is possible to perform simultaneously clostridial fermentation, esterification of the formed butanol to butyl butyrate, and extraction of this ester by hexadecane. The very high partition coefficient of butyl butyrate pulls the esterification towards the product side even at fermentation pH and relatively low butanol concentrations. The hexadecane extractant is a model diesel compound and is nontoxic to the cells. If butyl butyrate enriched diesel can directly be used as car fuel, no product recovery is required. A proof-of-principle experiment for the one-pot bio-ester production from glucose led to 5 g L(-1) butyl butyrate in the hexadecane phase. The principle may be extended to a wide range of esters, especially to longer chain ones. Copyright © 2012 Wiley Periodicals, Inc.

  16. Towards quality by design in pharmaceutical manufacturing: modelling and control of air jet mills

    NASA Astrophysics Data System (ADS)

    Bhonsale, Satyajeet; Telen, Dries; Stokbroekx, Bard; Van Impe, Jan

    2017-06-01

    Milling is an important step in pharmaceutical manufacturing as it not only determines the final formulation of the drug product, but also influences the bioavailability and dissolution rate of the active pharmaceutical ingredient (API). In this respect, the air jet mill (AJM) is most commonly used in the pharmaceutical industry as it is a non-contaminating and non-degrading self-classifying process capable of delivering narrow particle size distributions (PSD). Keeping the principles of Quality by Design in mind, the Critical Process Parameters (CPPs) of the AJM have been identified to be the pressures at the grinding nozzles, and the feed rate which affect the PSD, surface charge and the morphology of the product (i.e. the Critical Material Attributes (CMAs)). For the purpose of this research, the PSD is considered to be the only relevant CMA. A population balance based model is proposed to simulate the dynamics milling operation by utilizing the concept of breakage functions. This model agrees qualitatively with experimental observations of the air jet mill unit present at Janssen Pharmaceutica but further steps for model validation need to be carried out.

  17. A Comprehensive Community Nursing Center Model: Maximizing Practice Income--A Challenge to Educators.

    ERIC Educational Resources Information Center

    Walker, Patricia Hinton

    1994-01-01

    The University of Rochester's community nursing center is an entrepreneurial model for faculty practice based on sound business principles to enhance financial success. These principles include development and pricing of the product of nursing services, consumer dialogue instead of advertising monologue, and a diversified income base. (SK)

  18. 15 CFR 295.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... in accordance with applicable Federal cost principles. (e) The term foreign-owned company means a... allowability of indirect costs in accordance with applicable Federal cost principles. (i) The term industry-led..., marketing, or distribution of any product, process, or service that is not reasonably required to conduct...

  19. Development of dynamic Bayesian models for web application test management

    NASA Astrophysics Data System (ADS)

    Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.

    2018-03-01

    The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.

  20. Promoting Usability in Organizations with a New Health Usability Model: Implications for Nursing Informatics

    PubMed Central

    Staggers, Nancy; Rodney, Melanie

    2012-01-01

    Usability issues with products such as Electronic Health Records (EHRs) are of global interest to nursing informaticists. Although improvements in patient safety, clinical productivity and effectiveness are possible when usability principles and practices are in place, most organizations do not embrace usability. This paper presents a new Health Usability Maturity Model consisting of 5 phases: unrecognized, preliminary, implemented, integrated and strategic. Within each level various aspects are discussed including focus on users, management, education, resources, processes and infrastructure. Nurse informaticists may use this new model as a guide for assessing their organization’s level of usability and transitioning to the next level. Using tactics outlined here, nurse informaticists may also serve as catalysts for change and lead efforts to improve the user experience in organizations across industry, academe and healthcare settings. PMID:24199128

  1. A Conceptual Framework and Principles for Trusted Pervasive Health

    PubMed Central

    Blobel, Bernd Gerhard; Seppälä, Antto Veikko; Sorvari, Hannu Olavi; Nykänen, Pirkko Anneli

    2012-01-01

    Background Ubiquitous computing technology, sensor networks, wireless communication and the latest developments of the Internet have enabled the rise of a new concept—pervasive health—which takes place in an open, unsecure, and highly dynamic environment (ie, in the information space). To be successful, pervasive health requires implementable principles for privacy and trustworthiness. Objective This research has two interconnected objectives. The first is to define pervasive health as a system and to understand its trust and privacy challenges. The second goal is to build a conceptual model for pervasive health and use it to develop principles and polices which can make pervasive health trustworthy. Methods In this study, a five-step system analysis method is used. Pervasive health is defined using a metaphor of digital bubbles. A conceptual framework model focused on trustworthiness and privacy is then developed for pervasive health. On that model, principles and rules for trusted information management in pervasive health are defined. Results In the first phase of this study, a new definition of pervasive health was created. Using this model, differences between pervasive health and health care are stated. Reviewed publications demonstrate that the widely used principles of predefined and static trust cannot guarantee trustworthiness and privacy in pervasive health. Instead, such an environment requires personal dynamic and context-aware policies, awareness, and transparency. A conceptual framework model focused on information processing in pervasive health is developed. Using features of pervasive health and relations from the framework model, new principles for trusted pervasive health have been developed. The principles propose that personal health data should be under control of the data subject. The person shall have the right to verify the level of trust of any system which collects or processes his or her health information. Principles require that any stakeholder or system collecting or processing health data must support transparency and shall publish its trust and privacy attributes and even its domain specific policies. Conclusions The developed principles enable trustworthiness and guarantee privacy in pervasive health. The implementation of principles requires new infrastructural services such as trust verification and policy conflict resolution. After implementation, the accuracy and usability of principles should be analyzed. PMID:22481297

  2. A conceptual framework and principles for trusted pervasive health.

    PubMed

    Ruotsalainen, Pekka Sakari; Blobel, Bernd Gerhard; Seppälä, Antto Veikko; Sorvari, Hannu Olavi; Nykänen, Pirkko Anneli

    2012-04-06

    Ubiquitous computing technology, sensor networks, wireless communication and the latest developments of the Internet have enabled the rise of a new concept-pervasive health-which takes place in an open, unsecure, and highly dynamic environment (ie, in the information space). To be successful, pervasive health requires implementable principles for privacy and trustworthiness. This research has two interconnected objectives. The first is to define pervasive health as a system and to understand its trust and privacy challenges. The second goal is to build a conceptual model for pervasive health and use it to develop principles and policies which can make pervasive health trustworthy. In this study, a five-step system analysis method is used. Pervasive health is defined using a metaphor of digital bubbles. A conceptual framework model focused on trustworthiness and privacy is then developed for pervasive health. On that model, principles and rules for trusted information management in pervasive health are defined. In the first phase of this study, a new definition of pervasive health was created. Using this model, differences between pervasive health and health care are stated. Reviewed publications demonstrate that the widely used principles of predefined and static trust cannot guarantee trustworthiness and privacy in pervasive health. Instead, such an environment requires personal dynamic and context-aware policies, awareness, and transparency. A conceptual framework model focused on information processing in pervasive health is developed. Using features of pervasive health and relations from the framework model, new principles for trusted pervasive health have been developed. The principles propose that personal health data should be under control of the data subject. The person shall have the right to verify the level of trust of any system which collects or processes his or her health information. Principles require that any stakeholder or system collecting or processing health data must support transparency and shall publish its trust and privacy attributes and even its domain specific policies. The developed principles enable trustworthiness and guarantee privacy in pervasive health. The implementation of principles requires new infrastructural services such as trust verification and policy conflict resolution. After implementation, the accuracy and usability of principles should be analyzed.

  3. Production of photocurrent due to intermediate-to-conduction-band transitions: a demonstration of a key operating principle of the intermediate-band solar cell.

    PubMed

    Martí, A; Antolín, E; Stanley, C R; Farmer, C D; López, N; Díaz, P; Cánovas, E; Linares, P G; Luque, A

    2006-12-15

    We present intermediate-band solar cells manufactured using quantum dot technology that show for the first time the production of photocurrent when two sub-band-gap energy photons are absorbed simultaneously. One photon produces an optical transition from the intermediate-band to the conduction band while the second pumps an electron from the valence band to the intermediate-band. The detection of this two-photon absorption process is essential to verify the principles of operation of the intermediate-band solar cell. The phenomenon is the cornerstone physical principle that ultimately allows the production of photocurrent in a solar cell by below band gap photon absorption, without degradation of its output voltage.

  4. Connectionism, parallel constraint satisfaction processes, and gestalt principles: (re) introducing cognitive dynamics to social psychology.

    PubMed

    Read, S J; Vanman, E J; Miller, L C

    1997-01-01

    We argue that recent work in connectionist modeling, in particular the parallel constraint satisfaction processes that are central to many of these models, has great importance for understanding issues of both historical and current concern for social psychologists. We first provide a brief description of connectionist modeling, with particular emphasis on parallel constraint satisfaction processes. Second, we examine the tremendous similarities between parallel constraint satisfaction processes and the Gestalt principles that were the foundation for much of modem social psychology. We propose that parallel constraint satisfaction processes provide a computational implementation of the principles of Gestalt psychology that were central to the work of such seminal social psychologists as Asch, Festinger, Heider, and Lewin. Third, we then describe how parallel constraint satisfaction processes have been applied to three areas that were key to the beginnings of modern social psychology and remain central today: impression formation and causal reasoning, cognitive consistency (balance and cognitive dissonance), and goal-directed behavior. We conclude by discussing implications of parallel constraint satisfaction principles for a number of broader issues in social psychology, such as the dynamics of social thought and the integration of social information within the narrow time frame of social interaction.

  5. Design of clinical trials for therapeutic cancer vaccines development.

    PubMed

    Mackiewicz, Jacek; Mackiewicz, Andrzej

    2009-12-25

    Advances in molecular and cellular biology as well as biotechnology led to definition of a group of drugs referred to as medicinal products of advanced technologies. It includes gene therapy products, somatic cell therapeutics and tissue engineering. Therapeutic cancer vaccines including whole cell tumor cells vaccines or gene modified whole cells belong to somatic therapeutics and/or gene therapy products category. The drug development is a multistep complex process. It comprises of two phases: preclinical and clinical. Guidelines on preclinical testing of cell based immunotherapy medicinal products have been defined by regulatory agencies and are available. However, clinical testing of therapeutic cancer vaccines is still under debate. It presents a serious problem since recently clinical efficacy of the number of cancer vaccines has been demonstrated that focused a lot of public attention. In general clinical testing in the current form is very expensive, time consuming and poorly designed what may lead to overlooking of products clinically beneficial for patients. Accordingly regulatory authorities and researches including Cancer Vaccine Clinical Trial Working Group proposed three regulatory solutions to facilitate clinical development of cancer vaccines: cost-recovery program, conditional marketing authorization, and a new development paradigm. Paradigm includes a model in which cancer vaccines are investigated in two types of clinical trials: proof-of-principle and efficacy. The proof-of-principle trial objectives are: safety; dose selection and schedule of vaccination; and demonstration of proof-of-principle. Efficacy trials are randomized clinical trials with objectives of demonstrating clinical benefit either directly or through a surrogate. The clinical end points are still under debate.

  6. Application of Hands-On Simulation Games to Improve Classroom Experience

    ERIC Educational Resources Information Center

    Hamzeh, Farook; Theokaris, Christina; Rouhana, Carel; Abbas, Yara

    2017-01-01

    While many construction companies claim substantial productivity and profit gains when applying lean construction principles, it remains a challenge to teach these principles in a classroom. Lean construction emphasises collaborative processes and integrated delivery practices. Consequently, new teaching methods that nurture such values should…

  7. Organic food processing: a framework for concept, starting definitions and evaluation.

    PubMed

    Kahl, Johannes; Alborzi, Farnaz; Beck, Alexander; Bügel, Susanne; Busscher, Nicolaas; Geier, Uwe; Matt, Darja; Meischner, Tabea; Paoletti, Flavio; Pehme, Sirli; Ploeger, Angelika; Rembiałkowska, Ewa; Schmid, Otto; Strassner, Carola; Taupier-Letage, Bruno; Załęcka, Aneta

    2014-10-01

    In 2007 EU Regulation (EC) 834/2007 introduced principles and criteria for organic food processing. These regulations have been analysed and discussed in several scientific publications and research project reports. Recently, organic food quality was described by principles, aspects and criteria. These principles from organic agriculture were verified and adapted for organic food processing. Different levels for evaluation were suggested. In another document, underlying paradigms and consumer perception of organic food were reviewed against functional food, resulting in identifying integral product identity as the underlying paradigm and a holistic quality view connected to naturalness as consumers' perception of organic food quality. In a European study, the quality concept was applied to the organic food chain, resulting in a problem, namely that clear principles and related criteria were missing to evaluate processing methods. Therefore the goal of this paper is to describe and discuss the topic of organic food processing to make it operational. A conceptual background for organic food processing is given by verifying the underlying paradigms and principles of organic farming and organic food as well as on organic processing. The proposed definition connects organic processing to related systems such as minimal, sustainable and careful, gentle processing, and describes clear principles and related criteria. Based on food examples, such as milk with different heat treatments, the concept and definitions were verified. Organic processing can be defined by clear paradigms and principles and evaluated according criteria from a multidimensional approach. Further work has to be done on developing indicators and parameters for assessment of organic food quality. © 2013 Society of Chemical Industry.

  8. Collaborative, Sequential and Isolated Decisions in Design

    NASA Technical Reports Server (NTRS)

    Lewis, Kemper; Mistree, Farrokh

    1997-01-01

    The Massachusetts Institute of Technology (MIT) Commission on Industrial Productivity, in their report Made in America, found that six recurring weaknesses were hampering American manufacturing industries. The two weaknesses most relevant to product development were 1) technological weakness in development and production, and 2) failures in cooperation. The remedies to these weaknesses are considered the essential twin pillars of CE: 1) improved development process, and 2) closer cooperation. In the MIT report, it is recognized that total cooperation among teams in a CE environment is rare in American industry, while the majority of the design research in mathematically modeling CE has assumed total cooperation. In this paper, we present mathematical constructs, based on game theoretic principles, to model degrees of collaboration characterized by approximate cooperation, sequential decision making and isolation. The design of a pressure vessel and a passenger aircraft are included as illustrative examples.

  9. Irreversibility and entropy production in transport phenomena, IV: Symmetry, integrated intermediate processes and separated variational principles for multi-currents

    NASA Astrophysics Data System (ADS)

    Suzuki, Masuo

    2013-10-01

    The mechanism of entropy production in transport phenomena is discussed again by emphasizing the role of symmetry of non-equilibrium states and also by reformulating Einstein’s theory of Brownian motion to derive entropy production from it. This yields conceptual reviews of the previous papers [M. Suzuki, Physica A 390 (2011) 1904; 391 (2012) 1074; 392 (2013) 314]. Separated variational principles of steady states for multi external fields {Xi} and induced currents {Ji} are proposed by extending the principle of minimum integrated entropy production found by the present author for a single external field. The basic strategy of our theory on steady states is to take in all the intermediate processes from the equilibrium state to the final possible steady states in order to study the irreversible physics even in the steady states. As an application of this principle, Gransdorff-Prigogine’s evolution criterion inequality (or stability condition) dXP≡∫dr∑iJidXi≤0 is derived in the stronger form dQi≡∫drJidXi≤0 for individual force Xi and current Ji even in nonlinear responses which depend on all the external forces {Xk} nonlinearly. This is called “separated evolution criterion”. Some explicit demonstrations of the present general theory to simple electric circuits with multi external fields are given in order to clarify the physical essence of our new theory and to realize the condition of its validity concerning the existence of the solutions of the simultaneous equations obtained by the separated variational principles. It is also instructive to compare the two results obtained by the new variational theory and by the old scheme based on the instantaneous entropy production. This seems to be suggestive even to the energy problem in the world.

  10. WHO Expert Committee on Specifications for Pharmaceutical Preparations.

    PubMed

    2011-01-01

    The Expert Committee on Specifications for Pharmaceutical Preparations works towards clear, independent and practical standards and guidelines for the quality assurance of medicines. Standards are developed by the Committee through worldwide consultation and an international consensus-building process. The following new guidelines were adopted and recommended for use: procedure for adoption of International Chemical Reference Substances; WHO good practices for pharmaceutical microbiology laboratories; good manufacturing practices: main principles for pharmaceutical products; good manufacturing practices for blood establishments (jointly with the Expert Committee on Biological Standardization); guidelines on good manufacturing practices for heating, ventilation and air-conditioning systems for non-sterile pharmaceutical dosage forms; good manufacturing practices for sterile pharmaceutical products; guidelines on transfer of technology in pharmaceutical manufacturing; good pharmacy practice: standards for quality of pharmacy services (joint FIP/WHO); model guidance for the storage and transport of time- and temperature-sensitive pharmaceutical products (jointly with the Expert Committee on Biological Standardization); procedure for prequalification of pharmaceutical products; guide on submission of documentation for prequalification of innovator finished pharmaceutical products approved by stringent regulatory authorities; prequalification of quality control laboratories: procedure for assessing the acceptability, in principle, of quality control laboratories for use by United Nations agencies; guidelines for preparing a laboratory information file; guidelines for drafting a site master file; guidelines on submission of documentation for a multisource (generic) finished product: general format: preparation of product dossiers in common technical document format.

  11. Sustainment in the Army 2020: Using the Army’s Sustainment Principles to Identify and Mitigate Risks Associated with Organizational Change

    DTIC Science & Technology

    2015-06-12

    redesign itself to be better suited to a post Cold War world . In 2008, the Army established the brigade combat team as the primary basic unit of...important to understand models established for the business world and not just those used by the military. Historically, the term logistics as we know...involves every possible phase of the product support process.12 Peter Drucker, a renowned management consultant, argued that logistics is

  12. Growth-coupled overproduction is feasible for almost all metabolites in five major production organisms

    NASA Astrophysics Data System (ADS)

    von Kamp, Axel; Klamt, Steffen

    2017-06-01

    Computational modelling of metabolic networks has become an established procedure in the metabolic engineering of production strains. One key principle that is frequently used to guide the rational design of microbial cell factories is the stoichiometric coupling of growth and product synthesis, which makes production of the desired compound obligatory for growth. Here we show that the coupling of growth and production is feasible under appropriate conditions for almost all metabolites in genome-scale metabolic models of five major production organisms. These organisms comprise eukaryotes and prokaryotes as well as heterotrophic and photoautotrophic organisms, which shows that growth coupling as a strain design principle has a wide applicability. The feasibility of coupling is proven by calculating appropriate reaction knockouts, which enforce the coupling behaviour. The study presented here is the most comprehensive computational investigation of growth-coupled production so far and its results are of fundamental importance for rational metabolic engineering.

  13. Information flow analysis and Petri-net-based modeling for welding flexible manufacturing cell

    NASA Astrophysics Data System (ADS)

    Qiu, T.; Chen, Shanben; Wang, Y. T.; Wu, Lin

    2000-10-01

    Due to the development of advanced manufacturing technology and the introduction of Smart-Manufacturing notion in the field of modern industrial production, welding flexible manufacturing system (WFMS) using robot technology has become the inevitable developing direction on welding automation. In WFMS process, the flexibility for different welding products and the realizing on corresponding welding parameters control are the guarantees for welding quality. Based on a new intelligent arc-welding flexible manufacturing cell (WFMC), the system structure and control policies are studied in this paper. Aiming at the different information flows among every subsystem and central monitoring computer in this WFMC, Petri net theory is introduced into the process of welding manufacturing. With its help, a discrete control model of WFMC has been constructed, in which the system status is regarded as place and the control process is regarded as transition. Moreover, grounded on automation Petri net principle, the judging and utilizing of information obtained from welding sensors are imported into net structure, which extends the traditional Petri net concepts. The control model and policies researched in this paper have established foundation for further intelligent real-time control on WFMC and WFMS.

  14. THz spectroscopy: An emerging technology for pharmaceutical development and pharmaceutical Process Analytical Technology (PAT) applications

    NASA Astrophysics Data System (ADS)

    Wu, Huiquan; Khan, Mansoor

    2012-08-01

    As an emerging technology, THz spectroscopy has gained increasing attention in the pharmaceutical area during the last decade. This attention is due to the fact that (1) it provides a promising alternative approach for in-depth understanding of both intermolecular interaction among pharmaceutical molecules and pharmaceutical product quality attributes; (2) it provides a promising alternative approach for enhanced process understanding of certain pharmaceutical manufacturing processes; and (3) the FDA pharmaceutical quality initiatives, most noticeably, the Process Analytical Technology (PAT) initiative. In this work, the current status and progress made so far on using THz spectroscopy for pharmaceutical development and pharmaceutical PAT applications are reviewed. In the spirit of demonstrating the utility of first principles modeling approach for addressing model validation challenge and reducing unnecessary model validation "burden" for facilitating THz pharmaceutical PAT applications, two scientific case studies based on published THz spectroscopy measurement results are created and discussed. Furthermore, other technical challenges and opportunities associated with adapting THz spectroscopy as a pharmaceutical PAT tool are highlighted.

  15. Thermodynamic Modeling and Optimization of the Copper Flash Converting Process Using the Equilibrium Constant Method

    NASA Astrophysics Data System (ADS)

    Li, Ming-zhou; Zhou, Jie-min; Tong, Chang-ren; Zhang, Wen-hai; Chen, Zhuo; Wang, Jin-liang

    2018-05-01

    Based on the principle of multiphase equilibrium, a mathematical model of the copper flash converting process was established by the equilibrium constant method, and a computational system was developed with the use of MetCal software platform. The mathematical model was validated by comparing simulated outputs, industrial data, and published data. To obtain high-quality blister copper, a low copper content in slag, and increased impurity removal rate, the model was then applied to investigate the effects of the operational parameters [oxygen/feed ratio (R OF), flux rate (R F), and converting temperature (T)] on the product weights, compositions, and the distribution behaviors of impurity elements. The optimized results showed that R OF, R F, and T should be controlled at approximately 156 Nm3/t, within 3.0 pct, and at approximately 1523 K (1250 °C), respectively.

  16. Current pulse: can a production system reduce medical errors in health care?

    PubMed

    Printezis, Antonios; Gopalakrishnan, Mohan

    2007-01-01

    One of the reasons for rising health care costs is medical errors, a majority of which result from faulty systems and processes. Health care in the past has used process-based initiatives such as Total Quality Management, Continuous Quality Improvement, and Six Sigma to reduce errors. These initiatives to redesign health care, reduce errors, and improve overall efficiency and customer satisfaction have had moderate success. Current trend is to apply the successful Toyota Production System (TPS) to health care since its organizing principles have led to tremendous improvement in productivity and quality for Toyota and other businesses that have adapted them. This article presents insights on the effectiveness of TPS principles in health care and the challenges that lie ahead in successfully integrating this approach with other quality initiatives.

  17. Reflexion on linear regression trip production modelling method for ensuring good model quality

    NASA Astrophysics Data System (ADS)

    Suprayitno, Hitapriya; Ratnasari, Vita

    2017-11-01

    Transport Modelling is important. For certain cases, the conventional model still has to be used, in which having a good trip production model is capital. A good model can only be obtained from a good sample. Two of the basic principles of a good sampling is having a sample capable to represent the population characteristics and capable to produce an acceptable error at a certain confidence level. It seems that this principle is not yet quite understood and used in trip production modeling. Therefore, investigating the Trip Production Modelling practice in Indonesia and try to formulate a better modeling method for ensuring the Model Quality is necessary. This research result is presented as follows. Statistics knows a method to calculate span of prediction value at a certain confidence level for linear regression, which is called Confidence Interval of Predicted Value. The common modeling practice uses R2 as the principal quality measure, the sampling practice varies and not always conform to the sampling principles. An experiment indicates that small sample is already capable to give excellent R2 value and sample composition can significantly change the model. Hence, good R2 value, in fact, does not always mean good model quality. These lead to three basic ideas for ensuring good model quality, i.e. reformulating quality measure, calculation procedure, and sampling method. A quality measure is defined as having a good R2 value and a good Confidence Interval of Predicted Value. Calculation procedure must incorporate statistical calculation method and appropriate statistical tests needed. A good sampling method must incorporate random well distributed stratified sampling with a certain minimum number of samples. These three ideas need to be more developed and tested.

  18. Ergodicity, Maximum Entropy Production, and Steepest Entropy Ascent in the Proofs of Onsager's Reciprocal Relations

    NASA Astrophysics Data System (ADS)

    Benfenati, Francesco; Beretta, Gian Paolo

    2018-04-01

    We show that to prove the Onsager relations using the microscopic time reversibility one necessarily has to make an ergodic hypothesis, or a hypothesis closely linked to that. This is true in all the proofs of the Onsager relations in the literature: from the original proof by Onsager, to more advanced proofs in the context of linear response theory and the theory of Markov processes, to the proof in the context of the kinetic theory of gases. The only three proofs that do not require any kind of ergodic hypothesis are based on additional hypotheses on the macroscopic evolution: Ziegler's maximum entropy production principle (MEPP), the principle of time reversal invariance of the entropy production, or the steepest entropy ascent principle (SEAP).

  19. Proposal for a Conceptual Model for Evaluating Lean Product Development Performance: A Study of LPD Enablers in Manufacturing Companies

    NASA Astrophysics Data System (ADS)

    Osezua Aikhuele, Daniel; Mohd Turan, Faiz

    2016-02-01

    The instability in today's market and the emerging demands for mass customized products by customers, are driving companies to seek for cost effective and time efficient improvements in their production system and this have led to real pressure for the adaptation of new developmental architecture and operational parameters to remain competitive in the market. Among such developmental architecture adopted, is the integration of lean thinking in the product development process. However, due to lack of clear understanding of the lean performance and its measurements, many companies are unable to implement and fully integrate the lean principle into their product development process and without a proper performance measurement, the performance level of the organizational value stream will be unknown and the specific area of improvement as it relates to the LPD program cannot be tracked. Hence, it will result in poor decision making in the LPD implementation. This paper therefore seeks to present a conceptual model for evaluation of LPD performances by identifying and analysing the core existing LPD enabler (Chief Engineer, Cross-functional teams, Set-based engineering, Poka-yoke (mistakeproofing), Knowledge-based environment, Value-focused planning and development, Top management support, Technology, Supplier integration, Workforce commitment and Continuous improvement culture) for assessing the LPD performance.

  20. From science into practice: modelling hot spots for corporate flood risk and emergency management with high-resolution digital terrain data

    NASA Astrophysics Data System (ADS)

    Pfurtscheller, Clemens; Vetter, Michael; Werthmann, Markus

    2010-05-01

    In times of increasing scarcity of private or public resources and uncertain changes in natural environment caused by climate variations, prevention and risk management against floods and coherent processes in mountainous regions, like debris flows or log jams, should be faced as a main challenge for globalised enterprises whose production facilities are located in flood-prone areas. From an entrepreneurial perspective, vulnerability of production facilities which causes restrictions or a total termination of production processes has to be optimised by means of cost-benefit-principles. Modern production enterprises are subject to globalisation and accompanying aspects, like short order and delivery periods, interlinking production processes and just-in-time manufacturing, so a breakdown of production provokes substantial financial impacts, unemployment and a decline of gross regional product. The aim of the presented project is to identify weak and critical points of the corporate emergency planning ("hot spots") and to assess possible losses triggered by mountainous flood processes using high-resolution digital terrain models (DTM) from airborne LiDAR (ALS). We derive flood-hot spots and model critical locations where the risk of natural hazards is very high. To model those hot spots a flood simulation based on an ALS-DTM has to be calculated. Based on that flood simulation, the flood heights of the overflowed locations which are lower than a threshold are mapped as flood-hot-spots. Then the corporate critical infrastructure, e.g. production facilities or lifelines, which are affected by the flooding, can be figured out. After the identification of hot spots and possible damage potential, the implementation of the results into corporate risk and emergency management guarantees the transdisciplinary approach involving stakeholders, risk and safety management officers and corporate fire brigade. Thus, the interdisciplinary analysis, including remote sensing techniques, like LiDAR, and economic assessment of natural hazards, combining with corporate acting secures production, guarantees income and helps to stabilise region's wealth after major flood events. Beyond that, the assessment of hot spots could be raised as locational issue for greenfield strategy or company foundation.

  1. A Study on Improving Logistics in a Production Enterprise in the Automotive Domain

    NASA Astrophysics Data System (ADS)

    Muntean, A.; Inţă, M.; Stroilă, I. A.

    2016-11-01

    In order to remain on the market and to overcome competition, companies need to develop and implement a strategy of continuous improvement. This is based on customer satisfaction needs, management based on realities and respect for people. In order to meet market requirements global road Faurecia Company uses a system called "Faurecia Excellence System", a system that promotes continuous improvement. FES system controls all activities of Faurecia, from research and development to sales including production and company functions and implements all 114 main procedures. This paper examines the current activity and introduce the new methods to improve manufacturing processes. These new methods are based on some fundamental concepts, namely: a high level of stocks blocks resources and can mask other problems decreasing visibility and communication; product quality is essential and the goods must be produced only when needed using Kanban principle. Additional expenditure with purchased raw materials regarding storage until they are needed, identification and packaging for storage, inventory costs were reduced using a JIT based principle in purchasing.Thus, the paper proposes reorganizing production by reducing the level of stocks using JIT principle and production time. The application made in software Arena refers to simulating the fabrication process to reduce throughput time.

  2. Reducing waste and errors: piloting lean principles at Intermountain Healthcare.

    PubMed

    Jimmerson, Cindy; Weber, Dorothy; Sobek, Durward K

    2005-05-01

    The Toyota Production System (TPS), based on industrial engineering principles and operational innovations, is used to achieve waste reduction and efficiency while increasing product quality. Several key tools and principles, adapted to health care, have proved effective in improving hospital operations. Value Stream Maps (VSMs), which represent the key people, material, and information flows required to deliver a product or service, distinguish between value-adding and non-value-adding steps. The one-page Problem-Solving A3 Report guides staff through a rigorous and systematic problem-solving process. PILOT PROJECT at INTERMOUNTAIN HEALTHCARE: In a pilot project, participants made many improvements, ranging from simple changes implemented immediately (for example, heart monitor paper not available when a patient presented with a dysrythmia) to larger projects involving patient or information flow issues across multiple departments. Most of the improvements required little or no investment and reduced significant amounts of wasted time for front-line workers. In one unit, turnaround time for pathologist reports from an anatomical pathology lab was reduced from five to two days. TPS principles and tools are applicable to an endless variety of processes and work settings in health care and can be used to address critical challenges such as medical errors, escalating costs, and staffing shortages.

  3. A basic introduction to the thermodynamics of the Earth system far from equilibrium and maximum entropy production

    PubMed Central

    Kleidon, A.

    2010-01-01

    The Earth system is remarkably different from its planetary neighbours in that it shows pronounced, strong global cycling of matter. These global cycles result in the maintenance of a unique thermodynamic state of the Earth's atmosphere which is far from thermodynamic equilibrium (TE). Here, I provide a simple introduction of the thermodynamic basis to understand why Earth system processes operate so far away from TE. I use a simple toy model to illustrate the application of non-equilibrium thermodynamics and to classify applications of the proposed principle of maximum entropy production (MEP) to such processes into three different cases of contrasting flexibility in the boundary conditions. I then provide a brief overview of the different processes within the Earth system that produce entropy, review actual examples of MEP in environmental and ecological systems, and discuss the role of interactions among dissipative processes in making boundary conditions more flexible. I close with a brief summary and conclusion. PMID:20368248

  4. A basic introduction to the thermodynamics of the Earth system far from equilibrium and maximum entropy production.

    PubMed

    Kleidon, A

    2010-05-12

    The Earth system is remarkably different from its planetary neighbours in that it shows pronounced, strong global cycling of matter. These global cycles result in the maintenance of a unique thermodynamic state of the Earth's atmosphere which is far from thermodynamic equilibrium (TE). Here, I provide a simple introduction of the thermodynamic basis to understand why Earth system processes operate so far away from TE. I use a simple toy model to illustrate the application of non-equilibrium thermodynamics and to classify applications of the proposed principle of maximum entropy production (MEP) to such processes into three different cases of contrasting flexibility in the boundary conditions. I then provide a brief overview of the different processes within the Earth system that produce entropy, review actual examples of MEP in environmental and ecological systems, and discuss the role of interactions among dissipative processes in making boundary conditions more flexible. I close with a brief summary and conclusion.

  5. Tobacco industry responsibility for butts: a Model Tobacco Waste Act.

    PubMed

    Curtis, Clifton; Novotny, Thomas E; Lee, Kelley; Freiberg, Mike; McLaughlin, Ian

    2017-01-01

    Cigarette butts and other postconsumer products from tobacco use are the most common waste elements picked up worldwide each year during environmental cleanups. Under the environmental principle of Extended Producer Responsibility, tobacco product manufacturers may be held responsible for collection, transport, processing and safe disposal of tobacco product waste (TPW). Legislation has been applied to other toxic and hazardous postconsumer waste products such as paints, pesticide containers and unused pharmaceuticals, to reduce, prevent and mitigate their environmental impacts. Additional product stewardship (PS) requirements may be necessary for other stakeholders and beneficiaries of tobacco product sales and use, especially suppliers, retailers and consumers, in order to ensure effective TPW reduction. This report describes how a Model Tobacco Waste Act may be adopted by national and subnational jurisdictions to address the environmental impacts of TPW. Such a law will also reduce tobacco use and its health consequences by raising attention to the environmental hazards of TPW, increasing the price of tobacco products, and reducing the number of tobacco product retailers. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  6. The effects of type of knowledge upon human problem solving in a process control task

    NASA Technical Reports Server (NTRS)

    Morris, N. M.; Rouse, W. B.

    1985-01-01

    The question of what the operator of a dynamic system needs to know was investigated in an experiment using PLANT, a simulation of a generic dynamic production process. Knowledge of PLANT was manipulated via different types of instruction, so that four different groups were created: (1) minimal instructions only; (2) minimal instructions and guidelines for operation (procedures); (3) minimal instructions and dynamic relationships (principles); and (4) minimal instructions, and procedures, and principles. Subjects controlled PLANT in a variety of situations which required maintaining production while also diagnosing familiar and unfamiliar failures. Despite the fact that these manipulations resulted in differences in subjects' Knowledge, as assessed via a written test at the end of the experiment, instructions had no effect upon achievement of the primary goal of production, or upon subjects' ability to diagnose unfamiliar failures. However, those groups receiving procedures controlled the system in a more stable manner. Possible reasons for the failure to find an effect of principles are presented, and the implications of these results for operator training and aiding are discussed.

  7. R-HyMOD: an R-package for the hydrological model HyMOD

    NASA Astrophysics Data System (ADS)

    Baratti, Emanuele; Montanari, Alberto

    2015-04-01

    A software code for the implementation of the HyMOD hydrological model [1] is presented. HyMOD is a conceptual lumped rainfall-runoff model that is based on the probability-distributed soil storage capacity principle introduced by R. J. Moore 1985 [2]. The general idea behind this model is to describe the spatial variability of some process parameters as, for instance, the soil structure or the water storage capacities, through probability distribution functions. In HyMOD, the rainfall-runoff process is represented through a nonlinear tank connected with three identical linear tanks in parallel representing the surface flow and a slow-flow tank representing groundwater flow. The model requires the optimization of five parameters: Cmax (the maximum storage capacity within the watershed), β (the degree of spatial variability of the soil moisture capacity within the watershed), α (a factor for partitioning the flow between two series of tanks) and the two residence time parameters of quick-flow and slow-flow tanks, kquick and kslow respectively. Given its relatively simplicity but robustness, the model is widely used in the literature. The input data consist of precipitation and potential evapotranspiration at the given time scale. The R-HyMOD package is composed by a 'canonical' R-function of HyMOD and a fast FORTRAN implementation. The first one can be easily modified and can be used, for instance, for educational purposes; the second part combines the R user friendly interface with a fast processing unit. [1] Boyle D.P. (2000), Multicriteria calibration of hydrological models, Ph.D. dissertation, Dep. of Hydrol. and Water Resour., Univ of Arizona, Tucson. [2] Moore, R.J., (1985), The probability-distributed principle and runoff production at point and basin scale, Hydrol. Sci. J., 30(2), 273-297.

  8. Expanding lean thinking to the product and process design and development within the framework of sustainability

    NASA Astrophysics Data System (ADS)

    Sorli, M.; Sopelana, A.; Salgado, M.; Pelaez, G.; Ares, E.

    2012-04-01

    Companies require tools to change towards a new way of developing and producing innovative products to be manufactured considering the economic, social and environmental impact along the product life cycle. Based on translating Lean principles in Product Development (PD) from the design stage and, along the entire product life cycle, it is aimed to address both sustainability and environmental issues. The drivers of sustainable culture within a lean PD have been identified and a baseline for future research on the development of appropriate tools and techniques has been provided. This research provide industry with a framework which balance environmental and sustainable factors with lean principles to be considered and incorporated from the beginning of product design and development covering the entire product lifecycle.

  9. Principle considerations for the risk assessment of sprayed consumer products.

    PubMed

    Steiling, W; Bascompta, M; Carthew, P; Catalano, G; Corea, N; D'Haese, A; Jackson, P; Kromidas, L; Meurice, P; Rothe, H; Singal, M

    2014-05-16

    In recent years, the official regulation of chemicals and chemical products has been intensified. Explicitly for spray products enhanced requirements to assess the consumers'/professionals' exposure to such product type have been introduced. In this regard the Aerosol-Dispensers-Directive (75/324/EEC) with obligation for marketing aerosol dispensers, and the Cosmetic-Products-Regulation (1223/2009/EC) which obliges the insurance of a safety assessment, have to be mentioned. Both enactments, similar to the REACH regulation (1907/2006/EC), require a robust chemical safety assessment. From such assessment, appropriate risk management measures may be identified to adequately control the risk of these chemicals/products to human health and the environment when used. Currently, the above-mentioned regulations lack the guidance on which data are needed for preparing a proper hazard analysis and safety assessment of spray products. Mandatory in the process of inhalation risk and safety assessment is the determination and quantification of the actual exposure to the spray product and more specifically, its ingredients. In this respect the current article, prepared by the European Aerosol Federation (FEA, Brussels) task force "Inhalation Toxicology", intends to introduce toxicological principles and the state of the art in currently available exposure models adapted for typical application scenarios. This review on current methodologies is intended to guide safety assessors to better estimate inhalation exposure by using the most relevant data. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  10. A new submarine oil-water separation system

    NASA Astrophysics Data System (ADS)

    Cai, Wen-Bin; Liu, Bo-Hong

    2017-12-01

    In order to solve the oil field losses of environmental problems and economic benefit caused by the separation of lifting production liquid to offshore platforms in the current offshore oil production, from the most basic separation principle, a new oil-water separation system has been processed of adsorption and desorption on related materials, achieving high efficiency and separation of oil and water phases. And the submarine oil-water separation device has been designed. The main structure of the device consists of gas-solid phase separation device, period separating device and adsorption device that completed high efficiency separation of oil, gas and water under the adsorption and desorption principle, and the processing capacity of the device is calculated.

  11. [Quality process control system of Chinese medicine preparation based on "holistic view"].

    PubMed

    Wang, Ya-Qi; Jiao, Jiao-Jiao; Wu, Zhen-Feng; Zheng, Qin; Yang, Ming

    2018-01-01

    "High quality, safety and effectiveness" are the primary principles for the pharmaceutical research and development process in China. The quality of products relies not only on the inspection method, but also on the design and development, process control and standardized management. The quality depends on the process control level. In this paper, the history and current development of quality control of traditional Chinese medicine (TCM) preparations are reviewed systematically. Based on the development model of international drug quality control and the misunderstanding of quality control of TCM preparations, the reasons for impacting the homogeneity of TCM preparations are analyzed and summarized. According to TCM characteristics, efforts were made to control the diversity of TCM, make "unstable" TCM into "stable" Chinese patent medicines, put forward the concepts of "holistic view" and "QbD (quality by design)", so as to create the "holistic, modular, data, standardized" model as the core of TCM preparation quality process control model. Scientific studies shall conform to the actual production of TCM preparations, and be conducive to supporting advanced equipment and technology upgrade, thoroughly applying the scientific research achievements in Chinese patent medicines, and promoting the cluster application and transformation application of TCM pharmaceutical technology, so as to improve the quality and effectiveness of the TCM industry and realize the green development. Copyright© by the Chinese Pharmaceutical Association.

  12. The role of language in mathematical development: evidence from children with specific language impairments.

    PubMed

    Donlan, Chris; Cowan, Richard; Newton, Elizabeth J; Lloyd, Delyth

    2007-04-01

    A sample (n=48) of eight-year-olds with specific language impairments is compared with age-matched (n=55) and language matched controls (n=55) on a range of tasks designed to test the interdependence of language and mathematical development. Performance across tasks varies substantially in the SLI group, showing profound deficits in production of the count word sequence and basic calculation and significant deficits in understanding of the place-value principle in Hindu-Arabic notation. Only in understanding of arithmetic principles does SLI performance approximate that of age-matched-controls, indicating that principled understanding can develop even where number sequence production and other aspects of number processing are severely compromised.

  13. Redundancy and reduction: Speakers manage syntactic information density

    PubMed Central

    Florian Jaeger, T.

    2010-01-01

    A principle of efficient language production based on information theoretic considerations is proposed: Uniform Information Density predicts that language production is affected by a preference to distribute information uniformly across the linguistic signal. This prediction is tested against data from syntactic reduction. A single multilevel logit model analysis of naturally distributed data from a corpus of spontaneous speech is used to assess the effect of information density on complementizer that-mentioning, while simultaneously evaluating the predictions of several influential alternative accounts: availability, ambiguity avoidance, and dependency processing accounts. Information density emerges as an important predictor of speakers’ preferences during production. As information is defined in terms of probabilities, it follows that production is probability-sensitive, in that speakers’ preferences are affected by the contextual probability of syntactic structures. The merits of a corpus-based approach to the study of language production are discussed as well. PMID:20434141

  14. TOXPERT: An Expert System for Risk Assessment

    PubMed Central

    Soto, R. J.; Osimitz, T. G.; Oleson, A.

    1988-01-01

    TOXPERT is an artificial intelligence based system used to model product safety, toxicology (TOX) and regulatory (REG) decision processes. An expert system shell uses backward chaining rule control to link “marketing approval” goals to the type of product, REG agency, exposure conditions and TOX. Marketing risks are primarily a function of the TOX, hazards and exposure potential. The method employed differentiates between REG requirements in goal seeking control for various types of products. This is accomplished by controlling rule execution by defining frames for each REG agency. In addition, TOXPERT produces classifications of TOX ratings and suggested product labeling. This production rule system uses principles of TOX, REGs, corporate guidelines and internal “rules of thumb.” TOXPERT acts as an advisor for this narrow domain. Advantages are that it can make routine decisions freeing professional's time for more complex problem solving, provide backup and training.

  15. The Future of Pharmaceutical Manufacturing Sciences

    PubMed Central

    2015-01-01

    The entire pharmaceutical sector is in an urgent need of both innovative technological solutions and fundamental scientific work, enabling the production of highly engineered drug products. Commercial‐scale manufacturing of complex drug delivery systems (DDSs) using the existing technologies is challenging. This review covers important elements of manufacturing sciences, beginning with risk management strategies and design of experiments (DoE) techniques. Experimental techniques should, where possible, be supported by computational approaches. With that regard, state‐of‐art mechanistic process modeling techniques are described in detail. Implementation of materials science tools paves the way to molecular‐based processing of future DDSs. A snapshot of some of the existing tools is presented. Additionally, general engineering principles are discussed covering process measurement and process control solutions. Last part of the review addresses future manufacturing solutions, covering continuous processing and, specifically, hot‐melt processing and printing‐based technologies. Finally, challenges related to implementing these technologies as a part of future health care systems are discussed. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association J Pharm Sci 104:3612–3638, 2015 PMID:26280993

  16. The Future of Pharmaceutical Manufacturing Sciences.

    PubMed

    Rantanen, Jukka; Khinast, Johannes

    2015-11-01

    The entire pharmaceutical sector is in an urgent need of both innovative technological solutions and fundamental scientific work, enabling the production of highly engineered drug products. Commercial-scale manufacturing of complex drug delivery systems (DDSs) using the existing technologies is challenging. This review covers important elements of manufacturing sciences, beginning with risk management strategies and design of experiments (DoE) techniques. Experimental techniques should, where possible, be supported by computational approaches. With that regard, state-of-art mechanistic process modeling techniques are described in detail. Implementation of materials science tools paves the way to molecular-based processing of future DDSs. A snapshot of some of the existing tools is presented. Additionally, general engineering principles are discussed covering process measurement and process control solutions. Last part of the review addresses future manufacturing solutions, covering continuous processing and, specifically, hot-melt processing and printing-based technologies. Finally, challenges related to implementing these technologies as a part of future health care systems are discussed. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association.

  17. On the relationship between cell cycle analysis with ergodic principles and age-structured cell population models.

    PubMed

    Kuritz, K; Stöhr, D; Pollak, N; Allgöwer, F

    2017-02-07

    Cyclic processes, in particular the cell cycle, are of great importance in cell biology. Continued improvement in cell population analysis methods like fluorescence microscopy, flow cytometry, CyTOF or single-cell omics made mathematical methods based on ergodic principles a powerful tool in studying these processes. In this paper, we establish the relationship between cell cycle analysis with ergodic principles and age structured population models. To this end, we describe the progression of a single cell through the cell cycle by a stochastic differential equation on a one dimensional manifold in the high dimensional dataspace of cell cycle markers. Given the assumption that the cell population is in a steady state, we derive transformation rules which transform the number density on the manifold to the steady state number density of age structured population models. Our theory facilitates the study of cell cycle dependent processes including local molecular events, cell death and cell division from high dimensional "snapshot" data. Ergodic analysis can in general be applied to every process that exhibits a steady state distribution. By combining ergodic analysis with age structured population models we furthermore provide the theoretic basis for extensions of ergodic principles to distribution that deviate from their steady state. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Design of experiments (DoE) in pharmaceutical development.

    PubMed

    N Politis, Stavros; Colombo, Paolo; Colombo, Gaia; M Rekkas, Dimitrios

    2017-06-01

    At the beginning of the twentieth century, Sir Ronald Fisher introduced the concept of applying statistical analysis during the planning stages of research rather than at the end of experimentation. When statistical thinking is applied from the design phase, it enables to build quality into the product, by adopting Deming's profound knowledge approach, comprising system thinking, variation understanding, theory of knowledge, and psychology. The pharmaceutical industry was late in adopting these paradigms, compared to other sectors. It heavily focused on blockbuster drugs, while formulation development was mainly performed by One Factor At a Time (OFAT) studies, rather than implementing Quality by Design (QbD) and modern engineering-based manufacturing methodologies. Among various mathematical modeling approaches, Design of Experiments (DoE) is extensively used for the implementation of QbD in both research and industrial settings. In QbD, product and process understanding is the key enabler of assuring quality in the final product. Knowledge is achieved by establishing models correlating the inputs with the outputs of the process. The mathematical relationships of the Critical Process Parameters (CPPs) and Material Attributes (CMAs) with the Critical Quality Attributes (CQAs) define the design space. Consequently, process understanding is well assured and rationally leads to a final product meeting the Quality Target Product Profile (QTPP). This review illustrates the principles of quality theory through the work of major contributors, the evolution of the QbD approach and the statistical toolset for its implementation. As such, DoE is presented in detail since it represents the first choice for rational pharmaceutical development.

  19. Mechanism for multiplicity of steady states with distinct cell concentration in continuous culture of mammalian cells.

    PubMed

    Yongky, Andrew; Lee, Jongchan; Le, Tung; Mulukutla, Bhanu Chandra; Daoutidis, Prodromos; Hu, Wei-Shou

    2015-07-01

    Continuous culture for the production of biopharmaceutical proteins offers the possibility of steady state operations and thus more consistent product quality and increased productivity. Under some conditions, multiplicity of steady states has been observed in continuous cultures of mammalian cells, wherein with the same dilution rate and feed nutrient composition, steady states with very different cell and product concentrations may be reached. At those different steady states, cells may exhibit a high glycolysis flux with high lactate production and low cell concentration, or a low glycolysis flux with low lactate and high cell concentration. These different steady states, with different cell concentration, also have different productivity. Developing a mechanistic understanding of the occurrence of steady state multiplicity and devising a strategy to steer the culture toward the desired steady state is critical. We establish a multi-scale kinetic model that integrates a mechanistic intracellular metabolic model and cell growth model in a continuous bioreactor. We show that steady state multiplicity exists in a range of dilution rate in continuous culture as a result of the bistable behavior in glycolysis. The insights from the model were used to devise strategies to guide the culture to the desired steady state in the multiple steady state region. The model provides a guideline principle in the design of continuous culture processes of mammalian cells. © 2015 Wiley Periodicals, Inc.

  20. Onsager's variational principle in soft matter.

    PubMed

    Doi, Masao

    2011-07-20

    In the celebrated paper on the reciprocal relation for the kinetic coefficients in irreversible processes, Onsager (1931 Phys. Rev. 37 405) extended Rayleigh's principle of the least energy dissipation to general irreversible processes. In this paper, I shall show that this variational principle gives us a very convenient framework for deriving many established equations which describe the nonlinear and non-equilibrium phenomena in soft matter, such as phase separation kinetics in solutions, gel dynamics, molecular modeling for viscoelasticity nemato-hydrodynamics, etc. Onsager's variational principle can therefore be regarded as a solid general basis for soft matter physics.

  1. An Evaporative Cooling Model for Teaching Applied Psychrometrics

    ERIC Educational Resources Information Center

    Johnson, Donald M.

    2004-01-01

    Evaporative cooling systems are commonly used in controlled environment plant and animal production. These cooling systems operate based on well defined psychrometric principles. However, students often experience considerable difficulty in learning these principles when they are taught in an abstract, verbal manner. This article describes an…

  2. Human contact imagined during the production process increases food naturalness perceptions.

    PubMed

    Abouab, Nathalie; Gomez, Pierrick

    2015-08-01

    It is well established that food processing and naturalness are not good friends, but is food processing always detrimental to naturalness? Building on the contagion principle, this research examines how production mode (handmade vs. machine-made) influences naturalness perceptions. In a pilot study (n = 69) and an experiment (n = 133), we found that compared with both a baseline condition and a condition in which the mode of production process was portrayed as machine-made, a handmade production mode increases naturalness ratings of a grape juice. A mediation analysis demonstrates that these effects result from higher perceived human contact suggesting that the production process may preserve food naturalness when humanized. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. 75 FR 43922 - Interim Guidance for Determining Subject Matter Eligibility for Process Claims in View of Bilski...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ..., who do not routinely encounter claims that implicate the abstract idea exception. Under the principles... principles: Laws of nature, physical phenomena, and abstract ideas. See id. The Office has been using the so... marketing a product, comprising: Developing a shared marketing force, said shared marketing force including...

  4. Creation of the BMA ensemble for SST using a parallel processing technique

    NASA Astrophysics Data System (ADS)

    Kim, Kwangjin; Lee, Yang Won

    2013-10-01

    Despite the same purpose, each satellite product has different value because of its inescapable uncertainty. Also the satellite products have been calculated for a long time, and the kinds of the products are various and enormous. So the efforts for reducing the uncertainty and dealing with enormous data will be necessary. In this paper, we create an ensemble Sea Surface Temperature (SST) using MODIS Aqua, MODIS Terra and COMS (Communication Ocean and Meteorological Satellite). We used Bayesian Model Averaging (BMA) as ensemble method. The principle of the BMA is synthesizing the conditional probability density function (PDF) using posterior probability as weight. The posterior probability is estimated using EM algorithm. The BMA PDF is obtained by weighted average. As the result, the ensemble SST showed the lowest RMSE and MAE, which proves the applicability of BMA for satellite data ensemble. As future work, parallel processing techniques using Hadoop framework will be adopted for more efficient computation of very big satellite data.

  5. Enhancement of Pyrometallurgical Teaching Using Excel Simulation Models

    NASA Astrophysics Data System (ADS)

    Grimsey, Eric J.

    Steady state Excel models for a copper flash smelter and an iron blast furnace are used to enhance the teaching of pyrometallurgical smelting principles within a fourth year level process engineering unit delivered at the Western Australian School of Mines. A lecture/workshop approach has been adopted in which student teams undertake process simulation assignments that illustrate the multifaceted responses of process outputs to variation of inputs, the objectives being to reinforce their understanding of smelting principles. The approach has proven to be popular with students, as evidenced by the consistently high ratings the unit has received through student feedback. This paper provides an overview of the teaching approach and process models used.

  6. Monitoring of laser material processing using machine integrated low-coherence interferometry

    NASA Astrophysics Data System (ADS)

    Kunze, Rouwen; König, Niels; Schmitt, Robert

    2017-06-01

    Laser material processing has become an indispensable tool in modern production. With the availability of high power pico- and femtosecond laser sources, laser material processing is advancing into applications, which demand for highest accuracies such as laser micro milling or laser drilling. In order to enable narrow tolerance windows, a closedloop monitoring of the geometrical properties of the processed work piece is essential for achieving a robust manufacturing process. Low coherence interferometry (LCI) is a high-precision measuring principle well-known from surface metrology. In recent years, we demonstrated successful integrations of LCI into several different laser material processing methods. Within this paper, we give an overview about the different machine integration strategies, that always aim at a complete and ideally telecentric integration of the measurement device into the existing beam path of the processing laser. Thus, highly accurate depth measurements within machine coordinates and a subsequent process control and quality assurance are possible. First products using this principle have already found its way to the market, which underlines the potential of this technology for the monitoring of laser material processing.

  7. Proposed correlation of modern processing principles for Ayurvedic herbal drug manufacturing: A systematic review.

    PubMed

    Jain, Rahi; Venkatasubramanian, Padma

    2014-01-01

    Quality Ayurvedic herbal medicines are potential, low-cost solutions for addressing contemporary healthcare needs of both Indian and global community. Correlating Ayurvedic herbal preparations with modern processing principles (MPPs) can help develop new and use appropriate technology for scaling up production of the medicines, which is necessary to meet the growing demand. Understanding the fundamental Ayurvedic principles behind formulation and processing is also important for improving the dosage forms. Even though Ayurvedic industry has adopted technologies from food, chemical and pharmaceutical industries, there is no systematic study to correlate the traditional and modern processing methods. This study is an attempt to provide a possible correlation between the Ayurvedic processing methods and MPPs. A systematic literature review was performed to identify the Ayurvedic processing methods by collecting information from English editions of classical Ayurveda texts on medicine preparation methods. Correlation between traditional and MPPs was done based on the techniques used in Ayurvedic drug processing. It was observed that in Ayurvedic medicine preparations there were two major types of processes, namely extraction, and separation. Extraction uses membrane rupturing and solute diffusion principles, while separation uses volatility, adsorption, and size-exclusion principles. The study provides systematic documentation of methods used in Ayurveda for herbal drug preparation along with its interpretation in terms of MPPs. This is the first step which can enable improving or replacing traditional techniques. New technologies or use of existing technologies can be used to improve the dosage forms and scaling up while maintaining the Ayurvedic principles similar to traditional techniques.

  8. Overview of the TriBITS Lifecycle Model: Lean/Agile Software Lifecycle Model for Research-based Computational Science and Engineering Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Roscoe A; Heroux, Dr. Michael A; Willenbring, James

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science & engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process--respecting the competing needs of research vs. production--cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects thatmore » are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  9. Questionnaire typography and production.

    PubMed

    Gray, M

    1975-06-01

    This article describes the typographic principles and practice which provide the basis of good design and print, the relevant printing processes which can be used, and the graphic designer's function in questionnaire production. As they impose constraints on design decisions to be discussed later in the text, the various methods of printing and production are discussed first.

  10. Critical Zone Experimental Design to Assess Soil Processes and Function

    NASA Astrophysics Data System (ADS)

    Banwart, Steve

    2010-05-01

    Through unsustainable land use practices, mining, deforestation, urbanisation and degradation by industrial pollution, soil losses are now hypothesized to be much faster (100 times or more) than soil formation - with the consequence that soil has become a finite resource. The crucial challenge for the international research community is to understand the rates of processes that dictate soil mass stocks and their function within Earth's Critical Zone (CZ). The CZ is the environment where soils are formed, degrade and provide their essential ecosystem services. Key among these ecosystem services are food and fibre production, filtering, buffering and transformation of water, nutrients and contaminants, storage of carbon and maintaining biological habitat and genetic diversity. We have initiated a new research project to address the priority research areas identified in the European Union Soil Thematic Strategy and to contribute to the development of a global network of Critical Zone Observatories (CZO) committed to soil research. Our hypothesis is that the combined physical-chemical-biological structure of soil can be assessed from first-principles and the resulting soil functions can be quantified in process models that couple the formation and loss of soil stocks with descriptions of biodiversity and nutrient dynamics. The objectives of this research are to 1. Describe from 1st principles how soil structure influences processes and functions of soils, 2. Establish 4 European Critical Zone Observatories to link with established CZOs, 3. Develop a CZ Integrated Model of soil processes and function, 4. Create a GIS-based modelling framework to assess soil threats and mitigation at EU scale, 5. Quantify impacts of changing land use, climate and biodiversity on soil function and its value and 6. Form with international partners a global network of CZOs for soil research and deliver a programme of public outreach and research transfer on soil sustainability. The experimental design studies soil processes across the temporal evolution of the soil profile, from its formation on bare bedrock, through managed use as productive land to its degradation under longstanding pressures from intensive land use. To understand this conceptual life cycle of soil, we have selected 4 European field sites as Critical Zone Observatories. These are to provide data sets of soil parameters, processes and functions which will be incorporated into the mathematical models. The field sites are 1) the BigLink field station which is located in the chronosequence of the Damma Glacier forefield in alpine Switzerland and is established to study the initial stages of soil development on bedrock; 2) the Lysina Catchment in the Czech Republic which is representative of productive soils managed for intensive forestry, 3) the Fuchsenbigl Field Station in Austria which is an agricultural research site that is representative of productive soils managed as arable land and 4) the Koiliaris Catchment in Crete, Greece which represents degraded Mediterranean region soils, heavily impacted by centuries of intensive grazing and farming, under severe risk of desertification.

  11. Addressing the medicinal chemistry bottleneck: a lean approach to centralized purification.

    PubMed

    Weller, Harold N; Nirschl, David S; Paulson, James L; Hoffman, Steven L; Bullock, William H

    2012-09-10

    The use of standardized lean manufacturing principles to improve drug discovery productivity is often thought to be at odds with fostering innovation. This manuscript describes how selective implementation of a lean optimized process, in this case centralized purification for medicinal chemistry, can improve operational productivity and increase scientist time available for innovation. A description of the centralized purification process is provided along with both operational and impact (productivity) metrics, which indicate lower cost, higher output, and presumably more free time for innovation as a result of the process changes described.

  12. TriBITS lifecycle model. Version 1.0, a lean/agile software lifecycle model for research-based computational science and engineering and applied mathematical software.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for manymore » CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  13. Basic principles of stability.

    PubMed

    Egan, William; Schofield, Timothy

    2009-11-01

    An understanding of the principles of degradation, as well as the statistical tools for measuring product stability, is essential to management of product quality. Key to this is management of vaccine potency. Vaccine shelf life is best managed through determination of a minimum potency release requirement, which helps assure adequate potency throughout expiry. Use of statistical tools such a least squares regression analysis should be employed to model potency decay. The use of such tools provides incentive to properly design vaccine stability studies, while holding stability measurements to specification presents a disincentive for collecting valuable data. The laws of kinetics such as Arrhenius behavior help practitioners design effective accelerated stability programs, which can be utilized to manage stability after a process change. Design of stability studies should be carefully considered, with an eye to minimizing the variability of the stability parameter. In the case of measuring the degradation rate, testing at the beginning and the end of the study improves the precision of this estimate. Additional design considerations such as bracketing and matrixing improve the efficiency of stability evaluation of vaccines.

  14. Development of a Methodology for Conducting Training Effectiveness Evaluations of Air Defense Training, and Abstracts of TEE-Related Literature

    DTIC Science & Technology

    1982-06-01

    process) pertain to the second. Tiae or cost factors sometimes preclude the uGe of product measures, leaving measures of task process as the only...it noo.ewy ad idenatify by Sleek nomher) Training Offectiveness ivluation Product ftaluation Training effectiveness Air defense training Training...requirements. ’The, TE systea described in this report 4 rporates the principles of instructional system Aevelopusnt and provides for both product evaltiation

  15. Representing idioms: syntactic and contextual effects on idiom processing.

    PubMed

    Holsinger, Edward

    2013-09-01

    Recent work on the processing of idiomatic expressions argues against the idea that idioms are simply big words. For example, hybrid models of idiom representation, originally investigated in the context of idiom production, propose a priority of literal computation, and a principled relationship between the conceptual meaning of an idiom, its literal lemmas and its syntactic structure. We examined the predictions of the hybrid representation hypothesis in the domain of idiom comprehension. We conducted two experiments to examine the role of syntactic, lexical and contextual factors on the interpretation of idiomatic expressions. Experiment I examines the role of syntactic compatibility and lexical compatibility on the real-time processing of potentially idiomatic strings. Experiment 2 examines the role of contextual information on idiom processing and how context interacts with lexical information during processing. We find evidence that literal computation plays a causal role in the retrieval of idiomatic meaning and that contextual, lexical and structural information influence the processing of idiomatic strings at early stages during processing, which provide support for the hybrid model of idiom representation in the domain of idiom comprehension.

  16. First-principles modeling of laser-matter interaction and plasma dynamics in nanosecond pulsed laser shock processing

    NASA Astrophysics Data System (ADS)

    Zhang, Zhongyang; Nian, Qiong; Doumanidis, Charalabos C.; Liao, Yiliang

    2018-02-01

    Nanosecond pulsed laser shock processing (LSP) techniques, including laser shock peening, laser peen forming, and laser shock imprinting, have been employed for widespread industrial applications. In these processes, the main beneficial characteristic is the laser-induced shockwave with a high pressure (in the order of GPa), which leads to the plastic deformation with an ultrahigh strain rate (105-106/s) on the surface of target materials. Although LSP processes have been extensively studied by experiments, few efforts have been put on elucidating underlying process mechanisms through developing a physics-based process model. In particular, development of a first-principles model is critical for process optimization and novel process design. This work aims at introducing such a theoretical model for a fundamental understanding of process mechanisms in LSP. Emphasis is placed on the laser-matter interaction and plasma dynamics. This model is found to offer capabilities in predicting key parameters including electron and ion temperatures, plasma state variables (temperature, density, and pressure), and the propagation of the laser shockwave. The modeling results were validated by experimental data.

  17. Nondestructive methods of integrating energy harvesting systems with structures

    NASA Astrophysics Data System (ADS)

    Inamdar, Sumedh; Zimowski, Krystian; Crawford, Richard; Wood, Kristin; Jensen, Dan

    2012-04-01

    Designing an attachment structure that is both novel and meets the system requirements can be a difficult task especially for inexperienced designers. This paper presents a design methodology for concept generation of a "parent/child" attachment system. The "child" is broadly defined as any device, part, or subsystem that will attach to any existing system, part, or device called the "parent." An inductive research process was used to study a variety of products, patents, and biological examples that exemplified the parent/child system. Common traits among these products were found and categorized as attachment principles in three different domains: mechanical, material, and field. The attachment principles within the mechanical domain and accompanying examples are the focus of this paper. As an example of the method, a case study of generating concepts for a bridge mounted wind energy harvester using the mechanical attachment principles derived from the methodology and TRIZ principles derived from Altshuller's matrix of contradictions is presented.

  18. Using a mass balance to determine the potency loss during the production of a pharmaceutical blend.

    PubMed

    Mackaplow, Michael B

    2010-09-01

    The manufacture of a blend containing the active pharmaceutical ingredient (API) and inert excipients is a precursor for the production of most pharmaceutical capsules and tablets. However, if there is a net water gain or preferential loss of API during production, the potency of the final drug product may be less than the target value. We use a mass balance to predict the mean potency loss during the production of a blend via wet granulation and fluidized bed drying. The result is an explicit analytical equation for the change in blend potency a function of net water gain, solids losses (both regular and high-potency), and the fraction of excipients added extragranularly. This model predicts that each 1% gain in moisture content (as determined by a loss on drying test) will decrease the API concentration of the final blend at least 1% LC. The effect of pre-blend solid losses increases with their degree of superpotency. This work supports Quality by Design by providing a rational method to set the process design space to minimize blend potency losses. When an overage is necessary, the model can help justify it by providing a quantitative, first-principles understanding of the sources of potency loss. The analysis is applicable to other manufacturing processes where the primary sources of potency loss are net water gain and/or mass losses.

  19. Building Capacity in Community-Based Participatory Research Partnerships Through a Focus on Process and Multiculturalism.

    PubMed

    Corbie-Smith, Giselle; Bryant, Angela R; Walker, Deborah J; Blumenthal, Connie; Council, Barbara; Courtney, Dana; Adimora, Ada

    2015-01-01

    In health research, investigators and funders are emphasizing the importance of collaboration between communities and academic institutions to achieve health equity. Although the principles underlying community-academic partnered research have been well-articulated, the processes by which partnerships integrate these principles when working across cultural differences are not as well described. We present how Project GRACE (Growing, Reaching, Advocating for Change and Empowerment) integrated participatory research principles with the process of building individual and partnership capacity. We worked with Vigorous Interventions In Ongoing Natural Settings (VISIONS) Inc., a process consultant and training organization, to develop a capacity building model. We present the conceptual framework and multicultural process of change (MPOC) that was used to build individual and partnership capacity to address health disparities. The process and capacity building model provides a common language, approach, and toolset to understand differences and the dynamics of inequity. These tools can be used by other partnerships in the conduct of research to achieve health equity.

  20. Introduction to the application of QbD principles for the development of monoclonal antibodies.

    PubMed

    Finkler, Christof; Krummen, Lynne

    2016-09-01

    Quality by Design (QbD) is a global regulatory initiative with the goal of enhancing pharmaceutical development through the proactive design of pharmaceutical manufacturing process and controls to consistently deliver the intended performance of the product. The principles of pharmaceutical development relevant to QbD are described in the ICH guidance documents (ICHQ8-11). An integrated set of risk assessments and their related elements developed at Roche/Genentech were designed to provide an overview of product and process knowledge for the production of a recombinant monoclonal antibody. This chapter introduces a publication series on the application of Quality by Design for biopharmaceuticals, with a focus on the development of recombinant monoclonal antibodies. The development of and overview on the QbD concept applied by Roche and Genentech is described and essential QbD elements are presented. Copyright © 2016 International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  1. Dielectric properties of agricultural products – fundamental principles, influencing factors, and measurement technirques. Chapter 4. Electrotechnologies for Food Processing: Book Series. Volume 3. Radio-Frequency Heating

    USDA-ARS?s Scientific Manuscript database

    In this chapter, definitions of dielectric properties, or permittivity, of materials and a brief discussion of the fundamental principles governing their behavior with respect to influencing factors are presented. The basic physics of the influence of frequency of the electric fields and temperatur...

  2. Utilization of lean management principles in the ambulatory clinic setting.

    PubMed

    Casey, Jessica T; Brinton, Thomas S; Gonzalez, Chris M

    2009-03-01

    The principles of 'lean management' have permeated many sectors of today's business world, secondary to the success of the Toyota Production System. This management method enables workers to eliminate mistakes, reduce delays, lower costs, and improve the overall quality of the product or service they deliver. These lean management principles can be applied to health care. Their implementation within the ambulatory care setting is predicated on the continuous identification and elimination of waste within the process. The key concepts of flow time, inventory and throughput are utilized to improve the flow of patients through the clinic, and to identify points that slow this process -- so-called bottlenecks. Nonessential activities are shifted away from bottlenecks (i.e. the physician), and extra work capacity is generated from existing resources, rather than being added. The additional work capacity facilitates a more efficient response to variability, which in turn results in cost savings, more time for the physician to interact with patients, and faster completion of patient visits. Finally, application of the lean management principle of 'just-in-time' management can eliminate excess clinic inventory, better synchronize office supply with patient demand, and reduce costs.

  3. Development of a pheromone elution rate physical model

    Treesearch

    M.E. Teske; H.W. Thistle; B.L. Strom; H. Zhu

    2015-01-01

    A first principle modeling approach has been applied to available data describing the elution of semiochemicals from pheromone dispensers. These data included field data for 27 products developed by several manufacturers, including homemade devices, as well as environmental chamber data collected on three semiochemical products. The goal of this effort was to...

  4. PRODUCT LIFE-CYCLE ASSESSMENT: INVENTORY GUIDELINES AND PRINCIPLES

    EPA Science Inventory

    The Life Cycle Assessment (LCA) can be used as an objective technical tool to evaluate the environmental consequences of a product, process, or activity holistically, across its entire life cycle. omplete LCA can be viewed as consisting of three complementary components (1) the i...

  5. Model of Fluidized Bed Containing Reacting Solids and Gases

    NASA Technical Reports Server (NTRS)

    Bellan, Josette; Lathouwers, Danny

    2003-01-01

    A mathematical model has been developed for describing the thermofluid dynamics of a dense, chemically reacting mixture of solid particles and gases. As used here, "dense" signifies having a large volume fraction of particles, as for example in a bubbling fluidized bed. The model is intended especially for application to fluidized beds that contain mixtures of carrier gases, biomass undergoing pyrolysis, and sand. So far, the design of fluidized beds and other gas/solid industrial processing equipment has been based on empirical correlations derived from laboratory- and pilot-scale units. The present mathematical model is a product of continuing efforts to develop a computational capability for optimizing the designs of fluidized beds and related equipment on the basis of first principles. Such a capability could eliminate the need for expensive, time-consuming predesign testing.

  6. Nonequilibrium thermodynamics and maximum entropy production in the Earth system: applications and implications.

    PubMed

    Kleidon, Axel

    2009-06-01

    The Earth system is maintained in a unique state far from thermodynamic equilibrium, as, for instance, reflected in the high concentration of reactive oxygen in the atmosphere. The myriad of processes that transform energy, that result in the motion of mass in the atmosphere, in oceans, and on land, processes that drive the global water, carbon, and other biogeochemical cycles, all have in common that they are irreversible in their nature. Entropy production is a general consequence of these processes and measures their degree of irreversibility. The proposed principle of maximum entropy production (MEP) states that systems are driven to steady states in which they produce entropy at the maximum possible rate given the prevailing constraints. In this review, the basics of nonequilibrium thermodynamics are described, as well as how these apply to Earth system processes. Applications of the MEP principle are discussed, ranging from the strength of the atmospheric circulation, the hydrological cycle, and biogeochemical cycles to the role that life plays in these processes. Nonequilibrium thermodynamics and the MEP principle have potentially wide-ranging implications for our understanding of Earth system functioning, how it has evolved in the past, and why it is habitable. Entropy production allows us to quantify an objective direction of Earth system change (closer to vs further away from thermodynamic equilibrium, or, equivalently, towards a state of MEP). When a maximum in entropy production is reached, MEP implies that the Earth system reacts to perturbations primarily with negative feedbacks. In conclusion, this nonequilibrium thermodynamic view of the Earth system shows great promise to establish a holistic description of the Earth as one system. This perspective is likely to allow us to better understand and predict its function as one entity, how it has evolved in the past, and how it is modified by human activities in the future.

  7. Understanding Variability To Reduce the Energy and GHG Footprints of U.S. Ethylene Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yao, Yuan; Graziano, Diane J.; Riddle, Matthew

    2015-11-18

    Recent growth in U.S. ethylene production due to the shale gas boom is affecting the U.S. chemical industry's energy and greenhouse gas (GHG) emissions footprints. To evaluate these effects, a systematic, first-principles model of the cradle-to-gate ethylene production system was developed and applied. The variances associated with estimating the energy consumption and GHG emission intensities of U.S. ethylene production, both from conventional natural gas,and from shale gas, are explicitly analyzed. A sensitivity analysis illustrates that the large variances in energy intensity are due to process parameters (e.g., compressor efficiency), and that large variances in GHG emissions intensity are due tomore » fugitive emissions from upstream natural gas production. On the basis of these results, the opportunities with the greatest leverage for reducing the energy and GHG footprints are presented. The model and analysis provide energy analysts and policy makers with a better understanding of the drivers of energy use and GHG emissions associated with U.S. ethylene production. They also constitute a rich data resource that can be used to evaluate options for managing the industry's footprints moving forward.« less

  8. Mechanical-Kinetic Modeling of a Molecular Walker from a Modular Design Principle

    NASA Astrophysics Data System (ADS)

    Hou, Ruizheng; Loh, Iong Ying; Li, Hongrong; Wang, Zhisong

    2017-02-01

    Artificial molecular walkers beyond burnt-bridge designs are complex nanomachines that potentially replicate biological walkers in mechanisms and functionalities. Improving the man-made walkers up to performance for widespread applications remains difficult, largely because their biomimetic design principles involve entangled kinetic and mechanical effects to complicate the link between a walker's construction and ultimate performance. Here, a synergic mechanical-kinetic model is developed for a recently reported DNA bipedal walker, which is based on a modular design principle, potentially enabling many directional walkers driven by a length-switching engine. The model reproduces the experimental data of the walker, and identifies its performance-limiting factors. The model also captures features common to the underlying design principle, including counterintuitive performance-construction relations that are explained by detailed balance, entropy production, and bias cancellation. While indicating a low directional fidelity for the present walker, the model suggests the possibility of improving the fidelity above 90% by a more powerful engine, which may be an improved version of the present engine or an entirely new engine motif, thanks to the flexible design principle. The model is readily adaptable to aid these experimental developments towards high-performance molecular walkers.

  9. A Model for Determining Optimal Governance Structure in DoD Acquisition Projects in a Performance-Based Environment

    DTIC Science & Technology

    2010-04-30

    combating market dynamism (Aldrich, 1979; Child, 1972), which is a result of evolving technology, shifting prices, or variance in product availability... principles : (1) human beings are bounded rationally, and (2), as a result of being rationally bound, will always choose to further their own self... principles to govern the relationship among the buyers and suppliers. Our conceptual model aligns the alternative governance structures derived

  10. Process development and modeling of fluidized-bed reactor with coimmobilized biocatalyst for fuel ethanol production

    NASA Astrophysics Data System (ADS)

    Sun, May Yongmei

    This research focuses on two steps of commercial fuel ethanol production processes: the hydrolysis starch process and the fermentation process. The goal of this research is to evaluate the performance of co-immobilized biocatalysts in a fluidized bed reactor with emphasis on economic and engineering aspects and to develop a predictive mathematical model for this system. The productivity of an FBR is higher than productivity of a traditional batch reactor or CSTR. Fluidized beds offer great advantages over packed beds for immobilized cells when small particles are used or when the reactant feed contains suspended solids. Plugging problems, excessive pressure drops (and thus attrition), or crushing risks may be avoided. No mechanical stirring is required as mixing occurs due to the natural turbulence in the fluidized process. Both enzyme and microorganism are immobilized in one catalyst bead which is called co-immobilization. Inside this biocatalyst matrix, starch is hydrolyzed by the enzyme glucoamylase to form glucose and then converted to ethanol and carbon dioxide by microorganisms. Two biocatalysts were evaluated: (1) co-immobilized yeast strain Saccharomyces cerevisiae and glucoamylase. (2) co-immobilized Zymomonas mobilis and glucoamylase. A co-immobilized biocatalyst accomplishes the simultaneous saccharification and fermentation (SSF process). When compared to a two-step process involving separate saccharification and fermentation stages, the SSF process has productivity values twice that given by the pre-saccharified process when the time required for pre-saccharification (15--25 h) was taken into account. The SSF process should also save capital cost. The information about productivity, fermentation yield, concentration profiles along the bed, ethanol inhibition, et al., was obtained from the experimental data. For the yeast system, experimental results showed that: no apparent decrease of productivity occurred after two and half months, the productivity was 25--44g/L-hr (based on reactor volume), the average yield was 0.45 g ethanol/g starch, the biocatalyst retained physical integrity and contamination did not affect fermentation. For the Z. mobilis system the maximum volumetric productivity was 38 g ethanol/L-h, the average yield was 0.51 g ethanol/g starch and the FBR was successfully operated for almost one month. In order to develop, scale-up and economically evaluate this system more efficiently, a predictive mathematical model that is based on fundamental principles was developed and verified. This model includes kinetics of reactions, transport phenomena of the reactant and product by diffusion within the biocatalyst bead, and the hydrodynamics of the three phase fluidized bed. The co-immobilized biocatalyst involves a consecutive reaction mechanism The mathematical descriptions of the effectiveness factors of reactant and the intermediate product were developed. Hydrodynamic literature correlations were used to develop the dispersion coefficient and gas, liquid, and solid holdup. The solutions of coupled non-linear second order equations for biocatalyst bead and reactor together with the boundary conditions were solved numerically. This model gives considerable information about the system, such as concentration profiles inside both the beads and column, flow rate and feed concentration influences on productivity and phase hold up, and the influence of enzyme and cell mass loading in the catalyst. This model is generic in nature such that it can be easily applied to a diverse set of applications and operating conditions.

  11. Modeling transformation of soil organic matter through the soil enzyme activity

    NASA Astrophysics Data System (ADS)

    Tregubova, Polina; Vladimirov, Artem; Vasilyeva, Nadezda

    2017-04-01

    The sensitivity of soil heterotrophic respiration to changing environmental conditions is widely investigated nowadays but still remain extremely controversial. The mechanisms are still needed to reveal. In this work we model soil C and N biogeochemical cycles based on general principles of soil carbon and nitrogen dynamics with focusing on biochemical processes occurring in the soil based on well known classes of enzymes and organic compounds that they can transform. According to classic theories, exoenzymes and endoenzymes of bacteria and fungi as stable over a long period catalytic components play a significant role in degradation of plant and animal residues, decomposition of biopolymers of different sizes, humification processes and in releasing of labile compounds essential for the microorganism and plant growth and germination. We test the model regimes sensitivity to such environmental factors as temperature and moisture. Modeling the directions and patterns of soil biochemical activity is important for evaluation of soil agricultural productivity as well as its ecological functions.

  12. Single-step affinity purification of enzyme biotherapeutics: a platform methodology for accelerated process development.

    PubMed

    Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean

    2014-01-01

    Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT). © 2014 American Institute of Chemical Engineers.

  13. An ecological process model of systems change.

    PubMed

    Peirson, Leslea J; Boydell, Katherine M; Ferguson, H Bruce; Ferris, Lorraine E

    2011-06-01

    In June 2007 the American Journal of Community Psychology published a special issue focused on theories, methods and interventions for systems change which included calls from the editors and authors for theoretical advancement in this field. We propose a conceptual model of systems change that integrates familiar and fundamental community psychology principles (succession, interdependence, cycling of resources, adaptation) and accentuates a process orientation. To situate our framework we offer a definition of systems change and a brief review of the ecological perspective and principles. The Ecological Process Model of Systems Change is depicted, described and applied to a case example of policy driven systems level change in publicly funded social programs. We conclude by identifying salient implications for thinking and action which flow from the Model.

  14. Process design and control of a twin screw hot melt extrusion for continuous pharmaceutical tamper-resistant tablet production.

    PubMed

    Baronsky-Probst, J; Möltgen, C-V; Kessler, W; Kessler, R W

    2016-05-25

    Hot melt extrusion (HME) is a well-known process within the plastic and food industries that has been utilized for the past several decades and is increasingly accepted by the pharmaceutical industry for continuous manufacturing. For tamper-resistant formulations of e.g. opioids, HME is the most efficient production technique. The focus of this study is thus to evaluate the manufacturability of the HME process for tamper-resistant formulations. Parameters such as the specific mechanical energy (SME), as well as the melt pressure and its standard deviation, are important and will be discussed in this study. In the first step, the existing process data are analyzed by means of multivariate data analysis. Key critical process parameters such as feed rate, screw speed, and the concentration of the API in the polymers are identified, and critical quality parameters of the tablet are defined. In the second step, a relationship between the critical material, product and process quality attributes are established by means of Design of Experiments (DoEs). The resulting SME and the temperature at the die are essential data points needed to indirectly qualify the degradation of the API, which should be minimal. NIR-spectroscopy is used to monitor the material during the extrusion process. In contrast to most applications in which the probe is directly integrated into the die, the optical sensor is integrated into the cooling line of the strands. This saves costs in the probe design and maintenance and increases the robustness of the chemometric models. Finally, a process measurement system is installed to monitor and control all of the critical attributes in real-time by means of first principles, DoE models, soft sensor models, and spectroscopic information. Overall, the process is very robust as long as the screw speed is kept low. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Understanding Financial Innovation: An Introduction to Derivative Financial Products.

    ERIC Educational Resources Information Center

    Robinson, J. N.

    1992-01-01

    Explains the use of forwards, futures, swaps, and options in international currency trading. Argues that pricing options are based on the same basic principles as pricing other financial instruments. Concludes that, although financial markets have developed several new products, hedging and speculation involve similar processes. (CFR)

  16. Pharmaceutical Applications of Ion-Exchange Resins

    ERIC Educational Resources Information Center

    Elder, David

    2005-01-01

    The historical uses of ion-exchanged resins and a summary of the basic chemical principles involved in the ion-exchanged process are discussed. Specific applications of ion-exchange are provided that include drug stabilization, pharmaceutical excipients, taste-masking agents, oral sustained-release products, topical products for local application…

  17. An industrial ecology approach to municipal solid waste management: I. Methodology

    EPA Science Inventory

    Municipal solid waste (MSW) can be viewed as a feedstock for industrial ecology inspired conversions of wastes to valuable products and energy. The industrial ecology principle of symbiotic processes using waste streams for creating value-added products is applied to MSW, with e...

  18. Anaerobic Treatment of Municipal Solid Waste and Sludge for Energy Production and Recycling of Nutrients

    NASA Astrophysics Data System (ADS)

    Leinonen, S.

    This volume contains 18 papers presented at a Nordic workshop dealing with application of anaerobic decomposition processes on various types of organic wastes, held at the Siikasalmi Research and Experimental Station of the University of Joensuu on 1-2 Oct. 1992. Subject coverage of the presentations extends from the biochemical and microbiological principles of organic waste processing to descriptions and practical experiences of various types of treatment plants. The theoretical and experimental papers include studies on anaerobic and thermophilic degradation processes, methanogenesis, effects of hydrogen, treatment of chlorinated and phenolic compounds, and process modeling, while the practical examples range from treatment of various types of municipal, industrial, and mining wastes to agricultural and fish farm effluents. The papers provide technical descriptions of several biogas plants in operation. Geographically, the presentations span the Nordic and Baltic countries.

  19. Future issues including broadening the scope of the GLP principles.

    PubMed

    Liem, Francisca E; Lehr, Mark J

    2008-01-01

    When the principles of good laboratory practice (GLP) were drafted in 1982 by the Organisation for Economic Cooperation and Development (OECD) the electronic era was in its infant stages and many of the issues surrounding what may affect the environment and human health was not expected. Today, advances in technology for capturing and recording data for the reconstruction of a study are available and are being developed operating at speeds which could not have been known or understood in years past. Since that time, the United States Environmental Protection Agency (EPA) has required the conduct of additional studies in support of a pesticide registration in accordance with the GLP regulations. However, not all of these studies are required in other countries or may not require adherence to the principles of GLP. Companies are using computer models as virtual studies instead of inlife or bench type regulated research. Studies are often conducted at institutions of higher learning because of the academic expertise they offer. What is the overall impact advancing technology has on the principles of GLP? Are monitoring authorities (MAs) ready? The medical products field faces similar issues. Development and testing of these products and devices is being conducted similar to development and testing in the pesticide arena. To garner trust in mutual acceptance of data, each participating country must adhere to practices that ensure the highest standards of quality and integrity. The GLP inspector will need to have a good understanding of the science supporting the study conduct and the electronic systems that generate process and maintain study records.

  20. Modelling non-equilibrium thermodynamic systems from the speed-gradient principle.

    PubMed

    Khantuleva, Tatiana A; Shalymov, Dmitry S

    2017-03-06

    The application of the speed-gradient (SG) principle to the non-equilibrium distribution systems far away from thermodynamic equilibrium is investigated. The options for applying the SG principle to describe the non-equilibrium transport processes in real-world environments are discussed. Investigation of a non-equilibrium system's evolution at different scale levels via the SG principle allows for a fresh look at the thermodynamics problems associated with the behaviour of the system entropy. Generalized dynamic equations for finite and infinite number of constraints are proposed. It is shown that the stationary solution to the equations, resulting from the SG principle, entirely coincides with the locally equilibrium distribution function obtained by Zubarev. A new approach to describe time evolution of systems far from equilibrium is proposed based on application of the SG principle at the intermediate scale level of the system's internal structure. The problem of the high-rate shear flow of viscous fluid near the rigid plane plate is discussed. It is shown that the SG principle allows closed mathematical models of non-equilibrium processes to be constructed.This article is part of the themed issue 'Horizons of cybernetical physics'. © 2017 The Author(s).

  1. Modelling non-equilibrium thermodynamic systems from the speed-gradient principle

    NASA Astrophysics Data System (ADS)

    Khantuleva, Tatiana A.; Shalymov, Dmitry S.

    2017-03-01

    The application of the speed-gradient (SG) principle to the non-equilibrium distribution systems far away from thermodynamic equilibrium is investigated. The options for applying the SG principle to describe the non-equilibrium transport processes in real-world environments are discussed. Investigation of a non-equilibrium system's evolution at different scale levels via the SG principle allows for a fresh look at the thermodynamics problems associated with the behaviour of the system entropy. Generalized dynamic equations for finite and infinite number of constraints are proposed. It is shown that the stationary solution to the equations, resulting from the SG principle, entirely coincides with the locally equilibrium distribution function obtained by Zubarev. A new approach to describe time evolution of systems far from equilibrium is proposed based on application of the SG principle at the intermediate scale level of the system's internal structure. The problem of the high-rate shear flow of viscous fluid near the rigid plane plate is discussed. It is shown that the SG principle allows closed mathematical models of non-equilibrium processes to be constructed. This article is part of the themed issue 'Horizons of cybernetical physics'.

  2. Modelling non-equilibrium thermodynamic systems from the speed-gradient principle

    PubMed Central

    Khantuleva, Tatiana A.

    2017-01-01

    The application of the speed-gradient (SG) principle to the non-equilibrium distribution systems far away from thermodynamic equilibrium is investigated. The options for applying the SG principle to describe the non-equilibrium transport processes in real-world environments are discussed. Investigation of a non-equilibrium system's evolution at different scale levels via the SG principle allows for a fresh look at the thermodynamics problems associated with the behaviour of the system entropy. Generalized dynamic equations for finite and infinite number of constraints are proposed. It is shown that the stationary solution to the equations, resulting from the SG principle, entirely coincides with the locally equilibrium distribution function obtained by Zubarev. A new approach to describe time evolution of systems far from equilibrium is proposed based on application of the SG principle at the intermediate scale level of the system's internal structure. The problem of the high-rate shear flow of viscous fluid near the rigid plane plate is discussed. It is shown that the SG principle allows closed mathematical models of non-equilibrium processes to be constructed. This article is part of the themed issue ‘Horizons of cybernetical physics’. PMID:28115617

  3. Protection - Principles and practice.

    NASA Technical Reports Server (NTRS)

    Graham, G. S.; Denning, P. J.

    1972-01-01

    The protection mechanisms of computer systems control the access to objects, especially information objects. The principles of protection system design are formalized as a model (theory) of protection. Each process has a unique identification number which is attached by the system to each access attempted by the process. Details of system implementation are discussed, taking into account the storing of the access matrix, aspects of efficiency, and the selection of subjects and objects. Two systems which have protection features incorporating all the elements of the model are described.

  4. Application of a mathematical model for ergonomics in lean manufacturing.

    PubMed

    Botti, Lucia; Mora, Cristina; Regattieri, Alberto

    2017-10-01

    The data presented in this article are related to the research article "Integrating ergonomics and lean manufacturing principles in a hybrid assembly line" (Botti et al., 2017) [1]. The results refer to the application of the mathematical model for the design of lean processes in hybrid assembly lines, meeting both the lean principles and the ergonomic requirements for safe assembly work. Data show that the success of a lean strategy is possible when ergonomics of workers is a parameter of the assembly process design.

  5. Quality by design approach: application of artificial intelligence techniques of tablets manufactured by direct compression.

    PubMed

    Aksu, Buket; Paradkar, Anant; de Matas, Marcel; Ozer, Ozgen; Güneri, Tamer; York, Peter

    2012-12-01

    The publication of the International Conference of Harmonization (ICH) Q8, Q9, and Q10 guidelines paved the way for the standardization of quality after the Food and Drug Administration issued current Good Manufacturing Practices guidelines in 2003. "Quality by Design", mentioned in the ICH Q8 guideline, offers a better scientific understanding of critical process and product qualities using knowledge obtained during the life cycle of a product. In this scope, the "knowledge space" is a summary of all process knowledge obtained during product development, and the "design space" is the area in which a product can be manufactured within acceptable limits. To create the spaces, artificial neural networks (ANNs) can be used to emphasize the multidimensional interactions of input variables and to closely bind these variables to a design space. This helps guide the experimental design process to include interactions among the input variables, along with modeling and optimization of pharmaceutical formulations. The objective of this study was to develop an integrated multivariate approach to obtain a quality product based on an understanding of the cause-effect relationships between formulation ingredients and product properties with ANNs and genetic programming on the ramipril tablets prepared by the direct compression method. In this study, the data are generated through the systematic application of the design of experiments (DoE) principles and optimization studies using artificial neural networks and neurofuzzy logic programs.

  6. A physical description of fission product behavior fuels for advanced power reactors.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaganas, G.; Rest, J.; Nuclear Engineering Division

    2007-10-18

    The Global Nuclear Energy Partnership (GNEP) is considering a list of reactors and nuclear fuels as part of its chartered initiative. Because many of the candidate materials have not been explored experimentally under the conditions of interest, and in order to economize on program costs, analytical support in the form of combined first principle and mechanistic modeling is highly desirable. The present work is a compilation of mechanistic models developed in order to describe the fission product behavior of irradiated nuclear fuel. The mechanistic nature of the model development allows for the possibility of describing a range of nuclear fuelsmore » under varying operating conditions. Key sources include the FASTGRASS code with an application to UO{sub 2} power reactor fuel and the Dispersion Analysis Research Tool (DART ) with an application to uranium-silicide and uranium-molybdenum research reactor fuel. Described behavior mechanisms are divided into subdivisions treating fundamental materials processes under normal operation as well as the effect of transient heating conditions on these processes. Model topics discussed include intra- and intergranular gas-atom and bubble diffusion, bubble nucleation and growth, gas-atom re-solution, fuel swelling and ?scion gas release. In addition, the effect of an evolving microstructure on these processes (e.g., irradiation-induced recrystallization) is considered. The uranium-alloy fuel, U-xPu-Zr, is investigated and behavior mechanisms are proposed for swelling in the {alpha}-, intermediate- and {gamma}-uranium zones of this fuel. The work reviews the FASTGRASS kinetic/mechanistic description of volatile ?scion products and, separately, the basis for the DART calculation of bubble behavior in amorphous fuels. Development areas and applications for physical nuclear fuel models are identified.« less

  7. Applying lean management principles to the creation of a postpartum hemorrhage care bundle.

    PubMed

    Faulkner, Beth

    2013-10-01

    A lean management process is a set of interventions, each of which creates value for the customer. Lean management is not a new concept, but is relatively new to health care. Postpartum hemorrhage (PPH) is the most common cause of maternal death worldwide in both developing and developed countries. We applied lean management principles as an innovative approach to improving outcomes in patients with PPH. Initial results using principles of lean management indicated significant improvements in response time and family-centered care. When applied rigorously and throughout the organization, lean principles can have a dramatic effect on productivity, cost and quality. © 2013 AWHONN.

  8. Operationalising the Lean principles in maternity service design using 3P methodology

    PubMed Central

    Smith, Iain

    2016-01-01

    The last half century has seen significant changes to Maternity services in England. Though rates of maternal and infant mortality have fallen to very low levels, this has been achieved largely through hospital admission. It has been argued that maternity services may have become over-medicalised and service users have expressed a preference for more personalised care. NHS England's national strategy sets out a vision for a modern maternity service that continues to deliver safe care whilst also adopting the principles of personalisation. Therefore, there is a need to develop maternity services that balance safety with personal choice. To address this challenge, a maternity unit in North East England considered improving their service through refurbishment or building new facilities. Using a design process known as the production preparation process (or 3P), the Lean principles of understanding user value, mapping value-streams, creating flow, developing pull processes and continuous improvement were applied to the design of a new maternity department. Multiple stakeholders were engaged in the design through participation in a time-out (3P) workshop in which an innovative pathway and facility for maternity services were co-designed. The team created a hybrid model that they described as “wrap around care” in which the Lean concept of pull was applied to create a service and facility design in which expectant mothers were put at the centre of care with clinicians, skills, equipment and supplies drawn towards them in line with acuity changes as needed. Applying the Lean principles using the 3P method helped stakeholders to create an innovative design in line with the aspirations and objectives of the National Maternity Review. The case provides a practical example of stakeholders applying the Lean principles to maternity services and demonstrates the potential applicability of the Lean 3P approach to design healthcare services in line with policy requirements. PMID:27933146

  9. Operationalising the Lean principles in maternity service design using 3P methodology.

    PubMed

    Smith, Iain

    2016-01-01

    The last half century has seen significant changes to Maternity services in England. Though rates of maternal and infant mortality have fallen to very low levels, this has been achieved largely through hospital admission. It has been argued that maternity services may have become over-medicalised and service users have expressed a preference for more personalised care. NHS England's national strategy sets out a vision for a modern maternity service that continues to deliver safe care whilst also adopting the principles of personalisation. Therefore, there is a need to develop maternity services that balance safety with personal choice. To address this challenge, a maternity unit in North East England considered improving their service through refurbishment or building new facilities. Using a design process known as the production preparation process (or 3P), the Lean principles of understanding user value, mapping value-streams, creating flow, developing pull processes and continuous improvement were applied to the design of a new maternity department. Multiple stakeholders were engaged in the design through participation in a time-out (3P) workshop in which an innovative pathway and facility for maternity services were co-designed. The team created a hybrid model that they described as "wrap around care" in which the Lean concept of pull was applied to create a service and facility design in which expectant mothers were put at the centre of care with clinicians, skills, equipment and supplies drawn towards them in line with acuity changes as needed. Applying the Lean principles using the 3P method helped stakeholders to create an innovative design in line with the aspirations and objectives of the National Maternity Review. The case provides a practical example of stakeholders applying the Lean principles to maternity services and demonstrates the potential applicability of the Lean 3P approach to design healthcare services in line with policy requirements.

  10. First-principles Electronic Structure Calculations for Scintillation Phosphor Nuclear Detector Materials

    NASA Astrophysics Data System (ADS)

    Canning, Andrew

    2013-03-01

    Inorganic scintillation phosphors (scintillators) are extensively employed as radiation detector materials in many fields of applied and fundamental research such as medical imaging, high energy physics, astrophysics, oil exploration and nuclear materials detection for homeland security and other applications. The ideal scintillator for gamma ray detection must have exceptional performance in terms of stopping power, luminosity, proportionality, speed, and cost. Recently, trivalent lanthanide dopants such as Ce and Eu have received greater attention for fast and bright scintillators as the optical 5d to 4f transition is relatively fast. However, crystal growth and production costs remain challenging for these new materials so there is still a need for new higher performing scintillators that meet the needs of the different application areas. First principles calculations can provide a useful insight into the chemical and electronic properties of such materials and hence can aid in the search for better new scintillators. In the past there has been little first-principles work done on scintillator materials in part because it means modeling f electrons in lanthanides as well as complex excited state and scattering processes. In this talk I will give an overview of the scintillation process and show how first-principles calculations can be applied to such systems to gain a better understanding of the physics involved. I will also present work on a high-throughput first principles approach to select new scintillator materials for fabrication as well as present more detailed calculations to study trapping process etc. that can limit their brightness. This work in collaboration with experimental groups has lead to the discovery of some new bright scintillators. Work supported by the U.S. Department of Homeland Security and carried out under U.S. Department of Energy Contract no. DE-AC02-05CH11231 at Lawrence Berkeley National Laboratory.

  11. A novel approach to support formulation design on twin screw wet granulation technology: Understanding the impact of overarching excipient properties on drug product quality attributes.

    PubMed

    Willecke, N; Szepes, A; Wunderlich, M; Remon, J P; Vervaet, C; De Beer, T

    2018-04-21

    The overall objective of this work is to understand how excipient characteristics influence the drug product quality attributes and process performance of a continuous twin screw wet granulation process. The knowledge gained in this study is intended to be used for Quality by Design (QbD)-based formulation design and formulation optimization. Three principal components which represent the overarching properties of 8 selected pharmaceutical fillers were used as factors, whereas factors 4 and 5 represented binder type and binder concentration in a design of experiments (DoE). The majority of process parameters were kept constant to minimize their influence on the granule and drug product quality. 27 DoE batches consisting of binary filler/binder mixtures were processed via continuous twin screw wet granulation followed by tablet compression. Multiple linear regression models were built providing understanding of the impact of filler and binder properties on granule and tablet quality attributes (i.e. 16 DoE responses). The impact of fillers on the granule and tablet responses was more dominant compared to the impact of binder type and concentration. The filler properties had a relevant effect on granule characteristics, such as particle size, friability and specific surface area. Binder type and concentration revealed a relevant influence on granule flowability and friability as well as on the compactability (required compression force during tableting to obtain target hardness). In order to evaluate the DoE models' validity, a verification of the DoE models was performed with new formulations (i.e. a new combination of filler, binder type and binder concentration) which were initially not included in the dataset used to build the DoE models. The combined PCA (principle component analysis)/DoE approach allowed to link the excipient properties with the drug product quality attributes. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Public Relations for Brazilian Libraries: Process, Principles, Program Planning, Planning Techniques and Suggestions.

    ERIC Educational Resources Information Center

    Kies, Cosette N.

    A brief overview of the functions of public relations in libraries introduces this manual, which provides an explanation of the public relations (PR) process, including fact-finding, planning, communicating, evaluating, and marketing; some PR principles; a 10-step program that could serve as a model for planning a PR program; a discussion of PR…

  13. Assessment Methodology for Process Validation Lifecycle Stage 3A.

    PubMed

    Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Chen, Shu; Ingram, Marzena; Spes, Jana

    2017-07-01

    The paper introduces evaluation methodologies and associated statistical approaches for process validation lifecycle Stage 3A. The assessment tools proposed can be applied to newly developed and launched small molecule as well as bio-pharma products, where substantial process and product knowledge has been gathered. The following elements may be included in Stage 3A: number of 3A batch determination; evaluation of critical material attributes, critical process parameters, critical quality attributes; in vivo in vitro correlation; estimation of inherent process variability (IPV) and PaCS index; process capability and quality dashboard (PCQd); and enhanced control strategy. US FDA guidance on Process Validation: General Principles and Practices, January 2011 encourages applying previous credible experience with suitably similar products and processes. A complete Stage 3A evaluation is a valuable resource for product development and future risk mitigation of similar products and processes. Elements of 3A assessment were developed to address industry and regulatory guidance requirements. The conclusions made provide sufficient information to make a scientific and risk-based decision on product robustness.

  14. The NASA planning process, appendix D. [as useful planning approach for solving urban problems

    NASA Technical Reports Server (NTRS)

    Annett, H. A.

    1973-01-01

    The planning process is outlined which NASA used in making some fundamental post-Apollo decisions concerning the reuseable space shuttle and the orbiting laboratory. It is suggested that the basic elements and principles of the process, when combined, form a useful planning approach for solving urban problems. These elements and principles are defined along with the basic strengths of the planning model.

  15. Continuous production of fenofibrate solid lipid nanoparticles by hot-melt extrusion technology: a systematic study based on a quality by design approach.

    PubMed

    Patil, Hemlata; Feng, Xin; Ye, Xingyou; Majumdar, Soumyajit; Repka, Michael A

    2015-01-01

    This contribution describes a continuous process for the production of solid lipid nanoparticles (SLN) as drug-carrier systems via hot-melt extrusion (HME). Presently, HME technology has not been used for the manufacturing of SLN. Generally, SLN are prepared as a batch process, which is time consuming and may result in variability of end-product quality attributes. In this study, using Quality by Design (QbD) principles, we were able to achieve continuous production of SLN by combining two processes: HME technology for melt-emulsification and high-pressure homogenization (HPH) for size reduction. Fenofibrate (FBT), a poorly water-soluble model drug, was incorporated into SLN using HME-HPH methods. The developed novel platform demonstrated better process control and size reduction compared to the conventional process of hot homogenization (batch process). Varying the process parameters enabled the production of SLN below 200 nm. The dissolution profile of the FBT SLN prepared by the novel HME-HPH method was faster than that of the crude FBT and a micronized marketed FBT formulation. At the end of a 5-h in vitro dissolution study, a SLN formulation released 92-93% of drug, whereas drug release was approximately 65 and 45% for the marketed micronized formulation and crude drug, respectively. Also, pharmacokinetic study results demonstrated a statistical increase in Cmax, Tmax, and AUC0-24 h in the rate of drug absorption from SLN formulations as compared to the crude drug and marketed micronized formulation. In summary, the present study demonstrated the potential use of hot-melt extrusion technology for continuous and large-scale production of SLN.

  16. Optimal Control Inventory Stochastic With Production Deteriorating

    NASA Astrophysics Data System (ADS)

    Affandi, Pardi

    2018-01-01

    In this paper, we are using optimal control approach to determine the optimal rate in production. Most of the inventory production models deal with a single item. First build the mathematical models inventory stochastic, in this model we also assume that the items are in the same store. The mathematical model of the problem inventory can be deterministic and stochastic models. In this research will be discussed how to model the stochastic as well as how to solve the inventory model using optimal control techniques. The main tool in the study problems for the necessary optimality conditions in the form of the Pontryagin maximum principle involves the Hamilton function. So we can have the optimal production rate in a production inventory system where items are subject deterioration.

  17. Design of freeze-drying processes for pharmaceuticals: practical advice.

    PubMed

    Tang, Xiaolin; Pikal, Michael J

    2004-02-01

    Design of freeze-drying processes is often approached with a "trial and error" experimental plan or, worse yet, the protocol used in the first laboratory run is adopted without further attempts at optimization. Consequently, commercial freeze-drying processes are often neither robust nor efficient. It is our thesis that design of an "optimized" freeze-drying process is not particularly difficult for most products, as long as some simple rules based on well-accepted scientific principles are followed. It is the purpose of this review to discuss the scientific foundations of the freeze-drying process design and then to consolidate these principles into a set of guidelines for rational process design and optimization. General advice is given concerning common stability issues with proteins, but unusual and difficult stability issues are beyond the scope of this review. Control of ice nucleation and crystallization during the freezing step is discussed, and the impact of freezing on the rest of the process and final product quality is reviewed. Representative freezing protocols are presented. The significance of the collapse temperature and the thermal transition, denoted Tg', are discussed, and procedures for the selection of the "target product temperature" for primary drying are presented. Furthermore, guidelines are given for selection of the optimal shelf temperature and chamber pressure settings required to achieve the target product temperature without thermal and/or mass transfer overload of the freeze dryer. Finally, guidelines and "rules" for optimization of secondary drying and representative secondary drying protocols are presented.

  18. A new method of search design of refrigerating systems containing a liquid and gaseous working medium based on the graph model of the physical operating principle

    NASA Astrophysics Data System (ADS)

    Yakovlev, A. A.; Sorokin, V. S.; Mishustina, S. N.; Proidakova, N. V.; Postupaeva, S. G.

    2017-01-01

    The article describes a new method of search design of refrigerating systems, the basis of which is represented by a graph model of the physical operating principle based on thermodynamical description of physical processes. The mathematical model of the physical operating principle has been substantiated, and the basic abstract theorems relatively semantic load applied to nodes and edges of the graph have been represented. The necessity and the physical operating principle, sufficient for the given model and intended for the considered device class, were demonstrated by the example of a vapour-compression refrigerating plant. The example of obtaining a multitude of engineering solutions of a vapour-compression refrigerating plant has been considered.

  19. On the analysis of complex biological supply chains: From Process Systems Engineering to Quantitative Systems Pharmacology.

    PubMed

    Rao, Rohit T; Scherholz, Megerle L; Hartmanshenn, Clara; Bae, Seul-A; Androulakis, Ioannis P

    2017-12-05

    The use of models in biology has become particularly relevant as it enables investigators to develop a mechanistic framework for understanding the operating principles of living systems as well as in quantitatively predicting their response to both pathological perturbations and pharmacological interventions. This application has resulted in a synergistic convergence of systems biology and pharmacokinetic-pharmacodynamic modeling techniques that has led to the emergence of quantitative systems pharmacology (QSP). In this review, we discuss how the foundational principles of chemical process systems engineering inform the progressive development of more physiologically-based systems biology models.

  20. GOBF-ARMA based model predictive control for an ideal reactive distillation column.

    PubMed

    Seban, Lalu; Kirubakaran, V; Roy, B K; Radhakrishnan, T K

    2015-11-01

    This paper discusses the control of an ideal reactive distillation column (RDC) using model predictive control (MPC) based on a combination of deterministic generalized orthonormal basis filter (GOBF) and stochastic autoregressive moving average (ARMA) models. Reactive distillation (RD) integrates reaction and distillation in a single process resulting in process and energy integration promoting green chemistry principles. Improved selectivity of products, increased conversion, better utilization and control of reaction heat, scope for difficult separations and the avoidance of azeotropes are some of the advantages that reactive distillation offers over conventional technique of distillation column after reactor. The introduction of an in situ separation in the reaction zone leads to complex interactions between vapor-liquid equilibrium, mass transfer rates, diffusion and chemical kinetics. RD with its high order and nonlinear dynamics, and multiple steady states is a good candidate for testing and verification of new control schemes. Here a combination of GOBF-ARMA models is used to catch and represent the dynamics of the RDC. This GOBF-ARMA model is then used to design an MPC scheme for the control of product purity of RDC under different operating constraints and conditions. The performance of proposed modeling and control using GOBF-ARMA based MPC is simulated and analyzed. The proposed controller is found to perform satisfactorily for reference tracking and disturbance rejection in RDC. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Principles and processes for effecting change in environmental management in New Zealand.

    PubMed

    Valentine, Ian; Hurley, Evelyn; Reid, Janet; Allen, Will

    2007-02-01

    In New Zealand environmental management is essentially the responsibility of land managers. Management decisions affect both production/productivity and the environment. However, responsibility for ensuring positive environmental outcomes falls on both local (Regional) and Central Government, and both they and international agencies such as the OECD would wish to monitor and report on changes. In terms of policy, strong links have been established via Central and Regional Government to land managers. Consumers in the market place are also, increasingly, requiring responsibility for positive environmental outcomes of those who purchase and process primary products. Strong links for responsibility have been established between our international markets and processing businesses and there is a noticeable strengthening of the links from the processors to the land manager/producer. In New Zealand a range of initiatives has been developed and implemented over recent times, whereby land managers are taking increasing responsibility for accounting for the environmental outcomes of their production activities. The range covers the spectrum from voluntary to compulsory (e.g., in order to meet market requirements) and from those initiated by customers to processor and/or producer initiatives. This paper follows the evolution of the principles that drove the predominant activities of the period and the processes that initiated the changes in environmental management. As the focus of agriculturalists changed from pioneering in a new world, to establishing a production base, to economic reality, and finally to environmental responsibility, the processes of extension adapted to meet the new challenge.

  2. Aeroacoustic and aerodynamic applications of the theory of nonequilibrium thermodynamics

    NASA Technical Reports Server (NTRS)

    Horne, W. Clifton; Smith, Charles A.; Karamcheti, Krishnamurty

    1991-01-01

    Recent developments in the field of nonequilibrium thermodynamics associated with viscous flows are examined and related to developments to the understanding of specific phenomena in aerodynamics and aeroacoustics. A key element of the nonequilibrium theory is the principle of minimum entropy production rate for steady dissipative processes near equilibrium, and variational calculus is used to apply this principle to several examples of viscous flow. A review of nonequilibrium thermodynamics and its role in fluid motion are presented. Several formulations are presented of the local entropy production rate and the local energy dissipation rate, two quantities that are of central importance to the theory. These expressions and the principle of minimum entropy production rate for steady viscous flows are used to identify parallel-wall channel flow and irrotational flow as having minimally dissipative velocity distributions. Features of irrotational, steady, viscous flow near an airfoil, such as the effect of trailing-edge radius on circulation, are also found to be compatible with the minimum principle. Finally, the minimum principle is used to interpret the stability of infinitesimal and finite amplitude disturbances in an initially laminar, parallel shear flow, with results that are consistent with experiment and linearized hydrodynamic stability theory. These results suggest that a thermodynamic approach may be useful in unifying the understanding of many diverse phenomena in aerodynamics and aeroacoustics.

  3. Equipment characterization to mitigate risks during transfers of cell culture manufacturing processes.

    PubMed

    Sieblist, Christian; Jenzsch, Marco; Pohlscheidt, Michael

    2016-08-01

    The production of monoclonal antibodies by mammalian cell culture in bioreactors up to 25,000 L is state of the art technology in the biotech industry. During the lifecycle of a product, several scale up activities and technology transfers are typically executed to enable the supply chain strategy of a global pharmaceutical company. Given the sensitivity of mammalian cells to physicochemical culture conditions, process and equipment knowledge are critical to avoid impacts on timelines, product quantity and quality. Especially, the fluid dynamics of large scale bioreactors versus small scale models need to be described, and similarity demonstrated, in light of the Quality by Design approach promoted by the FDA. This approach comprises an associated design space which is established during process characterization and validation in bench scale bioreactors. Therefore the establishment of predictive models and simulation tools for major operating conditions of stirred vessels (mixing, mass transfer, and shear force.), based on fundamental engineering principles, have experienced a renaissance in the recent years. This work illustrates the systematic characterization of a large variety of bioreactor designs deployed in a global manufacturing network ranging from small bench scale equipment to large scale production equipment (25,000 L). Several traditional methods to determine power input, mixing, mass transfer and shear force have been used to create a data base and identify differences for various impeller types and configurations in operating ranges typically applied in cell culture processes at manufacturing scale. In addition, extrapolation of different empirical models, e.g. Cooke et al. (Paper presented at the proceedings of the 2nd international conference of bioreactor fluid dynamics, Cranfield, UK, 1988), have been assessed for their validity in these operational ranges. Results for selected designs are shown and serve as examples of structured characterization to enable fast and agile process transfers, scale up and troubleshooting.

  4. 77 FR 70991 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-28

    ... laws and principles underlying the basic problems of agriculture in its broadest aspects, including but... methods of the production, marketing, distribution, processing, and utilization of plant and animal...

  5. Faculty Opinions Regarding the Philosophical Principles of Total Quality Management (TQM).

    ERIC Educational Resources Information Center

    Aliff, John Vincent

    The 14 points of the Total Quality Management (TQM) model can be distilled into the following 5 main guiding principles: establish a moral purpose for the institution, use cooperative efforts instead of individual efforts, stop the use of inspection (testing) to improve students and teachers, continuously improve the system and its products, and…

  6. Using Participatory Action Research to Develop a Working Model That Enhances Psychiatric Nurses' Professionalism: The Architecture of Stability.

    PubMed

    Salzmann-Erikson, Martin

    2017-11-01

    Ward rules in psychiatric care aim to promote safety for both patients and staff. Simultaneously, ward rules are associated with increased patient violence, leading to neither a safe work environment nor a safe caring environment. Although ward rules are routinely used, few studies have explicitly accounted for their impact. To describe the process of a team development project considering ward rule issues, and to develop a working model to empower staff in their daily in-patient psychiatric nursing practices. The design of this study is explorative and descriptive. Participatory action research methodology was applied to understand ward rules. Data consists of audio-recorded group discussions, observations and field notes, together creating a data set of 556 text pages. More than 100 specific ward rules were identified. In this process, the word rules was relinquished in favor of adopting the term principles, since rules are inconsistent with a caring ideology. A linguistic transition led to the development of a framework embracing the (1) Principle of Safety, (2) Principle of Structure and (3) Principle of Interplay. The principles were linked to normative guidelines and applied ethical theories: deontology, consequentialism and ethics of care. The work model reminded staff about the principles, empowered their professional decision-making, decreased collegial conflicts because of increased acceptance for individual decisions, and, in general, improved well-being at work. Furthermore, the work model also empowered staff to find support for their decisions based on principles that are grounded in the ethics of totality.

  7. Integrating MRP (materiel requirements planning) II and JIT to achieve world-class status.

    PubMed

    Titone, R C

    1994-05-01

    The concepts and principles of using manufacturing resource planning (MRP II) for planning are not new. Their success has been proven in numerous manufacturing companies in America. The concepts and principles of using just-in-time (JIT) inventory for execution, while more recent, have also been available for some time, and their success in Japan well documented. However, it is the effective integration of these two powerful tools that open the way to achieving world-class manufacturing status. This article will utilize a newly developed world-class manufacturing model, which will review the aspects of planning, beginning with a business plan through the production planning process and culminating with a master schedule that drives a materiel/capacity plan. The importance and interrelationship of these functions are reviewed. The model then illustrates the important aspects of executing these plans beginning with people issues, through total quality control (TQC) and pull systems. We will then utilize this new functional model to demonstrate the relationship between these various functions and the importance of integrating them with a total comprehensive manufacturing strategy that will lead to world-class manufacturing and profits.

  8. Spectral Generation from the Ames Mars GCM for the Study of Martian Clouds

    NASA Astrophysics Data System (ADS)

    Klassen, David R.; Kahre, Melinda A.; Wolff, Michael J.; Haberle, Robert; Hollingsworth, Jeffery L.

    2017-10-01

    Studies of martian clouds come from two distinct groups of researchers: those modeling the martian system from first principles and those observing Mars from ground-based and orbital platforms. The model-view begins with global circulation models (GCMs) or mesoscale models to track a multitude of state variables over a prescribed set of spatial and temporal resolutions. The state variables can then be processed into distinct maps of derived product variables, such as integrated optical depth of aerosol (e.g., water ice cloud, dust) or column integrated water vapor for comparison to observational results. The observer view begins, typically, with spectral images or imaging spectra, calibrated to some form of absolute units then run through some form of radiative transfer model to also produce distinct maps of derived product variables. Both groups of researchers work to adjust model parameters and assumptions until some level of agreement in derived product variables is achieved. While this system appears to work well, it is in some sense only an implicit confirmation of the model assumptions that attribute to the work from both sides. We have begun a project of testing the NASA Ames Mars GCM and key aerosol model assumptions more directly by taking the model output and creating synthetic TES-spectra from them for comparison to actual raw-reduced TES spectra. We will present some preliminary generated GCM spectra and TES comparisons.

  9. Lexical access in sign language: a computational model.

    PubMed

    Caselli, Naomi K; Cohen-Goldberg, Ariel M

    2014-01-01

    PSYCHOLINGUISTIC THEORIES HAVE PREDOMINANTLY BEEN BUILT UPON DATA FROM SPOKEN LANGUAGE, WHICH LEAVES OPEN THE QUESTION: How many of the conclusions truly reflect language-general principles as opposed to modality-specific ones? We take a step toward answering this question in the domain of lexical access in recognition by asking whether a single cognitive architecture might explain diverse behavioral patterns in signed and spoken language. Chen and Mirman (2012) presented a computational model of word processing that unified opposite effects of neighborhood density in speech production, perception, and written word recognition. Neighborhood density effects in sign language also vary depending on whether the neighbors share the same handshape or location. We present a spreading activation architecture that borrows the principles proposed by Chen and Mirman (2012), and show that if this architecture is elaborated to incorporate relatively minor facts about either (1) the time course of sign perception or (2) the frequency of sub-lexical units in sign languages, it produces data that match the experimental findings from sign languages. This work serves as a proof of concept that a single cognitive architecture could underlie both sign and word recognition.

  10. "Emancipatory Disability Research": Project or Process?

    ERIC Educational Resources Information Center

    Barnes, Colin

    2002-01-01

    This article provides an overview of the core principles and implications of emancipatory disability research. It suggests the emancipatory research paradigm has begun to transform the material and social relations of research production and concludes by suggesting that emancipatory disability should be perceived as a process rather than a…

  11. Template for success: using a resident-designed sign-out template in the handover of patient care.

    PubMed

    Clark, Clancy J; Sindell, Sarah L; Koehler, Richard P

    2011-01-01

    Report our implementation of a standardized handover process in a general surgery residency program. The standardized handover process, sign-out template, method of implementation, and continuous quality improvement process were designed by general surgery residents with support of faculty and senior hospital administration using standard work principles and business models of the Virginia Mason Production System and the Toyota Production System. Nonprofit, tertiary referral teaching hospital. General surgery residents, residency faculty, patient care providers, and hospital administration. After instruction in quality improvement initiatives, a team of general surgery residents designed a sign-out process using an electronic template and standard procedures. The initial implementation phase resulted in 73% compliance. Using resident-driven continuous quality improvement processes, real-time feedback enabled residents to modify and improve this process, eventually attaining 100% compliance and acceptance by residents. The creation of a standardized template and protocol for patient handovers might eliminate communication failures. Encouraging residents to participate in this process can establish the groundwork for successful implementation of a standardized handover process. Integrating a continuous quality-improvement process into such an initiative can promote active participation of busy general surgery residents and lead to successful implementation of standard procedures. Copyright © 2011 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  12. A new theoretical approach to terrestrial ecosystem science based on multiscale observations and eco-evolutionary optimality principles

    NASA Astrophysics Data System (ADS)

    Prentice, Iain Colin; Wang, Han; Cornwell, William; Davis, Tyler; Dong, Ning; Evans, Bradley; Keenan, Trevor; Peng, Changhui; Stocker, Benjamin; Togashi, Henrique; Wright, Ian

    2016-04-01

    Ecosystem science focuses on biophysical interactions of organisms and their abiotic environment, and comprises vital aspects of Earth system function such as the controls of carbon, water and energy exchanges between ecosystems and the atmosphere. Global numerical models of these processes have proliferated, and have been incorporated as standard components of Earth system models whose ambitious goal is to predict the coupled behaviour of the oceans, atmosphere and land on time scales from minutes to millennia. Unfortunately, however, the performance of most current terrestrial ecosystem models is highly unsatisfactory. Models typically fail the most basic observational benchmarks, and diverge greatly from one another when called upon to predict the response of ecosystem function and composition to environmental changes beyond the narrow range for which they were developed. This situation seems to have arisen for two inter-related reasons. First, general principles underlying many basic terrestrial biogeochemical processes have been neither clearly formulated nor adequately tested. Second, extensive observational data sets that could be used to test process formulations have become available only quite recently, long postdating the emergence of the current modelling paradigm. But the situation has changed now and ecosystem science needs to change too, to reflect both recent theoretical advances and the vast increase in the availability of relevant data sets at scales from the leaf to the globe. This presentation will outline an emerging mathematical theory that links biophysical plant and ecosystem processes through testable hypotheses derived from the principle of optimization by natural selection. The development and testing of this theory has depended on the availability of extensive data sets on climate, leaf traits (including δ13C measurements), and ecosystem properties including green vegetation cover and land-atmosphere CO2 fluxes. Achievements to date include unified explanations for observed climate and elevation effects on leaf CO2 drawdown (ci:c¬a¬ ratio) and photosynthetic capacity (Vcmax), growth temperature effects on the Jmax:Vcmax ratio, the adaptive nature of acclimation to enhanced CO2 concentration, the controls of leaf versus sapwood respiration, the controls of leaf N content (Narea), the relative constancy of the light use efficiency of gross primary production, and the relative conservatism of leaf dark respiration with climate. These findings call into question many assumptions in supposed "state-of-the-art" terrestrial ecosystem models, and provide a foundation for next-generation global ecosystem models that will rest on a greatly strengthened theoretical and empirical basis.

  13. Practical research on the teaching of Optical Design

    NASA Astrophysics Data System (ADS)

    Fan, Changjiang; Ren, Zhijun; Ying, Chaofu; Peng, Baojin

    2017-08-01

    Optical design, together with applied optics, forms a complete system from basic theory to application theory, and it plays a very important role in professional education. In order to improve senior undergraduates' understanding of optical design, this course is divided into three parts: theoretical knowledge, software design and product processing. Through learning theoretical knowledge, students can master the aberration theory and the design principles of typical optical system. By using ZEMAX(an imaging design software), TRACEPRO(a lighting optical design software), SOLIDWORKS or PROE( mechanical design software), student can establish a complete model of optical system. Student can use carving machine located in lab or cooperative units to process the model. Through the above three parts, student can learn necessary practical knowledge and get improved in their learning and analysis abilities, thus they can also get enough practice to prompt their creative abilities, then they could gradually change from scientific theory learners to an Optics Engineers.

  14. A Model to Measure Bombardier/Navigator Performance during Radar Navigation in Device 2F114, A-6E Weapon System Trainer.

    DTIC Science & Technology

    1981-03-01

    systems, sub- systems, equipment, weapons, tactics, missions, etc. Concepts and Principles - Fundamental truths, ideas, opinions and thoughts formed from...verification, etc. Grasping the meaning of concepts and principles , i.e., understanding the basic principles of infrared and radar detection. Understanding...concepts, principles , procedures, etc.). Analysis A demonstration of a learned process of breaking down material (i.e., data, other information) into

  15. Meaningful questions: The acquisition of auxiliary inversion in a connectionist model of sentence production.

    PubMed

    Fitz, Hartmut; Chang, Franklin

    2017-09-01

    Nativist theories have argued that language involves syntactic principles which are unlearnable from the input children receive. A paradigm case of these innate principles is the structure dependence of auxiliary inversion in complex polar questions (Chomsky, 1968, 1975, 1980). Computational approaches have focused on the properties of the input in explaining how children acquire these questions. In contrast, we argue that messages are structured in a way that supports structure dependence in syntax. We demonstrate this approach within a connectionist model of sentence production (Chang, 2009) which learned to generate a range of complex polar questions from a structured message without positive exemplars in the input. The model also generated different types of error in development that were similar in magnitude to those in children (e.g., auxiliary doubling, Ambridge, Rowland, & Pine, 2008; Crain & Nakayama, 1987). Through model comparisons we trace how meaning constraints and linguistic experience interact during the acquisition of auxiliary inversion. Our results suggest that auxiliary inversion rules in English can be acquired without innate syntactic principles, as long as it is assumed that speakers who ask complex questions express messages that are structured into multiple propositions. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Formulaic Language in Alzheimer’s Disease

    PubMed Central

    Bridges, Kelly Ann; Van Lancker Sidtis, Diana

    2013-01-01

    Background Studies of productive language in Alzheimer’s disease (AD) have focused on formal testing of syntax and semantics but have directed less attention to naturalistic discourse and formulaic language. Clinical observations suggest that individuals with AD retain the ability to produce formulaic language long after other cognitive abilities have deteriorated. Aims This study quantifies production of formulaic expressions in the spontaneous speech of individuals with AD. Persons with early- and late-onset forms of the disease were compared. Methods & Procedures Conversational language samples of individuals with early- (n = 5) and late-onset (n = 6) AD and healthy controls (n = 5) were analyzed to determine whether formulaic language, as measured by the number of words in formulaic expressions, differs between groups. Outcomes & Results Results indicate that individuals with AD, regardless of age of onset, used significantly more formulaic expressions than healthy controls. The early- and late-onset AD groups did not differ on formulaic language measures. Conclusions These findings contribute to a dual process model of cerebral function, which proposes differing processing principles for formulaic and novel expressions. In this model, subcortical areas, which remain intact into late in the progression of Alzheimer’s disease, play an important role in the production of formulaic language. Applications to clinical practice include identifying preserved formulaic language and providing informed counseling to patient and family. PMID:24187417

  17. Deception and Cognitive Load: Expanding Our Horizon with a Working Memory Model

    PubMed Central

    Sporer, Siegfried L.

    2016-01-01

    Recently, studies on deception and its detection have increased dramatically. Many of these studies rely on the “cognitive load approach” as the sole explanatory principle to understand deception. These studies have been exclusively on lies about negative actions (usually lies of suspects of [mock] crimes). Instead, we need to re-focus more generally on the cognitive processes involved in generating both lies and truths, not just on manipulations of cognitive load. Using Baddeley’s (2000, 2007, 2012) working memory model, which integrates verbal and visual processes in working memory with retrieval from long-term memory and control of action, not only verbal content cues but also nonverbal, paraverbal, and linguistic cues can be investigated within a single framework. The proposed model considers long-term semantic, episodic and autobiographical memory and their connections with working memory and action. It also incorporates ironic processes of mental control (Wegner, 1994, 2009), the role of scripts and schemata and retrieval cues and retrieval processes. Specific predictions of the model are outlined and support from selective studies is presented. The model is applicable to different types of reports, particularly about lies and truths about complex events, and to different modes of production (oral, hand-written, typed). Predictions regarding several moderator variables and methods to investigate them are proposed. PMID:27092090

  18. Deception and Cognitive Load: Expanding Our Horizon with a Working Memory Model.

    PubMed

    Sporer, Siegfried L

    2016-01-01

    Recently, studies on deception and its detection have increased dramatically. Many of these studies rely on the "cognitive load approach" as the sole explanatory principle to understand deception. These studies have been exclusively on lies about negative actions (usually lies of suspects of [mock] crimes). Instead, we need to re-focus more generally on the cognitive processes involved in generating both lies and truths, not just on manipulations of cognitive load. Using Baddeley's (2000, 2007, 2012) working memory model, which integrates verbal and visual processes in working memory with retrieval from long-term memory and control of action, not only verbal content cues but also nonverbal, paraverbal, and linguistic cues can be investigated within a single framework. The proposed model considers long-term semantic, episodic and autobiographical memory and their connections with working memory and action. It also incorporates ironic processes of mental control (Wegner, 1994, 2009), the role of scripts and schemata and retrieval cues and retrieval processes. Specific predictions of the model are outlined and support from selective studies is presented. The model is applicable to different types of reports, particularly about lies and truths about complex events, and to different modes of production (oral, hand-written, typed). Predictions regarding several moderator variables and methods to investigate them are proposed.

  19. Inactivated polio vaccine development for technology transfer using attenuated Sabin poliovirus strains to shift from Salk-IPV to Sabin-IPV.

    PubMed

    Bakker, Wilfried A M; Thomassen, Yvonne E; van't Oever, Aart G; Westdijk, Janny; van Oijen, Monique G C T; Sundermann, Lars C; van't Veld, Peter; Sleeman, Eelco; van Nimwegen, Fred W; Hamidi, Ahd; Kersten, Gideon F A; van den Heuvel, Nico; Hendriks, Jan T; van der Pol, Leo A

    2011-09-22

    Industrial-scale inactivated polio vaccine (IPV) production dates back to the 1960s when at the Rijks Instituut voor de Volksgezondheid (RIV) in Bilthoven a process was developed based on micro-carrier technology and primary monkey kidney cells. This technology was freely shared with several pharmaceutical companies and institutes worldwide. In this contribution, the history of one of the first cell-culture based large-scale biological production processes is summarized. Also, recent developments and the anticipated upcoming shift from regular IPV to Sabin-IPV are presented. Responding to a call by the World Health Organization (WHO) for new polio vaccines, the development of Sabin-IPV was continued, after demonstrating proof of principle in the 1990s, at the Netherlands Vaccine Institute (NVI). Development of Sabin-IPV plays an important role in the WHO polio eradication strategy as biocontainment will be critical in the post-OPV cessation period. The use of attenuated Sabin strains instead of wild-type Salk polio strains will provide additional safety during vaccine production. Initially, the Sabin-IPV production process will be based on the scale-down model of the current, and well-established, Salk-IPV process. In parallel to clinical trial material production, process development, optimization and formulation research is being carried out to further optimize the process and reduce cost per dose. Also, results will be shown from large-scale (to prepare for future technology transfer) generation of Master- and Working virus seedlots, and clinical trial material (for phase I studies) production. Finally, the planned technology transfer to vaccine manufacturers in low and middle-income countries is discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. The lean service machine.

    PubMed

    Swank, Cynthia Karen

    2003-10-01

    Jefferson Pilot Financial, a life insurance and annuities firm, like many U.S. service companies at the end of the 1990s was looking for new ways to grow. Its top managers recognized that JPF needed to differentiate itself in the eyes of its customers, the independent life-insurance advisers who sell and service policies. To establish itself as these advisers' preferred partner, it set out to reduce the turnaround time on policy applications, simplify the submission process, and reduce errors. JPF's managers looked to the "lean production" practices that U.S. manufacturers adopted in response to competition from Japanese companies. Lean production is built around the concept of continuous-flow processing--a departure from traditional production systems, in which large batches are processed at each step. JPF appointed a "lean team" to reengineer its New Business unit's operations, beginning with the creation of a "model cell"--a fully functioning microcosm of JPF's entire process. This approach allowed managers to experiment and smooth out the kinks while working toward an optimal design. The team applied lean-manufacturing practices, including placing linked processes near one another, balancing employees' workloads, posting performance results, and measuring performance and productivity from the customer's perspective. Customer-focused metrics helped erode the employees' "My work is all that matters" mind-set. The results were so impressive that JPF is rolling out similar systems across many of its operations. To convince employees of the value of lean production, the lean team introduced a simulation in which teams compete to build the best paper airplane based on invented customer specifications. This game drives home lean production's basic principles, establishing a foundation for deep and far-reaching changes in the production system.

  1. Lean management systems: creating a culture of continuous quality improvement.

    PubMed

    Clark, David M; Silvester, Kate; Knowles, Simon

    2013-08-01

    This is the first in a series of articles describing the application of Lean management systems to Laboratory Medicine. Lean is the term used to describe a principle-based continuous quality improvement (CQI) management system based on the Toyota production system (TPS) that has been evolving for over 70 years. Its origins go back much further and are heavily influenced by the work of W Edwards Deming and the scientific method that forms the basis of most quality management systems. Lean has two fundamental elements--a systematic approach to process improvement by removing waste in order to maximise value for the end-user of the service and a commitment to respect, challenge and develop the people who work within the service to create a culture of continuous improvement. Lean principles have been applied to a growing number of Healthcare systems throughout the world to improve the quality and cost-effectiveness of services for patients and a number of laboratories from all the pathology disciplines have used Lean to shorten turnaround times, improve quality (reduce errors) and improve productivity. Increasingly, models used to plan and implement large scale change in healthcare systems, including the National Health Service (NHS) change model, have evidence-based improvement methodologies (such as Lean CQI) as a core component. Consequently, a working knowledge of improvement methodology will be a core skill for Pathologists involved in leadership and management.

  2. Knowledge Co-production Strategies for Water Resources Modeling and Decision Making

    NASA Astrophysics Data System (ADS)

    Gober, P.

    2016-12-01

    The limited impact of scientific information on policy making and climate adaptation in North America has raised awareness of the need for new modeling strategies and knowledge transfer processes. This paper outlines the rationale for a new paradigm in water resources modeling and management, using examples from the USA and Canada. Principles include anticipatory modeling, complex system dynamics, decision making under uncertainty, visualization, capacity to represent and manipulate critical trade-offs, stakeholder engagement, local knowledge, context-specific activities, social learning, vulnerability analysis, iterative and collaborative modeling, and the concept of a boundary organization. In this framework, scientists and stakeholders are partners in the production and dissemination of knowledge for decision making, and local knowledge is fused with scientific observation and methodology. Discussion draws from experience in building long-term collaborative boundary organizations in Phoenix, Arizona in the USA and the Saskatchewan River Basin (SRB) in Canada. Examples of boundary spanning activities include the use of visualization, the concept of a decision theater, infrastructure to support social learning, social networks, and reciprocity, simulation modeling to explore "what if" scenarios of the future, surveys to elicit how water problems are framed by scientists and stakeholders, and humanistic activities (theatrical performances, art exhibitions, etc.) to draw attention to local water issues. The social processes surrounding model development and dissemination are at least as important as modeling assumptions, procedures, and results in determining whether scientific knowledge will be used effectively for water resources decision making.

  3. A quality by design study applied to an industrial pharmaceutical fluid bed granulation.

    PubMed

    Lourenço, Vera; Lochmann, Dirk; Reich, Gabriele; Menezes, José C; Herdling, Thorsten; Schewitz, Jens

    2012-06-01

    The pharmaceutical industry is encouraged within Quality by Design (QbD) to apply science-based manufacturing principles to assure quality not only of new but also of existing processes. This paper presents how QbD principles can be applied to an existing industrial pharmaceutical fluid bed granulation (FBG) process. A three-step approach is presented as follows: (1) implementation of Process Analytical Technology (PAT) monitoring tools at the industrial scale process, combined with multivariate data analysis (MVDA) of process and PAT data to increase the process knowledge; (2) execution of scaled-down designed experiments at a pilot scale, with adequate PAT monitoring tools, to investigate the process response to intended changes in Critical Process Parameters (CPPs); and finally (3) the definition of a process Design Space (DS) linking CPPs to Critical to Quality Attributes (CQAs), within which product quality is ensured by design, and after scale-up enabling its use at the industrial process scale. The proposed approach was developed for an existing industrial process. Through enhanced process knowledge established a significant reduction in product CQAs, variability already within quality specifications ranges was achieved by a better choice of CPPs values. The results of such step-wise development and implementation are described. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Research on a dynamic workflow access control model

    NASA Astrophysics Data System (ADS)

    Liu, Yiliang; Deng, Jinxia

    2007-12-01

    In recent years, the access control technology has been researched widely in workflow system, two typical technologies of that are RBAC (Role-Based Access Control) and TBAC (Task-Based Access Control) model, which has been successfully used in the role authorizing and assigning in a certain extent. However, during the process of complicating a system's structure, these two types of technology can not be used in minimizing privileges and separating duties, and they are inapplicable when users have a request of frequently changing on the workflow's process. In order to avoid having these weakness during the applying, a variable flow dynamic role_task_view (briefly as DRTVBAC) of fine-grained access control model is constructed on the basis existed model. During the process of this model applying, an algorithm is constructed to solve users' requirements of application and security needs on fine-grained principle of privileges minimum and principle of dynamic separation of duties. The DRTVBAC model is implemented in the actual system, the figure shows that the task associated with the dynamic management of role and the role assignment is more flexible on authority and recovery, it can be met the principle of least privilege on the role implement of a specific task permission activated; separated the authority from the process of the duties completing in the workflow; prevented sensitive information discovering from concise and dynamic view interface; satisfied with the requirement of the variable task-flow frequently.

  5. [Investigating phonological planning processes in speech production through a speech-error induction technique].

    PubMed

    Nakayama, Masataka; Saito, Satoru

    2015-08-01

    The present study investigated principles of phonological planning, a common serial ordering mechanism for speech production and phonological short-term memory. Nakayama and Saito (2014) have investigated the principles by using a speech-error induction technique, in which participants were exposed to an auditory distracIor word immediately before an utterance of a target word. They demonstrated within-word adjacent mora exchanges and serial position effects on error rates. These findings support, respectively, the temporal distance and the edge principles at a within-word level. As this previous study induced errors using word distractors created by exchanging adjacent morae in the target words, it is possible that the speech errors are expressions of lexical intrusions reflecting interactive activation of phonological and lexical/semantic representations. To eliminate this possibility, the present study used nonword distractors that had no lexical or semantic representations. This approach successfully replicated the error patterns identified in the abovementioned study, further confirming that the temporal distance and edge principles are organizing precepts in phonological planning.

  6. [Lean logistics management in healthcare: a case study].

    PubMed

    Aguilar-Escobar, V G; Garrido-Vega, P

    2013-01-01

    To study the applicability of the principles of Lean Production to manage the supply chain of a hospital. In particular, to determine which Lean practices and principles are applicable, the benefits obtained and the main barriers for its implementation. Managing the hospital supply chain is an important issue, both for its effect on the quality of care and its impact on costs. This study is based on a case study. 2005-10. Hospital Virgen Macarena in Seville. Process of implementing a comprehensive logistics management plan based on Lean principles and technological investments. The implementation of the comprehensive plan has reduced inventory, decreased lead times and improved service quality. Also, there have been other important improvements: enhanced employee satisfaction and increased staff productivity, both dedicated to health and the logistics. The experience analysed has shown the applicability and appropriateness of Lean principles and some of its techniques in managing the logistics of hospitals. It also identifies some of the main difficulties that may arise. Copyright © 2011 SECA. Published by Elsevier Espana. All rights reserved.

  7. Nondestructive methods of integrating energy harvesting systems for highway bridges

    NASA Astrophysics Data System (ADS)

    Inamdar, Sumedh; Zimowski, Krystian; Crawford, Richard; Wood, Kristin; Jensen, Dan

    2012-04-01

    Designing an attachment structure that is both novel and meets the system requirements can be a difficult task especially for inexperienced designers. This paper presents a design methodology for concept generation of a "parent/child" attachment system. The "child" is broadly defined as any device, part, or subsystem that will attach to any existing system, part, or device called the "parent." An inductive research process was used to study a variety of products, patents, and biological examples that exemplified the parent/child system. Common traits among these products were found and categorized as attachment principles in three different domains: mechanical, material, and field. The attachment principles within the mechanical domain and accompanying examples are the focus of this paper. As an example of the method, a case study of generating concepts for a bridge mounted wind energy harvester using the mechanical attachment principles derived from the methodology and TRIZ principles derived from Altshuller's matrix of contradictions is presented.

  8. Relativistic corrections to electromagnetic heavy quarkonium production

    NASA Astrophysics Data System (ADS)

    Shtabovenko, Vladyslav

    2017-03-01

    We report on the calculation [1] of the relativistic O(αs0 ν2) corrections to the quarkonium production process e+e- → χcJ + γ in non-relativistic QCD (NRQCD). In our work we incorporate effects from operators that contribute through the sub-leading Fock state |QQ¯g>, that were not taken into account by previous studies. We determine the corresponding matching coeffcients that should be included into theoretical predictions for the electromagnetic production cross-section of χcJ. This process could be, in principle, measured by the Belle II experiment.

  9. Digital Media Production to Support Literacy for Secondary Students with Diverse Learning Abilities

    ERIC Educational Resources Information Center

    Leach, April Marie

    2017-01-01

    Producing digital media is a hands-on, inquiry-based mindful process that naturally embeds Universal Design for Learning (UDL) principles into literacy instruction, providing options for learning and assessment for a wide array of students with diverse learning abilities. Video production learning experiences acknowledge the cognitive talents of…

  10. The productive operating theatre and lean thinking systems.

    PubMed

    Kasivisvanathan, R; Chekairi, A

    2014-11-01

    The concept of 'lean thinking' first originated in the manufacturing industry as a means of improving productivity whilst maintaining quality through eliminating wasteful processes. The purpose of this article is to demonstrate how the principles of 'lean thinking' are relevant to healthcare and the operating theatre, with reference to our own institutional experience.

  11. Occupational ergonomics and injury prevention.

    PubMed

    Stobbe, T J

    1996-01-01

    Ergonomics is the study of people at work. The current focus is on the prevention of work-induced musculoskeletal injuries through the application of sound ergonomic principles. This chapter has briefly outlined ergonomics and its history, has described low back pain and upper extremity cumulative trauma disorders from an ergonomic perspective, and has discussed control and prevention approaches for a few scenarios. Ergonomic principles are based on a combination of science and engineering and a thorough understanding of human capabilities and limitations. When these principles are applied to the design of a job, task, process, or procedure, the incidence and severity of musculoskeletal injuries decrease. In many cases productivity and morale also improve. Workers are spared suffering, and employers are spared costs. It is hoped that this discussion will encourage more health, safety, and business professionals to learn about and apply ergonomics in their workplaces for the improvement of the worker, product, and business. Finally, many additional epidemiologic studies on the individual and joint effects of the CTD risk factors are needed. The knowledge gained from these studies will promote the more effective application of ergonomic principles to reduce worker suffering, improve products, and reduce costs.

  12. Activity-based costing: a practical model for cost calculation in radiotherapy.

    PubMed

    Lievens, Yolande; van den Bogaert, Walter; Kesteloot, Katrien

    2003-10-01

    The activity-based costing method was used to compute radiotherapy costs. This report describes the model developed, the calculated costs, and possible applications for the Leuven radiotherapy department. Activity-based costing is an advanced cost calculation technique that allocates resource costs to products based on activity consumption. In the Leuven model, a complex allocation principle with a large diversity of cost drivers was avoided by introducing an extra allocation step between activity groups and activities. A straightforward principle of time consumption, weighed by some factors of treatment complexity, was used. The model was developed in an iterative way, progressively defining the constituting components (costs, activities, products, and cost drivers). Radiotherapy costs are predominantly determined by personnel and equipment cost. Treatment-related activities consume the greatest proportion of the resource costs, with treatment delivery the most important component. This translates into products that have a prolonged total or daily treatment time being the most costly. The model was also used to illustrate the impact of changes in resource costs and in practice patterns. The presented activity-based costing model is a practical tool to evaluate the actual cost structure of a radiotherapy department and to evaluate possible resource or practice changes.

  13. A hotspot model for leaf canopies

    NASA Technical Reports Server (NTRS)

    Jupp, David L. B.; Strahler, Alan H.

    1991-01-01

    The hotspot effect, which provides important information about canopy structure, is modeled using general principles of environmental physics as driven by parameters of interest in remote sensing, such as leaf size, leaf shape, leaf area index, and leaf angle distribution. Specific examples are derived for canopies of horizontal leaves. The hotspot effect is implemented within the framework of the model developed by Suits (1972) for a canopy of leaves to illustrate what might occur in an agricultural crop. Because the hotspot effect arises from very basic geometrical principles and is scale-free, it occurs similarly in woodlands, forests, crops, rough soil surfaces, and clouds. The scaling principles advanced are also significant factors in the production of image spatial and angular variance and covariance which can be used to assess land cover structure through remote sensing.

  14. [Introduction of hazard analysis and critical control points (HACCP) principles at the flight catering food production plant].

    PubMed

    Popova, A Yu; Trukhina, G M; Mikailova, O M

    In the article there is considered the quality control and safety system implemented in the one of the largest flight catering food production plant for airline passengers and flying squad. The system for the control was based on the Hazard Analysis And Critical Control Points (HACCP) principles and developed hygienic and antiepidemic measures. There is considered the identification of hazard factors at stages of the technical process. There are presented results of the analysis data of monitoring for 6 critical control points over the five-year period. The quality control and safety system permit to decline food contamination risk during acceptance, preparation and supplying of in-flight meal. There was proved the efficiency of the implemented system. There are determined further ways of harmonization and implementation for HACCP principles in the plant.

  15. 75 FR 14418 - Codex Alimentarius Commission: Meeting of the Codex Committee on Food Labeling

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-25

    ...) Proposed Draft Criteria and Principles for Legibility and Readability of Nutrition Labels (d) Discussion..., Physical Activity, and Health Guidelines for the Production, Processing, Labeling and Marketing of...

  16. A Human Factors Framework for Payload Display Design

    NASA Technical Reports Server (NTRS)

    Dunn, Mariea C.; Hutchinson, Sonya L.

    1998-01-01

    During missions to space, one charge of the astronaut crew is to conduct research experiments. These experiments, referred to as payloads, typically are controlled by computers. Crewmembers interact with payload computers by using visual interfaces or displays. To enhance the safety, productivity, and efficiency of crewmember interaction with payload displays, particular attention must be paid to the usability of these displays. Enhancing display usability requires adoption of a design process that incorporates human factors engineering principles at each stage. This paper presents a proposed framework for incorporating human factors engineering principles into the payload display design process.

  17. Root Zone Water Quality Model (RZWQM2): Model use, calibration, and validation

    USDA-ARS?s Scientific Manuscript database

    The Root Zone Water Quality Model (RZWQM2) has been used widely for simulating agricultural management effects on crop production and soil and water quality. Although it is a one-dimensional model it has many desirable features for the modeling community. This paper outlines the principles of calibr...

  18. Fused deposition of ceramics: A comprehensive experimental, analytical and computational study of material behavior, fabrication process and equipment design

    NASA Astrophysics Data System (ADS)

    Bellini, Anna

    Customer-driven product customization and continued demand for cost and time savings have generated a renewed interest in agile manufacturing based on improvements on Rapid Prototyping (RP) technologies. The advantages of RP technologies are: (1) ability to shorten the product design and development time, (2) suitability for automation and decrease in the level of human intervention, (3) ability to build many geometrically complex shapes. A shift from "prototyping" to "manufacturing" necessitates the following improvements: (1) Flexibility in choice of materials; (2) Part integrity and built-in characteristics to meet performance requirements; (3) Dimensional stability and tolerances; (4) Improved surface finish. A project funded by ONR has been undertaken to develop an agile manufacturing technology for fabrication of ceramic and multi-component parts to meet various needs of the Navy, such as transducers, etc. The project is based on adaptation of a layered manufacturing concept since the program required that the new technology be developed based on a commercially available RP technology. Among various RP technologies available today, Fused Deposition Modeling (FDM) has been identified as the focus of this research because of its potential versatility in the choice of materials and deposition configuration. This innovative approach allows for designing and implementing highly complex internal architectures into parts through deposition of different materials in a variety of configurations in such a way that the finished product exhibit characteristics to meet the performance requirements. This implies that, in principle, one can tailor-make the assemble of materials and structures as per specifications of an optimum design. The program objectives can be achieved only through accurate process modeling and modeling of material behavior. Oftentimes, process modeling is based on some type of computational approach where as modeling of material behavior is based on extensive experimental investigations. Studies are conducted in the following categories: (1) Flow modeling during extrusion and deposition; (2) Thermal modeling; (3) Flow control during deposition; (4) Product characterization and property determination for dimensional analysis; (5) Development of a novel technology based on a mini-extrusion system. Studies in each of these stages have involved experimental as well as analytical approaches to develop a comprehensive modeling.

  19. Experimental evidence of multimaterial jet formation with lasers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicolaie, Ph.; Stenz, C.; Tikhonchuk, V.

    2010-11-15

    Laser-produced multimaterial jets have been investigated at the Prague Asterix Laser System laser [K. Jungwirth et al., Phys. Plasmas 8, 2495 (2001)]. The method of jet production is based on the laser-plasma ablation process and proved to be easy to set up and robust. The possibility of multimaterial laboratory jet production is demonstrated and complex hydrodynamic flows in the jet body are obtained. Two complementary diagnostics in the optical ray and x-ray ranges provide detailed information about jet characteristics. The latter are in agreement with estimates and two-dimensional radiation hydrodynamic simulation results. The experiment provides a proof of principle thatmore » a velocity field could be produced and controlled in the jet body. It opens a possibility of astrophysical jet structure modeling in laboratory.« less

  20. Logistic Principles Application for Managing the Extraction and Transportation of Solid Minerals

    NASA Astrophysics Data System (ADS)

    Tyurin, Alexey

    2017-11-01

    Reducing the cost of resources in solid mineral extraction is an urgent task. For its solution the article proposes logistic approach use to management of mining company all resources, including extraction processes, transport, mineral handling and storage. The account of the uneven operation of mining, transport units and complexes for processing and loading coal into railroad cars allows you to identify the shortcomings in the work of the entire enterprise and reduce resources use at the planned production level. In the article the mining planning model taking into account the dynamics of the production, transport stations and export coal to consumers rail transport on example of Krasnoyarsk region Nazarovo JSC «Razrez Sereul'skiy». Rolling planning methods use and data aggregation allows you to split the planning horizon (month) on equal periods and to use of dynamic programming method for building mining optimal production programme for the month. Coal mining production program definition technique will help align the work of all enterprise units, to optimize resources of all areas, to establish a flexible relationship between manufacturer and consumer, to take into account the irregularity of rail transport.

  1. The Red Lotus Health Promotion Model: a new model for holistic, ecological, salutogenic health promotion practice.

    PubMed

    Gregg, Jane; O'Hara, Lily

    2007-04-01

    There is a need for a system of values and principles consistent with modern health promotion that enables practitioners to use these values and principles to understand health and in their needs assessment, planning, implementation and evaluation practice. Grounded theory, document analysis and the authors' own practice experience were used to systematically collect and analyse data from key health promotion literature and to develop the Red Lotus Health Promotion Model. The Red Lotus Health Promotion Model is a new model for holistic, ecological, salutogenic health promotion practice. It is distinct from other health promotion models in that it incorporates a system of values and principles that is applied across the phases of health promotion, including determining the health paradigm, needs assessment, planning, implementation and evaluation. The Red Lotus Health Promotion Model enables practitioners to proactively and purposefully put into action a connected system of values and principles across the phases of a health promotion process.

  2. Toward a comprehensive model of antisocial development: a dynamic systems approach.

    PubMed

    Granic, Isabela; Patterson, Gerald R

    2006-01-01

    The purpose of this article is to develop a preliminary comprehensive model of antisocial development based on dynamic systems principles. The model is built on the foundations of behavioral research on coercion theory. First, the authors focus on the principles of multistability, feedback, and nonlinear causality to reconceptualize real-time parent-child and peer processes. Second, they model the mechanisms by which these real-time processes give rise to negative developmental outcomes, which in turn feed back to determine real-time interactions. Third, they examine mechanisms of change and stability in early- and late-onset antisocial trajectories. Finally, novel clinical designs and predictions are introduced. The authors highlight new predictions and present studies that have tested aspects of the model

  3. Production of long chain alkyl esters from carbon dioxide and electricity by a two-stage bacterial process.

    PubMed

    Lehtinen, Tapio; Efimova, Elena; Tremblay, Pier-Luc; Santala, Suvi; Zhang, Tian; Santala, Ville

    2017-11-01

    Microbial electrosynthesis (MES) is a promising technology for the reduction of carbon dioxide into value-added multicarbon molecules. In order to broaden the product profile of MES processes, we developed a two-stage process for microbial conversion of carbon dioxide and electricity into long chain alkyl esters. In the first stage, the carbon dioxide is reduced to organic compounds, mainly acetate, in a MES process by Sporomusa ovata. In the second stage, the liquid end-products of the MES process are converted to the final product by a second microorganism, Acinetobacter baylyi in an aerobic bioprocess. In this proof-of-principle study, we demonstrate for the first time the bacterial production of long alkyl esters (wax esters) from carbon dioxide and electricity as the sole sources of carbon and energy. The process holds potential for the efficient production of carbon-neutral chemicals or biofuels. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Applying Lean/Toyota production system principles to improve phlebotomy patient satisfaction and workflow.

    PubMed

    Melanson, Stacy E F; Goonan, Ellen M; Lobo, Margaret M; Baum, Jonathan M; Paredes, José D; Santos, Katherine S; Gustafson, Michael L; Tanasijevic, Milenko J

    2009-12-01

    Our goals were to improve the overall patient experience and optimize the blood collection process in outpatient phlebotomy using Lean principles. Elimination of non-value-added steps and modifications to operational processes resulted in increased capacity to handle workload during peak times without adding staff. The result was a reduction of average patient wait time from 21 to 5 minutes, with the goal of drawing blood samples within 10 minutes of arrival at the phlebotomy station met for 90% of patients. In addition, patient satisfaction increased noticeably as assessed by a 5-question survey. The results have been sustained for 10 months with staff continuing to make process improvements.

  5. Potential application of ecological models in the European environmental risk assessment of chemicals. I. Review of protection goals in EU directives and regulations.

    PubMed

    Hommen, Udo; Baveco, J M Hans; Galic, Nika; van den Brink, Paul J

    2010-07-01

    Several European directives and regulations address the environmental risk assessment of chemicals. We used the protection of freshwater ecosystems against plant protection products, biocidal products, human and veterinary pharmaceuticals, and other chemicals and priority substances under the Water Framework Directive as examples to explore the potential of ecological effect models for a refined risk assessment. Our analysis of the directives, regulations, and related guidance documents lead us to distinguish the following 5 areas for the application of ecological models in chemical risk assessment: 1) Extrapolation of organism-level effects to the population level: The protection goals are formulated in general terms, e.g., avoiding "unacceptable effects" or "adverse impact" on the environment or the "viability of exposed species." In contrast, most of the standard ecotoxicological tests provide data only on organism-level endpoints and are thus not directly linked to the protection goals which focus on populations and communities. 2) Extrapolation of effects between different exposure profiles: Especially for plant protection products, exposure profiles can be very variable and impossible to cover in toxicological tests. 3) Extrapolation of recovery processes: As a consequence of the often short-term exposures to plant protection products, the risk assessment is based on the community recovery principle. On the other hand, assessments under the other directives assume a more or less constant exposure and are based on the ecosystem threshold principle. 4) Analysis and prediction of indirect effects: Because effects on 1 or a few taxa might have consequences on other taxa that are not directly affected by the chemical, such indirect effects on communities have to be considered. 5) Prediction of bioaccumulation within food chains: All directives take the possibility of bioaccumulation, and thus secondary poisoning within the food chain, into account. (c) 2010 SETAC.

  6. Testing two principles of the Health Action Process Approach in individuals with type 2 diabetes.

    PubMed

    Lippke, Sonia; Plotnikoff, Ronald C

    2014-01-01

    The Health Action Process Approach (HAPA) proposes principles that can be translated into testable hypotheses. This is one of the first studies to have explicitly tested HAPA's first 2 principles, which are (1) health behavior change process can be subdivided into motivation and volition, and (2) volition can be grouped into intentional and action stages. The 3 stage groups are labeled preintenders, intenders, and actors. The hypotheses of the HAPA model were investigated in a sample of 1,193 individuals with Type 2 diabetes. Study participants completed a questionnaire assessing the HAPA variables. The hypotheses were evaluated by examining mean differences of test variables and by the use of multigroup structural equation modeling (MSEM). Findings support the HAPA's 2 principles and 3 distinct stages. The 3 HAPA stages were significantly different in several stage-specific variables, and discontinuity patterns were found in terms of nonlinear trends across means. In terms of predicting goals, action planning, and behavior, differences transpired between the 2 motivational stages (preintenders and intenders), and between the 2 volitional stages (intenders and actors). Results indicate implications for supporting behavior change processes, depending on in which stage a person is at: All individuals should be helped to increase self-efficacy. Preintenders and intenders require interventions targeting outcome expectancies. Actors benefit from an improvement in action planning to maintain and increase their previous behavior. Overall, the first 2 principles of the HAPA were supported and some evidence for the other principles was found. Future research should experimentally test these conclusions. 2014 APA, all rights reserved

  7. An Information Processing Perspective on Divergence and Convergence in Collaborative Learning

    ERIC Educational Resources Information Center

    Jorczak, Robert L.

    2011-01-01

    This paper presents a model of collaborative learning that takes an information processing perspective of learning by social interaction. The collaborative information processing model provides a theoretical basis for understanding learning principles associated with social interaction and explains why peer-to-peer discussion is potentially more…

  8. Innovative model of business process reengineering at machine building enterprises

    NASA Astrophysics Data System (ADS)

    Nekrasov, R. Yu; Tempel, Yu A.; Tempel, O. A.

    2017-10-01

    The paper provides consideration of business process reengineering viewed as amanagerial innovation accepted by present day machine building enterprises, as well as waysto improve its procedure. A developed innovative model of reengineering measures isdescribed and is based on the process approach and other principles of company management.

  9. Synergy as design principle for metabolic engineering of 1-propanol production in Escherichia coli.

    PubMed

    Shen, Claire R; Liao, James C

    2013-05-01

    Synthesis of a desired product can often be achieved via more than one metabolic pathway. Whether naturally evolved or synthetically engineered, these pathways often exhibit specific properties that are suitable for production under distinct conditions and host organisms. Synergy between pathways arises when the underlying pathway characteristics, such as reducing equivalent demand, ATP requirement, intermediate utilization, and cofactor preferences, are complementary to each other. Utilization of such pathways in combination leads to an increased metabolite productivity and/or yield compared to using each pathway alone. This work illustrates the principle of synergy between two different pathways for 1-propanol production in Escherichia coli. A model-guided design based on maximum theoretical yield calculations identified synergy of the native threonine pathway and the heterologous citramalate pathway in terms of production yield across all flux ratios between the two pathways. Characterization of the individual pathways by host gene deletions demonstrates their distinct metabolic characteristics: the necessity of TCA cycle for threonine pathway and the independence of TCA cycle for the citramalate pathway. The two pathways are also complementary in driving force demands. Production experiments verified the synergistic effects predicted by the yield model, in which the platform with dual pathway for 2-ketobutyrate synthesis achieved higher yield (0.15g/g of glucose) and productivity (0.12g/L/h) of 1-propanol than individual ones alone: the threonine pathway (0.09g/g; 0.04g/L/h) or the citramalate pathway (0.11g/g; 0.04g/L/h). Thus, incorporation of synergy into the design principle of metabolic engineering may improve the production yield and rate of the desired compound. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Simulation of textile manufacturing processes for planning, scheduling, and quality control purposes

    NASA Astrophysics Data System (ADS)

    Cropper, A. E.; Wang, Z.

    1995-08-01

    Simulation, as a management information tool, has been applied to engineering manufacture and assembly operations. The application of the principles to textile manufacturing (fiber to fabric) is discussed. The particular problems and solutions in applying the simulation software package to the yarn production processes are discussed with an indication of how the software achieves the production schedule. The system appears to have application in planning, scheduling, and quality assurance. The latter being a result of the traceability possibilities through a process involving mixing and splitting of material.

  11. Synthesis: Intertwining product and process

    NASA Technical Reports Server (NTRS)

    Weiss, David M.

    1990-01-01

    Synthesis is a proposed systematic process for rapidly creating different members of a program family. Family members are described by variations in their requirements. Requirements variations are mapped to variations on a standard design to generate production quality code and documentation. The approach is made feasible by using principles underlying design for change. Synthesis incorporates ideas from rapid prototyping, application generators, and domain analysis. The goals of Synthesis and the Synthesis process are discussed. The technology needed and the feasibility of the approach are also briefly discussed. The status of current efforts to implement Synthesis methodologies is presented.

  12. A study of process parameters on workpiece anisotropy in the laser engineered net shaping (LENSTM) process

    NASA Astrophysics Data System (ADS)

    Chandra, Shubham; Rao, Balkrishna C.

    2017-06-01

    The process of laser engineered net shaping (LENSTM) is an additive manufacturing technique that employs the coaxial flow of metallic powders with a high-power laser to form a melt pool and the subsequent deposition of the specimen on a substrate. Although research done over the past decade on the LENSTM processing of alloys of steel, titanium, nickel and other metallic materials typically reports superior mechanical properties in as-deposited specimens, when compared to the bulk material, there is anisotropy in the mechanical properties of the melt deposit. The current study involves the development of a numerical model of the LENSTM process, using the principles of computational fluid dynamics (CFD), and the subsequent prediction of the volume fraction of equiaxed grains to predict process parameters required for the deposition of workpieces with isotropy in their properties. The numerical simulation is carried out on ANSYS-Fluent, whose data on thermal gradient are used to determine the volume fraction of the equiaxed grains present in the deposited specimen. This study has been validated against earlier efforts on the experimental studies of LENSTM for alloys of nickel. Besides being applicable to the wider family of metals and alloys, the results of this study will also facilitate effective process design to improve both product quality and productivity.

  13. Computer simulation for improving radio frequency (RF) heating uniformity of food products: A review.

    PubMed

    Huang, Zhi; Marra, Francesco; Subbiah, Jeyamkondan; Wang, Shaojin

    2018-04-13

    Radio frequency (RF) heating has great potential for achieving rapid and volumetric heating in foods, providing safe and high-quality food products due to deep penetration depth, moisture self-balance effects, and leaving no chemical residues. However, the nonuniform heating problem (usually resulting in hot and cold spots in the heated product) needs to be resolved. The inhomogeneous temperature distribution not only affects the quality of the food but also raises the issue of food safety when the microorganisms or insects may not be controlled in the cold spots. The mathematical modeling for RF heating processes has been extensively studied in a wide variety of agricultural products recently. This paper presents a comprehensive review of recent progresses in computer simulation for RF heating uniformity improvement and the offered solutions to reduce the heating nonuniformity. It provides a brief introduction on the basic principle of RF heating technology, analyzes the applications of numerical simulation, and discusses the factors influencing the RF heating uniformity and the possible methods to improve heating uniformity. Mathematical modeling improves the understanding of RF heating of food and is essential to optimize the RF treatment protocol for pasteurization and disinfestation applications. Recommendations for future research have been proposed to further improve the accuracy of numerical models, by covering both heat and mass transfers in the model, validating these models with sample movement and mixing, and identifying the important model parameters by sensitivity analysis.

  14. The development of mathematics courseware for learning line and angle

    NASA Astrophysics Data System (ADS)

    Halim, Noor Dayana Abd; Han, Ong Boon; Abdullah, Zaleha; Yusup, Junaidah

    2015-05-01

    Learning software is a teaching aid which is often used in schools to increase students' motivation, attract students' attention and also improve the quality of teaching and learning process. However, the development of learning software should be followed the phases in Instructional Design (ID) Model, therefore the process can be carried out systematic and orderly. Thus, this concept paper describes the application of ADDIE model in the development of mathematics learning courseware for learning Line and Angle named CBL-Math. ADDIE model consists of five consecutive phases which are Analysis, Design, Development, Implementation and Evaluation. Each phase must be properly planned in order to achieve the objectives stated. Other than to describe the processes occurring in each phase, this paper also demonstrating how cognitive theory of multimedia learning principles are integrated in the developed courseware. The principles that applied in the courseware reduce the students' cognitive load while learning the topic of line and angle. With well prepared development process and the integration of appropriate principles, it is expected that the developed software can help students learn effectively and also increase students' achievement in the topic of Line and Angle.

  15. The four principles: Can they be measured and do they predict ethical decision making?

    PubMed Central

    2012-01-01

    Background The four principles of Beauchamp and Childress - autonomy, non-maleficence, beneficence and justice - have been extremely influential in the field of medical ethics, and are fundamental for understanding the current approach to ethical assessment in health care. This study tests whether these principles can be quantitatively measured on an individual level, and then subsequently if they are used in the decision making process when individuals are faced with ethical dilemmas. Methods The Analytic Hierarchy Process was used as a tool for the measurement of the principles. Four scenarios, which involved conflicts between the medical ethical principles, were presented to participants who then made judgments about the ethicality of the action in the scenario, and their intentions to act in the same manner if they were in the situation. Results Individual preferences for these medical ethical principles can be measured using the Analytic Hierarchy Process. This technique provides a useful tool in which to highlight individual medical ethical values. On average, individuals have a significant preference for non-maleficence over the other principles, however, and perhaps counter-intuitively, this preference does not seem to relate to applied ethical judgements in specific ethical dilemmas. Conclusions People state they value these medical ethical principles but they do not actually seem to use them directly in the decision making process. The reasons for this are explained through the lack of a behavioural model to account for the relevant situational factors not captured by the principles. The limitations of the principles in predicting ethical decision making are discussed. PMID:22606995

  16. The four principles: can they be measured and do they predict ethical decision making?

    PubMed

    Page, Katie

    2012-05-20

    The four principles of Beauchamp and Childress--autonomy, non-maleficence, beneficence and justice--have been extremely influential in the field of medical ethics, and are fundamental for understanding the current approach to ethical assessment in health care. This study tests whether these principles can be quantitatively measured on an individual level, and then subsequently if they are used in the decision making process when individuals are faced with ethical dilemmas. The Analytic Hierarchy Process was used as a tool for the measurement of the principles. Four scenarios, which involved conflicts between the medical ethical principles, were presented to participants who then made judgments about the ethicality of the action in the scenario, and their intentions to act in the same manner if they were in the situation. Individual preferences for these medical ethical principles can be measured using the Analytic Hierarchy Process. This technique provides a useful tool in which to highlight individual medical ethical values. On average, individuals have a significant preference for non-maleficence over the other principles, however, and perhaps counter-intuitively, this preference does not seem to relate to applied ethical judgements in specific ethical dilemmas. People state they value these medical ethical principles but they do not actually seem to use them directly in the decision making process. The reasons for this are explained through the lack of a behavioural model to account for the relevant situational factors not captured by the principles. The limitations of the principles in predicting ethical decision making are discussed.

  17. FDA 2011 process validation guidance: lifecycle compliance model.

    PubMed

    Campbell, Cliff

    2014-01-01

    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  18. Lasers: A Valuable Tool for Chemists.

    ERIC Educational Resources Information Center

    Findsen, E. W.; Ondrias, M. R.

    1986-01-01

    Discusses the properties of laser light, reviews types of lasers, presents operating principles, and considers mechanical aspects of laser light production. Applications reviewed include spectroscopy, photochemical reaction initiation, and investigation of biological processes involving porphyrins. (JM)

  19. Connecting Toxicology and Chemistry to Ensure Safer Chemical Design

    EPA Science Inventory

    Designing safer, healthier and sustainable products and processes requires the engagement of toxicologists and the incorporation of twenty-first century toxicology principles and practices. Hazard reduction through molecular design benefits from trans-disciplinary collaboration, ...

  20. [A new method of processing quantitative PCR data].

    PubMed

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  1. The evolution of acute burn care - retiring the split skin graft.

    PubMed

    Greenwood, J E

    2017-07-01

    The skin graft was born in 1869 and since then, surgeons have been using split skin grafts for wound repair. Nevertheless, this asset fails the big burn patient, who deserves an elastic, mobile and robust outcome but who receives the poorest possible outcome based on donor site paucity. Negating the need for the skin graft requires an autologous composite cultured skin and a material capable of temporising the burn wound for four weeks until the composite is produced. A novel, biodegradable polyurethane chemistry has been used to create two such products. This paper describes the design, production, optimisation and evaluation of several iterations of these products. The evaluation has occurred in a variety of models, both in vitro and in vivo, employing Hunterian scientific principles, and embracing Hunter's love and appreciation of comparative anatomy. The process has culminated in significant human experience in complex wounds and extensive burn injury. Used serially, the products offer robust and elastic healing in deep burns of any size within 6 weeks of injury.

  2. Scientific and Regulatory Considerations in Solid Oral Modified Release Drug Product Development.

    PubMed

    Li, Min; Sander, Sanna; Duan, John; Rosencrance, Susan; Miksinski, Sarah Pope; Yu, Lawrence; Seo, Paul; Rege, Bhagwant

    2016-11-01

    This review presents scientific and regulatory considerations for the development of solid oral modified release (MR) drug products. It includes a rationale for patient-focused development based on Quality-by-Design (QbD) principles. Product and process understanding of MR products includes identification and risk-based evaluation of critical material attributes (CMAs), critical process parameters (CPPs), and their impact on critical quality attributes (CQAs) that affect the clinical performance. The use of various biopharmaceutics tools that link the CQAs to a predictable and reproducible clinical performance for patient benefit is emphasized. Product and process understanding lead to a more comprehensive control strategy that can maintain product quality through the shelf life and the lifecycle of the drug product. The overall goal is to develop MR products that consistently meet the clinical objectives while mitigating the risks to patients by reducing the probability and increasing the detectability of CQA failures.

  3. Exploring the Dynamics and Modeling National Budget as a Supply Chain System: A Proposal for Reengineering the Budgeting Process and for Developing a Management Flight Simulator

    DTIC Science & Technology

    2012-09-01

    Elmendorf, D. W., & Gregory Mankiw , N. (1999). Government debt. Handbook of Macroeconomics , 1, 1615-1669. European Union. European financial stability...budget process, based on the supply chain demand management process principles of operations and it is introduced the idea of developing a Budget... principles of systems dynamics, a proposal for the development of a Budget Management Flight Simulator, that will operate as a learning and educational

  4. Implementation of quality by design toward processing of food products.

    PubMed

    Rathore, Anurag S; Kapoor, Gautam

    2017-05-28

    Quality by design (QbD) is a systematic approach that begins with predefined objectives and emphasizes product and process understanding and process control. It is an approach based on principles of sound science and quality risk management. As the food processing industry continues to embrace the idea of in-line, online, and/or at-line sensors and real-time characterization for process monitoring and control, the existing gaps with regard to our ability to monitor multiple parameters/variables associated with the manufacturing process will be alleviated over time. Investments made for development of tools and approaches that facilitate high-throughput analytical and process development, process analytical technology, design of experiments, risk analysis, knowledge management, and enhancement of process/product understanding would pave way for operational and economic benefits later in the commercialization process and across other product pipelines. This article aims to achieve two major objectives. First, to review the progress that has been made in the recent years on the topic of QbD implementation in processing of food products and second, present a case study that illustrates benefits of such QbD implementation.

  5. Maximum principle for a stochastic delayed system involving terminal state constraints.

    PubMed

    Wen, Jiaqiang; Shi, Yufeng

    2017-01-01

    We investigate a stochastic optimal control problem where the controlled system is depicted as a stochastic differential delayed equation; however, at the terminal time, the state is constrained in a convex set. We firstly introduce an equivalent backward delayed system depicted as a time-delayed backward stochastic differential equation. Then a stochastic maximum principle is obtained by virtue of Ekeland's variational principle. Finally, applications to a state constrained stochastic delayed linear-quadratic control model and a production-consumption choice problem are studied to illustrate the main obtained result.

  6. Integrated methodology for standard-setting norms of innovative product in the new competitive environment

    NASA Astrophysics Data System (ADS)

    Polyakova, Marina; Rubin, Gennadiy

    2017-07-01

    Modern theory of technological and economical development is based on long-term cycles. So far it has been proved that the technological structure of the economy can be subdivided into groups of technological complexes, which are inter-related with each other by similar technological links, so called technological modes. Technological mode is defined as a complex of interrelated production units of similar technological level, which develop simultaneously. In order to provide competitiveness of products in the new changing conditions, it is necessary to make sure that they meet all the regulatory requirements specified in standards. But the existing and the fast changing situation on the merchandise markets causes disbalance between the growing customer requirements and the technological capabilities of the manufacturer. This makes the issue of standardization development even more urgent both from the point of view of establishing the current positions and from the point of view of possible promising development trends in technology. In the paper scientific and engineering principles of developing standardization as a science are described. It is shown that further development of standardization is based on the principles of advanced standardization the main idea of which is to set up the prospective requirements to the innovative product. Modern approaches of advanced standardization are shown in this paper. The complexity of the negotiation procedure between customer and manufacturer as a whole and achieving of consensus, in particular, make it necessary to find conceptually new approaches to developing mathematical models. The developed methodology picture the process of achieving the consensus between customer and manufacturer while developing the standard norms in the form of decreasing S-curve diagram. It means that in the end of the negotiation process, there is no difference between customer and manufacturer positions. It makes it possible to provide the basics of the assessment using the differential equation of the relationship between the rate of change of quality assessment and the distance of the estimated parameter value from the best value to the worst one. The obtained mathematical model can be used in the practice of standardization decreasing time of setting standard norms.

  7. Using an Outdoor Learning Space to Teach Sustainability and Material Processes in HE Product Design

    ERIC Educational Resources Information Center

    Firth, Richard; Stoltenberg, Einar; Jennings, Trent

    2016-01-01

    This "case study" of two jewellery workshops, used outdoor learning spaces to explore both its impact on learning outcomes and to introduce some key principles of sustainable working methodologies and practices. Using the beach as the classroom, academics and students from a Norwegian and Scottish (HE) product design exchange programme…

  8. Principles of scientific research team formation and evolution

    PubMed Central

    Milojević, Staša

    2014-01-01

    Research teams are the fundamental social unit of science, and yet there is currently no model that describes their basic property: size. In most fields, teams have grown significantly in recent decades. We show that this is partly due to the change in the character of team size distribution. We explain these changes with a comprehensive yet straightforward model of how teams of different sizes emerge and grow. This model accurately reproduces the evolution of empirical team size distribution over the period of 50 y. The modeling reveals that there are two modes of knowledge production. The first and more fundamental mode employs relatively small, “core” teams. Core teams form by a Poisson process and produce a Poisson distribution of team sizes in which larger teams are exceedingly rare. The second mode employs “extended” teams, which started as core teams, but subsequently accumulated new members proportional to the past productivity of their members. Given time, this mode gives rise to a power-law tail of large teams (10–1,000 members), which features in many fields today. Based on this model, we construct an analytical functional form that allows the contribution of different modes of authorship to be determined directly from the data and is applicable to any field. The model also offers a solid foundation for studying other social aspects of science, such as productivity and collaboration. PMID:24591626

  9. Principles of scientific research team formation and evolution.

    PubMed

    Milojević, Staša

    2014-03-18

    Research teams are the fundamental social unit of science, and yet there is currently no model that describes their basic property: size. In most fields, teams have grown significantly in recent decades. We show that this is partly due to the change in the character of team size distribution. We explain these changes with a comprehensive yet straightforward model of how teams of different sizes emerge and grow. This model accurately reproduces the evolution of empirical team size distribution over the period of 50 y. The modeling reveals that there are two modes of knowledge production. The first and more fundamental mode employs relatively small, "core" teams. Core teams form by a Poisson process and produce a Poisson distribution of team sizes in which larger teams are exceedingly rare. The second mode employs "extended" teams, which started as core teams, but subsequently accumulated new members proportional to the past productivity of their members. Given time, this mode gives rise to a power-law tail of large teams (10-1,000 members), which features in many fields today. Based on this model, we construct an analytical functional form that allows the contribution of different modes of authorship to be determined directly from the data and is applicable to any field. The model also offers a solid foundation for studying other social aspects of science, such as productivity and collaboration.

  10. Development and test of a model for designing interactive CD-ROMs for teaching nursing skills.

    PubMed

    Jeffries, P R

    2000-01-01

    The use of interactive multimedia is well documented in the education literature as a medium for learning. Many schools of nursing and healthcare agencies purchase commercially-made CD-ROM products, and, in other cases, educators develop their own. Since nurses are increasingly designing CD-ROMs, they must be aware of the instructional design needed to develop comprehensive and effective CD-ROMs that do not compromise the quality of education. This article describes a process for developing and testing an interactive, multimedia CD-ROM on oral medication administration, using an instructional design model based on Chickering and Gamson's Principles of Good Practices in Education. Results from testing the model are reported. The findings can be used to guide the work of nurse educators who are interested in developing educational software.

  11. Application of lean manufacturing techniques in the Emergency Department.

    PubMed

    Dickson, Eric W; Singh, Sabi; Cheung, Dickson S; Wyatt, Christopher C; Nugent, Andrew S

    2009-08-01

    "Lean" is a set of principles and techniques that drive organizations to continually add value to the product they deliver by enhancing process steps that are necessary, relevant, and valuable while eliminating those that fail to add value. Lean has been used in manufacturing for decades and has been associated with enhanced product quality and overall corporate success. To evaluate whether the adoption of Lean principles by an Emergency Department (ED) improves the value of emergency care delivered. Beginning in December 2005, we implemented a variety of Lean techniques in an effort to enhance patient and staff satisfaction. The implementation followed a six-step process of Lean education, ED observation, patient flow analysis, process redesign, new process testing, and full implementation. Process redesign focused on generating improvement ideas from frontline workers across all departmental units. Value-based and operational outcome measures, including patient satisfaction, expense per patient, ED length of stay (LOS), and patient volume were compared for calendar year 2005 (pre-Lean) and periodically after 2006 (post-Lean). Patient visits increased by 9.23% in 2006. Despite this increase, LOS decreased slightly and patient satisfaction increased significantly without raising the inflation adjusted cost per patient. Lean improved the value of the care we delivered to our patients. Generating and instituting ideas from our frontline providers have been the key to the success of our Lean program. Although Lean represents a fundamental change in the way we think of delivering care, the specific process changes we employed tended to be simple, small procedure modifications specific to our unique people, process, and place. We, therefore, believe that institutions or departments aspiring to adopt Lean should focus on the core principles of Lean rather than on emulating specific process changes made at other institutions.

  12. Using the NIATx Model to Implement User-Centered Design of Technology for Older Adults.

    PubMed

    Gustafson, David H; Maus, Adam; Judkins, Julianne; Dinauer, Susan; Isham, Andrew; Johnson, Roberta; Landucci, Gina; Atwood, Amy K

    2016-01-14

    What models can effectively guide the creation of eHealth and mHealth technologies? This paper describes the use of the NIATx model as a framework for the user-centered design of a new technology for older adults. The NIATx model is a simple framework of process improvement based on the following principles derived from an analysis of decades of research from various industries about why some projects fail and others succeed: (1) Understand and involve the customer; (2) fix key problems; (3) pick an influential change leader; (4) get ideas from outside the field; (5) use rapid-cycle testing. This paper describes the use of these principles in technology development, the strengths and challenges of using this approach in this context, and lessons learned from the process. Overall, the NIATx model enabled us to produce a user-focused technology that the anecdotal evidence available so far suggests is engaging and useful to older adults. The first and fourth principles were especially important in developing the technology; the fourth proved the most challenging to use.

  13. Using the NIATx Model to Implement User-Centered Design of Technology for Older Adults

    PubMed Central

    Maus, Adam; Judkins, Julianne; Dinauer, Susan; Isham, Andrew; Johnson, Roberta; Landucci, Gina; Atwood, Amy K

    2016-01-01

    What models can effectively guide the creation of eHealth and mHealth technologies? This paper describes the use of the NIATx model as a framework for the user-centered design of a new technology for older adults. The NIATx model is a simple framework of process improvement based on the following principles derived from an analysis of decades of research from various industries about why some projects fail and others succeed: (1) Understand and involve the customer; (2) fix key problems; (3) pick an influential change leader; (4) get ideas from outside the field; (5) use rapid-cycle testing. This paper describes the use of these principles in technology development, the strengths and challenges of using this approach in this context, and lessons learned from the process. Overall, the NIATx model enabled us to produce a user-focused technology that the anecdotal evidence available so far suggests is engaging and useful to older adults. The first and fourth principles were especially important in developing the technology; the fourth proved the most challenging to use. PMID:27025985

  14. Worrying trends in econophysics

    NASA Astrophysics Data System (ADS)

    Gallegati, Mauro; Keen, Steve; Lux, Thomas; Ormerod, Paul

    2006-10-01

    Econophysics has already made a number of important empirical contributions to our understanding of the social and economic world. These fall mainly into the areas of finance and industrial economics, where in each case there is a large amount of reasonably well-defined data. More recently, Econophysics has also begun to tackle other areas of economics where data is much more sparse and much less reliable. In addition, econophysicists have attempted to apply the theoretical approach of statistical physics to try to understand empirical findings. Our concerns are fourfold. First, a lack of awareness of work that has been done within economics itself. Second, resistance to more rigorous and robust statistical methodology. Third, the belief that universal empirical regularities can be found in many areas of economic activity. Fourth, the theoretical models which are being used to explain empirical phenomena. The latter point is of particular concern. Essentially, the models are based upon models of statistical physics in which energy is conserved in exchange processes. There are examples in economics where the principle of conservation may be a reasonable approximation to reality, such as primitive hunter-gatherer societies. But in the industrialised capitalist economies, income is most definitely not conserved. The process of production and not exchange is responsible for this. Models which focus purely on exchange and not on production cannot by definition offer a realistic description of the generation of income in the capitalist, industrialised economies.

  15. Traditional Chinese medicine on the effects of low-intensity laser irradiation on cells

    NASA Astrophysics Data System (ADS)

    Liu, Timon C.; Duan, Rui; Li, Yan; Cai, Xiongwei

    2002-04-01

    In previous paper, process-specific times (PSTs) are defined by use of molecular reaction dynamics and time quantum theory established by TCY Liu et al., and the change of PSTs representing two weakly nonlinearly coupled bio-processes are shown to be parallel, which is called time parallel principle (TPP). The PST of a physiological process (PP) is called physiological time (PT). After the PTs of two PPs are compared with their Yin-Yang property of traditional Chinese medicine (TCM), the PST model of Yin and Yang (YPTM) was put forward: for two related processes, the process of small PST is Yin, and the other process is Yang. The Yin-Yang parallel principle (YPP) was put forward in terms of YPTM and TPP, which is the fundamental principle of TCM. In this paper, we apply it to study TCM on the effects of low intensity laser on cells, and successfully explained observed phenomena.

  16. Development of a pheromone elution rate physical model

    USDA-ARS?s Scientific Manuscript database

    A first principle modeling approach is applied to available data describing the elution of semiochemicals from pheromone dispensers. These data include field data for 27 products developed by several manufacturers, including homemade devices, as well as laboratory data collected on three semiochemi...

  17. Designing for Productive Adaptations of Curriculum Interventions

    ERIC Educational Resources Information Center

    Debarger, Angela Haydel; Choppin, Jeffrey; Beauvineau, Yves; Moorthy, Savitha

    2013-01-01

    Productive adaptations at the classroom level are evidence-based curriculum adaptations that are responsive to the demands of a particular classroom context and still consistent with the core design principles and intentions of a curriculum intervention. The model of design-based implementation research (DBIR) offers insights into complexities and…

  18. Constraint-Driven Software Design: An Escape from the Waterfall Model.

    ERIC Educational Resources Information Center

    de Hoog, Robert; And Others

    1994-01-01

    Presents the principles of a development methodology for software design based on a nonlinear, product-driven approach that integrates quality aspects. Two examples are given to show that the flexibility needed for building high quality systems leads to integrated development environments in which methodology, product, and tools are closely…

  19. [Organic waste treatment by earthworm vermicomposting and larvae bioconversion: review and perspective].

    PubMed

    Zhang, Zhi-jian; Liu, Meng; Zhu, Jun

    2013-05-01

    There is a growing attention on the environmental pollution and loss of potential regeneration of resources due to the poor handling of organic wastes, while earthworm vermicomposting and larvae bioconversion are well-known as two promising biotechnologies for sustainable wastes treatments, where earthworms or housefly larvae are employed to convert the organic wastes into humus like material, together with value-added worm product. Taken earthworm ( Eisenia foetida) and housefly larvae ( Musca domestica) as model species, this work illustrates fundamental definition and principle, operational process, technical mechanism, main factors, and bio-chemical features of organisms of these two technologies. Integrated with the physical and biochemical mechanisms, processes of biomass conversion, intestinal digestion, enzyme degradation and microflora decomposition are comprehensively reviewed on waste treatments with purposes of waste reduction, value-addition, and stabilization.

  20. On the Effects of Bremsstrahlung Radiation During Energetic Electron Precipitation

    NASA Astrophysics Data System (ADS)

    Xu, Wei; Marshall, Robert A.; Fang, Xiaohua; Turunen, Esa; Kero, Antti

    2018-01-01

    Precipitation of energetic particles into the Earth's atmosphere can significantly change the properties, dynamics, as well as the chemical composition of the upper and middle atmosphere. In this paper, using Monte Carlo models, we simulate, from first principles, the interaction of monoenergetic beams of precipitating electrons with the atmosphere, with particular emphasis on the process of bremsstrahlung radiation and its resultant ionization production and atmospheric effects. The pitch angle dependence of the ionization rate profile has been quantified: the altitude of peak ionization rate depends on the pitch angle by a few kilometers. We also demonstrate that the transport of precipitating electron energy in the form of bremsstrahlung photons leads to ionization at altitudes significantly lower than the direct impact ionization, as low as ˜20 km for 1 MeV precipitating electrons. Moreover, chemical modeling results suggest that the chemical effects in the atmosphere due to bremsstrahlung-induced ionization production during energetic electron precipitation are likely insignificant.

  1. Semiconductor nanocrystal quantum dot synthesis approaches towards large-scale industrial production for energy applications

    DOE PAGES

    Hu, Michael Z.; Zhu, Ting

    2015-12-04

    This study reviews the experimental synthesis and engineering developments that focused on various green approaches and large-scale process production routes for quantum dots. Fundamental process engineering principles were illustrated. In relation to the small-scale hot injection method, our discussions focus on the non-injection route that could be scaled up with engineering stir-tank reactors. In addition, applications that demand to utilize quantum dots as "commodity" chemicals are discussed, including solar cells and solid-state lightings.

  2. Authentication of Closely Related Fish and Derived Fish Products Using Tandem Mass Spectrometry and Spectral Library Matching.

    PubMed

    Nessen, Merel A; van der Zwaan, Dennis J; Grevers, Sander; Dalebout, Hans; Staats, Martijn; Kok, Esther; Palmblad, Magnus

    2016-05-11

    Proteomics methodology has seen increased application in food authentication, including tandem mass spectrometry of targeted species-specific peptides in raw, processed, or mixed food products. We have previously described an alternative principle that uses untargeted data acquisition and spectral library matching, essentially spectral counting, to compare and identify samples without the need for genomic sequence information in food species populations. Here, we present an interlaboratory comparison demonstrating how a method based on this principle performs in a realistic context. We also increasingly challenge the method by using data from different types of mass spectrometers, by trying to distinguish closely related and commercially important flatfish, and by analyzing heavily contaminated samples. The method was found to be robust in different laboratories, and 94-97% of the analyzed samples were correctly identified, including all processed and contaminated samples.

  3. Implementation of a Goal-Based Systems Engineering Process Using the Systems Modeling Language (SysML)

    NASA Technical Reports Server (NTRS)

    Breckenridge, Jonathan T.; Johnson, Stephen B.

    2013-01-01

    This paper describes the core framework used to implement a Goal-Function Tree (GFT) based systems engineering process using the Systems Modeling Language. It defines a set of principles built upon by the theoretical approach described in the InfoTech 2013 ISHM paper titled "Goal-Function Tree Modeling for Systems Engineering and Fault Management" presented by Dr. Stephen B. Johnson. Using the SysML language, the principles in this paper describe the expansion of the SysML language as a baseline in order to: hierarchically describe a system, describe that system functionally within success space, and allocate detection mechanisms to success functions for system protection.

  4. A Heat and Mass Transfer Model of a Silicon Pilot Furnace

    NASA Astrophysics Data System (ADS)

    Sloman, Benjamin M.; Please, Colin P.; Van Gorder, Robert A.; Valderhaug, Aasgeir M.; Birkeland, Rolf G.; Wegge, Harald

    2017-10-01

    The most common technological route for metallurgical silicon production is to feed quartz and a carbon source ( e.g., coal, coke, or charcoal) into submerged-arc furnaces, which use electrodes as electrical conductors. We develop a mathematical model of a silicon furnace. A continuum approach is taken, and we derive from first principles the equations governing the time evolution of chemical concentrations, gas partial pressures, velocity, and temperature within a one-dimensional vertical section of a furnace. Numerical simulations are obtained for this model and are shown to compare favorably with experimental results obtained using silicon pilot furnaces. A rising interface is shown to exist at the base of the charge, with motion caused by the heating of the pilot furnace. We find that more reactive carbon reduces the silicon monoxide losses, while reducing the carbon content in the raw material mixture causes greater solid and liquid material to build-up in the charge region, indicative of crust formation (which can be detrimental to the silicon production process). We also comment on how the various findings could be relevant for industrial operations.

  5. Approach of the Molten Salt Chemistry for Aluminium Production: High Temperature NMR Measurements, Molecular Dynamics and DFT Calculations

    NASA Astrophysics Data System (ADS)

    Machado, Kelly; Zanghi, Didier; Sarou-Kanian, Vincent; Cadars, Sylvian; Burbano, Mario; Salanne, Mathieu; Bessada, Catherine

    In aluminum production, the electrolyte is a molten fluorides mixture typically around 1000°C. In order to have a better understanding of the industrial process, it is necessary to have a model which will describe the molten salts on a wide range of compositions and temperatures, to accurately cover all the combinations that may be encountered in an operating electrolysis vessel. The aim of this study is to describe the speciation in the electrolyte in terms of anionic species in the bulk materials far from electrodes. To determine the speciation in situ at high temperature in the absence of an electrical field, we develop an original approach combining experimental methods such as Nuclear Magnetic Resonance spectroscopy (NMR) at high temperature with Molecular Dynamics (MD) simulation coupled with first principle calculations based on Density Functional Theory (DFT). This approach allows the calculation of NMR parameters and the comparison with the experimental ones. It will be provide an additional validation and constraint of the model used for MD. We test this approach on the model NaF-AlF3 system.

  6. Evaluate Yourself. Evaluation: Research-Based Decision Making Series, Number 9304.

    ERIC Educational Resources Information Center

    Fetterman, David M.

    This document considers both self-examination and external evaluation of gifted and talented education programs. Principles of the self-examination process are offered, noting similarities to external evaluation models. Principles of self-evaluation efforts include the importance of maintaining a nonjudgmental orientation, soliciting views from…

  7. Speckle noise removal applied to ultrasound image of carotid artery based on total least squares model.

    PubMed

    Yang, Lei; Lu, Jun; Dai, Ming; Ren, Li-Jie; Liu, Wei-Zong; Li, Zhen-Zhou; Gong, Xue-Hao

    2016-10-06

    An ultrasonic image speckle noise removal method by using total least squares model is proposed and applied onto images of cardiovascular structures such as the carotid artery. On the basis of the least squares principle, the related principle of minimum square method is applied to cardiac ultrasound image speckle noise removal process to establish the model of total least squares, orthogonal projection transformation processing is utilized for the output of the model, and the denoising processing for the cardiac ultrasound image speckle noise is realized. Experimental results show that the improved algorithm can greatly improve the resolution of the image, and meet the needs of clinical medical diagnosis and treatment of the cardiovascular system for the head and neck. Furthermore, the success in imaging of carotid arteries has strong implications in neurological complications such as stroke.

  8. Capturing Knowledge In Order To Optimize The Cutting Process For Polyethylene Pipes Using Knowledge Models

    NASA Astrophysics Data System (ADS)

    Rotaru, Ionela Magdalena

    2015-09-01

    Knowledge management is a powerful instrument. Areas where knowledge - based modelling can be applied are different from business, industry, government to education area. Companies engage in efforts to restructure the database held based on knowledge management principles as they recognize in it a guarantee of models characterized by the fact that they consist only from relevant and sustainable knowledge that can bring value to the companies. The proposed paper presents a theoretical model of what it means optimizing polyethylene pipes, thus bringing to attention two important engineering fields, the one of the metal cutting process and gas industry, who meet in order to optimize the butt fusion welding process - the polyethylene cutting part - of the polyethylene pipes. All approach is shaped on the principles of knowledge management. The study was made in collaboration with companies operating in the field.

  9. Understanding Parental Monitoring through Analysis of Monitoring Episodes in Context

    ERIC Educational Resources Information Center

    Hayes, Louise; Hudson, Alan; Matthews, Jan

    2007-01-01

    A model of monitoring interactions was proposed that is based on behavioural principles and places episodic parent-adolescent interactions at the centre of analysis for monitoring. The process-monitoring model contends that monitoring is an interactive process between parents and their adolescents, nested within a social setting. In the model it…

  10. Characterization of a Saccharomyces cerevisiae fermentation process for production of a therapeutic recombinant protein using a multivariate Bayesian approach.

    PubMed

    Fu, Zhibiao; Baker, Daniel; Cheng, Aili; Leighton, Julie; Appelbaum, Edward; Aon, Juan

    2016-05-01

    The principle of quality by design (QbD) has been widely applied to biopharmaceutical manufacturing processes. Process characterization is an essential step to implement the QbD concept to establish the design space and to define the proven acceptable ranges (PAR) for critical process parameters (CPPs). In this study, we present characterization of a Saccharomyces cerevisiae fermentation process using risk assessment analysis, statistical design of experiments (DoE), and the multivariate Bayesian predictive approach. The critical quality attributes (CQAs) and CPPs were identified with a risk assessment. The statistical model for each attribute was established using the results from the DoE study with consideration given to interactions between CPPs. Both the conventional overlapping contour plot and the multivariate Bayesian predictive approaches were used to establish the region of process operating conditions where all attributes met their specifications simultaneously. The quantitative Bayesian predictive approach was chosen to define the PARs for the CPPs, which apply to the manufacturing control strategy. Experience from the 10,000 L manufacturing scale process validation, including 64 continued process verification batches, indicates that the CPPs remain under a state of control and within the established PARs. The end product quality attributes were within their drug substance specifications. The probability generated with the Bayesian approach was also used as a tool to assess CPP deviations. This approach can be extended to develop other production process characterization and quantify a reliable operating region. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:799-812, 2016. © 2016 American Institute of Chemical Engineers.

  11. Reinventing the role of consumer research in today's open innovation ecosystem.

    PubMed

    Moskowitz, Howard R; Saguy, I Sam

    2013-01-01

    Consumer research (CR) has played a key role in the food and beverage industry. Emerging from laboratory product-tests, it has evolved into a corporate testing service that measures the consumer reactions to products/concepts using a wide range of analyses/metrics. We propose that CR transform itself in light of accelerated knowledge expansion, mounting global, and local economic pressure on corporations and changing consumer needs. The transformation moves from its traditional testing into creating profoundly new knowledge of the product and understanding of the corporation's current and future customers. CR's tasks will involve: contributing/expanding science, applying open innovation principles, and driving consumer-centric innovation. We identify seven paradigm shifts that will change CR, namely: a different way of working--from testing to open sourcing; from good corporate citizen to change leader; open new product development (NPD) process; new management roles/cultures; universities and industry, new education curricula, and cooperation; from battle over control to sustainable sharing is winning model (SiW); and the central role of design. This integrative, innovative CR requires the implementation of three recommendations: start the change process now, fine-tune along the way; create a new marketing/CR department; and educate and professionalize. These recommendations provide the blueprint for jump-starting the process and call for immediate actions to deal with the severity of the crises facing the CR profession.

  12. An Expert Teacher's Thinking and Teaching and Instructional Design Models and Principles: An Ethnographic Study.

    ERIC Educational Resources Information Center

    Moallem, Mahnaz

    1998-01-01

    Examines an expert teacher's thinking and teaching processes in order to link them to instructional-design procedures. Findings suggest that there were fundamental differences between the teacher's thinking and teaching processes and microinstructional design models. (Author/AEF)

  13. Electrostatic powder coating: Principles and pharmaceutical applications.

    PubMed

    Prasad, Leena Kumari; McGinity, James W; Williams, Robert O

    2016-05-30

    A majority of pharmaceutical powders are insulating materials that have a tendency to accumulate charge. This phenomenon has contributed to safety hazards and issues during powder handling and processing. However, increased understanding of this occurrence has led to greater understanding and control of processing and product performance. More recently, the charging of pharmaceutical powders has been employed to adopt electrostatic powder coating as a pharmaceutical process. Electrostatic powder coating is a mature technology used in the finishing industry and much of that knowledge applies to its use in pharmaceutical applications. This review will serve to summarize the principles of electrostatic powder coating and highlight some of the research conducted on its use for the preparation of pharmaceutical dosage forms. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. System-wide hybrid MPC-PID control of a continuous pharmaceutical tablet manufacturing process via direct compaction.

    PubMed

    Singh, Ravendra; Ierapetritou, Marianthi; Ramachandran, Rohit

    2013-11-01

    The next generation of QbD based pharmaceutical products will be manufactured through continuous processing. This will allow the integration of online/inline monitoring tools, coupled with an efficient advanced model-based feedback control systems, to achieve precise control of process variables, so that the predefined product quality can be achieved consistently. The direct compaction process considered in this study is highly interactive and involves time delays for a number of process variables due to sensor placements, process equipment dimensions, and the flow characteristics of the solid material. A simple feedback regulatory control system (e.g., PI(D)) by itself may not be sufficient to achieve the tight process control that is mandated by regulatory authorities. The process presented herein comprises of coupled dynamics involving slow and fast responses, indicating the requirement of a hybrid control scheme such as a combined MPC-PID control scheme. In this manuscript, an efficient system-wide hybrid control strategy for an integrated continuous pharmaceutical tablet manufacturing process via direct compaction has been designed. The designed control system is a hybrid scheme of MPC-PID control. An effective controller parameter tuning strategy involving an ITAE method coupled with an optimization strategy has been used for tuning of both MPC and PID parameters. The designed hybrid control system has been implemented in a first-principles model-based flowsheet that was simulated in gPROMS (Process System Enterprise). Results demonstrate enhanced performance of critical quality attributes (CQAs) under the hybrid control scheme compared to only PID or MPC control schemes, illustrating the potential of a hybrid control scheme in improving pharmaceutical manufacturing operations. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Coproduction of healthcare service.

    PubMed

    Batalden, Maren; Batalden, Paul; Margolis, Peter; Seid, Michael; Armstrong, Gail; Opipari-Arrigan, Lisa; Hartung, Hans

    2016-07-01

    Efforts to ensure effective participation of patients in healthcare are called by many names-patient centredness, patient engagement, patient experience. Improvement initiatives in this domain often resemble the efforts of manufacturers to engage consumers in designing and marketing products. Services, however, are fundamentally different than products; unlike goods, services are always 'coproduced'. Failure to recognise this unique character of a service and its implications may limit our success in partnering with patients to improve health care. We trace a partial history of the coproduction concept, present a model of healthcare service coproduction and explore its application as a design principle in three healthcare service delivery innovations. We use the principle to examine the roles, relationships and aims of this interdependent work. We explore the principle's implications and challenges for health professional development, for service delivery system design and for understanding and measuring benefit in healthcare services. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  16. Software-engineering challenges of building and deploying reusable problem solvers.

    PubMed

    O'Connor, Martin J; Nyulas, Csongor; Tu, Samson; Buckeridge, David L; Okhmatovskaia, Anna; Musen, Mark A

    2009-11-01

    Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task-method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach.

  17. Software-engineering challenges of building and deploying reusable problem solvers

    PubMed Central

    O’CONNOR, MARTIN J.; NYULAS, CSONGOR; TU, SAMSON; BUCKERIDGE, DAVID L.; OKHMATOVSKAIA, ANNA; MUSEN, MARK A.

    2012-01-01

    Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task–method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach. PMID:23565031

  18. The design and scale-up of spray dried particle delivery systems.

    PubMed

    Al-Khattawi, Ali; Bayly, Andrew; Phillips, Andrew; Wilson, David

    2018-01-01

    The rising demand for pharmaceutical particles with tailored physicochemical properties has opened new markets for spray drying especially for solubility enhancement, improving inhalation medicines and stabilization of biopharmaceuticals. Despite this, the spray drying literature is scattered and often does not address the principles underpinning robust development of pharmaceuticals. It is therefore necessary to present clearer picture of the field and highlight the factors influencing particle design and scale-up. Areas covered: The review presents a systematic analysis of the trends in development of particle delivery systems using spray drying. This is followed by exploring the mechanisms governing particle formation in the process stages. Particle design factors including those of equipment configurations and feed/process attributes were highlighted. Finally, the review summarises the current industrial approaches for upscaling pharmaceutical spray drying. Expert opinion: Spray drying provides the ability to design particles of the desired functionality. This greatly benefits the pharmaceutical sector especially as product specifications are becoming more encompassing and exacting. One of the biggest barriers to product translation remains one of scale-up/scale-down. A shift from trial and error approaches to model-based particle design helps to enhance control over product properties. To this end, process innovations and advanced manufacturing technologies are particularly welcomed.

  19. The Perceived Implications of an Outsourcing Model on Governance within British Columbia Provincial Parks in Canada: A Quantitative Study

    NASA Astrophysics Data System (ADS)

    Eagles, Paul; Havitz, Mark; McCutcheon, Bonnie; Buteau-Duitschaever, Windekind; Glover, Troy

    2010-06-01

    Good governance is of paramount importance to the success of parks and protected areas. This research utilized a questionnaire for 10 principles of governance to evaluate the outsourcing model used by British Columbia Provincial Parks, where profit-making corporations provide all front country visitor services. A total of 246 respondents representing five stakeholder groups evaluated the model according to each principle, using an online survey. Principal component analysis resulted in two of the 10 principles (equity and effectiveness) each being split into two categories, leading to 12 governance principles. Five of the 12 criteria received scores towards good governance: effectiveness outcome; equity general; strategic vision; responsiveness; and effectiveness process. One criterion, public participation, was on the neutral point. Six criteria received scores below neutral, more towards weak governance: transparency; rule of law; accountability; efficiency; consensus orientation; and, equity finance. The five stakeholder groups differed significantly on 10 of the 12 principles ( P < .05). The 2 exceptions were for efficiency and effectiveness process. Seven of the 12 criteria followed a pattern wherein government employees and contractors reported positive scores, visitors and representatives of NGOs reported more negative scores, and nearby residents reported mid-range scores. Three criteria had government employees and contractors reporting the most positive scores, residents and visitors the most negative scores, and NGO respondents reporting mid-range scores. This research found evidence that perceptions of governance related to this outsourcing model differed significantly amongst various constituent groups.

  20. Lean Management—The Journey from Toyota to Healthcare

    PubMed Central

    Teich, Sorin T.; Faddoul, Fady F.

    2013-01-01

    The evolution of production systems is tightly linked to the story of Toyota Motor Company (TMC) that has its roots around 1918. The term “lean” was coined in 1990 following the exploration of the Toyota model that led to the “transference” thesis sustaining the concept that manufacturing problems and technologies are universal problems faced by management and that these concepts can be emulated in non-Japanese enterprises. Lean is a multi-faceted concept and requires organizations to exert effort along several dimensions simultaneously; some consider a successful implementation either achieving major strategic components of lean, implementing practices to support operational aspects, or providing evidence that the improvements are sustainable in the long term. The article explores challenges and opportunities faced by organizations that intend incorporating lean management principles and presents the specific context of the healthcare industry. Finally, the concepts of “essential few” and customer value are illustrated through a simple example of process change following lean principles, which was implemented in a dental school in the United States. PMID:23908857

  1. Lean management-the journey from toyota to healthcare.

    PubMed

    Teich, Sorin T; Faddoul, Fady F

    2013-04-01

    The evolution of production systems is tightly linked to the story of Toyota Motor Company (TMC) that has its roots around 1918. The term "lean" was coined in 1990 following the exploration of the Toyota model that led to the "transference" thesis sustaining the concept that manufacturing problems and technologies are universal problems faced by management and that these concepts can be emulated in non-Japanese enterprises. Lean is a multi-faceted concept and requires organizations to exert effort along several dimensions simultaneously; some consider a successful implementation either achieving major strategic components of lean, implementing practices to support operational aspects, or providing evidence that the improvements are sustainable in the long term. The article explores challenges and opportunities faced by organizations that intend incorporating lean management principles and presents the specific context of the healthcare industry. Finally, the concepts of "essential few" and customer value are illustrated through a simple example of process change following lean principles, which was implemented in a dental school in the United States.

  2. GEOLAND2 global LAI, FAPAR Essential Climate Variables for terrestrial carbon modeling: principles and validation

    NASA Astrophysics Data System (ADS)

    Baret, F.; Weiss, M.; Lacaze, R.; Camacho, F.; Smets, B.; Pacholczyk, P.; Makhmara, H.

    2010-12-01

    LAI and fAPAR are recognized as Essential Climate Variables providing key information for the understanding and modeling of canopy functioning. Global remote sensing observations at medium resolution are routinely acquired since the 80’s mainly with AVHRR, SEAWIFS, VEGETATION, MODIS and MERIS sensors. Several operational products have been derived and provide global maps of LAI and fAPAR at daily to monthly time steps. Inter-comparison between MODIS, CYCLOPES, GLOBCARBON and JRC-FAPAR products showed generally consistent seasonality, while large differences in magnitude and smoothness may be observed. One of the objectives of the GEOLAND2 European project is to develop such core products to be used in a range of application services including the carbon monitoring. Rather than generating an additional product from scratch, the version 1 of GEOLAND2 products was capitalizing on the existing products by combining them to retain their pros and limit their cons. For these reasons, MODIS and CYCLOPES products were selected since they both include LAI and fAPAR while having relatively close temporal sampling intervals (8 to 10 days). GLOBCARBON products were not used here because of the too long monthly time step inducing large uncertainties in the seasonality description. JRC-FAPAR was not selected as well to preserve better consistency between LAI and fAPAR products. MODIS and CYCLOPES products were then linearly combined to take advantage of the good performances of CYCLOPES products for low to medium values of LAI and fAPAR while benefiting from the better MODIS performances for the highest LAI values. A training database representative of the global variability of vegetation type and conditions was thus built. A back-propagation neural network was then calibrated to estimate the new LAI and fAPAR products from VEGETATION preprocessed observations. Similarly, the vegetation cover fraction (fCover) was also derived by scaling the original CYCLOPES fCover products. Validation results achieved following the principles proposed by CEOS-LPV show that the new product called GEOV1 behaves as expected with good performances over the whole range of LAI and fAPAR in a temporally smooth and spatially consistent manner. These products will be processed and delivered by VITO in near real time at 1 km spatial resolution and 10 days frequency using a pre-operational production quality tracking system. The entire VEGETATION archive, from 1999 will be processed to provide a consistent time series over both VEGETATION sensors at the same spatial and temporal sampling. A climatology of products computed over the VEGETATION period will be also delivered at the same spatial and temporal sampling, showing average values, between year variability and possible trends over the decade. Finally, the VEGETATION derived time series starting back to 1999 will be completed with consistent products at 4 km spatial resolution derived from the NOAA/AVHRR series to cover the 1981-2010 period.

  3. Neural reuse of action perception circuits for language, concepts and communication.

    PubMed

    Pulvermüller, Friedemann

    2018-01-01

    Neurocognitive and neurolinguistics theories make explicit statements relating specialized cognitive and linguistic processes to specific brain loci. These linking hypotheses are in need of neurobiological justification and explanation. Recent mathematical models of human language mechanisms constrained by fundamental neuroscience principles and established knowledge about comparative neuroanatomy offer explanations for where, when and how language is processed in the human brain. In these models, network structure and connectivity along with action- and perception-induced correlation of neuronal activity co-determine neurocognitive mechanisms. Language learning leads to the formation of action perception circuits (APCs) with specific distributions across cortical areas. Cognitive and linguistic processes such as speech production, comprehension, verbal working memory and prediction are modelled by activity dynamics in these APCs, and combinatorial and communicative-interactive knowledge is organized in the dynamics within, and connections between APCs. The network models and, in particular, the concept of distributionally-specific circuits, can account for some previously not well understood facts about the cortical 'hubs' for semantic processing and the motor system's role in language understanding and speech sound recognition. A review of experimental data evaluates predictions of the APC model and alternative theories, also providing detailed discussion of some seemingly contradictory findings. Throughout, recent disputes about the role of mirror neurons and grounded cognition in language and communication are assessed critically. Copyright © 2017 The Author. Published by Elsevier Ltd.. All rights reserved.

  4. Realization of planning design of mechanical manufacturing system by Petri net simulation model

    NASA Astrophysics Data System (ADS)

    Wu, Yanfang; Wan, Xin; Shi, Weixiang

    1991-09-01

    Planning design is to work out a more overall long-term plan. In order to guarantee a mechanical manufacturing system (MMS) designed to obtain maximum economical benefit, it is necessary to carry out a reasonable planning design for the system. First, some principles on planning design for MMS are introduced. Problems of production scheduling and their decision rules for computer simulation are presented. Realizable method of each production scheduling decision rule in Petri net model is discussed. Second, the solution of conflict rules for conflict problems during running Petri net is given. Third, based on the Petri net model of MMS which includes part flow and tool flow, according to the principle of minimum event time advance, a computer dynamic simulation of the Petri net model, that is, a computer dynamic simulation of MMS, is realized. Finally, the simulation program is applied to a simulation exmple, so the scheme of a planning design for MMS can be evaluated effectively.

  5. Introspective Reasoning Models for Multistrategy Case-Based and Explanation

    DTIC Science & Technology

    1997-03-10

    symptoms and diseases to causal 30 principles about diseases and first-principle analysis grounded in basic science. Based on research in process...the symptoms of the failure to conclusion that the process which posts learning goals a causal explanation of the failure. Secondl,,. the learner...the vernacular, a "jones" is a drug habit accompanied the faucet for water. Therefore, the story can end with by withdrawal symptoms . The verb "to jones

  6. An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle

    NASA Astrophysics Data System (ADS)

    Wang, Yue; Gao, Dan; Mao, Xuming

    2018-03-01

    A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.

  7. Real time closed loop control of an Ar and Ar/O2 plasma in an ICP

    NASA Astrophysics Data System (ADS)

    Faulkner, R.; Soberón, F.; McCarter, A.; Gahan, D.; Karkari, S.; Milosavljevic, V.; Hayden, C.; Islyaikin, A.; Law, V. J.; Hopkins, M. B.; Keville, B.; Iordanov, P.; Doherty, S.; Ringwood, J. V.

    2006-10-01

    Real time closed loop control for plasma assisted semiconductor manufacturing has been the subject of academic research for over a decade. However, due to process complexity and the lack of suitable real time metrology, progress has been elusive and genuine real time, multi-input, multi-output (MIMO) control of a plasma assisted process has yet to be successfully implemented in an industrial setting. A Splasma parameter control strategy T is required to be adopted whereby process recipes which are defined in terms of plasma properties such as critical species densities as opposed to input variables such as rf power and gas flow rates may be transferable between different chamber types. While PIC simulations and multidimensional fluid models have contributed considerably to the basic understanding of plasmas and the design of process equipment, such models require a large amount of processing time and are hence unsuitable for testing control algorithms. In contrast, linear dynamical empirical models, obtained through system identification techniques are ideal in some respects for control design since their computational requirements are comparatively small and their structure facilitates the application of classical control design techniques. However, such models provide little process insight and are specific to an operating point of a particular machine. An ideal first principles-based, control-oriented model would exhibit the simplicity and computational requirements of an empirical model and, in addition, despite sacrificing first principles detail, capture enough of the essential physics and chemistry of the process in order to provide reasonably accurate qualitative predictions. This paper will discuss the development of such a first-principles based, control-oriented model of a laboratory inductively coupled plasma chamber. The model consists of a global model of the chemical kinetics coupled to an analytical model of power deposition. Dynamics of actuators including mass flow controllers and exhaust throttle are included and sensor characteristics are also modelled. The application of this control-oriented model to achieve multivariable closed loop control of specific species e.g. atomic Oxygen and ion density using the actuators rf power, Oxygen and Argon flow rates, and pressure/exhaust flow rate in an Ar/O2 ICP plasma will be presented.

  8. Adsorption of organic molecules on mineral surfaces studied by first-principle calculations: A review.

    PubMed

    Zhao, Hongxia; Yang, Yong; Shu, Xin; Wang, Yanwei; Ran, Qianping

    2018-04-09

    First-principle calculations, especially by the density functional theory (DFT) methods, are becoming a power technique to study molecular structure and properties of organic/inorganic interfaces. This review introduces some recent examples on the study of adsorption models of organic molecules or oligomers on mineral surfaces and interfacial properties obtained from first-principles calculations. The aim of this contribution is to inspire scientists to benefit from first-principle calculations and to apply the similar strategies when studying and tailoring interfacial properties at the atomistic scale, especially for those interested in the design and development of new molecules and new products. Copyright © 2017. Published by Elsevier B.V.

  9. Chilled storage of foods - principles

    USDA-ARS?s Scientific Manuscript database

    Chilled storage is the most common method for preserving perishable foods. The consumers’ increasing demand for convenient, minimally processed foods has caused food manufacturers to increase production of refrigerated foods worldwide. This book chapter reviews the development of using low tempera...

  10. Applying Humanistic Learning Theory: The "Art" of Coaching

    ERIC Educational Resources Information Center

    Connolly, Graeme J.

    2016-01-01

    The purpose of this article is to apply specific principles of psychology to the coaching process. More specifically, it is about becoming a productive and effective coach, who positively affects the athletic careers and lives of young people.

  11. Overview of the O3M SAF GOME-2 operational atmospheric composition and UV radiation data products and data availability

    NASA Astrophysics Data System (ADS)

    Hassinen, S.; Balis, D.; Bauer, H.; Begoin, M.; Delcloo, A.; Eleftheratos, K.; Gimeno Garcia, S.; Granville, J.; Grossi, M.; Hao, N.; Hedelt, P.; Hendrick, F.; Hess, M.; Heue, K.-P.; Hovila, J.; Jønch-Sørensen, H.; Kalakoski, N.; Kiemle, S.; Kins, L.; Koukouli, M. E.; Kujanpää, J.; Lambert, J.-C.; Lerot, C.; Loyola, D.; Määttä, A.; Pedergnana, M.; Pinardi, G.; Romahn, F.; van Roozendael, M.; Lutz, R.; De Smedt, I.; Stammes, P.; Steinbrecht, W.; Tamminen, J.; Theys, N.; Tilstra, L. G.; Tuinder, O. N. E.; Valks, P.; Zerefos, C.; Zimmer, W.; Zyrichidou, I.

    2015-07-01

    The three GOME-2 instruments will provide unique and long data sets for atmospheric research and applications. The complete time period will be 2007-2022, including the period of ozone depletion as well as the beginning of ozone layer recovery. Besides ozone chemistry, the GOME-2 products are important e.g. for air quality studies, climate modeling, policy monitoring and hazard warnings. The heritage for GOME-2 is in the ERS/GOME and Envisat/SCIAMACHY instruments. The current Level 2 (L2) data cover a wide range of products such as trace gas columns (NO2, BrO, H2CO, H2O, SO2), tropospheric columns of NO2, total ozone columns and vertical ozone profiles in high and low spatial resolution, absorbing aerosol indices from the main science channels as well as from the polarization channels (AAI, AAI-PMD), Lambertian-equivalent reflectivity database, clear-sky and cloud-corrected UV indices and surface UV fields with different weightings and photolysis rates. The Ozone Monitoring and Atmospheric Composition Satellite Application Facility (O3M SAF) processing and data dissemination is operational and running 24/7. Data quality is quarantined by the detailed review processes for the algorithms, validation of the products as well as by a continuous quality monitoring of the products and processing. This is an overview paper providing the O3M SAF project background, current status and future plans to utilization of the GOME-2 data. An important focus is the provision of summaries of the GOME-2 products including product principles and validation examples together with the product sample images. Furthermore, this paper collects the references to the detailed product algorithm and validation papers.

  12. Dynamic optimization of metabolic networks coupled with gene expression.

    PubMed

    Waldherr, Steffen; Oyarzún, Diego A; Bockmayr, Alexander

    2015-01-21

    The regulation of metabolic activity by tuning enzyme expression levels is crucial to sustain cellular growth in changing environments. Metabolic networks are often studied at steady state using constraint-based models and optimization techniques. However, metabolic adaptations driven by changes in gene expression cannot be analyzed by steady state models, as these do not account for temporal changes in biomass composition. Here we present a dynamic optimization framework that integrates the metabolic network with the dynamics of biomass production and composition. An approximation by a timescale separation leads to a coupled model of quasi-steady state constraints on the metabolic reactions, and differential equations for the substrate concentrations and biomass composition. We propose a dynamic optimization approach to determine reaction fluxes for this model, explicitly taking into account enzyme production costs and enzymatic capacity. In contrast to the established dynamic flux balance analysis, our approach allows predicting dynamic changes in both the metabolic fluxes and the biomass composition during metabolic adaptations. Discretization of the optimization problems leads to a linear program that can be efficiently solved. We applied our algorithm in two case studies: a minimal nutrient uptake network, and an abstraction of core metabolic processes in bacteria. In the minimal model, we show that the optimized uptake rates reproduce the empirical Monod growth for bacterial cultures. For the network of core metabolic processes, the dynamic optimization algorithm predicted commonly observed metabolic adaptations, such as a diauxic switch with a preference ranking for different nutrients, re-utilization of waste products after depletion of the original substrate, and metabolic adaptation to an impending nutrient depletion. These examples illustrate how dynamic adaptations of enzyme expression can be predicted solely from an optimization principle. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Are We Doing Ok? Developing a Generic Process to Benchmark Career Services in Educational Institutions

    ERIC Educational Resources Information Center

    McCowan, Col; McKenzie, Malcolm

    2011-01-01

    In 2007 the Career Industry Council of Australia developed the Guiding Principles for Career Development Services and Career Information Products as one part of its strategy to produce a national quality framework for career development activities in Australia. An Australian university career service undertook an assessment process against these…

  14. Three faces of entropy for complex systems: Information, thermodynamics, and the maximum entropy principle

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf

    2017-09-01

    There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.

  15. Lexical access in sign language: a computational model

    PubMed Central

    Caselli, Naomi K.; Cohen-Goldberg, Ariel M.

    2014-01-01

    Psycholinguistic theories have predominantly been built upon data from spoken language, which leaves open the question: How many of the conclusions truly reflect language-general principles as opposed to modality-specific ones? We take a step toward answering this question in the domain of lexical access in recognition by asking whether a single cognitive architecture might explain diverse behavioral patterns in signed and spoken language. Chen and Mirman (2012) presented a computational model of word processing that unified opposite effects of neighborhood density in speech production, perception, and written word recognition. Neighborhood density effects in sign language also vary depending on whether the neighbors share the same handshape or location. We present a spreading activation architecture that borrows the principles proposed by Chen and Mirman (2012), and show that if this architecture is elaborated to incorporate relatively minor facts about either (1) the time course of sign perception or (2) the frequency of sub-lexical units in sign languages, it produces data that match the experimental findings from sign languages. This work serves as a proof of concept that a single cognitive architecture could underlie both sign and word recognition. PMID:24860539

  16. Ten key principles for successful health systems integration.

    PubMed

    Suter, Esther; Oelke, Nelly D; Adair, Carol E; Armitage, Gail D

    2009-01-01

    Integrated health systems are considered part of the solution to the challenge of sustaining Canada's healthcare system. This systematic literature review was undertaken to guide decision-makers and others to plan for and implement integrated health systems. This review identified 10 universal principles of successfully integrated healthcare systems that may be used by decision-makers to assist with integration efforts. These principles define key areas for restructuring and allow organizational flexibility and adaptation to local context. The literature does not contain a one-size-fits-all model or process for successful integration, nor is there a firm empirical foundation for specific integration strategies and processes.

  17. Selective laser melting of Inconel super alloy-a review

    NASA Astrophysics Data System (ADS)

    Karia, M. C.; Popat, M. A.; Sangani, K. B.

    2017-07-01

    Additive manufacturing is a relatively young technology that uses the principle of layer by layer addition of material in solid, liquid or powder form to develop a component or product. The quality of additive manufactured part is one of the challenges to be addressed. Researchers are continuously working at various levels of additive manufacturing technologies. One of the significant powder bed processes for met als is Selective Laser Melting (SLM). Laser based processes are finding more attention of researchers and industrial world. The potential of this technique is yet to be fully explored. Due to very high strength and creep resistance Inconel is extensively used nickel based super alloy for manufacturing components for aerospace, automobile and nuclear industries. Due to law content of Aluminum and Titanium, it exhibits good fabricability too. Therefore the alloy is ideally suitable for selective laser melting to manufacture intricate components with high strength requirements. The selection of suitable process for manufacturing for a specific component depends on geometrical complexity, production quantity, and cost and required strength. There are numerous researchers working on various aspects like metallurgical and micro structural investigations and mechanical properties, geometrical accuracy, effects of process parameters and its optimization and mathematical modeling etc. The present paper represents a comprehensive overview of selective laser melting process for Inconel group of alloys.

  18. On the limitations of General Circulation Climate Models

    NASA Technical Reports Server (NTRS)

    Stone, Peter H.; Risbey, James S.

    1990-01-01

    General Circulation Models (GCMs) by definition calculate large-scale dynamical and thermodynamical processes and their associated feedbacks from first principles. This aspect of GCMs is widely believed to give them an advantage in simulating global scale climate changes as compared to simpler models which do not calculate the large-scale processes from first principles. However, it is pointed out that the meridional transports of heat simulated GCMs used in climate change experiments differ from observational analyses and from other GCMs by as much as a factor of two. It is also demonstrated that GCM simulations of the large scale transports of heat are sensitive to the (uncertain) subgrid scale parameterizations. This leads to the question whether current GCMs are in fact superior to simpler models for simulating temperature changes associated with global scale climate change.

  19. Green toxicology.

    PubMed

    Maertens, Alexandra; Anastas, Nicholas; Spencer, Pamela J; Stephens, Martin; Goldberg, Alan; Hartung, Thomas

    2014-01-01

    Historically, early identification and characterization of adverse effects of industrial chemicals was difficult because conventional toxicological test methods did not meet R&D needs for rapid, relatively inexpensive methods amenable to small amounts of test material. The pharmaceutical industry now front-loads toxicity testing, using in silico, in vitro, and less demanding animal tests at earlier stages of product development to identify and anticipate undesirable toxicological effects and optimize product development. The Green Chemistry movement embraces similar ideas for development of less toxic products, safer processes, and less waste and exposure. Further, the concept of benign design suggests ways to consider possible toxicities before the actual synthesis and to apply some structure/activity rules (SAR) and in silico methods. This requires not only scientific development but also a change in corporate culture in which synthetic chemists work with toxicologists. An emerging discipline called Green Toxicology (Anastas, 2012) provides a framework for integrating the principles of toxicology into the enterprise of designing safer chemicals, thereby minimizing potential toxicity as early in production as possible. Green Toxicology`s novel utility lies in driving innovation by moving safety considerations to the earliest stage in a chemical`s lifecycle, i.e., to molecular design. In principle, this field is no different than other subdisciplines of toxicology that endeavor to focus on a specific area - for example, clinical, environmental or forensic toxicology. We use the same principles and tools to evaluate an existing substance or to design a new one. The unique emphasis is in using 21st century toxicology tools as a preventative strategy to "design out" undesired human health and environmental effects, thereby increasing the likelihood of launching a successful, sustainable product. Starting with the formation of a steering group and a series of workshops, the Green Toxicology concept is currently spreading internationally and is being refined via an iterative process.

  20. A computer model for liquid jet atomization in rocket thrust chambers

    NASA Astrophysics Data System (ADS)

    Giridharan, M. G.; Lee, J. G.; Krishnan, A.; Yang, H. Q.; Ibrahim, E.; Chuech, S.; Przekwas, A. J.

    1991-12-01

    The process of atomization has been used as an efficient means of burning liquid fuels in rocket engines, gas turbine engines, internal combustion engines, and industrial furnaces. Despite its widespread application, this complex hydrodynamic phenomenon has not been well understood, and predictive models for this process are still in their infancy. The difficulty in simulating the atomization process arises from the relatively large number of parameters that influence it, including the details of the injector geometry, liquid and gas turbulence, and the operating conditions. In this study, numerical models are developed from first principles, to quantify factors influencing atomization. For example, the surface wave dynamics theory is used for modeling the primary atomization and the droplet energy conservation principle is applied for modeling the secondary atomization. The use of empirical correlations has been minimized by shifting the analyses to fundamental levels. During applications of these models, parametric studies are performed to understand and correlate the influence of relevant parameters on the atomization process. The predictions of these models are compared with existing experimental data. The main tasks of this study were the following: development of a primary atomization model; development of a secondary atomization model; development of a model for impinging jets; development of a model for swirling jets; and coupling of the primary atomization model with a CFD code.

  1. Positioning your business in the marketplace.

    PubMed

    Lachman, V D

    1996-01-01

    Marketing the quality, cost-effective service delivered by advanced practice nurses (APNs) requires savvy in marketing principles. The basic principles of market segmentation: target (niche) marketing; and the four Ps of marketing mix--product, price, promotion, and place. The marketing process is presented along with examples. APNs' ability to successfully market their skills requires that they "position" themselves in the prospective buyer's mind. After a brief description of the customer's mind-set, the focus shifts specifically to promotion--marketing in action. Numerous no-cost/low-cost ideas are included.

  2. Back to first principles: a new model for the regulation of drug promotion

    PubMed Central

    Bennett, Alan; Jiménez, Freddy; Fields, Larry Eugene; Oyster, Joshua

    2015-01-01

    The US Food and Drug Administration's (‘FDA’ or the ‘Agency’) current regulatory framework for drug promotion, by significantly restricting the ability of drug manufacturers to communicate important, accurate, up-to-date scientific information about their products that is truthful and non-misleading, runs afoul of the First Amendment and actually runs counter to the Agency's public health mission. Our article proposes a New Model that represents an initial proposal for a modern, sustainable regulatory framework that comprehensively addresses drug promotion while protecting the public health, protecting manufacturers’ First Amendment rights, establishing clear and understandable rules, and maintaining the integrity of the FDA approval process. The New Model would create three categories of manufacturer communications—(1) Scientific Exchange and Other Exempt Communications, (2) Non-Core Communications, and (3) Core Communications—that would be regulated consistent with the First Amendment and according to the strength of the government's interest in regulating the specific communications included within each category. The New Model should address the FDA's concerns related to off-label speech while protecting drug manufacturers’ freedom to engage in truthful and non-misleading communications about their products. PMID:27774195

  3. Applying Constructivist and Objectivist Learning Theories in the Design of a Web-based Course: Implications for Practice.

    ERIC Educational Resources Information Center

    Moallem, Mahnaz

    2001-01-01

    Provides an overview of the process of designing and developing a Web-based course using instructional design principles and models, including constructivist and objectivist theories. Explains the process of implementing an instructional design model in designing a Web-based undergraduate course and evaluates the model based on course evaluations.…

  4. Technology for Evolutionary Software Development (Technologies pour le developpement de logiciels evolutifs)

    DTIC Science & Technology

    2003-06-01

    greater detail in the next section, is to achieve these principles. Besides the fact, that these principles illustrate the essence of agile software...like e.g. ADLER, JASMIN , SAMOC or HEROS. In all of these projects the framework for the process model was the Vorgehensmodell (V-Model) of the...practical essence of the solutions to manage projects within the constraints of cost, schedule, functionality and quality and ways to get the

  5. Maximum Entropy Production As a Framework for Understanding How Living Systems Evolve, Organize and Function

    NASA Astrophysics Data System (ADS)

    Vallino, J. J.; Algar, C. K.; Huber, J. A.; Fernandez-Gonzalez, N.

    2014-12-01

    The maximum entropy production (MEP) principle holds that non equilibrium systems with sufficient degrees of freedom will likely be found in a state that maximizes entropy production or, analogously, maximizes potential energy destruction rate. The theory does not distinguish between abiotic or biotic systems; however, we will show that systems that can coordinate function over time and/or space can potentially dissipate more free energy than purely Markovian processes (such as fire or a rock rolling down a hill) that only maximize instantaneous entropy production. Biological systems have the ability to store useful information acquired via evolution and curated by natural selection in genomic sequences that allow them to execute temporal strategies and coordinate function over space. For example, circadian rhythms allow phototrophs to "predict" that sun light will return and can orchestrate metabolic machinery appropriately before sunrise, which not only gives them a competitive advantage, but also increases the total entropy production rate compared to systems that lack such anticipatory control. Similarly, coordination over space, such a quorum sensing in microbial biofilms, can increase acquisition of spatially distributed resources and free energy and thereby enhance entropy production. In this talk we will develop a modeling framework to describe microbial biogeochemistry based on the MEP conjecture constrained by information and resource availability. Results from model simulations will be compared to laboratory experiments to demonstrate the usefulness of the MEP approach.

  6. Ultrasonically enhanced extraction of bioactive principles from Quillaja Saponaria Molina.

    PubMed

    Gaete-Garretón, L; Vargas-Hernández, Yolanda; Cares-Pacheco, María G; Sainz, Javier; Alarcón, John

    2011-07-01

    A study of ultrasonic enhancement in the extraction of bioactive principles from Quillaja Saponaria Molina (Quillay) is presented. The effects influencing the extraction process were studied through a two-level factorial design. The effects considered in the experimental design were: granulometry, extraction time, acoustic Power, raw matter/solvent ratio (concentration) and acoustic impedance. It was found that for aqueous extraction the main factors affecting the ultrasonically-assisted process were: granulometry, raw matter/solvent ratio and extraction time. The extraction ratio was increased by Ultrasonics effect and a reduction in extraction time was verified without any influence in the product quality. In addition the process can be carried out at lower temperatures than the conventional method. As the process developed uses chips from the branches of trees, and not only the bark, this research contributes to make the saponin exploitation process a sustainable industry. Copyright © 2010 Elsevier B.V. All rights reserved.

  7. Chlor-Alkali Technology.

    ERIC Educational Resources Information Center

    Venkatesh, S.; Tilak, B. V.

    1983-01-01

    Chlor-alkali technology is one of the largest electrochemical industries in the world, the main products being chlorine and caustic soda (sodium hydroxide) generated simultaneously by the electrolysis of sodium chloride. This technology is reviewed in terms of electrochemical principles and manufacturing processes involved. (Author/JN)

  8. Green Chemistry and Education.

    ERIC Educational Resources Information Center

    Hjeresen, Dennis L.; Schutt, David L.; Boese, Janet M.

    2000-01-01

    Many students today are profoundly interested in the sustainability of their world. Introduces Green Chemistry and its principles with teaching materials. Green Chemistry is the use of chemistry for pollution prevention and the design of chemical products and processes that are environmentally benign. (ASK)

  9. Implementing Responsibility Centre Budgeting

    ERIC Educational Resources Information Center

    Vonasek, Joseph

    2011-01-01

    Recently, institutes of higher education (universities) have shown a renewed interest in organisational structures and operating methodologies that generate productivity and innovation; responsibility centre budgeting (RCB) is one such process. This paper describes the underlying principles constituting RCB, its origin and structural elements, and…

  10. Operator agency in process intervention: tampering versus application of tacit knowledge

    NASA Astrophysics Data System (ADS)

    Van Gestel, P.; Pons, D. J.; Pulakanam, V.

    2015-09-01

    Statistical process control (SPC) theory takes a negative view of adjustment of process settings, which is termed tampering. In contrast, quality and lean programmes actively encourage operators to acts of intervention and personal agency in the improvement of production outcomes. This creates a conflict that requires operator judgement: How does one differentiate between unnecessary tampering and needful intervention? Also, difficult is that operators apply tacit knowledge to such judgements. There is a need to determine where in a given production process the operators are applying tacit knowledge, and whether this is hindering or aiding quality outcomes. The work involved the conjoint application of systems engineering, statistics, and knowledge management principles, in the context of a case study. Systems engineering was used to create a functional model of a real plant. Actual plant data were analysed with the statistical methods of ANOVA, feature selection, and link analysis. This identified the variables to which the output quality was most sensitive. These key variables were mapped back to the functional model. Fieldwork was then directed to those areas to prospect for operator judgement activities. A natural conversational approach was used to determine where and how operators were applying judgement. This contrasts to the interrogative approach of conventional knowledge management. Data are presented for a case study of a meat rendering plant. The results identify specific areas where operators' tacit knowledge and mental model contribute to quality outcomes and untangles the motivations behind their agency. Also evident is how novice and expert operators apply their knowledge differently. Novices were focussed on meeting throughput objectives, and their incomplete understanding of the plant characteristics led them to inadvertently sacrifice quality in the pursuit of productivity in certain situations. Operators' responses to the plant are affected by their individual mental models of the plant, which differ between operators and have variable validity. Their behaviour is also affected by differing interpretations of how their personal agency should be applied to the achievement of production objectives. The methodology developed here is an integration of systems engineering, statistical analysis, and knowledge management. It shows how to determine where in a given production process the operator intervention is occurring, how it affects quality outcomes, and what tacit knowledge operators are using. It thereby assists the continuous quality improvement processes in a different way to SPC. A second contribution is the provision of a novel methodology for knowledge management, one that circumvents the usual codification barriers to knowledge management.

  11. Integration of Palliative Care Principles into the Ongoing Care of Children with Cancer: Individualized Care Planning and Coordination

    PubMed Central

    Baker, Justin N; Hinds, Pamela S; Spunt, Sheri L; Barfield, Raymond C; Allen, Caitlin; Powell, Brent C; Anderson, Lisa H; Kane, Javier R

    2008-01-01

    Synopsis The Individualized Care Planning and Coordination Model is designed to integrate palliative care principles and practices into the ongoing care of children with cancer. Application of the model helps clinicians to generate a comprehensive individualized care plan that is implemented through Individualized Care Coordination processes as detailed here. Clinicians’ strong desire to provide compassionate, competent and sensitive care to the seriously ill child and the child’s family can be effectively translated into clinical practice through these processes. “To cure sometimes, to relieve often, to comfort always -- this is our work.” Author Unknown PMID:18242323

  12. Simulation of target interpretation based on infrared image features and psychology principle

    NASA Astrophysics Data System (ADS)

    Lin, Wei; Chen, Yu-hua; Gao, Hong-sheng; Wang, Zhan-feng; Wang, Ji-jun; Su, Rong-hua; Huang, Yan-ping

    2009-07-01

    It's an important and complicated process in target interpretation that target features extraction and identification, which effect psychosensorial quantity of interpretation person to target infrared image directly, and decide target viability finally. Using statistical decision theory and psychology principle, designing four psychophysical experiment, the interpretation model of the infrared target is established. The model can get target detection probability by calculating four features similarity degree between target region and background region, which were plotted out on the infrared image. With the verification of a great deal target interpretation in practice, the model can simulate target interpretation and detection process effectively, get the result of target interpretation impersonality, which can provide technique support for target extraction, identification and decision-making.

  13. Development of an ICT in IBSE course for science teachers: A design-based research

    NASA Astrophysics Data System (ADS)

    Tran, Trinh-Ba

    2018-01-01

    Integration of ICT tools for measuring with sensors, analyzing video, and modelling into Inquiry-Based Science Education (IBSE) is a need globally recognized. The challenge to teachers is how to turn manipulation of equipment and software into manipulation of ideas. We have developed a short ICT in IBSE course to prepare and support science teachers to teach inquiry-based activities with ICT tools. Within the framework of design-based research, we first defined the pedagogical principles from the literature, developed core materials for teacher learning, explored boundary conditions of the training in different countries, and elaborated set-ups of the course for the Dutch, Slovak, and Vietnamese contexts. Next, we taught and evaluated three iterative cycles of the Dutch course set-ups for pre-service science teachers from four teacher-education institutes nationwide. In each cycle, data on the teacher learning was collected via observations, questionnaires, interviews, and documents. These data were then analyzed for the questions about faithful implementation and effectiveness of the course. Following the same approach, we taught and evaluated two cycles of the Slovak course set-ups for in-service science teachers in the context of the national accreditation programme for teacher professional development. In addition, we investigated applicability of the final Dutch course set-up in the context of the physics-education master program in Vietnam with adaptations geared to educational and cultural difference. Through the iterations of implementation, evaluation, and revision, eventually the course objectives were achieved to certain extent; the pedagogical principles and core materials proved to be effective and applicable in different contexts. We started this research and design project with the pedagogical principles and concluded it with these principles (i.e. complete theory-practice cycle, depth first, distributed learning, and ownership of learning) as the core of the basic design of the ICT in IBSE course. These principles can be considered as independent, validated educational products, which teacher educators can "buy into" and use for broader aims than only "ICT in IBSE" integration. Pedagogical principles establish the theoretical model underlying the course design, provide guidelines and structure to the (re)design, implementation, evaluation, and optimization process, and help to communicate the design-based research to others. The role of pedagogical principles in design-based research is indeed essential. Moreover, we incorporated a robustness test and a generalizability/transferability test as a further step in our design-based research and achieved successful outcomes with this step. Consequently, we strongly recommend the testing of the design product in routine implementation conditions and in considerably different contexts (e.g. different programmes or even countries) as part of design-based research.

  14. Research status of wave energy conversion (WEC) device of raft structure

    NASA Astrophysics Data System (ADS)

    Dong, Jianguo; Gao, Jingwei; Tao, Liang; Zheng, Peng

    2017-10-01

    This paper has briefly described the concept of wave energy generation and six typical conversion devices. As for raft structure, detailed analysis is provided from its development process to typical devices. Taking the design process and working principle of Plamis as an example, the general principle of raft structure is briefly described. After that, a variety of raft structure models are introduced. Finally, the advantages and disadvantages, and development trend of raft structure are pointed out.

  15. Four principles for user interface design of computerised clinical decision support systems.

    PubMed

    Kanstrup, Anne Marie; Christiansen, Marion Berg; Nøhr, Christian

    2011-01-01

    The paper presents results from a design research project of a user interface (UI) for a Computerised Clinical Decision Support System (CDSS). The ambition has been to design Human-Computer Interaction (HCI) that can minimise medication errors. Through an iterative design process a digital prototype for prescription of medicine has been developed. This paper presents results from the formative evaluation of the prototype conducted in a simulation laboratory with ten participating physicians. Data from the simulation is analysed by use of theory on how users perceive information. The conclusion is a model, which sum up four principles of interaction for design of CDSS. The four principles for design of user interfaces for CDSS are summarised as four A's: All in one, At a glance, At hand and Attention. The model emphasises integration of all four interaction principles in the design of user interfaces for CDSS, i.e. the model is an integrated model which we suggest as a guide for interaction design when working with preventing medication errors.

  16. Efficiency of RNA extraction from selected bacteria in the context of biogas production and metatranscriptomics.

    PubMed

    Stark, Lucy; Giersch, Tina; Wünschiers, Röbbe

    2014-10-01

    Understanding the microbial population in anaerobic digestion is an essential task to increase efficient substrate use and process stability. The metabolic state, represented e.g. by the transcriptome, of a fermenting system can help to find markers for monitoring industrial biogas production to prevent failures or to model the whole process. Advances in next-generation sequencing make transcriptomes accessible for large-scale analyses. In order to analyze the metatranscriptome of a mixed-species sample, isolation of high-quality RNA is the first step. However, different extraction methods may yield different efficiencies in different species. Especially in mixed-species environmental samples, unbiased isolation of transcripts is important for meaningful conclusions. We applied five different RNA-extraction protocols to nine taxonomic diverse bacterial species. Chosen methods are based on various lysis and extraction principles. We found that the extraction efficiency of different methods depends strongly on the target organism. RNA isolation of gram-positive bacteria was characterized by low yield whilst from gram-negative species higher concentrations can be obtained. Transferring our results to mixed-species investigations, such as metatranscriptomics with biofilms or biogas plants, leads to the conclusion that particular microorganisms might be over- or underrepresented depending on the method applied. Special care must be taken when using such metatranscriptomics data for, e.g. process modeling. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Energy Conversion and Storage Program

    NASA Astrophysics Data System (ADS)

    Cairns, E. J.

    1993-06-01

    This report is the 1992 annual progress report for the Energy Conversion and Storage Program, a part of the Energy and Environment Division of the Lawrence Berkeley Laboratory. Work described falls into three broad areas: electrochemistry; chemical applications; and materials applications. The Energy Conversion and Storage Program applies principles of chemistry and materials science to solve problems in several areas: (1) production of new synthetic fuels, (2) development of high-performance rechargeable batteries and fuel cells, (3) development of advanced thermochemical processes for energy conversion, (4) characterization of complex chemical processes and chemical species, and (5) study and application of novel materials for energy conversion and transmission. Projects focus on transport-process principles, chemical kinetics, thermodynamics, separation processes, organic and physical chemistry, novel materials, and advanced methods of analysis. Electrochemistry research aims to develop advanced power systems for electric vehicle and stationary energy storage applications. Chemical applications research includes topics such as separations, catalysis, fuels, and chemical analyses. Included in this program area are projects to develop improved, energy-efficient methods for processing product and waste streams from synfuel plants, coal gasifiers, and biomass conversion processes. Materials applications research includes evaluation of the properties of advanced materials, as well as development of novel preparation techniques. For example, techniques such as sputtering, laser ablation, and poised laser deposition are being used to produce high-temperature superconducting films.

  18. Superintendents' Group Problem-Solving Processes.

    ERIC Educational Resources Information Center

    Leithwood, Kenneth; And Others

    Findings of a study that examined the collaborative problem-solving processes used by superintendents are presented in this paper. Based on information processing theory, the study utilizes a model composed of the following components: interpretation; goals; principles and values; constraints; solution processes; and mood. Data were derived from…

  19. Irreversibility and entropy production in transport phenomena, III—Principle of minimum integrated entropy production including nonlinear responses

    NASA Astrophysics Data System (ADS)

    Suzuki, Masuo

    2013-01-01

    A new variational principle of steady states is found by introducing an integrated type of energy dissipation (or entropy production) instead of instantaneous energy dissipation. This new principle is valid both in linear and nonlinear transport phenomena. Prigogine’s dream has now been realized by this new general principle of minimum “integrated” entropy production (or energy dissipation). This new principle does not contradict with the Onsager-Prigogine principle of minimum instantaneous entropy production in the linear regime, but it is conceptually different from the latter which does not hold in the nonlinear regime. Applications of this theory to electric conduction, heat conduction, particle diffusion and chemical reactions are presented. The irreversibility (or positive entropy production) and long time tail problem in Kubo’s formula are also discussed in the Introduction and last section. This constitutes the complementary explanation of our theory of entropy production given in the previous papers (M. Suzuki, Physica A 390 (2011) 1904 and M. Suzuki, Physica A 391 (2012) 1074) and has given the motivation of the present investigation of variational principle.

  20. Cadre de planification integree de la chaine logistique pour la gestion et l'evaluation de strategies de bioraffinage forestier

    NASA Astrophysics Data System (ADS)

    Dansereau, Louis Patrick

    Biorefining is now recognized as a promising solution to transform the struggling forestry industry and to generate value-added pathways. The implementation of new products and processes will help companies to diversify revenues, but implies several strategic changes in the business model. Companies will face the dilemma of exiting or not traditional pulp and paper operations, while selecting their biorefinery product and process portfolio. As well, they will have to enter new markets and manage production to minimize the risk of market volatility. Over the past decades, both industry and academia paid a lot of attention to supply-chain management in order to increase the cost effectiveness of overall operations. The application of supply-chain management concepts could therefore greatly help the transforming North American forestry industry to compete globally. The objective of this Ph.D. project was to propose and illustrate an integrated supply-chain planning framework for the management and the evaluation of forest biorefinery strategies. This framework, named margins-based , integrates principles from revenue management, activity-based cost accounting, and manufacturing flexibility in a tactical planning model that maximizes profit of a company. The structure of the mathematical model and its associated cost model aims to represent as closely as possible the activities of a company, from procurement to sales. It enables the modeling of different process configurations leading to manufacturing flexibility. The model can thus be used as a platform for evaluating various operating strategies of a company, at both production and supply-chain levels. A case study of a newsprint mill implementing a parallel biomass fractionation line producing several bioproducts was used to illustrate this margins-based approach. Various strategic and tactical analyses were conducted to show the relevance of the approach as a decision-making tool for management problems related to the forest biorefinery. The model was used to evaluate the profitability of a company during its transformation to the biorefinery, by considering the gradual divestment in pulp and paper activities, while implementing a new biorefinery process. Results show that the tool can enhance decision-making activities by providing a better visualization and better comprehension of supply-chain and cost-related dynamics under different scenarios. Coupled with a scenario analysis, it offers the opportunity to develop a phased implementation strategy that would stabilize profitability during the transformation to the biorefinery. The planning tool was also used to study the management of a product portfolio to mitigate the risk of market volatility. One analysis focused on the exploitation of thermomechanical and deinked pulping flexibility in order to minimize the cost of raw material procurement in different to market conditions. Another analysis examined the impact of feedstock and production flexibility of a fractionation process on sales and profitability. Results show that the procurement and production needed to manufacture the product mix that provides the optimum margins vary significantly. Biorefinery processes can have complex interrelations that make dynamics and trade-offs between manufacturing options not easy to identify and understand. Results thus highlight the relevance of using such planning tools to identify the best opportunities. In a context where sales can be varied to a certain level, results also show that it may be beneficial to pay more for certain types of biomass if they offer a product portfolio mix with higher revenues.

  1. Toward Higher QA: From Parametric Release of Sterile Parenteral Products to PAT for Other Pharmaceutical Dosage Forms.

    PubMed

    Hock, Sia Chong; Constance, Neo Xue Rui; Wah, Chan Lai

    2012-01-01

    Pharmaceutical products are generally subjected to end-product batch testing as a means of quality control. Due to the inherent limitations of conventional batch testing, this is not the most ideal approach for determining the pharmaceutical quality of the finished dosage form. In the case of terminally sterilized parenteral products, the limitations of conventional batch testing have been successfully addressed with the application of parametric release (the release of a product based on control of process parameters instead of batch sterility testing at the end of the manufacturing process). Consequently, there has been an increasing interest in applying parametric release to other pharmaceutical dosage forms, beyond terminally sterilized parenteral products. For parametric release to be possible, manufacturers must be capable of designing quality into the product, monitoring the manufacturing processes, and controlling the quality of intermediates and finished products in real-time. Process analytical technology (PAT) has been thought to be capable of contributing to these prerequisites. It is believed that the appropriate use of PAT tools can eventually lead to the possibility of real-time release of other pharmaceutical dosage forms, by-passing the need for end-product batch testing. Hence, this literature review attempts to present the basic principles of PAT, introduce the various PAT tools that are currently available, present their recent applications to pharmaceutical processing, and explain the potential benefits that PAT can bring to conventional ways of processing and quality assurance of pharmaceutical products. Last but not least, current regulations governing the use of PAT and the manufacturing challenges associated with PAT implementation are also discussed. Pharmaceutical products are generally subjected to end-product batch testing as a means of quality control. Due to the inherent limitations of conventional batch testing, this is not the most ideal approach. In the case of terminally sterilized parenteral products, these limitations have been successfully addressed with the application of parametric release (the release of a product based on control of process parameters instead of batch sterility testing at the end of the manufacturing process). Consequently, there has been an increasing interest in applying parametric release to other pharmaceutical dosage forms. With the advancement of process analytical technology (PAT), it is possible to monitor the manufacturing processes closely. This will eventually enable quality control of the intermediates and finished products, and thus their release in real-time. Hence, this literature review attempts to present the basic principles of PAT, introduce the various PAT tools that are currently available, present their recent applications to pharmaceutical processing, and explain the potential benefits that PAT can bring to conventional ways of processing and quality assurance of pharmaceutical products. It will also discuss the current regulations governing the use of PAT and the manufacturing challenges associated with the implementation of PAT.

  2. Hydrologic consistency as a basis for assessing complexity of monthly water balance models for the continental United States

    NASA Astrophysics Data System (ADS)

    Martinez, Guillermo F.; Gupta, Hoshin V.

    2011-12-01

    Methods to select parsimonious and hydrologically consistent model structures are useful for evaluating dominance of hydrologic processes and representativeness of data. While information criteria (appropriately constrained to obey underlying statistical assumptions) can provide a basis for evaluating appropriate model complexity, it is not sufficient to rely upon the principle of maximum likelihood (ML) alone. We suggest that one must also call upon a "principle of hydrologic consistency," meaning that selected ML structures and parameter estimates must be constrained (as well as possible) to reproduce desired hydrological characteristics of the processes under investigation. This argument is demonstrated in the context of evaluating the suitability of candidate model structures for lumped water balance modeling across the continental United States, using data from 307 snow-free catchments. The models are constrained to satisfy several tests of hydrologic consistency, a flow space transformation is used to ensure better consistency with underlying statistical assumptions, and information criteria are used to evaluate model complexity relative to the data. The results clearly demonstrate that the principle of consistency provides a sensible basis for guiding selection of model structures and indicate strong spatial persistence of certain model structures across the continental United States. Further work to untangle reasons for model structure predominance can help to relate conceptual model structures to physical characteristics of the catchments, facilitating the task of prediction in ungaged basins.

  3. How nurse-led practices perceive implementation of the patient-centered medical home.

    PubMed

    Frasso, Rosemary; Golinkoff, A; Klusaritz, Heather; Kellom, Katherine; Kollar-McArthur, Helen; Miller-Day, Michelle; Gabbay, Robert; Cronholm, Peter F

    2017-04-01

    The Affordable Care Act (ACA) promotes the Patient-Centered Medical Home (PCMH) model as a way to improve healthcare quality, the patient experience, and has identified nurse-led primary care as a mechanism meeting the increasing demand for quality primary care. The purpose of this study was to investigate the implementation of a PCMH model in nurse-led primary care practices and to identify facilitators and barriers to the implementation of this model. Data were collected through in-depth interviews with providers and staff in nurse-led practices. These data suggest two categories of processes that facilitate the integration of PCMH in the nurse-led practice setting: patient-oriented facilitators and organizational facilitators. In addition, a number of barriers were identified to implementing the PCMH model. Overall, these practices creatively engaged in the transformation process by structuring themselves as a complex adaptive system and building upon the core principles of nurse-led care. Since the core principles of nurse-led care map onto many of the same principles of the PCMH model, this study discusses the possibility that nurse-led practices may experience fewer barriers when transitioning into PCMHs. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. The Principles for Successful Scientific Data Management Revisited

    NASA Astrophysics Data System (ADS)

    Walker, R. J.; King, T. A.; Joy, S. P.

    2005-12-01

    It has been 23 years since the National Research Council's Committee on Data Management and Computation (CODMAC) published its famous list of principles for successful scientific data management that have provided the framework for modern space science data management. CODMAC outlined seven principles: 1. Scientific Involvement in all aspects of space science missions. 2. Scientific Oversight of all scientific data-management activities. 3. Data Availability - Validated data should be made available to the scientific community in a timely manner. They should include appropriate ancillary data, and complete documentation. 4. Facilities - A proper balance between cost and scientific productivity should be maintained. 5. Software - Transportable well documented software should be available to process and analyze the data. 6. Scientific Data Storage - The data should be preserved in retrievable form. 7. Data System Funding - Adequate data funding should be made available at the outset of missions and protected from overruns. In this paper we will review the lessons learned in trying to apply these principles to space derived data. The Planetary Data System created the concept of data curation to carry out the CODMAC principles. Data curators are scientists and technologists who work directly with the mission scientists to create data products. The efficient application of the CODMAC principles requires that data curators and the mission team start early in a mission to plan for data access and archiving. To build the data products the planetary discipline adopted data access and documentation standards and has adhered to them. The data curators and mission team work together to produce data products and make them available. However even with early planning and agreement on standards the needs of the science community frequently far exceed the available resources. This is especially true for smaller principal investigator run missions. We will argue that one way to make data systems for small missions more effective is for the data curators to provide software tools to help develop the mission data system.

  5. Nature's combinatorial biosynthesis and recently engineered production of nucleoside antibiotics in Streptomyces.

    PubMed

    Chen, Shawn; Kinney, William A; Van Lanen, Steven

    2017-04-01

    Modified nucleosides produced by Streptomyces and related actinomycetes are widely used in agriculture and medicine as antibacterial, antifungal, anticancer and antiviral agents. These specialized small-molecule metabolites are biosynthesized by complex enzymatic machineries encoded within gene clusters in the genome. The past decade has witnessed a burst of reports defining the key metabolic processes involved in the biosynthesis of several distinct families of nucleoside antibiotics. Furthermore, genome sequencing of various Streptomyces species has dramatically increased over recent years. Potential biosynthetic gene clusters for novel nucleoside antibiotics are now apparent by analysis of these genomes. Here we revisit strategies for production improvement of nucleoside antibiotics that have defined mechanisms of action, and are in clinical or agricultural use. We summarize the progress for genetically manipulating biosynthetic pathways for structural diversification of nucleoside antibiotics. Microorganism-based biosynthetic examples are provided and organized under genetic principles and metabolic engineering guidelines. We show perspectives on the future of combinatorial biosynthesis, and present a working model for discovery of novel nucleoside natural products in Streptomyces.

  6. Process quality engineering for bioreactor-driven manufacturing of tissue-engineered constructs for bone regeneration.

    PubMed

    Papantoniou Ir, Ioannis; Chai, Yoke Chin; Luyten, Frank P; Schrooten Ir, Jan

    2013-08-01

    The incorporation of Quality-by-Design (QbD) principles in tissue-engineering bioprocess development toward clinical use will ensure that manufactured constructs possess prerequisite quality characteristics addressing emerging regulatory requirements and ensuring the functional in vivo behavior. In this work, the QbD principles were applied on a manufacturing process step for the in vitro production of osteogenic three-dimensional (3D) hybrid scaffolds that involves cell matrix deposition on a 3D titanium (Ti) alloy scaffold. An osteogenic cell source (human periosteum-derived cells) cultured in a bioinstructive medium was used to functionalize regular Ti scaffolds in a perfusion bioreactor, resulting in an osteogenic hybrid carrier. A two-level three-factor fractional factorial design of experiments was employed to explore a range of production-relevant process conditions by simultaneously changing value levels of the following parameters: flow rate (0.5-2 mL/min), cell culture duration (7-21 days), and cell-seeding density (1.5×10(3)-3×10(3) cells/cm(2)). This approach allowed to evaluate the individual impact of the aforementioned process parameters upon key quality attributes of the produced hybrids, such as collagen production, mineralization level, and cell number. The use of a fractional factorial design approach helped create a design space in which hybrid scaffolds of predefined quality attributes may be robustly manufactured while minimizing the number of required experiments.

  7. Quality engineering as a profession.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolb, Rachel R.; Hoover, Marcey L.

    Over the course of time, the profession of quality engineering has witnessed significant change, from its original emphasis on quality control and inspection to a more contemporary focus on upholding quality processes throughout the organization and its product realization activities. This paper describes the profession of quality engineering, exploring how todays quality engineers and quality professionals are certified individuals committed to upholding quality processes and principles while working with different dimensions of product development. It also discusses the future of the quality engineering profession and the future of the quality movement as a whole.

  8. Enroute flight planning: The design of cooperative planning systems

    NASA Technical Reports Server (NTRS)

    Smith, Philip J.; Layton, Chuck; Mccoy, Elaine

    1990-01-01

    Design concepts and principles to guide in the building of cooperative problem solving systems are being developed and evaluated. In particular, the design of cooperative systems for enroute flight planning is being studied. The investigation involves a three stage process, modeling human performance in existing environments, building cognitive artifacts, and studying the performance of people working in collaboration with these artifacts. The most significant design concepts and principles identified thus far are the principle focus.

  9. Development of a Logistics Support Framework for Defense Mapping Agency (DMA) Automated Systems

    DTIC Science & Technology

    1990-09-01

    cycle of a particular system. This research identified principles of management , design or system life cycle processes, and ILS elements needed to...Delphi results gathered from DMA expert opinions. The principles of management , depicted in the Logistics Systems Management Matrix (LSMM) portrayed...review were used to form the Delphi survey questions in Chapter III. As shown in Figure 2, the LSMM is a three-dimensional model with thp principles of management on

  10. Gas production in the Barnett Shale obeys a simple scaling theory

    PubMed Central

    Patzek, Tad W.; Male, Frank; Marder, Michael

    2013-01-01

    Natural gas from tight shale formations will provide the United States with a major source of energy over the next several decades. Estimates of gas production from these formations have mainly relied on formulas designed for wells with a different geometry. We consider the simplest model of gas production consistent with the basic physics and geometry of the extraction process. In principle, solutions of the model depend upon many parameters, but in practice and within a given gas field, all but two can be fixed at typical values, leading to a nonlinear diffusion problem we solve exactly with a scaling curve. The scaling curve production rate declines as 1 over the square root of time early on, and it later declines exponentially. This simple model provides a surprisingly accurate description of gas extraction from 8,294 wells in the United States’ oldest shale play, the Barnett Shale. There is good agreement with the scaling theory for 2,057 horizontal wells in which production started to decline exponentially in less than 10 y. The remaining 6,237 horizontal wells in our analysis are too young for us to predict when exponential decline will set in, but the model can nevertheless be used to establish lower and upper bounds on well lifetime. Finally, we obtain upper and lower bounds on the gas that will be produced by the wells in our sample, individually and in total. The estimated ultimate recovery from our sample of 8,294 wells is between 10 and 20 trillion standard cubic feet. PMID:24248376

  11. Gas production in the Barnett Shale obeys a simple scaling theory.

    PubMed

    Patzek, Tad W; Male, Frank; Marder, Michael

    2013-12-03

    Natural gas from tight shale formations will provide the United States with a major source of energy over the next several decades. Estimates of gas production from these formations have mainly relied on formulas designed for wells with a different geometry. We consider the simplest model of gas production consistent with the basic physics and geometry of the extraction process. In principle, solutions of the model depend upon many parameters, but in practice and within a given gas field, all but two can be fixed at typical values, leading to a nonlinear diffusion problem we solve exactly with a scaling curve. The scaling curve production rate declines as 1 over the square root of time early on, and it later declines exponentially. This simple model provides a surprisingly accurate description of gas extraction from 8,294 wells in the United States' oldest shale play, the Barnett Shale. There is good agreement with the scaling theory for 2,057 horizontal wells in which production started to decline exponentially in less than 10 y. The remaining 6,237 horizontal wells in our analysis are too young for us to predict when exponential decline will set in, but the model can nevertheless be used to establish lower and upper bounds on well lifetime. Finally, we obtain upper and lower bounds on the gas that will be produced by the wells in our sample, individually and in total. The estimated ultimate recovery from our sample of 8,294 wells is between 10 and 20 trillion standard cubic feet.

  12. Development of generalized 3-D braiding machines for composite preforms

    NASA Technical Reports Server (NTRS)

    Huey, Cecil O., Jr.; Farley, Gary L.

    1993-01-01

    The operating principles of two prototype braiding machines for the production of generalized braid patterns are described. Both processes afford previously unachievable control of the interlacing of fibers within a textile structure that make them especially amenable to the fabrication of textile preforms for composite materials. They enable independent control of the motion of the individual fibers being woven, thereby enabling the greatest possible freedom in controlling fiber orientation within a structure. This freedom enables the designer to prescribe local fiber orientation to better optimize material performance. The processes have been implemented on a very small scale but at a level that demonstrates their practicality and the soundness of the principles governing their operation.

  13. Energy transports by ocean and atmosphere based on an entropy extremum principle. I - Zonal averaged transports

    NASA Technical Reports Server (NTRS)

    Sohn, Byung-Ju; Smith, Eric A.

    1993-01-01

    The maximum entropy production principle suggested by Paltridge (1975) is applied to separating the satellite-determined required total transports into atmospheric and oceanic components. Instead of using the excessively restrictive equal energy dissipation hypothesis as a deterministic tool for separating transports between the atmosphere and ocean fluids, the satellite-inferred required 2D energy transports are imposed on Paltridge's energy balance model, which is then solved as a variational problem using the equal energy dissipation hypothesis only to provide an initial guess field. It is suggested that Southern Ocean transports are weaker than previously reported. It is argued that a maximum entropy production principle can serve as a governing rule on macroscale global climate, and, in conjunction with conventional satellite measurements of the net radiation balance, provides a means to decompose atmosphere and ocean transports from the total transport field.

  14. Coproduction of healthcare service

    PubMed Central

    Batalden, Maren; Batalden, Paul; Margolis, Peter; Seid, Michael; Armstrong, Gail; Opipari-Arrigan, Lisa; Hartung, Hans

    2016-01-01

    Efforts to ensure effective participation of patients in healthcare are called by many names—patient centredness, patient engagement, patient experience. Improvement initiatives in this domain often resemble the efforts of manufacturers to engage consumers in designing and marketing products. Services, however, are fundamentally different than products; unlike goods, services are always ‘coproduced’. Failure to recognise this unique character of a service and its implications may limit our success in partnering with patients to improve health care. We trace a partial history of the coproduction concept, present a model of healthcare service coproduction and explore its application as a design principle in three healthcare service delivery innovations. We use the principle to examine the roles, relationships and aims of this interdependent work. We explore the principle's implications and challenges for health professional development, for service delivery system design and for understanding and measuring benefit in healthcare services. PMID:26376674

  15. System principles, mathematical models and methods to ensure high reliability of safety systems

    NASA Astrophysics Data System (ADS)

    Zaslavskyi, V.

    2017-04-01

    Modern safety and security systems are composed of a large number of various components designed for detection, localization, tracking, collecting, and processing of information from the systems of monitoring, telemetry, control, etc. They are required to be highly reliable in a view to correctly perform data aggregation, processing and analysis for subsequent decision making support. On design and construction phases of the manufacturing of such systems a various types of components (elements, devices, and subsystems) are considered and used to ensure high reliability of signals detection, noise isolation, and erroneous commands reduction. When generating design solutions for highly reliable systems a number of restrictions and conditions such as types of components and various constrains on resources should be considered. Various types of components perform identical functions; however, they are implemented using diverse principles, approaches and have distinct technical and economic indicators such as cost or power consumption. The systematic use of different component types increases the probability of tasks performing and eliminates the common cause failure. We consider type-variety principle as an engineering principle of system analysis, mathematical models based on this principle, and algorithms for solving optimization problems of highly reliable safety and security systems design. Mathematical models are formalized in a class of two-level discrete optimization problems of large dimension. The proposed approach, mathematical models, algorithms can be used for problem solving of optimal redundancy on the basis of a variety of methods and control devices for fault and defects detection in technical systems, telecommunication networks, and energy systems.

  16. Principles for integrating reactive species into in vivo biological processes: Examples from exercise physiology.

    PubMed

    Margaritelis, Nikos V; Cobley, James N; Paschalis, Vassilis; Veskoukis, Aristidis S; Theodorou, Anastasios A; Kyparos, Antonios; Nikolaidis, Michalis G

    2016-04-01

    The equivocal role of reactive species and redox signaling in exercise responses and adaptations is an example clearly showing the inadequacy of current redox biology research to shed light on fundamental biological processes in vivo. Part of the answer probably relies on the extreme complexity of the in vivo redox biology and the limitations of the currently applied methodological and experimental tools. We propose six fundamental principles that should be considered in future studies to mechanistically link reactive species production to exercise responses or adaptations: 1) identify and quantify the reactive species, 2) determine the potential signaling properties of the reactive species, 3) detect the sources of reactive species, 4) locate the domain modified and verify the (ir)reversibility of post-translational modifications, 5) establish causality between redox and physiological measurements, 6) use selective and targeted antioxidants. Fulfilling these principles requires an idealized human experimental setting, which is certainly a utopia. Thus, researchers should choose to satisfy those principles, which, based on scientific evidence, are most critical for their specific research question. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Applying Toyota production system techniques for medication delivery: improving hospital safety and efficiency.

    PubMed

    Newell, Terry L; Steinmetz-Malato, Laura L; Van Dyke, Deborah L

    2011-01-01

    The inpatient medication delivery system used at a large regional acute care hospital in the Midwest had become antiquated and inefficient. The existing 24-hr medication cart-fill exchange process with delivery to the patients' bedside did not always provide ordered medications to the nursing units when they were needed. In 2007 the principles of the Toyota Production System (TPS) were applied to the system. Project objectives were to improve medication safety and reduce the time needed for nurses to retrieve patient medications. A multidisciplinary team was formed that included representatives from nursing, pharmacy, informatics, quality, and various operational support departments. Team members were educated and trained in the tools and techniques of TPS, and then designed and implemented a new pull system benchmarking the TPS Ideal State model. The newly installed process, providing just-in-time medication availability, has measurably improved delivery processes as well as patient safety and satisfaction. Other positive outcomes have included improved nursing satisfaction, reduced nursing wait time for delivered medications, and improved efficiency in the pharmacy. After a successful pilot on two nursing units, the system is being extended to the rest of the hospital. © 2010 National Association for Healthcare Quality.

  18. Universal Design: Process, Principles, and Applications

    ERIC Educational Resources Information Center

    Burgstahler, Sheryl

    2009-01-01

    Designing any product or environment involves the consideration of many factors, including aesthetics, engineering options, environmental issues, safety concerns, industry standards, and cost. Typically, designers focus their attention on the average user. In contrast, universal design (UD), according to the Center for Universal Design," is…

  19. INTEGRATED BIOPROCESS SYSTEMS FOR LOW-COST ENVIRONMENTAL REMEDIATION AND SUSTAINABLE BIOFERTILIZER PRODUCTION

    EPA Science Inventory

    Preliminary techno-economic analysis of these processes will be undertaken, utilizing the literature and including key supporting data and proof-of-principle experiments. The emphasis on low-cost bioreactors and operation greatly enhances the economic feasibility and practica...

  20. Printing--Graphic Arts--Graphic Communications

    ERIC Educational Resources Information Center

    Hauenstein, A. Dean

    1975-01-01

    Recently, "graphic arts" has shifted from printing skills to a conceptual approach of production processes. "Graphic communications" must embrace the total system of communication through graphic media, to serve broad career education purposes; students taught concepts and principles can be flexible and adaptive. The author…

  1. A Brief Survey of Modern Optimization for Statisticians

    PubMed Central

    Lange, Kenneth; Chi, Eric C.; Zhou, Hua

    2014-01-01

    Modern computational statistics is turning more and more to high-dimensional optimization to handle the deluge of big data. Once a model is formulated, its parameters can be estimated by optimization. Because model parsimony is important, models routinely include nondifferentiable penalty terms such as the lasso. This sober reality complicates minimization and maximization. Our broad survey stresses a few important principles in algorithm design. Rather than view these principles in isolation, it is more productive to mix and match them. A few well chosen examples illustrate this point. Algorithm derivation is also emphasized, and theory is downplayed, particularly the abstractions of the convex calculus. Thus, our survey should be useful and accessible to a broad audience. PMID:25242858

  2. Echo State Networks for data-driven downhole pressure estimation in gas-lift oil wells.

    PubMed

    Antonelo, Eric A; Camponogara, Eduardo; Foss, Bjarne

    2017-01-01

    Process measurements are of vital importance for monitoring and control of industrial plants. When we consider offshore oil production platforms, wells that require gas-lift technology to yield oil production from low pressure oil reservoirs can become unstable under some conditions. This undesirable phenomenon is usually called slugging flow, and can be identified by an oscillatory behavior of the downhole pressure measurement. Given the importance of this measurement and the unreliability of the related sensor, this work aims at designing data-driven soft-sensors for downhole pressure estimation in two contexts: one for speeding up first-principle model simulation of a vertical riser model; and another for estimating the downhole pressure using real-world data from an oil well from Petrobras based only on topside platform measurements. Both tasks are tackled by employing Echo State Networks (ESN) as an efficient technique for training Recurrent Neural Networks. We show that a single ESN is capable of robustly modeling both the slugging flow behavior and a steady state based only on a square wave input signal representing the production choke opening in the vertical riser. Besides, we compare the performance of a standard network to the performance of a multiple timescale hierarchical architecture in the second task and show that the latter architecture performs better in modeling both large irregular transients and more commonly occurring small oscillations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Neural network modeling of emotion

    NASA Astrophysics Data System (ADS)

    Levine, Daniel S.

    2007-03-01

    This article reviews the history and development of computational neural network modeling of cognitive and behavioral processes that involve emotion. The exposition starts with models of classical conditioning dating from the early 1970s. Then it proceeds toward models of interactions between emotion and attention. Then models of emotional influences on decision making are reviewed, including some speculative (not and not yet simulated) models of the evolution of decision rules. Through the late 1980s, the neural networks developed to model emotional processes were mainly embodiments of significant functional principles motivated by psychological data. In the last two decades, network models of these processes have become much more detailed in their incorporation of known physiological properties of specific brain regions, while preserving many of the psychological principles from the earlier models. Most network models of emotional processes so far have dealt with positive and negative emotion in general, rather than specific emotions such as fear, joy, sadness, and anger. But a later section of this article reviews a few models relevant to specific emotions: one family of models of auditory fear conditioning in rats, and one model of induced pleasure enhancing creativity in humans. Then models of emotional disorders are reviewed. The article concludes with philosophical statements about the essential contributions of emotion to intelligent behavior and the importance of quantitative theories and models to the interdisciplinary enterprise of understanding the interactions of emotion, cognition, and behavior.

  4. Le Chatelier's principle in replicator dynamics

    NASA Astrophysics Data System (ADS)

    Allahverdyan, Armen E.; Galstyan, Aram

    2011-10-01

    The Le Chatelier principle states that physical equilibria are not only stable, but they also resist external perturbations via short-time negative-feedback mechanisms: a perturbation induces processes tending to diminish its results. The principle has deep roots, e.g., in thermodynamics it is closely related to the second law and the positivity of the entropy production. Here we study the applicability of the Le Chatelier principle to evolutionary game theory, i.e., to perturbations of a Nash equilibrium within the replicator dynamics. We show that the principle can be reformulated as a majorization relation. This defines a stability notion that generalizes the concept of evolutionary stability. We determine criteria for a Nash equilibrium to satisfy the Le Chatelier principle and relate them to mutualistic interactions (game-theoretical anticoordination) showing in which sense mutualistic replicators can be more stable than (say) competing ones. There are globally stable Nash equilibria, where the Le Chatelier principle is violated even locally: in contrast to the thermodynamic equilibrium a Nash equilibrium can amplify small perturbations, though both types of equilibria satisfy the detailed balance condition.

  5. Le Chatelier's principle in replicator dynamics.

    PubMed

    Allahverdyan, Armen E; Galstyan, Aram

    2011-10-01

    The Le Chatelier principle states that physical equilibria are not only stable, but they also resist external perturbations via short-time negative-feedback mechanisms: a perturbation induces processes tending to diminish its results. The principle has deep roots, e.g., in thermodynamics it is closely related to the second law and the positivity of the entropy production. Here we study the applicability of the Le Chatelier principle to evolutionary game theory, i.e., to perturbations of a Nash equilibrium within the replicator dynamics. We show that the principle can be reformulated as a majorization relation. This defines a stability notion that generalizes the concept of evolutionary stability. We determine criteria for a Nash equilibrium to satisfy the Le Chatelier principle and relate them to mutualistic interactions (game-theoretical anticoordination) showing in which sense mutualistic replicators can be more stable than (say) competing ones. There are globally stable Nash equilibria, where the Le Chatelier principle is violated even locally: in contrast to the thermodynamic equilibrium a Nash equilibrium can amplify small perturbations, though both types of equilibria satisfy the detailed balance condition.

  6. Hot-melt extrusion--basic principles and pharmaceutical applications.

    PubMed

    Lang, Bo; McGinity, James W; Williams, Robert O

    2014-09-01

    Originally adapted from the plastics industry, the use of hot-melt extrusion has gained favor in drug delivery applications both in academia and the pharmaceutical industry. Several commercial products made by hot-melt extrusion have been approved by the FDA, demonstrating its commercial feasibility for pharmaceutical processing. A significant number of research articles have reported on advances made regarding the pharmaceutical applications of the hot-melt extrusion processing; however, only limited articles have been focused on general principles regarding formulation and process development. This review provides an in-depth analysis and discussion of the formulation and processing aspects of hot-melt extrusion. The impact of physicochemical properties of drug substances and excipients on formulation development using a hot-melt extrusion process is discussed from a material science point of view. Hot-melt extrusion process development, scale-up, and the interplay of formulation and process attributes are also discussed. Finally, recent applications of hot-melt extrusion to a variety of dosage forms and drug substances have also been addressed.

  7. Pharmaceutical product development: A quality by design approach

    PubMed Central

    Pramod, Kannissery; Tahir, M. Abu; Charoo, Naseem A.; Ansari, Shahid H.; Ali, Javed

    2016-01-01

    The application of quality by design (QbD) in pharmaceutical product development is now a thrust area for the regulatory authorities and the pharmaceutical industry. International Conference on Harmonization and United States Food and Drug Administration (USFDA) emphasized the principles and applications of QbD in pharmaceutical development in their guidance for the industry. QbD attributes are addressed in question-based review, developed by USFDA for chemistry, manufacturing, and controls section of abbreviated new drug applications. QbD principles, when implemented, lead to a successful product development, subsequent prompt regulatory approval, reduce exhaustive validation burden, and significantly reduce post-approval changes. The key elements of QbD viz., target product quality profile, critical quality attributes, risk assessments, design space, control strategy, product lifecycle management, and continual improvement are discussed to understand the performance of dosage forms within design space. Design of experiments, risk assessment tools, and process analytical technology are also discussed for their role in QbD. This review underlines the importance of QbD in inculcating science-based approach in pharmaceutical product development. PMID:27606256

  8. Pharmaceutical product development: A quality by design approach.

    PubMed

    Pramod, Kannissery; Tahir, M Abu; Charoo, Naseem A; Ansari, Shahid H; Ali, Javed

    2016-01-01

    The application of quality by design (QbD) in pharmaceutical product development is now a thrust area for the regulatory authorities and the pharmaceutical industry. International Conference on Harmonization and United States Food and Drug Administration (USFDA) emphasized the principles and applications of QbD in pharmaceutical development in their guidance for the industry. QbD attributes are addressed in question-based review, developed by USFDA for chemistry, manufacturing, and controls section of abbreviated new drug applications. QbD principles, when implemented, lead to a successful product development, subsequent prompt regulatory approval, reduce exhaustive validation burden, and significantly reduce post-approval changes. The key elements of QbD viz., target product quality profile, critical quality attributes, risk assessments, design space, control strategy, product lifecycle management, and continual improvement are discussed to understand the performance of dosage forms within design space. Design of experiments, risk assessment tools, and process analytical technology are also discussed for their role in QbD. This review underlines the importance of QbD in inculcating science-based approach in pharmaceutical product development.

  9. Rethinking infant knowledge: toward an adaptive process account of successes and failures in object permanence tasks.

    PubMed

    Munakata, Y; McClelland, J L; Johnson, M H; Siegler, R S

    1997-10-01

    Infants seem sensitive to hidden objects in habituation tasks at 3.5 months but fail to retrieve hidden objects until 8 months. The authors first consider principle-based accounts of these successes and failures, in which early successes imply knowledge of principles and failures are attributed to ancillary deficits. One account is that infants younger than 8 months have the object permanence principle but lack means-ends abilities. To test this, 7-month-olds were trained on means-ends behaviors and were tested on retrieval of visible and occluded toys. Means-ends demands were the same, yet infants made more toy-guided retrievals in the visible case. The authors offer an adaptive process account in which knowledge is graded and embedded in specific behavioral processes. Simulation models that learn gradually to represent occluded objects show how this approach can account for success and failure in object permanence tasks without assuming principles and ancillary deficits.

  10. Games in the environmental context and their strategic use for environmental education.

    PubMed

    Branco, M A A; Weyermüller, A R; Müller, E F; Schneider, G T; Hupffer, H M; Delgado, J; Mossman, J B; Bez, M R; Mendes, T G

    2015-05-01

    This article aims to present the productivity of the assumptions of Philosophical Hermeneutics (Gadamer, 1996) and his discovery of the logical, ontological and structural model of the game that takes place during the experience that is the basis of comprehension. Thus, digital games are proposed as manners, methods and ways to improve the understanding, interpretation and application of the concepts of Sustainability and Environmental Principles. The attraction of the game as a pedagogic space lays in the fact that it takes over and allows the player to internalize ecological sensitivity, something that happens during the play. Finally, the results show an augment on students' motivation, when using the game versus the traditional process.

  11. A Framework for Assessing the Sustainability of Monitored Natural Attenuation

    USGS Publications Warehouse

    Chapelle, Francis H.; Novak, John; Parker, John; Campbell, Bruce G.; Widdowson, Mark A.

    2007-01-01

    The sustainability of monitored natural attenuation (MNA) over time depends upon (1) the presence of chemical/biochemical processes that transform wastes to innocuous byproducts, and (2) the availability of energy to drive these processes to completion. The presence or absence of contaminant-transforming chemical/biochemical processes can be determined by observing contaminant mass loss over time and space (mass balance). The energy available to drive these processes to completion can be assessed by measuring the pool of metabolizable organic carbon available in a system, and by tracing the flow of this energy to available electron acceptors (energy balance). For the special case of chlorinated ethenes in ground-water systems, for which a variety of contaminant-transforming biochemical processes exist, natural attenuation is sustainable when the pool of bioavailable organic carbon is large relative to the carbon flux needed to drive biodegradation to completion. These principles are illustrated by assessing the sustainability of MNA at a chlorinated ethene-contaminated site in Kings Bay, Georgia. Approximately 1,000 kilograms of perchloroethene (PCE) was released to a municipal landfill in the 1978-1980 timeframe, and the resulting plume of chlorinated ethenes migrated toward a nearby housing development. A numerical model, built using the sequential electron acceptor model code (SEAM3D), was used to quantify mass and energy balance in this system. The model considered the dissolution of non-aqueous phase liquid (NAPL) as the source of the PCE, and was designed to trace energy flow from dissolved organic carbon to available electron acceptors in the sequence oxygen > chlorinated ethenes > ferric iron > sulfate > carbon dioxide. The model was constrained by (1) comparing simulated and measured rates of ground-water flow, (2) reproducing the observed distribution of electron-accepting processes in the aquifer, (3) comparing observed and measured concentrations of chlorinated ethenes, and (4) reproducing the observed production and subsequent dilution of dissolved chloride, a final degradation product of chloroethene biodegradation. Simulations using the constrained model indicated that an average flux of 5 milligrams per liter per day of organic carbon (CH2O) per model cell (25 square meters) is required to support the short-term sustainability of MNA. Because this flux is small relative to the pool of renewable organic carbon (about 4.7 x 107 milligrams [mg] per model cell) present in the soil zone and non-renewable carbon (about 6.9 x 108 mg per model cell) in an organic-rich sediment layer overlying the aquifer, the long-term sustainability of MNA is similarly large. This study illustrates that the short- and long-term sustainability of MNA can be assessed by: 1. Estimating the time required for contaminants to dissolve/disperse/degrade under ambient hydrologic conditions (time of remediation). 2. Quantifying the organic carbon flux to the system needed to consume competing electron acceptors (oxygen) and direct electron flow toward chloroethene degradation (short-term sustainability). 3. Comparing the required flux of organic carbon to the pool of renewable and non-renewable organic carbon given the estimated time of remediation (long-term sustainability). These are general principles that can be used to assess the sustainability of MNA in any hydrologic system.

  12. Extended Producer Responsibility and Product Stewardship for Tobacco Product Waste

    PubMed Central

    Curtis, Clifton; Collins, Susan; Cunningham, Shea; Stigler, Paula; Novotny, Thomas E

    2015-01-01

    This paper reviews several environmental principles, including Extended Producer Responsibility (EPR), Product Stewardship (PS), the Polluter Pays Principle (PPP), and the Precautionary Principle, as they may apply to tobacco product waste (TPW). The review addresses specific criteria that apply in deciding whether a particular toxic product should adhere to these principles; presents three case studies of similar approaches to other toxic and/or environmentally harmful products; and describes 10 possible interventions or policy actions that may help prevent, reduce, and mitigate the effects of TPW. EPR promotes total lifecycle environmental improvements, placing economic, physical, and informational responsibilities onto the tobacco industry, while PS complements EPR, but with responsibility shared by all parties involved in the tobacco product lifecycle. Both principles focus on toxic source reduction, post-consumer take-back, and final disposal of consumer products. These principles when applied to TPW have the potential to substantially decrease the environmental and public health harms of cigarette butts and other TPW throughout the world. TPW is the most commonly littered item picked up during environmental, urban, and coastal cleanups globally. PMID:26457262

  13. Law and order: Assessing and enforcing compliance with ontological modeling principles in the Foundational Model of Anatomy

    PubMed Central

    Zhang, Songmao; Bodenreider, Olivier

    2006-01-01

    The objective of this study is to provide an operational definition of principles with which well-formed ontologies should comply. We define 15 such principles, related to classification (e.g., no hierarchical cycles are allowed; concepts have a reasonable number of children), incompatible relationships (e.g., two concepts cannot stand both in a taxonomic and partitive relation), dependence among concepts, and the co-dependence of equivalent sets of relations. Implicit relations—embedded in concept names or inferred from a combination of explicit relations—are used in this process in addition to the relations explicitly represented. As a case study, we investigate the degree to which the Foundational Model of Anatomy (FMA)—a large ontology of anatomy—complies with these 15 principles. The FMA succeeds in complying with all the principles: totally with one and mostly with the others. Reasons for non-compliance are analyzed and suggestions are made for implementing effective enforcement mechanisms in ontology development environments. The limitations of this study are also discussed. PMID:16144698

  14. Beyond Sexual Assault Surveys: A Model for Comprehensive Campus Climate Assessments

    ERIC Educational Resources Information Center

    McMahon, Sarah; Stepleton, Kate; Cusano, Julia; O'Connor, Julia; Gandhi, Khushbu; McGinty, Felicia

    2018-01-01

    The White House Task Force to Protect Students from Sexual Assault identified campus climate surveys as "the first step" for addressing campus sexual violence. Through a process case study, this article presents one model for engaging in a comprehensive, action-focused campus climate assessment process. Rooted in principles of…

  15. Suffix Ordering and Morphological Processing

    ERIC Educational Resources Information Center

    Plag, Ingo; Baayen, Harald

    2009-01-01

    There is a long-standing debate about the principles constraining the combinatorial properties of suffixes. Hay 2002 and Hay & Plag 2004 proposed a model in which suffixes can be ordered along a hierarchy of processing complexity. We show that this model generalizes to a larger set of suffixes, and we provide independent evidence supporting the…

  16. Practice paper of the academy of nutrition and dietetics: principles of productivity in food and nutrition services: applications in the 21st century health care reform era.

    PubMed

    Gregoire, Mary B; Theis, Monica L

    2015-07-01

    Food and nutrition services, along with the health care organizations they serve, are becoming increasingly complex. These complexities are driven by sometimes conflicting (if not polarizing) human, department, organization, and environment factors and will require that managers shift how they think about and approach productivity in the context of the greater good of the organization and, perhaps, even society. Traditional, single-factor approaches to productivity measurements, while still valuable in the context of departmental trend analysis, are of limited value when assessing departmental performance in the context of an organization's goals and values. As health care continues to change and new models of care are introduced, food and nutrition services managers will need to consider innovative approaches to improve productivity that are consistent with their individual health care organization's vision and mission. Use of process improvement tools such as Lean and Six Sigma as strategies for evaluating and improving food and nutrition services efficiency should be considered. Copyright © 2015 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  17. Applying lean principles to continuous renal replacement therapy processes.

    PubMed

    Benfield, C Brett; Brummond, Philip; Lucarotti, Andrew; Villarreal, Maria; Goodwin, Adam; Wonnacott, Rob; Talley, Cheryl; Heung, Michael

    2015-02-01

    The application of lean principles to continuous renal replacement therapy (CRRT) processes in an academic medical center is described. A manual audit over six consecutive weeks revealed that 133 5-L bags of CRRT solution were discarded after being dispensed from pharmacy but before clinical use. Lean principles were used to examine the workflow for CRRT preparation and develop and implement an intervention. An educational program was developed to encourage and enhance direct communication between nursing and pharmacy about changes in a patient's condition or CRRT order. It was through this education program that the reordering workflow shifted from nurses to pharmacy technicians. The primary outcome was the number of CRRT solution bags delivered in the preintervention and postintervention periods. Nurses and pharmacy technicians were surveyed to determine their satisfaction with the workflow change. After implementation of lean principles, the mean number of CRRT solution bags dispensed per day of CRRT decreased substantially. Respondents' overall satisfaction with the CRRT solution preparation process increased during the postintervention period, and the satisfaction scores for each individual component of the workflow after implementation of lean principles. The decreased solution waste resulted in projected annual cost savings exceeding $70,000 in product alone. The use of lean principles to identify medication waste in the CRRT workflow and implementation of an intervention to shift the workload from intensive care unit nurses to pharmacy technicians led to reduced CRRT solution waste, improved efficiency of CRRT workflow, and increased satisfaction among staff. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  18. Minimal Cohomological Model of a Scalar Field on a Riemannian Manifold

    NASA Astrophysics Data System (ADS)

    Arkhipov, V. V.

    2018-04-01

    Lagrangians of the field-theory model of a scalar field are considered as 4-forms on a Riemannian manifold. The model is constructed on the basis of the Hodge inner product, this latter being an analog of the scalar product of two functions. Including the basis fields in the action of the terms with tetrads makes it possible to reproduce the Klein-Gordon equation and the Maxwell equations, and also the Einstein-Hilbert action. We conjecture that the principle of construction of the Lagrangians as 4-forms can give a criterion restricting possible forms of the field-theory models.

  19. The two-box model of climate: limitations and applications to planetary habitability and maximum entropy production studies.

    PubMed

    Lorenz, Ralph D

    2010-05-12

    The 'two-box model' of planetary climate is discussed. This model has been used to demonstrate consistency of the equator-pole temperature gradient on Earth, Mars and Titan with what would be predicted from a principle of maximum entropy production (MEP). While useful for exposition and for generating first-order estimates of planetary heat transports, it has too low a resolution to investigate climate systems with strong feedbacks. A two-box MEP model agrees well with the observed day : night temperature contrast observed on the extrasolar planet HD 189733b.

  20. Laboratory Modelling of Volcano Plumbing Systems: a review

    NASA Astrophysics Data System (ADS)

    Galland, Olivier; Holohan, Eoghan P.; van Wyk de Vries, Benjamin; Burchardt, Steffi

    2015-04-01

    Earth scientists have, since the XIX century, tried to replicate or model geological processes in controlled laboratory experiments. In particular, laboratory modelling has been used study the development of volcanic plumbing systems, which sets the stage for volcanic eruptions. Volcanic plumbing systems involve complex processes that act at length scales of microns to thousands of kilometres and at time scales from milliseconds to billions of years, and laboratory models appear very suitable to address them. This contribution reviews laboratory models dedicated to study the dynamics of volcano plumbing systems (Galland et al., Accepted). The foundation of laboratory models is the choice of relevant model materials, both for rock and magma. We outline a broad range of suitable model materials used in the literature. These materials exhibit very diverse rheological behaviours, so their careful choice is a crucial first step for the proper experiment design. The second step is model scaling, which successively calls upon: (1) the principle of dimensional analysis, and (2) the principle of similarity. The dimensional analysis aims to identify the dimensionless physical parameters that govern the underlying processes. The principle of similarity states that "a laboratory model is equivalent to his geological analogue if the dimensionless parameters identified in the dimensional analysis are identical, even if the values of the governing dimensional parameters differ greatly" (Barenblatt, 2003). The application of these two steps ensures a solid understanding and geological relevance of the laboratory models. In addition, this procedure shows that laboratory models are not designed to exactly mimic a given geological system, but to understand underlying generic processes, either individually or in combination, and to identify or demonstrate physical laws that govern these processes. From this perspective, we review the numerous applications of laboratory models to understand the distinct key features of volcanic plumbing systems: dykes, cone sheets, sills, laccoliths, caldera-related structures, ground deformation, magma/fault interactions, and explosive vents. Barenblatt, G.I., 2003. Scaling. Cambridge University Press, Cambridge. Galland, O., Holohan, E.P., van Wyk de Vries, B., Burchardt, S., Accepted. Laboratory modelling of volcanic plumbing systems: A review, in: Breitkreuz, C., Rocchi, S. (Eds.), Laccoliths, sills and dykes: Physical geology of shallow level magmatic systems. Springer.

  1. Possible dynamical explanations for Paltridge's principle of maximum entropy production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Virgo, Nathaniel, E-mail: nathanielvirgo@gmail.com; Ikegami, Takashi, E-mail: nathanielvirgo@gmail.com

    2014-12-05

    Throughout the history of non-equilibrium thermodynamics a number of theories have been proposed in which complex, far from equilibrium flow systems are hypothesised to reach a steady state that maximises some quantity. Perhaps the most celebrated is Paltridge's principle of maximum entropy production for the horizontal heat flux in Earth's atmosphere, for which there is some empirical support. There have been a number of attempts to derive such a principle from maximum entropy considerations. However, we currently lack a more mechanistic explanation of how any particular system might self-organise into a state that maximises some quantity. This is in contrastmore » to equilibrium thermodynamics, in which models such as the Ising model have been a great help in understanding the relationship between the predictions of MaxEnt and the dynamics of physical systems. In this paper we show that, unlike in the equilibrium case, Paltridge-type maximisation in non-equilibrium systems cannot be achieved by a simple dynamical feedback mechanism. Nevertheless, we propose several possible mechanisms by which maximisation could occur. Showing that these occur in any real system is a task for future work. The possibilities presented here may not be the only ones. We hope that by presenting them we can provoke further discussion about the possible dynamical mechanisms behind extremum principles for non-equilibrium systems, and their relationship to predictions obtained through MaxEnt.« less

  2. A Community-Based Participatory Approach to Personalized, Computer-Generated Nutrition Feedback Reports: The Healthy Environments Partnership

    PubMed Central

    Kannan, Srimathi; Schulz, Amy; Israel, Barbara; Ayra, Indira; Weir, Sheryl; Dvonch, Timothy J.; Rowe, Zachary; Miller, Patricia; Benjamin, Alison

    2008-01-01

    Background Computer tailoring and personalizing recommendations for dietary health-promoting behaviors are in accordance with community-based participatory research (CBPR) principles, which emphasizes research that benefits the participants and community involved. Objective To describe the CBPR process utilized to computer-generate and disseminate personalized nutrition feedback reports (NFRs) for Detroit Healthy Environments Partnership (HEP) study participants. METHODS The CBPR process included discussion and feedback from HEP partners on several draft personalized reports. The nutrition feedback process included defining the feedback objectives; prioritizing the nutrients; customizing the report design; reviewing and revising the NFR template and readability; producing and disseminating the report; and participant follow-up. Lessons Learned Application of CBPR principles in designing the NFR resulted in a reader-friendly product with useful recommendations to promote heart health. Conclusions A CBPR process can enhance computer tailoring of personalized NFRs to address racial and socioeconomic disparities in cardiovascular disease (CVD). PMID:19337572

  3. The maximum entropy production principle: two basic questions.

    PubMed

    Martyushev, Leonid M

    2010-05-12

    The overwhelming majority of maximum entropy production applications to ecological and environmental systems are based on thermodynamics and statistical physics. Here, we discuss briefly maximum entropy production principle and raises two questions: (i) can this principle be used as the basis for non-equilibrium thermodynamics and statistical mechanics and (ii) is it possible to 'prove' the principle? We adduce one more proof which is most concise today.

  4. 36 CFR 219.2 - Principles.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... sustainability of national forests and grasslands to provide for a wide variety of uses, values, products, and services. The benefits sought from these lands depend upon long-term ecological sustainability. Considering... processes and the ability of these natural resources to contribute to sustainability in the future. (1...

  5. Use of data sources, analytical techniques, and public involvement : MPO environmental justice report

    DOT National Transportation Integrated Search

    2001-01-01

    In the wake of new Federal guidelines on environmental justice that amplify Title VI of the Civil Rights Act, growing attention has been placed on the need to incorporate environmental justice principles into the processes and products of transportat...

  6. Potentials for win-win alliances among animal agriculture and forest products industries: application of the principles of industrial ecology and sustainable development.

    PubMed

    Cowling, Ellis B; Furiness, Carl S

    2005-12-01

    Commercial forests in many parts of the world are deficient in nitrogen and phosphorus. These nutrient-deficient forests often exist in close proximity to large animal feeding operations, meat processing and other food, textile, or other biomass-processing plants, and municipal waste treatment facilities. Many of these facilities produce large surpluses of nitrogen, phosphorus, and organic matter as gaseous ammonia, urea, uric acid, phosphorus compounds, bacterial sludges, and partially treated municipal wastewaters. These co-existing and substantial nutrient deficiencies and surpluses offer ready-made opportunities for discovery, demonstration, and commercial development of science-based, technology-facilitated, environmentally sound, economically viable, and socially acceptable "win-win alliances" among these major industries based on the principles of industrial ecology and sustainable development. The major challenge is to discover practical means to capture the surplus nutrients and put them to work in forest stands from which value-added products can be produced and sold at a profit.

  7. Applying Toyota Production System principles to a psychiatric hospital: making transfers safer and more timely.

    PubMed

    Young, John Q; Wachter, Robert M

    2009-09-01

    Health care organizations have increasingly embraced industrial methods, such as the Toyota Production System (TPS), to improve quality, safety, timeliness, and efficiency. However, the use of such methods in psychiatric hospitals has been limited. A psychiatric hospital applied TPS principles to patient transfers to the outpatient medication management clinics (MMCs) from all other inpatient and outpatient services within the hospital's system. Sources of error and delay were identified, and a new process was designed to improve timely access (measured by elapsed time from request for transfer to scheduling of an appointment and to the actual visit) and patient safety by decreasing communication errors (measured by number of failed transfers). Complexity was substantially reduced, with one streamlined pathway replacing five distinct and more complicated pathways. To assess sustainability, the postintervention period was divided into Period 1 (first 12 months) and Period 2 (next 24 months). Time required to process the transfer and schedule the first appointment was reduced by 74.1% in Period 1 (p < .001) and by an additional 52.7% in Period 2 (p < .0001) for an overall reduction of 87% (p < .0001). Similarly, time to the actual appointment was reduced 31.2% in Period 1 (p < .0001), but was stable in Period 2 (p = .48). The number of transfers per month successfully processed and scheduled increased 95% in the postintervention period compared with the pre-implementation period (p = .015). Finally, data for failed transfers were only available for the postintervention period, and the rate decreased 89% in Period 2 compared with Period 1 (p = .017). The application of TPS principles enhanced access and safety through marked and sustained improvements in the transfer process's timeliness and reliability. Almost all transfer processes have now been standardized.

  8. Potential of Continuous Manufacturing for Liposomal Drug Products.

    PubMed

    Worsham, Robert D; Thomas, Vaughan; Farid, Suzanne S

    2018-05-21

    Over the last several years, continuous manufacturing of pharmaceuticals has evolved from bulk APIs and solid oral dosages into the more complex realm of biologics. The development of continuous downstream processing techniques has allowed biologics manufacturing to realize the benefits (e.g. improved economics, more consistent quality) that come with continuous processing. If relevant processing techniques and principles are selected, the opportunity arises to develop continuous manufacturing designs for additional pharmaceutical products including liposomal drug formulations. Liposome manufacturing has some inherent aspects that make it favorable for a continuous process. Other aspects such as formulation refinement, materials of construction, and aseptic processing need development, but present an achievable challenge. This paper reviews the current state of continuous manufacturing technology applicable to liposomal drug product manufacturing and an assessment of the challenges and potential of this application. This article is protected by copyright. All rights reserved.

  9. Implementation of hazard analysis and critical control point (HACCP) in dried anchovy production process

    NASA Astrophysics Data System (ADS)

    Citraresmi, A. D. P.; Wahyuni, E. E.

    2018-03-01

    The aim of this study was to inspect the implementation of Hazard Analysis and Critical Control Point (HACCP) for identification and prevention of potential hazards in the production process of dried anchovy at PT. Kelola Mina Laut (KML), Lobuk unit, Sumenep. Cold storage process is needed in each anchovy processing step in order to maintain its physical and chemical condition. In addition, the implementation of quality assurance system should be undertaken to maintain product quality. The research was conducted using a survey method, by following the whole process of making anchovy from the receiving raw materials to the packaging of final product. The method of data analysis used was descriptive analysis method. Implementation of HACCP at PT. KML, Lobuk unit, Sumenep was conducted by applying Pre Requisite Programs (PRP) and preparation stage consisting of 5 initial stages and 7 principles of HACCP. The results showed that CCP was found in boiling process flow with significant hazard of Listeria monocytogenesis bacteria and final sorting process with significant hazard of foreign material contamination in the product. Actions taken were controlling boiling temperature of 100 – 105°C for 3 - 5 minutes and training for sorting process employees.

  10. MPR

    PubMed Central

    Killeen, Peter R.; Sitomer, Matthew T.

    2008-01-01

    Mathematical Principles of Reinforcement (MPR) is a theory of reinforcement schedules. This paper reviews the origin of the principles constituting MPR: arousal, association and constraint. Incentives invigorate responses, in particular those preceding and predicting the incentive. The process that generates an associative bond between stimuli, responses and incentives is called coupling. The combination of arousal and coupling constitutes reinforcement. Models of coupling play a central role in the evolution of the theory. The time required to respond constrains the maximum response rates, and generates a hyperbolic relation between rate of responding and rate of reinforcement. Models of control by ratio schedules are developed to illustrate the interaction of the principles. Correlations among parameters are incorporated into the structure of the models, and assumptions that were made in the original theory are refined in light of current data. PMID:12729968

  11. Integration of human factors principles in LARG organizations--a conceptual model.

    PubMed

    Figueira, Sara; Machado, V Cruz; Nunes, Isabel L

    2012-01-01

    Nowadays many companies are undergoing organizational transformations in order to meet the changing market demands. Thus, in order to become more competitive, supply chains (SC) are adopting new management paradigms to improve SC performance: lean, agile, resilient and green (LARG paradigms). The implementation of new production paradigms demands particular care with the issues related with Human Factors to avoid health and safety problems to workers and losses to companies. Thus, the successful introduction of these new production paradigms depends among others on a Human Factors oriented approach. This work presents a conceptual framework that allows integrating ergonomic and safety design principles during the different implementation phases of lean, agile, resilient and green practices.

  12. Physics and the production of antibiotics

    NASA Astrophysics Data System (ADS)

    Fairbrother, Robert; Riddle, Wendy; Fairbrother, Neil

    2006-01-01

    This article is the first in a series that describe some of the physics involved in the production of antibiotics. The field is often referred to as biochemical engineering but this does not indicate the considerable part played by physics and physicists. It is a process that undergoes continual research and development. Penicillin has been selected for the focus of this article, although the engineering principles and underlying physics apply to the production of other microbial products such as amino acids (which can be used as food additives), bulk chemicals such as ethanol (used in everything from hair spray and aftershave to solvents for paints and explosives) and the well-known processes of brewing and baking. In this article the application of physics to the design of the fermenter—the giant vessel in which the production of these products occurs—is discussed.

  13. Quality by design case study: an integrated multivariate approach to drug product and process development.

    PubMed

    Huang, Jun; Kaul, Goldi; Cai, Chunsheng; Chatlapalli, Ramarao; Hernandez-Abad, Pedro; Ghosh, Krishnendu; Nagi, Arwinder

    2009-12-01

    To facilitate an in-depth process understanding, and offer opportunities for developing control strategies to ensure product quality, a combination of experimental design, optimization and multivariate techniques was integrated into the process development of a drug product. A process DOE was used to evaluate effects of the design factors on manufacturability and final product CQAs, and establish design space to ensure desired CQAs. Two types of analyses were performed to extract maximal information, DOE effect & response surface analysis and multivariate analysis (PCA and PLS). The DOE effect analysis was used to evaluate the interactions and effects of three design factors (water amount, wet massing time and lubrication time), on response variables (blend flow, compressibility and tablet dissolution). The design space was established by the combined use of DOE, optimization and multivariate analysis to ensure desired CQAs. Multivariate analysis of all variables from the DOE batches was conducted to study relationships between the variables and to evaluate the impact of material attributes/process parameters on manufacturability and final product CQAs. The integrated multivariate approach exemplifies application of QbD principles and tools to drug product and process development.

  14. Supercritical fluid extraction. Principles and practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McHugh, M.A.; Krukonis, V.J.

    This book is a presentation of the fundamentals and application of super-critical fluid solvents (SCF). The authors cover virtually every facet of SCF technology: the history of SCF extraction, its underlying thermodynamic principles, process principles, industrial applications, and analysis of SCF research and development efforts. The thermodynamic principles governing SCF extraction are covered in depth. The often complex three-dimensional pressure-temperature composition (PTx) phase diagrams for SCF-solute mixtures are constructed in a coherent step-by-step manner using the more familiar two-dimensional Px diagrams. The experimental techniques used to obtain high pressure phase behavior information are described in detail and the advantages andmore » disadvantages of each technique are explained. Finally, the equations used to model SCF-solute mixtures are developed, and modeling results are presented to highlight the correlational strengths of a cubic equation of state.« less

  15. [Chapter 7. Big Data or the illusion of a synthesis by aggregation. Epistemological, ethical and political critics].

    PubMed

    Coutellec, Léo; Weil-Dubuc, Paul-Loup

    2017-10-27

    In this article, we propose a critical approach to the big data phenomenon by deconstructing the methodological principle that structures its logic : the principle of aggregation. Our hypothesis is upstream of the critics who make the use of big data a new mode of government. Aggregation, as a mode of processing the heterogeneity of data, structures the thinking big data, it is its very logic. Fragmentation in order to better aggregate, to aggregate to better fragment, a dialectic based on a presumption of generalized aggregability and on the claim to make aggregation the preferred route for the production of new syntheses. We proceed in three steps to deconstruct this idea and undo the claim of aggregation to assert itself as a new way to produce knowledge, as a new synthesis of identity and finally as a new model of solidarity. Each time we show that these attempts at aggregation fail to produce their objects : no knowledge, no identity, no solidarity can result from a process of amalgamation. In all three cases, aggregation is always accompanied by a moment of fragmentation whose dissociation, dislocation and separation are different figures. The bet we are making then is to make hesitate what presents itself as a new way of thinking man and the world.

  16. Acid and alkaline solubilization (pH shift) process: a better approach for the utilization of fish processing waste and by-products.

    PubMed

    Surasani, Vijay Kumar Reddy

    2018-05-22

    Several technologies and methods have been developed over the years to address the environmental pollution and nutritional losses associated with the dumping of fish processing waste and low-cost fish and by-products. Despite the continuous efforts put in this field, none of the developed technologies was successful in addressing the issues due to various technical problems. To solve the problems associated with the fish processing waste and low-value fish and by-products, a process called pH shift/acid and alkaline solubilization process was developed. In this process, proteins are first solubilized using acid and alkali followed by precipitating them at their isoelectric pH to recover functional and stable protein isolates from underutilized fish species and by-products. Many studies were conducted using pH shift process to recover proteins from fish and fish by-products and found to be most successful in recovering proteins with increased yields than conventional surimi (three cycle washing) process and with good functional properties. In this paper, problems associated with conventional processing, advantages and principle of pH shift processing, effect of pH shift process on the quality and storage stability of recovered isolates, applications protein isolates, etc. are discussed in detail for better understanding.

  17. Marketing is everything.

    PubMed

    McKenna, R

    1991-01-01

    Technology is creating customer choice, and choice is altering the marketplace. Gone are the days of the marketer as salesperson. Gone as well is marketing that tries to trick the customer into buying whatever the company makes. There is a new paradigm for marketing, a model that depends on the marketer's knowledge, experience, and ability to integrate the customer and the company. Six principles are at the heart of the new marketing. The first, "Marketing is everything and everything is marketing," suggests that marketing is like quality. It is not a function but an all-pervasive way of doing business. The second, "The goal of marketing is to own the market, not just to sell the product," is a remedy for companies that adopt a limiting "market-share mentality." When you own a market, you lead the market. The third principle says that "marketing evolves as technology evolves." Programmable technology means that companies can promise customers "any thing, any way, any time." Now marketing is evolving to deliver on that promise. The fourth principle, "Marketing moves from monologue to dialogue," argues that advertising is obsolete. Talking at customers is no longer useful. The new marketing requires a feedback loop--a dialogue between company and customer. The fifth principle says that "marketing a product is marketing a service is marketing a product." The line between the categories is fast eroding: the best manufacturing companies provide great service, the best service companies think of themselves as offering high-quality products. The sixth principle, "Technology markets technology," points out the inevitable marriage of marketing and technology and predicts the emergence of marketing workstations, a marketing counterpart to engineers' CAD/CAM systems.

  18. KC-135 Simulator Systems Engineering Case Study

    DTIC Science & Technology

    2010-01-01

    performance. The utilization and misutilization of SE principles are highlighted, with special emphasis on the conditions that foster and impede...process, from the identification of the need to the development and utilization of the product, must continuously integrate and optimize system and... utilizing the Friedman-Sage framework to organize the assessment of the application of the SE process. The framework and the derived matrix can

  19. Risk Informed Design as Part of the Systems Engineering Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.

  20. Emerging engineering principles for yield improvement in microbial cell design.

    PubMed

    Comba, Santiago; Arabolaza, Ana; Gramajo, Hugo

    2012-01-01

    Metabolic Engineering has undertaken a rapid transformation in the last ten years making real progress towards the production of a wide range of molecules and fine chemicals using a designed cellular host. However, the maximization of product yields through pathway optimization is a constant and central challenge of this field. Traditional methods used to improve the production of target compounds from engineered biosynthetic pathways in non-native hosts include: codon usage optimization, elimination of the accumulation of toxic intermediates or byproducts, enhanced production of rate-limiting enzymes, selection of appropriate promoter and ribosome binding sites, application of directed evolution of enzymes, and chassis re-circuit. Overall, these approaches tend to be specific for each engineering project rather than a systematic practice based on a more generalizable strategy. In this mini-review, we highlight some novel and extensive approaches and tools intended to address the improvement of a target product formation, founded in sophisticated principles such as dynamic control, pathway genes modularization, and flux modeling.

  1. Emerging engineering principles for yield improvement in microbial cell design

    PubMed Central

    Comba, Santiago; Arabolaza, Ana; Gramajo, Hugo

    2012-01-01

    Metabolic Engineering has undertaken a rapid transformation in the last ten years making real progress towards the production of a wide range of molecules and fine chemicals using a designed cellular host. However, the maximization of product yields through pathway optimization is a constant and central challenge of this field. Traditional methods used to improve the production of target compounds from engineered biosynthetic pathways in non-native hosts include: codon usage optimization, elimination of the accumulation of toxic intermediates or byproducts, enhanced production of rate-limiting enzymes, selection of appropriate promoter and ribosome binding sites, application of directed evolution of enzymes, and chassis re-circuit. Overall, these approaches tend to be specific for each engineering project rather than a systematic practice based on a more generalizable strategy. In this mini-review, we highlight some novel and extensive approaches and tools intended to address the improvement of a target product formation, founded in sophisticated principles such as dynamic control, pathway genes modularization, and flux modeling. PMID:24688676

  2. Guidelines for Risk-Based Changeover of Biopharma Multi-Product Facilities.

    PubMed

    Lynch, Rob; Barabani, David; Bellorado, Kathy; Canisius, Peter; Heathcote, Doug; Johnson, Alan; Wyman, Ned; Parry, Derek Willison

    2018-01-01

    In multi-product biopharma facilities, the protection from product contamination due to the manufacture of multiple products simultaneously is paramount to assure product quality. To that end, the use of traditional changeover methods (elastomer change-out, full sampling, etc.) have been widely used within the industry and have been accepted by regulatory agencies. However, with the endorsement of Quality Risk Management (1), the use of risk-based approaches may be applied to assess and continuously improve established changeover processes. All processes, including changeover, can be improved with investment (money/resources), parallel activities, equipment design improvements, and standardization. However, processes can also be improved by eliminating waste. For product changeover, waste is any activity not needed for the new process or that does not provide added assurance of the quality of the subsequent product. The application of a risk-based approach to changeover aligns with the principles of Quality Risk Management. Through the use of risk assessments, the appropriate changeover controls can be identified and controlled to assure product quality is maintained. Likewise, the use of risk assessments and risk-based approaches may be used to improve operational efficiency, reduce waste, and permit concurrent manufacturing of products. © PDA, Inc. 2018.

  3. Industrial applications using BASF eco-efficiency analysis: perspectives on green engineering principles.

    PubMed

    Shonnard, David R; Kicherer, Andreas; Saling, Peter

    2003-12-01

    Life without chemicals would be inconceivable, but the potential risks and impacts to the environment associated with chemical production and chemical products are viewed critically. Eco-efficiency analysis considers the economic and life cycle environmental effects of a product or process, giving these equal weighting. The major elements of the environmental assessment include primary energy use, raw materials utilization, emissions to all media, toxicity, safety risk, and land use. The relevance of each environmental category and also for the economic versus the environmental impacts is evaluated using national emissions and economic data. The eco-efficiency analysis method of BASF is briefly presented, and results from three applications to chemical processes and products are summarized. Through these applications, the eco-efficiency analyses mostly confirm the 12 Principles listed in Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37(5), 94A), with the exception that, in one application, production systems based on bio-based feedstocks were not the most eco-efficient as compared to those based on fossil resources. Over 180 eco-efficiency analyses have been conducted at BASF, and their results have been used to support strategic decision-making, marketing, research and development, and communication with external parties. Eco-efficiency analysis, as one important strategy and success factor in sustainable development, will continue to be a very strong operational tool at BASF.

  4. Modeling Spatial Dependencies and Semantic Concepts in Data Mining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vatsavai, Raju

    Data mining is the process of discovering new patterns and relationships in large datasets. However, several studies have shown that general data mining techniques often fail to extract meaningful patterns and relationships from the spatial data owing to the violation of fundamental geospatial principles. In this tutorial, we introduce basic principles behind explicit modeling of spatial and semantic concepts in data mining. In particular, we focus on modeling these concepts in the widely used classification, clustering, and prediction algorithms. Classification is the process of learning a structure or model (from user given inputs) and applying the known model to themore » new data. Clustering is the process of discovering groups and structures in the data that are ``similar,'' without applying any known structures in the data. Prediction is the process of finding a function that models (explains) the data with least error. One common assumption among all these methods is that the data is independent and identically distributed. Such assumptions do not hold well in spatial data, where spatial dependency and spatial heterogeneity are a norm. In addition, spatial semantics are often ignored by the data mining algorithms. In this tutorial we cover recent advances in explicitly modeling of spatial dependencies and semantic concepts in data mining.« less

  5. Crop biometric maps: the key to prediction.

    PubMed

    Rovira-Más, Francisco; Sáiz-Rubio, Verónica

    2013-09-23

    The sustainability of agricultural production in the twenty-first century, both in industrialized and developing countries, benefits from the integration of farm management with information technology such that individual plants, rows, or subfields may be endowed with a singular "identity." This approach approximates the nature of agricultural processes to the engineering of industrial processes. In order to cope with the vast variability of nature and the uncertainties of agricultural production, the concept of crop biometrics is defined as the scientific analysis of agricultural observations confined to spaces of reduced dimensions and known position with the purpose of building prediction models. This article develops the idea of crop biometrics by setting its principles, discussing the selection and quantization of biometric traits, and analyzing the mathematical relationships among measured and predicted traits. Crop biometric maps were applied to the case of a wine-production vineyard, in which vegetation amount, relative altitude in the field, soil compaction, berry size, grape yield, juice pH, and grape sugar content were selected as biometric traits. The enological potential of grapes was assessed with a quality-index map defined as a combination of titratable acidity, sugar content, and must pH. Prediction models for yield and quality were developed for high and low resolution maps, showing the great potential of crop biometric maps as a strategic tool for vineyard growers as well as for crop managers in general, due to the wide versatility of the methodology proposed.

  6. Crop Biometric Maps: The Key to Prediction

    PubMed Central

    Rovira-Más, Francisco; Sáiz-Rubio, Verónica

    2013-01-01

    The sustainability of agricultural production in the twenty-first century, both in industrialized and developing countries, benefits from the integration of farm management with information technology such that individual plants, rows, or subfields may be endowed with a singular “identity.” This approach approximates the nature of agricultural processes to the engineering of industrial processes. In order to cope with the vast variability of nature and the uncertainties of agricultural production, the concept of crop biometrics is defined as the scientific analysis of agricultural observations confined to spaces of reduced dimensions and known position with the purpose of building prediction models. This article develops the idea of crop biometrics by setting its principles, discussing the selection and quantization of biometric traits, and analyzing the mathematical relationships among measured and predicted traits. Crop biometric maps were applied to the case of a wine-production vineyard, in which vegetation amount, relative altitude in the field, soil compaction, berry size, grape yield, juice pH, and grape sugar content were selected as biometric traits. The enological potential of grapes was assessed with a quality-index map defined as a combination of titratable acidity, sugar content, and must pH. Prediction models for yield and quality were developed for high and low resolution maps, showing the great potential of crop biometric maps as a strategic tool for vineyard growers as well as for crop managers in general, due to the wide versatility of the methodology proposed. PMID:24064605

  7. Green nanoparticle production using micro reactor technology

    NASA Astrophysics Data System (ADS)

    Kück, A.; Steinfeldt, M.; Prenzel, K.; Swiderek, P.; Gleich, A. v.; Thöming, J.

    2011-07-01

    The importance and potential of nanoparticles in daily life as well as in various industrial processes is becoming more predominant. Specifically, silver nanoparticles are increasingly applied, e.g. in clothes and wipes, due to their antibacterial properties. For applications in liquid phase it is advantageous to produce the nanoparticles directly in suspension. This article describes a green production of silver nanoparticles using micro reactor technology considering principles of green chemistry. The aim is to reveal the potential and constraints of this approach and to show, how economic and environmental costs vary depending on process conditions. For this purpose our research compares the proposed process with water-based batch synthesis and demonstrates improvements in terms of product quality. Because of the lower energy consumption and lower demand of cleaning agents, micro reactor is the best ecological choice.

  8. Total Quality Management: Implications for Educational Assessment.

    ERIC Educational Resources Information Center

    Rankin, Stuart C.

    1992-01-01

    Deming's "System of Profound Knowledge" is even more fundamental than his 14-principle system transformation guide and is based on 4 elements: systems theory, statistical variation, a theory of knowledge, and psychology. Management should revamp total system processes so that quality of product is continually improved. Implications for…

  9. Universal Design in Postsecondary Education: Process, Principles, and Applications

    ERIC Educational Resources Information Center

    Burgstahler, Sheryl

    2009-01-01

    Designing any product or environment involves the consideration of many factors, including aesthetics, engineering options, environmental issues, safety concerns, industry standards, and cost. Typically, designers focus their attention on the average user. In contrast, universal design (UD), according to the Center for Universal Design, "is…

  10. Structured decision making as a framework for linking quantitative decision support to community values

    EPA Science Inventory

    Community-level decisions can have large impacts on production and delivery of ecosystem services, which ultimately affects community well-being. But engaging stakeholders in a process to explore these impacts is a significant challenge. The principles of Structured Decision Ma...

  11. Forces Shaping the Electronic Publishing Industry of the 1990s.

    ERIC Educational Resources Information Center

    Hawkins, Donald T.; And Others

    1992-01-01

    Reviews the conventional publishing industry, and discusses a study of the electronic publishing industry and its products and processes. Discusses seven major forces affecting it--technology, economics, demographics, social trends, government policies, applications growth, and industry trends--and outlines principles to follow for success in…

  12. Bioethical Considerations of Advancing the Application of Marine Biotechnology and Aquaculture

    PubMed Central

    Harrell, Reginal M.

    2017-01-01

    Normative ethical considerations of growth of the marine biotechnology and aquaculture disciplines in biopharming, food production, and marine products commercialization from a bioethical perspective have been limited. This paucity of information begs the question of what constitutes a bioethical approach (i.e., respect for individuals or autonomy; beneficence, nonmaleficence, and justice) to marine biotechnology and aquaculture, and whether it is one that is appropriate for consideration. Currently, thoughtful discussion on the bioethical implications of use, development, and commercialization of marine organisms or their products, as well as potential environmental effects, defaults to human biomedicine as a model. One must question the validity of using human bioethical principlism moral norms for appropriating a responsible marine biotechnology and aquaculture ethic. When considering potential impacts within these disciplines, deference must be given to differing value systems in order to find common ground to advance knowledge and avoid emotive impasses that can hinder the science and its application. The import of bioethical considerations when conducting research and/or production is discussed. This discussion is directed toward applying bioethical principles toward technology used for food, biomedical development (e.g., biopharming), or as model species for advancement of knowledge for human diseases. PMID:28672802

  13. Bioethical Considerations of Advancing the Application of Marine Biotechnology and Aquaculture.

    PubMed

    Harrell, Reginal M

    2017-06-24

    Normative ethical considerations of growth of the marine biotechnology and aquaculture disciplines in biopharming, food production, and marine products commercialization from a bioethical perspective have been limited. This paucity of information begs the question of what constitutes a bioethical approach (i.e., respect for individuals or autonomy; beneficence, nonmaleficence, and justice) to marine biotechnology and aquaculture, and whether it is one that is appropriate for consideration. Currently, thoughtful discussion on the bioethical implications of use, development, and commercialization of marine organisms or their products, as well as potential environmental effects, defaults to human biomedicine as a model. One must question the validity of using human bioethical principlism moral norms for appropriating a responsible marine biotechnology and aquaculture ethic. When considering potential impacts within these disciplines, deference must be given to differing value systems in order to find common ground to advance knowledge and avoid emotive impasses that can hinder the science and its application. The import of bioethical considerations when conducting research and/or production is discussed. This discussion is directed toward applying bioethical principles toward technology used for food, biomedical development (e.g., biopharming), or as model species for advancement of knowledge for human diseases.

  14. Problem Solving Model for Science Learning

    NASA Astrophysics Data System (ADS)

    Alberida, H.; Lufri; Festiyed; Barlian, E.

    2018-04-01

    This research aims to develop problem solving model for science learning in junior high school. The learning model was developed using the ADDIE model. An analysis phase includes curriculum analysis, analysis of students of SMP Kota Padang, analysis of SMP science teachers, learning analysis, as well as the literature review. The design phase includes product planning a science-learning problem-solving model, which consists of syntax, reaction principle, social system, support system, instructional impact and support. Implementation of problem-solving model in science learning to improve students' science process skills. The development stage consists of three steps: a) designing a prototype, b) performing a formative evaluation and c) a prototype revision. Implementation stage is done through a limited trial. A limited trial was conducted on 24 and 26 August 2015 in Class VII 2 SMPN 12 Padang. The evaluation phase was conducted in the form of experiments at SMPN 1 Padang, SMPN 12 Padang and SMP National Padang. Based on the development research done, the syntax model problem solving for science learning at junior high school consists of the introduction, observation, initial problems, data collection, data organization, data analysis/generalization, and communicating.

  15. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  16. Grades as Information

    ERIC Educational Resources Information Center

    Grant, Darren

    2007-01-01

    We determine how much observed student performance in microeconomics principles can be attributed, inferentially, to three kinds of student academic "productivity," the instructor, demographics, and unmeasurables. The empirical approach utilizes an ordered probit model that relates student performance in micro to grades in prior…

  17. Science Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1985

    1985-01-01

    Presents 23 experiments, demonstrations, activities, and computer programs in biology, chemistry, and physics. Topics include lead in petrol, production of organic chemicals, reduction of water, enthalpy, X-ray diffraction model, nuclear magnetic resonance spectroscopy, computer simulation for additive mixing of colors, Archimedes Principle, and…

  18. Increasing Therapist Productivity: Using Lean Principles in the Rehabilitation Department of an Academic Medical Center.

    PubMed

    Johnson, Diana; Snedeker, Kristie; Swoboda, Michael; Zalieckas, Cheryl; Dorsey, Rachel; Nohe, Cassandra; Smith, Paige; Roche, Renuka

    The Department of Rehabilitation Services, within the University of Maryland Medical Center's 650-bed academic medical center, was experiencing difficulty in meeting productivity standards. Therapists in the outpatient division believed they were not spending enough time performing billable patient care activities. Therapists in the inpatient division had difficulty keeping pace with the volume of incoming referrals. Collectively, these issues caused dissatisfaction among referral sources and frustration among the staff within the rehabilitation department. The department undertook a phased approach to address these issues that included examining the evidence, using Lean process improvement principles, and employing transformational leadership strategies to drive improvements in productivity and efficiency. The lessons learned support the importance of having meaningful metrics appropriate for the patient population served, the use of Lean as an effective tool for improving productivity in rehabilitation departments, the impact of engaging staff at the grassroots level, and the importance of having commitment from leaders. The study findings have implications for not only rehabilitation and hospital leadership, but CEOs and managers of any business who need to eliminate waste or increase staff productivity.

  19. Human factors systems approach to healthcare quality and patient safety

    PubMed Central

    Carayon, Pascale; Wetterneck, Tosha B.; Rivera-Rodriguez, A. Joy; Hundt, Ann Schoofs; Hoonakker, Peter; Holden, Richard; Gurses, Ayse P.

    2013-01-01

    Human factors systems approaches are critical for improving healthcare quality and patient safety. The SEIPS (Systems Engineering Initiative for Patient Safety) model of work system and patient safety is a human factors systems approach that has been successfully applied in healthcare research and practice. Several research and practical applications of the SEIPS model are described. Important implications of the SEIPS model for healthcare system and process redesign are highlighted. Principles for redesigning healthcare systems using the SEIPS model are described. Balancing the work system and encouraging the active and adaptive role of workers are key principles for improving healthcare quality and patient safety. PMID:23845724

  20. Prospects from agroecology and industrial ecology for animal production in the 21st century.

    PubMed

    Dumont, B; Fortun-Lamothe, L; Jouven, M; Thomas, M; Tichit, M

    2013-06-01

    Agroecology and industrial ecology can be viewed as complementary means for reducing the environmental footprint of animal farming systems: agroecology mainly by stimulating natural processes to reduce inputs, and industrial ecology by closing system loops, thereby reducing demand for raw materials, lowering pollution and saving on waste treatment. Surprisingly, animal farming systems have so far been ignored in most agroecological thinking. On the basis of a study by Altieri, who identified the key ecological processes to be optimized, we propose five principles for the design of sustainable animal production systems: (i) adopting management practices aiming to improve animal health, (ii) decreasing the inputs needed for production, (iii) decreasing pollution by optimizing the metabolic functioning of farming systems, (iv) enhancing diversity within animal production systems to strengthen their resilience and (v) preserving biological diversity in agroecosystems by adapting management practices. We then discuss how these different principles combine to generate environmental, social and economic performance in six animal production systems (ruminants, pigs, rabbits and aquaculture) covering a long gradient of intensification. The two principles concerning economy of inputs and reduction of pollution emerged in nearly all the case studies, a finding that can be explained by the economic and regulatory constraints affecting animal production. Integrated management of animal health was seldom mobilized, as alternatives to chemical drugs have only recently been investigated, and the results are not yet transferable to farming practices. A number of ecological functions and ecosystem services (recycling of nutrients, forage yield, pollination, resistance to weed invasion, etc.) are closely linked to biodiversity, and their persistence depends largely on maintaining biological diversity in agroecosystems. We conclude that the development of such ecology-based alternatives for animal production implies changes in the positions adopted by technicians and extension services, researchers and policymakers. Animal production systems should not only be considered holistically, but also in the diversity of their local and regional conditions. The ability of farmers to make their own decisions on the basis of the close monitoring of system performance is most important to ensure system sustainability.

  1. Steepest entropy ascent quantum thermodynamic model of electron and phonon transport

    NASA Astrophysics Data System (ADS)

    Li, Guanchen; von Spakovsky, Michael R.; Hin, Celine

    2018-01-01

    An advanced nonequilibrium thermodynamic model for electron and phonon transport is formulated based on the steepest-entropy-ascent quantum thermodynamics framework. This framework, based on the principle of steepest entropy ascent (or the equivalent maximum entropy production principle), inherently satisfies the laws of thermodynamics and mechanics and is applicable at all temporal and spatial scales even in the far-from-equilibrium realm. Specifically, the model is proven to recover the Boltzmann transport equations in the near-equilibrium limit and the two-temperature model of electron-phonon coupling when no dispersion is assumed. The heat and mass transport at a temperature discontinuity across a homogeneous interface where the dispersion and coupling of electron and phonon transport are both considered are then modeled. Local nonequilibrium system evolution and nonquasiequilibrium interactions are predicted and the results discussed.

  2. Raman spectroscopy as a process analytical technology for pharmaceutical manufacturing and bioprocessing.

    PubMed

    Esmonde-White, Karen A; Cuellar, Maryann; Uerpmann, Carsten; Lenain, Bruno; Lewis, Ian R

    2017-01-01

    Adoption of Quality by Design (QbD) principles, regulatory support of QbD, process analytical technology (PAT), and continuous manufacturing are major factors effecting new approaches to pharmaceutical manufacturing and bioprocessing. In this review, we highlight new technology developments, data analysis models, and applications of Raman spectroscopy, which have expanded the scope of Raman spectroscopy as a process analytical technology. Emerging technologies such as transmission and enhanced reflection Raman, and new approaches to using available technologies, expand the scope of Raman spectroscopy in pharmaceutical manufacturing, and now Raman spectroscopy is successfully integrated into real-time release testing, continuous manufacturing, and statistical process control. Since the last major review of Raman as a pharmaceutical PAT in 2010, many new Raman applications in bioprocessing have emerged. Exciting reports of in situ Raman spectroscopy in bioprocesses complement a growing scientific field of biological and biomedical Raman spectroscopy. Raman spectroscopy has made a positive impact as a process analytical and control tool for pharmaceutical manufacturing and bioprocessing, with demonstrated scientific and financial benefits throughout a product's lifecycle.

  3. Creating a culture of patient-focused care through a learner-centered philosophy.

    PubMed

    Linscott, J; Spee, R; Flint, F; Fisher, A

    1999-01-01

    This paper will discuss the teaching-learning process used in the Patient-Focused Care Course at a major teaching hospital in Canada that is transforming nursing practice from a provider driven to a patient-focused approach. The experiential and reflective nature of the course offers opportunities for nurses to link theory with practice, to think critically and reflectively about their own values and beliefs and to translate that meaning into practice. The learning process reflects principles of adult learning based on Knowles andragogical model which differs from the traditional pedagogical model of teaching. The essence of andragogy is a constant unfolding process of discovery based on dialogue. Utilization of adult learning principles that support critical thinking and foster transformational change present an alternative to traditional ways of teaching and learning the art and science of nursing practice.

  4. First evaluation of endotoxins in veterinary autogenous vaccines produced in Italy by LAL assay.

    PubMed

    Antonella, Di Paolo; Katia, Forti; Lucia, Anzalone; Sara, Corneli; Martina, Pellegrini; Giulio, Severi; Monica, Cagiola

    2018-06-21

    Endotoxin contamination is a serious concern for manufacturers of biological products and vaccines in terms of not only quality but also safety parameters. We evaluated the endotoxin presence in different veterinary autogenous vaccines produced by the Pharmaceutical Unit at the Experimental Zooprophylactic Institute of Umbria and Marche "Togo Rosati" (IZSUM). According to the 3Rs principles (Replace, Reduce, Refine), which aim to progressively reduce animal use in the quality control process, we tested the vaccines obtained from gram-negative bacteria and adjuvants by the limulus amebocyte lysate (LAL) assay. The results revealed low endotoxin concentrations compared to available data in the literature and represent the first report of the application of the 3Rs principles to veterinary autogenous vaccines production in Italy. Copyright © 2018. Published by Elsevier Ltd.

  5. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  6. Stereo Image Ranging For An Autonomous Robot Vision System

    NASA Astrophysics Data System (ADS)

    Holten, James R.; Rogers, Steven K.; Kabrisky, Matthew; Cross, Steven

    1985-12-01

    The principles of stereo vision for three-dimensional data acquisition are well-known and can be applied to the problem of an autonomous robot vehicle. Coincidental points in the two images are located and then the location of that point in a three-dimensional space can be calculated using the offset of the points and knowledge of the camera positions and geometry. This research investigates the application of artificial intelligence knowledge representation techniques as a means to apply heuristics to relieve the computational intensity of the low level image processing tasks. Specifically a new technique for image feature extraction is presented. This technique, the Queen Victoria Algorithm, uses formal language productions to process the image and characterize its features. These characterized features are then used for stereo image feature registration to obtain the required ranging information. The results can be used by an autonomous robot vision system for environmental modeling and path finding.

  7. Analysis and numerical simulation research of the heating process in the oven

    NASA Astrophysics Data System (ADS)

    Chen, Yawei; Lei, Dingyou

    2016-10-01

    How to use the oven to bake delicious food is the most concerned problem of the designers and users of the oven. For this intent, this paper analyzed the heat distribution in the oven based on the basic operation principles and proceeded the data simulation of the temperature distribution on the rack section. Constructing the differential equation model of the temperature distribution changes in the pan when the oven works based on the heat radiation and heat transmission, based on the idea of utilizing cellular automation to simulate heat transfer process, used ANSYS software to proceed the numerical simulation analysis to the rectangular, round-cornered rectangular, elliptical and circular pans and giving out the instantaneous temperature distribution of the corresponding shapes of the pans. The temperature distribution of the rectangular and circular pans proves that the product gets overcooked easily at the corners and edges of rectangular pans but not of a round pan.

  8. An efficient, maintenance free and approved method for spectroscopic control and monitoring of blend uniformity: The moving F-test.

    PubMed

    Besseling, Rut; Damen, Michiel; Tran, Thanh; Nguyen, Thanh; van den Dries, Kaspar; Oostra, Wim; Gerich, Ad

    2015-10-10

    Dry powder mixing is a wide spread Unit Operation in the Pharmaceutical industry. With the advent of in-line Near Infrared (NIR) Spectroscopy and Quality by Design principles, application of Process Analytical Technology to monitor Blend Uniformity (BU) is taking a more prominent role. Yet routine use of NIR for monitoring, let alone control of blending processes is not common in the industry, despite the improved process understanding and (cost) efficiency that it may offer. Method maintenance, robustness and translation to regulatory requirements have been important barriers to implement the method. This paper presents a qualitative NIR-BU method offering a convenient and compliant approach to apply BU control for routine operation and process understanding, without extensive calibration and method maintenance requirements. The method employs a moving F-test to detect the steady state of measured spectral variances and the endpoint of mixing. The fundamentals and performance characteristics of the method are first presented, followed by a description of the link to regulatory BU criteria, the method sensitivity and practical considerations. Applications in upscaling, tech transfer and commercial production are described, along with evaluation of the method performance by comparison with results from quantitative calibration models. A full application, in which end-point detection via the F-test controls the blending process of a low dose product, was successfully filed in Europe and Australia, implemented in commercial production and routinely used for about five years and more than 100 batches. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Neuro-cognitive mechanisms of conscious and unconscious visual perception: From a plethora of phenomena to general principles

    PubMed Central

    Kiefer, Markus; Ansorge, Ulrich; Haynes, John-Dylan; Hamker, Fred; Mattler, Uwe; Verleger, Rolf; Niedeggen, Michael

    2011-01-01

    Psychological and neuroscience approaches have promoted much progress in elucidating the cognitive and neural mechanisms that underlie phenomenal visual awareness during the last decades. In this article, we provide an overview of the latest research investigating important phenomena in conscious and unconscious vision. We identify general principles to characterize conscious and unconscious visual perception, which may serve as important building blocks for a unified model to explain the plethora of findings. We argue that in particular the integration of principles from both conscious and unconscious vision is advantageous and provides critical constraints for developing adequate theoretical models. Based on the principles identified in our review, we outline essential components of a unified model of conscious and unconscious visual perception. We propose that awareness refers to consolidated visual representations, which are accessible to the entire brain and therefore globally available. However, visual awareness not only depends on consolidation within the visual system, but is additionally the result of a post-sensory gating process, which is mediated by higher-level cognitive control mechanisms. We further propose that amplification of visual representations by attentional sensitization is not exclusive to the domain of conscious perception, but also applies to visual stimuli, which remain unconscious. Conscious and unconscious processing modes are highly interdependent with influences in both directions. We therefore argue that exactly this interdependence renders a unified model of conscious and unconscious visual perception valuable. Computational modeling jointly with focused experimental research could lead to a better understanding of the plethora of empirical phenomena in consciousness research. PMID:22253669

  10. Zero-Adjective Contrast in Much-less Ellipsis: The Advantage for Parallel Syntax.

    PubMed

    Carlson, Katy; Harris, Jesse A

    2018-01-01

    This paper explores the processing of sentences with a much less coordinator ( I don't own a pink hat, much less a red one ). This understudied ellipsis sentence, one of several focus-sensitive coordination structures, imposes syntactic and semantic conditions on the relationship between the correlate ( a pink hat ) and remnant ( a red one ). We present the case of zero-adjective contrast, in which an NP remnant introduces an adjective without an overt counterpart in the correlate ( I don't own a hat, much less a red one ). Although zero-adjective contrast could in principle ease comprehension by limiting the possible relationships between the remnant and correlate to entailment, we find that zero-adjective contrast is avoided in production and taxing in online processing. Results from several studies support a processing model in which syntactic parallelism is the primary guide for determining contrast in ellipsis structures, even when violating parallelism would assist in computing semantic relationships.

  11. Optimization, an Important Stage of Engineering Design

    ERIC Educational Resources Information Center

    Kelley, Todd R.

    2010-01-01

    A number of leaders in technology education have indicated that a major difference between the technological design process and the engineering design process is analysis and optimization. The analysis stage of the engineering design process is when mathematical models and scientific principles are employed to help the designer predict design…

  12. Determination of the transforming activities of adenovirus oncogenes.

    PubMed

    Nevels, Michael; Dobner, Thomas

    2007-01-01

    The last 50 yr of molecular biological investigations into human adenoviruses (Ads) have contributed enormously to our understanding of the basic principles of normal and malignant cell growth. Much of this knowledge stems from analyses of the Ad productive infection cycle in permissive host cells. Also, initial observations concerning the transforming potential of human Ads subsequently revealed decisive insights into the molecular mechanisms of the origins of cancer and established Ads as a model system for explaining virus-mediated transformation processes. Today it is well established that cell transformation by human Ads is a multistep process involving several gene products encoded in early transcription units 1A (E1A) and 1B (E1B). Moreover, a large body of evidence now indicates that alternative or additional mechanisms are engaged in Ad-mediated oncogenic transformation involving gene products encoded in early region 4 (E4) as well as epigenetic changes resulting from viral DNA integration. In particular, studies on the transforming potential of several E4 gene products have now revealed new pathways that point to novel general mechanisms of virus-mediated oncogenesis. In this chapter we describe in vitro and in vivo assays to determine the transforming and oncogenic activities of the E1A, E1B, and E4 oncoproteins in primary baby rat kidney cells and athymic nude mice.

  13. Cell transformation by human adenoviruses.

    PubMed

    Endter, C; Dobner, T

    2004-01-01

    The last 40 years of molecular biological investigations into human adenoviruses have contributed enormously to our understanding of the basic principles of normal and malignant cell growth. Much of this knowledge stems from analyses of their productive infection cycle in permissive host cells. Also, initial observations concerning the carcinogenic potential of human adenoviruses subsequently revealed decisive insights into the molecular mechanisms of the origins of cancer, and established adenoviruses as a model system for explaining virus-mediated transformation processes. Today it is well established that cell transformation by human adenoviruses is a multistep process involving several gene products encoded in early transcription units 1A (E1A) and 1B (E1B). Moreover, a large body of evidence now indicates that alternative or additional mechanisms are engaged in adenovirus-mediated oncogenic transformation involving gene products encoded in early region 4 (E4) as well as epigenetic changes resulting from viral DNA integration. In particular, detailed studies on the tumorigenic potential of subgroup D adenovirus type 9 (Ad9) E4 have now revealed a new pathway that points to a novel, general mechanism of virus-mediated oncogenesis. In this chapter, we summarize the current state of knowledge about the oncogenes and oncogene products of human adenoviruses, focusing particularly on recent findings concerning the transforming and oncogenic properties of viral proteins encoded in the E1B and E4 transcription units.

  14. Determination of the transforming activities of adenovirus oncogenes.

    PubMed

    Speiseder, Thomas; Nevels, Michael; Dobner, Thomas

    2014-01-01

    The last 50 years of molecular biological investigations into human adenoviruses (Ads) have contributed enormously to our understanding of the basic principles of normal and malignant cell growth. Much of this knowledge stems from analyses of the Ad productive infection cycle in permissive host cells. Also, initial observations concerning the transforming potential of human Ads subsequently revealed decisive insights into the molecular mechanisms of the origins of cancer and established Ads as a model system for explaining virus-mediated transformation processes. Today it is well established that cell transformation by human Ads is a multistep process involving several gene products encoded in early transcription units 1A (E1A) and 1B (E1B). Moreover, a large body of evidence now indicates that alternative or additional mechanisms are engaged in Ad-mediated oncogenic transformation involving gene products encoded in early region 4 (E4) as well as epigenetic changes resulting from viral DNA integration. In particular, studies on the transforming potential of several E4 gene products have now revealed new pathways that point to novel general mechanisms of virus-mediated oncogenesis. In this chapter we describe in vitro and in vivo assays to determine the transforming and oncogenic activities of the E1A, E1B, and E4 oncoproteins in primary baby rat kidney cells, human amniotic fluid cells and athymic nude mice.

  15. Processing of Visual Imagery by an Adaptive Model of the Visual System: Its Performance and its Significance. Final Report, June 1969-March 1970.

    ERIC Educational Resources Information Center

    Tallman, Oliver H.

    A digital simulation of a model for the processing of visual images is derived from known aspects of the human visual system. The fundamental principle of computation suggested by a biological model is a transformation that distributes information contained in an input stimulus everywhere in a transform domain. Each sensory input contributes under…

  16. Weak Galilean invariance as a selection principle for coarse-grained diffusive models.

    PubMed

    Cairoli, Andrea; Klages, Rainer; Baule, Adrian

    2018-05-29

    How does the mathematical description of a system change in different reference frames? Galilei first addressed this fundamental question by formulating the famous principle of Galilean invariance. It prescribes that the equations of motion of closed systems remain the same in different inertial frames related by Galilean transformations, thus imposing strong constraints on the dynamical rules. However, real world systems are often described by coarse-grained models integrating complex internal and external interactions indistinguishably as friction and stochastic forces. Since Galilean invariance is then violated, there is seemingly no alternative principle to assess a priori the physical consistency of a given stochastic model in different inertial frames. Here, starting from the Kac-Zwanzig Hamiltonian model generating Brownian motion, we show how Galilean invariance is broken during the coarse-graining procedure when deriving stochastic equations. Our analysis leads to a set of rules characterizing systems in different inertial frames that have to be satisfied by general stochastic models, which we call "weak Galilean invariance." Several well-known stochastic processes are invariant in these terms, except the continuous-time random walk for which we derive the correct invariant description. Our results are particularly relevant for the modeling of biological systems, as they provide a theoretical principle to select physically consistent stochastic models before a validation against experimental data.

  17. From Foam Rubber to Volcanoes: The Physical Chemistry of Foam Formation

    NASA Astrophysics Data System (ADS)

    Hansen, Lee D.; McCarlie, V. Wallace

    2004-11-01

    Principles of physical chemistry and physical properties are used to describe foam formation. Foams are common in nature and in consumer products. The process of foam formation can be used to understand a wide variety of phenomena from exploding volcanoes to popping popcorn and making shoe soles.

  18. The Competitive Advantage: Client Service.

    ERIC Educational Resources Information Center

    Leffel, Linda G.; DeBord, Karen B.

    The adult education literature contains a considerable amount of research on and discussion of client service in the marketing process, management and staff roles in service- and product-oriented businesses, and the importance of client service and service quality to survival in the marketplace. By applying the principles of client-oriented…

  19. 9 CFR 381.118 - Ingredients statement.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ingredients of poultry products processed from other kinds of poultry. (c) The terms spice, natural flavor, natural flavoring, flavor or flavoring may be used in the following manner: (1) The term “spice” means any... portion of any volatile oil or other flavoring principle has been removed. Spices include the spices...

  20. Development of Measures of Success for Corporate Level Air Force Acquisition Initiatives

    DTIC Science & Technology

    2006-04-30

    initiative. Customer satisfaction is described as the extent to which a process or product meets a customer’s expectations ( Kotler and Armstrong ...ADA366787). Kotler , P. and G. Armstrong . Principles of Marketing (9th Edition). Upper Saddle River NJ: Prentice Hall, 2001. Lambert, D. and T

  1. Development of Measures of Success for Corporate Level Air Force Acquisition Initiatives

    DTIC Science & Technology

    2004-03-01

    has failed. Customer satisfaction is described as the extent to which a process or product meets a customer’s expectations ( Kotler and Armstrong ...ADA366787). Kotler , P. and G. Armstrong . Principles of Marketing (9th Edition). Upper Saddle River NJ: Prentice Hall, 2001. Lambert, D. and T

  2. A Treatise on the Application of Life Cycle Management Principles in Agricultural & Biological Engineering

    USDA-ARS?s Scientific Manuscript database

    Life Cycle Management (LCM) is a systematic approach, mindset and culture that considers economic, social, and environmental factors among other factors in the decision making process throughout various business or organizational decisions that affect both inputs and outputs of a product or service...

  3. Product Development in Higher Education Marketing

    ERIC Educational Resources Information Center

    Durkin, Mark; Howcroft, Barry; Fairless, Craig

    2016-01-01

    Purpose: During the last 20 years or so the changing environment in which universities operate has meant that commensurately more emphasis has been placed on marketing principles. In light of this emphasis, it is perhaps a little surprising that relatively little attention has been directed towards the processes by which universities develop their…

  4. Applying principles from economics to improve the transfer of ecological production estimates in fisheries ecosystem services research

    EPA Science Inventory

    Ecosystem services (ES) represent a way to represent and quantify multiple uses, values as well as connectivity between ecosystem processes and human well-being. Ecosystem-based fisheries management approaches may seek to quantify expected trade-offs in ecosystem services due to ...

  5. Writing Centre Tutoring Sessions: Addressing Students' Concerns

    ERIC Educational Resources Information Center

    Winder, Roger; Kathpalia, Sujata S.; Koo, Swit Ling

    2016-01-01

    The guiding principle behind university writing centres is to focus on the process of writing rather than the finished product, prioritising higher order concerns related to organisation and argumentation of texts rather than lower order concerns of grammar and punctuation. Using survey-based data, this paper examines students' concerns regarding…

  6. REDUCTION OF WATER CONSUMPTION AND POLLUTION IN THE CORN MASA PRODUCTION PROCESS

    EPA Science Inventory

    Maize (corn) is the principle food source in Mexico accounting for approximately 70 percent of the total calorie intake and 50 percent of the total protein intake (Paredes and Saharopulos, 1983). Maize is primarily used to produce masa, a maize based dough. In Mexico there ...

  7. Principles and practices of integrated pest management on cotton in the lower Rio Grande Valley of Texas

    USDA-ARS?s Scientific Manuscript database

    Sustainable agriculture is ecologically sound, economically viable, socially just, and humane. These four goals for sustainability can be applied to all aspects of any agricultural system, from production and marketing, to processing and consumption. Integrated Pest Management (IPM) may be conside...

  8. Shhhh! Don't Tell: Advertising Design Impacts Sales.

    ERIC Educational Resources Information Center

    Schaub, Laura; Kelsey, Roy

    2000-01-01

    Discusses the creation of an advertisement to catch the attention of the target audience: student readers. Notes the consideration of several important factors including: the product, the audience, the positioning, the principles, and the ingredients. Describes ways to get started and several points in approaching the design process. (SC)

  9. FUNDAMENTALS OF LIFE CYCLE ASSESSMENT AND OFF-THE-SHELF SOFTWARE DEMONSTRATION

    EPA Science Inventory

    As the name implies, Life Cycle Assesssment (LCA) evaluates the entire life cycle of a product, process, activity, or service, not just simple economics at the time of delivery. This course on LCA covers the following issues:
    Basic principles of LCA for use in producing, des...

  10. Learning by Brewing: Beer Production Experiments in the Chemical Engineering Laboratory

    ERIC Educational Resources Information Center

    Cerretani, Colin; Kelkile, Esayas; Landry, Alexandra

    2017-01-01

    We discuss the successful creation and implementation of a biotechnology track within the chemical engineering unit operations course. The track focuses on engineering principles relevant to brewing. Following laboratory modules investigating heat transfer processes and yeast fermentation kinetics, student groups design and implement a project to…

  11. Operationalizing Space Weather Products - Process and Issues

    NASA Astrophysics Data System (ADS)

    Scro, K. D.; Quigley, S.

    2006-12-01

    Developing and transitioning operational products for any customer base is a complicated process. This is the case for operational space weather products and services for the USAF. This presentation will provide information on the current state of affairs regarding the process required to take an idea from the research field to the real-time application of 24-hour space weather operations support. General principles and specific issues are discussed and will include: customer requirements, organizations in-play, funding, product types, acquisition of engineering and validation data, security classification, version control, and various important changes that occur during the process. The author's viewpoint is as an individual developing space environmental system-impact products for the US Air Force: 1) as a member of its primary research organization (Air Force Research Laboratory), 2) working with its primary space environment technology transition organization (Technology Application Division of the Space and Missile Systems Center, SMC/WXT), and 3) delivering to the primary sponsor/customer of such system-impact products (Air Force Space Command). The experience and focus is obviously on specific military operationalization process and issues, but most of the paradigm may apply to other (commercial) enterprises as well.

  12. Principles of Classroom Management: A Professional Decision-Making Model, 7th Edition

    ERIC Educational Resources Information Center

    Levin, James; Nolan, James F.

    2014-01-01

    This text takes a decision-making model approach to classroom management. It provides teachers with a very practical system to influence students to choose to behave productively and to strive for academic success. This widely used text presents an array of decision-making options that guide teachers in developing positive, pro-social classroom…

  13. Data-based hybrid tension estimation and fault diagnosis of cold rolling continuous annealing processes.

    PubMed

    Liu, Qiang; Chai, Tianyou; Wang, Hong; Qin, Si-Zhao Joe

    2011-12-01

    The continuous annealing process line (CAPL) of cold rolling is an important unit to improve the mechanical properties of steel strips in steel making. In continuous annealing processes, strip tension is an important factor, which indicates whether the line operates steadily. Abnormal tension profile distribution along the production line can lead to strip break and roll slippage. Therefore, it is essential to estimate the whole tension profile in order to prevent the occurrence of faults. However, in real annealing processes, only a limited number of strip tension sensors are installed along the machine direction. Since the effects of strip temperature, gas flow, bearing friction, strip inertia, and roll eccentricity can lead to nonlinear tension dynamics, it is difficult to apply the first-principles induced model to estimate the tension profile distribution. In this paper, a novel data-based hybrid tension estimation and fault diagnosis method is proposed to estimate the unmeasured tension between two neighboring rolls. The main model is established by an observer-based method using a limited number of measured tensions, speeds, and currents of each roll, where the tension error compensation model is designed by applying neural networks principal component regression. The corresponding tension fault diagnosis method is designed using the estimated tensions. Finally, the proposed tension estimation and fault diagnosis method was applied to a real CAPL in a steel-making company, demonstrating the effectiveness of the proposed method.

  14. The two-box model of climate: limitations and applications to planetary habitability and maximum entropy production studies

    PubMed Central

    Lorenz, Ralph D.

    2010-01-01

    The ‘two-box model’ of planetary climate is discussed. This model has been used to demonstrate consistency of the equator–pole temperature gradient on Earth, Mars and Titan with what would be predicted from a principle of maximum entropy production (MEP). While useful for exposition and for generating first-order estimates of planetary heat transports, it has too low a resolution to investigate climate systems with strong feedbacks. A two-box MEP model agrees well with the observed day : night temperature contrast observed on the extrasolar planet HD 189733b. PMID:20368253

  15. Modular Chemical Process Intensification: A Review.

    PubMed

    Kim, Yong-Ha; Park, Lydia K; Yiacoumi, Sotira; Tsouris, Costas

    2017-06-07

    Modular chemical process intensification can dramatically improve energy and process efficiencies of chemical processes through enhanced mass and heat transfer, application of external force fields, enhanced driving forces, and combinations of different unit operations, such as reaction and separation, in single-process equipment. These dramatic improvements lead to several benefits such as compactness or small footprint, energy and cost savings, enhanced safety, less waste production, and higher product quality. Because of these benefits, process intensification can play a major role in industrial and manufacturing sectors, including chemical, pulp and paper, energy, critical materials, and water treatment, among others. This article provides an overview of process intensification, including definitions, principles, tools, and possible applications, with the objective to contribute to the future development and potential applications of modular chemical process intensification in industrial and manufacturing sectors. Drivers and barriers contributing to the advancement of process intensification technologies are discussed.

  16. Modular Chemical Process Intensification: A Review

    DOE PAGES

    Kim, Yong-ha; Park, Lydia K.; Yiacoumi, Sotira; ...

    2016-06-24

    Modular chemical process intensification can dramatically improve energy and process efficiencies of chemical processes through enhanced mass and heat transfer, application of external force fields, enhanced driving forces, and combinations of different unit operations, such as reaction and separation, in single-process equipment. Dramatic improvements such as these lead to several benefits such as compactness or small footprint, energy and cost savings, enhanced safety, less waste production, and higher product quality. Because of these benefits, process intensification can play a major role in industrial and manufacturing sectors, including chemical, pulp and paper, energy, critical materials, and water treatment, among others. Thismore » article provides an overview of process intensification, including definitions, principles, tools, and possible applications, with the objective to contribute to the future development and potential applications of modular chemical process intensification in industrial and manufacturing sectors. Drivers and barriers contributing to the advancement of process intensification technologies are discussed.« less

  17. Speech Communication Behavior; Perspectives and Principles.

    ERIC Educational Resources Information Center

    Barker, Larry L., Ed.; Kibler, Robert J., Ed.

    Readings are included on seven topics: 1) theories and models of communication processes, 2) acquisition and performance of communication behaviors, 3) human information processing and diffusion, 4) persuasion and attitude change, 5) psychophysiological approaches to studying communication, 6) interpersonal communication within transracial…

  18. Life-Game, with Glass Beads and Molecules, on the Principles of the Origin of Life

    ERIC Educational Resources Information Center

    Eigen, Manfred; Haglund, Herman

    1976-01-01

    Discusses a theoretical model that uses a game as a base for studying processes of a stochastic nature, which involve chemical reactions, molecular systems, biological processes, cells, or people in a population. (MLH)

  19. Towards Cloud-Resolving European-Scale Climate Simulations using a fully GPU-enabled Prototype of the COSMO Regional Model

    NASA Astrophysics Data System (ADS)

    Leutwyler, David; Fuhrer, Oliver; Cumming, Benjamin; Lapillonne, Xavier; Gysi, Tobias; Lüthi, Daniel; Osuna, Carlos; Schär, Christoph

    2014-05-01

    The representation of moist convection is a major shortcoming of current global and regional climate models. State-of-the-art global models usually operate at grid spacings of 10-300 km, and therefore cannot fully resolve the relevant upscale and downscale energy cascades. Therefore parametrization of the relevant sub-grid scale processes is required. Several studies have shown that this approach entails major uncertainties for precipitation processes, which raises concerns about the model's ability to represent precipitation statistics and associated feedback processes, as well as their sensitivities to large-scale conditions. Further refining the model resolution to the kilometer scale allows representing these processes much closer to first principles and thus should yield an improved representation of the water cycle including the drivers of extreme events. Although cloud-resolving simulations are very useful tools for climate simulations and numerical weather prediction, their high horizontal resolution and consequently the small time steps needed, challenge current supercomputers to model large domains and long time scales. The recent innovations in the domain of hybrid supercomputers have led to mixed node designs with a conventional CPU and an accelerator such as a graphics processing unit (GPU). GPUs relax the necessity for cache coherency and complex memory hierarchies, but have a larger system memory-bandwidth. This is highly beneficial for low compute intensity codes such as atmospheric stencil-based models. However, to efficiently exploit these hybrid architectures, climate models need to be ported and/or redesigned. Within the framework of the Swiss High Performance High Productivity Computing initiative (HP2C) a project to port the COSMO model to hybrid architectures has recently come to and end. The product of these efforts is a version of COSMO with an improved performance on traditional x86-based clusters as well as hybrid architectures with GPUs. We present our redesign and porting approach as well as our experience and lessons learned. Furthermore, we discuss relevant performance benchmarks obtained on the new hybrid Cray XC30 system "Piz Daint" installed at the Swiss National Supercomputing Centre (CSCS), both in terms of time-to-solution as well as energy consumption. We will demonstrate a first set of short cloud-resolving climate simulations at the European-scale using the GPU-enabled COSMO prototype and elaborate our future plans on how to exploit this new model capability.

  20. Study on Mine Emergency Mechanism based on TARP and ICS

    NASA Astrophysics Data System (ADS)

    Xi, Jian; Wu, Zongzhi

    2018-01-01

    By analyzing the experiences and practices of mine emergency in China and abroad, especially the United States and Australia, normative principle, risk management principle and adaptability principle of constructing mine emergency mechanism based on Trigger Action Response Plans (TARP) and Incident Command System (ICS) are summarized. Classification method, framework, flow and subject of TARP and ICS which are suitable for the actual situation of domestic mine emergency are proposed. The system dynamics model of TARP and ICS is established. The parameters such as evacuation ratio, response rate, per capita emergency capability and entry rate of rescuers are set up. By simulating the operation process of TARP and ICS, the impact of these parameters on the emergency process are analyzed, which could provide a reference and basis for building emergency capacity, formulating emergency plans and setting up action plans in the emergency process.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Ms. Ketki; Kim, Yong-Ha; Yiacoumi, Sotira

    The mixing process of fresh water and seawater releases a significant amount of energy and is a potential source of renewable energy. The so called ‘blue energy’ or salinity-gradient energy can be harvested by a device consisting of carbon electrodes immersed in an electrolyte solution, based on the principle of capacitive double layer expansion (CDLE). In this study, we have investigated the feasibility of energy production based on the CDLE principle. Experiments and computer simulations were used to study the process. Mesoporous carbon materials, synthesized at the Oak Ridge National Laboratory, were used as electrode materials in the experiments. Neutronmore » imaging of the blue energy cycle was conducted with cylindrical mesoporous carbon electrodes and 0.5 M lithium chloride as the electrolyte solution. For experiments conducted at 0.6 V and 0.9 V applied potential, a voltage increase of 0.061 V and 0.054 V was observed, respectively. From sequences of neutron images obtained for each step of the blue energy cycle, information on the direction and magnitude of lithium ion transport was obtained. A computer code was developed to simulate the process. Experimental data and computer simulations allowed us to predict energy production.« less

  2. Analysis and Simulation of a Blue Energy Cycle

    DOE PAGES

    Sharma, Ms. Ketki; Kim, Yong-Ha; Yiacoumi, Sotira; ...

    2016-01-30

    The mixing process of fresh water and seawater releases a significant amount of energy and is a potential source of renewable energy. The so called ‘blue energy’ or salinity-gradient energy can be harvested by a device consisting of carbon electrodes immersed in an electrolyte solution, based on the principle of capacitive double layer expansion (CDLE). In this study, we have investigated the feasibility of energy production based on the CDLE principle. Experiments and computer simulations were used to study the process. Mesoporous carbon materials, synthesized at the Oak Ridge National Laboratory, were used as electrode materials in the experiments. Neutronmore » imaging of the blue energy cycle was conducted with cylindrical mesoporous carbon electrodes and 0.5 M lithium chloride as the electrolyte solution. For experiments conducted at 0.6 V and 0.9 V applied potential, a voltage increase of 0.061 V and 0.054 V was observed, respectively. From sequences of neutron images obtained for each step of the blue energy cycle, information on the direction and magnitude of lithium ion transport was obtained. A computer code was developed to simulate the process. Experimental data and computer simulations allowed us to predict energy production.« less

  3. Framework model and principles for trusted information sharing in pervasive health.

    PubMed

    Ruotsalainen, Pekka; Blobel, Bernd; Nykänen, Pirkko; Seppälä, Antto; Sorvari, Hannu

    2011-01-01

    Trustfulness (i.e. health and wellness information is processed ethically, and privacy is guaranteed) is one of the cornerstones for future Personal Health Systems, ubiquitous healthcare and pervasive health. Trust in today's healthcare is organizational, static and predefined. Pervasive health takes place in an open and untrusted information space where person's lifelong health and wellness information together with contextual data are dynamically collected and used by many stakeholders. This generates new threats that do not exist in today's eHealth systems. Our analysis shows that the way security and trust are implemented in today's healthcare cannot guarantee information autonomy and trustfulness in pervasive health. Based on a framework model of pervasive health and risks analysis of ubiquitous information space, we have formulated principles which enable trusted information sharing in pervasive health. Principles imply that the data subject should have the right to dynamically verify trust and to control the use of her health information, as well as the right to set situation based context-aware personal policies. Data collectors and processors have responsibilities including transparency of information processing, and openness of interests, policies and environmental features. Our principles create a base for successful management of privacy and information autonomy in pervasive health. They also imply that it is necessary to create new data models for personal health information and new architectures which support situation depending trust and privacy management.

  4. Terminating DNA Tile Assembly with Nanostructured Caps.

    PubMed

    Agrawal, Deepak K; Jiang, Ruoyu; Reinhart, Seth; Mohammed, Abdul M; Jorgenson, Tyler D; Schulman, Rebecca

    2017-10-24

    Precise control over the nucleation, growth, and termination of self-assembly processes is a fundamental tool for controlling product yield and assembly dynamics. Mechanisms for altering these processes programmatically could allow the use of simple components to self-assemble complex final products or to design processes allowing for dynamic assembly or reconfiguration. Here we use DNA tile self-assembly to develop general design principles for building complexes that can bind to a growing biomolecular assembly and terminate its growth by systematically characterizing how different DNA origami nanostructures interact with the growing ends of DNA tile nanotubes. We find that nanostructures that present binding interfaces for all of the binding sites on a growing facet can bind selectively to growing ends and stop growth when these interfaces are presented on either a rigid or floppy scaffold. In contrast, nucleation of nanotubes requires the presentation of binding sites in an arrangement that matches the shape of the structure's facet. As a result, it is possible to build nanostructures that can terminate the growth of existing nanotubes but cannot nucleate a new structure. The resulting design principles for constructing structures that direct nucleation and termination of the growth of one-dimensional nanostructures can also serve as a starting point for programmatically directing two- and three-dimensional crystallization processes using nanostructure design.

  5. Control of the dehydration process in production of intermediate-moisture meat products: a review.

    PubMed

    Chang, S F; Huang, T C; Pearson, A M

    1996-01-01

    IM meat products are produced by lowering the aw to 0.90 to 0.60. Such products are stable at ambient temperature and humidity and are produced in nearly every country in the world, especially in developing areas where refrigeration is limited or unavailable. Traditionally IM meats use low cost sources of energy for drying, such as sun drying, addition of salt, or fermentation. Products produced by different processes are of interest since they do not require refrigeration during distribution and storage. Many different IM meat products can be produced by utilizing modern processing equipment and methods. Production can be achieved in a relatively short period of time and their advantages during marketing and distribution can be utilized. Nevertheless, a better understanding of the principles involved in heat transfer and efficiency of production are still needed to increase efficiency of processing. A basic understanding of the influence of water vapor pressure and sorption phenomena on water activity can materially improve the efficiency of drying of IM meats. Predrying treatments, such as fermentation and humidity control, can also be taken advantage of during the dehydration process. Such information can lead to process optimization and reduction of energy costs during production of IM meats. The development of sound science-based methods to assure the production of high-quality and nutritious IM meats is needed. Finally, such products also must be free of pathogenic microorganisms to assure their success in production and marketing.

  6. Remote detection of carbon monoxide by FTIR for simulating field detection in industrial process

    NASA Astrophysics Data System (ADS)

    Gao, Qiankun; Liu, Wenqing; Zhang, Yujun; Gao, Mingguang; Xu, Liang; Li, Xiangxian; Jin, Ling

    2016-10-01

    In order to monitor carbon monoxide in industrial production, we developed a passive gas radiation measurement system based on Fourier transform infrared spectroscopy and carried out infrared radiation measurement experiment of carbon monoxide detection in simulated industrial production environment by this system. The principle, condition, device and data processing method of the experiment are introduced in this paper. In order to solve the problem of light path jitter in the actual industrial field, we simulated the noise in the industrial environment. We combine the advantages of MATHEMATICA software in the aspects of graph processing and symbolic computation to data processing to improve the signal noise ratio and noise suppression. Based on the HITRAN database, the nonlinear least square fitting method was used to calculate the concentration of the CO spectra before and after the data processing. By comparing the calculated concentration, the data processed by MATHEMATICA is reliable and necessary in the industrial production environment.

  7. Modeling Web-Based Educational Systems: Process Design Teaching Model

    ERIC Educational Resources Information Center

    Rokou, Franca Pantano; Rokou, Elena; Rokos, Yannis

    2004-01-01

    Using modeling languages is essential to the construction of educational systems based on software engineering principles and methods. Furthermore, the instructional design is undoubtedly the cornerstone of the design and development of educational systems. Although several methodologies and languages have been proposed for the specification of…

  8. Principles of biorefineries.

    PubMed

    Kamm, B; Kamm, M

    2004-04-01

    Sustainable economic growth requires safe, sustainable resources for industrial production. For the future re-arrangement of a substantial economy to biological raw materials, completely new approaches in research and development, production and economy are necessary. Biorefineries combine the necessary technologies between biological raw materials and industrial intermediates and final products. The principal goal in the development of biorefineries is defined by the following: (biomass) feedstock-mix + process-mix --> product-mix. Here, particularly the combination between biotechnological and chemical conversion of substances will play an important role. Currently the "whole-crop biorefinery", "green biorefinery" and "lignocellulose-feedstock biorefinery" systems are favored in research and development.

  9. Control Systems Engineering in Continuous Pharmaceutical Manufacturing May 20-21, 2014 Continuous Manufacturing Symposium.

    PubMed

    Myerson, Allan S; Krumme, Markus; Nasr, Moheb; Thomas, Hayden; Braatz, Richard D

    2015-03-01

    This white paper provides a perspective of the challenges, research needs, and future directions for control systems engineering in continuous pharmaceutical processing. The main motivation for writing this paper is to facilitate the development and deployment of control systems technologies so as to ensure quality of the drug product. Although the main focus is on small-molecule pharmaceutical products, most of the same statements apply to biological drug products. An introduction to continuous manufacturing and control systems is followed by a discussion of the current status and technical needs in process monitoring and control, systems integration, and risk analysis. Some key points are that: (1) the desired objective in continuous manufacturing should be the satisfaction of all critical quality attributes (CQAs), not for all variables to operate at steady-state values; (2) the design of start-up and shutdown procedures can significantly affect the economic operation of a continuous manufacturing process; (3) the traceability of material as it moves through the manufacturing facility is an important consideration that can at least in part be addressed using residence time distributions; and (4) the control systems technologies must assure quality in the presence of disturbances, dynamics, uncertainties, nonlinearities, and constraints. Direct measurement, first-principles and empirical model-based predictions, and design space approaches are described for ensuring that CQA specifications are met. Ways are discussed for universities, regulatory bodies, and industry to facilitate working around or through barriers to the development of control systems engineering technologies for continuous drug manufacturing. Industry and regulatory bodies should work with federal agencies to create federal funding mechanisms to attract faculty to this area. Universities should hire faculty interested in developing first-principles models and control systems technologies for drug manufacturing that are easily transportable to industry. Industry can facilitate the move to continuous manufacturing by working with universities on the conception of new continuous pharmaceutical manufacturing process unit operations that have the potential to make major improvements in product quality, controllability, or reduced capital and/or operating costs. Regulatory bodies should ensure that: (1) regulations and regulatory practices promote, and do not derail, the development and implementation of continuous manufacturing and control systems engineering approaches; (2) the individuals who approve specific regulatory filings are sufficiently trained to make good decisions regarding control systems approaches; (3) provide regulatory clarity and eliminate/reduce regulatory risks; (4) financially support the development of high-quality training materials for use of undergraduate students, graduate students, industrial employees, and regulatory staff; (5) enhance the training of their own technical staff by financially supporting joint research projects with universities in the development of continuous pharmaceutical manufacturing processes and the associated control systems engineering theory, numerical algorithms, and software; and (6) strongly encourage the federal agencies that support research to fund these research areas. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  10. Control systems engineering in continuous pharmaceutical manufacturing. May 20-21, 2014 Continuous Manufacturing Symposium.

    PubMed

    Myerson, Allan S; Krumme, Markus; Nasr, Moheb; Thomas, Hayden; Braatz, Richard D

    2015-03-01

    This white paper provides a perspective of the challenges, research needs, and future directions for control systems engineering in continuous pharmaceutical processing. The main motivation for writing this paper is to facilitate the development and deployment of control systems technologies so as to ensure quality of the drug product. Although the main focus is on small-molecule pharmaceutical products, most of the same statements apply to biological drug products. An introduction to continuous manufacturing and control systems is followed by a discussion of the current status and technical needs in process monitoring and control, systems integration, and risk analysis. Some key points are that: (1) the desired objective in continuous manufacturing should be the satisfaction of all critical quality attributes (CQAs), not for all variables to operate at steady-state values; (2) the design of start-up and shutdown procedures can significantly affect the economic operation of a continuous manufacturing process; (3) the traceability of material as it moves through the manufacturing facility is an important consideration that can at least in part be addressed using residence time distributions; and (4) the control systems technologies must assure quality in the presence of disturbances, dynamics, uncertainties, nonlinearities, and constraints. Direct measurement, first-principles and empirical model-based predictions, and design space approaches are described for ensuring that CQA specifications are met. Ways are discussed for universities, regulatory bodies, and industry to facilitate working around or through barriers to the development of control systems engineering technologies for continuous drug manufacturing. Industry and regulatory bodies should work with federal agencies to create federal funding mechanisms to attract faculty to this area. Universities should hire faculty interested in developing first-principles models and control systems technologies for drug manufacturing that are easily transportable to industry. Industry can facilitate the move to continuous manufacturing by working with universities on the conception of new continuous pharmaceutical manufacturing process unit operations that have the potential to make major improvements in product quality, controllability, or reduced capital and/or operating costs. Regulatory bodies should ensure that: (1) regulations and regulatory practices promote, and do not derail, the development and implementation of continuous manufacturing and control systems engineering approaches; (2) the individuals who approve specific regulatory filings are sufficiently trained to make good decisions regarding control systems approaches; (3) provide regulatory clarity and eliminate/reduce regulatory risks; (4) financially support the development of high-quality training materials for use of undergraduate students, graduate students, industrial employees, and regulatory staff; (5) enhance the training of their own technical staff by financially supporting joint research projects with universities in the development of continuous pharmaceutical manufacturing processes and the associated control systems engineering theory, numerical algorithms, and software; and (6) strongly encourage the federal agencies that support research to fund these research areas. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  11. EcoPrinciples Connect: A Pilot Project Matching Ecological Principles with Available Data to Promote Ecosystem-Based Management

    NASA Astrophysics Data System (ADS)

    Martone, R. G.; Erickson, A.; Mach, M.; Hale, T.; McGregor, A.; Prahler, E. E.; Foley, M.; Caldwell, M.; Hartge, E. H.

    2016-02-01

    Ocean and coastal practitioners work within existing financial constraints, jurisdictions, and legislative authorities to manage coastal and marine resources while seeking to promote and maintain a healthy and productive coastal economy. Fulfilling this mandate necessitates incorporation of best available science, including ecosystem-based management (EBM) into coastal and ocean management decisions. To do this, many agencies seek ways to apply lessons from ecological theory into their decision processes. However, making direct connections between science and management can be challenging, in part because there is no process for linking ecological principles (e.g., maintaining species diversity, habitat diversity, connectivity and populations of key species) with available data. Here we explore how incorporating emerging data and methods into resource management at a local scale can improve the overall health of our coastal and marine ecosystems. We introduce a new web-based interface, EcoPrinciples Connect, that links marine managers to scientific and geospatial information through the lens of these ecological principles, ultimately helping managers become more efficient, more consistent, and advance the integration of EBM. The EcoPrinciples Connect tool grew directly out of needs identified in response to a Center for Ocean Solutions reference guide, Incorporating Ecological Principles into California Ocean and Coastal Management: Examples from Practice. Here we illustrate how we have worked to translate the information in this guide into a co-developed, user-centric tool for agency staff. Specifically, we present a pilot project where we match publicly available data to the ecological principles for the California San Francisco Bay Conservation and Development Commission. We will share early lessons learned from pilot development and highlight opportunities for future transferability to an expanded group of practitioners.

  12. EcoPrinciples Connect: A Pilot Project Matching Ecological Principles with Available Data to Promote Ecosystem-Based Management

    NASA Astrophysics Data System (ADS)

    Martone, R. G.; Erickson, A.; Mach, M.; Hale, T.; McGregor, A.; Prahler, E. E.; Foley, M.; Caldwell, M.; Hartge, E. H.

    2016-12-01

    Ocean and coastal practitioners work within existing financial constraints, jurisdictions, and legislative authorities to manage coastal and marine resources while seeking to promote and maintain a healthy and productive coastal economy. Fulfilling this mandate necessitates incorporation of best available science, including ecosystem-based management (EBM) into coastal and ocean management decisions. To do this, many agencies seek ways to apply lessons from ecological theory into their decision processes. However, making direct connections between science and management can be challenging, in part because there is no process for linking ecological principles (e.g., maintaining species diversity, habitat diversity, connectivity and populations of key species) with available data. Here we explore how incorporating emerging data and methods into resource management at a local scale can improve the overall health of our coastal and marine ecosystems. We introduce a new web-based interface, EcoPrinciples Connect, that links marine managers to scientific and geospatial information through the lens of these ecological principles, ultimately helping managers become more efficient, more consistent, and advance the integration of EBM. The EcoPrinciples Connect tool grew directly out of needs identified in response to a Center for Ocean Solutions reference guide, Incorporating Ecological Principles into California Ocean and Coastal Management: Examples from Practice. Here we illustrate how we have worked to translate the information in this guide into a co-developed, user-centric tool for agency staff. Specifically, we present a pilot project where we match publicly available data to the ecological principles for the California San Francisco Bay Conservation and Development Commission. We will share early lessons learned from pilot development and highlight opportunities for future transferability to an expanded group of practitioners.

  13. Toward the Application of the Maximum Entropy Production Principle to a Broader Range of Far From Equilibrium Dissipative Systems

    NASA Astrophysics Data System (ADS)

    Lineweaver, C. H.

    2005-12-01

    The principle of Maximum Entropy Production (MEP) is being usefully applied to a wide range of non-equilibrium processes including flows in planetary atmospheres and the bioenergetics of photosynthesis. Our goal of applying the principle of maximum entropy production to an even wider range of Far From Equilibrium Dissipative Systems (FFEDS) depends on the reproducibility of the evolution of the system from macro-state A to macro-state B. In an attempt to apply the principle of MEP to astronomical and cosmological structures, we investigate the problematic relationship between gravity and entropy. In the context of open and non-equilibrium systems, we use a generalization of the Gibbs free energy to include the sources of free energy extracted by non-living FFEDS such as hurricanes and convection cells. Redox potential gradients and thermal and pressure gradients provide the free energy for a broad range of FFEDS, both living and non-living. However, these gradients have to be within certain ranges. If the gradients are too weak, FFEDS do not appear. If the gradients are too strong FFEDS disappear. Living and non-living FFEDS often have different source gradients (redox potential gradients vs thermal and pressure gradients) and when they share the same gradient, they exploit different ranges of the gradient. In a preliminary attempt to distinguish living from non-living FFEDS, we investigate the parameter space of: type of gradient and steepness of gradient.

  14. A study of tumour growth based on stoichiometric principles: a continuous model and its discrete analogue.

    PubMed

    Saleem, M; Agrawal, Tanuja; Anees, Afzal

    2014-01-01

    In this paper, we consider a continuous mathematically tractable model and its discrete analogue for the tumour growth. The model formulation is based on stoichiometric principles considering tumour-immune cell interactions in potassium (K (+))-limited environment. Our both continuous and discrete models illustrate 'cancer immunoediting' as a dynamic process having all three phases namely elimination, equilibrium and escape. The stoichiometric principles introduced into the model allow us to study its dynamics with the variation in the total potassium in the surrounding of the tumour region. It is found that an increase in the total potassium may help the patient fight the disease for a longer period of time. This result seems to be in line with the protective role of the potassium against the risk of pancreatic cancer as has been reported by Bravi et al. [Dietary intake of selected micronutrients and risk of pancreatic cancer: An Italian case-control study, Ann. Oncol. 22 (2011), pp. 202-206].

  15. A study of tumour growth based on stoichiometric principles: a continuous model and its discrete analogue

    PubMed Central

    Saleem, M.; Agrawal, Tanuja; Anees, Afzal

    2014-01-01

    In this paper, we consider a continuous mathematically tractable model and its discrete analogue for the tumour growth. The model formulation is based on stoichiometric principles considering tumour-immune cell interactions in potassium (K +)-limited environment. Our both continuous and discrete models illustrate ‘cancer immunoediting’ as a dynamic process having all three phases namely elimination, equilibrium and escape. The stoichiometric principles introduced into the model allow us to study its dynamics with the variation in the total potassium in the surrounding of the tumour region. It is found that an increase in the total potassium may help the patient fight the disease for a longer period of time. This result seems to be in line with the protective role of the potassium against the risk of pancreatic cancer as has been reported by Bravi et al. [Dietary intake of selected micronutrients and risk of pancreatic cancer: An Italian case-control study, Ann. Oncol. 22 (2011), pp. 202–206]. PMID:24963981

  16. Large-baseline InSAR for precise topographic mapping: a framework for TanDEM-X large-baseline data

    NASA Astrophysics Data System (ADS)

    Pinheiro, Muriel; Reigber, Andreas; Moreira, Alberto

    2017-09-01

    The global Digital Elevation Model (DEM) resulting from the TanDEM-X mission provides information about the world topography with outstanding precision. In fact, performance analysis carried out with the already available data have shown that the global product is well within the requirements of 10 m absolute vertical accuracy and 2 m relative vertical accuracy for flat to moderate terrain. The mission's science phase took place from October 2014 to December 2015. During this phase, bistatic acquisitions with across-track separation between the two satellites up to 3.6 km at the equator were commanded. Since the relative vertical accuracy of InSAR derived elevation models is, in principle, inversely proportional to the system baseline, the TanDEM-X science phase opened the doors for the generation of elevation models with improved quality with respect to the standard product. However, the interferometric processing of the large-baseline data is troublesome due to the increased volume decorrelation and very high frequency of the phase variations. Hence, in order to fully profit from the increased baseline, sophisticated algorithms for the interferometric processing, and, in particular, for the phase unwrapping have to be considered. This paper proposes a novel dual-baseline region-growing framework for the phase unwrapping of the large-baseline interferograms. Results from two experiments with data from the TanDEM-X science phase are discussed, corroborating the expected increased level of detail of the large-baseline DEMs.

  17. Critical Thinking Theory to Practice: Using the Expert's Thought Process as Guide for Learning and Assessment.

    PubMed

    Marshall, Teresa A; Marchini, Leonardo; Cowen, Howard; Hartshorn, Jennifer E; Holloway, Julie A; Straub-Morarend, Cheryl L; Gratton, David; Solow, Catherine M; Colangelo, Nicholas; Johnsen, David C

    2017-08-01

    Critical thinking skills are essential for the successful dentist, yet few explicit skillsets in critical thinking have been developed and published in peer-reviewed literature. The aims of this article are to 1) offer an assessable critical thinking teaching model with the expert's thought process as the outcome, learning guide, and assessment instrument and 2) offer three critical thinking skillsets following this model: for geriatric risk assessment, technology decision making, and situation analysis/reflections. For the objective component, the student demonstrates delivery of each step in the thought process. For the subjective component, the student is judged to have grasped the principles as applied to the patient or case. This article describes the framework and the results of pilot tests in which students in one year at this school used the model in the three areas, earning scores of 90% or above on the assessments. The model was thus judged to be successful for students to demonstrate critical thinking skillsets in the course settings. Students consistently delivered each step of the thought process and were nearly as consistent in grasping the principles behind each step. As more critical thinking skillsets are implemented, a reinforcing network develops.

  18. The principles of high voltage electric field and its application in food processing: A review.

    PubMed

    Dalvi-Isfahan, Mohsen; Hamdami, Nasser; Le-Bail, Alain; Xanthakis, Epameinondas

    2016-11-01

    Food processing is a major part of the modern global industry and it will certainly be an important sector of the industry in the future. Several processes for different purposes are involved in food processing aiming at the development of new products by combining and/or transforming raw materials, to the extension of food shelf-life, recovery, exploitation and further use of valuable compounds and many others. During the last century several new food processes have arisen and most of the traditional ones have evolved. The future food factory will require innovative approaches food processing which can combine increased sustainability, efficiency and quality. Herein, the objective of this review is to explore the multiple applications of high voltage electric field (HVEF) and its potentials within the food industry. These applications include processes such as drying, refrigeration, freezing, thawing, extending food shelf- life, and extraction of biocompounds. In addition, the principles, mechanism of action and influence of specific parameters have been discussed comprehensively. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yong, E-mail: 83229994@qq.com; Ge, Hao, E-mail: haoge@pku.edu.cn; Xiong, Jie, E-mail: jiexiong@umac.mo

    Fluctuation theorem is one of the major achievements in the field of nonequilibrium statistical mechanics during the past two decades. There exist very few results for steady-state fluctuation theorem of sample entropy production rate in terms of large deviation principle for diffusion processes due to the technical difficulties. Here we give a proof for the steady-state fluctuation theorem of a diffusion process in magnetic fields, with explicit expressions of the free energy function and rate function. The proof is based on the Karhunen-Loève expansion of complex-valued Ornstein-Uhlenbeck process.

  20. Evaluating mountain goat dairy systems for conversion to the organic model, using a multicriteria method.

    PubMed

    Mena, Y; Nahed, J; Ruiz, F A; Sánchez-Muñoz, J B; Ruiz-Rojas, J L; Castel, J M

    2012-04-01

    Organic farming conserves natural resources, promotes biodiversity, guarantees animal welfare and obtains healthy products from raw materials through natural processes. In order to evaluate possibilities of increasing organic animal production, this study proposes a farm-scale multicriteria method for assessing the conversion of dairy goat systems to the organic model. In addition, a case study in the Northern Sierra of Seville, southern Spain, is analysed. A consensus of expert opinions and a field survey are used to validate a list of potential indicators and issues for assessing the conversion, which consider not only the European Community regulations for organic livestock farming, but also agroecological principles. As a result, the method includes 56 variables integrated in nine indicators: Nutritional management, Sustainable pasture management, Soil fertility and contamination, Weed and pest control, Disease prevention, Breeds and reproduction, Animal welfare, Food safety and Marketing and management. The nine indicators are finally integrated in a global index named OLPI (Organic Livestock Proximity Index). Application of the method to a case study with 24 goat farms reveals an OLPI value of 46.5% for dairy goat farms located in mountain areas of southern Spain. The aspects that differ most from the agroecological model include soil management, animal nutrition and product marketing. Results of the case study indicate that the proposed method is easy to implement and is useful for quantifying the approximation of conventional farms to an organic model.

Top