Applying linear programming model to aggregate production planning of coated peanut products
NASA Astrophysics Data System (ADS)
Rohmah, W. G.; Purwaningsih, I.; Santoso, EF S. M.
2018-03-01
The aim of this study was to set the overall production level for each grade of coated peanut product to meet market demands with a minimum production cost. The linear programming model was applied in this study. The proposed model was used to minimize the total production cost based on the limited demand of coated peanuts. The demand values applied to the method was previously forecasted using time series method and production capacity aimed to plan the aggregate production for the next 6 month period. The results indicated that the production planning using the proposed model has resulted a better fitted pattern to the customer demands compared to that of the company policy. The production capacity of product family A, B, and C was relatively stable for the first 3 months of the planning periods, then began to fluctuate over the next 3 months. While, the production capacity of product family D and E was fluctuated over the 6-month planning periods, with the values in the range of 10,864 - 32,580 kg and 255 – 5,069 kg, respectively. The total production cost for all products was 27.06% lower than the production cost calculated using the company’s policy-based method.
[Theoretical model study about the application risk of high risk medical equipment].
Shang, Changhao; Yang, Fenghui
2014-11-01
Research for establishing a risk monitoring theoretical model of high risk medical equipment at applying site. Regard the applying site as a system which contains some sub-systems. Every sub-system consists of some risk estimating indicators. After quantizing of each indicator, the quantized values are multiplied with corresponding weight and then the products are accumulated. Hence, the risk estimating value of each subsystem is attained. Follow the calculating method, the risk estimating values of each sub-system are multiplied with corresponding weights and then the product is accumulated. The cumulative sum is the status indicator of the high risk medical equipment at applying site. The status indicator reflects the applying risk of the medical equipment at applying site. Establish a risk monitoring theoretical model of high risk medical equipment at applying site. The model can monitor the applying risk of high risk medical equipment at applying site dynamically and specially.
Information quality-control model
NASA Technical Reports Server (NTRS)
Vincent, D. A.
1971-01-01
Model serves as graphic tool for estimating complete product objectives from limited input information, and is applied to cost estimations, product-quality evaluations, and effectiveness measurements for manpower resources allocation. Six product quality levels are defined.
Ma, Lijuan; Li, Chen; Yang, Zhenhua; Jia, Wendi; Zhang, Dongyuan; Chen, Shulin
2013-07-20
Reducing the production cost of cellulase as the key enzyme for cellulose hydrolysis to fermentable sugars remains a major challenge for biofuel production. Because of the complexity of cellulase production, kinetic modeling and mass balance calculation can be used as effective tools for process design and optimization. In this study, kinetic models for cell growth, substrate consumption and cellulase production in batch fermentation were developed, and then applied in fed-batch fermentation to enhance cellulase production. Inhibition effect of substrate was considered and a modified Luedeking-Piret model was developed for cellulase production and substrate consumption according to the growth characteristics of Trichoderma reesei. The model predictions fit well with the experimental data. Simulation results showed that higher initial substrate concentration led to decrease of cellulase production rate. Mass balance and kinetic simulation results were applied to determine the feeding strategy. Cellulase production and its corresponding productivity increased by 82.13% after employing the proper feeding strategy in fed-batch fermentation. This method combining mathematics and chemometrics by kinetic modeling and mass balance can not only improve cellulase fermentation process, but also help to better understand the cellulase fermentation process. The model development can also provide insight to other similar fermentation processes. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Satellite Data for Seagrass Health Modeling and Monitoring
NASA Technical Reports Server (NTRS)
Spiering, Bruce A.; Underwood, Lauren; Ross, Kenton
2011-01-01
Time series derived information for coastal waters will be used to provide input data for the Fong and Harwell model. The current MODIS land mask limits where the model can be applied; this project will: a) Apply MODIS data with resolution higher than the standard products (250-m vs. 1-km). b) Seek to refine the land mask. c) Explore nearby areas to use as proxies for time series directly over the beds. Novel processing approaches will be leveraged from other NASA projects and customized as inputs for seagrass productivity modeling
NASA Astrophysics Data System (ADS)
Gass, S. I.
1982-05-01
The theoretical and applied state of the art of oil and gas supply models was discussed. The following areas were addressed: the realities of oil and gas supply, prediction of oil and gas production, problems in oil and gas modeling, resource appraisal procedures, forecasting field size and production, investment and production strategies, estimating cost and production schedules for undiscovered fields, production regulations, resource data, sensitivity analysis of forecasts, econometric analysis of resource depletion, oil and gas finding rates, and various models of oil and gas supply.
NASA Astrophysics Data System (ADS)
Ismail, Edy; Samsudi, Widjanarko, Dwi; Joyce, Peter; Stearns, Roman
2018-03-01
This model integrates project base learning by creating a product based on environmental needs. The Produktif Orientasi Lapangan 4 Tahap (POL4T) combines technical skills and entrepreneurial elements together in the learning process. This study is to implement the result of technopreneurship learning model development which is environment-oriented by combining technology and entrepreneurship components on Machining Skill Program. This study applies research and development design by optimizing experimental subject. Data were obtained from questionnaires, learning material validation, interpersonal, intrapersonal observation forms, skills, product, teachers and students' responses, and cognitive tasks. Expert validation and t-test calculation are applied to see how effective POL4T learning model. The result of the study is in the form of 4 steps learning model to enhance interpersonal and intrapersonal attitudes, develop practical products which orient to society and appropriate technology so that the products can have high selling value. The model is effective based on the students' post test result, which is better than the pre-test. The product obtained from POL4T model is proven to be better than the productive learning. POL4T model is recommended to be implemented for XI grade students. This is can develop entrepreneurial attitudes that are environment oriented, community needs and technical competencies students.
Revised Reynolds Stress and Triple Product Models
NASA Technical Reports Server (NTRS)
Olsen, Michael E.; Lillard, Randolph P.
2017-01-01
Revised versions of Lag methodology Reynolds-stress and triple product models are applied to accepted test cases to assess the improvement, or lack thereof, in the prediction capability of the models. The Bachalo-Johnson bump flow is shown as an example for this abstract submission.
Combustion system CFD modeling at GE Aircraft Engines
NASA Technical Reports Server (NTRS)
Burrus, D.; Mongia, H.; Tolpadi, Anil K.; Correa, S.; Braaten, M.
1995-01-01
This viewgraph presentation discusses key features of current combustion system CFD modeling capabilities at GE Aircraft Engines provided by the CONCERT code; CONCERT development history; modeling applied for designing engine combustion systems; modeling applied to improve fundamental understanding; CONCERT3D results for current production combustors; CONCERT3D model of NASA/GE E3 combustor; HYBRID CONCERT CFD/Monte-Carlo modeling approach; and future modeling directions.
Combustion system CFD modeling at GE Aircraft Engines
NASA Astrophysics Data System (ADS)
Burrus, D.; Mongia, H.; Tolpadi, Anil K.; Correa, S.; Braaten, M.
1995-03-01
This viewgraph presentation discusses key features of current combustion system CFD modeling capabilities at GE Aircraft Engines provided by the CONCERT code; CONCERT development history; modeling applied for designing engine combustion systems; modeling applied to improve fundamental understanding; CONCERT3D results for current production combustors; CONCERT3D model of NASA/GE E3 combustor; HYBRID CONCERT CFD/Monte-Carlo modeling approach; and future modeling directions.
Scaled CMOS Technology Reliability Users Guide
NASA Technical Reports Server (NTRS)
White, Mark
2010-01-01
The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is presented revealing a power relationship. General models describing the soft error rates across scaled product generations are presented. The analysis methodology may be applied to other scaled microelectronic products and their key parameters.
A mass balance eutrophication model, Gulf of Mexico Dissolved Oxygen Model (GoMDOM), has been developed and applied to describe nitrogen, phosphorus and primary production in the Louisiana shelf of the Gulf of Mexico. Features of this model include bi-directional boundary exchan...
Reynolds-Stress and Triple-Product Models Applied to Flows with Rotation and Curvature
NASA Technical Reports Server (NTRS)
Olsen, Michael E.
2016-01-01
Predictions for Reynolds-stress and triple product turbulence models are compared for flows with significant rotational effects. Driver spinning cylinder flowfield and Zaets rotating pipe case are to be investigated at a minimum.
Prototype design based on NX subdivision modeling application
NASA Astrophysics Data System (ADS)
Zhan, Xianghui; Li, Xiaoda
2018-04-01
Prototype design is an important part of the product design, through a quick and easy way to draw a three-dimensional product prototype. Combined with the actual production, the prototype could be modified several times, resulting in a highly efficient and reasonable design before the formal design. Subdivision modeling is a common method of modeling product prototypes. Through Subdivision modeling, people can in a short time with a simple operation to get the product prototype of the three-dimensional model. This paper discusses the operation method of Subdivision modeling for geometry. Take a vacuum cleaner as an example, the NX Subdivision modeling functions are applied. Finally, the development of Subdivision modeling is forecasted.
When do combinatorial mechanisms apply in the production of inflected words?
Cholin, Joana; Rapp, Brenda; Miozzo, Michele
2010-01-01
A central question for theories of inflected word processing is to determine under what circumstances compositional procedures apply. Some accounts (e.g., the Dual Mechanism Model; Clahsen, 1999) propose that compositional processes only apply to verbs that take productive affixes. For all other verbs, inflected forms are assumed to be stored in the lexicon in a non-decomposed manner. This account makes clear predictions about the consequences of disruption to the lexical access mechanisms involved in the spoken production of inflected forms. Briefly, it predicts that non-productive forms (which require lexical access) should be more affected than productive forms (which, depending on the language task, may not). We tested these predictions through the detailed analysis of the spoken production of a German-speaking individual with an acquired lexical impairment resulting from a stroke. Analyses of response accuracy, error types, and frequency effects revealed that combinatorial processes are not restricted to verbs that take productive inflections. On this basis, we propose an alternative account, the Stem-based Assembly Model (SAM) that posits that combinatorial processes may be available to all stems, and not only those that combine with productive affixes. PMID:21104479
Jennifer C. Jenkins; Richard A. Birdsey
2000-01-01
As interest grows in the role of forest growth in the carbon cycle, and as simulation models are applied to predict future forest productivity at large spatial scales, the need for reliable and field-based data for evaluation of model estimates is clear. We created estimates of potential forest biomass and annual aboveground production for the Chesapeake Bay watershed...
A modeling study examining the impact of nutrient boundaries ...
A mass balance eutrophication model, Gulf of Mexico Dissolved Oxygen Model (GoMDOM), has been developed and applied to describe nitrogen, phosphorus and primary production in the Louisiana shelf of the Gulf of Mexico. Features of this model include bi-directional boundary exchanges, an empirical site-specific light attenuation equation, estimates of 56 river loads and atmospheric loads. The model was calibrated for 2006 by comparing model output to observations in zones that represent different locations in the Gulf. The model exhibited reasonable skill in simulating the phosphorus and nitrogen field data and primary production observations. The model was applied to generate a nitrogen mass balance estimate, to perform sensitivity analysis to compare the importance of the nutrient boundary concentrations versus the river loads on nutrient concentrations and primary production within the shelf, and to provide insight into the relative importance of different limitation factors on primary production. The mass budget showed the importance of the rivers as the major external nitrogen source while the atmospheric load contributed approximately 2% of the total external load. Sensitivity analysis showed the importance of accurate estimates of boundary nitrogen concentrations on the nitrogen levels on the shelf, especially at regions further away from the river influences. The boundary nitrogen concentrations impacted primary production less than nitrogen concent
Duret, Steven; Guillier, Laurent; Hoang, Hong-Minh; Flick, Denis; Laguerre, Onrawee
2014-06-16
Deterministic models describing heat transfer and microbial growth in the cold chain are widely studied. However, it is difficult to apply them in practice because of several variable parameters in the logistic supply chain (e.g., ambient temperature varying due to season and product residence time in refrigeration equipment), the product's characteristics (e.g., pH and water activity) and the microbial characteristics (e.g., initial microbial load and lag time). This variability can lead to different bacterial growth rates in food products and has to be considered to properly predict the consumer's exposure and identify the key parameters of the cold chain. This study proposes a new approach that combines deterministic (heat transfer) and stochastic (Monte Carlo) modeling to account for the variability in the logistic supply chain and the product's characteristics. The model generates a realistic time-temperature product history , contrary to existing modeling whose describe time-temperature profile Contrary to existing approaches that use directly a time-temperature profile, the proposed model predicts product temperature evolution from the thermostat setting and the ambient temperature. The developed methodology was applied to the cold chain of cooked ham including, the display cabinet, transport by the consumer and the domestic refrigerator, to predict the evolution of state variables, such as the temperature and the growth of Listeria monocytogenes. The impacts of the input factors were calculated and ranked. It was found that the product's time-temperature history and the initial contamination level are the main causes of consumers' exposure. Then, a refined analysis was applied, revealing the importance of consumer behaviors on Listeria monocytogenes exposure. Copyright © 2014. Published by Elsevier B.V.
Creating system engineering products with executable models in a model-based engineering environment
NASA Astrophysics Data System (ADS)
Karban, Robert; Dekens, Frank G.; Herzig, Sebastian; Elaasar, Maged; Jankevičius, Nerijus
2016-08-01
Applying systems engineering across the life-cycle results in a number of products built from interdependent sources of information using different kinds of system level analysis. This paper focuses on leveraging the Executable System Engineering Method (ESEM) [1] [2], which automates requirements verification (e.g. power and mass budget margins and duration analysis of operational modes) using executable SysML [3] models. The particular value proposition is to integrate requirements, and executable behavior and performance models for certain types of system level analysis. The models are created with modeling patterns that involve structural, behavioral and parametric diagrams, and are managed by an open source Model Based Engineering Environment (named OpenMBEE [4]). This paper demonstrates how the ESEM is applied in conjunction with OpenMBEE to create key engineering products (e.g. operational concept document) for the Alignment and Phasing System (APS) within the Thirty Meter Telescope (TMT) project [5], which is under development by the TMT International Observatory (TIO) [5].
When do combinatorial mechanisms apply in the production of inflected words?
Cholin, Joana; Rapp, Brenda; Miozzo, Michele
2010-01-01
A central question for theories of inflected word processing is to determine under what circumstances compositional procedures apply. Some accounts (e.g., the dual-mechanism model; Clahsen, 1999 ) propose that compositional processes only apply to verbs that take productive affixes. For all other verbs, inflected forms are assumed to be stored in the lexicon in a nondecomposed manner. This account makes clear predictions about the consequences of disruption to the lexical access mechanisms involved in the spoken production of inflected forms. Briefly, it predicts that nonproductive forms (which require lexical access) should be more affected than productive forms (which, depending on the language task, may not). We tested these predictions through the detailed analysis of the spoken production of a German-speaking individual with an acquired lexical impairment resulting from a stroke. Analyses of response accuracy, error types, and frequency effects revealed that combinatorial processes are not restricted to verbs that take productive inflections. On this basis, we propose an alternative account, the stem-based assembly model (SAM), which posits that combinatorial processes may be available to all stems and not only to those that combine with productive affixes.
A Novel Ontology Approach to Support Design for Reliability considering Environmental Effects
Sun, Bo; Li, Yu; Ye, Tianyuan
2015-01-01
Environmental effects are not considered sufficiently in product design. Reliability problems caused by environmental effects are very prominent. This paper proposes a method to apply ontology approach in product design. During product reliability design and analysis, environmental effects knowledge reusing is achieved. First, the relationship of environmental effects and product reliability is analyzed. Then environmental effects ontology to describe environmental effects domain knowledge is designed. Related concepts of environmental effects are formally defined by using the ontology approach. This model can be applied to arrange environmental effects knowledge in different environments. Finally, rubber seals used in the subhumid acid rain environment are taken as an example to illustrate ontological model application on reliability design and analysis. PMID:25821857
A novel ontology approach to support design for reliability considering environmental effects.
Sun, Bo; Li, Yu; Ye, Tianyuan; Ren, Yi
2015-01-01
Environmental effects are not considered sufficiently in product design. Reliability problems caused by environmental effects are very prominent. This paper proposes a method to apply ontology approach in product design. During product reliability design and analysis, environmental effects knowledge reusing is achieved. First, the relationship of environmental effects and product reliability is analyzed. Then environmental effects ontology to describe environmental effects domain knowledge is designed. Related concepts of environmental effects are formally defined by using the ontology approach. This model can be applied to arrange environmental effects knowledge in different environments. Finally, rubber seals used in the subhumid acid rain environment are taken as an example to illustrate ontological model application on reliability design and analysis.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-17
... decision and order must be used for all future testing for any basic models covered by the decision and... require petitioners to: (1) Specify the basic model(s) to which the waiver applies; (2) identify other manufacturers of similar products; (3) include any known alternate test procedures of the basic model, with the...
Owamah, H I; Izinyon, O C
2015-10-01
Biogas kinetic models are often used to characterize substrate degradation and prediction of biogas production potential. Most of these existing models are however difficult to apply to substrates they were not developed for since their applications are usually substrate specific. Biodegradability kinetic (BIK) model and maximum biogas production potential and stability assessment (MBPPSA) model were therefore developed in this study for better understanding of the anaerobic co-digestion of food waste and maize husk for biogas production. Biodegradability constant (k) was estimated as 0.11 d(-1) using the BIK model. The results of maximum biogas production potential (A) obtained using the MBPPSA model were found to be in good correspondence, both in value and trend with the results obtained using the popular but complex modified Gompertz model for digesters B-1, B-2, B-3, B-4, and B-5. The (If) value of MBPPSA model also showed that digesters B-3, B-4, and B-5 were stable, while B-1 and B-2 were inhibited/unstable. Similar stability observation was also obtained using the modified Gompertz model. The MBPPSA model can therefore be used as an alternative model for anaerobic digestion feasibility studies and plant design. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lynch, P.; Reid, J. S.; Westphal, D. L.; Zhang, J.; Hogan, T. F.; Hyer, E. J.; Curtis, C. A.; Hegg, D. A.; Shi, Y.; Campbell, J. R.; Rubin, J. I.; Sessions, W. R.; Turk, F. J.; Walker, A. L.
2015-12-01
While standalone satellite and model aerosol products see wide utilization, there is a significant need in numerous climate and applied applications for a fused product on a regular grid. Aerosol data assimilation is an operational reality at numerous centers, and like meteorological reanalyses, aerosol reanalyses will see significant use in the near future. Here we present a standardized 2003-2013 global 1° × 1° and 6 hourly modal aerosol optical thickness (AOT) reanalysis product. This dataset can be applied to basic and applied earth system science studies of significant aerosol events, aerosol impacts on numerical weather prediction, and electro-optical propagation and sensor performance, among other uses. This paper describes the science of how to develop and score an aerosol reanalysis product. This reanalysis utilizes a modified Navy Aerosol Analysis and Prediction System (NAAPS) at its core and assimilates quality controlled retrievals of AOT from the Moderate Resolution Imaging Spectroradiometer (MODIS) on Terra and Aqua and the Multi-angle Imaging SpectroRadiometer (MISR) on Terra. The aerosol source functions, including dust and smoke, were regionally tuned to obtain the best match between the model fine and coarse mode AOTs and the Aerosol Robotic Network (AERONET) AOTs. Other model processes, including deposition, were tuned to minimize the AOT difference between the model and satellite AOT. Aerosol wet deposition in the tropics is driven with satellite retrieved precipitation, rather than the model field. The final reanalyzed fine and coarse mode AOT at 550 nm is shown to have good agreement with AERONET observations, with global mean root mean square error around 0.1 for both fine and coarse mode AOTs. This paper includes a discussion of issues particular to aerosol reanalyses that make them distinct from standard meteorological reanalyses, considerations for extending such a reanalysis outside of the NASA A-Train era, and examples of how the aerosol reanalysis can be applied or fused with other model or remote sensing products. Finally, the reanalysis is evaluated in comparison with other available studies of aerosol trends, and the implications of this comparison are discussed.
Reynolds-Stress and Triple-Product Models Applied to a Flow with Rotation and Curvature
NASA Technical Reports Server (NTRS)
Olsen, Michael E.
2016-01-01
Turbulence models, with increasing complexity, up to triple product terms, are applied to the flow in a rotating pipe. The rotating pipe is a challenging case for turbulence models as it contains significant rotational and curvature effects. The flow field starts with the classic fully developed pipe flow, with a stationary pipe wall. This well defined condition is then subjected to a section of pipe with a rotating wall. The rotating wall introduces a second velocity scale, and creates Reynolds shear stresses in the radial-circumferential and circumferential-axial planes. Furthermore, the wall rotation introduces a flow stabilization, and actually reduces the turbulent kinetic energy as the flow moves along the rotating wall section. It is shown in the present work that the Reynolds stress models are capable of predicting significant reduction in the turbulent kinetic energy, but triple product improves the predictions of the centerline turbulent kinetic energy, which is governed by convection, dissipation and transport terms, as the production terms vanish on the pipe axis.
Product Recommendation System Based on Personal Preference Model Using CAM
NASA Astrophysics Data System (ADS)
Murakami, Tomoko; Yoshioka, Nobukazu; Orihara, Ryohei; Furukawa, Koichi
Product recommendation system is realized by applying business rules acquired by data maining techniques. Business rules such as demographical patterns of purchase, are able to cover the groups of users that have a tendency to purchase products, but it is difficult to recommend products adaptive to various personal preferences only by utilizing them. In addition to that, it is very costly to gather the large volume of high quality survey data, which is necessary for good recommendation based on personal preference model. A method collecting kansei information automatically without questionnaire survey is required. The constructing personal preference model from less favor data is also necessary, since it is costly for the user to input favor data. In this paper, we propose product recommendation system based on kansei information extracted by text mining and user's preference model constructed by Category-guided Adaptive Modeling, CAM for short. CAM is a feature construction method that can generate new features constructing the space where same labeled examples are close and different labeled examples are far away from some labeled examples. It is possible to construct personal preference model by CAM despite less information of likes and dislikes categories. In the system, retrieval agent gathers the products' specification and user agent manages preference model, user's likes and dislikes. Kansei information of the products is gained by applying text mining technique to the reputation documents about the products on the web site. We carry out some experimental studies to make sure that prefrence model obtained by our method performs effectively.
Applying Model Based Systems Engineering to NASA's Space Communications Networks
NASA Technical Reports Server (NTRS)
Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert
2013-01-01
System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its results and impact. We will highlight the insights gained by applying the Model Based System Engineering and provide recommendations for its applications and improvements.
Bilingual Language Switching: Production vs. Recognition
Mosca, Michela; de Bot, Kees
2017-01-01
This study aims at assessing how bilinguals select words in the appropriate language in production and recognition while minimizing interference from the non-appropriate language. Two prominent models are considered which assume that when one language is in use, the other is suppressed. The Inhibitory Control (IC) model suggests that, in both production and recognition, the amount of inhibition on the non-target language is greater for the stronger compared to the weaker language. In contrast, the Bilingual Interactive Activation (BIA) model proposes that, in language recognition, the amount of inhibition on the weaker language is stronger than otherwise. To investigate whether bilingual language production and recognition can be accounted for by a single model of bilingual processing, we tested a group of native speakers of Dutch (L1), advanced speakers of English (L2) in a bilingual recognition and production task. Specifically, language switching costs were measured while participants performed a lexical decision (recognition) and a picture naming (production) task involving language switching. Results suggest that while in language recognition the amount of inhibition applied to the non-appropriate language increases along with its dominance as predicted by the IC model, in production the amount of inhibition applied to the non-relevant language is not related to language dominance, but rather it may be modulated by speakers' unconscious strategies to foster the weaker language. This difference indicates that bilingual language recognition and production might rely on different processing mechanisms and cannot be accounted within one of the existing models of bilingual language processing. PMID:28638361
Bilingual Language Switching: Production vs. Recognition.
Mosca, Michela; de Bot, Kees
2017-01-01
This study aims at assessing how bilinguals select words in the appropriate language in production and recognition while minimizing interference from the non-appropriate language. Two prominent models are considered which assume that when one language is in use, the other is suppressed. The Inhibitory Control (IC) model suggests that, in both production and recognition, the amount of inhibition on the non-target language is greater for the stronger compared to the weaker language. In contrast, the Bilingual Interactive Activation (BIA) model proposes that, in language recognition, the amount of inhibition on the weaker language is stronger than otherwise. To investigate whether bilingual language production and recognition can be accounted for by a single model of bilingual processing, we tested a group of native speakers of Dutch (L1), advanced speakers of English (L2) in a bilingual recognition and production task. Specifically, language switching costs were measured while participants performed a lexical decision (recognition) and a picture naming (production) task involving language switching. Results suggest that while in language recognition the amount of inhibition applied to the non-appropriate language increases along with its dominance as predicted by the IC model, in production the amount of inhibition applied to the non-relevant language is not related to language dominance, but rather it may be modulated by speakers' unconscious strategies to foster the weaker language. This difference indicates that bilingual language recognition and production might rely on different processing mechanisms and cannot be accounted within one of the existing models of bilingual language processing.
Aguilar-Uscanga, M G; Garcia-Alvarado, Y; Gomez-Rodriguez, J; Phister, T; Delia, M L; Strehaiano, P
2011-08-01
To study the effect of glucose concentrations on the growth by Brettanomyces bruxellensis yeast strain in batch experiments and develop a mathematical model for kinetic behaviour analysis of yeast growing in batch culture. A Matlab algorithm was developed for the estimation of model parameters. Glucose fermentation by B. bruxellensis was studied by varying its concentration (5, 9.3, 13.8, 16.5, 17.6 and 21.4%). The increase in substrate concentration up to a certain limit was accompanied by an increase in ethanol and biomass production; at a substrate concentration of 50-138 g l(-1), the ethanol and biomass production were 24, 59 and 6.3, 11.4 g l(-1), respectively. However, an increase in glucose concentration to 165 g l(-1) led to a drastic decrease in product formation and substrate utilization. The model successfully simulated the batch kinetic observed in all cases. The confidence intervals were also estimated at each phase at a 0.95 probability level in a t-Student distribution for f degrees of freedom. The maximum ethanol and biomass yields were obtained with an initial glucose concentration of 138 g l(-1). These experiments illustrate the importance of using a mathematical model applied to kinetic behaviour on glucose concentration by B. bruxellensis. © 2011 The Authors. Letters in Applied Microbiology © 2011 The Society for Applied Microbiology.
Cross Sections From Scalar Field Theory
NASA Technical Reports Server (NTRS)
Norbury, John W.; Dick, Frank; Norman, Ryan B.; Nasto, Rachel
2008-01-01
A one pion exchange scalar model is used to calculate differential and total cross sections for pion production through nucleon- nucleon collisions. The collisions involve intermediate delta particle production and decay to nucleons and a pion. The model provides the basic theoretical framework for scalar field theory and can be applied to particle production processes where the effects of spin can be neglected.
Simulating the Composite Propellant Manufacturing Process
NASA Technical Reports Server (NTRS)
Williamson, Suzanne; Love, Gregory
2000-01-01
There is a strategic interest in understanding how the propellant manufacturing process contributes to military capabilities outside the United States. The paper will discuss how system dynamics (SD) has been applied to rapidly assess the capabilities and vulnerabilities of a specific composite propellant production complex. These facilities produce a commonly used solid propellant with military applications. The authors will explain how an SD model can be configured to match a specific production facility followed by a series of scenarios designed to analyze operational vulnerabilities. By using the simulation model to rapidly analyze operational risks, the analyst gains a better understanding of production complexities. There are several benefits of developing SD models to simulate chemical production. SD is an effective tool for characterizing complex problems, especially the production process where the cascading effect of outages quickly taxes common understanding. By programming expert knowledge into an SD application, these tools are transformed into a knowledge management resource that facilitates rapid learning without requiring years of experience in production operations. It also permits the analyst to rapidly respond to crisis situations and other time-sensitive missions. Most importantly, the quantitative understanding gained from applying the SD model lends itself to strategic analysis and planning.
Incorporating Resilience into Transportation Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Connelly, Elizabeth; Melaina, Marc
To aid decision making for developing transportation infrastructure, the National Renewable Energy Laboratory has developed the Scenario Evaluation, Regionalization and Analysis (SERA) model. The SERA model is a geospatially and temporally oriented model that has been applied to determine optimal production and delivery scenarios for hydrogen, given resource availability and technology cost and performance, for use in fuel cell vehicles. In addition, the SERA model has been applied to plug-in electric vehicles.
ERIC Educational Resources Information Center
Brackenbury, Tim; Zickar, Michael J.; Munson, Benjamin; Storkel, Holly L.
2017-01-01
Purpose: Item response theory (IRT) is a psychometric approach to measurement that uses latent trait abilities (e.g., speech sound production skills) to model performance on individual items that vary by difficulty and discrimination. An IRT analysis was applied to preschoolers' productions of the words on the Goldman-Fristoe Test of…
OHD/HL/HSMB - Hydrologic Science & Modeling Branch
apply these sciences to application software and data products developed within the HL and as a hydrologic services program. HSMB applies its scientific expertise to training material developed
NASA Astrophysics Data System (ADS)
Kimball, H.; Selmants, P. C.; Running, S. W.; Moreno, A.; Giardina, C. P.
2016-12-01
In this study we evaluate the influence of spatial data product accuracy and resolution on the application of global models for smaller scale heterogeneous landscapes. In particular, we assess the influence of locally specific land cover and high-resolution climate data products on estimates of Gross Primary Production (GPP) for the Hawaiian Islands using the MOD17 model. The MOD17 GPP algorithm uses a measure of the fraction of absorbed photosynthetically active radiation from the National Aeronautics and Space Administration's Earth Observation System. This direct measurement is combined with global land cover (500-m resolution) and climate models ( 1/2-degree resolution) to estimate GPP. We first compared the alignment between the global land cover model used in MOD17 with a Hawaii specific land cover data product. We found that there was a 51.6% overall agreement between the two land cover products. We then compared four MOD17 GPP models: A global model that used the global land cover and low-resolution global climate data products, a model produced using the Hawaii specific land cover and low-resolution global climate data products, a model with global land cover and high-resolution climate data products, and finally, a model using both Hawaii specific land cover and high-resolution climate data products. We found that including either the Hawaii specific land cover or the high-resolution Hawaii climate data products with MOD17 reduced overall estimates of GPP by 8%. When both were used, GPP estimates were reduced by 16%. The reduction associated with land cover is explained by a reduction of the total area designated as evergreen broad leaf forest and an increase in the area designated as barren or sparsely vegetated in the Hawaii land cover product as compared to the global product. The climate based reduction is explained primarily by the spatial resolution and distribution of solar radiation in the Hawaiian Islands. This study highlights the importance of accuracy and resolution when applying global models to highly variable landscapes and provides an estimate of the influence of land cover and climate data products on estimates of GPP using MOD17.
40 CFR 1045.1 - Does this part apply for my products?
Code of Federal Regulations, 2011 CFR
2011-07-01
... POLLUTION CONTROLS CONTROL OF EMISSIONS FROM SPARK-IGNITION PROPULSION MARINE ENGINES AND VESSELS Overview... exhaust emissions apply to new, spark-ignition propulsion marine engines beginning with the 2010 model...
40 CFR 1045.1 - Does this part apply for my products?
Code of Federal Regulations, 2012 CFR
2012-07-01
... POLLUTION CONTROLS CONTROL OF EMISSIONS FROM SPARK-IGNITION PROPULSION MARINE ENGINES AND VESSELS Overview... exhaust emissions apply to new, spark-ignition propulsion marine engines beginning with the 2010 model...
40 CFR 1045.1 - Does this part apply for my products?
Code of Federal Regulations, 2013 CFR
2013-07-01
... POLLUTION CONTROLS CONTROL OF EMISSIONS FROM SPARK-IGNITION PROPULSION MARINE ENGINES AND VESSELS Overview... exhaust emissions apply to new, spark-ignition propulsion marine engines beginning with the 2010 model...
40 CFR 1045.1 - Does this part apply for my products?
Code of Federal Regulations, 2014 CFR
2014-07-01
... POLLUTION CONTROLS CONTROL OF EMISSIONS FROM SPARK-IGNITION PROPULSION MARINE ENGINES AND VESSELS Overview... exhaust emissions apply to new, spark-ignition propulsion marine engines beginning with the 2010 model...
40 CFR 1045.1 - Does this part apply for my products?
Code of Federal Regulations, 2010 CFR
2010-07-01
... POLLUTION CONTROLS CONTROL OF EMISSIONS FROM SPARK-IGNITION PROPULSION MARINE ENGINES AND VESSELS Overview... exhaust emissions apply to new, spark-ignition propulsion marine engines beginning with the 2010 model...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-09
... would not apply to that relationship. The adoption of the Customer CDS Clearing Model will also require... Customer Clearing Model for CDS Products and To Amend, Clarify and Consolidate Certain Rules and Procedures... provide for a clearing model for CDS products whereby customers of ICE Clear Europe have the ability to...
Marketing for a Web-Based Master's Degree Program in Light of Marketing Mix Model
ERIC Educational Resources Information Center
Pan, Cheng-Chang
2012-01-01
The marketing mix model was applied with a focus on Web media to re-strategize a Web-based Master's program in a southern state university in U.S. The program's existing marketing strategy was examined using the four components of the model: product, price, place, and promotion, in hopes to repackage the program (product) to prospective students…
A study of the human skin-whitening effects of resveratryl triacetate.
Ryu, Ja Hyun; Seok, Jin Kyung; An, Sang Mi; Baek, Ji Hwoon; Koh, Jae Sook; Boo, Yong Chool
2015-04-01
Resveratrol has a variety of bioactivities that include its anti-melanogenic effects, but its use in cosmetics has been challenging partly because of its chemical instability. Resveratryl triacetate (RTA) is a prodrug that can enhance stability. The purpose of this study was to examine the skin safety and whitening effects of RTA in human subjects. The primary skin irritation potentials of RTA and resveratrol were tested at 0.1 and 0.5 % on human subjects. Resveratrol at a concentration of 0.5 % induced weak skin irritation, whereas RTA did not induce any skin responses. The skin-whitening efficacy of a cosmetic formulation containing 0.4 % RTA was evaluated in two different test models. In the artificial tanning model, the test product and the control product were applied twice daily to the skin of the forearms of 22 human subjects after pigmentation induction by ultraviolet irradiation. Applying the test and the control products to the artificial tanning model for 8 weeks increased the individual topology angles (ITA°) by 17.06 and 13.81 %, respectively, a difference that was statistically significant (p < 0.05). In the hyperpigmentation model, the test product and the control product were applied twice daily to the faces of 21 human subjects. The averaged intensity of the hyperpigmented spots decreased by 2.67 % in the test group and 1.46 % in the control group, a difference that was statistically significant (p < 0.05). Therefore, RTA incorporated into cosmetic formulations can whiten human skin without inducing skin irritation.
Effect of Manure vs. Fertilizer Inputs on Productivity of Forage Crop Models
Annicchiarico, Giovanni; Caternolo, Giovanni; Rossi, Emanuela; Martiniello, Pasquale
2011-01-01
Manure produced by livestock activity is a dangerous product capable of causing serious environmental pollution. Agronomic management practices on the use of manure may transform the target from a waste to a resource product. Experiments performed on comparison of manure with standard chemical fertilizers (CF) were studied under a double cropping per year regime (alfalfa, model I; Italian ryegrass-corn, model II; barley-seed sorghum, model III; and horse-bean-silage sorghum, model IV). The total amount of manure applied in the annual forage crops of the model II, III and IV was 158, 140 and 80 m3 ha−1, respectively. The manure applied to soil by broadcast and injection procedure provides an amount of nitrogen equal to that supplied by CF. The effect of manure applications on animal feeding production and biochemical soil characteristics was related to the models. The weather condition and manures and CF showed small interaction among treatments. The number of MFU ha−1 of biomass crop gross product produced in autumn and spring sowing models under manure applications was 11,769, 20,525, 11,342, 21,397 in models I through IV, respectively. The reduction of MFU ha−1 under CF ranges from 10.7% to 13.2% those of the manure models. The effect of manure on organic carbon and total nitrogen of topsoil, compared to model I, stressed the parameters as CF whose amount was higher in models II and III than model IV. In term of percentage the organic carbon and total nitrogen of model I and treatment with manure was reduced by about 18.5 and 21.9% in model II and model III and 8.8 and 6.3% in model IV, respectively. Manure management may substitute CF without reducing gross production and sustainability of cropping systems, thus allowing the opportunity to recycle the waste product for animal forage feeding. PMID:21776208
Effect of manure vs. fertilizer inputs on productivity of forage crop models.
Annicchiarico, Giovanni; Caternolo, Giovanni; Rossi, Emanuela; Martiniello, Pasquale
2011-06-01
Manure produced by livestock activity is a dangerous product capable of causing serious environmental pollution. Agronomic management practices on the use of manure may transform the target from a waste to a resource product. Experiments performed on comparison of manure with standard chemical fertilizers (CF) were studied under a double cropping per year regime (alfalfa, model I; Italian ryegrass-corn, model II; barley-seed sorghum, model III; and horse-bean-silage sorghum, model IV). The total amount of manure applied in the annual forage crops of the model II, III and IV was 158, 140 and 80 m3 ha(-1), respectively. The manure applied to soil by broadcast and injection procedure provides an amount of nitrogen equal to that supplied by CF. The effect of manure applications on animal feeding production and biochemical soil characteristics was related to the models. The weather condition and manures and CF showed small interaction among treatments. The number of MFU ha(-1) of biomass crop gross product produced in autumn and spring sowing models under manure applications was 11,769, 20,525, 11,342, 21,397 in models I through IV, respectively. The reduction of MFU ha(-1) under CF ranges from 10.7% to 13.2% those of the manure models. The effect of manure on organic carbon and total nitrogen of topsoil, compared to model I, stressed the parameters as CF whose amount was higher in models II and III than model IV. In term of percentage the organic carbon and total nitrogen of model I and treatment with manure was reduced by about 18.5 and 21.9% in model II and model III and 8.8 and 6.3% in model IV, respectively. Manure management may substitute CF without reducing gross production and sustainability of cropping systems, thus allowing the opportunity to recycle the waste product for animal forage feeding.
González-Sáiz, José-María; Esteban-Díez, Isabel; Rodríguez-Tecedor, Sofía; Pizarro, Consuelo
2008-11-01
The overall purpose of the project, of which this study is a part, was to examine the feasibility of onion waste as a support-substrate for the profitable production of food-grade products. This study focused on the efficient production of ethanol from worthless onions by transforming the onion juice into onion liquor via alcoholic fermentation with the yeast Saccharomyces cerevisiae. The onion bioethanol produced could be later used as a favorable substrate for acetic fermentation to finally obtain onion vinegar. Near-infrared spectroscopy (NIRS), coupled with the multivariate curve resolution-alternating least squares (MCR-ALS) method, has been used to reveal the compositional and spectral profiles for both substrates and products of alcoholic fermentation runs, that is, total sugars, ethanol, and biomass concentration. The ambiguity associated with the ALS calculation was resolved by applying suitable inequality and equality constraints. The quality of the results provided by the NIR-based MCR-ALS methodology adopted was evaluated by several performance indicators, including the variance explained by the model, the lack of fit and the agreement between the MCR-ALS achieved solution and the results computed by applying previously validated PLS reference models. An additional fermentation run was employed to test the actual predictive ability of the ALS model developed. For all the components resolved in the fermentation system studied (i.e., total sugars, ethanol, and biomass), the final model obtained showed a high predictive ability and suitable accuracy and precision, both in calibration and external validation, confirmed by the very good agreement between the ALS responses and the reference values (the coefficient of determination was, in all cases, very close to 1, and the statistics confirmed that no significant difference was found between PLS reference models and the MCR-ALS methodology applied). Thus, the proven reliability of the MCR-ALS model presented in this study, based only on NIR measurements, makes it suitable for monitoring of the key species involved in the alcoholic fermentation of onion juice, allowing the process to be modeled and controlled in real time.
Empirical justification of the elementary model of money circulation
NASA Astrophysics Data System (ADS)
Schinckus, Christophe; Altukhov, Yurii A.; Pokrovskii, Vladimir N.
2018-03-01
This paper proposes an elementary model describing the money circulation for a system, composed by a production system, the government, a central bank, commercial banks and their customers. A set of equations for the system determines the main features of interaction between the production and the money circulation. It is shown, that the money system can evolve independently of the evolution of production. The model can be applied to any national economy but we will illustrate our claim in the context of the Russian monetary system.
Metabolic pathway analysis and kinetic studies for production of nattokinase in Bacillus subtilis.
Unrean, Pornkamol; Nguyen, Nhung H A
2013-01-01
We have constructed a reaction network model of Bacillus subtilis. The model was analyzed using a pathway analysis tool called elementary mode analysis (EMA). The analysis tool was used to study the network capabilities and the possible effects of altered culturing conditions on the production of a fibrinolytic enzyme, nattokinase (NK) by B. subtilis. Based on all existing metabolic pathways, the maximum theoretical yield for NK synthesis in B. subtilis under different substrates and oxygen availability was predicted and the optimal culturing condition for NK production was identified. To confirm model predictions, experiments were conducted by testing these culture conditions for their influence on NK activity. The optimal culturing conditions were then applied to batch fermentation, resulting in high NK activity. The EMA approach was also applied for engineering B. subtilis metabolism towards the most efficient pathway for NK synthesis by identifying target genes for deletion and overexpression that enable the cell to produce NK at the maximum theoretical yield. The consistency between experiments and model predictions proves the feasibility of EMA being used to rationally design culture conditions and genetic manipulations for the efficient production of desired products.
Unit mechanisms of fission gas release: Current understanding and future needs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tonks, Michael; Andersson, David; Devanathan, Ram
Gaseous fission product transport and release has a large impact on fuel performance, degrading fuel properties and, once the gas is released into the gap between the fuel and cladding, lowering gap thermal conductivity and increasing gap pressure. While gaseous fission product behavior has been investigated with bulk reactor experiments and simplified analytical models, recent improvements in experimental and modeling approaches at the atomistic and mesoscales are being applied to provide unprecedented understanding of the unit mechanisms that define the fission product behavior. In this article, existing research on the basic mechanisms behind the various stages of fission gas releasemore » during normal reactor operation are summarized and critical areas where experimental and simulation work is needed are identified. This basic understanding of the fission gas behavior mechanisms has the potential to revolutionize our ability to predict fission product behavior during reactor operation and to design fuels that have improved fission product retention. In addition, this work can serve as a model on how a coupled experimental and modeling approach can be applied to understand the unit mechanisms behind other critical behaviors in reactor materials.« less
Development of a pheromone elution rate physical model
M.E. Teske; H.W. Thistle; B.L. Strom; H. Zhu
2015-01-01
A first principle modeling approach has been applied to available data describing the elution of semiochemicals from pheromone dispensers. These data included field data for 27 products developed by several manufacturers, including homemade devices, as well as environmental chamber data collected on three semiochemical products. The goal of this effort was to...
Modeling pure culture heterotrophic production of polyhydroxybutyrate (PHB).
Mozumder, Md Salatul Islam; Goormachtigh, Laurens; Garcia-Gonzalez, Linsey; De Wever, Heleen; Volcke, Eveline I P
2014-03-01
In this contribution a mechanistic model describing the production of polyhydroxybutyrate (PHB) through pure-culture fermentation was developed, calibrated and validated for two different substrates, namely glucose and waste glycerol. In both cases, non-growth-associated PHB production was triggered by applying nitrogen limitation. The occurrence of some growth-associated PHB production besides non-growth-associated PHB production was demonstrated, although it is inhibited in the presence of nitrogen. Other phenomena observed experimentally and described by the model included biomass growth on PHB and non-linear product inhibition of PHB production. The accumulated impurities from the waste substrate negatively affected the obtained maximum PHB content. Overall, the developed mathematical model provided an accurate prediction of the dynamic behavior of heterotrophic biomass growth and PHB production in a two-phase pure culture system. Copyright © 2013 Elsevier Ltd. All rights reserved.
Enhancement of the dark matter abundance before reheating: Applications to gravitino dark matter
NASA Astrophysics Data System (ADS)
Garcia, Marcos A. G.; Mambrini, Yann; Olive, Keith A.; Peloso, Marco
2017-11-01
In the first stages of inflationary reheating, the temperature of the radiation produced by inflaton decays is typically higher than the commonly defined reheating temperature TR H˜(ΓϕMP)1/2 where Γϕ is the inflaton decay rate. We consider the effect of particle production at temperatures at or near the maximum temperature attained during reheating. We show that the impact of this early production on the final particle abundance depends strongly on the temperature dependence of the production cross section. For ⟨σ v ⟩˜Tn/Mn +2, and for n <6 , any particle produced at Tmax is diluted by the later generation of entropy near TR H. This applies to cases such as gravitino production in low scale supersymmetric models (n =0 ) or NETDM models of dark matter (n =2 ). However, for n ≥6 the net abundance of particles produced during reheating is enhanced by over an order of magnitude, dominating over the dilution effect. This applies, for instance to gravitino production in high scale supersymmetry models where n =6 .
NASA Astrophysics Data System (ADS)
Nazir, Mohd Yusuf Mohd; Al-Shorgani, Najeeb Kaid Nasser; Kalil, Mohd Sahaid; Hamid, Aidil Abdul
2015-09-01
In this study, three factors (fructose concentration, agitation speed and monosodium glutamate (MSG) concentration) were optimized to enhance DHA production by Schizochytrium SW1 using response surface methodology (RSM). Central composite design was applied as the experimental design and analysis of variance (ANOVA) was used to analyze the data. The experiments were conducted using 500 mL flask with 100 mL working volume at 30°C for 96 hours. ANOVA analysis revealed that the process was adequately represented significantly by the quadratic model (p<0.0001) and two of the factors namely agitation speed and MSG concentration significantly affect DHA production (p<0.005). Level of influence for each variable and quadratic polynomial equation were obtained for DHA production by multiple regression analyses. The estimated optimum conditions for maximizing DHA production by SW1 were 70 g/L fructose, 250 rpm agitation speed and 12 g/L MSG. Consequently, the quadratic model was validated by applying of the estimated optimum conditions, which confirmed the model validity and 52.86% of DHA was produced.
Wright, Andrew; Hudson, Darren
2014-10-01
Studies of how carbon reduction policies would affect agricultural production have found that there is a connection between carbon emissions and irrigation. Using county level data we develop an optimization model that accounts for the gross carbon emitted during the production process to evaluate how carbon reducing policies applied to agriculture would affect the choices of what to plant and how much to irrigate by producers on the Texas High Plains. Carbon emissions were calculated using carbon equivalent (CE) calculations developed by researchers at the University of Arkansas. Carbon reduction was achieved in the model through a constraint, a tax, or a subsidy. Reducing carbon emissions by 15% resulted in a significant reduction in the amount of water applied to a crop; however, planted acreage changed very little due to a lack of feasible alternative crops. The results show that applying carbon restrictions to agriculture may have important implications for production choices in areas that depend on groundwater resources for agricultural production. Copyright © 2014 Elsevier Ltd. All rights reserved.
Impacts of Learning Orientation on Product Innovation Performance
ERIC Educational Resources Information Center
Calisir, Fethi; Gumussoy, Cigdem Altin; Guzelsoy, Ezgi
2013-01-01
Purpose: The present study aims to examine the effect of learning orientation (commitment to learning, shared vision, open-mindedness) on the product innovation performance (product innovation efficacy and efficiency) of companies in Turkey. Design/methodology/approach: A structural equation-modeling approach was applied to identify the variables…
Modelling exclusive meson pair production at hadron colliders
NASA Astrophysics Data System (ADS)
Harland-Lang, L. A.; Khoze, V. A.; Ryskin, M. G.
2014-04-01
We present a study of the central exclusive production of light meson pairs, concentrating on the region of lower invariant masses of the central system and/or meson transverse momentum, where perturbative QCD cannot be reliably applied. We describe in detail a phenomenological model, using the tools of Regge theory, that may be applied with some success in this regime, and we present the new, publicly available, Dime Monte Carlo (MC) implementation of this for , and production. The MC implementation includes a fully differential treatment of the survival factor, which in general depends on all kinematic variables, as well as allows for the so far reasonably unconstrained model parameters to be set by the user. We present predictions for the Tevatron and LHC, discuss and estimate the size of the proton-dissociative background, and show how future measurements may further test this Regge-based approach, as well as the soft hadronic model required to calculate the survival factor, in particular in the presence of tagged protons.
Quantitative Modeling of Cerenkov Light Production Efficiency from Medical Radionuclides
Beattie, Bradley J.; Thorek, Daniel L. J.; Schmidtlein, Charles R.; Pentlow, Keith S.; Humm, John L.; Hielscher, Andreas H.
2012-01-01
There has been recent and growing interest in applying Cerenkov radiation (CR) for biological applications. Knowledge of the production efficiency and other characteristics of the CR produced by various radionuclides would help in accessing the feasibility of proposed applications and guide the choice of radionuclides. To generate this information we developed models of CR production efficiency based on the Frank-Tamm equation and models of CR distribution based on Monte-Carlo simulations of photon and β particle transport. All models were validated against direct measurements using multiple radionuclides and then applied to a number of radionuclides commonly used in biomedical applications. We show that two radionuclides, Ac-225 and In-111, which have been reported to produce CR in water, do not in fact produce CR directly. We also propose a simple means of using this information to calibrate high sensitivity luminescence imaging systems and show evidence suggesting that this calibration may be more accurate than methods in routine current use. PMID:22363636
Simulating the fate of fall- and spring-applied poultry litter nitrogen in corn production
USDA-ARS?s Scientific Manuscript database
Monitoring the fate of N derived from manures applied to fertilize crops is difficult, time consuming, and relatively expensive. But computer simulation models can help understand the interactions among various N processes in the soil-plant system and determine the fate of applied N. The RZWQM2 was ...
Mitropoulos, Panagiotis Takis; Cupido, Gerardo
2009-01-01
In construction, the challenge for researchers and practitioners is to develop work systems (production processes and teams) that can achieve high productivity and high safety at the same time. However, construction accident causation models ignore the role of work practices and teamwork. This study investigates the mechanisms by which production and teamwork practices affect the likelihood of accidents. The paper synthesizes a new model for construction safety based on the cognitive perspective (Fuller's Task-Demand-Capability Interface model, 2005) and then presents an exploratory case study. The case study investigates and compares the work practices of two residential framing crews: a 'High Reliability Crew' (HRC)--that is, a crew with exceptional productivity and safety over several years, and an average performing crew from the same company. The model explains how the production and teamwork practices generate the work situations that workers face (the task demands) and affect the workers ability to cope (capabilities). The case study indicates that the work practices of the HRC directly influence the task demands and match them with the applied capabilities. These practices were guided by the 'principle' of avoiding errors and rework and included work planning and preparation, work distribution, managing the production pressures, and quality and behavior monitoring. The Task Demand-Capability model links construction research to a cognitive model of accident causation and provides a new way to conceptualize safety as an emergent property of the production practices and teamwork processes. The empirical evidence indicates that the crews' work practices and team processes strongly affect the task demands, the applied capabilities, and the match between demands and capabilities. The proposed model and the exploratory case study will guide further discovery of work practices and teamwork processes that can increase both productivity and safety in construction operations. Such understanding will enable training of construction foremen and crews in these practices to systematically develop high reliability crews.
The practice of quality-associated costing: application to transfusion manufacturing processes.
Trenchard, P M; Dixon, R
1997-01-01
This article applies the new method of quality-associated costing (QAC) to the mixture of processes that create red cell and plasma products from whole blood donations. The article compares QAC with two commonly encountered but arbitrary models and illustrates the invalidity of clinical cost-benefit analysis based on these models. The first, an "isolated" cost model, seeks to allocate each whole process cost to only one product class. The other is a "shared" cost model, and it seeks to allocate an approximately equal share of all process costs to all associated products.
Forecasting coconut production in the Philippines with ARIMA model
NASA Astrophysics Data System (ADS)
Lim, Cristina Teresa
2015-02-01
The study aimed to depict the situation of the coconut industry in the Philippines for the future years applying Autoregressive Integrated Moving Average (ARIMA) method. Data on coconut production, one of the major industrial crops of the country, for the period of 1990 to 2012 were analyzed using time-series methods. Autocorrelation (ACF) and partial autocorrelation functions (PACF) were calculated for the data. Appropriate Box-Jenkins autoregressive moving average model was fitted. Validity of the model was tested using standard statistical techniques. The forecasting power of autoregressive moving average (ARMA) model was used to forecast coconut production for the eight leading years.
Baehr, Arthur L.; Baker, Ronald J.
1995-01-01
A mathematical model is presented that simulates the transport and reaction of any number of gaseous phase constituents (e.g. CO2, O2, N2, and hydrocarbons) in unsaturated porous media. The model was developed as part of a method to determine rates of hydrocarbon biodegradation associated with natural cleansing at petroleum product spill sites. The one-dimensional model can be applied to analyze data from column experiments or from field sites where gas transport in the unsaturated zone is approximately vertical. A coupled, non-Fickian constitutive relation between fluxes and concentration gradients, together with the capability of incorporating heterogeneity with respect to model parameters, results in model applicability over a wide range of experimental and field conditions. When applied in a calibration mode, the model allows for the determination of constituent production/consumption rates as a function of the spatial coordinate. Alternatively, the model can be applied in a predictive mode to obtain the distribution of constituent concentrations and fluxes on the basis of assumed values of model parameters and a biodegradation hypothesis. Data requirements for the model are illustrated by analyzing data from a column experiment designed to determine the aerobic degradation rate of toluene in sediments collected from a gasoline spill site in Galloway Township, New Jersey.
An Alternative Procedure for Estimating Unit Learning Curves,
1985-09-01
the model accurately describes the real-life situation, i.e., when the model is properly applied to the data, it can be a powerful tool for...predicting unit production costs. There are, however, some unique estimation problems inherent in the model . The usual method of generating predicted unit...production costs attempts to extend properties of least squares estimators to non- linear functions of these estimators. The result is biased estimates of
Yoo, Min-Jung; Grozel, Clément; Kiritsis, Dimitris
2016-07-08
This paper describes our conceptual framework of closed-loop lifecycle information sharing for product-service in the Internet of Things (IoT). The framework is based on the ontology model of product-service and a type of IoT message standard, Open Messaging Interface (O-MI) and Open Data Format (O-DF), which ensures data communication. (1) BACKGROUND: Based on an existing product lifecycle management (PLM) methodology, we enhanced the ontology model for the purpose of integrating efficiently the product-service ontology model that was newly developed; (2) METHODS: The IoT message transfer layer is vertically integrated into a semantic knowledge framework inside which a Semantic Info-Node Agent (SINA) uses the message format as a common protocol of product-service lifecycle data transfer; (3) RESULTS: The product-service ontology model facilitates information retrieval and knowledge extraction during the product lifecycle, while making more information available for the sake of service business creation. The vertical integration of IoT message transfer, encompassing all semantic layers, helps achieve a more flexible and modular approach to knowledge sharing in an IoT environment; (4) Contribution: A semantic data annotation applied to IoT can contribute to enhancing collected data types, which entails a richer knowledge extraction. The ontology-based PLM model enables as well the horizontal integration of heterogeneous PLM data while breaking traditional vertical information silos; (5) CONCLUSION: The framework was applied to a fictive case study with an electric car service for the purpose of demonstration. For the purpose of demonstrating the feasibility of the approach, the semantic model is implemented in Sesame APIs, which play the role of an Internet-connected Resource Description Framework (RDF) database.
Yoo, Min-Jung; Grozel, Clément; Kiritsis, Dimitris
2016-01-01
This paper describes our conceptual framework of closed-loop lifecycle information sharing for product-service in the Internet of Things (IoT). The framework is based on the ontology model of product-service and a type of IoT message standard, Open Messaging Interface (O-MI) and Open Data Format (O-DF), which ensures data communication. (1) Background: Based on an existing product lifecycle management (PLM) methodology, we enhanced the ontology model for the purpose of integrating efficiently the product-service ontology model that was newly developed; (2) Methods: The IoT message transfer layer is vertically integrated into a semantic knowledge framework inside which a Semantic Info-Node Agent (SINA) uses the message format as a common protocol of product-service lifecycle data transfer; (3) Results: The product-service ontology model facilitates information retrieval and knowledge extraction during the product lifecycle, while making more information available for the sake of service business creation. The vertical integration of IoT message transfer, encompassing all semantic layers, helps achieve a more flexible and modular approach to knowledge sharing in an IoT environment; (4) Contribution: A semantic data annotation applied to IoT can contribute to enhancing collected data types, which entails a richer knowledge extraction. The ontology-based PLM model enables as well the horizontal integration of heterogeneous PLM data while breaking traditional vertical information silos; (5) Conclusion: The framework was applied to a fictive case study with an electric car service for the purpose of demonstration. For the purpose of demonstrating the feasibility of the approach, the semantic model is implemented in Sesame APIs, which play the role of an Internet-connected Resource Description Framework (RDF) database. PMID:27399717
NASA Astrophysics Data System (ADS)
Al-Kuhali, K.; Hussain M., I.; Zain Z., M.; Mullenix, P.
2015-05-01
Aim: This paper contribute to the flat panel display industry it terms of aggregate production planning. Methodology: For the minimization cost of total production of LCD manufacturing, a linear programming was applied. The decision variables are general production costs, additional cost incurred for overtime production, additional cost incurred for subcontracting, inventory carrying cost, backorder costs and adjustments for changes incurred within labour levels. Model has been developed considering a manufacturer having several product types, which the maximum types are N, along a total time period of T. Results: Industrial case study based on Malaysia is presented to test and to validate the developed linear programming model for aggregate production planning. Conclusion: The model development is fit under stable environment conditions. Overall it can be recommended to adapt the proven linear programming model to production planning of Malaysian flat panel display industry.
O'Donnell, Michael S.; Aldridge, Cameron L.; Doherty, Kevin E.; Fedy, Bradley C.
2015-01-01
We deliver all products described herein as online geographic information system data for visualization and downloading. We outline the data properties for each model and their data inputs, describe the process of selecting appropriate data products for multifarious applications, describe all data products and software, provide newly derived model composites, and discuss how land managers may use the models to inform future sage-grouse studies and potentially refine conservation efforts. The models, software tools, and associated opportunities for novel applications of these products should provide a suite of additional, but not exclusive, tools for assessing Wyoming Greater Sage-grouse habitats, which land managers, conservationists, and scientists can apply to myriad applications.
Design of experiment analysis of CO2 dielectric barrier discharge conditions on CO production
NASA Astrophysics Data System (ADS)
Becker, Markus; Ponduri, Srinath; Engeln, Richard; van de Sanden, Richard; Loffhagen, Detlef
2016-09-01
Dielectric barrier discharges (DBD) are frequently used for the generation of CO from CO2 which is of particular interest for syngas production. It has been found by means of fluid modelling in that the CO2 conversion frequency in a CO2 DBD depends linearly on the specific energy input (SEI) while the energy efficiency of CO production is only weakly dependent on the SEI. Here, the same numerical model as in is applied to study systematically the influence of gas pressure, applied voltage amplitude and frequency on the CO2 conversion frequency and the energy efficiency of CO production based on a 2-level 3-factor full factorial experimental design. It is found that the operating conditions of the CO2 DBD for CO production can be chosen to either have an optimal throughput or a better energy efficiency. This work was partly supported by the German Research Foundation within the Collaborative Research Centre Transregio 24.
Applying Modeling Tools to Ground System Procedures
NASA Technical Reports Server (NTRS)
Di Pasquale, Peter
2012-01-01
As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.
National economic models of industrial water use and waste treatment. [technology transfer
NASA Technical Reports Server (NTRS)
Thompson, R. G.; Calloway, J. A.
1974-01-01
The effects of air emission and solid waste restrictions on production costs and resource use by industry is investigated. A linear program is developed to analyze how resource use, production cost, and waste discharges in different types of production may be affected by resource limiting policies of the government. The method is applied to modeling ethylene and ammonia plants at the design stage. Results show that the effects of increasingly restrictive wastewater effluent standards on increased energy use were small in both plants. Plant models were developed for other industries and the program estimated effects of wastewater discharge policies on production costs of industry.
Measuring test productivity - The elusive dream
NASA Astrophysics Data System (ADS)
Ward, D. T.; Cross, E. J., Jr.
1983-11-01
The paper summarizes definitions and terminology relating to measurement of Test and Evaluation productivity before settling on the appropriate criteria for such a measurement model. A productivity measurement scheme suited for use by Test and Evaluation organizations is suggested. This mathematical model is a simplified version of one proposed by the American Productivity Center and applied to an aircraft maintenance facility by Fletcher. It includes only four primary variables: safety, schedule, cost, and deficiencies reported with varying degrees of objectivity and subjectivity involved in quantifying them. A hypothetical example of a fighter aircraft flight test program is used to illustrate the application of the productivity measurement model. The proposed model is intended to serve as a first iteration procedure and should be tested against real test programs to verify and refine it.
NASA Astrophysics Data System (ADS)
Smith, S. L.; Chen, B.; Vallina, S. M.
2017-12-01
Biodiversity-Ecosystem Function (BEF) relationships, which are most commonly quantified in terms of productivity or total biomass yield, are known to depend on the timescale of the experiment or field study, both for terrestrial plants and phytoplankton, which have each been widely studied as model ecosystems. Although many BEF relationships are positive (i.e., increasing biodiversity enhances function), in some cases there is an optimal intermediate diversity level (i.e., a uni-modal relationship), and in other cases productivity decreases with certain measures of biodiversity. These differences in BEF relationships cannot be reconciled merely by differences in the timescale of experiments. We will present results from simulation experiments applying recently developed trait-based models of phytoplankton communities and ecosystems, using the `adaptive dynamics' framework to represent continuous distributions of size and other key functional traits. Controlled simulation experiments were conducted with different levels of phytoplankton size-diversity, which through trait-size correlations implicitly represents functional-diversity. One recent study applied a theoretical box model for idealized simulations at different frequencies of disturbance. This revealed how the shapes of BEF relationships depend systematically on the frequency of disturbance and associated nutrient supply. We will also present more recent results obtained using a trait-based plankton ecosystem model embedded in a three-dimensional ocean model applied to the North Pacific. This reveals essentially the same pattern in a spatially explicit model with more realistic environmental forcing. In the relatively more variable subarctic, productivity tends to increase with the size (and hence functional) diversity of phytoplankton, whereas productivity tends to decrease slightly with increasing size-diversity in the relatively calm subtropics. Continuous trait-based models can capture essential features of BEF relationships, while requiring far fewer calculations compared to typical plankton diversity models that explicitly simulate a great many idealized species.
A comparison of economic evaluation models as applied to geothermal energy technology
NASA Technical Reports Server (NTRS)
Ziman, G. M.; Rosenberg, L. S.
1983-01-01
Several cost estimation and financial cash flow models have been applied to a series of geothermal case studies. In order to draw conclusions about relative performance and applicability of these models to geothermal projects, the consistency of results was assessed. The model outputs of principal interest in this study were net present value, internal rate of return, or levelized breakeven price. The models used were VENVAL, a venture analysis model; the Geothermal Probabilistic Cost Model (GPC Model); the Alternative Power Systems Economic Analysis Model (APSEAM); the Geothermal Loan Guarantee Cash Flow Model (GCFM); and the GEOCOST and GEOCITY geothermal models. The case studies to which the models were applied include a geothermal reservoir at Heber, CA; a geothermal eletric power plant to be located at the Heber site; an alcohol fuels production facility to be built at Raft River, ID; and a direct-use, district heating system in Susanville, CA.
Probabilistic framework for product design optimization and risk management
NASA Astrophysics Data System (ADS)
Keski-Rahkonen, J. K.
2018-05-01
Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.
Martínez-Trujillo, A; Arreguín-Rangel, L; García-Rivero, M; Aguilar-Osorio, G
2011-08-01
Utilization of fruit residues for pectinase production by two Aspergillus strains for recognizing the effects of some factors during fermentation and describing enzyme production kinetics. Pectinase production on several fruit residues was compared. The effects of three factors on the production of several pectinases were evaluated by a full factorial 2(k) experimental design. Higher activities were obtained on lemon peel. In both strains, acidic pH values and high carbon source concentration favoured exopectinase and endopectinase production, while higher pH values and low carbon source concentration promoted pectin lyase and rhamnogalacturonase production. Unstructured mathematical modelling provided a good description of pectinase production in a submerged batch culture. Fruit residues were very good substrates for pectinase production, and Aspergillus strains used showed a promising performance in submerged fermentation. Mathematical modelling was useful to describe growth and pectinase production. Lemon peel can be used as a substrate to obtain high pectinase titres by Aspergillus flavipes FP-500 and Aspergillus terreus FP-370. The factors that contributed to improve the yield were identified, which supports the possibility of using this substrate in the industrial production of these enzymes. © 2011 The Authors. Letters in Applied Microbiology © 2011 The Society for Applied Microbiology.
1D diffusion models may be used to estimate rates of production and consumption of dissolved metabolites in marine sediments, but are applied less often to the solid phase. Here we used a numerical inverse method to estimate solid phase Fe(III) and Fe(II) consumption and product...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rhee, C.O.
1987-01-01
This paper investigates, in the framework of firm's optimal behavior, the effect of company-funded and federally-funded RandD on productivity in selected US industries. Especially, the role of federal funding RandD in productivity through direct as well as indirect mechanisms is analyzed. Using different model specification, two types of RandD-federal and company, and data of industry level, no support can be found for the blanket statement that federally-funded RandD (FRD) crowds out or pulls in company-funding RandD in productivity growth. Whether crowding-out or pulling-in is shown to be industry-specific as well as based on FRD's time dimension. Hence, the lag effectmore » of heterogeneous RandD funds on productivity is emphasized. The classification of heterogeneous RandD funds into basic research, applied research, and development is adopted to look at the impact of each on productivity. The model of firm's optimal behavior following such classification demonstrates that federally-funded basic research has a tremendous pulling-in impact on company-funded applied research and development, respectively.« less
through programs of applied research in data analysis, modeling and product development in partnership with the broader research community. The National Hurricane Center (NHC) provides official NWS ) provides analysis and forecast products, specializing in quantitative precipitation forecasts to five days
Evaluation of Turbulence-Model Performance as Applied to Jet-Noise Prediction
NASA Technical Reports Server (NTRS)
Woodruff, S. L.; Seiner, J. M.; Hussaini, M. Y.; Erlebacher, G.
1998-01-01
The accurate prediction of jet noise is possible only if the jet flow field can be predicted accurately. Predictions for the mean velocity and turbulence quantities in the jet flowfield are typically the product of a Reynolds-averaged Navier-Stokes solver coupled with a turbulence model. To evaluate the effectiveness of solvers and turbulence models in predicting those quantities most important to jet noise prediction, two CFD codes and several turbulence models were applied to a jet configuration over a range of jet temperatures for which experimental data is available.
Managing Variation in Services in a Software Product Line Context
2010-05-01
Oriented Domain Analysis ( FODA ) Feasibility Study (CMU/SEI-90-TR-021, ADA235785). Software Engineering Institute, Carnegie Mellon University, 1990...the systems in the product line, and a plan for building the systems. Product line scope and product line analysis define the boundaries and...systems, as well as expected ways in which they may vary. Product line analysis applies established modeling techniques to engineer the common and
NASA Astrophysics Data System (ADS)
Sumaryani, Sri
2018-03-01
The purpose of this study is to develop a model of production management unit to enhance entrepreneurship attitude of vocational school students from fashion department. This study concerns in developing students' entrepreneurship attitude in management which includes planning, organizing, applying and evaluation. The study uses Research and Development (R & D) approach with three main steps; preliminary study, development step, and product validation. Research subject was vocational school teachers from fashion department in Semarang, Salatiga and Demak. This study yields a development model of production management unit that could enhance vocational school students' entrepreneurship attitude in fashion department. The result shows that research subjects have understood about of production management unit in Vocational School (SMK).
Study the influence factors to the adsorption process for separation of polyphenols from green tea
NASA Astrophysics Data System (ADS)
Phung, Lan Huong; Tran, Trung Kien; Van Quyet, Chu; Phi, Nguyen Thien
2017-09-01
The objective of this work is applying adsorption process for separation of polyphenols from extract solution of green tea by-product. The older leaves and stem of green tea tree are collected from Hiep Khanh Tea Company (Hoabinh province, Vietnam). In this study, two kinds of adsorbent (silicagel, active carbon) were applied for the adsorption process in batch stirring vessel. The factors that affected to the process productivity were investigated: temperature, solid/liquid ratio, duration time, stirring speed. The process has been empirically described with statistical models obtained by Design of Experiments. The results indicated that active carbon was verified to offer good adsorption productivity (more than 95%), much more effective than silicagel (with only about 20%). From the model, the most affected factor to the process could be seen as solid/liquid ratio.
Garcia-Aloy, Mar; Llorach, Rafael; Urpi-Sarda, Mireia; Jáuregui, Olga; Corella, Dolores; Ruiz-Canela, Miguel; Salas-Salvadó, Jordi; Fitó, Montserrat; Ros, Emilio; Estruch, Ramon; Andres-Lacueva, Cristina
2015-02-01
The aim of the current study was to apply an untargeted metabolomics strategy to characterize a model of cocoa intake biomarkers in a free-living population. An untargeted HPLC-q-ToF-MS based metabolomics approach was applied to human urine from 32 consumers of cocoa or derived products (CC) and 32 matched control subjects with no consumption of cocoa products (NC). The multivariate statistical analysis (OSC-PLS-DA) showed clear differences between CC and NC groups. The discriminant biomarkers identified were mainly related to the metabolic pathways of theobromine and polyphenols, as well as to cocoa processing. Consumption of cocoa products was also associated with reduced urinary excretions of methylglutarylcarnitine, which could be related to effects of cocoa exposure on insulin resistance. To improve the prediction of cocoa consumption, a combined urinary metabolite model was constructed. ROC curves were performed to evaluate the model and individual metabolites. The AUC values (95% CI) for the model were 95.7% (89.8-100%) and 92.6% (81.9-100%) in training and validation sets, respectively, whereas the AUCs for individual metabolites were <90%. The metabolic signature of cocoa consumption in free-living subjects reveals that combining different metabolites as biomarker models improves prediction of dietary exposure to cocoa. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Land cover maps, BVOC emissions, and SOA burden in a global aerosol-climate model
NASA Astrophysics Data System (ADS)
Stanelle, Tanja; Henrot, Alexandra; Bey, Isaelle
2015-04-01
It has been reported that different land cover representations influence the emission of biogenic volatile organic compounds (BVOC) (e.g. Guenther et al., 2006). But the land cover forcing used in model simulations is quite uncertain (e.g. Jung et al., 2006). As a consequence the simulated emission of BVOCs depends on the applied land cover map. To test the sensitivity of global and regional estimates of BVOC emissions on the applied land cover map we applied 3 different land cover maps into our global aerosol-climate model ECHAM6-HAM2.2. We found a high sensitivity for tropical regions. BVOCs are a very prominent precursor for the production of Secondary Organic Aerosols (SOA). Therefore the sensitivity of BVOC emissions on land cover maps impacts the SOA burden in the atmosphere. With our model system we are able to quantify that impact. References: Guenther et al. (2006), Estimates of global terrestrial isoprene emissions using MEGAN, Atmos. Chem. Phys., 6, 3181-3210, doi:10.5194/acp-6-3181-2006. Jung et al. (2006), Exploiting synergies of global land cover products for carbon cycle modeling, Rem. Sens. Environm., 101, 534-553, doi:10.1016/j.rse.2006.01.020.
Williams, Colin F.; Reed, Marshall J.; Mariner, Robert H.
2008-01-01
The U. S. Geological Survey (USGS) is conducting an updated assessment of geothermal resources in the United States. The primary method applied in assessments of identified geothermal systems by the USGS and other organizations is the volume method, in which the recoverable heat is estimated from the thermal energy available in a reservoir. An important focus in the assessment project is on the development of geothermal resource models consistent with the production histories and observed characteristics of exploited geothermal fields. The new assessment will incorporate some changes in the models for temperature and depth ranges for electric power production, preferred chemical geothermometers for estimates of reservoir temperatures, estimates of reservoir volumes, and geothermal energy recovery factors. Monte Carlo simulations are used to characterize uncertainties in the estimates of electric power generation. These new models for the recovery of heat from heterogeneous, fractured reservoirs provide a physically realistic basis for evaluating the production potential of natural geothermal reservoirs.
Fuzzy linear model for production optimization of mining systems with multiple entities
NASA Astrophysics Data System (ADS)
Vujic, Slobodan; Benovic, Tomo; Miljanovic, Igor; Hudej, Marjan; Milutinovic, Aleksandar; Pavlovic, Petar
2011-12-01
Planning and production optimization within multiple mines or several work sites (entities) mining systems by using fuzzy linear programming (LP) was studied. LP is the most commonly used operations research methods in mining engineering. After the introductory review of properties and limitations of applying LP, short reviews of the general settings of deterministic and fuzzy LP models are presented. With the purpose of comparative analysis, the application of both LP models is presented using the example of the Bauxite Basin Niksic with five mines. After the assessment, LP is an efficient mathematical modeling tool in production planning and solving many other single-criteria optimization problems of mining engineering. After the comparison of advantages and deficiencies of both deterministic and fuzzy LP models, the conclusion presents benefits of the fuzzy LP model but is also stating that seeking the optimal plan of production means to accomplish the overall analysis that will encompass the LP model approaches.
Optimized production planning model for a multi-plant cultivation system under uncertainty
NASA Astrophysics Data System (ADS)
Ke, Shunkui; Guo, Doudou; Niu, Qingliang; Huang, Danfeng
2015-02-01
An inexact multi-constraint programming model under uncertainty was developed by incorporating a production plan algorithm into the crop production optimization framework under the multi-plant collaborative cultivation system. In the production plan, orders from the customers are assigned to a suitable plant under the constraints of plant capabilities and uncertainty parameters to maximize profit and achieve customer satisfaction. The developed model and solution method were applied to a case study of a multi-plant collaborative cultivation system to verify its applicability. As determined in the case analysis involving different orders from customers, the period of plant production planning and the interval between orders can significantly affect system benefits. Through the analysis of uncertain parameters, reliable and practical decisions can be generated using the suggested model of a multi-plant collaborative cultivation system.
NASA Astrophysics Data System (ADS)
Thomson, A. M.; Izaurralde, R. C.; Calvin, K.; Zhang, X.; Wise, M.; West, T. O.
2010-12-01
Climate change and food security are global issues increasingly linked through human decision making that takes place across all scales from on-farm management actions to international climate negotiations. Understanding how agricultural systems can respond to climate change, through mitigation or adaptation, while still supplying sufficient food to feed a growing global population, thus requires a multi-sector tool in a global economic framework. Integrated assessment models are one such tool, however they are typically driven by historical aggregate statistics of production in combination with exogenous assumptions of future trends in agricultural productivity; they are not yet capable of exploring agricultural management practices as climate adaptation or mitigation strategies. Yet there are agricultural models capable of detailed biophysical modeling of farm management and climate impacts on crop yield, soil erosion and C and greenhouse gas emissions, although these are typically applied at point scales that are incompatible with coarse resolution integrated assessment modeling. To combine the relative strengths of these modeling systems, we are using the agricultural model EPIC (Environmental Policy Integrated Climate), applied in a geographic data framework for regional analyses, to provide input to the global economic model GCAM (Global Change Assessment Model). The initial phase of our approach focuses on a pilot region of the Midwest United States, a highly productive agricultural area. We apply EPIC, a point based biophysical process model, at 60 m spatial resolution within this domain and aggregate the results to GCAM agriculture and land use subregions for the United States. GCAM is then initialized with multiple management options for key food and bioenergy crops. Using EPIC to distinguish these management options based on grain yield, residue yield, soil C change and cost differences, GCAM then simulates the optimum distribution of the available management options to meet demands for food and energy over the next century. The coupled models provide a new platform for evaluating future changes in agricultural management based on food demand, bioenergy demand, and changes in crop yield and soil C under a changing climate. This framework can be applied to evaluate the economically and biophysically optimal distribution of management under future climates.
Jeffrey P. Prestemon; James A. Turner; Joseph Buongiorno; Shushuai Zhu; Ruhong Li
2008-01-01
US policy and forest product industry decisionmakers need quantitative information about the magnitude of timber product market impacts from the possible introduction of an exotic and potentially dangerous defoliating forest pest. We applied the Global Forest Products Model to evaluate the effects on the United States of an invasion by the Asian gypsy (...
Impact of Brexit on the forest products industry of the United Kingdom and the rest of the world
Craig M. T. Johnston; Joseph Buongiorno
2016-01-01
The Global Forest Products Model was applied to forecast the effect of Brexit on the global forest products industry to2003 under two scenarios; an optimistic and pessimistic future storyline regarding the potential economic effect of Brexit. The forecasts integrated a range of gross domestic product growth rates using an average of the optimistic and...
Surplus N in US maize production: Informing data-driven policies using the Adapt-N model
NASA Astrophysics Data System (ADS)
Sela, Shai; van-Es, Harold; McLellan, Eileen; Margerison, Rebecca; Melkonian, Jeff
2016-04-01
Maize (Zea mays L.) production accounts for the largest share of crop land area in the U.S, and is the largest consumer of nitrogen (N) fertilizers of all US crops. Over-application of N fertilizer in excess of crop needs often lead to surplus of N in the soil, resulting in well-documented environmental problems and social costs associated with high reactive N losses. There is a potential to reduce these costs through better application timing, use of enhanced efficiency products, and more precise rate calculations. However, promoting management changes by means of environmental policies requires robust analysis of the possible environmental outcomes associated with these policies. This research gap is addressed using Adapt-N, a computational tool that combines soil, crop and management information with near-real-time weather data to estimate optimum N application rates for maize. Using results from a large synthetic dataset of 8100 simulations spanning 6 years (2010-2015), we have explored the total applied N rates, surplus of N (total N applied minus N removed by the crop) and the environmental losses resulting from seven N management scenarios applied in the top 5 US maize production states - IL, IN, IA, MN and NE. To cover a wide range of weather and production environments, all scenarios were applied at five randomly selected locations in each state, using combinations of three soil texture classes and two organic matter contents. The results indicate that fall applications typically lead to the highest total amount of N applied, highest N surplus and substantial amounts of environmental N losses. Nitrification inhibitors were found to have a marginal benefits for fall applied N. Spring pre-plant N applications were found to have lower N surplus than fall applications, but could still lead to high N losses under wet spring conditions. These losses were reduced (12%) when nitrification and urease inhibitors were applied. Out of all simulated N management scenarios, applying a split application of a modest starter followed by the majority of N applied at sidedress was found to have on average the lowest total N applied amount and N surplus. A split application was found to reduce environmental losses by 46% and 17% compared with fall and spring pre-plant N applications (respectively). These results could be used to inform environmental policies and business models to reduce environmental costs associated with maize production in the U.S.
Study of clusters and hypernuclei production within PHSD+FRIGA model
NASA Astrophysics Data System (ADS)
Kireyeu, Viktar; Le Fèvre, Arnaud; Bratkovskaya, Elena
2017-03-01
We report on the results on the dynamical modelling of cluster formation with the new combined PHSD+FRIGA model at Nuclotron and NICA energies. The FRIGA clusterization algorithm, which can be applied to the transport models, is based on the simulated annealing technique to obtain the most bound configuration of fragments and nucleons. The PHSD+FRIGA model is able to predict isotope yields as well as hypernucleus production. Based on present predictions of the combined model we study the possibility to detect such clusters and hypernuclei in the BM@N and MPD/NICA detectors.
Study of Clusters and Hypernuclei production within PHSD+FRIGA model
NASA Astrophysics Data System (ADS)
Kireyeu, V.; Le Fèvre, A.; Bratkovskaya, E.
2017-01-01
We report on the results on the dynamical modelling of cluster formation with the new combined PHSD+FRIGA model at Nuclotron and NICA energies. The FRIGA clusterisation algorithm, which can be applied to the transport models, is based on the simulated annealing technique to obtain the most bound configuration of fragments and nucleons. The PHSD+FRIGA model is able to predict isotope yields as well as hyper-nucleus production. Based on present predictions of the combined model we study the possibility to detect such clusters and hypernuclei in the BM@N and MPD/NICA detectors.
Selecting statistical model and optimum maintenance policy: a case study of hydraulic pump.
Ruhi, S; Karim, M R
2016-01-01
Proper maintenance policy can play a vital role for effective investigation of product reliability. Every engineered object such as product, plant or infrastructure needs preventive and corrective maintenance. In this paper we look at a real case study. It deals with the maintenance of hydraulic pumps used in excavators by a mining company. We obtain the data that the owner had collected and carry out an analysis and building models for pump failures. The data consist of both failure and censored lifetimes of the hydraulic pump. Different competitive mixture models are applied to analyze a set of maintenance data of a hydraulic pump. Various characteristics of the mixture models, such as the cumulative distribution function, reliability function, mean time to failure, etc. are estimated to assess the reliability of the pump. Akaike Information Criterion, adjusted Anderson-Darling test statistic, Kolmogrov-Smirnov test statistic and root mean square error are considered to select the suitable models among a set of competitive models. The maximum likelihood estimation method via the EM algorithm is applied mainly for estimating the parameters of the models and reliability related quantities. In this study, it is found that a threefold mixture model (Weibull-Normal-Exponential) fits well for the hydraulic pump failures data set. This paper also illustrates how a suitable statistical model can be applied to estimate the optimum maintenance period at a minimum cost of a hydraulic pump.
Evaluation of pulsed streamer corona experiments to determine the O* radical yield
NASA Astrophysics Data System (ADS)
van Heesch, E. J. M.; Winands, G. J. J.; Pemen, A. J. M.
2008-12-01
The production of O* radicals in air by a pulsed streamer plasma is studied by integration of a large set of precise experimental data and the chemical kinetics of ozone production. The measured data comprise ozone production, plasma energy, streamer volume, streamer length, streamer velocity, humidity and gas-flow rate. Instead of entering input parameters into a kinetic model to calculate the end products the opposite strategy is followed. Since the amount of end-products (ozone) is known from the measurements the model had to be applied in the reverse direction to determine the input parameters, i.e. the O* radical concentration.
Future carbon storage in harvested wood products from Ontario's Crown forests
Jiaxin Chen; Stephen J. Colombo; Michael T. Ter-Mikaelian; Linda S. Heath
2008-01-01
This analysis quantifies projected carbon (C) storage in harvested wood products (HWP) from Ontario's Crown forests. The large-scale forest C budget model, FORCARB-ON, was applied to estimate HWP C stock changes using the production approach defined by the Intergovernmental Panel on Climate Change. Harvested wood volume was converted to C mass and allocated to...
A Hybrid Approach to the Valuation of RFID/MEMS Technology Applied to Ordnance Inventory
2005-11-01
International Journal of Production Economics . Graham...model the IT/IS investment evaluation process." International Journal of Production Economics 75: 199-211. Kakati, M. and U. R. Dhar (1991...34 International Journal of Production Economics 79: 197-208. 29 Ramesh, R. V. and M. D. Jayakumar (1997). "Inclusion of flexibility benefits
Whole-system carbon balance for a regional temperate forest in Northern Wisconsin, USA
NASA Astrophysics Data System (ADS)
Peckham, S. D.; Gower, S. T.
2010-12-01
The whole-system (biological + industrial) carbon (C) balance was estimated for the Chequamegon-Nicolet National Forest (CNNF), a temperate forest covering 600,000 ha in Northern Wisconsin, USA. The biological system was modeled using a spatially-explicit version of the ecosystem process model Biome-BGC. The industrial system was modeled using life cycle inventory (LCI) models for wood and paper products. Biome-BGC was used to estimate net primary production, net ecosystem production (NEP), and timber harvest (H) over the entire CNNF. The industrial carbon budget (Ci) was estimated by applying LCI models of CO2 emissions resulting from timber harvest and production of specific wood and paper products in the CNNF region. In 2009, simulated NEP of the CNNF averaged 3.0 tC/ha and H averaged 0.1 tC/ha. Despite model uncertainty, the CNNF region is likely a carbon sink (NEP - Ci > 0), even when CO2 emissions from timber harvest and production of wood and paper products are included in the calculation of the entire forest system C budget.
Static capacity model for sealed nickel cadmium cells
NASA Astrophysics Data System (ADS)
Lomaniec, Jacob
1989-04-01
A model was developed for calculating the capacity of nickel cadmium rechargeable sealed cells. The model applies only to the following operating conditions for a cell of capacity C ampere-hours: Temperature of 20 + or - 5 C; charging for 16 hr, at 0.1 C amp; discharging at 0.2 C amp until the terminal voltage falls to 1.0 V. The study considers the dimensional and quantitative relationships among the cell's chemical and mechanical components, and the application of these relationships to optimizing cell design in terms of energy density and electrical performance. The model comprises several components, representing the several stages of the manufacturing process: assembling the electrodes and separator in a cylindrical can; production of porous, sintered nickel plaque on a perforated steel substrate; impregnating the plaque with active material; oxidation of the sintered nickel during impregnation; insertion and concentration of the electrolyte solution. Results from the model were compared with those obtained from cells manufactured during a long period. Good agreement was obtained. The models were used in the plant, to define the operating parameters of various production stages and contributed to a general improvement in product quality. The models were also applied to the optimization of new cell designs, and reduction of development costs by the elimination of much experimental work.
Exploring Causal Models of Educational Achievement.
ERIC Educational Resources Information Center
Parkerson, Jo Ann; And Others
1984-01-01
This article evaluates five causal model of educational productivity applied to learning science in a sample of 882 fifth through eighth graders. Each model explores the relationship between achievement and a combination of eight constructs: home environment, peer group, media, ability, social environment, time on task, motivation, and…
NASA Astrophysics Data System (ADS)
Ogawa, Tatsuhiko; Hashimoto, Shintaro; Sato, Tatsuhiko; Niita, Koji
2014-06-01
A new nuclear de-excitation model, intended for accurate simulation of isomeric transition of excited nuclei, was incorporated into PHITS and applied to various situations to clarify the impact of the model. The case studies show that precise treatment of gamma de-excitation and consideration for isomer production are important for various applications such as detector performance prediction, radiation shielding calculations and the estimation of radioactive inventory including isomers.
Integration of scheduling and discrete event simulation systems to improve production flow planning
NASA Astrophysics Data System (ADS)
Krenczyk, D.; Paprocka, I.; Kempa, W. M.; Grabowik, C.; Kalinowski, K.
2016-08-01
The increased availability of data and computer-aided technologies such as MRPI/II, ERP and MES system, allowing producers to be more adaptive to market dynamics and to improve production scheduling. Integration of production scheduling and computer modelling, simulation and visualization systems can be useful in the analysis of production system constraints related to the efficiency of manufacturing systems. A integration methodology based on semi-automatic model generation method for eliminating problems associated with complexity of the model and labour-intensive and time-consuming process of simulation model creation is proposed. Data mapping and data transformation techniques for the proposed method have been applied. This approach has been illustrated through examples of practical implementation of the proposed method using KbRS scheduling system and Enterprise Dynamics simulation system.
Discrete-event system simulation on small and medium enterprises productivity improvement
NASA Astrophysics Data System (ADS)
Sulistio, J.; Hidayah, N. A.
2017-12-01
Small and medium industries in Indonesia is currently developing. The problem faced by SMEs is the difficulty of meeting growing demand coming into the company. Therefore, SME need an analysis and evaluation on its production process in order to meet all orders. The purpose of this research is to increase the productivity of SMEs production floor by applying discrete-event system simulation. This method preferred because it can solve complex problems die to the dynamic and stochastic nature of the system. To increase the credibility of the simulation, model validated by cooperating the average of two trials, two trials of variance and chi square test. Afterwards, Benferroni method applied to development several alternatives. The article concludes that, the productivity of SMEs production floor increased up to 50% by adding the capacity of dyeing and drying machines.
Joseph Buongiorno; Ronald Raunikar; Shushuai Zhu
2011-01-01
The Global Forest Products Model (GFPM) was applied to project the consequences for the global forest sector of doubling the rate of growth of bioenergy demand relative to a base scenario, other drivers being maintained constant. The results showed that this would lead to the convergence of the price of fuelwood and industrial roundwood, raising the price of industrial...
Estimating Animal Abundance in Ground Beef Batches Assayed with Molecular Markers
Hu, Xin-Sheng; Simila, Janika; Platz, Sindey Schueler; Moore, Stephen S.; Plastow, Graham; Meghen, Ciaran N.
2012-01-01
Estimating animal abundance in industrial scale batches of ground meat is important for mapping meat products through the manufacturing process and for effectively tracing the finished product during a food safety recall. The processing of ground beef involves a potentially large number of animals from diverse sources in a single product batch, which produces a high heterogeneity in capture probability. In order to estimate animal abundance through DNA profiling of ground beef constituents, two parameter-based statistical models were developed for incidence data. Simulations were applied to evaluate the maximum likelihood estimate (MLE) of a joint likelihood function from multiple surveys, showing superiority in the presence of high capture heterogeneity with small sample sizes, or comparable estimation in the presence of low capture heterogeneity with a large sample size when compared to other existing models. Our model employs the full information on the pattern of the capture-recapture frequencies from multiple samples. We applied the proposed models to estimate animal abundance in six manufacturing beef batches, genotyped using 30 single nucleotide polymorphism (SNP) markers, from a large scale beef grinding facility. Results show that between 411∼1367 animals were present in six manufacturing beef batches. These estimates are informative as a reference for improving recall processes and tracing finished meat products back to source. PMID:22479559
NASA Astrophysics Data System (ADS)
Yehia, Ali M.; Arafa, Reham M.; Abbas, Samah S.; Amer, Sawsan M.
2016-01-01
Spectral resolution of cefquinome sulfate (CFQ) in the presence of its degradation products was studied. Three selective, accurate and rapid spectrophotometric methods were performed for the determination of CFQ in the presence of either its hydrolytic, oxidative or photo-degradation products. The proposed ratio difference, derivative ratio and mean centering are ratio manipulating spectrophotometric methods that were satisfactorily applied for selective determination of CFQ within linear range of 5.0-40.0 μg mL- 1. Concentration Residuals Augmented Classical Least Squares was applied and evaluated for the determination of the cited drug in the presence of its all degradation products. Traditional Partial Least Squares regression was also applied and benchmarked against the proposed advanced multivariate calibration. Experimentally designed 25 synthetic mixtures of three factors at five levels were used to calibrate and validate the multivariate models. Advanced chemometrics succeeded in quantitative and qualitative analyses of CFQ along with its hydrolytic, oxidative and photo-degradation products. The proposed methods were applied successfully for different pharmaceutical formulations analyses. These developed methods were simple and cost-effective compared with the manufacturer's RP-HPLC method.
PERSON-Personalized Expert Recommendation System for Optimized Nutrition.
Chen, Chih-Han; Karvela, Maria; Sohbati, Mohammadreza; Shinawatra, Thaksin; Toumazou, Christofer
2018-02-01
The rise of personalized diets is due to the emergence of nutrigenetics and genetic tests services. However, the recommendation system is far from mature to provide personalized food suggestion to consumers for daily usage. The main barrier of connecting genetic information to personalized diets is the complexity of data and the scalability of the applied systems. Aiming to cross such barriers and provide direct applications, a personalized expert recommendation system for optimized nutrition is introduced in this paper, which performs direct to consumer personalized grocery product filtering and recommendation. Deep learning neural network model is applied to achieve automatic product categorization. The ability of scaling with unknown new data is achieved through the generalized representation of word embedding. Furthermore, the categorized products are filtered with a model based on individual genetic data with associated phenotypic information and a case study with databases from three different sources is carried out to confirm the system.
Triple Value System Dynamics Modeling to Help Stakeholders Engage with Food-Energy-Water Problems
Triple Value (3V) Community scoping projects and Triple Value Simulation (3VS) models help decision makers and stakeholders apply systems-analysis methodology to complex problems related to food production, water quality, and energy use. 3VS models are decision support tools that...
Hybrid modeling as a QbD/PAT tool in process development: an industrial E. coli case study.
von Stosch, Moritz; Hamelink, Jan-Martijn; Oliveira, Rui
2016-05-01
Process understanding is emphasized in the process analytical technology initiative and the quality by design paradigm to be essential for manufacturing of biopharmaceutical products with consistent high quality. A typical approach to developing a process understanding is applying a combination of design of experiments with statistical data analysis. Hybrid semi-parametric modeling is investigated as an alternative method to pure statistical data analysis. The hybrid model framework provides flexibility to select model complexity based on available data and knowledge. Here, a parametric dynamic bioreactor model is integrated with a nonparametric artificial neural network that describes biomass and product formation rates as function of varied fed-batch fermentation conditions for high cell density heterologous protein production with E. coli. Our model can accurately describe biomass growth and product formation across variations in induction temperature, pH and feed rates. The model indicates that while product expression rate is a function of early induction phase conditions, it is negatively impacted as productivity increases. This could correspond with physiological changes due to cytoplasmic product accumulation. Due to the dynamic nature of the model, rational process timing decisions can be made and the impact of temporal variations in process parameters on product formation and process performance can be assessed, which is central for process understanding.
Stochastic Optimization in The Power Management of Bottled Water Production Planning
NASA Astrophysics Data System (ADS)
Antoro, Budi; Nababan, Esther; Mawengkang, Herman
2018-01-01
This paper review a model developed to minimize production costs on bottled water production planning through stochastic optimization. As we know, that planning a management means to achieve the goal that have been applied, since each management level in the organization need a planning activities. The built models is a two-stage stochastic models that aims to minimize the cost on production of bottled water by observing that during the production process, neither interfernce nor vice versa occurs. The models were develop to minimaze production cost, assuming the availability of packing raw materials used considered to meet for each kind of bottles. The minimum cost for each kind production of bottled water are expressed in the expectation of each production with a scenario probability. The probability of uncertainly is a representation of the number of productions and the timing of power supply interruption. This is to ensure that the number of interruption that occur does not exceed the limit of the contract agreement that has been made by the company with power suppliers.
2014-01-01
Background Metal oxide nanoparticles such as ZnO are used in sunscreens as they improve their optical properties against the UV-light that causes dermal damage and skin cancer. However, the hazardous properties of the particles used as UV-filters in the sunscreens and applied to the skin have remained uncharacterized. Methods Here we investigated whether different sized ZnO particles would be able to penetrate injured skin and injured allergic skin in the mouse atopic dermatitis model after repeated topical application of ZnO particles. Nano-sized ZnO (nZnO) and bulk-sized ZnO (bZnO) were applied to mechanically damaged mouse skin with or without allergen/superantigen sensitization. Allergen/superantigen sensitization evokes local inflammation and allergy in the skin and is used as a disease model of atopic dermatitis (AD). Results Our results demonstrate that only nZnO is able to reach into the deep layers of the allergic skin whereas bZnO stays in the upper layers of both damaged and allergic skin. In addition, both types of particles diminish the local skin inflammation induced in the mouse model of AD; however, nZnO has a higher potential to suppress the local effects. In addition, especially nZnO induces systemic production of IgE antibodies, evidence of allergy promoting adjuvant properties for topically applied nZnO. Conclusions These results provide new hazard characterization data about the metal oxide nanoparticles commonly used in cosmetic products and provide new insights into the dermal exposure and hazard assessment of these materials in injured skin. PMID:25123235
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pinches, A.; Pallent, L.J.
1986-10-01
Rate and yield information relating to biomass and product formation and to nitrogen, glucose and oxygen consumption are described for xanthan gum batch fermentations in which both chemically defined (glutamate nitrogen) and complex (peptone nitrogen) media are employed. Simple growth and product models are used for data interpretation. For both nitrogen sources, rate and yield parameter estimates are shown to be independent of initial nitrogen concentrations. For stationary phases, specific rates of gum production are shown to be independent of nitrogen source but dependent on initial nitrogen concentration. The latter is modeled empirically and suggests caution in applying simple productmore » models to xanthan gum fermentations. 13 references.« less
Target Scattering Metrics: Model-Model and Model-Data Comparisons
2017-12-13
measured synthetic aperture sonar (SAS) data or from numerical models is investigated. Metrics are needed for quantitative comparisons for signals...candidate metrics for model-model comparisons are examined here with a goal to consider raw data prior to its reduction to data products, which may...be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for
Target Scattering Metrics: Model-Model and Model Data comparisons
2017-12-13
measured synthetic aperture sonar (SAS) data or from numerical models is investigated. Metrics are needed for quantitative comparisons for signals...candidate metrics for model-model comparisons are examined here with a goal to consider raw data prior to its reduction to data products, which may...be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for
Stochastic growth logistic model with aftereffect for batch fermentation process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosli, Norhayati; Ayoubi, Tawfiqullah; Bahar, Arifah
2014-06-19
In this paper, the stochastic growth logistic model with aftereffect for the cell growth of C. acetobutylicum P262 and Luedeking-Piret equations for solvent production in batch fermentation system is introduced. The parameters values of the mathematical models are estimated via Levenberg-Marquardt optimization method of non-linear least squares. We apply Milstein scheme for solving the stochastic models numerically. The effciency of mathematical models is measured by comparing the simulated result and the experimental data of the microbial growth and solvent production in batch system. Low values of Root Mean-Square Error (RMSE) of stochastic models with aftereffect indicate good fits.
Stochastic growth logistic model with aftereffect for batch fermentation process
NASA Astrophysics Data System (ADS)
Rosli, Norhayati; Ayoubi, Tawfiqullah; Bahar, Arifah; Rahman, Haliza Abdul; Salleh, Madihah Md
2014-06-01
In this paper, the stochastic growth logistic model with aftereffect for the cell growth of C. acetobutylicum P262 and Luedeking-Piret equations for solvent production in batch fermentation system is introduced. The parameters values of the mathematical models are estimated via Levenberg-Marquardt optimization method of non-linear least squares. We apply Milstein scheme for solving the stochastic models numerically. The effciency of mathematical models is measured by comparing the simulated result and the experimental data of the microbial growth and solvent production in batch system. Low values of Root Mean-Square Error (RMSE) of stochastic models with aftereffect indicate good fits.
Alternative approaches to predicting methane emissions from dairy cows.
Mills, J A N; Kebreab, E; Yates, C M; Crompton, L A; Cammell, S B; Dhanoa, M S; Agnew, R E; France, J
2003-12-01
Previous attempts to apply statistical models, which correlate nutrient intake with methane production, have been of limited value where predictions are obtained for nutrient intakes and diet types outside those used in model construction. Dynamic mechanistic models have proved more suitable for extrapolation, but they remain computationally expensive and are not applied easily in practical situations. The first objective of this research focused on employing conventional techniques to generate statistical models of methane production appropriate to United Kingdom dairy systems. The second objective was to evaluate these models and a model published previously using both United Kingdom and North American data sets. Thirdly, nonlinear models were considered as alternatives to the conventional linear regressions. The United Kingdom calorimetry data used to construct the linear models also were used to develop the three nonlinear alternatives that were all of modified Mitscherlich (monomolecular) form. Of the linear models tested, an equation from the literature proved most reliable across the full range of evaluation data (root mean square prediction error = 21.3%). However, the Mitscherlich models demonstrated the greatest degree of adaptability across diet types and intake level. The most successful model for simulating the independent data was a modified Mitscherlich equation with the steepness parameter set to represent dietary starch-to-ADF ratio (root mean square prediction error = 20.6%). However, when such data were unavailable, simpler Mitscherlich forms relating dry matter or metabolizable energy intake to methane production remained better alternatives relative to their linear counterparts.
MODELING OF THE FAST ORGANIC EMISSIONS FROM A WOOD-FINISHING PRODUCT -- FLOOR WAX
The paper discusses environmental chamber and full-scale residential house tests conducted to characterize the fast organic emissions from a wood finishing product, floor wax. For the environmental chamber tests, a very small amount (< 5 g/sq m) of the wax was applied to an alumi...
NASA Astrophysics Data System (ADS)
Wang, Yu; Liu, Qun
2013-01-01
Surplus-production models are widely used in fish stock assessment and fisheries management due to their simplicity and lower data demands than age-structured models such as Virtual Population Analysis. The CEDA (catch-effort data analysis) and ASPIC (a surplus-production model incorporating covariates) computer packages are data-fitting or parameter estimation tools that have been developed to analyze catch-and-effort data using non-equilibrium surplus production models. We applied CEDA and ASPIC to the hairtail ( Trichiurus japonicus) fishery in the East China Sea. Both packages produced robust results and yielded similar estimates. In CEDA, the Schaefer surplus production model with log-normal error assumption produced results close to those of ASPIC. CEDA is sensitive to the choice of initial proportion, while ASPIC is not. However, CEDA produced higher R 2 values than ASPIC.
Evaluating lotion transfer from products to skin using the behind-the-knee test.
Farage, Miranda A
2010-05-01
Adding lotions or emollients to the surface of a variety of paper products confers a number of benefits to the skin of consumers. A modification of the 'behind-the-knee (BTK)' test model was used as a means of measuring the effectiveness of lotion transfer to the skin. Two series of feminine protection pads were prepared: (1) identically constructed pads differing only in the amount of lotion applied to the surface and (2) pads of various compositions to compare the influence of other product characteristics. For the first series, pads were applied for 3 h using the BTK protocol, and lotion transfer was evaluated. For the second series of products, two sample pads were applied consecutively for 3 h each, and lotion transfer was evaluated a both time points (e.g., 3 and 6 h). In addition, a clinical in-use study was used to evaluate lotion transfer for the second product series. In the BTK model using pads of identical composition, lotion transfer was a function of the amount of lotion placed on the pad. However, results from the second product series indicated that when pads were prepared using different absorbant materials (supreabsorbent gelling material, or AGM and cellulose), pads with the AGM core transferred lotion more effectively than pads with a cellulose core. Other product characteristics, i.e., pad thickness and lotion configuration, did not detectibly influence lotion transfer. The results of an in-use clinical study conducted on the second series of test products were directionally similar to those from the BTK, but statistical significance was not reached. An adaptation of the BTK test method provides an effective means of evaluating the transfer of lotion formulations from feminine protection pads at a fraction of the cost of clinical in-use studies.
Building aggregate timber supply models from individual harvest choice
Maksym Polyakov; David N. Wear; Robert Huggett
2009-01-01
Timber supply has traditionally been modelled using aggregate data. In this paper, we build aggregate supply models for four roundwood products for the US state of North Carolina from a stand-level harvest choice model applied to detailed forest inventory. The simulated elasticities of pulpwood supply are much lower than reported by previous studies. Cross price...
Harvest choice and timber supply models for forest forecasting
Maksym Polyakov; David N Wear
2010-01-01
Timber supply has traditionally been modeled using aggregate data, whereas individual harvest choices have been shown to be sensitive to the vintage and condition of forest capital stocks. In this article, we build aggregate supply models for four roundwood products in a seven-state region of the US South directly from stand-level harvest choice models applied to...
Three probes for diagnosing photochemical dynamics are presented and applied to specialized ambient surface-level observations and to a numerical photochemical model to better understand rates of production and other process information in the atmosphere and in the model. Howeve...
NASA Astrophysics Data System (ADS)
Lynch, Peng; Reid, Jeffrey S.; Westphal, Douglas L.; Zhang, Jianglong; Hogan, Timothy F.; Hyer, Edward J.; Curtis, Cynthia A.; Hegg, Dean A.; Shi, Yingxi; Campbell, James R.; Rubin, Juli I.; Sessions, Walter R.; Turk, F. Joseph; Walker, Annette L.
2016-04-01
While stand alone satellite and model aerosol products see wide utilization, there is a significant need in numerous atmospheric and climate applications for a fused product on a regular grid. Aerosol data assimilation is an operational reality at numerous centers, and like meteorological reanalyses, aerosol reanalyses will see significant use in the near future. Here we present a standardized 2003-2013 global 1 × 1° and 6-hourly modal aerosol optical thickness (AOT) reanalysis product. This data set can be applied to basic and applied Earth system science studies of significant aerosol events, aerosol impacts on numerical weather prediction, and electro-optical propagation and sensor performance, among other uses. This paper describes the science of how to develop and score an aerosol reanalysis product. This reanalysis utilizes a modified Navy Aerosol Analysis and Prediction System (NAAPS) at its core and assimilates quality controlled retrievals of AOT from the Moderate Resolution Imaging Spectroradiometer (MODIS) on Terra and Aqua and the Multi-angle Imaging SpectroRadiometer (MISR) on Terra. The aerosol source functions, including dust and smoke, were regionally tuned to obtain the best match between the model fine- and coarse-mode AOTs and the Aerosol Robotic Network (AERONET) AOTs. Other model processes, including deposition, were tuned to minimize the AOT difference between the model and satellite AOT. Aerosol wet deposition in the tropics is driven with satellite-retrieved precipitation, rather than the model field. The final reanalyzed fine- and coarse-mode AOT at 550 nm is shown to have good agreement with AERONET observations, with global mean root mean square error around 0.1 for both fine- and coarse-mode AOTs. This paper includes a discussion of issues particular to aerosol reanalyses that make them distinct from standard meteorological reanalyses, considerations for extending such a reanalysis outside of the NASA A-Train era, and examples of how the aerosol reanalysis can be applied or fused with other model or remote sensing products. Finally, the reanalysis is evaluated in comparison with other available studies of aerosol trends, and the implications of this comparison are discussed.
Capacity planning for batch and perfusion bioprocesses across multiple biopharmaceutical facilities.
Siganporia, Cyrus C; Ghosh, Soumitra; Daszkowski, Thomas; Papageorgiou, Lazaros G; Farid, Suzanne S
2014-01-01
Production planning for biopharmaceutical portfolios becomes more complex when products switch between fed-batch and continuous perfusion culture processes. This article describes the development of a discrete-time mixed integer linear programming (MILP) model to optimize capacity plans for multiple biopharmaceutical products, with either batch or perfusion bioprocesses, across multiple facilities to meet quarterly demands. The model comprised specific features to account for products with fed-batch or perfusion culture processes such as sequence-dependent changeover times, continuous culture constraints, and decoupled upstream and downstream operations that permit independent scheduling of each. Strategic inventory levels were accounted for by applying cost penalties when they were not met. A rolling time horizon methodology was utilized in conjunction with the MILP model and was shown to obtain solutions with greater optimality in less computational time than the full-scale model. The model was applied to an industrial case study to illustrate how the framework aids decisions regarding outsourcing capacity to third party manufacturers or building new facilities. The impact of variations on key parameters such as demand or titres on the optimal production plans and costs was captured. The analysis identified the critical ratio of in-house to contract manufacturing organization (CMO) manufacturing costs that led the optimization results to favor building a future facility over using a CMO. The tool predicted that if titres were higher than expected then the optimal solution would allocate more production to in-house facilities, where manufacturing costs were lower. Utilization graphs indicated when capacity expansion should be considered. © 2014 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.
Capacity Planning for Batch and Perfusion Bioprocesses Across Multiple Biopharmaceutical Facilities
Siganporia, Cyrus C; Ghosh, Soumitra; Daszkowski, Thomas; Papageorgiou, Lazaros G; Farid, Suzanne S
2014-01-01
Production planning for biopharmaceutical portfolios becomes more complex when products switch between fed-batch and continuous perfusion culture processes. This article describes the development of a discrete-time mixed integer linear programming (MILP) model to optimize capacity plans for multiple biopharmaceutical products, with either batch or perfusion bioprocesses, across multiple facilities to meet quarterly demands. The model comprised specific features to account for products with fed-batch or perfusion culture processes such as sequence-dependent changeover times, continuous culture constraints, and decoupled upstream and downstream operations that permit independent scheduling of each. Strategic inventory levels were accounted for by applying cost penalties when they were not met. A rolling time horizon methodology was utilized in conjunction with the MILP model and was shown to obtain solutions with greater optimality in less computational time than the full-scale model. The model was applied to an industrial case study to illustrate how the framework aids decisions regarding outsourcing capacity to third party manufacturers or building new facilities. The impact of variations on key parameters such as demand or titres on the optimal production plans and costs was captured. The analysis identified the critical ratio of in-house to contract manufacturing organization (CMO) manufacturing costs that led the optimization results to favor building a future facility over using a CMO. The tool predicted that if titres were higher than expected then the optimal solution would allocate more production to in-house facilities, where manufacturing costs were lower. Utilization graphs indicated when capacity expansion should be considered. © 2013 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 30:594–606, 2014 PMID:24376262
The added value of remote sensing products in constraining hydrological models
NASA Astrophysics Data System (ADS)
Nijzink, Remko C.; Almeida, Susana; Pechlivanidis, Ilias; Capell, René; Gustafsson, David; Arheimer, Berit; Freer, Jim; Han, Dawei; Wagener, Thorsten; Sleziak, Patrik; Parajka, Juraj; Savenije, Hubert; Hrachowitz, Markus
2017-04-01
The calibration of a hydrological model still depends on the availability of streamflow data, even though more additional sources of information (i.e. remote sensed data products) have become more widely available. In this research, the model parameters of four different conceptual hydrological models (HYPE, HYMOD, TUW, FLEX) were constrained with remotely sensed products. The models were applied over 27 catchments across Europe to cover a wide range of climates, vegetation and landscapes. The fluxes and states of the models were correlated with the relevant products (e.g. MOD10A snow with modelled snow states), after which new a-posteriori parameter distributions were determined based on a weighting procedure using conditional probabilities. Briefly, each parameter was weighted with the coefficient of determination of the relevant regression between modelled states/fluxes and products. In this way, final feasible parameter sets were derived without the use of discharge time series. Initial results show that improvements in model performance, with regard to streamflow simulations, are obtained when the models are constrained with a set of remotely sensed products simultaneously. In addition, we present a more extensive analysis to assess a model's ability to reproduce a set of hydrological signatures, such as rising limb density or peak distribution. Eventually, this research will enhance our understanding and recommendations in the use of remotely sensed products for constraining conceptual hydrological modelling and improving predictive capability, especially for data sparse regions.
A modelling methodology to assess the effect of insect pest control on agro-ecosystems.
Wan, Nian-Feng; Ji, Xiang-Yun; Jiang, Jie-Xian; Li, Bo
2015-04-23
The extensive use of chemical pesticides for pest management in agricultural systems can entail risks to the complex ecosystems consisting of economic, ecological and social subsystems. To analyze the negative and positive effects of external or internal disturbances on complex ecosystems, we proposed an ecological two-sidedness approach which has been applied to the design of pest-controlling strategies for pesticide pollution management. However, catastrophe theory has not been initially applied to this approach. Thus, we used an approach of integrating ecological two-sidedness with a multi-criterion evaluation method of catastrophe theory to analyze the complexity of agro-ecosystems disturbed by the insecticides and screen out the best insect pest-controlling strategy in cabbage production. The results showed that the order of the values of evaluation index (RCC/CP) for three strategies in cabbage production was "applying frequency vibration lamps and environment-friendly insecticides 8 times" (0.80) < "applying trap devices and environment-friendly insecticides 9 times" (0.83) < "applying common insecticides 14 times" (1.08). The treatment "applying frequency vibration lamps and environment-friendly insecticides 8 times" was considered as the best insect pest-controlling strategy in cabbage production in Shanghai, China.
A modelling methodology to assess the effect of insect pest control on agro-ecosystems
Wan, Nian-Feng; Ji, Xiang-Yun; Jiang, Jie-Xian; Li, Bo
2015-01-01
The extensive use of chemical pesticides for pest management in agricultural systems can entail risks to the complex ecosystems consisting of economic, ecological and social subsystems. To analyze the negative and positive effects of external or internal disturbances on complex ecosystems, we proposed an ecological two-sidedness approach which has been applied to the design of pest-controlling strategies for pesticide pollution management. However, catastrophe theory has not been initially applied to this approach. Thus, we used an approach of integrating ecological two-sidedness with a multi-criterion evaluation method of catastrophe theory to analyze the complexity of agro-ecosystems disturbed by the insecticides and screen out the best insect pest-controlling strategy in cabbage production. The results showed that the order of the values of evaluation index (RCC/CP) for three strategies in cabbage production was “applying frequency vibration lamps and environment-friendly insecticides 8 times” (0.80) < “applying trap devices and environment-friendly insecticides 9 times” (0.83) < “applying common insecticides 14 times” (1.08). The treatment “applying frequency vibration lamps and environment-friendly insecticides 8 times” was considered as the best insect pest-controlling strategy in cabbage production in Shanghai, China. PMID:25906199
NASA Astrophysics Data System (ADS)
Perez, D.; Phinn, S. R.; Roelfsema, C. M.; Shaw, E. C.; Johnston, L.; Iguel, J.; Camacho, R.
2017-12-01
Primary production and calcification are important to measure and monitor over time, because of their fundamental roles in the carbon cycling and accretion of habitat structure for reef ecosystems. However, monitoring biogeochemical processes in coastal environments has been difficult due to complications in resolving differences in water optical properties from biological productivity and other sources (sediment, dissolved organics, etc.). This complicates application of algorithms developed for satellite image data from open ocean conditions, and requires alternative approaches. This project applied a cross-disciplinary approach, using established methods for monitoring productivity in terrestrial environments to coral reef systems. Availability of regularly acquired high spatial (< 5m pixels), multispectral satellite imagery has improved mapping and monitoring capabilities for shallow, marine environments such as seagrass and coral reefs. There is potential to further develop optical models for remote sensing applications to estimate and monitor reef system processes, such as primary productivity and calcification. This project collected field measurements of spectral absorptance and primary productivity and calcification rates for two reef systems: Heron Reef, southern Great Barrier Reef and Saipan Lagoon, Commonwealth of the Northern Mariana Islands. Field data were used to parameterize a light-use efficiency (LUE) model, estimating productivity from absorbed photosynthetically active radiation. The LUE model has been successfully applied in terrestrial environments for the past 40 years, and could potentially be used in shallow, marine environments. The model was used in combination with a map of benthic community composition produced from objective based image analysis of WorldView 2 imagery. Light-use efficiency was measured for functional groups: coral, algae, seagrass, and sediment. However, LUE was overestimated for sediment, which led to overestimation of productivity for the mapped area. This was due to differences in spatial and temporal resolution of field data used in the model. The limitations and application of the LUE model to coral reef environments will be presented.
Yang, Deming; Xu, Zhenming
2011-09-15
Crushing and separating technology is widely used in waste printed circuit boards (PCBs) recycling process. A set of automatic line without negative impact to environment for recycling waste PCBs was applied in industry scale. Crushed waste PCBs particles grinding and classification cyclic system is the most important part of the automatic production line, and it decides the efficiency of the whole production line. In this paper, a model for computing the process of the system was established, and matrix analysis method was adopted. The result showed that good agreement can be achieved between the simulation model and the actual production line, and the system is anti-jamming. This model possibly provides a basis for the automatic process control of waste PCBs production line. With this model, many engineering problems can be reduced, such as metals and nonmetals insufficient dissociation, particles over-pulverizing, incomplete comminuting, material plugging and equipment fever. Copyright © 2011 Elsevier B.V. All rights reserved.
Development of a pheromone elution rate physical model
USDA-ARS?s Scientific Manuscript database
A first principle modeling approach is applied to available data describing the elution of semiochemicals from pheromone dispensers. These data include field data for 27 products developed by several manufacturers, including homemade devices, as well as laboratory data collected on three semiochemi...
Performance evaluation of the croissant production line with reparable machines
NASA Astrophysics Data System (ADS)
Tsarouhas, Panagiotis H.
2015-03-01
In this study, the analytical probability models for an automated serial production system, bufferless that consists of n-machines in series with common transfer mechanism and control system was developed. Both time to failure and time to repair a failure are assumed to follow exponential distribution. Applying those models, the effect of system parameters on system performance in actual croissant production line was studied. The production line consists of six workstations with different numbers of reparable machines in series. Mathematical models of the croissant production line have been developed using Markov process. The strength of this study is in the classification of the whole system in states, representing failures of different machines. Failure and repair data from the actual production environment have been used to estimate reliability and maintainability for each machine, workstation, and the entire line is based on analytical models. The analysis provides a useful insight into the system's behaviour, helps to find design inherent faults and suggests optimal modifications to upgrade the system and improve its performance.
Characterization and prediction of chemical functions and weight fractions in consumer products.
Isaacs, Kristin K; Goldsmith, Michael-Rock; Egeghy, Peter; Phillips, Katherine; Brooks, Raina; Hong, Tao; Wambaugh, John F
2016-01-01
Assessing exposures from the thousands of chemicals in commerce requires quantitative information on the chemical constituents of consumer products. Unfortunately, gaps in available composition data prevent assessment of exposure to chemicals in many products. Here we propose filling these gaps via consideration of chemical functional role. We obtained function information for thousands of chemicals from public sources and used a clustering algorithm to assign chemicals into 35 harmonized function categories (e.g., plasticizers, antimicrobials, solvents). We combined these functions with weight fraction data for 4115 personal care products (PCPs) to characterize the composition of 66 different product categories (e.g., shampoos). We analyzed the combined weight fraction/function dataset using machine learning techniques to develop quantitative structure property relationship (QSPR) classifier models for 22 functions and for weight fraction, based on chemical-specific descriptors (including chemical properties). We applied these classifier models to a library of 10196 data-poor chemicals. Our predictions of chemical function and composition will inform exposure-based screening of chemicals in PCPs for combination with hazard data in risk-based evaluation frameworks. As new information becomes available, this approach can be applied to other classes of products and the chemicals they contain in order to provide essential consumer product data for use in exposure-based chemical prioritization.
Application of Kansei engineering and data mining in the Thai ceramic manufacturing
NASA Astrophysics Data System (ADS)
Kittidecha, Chaiwat; Yamada, Koichi
2018-01-01
Ceramic is one of the highly competitive products in Thailand. Many Thai ceramic companies are attempting to know the customer needs and perceptions for making favorite products. To know customer needs is the target of designers and to develop a product that must satisfy customers. This research is applied Kansei Engineering (KE) and Data Mining (DM) into the customer driven product design process. KE can translate customer emotions into the product attributes. This method determines the relationships between customer feelings or Kansei words and the design attributes. Decision tree J48 and Class association rule which implemented through Waikato Environment for Knowledge Analysis (WEKA) software are used to generate a predictive model and to find the appropriate rules. In this experiment, the emotion scores were rated by 37 participants for training data and 16 participants for test data. 6 Kansei words were selected, namely, attractive, ease of drinking, ease of handing, quality, modern and durable. 10 mugs were selected as product samples. The results of this study indicate that the proposed models and rules can interpret the design product elements affecting the customer emotions. Finally, this study provides useful understanding for the application DM in KE and can be applied to a variety of design cases.
Economic and environmental optimization of waste treatment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Münster, M.; Ravn, H.; Hedegaard, K.
2015-04-15
Highlights: • Optimizing waste treatment by incorporating LCA methodology. • Applying different objectives (minimizing costs or GHG emissions). • Prioritizing multiple objectives given different weights. • Optimum depends on objective and assumed displaced electricity production. - Abstract: This article presents the new systems engineering optimization model, OptiWaste, which incorporates a life cycle assessment (LCA) methodology and captures important characteristics of waste management systems. As part of the optimization, the model identifies the most attractive waste management options. The model renders it possible to apply different optimization objectives such as minimizing costs or greenhouse gas emissions or to prioritize several objectivesmore » given different weights. A simple illustrative case is analysed, covering alternative treatments of one tonne of residual household waste: incineration of the full amount or sorting out organic waste for biogas production for either combined heat and power generation or as fuel in vehicles. The case study illustrates that the optimal solution depends on the objective and assumptions regarding the background system – illustrated with different assumptions regarding displaced electricity production. The article shows that it is feasible to combine LCA methodology with optimization. Furthermore, it highlights the need for including the integrated waste and energy system into the model.« less
Consumer product chemical weight fractions from ingredient lists.
Isaacs, Kristin K; Phillips, Katherine A; Biryol, Derya; Dionisio, Kathie L; Price, Paul S
2018-05-01
Assessing human exposures to chemicals in consumer products requires composition information. However, comprehensive composition data for products in commerce are not generally available. Many consumer products have reported ingredient lists that are constructed using specific guidelines. A probabilistic model was developed to estimate quantitative weight fraction (WF) values that are consistent with the rank of an ingredient in the list, the number of reported ingredients, and labeling rules. The model provides the mean, median, and 95% upper and lower confidence limit WFs for ingredients of any rank in lists of any length. WFs predicted by the model compared favorably with those reported on Material Safety Data Sheets. Predictions for chemicals known to provide specific functions in products were also found to reasonably agree with reported WFs. The model was applied to a selection of publicly available ingredient lists, thereby estimating WFs for 1293 unique ingredients in 1123 products in 81 product categories. Predicted WFs, although less precise than reported values, can be estimated for large numbers of product-chemical combinations and thus provide a useful source of data for high-throughput or screening-level exposure assessments.
Nuclear Forensics and Radiochemistry: Reaction Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rundberg, Robert S.
In the intense neutron flux of a nuclear explosion the production of isotopes may occur through successive neutron induced reactions. The pathway to these isotopes illustrates both the complexity of the problem and the need for high quality nuclear data. The growth and decay of radioactive isotopes can follow a similarly complex network. The Bateman equation will be described and modified to apply to the transmutation of isotopes in a high flux reactor. A alternative model of growth and decay, the GD code, that can be applied to fission products will also be described.
Inverse geothermal modelling applied to Danish sedimentary basins
NASA Astrophysics Data System (ADS)
Poulsen, Søren E.; Balling, Niels; Bording, Thue S.; Mathiesen, Anders; Nielsen, Søren B.
2017-10-01
This paper presents a numerical procedure for predicting subsurface temperatures and heat-flow distribution in 3-D using inverse calibration methodology. The procedure is based on a modified version of the groundwater code MODFLOW by taking advantage of the mathematical similarity between confined groundwater flow (Darcy's law) and heat conduction (Fourier's law). Thermal conductivity, heat production and exponential porosity-depth relations are specified separately for the individual geological units of the model domain. The steady-state temperature model includes a model-based transient correction for the long-term palaeoclimatic thermal disturbance of the subsurface temperature regime. Variable model parameters are estimated by inversion of measured borehole temperatures with uncertainties reflecting their quality. The procedure facilitates uncertainty estimation for temperature predictions. The modelling procedure is applied to Danish onshore areas containing deep sedimentary basins. A 3-D voxel-based model, with 14 lithological units from surface to 5000 m depth, was built from digital geological maps derived from combined analyses of reflection seismic lines and borehole information. Matrix thermal conductivity of model lithologies was estimated by inversion of all available deep borehole temperature data and applied together with prescribed background heat flow to derive the 3-D subsurface temperature distribution. Modelled temperatures are found to agree very well with observations. The numerical model was utilized for predicting and contouring temperatures at 2000 and 3000 m depths and for two main geothermal reservoir units, the Gassum (Lower Jurassic-Upper Triassic) and Bunter/Skagerrak (Triassic) reservoirs, both currently utilized for geothermal energy production. Temperature gradients to depths of 2000-3000 m are generally around 25-30 °C km-1, locally up to about 35 °C km-1. Large regions have geothermal reservoirs with characteristic temperatures ranging from ca. 40-50 °C, at 1000-1500 m depth, to ca. 80-110 °C, at 2500-3500 m, however, at the deeper parts, most likely, with too low permeability for non-stimulated production.
Quality Assurance in the Presence of Variability
NASA Astrophysics Data System (ADS)
Lauenroth, Kim; Metzger, Andreas; Pohl, Klaus
Software Product Line Engineering (SPLE) is a reuse-driven development paradigm that has been applied successfully in information system engineering and other domains. Quality assurance of the reusable artifacts of the product line (e.g. requirements, design, and code artifacts) is essential for successful product line engineering. As those artifacts are reused in several products, a defect in a reusable artifact can affect several products of the product line. A central challenge for quality assurance in product line engineering is how to consider product line variability. Since the reusable artifacts contain variability, quality assurance techniques from single-system engineering cannot directly be applied to those artifacts. Therefore, different strategies and techniques have been developed for quality assurance in the presence of variability. In this chapter, we describe those strategies and discuss in more detail one of those strategies, the so called comprehensive strategy. The comprehensive strategy aims at checking the quality of all possible products of the product line and thus offers the highest benefits, since it is able to uncover defects in all possible products of the product line. However, the central challenge for applying the comprehensive strategy is the complexity that results from the product line variability and the large number of potential products of a product line. In this chapter, we present one concrete technique that we have developed to implement the comprehensive strategy that addresses this challenge. The technique is based on model checking technology and allows for a comprehensive verification of domain artifacts against temporal logic properties.
Projected refined product balances in key Latin American countries: A preliminary examination
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-06-01
Over the years, the East-West Center (EWC) has developed considerable expertise in refinery modeling, especially in the area of forecasting product balances for countries, given planned capacity changes, changes in product demand, changes in crude slates, and changes in product specifications. This expertise has been applied on an ongoing basis to the major refiners in the Middle East and the Asia-Pacific region, along with the US West Coast as region in its own right. Refinery modeling in these three areas has been ongoing for nearly 15 years at the Center, and the tools and information sources are now well developed.more » To date, the EWC has not applied these tools to Latin America. Although research on Latin America has been an ongoing area of concern at the Center in recent years, the information gathered to date is still not of the level of detail nor quality available for other areas. The modeling efforts undertaken in this report are of a ``baseline`` nature, designed to outline the major issues, attempt a first cut at emerging product balances, and, above all, to elicit commentary from those directly involved in the oil industry in the key countries modeled. Our experience in other regions has shown that it takes a few years dialogue with refiners and government planner in individual countries to develop a reliable database, as well as the insights into operational constraints and practices that make accurate modeling possible. This report is no more than a first step down the road.« less
Streefland, M; Van Herpen, P F G; Van de Waterbeemd, B; Van der Pol, L A; Beuvery, E C; Tramper, J; Martens, D E; Toft, M
2009-10-15
A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process, especially for processes involving biological products, usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged by new or additional clinical testing. Recent changes in the regulations for pharmaceutical processing allow broader ranges of process settings to be submitted for regulatory approval, the so-called process design space, which means that a manufacturer can optimize his process within the submitted ranges after the product has entered the market, which allows flexible processes. In this article, the applicability of this concept of the process design space is investigated for the cultivation process step for a vaccine against whooping cough disease. An experimental design (DoE) is applied to investigate the ranges of critical process parameters that still result in a product that meets specifications. The on-line process data, including near infrared spectroscopy, are used to build a descriptive model of the processes used in the experimental design. Finally, the data of all processes are integrated in a multivariate batch monitoring model that represents the investigated process design space. This article demonstrates how the general principles of PAT and process design space can be applied for an undefined biological product such as a whole cell vaccine. The approach chosen for model development described here, allows on line monitoring and control of cultivation batches in order to assure in real time that a process is running within the process design space.
Wang, H Holly; Tan, Tih Koon; Schotzko, R Thomas
2007-02-01
Potato production and processing are very important activities in the agricultural economy of the Pacific Northwest. Part of the reason for the development of this industry has been the availability of water for both growing and processing. A great amount of water is used in processing potato products, such as frozen French fries, and the waste water is a pollutant because it contains high levels of nitrate and other nutrients. Using this waste water to irrigate the fields can be a suitable disposal method. Field application will reduce potato fertilizer costs, but it can also cause underground water contamination if over-applied to the field. In this econometric study, we used field data associated with current waste water applications in central Washington to examine the yield response as well as the soil nitrogen content response to waste water applications. Our results from the production model show that both water and nitrogen positively affect crop yields at the current levels of application, but potassium has been over applied. This implies that replacing some waste water with fresh water and nitrogen fertilizer will increase production. The environmental model results show that applying more nitrogen to the soil leads to more movement below the root zone. The results also suggest that higher crop yields lead to less nitrogen in the soil, and applying more water increases crop yields, which can reduce the nitrogen left in the soil. Therefore, relative to the current practice, waste water application rates should be reduced and supplemented with fresh water to enhance nitrogen use by plants and reduce residual nitrogen in the soil.
Effective management of construction company in terms of linguistic communication
NASA Astrophysics Data System (ADS)
Shirina, Elena; Gaybarian, Olga; Myasischev, Georg
2017-10-01
The research presented here has been made over the years in the field of increasing the effectiveness of management in a construction company in terms of applied linguistics. The aim of this work is to share with the scientific community some practical findings of applying the technology of process management of the company, in particular the methods of linguistic efficiency considering the factors of the linguistic personality of the employee. The study deals with the description of applied linguistic and managerial models, views, practical results of their application in the applied field in order to assess production sustainability and minimize losses. The authors applied the developed technology to practical use, and the article presents the results of this application. The authors continue the research in this direction aiming at improving the production effectiveness of the proposed technologies and eliminating some identified drawback.
Transforming Multidisciplinary Customer Requirements to Product Design Specifications
NASA Astrophysics Data System (ADS)
Ma, Xiao-Jie; Ding, Guo-Fu; Qin, Sheng-Feng; Li, Rong; Yan, Kai-Yin; Xiao, Shou-Ne; Yang, Guang-Wu
2017-09-01
With the increasing of complexity of complex mechatronic products, it is necessary to involve multidisciplinary design teams, thus, the traditional customer requirements modeling for a single discipline team becomes difficult to be applied in a multidisciplinary team and project since team members with various disciplinary backgrounds may have different interpretations of the customers' requirements. A new synthesized multidisciplinary customer requirements modeling method is provided for obtaining and describing the common understanding of customer requirements (CRs) and more importantly transferring them into a detailed and accurate product design specifications (PDS) to interact with different team members effectively. A case study of designing a high speed train verifies the rationality and feasibility of the proposed multidisciplinary requirement modeling method for complex mechatronic product development. This proposed research offersthe instruction to realize the customer-driven personalized customization of complex mechatronic product.
A methodology for the assessment of inhalation exposure to aluminium from antiperspirant sprays.
Schwarz, Katharina; Pappa, Gerlinde; Miertsch, Heike; Scheel, Julia; Koch, Wolfgang
2018-04-01
Inhalative exposure can occur accidentally when using cosmetic spray products. Usually, a tiered approach is applied for exposure assessment, starting with rather conservative, simplistic calculation models that may be improved with measured data and more refined modelling. Here we report on an advanced methodology to mimic in-use conditions for antiperspirant spray products to provide a more accurate estimate of the amount of aluminium possibly inhaled and taken up systemically, thus contributing to the overall body burden. Four typical products were sprayed onto a skin surrogate in defined rooms. For aluminium, size-related aerosol release fractions, i.e. inhalable, thoracic and respirable, were determined by a mass balance method taking droplet maturation into account. These data were included into a simple two-box exposure model, allowing calculation of the inhaled aluminium dose over 12 min. Systemic exposure doses were calculated for exposure of the deep lung and the upper respiratory tract using the Multiple Path Particle Deposition Model (MPPD) model. The total systemically available dose of aluminium was in all cases found to be less than 0.5 µg per application. With this study it could be demonstrated that refinement of the input data of the two-box exposure model with measured data of released airborne aluminium is a valuable approach to analyse the contribution of antiperspirant spray inhalation to total aluminium exposure as part of the overall risk assessment. We suggest the methodology which can also be applied to other exposure modelling approaches for spray products, and further is adapted to other similar use scenarios.
Informing Aerosol Transport Models With Satellite Multi-Angle Aerosol Measurements
NASA Technical Reports Server (NTRS)
Limbacher, J.; Patadia, F.; Petrenko, M.; Martin, M. Val; Chin, M.; Gaitley, B.; Garay, M.; Kalashnikova, O.; Nelson, D.; Scollo, S.
2011-01-01
As the aerosol products from the NASA Earth Observing System's Multi-angle Imaging SpectroRadiometer (MISR) mature, we are placing greater focus on ways of using the aerosol amount and type data products, and aerosol plume heights, to constrain aerosol transport models. We have demonstrated the ability to map aerosol air-mass-types regionally, and have identified product upgrades required to apply them globally, including the need for a quality flag indicating the aerosol type information content, that varies depending upon retrieval conditions. We have shown that MISR aerosol type can distinguish smoke from dust, volcanic ash from sulfate and water particles, and can identify qualitative differences in mixtures of smoke, dust, and pollution aerosol components in urban settings. We demonstrated the use of stereo imaging to map smoke, dust, and volcanic effluent plume injection height, and the combination of MISR and MODIS aerosol optical depth maps to constrain wildfire smoke source strength. This talk will briefly highlight where we stand on these application, with emphasis on the steps we are taking toward applying the capabilities toward constraining aerosol transport models, planet-wide.
A Model for the Creation of Human-Generated Metadata within Communities
ERIC Educational Resources Information Center
Brasher, Andrew; McAndrew, Patrick
2005-01-01
This paper considers situations for which detailed metadata descriptions of learning resources are necessary, and focuses on human generation of such metadata. It describes a model which facilitates human production of good quality metadata by the development and use of structured vocabularies. Using examples, this model is applied to single and…
Applied metrology in the production of superconducting model magnets for particle accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferradas Troitino, Jose; Bestmann, Patrick; Bourcey, Nicolas
2017-12-22
The production of superconducting magnets for particle accelerators involves high precision assemblies and tight tolerances, in order to achieve the requirements for their appropriate performance. It is therefore essential to have a strict control and traceability over the geometry of each component of the system, and also to be able to compensate possible inherent deviations coming from the production process.
Bangbang Zhang; Gary Feng; Lajpat R. Ahuja; Xiangbin Kong; Ying Ouyang; Ardeshir Adeli; Johnie N. Jenkins
2018-01-01
Crop production as a function of water use or water applied, called the crop water production function (CWPF), is a useful tool for irrigation planning, design and management. However, these functions are not only crop and variety specific they also vary with soil types and climatic conditions (locations). Derivation of multi-year average CWPFs through field...
Software Engineering Laboratory (SEL) cleanroom process model
NASA Technical Reports Server (NTRS)
Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon
1991-01-01
The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.
NASA Astrophysics Data System (ADS)
Shao, G.; Gallion, J.; Fei, S.
2016-12-01
Sound forest aboveground biomass estimation is required to monitor diverse forest ecosystems and their impacts on the changing climate. Lidar-based regression models provided promised biomass estimations in most forest ecosystems. However, considerable uncertainties of biomass estimations have been reported in the temperate hardwood and hardwood-dominated mixed forests. Varied site productivities in temperate hardwood forests largely diversified height and diameter growth rates, which significantly reduced the correlation between tree height and diameter at breast height (DBH) in mature and complex forests. It is, therefore, difficult to utilize height-based lidar metrics to predict DBH-based field-measured biomass through a simple regression model regardless the variation of site productivity. In this study, we established a multi-dimension nonlinear regression model incorporating lidar metrics and site productivity classes derived from soil features. In the regression model, lidar metrics provided horizontal and vertical structural information and productivity classes differentiated good and poor forest sites. The selection and combination of lidar metrics were discussed. Multiple regression models were employed and compared. Uncertainty analysis was applied to the best fit model. The effects of site productivity on the lidar-based biomass model were addressed.
Improvements and validation of the erythropoiesis control model for bed rest simulation
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1977-01-01
The most significant improvement in the model is the explicit formulation of separate elements representing erythropoietin production and red cell production. Other modifications include bone marrow time-delays, capability to shift oxyhemoglobin affinity and an algorithm for entering experimental data as time-varying driving functions. An area of model development is suggested by applying the model to simulating onset, diagnosis and treatment of a hematologic disorder. Recommendations for further improvements in the model and suggestions for experimental application are also discussed. A detailed analysis of the hematologic response to bed rest including simulation of the recent Baylor Medical College bed rest studies is also presented.
Monitoring Crop Productivity over the U.S. Corn Belt using an Improved Light Use Efficiency Model
NASA Astrophysics Data System (ADS)
Wu, X.; Xiao, X.; Zhang, Y.; Qin, Y.; Doughty, R.
2017-12-01
Large-scale monitoring of crop yield is of great significance for forecasting food production and prices and ensuring food security. Satellite data that provide temporally and spatially continuous information that by themselves or in combination with other data or models, raises possibilities to monitor and understand agricultural productivity regionally. In this study, we first used an improved light use efficiency model-Vegetation Photosynthesis Model (VPM) to simulate the gross primary production (GPP). Model evaluation showed that the simulated GPP (GPPVPM) could well captured the spatio-temporal variation of GPP derived from FLUXNET sites. Then we applied the GPPVPM to further monitor crop productivity for corn and soybean over the U.S. Corn Belt and benchmarked with county-level crop yield statistics. We found VPM-based approach provides pretty good estimates (R2 = 0.88, slope = 1.03). We further showed the impacts of climate extremes on the crop productivity and carbon use efficiency. The study indicates the great potential of VPM in estimating crop yield and in understanding of crop yield responses to climate variability and change.
NASA Astrophysics Data System (ADS)
Sinaga, A. T.; Wangsaputra, R.
2018-03-01
The development of technology causes the needs of products and services become increasingly complex, diverse, and fluctuating. This causes the level of inter-company dependencies within a production chains increased. To be able to compete, efficiency improvements need to be done collaboratively in the production chain network. One of the efforts to increase efficiency is to harmonize production and distribution activities in the production chain network. This paper describes the harmonization of production and distribution activities by applying the use of push-pull system and supply hub in the production chain between two companies. The research methodology begins with conducting empirical and literature studies, formulating research questions, developing mathematical models, conducting trials and analyses, and taking conclusions. The relationship between the two companies is described in the MINLP mathematical model with the total cost of production chain as the objective function. Decisions generated by the mathematical models are the size of production lot, size of delivery lot, number of kanban, frequency of delivery, and the number of understock and overstock lot.
Aung, Hnin W.; Henry, Susan A.
2013-01-01
Abstract Genome-scale metabolic models are built using information from an organism's annotated genome and, correspondingly, information on reactions catalyzed by the set of metabolic enzymes encoded by the genome. These models have been successfully applied to guide metabolic engineering to increase production of metabolites of industrial interest. Congruity between simulated and experimental metabolic behavior is influenced by the accuracy of the representation of the metabolic network in the model. In the interest of applying the consensus model of Saccharomyces cerevisiae metabolism for increased productivity of triglycerides, we manually evaluated the representation of fatty acid, glycerophospholipid, and glycerolipid metabolism in the consensus model (Yeast v6.0). These areas of metabolism were chosen due to their tightly interconnected nature to triglyceride synthesis. Manual curation was facilitated by custom MATLAB functions that return information contained in the model for reactions associated with genes and metabolites within the stated areas of metabolism. Through manual curation, we have identified inconsistencies between information contained in the model and literature knowledge. These inconsistencies include incorrect gene-reaction associations, improper definition of substrates/products in reactions, inappropriate assignments of reaction directionality, nonfunctional β-oxidation pathways, and missing reactions relevant to the synthesis and degradation of triglycerides. Suggestions to amend these inconsistencies in the Yeast v6.0 model can be implemented through a MATLAB script provided in the Supplementary Materials, Supplementary Data S1 (Supplementary Data are available online at www.liebertpub.com/ind). PMID:24678285
NASA Astrophysics Data System (ADS)
Abendroth, Sven; Thaler, Jan; Klump, Jens; Schicks, Judith; Uddin, Mafiz
2014-05-01
In the context of the German joint project SUGAR (Submarine Gas Hydrate Reservoirs: exploration, extraction and transport) we conducted a series of experiments in the LArge Reservoir Simulator (LARS) at the German Research Centre of Geosciences Potsdam. These experiments allow us to investigate the formation and dissociation of hydrates at large scale laboratory conditions. We performed an experiment similar to the field-test conditions of the production test in the Mallik gas hydrate field (Mallik 2L-38) in the Beaufort Mackenzie Delta of the Canadian Arctic. The aim of this experiment was to study the transport behavior of fluids in gas hydrate reservoirs during depressurization (see also Heeschen et al. and Priegnitz et al., this volume). The experimental results from LARS are used to provide details about processes inside the pressure vessel, to validate the models through history matching, and to feed back into the design of future experiments. In experiments in LARS the amount of methane produced from gas hydrates was much lower than expected. Previously published models predict a methane production rate higher than the one observed in experiments and field studies (Uddin et al. 2010; Wright et al. 2011). The authors of the aforementioned studies point out that the current modeling approach overestimates the gas production rate when modeling gas production by depressurization. They suggest that trapping of gas bubbles inside the porous medium is responsible for the reduced gas production rate. They point out that this behavior of multi-phase flow is not well explained by a "residual oil" model, but rather resembles a "foamy oil" model. Our study applies Uddin's (2010) "foamy oil" model and combines it with history matches of our experiments in LARS. Our results indicate a better agreement between experimental and model results when using the "foamy oil" model instead of conventional models of gas flow in water. References Uddin M., Wright J.F. and Coombe D. (2010) - Numerical Study of gas evolution and transport behaviors in natural gas hydrate reservoirs; CSUG/SPE 137439. Wright J.F., Uddin M., Dallimore S.R. and Coombe D. (2011) - Mechanisms of gas evolution and transport in a producing gas hydrate reservoir: an unconventional basis for successful history matching of observed production flow data; International Conference on Gas Hydrates (ICGH 2011).
Copula-based nonlinear modeling of the law of one price for lumber products
Barry K. Goodwin; Matthew T. Holt; Gülcan Önel; Jeffrey P. Prestemon
2018-01-01
This paper proposes an alternative and potentially novel approach to analyzing the law of one price in a nonlinear fashion. Copula-based models that consider the joint distribution of prices separated by space are developed and applied to weekly...
[Analysis of the model OPM3® application and results for health area].
Augusto Dos Santos, Luis; de Fátima Marin, Heimar
2011-01-01
This research sought to analyze if a questionnaire model created by an international community of project management is applicable to health organizations. The model OPM3 ® (Organizational Project Management Maturity Model) was created in order that organizations of any area or size can identify the presence or absence of good management practices. The aim of applying this model is to always evaluate the organization and not the interviewee. In this paper, one presents the results of employing this model in an organization that has information technology products and services applied to health area. This study verified that the model is rapidly applicable and that the analyzed organization has an expressive number of good practices.
An Evaluation of Causal Modeling Applied to Educational Productivity in Mathematics.
ERIC Educational Resources Information Center
Harnisch, Delwyn L.; Dunbar, Stephen B.
To probe a psychological theory of educational productivity, background measures along with mathematics test scores and motivational measures of over 7,000 students (9-, 13- and 17-year olds from National Assessment of Educational Progress samples) were statistically related to each other and to indicators of constructs that prior research shows…
Production ecology of Thuja occidentalis
Philip V. Hofmeyer; Robert S. Seymour; Laura S. Kenefic
2010-01-01
Equations to predict branch and tree leaf area, foliar mass, and stemwood volume were developed from 25 destructively sampled northern white-cedar (Thuja occidentalis L.) trees, a species whose production ecology has not been studied. Resulting models were applied to a large sample of 296 cored trees from 60 sites stratified across a soil gradient...
Unit Price Scaling Trends for Chemical Products
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qi, Wei; Sathre, Roger; William R. Morrow, III
2015-08-01
To facilitate early-stage life-cycle techno-economic modeling of emerging technologies, here we identify scaling relations between unit price and sales quantity for a variety of chemical products of three categories - metal salts, organic compounds, and solvents. We collect price quotations for lab-scale and bulk purchases of chemicals from both U.S. and Chinese suppliers. We apply a log-log linear regression model to estimate the price discount effect. Using the median discount factor of each category, one can infer bulk prices of products for which only lab-scale prices are available. We conduct out-of-sample tests showing that most of the price proxies deviatemore » from their actual reference prices by a factor less than ten. We also apply the bootstrap method to determine if a sample median discount factor should be accepted for price approximation. We find that appropriate discount factors for metal salts and for solvents are both -0.56, while that for organic compounds is -0.67 and is less representative due to greater extent of product heterogeneity within this category.« less
Atomic scale simulations for improved CRUD and fuel performance modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andersson, Anders David Ragnar; Cooper, Michael William Donald
2017-01-06
A more mechanistic description of fuel performance codes can be achieved by deriving models and parameters from atomistic scale simulations rather than fitting models empirically to experimental data. The same argument applies to modeling deposition of corrosion products on fuel rods (CRUD). Here are some results from publications in 2016 carried out using the CASL allocation at LANL.
Optimal policy for mitigating emissions in the European transport sector
NASA Astrophysics Data System (ADS)
Leduc, Sylvain; Piera, Patrizio; Sennai, Mesfun; Igor, Staritsky; Berien, Elbersen; Tijs, Lammens; Florian, Kraxner
2017-04-01
A geographic explicit techno-economic model, BeWhere (www.iiasa.ac.at/bewhere), has been developed at the European scale (Europe 28, the Balkans countries, Turkey, Moldavia and Ukraine) at a 40km grid size, to assess the potential of bioenergy from non-food feedstock. Based on the minimization of the supply chain from feedstock collection to the final energy product distribution, the model identifies the optimal bioenergy production plants in terms of spatial location, technology and capacity. The feedstock of interests are woody biomass (divided into eight types from conifers and non-conifers) and five different crop residuals. For each type of feedstock, one or multiple technologies can be applied for either heat, electricity or biofuel production. The model is run for different policy tools such as carbon cost, biofuel support, or subsidies, and the optimal mix of technologies and biomass needed is optimized to reach a production cost competitive against the actual reference system which is fossil fuel based. From this approach, the optimal mix of policy tools that can be applied country wide in Europe will be identified. The preliminary results show that high carbon tax and biofuel support contribute to the development of large scale biofuel production based on woody biomass plants mainly located in the northern part of Europe. Finally the highest emission reduction is reached with low biofuel support and high carbon tax evenly distributed in Europe.
Dietary Impact of Adding Potassium Chloride to Foods as a Sodium Reduction Technique.
van Buren, Leo; Dötsch-Klerk, Mariska; Seewi, Gila; Newson, Rachel S
2016-04-21
Potassium chloride is a leading reformulation technology for reducing sodium in food products. As, globally, sodium intake exceeds guidelines, this technology is beneficial; however, its potential impact on potassium intake is unknown. Therefore, a modeling study was conducted using Dutch National Food Survey data to examine the dietary impact of reformulation (n = 2106). Product-specific sodium criteria, to enable a maximum daily sodium chloride intake of 5 grams/day, were applied to all foods consumed in the survey. The impact of replacing 20%, 50% and 100% of sodium chloride from each product with potassium chloride was modeled. At baseline median, potassium intake was 3334 mg/day. An increase in the median intake of potassium of 453 mg/day was seen when a 20% replacement was applied, 674 mg/day with a 50% replacement scenario and 733 mg/day with a 100% replacement scenario. Reformulation had the largest impact on: bread, processed fruit and vegetables, snacks and processed meat. Replacement of sodium chloride by potassium chloride, particularly in key contributing product groups, would result in better compliance to potassium intake guidelines (3510 mg/day). Moreover, it could be considered safe for the general adult population, as intake remains compliant with EFSA guidelines. Based on current modeling potassium chloride presents as a valuable, safe replacer for sodium chloride in food products.
Yehia, Ali M; Arafa, Reham M; Abbas, Samah S; Amer, Sawsan M
2016-01-15
Spectral resolution of cefquinome sulfate (CFQ) in the presence of its degradation products was studied. Three selective, accurate and rapid spectrophotometric methods were performed for the determination of CFQ in the presence of either its hydrolytic, oxidative or photo-degradation products. The proposed ratio difference, derivative ratio and mean centering are ratio manipulating spectrophotometric methods that were satisfactorily applied for selective determination of CFQ within linear range of 5.0-40.0 μg mL(-1). Concentration Residuals Augmented Classical Least Squares was applied and evaluated for the determination of the cited drug in the presence of its all degradation products. Traditional Partial Least Squares regression was also applied and benchmarked against the proposed advanced multivariate calibration. Experimentally designed 25 synthetic mixtures of three factors at five levels were used to calibrate and validate the multivariate models. Advanced chemometrics succeeded in quantitative and qualitative analyses of CFQ along with its hydrolytic, oxidative and photo-degradation products. The proposed methods were applied successfully for different pharmaceutical formulations analyses. These developed methods were simple and cost-effective compared with the manufacturer's RP-HPLC method. Copyright © 2015 Elsevier B.V. All rights reserved.
Operator agency in process intervention: tampering versus application of tacit knowledge
NASA Astrophysics Data System (ADS)
Van Gestel, P.; Pons, D. J.; Pulakanam, V.
2015-09-01
Statistical process control (SPC) theory takes a negative view of adjustment of process settings, which is termed tampering. In contrast, quality and lean programmes actively encourage operators to acts of intervention and personal agency in the improvement of production outcomes. This creates a conflict that requires operator judgement: How does one differentiate between unnecessary tampering and needful intervention? Also, difficult is that operators apply tacit knowledge to such judgements. There is a need to determine where in a given production process the operators are applying tacit knowledge, and whether this is hindering or aiding quality outcomes. The work involved the conjoint application of systems engineering, statistics, and knowledge management principles, in the context of a case study. Systems engineering was used to create a functional model of a real plant. Actual plant data were analysed with the statistical methods of ANOVA, feature selection, and link analysis. This identified the variables to which the output quality was most sensitive. These key variables were mapped back to the functional model. Fieldwork was then directed to those areas to prospect for operator judgement activities. A natural conversational approach was used to determine where and how operators were applying judgement. This contrasts to the interrogative approach of conventional knowledge management. Data are presented for a case study of a meat rendering plant. The results identify specific areas where operators' tacit knowledge and mental model contribute to quality outcomes and untangles the motivations behind their agency. Also evident is how novice and expert operators apply their knowledge differently. Novices were focussed on meeting throughput objectives, and their incomplete understanding of the plant characteristics led them to inadvertently sacrifice quality in the pursuit of productivity in certain situations. Operators' responses to the plant are affected by their individual mental models of the plant, which differ between operators and have variable validity. Their behaviour is also affected by differing interpretations of how their personal agency should be applied to the achievement of production objectives. The methodology developed here is an integration of systems engineering, statistical analysis, and knowledge management. It shows how to determine where in a given production process the operator intervention is occurring, how it affects quality outcomes, and what tacit knowledge operators are using. It thereby assists the continuous quality improvement processes in a different way to SPC. A second contribution is the provision of a novel methodology for knowledge management, one that circumvents the usual codification barriers to knowledge management.
Metabolic Network Modeling of Microbial Interactions in Natural and Engineered Environmental Systems
Perez-Garcia, Octavio; Lear, Gavin; Singhal, Naresh
2016-01-01
We review approaches to characterize metabolic interactions within microbial communities using Stoichiometric Metabolic Network (SMN) models for applications in environmental and industrial biotechnology. SMN models are computational tools used to evaluate the metabolic engineering potential of various organisms. They have successfully been applied to design and optimize the microbial production of antibiotics, alcohols and amino acids by single strains. To date however, such models have been rarely applied to analyze and control the metabolism of more complex microbial communities. This is largely attributed to the diversity of microbial community functions, metabolisms, and interactions. Here, we firstly review different types of microbial interaction and describe their relevance for natural and engineered environmental processes. Next, we provide a general description of the essential methods of the SMN modeling workflow including the steps of network reconstruction, simulation through Flux Balance Analysis (FBA), experimental data gathering, and model calibration. Then we broadly describe and compare four approaches to model microbial interactions using metabolic networks, i.e., (i) lumped networks, (ii) compartment per guild networks, (iii) bi-level optimization simulations, and (iv) dynamic-SMN methods. These approaches can be used to integrate and analyze diverse microbial physiology, ecology and molecular community data. All of them (except the lumped approach) are suitable for incorporating species abundance data but so far they have been used only to model simple communities of two to eight different species. Interactions based on substrate exchange and competition can be directly modeled using the above approaches. However, interactions based on metabolic feedbacks, such as product inhibition and synthropy require extensions to current models, incorporating gene regulation and compounding accumulation mechanisms. SMN models of microbial interactions can be used to analyze complex “omics” data and to infer and optimize metabolic processes. Thereby, SMN models are suitable to capitalize on advances in high-throughput molecular and metabolic data generation. SMN models are starting to be applied to describe microbial interactions during wastewater treatment, in-situ bioremediation, microalgae blooms methanogenic fermentation, and bioplastic production. Despite their current challenges, we envisage that SMN models have future potential for the design and development of novel growth media, biochemical pathways and synthetic microbial associations. PMID:27242701
Unit mechanisms of fission gas release: Current understanding and future needs
Tonks, Michael; Andersson, David; Devanathan, Ram; ...
2018-03-01
Gaseous fission product transport and release has a large impact on fuel performance, degrading fuel and gap properties. While gaseous fission product behavior has been investigated with bulk reactor experiments and simplified analytical models, recent improvements in experimental and modeling approaches at the atomistic and mesoscales are beginning to reveal new understanding of the unit mechanisms that define fission product behavior. Here, existing research on the basic mechanisms of fission gas release during normal reactor operation are summarized and critical areas where work is needed are identified. Here, this basic understanding of the fission gas behavior mechanisms has the potentialmore » to revolutionize our ability to predict fission product behavior and to design fuels with improved performance. In addition, this work can serve as a model on how a coupled experimental and modeling approach can be applied to understand the unit mechanisms behind other critical behaviors in reactor materials.« less
Unit mechanisms of fission gas release: Current understanding and future needs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tonks, Michael; Andersson, David; Devanathan, Ram
Gaseous fission product transport and release has a large impact on fuel performance, degrading fuel and gap properties. While gaseous fission product behavior has been investigated with bulk reactor experiments and simplified analytical models, recent improvements in experimental and modeling approaches at the atomistic and mesoscales are beginning to reveal new understanding of the unit mechanisms that define fission product behavior. Here, existing research on the basic mechanisms of fission gas release during normal reactor operation are summarized and critical areas where work is needed are identified. Here, this basic understanding of the fission gas behavior mechanisms has the potentialmore » to revolutionize our ability to predict fission product behavior and to design fuels with improved performance. In addition, this work can serve as a model on how a coupled experimental and modeling approach can be applied to understand the unit mechanisms behind other critical behaviors in reactor materials.« less
Unit mechanisms of fission gas release: Current understanding and future needs
NASA Astrophysics Data System (ADS)
Tonks, Michael; Andersson, David; Devanathan, Ram; Dubourg, Roland; El-Azab, Anter; Freyss, Michel; Iglesias, Fernando; Kulacsy, Katalin; Pastore, Giovanni; Phillpot, Simon R.; Welland, Michael
2018-06-01
Gaseous fission product transport and release has a large impact on fuel performance, degrading fuel and gap properties. While gaseous fission product behavior has been investigated with bulk reactor experiments and simplified analytical models, recent improvements in experimental and modeling approaches at the atomistic and mesoscales are beginning to reveal new understanding of the unit mechanisms that define fission product behavior. Here, existing research on the basic mechanisms of fission gas release during normal reactor operation are summarized and critical areas where work is needed are identified. This basic understanding of the fission gas behavior mechanisms has the potential to revolutionize our ability to predict fission product behavior and to design fuels with improved performance. In addition, this work can serve as a model on how a coupled experimental and modeling approach can be applied to understand the unit mechanisms behind other critical behaviors in reactor materials.
ERIC Educational Resources Information Center
Chang, Franklin; Dell, Gary S.; Bock, Kathryn
2006-01-01
Psycholinguistic research has shown that the influence of abstract syntactic knowledge on performance is shaped by particular sentences that have been experienced. To explore this idea, the authors applied a connectionist model of sentence production to the development and use of abstract syntax. The model makes use of (a) error-based learning to…
FUNDAMENTAL MASS TRANSFER MODEL FOR INDOOR AIR EMISSIONS FROM SURFACE COATINGS
Emissions from freshly applied paints and other coatings can cause elevated indoor concentrations of vapor-phase organics. Methods are needed to determine the emission rates over time for these products. Some success has been achieved using simple first-order decay models to eval...
Bromochloromethane (BCM) is a volatile compound and a by-product of disinfection of water by ofchlorination. Physiologically based pharmacokinetic (PBPK) models are used in risk assessment applications. An updated PBPKmodel for BCM is generated and applied to hypotheses testing c...
Bourrinet, P; Conduzorgues, J P; Dutertre, H; Macabies, J; Masson, P; Maurin, J; Mercier, O
1995-02-01
An interlaboratory study was carried out to determine the feasibility and reliability of a method using the hamster cheek pouch as a model for assessing the potential irritative properties of substances intended to be applied to the lips or other mucous membranes. The test substances were applied once daily to both pouches for 14 consecutive days. Local and general tolerances were appraised throughout the study. At the end of the study, histologic examination of the pouches and the main organs was performed. Results of the feasibility study, conducted on various types of commercial products, indicated that this model is suitable for preparations of various consistence and composition. Results of the reliability study, carried out on gel-type preparations containing various concentrations of a known irritant, sodium lauryl sulfate, indicated that the method elicits a dose-dependent reaction for this compound. This hamster cheek pouch method was reproducible for the various parameters under consideration: local tolerance, general tolerance, histologic examination. For all products, results were in good agreement among the various laboratories participating in the study. The French regulatory authorities of the Fraud Repression Department have accepted it as an official method for the evaluation of the potential irritative properties of cosmetics and hygiene products intended to be applied to the lips or other mucous membranes.
NASA Astrophysics Data System (ADS)
Djuwendah, E.; Priyatna, T.; Kusno, K.; Deliana, Y.; Wulandari, E.
2018-03-01
Building agribusiness model of LEISA is needed as a prototype of sustainable regional and economic development (SRRED) in the watersheds (DAS) of West Java Province. Agribusiness model of LEISA is a sustainable agribusiness system applying low external input. The system was developed in the framework of optimizing local-based productive resources including soil, water, vegetation, microclimate, renewable energy, appropriate technology, social capital, environment and human resources by combining various subsystems including integrated production subsystems of crops, livestock and fish to provide a maximum synergy effect, post-harvest subsystem and processing of results, marketing subsystems and supporting subsystems. In this study, the ecological boundary of Cipunegara sub-watershed ecosystem, administrative boundaries are Surian Subdistricts in Sumedang. The purpose of this study are to identify the potency of natural resources and local agricultural technologies that could support the LEISA model in Surian and to identify the potency of internal and external inputs in the LEISA model. The research used qualitative descriptive method and technical action research. Data were obtained through interviews, documentation, and observation. The results showed that natural resources in the form of agricultural land, water resources, livestock resources, and human labor are sufficient to support agribusiness model of LEISA. LEISA agribusiness model that has been applied in the research location is the integration of beef cattle, agroforestry, and agrosilvopasture. By building LEISA model, agribusiness can optimize the utilization of locally based productive resources, reduce dependence on external resources, and support sustainable food security.
Assessing the impact of climate change upon hydrology and agriculture in the Indrawati Basin, Nepal.
NASA Astrophysics Data System (ADS)
Palazzoli, Irene; Bocchiola, Daniele; Nana, Ester; Maskey, Shreedhar; Uhlenbrook, Stefan
2014-05-01
Agriculture is sensitive to climate change, especially to temperature and precipitation changes. The purpose of this study was to evaluate the climate change impacts upon rain-fed crops production in the Indrawati river basin, Nepal. The Soil and Water Assessment Tool SWAT model was used to model hydrology and cropping systems in the catchment, and to predict the influence of different climate change scenarios therein. Daily weather data collected from about 13 weather stations during 4 decades were used to constrain the SWAT model, and data from two hydrometric stations used to calibrate/validate it. Then management practices (crop calendar) were applied to specific Hydrological Response Units (HRUs) for the main crops of the region, rice, corn and wheat. Manual calibration of crop production was also carried, against values of crop yield in the area from literature. The calibrated and validated model was further applied to assess the impact of three future climate change scenarios (RCPs) upon the crop productivity in the region. Three climate models (GCMs) were adopted, each with three RCPs (2.5, 4.5, 8.5). Hence, impacts of climate change were assessed considering three time windows, namely a baseline period (1995-2004), the middle of century (2045-2054) and the end of century (2085-2094). For each GCM and RCP future hydrology and yield was compared to baseline scenario. The results displayed slightly modified hydrological cycle, and somewhat small variation in crop production, variable with models and RCPs, and for crop type, the largest being for wheat. Keywords: Climate Change, Nepal, hydrological cycle, crop yield.
Parametric inference for biological sequence analysis.
Pachter, Lior; Sturmfels, Bernd
2004-11-16
One of the major successes in computational biology has been the unification, by using the graphical model formalism, of a multitude of algorithms for annotating and comparing biological sequences. Graphical models that have been applied to these problems include hidden Markov models for annotation, tree models for phylogenetics, and pair hidden Markov models for alignment. A single algorithm, the sum-product algorithm, solves many of the inference problems that are associated with different statistical models. This article introduces the polytope propagation algorithm for computing the Newton polytope of an observation from a graphical model. This algorithm is a geometric version of the sum-product algorithm and is used to analyze the parametric behavior of maximum a posteriori inference calculations for graphical models.
Nazir, Yusuf; Shuib, Shuwahida; Kalil, Mohd Sahaid; Song, Yuanda; Hamid, Aidil Abdul
2018-06-11
In this study, optimization of growth, lipid and DHA production of Aurantiochytrium SW1 was carried out using response surface methodology (RSM) in optimizing initial fructose concentration, agitation speed and monosodium glutamate (MSG) concentration. Central composite design was applied as the experimental design and analysis of variance (ANOVA) was used to analyze the data. ANOVA analysis revealed that the process which adequately represented by quadratic model was significant (p < 0.0001) for all the response. All the three factors were significant (p < 0.005) in influencing the biomass and lipid data while only two factors (agitation speed and MSG) gave significant effect on DHA production (p < 0.005). The estimated optimal conditions for enhanced growth, lipid and DHA production were 70 g/L fructose, 250 rpm agitation speed and 10 g/L MSG. Consequently, the quadratic model was validated by applying the estimated optimum conditions, which confirmed the model validity where 19.0 g/L biomass, 9.13 g/L lipid and 4.75 g/L of DHA were produced. The growth, lipid and DHA were 28, 36 and 35% respectively higher than that produced in the original medium prior to optimization.
Characterization and Prediction of Chemical Functions and ...
Assessing exposures from the thousands of chemicals in commerce requires quantitative information on the chemical constituents of consumer products. Unfortunately, gaps in available composition data prevent assessment of exposure to chemicals in many products. Here we propose filling these gaps via consideration of chemical functional role. We obtained function information for thousands of chemicals from public sources and used a clustering algorithm to assign chemicals into 35 harmonized function categories (e.g., plasticizers, antimicrobials, solvents). We combined these functions with weight fraction data for 4115 personal care products (PCPs) to characterize the composition of 66 different product categories (e.g., shampoos). We analyzed the combined weight fraction/function dataset using machine learning techniques to develop quantitative structure property relationship (QSPR) classifier models for 22 functions and for weight fraction, based on chemical-specific descriptors (including chemical properties). We applied these classifier models to a library of 10196 data-poor chemicals. Our predictions of chemical function and composition will inform exposure-based screening of chemicals in PCPs for combination with hazard data in risk-based evaluation frameworks. As new information becomes available, this approach can be applied to other classes of products and the chemicals they contain in order to provide essential consumer product data for use in exposure-b
An integrated new product development framework - an application on green and low-carbon products
NASA Astrophysics Data System (ADS)
Lin, Chun-Yu; Lee, Amy H. I.; Kang, He-Yau
2015-03-01
Companies need to be innovative to survive in today's competitive market; thus, new product development (NPD) has become very important. This research constructs an integrated NPD framework for developing new products. In stage one, customer attributes (CAs) and engineering characteristics (ECs) for developing products are collected, and fuzzy interpretive structural modelling (FISM) is applied to understand the relationships among these critical factors. Based on quality function deployment (QFD), a house of quality is then built, and fuzzy analytic network process (FANP) is adopted to calculate the relative importance of ECs. In stage two, fuzzy failure mode and effects analysis (FFMEA) is applied to understand the potential failures of the ECs and to determine the importance of ECs with respect to risk control. In stage three, a goal programming (GP) model is constructed to consider the outcome from the FANP-QFD, FFMEA and other objectives, in order to select the most important ECs. Due to pollution and global warming, environmental protection has become an important topic. With both governments and consumers developing environmental consciousness, successful green and low-carbon NPD provides an important competitive advantage, enabling the survival or renewal of firms. The proposed framework is implemented in a panel manufacturing firm for designing a green and low-carbon product.
Coupled thermal–hydrological–mechanical modeling of CO 2 -enhanced coalbed methane recovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Tianran; Rutqvist, Jonny; Oldenburg, Curtis M.
CO 2 -enhanced coalbed methane recovery, also known as CO 2 -ECBM, is a potential win-win approach for enhanced methane production while simultaneously sequestering injected anthropogenic CO 2 to decrease CO 2 emissions into the atmosphere. Here, CO 2 -ECBM is simulated using a coupled thermal–hydrological–mechanical (THM) numerical model that considers multiphase (gas and water) flow and solubility, multicomponent (CO 2 and CH 4 ) diffusion and adsorption, heat transfer and coal deformation. The coupled model is based on the TOUGH-FLAC simulator, which is applied here for the first time to model CO 2 -ECBM. The capacity of the simulatormore » for modeling methane production is verified by a code-to-code comparison with the general-purpose finite-element solver COMSOL. Then, the TOUGH-FLAC simulator is applied in an isothermal simulation to study the variations in permeability evolution during a CO 2 -ECBM operation while considering four different stress-dependent permeability models that have been implemented into the simulator. Finally, the TOUGH-FLAC simulator is applied in non-isothermal simulations to model THM responses during a CO 2 -ECBM operation.Our simulations show that the permeability evolution, mechanical stress, and deformation are all affected by changes in pressure, temperature and adsorption swelling, with adsorption swelling having the largest effect. The calculated stress changes do not induce any mechanical failure in the coal seam, except near the injection well in one case of a very unfavorable stress field.« less
Coupled thermal–hydrological–mechanical modeling of CO 2 -enhanced coalbed methane recovery
Ma, Tianran; Rutqvist, Jonny; Oldenburg, Curtis M.; ...
2017-05-22
CO 2 -enhanced coalbed methane recovery, also known as CO 2 -ECBM, is a potential win-win approach for enhanced methane production while simultaneously sequestering injected anthropogenic CO 2 to decrease CO 2 emissions into the atmosphere. Here, CO 2 -ECBM is simulated using a coupled thermal–hydrological–mechanical (THM) numerical model that considers multiphase (gas and water) flow and solubility, multicomponent (CO 2 and CH 4 ) diffusion and adsorption, heat transfer and coal deformation. The coupled model is based on the TOUGH-FLAC simulator, which is applied here for the first time to model CO 2 -ECBM. The capacity of the simulatormore » for modeling methane production is verified by a code-to-code comparison with the general-purpose finite-element solver COMSOL. Then, the TOUGH-FLAC simulator is applied in an isothermal simulation to study the variations in permeability evolution during a CO 2 -ECBM operation while considering four different stress-dependent permeability models that have been implemented into the simulator. Finally, the TOUGH-FLAC simulator is applied in non-isothermal simulations to model THM responses during a CO 2 -ECBM operation.Our simulations show that the permeability evolution, mechanical stress, and deformation are all affected by changes in pressure, temperature and adsorption swelling, with adsorption swelling having the largest effect. The calculated stress changes do not induce any mechanical failure in the coal seam, except near the injection well in one case of a very unfavorable stress field.« less
Hossain, Md. Kamrul; Kamil, Anton Abdulbasah; Baten, Md. Azizul; Mustafa, Adli
2012-01-01
The objective of this paper is to apply the Translog Stochastic Frontier production model (SFA) and Data Envelopment Analysis (DEA) to estimate efficiencies over time and the Total Factor Productivity (TFP) growth rate for Bangladeshi rice crops (Aus, Aman and Boro) throughout the most recent data available comprising the period 1989–2008. Results indicate that technical efficiency was observed as higher for Boro among the three types of rice, but the overall technical efficiency of rice production was found around 50%. Although positive changes exist in TFP for the sample analyzed, the average growth rate of TFP for rice production was estimated at almost the same levels for both Translog SFA with half normal distribution and DEA. Estimated TFP from SFA is forecasted with ARIMA (2, 0, 0) model. ARIMA (1, 0, 0) model is used to forecast TFP of Aman from DEA estimation. PMID:23077500
NASA Astrophysics Data System (ADS)
Battaïa, Olga; Dolgui, Alexandre; Guschinsky, Nikolai; Levin, Genrikh
2014-10-01
Solving equipment selection and line balancing problems together allows better line configurations to be reached and avoids local optimal solutions. This article considers jointly these two decision problems for mass production lines with serial-parallel workplaces. This study was motivated by the design of production lines based on machines with rotary or mobile tables. Nevertheless, the results are more general and can be applied to assembly and production lines with similar structures. The designers' objectives and the constraints are studied in order to suggest a relevant mathematical model and an efficient optimization approach to solve it. A real case study is used to validate the model and the developed approach.
Shao, Q; Rowe, R C; York, P
2007-06-01
This study has investigated an artificial intelligence technology - model trees - as a modelling tool applied to an immediate release tablet formulation database. The modelling performance was compared with artificial neural networks that have been well established and widely applied in the pharmaceutical product formulation fields. The predictability of generated models was validated on unseen data and judged by correlation coefficient R(2). Output from the model tree analyses produced multivariate linear equations which predicted tablet tensile strength, disintegration time, and drug dissolution profiles of similar quality to neural network models. However, additional and valuable knowledge hidden in the formulation database was extracted from these equations. It is concluded that, as a transparent technology, model trees are useful tools to formulators.
NASA Astrophysics Data System (ADS)
Barthel, Thomas; De Bacco, Caterina; Franz, Silvio
2018-01-01
We introduce and apply an efficient method for the precise simulation of stochastic dynamical processes on locally treelike graphs. Networks with cycles are treated in the framework of the cavity method. Such models correspond, for example, to spin-glass systems, Boolean networks, neural networks, or other technological, biological, and social networks. Building upon ideas from quantum many-body theory, our approach is based on a matrix product approximation of the so-called edge messages—conditional probabilities of vertex variable trajectories. Computation costs and accuracy can be tuned by controlling the matrix dimensions of the matrix product edge messages (MPEM) in truncations. In contrast to Monte Carlo simulations, the algorithm has a better error scaling and works for both single instances as well as the thermodynamic limit. We employ it to examine prototypical nonequilibrium Glauber dynamics in the kinetic Ising model. Because of the absence of cancellation effects, observables with small expectation values can be evaluated accurately, allowing for the study of decay processes and temporal correlations.
Simulating the effects of fire disturbance and vegetation recovery on boreal ecosystem carbon fluxes
NASA Astrophysics Data System (ADS)
Yi, Y.; Kimball, J. S.; Jones, L. A.; Zhao, M.
2011-12-01
Fire related disturbance and subsequent vegetation recovery has a major influence on carbon storage and land-atmosphere CO2 fluxes in boreal ecosystems. We applied a synthetic approach combining tower eddy covariance flux measurements, satellite remote sensing and model reanalysis surface meteorology within a terrestrial carbon model framework to estimate fire disturbance and recovery effects on boreal ecosystem carbon fluxes including gross primary production (GPP), ecosystem respiration and net CO2 exchange (NEE). A disturbance index based on MODIS land surface temperature and NDVI was found to coincide with vegetation recovery status inferred from tower chronosequence sites. An empirical algorithm was developed to track ecosystem recovery status based on the disturbance index and used to nudge modeled net primary production (NPP) and surface soil organic carbon stocks from baseline steady-state conditions. The simulations were conducted using a satellite based terrestrial carbon flux model driven by MODIS NDVI and MERRA reanalysis daily surface meteorology inputs. The MODIS (MCD45) burned area product was then applied for mapping recent (post 2000) regional disturbance history, and used with the disturbance index to define vegetation disturbance and recovery status. The model was then applied to estimate regional patterns and temporal changes in terrestrial carbon fluxes across the entire northern boreal forest and tundra domain. A sensitivity analysis was conducted to assess the relative importance of fire disturbance and recovery on regional carbon fluxes relative to assumed steady-state conditions. The explicit representation of disturbance and recovery effects produces more accurate NEE predictions than the baseline steady-state simulations and reduces uncertainty regarding the purported missing carbon sink in the high latitudes.
A convolution model for computing the far-field directivity of a parametric loudspeaker array.
Shi, Chuang; Kajikawa, Yoshinobu
2015-02-01
This paper describes a method to compute the far-field directivity of a parametric loudspeaker array (PLA), whereby the steerable parametric loudspeaker can be implemented when phased array techniques are applied. The convolution of the product directivity and the Westervelt's directivity is suggested, substituting for the past practice of using the product directivity only. Computed directivity of a PLA using the proposed convolution model achieves significant improvement in agreement to measured directivity at a negligible computational cost.
Validation, Optimization and Simulation of a Solar Thermoelectric Generator Model
NASA Astrophysics Data System (ADS)
Madkhali, Hadi Ali; Hamil, Ali; Lee, HoSung
2017-12-01
This study explores thermoelectrics as a viable option for small-scale solar thermal applications. Thermoelectric technology is based on the Seebeck effect, which states that a voltage is induced when a temperature gradient is applied to the junctions of two differing materials. This research proposes to analyze, validate, simulate, and optimize a prototype solar thermoelectric generator (STEG) model in order to increase efficiency. The intent is to further develop STEGs as a viable and productive energy source that limits pollution and reduces the cost of energy production. An empirical study (Kraemer et al. in Nat Mater 10:532, 2011) on the solar thermoelectric generator reported a high efficiency performance of 4.6%. The system had a vacuum glass enclosure, a flat panel (absorber), thermoelectric generator and water circulation for the cold side. The theoretical and numerical approach of this current study validated the experimental results from Kraemer's study to a high degree. The numerical simulation process utilizes a two-stage approach in ANSYS software for Fluent and Thermal-Electric Systems. The solar load model technique uses solar radiation under AM 1.5G conditions in Fluent. This analytical model applies Dr. Ho Sung Lee's theory of optimal design to improve the performance of the STEG system by using dimensionless parameters. Applying this theory, using two cover glasses and radiation shields, the STEG model can achieve a highest efficiency of 7%.
Tozer, Sarah A; Kelly, Seamus; O'Mahony, Cian; Daly, E J; Nash, J F
2015-09-01
Realistic estimates of chemical aggregate exposure are needed to ensure consumer safety. As exposure estimates are a critical part of the equation used to calculate acceptable "safe levels" and conduct quantitative risk assessments, methods are needed to produce realistic exposure estimations. To this end, a probabilistic aggregate exposure model was developed to estimate consumer exposure from several rinse off personal cleansing products containing the anti-dandruff preservative zinc pyrithione. The model incorporates large habits and practices surveys, containing data on frequency of use, amount applied, co-use along with market share, and combines these data at the level of the individual based on subject demographics to better estimate exposure. The daily-applied exposure (i.e., amount applied to the skin) was 3.79 mg/kg/day for the 95th percentile consumer. The estimated internal dose for the 95th percentile exposure ranged from 0.01-1.29 μg/kg/day after accounting for retention following rinsing and dermal penetration of ZnPt. This probabilistic aggregate exposure model can be used in the human safety assessment of ingredients in multiple rinse-off technologies (e.g., shampoo, bar soap, body wash, and liquid hand soap). In addition, this model may be used in other situations where refined exposure assessment is required to support a chemical risk assessment. Copyright © 2015 Elsevier Ltd. All rights reserved.
48 CFR 2442.1107 - Contract clause.
Code of Federal Regulations, 2014 CFR
2014-10-01
... requires services of an analytical nature (e.g., applied social science research); and (3) The contract requires the delivery of an overall end product (e.g., evaluation, study, model). (b) The Contracting...
48 CFR 2442.1107 - Contract clause.
Code of Federal Regulations, 2013 CFR
2013-10-01
... requires services of an analytical nature (e.g., applied social science research); and (3) The contract requires the delivery of an overall end product (e.g., evaluation, study, model). (b) The Contracting...
Machine Learning Methods for Production Cases Analysis
NASA Astrophysics Data System (ADS)
Mokrova, Nataliya V.; Mokrov, Alexander M.; Safonova, Alexandra V.; Vishnyakov, Igor V.
2018-03-01
Approach to analysis of events occurring during the production process were proposed. Described machine learning system is able to solve classification tasks related to production control and hazard identification at an early stage. Descriptors of the internal production network data were used for training and testing of applied models. k-Nearest Neighbors and Random forest methods were used to illustrate and analyze proposed solution. The quality of the developed classifiers was estimated using standard statistical metrics, such as precision, recall and accuracy.
Burckel, E; Ashraf, T; de Sousa Filho, J P; Forleo Neto, E; Guarino, H; Yauti, C; Barreto F de, B; Champion, L
1999-11-01
To develop and apply a model to assess the economic value of a workplace influenza programme from the perspective of the employer. The model calculated the avoided costs of influenza, including treatment costs, lost productivity, lost worker added value and the cost of replacing workers. Subtracted from this benefit were the costs associated with a vaccination programme, including administrative costs, the time to give the vaccine, and lost productivity due to adverse reactions. The framework of the model can be applied to any company to estimate the cost-benefit of an influenza immunisation programme. The model developed was applied to 4030 workers in the core divisions of a Brazilian pharma-chemical company. The model determined a net benefit of $US121,441 [129,335 Brazilian reals ($Brz)], or $US35.45 ($Brz37.75) per vaccinated employee (1997 values). The cost-benefit ratio was 1:2.47. The calculations were subjected to a battery of 1-way and 2-way sensitivity analyses that determined that net benefit would be retained as long as the vaccine cost remained below $US45.40 ($Brz48.40) or the vaccine was at least 32.5% effective. Other alterations would retain a net benefit as well, including several combinations of incidence rate and vaccine effectiveness. The analysis suggests that providing an influenza vaccination programme can incur a substantial net benefit for an employer, although the size of the benefit will depend upon who normally absorbs the costs of treating influenza and compensating workers for lost work time due to illness, as well as the type of company in which the immunisation programme is applied.
Expectations, Performance, and Citizen Satisfaction with Urban Services
ERIC Educational Resources Information Center
Van Ryzin, Gregg G.
2004-01-01
The expectancy disconfirmation model has dominated private-sector research on customer satisfaction for several decades, yet it has not been applied to citizen satisfaction with urban services. The model views satisfaction judgments as determined--not just by product or service performance--but by a process in which consumers compare performance…
Measuring Misinformation in Repeat Trial Pick 1 of 2 Tests.
ERIC Educational Resources Information Center
Henderson, Pamela W.; Buchanan, Bruce
1992-01-01
An extension is described to a product-testing model to account for misinformation among subjects that would lead them to perform incorrectly on "pick one of two" tests. The model is applied to a data set of 367 subjects picking 1 of 2 colas. Misinformation does exist. (SLD)
Applying the Achievement Orientation Model to the Job Satisfaction of Teachers of the Gifted
ERIC Educational Resources Information Center
Siegle, Del; McCoach, D. Betsy; Shea, Kelly
2014-01-01
Factors associated with motivation and satisfaction aid in understanding the processes that enhance achievement and productivity. Siegle and McCoach (2005) proposed a motivational model for understanding student achievement and underachievement that included self-perceptions in three areas (meaningfulness [goal valuation], self-efficacy, and…
L. Linsen; B.J. Karis; E.G. McPherson; B. Hamann
2005-01-01
In computer graphics, models describing the fractal branching structure of trees typically exploit the modularity of tree structures. The models are based on local production rules, which are applied iteratively and simultaneously to create a complex branching system. The objective is to generate three-dimensional scenes of often many realistic- looking and non-...
Modeling Interdependence: Productive Parenting for Gifted Adolescents.
ERIC Educational Resources Information Center
Rhodes, Celeste
1994-01-01
This article offers a theoretical framework of the parenting role as applied to the unique needs and characteristics of gifted adolescents. In addition, a theory of parental modeling of interdependence is presented. Composite examples are made from informal observations of parents who have been successful in promoting the growth of their highly…
Collaborative Development: A New Culture Affects an Old Organization
ERIC Educational Resources Information Center
Phelps, Jim; Ruzicka, Terry
2008-01-01
At the University of Wisconsin (UW)-Madison, the Registrar's Office and the Division of Information Technology (DoIT) apply a collaborative development process to joint projects. This model differs from a "waterfall" model in that technical and functional staff work closely to develop requirements, prototypes, and the product throughout…
High throughput screening (HTS) models are being developed and applied to prioritize chemicals for more comprehensive exposure and risk assessment. Dermal pathways are possible exposure routes to humans for thousands of chemicals found in personal care products and the indoor env...
NASA Astrophysics Data System (ADS)
Garcia, V.; Cooter, E. J.
2013-12-01
The Renewable Fuel Standard (RFS) requires oil refiners to reach a target of 15 billion gallons of corn-based ethanol by 2022. However, there are concerns that the broad-scale use of corn as a source of ethanol may lead to unintended economic and environmental consequences. This study applies the geophysical relationships captured with linked meteorological, air quality and agriculture models to examine the impact of corn production before enactment of the RFS in 2002 and at the height of the RFS targets in 2022. In particular, we investigate the probability of high-levels of nitrate in groundwater resulting from increased corn production and then relate this vulnerability to the potential for infants to acquire Methemoglobinemia, or 'Blue Baby Syndrome'. Blue Baby Syndrome (BBS) is a potentially fatal condition that occurs when the hemoglobin (Fe2+) in an infant's red blood cells is oxidized to methemoglobin (Fe3+), preventing the uptake of oxygen from the baby's blood. Exposure to high levels of nitrate in groundwater occur near the intersection of areas where surface water can more readily leach into shallow aquifers, wells are the main source of drinking water, and high nitrogen inputs exist. We use a coupled meteorological, agricultural and air quality model to identify areas vulnerable to increased nitrate contamination and associated risk to acquiring BBS. We first verify the relationship between predictive variables (e.g., nitrogen deposition and fertilization rates, landcover, soils and aquifer type) and nitrate groundwater levels by applying a regression model to over 800 nitrate measurements taken from wells located throughout the US (Figure 1). We then apply the regression coefficients to the coupled model output to identify areas that are at an increased risk for high nitrate groundwater levels in 2022. Finally, we examine the potential change in risk for acquiring BBS resulting from increased corn production by applying an Oral Reference Dose (RfD) factor from the US EPA Integrated Risk Information System.
Zelić, B; Bolf, N; Vasić-Racki, D
2006-06-01
Three different models: the unstructured mechanistic black-box model, the input-output neural network-based model and the externally recurrent neural network model were used to describe the pyruvate production process from glucose and acetate using the genetically modified Escherichia coli YYC202 ldhA::Kan strain. The experimental data were used from the recently described batch and fed-batch experiments [ Zelić B, Study of the process development for Escherichia coli-based pyruvate production. PhD Thesis, University of Zagreb, Faculty of Chemical Engineering and Technology, Zagreb, Croatia, July 2003. (In English); Zelić et al. Bioproc Biosyst Eng 26:249-258 (2004); Zelić et al. Eng Life Sci 3:299-305 (2003); Zelić et al Biotechnol Bioeng 85:638-646 (2004)]. The neural networks were built out of the experimental data obtained in the fed-batch pyruvate production experiments with the constant glucose feed rate. The model validation was performed using the experimental results obtained from the batch and fed-batch pyruvate production experiments with the constant acetate feed rate. Dynamics of the substrate and product concentration changes was estimated using two neural network-based models for biomass and pyruvate. It was shown that neural networks could be used for the modeling of complex microbial fermentation processes, even in conditions in which mechanistic unstructured models cannot be applied.
Development of mpi_EPIC model for global agroecosystem modeling
Kang, Shujiang; Wang, Dali; Jeff A. Nichols; ...
2014-12-31
Models that address policy-maker concerns about multi-scale effects of food and bioenergy production systems are computationally demanding. We integrated the message passing interface algorithm into the process-based EPIC model to accelerate computation of ecosystem effects. Simulation performance was further enhanced by applying the Vampir framework. When this enhanced mpi_EPIC model was tested, total execution time for a global 30-year simulation of a switchgrass cropping system was shortened to less than 0.5 hours on a supercomputer. The results illustrate that mpi_EPIC using parallel design can balance simulation workloads and facilitate large-scale, high-resolution analysis of agricultural production systems, management alternatives and environmentalmore » effects.« less
ECUT (Energy Conversion and Utilization Technologies) program: Biocatalysis Project
NASA Technical Reports Server (NTRS)
1988-01-01
Fiscal year 1987 research activities and accomplishments for the Biocatalysis Project of the U.S. Department of Energy, Energy Conversion and Utilization Technologies (ECUT) Division are presented. The project's technical activities were organized into three work elements. The Molecular Modeling and Applied Genetics work element includes modeling and simulation studies to verify a dynamic model of the enzyme carboxypeptidase; plasmid stabilization by chromosomal integration; growth and stability characteristics of plasmid-containing cells; and determination of optional production parameters for hyper-production of polyphenol oxidase. The Bioprocess Engineering work element supports efforts in novel bioreactor concepts that are likely to lead to substantially higher levels of reactor productivity, product yields, and lower separation energetics. The Bioprocess Design and Assessment work element attempts to develop procedures (via user-friendly computer software) for assessing the economics and energetics of a given biocatalyst process.
Production of furfural from palm oil empty fruit bunches: kinetic model comparation
NASA Astrophysics Data System (ADS)
Panjaitan, J. R. H.; Monica, S.; Gozan, M.
2017-05-01
Furfural is a chemical compound that can be applied to pharmaceuticals, cosmetics, resins and cleaning compound which can be produced by acid hydrolysis of biomass. Indonesia’s demand for furfural in 2010 reached 790 tons that still imported mostly 72% from China. In this study, reaction kinetic models of furfural production from oil palm empty fruit bunches with submitting acid catalyst at the beginning of the experiment will be determine. Kinetic data will be obtained from hydrolysis of empty oil palm bunches using sulfuric acid catalyst 3% at temperature 170°C, 180°C and 190°C for 20 minutes. From this study, the kinetic model to describe the production of furfural is the kinetic model where generally hydrolysis reaction with an acid catalyst in hemicellulose and furfural will produce the same decomposition product which is formic acid with different reaction pathways. The activation energy obtained for the formation of furfural, the formation of decomposition products from furfural and the formation of decomposition products from hemicellulose is 8.240 kJ/mol, 19.912 kJ/mol and -39.267 kJ / mol.
48 CFR 2442.1107 - Contract clause.
Code of Federal Regulations, 2011 CFR
2011-10-01
... analytical nature (e.g., applied social science research); and (3) The contract requires the delivery of an overall end product (e.g., evaluation, study, model). (b) The Contracting Officer shall use the basic...
48 CFR 2442.1107 - Contract clause.
Code of Federal Regulations, 2012 CFR
2012-10-01
... analytical nature (e.g., applied social science research); and (3) The contract requires the delivery of an overall end product (e.g., evaluation, study, model). (b) The Contracting Officer shall use the basic...
48 CFR 2442.1107 - Contract clause.
Code of Federal Regulations, 2010 CFR
2010-10-01
... analytical nature (e.g., applied social science research); and (3) The contract requires the delivery of an overall end product (e.g., evaluation, study, model). (b) The Contracting Officer shall use the basic...
Vegetation Continuous Fields--Transitioning from MODIS to VIIRS
NASA Astrophysics Data System (ADS)
DiMiceli, C.; Townshend, J. R.; Sohlberg, R. A.; Kim, D. H.; Kelly, M.
2015-12-01
Measurements of fractional vegetation cover are critical for accurate and consistent monitoring of global deforestation rates. They also provide important parameters for land surface, climate and carbon models and vital background data for research into fire, hydrological and ecosystem processes. MODIS Vegetation Continuous Fields (VCF) products provide four complementary layers of fractional cover: tree cover, non-tree vegetation, bare ground, and surface water. MODIS VCF products are currently produced globally and annually at 250m resolution for 2000 to the present. Additionally, annual VCF products at 1/20° resolution derived from AVHRR and MODIS Long-Term Data Records are in development to provide Earth System Data Records of fractional vegetation cover for 1982 to the present. In order to provide continuity of these valuable products, we are extending the VCF algorithms to create Suomi NPP/VIIRS VCF products. This presentation will highlight the first VIIRS fractional cover product: global percent tree cover at 1 km resolution. To create this product, phenological and physiological metrics were derived from each complete year of VIIRS 8-day surface reflectance products. A supervised regression tree method was applied to the metrics, using training derived from Landsat data supplemented by high-resolution data from Ikonos, RapidEye and QuickBird. The regression tree model was then applied globally to produce fractional tree cover. In our presentation we will detail our methods for creating the VIIRS VCF product. We will compare the new VIIRS VCF product to our current MODIS VCF products and demonstrate continuity between instruments. Finally, we will outline future VIIRS VCF development plans.
Model of the Product Development Lifecycle.
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Sunny L.; Roe, Natalie H.; Wood, Evan
2015-10-01
While the increased use of Commercial Off-The-Shelf information technology equipment has presented opportunities for improved cost effectiveness and flexibility, the corresponding loss of control over the product's development creates unique vulnerabilities and security concerns. Of particular interest is the possibility of a supply chain attack. A comprehensive model for the lifecycle of hardware and software products is proposed based on a survey of existing literature from academic, government, and industry sources. Seven major lifecycle stages are identified and defined: (1) Requirements, (2) Design, (3) Manufacturing for hardware and Development for software, (4) Testing, (5) Distribution, (6) Use and Maintenance, andmore » (7) Disposal. The model is then applied to examine the risk of attacks at various stages of the lifecycle.« less
NASA Astrophysics Data System (ADS)
Friedl, L. A.; Cox, L.
2008-12-01
The NASA Applied Sciences Program collaborates with organizations to discover and demonstrate applications of NASA Earth science research and technology to decision making. The desired outcome is for public and private organizations to use NASA Earth science products in innovative applications for sustained, operational uses to enhance their decisions. In addition, the program facilitates the end-user feedback to Earth science to improve products and demands for research. The Program thus serves as a bridge between Earth science research and technology and the applied organizations and end-users with management, policy, and business responsibilities. Since 2002, the Applied Sciences Program has sponsored over 115 applications-oriented projects to apply Earth observations and model products to decision making activities. Projects have spanned numerous topics - agriculture, air quality, water resources, disasters, public health, aviation, etc. The projects have involved government agencies, private companies, universities, non-governmental organizations, and foreign entities in multiple types of teaming arrangements. The paper will examine this set of applications projects and present specific examples of successful use of Earth science in decision making. The paper will discuss scientific, organizational, and management factors that contribute to or impede the integration of the Earth science research in policy and management. The paper will also present new methods the Applied Sciences Program plans to implement to improve linkages between science and end users.
Improved Monitoring of Vegetation Productivity using Continuous Assimilation of Radiometric Data
NASA Astrophysics Data System (ADS)
Baret, F.; Lauvernet, C.; Weiss, M.; Prevot, L.; Rochdi, N.
Canopy functioning models describe crop production from meteorological and soil inputs. However, because of the large number of variables and parameters used, and the poor knowledge of the actual values of some of them, the time course of the canopy and thus final production simulated by these models is often not very accurate. Satellite observations sensors allow controlling the simulations through assimilation of the radiometric data within radiative transfer models coupled to canopy functioning models. An assimilation scheme is presented with application to wheat crops. The coupling between radiative transfer models and canopy functioning models is described. The assimilation scheme is then applied to an experiment achieved within the ReSeDA project. Several issues relative to the assimilation process are discussed. They concern the type of canopy functioning model used, the possibility to assimilate biophysical products rather than radiances, and the use of ancillary information. Further, considerations associated to the problems linked to high spatial and temporal resolution data are listed and illustrated by preliminary results acquired within the ADAM project. Further discussion is made on the required temporal sampling for space observations.
A seawater desalination scheme for global hydrological models
NASA Astrophysics Data System (ADS)
Hanasaki, Naota; Yoshikawa, Sayaka; Kakinuma, Kaoru; Kanae, Shinjiro
2016-10-01
Seawater desalination is a practical technology for providing fresh water to coastal arid regions. Indeed, the use of desalination is rapidly increasing due to growing water demand in these areas and decreases in production costs due to technological advances. In this study, we developed a model to estimate the areas where seawater desalination is likely to be used as a major water source and the likely volume of production. The model was designed to be incorporated into global hydrological models (GHMs) that explicitly include human water usage. The model requires spatially detailed information on climate, income levels, and industrial and municipal water use, which represent standard input/output data in GHMs. The model was applied to a specific historical year (2005) and showed fairly good reproduction of the present geographical distribution and national production of desalinated water in the world. The model was applied globally to two periods in the future (2011-2040 and 2041-2070) under three distinct socioeconomic conditions, i.e., SSP (shared socioeconomic pathway) 1, SSP2, and SSP3. The results indicate that the usage of seawater desalination will have expanded considerably in geographical extent, and that production will have increased by 1.4-2.1-fold in 2011-2040 compared to the present (from 2.8 × 109 m3 yr-1 in 2005 to 4.0-6.0 × 109 m3 yr-1), and 6.7-17.3-fold in 2041-2070 (from 18.7 to 48.6 × 109 m3 yr-1). The estimated global costs for production for each period are USD 1.1-10.6 × 109 (0.002-0.019 % of the total global GDP), USD 1.6-22.8 × 109 (0.001-0.020 %), and USD 7.5-183.9 × 109 (0.002-0.100 %), respectively. The large spreads in these projections are primarily attributable to variations within the socioeconomic scenarios.
Jenkin, Gabrielle; Wilson, Nick; Hermanson, Nicole
2009-05-01
To evaluate the feasibility of the UK Nutrient Profile (NP) model for identifying 'unhealthy' food advertisements using a case study of New Zealand television advertisements. Four weeks of weekday television from 15.30 hours to 18.30 hours was videotaped from a state-owned (free-to-air) television channel popular with children. Food advertisements were identified and their nutritional information collected in accordance with the requirements of the NP model. Nutrient information was obtained from a variety of sources including food labels, company websites and a national nutritional database. From the 60 h sample of weekday afternoon television, there were 1893 advertisements, of which 483 were for food products or retailers. After applying the NP model, 66 % of these were classified as advertising high-fat, high-salt and high-sugar (HFSS) foods; 28 % were classified as advertising non-HFSS foods; and the remaining 2 % were unclassifiable. More than half (53 %) of the HFSS food advertisements were for 'mixed meal' items promoted by major fast-food franchises. The advertising of non-HFSS food was sparse, covering a narrow range of food groups, with no advertisements for fresh fruit or vegetables. Despite the NP model having some design limitations in classifying real-world televised food advertisements, it was easily applied to this sample and could clearly identify HFSS products. Policy makers who do not wish to completely restrict food advertising to children outright should consider using this NP model for regulating food advertising.
Guo, Changning; Doub, William H; Kauffman, John F
2010-08-01
Monte Carlo simulations were applied to investigate the propagation of uncertainty in both input variables and response measurements on model prediction for nasal spray product performance design of experiment (DOE) models in the first part of this study, with an initial assumption that the models perfectly represent the relationship between input variables and the measured responses. In this article, we discard the initial assumption, and extended the Monte Carlo simulation study to examine the influence of both input variable variation and product performance measurement variation on the uncertainty in DOE model coefficients. The Monte Carlo simulations presented in this article illustrate the importance of careful error propagation during product performance modeling. Our results show that the error estimates based on Monte Carlo simulation result in smaller model coefficient standard deviations than those from regression methods. This suggests that the estimated standard deviations from regression may overestimate the uncertainties in the model coefficients. Monte Carlo simulations provide a simple software solution to understand the propagation of uncertainty in complex DOE models so that design space can be specified with statistically meaningful confidence levels. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association
Extension of the Haseman-Elston regression model to longitudinal data.
Won, Sungho; Elston, Robert C; Park, Taesung
2006-01-01
We propose an extension to longitudinal data of the Haseman and Elston regression method for linkage analysis. The proposed model is a mixed model having several random effects. As response variable, we investigate the sibship sample mean corrected cross-product (smHE) and the BLUP-mean corrected cross product (pmHE), comparing them with the original squared difference (oHE), the overall mean corrected cross-product (rHE), and the weighted average of the squared difference and the squared mean-corrected sum (wHE). The proposed model allows for the correlation structure of longitudinal data. Also, the model can test for gene x time interaction to discover genetic variation over time. The model was applied in an analysis of the Genetic Analysis Workshop 13 (GAW13) simulated dataset for a quantitative trait simulating systolic blood pressure. Independence models did not preserve the test sizes, while the mixed models with both family and sibpair random effects tended to preserve size well. Copyright 2006 S. Karger AG, Basel.
Evaluation of the energy efficiency of enzyme fermentation by mechanistic modeling.
Albaek, Mads O; Gernaey, Krist V; Hansen, Morten S; Stocks, Stuart M
2012-04-01
Modeling biotechnological processes is key to obtaining increased productivity and efficiency. Particularly crucial to successful modeling of such systems is the coupling of the physical transport phenomena and the biological activity in one model. We have applied a model for the expression of cellulosic enzymes by the filamentous fungus Trichoderma reesei and found excellent agreement with experimental data. The most influential factor was demonstrated to be viscosity and its influence on mass transfer. Not surprisingly, the biological model is also shown to have high influence on the model prediction. At different rates of agitation and aeration as well as headspace pressure, we can predict the energy efficiency of oxygen transfer, a key process parameter for economical production of industrial enzymes. An inverse relationship between the productivity and energy efficiency of the process was found. This modeling approach can be used by manufacturers to evaluate the enzyme fermentation process for a range of different process conditions with regard to energy efficiency. Copyright © 2011 Wiley Periodicals, Inc.
Ethylene dynamics in the CELSS biomass production chamber
NASA Technical Reports Server (NTRS)
Rakow, Allen L.
1994-01-01
A material balance model for ethylene was developed and applied retrospectively to data obtained in the Biomass Production Chamber of CELSS in order to calculate true plant production rates of ethylene. Four crops were analyzed: wheat, lettuce, soybean, and potato. The model represents an effort to account for each and every source and sink for ethylene in the system. The major source of ethylene is the plant biomass and the major sink is leakage to the surroundings. The result, expressed in the units of ppd/day, were converted to nl of ethylene per gram of plant dry mass per hour and compare favorably with recent glasshouse to belljar experiments.
NASA Astrophysics Data System (ADS)
Claverie, M.; Franch, B.; Vermote, E.; Becker-Reshef, I.; Justice, C. O.
2015-12-01
Wheat is one of the key cereals crop grown worldwide. Thus, accurate and timely forecasts of its production are critical for informing agricultural policies and investments, as well as increasing market efficiency and stability. Becker-Reshef et al. (2010) used an empirical generalized model for forecasting winter wheat production using combined BRDF-corrected daily surface reflectance from the Moderate resolution Imaging Spectroradiometer (MODIS) Climate Modeling Grid (CMG) with detailed official crop statistics and crop type masks. It is based on the relationship between the Normalized Difference Vegetation Index (NDVI) at the peak of the growing season, percent wheat within the CMG pixel, and the final yields. This method predicts the yield approximately one month to six weeks prior to harvest. Recently, Franch et al. (2015) included Growing Degree Day (GDD) information extracted from NCEP/NCAR reanalysis data in order to improve the winter wheat production forecast by increasing the timeliness of the forecasts between a month to a month and a half prior to the peak NDVI (i.e. 1-2.5 months prior to harvest), while conserving the accuracy of the original model. In this study, we apply these methods to historical data from the Advanced Very High Resolution Radiometer (AVHRR). We apply both the original and the modified model to United States of America from 1990 to 2014 and inter-compare the AVHRR results to MODIS from 2000 to 2014.
NASA Astrophysics Data System (ADS)
Liu, Qiong; Wang, Wen-xi; Zhu, Ke-ren; Zhang, Chao-yong; Rao, Yun-qing
2014-11-01
Mixed-model assembly line sequencing is significant in reducing the production time and overall cost of production. To improve production efficiency, a mathematical model aiming simultaneously to minimize overtime, idle time and total set-up costs is developed. To obtain high-quality and stable solutions, an advanced scatter search approach is proposed. In the proposed algorithm, a new diversification generation method based on a genetic algorithm is presented to generate a set of potentially diverse and high-quality initial solutions. Many methods, including reference set update, subset generation, solution combination and improvement methods, are designed to maintain the diversification of populations and to obtain high-quality ideal solutions. The proposed model and algorithm are applied and validated in a case company. The results indicate that the proposed advanced scatter search approach is significant for mixed-model assembly line sequencing in this company.
Using fuzzy rule-based knowledge model for optimum plating conditions search
NASA Astrophysics Data System (ADS)
Solovjev, D. S.; Solovjeva, I. A.; Litovka, Yu V.; Arzamastsev, A. A.; Glazkov, V. P.; L’vov, A. A.
2018-03-01
The paper discusses existing approaches to plating process modeling in order to decrease the distribution thickness of plating surface cover. However, these approaches do not take into account the experience, knowledge, and intuition of the decision-makers when searching the optimal conditions of electroplating technological process. The original approach to optimal conditions search for applying the electroplating coatings, which uses the rule-based model of knowledge and allows one to reduce the uneven product thickness distribution, is proposed. The block diagrams of a conventional control system of a galvanic process as well as the system based on the production model of knowledge are considered. It is shown that the fuzzy production model of knowledge in the control system makes it possible to obtain galvanic coatings of a given thickness unevenness with a high degree of adequacy to the experimental data. The described experimental results confirm the theoretical conclusions.
Wirges, M; Funke, A; Serno, P; Knop, K; Kleinebudde, P
2013-05-05
Incorporation of an active pharmaceutical ingredient (API) into the coating layer of film-coated tablets is a method mainly used to formulate fixed-dose combinations. Uniform and precise spray-coating of an API represents a substantial challenge, which could be overcome by applying Raman spectroscopy as process analytical tool. In pharmaceutical industry, Raman spectroscopy is still mainly used as a bench top laboratory analytical method and usually not implemented in the production process. Concerning the application in the production process, a lot of scientific approaches stop at the level of feasibility studies and do not manage the step to production scale and process applications. The present work puts the scale up of an active coating process into focus, which is a step of highest importance during the pharmaceutical development. Active coating experiments were performed at lab and production scale. Using partial least squares (PLS), a multivariate model was constructed by correlating in-line measured Raman spectral data with the coated amount of API. By transferring this model, being implemented for a lab scale process, to a production scale process, the robustness of this analytical method and thus its applicability as a Process Analytical Technology (PAT) tool for the correct endpoint determination in pharmaceutical manufacturing could be shown. Finally, this method was validated according to the European Medicine Agency (EMA) guideline with respect to the special requirements of the applied in-line model development strategy. Copyright © 2013 Elsevier B.V. All rights reserved.
Mathematical models of cell factories: moving towards the core of industrial biotechnology.
Cvijovic, Marija; Bordel, Sergio; Nielsen, Jens
2011-09-01
Industrial biotechnology involves the utilization of cell factories for the production of fuels and chemicals. Traditionally, the development of highly productive microbial strains has relied on random mutagenesis and screening. The development of predictive mathematical models provides a new paradigm for the rational design of cell factories. Instead of selecting among a set of strains resulting from random mutagenesis, mathematical models allow the researchers to predict in silico the outcomes of different genetic manipulations and engineer new strains by performing gene deletions or additions leading to a higher productivity of the desired chemicals. In this review we aim to summarize the main modelling approaches of biological processes and illustrate the particular applications that they have found in the field of industrial microbiology. © 2010 The Authors. Journal compilation © 2010 Society for Applied Microbiology and Blackwell Publishing Ltd.
Statistical Method to Overcome Overfitting Issue in Rational Function Models
NASA Astrophysics Data System (ADS)
Alizadeh Moghaddam, S. H.; Mokhtarzade, M.; Alizadeh Naeini, A.; Alizadeh Moghaddam, S. A.
2017-09-01
Rational function models (RFMs) are known as one of the most appealing models which are extensively applied in geometric correction of satellite images and map production. Overfitting is a common issue, in the case of terrain dependent RFMs, that degrades the accuracy of RFMs-derived geospatial products. This issue, resulting from the high number of RFMs' parameters, leads to ill-posedness of the RFMs. To tackle this problem, in this study, a fast and robust statistical approach is proposed and compared to Tikhonov regularization (TR) method, as a frequently-used solution to RFMs' overfitting. In the proposed method, a statistical test, namely, significance test is applied to search for the RFMs' parameters that are resistant against overfitting issue. The performance of the proposed method was evaluated for two real data sets of Cartosat-1 satellite images. The obtained results demonstrate the efficiency of the proposed method in term of the achievable level of accuracy. This technique, indeed, shows an improvement of 50-80% over the TR.
Metabolomics for organic food authentication: Results from a long-term field study in carrots.
Cubero-Leon, Elena; De Rudder, Olivier; Maquet, Alain
2018-01-15
Increasing demand for organic products and their premium prices make them an attractive target for fraudulent malpractices. In this study, a large-scale comparative metabolomics approach was applied to investigate the effect of the agronomic production system on the metabolite composition of carrots and to build statistical models for prediction purposes. Orthogonal projections to latent structures-discriminant analysis (OPLS-DA) was applied successfully to predict the origin of the agricultural system of the harvested carrots on the basis of features determined by liquid chromatography-mass spectrometry. When the training set used to build the OPLS-DA models contained samples representative of each harvest year, the models were able to classify unknown samples correctly (100% correct classification). If a harvest year was left out of the training sets and used for predictions, the correct classification rates achieved ranged from 76% to 100%. The results therefore highlight the potential of metabolomic fingerprinting for organic food authentication purposes. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Some Results of Weak Anticipative Concept Applied in Simulation Based Decision Support in Enterprise
NASA Astrophysics Data System (ADS)
Kljajić, Miroljub; Kofjač, Davorin; Kljajić Borštnar, Mirjana; Škraba, Andrej
2010-11-01
The simulation models are used as for decision support and learning in enterprises and in schools. Tree cases of successful applications demonstrate usefulness of weak anticipative information. Job shop scheduling production with makespan criterion presents a real case customized flexible furniture production optimization. The genetic algorithm for job shop scheduling optimization is presented. Simulation based inventory control for products with stochastic lead time and demand describes inventory optimization for products with stochastic lead time and demand. Dynamic programming and fuzzy control algorithms reduce the total cost without producing stock-outs in most cases. Values of decision making information based on simulation were discussed too. All two cases will be discussed from optimization, modeling and learning point of view.
Development and application of an acceptance testing model
NASA Technical Reports Server (NTRS)
Pendley, Rex D.; Noonan, Caroline H.; Hall, Kenneth R.
1992-01-01
The process of acceptance testing large software systems for NASA has been analyzed, and an empirical planning model of the process constructed. This model gives managers accurate predictions of the staffing needed, the productivity of a test team, and the rate at which the system will pass. Applying the model to a new system shows a high level of agreement between the model and actual performance. The model also gives managers an objective measure of process improvement.
Applying an Information Problem-Solving Model to Academic Reference Work: Findings and Implications.
ERIC Educational Resources Information Center
Cottrell, Janet R.; Eisenberg, Michael B.
2001-01-01
Examines the usefulness of the Eisenberg-Berkowitz Information Problem-Solving model as a categorization for academic reference encounters. Major trends in the data include a high proportion of questions about location and access of sources, lack of synthesis or production activities, and consistent presence of system problems that impede the…
NASA Astrophysics Data System (ADS)
Zhong, H.; Sun, L.; Tian, Z.; Liang, Z.; Fischer, G.
2014-12-01
China is one of the most populous and fast developing countries, also faces a great pressure on grain production and food security. Multi-cropping system is widely applied in China to fully utilize agro-climatic resources and increase land productivity. As the heat resource keep improving under climate warming, multi-cropping system will also shifting northward, and benefit crop production. But water shortage in North China Plain will constrain the adoption of new multi-cropping system. Effectiveness of multi-cropping system adaptation to climate change will greatly depend on future hydrological change and agriculture water management. So it is necessary to quantitatively express the water demand of different multi-cropping systems under climate change. In this paper, we proposed an integrated climate-cropping system-crops adaptation framework, and specifically focused on: 1) precipitation and hydrological change under future climate change in China; 2) the best multi-cropping system and correspondent crop rotation sequence, and water demand under future agro-climatic resources; 3) attainable crop production with water constraint; and 4) future water management. In order to obtain climate projection and precipitation distribution, global climate change scenario from HADCAM3 is downscaled with regional climate model (PRECIS), historical climate data (1960-1990) was interpolated from more than 700 meteorological observation stations. The regional Agro-ecological Zone (AEZ) model is applied to simulate the best multi-cropping system and crop rotation sequence under projected climate change scenario. Finally, we use the site process-based DSSAT model to estimate attainable crop production and the water deficiency. Our findings indicate that annual land productivity may increase and China can gain benefit from climate change if multi-cropping system would be adopted. This study provides a macro-scale view of agriculture adaptation, and gives suggestions to national agriculture adaptation strategy decisions.
Safari, Parviz; Danyali, Syyedeh Fatemeh; Rahimi, Mehdi
2018-06-02
Drought is the main abiotic stress seriously influencing wheat production. Information about the inheritance of drought tolerance is necessary to determine the most appropriate strategy to develop tolerant cultivars and populations. In this study, generation means analysis to identify the genetic effects controlling grain yield inheritance in water deficit and normal conditions was considered as a model selection problem in a Bayesian framework. Stochastic search variable selection (SSVS) was applied to identify the most important genetic effects and the best fitted models using different generations obtained from two crosses applying two water regimes in two growing seasons. The SSVS is used to evaluate the effect of each variable on the dependent variable via posterior variable inclusion probabilities. The model with the highest posterior probability is selected as the best model. In this study, the grain yield was controlled by the main effects (additive and non-additive effects) and epistatic. The results demonstrate that breeding methods such as recurrent selection and subsequent pedigree method and hybrid production can be useful to improve grain yield.
Real‐time monitoring and control of the load phase of a protein A capture step
Rüdt, Matthias; Brestrich, Nina; Rolinger, Laura
2016-01-01
ABSTRACT The load phase in preparative Protein A capture steps is commonly not controlled in real‐time. The load volume is generally based on an offline quantification of the monoclonal antibody (mAb) prior to loading and on a conservative column capacity determined by resin‐life time studies. While this results in a reduced productivity in batch mode, the bottleneck of suitable real‐time analytics has to be overcome in order to enable continuous mAb purification. In this study, Partial Least Squares Regression (PLS) modeling on UV/Vis absorption spectra was applied to quantify mAb in the effluent of a Protein A capture step during the load phase. A PLS model based on several breakthrough curves with variable mAb titers in the HCCF was successfully calibrated. The PLS model predicted the mAb concentrations in the effluent of a validation experiment with a root mean square error (RMSE) of 0.06 mg/mL. The information was applied to automatically terminate the load phase, when a product breakthrough of 1.5 mg/mL was reached. In a second part of the study, the sensitivity of the method was further increased by only considering small mAb concentrations in the calibration and by subtracting an impurity background signal. The resulting PLS model exhibited a RMSE of prediction of 0.01 mg/mL and was successfully applied to terminate the load phase, when a product breakthrough of 0.15 mg/mL was achieved. The proposed method has hence potential for the real‐time monitoring and control of capture steps at large scale production. This might enhance the resin capacity utilization, eliminate time‐consuming offline analytics, and contribute to the realization of continuous processing. Biotechnol. Bioeng. 2017;114: 368–373. © 2016 The Authors. Biotechnology and Bioengineering published by Wiley Periodicals, Inc. PMID:27543789
Learning to apply models of materials while explaining their properties
NASA Astrophysics Data System (ADS)
Karpin, Tiia; Juuti, Kalle; Lavonen, Jari
2014-09-01
Background:Applying structural models is important to chemistry education at the upper secondary level, but it is considered one of the most difficult topics to learn. Purpose:This study analyses to what extent in designed lessons students learned to apply structural models in explaining the properties and behaviours of various materials. Sample:An experimental group is 27 Finnish upper secondary school students and control group included 18 students from the same school. Design and methods:In quasi-experimental setting, students were guided through predict, observe, explain activities in four practical work situations. It was intended that the structural models would encourage students to learn how to identify and apply appropriate models when predicting and explaining situations. The lessons, organised over a one-week period, began with a teacher's demonstration and continued with student experiments in which they described the properties and behaviours of six household products representing three different materials. Results:Most students in the experimental group learned to apply the models correctly, as demonstrated by post-test scores that were significantly higher than pre-test scores. The control group showed no significant difference between pre- and post-test scores. Conclusions:The findings indicate that the intervention where students engage in predict, observe, explain activities while several materials and models are confronted at the same time, had a positive effect on learning outcomes.
Large-scale Modeling of Nitrous Oxide Production: Issues of Representing Spatial Heterogeneity
NASA Astrophysics Data System (ADS)
Morris, C. K.; Knighton, J.
2017-12-01
Nitrous oxide is produced from the biological processes of nitrification and denitrification in terrestrial environments and contributes to the greenhouse effect that warms Earth's climate. Large scale modeling can be used to determine how global rate of nitrous oxide production and consumption will shift under future climates. However, accurate modeling of nitrification and denitrification is made difficult by highly parameterized, nonlinear equations. Here we show that the representation of spatial heterogeneity in inputs, specifically soil moisture, causes inaccuracies in estimating the average nitrous oxide production in soils. We demonstrate that when soil moisture is averaged from a spatially heterogeneous surface, net nitrous oxide production is under predicted. We apply this general result in a test of a widely-used global land surface model, the Community Land Model v4.5. The challenges presented by nonlinear controls on nitrous oxide are highlighted here to provide a wider context to the problem of extraordinary denitrification losses in CLM. We hope that these findings will inform future researchers on the possibilities for model improvement of the global nitrogen cycle.
ERIC Educational Resources Information Center
Huempfner, Lisa; Kopf, Dennis A.
2017-01-01
Higher education administrators are often faced with difficult choices in allocating limited resources for the creation of new programs. The purpose of this article is to explore the suitability of a new product, an integrated business Spanish major, by applying stakeholder marketing. In so doing, it provides a framework for the application of…
Measuring water and sediment discharge from a road plot with a settling basin and tipping bucket
Thomas A. Black; Charles H. Luce
2013-01-01
A simple empirical method quantifies water and sediment production from a forest road surface, and is well suited for calibration and validation of road sediment models. To apply this quantitative method, the hydrologic technician installs bordered plots on existing typical road segments and measures coarse sediment production in a settling tank. When a tipping bucket...
Yuan, Heyang; Lu, Yaobin; Abu-Reesh, Ibrahim M; He, Zhen
2015-01-01
While microbial electrolysis cells (MECs) can simultaneously produce bioelectrochemical hydrogen and treat wastewater, they consume considerable energy to overcome the unfavorable thermodynamics, which is not sustainable and economically feasible in practical applications. This study presents a proof-of-concept system in which hydrogen can be produced in an MEC powered by theoretically predicated energy from pressure-retarded osmosis (PRO). The system consists of a PRO unit that extracts high-quality water and generates electricity from water osmosis, and an MEC for organic removal and hydrogen production. The feasibility of the system was demonstrated using simulated PRO performance (in terms of energy production and effluent quality) and experimental MEC results (e.g., hydrogen production and organic removal). The PRO and MEC models were proven to be valid. The model predicted that the PRO unit could produce 485 mL of clean water and 579 J of energy with 600 mL of draw solution (0.8 M of NaCl). The amount of the predicated energy was applied to the MEC by a power supply, which drove the MEC to remove 93.7 % of the organic compounds and produce 32.8 mL of H2 experimentally. Increasing the PRO influent volume and draw concentration could produce more energy for the MEC operation, and correspondingly increase the MEC hydraulic retention time (HRT) and total hydrogen production. The models predicted that at an external voltage of 0.9 V, the MEC energy consumption reached the maximum PRO energy production. With a higher external voltage, the MEC energy consumption would exceed the PRO energy production, leading to negative effects on both organic removal and hydrogen production. The PRO-MEC system holds great promise in addressing water-energy nexus through organic removal, hydrogen production, and water recovery: (1) the PRO unit can reduce the volume of wastewater and extract clean water; (2) the PRO effluents can be further treated by the MEC; and (3) the osmotic energy harvested from the PRO unit can be applied to the MEC for sustainable bioelectrochemical hydrogen production.
Generalised additive modelling approach to the fermentation process of glutamate.
Liu, Chun-Bo; Li, Yun; Pan, Feng; Shi, Zhong-Ping
2011-03-01
In this work, generalised additive models (GAMs) were used for the first time to model the fermentation of glutamate (Glu). It was found that three fermentation parameters fermentation time (T), dissolved oxygen (DO) and oxygen uptake rate (OUR) could capture 97% variance of the production of Glu during the fermentation process through a GAM model calibrated using online data from 15 fermentation experiments. This model was applied to investigate the individual and combined effects of T, DO and OUR on the production of Glu. The conditions to optimize the fermentation process were proposed based on the simulation study from this model. Results suggested that the production of Glu can reach a high level by controlling concentration levels of DO and OUR to the proposed optimization conditions during the fermentation process. The GAM approach therefore provides an alternative way to model and optimize the fermentation process of Glu. Crown Copyright © 2010. Published by Elsevier Ltd. All rights reserved.
Schutyser, M A I; Straatsma, J; Keijzer, P M; Verschueren, M; De Jong, P
2008-11-30
In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs. It can be applied to existing products and processes but also to reduce time to market for new products. Important aspects of the tool are its user-friendliness and its specifications customised to the needs of small dairy companies. To challenge the web-based tool it was applied for optimisation of thermal treatments in 16 dairy companies producing yoghurt, fresh cream, chocolate milk and cheese. Optimisation with WebSim-MILQ resulted in concrete improvements with respect to risk of microbial contamination, cheese yield, fouling and production costs. In this paper we illustrate the use of WebSim-MILQ for optimisation of a cheese milk pasteurisation process where we could increase the cheese yield (1 extra cheese for each 100 produced cheeses from the same amount of milk) and reduced the risk of contamination of pasteurised cheese milk with thermoresistent streptococci from critical to negligible. In another case we demonstrate the advantage for changing from an indirect to a direct heating method for a UHT process resulting in 80% less fouling, while improving product quality and maintaining product safety.
Assessing the Linguistic Productivity of Unsupervised Deep Neural Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillips, Lawrence A.; Hodas, Nathan O.
Increasingly, cognitive scientists have demonstrated interest in applying tools from deep learning. One use for deep learning is in language acquisition where it is useful to know if a linguistic phenomenon can be learned through domain-general means. To assess whether unsupervised deep learning is appropriate, we first pose a smaller question: Can unsupervised neural networks apply linguistic rules productively, using them in novel situations. We draw from the literature on determiner/noun productivity by training an unsupervised, autoencoder network measuring its ability to combine nouns with determiners. Our simple autoencoder creates combinations it has not previously encountered, displaying a degree ofmore » overlap similar to actual children. While this preliminary work does not provide conclusive evidence for productivity, it warrants further investigation with more complex models. Further, this work helps lay the foundations for future collaboration between the deep learning and cognitive science communities.« less
Macroergomonics' contribution to the effectiveness of collaborative supply chains.
Herrera, Sandra Mejias; Huatuco, Luisa Huaccho
2012-01-01
This article presents a conceptual model that combines Macroergonomics and Supply chain. The authors combine their expertise on these individual topics, building on their previous research. The argument of the paper is that human factors are key to achieve effective supplier-customer collaboration. A conceptual model is presented, its elements and their interactions are explained. The Content-Context-Process is applied as a departing point to this model. Macroergonomics aspects considered are: a systemic approach, participatory ergonomics, formation of ergonomics teams and evaluation of ergonomics projects. The expected outcomes are: (a) improvement of production and productivity levels, (b) improvement of the product quality, (c) Reduction of absenteeism, (d) Improvement in the quality of work life (from the employees' perspective), and (e) increase in the employees' contribution rate of ideas for improvement. A case study was carried out at a vitroplant production organisation incorporating environmental aspects to obtain sustainable benefits.
Conception of Self-Construction Production Scheduling System
NASA Astrophysics Data System (ADS)
Xue, Hai; Zhang, Xuerui; Shimizu, Yasuhiro; Fujimura, Shigeru
With the high speed innovation of information technology, many production scheduling systems have been developed. However, a lot of customization according to individual production environment is required, and then a large investment for development and maintenance is indispensable. Therefore now the direction to construct scheduling systems should be changed. The final objective of this research aims at developing a system which is built by it extracting the scheduling technique automatically through the daily production scheduling work, so that an investment will be reduced. This extraction mechanism should be applied for various production processes for the interoperability. Using the master information extracted by the system, production scheduling operators can be supported to accelerate the production scheduling work easily and accurately without any restriction of scheduling operations. By installing this extraction mechanism, it is easy to introduce scheduling system without a lot of expense for customization. In this paper, at first a model for expressing a scheduling problem is proposed. Then the guideline to extract the scheduling information and use the extracted information is shown and some applied functions are also proposed based on it.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olea, Ricardo A., E-mail: olea@usgs.gov; Cook, Troy A.; Coleman, James L.
2010-12-15
The Greater Natural Buttes tight natural gas field is an unconventional (continuous) accumulation in the Uinta Basin, Utah, that began production in the early 1950s from the Upper Cretaceous Mesaverde Group. Three years later, production was extended to the Eocene Wasatch Formation. With the exclusion of 1100 non-productive ('dry') wells, we estimate that the final recovery from the 2500 producing wells existing in 2007 will be about 1.7 trillion standard cubic feet (TSCF) (48.2 billion cubic meters (BCM)). The use of estimated ultimate recovery (EUR) per well is common in assessments of unconventional resources, and it is one of themore » main sources of information to forecast undiscovered resources. Each calculated recovery value has an associated drainage area that generally varies from well to well and that can be mathematically subdivided into elemental subareas of constant size and shape called cells. Recovery per 5-acre cells at Greater Natural Buttes shows spatial correlation; hence, statistical approaches that ignore this correlation when inferring EUR values for untested cells do not take full advantage of all the information contained in the data. More critically, resulting models do not match the style of spatial EUR fluctuations observed in nature. This study takes a new approach by applying spatial statistics to model geographical variation of cell EUR taking into account spatial correlation and the influence of fractures. We applied sequential indicator simulation to model non-productive cells, while spatial mapping of cell EUR was obtained by applying sequential Gaussian simulation to provide multiple versions of reality (realizations) having equal chances of being the correct model. For each realization, summation of EUR in cells not drained by the existing wells allowed preparation of a stochastic prediction of undiscovered resources, which range between 2.6 and 3.4 TSCF (73.6 and 96.3 BCM) with a mean of 2.9 TSCF (82.1 BCM) for Greater Natural Buttes. A second approach illustrates the application of multiple-point simulation to assess a hypothetical frontier area for which there is no production information but which is regarded as being similar to Greater Natural Buttes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vimmerstedt, Laura; Newes, Emily
The Federal Aviation Administration promotes the development of an aviation biofuel market, and has pursued a goal of 1 billion gallons of production annually by 2018. Although this goal is unlikely to be met, this analysis applies the Biomass Scenario Model to explore conditions affecting market growth, and identifies policy incentive and oil price conditions under which this level of production might occur, and by what year. Numerous combinations of conditions that are more favorable than current conditions can reach the goal before 2030.
Romero-Fernández, Ma Mar; Royo-Bordonada, Miguel Ángel; Rodríguez-Artalejo, Fernando
2013-07-01
To evaluate the nutritional quality of products advertised on television (TV) during children’s viewing time in Spain, applying the UK nutrient profile model (UKNPM). We recorded 80 h of four general TV station broadcasts during children’s time in May and June 2008, and identified all advertisements for foods and beverages. Nutritional information was obtained from the product labels or websites and from food composition tables. Each product was classified as healthy (e.g. gazpacho, a vegetable juice) or less healthy (e.g. potato crisp snacks) according to the UKNPM criteria. Four free-of-charge TV channels in Spain: two national channels and two regional ones. TV commercials of food and beverages. A total of 486 commercials were broadcast for ninety-six different products, with a mean frequency of 5?1 advertisements per product. Some 61?5% of the ninety-six products were less healthy, and the percentage was higher for foods (74?1 %). All (100 %) of the breakfast cereals and 80% of the non-alcoholic drinks and soft drinks were less healthy. Of the total sample of commercials, 59?7% were for less healthy products, a percentage that rose to 71?2% during children’s reinforced protection viewing time. Over half the commercials were for less healthy products, a proportion that rose to over two-thirds during the hours of special protection for children. This suggests that applying the UKNPM to regulate food advertising during this slot would entail the withdrawal of most food commercials in Spain. TV advertising of products with low nutritional quality should be restricted.
Increasing operating room productivity by duration categories and a newsvendor model.
Lehtonen, Juha-Matti; Torkki, Paulus; Peltokorpi, Antti; Moilanen, Teemu
2013-01-01
Previous studies approach surgery scheduling mainly from the mathematical modeling perspective which is often hard to apply in a practical environment. The aim of this study is to develop a practical scheduling system that considers the advantages of both surgery categorization and newsvendor model to surgery scheduling. The research was carried out in a Finnish orthopaedic specialist centre that performs only joint replacement surgery. Four surgery categorization scenarios were defined and their productivity analyzed by simulation and newsvendor model. Detailed analyses of surgery durations and the use of more accurate case categories and their combinations in scheduling improved OR productivity 11.3 percent when compared to the base case. Planning to have one OR team to work longer led to remarkable decrease in scheduling inefficiency. In surgical services, productivity and cost-efficiency can be improved by utilizing historical data in case scheduling and by increasing flexibility in personnel management. The study increases the understanding of practical scheduling methods used to improve efficiency in surgical services.
NASA Astrophysics Data System (ADS)
Shair, Syazreen Niza; Yusof, Aida Yuzi; Asmuni, Nurin Haniah
2017-05-01
Coherent mortality forecasting models have recently received increasing attention particularly in their application to sub-populations. The advantage of coherent models over independent models is the ability to forecast a non-divergent mortality for two or more sub-populations. One of the coherent models was recently developed by [1] known as the product-ratio model. This model is an extension version of the functional independent model from [2]. The product-ratio model has been applied in a developed country, Australia [1] and has been extended in a developing nation, Malaysia [3]. While [3] accounted for coherency of mortality rates between gender and ethnic group, the coherency between states in Malaysia has never been explored. This paper will forecast the mortality rates of Malaysian sub-populations according to states using the product ratio coherent model and its independent version— the functional independent model. The forecast accuracies of two different models are evaluated using the out-of-sample error measurements— the mean absolute forecast error (MAFE) for age-specific death rates and the mean forecast error (MFE) for the life expectancy at birth. We employ Malaysian mortality time series data from 1991 to 2014, segregated by age, gender and states.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bandaru, Varaprasad; Izaurralde, Roberto C.; Manowitz, David H.
2013-12-01
The use of marginal lands (MLs) for biofuel production has been contemplated as a promising solution for meeting biofuel demands. However, there have been concerns with spatial location of MLs, their inherent biofuel potential, and possible environmental consequences with the cultivation of energy crops. Here, we developed a new quantitative approach that integrates high-resolution land cover and land productivity maps and uses conditional probability density functions for analyzing land use patterns as a function of land productivity to classify the agricultural lands. We subsequently applied this method to determine available productive croplands (P-CLs) and non-crop marginal lands (NC-MLs) in amore » nine-county Southern Michigan. Furthermore, Spatially Explicit Integrated Modeling Framework (SEIMF) using EPIC (Environmental Policy Integrated Climate) was used to understand the net energy (NE) and soil organic carbon (SOC) implications of cultivating different annual and perennial production systems.« less
NASA Astrophysics Data System (ADS)
Tohidnia, S.; Tohidi, G.
2018-02-01
The current paper develops three different ways to measure the multi-period global cost efficiency for homogeneous networks of processes when the prices of exogenous inputs are known at all time periods. A multi-period network data envelopment analysis model is presented to measure the minimum cost of the network system based on the global production possibility set. We show that there is a relationship between the multi-period global cost efficiency of network system and its subsystems, and also its processes. The proposed model is applied to compute the global cost Malmquist productivity index for measuring the productivity changes of network system and each of its process between two time periods. This index is circular. Furthermore, we show that the productivity changes of network system can be defined as a weighted average of the process productivity changes. Finally, a numerical example will be presented to illustrate the proposed approach.
Population Modeling Approach to Optimize Crop Harvest Strategy. The Case of Field Tomato.
Tran, Dinh T; Hertog, Maarten L A T M; Tran, Thi L H; Quyen, Nguyen T; Van de Poel, Bram; Mata, Clara I; Nicolaï, Bart M
2017-01-01
In this study, the aim is to develop a population model based approach to optimize fruit harvesting strategies with regard to fruit quality and its derived economic value. This approach was applied to the case of tomato fruit harvesting under Vietnamese conditions. Fruit growth and development of tomato (cv. "Savior") was monitored in terms of fruit size and color during both the Vietnamese winter and summer growing seasons. A kinetic tomato fruit growth model was applied to quantify biological fruit-to-fruit variation in terms of their physiological maturation. This model was successfully calibrated. Finally, the model was extended to translate the fruit-to-fruit variation at harvest into the economic value of the harvested crop. It can be concluded that a model based approach to the optimization of harvest date and harvest frequency with regard to economic value of the crop as such is feasible. This approach allows growers to optimize their harvesting strategy by harvesting the crop at more uniform maturity stages meeting the stringent retail demands for homogeneous high quality product. The total farm profit would still depend on the impact a change in harvesting strategy might have on related expenditures. This model based harvest optimisation approach can be easily transferred to other fruit and vegetable crops improving homogeneity of the postharvest product streams.
Structure-reactivity modeling using mixture-based representation of chemical reactions.
Polishchuk, Pavel; Madzhidov, Timur; Gimadiev, Timur; Bodrov, Andrey; Nugmanov, Ramil; Varnek, Alexandre
2017-09-01
We describe a novel approach of reaction representation as a combination of two mixtures: a mixture of reactants and a mixture of products. In turn, each mixture can be encoded using an earlier reported approach involving simplex descriptors (SiRMS). The feature vector representing these two mixtures results from either concatenated product and reactant descriptors or the difference between descriptors of products and reactants. This reaction representation doesn't need an explicit labeling of a reaction center. The rigorous "product-out" cross-validation (CV) strategy has been suggested. Unlike the naïve "reaction-out" CV approach based on a random selection of items, the proposed one provides with more realistic estimation of prediction accuracy for reactions resulting in novel products. The new methodology has been applied to model rate constants of E2 reactions. It has been demonstrated that the use of the fragment control domain applicability approach significantly increases prediction accuracy of the models. The models obtained with new "mixture" approach performed better than those required either explicit (Condensed Graph of Reaction) or implicit (reaction fingerprints) reaction center labeling.
The Application of Architecture Frameworks to Modelling Exploration Operations Costs
NASA Technical Reports Server (NTRS)
Shishko, Robert
2006-01-01
Developments in architectural frameworks and system-of-systems thinking have provided useful constructs for systems engineering. DoDAF concepts, language, and formalisms, in particular, provide a natural way of conceptualizing an operations cost model applicable to NASA's space exploration vision. Not all DoDAF products have meaning or apply to a DoDAF inspired operations cost model, but this paper describes how such DoDAF concepts as nodes, systems, and operational activities relate to the development of a model to estimate exploration operations costs. The paper discusses the specific implementation to the Mission Operations Directorate (MOD) operational functions/activities currently being developed and presents an overview of how this powerful representation can apply to robotic space missions as well.
An effective model for ergonomic optimization applied to a new automotive assembly line
NASA Astrophysics Data System (ADS)
Duraccio, Vincenzo; Elia, Valerio; Forcina, Antonio
2016-06-01
An efficient ergonomic optimization can lead to a significant improvement in production performance and a considerable reduction of costs. In the present paper new model for ergonomic optimization is proposed. The new approach is based on the criteria defined by National Institute of Occupational Safety and Health and, adapted to Italian legislation. The proposed model provides an ergonomic optimization, by analyzing ergonomic relations between manual work in correct conditions. The model includes a schematic and systematic analysis method of the operations, and identifies all possible ergonomic aspects to be evaluated. The proposed approach has been applied to an automotive assembly line, where the operation repeatability makes the optimization fundamental. The proposed application clearly demonstrates the effectiveness of the new approach.
Jacques-Jamin, Carine; Jeanjean-Miquel, Corinne; Domergue, Anaïs; Bessou-Touya, Sandrine; Duplan, Hélène
2017-01-01
Information is lacking on the dermal penetration of topically applied formulations on in vitro skin models, under conditions where the stratum corneum (SC) is damaged. Therefore, we have developed a standardized in vitro barrier-disrupted skin model using tape stripping. Different tape stripping conditions were evaluated using histology, transepidermal water loss, infrared densitometry, and caffeine absorption. The effects of tape stripping were comparable using pig and human skin. Optimized conditions were used to test the effect of SC damage and UV irradiation on the absorption of an UV filter combination present in a sunscreen. The bioavailability of the filters was extremely low regardless of the extent of skin damage, suggesting bioavailability would not be increased if the consumer applied the sunscreen to sun-damaged skin. This standardized in vitro methodology using pig or human skin for damaged skin will add valuable information for the safety assessment of topically applied products. © 2017 S. Karger AG, Basel.
Vukić, Dajana V; Vukić, Vladimir R; Milanović, Spasenija D; Ilicić, Mirela D; Kanurić, Katarina G
2018-06-01
Tree different fermented dairy products obtained by conventional and non-conventional starter cultures were investigated in this paper. Textural and rheological characteristics as well as chemical composition during 21 days of storage were analysed and subsequent data processing was performed by principal component analysis. The analysis of samples` flow behaviour was focused on their time dependent properties. Parameters of Power law model described flow behaviour of samples depended on used starter culture and days of storage. The Power law model was applied successfully to describe the flow of the fermented milk, which had characteristics of shear thinning and non-Newtonian fluid behaviour.
Olsen, Kim Rose; Gyrd-Hansen, Dorte; Sørensen, Torben Højmark; Kristensen, Troels; Vedsted, Peter; Street, Andrew
2013-04-01
Shortage of general practitioners (GPs) and an increased political focus on primary care have enforced the interest in efficiency analysis in the Danish primary care sector. This paper assesses the association between organisational factors of general practices and production and efficiency. We assume that production and efficiency can be modelled using a behavioural production function. We apply the Battese and Coelli (Empir Econ 20:325-332, 1995) estimator to accomplish a decomposition of exogenous variables to determine the production frontier and variables determining the individual GPs distance to this frontier. Two different measures of practice outputs (number of office visits and total production) were applied and the results compared. The results indicate that nurses do not substitute GPs in the production. The production function exhibited constant returns to scale. The mean level of efficiency was between 0.79 and 0.84, and list size was the most important determinant of variation in efficiency levels. Nurses are currently undertaking other tasks than GPs, and larger practices do not lead to increased production per GP. However, a relative increase in list size increased the efficiency. This indicates that organisational changes aiming to increase capacity in general practice should be carefully designed and tested.
Evaluation of globally available precipitation data products as input for water balance models
NASA Astrophysics Data System (ADS)
Lebrenz, H.; Bárdossy, A.
2009-04-01
Subject of this study is the evaluation of globally available precipitation data products, which are intended to be used as input variables for water balance models in ungauged basins. The selected data sources are a) the Global Precipitation Climatology Centre (GPCC), b) the Global Precipitation Climatology Project (GPCP) and c) the Climate Research Unit (CRU), resulting into twelve globally available data products. The data products imply different data bases, different derivation routines and varying resolutions in time and space. For validation purposes, the ground data from South Africa were screened on homogeneity and consistency by various tests and an outlier detection using multi-linear regression was performed. External Drift Kriging was subsequently applied on the ground data and the resulting precipitation arrays were compared to the different products with respect to quantity and variance.
NASA Technical Reports Server (NTRS)
Stark, Michael; Hennessy, Joseph F. (Technical Monitor)
2002-01-01
My assertion is that not only are product lines a relevant research topic, but that the tools used by empirical software engineering researchers can address observed practical problems. Our experience at NASA has been there are often externally proposed solutions available, but that we have had difficulties applying them in our particular context. We have also focused on return on investment issues when evaluating product lines, and while these are important, one can not attain objective data on success or failure until several applications from a product family have been deployed. The use of the Quality Improvement Paradigm (QIP) can address these issues: (1) Planning an adoption path from an organization's current state to a product line approach; (2) Constructing a development process to fit the organization's adoption path; (3) Evaluation of product line development processes as the project is being developed. The QIP consists of the following six steps: (1) Characterize the project and its environment; (2) Set quantifiable goals for successful project performance; (3) Choose the appropriate process models, supporting methods, and tools for the project; (4) Execute the process, analyze interim results, and provide real-time feedback for corrective action; (5) Analyze the results of completed projects and recommend improvements; and (6) Package the lessons learned as updated and refined process models. A figure shows the QIP in detail. The iterative nature of the QIP supports an incremental development approach to product lines, and the project learning and feedback provide the necessary early evaluations.
Cornelisse, C J; Hermens, W T; Joe, M T; Duijndam, W A; van Duijn, P
1976-11-01
A numerical method was developed for computing the steady-state concentration gradient of a diffusible enzyme reaction product in a membrane-limited compartment of a simplified theoretical cell model. In cytochemical enzyme reactions proceeding according to the metal-capture principle, the local concentration of the primary reaction product is an important factor in the onset of the precipitation process and in the distribution of the final reaction product. The following variables were incorporated into the model: enzyme activity, substrate concentration, Km, diffusion coefficient of substrate and product, particle radius and cell radius. The method was applied to lysosomal acid phosphatase. Numerical values for the variables were estimated from experimental data in the literature. The results show that the calculated phosphate concentrations inside lysosomes are several orders of magnitude lower than the critical concentrations for efficient phosphate capture found in a previous experimental model study. Reasons for this apparent discrepancy are discussed.
Oncel, S; Sabankay, M
2012-10-01
This study focuses on a scale-up procedure considering two vital parameters light energy and mixing for microalgae cultivation, taking Chlamydomonas reinhardtii as the model microorganism. Applying two stage hydrogen production protocol to 1L flat type and 2.5L tank type photobioreactors hydrogen production was investigated with constant light energy and mixing time. The conditions that provide the shortest transfer time to anaerobic culture (light energy; 2.96 kJ s(-1)m(-3) and mixing time; 1 min) and highest hydrogen production rate (light energy; 1.22 kJ s(-1)m(-3) and mixing time; 2.5 min) are applied to 5L photobioreactor. The final hydrogen production for 5L system after 192 h was measured as 195 ± 10 mL that is comparable with the other systems is a good validation for the scale-up procedure. Copyright © 2012 Elsevier Ltd. All rights reserved.
Nitrous Oxide Production in a Granule-based Partial Nitritation Reactor: A Model-based Evaluation
NASA Astrophysics Data System (ADS)
Peng, Lai; Sun, Jing; Liu, Yiwen; Dai, Xiaohu; Ni, Bing-Jie
2017-04-01
Sustainable wastewater treatment has been attracting increasing attentions over the past decades. However, the production of nitrous oxide (N2O), a potent GHG, from the energy-efficient granule-based autotrophic nitrogen removal is largely unknown. This study applied a previously established N2O model, which incorporated two N2O production pathways by ammonia-oxidizing bacteria (AOB) (AOB denitrification and the hydroxylamine (NH2OH) oxidation). The two-pathway model was used to describe N2O production from a granule-based partial nitritation (PN) reactor and provide insights into the N2O distribution inside granules. The model was evaluated by comparing simulation results with N2O monitoring profiles as well as isotopic measurement data from the PN reactor. The model demonstrated its good predictive ability against N2O dynamics and provided useful information about the shift of N2O production pathways inside granules for the first time. The simulation results indicated that the increase of oxygen concentration and granule size would significantly enhance N2O production. The results further revealed a linear relationship between N2O production and ammonia oxidation rate (AOR) (R2 = 0.99) under the conditions of varying oxygen levels and granule diameters, suggesting that bulk oxygen and granule size may exert an indirect effect on N2O production by causing a change in AOR.
Nitrous Oxide Production in a Granule-based Partial Nitritation Reactor: A Model-based Evaluation
Peng, Lai; Sun, Jing; Liu, Yiwen; Dai, Xiaohu; Ni, Bing-Jie
2017-01-01
Sustainable wastewater treatment has been attracting increasing attentions over the past decades. However, the production of nitrous oxide (N2O), a potent GHG, from the energy-efficient granule-based autotrophic nitrogen removal is largely unknown. This study applied a previously established N2O model, which incorporated two N2O production pathways by ammonia-oxidizing bacteria (AOB) (AOB denitrification and the hydroxylamine (NH2OH) oxidation). The two-pathway model was used to describe N2O production from a granule-based partial nitritation (PN) reactor and provide insights into the N2O distribution inside granules. The model was evaluated by comparing simulation results with N2O monitoring profiles as well as isotopic measurement data from the PN reactor. The model demonstrated its good predictive ability against N2O dynamics and provided useful information about the shift of N2O production pathways inside granules for the first time. The simulation results indicated that the increase of oxygen concentration and granule size would significantly enhance N2O production. The results further revealed a linear relationship between N2O production and ammonia oxidation rate (AOR) (R2 = 0.99) under the conditions of varying oxygen levels and granule diameters, suggesting that bulk oxygen and granule size may exert an indirect effect on N2O production by causing a change in AOR. PMID:28367960
Nitrous Oxide Production in a Granule-based Partial Nitritation Reactor: A Model-based Evaluation.
Peng, Lai; Sun, Jing; Liu, Yiwen; Dai, Xiaohu; Ni, Bing-Jie
2017-04-03
Sustainable wastewater treatment has been attracting increasing attentions over the past decades. However, the production of nitrous oxide (N 2 O), a potent GHG, from the energy-efficient granule-based autotrophic nitrogen removal is largely unknown. This study applied a previously established N 2 O model, which incorporated two N 2 O production pathways by ammonia-oxidizing bacteria (AOB) (AOB denitrification and the hydroxylamine (NH 2 OH) oxidation). The two-pathway model was used to describe N 2 O production from a granule-based partial nitritation (PN) reactor and provide insights into the N 2 O distribution inside granules. The model was evaluated by comparing simulation results with N 2 O monitoring profiles as well as isotopic measurement data from the PN reactor. The model demonstrated its good predictive ability against N 2 O dynamics and provided useful information about the shift of N 2 O production pathways inside granules for the first time. The simulation results indicated that the increase of oxygen concentration and granule size would significantly enhance N 2 O production. The results further revealed a linear relationship between N 2 O production and ammonia oxidation rate (AOR) (R 2 = 0.99) under the conditions of varying oxygen levels and granule diameters, suggesting that bulk oxygen and granule size may exert an indirect effect on N 2 O production by causing a change in AOR.
Xu, Li-Jian; Liu, Yuan-Shuai; Zhou, Li-Gang; Wu, Jian-Yong
2011-09-01
Beauvericin (BEA) is a cyclic hexadepsipeptide mycotoxin with notable phytotoxic and insecticidal activities. Fusarium redolens Dzf2 is a highly BEA-producing fungus isolated from a medicinal plant. The aim of the current study was to develop a simple and valid kinetic model for F. redolens Dzf2 mycelial growth and the optimal fed-batch operation for efficient BEA production. A modified Monod model with substrate (glucose) and product (BEA) inhibition was constructed based on the culture characteristics of F. redolens Dzf2 mycelia in a liquid medium. Model parameters were derived by simulation of the experimental data from batch culture. The model fitted closely with the experimental data over 20-50 g l(-1) glucose concentration range in batch fermentation. The kinetic model together with the stoichiometric relationships for biomass, substrate and product was applied to predict the optimal feeding scheme for fed-batch fermentation, leading to 54% higher BEA yield (299 mg l(-1)) than in the batch culture (194 mg l(-1)). The modified Monod model incorporating substrate and product inhibition was proven adequate for describing the growth kinetics of F. redolens Dzf2 mycelial culture at suitable but not excessive initial glucose levels in batch and fed-batch cultures.
Analysis of an algae-based CELSS. I - Model development
NASA Technical Reports Server (NTRS)
Holtzapple, Mark T.; Little, Frank E.; Makela, Merry E.; Patterson, C. O.
1989-01-01
A steady state chemical model and computer program have been developed for a life support system and applied to trade-off studies. The model is based on human demand for food and oxygen determined from crew metabolic needs. The model includes modules for water recycle, waste treatment, CO2 removal and treatment, and food production. The computer program calculates rates of use and material balance for food, O2, the recycle of human waste and trash, H2O, N2, and food production/supply. A simple noniterative solution for the model has been developed using the steady state rate equations for the chemical reactions. The model and program have been used in system sizing and subsystem trade-off studies of a partially closed life support system.
Analysis of an algae-based CELSS. Part 1: model development
NASA Technical Reports Server (NTRS)
Holtzapple, M. T.; Little, F. E.; Makela, M. E.; Patterson, C. O.
1989-01-01
A steady state chemical model and computer program have been developed for a life support system and applied to trade-off studies. The model is based on human demand for food and oxygen determined from crew metabolic needs. The model includes modules for water recycle, waste treatment, CO2 removal and treatment, and food production. The computer program calculates rates of use and material balance for food. O2, the recycle of human waste and trash, H2O, N2, and food production supply. A simple non-iterative solution for the model has been developed using the steady state rate equations for the chemical reactions. The model and program have been used in system sizing and subsystem trade-off studies of a partially closed life support system.
Prediction of the properties anhydrite construction mixtures based on neural network approach
NASA Astrophysics Data System (ADS)
Fedorchuk, Y. M.; Zamyatin, N. V.; Smirnov, G. V.; Rusina, O. N.; Sadenova, M. A.
2017-08-01
The article considered the question of applying the backstop modeling mechanism from the components of anhydride mixtures in the process of managing the technological processes of receiving construction products which based on fluoranhydrite.
Lakshmikanthan, P; Sughosh, P; White, James; Sivakumar Babu, G L
2017-07-01
The performance of an anaerobic bioreactor in treating mechanically biologically treated municipal solid waste was investigated using experimental and modelling techniques. The key parameters measured during the experimental test period included the gas yield, leachate generation and settlement under applied load. Modelling of the anaerobic bioreactor was carried out using the University of Southampton landfill degradation and transport model. The model was used to simulate the actual gas production and settlement. A sensitivity analysis showed that the most influential model parameters are the monod growth rate and moisture. In this case, pH had no effect on the total gas production and waste settlement, and only a small variation in the gas production was observed when the heat transfer coefficient of waste was varied from 20 to 100 kJ/(m d K) -1 . The anaerobic bioreactor contained 1.9 kg (dry) of mechanically biologically treated waste producing 10 L of landfill gas over 125 days.
Tobacco industry responsibility for butts: a Model Tobacco Waste Act
Curtis, Clifton; Novotny, Thomas E; Lee, Kelley; Freiberg, Mike; McLaughlin, Ian
2017-01-01
Cigarette butts and other postconsumer products from tobacco use are the most common waste elements picked up worldwide each year during environmental cleanups. Under the environmental principle of Extended Producer Responsibility, tobacco product manufacturers may be held responsible for collection, transport, processing and safe disposal of tobacco product waste (TPW). Legislation has been applied to other toxic and hazardous postconsumer waste products such as paints, pesticide containers and unused pharmaceuticals, to reduce, prevent and mitigate their environmental impacts. Additional product stewardship (PS) requirements may be necessary for other stakeholders and beneficiaries of tobacco product sales and use, especially suppliers, retailers and consumers, in order to ensure effective TPW reduction. This report describes how a Model Tobacco Waste Act may be adopted by national and subnational jurisdictions to address the environmental impacts of TPW. Such a law will also reduce tobacco use and its health consequences by raising attention to the environmental hazards of TPW, increasing the price of tobacco products, and reducing the number of tobacco product retailers. PMID:26931480
Fine bakery wares with label claims in Europe and their categorisation by nutrient profiling models.
Trichterborn, J; Harzer, G; Kunz, C
2011-03-01
This study assesses a range of commercially available fine bakery wares with nutrition or health related on-pack communication against the criteria of selected nutrient profiling models. Different purposes of the application of nutrient profiles were considered, including front-of-pack signposting and the regulation of claims or advertising. More than 200 commercially available fine bakery wares carrying claims were identified in Germany, France, Spain, Sweden and United Kingdom and evaluated against five nutrient profiling models. All models were assessed regarding their underlying principles, generated results and inter-model agreement levels. Total energy, saturated fatty acids, sugars, sodium and fibre were critical parameters for the categorisation of products. The Choices Programme was the most restrictive model in this category, while the Food and Drug Administration model allowed the highest number of products to qualify. According to all models, more savoury than sweet products met the criteria. On average, qualifying products contained less than half the amounts of nutrients to limit and more than double the amount of fibre compared with all the products in the study. None of the models had a significant impact on the average energy contents. Nutrient profiles can be applied to identify fine bakery wares with a significantly better nutritional composition than the average range of products positioned as healthier. Important parameters to take into account include energy, saturated fatty acids, sugars, sodium and fibre. Different criteria sets for subcategories of fine bakery wares do not seem necessary.
Aswad, Miran; Rayan, Mahmoud; Abu-Lafi, Saleh; Falah, Mizied; Raiyn, Jamal; Abdallah, Ziyad; Rayan, Anwar
2018-01-01
The aim was to index natural products for less expensive preventive or curative anti-inflammatory therapeutic drugs. A set of 441 anti-inflammatory drugs representing the active domain and 2892 natural products representing the inactive domain was used to construct a predictive model for bioactivity-indexing purposes. The model for indexing the natural products for potential anti-inflammatory activity was constructed using the iterative stochastic elimination algorithm (ISE). ISE is capable of differentiating between active and inactive anti-inflammatory molecules. By applying the prediction model to a mix set of (active/inactive) substances, we managed to capture 38% of the anti-inflammatory drugs in the top 1% of the screened set of chemicals, yielding enrichment factor of 38. Ten natural products that scored highly as potential anti-inflammatory drug candidates are disclosed. Searching the PubMed revealed that only three molecules (Moupinamide, Capsaicin, and Hypaphorine) out of the ten were tested and reported as anti-inflammatory. The other seven phytochemicals await evaluation for their anti-inflammatory activity in wet lab. The proposed anti-inflammatory model can be utilized for the virtual screening of large chemical databases and for indexing natural products for potential anti-inflammatory activity.
Accuracy assessment for a multi-parameter optical calliper in on line automotive applications
NASA Astrophysics Data System (ADS)
D'Emilia, G.; Di Gasbarro, D.; Gaspari, A.; Natale, E.
2017-08-01
In this work, a methodological approach based on the evaluation of the measurement uncertainty is applied to an experimental test case, related to the automotive sector. The uncertainty model for different measurement procedures of a high-accuracy optical gauge is discussed in order to individuate the best measuring performances of the system for on-line applications and when the measurement requirements are becoming more stringent. In particular, with reference to the industrial production and control strategies of high-performing turbochargers, two uncertainty models are proposed, discussed and compared, to be used by the optical calliper. Models are based on an integrated approach between measurement methods and production best practices to emphasize their mutual coherence. The paper shows the possible advantages deriving from the considerations that the measurement uncertainty modelling provides, in order to keep control of the uncertainty propagation on all the indirect measurements useful for production statistical control, on which basing further improvements.
Population and prehistory II: Space-limited human populations in constant environments
Puleston, Cedric O.; Tuljapurkar, Shripad
2010-01-01
We present a population model to examine the forces that determined the quality and quantity of human life in early agricultural societies where cultivable area is limited. The model is driven by the non-linear and interdependent relationships between the age distribution of a population, its behavior and technology, and the nature of its environment. The common currency in the model is the production of food, on which age-specific rates of birth and death depend. There is a single nontrivial equilibrium population at which productivity balances caloric needs. One of the most powerful controls on equilibrium hunger level is fertility control. Gains against hunger are accompanied by decreases in population size. Increasing worker productivity does increase equilibrium population size but does not improve welfare at equilibrium. As a case study we apply the model to the population of a Polynesian valley before European contact. PMID:18598711
Population and prehistory II: space-limited human populations in constant environments.
Puleston, Cedric O; Tuljapurkar, Shripad
2008-09-01
We present a population model to examine the forces that determined the quality and quantity of human life in early agricultural societies where cultivable area is limited. The model is driven by the non-linear and interdependent relationships between the age distribution of a population, its behavior and technology, and the nature of its environment. The common currency in the model is the production of food, on which age-specific rates of birth and death depend. There is a single non-trivial equilibrium population at which productivity balances caloric needs. One of the most powerful controls on equilibrium hunger level is fertility control. Gains against hunger are accompanied by decreases in population size. Increasing worker productivity does increase equilibrium population size but does not improve welfare at equilibrium. As a case study we apply the model to the population of a Polynesian valley before European contact.
The Market Responses to the Government Regulation of Chlorinated Solvents: A Policy Analysis
1988-10-01
in the process of statistical estimation of model parameters. The results of the estimation process applied to chlorinated solvent markets show the...93 C.5. Marginal Feedstock Cost Series Estimates for Process Share of Total Production .................................. 94 F.I...poliay context for this research. Section III provides analysis necessary to understand the chemicals involved, their production processes and costs, and
NASA Astrophysics Data System (ADS)
Ziemba, Alexander; El Serafy, Ghada
2016-04-01
Ecological modeling and water quality investigations are complex processes which can require a high level of parameterization and a multitude of varying data sets in order to properly execute the model in question. Since models are generally complex, their calibration and validation can benefit from the application of data and information fusion techniques. The data applied to ecological models comes from a wide range of sources such as remote sensing, earth observation, and in-situ measurements, resulting in a high variability in the temporal and spatial resolution of the various data sets available to water quality investigators. It is proposed that effective fusion into a comprehensive singular set will provide a more complete and robust data resource with which models can be calibrated, validated, and driven by. Each individual product contains a unique valuation of error resulting from the method of measurement and application of pre-processing techniques. The uncertainty and error is further compounded when the data being fused is of varying temporal and spatial resolution. In order to have a reliable fusion based model and data set, the uncertainty of the results and confidence interval of the data being reported must be effectively communicated to those who would utilize the data product or model outputs in a decision making process[2]. Here we review an array of data fusion techniques applied to various remote sensing, earth observation, and in-situ data sets whose domains' are varied in spatial and temporal resolution. The data sets examined are combined in a manner so that the various classifications, complementary, redundant, and cooperative, of data are all assessed to determine classification's impact on the propagation and compounding of error. In order to assess the error of the fused data products, a comparison is conducted with data sets containing a known confidence interval and quality rating. We conclude with a quantification of the performance of the data fusion techniques and a recommendation on the feasibility of applying of the fused products in operating forecast systems and modeling scenarios. The error bands and confidence intervals derived can be used in order to clarify the error and confidence of water quality variables produced by prediction and forecasting models. References [1] F. Castanedo, "A Review of Data Fusion Techniques", The Scientific World Journal, vol. 2013, pp. 1-19, 2013. [2] T. Keenan, M. Carbone, M. Reichstein and A. Richardson, "The model-data fusion pitfall: assuming certainty in an uncertain world", Oecologia, vol. 167, no. 3, pp. 587-597, 2011.
Coupling sensing to crop models for closed-loop plant production in advanced life support systems
NASA Astrophysics Data System (ADS)
Cavazzoni, James; Ling, Peter P.
1999-01-01
We present a conceptual framework for coupling sensing to crop models for closed-loop analysis of plant production for NASA's program in advanced life support. Crop status may be monitored through non-destructive observations, while models may be independently applied to crop production planning and decision support. To achieve coupling, environmental variables and observations are linked to mode inputs and outputs, and monitoring results compared with model predictions of plant growth and development. The information thus provided may be useful in diagnosing problems with the plant growth system, or as a feedback to the model for evaluation of plant scheduling and potential yield. In this paper, we demonstrate this coupling using machine vision sensing of canopy height and top projected canopy area, and the CROPGRO crop growth model. Model simulations and scenarios are used for illustration. We also compare model predictions of the machine vision variables with data from soybean experiments conducted at New Jersey Agriculture Experiment Station Horticulture Greenhouse Facility, Rutgers University. Model simulations produce reasonable agreement with the available data, supporting our illustration.
Validation of Infrared Azimuthal Model as Applied to GOES Data Over the ARM SGP
NASA Technical Reports Server (NTRS)
Gambheer, Arvind V.; Doelling, David R.; Spangenberg, Douglas A.; Minnis, Patrick
2004-01-01
The goal of this research is to identify and reduce the GOES-8 IR temperature biases, induced by a fixed geostationary position, during the course of a day. In this study, the same CERES LW window channel model is applied to GOES-8 IR temperatures during clear days over the Atmospheric Radiation Measurement-Southern Great Plains Central Facility (SCF). The model-adjusted and observed IR temperatures are compared with topof- the-atmosphere (TOA) estimated temperatures derived from a radiative transfer algorithm based on the atmospheric profile and surface radiometer measurements. This algorithm can then be incorporated to derive more accurate Ts from real-time satellite operational products.
ERIC Educational Resources Information Center
Dodd, Carol Ann
This study explores a technique for evaluating teacher education programs in terms of teaching competencies, as applied to the Indiana University Mathematics Methods Program (MMP). The evaluation procedures formulated for the study include a process product design in combination with a modification of Pophan's performance test paradigm and Gage's…
Coupling Computer-Aided Process Simulation and ...
A methodology is described for developing a gate-to-gate life cycle inventory (LCI) of a chemical manufacturing process to support the application of life cycle assessment in the design and regulation of sustainable chemicals. The inventories were derived by first applying process design and simulation of develop a process flow diagram describing the energy and basic material flows of the system. Additional techniques developed by the U.S. Environmental Protection Agency for estimating uncontrolled emissions from chemical processing equipment were then applied to obtain a detailed emission profile for the process. Finally, land use for the process was estimated using a simple sizing model. The methodology was applied to a case study of acetic acid production based on the Cativa tm process. The results reveal improvements in the qualitative LCI for acetic acid production compared to commonly used databases and top-down methodologies. The modeling techniques improve the quantitative LCI results for inputs and uncontrolled emissions. With provisions for applying appropriate emission controls, the proposed method can provide an estimate of the LCI that can be used for subsequent life cycle assessments. As part of its mission, the Agency is tasked with overseeing the use of chemicals in commerce. This can include consideration of a chemical's potential impact on health and safety, resource conservation, clean air and climate change, clean water, and sustainable
Optimization Control of the Color-Coating Production Process for Model Uncertainty
He, Dakuo; Wang, Zhengsong; Yang, Le; Mao, Zhizhong
2016-01-01
Optimized control of the color-coating production process (CCPP) aims at reducing production costs and improving economic efficiency while meeting quality requirements. However, because optimization control of the CCPP is hampered by model uncertainty, a strategy that considers model uncertainty is proposed. Previous work has introduced a mechanistic model of CCPP based on process analysis to simulate the actual production process and generate process data. The partial least squares method is then applied to develop predictive models of film thickness and economic efficiency. To manage the model uncertainty, the robust optimization approach is introduced to improve the feasibility of the optimized solution. Iterative learning control is then utilized to further refine the model uncertainty. The constrained film thickness is transformed into one of the tracked targets to overcome the drawback that traditional iterative learning control cannot address constraints. The goal setting of economic efficiency is updated continuously according to the film thickness setting until this reaches its desired value. Finally, fuzzy parameter adjustment is adopted to ensure that the economic efficiency and film thickness converge rapidly to their optimized values under the constraint conditions. The effectiveness of the proposed optimization control strategy is validated by simulation results. PMID:27247563
Optimization Control of the Color-Coating Production Process for Model Uncertainty.
He, Dakuo; Wang, Zhengsong; Yang, Le; Mao, Zhizhong
2016-01-01
Optimized control of the color-coating production process (CCPP) aims at reducing production costs and improving economic efficiency while meeting quality requirements. However, because optimization control of the CCPP is hampered by model uncertainty, a strategy that considers model uncertainty is proposed. Previous work has introduced a mechanistic model of CCPP based on process analysis to simulate the actual production process and generate process data. The partial least squares method is then applied to develop predictive models of film thickness and economic efficiency. To manage the model uncertainty, the robust optimization approach is introduced to improve the feasibility of the optimized solution. Iterative learning control is then utilized to further refine the model uncertainty. The constrained film thickness is transformed into one of the tracked targets to overcome the drawback that traditional iterative learning control cannot address constraints. The goal setting of economic efficiency is updated continuously according to the film thickness setting until this reaches its desired value. Finally, fuzzy parameter adjustment is adopted to ensure that the economic efficiency and film thickness converge rapidly to their optimized values under the constraint conditions. The effectiveness of the proposed optimization control strategy is validated by simulation results.
Designers workbench: toward real-time immersive modeling
NASA Astrophysics Data System (ADS)
Kuester, Falko; Duchaineau, Mark A.; Hamann, Bernd; Joy, Kenneth I.; Ma, Kwan-Liu
2000-05-01
This paper introduces the Designers Workbench, a semi- immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing, and computer-aided engineering systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates form a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The Designers Workbench aims at closing this technology or 'digital gap' experienced by design and CAD engineers by transforming the classical design paradigm into its fully integrate digital and virtual analog allowing collaborative development in a semi- immersive virtual environment. This project emphasizes two key components form the classical product design cycle: freeform modeling and analysis. In the freedom modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.
Antiproton-proton annihilation into light neutral meson pairs within an effective meson theory
NASA Astrophysics Data System (ADS)
Wang, Ying; Bystritskiy, Yury M.; Ahmadov, Azad I.; Tomasi-Gustafsson, Egle
2017-08-01
Antiproton-proton annihilation into light neutral mesons in the few GeV energy domain is investigated in view of a global description of the existing data and predictions for future work at the Antiproton Annihilation at Darmstadt (PANDA) experiment at the Facility for Antiproton and Ion Research (FAIR). An effective meson model earlier developed, with mesonic and baryonic degrees of freedom in s , t , and u channels, is applied here to π0π0 production. Form factors with logarithmic s and t (u ) dependencies are applied. A fair agreement with the existing angular distributions is obtained. Applying SU(3) symmetry, it is straightforward to recover the angular distributions for π0η and η η production in the same energy range. A good agreement is generally obtained with all existing data.
A user-friendly model for spray drying to aid pharmaceutical product development.
Grasmeijer, Niels; de Waard, Hans; Hinrichs, Wouter L J; Frijlink, Henderik W
2013-01-01
The aim of this study was to develop a user-friendly model for spray drying that can aid in the development of a pharmaceutical product, by shifting from a trial-and-error towards a quality-by-design approach. To achieve this, a spray dryer model was developed in commercial and open source spreadsheet software. The output of the model was first fitted to the experimental output of a Büchi B-290 spray dryer and subsequently validated. The predicted outlet temperatures of the spray dryer model matched the experimental values very well over the entire range of spray dryer settings that were tested. Finally, the model was applied to produce glassy sugars by spray drying, an often used excipient in formulations of biopharmaceuticals. For the production of glassy sugars, the model was extended to predict the relative humidity at the outlet, which is not measured in the spray dryer by default. This extended model was then successfully used to predict whether specific settings were suitable for producing glassy trehalose and inulin by spray drying. In conclusion, a spray dryer model was developed that is able to predict the output parameters of the spray drying process. The model can aid the development of spray dried pharmaceutical products by shifting from a trial-and-error towards a quality-by-design approach.
Multiple resource use efficiency (mRUE): A new concept for ecosystem production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, Juanjuan; Chen, Jiquan; Miao, Yuan
The resource-driven concept, which is an important school for investigating ecosystem production, has been applied for decades. However, the regulatory mechanisms of production by multiple resources remain unclear. We formulated a new algorithm model that integrates multiple resource uses to study ecosystem production and tested its applications on a water-availability gradient in semi-arid grassland. The result of our experiment showed that changes in water availability significantly affected the resources of light and nitrogen, and altered the relationships among multiple resource absorption rate (ε), multiple resource use efficiency (mRUE), and available resource (R avail). The increased water availability suppressed ecosystem mRUEmore » (i.e., “declining marginal returns”); The changes in mRUE had a negative effect on ε (i.e., “inverse feedback”). These two processes jointly regulated that the stimulated single resource availability would promote ecosystem production rather than suppress it, even when mRUE was reduced. This study illustrated the use of the mRUE model in exploring the coherent relationships among the key parameters on regulating the ecosystem production for future modeling, and evaluated the sensitivity of this conceptual model under different dataset properties. Furthermore, this model needs extensive validation by the ecological community before it can extrapolate this method to other ecosystems in the future.« less
Yoshikawa, Katsunori; Aikawa, Shimpei; Kojima, Yuta; Toya, Yoshihiro; Furusawa, Chikara; Kondo, Akihiko; Shimizu, Hiroshi
2015-01-01
Arthrospira (Spirulina) platensis is a promising feedstock and host strain for bioproduction because of its high accumulation of glycogen and superior characteristics for industrial production. Metabolic simulation using a genome-scale metabolic model and flux balance analysis is a powerful method that can be used to design metabolic engineering strategies for the improvement of target molecule production. In this study, we constructed a genome-scale metabolic model of A. platensis NIES-39 including 746 metabolic reactions and 673 metabolites, and developed novel strategies to improve the production of valuable metabolites, such as glycogen and ethanol. The simulation results obtained using the metabolic model showed high consistency with experimental results for growth rates under several trophic conditions and growth capabilities on various organic substrates. The metabolic model was further applied to design a metabolic network to improve the autotrophic production of glycogen and ethanol. Decreased flux of reactions related to the TCA cycle and phosphoenolpyruvate reaction were found to improve glycogen production. Furthermore, in silico knockout simulation indicated that deletion of genes related to the respiratory chain, such as NAD(P)H dehydrogenase and cytochrome-c oxidase, could enhance ethanol production by using ammonium as a nitrogen source. PMID:26640947
Multiple resource use efficiency (mRUE): A new concept for ecosystem production
Han, Juanjuan; Chen, Jiquan; Miao, Yuan; ...
2016-11-21
The resource-driven concept, which is an important school for investigating ecosystem production, has been applied for decades. However, the regulatory mechanisms of production by multiple resources remain unclear. We formulated a new algorithm model that integrates multiple resource uses to study ecosystem production and tested its applications on a water-availability gradient in semi-arid grassland. The result of our experiment showed that changes in water availability significantly affected the resources of light and nitrogen, and altered the relationships among multiple resource absorption rate (ε), multiple resource use efficiency (mRUE), and available resource (R avail). The increased water availability suppressed ecosystem mRUEmore » (i.e., “declining marginal returns”); The changes in mRUE had a negative effect on ε (i.e., “inverse feedback”). These two processes jointly regulated that the stimulated single resource availability would promote ecosystem production rather than suppress it, even when mRUE was reduced. This study illustrated the use of the mRUE model in exploring the coherent relationships among the key parameters on regulating the ecosystem production for future modeling, and evaluated the sensitivity of this conceptual model under different dataset properties. Furthermore, this model needs extensive validation by the ecological community before it can extrapolate this method to other ecosystems in the future.« less
Multiple Resource Use Efficiency (mRUE): A New Concept for Ecosystem Production.
Han, Juanjuan; Chen, Jiquan; Miao, Yuan; Wan, Shiqiang
2016-11-21
The resource-driven concept, which is an important school for investigating ecosystem production, has been applied for decades. However, the regulatory mechanisms of production by multiple resources remain unclear. We formulated a new algorithm model that integrates multiple resource uses to study ecosystem production and tested its applications on a water-availability gradient in semi-arid grassland. The result of our experiment showed that changes in water availability significantly affected the resources of light and nitrogen, and altered the relationships among multiple resource absorption rate (ε), multiple resource use efficiency (mRUE), and available resource (R avail ). The increased water availability suppressed ecosystem mRUE (i.e., "declining marginal returns"); The changes in mRUE had a negative effect on ε (i.e., "inverse feedback"). These two processes jointly regulated that the stimulated single resource availability would promote ecosystem production rather than suppress it, even when mRUE was reduced. This study illustrated the use of the mRUE model in exploring the coherent relationships among the key parameters on regulating the ecosystem production for future modeling, and evaluated the sensitivity of this conceptual model under different dataset properties. However, this model needs extensive validation by the ecological community before it can extrapolate this method to other ecosystems in the future.
How do strategic decisions and operative practices affect operating room productivity?
Peltokorpi, Antti
2011-12-01
Surgical operating rooms are cost-intensive parts of health service production. Managing operating units efficiently is essential when hospitals and healthcare systems aim to maximize health outcomes with limited resources. Previous research about operating room management has focused on studying the effect of management practices and decisions on efficiency by utilizing mainly modeling approach or before-after analysis in single hospital case. The purpose of this research is to analyze the synergic effect of strategic decisions and operative management practices on operating room productivity and to use a multiple case study method enabling statistical hypothesis testing with empirical data. 11 hypotheses that propose connections between the use of strategic and operative practices and productivity were tested in a multi-hospital study that included 26 units. The results indicate that operative practices, such as personnel management, case scheduling and performance measurement, affect productivity more remarkably than do strategic decisions that relate to, e.g., units' size, scope or academic status. Units with different strategic positions should apply different operative practices: Focused hospital units benefit most from sophisticated case scheduling and parallel processing whereas central and ambulatory units should apply flexible working hours, incentives and multi-skilled personnel. Operating units should be more active in applying management practices which are adequate for their strategic orientation.
Applied behavior analysis: New directions from the laboratory
Epling, W. Frank; Pierce, W. David
1983-01-01
Applied behavior analysis began when laboratory based principles were extended to humans inorder to change socially significant behavior. Recent laboratory findings may have applied relevance; however, the majority of basic researchers have not clearly communicated the practical implications of their work. The present paper samples some of the new findings and attempts to demonstrate their applied importance. Schedule-induced behavior which occurs as a by-product of contingencies of reinforcement is discussed. Possible difficulties in treatment and management of induced behaviors are considered. Next, the correlation-based law of effect and the implications of relative reinforcement are explored in terms of applied examples. Relative rate of reinforcement is then extended to the literature dealing with concurrent operants. Concurrent operant models may describe human behavior of applied importance, and several techniques for modification of problem behavior are suggested. As a final concern, the paper discusses several new paradigms. While the practical importance of these models is not clear at the moment, it may be that new practical advantages will soon arise. Thus, it is argued that basic research continues to be of theoretical and practical importance to applied behavior analysis. PMID:22478574
Cheirsilp, B; Shimizu, H; Shioya, S
2001-12-01
A mathematical model for kefiran production by Lactobacillus kefiranofaciens was established, in which the effects of pH, substrate and product on cell growth, exopolysaccharide formation and substrate assimilation were considered. The model gave a good representation both of the formation of exopolysaccharides (which are not only attached to cells but also released into the medium) and of the time courses of the production of galactose and glucose in the medium (which are produced and consumed by the cells). Since pH and both lactose and lactic acid concentrations differently affected production and growth activity, the model included the effects of pH and the concentrations of lactose and lactic acid. Based on the mathematical model, an optimal pH profile for the maximum production of kefiran in batch culture was obtained. In this study, a simplified optimization method was developed, in which the optimal pH profile was determined at a particular final fermentation time. This was based on the principle that, at a certain time, switching from the maximum specific growth rate to the critical one (which yields the maximum specific production rate) results in maximum production. Maximum kefiran production was obtained, which was 20% higher than that obtained in the constant-pH control fermentation. A genetic algorithm (GA) was also applied to obtain the optimal pH profile; and it was found that practically the same solution was obtained using the GA.
Modelling and analysis of FMS productivity variables by ISM, SEM and GTMA approach
NASA Astrophysics Data System (ADS)
Jain, Vineet; Raj, Tilak
2014-09-01
Productivity has often been cited as a key factor in a flexible manufacturing system (FMS) performance, and actions to increase it are said to improve profitability and the wage earning capacity of employees. Improving productivity is seen as a key issue for survival and success in the long term of a manufacturing system. The purpose of this paper is to make a model and analysis of the productivity variables of FMS. This study was performed by different approaches viz. interpretive structural modelling (ISM), structural equation modelling (SEM), graph theory and matrix approach (GTMA) and a cross-sectional survey within manufacturing firms in India. ISM has been used to develop a model of productivity variables, and then it has been analyzed. Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) are powerful statistical techniques. CFA is carried by SEM. EFA is applied to extract the factors in FMS by the statistical package for social sciences (SPSS 20) software and confirming these factors by CFA through analysis of moment structures (AMOS 20) software. The twenty productivity variables are identified through literature and four factors extracted, which involves the productivity of FMS. The four factors are people, quality, machine and flexibility. SEM using AMOS 20 was used to perform the first order four-factor structures. GTMA is a multiple attribute decision making (MADM) methodology used to find intensity/quantification of productivity variables in an organization. The FMS productivity index has purposed to intensify the factors which affect FMS.
Towards a Food Safety Knowledge Base Applicable in Crisis Situations and Beyond
Falenski, Alexander; Weiser, Armin A.; Thöns, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias
2015-01-01
In case of contamination in the food chain, fast action is required in order to reduce the numbers of affected people. In such situations, being able to predict the fate of agents in foods would help risk assessors and decision makers in assessing the potential effects of a specific contamination event and thus enable them to deduce the appropriate mitigation measures. One efficient strategy supporting this is using model based simulations. However, application in crisis situations requires ready-to-use and easy-to-adapt models to be available from the so-called food safety knowledge bases. Here, we illustrate this concept and its benefits by applying the modular open source software tools PMM-Lab and FoodProcess-Lab. As a fictitious sample scenario, an intentional ricin contamination at a beef salami production facility was modelled. Predictive models describing the inactivation of ricin were reviewed, relevant models were implemented with PMM-Lab, and simulations on residual toxin amounts in the final product were performed with FoodProcess-Lab. Due to the generic and modular modelling concept implemented in these tools, they can be applied to simulate virtually any food safety contamination scenario. Apart from the application in crisis situations, the food safety knowledge base concept will also be useful in food quality and safety investigations. PMID:26247028
Prigent, Sylvain; Nielsen, Jens Christian; Frisvad, Jens Christian; Nielsen, Jens
2018-06-05
Modelling of metabolism at the genome-scale have proved to be an efficient method for explaining observed phenotypic traits in living organisms. Further, it can be used as a means of predicting the effect of genetic modifications e.g. for development of microbial cell factories. With the increasing amount of genome sequencing data available, a need exists to accurately and efficiently generate such genome-scale metabolic models (GEMs) of non-model organisms, for which data is sparse. In this study, we present an automatic reconstruction approach applied to 24 Penicillium species, which have potential for production of pharmaceutical secondary metabolites or used in the manufacturing of food products such as cheeses. The models were based on the MetaCyc database and a previously published Penicillium GEM, and gave rise to comprehensive genome-scale metabolic descriptions. The models proved that while central carbon metabolism is highly conserved, secondary metabolic pathways represent the main diversity among the species. The automatic reconstruction approach presented in this study can be applied to generate GEMs of other understudied organisms, and the developed GEMs are a useful resource for the study of Penicillium metabolism, for example with the scope of developing novel cell factories. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Towards a Food Safety Knowledge Base Applicable in Crisis Situations and Beyond.
Falenski, Alexander; Weiser, Armin A; Thöns, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias
2015-01-01
In case of contamination in the food chain, fast action is required in order to reduce the numbers of affected people. In such situations, being able to predict the fate of agents in foods would help risk assessors and decision makers in assessing the potential effects of a specific contamination event and thus enable them to deduce the appropriate mitigation measures. One efficient strategy supporting this is using model based simulations. However, application in crisis situations requires ready-to-use and easy-to-adapt models to be available from the so-called food safety knowledge bases. Here, we illustrate this concept and its benefits by applying the modular open source software tools PMM-Lab and FoodProcess-Lab. As a fictitious sample scenario, an intentional ricin contamination at a beef salami production facility was modelled. Predictive models describing the inactivation of ricin were reviewed, relevant models were implemented with PMM-Lab, and simulations on residual toxin amounts in the final product were performed with FoodProcess-Lab. Due to the generic and modular modelling concept implemented in these tools, they can be applied to simulate virtually any food safety contamination scenario. Apart from the application in crisis situations, the food safety knowledge base concept will also be useful in food quality and safety investigations.
Schmidt, Andres; Law, Beverly E.; Göckede, Mathias; ...
2016-09-15
Here, the vast forests and natural areas of the Pacific Northwest comprise one of the most productive ecosystems in the northern hemisphere. The heterogeneous landscape of Oregon poses a particular challenge to ecosystem models. We present a framework using a scaling factor Bayesian inversion to improve the modeled atmosphere-biosphere exchange of carbon dioxide. Observations from 5 CO/CO 2 towers, eddy covariance towers, and airborne campaigns were used to constrain the Community Land Model CLM4.5 simulated terrestrial CO 2 exchange at a high spatial and temporal resolution (1/24°, 3-hourly). To balance aggregation errors and the degrees of freedom in the inversemore » modeling system, we applied an unsupervised clustering approach for the spatial structuring of our model domain. Data from flight campaigns were used to quantify the uncertainty introduced by the Lagrangian particle dispersion model that was applied for the inversions. The average annual statewide net ecosystem productivity (NEP) was increased by 32% to 29.7 TgC per year by assimilating the tropospheric mixing ratio data. The associated uncertainty was decreased by 28.4% to 29%, on average over the entire Oregon model domain with the lowest uncertainties of 11% in western Oregon. The largest differences between posterior and prior CO 2 fluxes were found for the Coast Range ecoregion of Oregon that also exhibits the highest availability of atmospheric observations and associated footprints. In this area, covered by highly productive Douglas-fir forest, the differences between the prior and posterior estimate of NEP averaged 3.84 TgC per year during the study period from 2012 through 2014.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmidt, Andres; Law, Beverly E.; Göckede, Mathias
Here, the vast forests and natural areas of the Pacific Northwest comprise one of the most productive ecosystems in the northern hemisphere. The heterogeneous landscape of Oregon poses a particular challenge to ecosystem models. We present a framework using a scaling factor Bayesian inversion to improve the modeled atmosphere-biosphere exchange of carbon dioxide. Observations from 5 CO/CO 2 towers, eddy covariance towers, and airborne campaigns were used to constrain the Community Land Model CLM4.5 simulated terrestrial CO 2 exchange at a high spatial and temporal resolution (1/24°, 3-hourly). To balance aggregation errors and the degrees of freedom in the inversemore » modeling system, we applied an unsupervised clustering approach for the spatial structuring of our model domain. Data from flight campaigns were used to quantify the uncertainty introduced by the Lagrangian particle dispersion model that was applied for the inversions. The average annual statewide net ecosystem productivity (NEP) was increased by 32% to 29.7 TgC per year by assimilating the tropospheric mixing ratio data. The associated uncertainty was decreased by 28.4% to 29%, on average over the entire Oregon model domain with the lowest uncertainties of 11% in western Oregon. The largest differences between posterior and prior CO 2 fluxes were found for the Coast Range ecoregion of Oregon that also exhibits the highest availability of atmospheric observations and associated footprints. In this area, covered by highly productive Douglas-fir forest, the differences between the prior and posterior estimate of NEP averaged 3.84 TgC per year during the study period from 2012 through 2014.« less
NASA Astrophysics Data System (ADS)
Vinod Kumar, A.; Sitaraman, V.; Oza, R. B.; Krishnamoorthy, T. M.
A one-dimensional numerical planetary boundary layer (PBL) model is developed and applied to study the vertical distribution of radon and its daughter products in the atmosphere. The meteorological model contains parameterization for the vertical diffusion coefficient based on turbulent kinetic energy and energy dissipation ( E- ɛ model). The increased vertical resolution and the realistic concentration of radon and its daughter products based on the time-dependent PBL model is compared with the steady-state model results and field observations. The ratio of radon concentration at higher levels to that at the surface has been studied to see the effects of atmospheric stability. The significant change in the vertical profile of concentration due to decoupling of the upper portion of the boundary layer from the shallow lower stable layer is explained by the PBL model. The disequilibrium ratio of 214Bi/ 214Pb broadly agrees with the observed field values. The sharp decrease in the ratio during transition from unstable to stable atmospheric condition is also reproduced by the model.
Lübken, Manfred; Wichern, Marc; Schlattmann, Markus; Gronauer, Andreas; Horn, Harald
2007-10-01
Knowledge of the net energy production of anaerobic fermenters is important for reliable modelling of the efficiency of anaerobic digestion processes. By using the Anaerobic Digestion Model No. 1 (ADM1) the simulation of biogas production and composition is possible. This paper shows the application and modification of ADM1 to simulate energy production of the digestion of cattle manure and renewable energy crops. The paper additionally presents an energy balance model, which enables the dynamic calculation of the net energy production. The model was applied to a pilot-scale biogas reactor. It was found in a simulation study that a continuous feeding and splitting of the reactor feed into smaller heaps do not generally have a positive effect on the net energy yield. The simulation study showed that the ratio of co-substrate to liquid manure in the inflow determines the net energy production when the inflow load is split into smaller heaps. Mathematical equations are presented to calculate the increase of biogas and methane yield for the digestion of liquid manure and lipids for different feeding intervals. Calculations of different kinds of energy losses for the pilot-scale digester showed high dynamic variations, demonstrating the significance of using a dynamic energy balance model.
Liu, Yiwen; Sun, Jing; Peng, Lai; Wang, Dongbo; Dai, Xiaohu; Ni, Bing-Jie
2016-01-01
Anaerobic ammonium oxidation (anammox) is known to autotrophically convert ammonium to dinitrogen gas with nitrite as the electron acceptor, but little is known about their released microbial products and how these are relative to heterotrophic growth in anammox system. In this work, we applied a mathematical model to assess the heterotrophic growth supported by three key microbial products produced by bacteria in anammox biofilm (utilization associated products (UAP), biomass associated products (BAP), and decay released substrate). Both One-dimensional and two-dimensional numerical biofilm models were developed to describe the development of anammox biofilm as a function of the multiple bacteria–substrate interactions. Model simulations show that UAP of anammox is the main organic carbon source for heterotrophs. Heterotrophs are mainly dominant at the surface of the anammox biofilm with small fraction inside the biofilm. 1-D model is sufficient to describe the main substrate concentrations/fluxes within the anammox biofilm, while the 2-D model can give a more detailed biomass distribution. The heterotrophic growth on UAP is mainly present at the outside of anammox biofilm, their growth on BAP (HetB) are present throughout the biofilm, while the growth on decay released substrate (HetD) is mainly located in the inner layers of the biofilm. PMID:27273460
NASA Astrophysics Data System (ADS)
Fitriana, R.; Saragih, J.; Luthfiana, N.
2017-12-01
R Bakery company is a company that produces bread every day. Products that produced in that company have many different types of bread. Products are made in the form of sweet bread and wheat bread which have different tastes for every types of bread. During the making process, there were defects in the products which the defective product turns into reject product. Types of defects that are produced include burnt, sodden bread and shapeless bread. To find out the information about the defects that have been produced then by applying a designed model business intelligence system to create database and data warehouse. By using model business Intelligence system, it will generate useful information such as how many defect that produced by each of the bakery products. To make it easier to obtain such information, it can be done by using data mining method which data that we get is deep explored. The method of data mining is using k-means clustering method. The results of this intelligence business model system are cluster 1 with little amount of defect, cluster 2 with medium amount of defect and cluster 3 with high amount of defect. From OLAP Cube method can be seen that the defect generated during the 7 months period of 96,744 pieces.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beverly Law; David Turner; Warren Cohen
2008-05-22
The goal is to quantify and explain the carbon (C) budget for Oregon and N. California. The research compares "bottom -up" and "top-down" methods, and develops prototype analytical systems for regional analysis of the carbon balance that are potentially applicable to other continental regions, and that can be used to explore climate, disturbance and land-use effects on the carbon cycle. Objectives are: 1) Improve, test and apply a bottom up approach that synthesizes a spatially nested hierarchy of observations (multispectral remote sensing, inventories, flux and extensive sites), and the Biome-BGC model to quantify the C balance across the region; 2)more » Improve, test and apply a top down approach for regional and global C flux modeling that uses a model-data fusion scheme (MODIS products, AmeriFlux, atmospheric CO2 concentration network), and a boundary layer model to estimate net ecosystem production (NEP) across the region and partition it among GPP, R(a) and R(h). 3) Provide critical understanding of the controls on regional C balance (how NEP and carbon stocks are influenced by disturbance from fire and management, land use, and interannual climate variation). The key science questions are, "What are the magnitudes and distributions of C sources and sinks on seasonal to decadal time scales, and what processes are controlling their dynamics? What are regional spatial and temporal variations of C sources and sinks? What are the errors and uncertainties in the data products and results (i.e., in situ observations, remote sensing, models)?« less
Numerical study on turbulence modulation in gas-particle flows
NASA Astrophysics Data System (ADS)
Yan, F.; Lightstone, M. F.; Wood, P. E.
2007-01-01
A mathematical model is proposed based on the Eulerian/Lagrangian approach to account for both the particle crossing trajectory effect and the extra turbulence production due to particle wake effects. The resulting model, together with existing models from the literature, is applied to two different particle-laden flow configurations, namely a vertical pipe flow and axisymmetric downward jet flow. The results show that the proposed model is able to provide improved predictions of the experimental results.
Assessment of Required Accuracy of Digital Elevation Data for Hydrologic Modeling
NASA Technical Reports Server (NTRS)
Kenward, T.; Lettenmaier, D. P.
1997-01-01
The effect of vertical accuracy of Digital Elevation Models (DEMs) on hydrologic models is evaluated by comparing three DEMs and resulting hydrologic model predictions applied to a 7.2 sq km USDA - ARS watershed at Mahantango Creek, PA. The high resolution (5 m) DEM was resempled to a 30 m resolution using method that constrained the spatial structure of the elevations to be comparable with the USGS and SIR-C DEMs. This resulting 30 m DEM was used as the reference product for subsequent comparisons. Spatial fields of directly derived quantities, such as elevation differences, slope, and contributing area, were compared to the reference product, as were hydrologic model output fields derived using each of the three DEMs at the common 30 m spatial resolution.
Evaluation of a Soil Moisture Data Assimilation System Over West Africa
NASA Astrophysics Data System (ADS)
Bolten, J. D.; Crow, W.; Zhan, X.; Jackson, T.; Reynolds, C.
2009-05-01
A crucial requirement of global crop yield forecasts by the U.S. Department of Agriculture (USDA) International Production Assessment Division (IPAD) is the regional characterization of surface and sub-surface soil moisture. However, due to the spatial heterogeneity and dynamic nature of precipitation events and resulting soil moisture, accurate estimation of regional land surface-atmosphere interactions based sparse ground measurements is difficult. IPAD estimates global soil moisture using daily estimates of minimum and maximum temperature and precipitation applied to a modified Palmer two-layer soil moisture model which calculates the daily amount of soil moisture withdrawn by evapotranspiration and replenished by precipitation. We attempt to improve upon the existing system by applying an Ensemble Kalman filter (EnKF) data assimilation system to integrate surface soil moisture retrievals from the NASA Advanced Microwave Scanning Radiometer (AMSR-E) into the USDA soil moisture model. This work aims at evaluating the utility of merging satellite-retrieved soil moisture estimates with the IPAD two-layer soil moisture model used within the DBMS. We present a quantitative analysis of the assimilated soil moisture product over West Africa (9°N- 20°N; 20°W-20°E). This region contains many key agricultural areas and has a high agro- meteorological gradient from desert and semi-arid vegetation in the North, to grassland, trees and crops in the South, thus providing an ideal location for evaluating the assimilated soil moisture product over multiple land cover types and conditions. A data denial experimental approach is utilized to isolate the added utility of integrating remotely-sensed soil moisture by comparing assimilated soil moisture results obtained using (relatively) low-quality precipitation products obtained from real-time satellite imagery to baseline model runs forced with higher quality rainfall. An analysis of root-zone anomalies for each model simulation suggests that the assimilation of AMSR-E surface soil moisture retrievals can add significant value to USDA root-zone predictions derived from real-time satellite precipitation products.
Requirement analysis for the one-stop logistics management of fresh agricultural products
NASA Astrophysics Data System (ADS)
Li, Jun; Gao, Hongmei; Liu, Yuchuan
2017-08-01
Issues and concerns for food safety, agro-processing, and the environmental and ecological impact of food production have been attracted many research interests. Traceability and logistics management of fresh agricultural products is faced with the technological challenges including food product label and identification, activity/process characterization, information systems for the supply chain, i.e., from farm to table. Application of one-stop logistics service focuses on the whole supply chain process integration for fresh agricultural products is studied. A collaborative research project for the supply and logistics of fresh agricultural products in Tianjin was performed. Requirement analysis for the one-stop logistics management information system is studied. The model-driven business transformation, an approach uses formal models to explicitly define the structure and behavior of a business, is applied for the review and analysis process. Specific requirements for the logistic management solutions are proposed. Development of this research is crucial for the solution of one-stop logistics management information system integration platform for fresh agricultural products.
Horbowy, Jan; Tomczak, Maciej T
2017-01-01
Biomass reconstructions to pre-assessment periods for commercially important and exploitable fish species are important tools for understanding long-term processes and fluctuation on stock and ecosystem level. For some stocks only fisheries statistics and fishery dependent data are available, for periods before surveys were conducted. The methods for the backward extension of the analytical assessment of biomass for years for which only total catch volumes are available were developed and tested in this paper. Two of the approaches developed apply the concept of the surplus production rate (SPR), which is shown to be stock density dependent if stock dynamics is governed by classical stock-production models. The other approach used a modified form of the Schaefer production model that allows for backward biomass estimation. The performance of the methods was tested on the Arctic cod and North Sea herring stocks, for which analytical biomass estimates extend back to the late 1940s. Next, the methods were applied to extend biomass estimates of the North-east Atlantic mackerel from the 1970s (analytical biomass estimates available) to the 1950s, for which only total catch volumes were available. For comparison with other methods which employs a constant SPR estimated as an average of the observed values, was also applied. The analyses showed that the performance of the methods is stock and data specific; the methods that work well for one stock may fail for the others. The constant SPR method is not recommended in those cases when the SPR is relatively high and the catch volumes in the reconstructed period are low.
Horbowy, Jan
2017-01-01
Biomass reconstructions to pre-assessment periods for commercially important and exploitable fish species are important tools for understanding long-term processes and fluctuation on stock and ecosystem level. For some stocks only fisheries statistics and fishery dependent data are available, for periods before surveys were conducted. The methods for the backward extension of the analytical assessment of biomass for years for which only total catch volumes are available were developed and tested in this paper. Two of the approaches developed apply the concept of the surplus production rate (SPR), which is shown to be stock density dependent if stock dynamics is governed by classical stock-production models. The other approach used a modified form of the Schaefer production model that allows for backward biomass estimation. The performance of the methods was tested on the Arctic cod and North Sea herring stocks, for which analytical biomass estimates extend back to the late 1940s. Next, the methods were applied to extend biomass estimates of the North-east Atlantic mackerel from the 1970s (analytical biomass estimates available) to the 1950s, for which only total catch volumes were available. For comparison with other methods which employs a constant SPR estimated as an average of the observed values, was also applied. The analyses showed that the performance of the methods is stock and data specific; the methods that work well for one stock may fail for the others. The constant SPR method is not recommended in those cases when the SPR is relatively high and the catch volumes in the reconstructed period are low. PMID:29131850
NASA Technical Reports Server (NTRS)
Hall, Callie; Arnone, Robert
2006-01-01
The NASA Applied Sciences Program seeks to transfer NASA data, models, and knowledge into the hands of end-users by forming links with partner agencies and associated decision support tools (DSTs). Through the NASA REASoN (Research, Education and Applications Solutions Network) Cooperative Agreement, the Oceanography Division of the Naval Research Laboratory (NRLSSC) is developing new products through the integration of data from NASA Earth-Sun System assets with coastal ocean forecast models and other available data to enhance coastal management in the Gulf of Mexico. The recipient federal agency for this research effort is the National Oceanic and Atmospheric Administration (NOAA). The contents of this report detail the effort to further the goals of the NASA Applied Sciences Program by demonstrating the use of NASA satellite products combined with data-assimilating ocean models to provide near real-time information to maritime users and coastal managers of the Gulf of Mexico. This effort provides new and improved capabilities for monitoring, assessing, and predicting the coastal environment. Coastal managers can exploit these capabilities through enhanced DSTs at federal, state and local agencies. The project addresses three major issues facing coastal managers: 1) Harmful Algal Blooms (HABs); 2) hypoxia; and 3) freshwater fluxes to the coastal ocean. A suite of ocean products capable of describing Ocean Weather is assembled on a daily basis as the foundation for this semi-operational multiyear effort. This continuous realtime capability brings decision makers a new ability to monitor both normal and anomalous coastal ocean conditions with a steady flow of satellite and ocean model conditions. Furthermore, as the baseline data sets are used more extensively and the customer list increased, customer feedback is obtained and additional customized products are developed and provided to decision makers. Continual customer feedback and response with new improved products are required between the researcher and customer. This document details the methods by which these coastal ocean products are produced including the data flow, distribution, and verification. Product applications and the degree to which these products are used successfully within NOAA and coordinated with the Mississippi Department of Marine Resources (MDMR) is benchmarked.
Grilo, A; Santos, J
2015-01-01
Business incubators can play a major role in helping to turn a business idea into a technology-based organization that is economically efficient. However, there is a shortage in the literature regarding the efficiency evaluation and productivity evolution of the new technology-based firms (NTBFs) in the incubation scope. This study develops a model based on the data envelopment analysis (DEA) methodology, which allows the incubated NTBFs to evaluate and improve the efficiency of their management. Moreover, the Malmquist index is used to examine productivity change. The index is decomposed into multiple components to give insights into the root sources of productivity change. The proposed model was applied in a case study with 13 NTBFs incubated. From that study, we conclude that inefficient firms invest excessively in research and development (R&D), and, on average, firms have a productivity growth in the period of study.
Grilo, A.; Santos, J.
2015-01-01
Business incubators can play a major role in helping to turn a business idea into a technology-based organization that is economically efficient. However, there is a shortage in the literature regarding the efficiency evaluation and productivity evolution of the new technology-based firms (NTBFs) in the incubation scope. This study develops a model based on the data envelopment analysis (DEA) methodology, which allows the incubated NTBFs to evaluate and improve the efficiency of their management. Moreover, the Malmquist index is used to examine productivity change. The index is decomposed into multiple components to give insights into the root sources of productivity change. The proposed model was applied in a case study with 13 NTBFs incubated. From that study, we conclude that inefficient firms invest excessively in research and development (R&D), and, on average, firms have a productivity growth in the period of study. PMID:25874266
Zhu, Tong; Moussa, Ehab M; Witting, Madeleine; Zhou, Deliang; Sinha, Kushal; Hirth, Mario; Gastens, Martin; Shang, Sherwin; Nere, Nandkishor; Somashekar, Shubha Chetan; Alexeenko, Alina; Jameel, Feroz
2018-07-01
Scale-up and technology transfer of lyophilization processes remains a challenge that requires thorough characterization of the laboratory and larger scale lyophilizers. In this study, computational fluid dynamics (CFD) was employed to develop computer-based models of both laboratory and manufacturing scale lyophilizers in order to understand the differences in equipment performance arising from distinct designs. CFD coupled with steady state heat and mass transfer modeling of the vial were then utilized to study and predict independent variables such as shelf temperature and chamber pressure, and response variables such as product resistance, product temperature and primary drying time for a given formulation. The models were then verified experimentally for the different lyophilizers. Additionally, the models were applied to create and evaluate a design space for a lyophilized product in order to provide justification for the flexibility to operate within a certain range of process parameters without the need for validation. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Deng, Bo; Shi, Yaoyao
2017-11-01
The tape winding technology is an effective way to fabricate rotationally composite materials. Nevertheless, some inevitable defects will seriously influence the performance of winding products. One of the crucial ways to identify the quality of fiber-reinforced composite material products is examining its void content. Significant improvement in products' mechanical properties can be achieved by minimizing the void defect. Two methods were applied in this study, finite element analysis and experimental testing, respectively, to investigate the mechanism of how void forming in composite tape winding processing. Based on the theories of interlayer intimate contact and Domain Superposition Technique (DST), a three-dimensional model of prepreg tape void with SolidWorks has been modeled in this paper. Whereafter, ABAQUS simulation software was used to simulate the void content change with pressure and temperature. Finally, a series of experiments were performed to determine the accuracy of the model-based predictions. The results showed that the model is effective for predicting the void content in the composite tape winding process.
NASA Astrophysics Data System (ADS)
Septiani, Eka Lutfi; Widiyastuti, W.; Winardi, Sugeng; Machmudah, Siti; Nurtono, Tantular; Kusdianto
2016-02-01
Flame assisted spray dryer are widely uses for large-scale production of nanoparticles because of it ability. Numerical approach is needed to predict combustion and particles production in scale up and optimization process due to difficulty in experimental observation and relatively high cost. Computational Fluid Dynamics (CFD) can provide the momentum, energy and mass transfer, so that CFD more efficient than experiment due to time and cost. Here, two turbulence models, k-ɛ and Large Eddy Simulation were compared and applied in flame assisted spray dryer system. The energy sources for particle drying was obtained from combustion between LPG as fuel and air as oxidizer and carrier gas that modelled by non-premixed combustion in simulation. Silica particles was used to particle modelling from sol silica solution precursor. From the several comparison result, i.e. flame contour, temperature distribution and particle size distribution, Large Eddy Simulation turbulence model can provide the closest data to the experimental result.
An effective model for ergonomic optimization applied to a new automotive assembly line
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duraccio, Vincenzo; Elia, Valerio; Forcina, Antonio
2016-06-08
An efficient ergonomic optimization can lead to a significant improvement in production performance and a considerable reduction of costs. In the present paper new model for ergonomic optimization is proposed. The new approach is based on the criteria defined by National Institute of Occupational Safety and Health and, adapted to Italian legislation. The proposed model provides an ergonomic optimization, by analyzing ergonomic relations between manual work in correct conditions. The model includes a schematic and systematic analysis method of the operations, and identifies all possible ergonomic aspects to be evaluated. The proposed approach has been applied to an automotive assemblymore » line, where the operation repeatability makes the optimization fundamental. The proposed application clearly demonstrates the effectiveness of the new approach.« less
Analytical method to estimate waterflood performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cremonini, A.S.
A method to predict oil production resulting from the injection of immiscible fluids is described. The method is based on two models: one of them considers the vertical and displacement efficiencies, assuming unit areal efficiency and, therefore, a linear flow. It is a layered model without crossflow in which Buckley-Leveret`s displacement theory is used for each layer. The results obtained in the linear model are applied to a streamchannel model similar to the one used by Higgins and Leighton. In this way, areal efficiency is taken into account. The principal innovation is the possibility of applying different relative permeability curvesmore » to each layer. A numerical example in a five-spot pattern which uses relative permeability data obtained from reservoir core samples is presented.« less
Dynamic metabolic modeling for a MAB bioprocess.
Gao, Jianying; Gorenflo, Volker M; Scharer, Jeno M; Budman, Hector M
2007-01-01
Production of monoclonal antibodies (MAb) for diagnostic or therapeutic applications has become an important task in the pharmaceutical industry. The efficiency of high-density reactor systems can be potentially increased by model-based design and control strategies. Therefore, a reliable kinetic model for cell metabolism is required. A systematic procedure based on metabolic modeling is used to model nutrient uptake and key product formation in a MAb bioprocess during both the growth and post-growth phases. The approach combines the key advantages of stoichiometric and kinetic models into a complete metabolic network while integrating the regulation and control of cellular activity. This modeling procedure can be easily applied to any cell line during both the cell growth and post-growth phases. Quadratic programming (QP) has been identified as a suitable method to solve the underdetermined constrained problem related to model parameter identification. The approach is illustrated for the case of murine hybridoma cells cultivated in stirred spinners.
ECUT (Energy Conversion and Utilization Technologies) program: Biocatalysis project
NASA Technical Reports Server (NTRS)
Baresi, Larry
1989-01-01
The Annual Report presents the fiscal year (FY) 1988 research activities and accomplishments, for the Biocatalysis Project of the U.S. Department of Energy, Energy Conversion and Utilization Technologies (ECUT) Division. The ECUT Biocatalysis Project is managed by the Jet Propulsion Laboratory, California Institute of Technology. The Biocatalysis Project is a mission-oriented, applied research and exploratory development activity directed toward resolution of the major generic technical barriers that impede the development of biologically catalyzed commercial chemical production. The approach toward achieving project objectives involves an integrated participation of universities, industrial companies and government research laboratories. The Project's technical activities were organized into three work elements: (1) The Molecular Modeling and Applied Genetics work element includes research on modeling of biological systems, developing rigorous methods for the prediction of three-dimensional (tertiary) protein structure from the amino acid sequence (primary structure) for designing new biocatalysis, defining kinetic models of biocatalyst reactivity, and developing genetically engineered solutions to the generic technical barriers that preclude widespread application of biocatalysis. (2) The Bioprocess Engineering work element supports efforts in novel bioreactor concepts that are likely to lead to substantially higher levels of reactor productivity, product yields and lower separation energetics. Results of work within this work element will be used to establish the technical feasibility of critical bioprocess monitoring and control subsystems. (3) The Bioprocess Design and Assessment work element attempts to develop procedures (via user-friendly computer software) for assessing the energy-economics of biocatalyzed chemical production processes, and initiation of technology transfer for advanced bioprocesses.
ECUT (Energy Conversion and Utilization Technologies) program: Biocatalysis project
NASA Astrophysics Data System (ADS)
Baresi, Larry
1989-03-01
The Annual Report presents the fiscal year (FY) 1988 research activities and accomplishments, for the Biocatalysis Project of the U.S. Department of Energy, Energy Conversion and Utilization Technologies (ECUT) Division. The ECUT Biocatalysis Project is managed by the Jet Propulsion Laboratory, California Institute of Technology. The Biocatalysis Project is a mission-oriented, applied research and exploratory development activity directed toward resolution of the major generic technical barriers that impede the development of biologically catalyzed commercial chemical production. The approach toward achieving project objectives involves an integrated participation of universities, industrial companies and government research laboratories. The Project's technical activities were organized into three work elements: (1) The Molecular Modeling and Applied Genetics work element includes research on modeling of biological systems, developing rigorous methods for the prediction of three-dimensional (tertiary) protein structure from the amino acid sequence (primary structure) for designing new biocatalysis, defining kinetic models of biocatalyst reactivity, and developing genetically engineered solutions to the generic technical barriers that preclude widespread application of biocatalysis. (2) The Bioprocess Engineering work element supports efforts in novel bioreactor concepts that are likely to lead to substantially higher levels of reactor productivity, product yields and lower separation energetics. Results of work within this work element will be used to establish the technical feasibility of critical bioprocess monitoring and control subsystems. (3) The Bioprocess Design and Assessment work element attempts to develop procedures (via user-friendly computer software) for assessing the energy-economics of biocatalyzed chemical production processes, and initiation of technology transfer for advanced bioprocesses.
Diffractive heavy quark production in AA collisions at the LHC at NLO
DOE Office of Scientific and Technical Information (OSTI.GOV)
Machado, M. M.; Ducati, M. B. Gay; Machado, M. V. T.
2011-07-15
The single and double diffractive cross sections for heavy quarks production are evaluated at NLO accuracy for hadronic and heavy ion collisions at the LHC. Diffractive charm and bottom production is the main subject of this work, providing predictions for CaCa, PbPb and pPb collisions. The hard diffraction formalism is considered using the Ingelman-Schlein model where a recent parametrization for the Pomeron structure function (DPDF) is applied. Absorptive corrections are taken into account as well. The diffractive ratios are estimated and theoretical uncertainties are discussed. Comparison with competing production channels is also presented.
Diffractive heavy quark production in AA collisions at the LHC at NLO
NASA Astrophysics Data System (ADS)
Machado, M. M.; Ducati, M. B. Gay; Machado, M. V. T.
2011-07-01
The single and double diffractive cross sections for heavy quarks production are evaluated at NLO accuracy for hadronic and heavy ion collisions at the LHC. Diffractive charm and bottom production is the main subject of this work, providing predictions for CaCa, PbPb and pPb collisions. The hard diffraction formalism is considered using the Ingelman-Schlein model where a recent parametrization for the Pomeron structure function (DPDF) is applied. Absorptive corrections are taken into account as well. The diffractive ratios are estimated and theoretical uncertainties are discussed. Comparison with competing production channels is also presented.
A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.
Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko
2015-10-30
Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.
A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems
Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko
2015-01-01
Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982
ECUT: Energy Conversion and Utilization Technologies program. Biocatalysis project
NASA Technical Reports Server (NTRS)
1990-01-01
The Biocatalysis Project is a mission-oriented, applied research and exploratory development activity directed toward resolution of the major generic technical barriers that impede the development of biologically catalyzed commercial chemical production. The approach toward achieving project objectives involves an integrated participation of Universities, Industrial Companies and Government Research Laboratories. The Project's technical activities were organized into three work elements: molecular modeling and applied genetics; bioprocess engineering; and bioprocess design and assessment.
2008-09-30
retrievals, Geophysical Research Abstracts, Vol. 10, EGU2008-A-11193, 2008, SRef-ID: 1607-7962/gra/EGU2008-A 11193, EGU General Assembly 2008. Liu, M...Application of Earth Sciences Products for use in Next Generation Numerical Aerosol...can be generated and predicted. Through this system, we will be able to advance a number of US Navy Applied Science needs in the areas of improved
Barbour R. James.; Xiaoping Zhou; Jeffrey P. Prestemon
2008-01-01
This study reports the results from a 5 year simulation of forest thinning intended to reduce fire hazard on publicly managed lands in the western United States. A state simulation model of interrelated timber markets was used to evaluate the timber product outputs. Approximately 84 million acres (34 million hectares), or 66% of total timberland in the western United...
NASA Astrophysics Data System (ADS)
Widowati, A.; Anjarsari, P.; Zuhdan, K. P.; Dita, A.
2018-03-01
The challenges of the 21st century require innovative solutions. Education must able to make an understanding of science learning that leads to the formation of scientific literacy learners. This research was conducted to produce the prototype as science worksheet based on Nature of Science (NoS) within inquiry approach and to know the effectiveness its product for developing scientific literacy. This research was the development and research design, by pointing to Four D models and Borg & Gall Model. There were 4 main phases (define, design, develop, disseminate) and additional phases (preliminary field testing, main product revision, main field testing, and operational product revision). Research subjects were students of the junior high school in Yogyakarta. The instruments used included questionnaire sheet product validation and scientific literacy test. For the validation data were analyzed descriptively. The test result was analyzed by an N-gain score. The results showed that the appropriateness of worksheet applying NoS within inquiry-based learning approach is eligible based on the assessment from excellent by experts and teachers, students’ scientific literacy can improve high category of the N-gain score at 0.71 by using student worksheet with Nature of Science (NoS) within inquiry approach.
Material Testing and Initial Pavement Design Modeling: Minnesota Road Research Project
DOT National Transportation Integrated Search
1996-09-01
Between January 1990 and December 1994, a study verified and applied a Corps of Engineers-developed mechanistic design and evaluation method for pavements in seasonal frost areas as part of a Construction Productivity Advancement Research (CPAR) proj...
Agroecosystems & Environment | National Agricultural Library
Skip to main content Home National Agricultural Library United States Department of Agriculture Ag useful formats (maps, tables, graphs), Agricultural Products html Useful to Usable: Developing usable integrated expertise in applied climatology, crop modeling, agronomy, cyber-technology, agricultural
Applications of AVHRR-Derived Ice Motions for the Arctic and Antarctic
NASA Technical Reports Server (NTRS)
Maslanik, James; Emery, William
1998-01-01
Characterization and diagnosis of sea ice/atmosphere/ocean interactions require a synthesis of observations and modeling to identify the key mechanisms controlling the ice/climate system. In this project, we combined product generation, observational analyses, and modeling to define and interpret variability in ice motion in conjunction with thermodynamic factors such as surface temperature and albedo. The goals of this work were twofold: (1) to develop and test procedures to produce an integrated set of polar products from remotely-sensed and supporting data; and (2) to apply these data to understand processes at work in controlling sea ice distribution.
Parameters modelling of amaranth grain processing technology
NASA Astrophysics Data System (ADS)
Derkanosova, N. M.; Shelamova, S. A.; Ponomareva, I. N.; Shurshikova, G. V.; Vasilenko, O. A.
2018-03-01
The article presents a technique that allows calculating the structure of a multicomponent bakery mixture for the production of enriched products, taking into account the instability of nutrient content, and ensuring the fulfilment of technological requirements and, at the same time considering consumer preferences. The results of modelling and analysis of optimal solutions are given by the example of calculating the structure of a three-component mixture of wheat and rye flour with an enriching component, that is, whole-hulled amaranth flour applied to the technology of bread from a mixture of rye and wheat flour on a liquid leaven.
ERIC Educational Resources Information Center
Kim, Sun Hee; Kim, Soojin
2010-01-01
What should we do to educate the mathematically gifted and how should we do it? In this research, to satisfy diverse mathematical and cognitive demands of the gifted who have excellent learning ability and task tenacity in mathematics, we sought to apply mathematical modeling. One of the objectives of the gifted education in Korea is cultivating…
Contrast of degraded and restored stream habitat using an individual-based salmon model
S. F. Railsback; M. Gard; Bret Harvey; Jason White; J.K.H. Zimmerman
2013-01-01
Stream habitat restoration projects are popular, but can be expensive and difficult to evaluate. We describe inSALMO, an individual-based model designed to predict habitat effects on freshwater life stages (spawning through juvenile out-migration) of salmon. We applied inSALMO to Clear Creek, California, simulating the production of total and large (>5 cm FL)...
Learning CAD at University through Summaries of the Rules of Design Intent
ERIC Educational Resources Information Center
Barbero, Basilio Ramos; Pedrosa, Carlos Melgosa; Samperio, Raúl Zamora
2017-01-01
The ease with which 3D CAD models may be modified and reused are two key aspects that improve the design-intent variable and that can significantly shorten the development timelines of a product. A set of rules are gathered from various authors that take different 3D modelling strategies into account. These rules are then applied to CAD…
Layout design-based research on optimization and assessment method for shipbuilding workshop
NASA Astrophysics Data System (ADS)
Liu, Yang; Meng, Mei; Liu, Shuang
2013-06-01
The research study proposes to examine a three-dimensional visualization program, emphasizing on improving genetic algorithms through the optimization of a layout design-based standard and discrete shipbuilding workshop. By utilizing a steel processing workshop as an example, the principle of minimum logistic costs will be implemented to obtain an ideological equipment layout, and a mathematical model. The objectiveness is to minimize the total necessary distance traveled between machines. An improved control operator is implemented to improve the iterative efficiency of the genetic algorithm, and yield relevant parameters. The Computer Aided Tri-Dimensional Interface Application (CATIA) software is applied to establish the manufacturing resource base and parametric model of the steel processing workshop. Based on the results of optimized planar logistics, a visual parametric model of the steel processing workshop is constructed, and qualitative and quantitative adjustments then are applied to the model. The method for evaluating the results of the layout is subsequently established through the utilization of AHP. In order to provide a mode of reference to the optimization and layout of the digitalized production workshop, the optimized discrete production workshop will possess a certain level of practical significance.
NASA Astrophysics Data System (ADS)
Vaicberg, H.; Palmeira, A. C. P. A.; Nunes, A.
2017-12-01
Studies on South Atlantic cyclones are mainly compromised by scarcity of observations. Therefore, remote sensing and global (re) analysis products are usually employed in investigations of their evolution. However, the frequent use of global reanalysis might difficult the assessment of the characteristics of the cyclones found in South Atlantic. In that regard, studies on "subtropical" cyclones have been performed using the 25-km resolution, Satellite-enhanced Regional Downscaling for Applied Studies (SRDAS), a product developed at the Federal University of Rio de Janeiro in Brazil. In SRDAS, the Regional Spectral Model assimilates precipitation estimates from environmental satellites, while dynamically downscaling a global reanalysis using the spectral nudging technique to maintain the large-scale features in agreement with the regional model solution. The use of regional models in the downscaling of general circulation models provides more detailed information on weather and climate. As a way of illustrating the usefulness of SRDAS in the study of the subtropical South Atlantic cyclones, the subtropical cyclone Anita was selected because of its intensity. Anita developed near Brazilian south/southeast coast, with damages to local communities. Comparisons with available observations demonstrated the skill of SRDAS in simulating such an extreme event.
NASA Technical Reports Server (NTRS)
Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David
1990-01-01
This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.
Quantum corrections of the truncated Wigner approximation applied to an exciton transport model.
Ivanov, Anton; Breuer, Heinz-Peter
2017-04-01
We modify the path integral representation of exciton transport in open quantum systems such that an exact description of the quantum fluctuations around the classical evolution of the system is possible. As a consequence, the time evolution of the system observables is obtained by calculating the average of a stochastic difference equation which is weighted with a product of pseudoprobability density functions. From the exact equation of motion one can clearly identify the terms that are also present if we apply the truncated Wigner approximation. This description of the problem is used as a basis for the derivation of a new approximation, whose validity goes beyond the truncated Wigner approximation. To demonstrate this we apply the formalism to a donor-acceptor transport model.
Hamidi, Ahd; Kreeftenberg, Hans; V D Pol, Leo; Ghimire, Saroj; V D Wielen, Luuk A M; Ottens, Marcel
2016-05-01
Vaccination is one of the most successful public health interventions being a cost-effective tool in preventing deaths among young children. The earliest vaccines were developed following empirical methods, creating vaccines by trial and error. New process development tools, for example mathematical modeling, as well as new regulatory initiatives requiring better understanding of both the product and the process are being applied to well-characterized biopharmaceuticals (for example recombinant proteins). The vaccine industry is still running behind in comparison to these industries. A production process for a new Haemophilus influenzae type b (Hib) conjugate vaccine, including related quality control (QC) tests, was developed and transferred to a number of emerging vaccine manufacturers. This contributed to a sustainable global supply of affordable Hib conjugate vaccines, as illustrated by the market launch of the first Hib vaccine based on this technology in 2007 and concomitant price reduction of Hib vaccines. This paper describes the development approach followed for this Hib conjugate vaccine as well as the mathematical modeling tool applied recently in order to indicate options for further improvements of the initial Hib process. The strategy followed during the process development of this Hib conjugate vaccine was a targeted and integrated approach based on prior knowledge and experience with similar products using multi-disciplinary expertise. Mathematical modeling was used to develop a predictive model for the initial Hib process (the 'baseline' model) as well as an 'optimized' model, by proposing a number of process changes which could lead to further reduction in price. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:568-580, 2016. © 2016 American Institute of Chemical Engineers.
NASA Technical Reports Server (NTRS)
Carey, Lawrence; Koshak, William; Peterson, Harold; Matthee, Retha; Bain, Lamont
2013-01-01
The Deep Convective Clouds and Chemistry (DC3) experiment seeks to quantify the relationship between storm physics, lightning characteristics and the production of nitrogen oxides via lightning (LNOx). The focus of this study is to investigate the kinematic and microphysical control of lightning properties, particularly those that may govern LNOx production, such as flash rate, type and extent across Alabama during DC3. Prior studies have demonstrated that lightning flash rate and type is correlated to kinematic and microphysical properties in the mixed-phase region of thunderstorms such as updraft volume and graupel mass. More study is required to generalize these relationships in a wide variety of storm modes and meteorological conditions. Less is known about the co-evolving relationship between storm physics, morphology and three-dimensional flash extent, despite its importance for LNOx production. To address this conceptual gap, the NASA Lightning Nitrogen Oxides Model (LNOM) is applied to North Alabama Lightning Mapping Array (NALMA) and Vaisala National Lightning Detection Network(TM) (NLDN) observations following ordinary convective cells through their lifecycle. LNOM provides estimates of flash rate, flash type, channel length distributions, lightning segment altitude distributions (SADs) and lightning NOx production profiles. For this study, LNOM is applied in a Lagrangian sense to multicell thunderstorms over Northern Alabama on two days during DC3 (21 May and 11 June 2012) in which aircraft observations of NOx are available for comparison. The LNOM lightning characteristics and LNOX production estimates are compared to the evolution of updraft and precipitation properties inferred from dual-Doppler and polarimetric radar analyses applied to observations from a nearby radar network, including the UAH Advanced Radar for Meteorological and Operational Research (ARMOR). Given complex multicell evolution, particular attention is paid to storm morphology, cell mergers and possible dynamical, microphysical and electrical interaction of individual cells when testing various hypotheses.
NASA Technical Reports Server (NTRS)
Carey, Lawrence; Koshak, William; Peterson, Harold; Matthee, Retha; Bain, Lamont
2013-01-01
The Deep Convective Clouds and Chemistry (DC3) experiment seeks to quantify the relationship between storm physics, lightning characteristics and the production of nitrogen oxides via lightning (LNOx). The focus of this study is to investigate the kinematic and microphysical control of lightning properties, particularly those that may govern LNOx production, such as flash rate, type and extent across Alabama during DC3. Prior studies have demonstrated that lightning flash rate and type is correlated to kinematic and microphysical properties in the mixed-phase region of thunderstorms such as updraft volume and graupel mass. More study is required to generalize these relationships in a wide variety of storm modes and meteorological conditions. Less is known about the co-evolving relationship between storm physics, morphology and three-dimensional flash extent, despite its importance for LNOx production. To address this conceptual gap, the NASA Lightning Nitrogen Oxides Model (LNOM) is applied to North Alabama Lightning Mapping Array (NALMA) and Vaisala National Lightning Detection NetworkTM (NLDN) observations following ordinary convective cells through their lifecycle. LNOM provides estimates of flash rate, flash type, channel length distributions, lightning segment altitude distributions (SADs) and lightning NOx production profiles. For this study, LNOM is applied in a Lagrangian sense to multicell thunderstorms over Northern Alabama on two days during DC3 (21 May and 11 June 2012) in which aircraft observations of NOx are available for comparison. The LNOM lightning characteristics and LNOX production estimates are compared to the evolution of updraft and precipitation properties inferred from dual-Doppler and polarimetric radar analyses applied to observations from a nearby radar network, including the UAH Advanced Radar for Meteorological and Operational Research (ARMOR). Given complex multicell evolution, particular attention is paid to storm morphology, cell mergers and possible dynamical, microphysical and electrical interaction of individual cells when testing various hypotheses.
NASA Astrophysics Data System (ADS)
Ichii, K.; Kondo, M.; Wang, W.; Hashimoto, H.; Nemani, R. R.
2012-12-01
Various satellite-based spatial products such as evapotranspiration (ET) and gross primary productivity (GPP) are now produced by integration of ground and satellite observations. Effective use of these multiple satellite-based products in terrestrial biosphere models is an important step toward better understanding of terrestrial carbon and water cycles. However, due to the complexity of terrestrial biosphere models with large number of model parameters, the application of these spatial data sets in terrestrial biosphere models is difficult. In this study, we established an effective but simple framework to refine a terrestrial biosphere model, Biome-BGC, using multiple satellite-based products as constraints. We tested the framework in the monsoon Asia region covered by AsiaFlux observations. The framework is based on the hierarchical analysis (Wang et al. 2009) with model parameter optimization constrained by satellite-based spatial data. The Biome-BGC model is separated into several tiers to minimize the freedom of model parameter selections and maximize the independency from the whole model. For example, the snow sub-model is first optimized using MODIS snow cover product, followed by soil water sub-model optimized by satellite-based ET (estimated by an empirical upscaling method; Support Vector Regression (SVR) method; Yang et al. 2007), photosynthesis model optimized by satellite-based GPP (based on SVR method), and respiration and residual carbon cycle models optimized by biomass data. As a result of initial assessment, we found that most of default sub-models (e.g. snow, water cycle and carbon cycle) showed large deviations from remote sensing observations. However, these biases were removed by applying the proposed framework. For example, gross primary productivities were initially underestimated in boreal and temperate forest and overestimated in tropical forests. However, the parameter optimization scheme successfully reduced these biases. Our analysis shows that terrestrial carbon and water cycle simulations in monsoon Asia were greatly improved, and the use of multiple satellite observations with this framework is an effective way for improving terrestrial biosphere models.
Jiang, Xia; Jin, Xiang-can; Yan, Chang-zhou; Yediler, Ayfer; Ou, Zi-qing; Kettrup, Antonius
2004-01-01
Advanced closed chamber system was used to study the fate of phenanthrene (3-rings PAHs) in the presence of linear alkylbenzene sulphonates (LAS). The results showed mineralization and metabolism of phenanthrene are fast in the "culture solution-lava-plant-air" model ecological system. The distribution proportions of applied 14C-activity in this simulative ecological system were 41%-45%, 14% to 10% and 1% in plant, lava and culture solution respectively, and 18% to 29%, 11% to 8% recovered in the forms of VOCs and CO2. Main parts of the applied 14C-activity exist in two forms, one is polar metabolites (25%) which mainly distribute in the root (23%), the other is unextractable part (23%) which have been constructed into plant root (8.98%), shoot (0.53%) or bonded to lava (13.2%). The main metabolites of phenanthrene were polar compounds (25% of applied 14C-activity), and small portion of 14C-activity was identified as non-polar metabolites (6% of applied 14C-activity) and apparent phenanthrene (1.91% of applied 14C-activity). Phenanthrene and its metabolites can be taken up through plant roots and translocated to plant shoots. The presence of LAS significantly increased the the concentration of 14C-activity in the plant and production of VOCs, at the same time it decreased the phenanthrene level in the plant and the production of CO2 at the concentration of 200 mg/L.
NASA Astrophysics Data System (ADS)
Blyverket, J.; Hamer, P.; Bertino, L.; Lahoz, W. A.
2017-12-01
The European Space Agency Climate Change Initiative for soil moisture (ESA CCI SM) was initiated in 2012 for a period of six years, the objective for this period was to produce the most complete and consistent global soil moisture data record based on both active and passive sensors. The ESA CCI SM products consist of three surface soil moisture datasets: The ACTIVE product and the PASSIVE product were created by fusing scatterometer and radiometer soil moisture data, respectively. The COMBINED product is a blended product based on the former two datasets. In this study we assimilate globally both the ACTIVE and PASSIVE product at a 25 km spatial resolution. The different satellite platforms have different overpass times, an observation is mapped to the hours 00.00, 06.00, 12.00 or 18.00 if it falls within a 3 hour window centred at these times. We use the SURFEX land surface model with the ISBA diffusion scheme for the soil hydrology. For the assimilation routine we apply the Ensemble Transform Kalman Filter (ETKF). The land surface model is driven by perturbed MERRA-2 atmospheric forcing data, which has a temporal resolution of one hour and is mapped to the SURFEX model grid. Bias between the land surface model and the ESA CCI product is removed by cumulative distribution function (CDF) matching. This work is a step towards creating a global root zone soil moisture product from the most comprehensive satellite surface soil moisture product available. As a first step we consider the period from 2010 - 2016. This allows for comparison against other global root zone soil moisture products (SMAP Level 4, which is independent of the ESA CCI SM product).
New product forecasting with limited or no data
NASA Astrophysics Data System (ADS)
Ismai, Zuhaimy; Abu, Noratikah; Sufahani, Suliadi
2016-10-01
In the real world, forecasts would always be based on historical data with the assumption that the behaviour be the same for the future. But how do we forecast when there is no such data available? New product or new technologies normally has limited amount of data available. Knowing that forecasting is valuable for decision making, this paper presents forecasting of new product or new technologies using aggregate diffusion models and modified Bass Model. A newly launched Proton car and its penetration was chosen to demonstrate the possibility of forecasting sales demand where there is limited or no data available. The model was developed to forecast diffusion of new vehicle or an innovation in the Malaysian society. It is to represent the level of spread on the new vehicle among a given set of the society in terms of a simple mathematical function that elapsed since the introduction of the new product. This model will forecast the car sales volume. A procedure of the proposed diffusion model was designed and the parameters were estimated. Results obtained by applying the proposed diffusion model and numerical calculation shows that the model is robust and effective for forecasting demand of the new vehicle. The results reveal that newly developed modified Bass diffusion of demand function has significantly contributed for forecasting the diffusion of new Proton car or new product.
Estimating Top-of-Atmosphere Thermal Infrared Radiance Using MERRA-2 Atmospheric Data
NASA Astrophysics Data System (ADS)
Kleynhans, Tania
Space borne thermal infrared sensors have been extensively used for environmental research as well as cross-calibration of other thermal sensing systems. Thermal infrared data from satellites such as Landsat and Terra/MODIS have limited temporal resolution (with a repeat cycle of 1 to 2 days for Terra/MODIS, and 16 days for Landsat). Thermal instruments with finer temporal resolution on geostationary satellites have limited utility for cross-calibration due to their large view angles. Reanalysis atmospheric data is available on a global spatial grid at three hour intervals making it a potential alternative to existing satellite image data. This research explores using the Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2) reanalysis data product to predict top-of-atmosphere (TOA) thermal infrared radiance globally at time scales finer than available satellite data. The MERRA-2 data product provides global atmospheric data every three hours from 1980 to the present. Due to the high temporal resolution of the MERRA-2 data product, opportunities for novel research and applications are presented. While MERRA-2 has been used in renewable energy and hydrological studies, this work seeks to leverage the model to predict TOA thermal radiance. Two approaches have been followed, namely physics-based approach and a supervised learning approach, using Terra/MODIS band 31 thermal infrared data as reference. The first physics-based model uses forward modeling to predict TOA thermal radiance. The second model infers the presence of clouds from the MERRA-2 atmospheric data, before applying an atmospheric radiative transfer model. The last physics-based model parameterized the previous model to minimize computation time. The second approach applied four different supervised learning algorithms to the atmospheric data. The algorithms included a linear least squares regression model, a non-linear support vector regression (SVR) model, a multi-layer perceptron (MLP), and a convolutional neural network (CNN). This research found that the multi-layer perceptron model produced the lowest error rates overall, with an RMSE of 1.22W / m2 sr mum when compared to actual Terra/MODIS band 31 image data. This research further aimed to characterize the errors associated with each method so that any potential user will have the best information available should they wish to apply these methods towards their own application.
Monitoring and Simulating Water, Carbon and Nitrogen Dynamics over Catchments in Eastern Asia
NASA Astrophysics Data System (ADS)
Wang, Q.; Xiao, Q.; Liu, C.; Watanabe, M.
2006-05-01
There is an emergency need to support decision-making in water environment management in Eastern Asia. For sound management and decision making of sustainable water use, the catchment ecosystem assessment, emphasizing biophysical and biogeochemical processes and human interactions, is a key task. For this task, an integrated ecosystem model has been developed to estimate the spatial and temporal distributions of the water, carbon and nutrient cycles over catchment scales. The model integrated both a distributed hydrologic model (Nakayama and Watanabe, 2004) and an ecosystem model, BIOME-BGC (Running and Coughlan, 1988), which has been modified and validated for various ecosystems by using the APEIS-FLUX datasets in China (Wang and Watanabe, 2005). The model has been applied to catchments in China, such as the Changjiang River and the Yellow River. The MODIS satellite data products, such as leaf area index (LAI), vegetation index (VI) and land surface temperature (LST) were used as the input parameters. By using the integrated model, the future changes in water, carbon and nitrogen cycle can be predicted based on scenarios, such as the decrease in crop production due to water shortage, and the increase in temperature and CO2 concentration, as well as the land use/cover changes. The model was validated by the measured values of soil moisture, and river flow discharge throughout the year, showing that this model achieves a fairly high accuracy. As an example, we applied the integrated model to simulate the daily water vapor, carbon and nitrogen fluxes over the Changjiang River Basin. The Changjiang River is ranked third in length and is the largest river in terms of water discharge over the Euro-Asian continent. The drainage basin of the Changjiang supplies 5-10% of the total world population with water resources and nutrition and irrigates 40% of China's national crop production. Moreover, the materials carried by the Changjiang River have a significant influence on the coastal environment. Simulation results showed that enhanced atmospheric CO2 concentrations and especially increased nitrogen application had a marked effect on the simulated water and carbon sequestration capacity and played a prominent role in increasing this capacity. Finally, the model has been applied to evaluate the impact of land cover change from 1980 to 2000 on water, carbon and nitrogen fluxes over larger river basins in China.
Prakash Vincent, Samuel Gnana
2014-01-01
Production of fibrinolytic enzyme by a newly isolated Paenibacillus sp. IND8 was optimized using wheat bran in solid state fermentation. A 25 full factorial design (first-order model) was applied to elucidate the key factors as moisture, pH, sucrose, yeast extract, and sodium dihydrogen phosphate. Statistical analysis of the results has shown that moisture, sucrose, and sodium dihydrogen phosphate have the most significant effects on fibrinolytic enzymes production (P < 0.05). Central composite design (CCD) was used to determine the optimal concentrations of these three components and the experimental results were fitted with a second-order polynomial model at 95% level (P < 0.05). Overall, 4.5-fold increase in fibrinolytic enzyme production was achieved in the optimized medium as compared with the unoptimized medium. PMID:24523635
NASA Technical Reports Server (NTRS)
Paudel, Krishna P.; Limaye, Ashutosh; Hatch, Upton; Cruise, James; Musleh, Fuad
2005-01-01
We developed a dynamic model to optimize irrigation application in three major crops (corn, cotton and peanuts) grown in the Southeast USA. Water supply amount is generated from an engineering model which is then combined with economic models to find the optimal amount of irrigation water to apply on each crop field during the six critical water deficit weeks in summer. Results indicate that water is applied on the crop with the highest marginal value product of irrigation. Decision making tool such as the one developed here would help farmers and policy makers to find the maximum profitable solution when water shortage is a serious concern.
NASA Astrophysics Data System (ADS)
Oriani, F.; Stisen, S.
2016-12-01
Rainfall amount is one of the most sensitive inputs to distributed hydrological models. Its spatial representation is of primary importance to correctly study the uncertainty of basin recharge and its propagation to the surface and underground circulation. We consider here the 10-km-grid rainfall product provided by the Danish Meteorological Institute as input to the National Water Resources Model of Denmark. Due to a drastic reduction in the rain gauge network in recent years (from approximately 500 stations in the period 1996-2006, to 250 in the period 2007-2014), the grid rainfall product, based on the interpolation of these data, is much less reliable. Consequently, the related hydrological model shows a significantly lower prediction power. To give a better estimation of spatial rainfall at the grid points far from ground measurements, we use the direct sampling technique (DS) [1], belonging to the family of multiple-point geostatistics. DS, already applied to rainfall and spatial variable estimation [2, 3], simulates a grid value by sampling a training data set where a similar data neighborhood occurs. In this way, complex statistical relations are preserved by generating similar spatial patterns to the ones found in the training data set. Using the reliable grid product from the period 1996-2006 as training data set, we first test the technique by simulating part of this data set, then we apply the technique to the grid product of the period 2007-2014, and subsequently analyzing the uncertainty propagation to the hydrological model. We show that DS can improve the reliability of the rainfall product by generating more realistic rainfall patterns, with a significant repercussion on the hydrological model. The reduction of rain gauge networks is a global phenomenon which has huge implications for hydrological model performance and the uncertainty assessment of water resources. Therefore, the presented methodology can potentially be used in many regions where historical records can act as training data. [1] G.Mariethoz et al. (2010), Water Resour. Res., 10.1029/2008WR007621.[2] F. Oriani et al. (2014), Hydrol. Earth Syst. Sc., 10.5194/hessd-11-3213-2014. [3] G. Mariethoz et al. (2012), Water Resour. Res., 10.1029/2012WR012115.
NASA Astrophysics Data System (ADS)
Chinnayakanahalli, K.; Adam, J. C.; Stockle, C.; Nelson, R.; Brady, M.; Rajagopalan, K.; Barber, M. E.; Dinesh, S.; Malek, K.; Yorgey, G.; Kruger, C.; Marsh, T.; Yoder, J.
2011-12-01
For better management and decision making in the face of climate change, earth system models must explicitly account for natural resource and agricultural management activities. Including crop system, water management, and economic models into an earth system modeling framework can help in answering questions related to the impacts of climate change on irrigation water and crop productivity, how agricultural producers can adapt to anticipated climate change, and how agricultural practices can mitigate climate change. Herein we describe the coupling of the Variability Infiltration Capacity (VIC) land surface model, which solves the water and energy balances of the hydrologic cycle at regional scales, with a crop-growth model, CropSyst. This new model, VIC-CropSyst, is the land surface model that will be used in a new regional-scale model development project focused on the Pacific Northwest, termed BioEarth. Here we describe the VIC-CropSyst coupling process and its application over the Columbia River basin (CRB) using agricultural-specific land cover information. The Washington State Department of Agriculture (WSDA) and U. S. Department of Agriculture (USDA) cropland data layers were used to identify agricultural land use patterns, in which both irrigated and dry land crops were simulated. The VIC-CropSyst model was applied over the CRB for the historical period of 1976 - 2006 to establish a baseline for surface water availability, irrigation demand, and crop production. The model was then applied under future (2030s) climate change scenarios derived from statistically-downscaled Global Circulation Models output under two emission scenarios (A1B and B1). Differences between simulated future and historical irrigation demand, irrigation water availability, and crop production were used in an economics model to identify the most economically-viable future cropping pattern. The economics model was run under varying scenarios of regional growth, trade, water pricing, and water capacity providing a spectrum of possible future cropping patterns. The resulting cropping patterns were then used in VIC-CropSyst to quantify the impacts of climate change, economic, and water management scenarios on crop production, and water resources availability. This modeling framework provides opportunities to study the interactions between human activities and complex natural processes and is a valuable tool for inclusion in an earth system model with the goal of informing land use and water management.
Electrolytic hydrogen production: An analysis and review
NASA Technical Reports Server (NTRS)
Evangelista, J.; Phillips, B.; Gordon, L.
1975-01-01
The thermodynamics of water electrolysis cells is presented, followed by a review of current and future technology of commercial cells. The irreversibilities involved are analyzed and the resulting equations assembled into a computer simulation model of electrolysis cell efficiency. The model is tested by comparing predictions based on the model to actual commercial cell performance, and a parametric investigation of operating conditions is performed. Finally, the simulation model is applied to a study of electrolysis cell dynamics through consideration of an ideal pulsed electrolyzer.
A lower trophic ecosystem model including iron effects in the Okhotsk Sea
NASA Astrophysics Data System (ADS)
Okunishi, Takeshi; Kishi, Michio J.; Ono, Yukiko; Yamashita, Toshihiko
2007-09-01
We applied a three-dimensional ecosystem-physical coupled model including iron the effect to the Okhotsk Sea. In order to clarify the sources of iron, four dissolved iron compartments, based on the sources of supply, were added to Kawamiya et al.'s [1995, An ecological-physical coupled model applied to Station Papa. Journal of Oceanography, 51, 635-664] model (KKYS) to create our ecosystem model (KKYS-Fe). We hypothesized that four processes supply iron to sea water: atmospheric loadings from Northeastern Asia, input from the Amur River, dissolution from sediments and regeneration by zooplankton and bacteria. We simulated one year, from 1 January 2001 to 31 December 2001, using both KKYS-Fe and KKYS. KKYS could not reproduce the surface nitrate distribution after the spring bloom, whereas KKYS-Fe agreed well with observations in the northwestern Pacific because it includes iron limitation of phytoplankton growth. During the spring bloom, the main source of iron at the sea surface is from the atmosphere. The contribution of riverine iron to the total iron utilized for primary production is small in the Okhotsk Sea. Atmospheric deposition, the iron flux from sediment and regeneration of iron in the water column play important roles in maintaining high primary production in the Okhotsk Sea.
High-throughput screening of chemicals as functional ...
Identifying chemicals that provide a specific function within a product, yet have minimal impact on the human body or environment, is the goal of most formulation chemists and engineers practicing green chemistry. We present a methodology to identify potential chemical functional substitutes from large libraries of chemicals using machine learning based models. We collect and analyze publicly available information on the function of chemicals in consumer products or industrial processes to identify a suite of harmonized function categories suitable for modeling. We use structural and physicochemical descriptors for these chemicals to build 41 quantitative structure–use relationship (QSUR) models for harmonized function categories using random forest classification. We apply these models to screen a library of nearly 6400 chemicals with available structure information for potential functional substitutes. Using our Functional Use database (FUse), we could identify uses for 3121 chemicals; 4412 predicted functional uses had a probability of 80% or greater. We demonstrate the potential application of the models to high-throughput (HT) screening for “candidate alternatives” by merging the valid functional substitute classifications with hazard metrics developed from HT screening assays for bioactivity. A descriptor set could be obtained for 6356 Tox21 chemicals that have undergone a battery of HT in vitro bioactivity screening assays. By applying QSURs, we wer
Stochastic Threshold Microdose Model for Cell Killing by Insoluble Metallic Nanomaterial Particles
Scott, Bobby R.
2010-01-01
This paper introduces a novel microdosimetric model for metallic nanomaterial-particles (MENAP)-induced cytotoxicity. The focus is on the engineered insoluble MENAP which represent a significant breakthrough in the design and development of new products for consumers, industry, and medicine. Increased production is rapidly occurring and may cause currently unrecognized health effects (e.g., nervous system dysfunction, heart disease, cancer); thus, dose-response models for MENAP-induced biological effects are needed to facilitate health risk assessment. The stochastic threshold microdose (STM) model presented introduces novel stochastic microdose metrics for use in constructing dose-response relationships for the frequency of specific cellular (e.g., cell killing, mutations, neoplastic transformation) or subcellular (e.g., mitochondria dysfunction) effects. A key metric is the exposure-time-dependent, specific burden (MENAP count) for a given critical target (e.g., mitochondria, nucleus). Exceeding a stochastic threshold specific burden triggers cell death. For critical targets in the cytoplasm, the autophagic mode of death is triggered. For the nuclear target, the apoptotic mode of death is triggered. Overall cell survival is evaluated for the indicated competing modes of death when both apply. The STM model can be applied to cytotoxicity data using Bayesian methods implemented via Markov chain Monte Carlo. PMID:21191483
Zhi, Zelun; Wang, Hui
2014-07-01
This paper demonstrates biohydrogen production was enhanced by white-rot fungal pretreatment of wheat straw (WS) through simultaneous saccharification and fermentation (SSF). Wheat straw was pretreated by Phanerochaete chrysosporium at 30 °C under solid state fermentation for 12 days, and lignin was removed about 28.5 ± 1.3 %. Microscopic structure observation combined thermal gravity and differential thermal gravity analysis further showed that the lignocellulose structure obviously disrupted after fungal pretreatment. Subsequently, the pretreated WS and crude cellulases prepared from Trichoderma atroviride were applied in SSF for hydrogen production using Clostridium perfringens. The maximum hydrogen yield was obtained to be 78.5 ± 3.4 ml g(-1)-pretreated WS, which was about 1.8-fold than the unpretreated group. Furthermore, the modified Gompertz model was applied study the progress of cumulative H(2) production. This work developed a novel bio-approach to improve fermentative H(2) yield from lignocellulosic biomass.
Shin, Wonkyoung; Park, Minyong
2017-01-01
Background/Study Context: The increasing longevity and health of older users as well as aging populations has created the need to develop senior-oriented product interfaces. This study aims to find user interface (UI) priorities according to older user groups based on their lifestyle and develop quality of UI (QUI) models for large electronic home appliances and mobile products. A segmentation table designed to show how older users can be categorized was created through a review of the literature to survey 252 subjects with a questionnaire. Factor analysis was performed to extract six preliminary lifestyle factors, which were then used for subsequent cluster analysis. The analysis resulted in four groups. Cross-analysis was carried out to investigate which characteristics were included in the groups. Analysis of variance was then applied to investigate the differences in the UI priorities among the user groups for various electronic devices. Finally, QUI models were developed and applied to those electronic devices. Differences in UI priorities were found according to the four lifestyles ("money-oriented," "innovation-oriented," "stability- and simplicity-oriented," and "innovation- and intellectual-oriented"). Twelve QUI models were developed for four different lifestyle groups associated with different products. Three washers and three smartphones were used as an example for testing the QUI models. The UI differences of the older user groups by the segmentation in this study using several key (i.e., demographic, socioeconomic, and physical-cognitive) variables are distinct from earlier studies made by a single variable. The differences in responses clearly indicate the benefits of integrating various factors of older users, rather than single variable, in order to design and develop more innovative and better consumer products in the future. The results of this study showed that older users with a potentially high buying power in the future are likely to have higher satisfaction when selecting products customized for their lifestyle. Designers could also use the results of UI evaluation for older users based on their lifestyle before developing products through QUI modeling. This approach would save time and costs.
Strategizing for Intense Competition.
ERIC Educational Resources Information Center
Hahn, William; Bourgeois, Ernest J., Jr.
1999-01-01
Examines trend toward more aggressive student recruiting strategies by colleges and universities, applying a model that assesses five competitive forces-cause and effect of competition, the expanding marketplace, substitute products, buyer power, and supplier power, and examines various strategies for dealing with these competitive forces, such as…
Neale, Patrick J; Thomas, Brian C
2017-01-01
Phytoplankton photosynthesis is often inhibited by ultraviolet (UV) and intense photosynthetically available radiation (PAR), but the effects on ocean productivity have received little consideration aside from polar areas subject to periodic enhanced UV-B due to depletion of stratospheric ozone. A more comprehensive assessment is important for understanding the contribution of phytoplankton production to the global carbon budget, present and future. Here, we consider responses in the temperate and tropical mid-ocean regions typically dominated by picophytoplankton including the prokaryotic lineages, Prochlorococcus and Synechococcus. Spectral models of photosynthetic response for each lineage were constructed using model strains cultured at different growth irradiances and temperatures. In the model, inhibition becomes more severe once exposure exceeds a threshold (E max ) related to repair capacity. Model parameters are presented for Prochlorococcus adding to those previously presented for Synechococcus. The models were applied to estimate midday, water column photosynthesis based on an atmospheric model of spectral radiation, satellite-derived spectral water transparency and temperature. Based on a global survey of inhibitory exposure severity, a full-latitude section of the mid-Pacific and near-equatorial region of the east Pacific were identified as representative regions for prediction of responses over the entire water column. Comparing predictions integrated over the water column including versus excluding inhibition, production was 7-28% lower due to inhibition depending on strain and site conditions. Inhibition was consistently greater for Prochlorococcus compared to two strains of Synechococcus. Considering only the surface mixed layer, production was inhibited 7-73%. On average, including inhibition lowered estimates of midday productivity around 20% for the modeled region of the Pacific with UV accounting for two-thirds of the reduction. In contrast, most other productivity models either ignore inhibition or only include PAR inhibition. Incorporation of E max model responses into an existing spectral model of depth-integrated, daily production will enable efficient global predictions of picophytoplankton productivity including inhibition. © 2016 John Wiley & Sons Ltd.
Production scheduling and rescheduling with genetic algorithms.
Bierwirth, C; Mattfeld, D C
1999-01-01
A general model for job shop scheduling is described which applies to static, dynamic and non-deterministic production environments. Next, a Genetic Algorithm is presented which solves the job shop scheduling problem. This algorithm is tested in a dynamic environment under different workload situations. Thereby, a highly efficient decoding procedure is proposed which strongly improves the quality of schedules. Finally, this technique is tested for scheduling and rescheduling in a non-deterministic environment. It is shown by experiment that conventional methods of production control are clearly outperformed at reasonable run-time costs.
Grand canonical validation of the bipartite international trade network.
Straka, Mika J; Caldarelli, Guido; Saracco, Fabio
2017-08-01
Devising strategies for economic development in a globally competitive landscape requires a solid and unbiased understanding of countries' technological advancements and similarities among export products. Both can be addressed through the bipartite representation of the International Trade Network. In this paper, we apply the recently proposed grand canonical projection algorithm to uncover country and product communities. Contrary to past endeavors, our methodology, based on information theory, creates monopartite projections in an unbiased and analytically tractable way. Single links between countries or products represent statistically significant signals, which are not accounted for by null models such as the bipartite configuration model. We find stable country communities reflecting the socioeconomic distinction in developed, newly industrialized, and developing countries. Furthermore, we observe product clusters based on the aforementioned country groups. Our analysis reveals the existence of a complicated structure in the bipartite International Trade Network: apart from the diversification of export baskets from the most basic to the most exclusive products, we observe a statistically significant signal of an export specialization mechanism towards more sophisticated products.
Grand canonical validation of the bipartite international trade network
NASA Astrophysics Data System (ADS)
Straka, Mika J.; Caldarelli, Guido; Saracco, Fabio
2017-08-01
Devising strategies for economic development in a globally competitive landscape requires a solid and unbiased understanding of countries' technological advancements and similarities among export products. Both can be addressed through the bipartite representation of the International Trade Network. In this paper, we apply the recently proposed grand canonical projection algorithm to uncover country and product communities. Contrary to past endeavors, our methodology, based on information theory, creates monopartite projections in an unbiased and analytically tractable way. Single links between countries or products represent statistically significant signals, which are not accounted for by null models such as the bipartite configuration model. We find stable country communities reflecting the socioeconomic distinction in developed, newly industrialized, and developing countries. Furthermore, we observe product clusters based on the aforementioned country groups. Our analysis reveals the existence of a complicated structure in the bipartite International Trade Network: apart from the diversification of export baskets from the most basic to the most exclusive products, we observe a statistically significant signal of an export specialization mechanism towards more sophisticated products.
Use of GPM Data Products in SERVIR Hydrological Applications
NASA Astrophysics Data System (ADS)
Limaye, A. S.; Mithieu, F.; Gurung, D. R.; Blankenship, C. B.; Crosson, W. L.; Anderson, E. R.; Flores, A.; Delgado, F.; Stanton, K.; Irwin, D.
2015-12-01
Availability of reliable precipitation data is a major challenge for SERVIR, a joint USAID-NASA project aimed at improving the environmental decision-making capacity of developing countries. GPM data products are fulfilling that challenge through frequent, high spatial resolution precipitation products over regional scales. SERVIR is using the products in different ways. First, SERVIR is using those in hydrologic modeling over Eastern Africa and in Hindu Kush Himalaya. SERVIR's distributed hydrologic modeling capability is helping the hydrological and meteorological departments in SERVIR regions, or Hubs, identify local watershed deserving immediate attention - such as recurring floods. Additionally, SERVIR technical implementers in the Hubs are building capacities of the departments and ministries in their member countries to effectively use the GPM products. SERVIR also provides an easy access for efficient integration of GPM products in web map services. This presentation will highlight ongoing collaborations and results generated through collaborative partnership among the water resources and hydrometeorology departments in Kenya, Uganda, Rwanda, Namibia, and Bhutan, SERVIR Hubs, and SERVIR Applied Sciences Team projects
NASA Astrophysics Data System (ADS)
Rodrigo-Clavero, Maria-Elena; Rodrigo-Ilarri, Javier
2017-04-01
One of the most serious environmental problems in modern societies is the management and disposal of urban solid waste (MSW). Despite the efforts of the administration to promote recycling and reuse policies and energy recovery technologies, nowadays the majority of MSW still is disposed in sanitary landfills. During the phases of operation and post-closure maintenance of any solid waste disposal site, two of the most relevant problems are the production of leachate and the generation of biogas. The leachate and biogas production formation processes occur simultaneously over time and are coupled together through the consumption and/or production of water. However, no mathematical models have been easily identified that allow to the evaluation of the joint production of leachate and biogas, during the operational and the post-closure phase of an urban waste landfill. This paper introduces BIOLEACH, a new mathematical model programmed on a monthly scale, that evaluates the joint production of leachate and biogas applying water balance techniques and considers the management of the landfill as a bioreactor. The application of such a model on real landfills allows to perform an environmentally sustainable management that minimizes the environmental impacts produced being also economically more profitable.
A User-Friendly Model for Spray Drying to Aid Pharmaceutical Product Development
Grasmeijer, Niels; de Waard, Hans; Hinrichs, Wouter L. J.; Frijlink, Henderik W.
2013-01-01
The aim of this study was to develop a user-friendly model for spray drying that can aid in the development of a pharmaceutical product, by shifting from a trial-and-error towards a quality-by-design approach. To achieve this, a spray dryer model was developed in commercial and open source spreadsheet software. The output of the model was first fitted to the experimental output of a Büchi B-290 spray dryer and subsequently validated. The predicted outlet temperatures of the spray dryer model matched the experimental values very well over the entire range of spray dryer settings that were tested. Finally, the model was applied to produce glassy sugars by spray drying, an often used excipient in formulations of biopharmaceuticals. For the production of glassy sugars, the model was extended to predict the relative humidity at the outlet, which is not measured in the spray dryer by default. This extended model was then successfully used to predict whether specific settings were suitable for producing glassy trehalose and inulin by spray drying. In conclusion, a spray dryer model was developed that is able to predict the output parameters of the spray drying process. The model can aid the development of spray dried pharmaceutical products by shifting from a trial-and-error towards a quality-by-design approach. PMID:24040240
NASA Astrophysics Data System (ADS)
An, Xinliang; Wong, Willie Wai Yeung
2018-01-01
Many classical results in relativity theory concerning spherically symmetric space-times have easy generalizations to warped product space-times, with a two-dimensional Lorentzian base and arbitrary dimensional Riemannian fibers. We first give a systematic presentation of the main geometric constructions, with emphasis on the Kodama vector field and the Hawking energy; the construction is signature independent. This leads to proofs of general Birkhoff-type theorems for warped product manifolds; our theorems in particular apply to situations where the warped product manifold is not necessarily Einstein, and thus can be applied to solutions with matter content in general relativity. Next we specialize to the Lorentzian case and study the propagation of null expansions under the assumption of the dominant energy condition. We prove several non-existence results relating to the Yamabe class of the fibers, in the spirit of the black-hole topology theorem of Hawking–Galloway–Schoen. Finally we discuss the effect of the warped product ansatz on matter models. In particular we construct several cosmological solutions to the Einstein–Euler equations whose spatial geometry is generally not isotropic.
Effect of Streamflow Forecast Uncertainty on Real-Time Reservoir Operation
NASA Astrophysics Data System (ADS)
Zhao, T.; Cai, X.; Yang, D.
2010-12-01
Various hydrological forecast products have been applied to real-time reservoir operation, including deterministic streamflow forecast (DSF), DSF-based probabilistic streamflow forecast (DPSF), and ensemble streamflow forecast (ESF), which represent forecast uncertainty in the form of deterministic forecast error, deterministic forecast error-based uncertainty distribution, and ensemble forecast errors, respectively. Compared to previous studies that treat these forecast products as ad hoc inputs for reservoir operation models, this paper attempts to model the uncertainties involved in the various forecast products and explores their effect on real-time reservoir operation decisions. In hydrology, there are various indices reflecting the magnitude of streamflow forecast uncertainty; meanwhile, few models illustrate the forecast uncertainty evolution process. This research introduces Martingale Model of Forecast Evolution (MMFE) from supply chain management and justifies its assumptions for quantifying the evolution of uncertainty in streamflow forecast as time progresses. Based on MMFE, this research simulates the evolution of forecast uncertainty in DSF, DPSF, and ESF, and applies the reservoir operation models (dynamic programming, DP; stochastic dynamic programming, SDP; and standard operation policy, SOP) to assess the effect of different forms of forecast uncertainty on real-time reservoir operation. Through a hypothetical single-objective real-time reservoir operation model, the results illustrate that forecast uncertainty exerts significant effects. Reservoir operation efficiency, as measured by a utility function, decreases as the forecast uncertainty increases. Meanwhile, these effects also depend on the type of forecast product being used. In general, the utility of reservoir operation with ESF is nearly as high as the utility obtained with a perfect forecast; the utilities of DSF and DPSF are similar to each other but not as efficient as ESF. Moreover, streamflow variability and reservoir capacity can change the magnitude of the effects of forecast uncertainty, but not the relative merit of DSF, DPSF, and ESF. Schematic diagram of the increase in forecast uncertainty with forecast lead-time and the dynamic updating property of real-time streamflow forecast
Present and future hydropower scheduling in Statkraft
NASA Astrophysics Data System (ADS)
Bruland, O.
2012-12-01
Statkraft produces close to 40 TWH in an average year and is one of the largest hydropower producers in Europe. For hydropower producers the scheduling of electricity generation is the key to success and this depend on optimal use of the water resources. The hydrologist and his forecasts both on short and on long terms are crucial to this success. The hydrological forecasts in Statkraft and most hydropower companies in Scandinavia are based on lumped models and the HBV concept. But before the hydrological model there is a complex system for collecting, controlling and correcting data applied in the models and the production scheduling and, equally important, routines for surveillance of the processes and manual intervention. Prior to the forecasting the states in the hydrological models are updated based on observations. When snow is present in the catchments snow surveys are an important source for model updating. The meteorological forecast is another premise provider to the hydrological forecast and to get as precise meteorological forecast as possible Statkraft hires resources from the governmental forecasting center. Their task is to interpret the meteorological situation, describe the uncertainties and if necessary use their knowledge and experience to manually correct the forecast in the hydropower production regions. This is one of several forecast applied further in the scheduling process. Both to be able to compare and evaluate different forecast providers and to ensure that we get the best available forecast, forecasts from different sources are applied. Some of these forecasts have undergone statistical corrections to reduce biases. The uncertainties related to the meteorological forecast have for a long time been approached and described by ensemble forecasts. But also the observations used for updating the model have a related uncertainty. Both to the observations itself and to how well they represent the catchment. Though well known, these uncertainties have thus far been handled superficially. Statkraft has initiated a program called ENKI to approach these issues. A part of this program is to apply distributed models for hydrological forecasting. Developing methodologies to handle uncertainties in the observations, the meteorological forecasts, the model itself and how to update the model with this information are other parts of the program. Together with energy price expectations and information about the state of the energy production system the hydrological forecast is input to the next step in the production scheduling both on short and long term. The long term schedule for reservoir filling is premise provider to the short term optimizing of water. The long term schedule is based on the actual reservoir levels, snow storages and a long history of meteorological observations and gives an overall schedule at a regional level. Within the regions a more detailed tool is used for short term optimizing of the hydropower production Each reservoir is scheduled taking into account restrictions in the water courses and cost of start and stop of aggregates. The value of the water is calculated for each reservoir and reflects the risk of water spillage. This compared to the energy price determines whether an aggregate will run or not. In a gradually more complex energy system with relatively lower regulated capacity this is an increasingly more challenging task.
Hu, E; Liao, T. W.; Tiersch, T. R.
2013-01-01
Emerging commercial-level technology for aquatic sperm cryopreservation has not been modeled by computer simulation. Commercially available software (ARENA, Rockwell Automation, Inc. Milwaukee, WI) was applied to simulate high-throughput sperm cryopreservation of blue catfish (Ictalurus furcatus) based on existing processing capabilities. The goal was to develop a simulation model suitable for production planning and decision making. The objectives were to: 1) predict the maximum output for 8-hr workday; 2) analyze the bottlenecks within the process, and 3) estimate operational costs when run for daily maximum output. High-throughput cryopreservation was divided into six major steps modeled with time, resources and logic structures. The modeled production processed 18 fish and produced 1164 ± 33 (mean ± SD) 0.5-ml straws containing one billion cryopreserved sperm. Two such production lines could support all hybrid catfish production in the US and 15 such lines could support the entire channel catfish industry if it were to adopt artificial spawning techniques. Evaluations were made to improve efficiency, such as increasing scale, optimizing resources, and eliminating underutilized equipment. This model can serve as a template for other aquatic species and assist decision making in industrial application of aquatic germplasm in aquaculture, stock enhancement, conservation, and biomedical model fishes. PMID:25580079
Designers Workbench: Towards Real-Time Immersive Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuester, F; Duchaineau, M A; Hamann, B
2001-10-03
This paper introduces the DesignersWorkbench, a semi-immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing (CAM), and computer-aided engineering (CAE) systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates from a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The DesignersWorkbench aims at closing this technologymore » or ''digital gap'' experienced by design and CAD engineers by transforming the classical design paradigm into its filly integrated digital and virtual analog allowing collaborative development in a semi-immersive virtual environment. This project emphasizes two key components from the classical product design cycle: freeform modeling and analysis. In the freeform modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.« less
Dynamic Model for the Stocks and Release Flows of Engineered Nanomaterials.
Song, Runsheng; Qin, Yuwei; Suh, Sangwon; Keller, Arturo A
2017-11-07
Most existing life-cycle release models for engineered nanomaterials (ENM) are static, ignoring the dynamics of stock and flows of ENMs. Our model, nanoRelease, estimates the annual releases of ENMs from manufacturing, use, and disposal of a product explicitly taking stock and flow dynamics into account. Given the variabilities in key parameters (e.g., service life of products and annual release rate during use) nanoRelease is designed as a stochastic model. We apply nanoRelease to three ENMs (TiO 2 , SiO 2 and FeO x ) used in paints and coatings through seven product applications, including construction and building, household and furniture, and automotive for the period from 2000 to 2020 using production volume and market projection information. We also consider model uncertainties using Monte Carlo simulation. Compared with 2016, the total annual releases of ENMs in 2020 will increase by 34-40%, and the stock will increase by 28-34%. The fraction of the end-of-life release among total release flows will increase from 11% in 2002 to 43% in 2020. As compared to static models, our dynamic model predicts about an order of magnitude lower values for the amount of ENM released from this sector in the near-term while stock continues to build up in the system.
Steam jacket dynamics in underground coal gasification
NASA Astrophysics Data System (ADS)
Otto, Christopher; Kempka, Thomas
2017-04-01
Underground coal gasification (UCG) has the potential to increase the world-wide hydrocarbon reserves by utilization of deposits not economically mineable by conventional methods. In this context, UCG involves combusting coal in-situ to produce a high-calorific synthesis gas, which can be applied for electricity generation or chemical feedstock production. Apart from high economic potentials, in-situ combustion may cause environmental impacts such as groundwater pollution by by-product leakage. In order to prevent or significantly mitigate these potential environmental concerns, UCG reactors are generally operated below hydrostatic pressure to limit the outflow of UCG process fluids into overburden aquifers. This pressure difference effects groundwater inflow into the reactor and prevents the escape of product gas. In the close reactor vicinity, fluid flow determined by the evolving high reactor temperatures, resulting in the build-up of a steam jacket. Numerical modeling is one of the key components to study coupled processes in in-situ combustion. We employed the thermo-hydraulic numerical simulator MUFITS (BINMIXT module) to address the influence of reactor pressure dynamics as well as hydro-geological coal and caprock parameters on water inflow and steam jacket dynamics. The US field trials Hanna and Hoe Creek (Wyoming) were applied for 3D model validation in terms of water inflow matching, whereby the good agreement between our modeling results and the field data indicates that our model reflects the hydrothermal physics of the process. In summary, our validated model allows a fast prediction of the steam jacket dynamics as well as water in- and outflows, required to avoid aquifer contamination during the entire life cycle of in-situ combustion operations.
Comparative study: TQ and Lean Production ownership models in health services
Eiro, Natalia Yuri; Torres-Junior, Alvair Silveira
2015-01-01
Objective: compare the application of Total Quality (TQ) models used in processes of a health service, cases of lean healthcare and literature from another institution that has also applied this model. Method: this is a qualitative research that was conducted through a descriptive case study. Results: through critical analysis of the institutions studied it was possible to make a comparison between the traditional quality approach checked in one case and the theoretical and practice lean production approach used in another case and the specifications are described below. Conclusion: the research identified that the lean model was better suited for people that work systemically and generate the flow. It also pointed towards some potential challenges in the introduction and implementation of lean methods in health. PMID:26487134
Study of the Bellman equation in a production model with unstable demand
NASA Astrophysics Data System (ADS)
Obrosova, N. K.; Shananin, A. A.
2014-09-01
A production model with allowance for a working capital deficit and a restricted maximum possible sales volume is proposed and analyzed. The study is motivated by the urgency of analyzing well-known problems of functioning low competitive macroeconomic structures. The original formulation of the task represents an infinite-horizon optimal control problem. As a result, the model is formalized in the form of a Bellman equation. It is proved that the corresponding Bellman operator is a contraction and has a unique fixed point in the chosen class of functions. A closed-form solution of the Bellman equation is found using the method of steps. The influence of the credit interest rate on the firm market value assessment is analyzed by applying the developed model.
Organizational Linkages: Understanding the Productivity Paradox,
1994-01-01
students were asked to make a decision regarding a production scheduling. Some used a Lotus spreadsheet’s what-if capacity, which enabled them to...the degree to which managers and MBA students believed that they make better decisions using what-if spreadsheet models, despite the fact that their...for this system is Naylor et al.’s (1980) view of behavior in organizations. When Pritchard and his students (Pritchard et al., 1988) applied this
Noh, Kyungrin; Yoo, Sunyong; Lee, Doheon
2018-06-13
Natural products have been widely investigated in the drug development field. Their traditional use cases as medicinal agents and their resemblance of our endogenous compounds show the possibility of new drug development. Many researchers have focused on identifying therapeutic effects of natural products, yet the resemblance of natural products and human metabolites has been rarely touched. We propose a novel method which predicts therapeutic effects of natural products based on their similarity with human metabolites. In this study, we compare the structure, target and phenotype similarities between natural products and human metabolites to capture molecular and phenotypic properties of both compounds. With the generated similarity features, we train support vector machine model to identify similar natural product and human metabolite pairs. The known functions of human metabolites are then mapped to the paired natural products to predict their therapeutic effects. With our selected three feature sets, structure, target and phenotype similarities, our trained model successfully paired similar natural products and human metabolites. When applied to the natural product derived drugs, we could successfully identify their indications with high specificity and sensitivity. We further validated the found therapeutic effects of natural products with the literature evidence. These results suggest that our model can match natural products to similar human metabolites and provide possible therapeutic effects of natural products. By utilizing the similar human metabolite information, we expect to find new indications of natural products which could not be covered by previous in silico methods.
Evolutionary model of the growth and size of firms
NASA Astrophysics Data System (ADS)
Kaldasch, Joachim
2012-07-01
The key idea of this model is that firms are the result of an evolutionary process. Based on demand and supply considerations the evolutionary model presented here derives explicitly Gibrat's law of proportionate effects as the result of the competition between products. Applying a preferential attachment mechanism for firms, the theory allows to establish the size distribution of products and firms. Also established are the growth rate and price distribution of consumer goods. Taking into account the characteristic property of human activities to occur in bursts, the model allows also an explanation of the size-variance relationship of the growth rate distribution of products and firms. Further the product life cycle, the learning (experience) curve and the market size in terms of the mean number of firms that can survive in a market are derived. The model also suggests the existence of an invariant of a market as the ratio of total profit to total revenue. The relationship between a neo-classic and an evolutionary view of a market is discussed. The comparison with empirical investigations suggests that the theory is able to describe the main stylized facts concerning the size and growth of firms.
Netcher, Andrea C; Duranceau, Steven J
2016-03-01
In surface water treatment, ultrafiltration (UF) membranes are widely used because of their ability to supply safe drinking water. Although UF membranes produce high-quality water, their efficiency is limited by fouling. Improving UF filtrate productivity is economically desirable and has been attempted by incorporating sustainable biofiltration processes as pretreatment to UF with varying success. The availability of models that can be applied to describe the effectiveness of biofiltration on membrane mass transfer are lacking. In this work, UF water productivity was empirically modeled as a function of biofilter feed water quality using either a quadratic or Gaussian relationship. UF membrane mass transfer variability was found to be governed by the dimensionless mass ratio between the alkalinity (ALK) and dissolved organic carbon (DOC). UF membrane productivity was optimized when the biofilter feed water ALK to DOC ratio fell between 10 and 14. Copyright © 2015 Elsevier Ltd. All rights reserved.
A prototype for automation of land-cover products from Landsat Surface Reflectance Data Records
NASA Astrophysics Data System (ADS)
Rover, J.; Goldhaber, M. B.; Steinwand, D.; Nelson, K.; Coan, M.; Wylie, B. K.; Dahal, D.; Wika, S.; Quenzer, R.
2014-12-01
Landsat data records of surface reflectance provide a three-decade history of land surface processes. Due to the vast number of these archived records, development of innovative approaches for automated data mining and information retrieval were necessary. Recently, we created a prototype utilizing open source software libraries for automatically generating annual Anderson Level 1 land cover maps and information products from data acquired by the Landsat Mission for the years 1984 to 2013. The automated prototype was applied to two target areas in northwestern and east-central North Dakota, USA. The approach required the National Land Cover Database (NLCD) and two user-input target acquisition year-days. The Landsat archive was mined for scenes acquired within a 100-day window surrounding these target dates, and then cloud-free pixels where chosen closest to the specified target acquisition dates. The selected pixels were then composited before completing an unsupervised classification using the NLCD. Pixels unchanged in pairs of the NLCD were used for training decision tree models in an iterative process refined with model confidence measures. The decision tree models were applied to the Landsat composites to generate a yearly land cover map and related information products. Results for the target areas captured changes associated with the recent expansion of oil shale production and agriculture driven by economics and policy, such as the increase in biofuel production and reduction in Conservation Reserve Program. Changes in agriculture, grasslands, and surface water reflect the local hydrological conditions that occurred during the 29-year span. Future enhancements considered for this prototype include a web-based client, ancillary spatial datasets, trends and clustering algorithms, and the forecasting of future land cover.
Wu, Jing; Hu, Yu-Ying; Wang, Shi-Feng; Cao, Zhi-Ping; Li, Huai-Zhi; Fu, Xin-Mei; Wang, Kai-Jun; Zuo, Jian-E
2017-04-01
Anaerobic digestion (AD), which is a process for generating biogas, can be applied to the treatment of organic wastes. Owing to its smaller footprint, lower energy consumption, and less digestate, high solid anaerobic digestion (HSAD) has attracted increasing attention. However, its biogas production is poor. In order to improve biogas production and decrease energy consumption, an improved thermal treatment process was proposed. Raw swine manure (>20% solid content) without any dilution was thermally treated at 70±1°C for different retention times, and then its effect on HSAD was investigated via batch AD experiments at 8.9% solid content. Results showed that the main organic components of swine manure hydrolyzed significantly during the thermal treatment, and HSAD's methane production rate was improved by up to 39.5%. Analysis using two kinetic models confirmed that the treatment could increase biodegradable organics (especially the readily biodegradable organics) in swine manure rather than upgrading its hydrolysis rate. It is worth noting that the superimposed first-order kinetics model was firstly applied in AD, and was a good tool to reveal the AD kinetics mechanism of substrates with complex components. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Chelibanov, V. P.; Ishanin, G. G.; Isaev, L. N.
2014-05-01
Role of nitrogen oxide in ambient air is described and analyzed. New method of nitrogen oxide concentration measurement in gas phase is suggested based on ozone concentration measurement with titration by nitrogen oxide. Research of chemiluminescent sensor composition is carried out on experimental stand. The sensor produced on the base of solid state non-activated chemiluminescent composition is applied as ozone sensor. Composition is put on the surface of polymer matrix with developed surface. Sensor compositions includes gallic acid with addition of rodamine-6G. Model of interaction process between sensor composition and ozone has been developed, main products appeared during reaction are identified. The product determining the speed of luminescense appearance is found. This product belongs to quinone class. Then new structure of chemiluminescent composition was suggested, with absence of activation period and with high stability of operation. Experimental model of gas analyzer was constructed and operation algorithm was developed. It was demonstrated that developed NO measuring instrument would be applied for monitoring purposes of ambient air. This work was partially financially supported by Government of Russian Federation, Grant 074-U01
Acquisition Community Team Dynamics: The Tuckman Model vs. the DAU Model
2007-04-30
courses . These student teams are used to enable the generation of more complex products and to prepare the students for the ...requirement for stage discreteness was met, I developed a stage-separation test that, when applied to the data representing the experience of a... test the reliability, and validate an improved questionnaire instrument that: – Redefines “Storming” with new storming questions Less focused
Mixing of gaseous reactants in chemical generation of atomic iodine for COIL: two-dimensional study
NASA Astrophysics Data System (ADS)
Jirasek, Vit; Spalek, Otomar; Kodymova, Jarmila; Censky, Miroslav
2003-11-01
Two-dimensional CFD model was applied for the study of mixing and reaction between gaseous chlorine dioxide and nitrogen monoxide diluted with nitrogen during atomic iodine generation. The influence of molecular diffusion on the production of atomic chlorine as a precursor of atomic iodine was predominantly studied. The results were compared with one-dimensional modeling of the system.
John Tipton; Gretchen Moisen; Paul Patterson; Thomas A. Jackson; John Coulston
2012-01-01
There are many factors that will determine the final cost of modeling and mapping tree canopy cover nationwide. For example, applying a normalization process to Landsat data used in the models is important in standardizing reflectance values among scenes and eliminating visual seams in the final map product. However, normalization at the national scale is expensive and...
Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Feng, E-mail: fwang@unu.edu; Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft; Huisman, Jaco
2013-11-15
Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lackmore » of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e-waste estimation studies.« less
An application of the transtheoretical model to becoming vegan.
Mendes, Elisabeth
2013-01-01
This article applies the transtheoretical model TM to veganism. By and large, the TM is a model of behavioral change that incorporates different stages to describe how an individual moves from an unhealthy behavior to a healthy one. The TM construes change as a five-stage process. The five stages of change are (a) precontemplation, (b) contemplation, (c) preparation, (d) action, and (e) maintenance. In this analysis, the model is applied to a person's determination to become vegan. A person chooses to become a vegan by eliminating all animal products from his or her diets; he or she does this by progressing through the stages, as prescribed by the model. The different changes people make to their life are described in detail. It is also possible to measure the success of a person's progression based on positive health changes that he or she experiences.
Ruiz Estrada, Mario Arturo; Yap, Su Fei; Park, Donghyun
2014-07-01
Natural hazards have a potentially large impact on economic growth, but measuring their economic impact is subject to a great deal of uncertainty. The central objective of this paper is to demonstrate a model--the natural disasters vulnerability evaluation (NDVE) model--that can be used to evaluate the impact of natural hazards on gross national product growth. The model is based on five basic indicators-natural hazards growth rates (αi), the national natural hazards vulnerability rate (ΩT), the natural disaster devastation magnitude rate (Π), the economic desgrowth rate (i.e. shrinkage of the economy) (δ), and the NHV surface. In addition, we apply the NDVE model to the north-east Japan earthquake and tsunami of March 2011 to evaluate its impact on the Japanese economy. © 2014 The Author(s). Disasters © Overseas Development Institute, 2014.
Mangaraj, S; K Goswami, T; Mahajan, P V
2015-07-01
MAP is a dynamic system where respiration of the packaged product and gas permeation through the packaging film takes place simultaneously. The desired level of O2 and CO2 in a package is achieved by matching film permeation rates for O2 and CO2 with respiration rate of the packaged product. A mathematical model for MAP of fresh fruits applying enzyme kinetics based respiration equation coupled with the Arrhenious type model was developed. The model was solved numerically using MATLAB programme. The model was used to determine the time to reach to the equilibrium concentration inside the MA package and the level of O2 and CO2 concentration at equilibrium state. The developed model for prediction of equilibrium O2 and CO2 concentration was validated using experimental data for MA packaging of apple, guava and litchi.
Ferreira, Iuri E P; Zocchi, Silvio S; Baron, Daniel
2017-11-01
Reliable fertilizer recommendations depend on the correctness of the crop production models fitted to the data, but generally the crop models are built empirically, neglecting important physiological aspects related with response to fertilizers, or they are based in laws of plant mineral nutrition seen by many authors as conflicting theories: the Liebig's Law of the Minimum and Mitscherlich's Law of Diminishing Returns. We developed a new approach to modelling the crop response to fertilizers that reconcile these laws. In this study, the Liebig's Law is applied at the cellular level to explain plant production and, as a result, crop models compatible with the Law of Diminishing Returns are derived. Some classical crop models appear here as special cases of our methodology, and a new interpretation for Mitscherlich's Law is also provided. Copyright © 2017 Elsevier Inc. All rights reserved.
Application of overlay modeling and control with Zernike polynomials in an HVM environment
NASA Astrophysics Data System (ADS)
Ju, JaeWuk; Kim, MinGyu; Lee, JuHan; Nabeth, Jeremy; Jeon, Sanghuck; Heo, Hoyoung; Robinson, John C.; Pierson, Bill
2016-03-01
Shrinking technology nodes and smaller process margins require improved photolithography overlay control. Generally, overlay measurement results are modeled with Cartesian polynomial functions for both intra-field and inter-field models and the model coefficients are sent to an advanced process control (APC) system operating in an XY Cartesian basis. Dampened overlay corrections, typically via exponentially or linearly weighted moving average in time, are then retrieved from the APC system to apply on the scanner in XY Cartesian form for subsequent lot exposure. The goal of the above method is to process lots with corrections that target the least possible overlay misregistration in steady state as well as in change point situations. In this study, we model overlay errors on product using Zernike polynomials with same fitting capability as the process of reference (POR) to represent the wafer-level terms, and use the standard Cartesian polynomials to represent the field-level terms. APC calculations for wafer-level correction are performed in Zernike basis while field-level calculations use standard XY Cartesian basis. Finally, weighted wafer-level correction terms are converted to XY Cartesian space in order to be applied on the scanner, along with field-level corrections, for future wafer exposures. Since Zernike polynomials have the property of being orthogonal in the unit disk we are able to reduce the amount of collinearity between terms and improve overlay stability. Our real time Zernike modeling and feedback evaluation was performed on a 20-lot dataset in a high volume manufacturing (HVM) environment. The measured on-product results were compared to POR and showed a 7% reduction in overlay variation including a 22% terms variation. This led to an on-product raw overlay Mean + 3Sigma X&Y improvement of 5% and resulted in 0.1% yield improvement.
A mathematical approach to HIV infection dynamics
NASA Astrophysics Data System (ADS)
Ida, A.; Oharu, S.; Oharu, Y.
2007-07-01
In order to obtain a comprehensive form of mathematical models describing nonlinear phenomena such as HIV infection process and AIDS disease progression, it is efficient to introduce a general class of time-dependent evolution equations in such a way that the associated nonlinear operator is decomposed into the sum of a differential operator and a perturbation which is nonlinear in general and also satisfies no global continuity condition. An attempt is then made to combine the implicit approach (usually adapted for convective diffusion operators) and explicit approach (more suited to treat continuous-type operators representing various physiological interactions), resulting in a semi-implicit product formula. Decomposing the operators in this way and considering their individual properties, it is seen that approximation-solvability of the original model is verified under suitable conditions. Once appropriate terms are formulated to describe treatment by antiretroviral therapy, the time-dependence of the reaction terms appears, and such product formula is useful for generating approximate numerical solutions to the governing equations. With this knowledge, a continuous model for HIV disease progression is formulated and physiological interpretations are provided. The abstract theory is then applied to show existence of unique solutions to the continuous model describing the behavior of the HIV virus in the human body and its reaction to treatment by antiretroviral therapy. The product formula suggests appropriate discrete models describing the dynamics of host pathogen interactions with HIV1 and is applied to perform numerical simulations based on the model of the HIV infection process and disease progression. Finally, the results of our numerical simulations are visualized and it is observed that our results agree with medical and physiological aspects.
[Logistic and production process in a regional blood center: modeling and analysis].
Baesler, Felipe; Martínez, Cristina; Yaksic, Eduardo; Herrera, Claudia
2011-09-01
The blood supply chain is a complex system that considers different interconnected elements that have to be synchronized correctly to satisfy in quality and quantity the final patient requirements. To determine the blood center maximum production capacity, as well as the determination of the necessary changes for a future production capacity expansion. This work was developed in the Blood Center of Concepción, Chile, operations management tools were applied to model it and to propose improvement alternatives for the production process. The use of simulation is highlighted, which permitted the replication of the center behavior and the evaluation of expansion alternatives. It is possible to absorb a 100% increment in blood demand, without making major changes or investments in the production process. Also it was possible to determine the subsequent steps in terms of investments in equipment and human resources for a future expansion of the center coverage. The techniques used to model the production process of the blood center of Concepción, Chile, allowed us to analyze how it operates, to detect "bottle necks", and to support the decision making process for a future expansion of its capacity.
NASA Astrophysics Data System (ADS)
Dimitrov, Dimitre D.; Grant, Robert F.; Lafleur, Peter M.; Humphreys, Elyn R.
2011-12-01
The ecosys model was applied to investigate the effects of water table and subsurface hydrology changes on carbon dioxide exchange at the ombrotrophic Mer Bleue peatland, Ontario, Canada. It was hypothesized that (1) water table drawdown would not affect vascular canopy water potential, hence vascular productivity, because roots would penetrate deeper to compensate for near-surface dryness, (2) moss canopy water potential and productivity would be severely reduced because rhizoids occupy the uppermost peat that is subject to desiccation with water table decline, and (3) given that in a previous study of Mer Bleue, ecosystem respiration showed little sensitivity to water table drawdown, gross primary productivity would mainly determine the net ecosystem productivity through these vegetation-subsurface hydrology linkages. Model output was compared with literature reports and hourly eddy-covariance measurements during 2000-2004. Our findings suggest that late-summer water table drawdown in 2001 had only a minor impact on vascular canopy water potential but greatly impacted hummock moss water potential, where midday values declined to -250 MPa on average in the model. As a result, simulated moss productivity was reduced by half, which largely explained a reduction of 2-3 μmol CO2 m-2 s-1 in midday simulated and measurement-derived gross primary productivity and an equivalent reduction in simulated and measured net ecosystem productivity. The water content of the near-surface peat (top 5-10 cm) was found to be the most important driver of interannual variability of annual net ecosystem productivity through its effects on hummock moss productivity and on ecosystem respiration.
SMERGE: A multi-decadal root-zone soil moisture product for CONUS
NASA Astrophysics Data System (ADS)
Crow, W. T.; Dong, J.; Tobin, K. J.; Torres, R.
2017-12-01
Multi-decadal root-zone soil moisture products are of value for a range of water resource and climate applications. The NASA-funded root-zone soil moisture merging project (SMERGE) seeks to develop such products through the optimal merging of land surface model predictions with surface soil moisture retrievals acquired from multi-sensor remote sensing products. This presentation will describe the creation and validation of a daily, multi-decadal (1979-2015), vertically-integrated (both surface to 40 cm and surface to 100 cm), 0.125-degree root-zone product over the contiguous United States (CONUS). The modeling backbone of the system is based on hourly root-zone soil moisture simulations generated by the Noah model (v3.2) operating within the North American Land Data Assimilation System (NLDAS-2). Remotely-sensed surface soil moisture retrievals are taken from the multi-sensor European Space Agency Climate Change Initiative soil moisture data set (ESA CCI SM). In particular, the talk will detail: 1) the exponential smoothing approach used to convert surface ESA CCI SM retrievals into root-zone soil moisture estimates, 2) the averaging technique applied to merge (temporally-sporadic) remotely-sensed with (continuous) NLDAS-2 land surface model estimates of root-zone soil moisture into the unified SMERGE product, and 3) the validation of the SMERGE product using long-term, ground-based soil moisture datasets available within CONUS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cirac, J. Ignacio; Sierra, German; Instituto de Fisica Teorica, UAM-CSIC, Madrid
We generalize the matrix product states method using the chiral vertex operators of conformal field theory and apply it to study the ground states of the XXZ spin chain, the J{sub 1}-J{sub 2} model and random Heisenberg models. We compute the overlap with the exact wave functions, spin-spin correlators, and the Renyi entropy, showing that critical systems can be described by this method. For rotational invariant ansatzs we construct an inhomogenous extension of the Haldane-Shastry model with long-range exchange interactions.
ERIC Educational Resources Information Center
Zink, J.
1976-01-01
The design and implementation of a cablevision delivered course on Grand Opera is described. The unique aspect of this project is that both the viewers and the production teams experienced active learning situations. (Author)
Some Thoughts on Treasure-Keeping.
ERIC Educational Resources Information Center
O'Brien, Thomas C.
1989-01-01
Instead of studying children's knowing, American educators have applied policies and procedures from factories and assembly lines of the early 1900s. Three factory-oriented themes are paramount: mass production, cost effectiveness, and efficiency. This article suggests a Piagetian alternative to the present mechanistic model. Includes seven…
Modeling Pacific Northwest carbon and water cycling using CARAIB Dynamic Vegetation Model
NASA Astrophysics Data System (ADS)
Dury, M.; Kim, J. B.; Still, C. J.; Francois, L. M.; Jiang, Y.
2015-12-01
While uncertainties remain regarding projected temperature and precipitation changes, climate warming is already affecting ecosystems in the Pacific Northwest (PNW). Decrease in ecosystem productivity as well as increase in mortality of some plant species induced by drought and disturbance have been reported. Here, we applied the process-based dynamic vegetation model CARAIB to PNW to simulate the response of water and carbon cycling to current and future climate change projections. The vegetation model has already been successfully applied to Europe to simulate plant physiological response to climate change. We calibrated CARAIB to PNW using global Plant Functional Types. For calibration, the model is driven with the gridded surface meteorological dataset UIdaho MACA METDATA with 1/24-degree (~4-km) resolution at a daily time step for the period 1979-2014. The model ability to reproduce the current spatial and temporal variations of carbon stocks and fluxes was evaluated using a variety of available datasets, including eddy covariance and satellite observations. We focused particularly on past severe drought and fire episodes. Then, we simulated future conditions using the UIdaho MACAv2-METDATA dataset, which includes downscaled CMIP5 projections from 28 GCMs for RCP4.5 and RCP8.5. We evaluated the future ecosystem carbon balance resulting from changes in drought frequency as well as in fire risk. We also simulated future productivity and drought-induced mortality of several key PNW tree species.
Oppenheim, Gary M; Dell, Gary S; Schwartz, Myrna F
2010-02-01
Naming a picture of a dog primes the subsequent naming of a picture of a dog (repetition priming) and interferes with the subsequent naming of a picture of a cat (semantic interference). Behavioral studies suggest that these effects derive from persistent changes in the way that words are activated and selected for production, and some have claimed that the findings are only understandable by positing a competitive mechanism for lexical selection. We present a simple model of lexical retrieval in speech production that applies error-driven learning to its lexical activation network. This model naturally produces repetition priming and semantic interference effects. It predicts the major findings from several published experiments, demonstrating that these effects may arise from incremental learning. Furthermore, analysis of the model suggests that competition during lexical selection is not necessary for semantic interference if the learning process is itself competitive. Copyright 2009 Elsevier B.V. All rights reserved.
Modelling and analysis of solar cell efficiency distributions
NASA Astrophysics Data System (ADS)
Wasmer, Sven; Greulich, Johannes
2017-08-01
We present an approach to model the distribution of solar cell efficiencies achieved in production lines based on numerical simulations, metamodeling and Monte Carlo simulations. We validate our methodology using the example of an industrial feasible p-type multicrystalline silicon “passivated emitter and rear cell” process. Applying the metamodel, we investigate the impact of each input parameter on the distribution of cell efficiencies in a variance-based sensitivity analysis, identifying the parameters and processes that need to be improved and controlled most accurately. We show that if these could be optimized, the mean cell efficiencies of our examined cell process would increase from 17.62% ± 0.41% to 18.48% ± 0.09%. As the method relies on advanced characterization and simulation techniques, we furthermore introduce a simplification that enhances applicability by only requiring two common measurements of finished cells. The presented approaches can be especially helpful for ramping-up production, but can also be applied to enhance established manufacturing.
Plumes and Blooms: Observations, Analysis and Modeling for SIMBIOS
NASA Technical Reports Server (NTRS)
Maritorena, S.; Siegel, D. A.; Nelson, N. B.
2004-01-01
The goal of the Plumes and Blooms (PnB) project is to develop, validate and apply to imagery state-of-the-art ocean color algorithms for quantifying sediment plumes and phytoplankton blooms for the Case II environment of the Santa Barbara Channel. We conduct monthly to twice-monthly transect observations across the Santa Barbara Channel to develop an algorithm development and product validation data set. A primary goal is the use the PnB field data set to objectively tune semi-analytical models of ocean color for this site and apply them using available satellite imagery (SeaWiFS and MODIS). However, the comparison between PnB field observations and satellite estimates of primary products has been disappointing. We find that field estimates of water-leaving radiance correspond poorly to satellite estimates for both SeaWiFS and MODIS local area coverage imagery. We believe this is due to poor atmospheric correction due to complex mixtures of aerosol types found in these near-coastal regions.
Basic characteristics and realization of production system control
NASA Astrophysics Data System (ADS)
Cheng, Shaopeng; Shell, Richard; Hall, Ernest L.
1992-11-01
This paper analyzes the issues involved in developing an intelligent production control system. It describes the basic characteristics of a production control system and an effective design methodology to realize the production control functions. Petri net, subsystem and hierarchical control concepts are applied to a computer integrated material handling system (MHS). Some communication and interface requirements of the MHS are also considered in this paper. The control system solution is illustrated with an actual MHS operation case which indicates that a truly flexible and integrated production system can be realized with a Petri net operation model and a hierarchical control structure. The significance of this work is related to the different operation testing and evaluation requirements encountered in manufacturing.
Dussan, K J; Cardona, C A; Giraldo, O H; Gutiérrez, L F; Pérez, V H
2010-12-01
Magnetic nanoparticles were prepared by coprecipitating Fe(2+) and Fe(3+) ions in a sodium hydroxide solution and used as support for lipase. The lipase-coated particles were applied in a reactive extraction process that allowed separation of the products formed during transesterification. Kinetics data for triolein and ethanol consumption during biodiesel (ethyl oleate) synthesis together with a thermodynamic phase equilibrium model (liquid-liquid) were used for simulation of batch and continuous processes. The analysis demonstrated the possibility of applying this biocatalytic system in the reactive zone using external magnetic fields. This approach implies new advantages in efficient location and use of lipases in column reactors for producing biodiesel. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
Three essays on energy and environmental economics: Empirical, applied, and theoretical
NASA Astrophysics Data System (ADS)
Karney, Daniel Houghton
Energy and environmental economics are closely related fields as nearly all forms of energy production generate pollution and thus nearly all forms of environmental policy affect energy production and consumption. The three essays in this dissertation are related by their common themes of energy and environmental economics, but they differ in their methodologies. The first chapter is an empirical exercise that looks that the relationship between electricity price deregulation and maintenance outages at nuclear power plants. The second chapter is an applied theory paper that investigates environmental regulation in a multiple pollutants setting. The third chapter develops a new methodology regarding the construction of analytical general equilibrium models that can be used to study topics in energy and environmental economics.
Lee, Jee Bum; Li, Ying; Choi, Ji Suk; Lee, Hyo Seok
2016-01-01
Purpose. To investigate the therapeutic effects of topical administration of antioxidant medicinal plant extracts in a mouse model of experimental dry eye (EDE). Methods. Eye drops containing balanced salt solution (BSS) or 0.001%, 0.01%, and 0.1% extracts were applied for the treatment of EDE. Tear volume, tear film break-up time (BUT), and corneal fluorescein staining scores were measured 10 days after desiccating stress. In addition, we evaluated the levels of interleukin- (IL-) 1β, tumor necrosis factor- (TNF-) α, IL-6, interferon- (IFN-) γ, and IFN-γ associated chemokines, percentage of CD4+C-X-C chemokine receptor type 3 positive (CXCR3+) T cells, goblet cell density, number of 4-hydroxy-2-nonenal (4-HNE) positive cells, and extracellular reactive oxygen species (ROS) production. Results. Compared to the EDE and BSS control groups, the mice treated with topical application of the 0.1% extract showed significant improvements in all clinical parameters, IL-1β, IL-6, TNF-α, and IFN-γ levels, percentage of CD4+CXCR3+ T cells, goblet cell density, number of 4-HNE-positive cells, and extracellular ROS production (P < 0.05). Conclusions. Topical application of 0.1% medicinal plant extracts improved clinical signs, decreased inflammation, and ameliorated oxidative stress marker and ROS production on the ocular surface of the EDE model mice. PMID:27313829
Kramer, Annemarie; Beck, Hans Christian; Kumar, Abhishek; Kristensen, Lars Peter; Imhoff, Johannes F; Labes, Antje
2015-01-01
The marine fungus Microascus brevicaulis strain LF580 is a non-model secondary metabolite producer with high yields of the two secondary metabolites scopularides A and B, which exhibit distinct activities against tumour cell lines. A mutant strain was obtained using UV mutagenesis, showing faster growth and differences in pellet formation besides higher production levels. Here, we show the first proteome study of a marine fungus. Comparative proteomics were applied to gain deeper understanding of the regulation of production and of the physiology of the wild type strain and its mutant. For this purpose, an optimised protein extraction protocol was established. In total, 4759 proteins were identified. The central metabolic pathway of strain LF580 was mapped using the KEGG pathway analysis and GO annotation. Employing iTRAQ labelling, 318 proteins were shown to be significantly regulated in the mutant strain: 189 were down- and 129 upregulated. Proteomics are a powerful tool for the understanding of regulatory aspects: The differences on proteome level could be attributed to limited nutrient availability in the wild type strain due to a strong pellet formation. This information can be applied for optimisation on strain and process level. The linkage between nutrient limitation and pellet formation in the non-model fungus M. brevicaulis is in consensus with the knowledge on model organisms like Aspergillus niger and Penicillium chrysogenum.
Kramer, Annemarie; Beck, Hans Christian; Kumar, Abhishek; Kristensen, Lars Peter; Imhoff, Johannes F.; Labes, Antje
2015-01-01
The marine fungus Microascus brevicaulis strain LF580 is a non-model secondary metabolite producer with high yields of the two secondary metabolites scopularides A and B, which exhibit distinct activities against tumour cell lines. A mutant strain was obtained using UV mutagenesis, showing faster growth and differences in pellet formation besides higher production levels. Here, we show the first proteome study of a marine fungus. Comparative proteomics were applied to gain deeper understanding of the regulation of production and of the physiology of the wild type strain and its mutant. For this purpose, an optimised protein extraction protocol was established. In total, 4759 proteins were identified. The central metabolic pathway of strain LF580 was mapped using the KEGG pathway analysis and GO annotation. Employing iTRAQ labelling, 318 proteins were shown to be significantly regulated in the mutant strain: 189 were down- and 129 upregulated. Proteomics are a powerful tool for the understanding of regulatory aspects: The differences on proteome level could be attributed to limited nutrient availability in the wild type strain due to a strong pellet formation. This information can be applied for optimisation on strain and process level. The linkage between nutrient limitation and pellet formation in the non-model fungus M. brevicaulis is in consensus with the knowledge on model organisms like Aspergillus niger and Penicillium chrysogenum. PMID:26460745
Age structure and capital dilution effects in neo-classical growth models.
Blanchet, D
1988-01-01
Economists often over estimate capital dilution effects when applying neoclassical growth models which use age structured population and depreciation of capital stock. This occurs because capital stock is improperly characterized. A standard model which assumes a constant depreciation of capital intimates that a population growth rate equal to a negative constant savings ratio is preferable to any higher growth rate. Growth rates which are lower than a negative constant savings ratio suggest an ever growing capital/labor ratio and an ever growing standard of living, even if people do not save. This is suggested because the natural reduction of the capital stock through depreciation is slower than the population decrease which is simply unrealistic. This model overlooks the fact that low or negative growth rates result in an ageing of the capital stock, and this ageing subsequently results in an increase of the overall rate of capital depreciation. In that overly simplistic model, depreciation was assumed independent of the age of the captial stock. Incorporating depreciation as a variable into a model allows a more symmetric treatment of capital. Using models with heterogenous capital, this article explores what occurs when more than 1 kind of capital good is involved in production and when these various captial goods have different lengths of life. Applying economic models, it also examines what occurs when the length of life of capital may vary. These variations correct the negative impact that population growth can have on per capital production and consumption.
Biases in simulation of the rice phenology models when applied in warmer climates
NASA Astrophysics Data System (ADS)
Zhang, T.; Li, T.; Yang, X.; Simelton, E.
2015-12-01
The current model inter-comparison studies highlight the difference in projections between crop models when they are applied to warmer climates, but these studies do not provide results on how the accuracy of the models would change in these projections because the adequate observations under largely diverse growing season temperature (GST) are often unavailable. Here, we investigate the potential changes in the accuracy of rice phenology models when these models were applied to a significantly warmer climate. We collected phenology data from 775 trials with 19 cultivars in 5 Asian countries (China, India, Philippines, Bangladesh and Thailand). Each cultivar encompasses the phenology observations under diverse GST regimes. For a given rice cultivar in different trials, the GST difference reaches 2.2 to 8.2°C, which allows us to calibrate the models under lower GST and validate under higher GST (i.e., warmer climates). Four common phenology models representing major algorithms on simulations of rice phenology, and three model calibration experiments were conducted. The results suggest that the bilinear and beta models resulted in gradually increasing phenology bias (Figure) and double yield bias per percent increase in phenology bias, whereas the growing-degree-day (GDD) and exponential models maintained a comparatively constant bias when applied in warmer climates (Figure). Moreover, the bias of phenology estimated by the bilinear and beta models did not reduce with increase in GST when all data were used to calibrate models. These suggest that variations in phenology bias are primarily attributed to intrinsic properties of the respective phenology model rather than on the calibration dataset. Therefore we conclude that using the GDD and exponential models has more chances of predicting rice phenology correctly and thus, production under warmer climates, and result in effective agricultural strategic adaptation to and mitigation of climate change.
Mpinga, Emmanuel Kabengele; Frey, Conrad; Chastonay, Philippe
2014-01-01
Torture is an important social and political problem worldwide that affects millions of people. Many host countries give victims of torture the status of refugee and take care of them as far as basic needs; health care, professional reinsertion, and education. Little is known about the costs of torture. However, this knowledge could serve as an additional argument for the prevention and social mobilization to fight against torture and to provide a powerful basis of advocacy for rehabilitation programs and judiciary claims. Development of a model for estimating the economic costs of torture and applying the model to a specific country. The estimation of the possible prevalence of victims of torture was based on a review of the literature. The identification of the socioeconomic factors to be considered was done by analogy with various health problems. The estimation of the loss of the productivity and of the economic burden of disease related to torture was done through the human capital approach and the component technique analysis. The model was applied to the situation in Switzerland of estimated torture victims Switzerland is confronted with. When applied to the case study, the direct costs - such as housing, food, and clothing - represent roughly 130 million Swiss francs (CHF) per year; whereas, health care costs amount to 16 million CHF per year, and the costs related to education of young people to 34 million CHF per year. Indirect costs, namely those costs related to the loss of the productivity of direct survivors of torture, have been estimated to one-third of 1 billion CHF per year. This jumps to 10,073,419,200 CHF in the loss of productivity if one would consider 30 years of loss per survivor. Our study shows that a rough estimation of the costs related to torture is possible with some prerequisites, such as access to social and economic indicators at the country level.
Value of Pediatric Orthopaedic Surgery.
Kocher, Mininder S
2015-01-01
Value has become the buzzword of contemporaneous health care reform. Value is defined as outcomes relative to costs. Orthopaedic surgery has come under increasing scrutiny due to high procedural costs. However, orthopaedic surgery may actually be a great value given the benefits of treatment. The American Academy of Orthopaedic Surgeons (AAOS) Value Project team was tasked to develop a model for assessing the benefits of orthopaedic surgery including indirect costs related to productivity and health-related quality of life. This model was applied to 5 orthopaedic conditions demonstrating robust societal and economic value. In all cost-effectiveness models, younger patients demonstrated greater cost-effectiveness given increased lifespan and productivity. This has tremendous implications within the field of pediatric orthopedic surgery. Pediatric orthopaedics may be the best value in medicine!
A simulation-based approach for solving assembly line balancing problem
NASA Astrophysics Data System (ADS)
Wu, Xiaoyu
2017-09-01
Assembly line balancing problem is directly related to the production efficiency, since the last century, the problem of assembly line balancing was discussed and still a lot of people are studying on this topic. In this paper, the problem of assembly line is studied by establishing the mathematical model and simulation. Firstly, the model of determing the smallest production beat under certain work station number is anysized. Based on this model, the exponential smoothing approach is applied to improve the the algorithm efficiency. After the above basic work, the gas stirling engine assembly line balancing problem is discussed as a case study. Both two algorithms are implemented using the Lingo programming environment and the simulation results demonstrate the validity of the new methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ishida, Muneyuki; Ishida, Shin; Ishida, Taku
1998-05-29
The relation between scattering and production amplitudes are investigated, using a simple field theoretical model, from the general viewpoint of unitarity and the applicability of final state interaction (FSI-) theorem. The IA-method and VMW-method, which are applied to our phenomenological analyses [2,3] suggesting the {sigma}-existence, are obtained as the physical state representations of scattering and production amplitudes, respectively. Moreover, the VMW-method is shown to be an effective method to obtain the resonance properties from general production processes, while the conventional analyses based on the 'universality' of {pi}{pi}-scattering amplitude are powerless for this purpose.
Relation between scattering and production amplitude—Case of intermediate σ-particle in ππ-system—
NASA Astrophysics Data System (ADS)
Ishida, Muneyuki; Ishida, Shin; Ishida, Taku
1998-05-01
The relation between scattering and production amplitudes are investigated, using a simple field theoretical model, from the general viewpoint of unitarity and the applicability of final state interaction (FSI-) theorem. The IA-method and VMW-method, which are applied to our phenomenological analyses [2,3] suggesting the σ-existence, are obtained as the physical state representations of scattering and production amplitudes, respectively. Moreover, the VMW-method is shown to be an effective method to obtain the resonance properties from general production processes, while the conventional analyses based on the "universality" of ππ-scattering amplitude are powerless for this purpose.
Brackman, G; De Meyer, L; Nelis, H J; Coenye, T
2013-06-01
Although several factors contribute to wound healing, bacterial infections and the presence of biofilm can significantly affect healing. Despite that this clearly indicates that therapies should address biofilm in wounds, only few wound care products have been evaluated for their antibiofilm effect. For this reason, we developed a rapid quantification approach to investigate the efficacy of wound care products on wounds infected with Staphylococcus spp. An in vitro chronic wound infection model was used in which a fluorescent Staph. aureus strain was used to allow the rapid quantification of the bacterial burden after treatment. A good correlation was observed between the fluorescence signal and the bacterial counts. When evaluated in this model, several commonly used wound dressings and wound care products inhibited biofilm formation resulting in a decrease between one and seven log CFU per biofilm compared with biofilm formed in the absence of products. In contrast, most dressings only moderately affected mature biofilms. Our model allowed the rapid quantification of the bacterial burden after treatment. However, the efficacy of treatment varied between the different types of dressings and/or wound care products. Our model can be used to compare the efficacy of wound care products to inhibit biofilm formation and/or eradicate mature biofilms. In addition, the results indicate that treatment of infected wounds should be started as soon as possible and that novel products with more potent antibiofilm activity are needed. © 2013 The Society for Applied Microbiology.
NASA Astrophysics Data System (ADS)
DeLonge, M. S.; Ryals, R.; Silver, W. L.
2011-12-01
Soil amendments, such as compost and manure, can be applied to grasslands to improve soil conditions and enhance aboveground net primary productivity. Applying such amendments can also lead to soil carbon (C) sequestration and, when materials are diverted from waste streams (e.g., landfills, manure lagoons), can offset greenhouse gas (GHG) emissions. However, amendment production and application is also associated with GHG emissions, and the net impact of these amendments remains unclear. To investigate the potential for soil amendments to reduce net GHG emissions, we developed a comprehensive, field-scale life cycle assessment (LCA) model. The LCA includes GHG (i.e., CO2, CH4, N2O) emissions of soil amendment production, application, and ecosystem response. Emissions avoided by diverting materials from landfills or manure management systems are also considered. We developed the model using field observations from grazed annual grassland in northern California (e.g., soil C; above- and belowground net primary productivity; C:N ratios; trace gas emissions from soils, manure piles, and composting), CENTURY model simulations (e.g., long-term soil C and trace gas emissions from soils under various land management strategies), and literature values (e.g., GHG emissions from transportation, inorganic fertilizer production, composting, and enteric fermentation). The LCA quantifies and contrasts the potential net GHG impacts of applying compost, manure, and commercial inorganic fertilizer to grazing lands. To estimate the LCA uncertainty, sensitivity tests were performed on the most widely ranging or highly uncertain parameters (e.g., compost materials, landfill emissions, manure management system emissions). Finally, our results are scaled-up to assess the feasibility and potential impacts of large-scale adoption of soil amendment application as a land-management strategy in California. Our base case results indicate that C sinks and emissions offsets associated with compost production and application can exceed life cycle emissions, potentially leading to a net reduction in GHG emissions of over 20 Mg CO2e per hectare of treated land. If similar results could be obtained in only 5% of California's 2,550,000 ha of rangeland, compost amendment application could offset the annual emissions of the California agriculture and forestry industries (> 28.25 million Mg CO2e, California Air Resources Board, 2008). Our findings indicate that application of compost amendments to grasslands may be an effective, beneficial, and relatively inexpensive strategy to contribute to climate change mitigation.
Choi, Eunhee; Tang, Fengyan; Kim, Sung-Geun; Turk, Phillip
2016-10-01
This study examined the longitudinal relationships between functional health in later years and three types of productive activities: volunteering, full-time, and part-time work. Using the data from five waves (2000-2008) of the Health and Retirement Study, we applied multivariate latent growth curve modeling to examine the longitudinal relationships among individuals 50 or over. Functional health was measured by limitations in activities of daily living. Individuals who volunteered, worked either full time or part time exhibited a slower decline in functional health than nonparticipants. Significant associations were also found between initial functional health and longitudinal changes in productive activity participation. This study provides additional support for the benefits of productive activities later in life; engagement in volunteering and employment are indeed associated with better functional health in middle and old age. © The Author(s) 2016.
Buongiorno, J.; Raunikar, R.; Zhu, S.
2011-01-01
The Global Forest Products Model (GFPM) was applied to project the consequences for the global forest sector of doubling the rate of growth of bioenergy demand relative to a base scenario, other drivers being maintained constant. The results showed that this would lead to the convergence of the price of fuelwood and industrial roundwood, raising the price of industrial roundwood by nearly 30% in 2030. The price of sawnwood and panels would be 15% higher. The price of paper would be 3% higher. Concurrently, the demand for all manufactured wood products would be lower in all countries, but the production would rise in countries with competitive advantage. The global value added in wood processing industries would be 1% lower in 2030. The forest stock would be 2% lower for the world and 4% lower for Asia. These effects varied substantially by country. ?? 2011 Department of Forest Economics, SLU Ume??, Sweden.
Reflectance model for quantifying chlorophyll a in the presence of productivity degradation products
NASA Technical Reports Server (NTRS)
Carder, K. L.; Hawes, S. K.; Steward, R. G.; Baker, K. A.; Smith, R. C.; Mitchell, B. G.
1991-01-01
A reflectance model developed to estimate chlorophyll a concentrations in the presence of marine colored dissolved organic matter, pheopigments, detritus, and bacteria is presented. Nomograms and lookup tables are generated to describe the effects of different mixtures of chlorophyll a and these degradation products on the R(412):R(443) and R(443):R(565) remote-sensing reflectance or irradiance reflectance ratios. These are used to simulate the accuracy of potential ocean color satellite algorithms, assuming that atmospheric effects have been removed. For the California Current upwelling and offshore regions, with chlorophyll a not greater than 1.3 mg/cu m, the average error for chlorophyll a retrievals derived from irradiance reflectance data for degradation product-rich areas was reduced from +/-61 percent to +/-23 percent by application of an algorithm using two reflectance ratios rather than the commonly used algorithm applying a single reflectance ratio.
Examining the impacts of increased corn production on ...
This study demonstrates the value of a coupled chemical transport modeling system for investigating groundwater nitrate contamination responses associated with nitrogen (N) fertilizer application and increased corn production. The coupled Community Multiscale Air Quality Bidirectional and Environmental Policy Integrated Climate modeling system incorporates agricultural management practices and N exchange processes between the soil and atmosphere to estimate levels of N that may volatilize into the atmosphere, re-deposit, and seep or flow into surface and groundwater. Simulated values from this modeling system were used in a land-use regression model to examine associations between groundwater nitrate-N measurements and a suite of factors related to N fertilizer and groundwater nitrate contamination. Multi-variable modeling analysis revealed that the N-fertilizer rate (versus total) applied to irrigated (versus rainfed) grain corn (versus other crops) was the strongest N-related predictor variable of groundwater nitrate-N concentrations. Application of this multi-variable model considered groundwater nitrate-N concentration responses under two corn production scenarios. Findings suggest that increased corn production between 2002 and 2022 could result in 56% to 79% increase in areas vulnerable to groundwater nitrate-N concentrations ≥ 5 mg/L. These above-threshold areas occur on soils with a hydraulic conductivity 13% higher than the rest of the domain. Additio
Vallejo-Torres, Laura; Steuten, Lotte; Parkinson, Bonny; Girling, Alan J; Buxton, Martin J
2011-01-01
The probability of reimbursement is a key factor in determining whether to proceed with or abandon a product during its development. The purpose of this article is to illustrate how the methods of iterative Bayesian economic evaluation proposed in the literature can be incorporated into the development process of new medical devices, adapting them to face the relative scarcity of data and time that characterizes the process. A 3-stage economic evaluation was applied: an early phase in which simple methods allow for a quick prioritization of competing products; a mid-stage in which developers synthesize the data into a decision model, identify the parameters for which more information is most valuable, and explore uncertainty; and a late stage, in which all relevant information is synthesized. A retrospective analysis was conducted of the case study of absorbable pins, compared with metallic fixation, in osteotomy to treat hallux valgus. The results from the early analysis suggest absorbable pins to be cost-effective under the beliefs and assumptions applied. The outputs from the models at the mid-stage analyses show the device to be cost-effective with a high probability. Late-stage analysis synthesizes evidence from a randomized controlled trial and informative priors, which are based on previous evidence. It also suggests that absorbable pins are the most cost-effective strategy, although the uncertainty in the model output increased considerably. This example illustrates how the method proposed allows decisions in the product development cycle to be based on the best knowledge that is available at each stage.
Vatankhah, Soudabeh; Alirezaei, Samira; Khosravizadeh, Omid; Mirbahaeddin, Seyyed Elmira; Alikhani, Mahtab; Alipanah, Mobarakeh
2017-08-01
In today's transforming world, increased productivity and efficient use of existing facilities are practically beyond a choice and become a necessity. In this line, attention to change and transformation is one of the affecting factors on the growth of productivity in organizations, especially in hospitals. To examine the effect of transformational leadership on the productivity of employees in teaching hospitals affiliated to Iran University of Medical Sciences. This cross-sectional study was conducted on 254 participants from educational and medical centers affiliated to Iran University of Medical Sciences (Tehran, Iran) in 2016. The standard questionnaires of Bass & Avolio and of Hersi & Goldsmith were used to respectively assess transformational leadership and level of productivity. The research assumptions were tested in a significance level of 0.05 by applying descriptive statistics and structural equations modeling (SEM) using SPSS 19 and Amos 24. Results of the fitting indicators of the assessing model after amending includes Chi-square two to degrees of freedom of 2.756, CFI indicator 0.95, IFI indicator 0.92, Root mean square error of approximation (RMSEA) indicator 0.10. These results indicate that the assessing model is well fitting after the amendment. Also, analysis of the model's assumptions and the final model of the research reveals the effect of transformational leadership on employees' productivity with a significance level of 0.83 (p=0.001). This research indicates that the more the leadership and decision-making style in hospitals lean towards transformational mode, the more positive outcomes it brings among employees and the organization due to increased productivity. Therefore, it is essential to pay focused attention to training/educational programs in organizations to create and encourage transformational leadership behaviors which hopefully lead to more productive employees.
Economic analysis and assessment of syngas production using a modeling approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Hakkwan; Parajuli, Prem B.; Yu, Fei
Economic analysis and modeling are essential and important issues for the development of current feedstock and process technology for bio-gasification. The objective of this study was to develop an economic model and apply to predict the unit cost of syngas production from a micro-scale bio-gasification facility. An economic model was programmed in C++ computer programming language and developed using a parametric cost approach, which included processes to calculate the total capital costs and the total operating costs. The model used measured economic data from the bio-gasification facility at Mississippi State University. The modeling results showed that the unit cost ofmore » syngas production was $1.217 for a 60 Nm-3 h-1 capacity bio-gasifier. The operating cost was the major part of the total production cost. The equipment purchase cost and the labor cost were the largest part of the total capital cost and the total operating cost, respectively. Sensitivity analysis indicated that labor costs rank the top as followed by equipment cost, loan life, feedstock cost, interest rate, utility cost, and waste treatment cost. The unit cost of syngas production increased with the increase of all parameters with exception of loan life. The annual cost regarding equipment, labor, feedstock, waste treatment, and utility cost showed a linear relationship with percent changes, while loan life and annual interest rate showed a non-linear relationship. This study provides the useful information for economic analysis and assessment of the syngas production using a modeling approach.« less
HPC simulations of grain-scale spallation to improve thermal spallation drilling
NASA Astrophysics Data System (ADS)
Walsh, S. D.; Lomov, I.; Wideman, T. W.; Potter, J.
2012-12-01
Thermal spallation drilling and related hard-rock hole opening techniques are transformative technologies with the potential to dramatically reduce the costs associated with EGS well drilling and improve the productivity of new and existing wells. In contrast to conventional drilling methods that employ mechanical means to penetrate rock, thermal spallation methods fragment rock into small pieces ("spalls") without contact via the rapid transmission of heat to the rock surface. State-of-the-art constitutive models of thermal spallation employ Weibull statistical failure theory to represent the relationship between rock heterogeneity and its propensity to produce spalls when heat is applied to the rock surface. These models have been successfully used to predict such factors as penetration rate, spall-size distribution and borehole radius from drilling jet velocity and applied heat flux. A properly calibrated Weibull model would permit design optimization of thermal spallation drilling under geothermal field conditions. However, although useful for predicting system response in a given context, Weibull models are by their nature empirically derived. In the past, the parameters used in these models were carefully determined from laboratory tests, and thus model applicability was limited by experimental scope. This becomes problematic, for example, if simulating spall production at depths relevant for geothermal energy production, or modeling thermal spallation drilling in new rock types. Nevertheless, with sufficient computational resources, Weibull models could be validated in the absence of experimental data by explicit small-scale simulations that fully resolve rock grains. This presentation will discuss how high-fidelity simulations can be used to inform Weibull models of thermal spallation, and what these simulations reveal about the processes driving spallation at the grain-scale - in particular, the role that inter-grain boundaries and micro-pores play in the onset and extent of spallation. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
On the computation of molecular surface correlations for protein docking using fourier techniques.
Sakk, Eric
2007-08-01
The computation of surface correlations using a variety of molecular models has been applied to the unbound protein docking problem. Because of the computational complexity involved in examining all possible molecular orientations, the fast Fourier transform (FFT) (a fast numerical implementation of the discrete Fourier transform (DFT)) is generally applied to minimize the number of calculations. This approach is rooted in the convolution theorem which allows one to inverse transform the product of two DFTs in order to perform the correlation calculation. However, such a DFT calculation results in a cyclic or "circular" correlation which, in general, does not lead to the same result as the linear correlation desired for the docking problem. In this work, we provide computational bounds for constructing molecular models used in the molecular surface correlation problem. The derived bounds are then shown to be consistent with various intuitive guidelines previously reported in the protein docking literature. Finally, these bounds are applied to different molecular models in order to investigate their effect on the correlation calculation.
Petroleum refinery operational planning using robust optimization
NASA Astrophysics Data System (ADS)
Leiras, A.; Hamacher, S.; Elkamel, A.
2010-12-01
In this article, the robust optimization methodology is applied to deal with uncertainties in the prices of saleable products, operating costs, product demand, and product yield in the context of refinery operational planning. A numerical study demonstrates the effectiveness of the proposed robust approach. The benefits of incorporating uncertainty in the different model parameters were evaluated in terms of the cost of ignoring uncertainty in the problem. The calculations suggest that this benefit is equivalent to 7.47% of the deterministic solution value, which indicates that the robust model may offer advantages to those involved with refinery operational planning. In addition, the probability bounds of constraint violation are calculated to help the decision-maker adopt a more appropriate parameter to control robustness and judge the tradeoff between conservatism and total profit.
Mechanisms and kinetics models for ultrasonic waste activated sludge disintegration.
Wang, Fen; Wang, Yong; Ji, Min
2005-08-31
Ultrasonic energy can be applied as pre-treatment to disintegrate sludge flocs and disrupt bacterial cells' walls, and the hydrolysis can be improved, so that the rate of sludge digestion and methane production is improved. In this paper, by adding NaHCO3 to mask the oxidizing effect of OH, the mechanisms of disintegration are investigated. In addition, kinetics models for ultrasonic sludge disintegration are established by applying multi-variable linear regression method. It has been found that hydro-mechanical shear forces predominantly responsible for the disintegration, and the contribution of oxidizing effect of OH increases with the amount of the ultrasonic density and ultrasonic intensity. It has also been inferred from the kinetics model which dependent variable is SCOD+ that both sludge pH and sludge concentration significantly affect the disintegration.
Greenhouse Gas Emissions of Beef Cattle Production in the Southern Great Plains
NASA Astrophysics Data System (ADS)
Kannan, N.; Niraula, R.; Saleh, A.; Osei, E.; Cole, A.; Todd, R.; Waldrip, H.; Aljoe, H.
2017-12-01
A five-year USDA-funded study titled "Resilience and vulnerability of beef cattle production in the Southern Great Plains under changing climate, land use, and markets" was initiated as a multi-institutional collaboration involving Texas Institute for Applied Environmental Research (TIAER)—Tarleton State University, United States Department of Agriculture (USDA)—Agricultural Research Service (ARS) in El Reno, Oklahoma, USDA—ARS in Bushland, Texas, Kansas State University, Oklahoma State University, University of Oklahoma, and the Noble Research Institute in Ardmore, Oklahoma. The project goal is to safeguard and promote regional beef production while mitigating its environmental footprint. Conducting a full Life Cycle Analysis (LCA) is one of the major objectives of the study, in addition to field experiments, extension, outreach, and education. Estimation of all the resource use and greenhouse gas emissions are parts of the LCA. A computer model titled Animal Production Life Cycle Analysis Tool (APLCAT) is developed and applied to conduct the LCA on beef cattle production in the study region. The model estimates water use, energy requirements, and emissions of enteric methane, manure methane, nitrous oxide, and carbon dioxide. Also included in the LCA analysis are land-atmospheric exchanges of methane, nitrous oxide, carbon dioxide and the global warming potential. Our study is focused on the cow-calf and stocker phases of beef cattle production. The animal production system in the study region is predominantly forage based with protein and energy supplements when needed. Spring calving typical to the study region. In the cow-calf phase animals typically graze native prairie although introduced pasture grazing is also prevalent. Stockers use winter pasture as the major feed. The results of greenhouse gas emissions summarized per kg of hot carcass weight or animal fed will be presented.
Holder, Christopher T; Cleland, Joshua C; LeDuc, Stephen D; Andereck, Zac; Hogan, Chris; Martin, Kristen M
2016-04-01
The potential environmental effects of increased U.S. biofuel production often vary depending upon the location and type of land used to produce biofuel feedstocks. However, complete, annual data are generally lacking regarding feedstock production by specific location. Corn is the dominant biofuel feedstock in the U.S., so here we present methods for estimating where bioethanol corn feedstock is grown annually and how much is used by U.S. ethanol biorefineries. We use geospatial software and publicly available data to map locations of biorefineries, estimate their corn feedstock requirements, and estimate the feedstock production locations and quantities. We combined these data and estimates into a Bioethanol Feedstock Geospatial Database (BFGD) for years 2005-2010. We evaluated the performance of the methods by assessing how well the feedstock geospatial model matched our estimates of locally-sourced feedstock demand. On average, the model met approximately 89 percent of the total estimated local feedstock demand across the studied years-within approximately 25-to-40 kilometers of the biorefinery in the majority of cases. We anticipate that these methods could be used for other years and feedstocks, and can be subsequently applied to estimate the environmental footprint of feedstock production. Methods used to develop the Bioethanol Feedstock Geospatial Database (BFGD) provide a means of estimating the amount and location of U.S. corn harvested for use as U.S. bioethanol feedstock. Such estimates of geospatial feedstock production may be used to evaluate environmental impacts of bioethanol production and to identify conservation priorities. The BFGD is available for 2005-2010, and the methods may be applied to additional years, locations, and potentially other biofuels and feedstocks.
An Evaporative Cooling Model for Teaching Applied Psychrometrics
ERIC Educational Resources Information Center
Johnson, Donald M.
2004-01-01
Evaporative cooling systems are commonly used in controlled environment plant and animal production. These cooling systems operate based on well defined psychrometric principles. However, students often experience considerable difficulty in learning these principles when they are taught in an abstract, verbal manner. This article describes an…
40 CFR 86.085-37 - Production vehicles and engines.
Code of Federal Regulations, 2013 CFR
2013-07-01
.... (d) The following definitions apply to this section: (1) Model type means a unique combination of car..., inertia weight, and transmission class. (3) Vehicle configuration means a unique combination of basic engine, engine code, inertia weight, transmission configuration, and axle ratio within a base level. [48...
40 CFR 86.085-37 - Production vehicles and engines.
Code of Federal Regulations, 2012 CFR
2012-07-01
.... (d) The following definitions apply to this section: (1) Model type means a unique combination of car..., inertia weight, and transmission class. (3) Vehicle configuration means a unique combination of basic engine, engine code, inertia weight, transmission configuration, and axle ratio within a base level. [48...
Analysing Amazonian forest productivity using a new individual and trait-based model (TFS v.1)
NASA Astrophysics Data System (ADS)
Fyllas, N. M.; Gloor, E.; Mercado, L. M.; Sitch, S.; Quesada, C. A.; Domingues, T. F.; Galbraith, D. R.; Torre-Lezama, A.; Vilanova, E.; Ramírez-Angulo, H.; Higuchi, N.; Neill, D. A.; Silveira, M.; Ferreira, L.; Aymard C., G. A.; Malhi, Y.; Phillips, O. L.; Lloyd, J.
2014-07-01
Repeated long-term censuses have revealed large-scale spatial patterns in Amazon basin forest structure and dynamism, with some forests in the west of the basin having up to a twice as high rate of aboveground biomass production and tree recruitment as forests in the east. Possible causes for this variation could be the climatic and edaphic gradients across the basin and/or the spatial distribution of tree species composition. To help understand causes of this variation a new individual-based model of tropical forest growth, designed to take full advantage of the forest census data available from the Amazonian Forest Inventory Network (RAINFOR), has been developed. The model allows for within-stand variations in tree size distribution and key functional traits and between-stand differences in climate and soil physical and chemical properties. It runs at the stand level with four functional traits - leaf dry mass per area (Ma), leaf nitrogen (NL) and phosphorus (PL) content and wood density (DW) varying from tree to tree - in a way that replicates the observed continua found within each stand. We first applied the model to validate canopy-level water fluxes at three eddy covariance flux measurement sites. For all three sites the canopy-level water fluxes were adequately simulated. We then applied the model at seven plots, where intensive measurements of carbon allocation are available. Tree-by-tree multi-annual growth rates generally agreed well with observations for small trees, but with deviations identified for larger trees. At the stand level, simulations at 40 plots were used to explore the influence of climate and soil nutrient availability on the gross (ΠG) and net (ΠN) primary production rates as well as the carbon use efficiency (CU). Simulated ΠG, ΠN and CU were not associated with temperature. On the other hand, all three measures of stand level productivity were positively related to both mean annual precipitation and soil nutrient status. Sensitivity studies showed a clear importance of an accurate parameterisation of within- and between-stand trait variability on the fidelity of model predictions. For example, when functional tree diversity was not included in the model (i.e. with just a single plant functional type with mean basin-wide trait values) the predictive ability of the model was reduced. This was also the case when basin-wide (as opposed to site-specific) trait distributions were applied within each stand. We conclude that models of tropical forest carbon, energy and water cycling should strive to accurately represent observed variations in functionally important traits across the range of relevant scales.
Mead, Emma J; Chiverton, Lesley M; Spurgeon, Sarah K; Martin, Elaine B; Montague, Gary A; Smales, C Mark; von der Haar, Tobias
2012-01-01
Monoclonal antibodies are commercially important, high value biotherapeutic drugs used in the treatment of a variety of diseases. These complex molecules consist of two heavy chain and two light chain polypeptides covalently linked by disulphide bonds. They are usually expressed as recombinant proteins from cultured mammalian cells, which are capable of correctly modifying, folding and assembling the polypeptide chains into the native quaternary structure. Such recombinant cell lines often vary in the amounts of product produced and in the heterogeneity of the secreted products. The biological mechanisms of this variation are not fully defined. Here we have utilised experimental and modelling strategies to characterise and define the biology underpinning product heterogeneity in cell lines exhibiting varying antibody expression levels, and then experimentally validated these models. In undertaking these studies we applied and validated biochemical (rate-constant based) and engineering (nonlinear) models of antibody expression to experimental data from four NS0 cell lines with different IgG4 secretion rates. The models predict that export of the full antibody and its fragments are intrinsically linked, and cannot therefore be manipulated individually at the level of the secretory machinery. Instead, the models highlight strategies for the manipulation at the precursor species level to increase recombinant protein yields in both high and low producing cell lines. The models also highlight cell line specific limitations in the antibody expression pathway.
NASA Technical Reports Server (NTRS)
Woodward, W. A.; Gray, H. L.
1983-01-01
Efforts in support of the development of multicrop production monitoring capability are reported. In particular, segment level proportion estimation techniques based upon a mixture model were investigated. Efforts have dealt primarily with evaluation of current techniques and development of alternative ones. A comparison of techniques is provided on both simulated and LANDSAT data along with an analysis of the quality of profile variables obtained from LANDSAT data.
2013-03-20
Wakefield of the University of Michigan as Co-PI. This extended activity produced a large number of products and accomplishments; however, this report...speech communication will be expanded to provide a robust modeling and prediction capability for tasks involving speech production and speech and non...preparations made to move to the newer Cocoa API instead of the previous Carbon API. In the following sections, an extended treatment will be
Clinical Development of Cell Therapies: Setting the Stage for Academic Success.
Abou-El-Enein, M; Volk, H-D; Reinke, P
2017-01-01
Cellular therapies have potential to treat a wide range of diseases with autologous immunotherapies showing unprecedented therapeutic promise in clinical trials. Such therapies are mainly developed by academic researchers applying small-scale production, targeting rare and unmet medical needs. Here, we highlight the clinical translation of immunotherapy product in an academic setting, which may serve as a success model for early academic development of cell-based therapeutics. © 2016 American Society for Clinical Pharmacology and Therapeutics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghosh, Amit; Ando, David; Gin, Jennifer
Efficient redirection of microbial metabolism into the abundant production of desired bioproducts remains non-trivial. Here, we used flux-based modeling approaches to improve yields of fatty acids in Saccharomyces cerevisiae. We combined 13C labeling data with comprehensive genome-scale models to shed light onto microbial metabolism and improve metabolic engineering efforts. We concentrated on studying the balance of acetyl-CoA, a precursor metabolite for the biosynthesis of fatty acids. A genome-wide acetyl-CoA balance study showed ATP citrate lyase from Yarrowia lipolytica as a robust source of cytoplasmic acetyl-CoA and malate synthase as a desirable target for downregulation in terms of acetyl-CoA consumption. Thesemore » genetic modifications were applied to S. cerevisiae WRY2, a strain that is capable of producing 460 mg/L of free fatty acids. With the addition of ATP citrate lyase and downregulation of malate synthase, the engineered strain produced 26% more free fatty acids. Further increases in free fatty acid production of 33% were obtained by knocking out the cytoplasmic glycerol-3-phosphate dehydrogenase, which flux analysis had shown was competing for carbon flux upstream with the carbon flux through the acetyl-CoA production pathway in the cytoplasm. In total, the genetic interventions applied in this work increased fatty acid production by ~70%.« less
Ghosh, Amit; Ando, David; Gin, Jennifer; ...
2016-10-05
Efficient redirection of microbial metabolism into the abundant production of desired bioproducts remains non-trivial. Here, we used flux-based modeling approaches to improve yields of fatty acids in Saccharomyces cerevisiae. We combined 13C labeling data with comprehensive genome-scale models to shed light onto microbial metabolism and improve metabolic engineering efforts. We concentrated on studying the balance of acetyl-CoA, a precursor metabolite for the biosynthesis of fatty acids. A genome-wide acetyl-CoA balance study showed ATP citrate lyase from Yarrowia lipolytica as a robust source of cytoplasmic acetyl-CoA and malate synthase as a desirable target for downregulation in terms of acetyl-CoA consumption. Thesemore » genetic modifications were applied to S. cerevisiae WRY2, a strain that is capable of producing 460 mg/L of free fatty acids. With the addition of ATP citrate lyase and downregulation of malate synthase, the engineered strain produced 26% more free fatty acids. Further increases in free fatty acid production of 33% were obtained by knocking out the cytoplasmic glycerol-3-phosphate dehydrogenase, which flux analysis had shown was competing for carbon flux upstream with the carbon flux through the acetyl-CoA production pathway in the cytoplasm. In total, the genetic interventions applied in this work increased fatty acid production by ~70%.« less
Application of target costing in machining
NASA Astrophysics Data System (ADS)
Gopalakrishnan, Bhaskaran; Kokatnur, Ameet; Gupta, Deepak P.
2004-11-01
In today's intensely competitive and highly volatile business environment, consistent development of low cost and high quality products meeting the functionality requirements is a key to a company's survival. Companies continuously strive to reduce the costs while still producing quality products to stay ahead in the competition. Many companies have turned to target costing to achieve this objective. Target costing is a structured approach to determine the cost at which a proposed product, meeting the quality and functionality requirements, must be produced in order to generate the desired profits. It subtracts the desired profit margin from the company's selling price to establish the manufacturing cost of the product. Extensive literature review revealed that companies in automotive, electronic and process industries have reaped the benefits of target costing. However target costing approach has not been applied in the machining industry, but other techniques based on Geometric Programming, Goal Programming, and Lagrange Multiplier have been proposed for application in this industry. These models follow a forward approach, by first selecting a set of machining parameters, and then determining the machining cost. Hence in this study we have developed an algorithm to apply the concepts of target costing, which is a backward approach that selects the machining parameters based on the required machining costs, and is therefore more suitable for practical applications in process improvement and cost reduction. A target costing model was developed for turning operation and was successfully validated using practical data.
Economic risk assessment of drought impacts on irrigated agriculture
NASA Astrophysics Data System (ADS)
Lopez-Nicolas, A.; Pulido-Velazquez, M.; Macian-Sorribes, H.
2017-07-01
In this paper we present an innovative framework for an economic risk analysis of drought impacts on irrigated agriculture. It consists on the integration of three components: stochastic time series modelling for prediction of inflows and future reservoir storages at the beginning of the irrigation season; statistical regression for the evaluation of water deliveries based on projected inflows and storages; and econometric modelling for economic assessment of the production value of agriculture based on irrigation water deliveries and crop prices. Therefore, the effect of the price volatility can be isolated from the losses due to water scarcity in the assessment of the drought impacts. Monte Carlo simulations are applied to generate probability functions of inflows, which are translated into probabilities of storages, deliveries, and finally, production value of agriculture. The framework also allows the assessment of the value of mitigation measures as reduction of economic losses during droughts. The approach was applied to the Jucar river basin, a complex system affected by multiannual severe droughts, with irrigated agriculture as the main consumptive demand. Probability distributions of deliveries and production value were obtained for each irrigation season. In the majority of the irrigation districts, drought causes a significant economic impact. The increase of crop prices can partially offset the losses from the reduction of production due to water scarcity in some districts. Emergency wells contribute to mitigating the droughts' impacts on the Jucar river system.
Single stage queueing/manufacturing system model that involves emission variable
NASA Astrophysics Data System (ADS)
Murdapa, P. S.; Pujawan, I. N.; Karningsih, P. D.; Nasution, A. H.
2018-04-01
Queueing is commonly occured at every industry. The basic model of queueing theory gives a foundation for modeling a manufacturing system. Nowadays, carbon emission is an important and inevitable issue due to its huge impact to our environment. However, existing model of queuing applied for analysis of single stage manufacturing system has not taken Carbon emissions into consideration. If it is applied to manufacturing context, it may lead to improper decisisions. By taking into account of emission variables into queuing models, not only the model become more comprehensive but also it creates awareness on the issue to many parties that involves in the system. This paper discusses the single stage M/M/1 queueing model that involves emission variable. Hopefully it could be a starting point for the next more complex models. It has a main objective for determining how carbon emissions could fit into the basic queueing theory. It turned out that the involvement of emission variables into the model has modified the traditional model of a single stage queue to a calculation model of production lot quantity allowed per period.
Evaluating Predictive Models of Software Quality
NASA Astrophysics Data System (ADS)
Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.
2014-06-01
Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parini, Mauro; Acuna, Jorge A.; Laudiano, Michele
1996-01-24
The first 55 MW power plant at Miravalles started operation in March, 1994. During the first few months of production, a gradual increase in chloride content was observed in some production wells. The cause was assumed to be a rapid return of injectate from two in.jection wells located fairly near to the main production area. A tracer test was performed and showed a relatively rapid breakthrough, confirming the assumption made. Numerical modeling was then carried out to try to reproduce the observed behavior. The reservoir was modelled with an idealized three-dimensional network of fractures embedded into a low permeability matrix.more » The “two waters” feature of TOUGH2 simulator was used. The numerical simulation showed good agreement with observations. A “porous medium” model with equivalent hydraulic characteristics was unable to reproduce the observations. The fractured model, when applied to investigate the mid and long term expected behavior, indicated a reservoir cooling risk associated to the present injection scheme. Work is currently underway to modify this scheme.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parini, M.; Laudiano, M.; Acuna, J.A.
1996-12-31
The first 55 MW power plant at Miravalles started operation in March, 1994. During the first few months of production, a gradual increase in chloride content was observed in some production wells. The cause was assumed to be a rapid return of injectate from two injection wells located fairly near to the main production area. A tracer test was performed and showed a relatively rapid breakthrough, confirming the assumption made. Numerical modeling was then carried out to try to reproduce the observed behavior. The reservoir was modelled with an idealized three-dimensional network of fractures embedded into a low permeability matrix.more » The {open_quotes}two waters{close_quotes} feature of TOUGH2 simulator was used. The numerical simulation showed good agreement with observations. A {open_quotes}porous medium{close_quotes} model with equivalent hydraulic characteristics was unable to reproduce the observations. The fractured model, when applied to investigate the mid and long term expected behavior, indicated a reservoir cooling risk associated to the present injection scheme. Work is currently underway to modify this scheme.« less
Du, Wei; Jongbloets, Joeri A; van Boxtel, Coco; Pineda Hernández, Hugo; Lips, David; Oliver, Brett G; Hellingwerf, Klaas J; Branco Dos Santos, Filipe
2018-01-01
Microbial bioengineering has the potential to become a key contributor to the future development of human society by providing sustainable, novel, and cost-effective production pipelines. However, the sustained productivity of genetically engineered strains is often a challenge, as spontaneous non-producing mutants tend to grow faster and take over the population. Novel strategies to prevent this issue of strain instability are urgently needed. In this study, we propose a novel strategy applicable to all microbial production systems for which a genome-scale metabolic model is available that aligns the production of native metabolites to the formation of biomass. Based on well-established constraint-based analysis techniques such as OptKnock and FVA, we developed an in silico pipeline-FRUITS-that specifically 'Finds Reactions Usable in Tapping Side-products'. It analyses a metabolic network to identify compounds produced in anabolism that are suitable to be coupled to growth by deletion of their re-utilization pathway(s), and computes their respective biomass and product formation rates. When applied to Synechocystis sp. PCC6803, a model cyanobacterium explored for sustainable bioproduction, a total of nine target metabolites were identified. We tested our approach for one of these compounds, acetate, which is used in a wide range of industrial applications. The model-guided engineered strain shows an obligatory coupling between acetate production and photoautotrophic growth as predicted. Furthermore, the stability of acetate productivity in this strain was confirmed by performing prolonged turbidostat cultivations. This work demonstrates a novel approach to stabilize the production of target compounds in cyanobacteria that culminated in the first report of a photoautotrophic growth-coupled cell factory. The method developed is generic and can easily be extended to any other modeled microbial production system.
Leopold, Christine; Mantel-Teeuwisse, Aukje Katja; Seyfang, Leonhard; Vogler, Sabine; de Joncheere, Kees; Laing, Richard Ogilvie; Leufkens, Hubert
2012-01-01
Objectives: This study aims to examine the impact of external price referencing (EPR) on on-patent medicine prices, adjusting for other factors that may affect price levels such as sales volume, exchange rates, gross domestic product (GDP) per capita, total pharmaceutical expenditure (TPE), and size of the pharmaceutical industry. Methods: Price data of 14 on-patent products, in 14 European countries in 2007 and 2008 were obtained from the Pharmaceutical Price Information Service of the Austrian Health Institute. Based on the unit ex-factory prices in EURO, scaled ranks per country and per product were calculated. For the regression analysis the scaled ranks per country and product were weighted; each country had the same sum of weights but within a country the weights were proportional to its sales volume in the year (data obtained from IMS Health). Taking the scaled ranks, several statistical analyses were performed by using the program “R”, including a multiple regression analysis (including variables such as GDP per capita and national industry size). Results: This study showed that on average EPR as a pricing policy leads to lower prices. However, the large variation in price levels among countries using EPR confirmed that the price level is not only driven by EPR. The unadjusted linear regression model confirms that applying EPR in a country is associated with a lower scaled weighted rank (p=0.002). This interaction persisted after inclusion of total pharmaceutical expenditure per capita and GDP per capita in the final model. Conclusions: The study showed that for patented products, prices are in general lower in case the country applied EPR. Nevertheless substantial price differences among countries that apply EPR could be identified. Possible explanations could be found through a correlation between pharmaceutical industry and the scaled price ranks. In conclusion, we found that implementing external reference pricing could lead to lower prices. PMID:23532710
Petersen, Mark D.; Frankel, Arthur D.; Harmsen, Stephen C.; Mueller, Charles S.; Boyd, Oliver S.; Luco, Nicolas; Wheeler, Russell L.; Rukstales, Kenneth S.; Haller, Kathleen M.
2012-01-01
In this paper, we describe the scientific basis for the source and ground-motion models applied in the 2008 National Seismic Hazard Maps, the development of new products that are used for building design and risk analyses, relationships between the hazard maps and design maps used in building codes, and potential future improvements to the hazard maps.
NASA Astrophysics Data System (ADS)
Minaya, Veronica; Corzo, Gerald; van der Kwast, Johannes; Galarraga, Remigio; Mynett, Arthur
2014-05-01
Simulations of carbon cycling are prone to uncertainties from different sources, which in general are related to input data, parameters and the model representation capacities itself. The gross carbon uptake in the cycle is represented by the gross primary production (GPP), which deals with the spatio-temporal variability of the precipitation and the soil moisture dynamics. This variability associated with uncertainty of the parameters can be modelled by multivariate probabilistic distributions. Our study presents a novel methodology that uses multivariate Copulas analysis to assess the GPP. Multi-species and elevations variables are included in a first scenario of the analysis. Hydro-meteorological conditions that might generate a change in the next 50 or more years are included in a second scenario of this analysis. The biogeochemical model BIOME-BGC was applied in the Ecuadorian Andean region in elevations greater than 4000 masl with the presence of typical vegetation of páramo. The change of GPP over time is crucial for climate scenarios of the carbon cycling in this type of ecosystem. The results help to improve our understanding of the ecosystem function and clarify the dynamics and the relationship with the change of climate variables. Keywords: multivariate analysis, Copula, BIOME-BGC, NPP, páramos
Tobacco industry responsibility for butts: a Model Tobacco Waste Act.
Curtis, Clifton; Novotny, Thomas E; Lee, Kelley; Freiberg, Mike; McLaughlin, Ian
2017-01-01
Cigarette butts and other postconsumer products from tobacco use are the most common waste elements picked up worldwide each year during environmental cleanups. Under the environmental principle of Extended Producer Responsibility, tobacco product manufacturers may be held responsible for collection, transport, processing and safe disposal of tobacco product waste (TPW). Legislation has been applied to other toxic and hazardous postconsumer waste products such as paints, pesticide containers and unused pharmaceuticals, to reduce, prevent and mitigate their environmental impacts. Additional product stewardship (PS) requirements may be necessary for other stakeholders and beneficiaries of tobacco product sales and use, especially suppliers, retailers and consumers, in order to ensure effective TPW reduction. This report describes how a Model Tobacco Waste Act may be adopted by national and subnational jurisdictions to address the environmental impacts of TPW. Such a law will also reduce tobacco use and its health consequences by raising attention to the environmental hazards of TPW, increasing the price of tobacco products, and reducing the number of tobacco product retailers. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Establishment of a center of excellence for applied mathematical and statistical research
NASA Technical Reports Server (NTRS)
Woodward, W. A.; Gray, H. L.
1983-01-01
The state of the art was assessed with regards to efforts in support of the crop production estimation problem and alternative generic proportion estimation techniques were investigated. Topics covered include modeling the greeness profile (Badhwarmos model), parameter estimation using mixture models such as CLASSY, and minimum distance estimation as an alternative to maximum likelihood estimation. Approaches to the problem of obtaining proportion estimates when the underlying distributions are asymmetric are examined including the properties of Weibull distribution.
NASA Astrophysics Data System (ADS)
González, M. R.; Torres, F.; Yoldi, V.; Arcega, F.; Plaza, I.
2012-04-01
It is proposed an integrated management model for an organization. This model is based on the continuous improvement Plan-Do-Check-Act cycle and it intends to integrate the environmental, risk prevention and ethical aspects as well as research, development and innovation projects management in the general quality management structure proposed by ISO 9001:2008. It aims to fulfill the standards ISO 9001, ISO 14001, OSHAS 18001, SGE 21 y 166002.
On DSS Implementation in the Dynamic Model of the Digital Oil field
NASA Astrophysics Data System (ADS)
Korovin, Iakov S.; Khisamutdinov, Maksim V.; Kalyaev, Anatoly I.
2018-02-01
Decision support systems (DSS), especially based on the artificial intelligence (AI) techniques are been widely applied in different domains nowadays. In the paper we depict an approach of implementing DSS in to Digital Oil Field (DOF) dynamic model structure in order to reduce the human factor influence, considering the automation of all production processes to be the DOF model clue element. As the basic tool of data handling we propose the hybrid application on artificial neural networks and evolutional algorithms.
Continuum-level modelling of cellular adhesion and matrix production in aggregates.
Geris, Liesbet; Ashbourn, Joanna M A; Clarke, Tim
2011-05-01
Key regulators in tissue-engineering processes such as cell culture and cellular organisation are the cell-cell and cell-matrix interactions. As mathematical models are increasingly applied to investigate biological phenomena in the biomedical field, it is important, for some applications, that these models incorporate an adequate description of cell adhesion. This study describes the development of a continuum model that represents a cell-in-gel culture system used in bone-tissue engineering, namely that of a cell aggregate embedded in a hydrogel. Cell adhesion is modelled through the use of non-local (integral) terms in the partial differential equations. The simulation results demonstrate that the effects of cell-cell and cell-matrix adhesion are particularly important for the survival and growth of the cell population and the production of extracellular matrix by the cells, concurring with experimental observations in the literature.
Logan, Jennifer A; Beatty, Maile; Woliver, Renee; Rubinstein, Eric P; Averbach, Abigail R
2005-12-01
Over time, improvements in HIV/AIDS surveillance and service utilization data have increased their usefulness for planning programs, targeting resources, and otherwise informing HIV/AIDS policy. However, community planning groups, service providers, and health department staff often have difficulty in interpreting and applying the wide array of data now available. We describe the development of the Bridging Model, a technical assistance model for overcoming barriers to the use of data for program planning. Through the use of an iterative feedback loop in the model, HIV/AIDS data products constantly are evolving to better inform the decision-making tasks of their multiple users. Implementation of this model has led to improved data quality and data products and to a greater willingness and ability among stakeholders to use the data for planning purposes.
Modelling of influential parameters on a continuous evaporation process by Doehlert shells
Porte, Catherine; Havet, Jean-Louis; Daguet, David
2003-01-01
The modelling of the parameters that influence the continuous evaporation of an alcoholic extract was considered using Doehlert matrices. The work was performed with a wiped falling film evaporator that allowed us to study the influence of the pressure, temperature, feed flow and dry matter of the feed solution on the dry matter contents of the resulting concentrate, and the productivity of the process. The Doehlert shells were used to model the influential parameters. The pattern obtained from the experimental results was checked allowing for some dysfunction in the unit. The evaporator was modified and a new model applied; the experimental results were then in agreement with the equations. The model was finally determined and successfully checked in order to obtain an 8% dry matter concentrate with the best productivity; the results fit in with the industrial constraints of subsequent processes. PMID:18924887
In vitro experimental investigation of voice production
Horáčcek, Jaromír; Brücker, Christoph; Becker, Stefan
2012-01-01
The process of human phonation involves a complex interaction between the physical domains of structural dynamics, fluid flow, and acoustic sound production and radiation. Given the high degree of nonlinearity of these processes, even small anatomical or physiological disturbances can significantly affect the voice signal. In the worst cases, patients can lose their voice and hence the normal mode of speech communication. To improve medical therapies and surgical techniques it is very important to understand better the physics of the human phonation process. Due to the limited experimental access to the human larynx, alternative strategies, including artificial vocal folds, have been developed. The following review gives an overview of experimental investigations of artificial vocal folds within the last 30 years. The models are sorted into three groups: static models, externally driven models, and self-oscillating models. The focus is on the different models of the human vocal folds and on the ways in which they have been applied. PMID:23181007
Detecting Inconsistencies in Multi-View Models with Variability
NASA Astrophysics Data System (ADS)
Lopez-Herrejon, Roberto Erick; Egyed, Alexander
Multi-View Modeling (MVM) is a common modeling practice that advocates the use of multiple, different and yet related models to represent the needs of diverse stakeholders. Of crucial importance in MVM is consistency checking - the description and verification of semantic relationships amongst the views. Variability is the capacity of software artifacts to vary, and its effective management is a core tenet of the research in Software Product Lines (SPL). MVM has proven useful for developing one-of-a-kind systems; however, to reap the potential benefits of MVM in SPL it is vital to provide consistency checking mechanisms that cope with variability. In this paper we describe how to address this need by applying Safe Composition - the guarantee that all programs of a product line are type safe. We evaluate our approach with a case study.
Investigating Galactic Structure with COBE/DIRBE and Simulation
NASA Technical Reports Server (NTRS)
Cohen, Martin
1999-01-01
In this work I applied the current version of the SKY model of the point source sky to the interpretation of the diffuse all-sky emission observed by COBE/DIRBE (Cosmic Background Explorer Satellite/Diffuse Infrared Background Experiment). The goal was to refine the SKY model using the all-sky DIRBE maps of the Galaxy, in order that a search could be made for an isotropic cosmic background."Faint Source Model" [FSM] was constructed to remove Galactic fore ground stars from the ZSMA products. The FSM mimics SKY version 1 but it was inadequate to seek cosmic background emission because of the sizeable residual emission in the ZSMA products after this starlight subtraction. At this point I can only support that such models are currently inadequate to reveal a cosmic background. Even SKY5 yields the same disappointing result.
NASA Astrophysics Data System (ADS)
Bascetin, A.
2007-04-01
The selection of an optimal reclamation method is one of the most important factors in open-pit design and production planning. It also affects economic considerations in open-pit design as a function of plan location and depth. Furthermore, the selection is a complex multi-person, multi-criteria decision problem. The group decision-making process can be improved by applying a systematic and logical approach to assess the priorities based on the inputs of several specialists from different functional areas within the mine company. The analytical hierarchy process (AHP) can be very useful in involving several decision makers with different conflicting objectives to arrive at a consensus decision. In this paper, the selection of an optimal reclamation method using an AHP-based model was evaluated for coal production in an open-pit coal mine located at Seyitomer region in Turkey. The use of the proposed model indicates that it can be applied to improve the group decision making in selecting a reclamation method that satisfies optimal specifications. Also, it is found that the decision process is systematic and using the proposed model can reduce the time taken to select a optimal method.
NASA Astrophysics Data System (ADS)
Hao, Y.; Settgast, R. R.; Fu, P.; Tompson, A. F. B.; Morris, J.; Ryerson, F. J.
2016-12-01
It has long been recognized that multiphase flow and transport in fractured porous media is very important for various subsurface applications. Hydrocarbon fluid flow and production from hydraulically fractured shale reservoirs is an important and complicated example of multiphase flow in fractured formations. The combination of horizontal drilling and hydraulic fracturing is able to create extensive fracture networks in low permeability shale rocks, leading to increased formation permeability and enhanced hydrocarbon production. However, unconventional wells experience a much faster production decline than conventional hydrocarbon recovery. Maintaining sustainable and economically viable shale gas/oil production requires additional wells and re-fracturing. Excessive fracturing fluid loss during hydraulic fracturing operations may also drive up operation costs and raise potential environmental concerns. Understanding and modeling processes that contribute to decreasing productivity and fracturing fluid loss represent a critical component for unconventional hydrocarbon recovery analysis. Towards this effort we develop a discrete fracture model (DFM) in GEOS (LLNL multi-physics computational code) to simulate multiphase flow and transfer in hydraulically fractured reservoirs. The DFM model is able to explicitly account for both individual fractures and their surrounding rocks, therefore allowing for an accurate prediction of impacts of fracture-matrix interactions on hydrocarbon production. We apply the DFM model to simulate three-phase (water, oil, and gas) flow behaviors in fractured shale rocks as a result of different hydraulic stimulation scenarios. Numerical results show that multiphase flow behaviors at the fracture-matrix interface play a major role in controlling both hydrocarbon production and fracturing fluid recovery rates. The DFM model developed in this study will be coupled with the existing hydro-fracture model to provide a fully integrated geomechanical and reservoir simulation capability for an accurate prediction and assessment of hydrocarbon production and hydraulic fracturing performance. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Tropical geometry of statistical models.
Pachter, Lior; Sturmfels, Bernd
2004-11-16
This article presents a unified mathematical framework for inference in graphical models, building on the observation that graphical models are algebraic varieties. From this geometric viewpoint, observations generated from a model are coordinates of a point in the variety, and the sum-product algorithm is an efficient tool for evaluating specific coordinates. Here, we address the question of how the solutions to various inference problems depend on the model parameters. The proposed answer is expressed in terms of tropical algebraic geometry. The Newton polytope of a statistical model plays a key role. Our results are applied to the hidden Markov model and the general Markov model on a binary tree.
Optimum cooking conditions for shrimp and Atlantic salmon.
Brookmire, Lauren; Mallikarjunan, P; Jahncke, M; Grisso, R
2013-02-01
The quality and safety of a cooked food product depends on many variables, including the cooking method and time-temperature combinations employed. The overall heating profile of the food can be useful in predicting the quality changes and microbial inactivation occurring during cooking. Mathematical modeling can be used to attain the complex heating profile of a food product during cooking. Studies were performed to monitor the product heating profile during the baking and boiling of shrimp and the baking and pan-frying of salmon. Product color, texture, moisture content, mass loss, and pressed juice were evaluated during the cooking processes as the products reached the internal temperature recommended by the FDA. Studies were also performed on the inactivation of Salmonella cocktails in shrimp and salmon. To effectively predict inactivation during cooking, the Bigelow, Fermi distribution, and Weibull distribution models were applied to the Salmonella thermal inactivation data. Minimum cooking temperatures necessary to destroy Salmonella in shrimp and salmon were determined. The heating profiles of the 2 products were modeled using the finite difference method. Temperature data directly from the modeled heating profiles were then used in the kinetic modeling of quality change and Salmonella inactivation during cooking. The optimum cooking times for a 3-log reduction of Salmonella and maintaining 95% of quality attributes are 100, 233, 159, 378, 1132, and 399 s for boiling extra jumbo shrimp, baking extra jumbo shrimp, boiling colossal shrimp, baking colossal shrimp, baking Atlantic salmon, and pan frying Atlantic Salmon, respectively. © 2013 Institute of Food Technologists®
NASA Astrophysics Data System (ADS)
Tarnavsky, E.
2016-12-01
The water resources satisfaction index (WRSI) model is widely used in drought early warning and food security analyses, as well as in agro-meteorological risk management through weather index-based insurance. Key driving data for the model is provided from satellite-based rainfall estimates such as ARC2 and TAMSAT over Africa and CHIRPS globally. We evaluate the performance of these rainfall datasets for detecting onset and cessation of rainfall and estimating crop production conditions for the WRSI model. We also examine the sensitivity of the WRSI model to different satellite-based rainfall products over maize growing regions in Tanzania. Our study considers planting scenarios for short-, medium-, and long-growing cycle maize, and we apply these for 'regular' and drought-resistant maize, as well as with two different methods for defining the start of season (SOS). Simulated maize production estimates are compared against available reported production figures at the national and sub-national (province) levels. Strengths and weaknesses of the driving rainfall data, insights into the role of the SOS definition method, and phenology-based crop yield coefficient and crop yield reduction functions are discussed in the context of space-time drought characteristics. We propose a way forward for selecting skilled rainfall datasets and discuss their implication for crop production monitoring and the design and structure of weather index-based insurance products as risk transfer mechanisms implemented across scales for smallholder farmers to national programmes.
Mbuthia, Jackson M; Rewe, Thomas O; Kahi, Alexander K
2015-02-01
A deterministic bio-economic model was developed and applied to evaluate biological and economic variables that characterize smallholder pig production systems in Kenya. Two pig production systems were considered namely, semi-intensive (SI) and extensive (EX). The input variables were categorized into biological variables including production and functional traits, nutritional variables, management variables and economic variables. The model factored the various sow physiological systems including gestation, farrowing, lactation, growth and development. The model was developed to evaluate a farrow to finish operation, but the results were customized to account for a farrow to weaner operation for a comparative analysis. The operations were defined as semi-intensive farrow to finish (SIFF), semi-intensive farrow to weaner (SIFW), extensive farrow to finish (EXFF) and extensive farrow to weaner (EXFW). In SI, the profits were the highest at KES. 74,268.20 per sow per year for SIFF against KES. 4026.12 for SIFW. The corresponding profits for EX were KES. 925.25 and KES. 626.73. Feed costs contributed the major part of the total costs accounting for 67.0, 50.7, 60.5 and 44.5 % in the SIFF, SIFW, EXFF and EXFW operations, respectively. The bio-economic model developed could be extended with modifications for use in deriving economic values for breeding goal traits for pigs under smallholder production systems in other parts of the tropics.
Rodrigues Neves, Charlotte; Gibbs, Susan
2018-06-23
Contact with the skin is inevitable or desirable for daily life products such as cosmetics, hair dyes, perfumes, drugs, household products, and industrial and agricultural products. Whereas the majority of these products are harmless, a number can become metabolized and/or activate the immunological defense via innate and adaptive mechanisms resulting in sensitization and allergic contact dermatitis upon following exposures to the same substance. Therefore, strict safety (hazard) assessment of actives and ingredients in products and drugs applied to the skin is essential to determine I) whether the chemical is a potential sensitizer and if so II) what is the safe concentration for human exposure to prevent sensitization from occurring. Ex vivo skin is a valuable model for skin penetration studies but due to logistical and viability limitations the development of in vitro alternatives is required. The aim of this review is to give a clear overview of the organotypic in vitro skin models (reconstructed human epidermis, reconstructed human skin, immune competent skin models incorporating Langerhans Cells and T-cells, skin-on-chip) that are currently commercially available or which are being used in a laboratory research setting for hazard assessment of potential sensitizers and for investigating the mechanisms (sensitization key events 1-4) related to allergic contact dermatitis. The limitations of the models, their current applications, and their future potential in replacing animals in allergy-related science are discussed.
Khan, Mohammad Jakir Hossain; Hussain, Mohd Azlan; Mujtaba, Iqbal Mohammed
2014-01-01
Propylene is one type of plastic that is widely used in our everyday life. This study focuses on the identification and justification of the optimum process parameters for polypropylene production in a novel pilot plant based fluidized bed reactor. This first-of-its-kind statistical modeling with experimental validation for the process parameters of polypropylene production was conducted by applying ANNOVA (Analysis of variance) method to Response Surface Methodology (RSM). Three important process variables i.e., reaction temperature, system pressure and hydrogen percentage were considered as the important input factors for the polypropylene production in the analysis performed. In order to examine the effect of process parameters and their interactions, the ANOVA method was utilized among a range of other statistical diagnostic tools such as the correlation between actual and predicted values, the residuals and predicted response, outlier t plot, 3D response surface and contour analysis plots. The statistical analysis showed that the proposed quadratic model had a good fit with the experimental results. At optimum conditions with temperature of 75°C, system pressure of 25 bar and hydrogen percentage of 2%, the highest polypropylene production obtained is 5.82% per pass. Hence it is concluded that the developed experimental design and proposed model can be successfully employed with over a 95% confidence level for optimum polypropylene production in a fluidized bed catalytic reactor (FBCR). PMID:28788576
Sea Ice Biogeochemistry: A Guide for Modellers
Tedesco, Letizia; Vichi, Marcello
2014-01-01
Sea ice is a fundamental component of the climate system and plays a key role in polar trophic food webs. Nonetheless sea ice biogeochemical dynamics at large temporal and spatial scales are still rarely described. Numerical models may potentially contribute integrating among sparse observations, but available models of sea ice biogeochemistry are still scarce, whether their relevance for properly describing the current and future state of the polar oceans has been recently addressed. A general methodology to develop a sea ice biogeochemical model is presented, deriving it from an existing validated model application by extension of generic pelagic biogeochemistry model parameterizations. The described methodology is flexible and considers different levels of ecosystem complexity and vertical representation, while adopting a strategy of coupling that ensures mass conservation. We show how to apply this methodology step by step by building an intermediate complexity model from a published realistic application and applying it to analyze theoretically a typical season of first-year sea ice in the Arctic, the one currently needing the most urgent understanding. The aim is to (1) introduce sea ice biogeochemistry and address its relevance to ocean modelers of polar regions, supporting them in adding a new sea ice component to their modelling framework for a more adequate representation of the sea ice-covered ocean ecosystem as a whole, and (2) extend our knowledge on the relevant controlling factors of sea ice algal production, showing that beyond the light and nutrient availability, the duration of the sea ice season may play a key-role shaping the algal production during the on going and upcoming projected changes. PMID:24586604
Measuring Involvement with Social Issues.
ERIC Educational Resources Information Center
Nowak, Glen J.; Salmon, Charles T.
A study applied research concepts from consumer product involvement to test a model for research on involvement with social issues. Issue involvement was defined as the state or level of perceived importance and/or interest evoked by a stimulus (issue) within a specific situation. Attitudes on four social issues--abortion, pornography, the…
Productive Failure in STEM Education
ERIC Educational Resources Information Center
Trueman, Rebecca J.
2014-01-01
Science education is criticized because it often fails to support problem-solving skills in students. Instead, the instructional methods primarily emphasize didactic models that fail to engage students and reveal how the material can be applied to solve real problems. To overcome these limitations, this study asked participants in a general…
ERIC Educational Resources Information Center
Tsai, Bor-sheng
1994-01-01
Describes the use of infometry, or informational geometry, to meet the challenges of information service businesses. Highlights include theoretical models for cognitive coordination and genetic programming; electronic information packaging; marketing electronic information products, including cost-benefit analyses; and recapitalization, including…
Code of Federal Regulations, 2013 CFR
2013-10-01
... configuration. (A) A manufacturer may group together subconfigurations that have the same test weight (ETW... section apply for vehicles complying in model year 2013. If some test groups are certified by EPA after... production that occurs after all test groups are certified in accordance with 40 CFR 1037.150 (a)(2). (vi...
Code of Federal Regulations, 2014 CFR
2014-10-01
... configuration. (A) A manufacturer may group together subconfigurations that have the same test weight (ETW... section apply for vehicles complying in model year 2013. If some test groups are certified by EPA after... production that occurs after all test groups are certified in accordance with 40 CFR 1037.150 (a)(2). (vi...
A probabilistic model framework for evaluating year-to-year variation in crop productivity
NASA Astrophysics Data System (ADS)
Yokozawa, M.; Iizumi, T.; Tao, F.
2008-12-01
Most models describing the relation between crop productivity and weather condition have so far been focused on mean changes of crop yield. For keeping stable food supply against abnormal weather as well as climate change, evaluating the year-to-year variations in crop productivity rather than the mean changes is more essential. We here propose a new framework of probabilistic model based on Bayesian inference and Monte Carlo simulation. As an example, we firstly introduce a model on paddy rice production in Japan. It is called PRYSBI (Process- based Regional rice Yield Simulator with Bayesian Inference; Iizumi et al., 2008). The model structure is the same as that of SIMRIW, which was developed and used widely in Japan. The model includes three sub- models describing phenological development, biomass accumulation and maturing of rice crop. These processes are formulated to include response nature of rice plant to weather condition. This model inherently was developed to predict rice growth and yield at plot paddy scale. We applied it to evaluate the large scale rice production with keeping the same model structure. Alternatively, we assumed the parameters as stochastic variables. In order to let the model catch up actual yield at larger scale, model parameters were determined based on agricultural statistical data of each prefecture of Japan together with weather data averaged over the region. The posterior probability distribution functions (PDFs) of parameters included in the model were obtained using Bayesian inference. The MCMC (Markov Chain Monte Carlo) algorithm was conducted to numerically solve the Bayesian theorem. For evaluating the year-to-year changes in rice growth/yield under this framework, we firstly iterate simulations with set of parameter values sampled from the estimated posterior PDF of each parameter and then take the ensemble mean weighted with the posterior PDFs. We will also present another example for maize productivity in China. The framework proposed here provides us information on uncertainties, possibilities and limitations on future improvements in crop model as well.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brink, Willem van den; Emerenciana, Annette
Increased incidence of C-cell carcinogenicity has been observed for glucagon-like-protein-1 receptor (GLP-1r) agonists in rodents. It is suggested that the duration of exposure is an indicator of carcinogenic potential in rodents of the different products on the market. Furthermore, the role of GLP-1-related mechanisms in the induction of C-cell carcinogenicity has gained increased attention by regulatory agencies. This study proposes an integrative pharmacokinetic/pharmacodynamic (PKPD) framework to identify explanatory factors and characterize differences in carcinogenic potential of the GLP-1r agonist products. PK models for four products (exenatide QW (once weekly), exenatide BID (twice daily), liraglutide and lixisenatide) were developed using nonlinearmore » mixed effects modelling. Predicted exposure was subsequently linked to GLP-1r stimulation using in vitro GLP-1r potency data. A logistic regression model was then applied to exenatide QW and liraglutide data to assess the relationship between GLP-1r stimulation and thyroid C-cell hyperplasia incidence as pre-neoplastic predictor of a carcinogenic response. The model showed a significant association between predicted GLP-1r stimulation and C-cell hyperplasia after 2 years of treatment. The predictive performance of the model was evaluated using lixisenatide, for which hyperplasia data were accurately described during the validation step. The use of a model-based approach provided insight into the relationship between C-cell hyperplasia and GLP-1r stimulation for all four products, which is not possible with traditional data analysis methods. It can be concluded that both pharmacokinetics (exposure) and pharmacodynamics (potency for GLP-1r) factors determine C-cell hyperplasia incidence in rodents. Our work highlights the pharmacological basis for GLP-1r agonist-induced C-cell carcinogenicity. The concept is promising for application to other drug classes. - Highlights: • An integrative PKPD model is applied to study GLP-1r agonist carcinogenicity. • C-cell carcinogenicity is impacted by both pharmacokinetics and pharmacodynamics. • The relation of GLP-1r stimulation and C-cell hyperplasia appears drug-independent. • Understanding carcinogenic risk needs a pharmacological basis.« less
NASA Astrophysics Data System (ADS)
Iwabuchi, Hironobu; Saito, Masanori; Tokoro, Yuka; Putri, Nurfiena Sagita; Sekiguchi, Miho
2016-12-01
Satellite remote sensing of the macroscopic, microphysical, and optical properties of clouds are useful for studying spatial and temporal variations of clouds at various scales and constraining cloud physical processes in climate and weather prediction models. Instead of using separate independent algorithms for different cloud properties, a unified, optimal estimation-based cloud retrieval algorithm is developed and applied to moderate resolution imaging spectroradiometer (MODIS) observations using ten thermal infrared bands. The model considers sensor configurations, background surface and atmospheric profile, and microphysical and optical models of ice and liquid cloud particles and radiative transfer in a plane-parallel, multilayered atmosphere. Measurement and model errors are thoroughly quantified from direct comparisons of clear-sky observations over the ocean with model calculations. Performance tests by retrieval simulations show that ice cloud properties are retrieved with high accuracy when cloud optical thickness (COT) is between 0.1 and 10. Cloud-top pressure is inferred with uncertainty lower than 10 % when COT is larger than 0.3. Applying the method to a tropical cloud system and comparing the results with the MODIS Collection 6 cloud product shows good agreement for ice cloud optical thickness when COT is less than about 5. Cloud-top height agrees well with estimates obtained by the CO2 slicing method used in the MODIS product. The present algorithm can detect optically thin parts at the edges of high clouds well in comparison with the MODIS product, in which these parts are recognized as low clouds by the infrared window method. The cloud thermodynamic phase in the present algorithm is constrained by cloud-top temperature, which tends not to produce results with an ice cloud that is too warm and liquid cloud that is too cold.
Bett, R C; Kosgey, I S; Bebe, B O; Kahi, A K
2007-10-01
A deterministic model was developed and applied to evaluate biological and economic variables that characterize smallholder production systems utilizing the Kenya Dual Purpose goat (KDPG) in Kenya. The systems were defined as: smallholder low-potential (SLP), smallholder medium-potential (SMP) and smallholder high-potential (SHP). The model was able to predict revenues and costs to the system. Revenues were from sale of milk, surplus yearlings and cull-forage animals, while costs included those incurred for feeds, husbandry, marketing and fixed asset (fixed costs). Of the total outputs, revenue from meat and milk accounted for about 55% and 45%, respectively, in SMP and 39% and 61% in SHP. Total costs comprised mainly variable costs (98%), with husbandry costs being the highest in both SMP and SLP. The total profit per doe per year was KSh 315.48 in SMP, KSh -1352.75 in SLP and KSh -80.22 in SHP. Results suggest that the utilization of the KDPG goat in Kenya is more profitable in the smallholder medium-potential production system. The implication for the application of the model to smallholder production systems in Kenya is discussed.
Bai, Hong; Kong, Wen-Wen; Shao, Chang-Lun; Li, Yun; Liu, Yun-Zhang; Liu, Min; Guan, Fei-Fei; Wang, Chang-Yun
2016-04-01
Marine organisms often protect themselves against their predators by chemical defensive strategy. The second metabolites isolated from marine organisms and their symbiotic microbes have been proven to play a vital role in marine chemical ecology, such as ichthyotoxicity, allelopathy, and antifouling. It is well known that the microscale models for marine chemoecology assessment are urgently needed for trace quantity of marine natural products. Zebrafish model has been widely used as a microscale model in the fields of environment ecological evaluation and drug safety evaluation, but seldom reported for marine chemoecology assessment. In this work, zebrafish embryo toxicity microscale model was established for ichthyotoxicity evaluation of marine natural products by using 24-well microplate based on zebrafish embryo. Ichthyotoxicity was evaluated by observation of multiple toxicological endpoints, including coagulation egg, death, abnormal heartbeat, no spontaneous movement, delayed hatch, and malformation of the different organs during zebrafish embryogenesis periods at 24, 48, and 72 h post-fertilization (hpf). 3,4-Dichloroaniline was used as the positive control for method validation. Subsequently, the established model was applied to test the ichthyotoxic activity of the compounds isolated from corals and their symbiotic microbes and to isolate the bioactive secondary metabolites from the gorgonian Subergorgia mollis under bioassay guidance. It was suggested that zebrafish embryo toxicity microscale model is suitable for bioassay-guided isolation and preliminary bioactivity screening of marine natural products.
NASA Astrophysics Data System (ADS)
Pickett, Derek Kyle
Due to an increased interest in sustainable energy, biodiesel has become much more widely used in the last several years. Glycerin, one major waste component in biodiesel production, can be converted into a hydrogen rich synthesis gas to be used in an engine generator to recover energy from the biodiesel production process. This thesis contains information detailing the production, testing, and analysis of a unique synthesis generator rig at the University of Kansas. Chapter 2 gives a complete background of all major components, as well as how they are operated. In addition to component descriptions, methods for operating the system on pure propane, reformed propane, reformed glycerin along with the methodology of data acquisition is described. This chapter will serve as a complete operating manual for future students to continue research on the project. Chapter 3 details the literature review that was completed to better understand fuel reforming of propane and glycerin. This chapter also describes the numerical model produced to estimate the species produced during reformation activities. The model was applied to propane reformation in a proof of concept and calibration test before moving to glycerin reformation and its subsequent combustion. Chapter 4 first describes the efforts to apply the numerical model to glycerin using the calibration tools from propane reformation. It then discusses catalytic material preparation and glycerin reformation tests. Gas chromatography analysis of the reformer effluent was completed to compare to theoretical values from the numerical model. Finally, combustion of reformed glycerin was completed for power generation. Tests were completed to compare emissions from syngas combustion and propane combustion.
Bamford, Adrian; Nation, Andy; Durrell, Susie; Andronis, Lazaros; Rule, Ellen; McLeod, Hugh
2017-02-03
The Keele stratified care model for management of low back pain comprises use of the prognostic STarT Back Screening Tool to allocate patients into one of three risk-defined categories leading to associated risk-specific treatment pathways, such that high-risk patients receive enhanced treatment and more sessions than medium- and low-risk patients. The Keele model is associated with economic benefits and is being widely implemented. The objective was to assess the use of the stratified model following its introduction in an acute hospital physiotherapy department setting in Gloucestershire, England. Physiotherapists recorded data on 201 patients treated using the Keele model in two audits in 2013 and 2014. To assess whether implementation of the stratified model was associated with the anticipated range of treatment sessions, regression analysis of the audit data was used to determine whether high- or medium-risk patients received significantly more treatment sessions than low-risk patients. The analysis controlled for patient characteristics, year, physiotherapists' seniority and physiotherapist. To assess the physiotherapists' views on the usefulness of the stratified model, audit data on this were analysed using framework methods. To assess the potential economic consequences of introducing the stratified care model in Gloucestershire, published economic evaluation findings on back-related National Health Service (NHS) costs, quality-adjusted life years (QALYs) and societal productivity losses were applied to audit data on the proportion of patients by risk classification and estimates of local incidence. When the Keele model was implemented, patients received significantly more treatment sessions as the risk-rating increased, in line with the anticipated impact of targeted treatment pathways. Physiotherapists were largely positive about using the model. The potential annual impact of rolling out the model across Gloucestershire is a gain in approximately 30 QALYs, a reduction in productivity losses valued at £1.4 million and almost no change to NHS costs. The Keele model was implemented and risk-specific treatment pathways successfully used for patients presenting with low back pain. Applying published economic evidence to the Gloucestershire locality suggests that substantial health and productivity outcomes would be associated with rollout of the Keele model while being cost-neutral for the NHS.
Multispectral processing without spectra.
Drew, Mark S; Finlayson, Graham D
2003-07-01
It is often the case that multiplications of whole spectra, component by component, must be carried out,for example when light reflects from or is transmitted through materials. This leads to particularly taxing calculations, especially in spectrally based ray tracing or radiosity in graphics, making a full-spectrum method prohibitively expensive. Nevertheless, using full spectra is attractive because of the many important phenomena that can be modeled only by using all the physics at hand. We apply to the task of spectral multiplication a method previously used in modeling RGB-based light propagation. We show that we can often multiply spectra without carrying out spectral multiplication. In previous work [J. Opt. Soc. Am. A 11, 1553 (1994)] we developed a method called spectral sharpening, which took camera RGBs to a special sharp basis that was designed to render illuminant change simple to model. Specifically, in the new basis, one can effectively model illuminant change by using a diagonal matrix rather than the 3 x 3 linear transform that results from a three-component finite-dimensional model [G. Healey and D. Slater, J. Opt. Soc. Am. A 11, 3003 (1994)]. We apply this idea of sharpening to the set of principal components vectors derived from a representative set of spectra that might reasonably be encountered in a given application. With respect to the sharp spectral basis, we show that spectral multiplications can be modeled as the multiplication of the basis coefficients. These new product coefficients applied to the sharp basis serve to accurately reconstruct the spectral product. Although the method is quite general, we show how to use spectral modeling by taking advantage of metameric surfaces, ones that match under one light but not another, for tasks such as volume rendering. The use of metamers allows a user to pick out or merge different volume structures in real time simply by changing the lighting.
Multispectral processing without spectra
NASA Astrophysics Data System (ADS)
Drew, Mark S.; Finlayson, Graham D.
2003-07-01
It is often the case that multiplications of whole spectra, component by component, must be carried out, for example when light reflects from or is transmitted through materials. This leads to particularly taxing calculations, especially in spectrally based ray tracing or radiosity in graphics, making a full-spectrum method prohibitively expensive. Nevertheless, using full spectra is attractive because of the many important phenomena that can be modeled only by using all the physics at hand. We apply to the task of spectral multiplication a method previously used in modeling RGB-based light propagation. We show that we can often multiply spectra without carrying out spectral multiplication. In previous work J. Opt. Soc. Am. A 11 , 1553 (1994) we developed a method called spectral sharpening, which took camera RGBs to a special sharp basis that was designed to render illuminant change simple to model. Specifically, in the new basis, one can effectively model illuminant change by using a diagonal matrix rather than the 33 linear transform that results from a three-component finite-dimensional model G. Healey and D. Slater, J. Opt. Soc. Am. A 11 , 3003 (1994). We apply this idea of sharpening to the set of principal components vectors derived from a representative set of spectra that might reasonably be encountered in a given application. With respect to the sharp spectral basis, we show that spectral multiplications can be modeled as the multiplication of the basis coefficients. These new product coefficients applied to the sharp basis serve to accurately reconstruct the spectral product. Although the method is quite general, we show how to use spectral modeling by taking advantage of metameric surfaces, ones that match under one light but not another, for tasks such as volume rendering. The use of metamers allows a user to pick out or merge different volume structures in real time simply by changing the lighting. 2003 Optical Society of America
Enhancing e-waste estimates: improving data quality by multivariate Input-Output Analysis.
Wang, Feng; Huisman, Jaco; Stevels, Ab; Baldé, Cornelis Peter
2013-11-01
Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input-Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e-waste estimation studies. Copyright © 2013 Elsevier Ltd. All rights reserved.
Zhen, Chen; Brissette, Ian F.; Ruff, Ryan R.
2014-01-01
The obesity epidemic and excessive consumption of sugar-sweetened beverages have led to proposals of economics-based interventions to promote healthy eating in the United States. Targeted food and beverage taxes and subsidies are prominent examples of such potential intervention strategies. This paper examines the differential effects of taxing sugar-sweetened beverages by calories and by ounces on beverage demand. To properly measure the extent of substitution and complementarity between beverage products, we developed a fully modified distance metric model of differentiated product demand that endogenizes the cross-price effects. We illustrated the proposed methodology in a linear approximate almost ideal demand system, although other flexible demand systems can also be used. In the empirical application using supermarket scanner data, the product-level demand model consists of 178 beverage products with combined market share of over 90%. The novel demand model outperformed the conventional distance metric model in non-nested model comparison tests and in terms of the economic significance of model predictions. In the fully modified model, a calorie-based beverage tax was estimated to cost $1.40 less in compensating variation than an ounce-based tax per 3,500 beverage calories reduced. This difference in welfare cost estimates between two tax strategies is more than three times as much as the difference estimated by the conventional distance metric model. If applied to products purchased from all sources, a 0.04-cent per kcal tax on sugar-sweetened beverages is predicted to reduce annual per capita beverage intake by 5,800 kcal. PMID:25414517
Birdsall, Margaux
2011-01-01
This paper examines the definition of the terms "food" and "drug" as used in the Food, Drug and Cosmetic Act through the lens of biopharmed products. The paper uses the so-called "banana vaccine" as a case study to highlight the problems that occur when attempting to regulate a product that could be safely used as a food or as a drug. Specifically, the examination of this model illustrates the problems in the current definitional scheme. The paper considers how a product that straddles the definitional line between food and drug could be regulated and proposes a reformation to how the definitions are applied to products to better suit new technology in food and drugs.
Oguz, Temel; Macias, Diego; Tintore, Joaquin
2015-01-01
Buoyancy-induced unstable boundary currents and the accompanying retrograde density fronts are often the sites of pronounced mesoscale activity, ageostrophic frontal processes, and associated high biological production in marginal seas. Biophysical model simulations of the Catalano-Balearic Sea (Western Mediterranean) illustrated that the unstable and nonlinear southward frontal boundary current along the Spanish coast resulted in a strain-driven frontogenesis mechanism. High upwelling velocities of up to 80 m d-1 injected nutrients into the photic layer and promoted enhanced production on the less dense, onshore side of the front characterized by negative relative vorticity. Additional down-front wind stress and heat flux (cooling) intensified boundary current instabilities and thus ageostrophic cross-frontal circulation and augmented production. Specifically, entrainment of nutrients by relatively strong buoyancy-induced vertical mixing gave rise to a more widespread phytoplankton biomass distribution within the onshore side of the front. Mesoscale cyclonic eddies contributed to production through an eddy pumping mechanism, but it was less effective and more limited regionally than the frontal processes. The model was configured for the Catalano-Balearic Sea, but the mechanisms and model findings apply to other marginal seas with similar unstable frontal boundary current systems. PMID:26065688
Operational seasonal forecasting of crop performance.
Stone, Roger C; Meinke, Holger
2005-11-29
Integrated, interdisciplinary crop performance forecasting systems, linked with appropriate decision and discussion support tools, could substantially improve operational decision making in agricultural management. Recent developments in connecting numerical weather prediction models and general circulation models with quantitative crop growth models offer the potential for development of integrated systems that incorporate components of long-term climate change. However, operational seasonal forecasting systems have little or no value unless they are able to change key management decisions. Changed decision making through incorporation of seasonal forecasting ultimately has to demonstrate improved long-term performance of the cropping enterprise. Simulation analyses conducted on specific production scenarios are especially useful in improving decisions, particularly if this is done in conjunction with development of decision-support systems and associated facilitated discussion groups. Improved management of the overall crop production system requires an interdisciplinary approach, where climate scientists, agricultural scientists and extension specialists are intimately linked with crop production managers in the development of targeted seasonal forecast systems. The same principle applies in developing improved operational management systems for commodity trading organizations, milling companies and agricultural marketing organizations. Application of seasonal forecast systems across the whole value chain in agricultural production offers considerable benefits in improving overall operational management of agricultural production.
Operational seasonal forecasting of crop performance
Stone, Roger C; Meinke, Holger
2005-01-01
Integrated, interdisciplinary crop performance forecasting systems, linked with appropriate decision and discussion support tools, could substantially improve operational decision making in agricultural management. Recent developments in connecting numerical weather prediction models and general circulation models with quantitative crop growth models offer the potential for development of integrated systems that incorporate components of long-term climate change. However, operational seasonal forecasting systems have little or no value unless they are able to change key management decisions. Changed decision making through incorporation of seasonal forecasting ultimately has to demonstrate improved long-term performance of the cropping enterprise. Simulation analyses conducted on specific production scenarios are especially useful in improving decisions, particularly if this is done in conjunction with development of decision-support systems and associated facilitated discussion groups. Improved management of the overall crop production system requires an interdisciplinary approach, where climate scientists, agricultural scientists and extension specialists are intimately linked with crop production managers in the development of targeted seasonal forecast systems. The same principle applies in developing improved operational management systems for commodity trading organizations, milling companies and agricultural marketing organizations. Application of seasonal forecast systems across the whole value chain in agricultural production offers considerable benefits in improving overall operational management of agricultural production. PMID:16433097
SHEDS-HT: An Integrated Probabilistic Exposure Model for ...
United States Environmental Protection Agency (USEPA) researchers are developing a strategy for highthroughput (HT) exposure-based prioritization of chemicals under the ExpoCast program. These novel modeling approaches for evaluating chemicals based on their potential for biologically relevant human exposures will inform toxicity testing and prioritization for chemical risk assessment. Based on probabilistic methods and algorithms developed for The Stochastic Human Exposure and Dose Simulation Model for Multimedia, Multipathway Chemicals (SHEDS-MM), a new mechanistic modeling approach has been developed to accommodate high-throughput (HT) assessment of exposure potential. In this SHEDS-HT model, the residential and dietary modules of SHEDS-MM have been operationally modified to reduce the user burden, input data demands, and run times of the higher-tier model, while maintaining critical features and inputs that influence exposure. The model has been implemented in R; the modeling framework links chemicals to consumer product categories or food groups (and thus exposure scenarios) to predict HT exposures and intake doses. Initially, SHEDS-HT has been applied to 2507 organic chemicals associated with consumer products and agricultural pesticides. These evaluations employ data from recent USEPA efforts to characterize usage (prevalence, frequency, and magnitude), chemical composition, and exposure scenarios for a wide range of consumer products. In modeling indirec
A novel two-level dielectric barrier discharge reactor for methyl orange degradation.
Tao, Xumei; Wang, Guowei; Huang, Liang; Ye, Qingguo; Xu, Dongyan
2016-12-15
A novel pilot two-level dielectric barrier discharge (DBD) reactor has been proposed and applied for degradation of continuous model wastewater. The two-level DBD reactor was skillfully realized with high space utilization efficiency and large contact area between plasma and wastewater. Various conditions such as applied voltage, initial concentration and initial pH value on methyl orange (MO) model wastewater degradation were investigated. The results showed that the appropriate applied voltage was 13.4 kV; low initial concentration and low initial pH value were conducive for MO degradation. The percentage removal of 4 L MO with concentration of 80 mg/L reached 94.1% after plasma treatment for 80min. Based on ultraviolet spectrum (UV), Infrared spectrum (IR), liquid chromatography-mass spectrometry (LC-MS) analysis of degradation intermediates and products, insights in the degradation pathway of MO were proposed. Copyright © 2016 Elsevier Ltd. All rights reserved.
A computational cognitive model of syntactic priming.
Reitter, David; Keller, Frank; Moore, Johanna D
2011-01-01
The psycholinguistic literature has identified two syntactic adaptation effects in language production: rapidly decaying short-term priming and long-lasting adaptation. To explain both effects, we present an ACT-R model of syntactic priming based on a wide-coverage, lexicalized syntactic theory that explains priming as facilitation of lexical access. In this model, two well-established ACT-R mechanisms, base-level learning and spreading activation, account for long-term adaptation and short-term priming, respectively. Our model simulates incremental language production and in a series of modeling studies, we show that it accounts for (a) the inverse frequency interaction; (b) the absence of a decay in long-term priming; and (c) the cumulativity of long-term adaptation. The model also explains the lexical boost effect and the fact that it only applies to short-term priming. We also present corpus data that verify a prediction of the model, that is, that the lexical boost affects all lexical material, rather than just heads. Copyright © 2011 Cognitive Science Society, Inc.
Advanced optical modeling of TiN metal hard mask for scatterometric critical dimension metrology
NASA Astrophysics Data System (ADS)
Ebersbach, Peter; Urbanowicz, Adam M.; Likhachev, Dmitriy; Hartig, Carsten
2017-03-01
The majority of scatterometric production control models assume constant optical properties of the materials and only dimensional parameters are allowed to vary. However, this assumption, especially in case of thin-metal films, negatively impacts model precision and accuracy. In this work we focus on optical modeling of the TiN metal hardmask for scatterometry applications. Since the dielectric function of TiN exhibits thickness dependence, we had to take this fact into account. Moreover, presence of the highly absorbing films influences extracted thicknesses of dielectric layers underneath the metal films. The later phenomenon is often not reflected by goodness of fit. We show that accurate optical modeling of metal is essential to achieve desired scatterometric model quality for automatic process control in microelectronic production. Presented modeling methodology can be applied to other TiN applications such as diffusion barriers and metal gates as well as for other metals used in microelectronic manufacturing for all technology nodes.
Improvement of water transport mechanisms during potato drying by applying ultrasound.
Ozuna, César; Cárcel, Juan A; García-Pérez, José V; Mulet, Antonio
2011-11-01
The drying rate of vegetables is limited by internal moisture diffusion and convective transport mechanisms. The increase of drying air temperature leads to faster water mobility; however, it provokes quality loss in the product and presents a higher energy demand. Therefore, the search for new strategies to improve water mobility during convective drying constitutes a topic of relevant research. The aim of this work was to evaluate the use of power ultrasound to improve convective drying of potato and quantify the influence of the applied power in the water transport mechanisms. Drying kinetics of potato cubes were increased by the ultrasonic application. The influence of power ultrasound was dependent on the ultrasonic power (from 0 to 37 kW m(-3) ), the higher the applied power, the faster the drying kinetic. The diffusion model considering external resistance to mass transfer provided a good fit of drying kinetics. From modelling, it was observed a proportional and significant (P < 0.05) influence of the applied ultrasonic power on the identified kinetic parameters: effective moisture diffusivity and mass transfer coefficient. The ultrasonic application during drying represents an interesting alternative to traditional convective drying by shortening drying time, which may involve an energy saving concerning industrial applications. In addition, the ultrasonic effect in the water transport is based on mechanical phenomena with a low heating capacity, which is highly relevant for drying heat sensitive materials and also for obtaining high-quality dry products. Copyright © 2011 Society of Chemical Industry.
Model-based reasoning in the physics laboratory: Framework and initial results
NASA Astrophysics Data System (ADS)
Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.
2015-12-01
[This paper is part of the Focused Collection on Upper Division Physics Courses.] We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable process, within physics education, it has been preferentially applied to the iterative development of broadly applicable principles (e.g., Newton's laws of motion in introductory mechanics). A significant feature of the new framework is that measurement tools (in addition to the physical system being studied) are subjected to the process of modeling. Think-aloud interviews were used to refine the framework and demonstrate its utility by documenting examples of model-based reasoning in the laboratory. When applied to the think-aloud interviews, the framework captures and differentiates students' model-based reasoning and helps identify areas of future research. The interviews showed how students productively applied similar facets of modeling to the physical system and measurement tools: construction, prediction, interpretation of data, identification of model limitations, and revision. Finally, we document students' challenges in explicitly articulating assumptions when constructing models of experimental systems and further challenges in model construction due to students' insufficient prior conceptual understanding. A modeling perspective reframes many of the seemingly arbitrary technical details of measurement tools and apparatus as an opportunity for authentic and engaging scientific sense making.
Vastrad, B. M.; Neelagund, S. E.
2014-01-01
Neomycin production of Streptomyces fradiae NCIM 2418 was optimized by using response surface methodology (RSM), which is powerful mathematical approach comprehensively applied in the optimization of solid state fermentation processes. In the first step of optimization, with Placket-Burman design, ammonium chloride, sodium nitrate, L-histidine, and ammonium nitrate were established to be the crucial nutritional factors affecting neomycin production significantly. In the second step, a 24 full factorial central composite design and RSM were applied to determine the optimal concentration of significant variable. A second-order polynomial was determined by the multiple regression analysis of the experimental data. The optimum values for the important nutrients for the maximum were obtained as follows: ammonium chloride 2.00%, sodium nitrate 1.50%, L-histidine 0.250%, and ammonium nitrate 0.250% with a predicted value of maximum neomycin production of 20,000 g kg−1 dry coconut oil cake. Under the optimal condition, the practical neomycin production was 19,642 g kg−1 dry coconut oil cake. The determination coefficient (R 2) was 0.9232, which ensures an acceptable admissibility of the model. PMID:25009746
Evaluation of Wind Energy Production in Texas using Geographic Information Systems (GIS)
NASA Astrophysics Data System (ADS)
Ferrer, L. M.
2017-12-01
Texas has the highest installed wind capacity in the United States. The purpose of this research was to estimate the theoretical wind turbine energy production and the utilization ratio of wind turbines in Texas. Windfarm data was combined applying Geographic Information System (GIS) methodology to create an updated GIS wind turbine database, including location and technical specifications. Applying GIS diverse tools, the windfarm data was spatially joined with National Renewable Energy Laboratory (NREL) wind data to calculate the wind speed at each turbine hub. The power output for each turbine at the hub wind speed was evaluated by the GIS system according the respective turbine model power curve. In total over 11,700 turbines are installed in Texas with an estimated energy output of 60 GWh per year and an average utilization ratio of 0.32. This research indicates that applying GIS methodologies will be crucial in the growth of wind energy and efficiency in Texas.