ERIC Educational Resources Information Center
Zillesen, Pieter G. van Schaick
This paper introduces a hardware and software independent model for producing educational computer simulation environments. The model, which is based on the results of 32 studies of educational computer simulations program production, implies that educational computer simulation environments are specified, constructed, tested, implemented, and…
Modelling milk production from feed intake in dairy cattle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clarke, D.L.
1985-05-01
Predictive models were developed for both Holstein and Jersey cows. Since Holsteins comprised eighty-five percent of the data, the predictive models developed for Holsteins were used for the development of a user-friendly computer model. Predictive models included: milk production (squared multiple correlation .73), natural log (ln) of milk production (.73), four percent fat-corrected milk (.67), ln four percent fat-corrected milk (.68), fat-free milk (.73), ln fat-free milk (.73), dry matter intake (.61), ln dry matter intake (.60), milk fat (.52), and ln milk fat (.56). The predictive models for ln milk production, ln fat-free milk and ln dry matter intakemore » were incorporated into a computer model. The model was written in standard Fortran for use on mainframe or micro-computers. Daily milk production, fat-free milk production, and dry matter intake were predicted on a daily basis with the previous day's dry matter intake serving as an independent variable in the prediction of the daily milk and fat-free milk production. 21 refs.« less
NASA Astrophysics Data System (ADS)
Divayana, D. G. H.; Adiarta, A.; Abadi, I. B. G. S.
2018-01-01
The aim of this research was to create initial design of CSE-UCLA evaluation model modified with Weighted Product in evaluating digital library service at Computer College in Bali. The method used in this research was developmental research method and developed by Borg and Gall model design. The results obtained from the research that conducted earlier this month was a rough sketch of Weighted Product based CSE-UCLA evaluation model that the design had been able to provide a general overview of the stages of weighted product based CSE-UCLA evaluation model used in order to optimize the digital library services at the Computer Colleges in Bali.
The application of virtual reality systems as a support of digital manufacturing and logistics
NASA Astrophysics Data System (ADS)
Golda, G.; Kampa, A.; Paprocka, I.
2016-08-01
Modern trends in development of computer aided techniques are heading toward the integration of design competitive products and so-called "digital manufacturing and logistics", supported by computer simulation software. All phases of product lifecycle: starting from design of a new product, through planning and control of manufacturing, assembly, internal logistics and repairs, quality control, distribution to customers and after-sale service, up to its recycling or utilization should be aided and managed by advanced packages of product lifecycle management software. Important problems for providing the efficient flow of materials in supply chain management of whole product lifecycle, using computer simulation will be described on that paper. Authors will pay attention to the processes of acquiring relevant information and correct data, necessary for virtual modeling and computer simulation of integrated manufacturing and logistics systems. The article describes possibilities of use an applications of virtual reality software for modeling and simulation the production and logistics processes in enterprise in different aspects of product lifecycle management. The authors demonstrate effective method of creating computer simulations for digital manufacturing and logistics and show modeled and programmed examples and solutions. They pay attention to development trends and show options of the applications that go beyond enterprise.
ERIC Educational Resources Information Center
Stumpf, Mark R.
This report presents an instructional design model that was developed for use by the End-Users Computing department of a large pharmaceutical company in developing effective--but not lengthy--microcomputer training seminars to train office workers and executives in the proper use of computers and thus increase their productivity. The 14 steps of…
Biological production models as elements of coupled, atmosphere-ocean models for climate research
NASA Technical Reports Server (NTRS)
Platt, Trevor; Sathyendranath, Shubha
1991-01-01
Process models of phytoplankton production are discussed with respect to their suitability for incorporation into global-scale numerical ocean circulation models. Exact solutions are given for integrals over the mixed layer and the day of analytic, wavelength-independent models of primary production. Within this class of model, the bias incurred by using a triangular approximation (rather than a sinusoidal one) to the variation of surface irradiance through the day is computed. Efficient computation algorithms are given for the nonspectral models. More exact calculations require a spectrally sensitive treatment. Such models exist but must be integrated numerically over depth and time. For these integrations, resolution in wavelength, depth, and time are considered and recommendations made for efficient computation. The extrapolation of the one-(spatial)-dimension treatment to large horizontal scale is discussed.
Product placement of computer games in cyberspace.
Yang, Heng-Li; Wang, Cheng-Shu
2008-08-01
Computer games are considered an emerging media and are even regarded as an advertising channel. By a three-phase experiment, this study investigated the advertising effectiveness of computer games for different product placement forms, product types, and their combinations. As the statistical results revealed, computer games are appropriate for placement advertising. Additionally, different product types and placement forms produced different advertising effectiveness. Optimum combinations of product types and placement forms existed. An advertisement design model is proposed for use in game design environments. Some suggestions are given for advertisers and game companies respectively.
A convolution model for computing the far-field directivity of a parametric loudspeaker array.
Shi, Chuang; Kajikawa, Yoshinobu
2015-02-01
This paper describes a method to compute the far-field directivity of a parametric loudspeaker array (PLA), whereby the steerable parametric loudspeaker can be implemented when phased array techniques are applied. The convolution of the product directivity and the Westervelt's directivity is suggested, substituting for the past practice of using the product directivity only. Computed directivity of a PLA using the proposed convolution model achieves significant improvement in agreement to measured directivity at a negligible computational cost.
Mateo, Jordi; Pla, Lluis M; Solsona, Francesc; Pagès, Adela
2016-01-01
Production planning models are achieving more interest for being used in the primary sector of the economy. The proposed model relies on the formulation of a location model representing a set of farms susceptible of being selected by a grocery shop brand to supply local fresh products under seasonal contracts. The main aim is to minimize overall procurement costs and meet future demand. This kind of problem is rather common in fresh vegetable supply chains where producers are located in proximity either to processing plants or retailers. The proposed two-stage stochastic model determines which suppliers should be selected for production contracts to ensure high quality products and minimal time from farm-to-table. Moreover, Lagrangian relaxation and parallel computing algorithms are proposed to solve these instances efficiently in a reasonable computational time. The results obtained show computational gains from our algorithmic proposals in front of the usage of plain CPLEX solver. Furthermore, the results ensure the competitive advantages of using the proposed model by purchase managers in the fresh vegetables industry.
The DIVA model: A neural theory of speech acquisition and production
Tourville, Jason A.; Guenther, Frank H.
2013-01-01
The DIVA model of speech production provides a computationally and neuroanatomically explicit account of the network of brain regions involved in speech acquisition and production. An overview of the model is provided along with descriptions of the computations performed in the different brain regions represented in the model. The latest version of the model, which contains a new right-lateralized feedback control map in ventral premotor cortex, will be described, and experimental results that motivated this new model component will be discussed. Application of the model to the study and treatment of communication disorders will also be briefly described. PMID:23667281
Harris, Courtenay; Straker, Leon; Pollock, Clare; Smith, Anne
2015-01-01
Children's computer use is rapidly growing, together with reports of related musculoskeletal outcomes. Models and theories of adult-related risk factors demonstrate multivariate risk factors associated with computer use. Children's use of computers is different from adult's computer use at work. This study developed and tested a child-specific model demonstrating multivariate relationships between musculoskeletal outcomes, computer exposure and child factors. Using pathway modelling, factors such as gender, age, television exposure, computer anxiety, sustained attention (flow), socio-economic status and somatic complaints (headache and stomach pain) were found to have effects on children's reports of musculoskeletal symptoms. The potential for children's computer exposure to follow a dose-response relationship was also evident. Developing a child-related model can assist in understanding risk factors for children's computer use and support the development of recommendations to encourage children to use this valuable resource in educational, recreational and communication environments in a safe and productive manner. Computer use is an important part of children's school and home life. Application of this developed model, that encapsulates related risk factors, enables practitioners, researchers, teachers and parents to develop strategies that assist young people to use information technology for school, home and leisure in a safe and productive manner.
Progress in computational toxicology.
Ekins, Sean
2014-01-01
Computational methods have been widely applied to toxicology across pharmaceutical, consumer product and environmental fields over the past decade. Progress in computational toxicology is now reviewed. A literature review was performed on computational models for hepatotoxicity (e.g. for drug-induced liver injury (DILI)), cardiotoxicity, renal toxicity and genotoxicity. In addition various publications have been highlighted that use machine learning methods. Several computational toxicology model datasets from past publications were used to compare Bayesian and Support Vector Machine (SVM) learning methods. The increasing amounts of data for defined toxicology endpoints have enabled machine learning models that have been increasingly used for predictions. It is shown that across many different models Bayesian and SVM perform similarly based on cross validation data. Considerable progress has been made in computational toxicology in a decade in both model development and availability of larger scale or 'big data' models. The future efforts in toxicology data generation will likely provide us with hundreds of thousands of compounds that are readily accessible for machine learning models. These models will cover relevant chemistry space for pharmaceutical, consumer product and environmental applications. Copyright © 2013 Elsevier Inc. All rights reserved.
Creation of Anatomically Accurate Computer-Aided Design (CAD) Solid Models from Medical Images
NASA Technical Reports Server (NTRS)
Stewart, John E.; Graham, R. Scott; Samareh, Jamshid A.; Oberlander, Eric J.; Broaddus, William C.
1999-01-01
Most surgical instrumentation and implants used in the world today are designed with sophisticated Computer-Aided Design (CAD)/Computer-Aided Manufacturing (CAM) software. This software automates the mechanical development of a product from its conceptual design through manufacturing. CAD software also provides a means of manipulating solid models prior to Finite Element Modeling (FEM). Few surgical products are designed in conjunction with accurate CAD models of human anatomy because of the difficulty with which these models are created. We have developed a novel technique that creates anatomically accurate, patient specific CAD solids from medical images in a matter of minutes.
Dayglow and night glow of the Venusian upper atmosphere. Modelling and observations
NASA Astrophysics Data System (ADS)
Gronoff, G.; Lilensten, J.; Simon, C.; Barthélemy, M.; Leblanc, F.
2007-08-01
Aims. We present the modelling of the production of excited states of O, CO and N2 in the Venusian upper atmosphere, which allows to compute the nightglow emissions. In the dayside, we also compute several emissions, taking advantage of the small influence of resonant scattering for forbidden transitions. Methods. We compute the photoionisation and the photodissociation mechanisms, and thus the photoelectron production. We compute electron impact excitation and ionisation through a multi-stream stationary kinetic transport code. Finally, we compute the ion recombination with a stationary chemical model. Results.We predict altitude density profiles for O(1S) and O(1D) states and the emissions corresponding to their different transitions. They are found to be very comparable to the observations without the need for stratospheric emissions. In the nightside, we highlight the role of the N + O+2 reaction in the creation of the O(1S) state. This reaction has been suggested by Rees in 1975 (Frederick, 1976). It has been discussed several times afterwhile and in spite of different studies, is still controversial. However, when we take it in consideration in Venus, it is shown to be the cause of almost 90% of the state production. We calculate the production intensities of O(3S) and O(5S) states, which are needed for radiative transfer models. For CO we compute the Cameron band and the fourth positive band emissions. For N2 we compute the LBH, first and second positive bands. All these values are successfully compared to the experiment when data are available. Conclusions. For the first time, a comprehensive model is proposed to compute dayglow and nightglow emissions of the Venusian upper atmosphere. It relies on previous works with noticeable improvements, both on the transport and on the chemical aspects. In the near future, a radiative transfer model will be used to compute optically thick lines in the dayglow, and a fluid model will be added to compute ionosphere densities and temperatures. We will present the first observational results from the Pic du Midi telescope in June 2007, in order to compare with our modelling.
Development of a model and computer code to describe solar grade silicon production processes
NASA Technical Reports Server (NTRS)
Srivastava, R.; Gould, R. K.
1979-01-01
Mathematical models, and computer codes based on these models were developed which allow prediction of the product distribution in chemical reactors in which gaseous silicon compounds are converted to condensed phase silicon. The reactors to be modeled are flow reactors in which silane or one of the halogenated silanes is thermally decomposed or reacted with an alkali metal, H2 or H atoms. Because the product of interest is particulate silicon, processes which must be modeled, in addition to mixing and reaction of gas-phase reactants, include the nucleation and growth of condensed Si via coagulation, condensation, and heterogeneous reaction.
FORBEEF: A Forage-Livestock System Computer Model Used as a Teaching Aid for Decision Making.
ERIC Educational Resources Information Center
Stringer, W. C.; And Others
1987-01-01
Describes the development of a computer simulation model of forage-beef production systems, which is intended to incorporate soil, forage, and animal decisions into an enterprise scenario. Produces a summary of forage production and livestock needs. Cites positive assessment of the program's value by participants in inservice training workshops.…
SAMICS marketing and distribution model
NASA Technical Reports Server (NTRS)
1978-01-01
A SAMICS (Solar Array Manufacturing Industry Costing Standards) was formulated as a computer simulation model. Given a proper description of the manufacturing technology as input, this model computes the manufacturing price of solar arrays for a broad range of production levels. This report presents a model for computing these marketing and distribution costs, the end point of the model being the loading dock of the final manufacturer.
First Order Fire Effects Model: FOFEM 4.0, user's guide
Elizabeth D. Reinhardt; Robert E. Keane; James K. Brown
1997-01-01
A First Order Fire Effects Model (FOFEM) was developed to predict the direct consequences of prescribed fire and wildfire. FOFEM computes duff and woody fuel consumption, smoke production, and fire-caused tree mortality for most forest and rangeland types in the United States. The model is available as a computer program for PC or Data General computer.
Designers workbench: toward real-time immersive modeling
NASA Astrophysics Data System (ADS)
Kuester, Falko; Duchaineau, Mark A.; Hamann, Bernd; Joy, Kenneth I.; Ma, Kwan-Liu
2000-05-01
This paper introduces the Designers Workbench, a semi- immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing, and computer-aided engineering systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates form a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The Designers Workbench aims at closing this technology or 'digital gap' experienced by design and CAD engineers by transforming the classical design paradigm into its fully integrate digital and virtual analog allowing collaborative development in a semi- immersive virtual environment. This project emphasizes two key components form the classical product design cycle: freeform modeling and analysis. In the freedom modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rest, J.
1995-08-01
This report describes the primary physical models that form the basis of the DART mechanistic computer model for calculating fission-product-induced swelling of aluminum dispersion fuels; the calculated results are compared with test data. In addition, DART calculates irradiation-induced changes in the thermal conductivity of the dispersion fuel, as well as fuel restructuring due to aluminum fuel reaction, amorphization, and recrystallization. Input instructions for execution on mainframe, workstation, and personal computers are provided, as is a description of DART output. The theory of fission gas behavior and its effect on fuel swelling is discussed. The behavior of these fission products inmore » both crystalline and amorphous fuel and in the presence of irradiation-induced recrystallization and crystalline-to-amorphous-phase change phenomena is presented, as are models for these irradiation-induced processes.« less
ERIC Educational Resources Information Center
Bucks, Gregory Warren
2010-01-01
Computers have become an integral part of how engineers complete their work, allowing them to collect and analyze data, model potential solutions and aiding in production through automation and robotics. In addition, computers are essential elements of the products themselves, from tennis shoes to construction materials. An understanding of how…
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.
1995-01-01
Computing architectures are being assembled that extend concurrent engineering practices by providing more efficient execution and collaboration on distributed, heterogeneous computing networks. Built on the successes of initial architectures, requirements for a next-generation design computing infrastructure can be developed. These requirements concentrate on those needed by a designer in decision-making processes from product conception to recycling and can be categorized in two areas: design process and design information management. A designer both designs and executes design processes throughout design time to achieve better product and process capabilities while expanding fewer resources. In order to accomplish this, information, or more appropriately design knowledge, needs to be adequately managed during product and process decomposition as well as recomposition. A foundation has been laid that captures these requirements in a design architecture called DREAMS (Developing Robust Engineering Analysis Models and Specifications). In addition, a computing infrastructure, called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment), is being developed that satisfies design requirements defined in DREAMS and incorporates enabling computational technologies.
Pharmaceutical industry and trade liberalization using computable general equilibrium model.
Barouni, M; Ghaderi, H; Banouei, Aa
2012-01-01
Computable general equilibrium models are known as a powerful instrument in economic analyses and widely have been used in order to evaluate trade liberalization effects. The purpose of this study was to provide the impacts of trade openness on pharmaceutical industry using CGE model. Using a computable general equilibrium model in this study, the effects of decrease in tariffs as a symbol of trade liberalization on key variables of Iranian pharmaceutical products were studied. Simulation was performed via two scenarios in this study. The first scenario was the effect of decrease in tariffs of pharmaceutical products as 10, 30, 50, and 100 on key drug variables, and the second was the effect of decrease in other sectors except pharmaceutical products on vital and economic variables of pharmaceutical products. The required data were obtained and the model parameters were calibrated according to the social accounting matrix of Iran in 2006. The results associated with simulation demonstrated that the first scenario has increased import, export, drug supply to markets and household consumption, while import, export, supply of product to market, and household consumption of pharmaceutical products would averagely decrease in the second scenario. Ultimately, society welfare would improve in all scenarios. We presents and synthesizes the CGE model which could be used to analyze trade liberalization policy issue in developing countries (like Iran), and thus provides information that policymakers can use to improve the pharmacy economics.
Computer model for economic study of unbleached kraft paperboard production
Peter J. Ince
1984-01-01
Unbleached kraft paperboard is produced from wood fiber in an industrial papermaking process. A highly specific and detailed model of the process is presented. The model is also presented as a working computer program. A user of the computer program will provide data on physical parameters of the process and on prices of material inputs and outputs. The program is then...
Integration of scheduling and discrete event simulation systems to improve production flow planning
NASA Astrophysics Data System (ADS)
Krenczyk, D.; Paprocka, I.; Kempa, W. M.; Grabowik, C.; Kalinowski, K.
2016-08-01
The increased availability of data and computer-aided technologies such as MRPI/II, ERP and MES system, allowing producers to be more adaptive to market dynamics and to improve production scheduling. Integration of production scheduling and computer modelling, simulation and visualization systems can be useful in the analysis of production system constraints related to the efficiency of manufacturing systems. A integration methodology based on semi-automatic model generation method for eliminating problems associated with complexity of the model and labour-intensive and time-consuming process of simulation model creation is proposed. Data mapping and data transformation techniques for the proposed method have been applied. This approach has been illustrated through examples of practical implementation of the proposed method using KbRS scheduling system and Enterprise Dynamics simulation system.
Babbitt, Callie W; Kahhat, Ramzy; Williams, Eric; Babbitt, Gregory A
2009-07-01
Product lifespan is a fundamental variable in understanding the environmental impacts associated with the life cycle of products. Existing life cycle and materials flow studies of products, almost without exception, consider lifespan to be constant over time. To determine the validity of this assumption, this study provides an empirical documentation of the long-term evolution of personal computer lifespan, using a major U.S. university as a case study. Results indicate that over the period 1985-2000, computer lifespan (purchase to "disposal") decreased steadily from a mean of 10.7 years in 1985 to 5.5 years in 2000. The distribution of lifespan also evolved, becoming narrower over time. Overall, however, lifespan distribution was broader than normally considered in life cycle assessments or materials flow forecasts of electronic waste management for policy. We argue that these results suggest that at least for computers, the assumption of constant lifespan is problematic and that it is important to work toward understanding the dynamics of use patterns. We modify an age-structured model of population dynamics from biology as a modeling approach to describe product life cycles. Lastly, the purchase share and generation of obsolete computers from the higher education sector is estimated using different scenarios for the dynamics of product lifespan.
A review of ocean chlorophyll algorithms and primary production models
NASA Astrophysics Data System (ADS)
Li, Jingwen; Zhou, Song; Lv, Nan
2015-12-01
This paper mainly introduces the five ocean chlorophyll concentration inversion algorithm and 3 main models for computing ocean primary production based on ocean chlorophyll concentration. Through the comparison of five ocean chlorophyll inversion algorithm, sums up the advantages and disadvantages of these algorithm,and briefly analyzes the trend of ocean primary production model.
A primer for biomedical scientists on how to execute model II linear regression analysis.
Ludbrook, John
2012-04-01
1. There are two very different ways of executing linear regression analysis. One is Model I, when the x-values are fixed by the experimenter. The other is Model II, in which the x-values are free to vary and are subject to error. 2. I have received numerous complaints from biomedical scientists that they have great difficulty in executing Model II linear regression analysis. This may explain the results of a Google Scholar search, which showed that the authors of articles in journals of physiology, pharmacology and biochemistry rarely use Model II regression analysis. 3. I repeat my previous arguments in favour of using least products linear regression analysis for Model II regressions. I review three methods for executing ordinary least products (OLP) and weighted least products (WLP) regression analysis: (i) scientific calculator and/or computer spreadsheet; (ii) specific purpose computer programs; and (iii) general purpose computer programs. 4. Using a scientific calculator and/or computer spreadsheet, it is easy to obtain correct values for OLP slope and intercept, but the corresponding 95% confidence intervals (CI) are inaccurate. 5. Using specific purpose computer programs, the freeware computer program smatr gives the correct OLP regression coefficients and obtains 95% CI by bootstrapping. In addition, smatr can be used to compare the slopes of OLP lines. 6. When using general purpose computer programs, I recommend the commercial programs systat and Statistica for those who regularly undertake linear regression analysis and I give step-by-step instructions in the Supplementary Information as to how to use loss functions. © 2011 The Author. Clinical and Experimental Pharmacology and Physiology. © 2011 Blackwell Publishing Asia Pty Ltd.
Computational Methods to Assess the Production Potential of Bio-Based Chemicals.
Campodonico, Miguel A; Sukumara, Sumesh; Feist, Adam M; Herrgård, Markus J
2018-01-01
Elevated costs and long implementation times of bio-based processes for producing chemicals represent a bottleneck for moving to a bio-based economy. A prospective analysis able to elucidate economically and technically feasible product targets at early research phases is mandatory. Computational tools can be implemented to explore the biological and technical spectrum of feasibility, while constraining the operational space for desired chemicals. In this chapter, two different computational tools for assessing potential for bio-based production of chemicals from different perspectives are described in detail. The first tool is GEM-Path: an algorithm to compute all structurally possible pathways from one target molecule to the host metabolome. The second tool is a framework for Modeling Sustainable Industrial Chemicals production (MuSIC), which integrates modeling approaches for cellular metabolism, bioreactor design, upstream/downstream processes, and economic impact assessment. Integrating GEM-Path and MuSIC will play a vital role in supporting early phases of research efforts and guide the policy makers with decisions, as we progress toward planning a sustainable chemical industry.
Designers Workbench: Towards Real-Time Immersive Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuester, F; Duchaineau, M A; Hamann, B
2001-10-03
This paper introduces the DesignersWorkbench, a semi-immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing (CAM), and computer-aided engineering (CAE) systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates from a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The DesignersWorkbench aims at closing this technologymore » or ''digital gap'' experienced by design and CAD engineers by transforming the classical design paradigm into its filly integrated digital and virtual analog allowing collaborative development in a semi-immersive virtual environment. This project emphasizes two key components from the classical product design cycle: freeform modeling and analysis. In the freeform modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.« less
NASA Astrophysics Data System (ADS)
Wang, Yu; Liu, Qun
2013-01-01
Surplus-production models are widely used in fish stock assessment and fisheries management due to their simplicity and lower data demands than age-structured models such as Virtual Population Analysis. The CEDA (catch-effort data analysis) and ASPIC (a surplus-production model incorporating covariates) computer packages are data-fitting or parameter estimation tools that have been developed to analyze catch-and-effort data using non-equilibrium surplus production models. We applied CEDA and ASPIC to the hairtail ( Trichiurus japonicus) fishery in the East China Sea. Both packages produced robust results and yielded similar estimates. In CEDA, the Schaefer surplus production model with log-normal error assumption produced results close to those of ASPIC. CEDA is sensitive to the choice of initial proportion, while ASPIC is not. However, CEDA produced higher R 2 values than ASPIC.
Rapid prototyping and stereolithography in dentistry
Nayar, Sanjna; Bhuminathan, S.; Bhat, Wasim Manzoor
2015-01-01
The word rapid prototyping (RP) was first used in mechanical engineering field in the early 1980s to describe the act of producing a prototype, a unique product, the first product, or a reference model. In the past, prototypes were handmade by sculpting or casting, and their fabrication demanded a long time. Any and every prototype should undergo evaluation, correction of defects, and approval before the beginning of its mass or large scale production. Prototypes may also be used for specific or restricted purposes, in which case they are usually called a preseries model. With the development of information technology, three-dimensional models can be devised and built based on virtual prototypes. Computers can now be used to create accurately detailed projects that can be assessed from different perspectives in a process known as computer aided design (CAD). To materialize virtual objects using CAD, a computer aided manufacture (CAM) process has been developed. To transform a virtual file into a real object, CAM operates using a machine connected to a computer, similar to a printer or peripheral device. In 1987, Brix and Lambrecht used, for the first time, a prototype in health care. It was a three-dimensional model manufactured using a computer numerical control device, a type of machine that was the predecessor of RP. In 1991, human anatomy models produced with a technology called stereolithography were first used in a maxillofacial surgery clinic in Viena. PMID:26015715
Rapid prototyping and stereolithography in dentistry.
Nayar, Sanjna; Bhuminathan, S; Bhat, Wasim Manzoor
2015-04-01
The word rapid prototyping (RP) was first used in mechanical engineering field in the early 1980s to describe the act of producing a prototype, a unique product, the first product, or a reference model. In the past, prototypes were handmade by sculpting or casting, and their fabrication demanded a long time. Any and every prototype should undergo evaluation, correction of defects, and approval before the beginning of its mass or large scale production. Prototypes may also be used for specific or restricted purposes, in which case they are usually called a preseries model. With the development of information technology, three-dimensional models can be devised and built based on virtual prototypes. Computers can now be used to create accurately detailed projects that can be assessed from different perspectives in a process known as computer aided design (CAD). To materialize virtual objects using CAD, a computer aided manufacture (CAM) process has been developed. To transform a virtual file into a real object, CAM operates using a machine connected to a computer, similar to a printer or peripheral device. In 1987, Brix and Lambrecht used, for the first time, a prototype in health care. It was a three-dimensional model manufactured using a computer numerical control device, a type of machine that was the predecessor of RP. In 1991, human anatomy models produced with a technology called stereolithography were first used in a maxillofacial surgery clinic in Viena.
Gilet, Estelle; Diard, Julien; Bessière, Pierre
2011-01-01
In this paper, we study the collaboration of perception and action representations involved in cursive letter recognition and production. We propose a mathematical formulation for the whole perception–action loop, based on probabilistic modeling and Bayesian inference, which we call the Bayesian Action–Perception (BAP) model. Being a model of both perception and action processes, the purpose of this model is to study the interaction of these processes. More precisely, the model includes a feedback loop from motor production, which implements an internal simulation of movement. Motor knowledge can therefore be involved during perception tasks. In this paper, we formally define the BAP model and show how it solves the following six varied cognitive tasks using Bayesian inference: i) letter recognition (purely sensory), ii) writer recognition, iii) letter production (with different effectors), iv) copying of trajectories, v) copying of letters, and vi) letter recognition (with internal simulation of movements). We present computer simulations of each of these cognitive tasks, and discuss experimental predictions and theoretical developments. PMID:21674043
Kutlak, Roman; van Deemter, Kees; Mellish, Chris
2016-01-01
This article presents a computational model of the production of referring expressions under uncertainty over the hearer's knowledge. Although situations where the hearer's knowledge is uncertain have seldom been addressed in the computational literature, they are common in ordinary communication, for example when a writer addresses an unknown audience, or when a speaker addresses a stranger. We propose a computational model composed of three complimentary heuristics based on, respectively, an estimation of the recipient's knowledge, an estimation of the extent to which a property is unexpected, and the question of what is the optimum number of properties in a given situation. The model was tested in an experiment with human readers, in which it was compared against the Incremental Algorithm and human-produced descriptions. The results suggest that the new model outperforms the Incremental Algorithm in terms of the proportion of correctly identified entities and in terms of the perceived quality of the generated descriptions. PMID:27630592
Generation and use of human 3D-CAD models
NASA Astrophysics Data System (ADS)
Grotepass, Juergen; Speyer, Hartmut; Kaiser, Ralf
2002-05-01
Individualized Products are one of the ten mega trends of the 21st Century with human modeling as the key issue for tomorrow's design and product development. The use of human modeling software for computer based ergonomic simulations within the production process increases quality while reducing costs by 30- 50 percent and shortening production time. This presentation focuses on the use of human 3D-CAD models for both, the ergonomic design of working environments and made to measure garment production. Today, the entire production chain can be designed, individualized models generated and analyzed in 3D computer environments. Anthropometric design for ergonomics is matched to human needs, thus preserving health. Ergonomic simulation includes topics as human vision, reachability, kinematics, force and comfort analysis and international design capabilities. In German more than 17 billions of Mark are moved to other industries, because clothes do not fit. Individual clothing tailored to the customer's preference means surplus value, pleasure and perfect fit. The body scanning technology is the key to generation and use of human 3D-CAD models for both, the ergonomic design of working environments and made to measure garment production.
NASA Technical Reports Server (NTRS)
2001-01-01
Howmet Research Corporation was the first to commercialize an innovative cast metal technology developed at Auburn University, Auburn, Alabama. With funding assistance from NASA's Marshall Space Flight Center, Auburn University's Solidification Design Center (a NASA Commercial Space Center), developed accurate nickel-based superalloy data for casting molten metals. Through a contract agreement, Howmet used the data to develop computer model predictions of molten metals and molding materials in cast metal manufacturing. Howmet Metal Mold (HMM), part of Howmet Corporation Specialty Products, of Whitehall, Michigan, utilizes metal molds to manufacture net shape castings in various alloys and amorphous metal (metallic glass). By implementing the thermophysical property data from by Auburn researchers, Howmet employs its newly developed computer model predictions to offer customers high-quality, low-cost, products with significantly improved mechanical properties. Components fabricated with this new process replace components originally made from forgings or billet. Compared with products manufactured through traditional casting methods, Howmet's computer-modeled castings come out on top.
Some Useful Cost-Benefit Criteria for Evaluating Computer-Based Test Delivery Models and Systems
ERIC Educational Resources Information Center
Luecht, Richard M.
2005-01-01
Computer-based testing (CBT) is typically implemented using one of three general test delivery models: (1) multiple fixed testing (MFT); (2) computer-adaptive testing (CAT); or (3) multistage testing (MSTs). This article reviews some of the real cost drivers associated with CBT implementation--focusing on item production costs, the costs…
A New Model that Generates Lotka's Law.
ERIC Educational Resources Information Center
Huber, John C.
2002-01-01
Develops a new model for a process that generates Lotka's Law. Topics include measuring scientific productivity through the number of publications; rate of production; career duration; randomness; Poisson distribution; computer simulations; goodness-of-fit; theoretical support for the model; and future research. (Author/LRW)
Articulatory speech synthesis and speech production modelling
NASA Astrophysics Data System (ADS)
Huang, Jun
This dissertation addresses the problem of speech synthesis and speech production modelling based on the fundamental principles of human speech production. Unlike the conventional source-filter model, which assumes the independence of the excitation and the acoustic filter, we treat the entire vocal apparatus as one system consisting of a fluid dynamic aspect and a mechanical part. We model the vocal tract by a three-dimensional moving geometry. We also model the sound propagation inside the vocal apparatus as a three-dimensional nonplane-wave propagation inside a viscous fluid described by Navier-Stokes equations. In our work, we first propose a combined minimum energy and minimum jerk criterion to estimate the dynamic vocal tract movements during speech production. Both theoretical error bound analysis and experimental results show that this method can achieve very close match at the target points and avoid the abrupt change in articulatory trajectory at the same time. Second, a mechanical vocal fold model is used to compute the excitation signal of the vocal tract. The advantage of this model is that it is closely coupled with the vocal tract system based on fundamental aerodynamics. As a result, we can obtain an excitation signal with much more detail than the conventional parametric vocal fold excitation model. Furthermore, strong evidence of source-tract interaction is observed. Finally, we propose a computational model of the fricative and stop types of sounds based on the physical principles of speech production. The advantage of this model is that it uses an exogenous process to model the additional nonsteady and nonlinear effects due to the flow mode, which are ignored by the conventional source- filter speech production model. A recursive algorithm is used to estimate the model parameters. Experimental results show that this model is able to synthesize good quality fricative and stop types of sounds. Based on our dissertation work, we carefully argue that the articulatory speech production model has the potential to flexibly synthesize natural-quality speech sounds and to provide a compact computational model for speech production that can be beneficial to a wide range of areas in speech signal processing.
Proposed standards for peer-reviewed publication of computer code
USDA-ARS?s Scientific Manuscript database
Computer simulation models are mathematical abstractions of physical systems. In the area of natural resources and agriculture, these physical systems encompass selected interacting processes in plants, soils, animals, or watersheds. These models are scientific products and have become important i...
Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools
NASA Astrophysics Data System (ADS)
Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.
2015-12-01
Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software used to execute analyses and models, and derived data and products that arise from these computations. This provenance is vital to interpretation and understanding of science, and provides an audit trail that researchers can use to understand and replicate computational workflows in the geosciences.
Increasing productivity of the McAuto CAD/CAE system by user-specific applications programming
NASA Technical Reports Server (NTRS)
Plotrowski, S. M.; Vu, T. H.
1985-01-01
Significant improvements in the productivity of the McAuto Computer-Aided Design/Computer-Aided Engineering (CAD/CAE) system were achieved by applications programming using the system's own Graphics Interactive Programming language (GRIP) and the interface capabilities with the main computer on which the system resides. The GRIP programs for creating springs, bar charts, finite element model representations and aiding management planning are presented as examples.
Computer simulation models as tools for identifying research needs: A black duck population model
Ringelman, J.K.; Longcore, J.R.
1980-01-01
Existing data on the mortality and production rates of the black duck (Anas rubripes) were used to construct a WATFIV computer simulation model. The yearly cycle was divided into 8 phases: hunting, wintering, reproductive, molt, post-molt, and juvenile dispersal mortality, and production from original and renesting attempts. The program computes population changes for sex and age classes during each phase. After completion of a standard simulation run with all variable default values in effect, a sensitivity analysis was conducted by changing each of 50 input variables, 1 at a time, to assess the responsiveness of the model to changes in each variable. Thirteen variables resulted in a substantial change in population level. Adult mortality factors were important during hunting and wintering phases. All production and mortality associated with original nesting attempts were sensitive, as was juvenile dispersal mortality. By identifying those factors which invoke the greatest population change, and providing an indication of the accuracy required in estimating these factors, the model helps to identify those variables which would be most profitable topics for future research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vigil,Benny Manuel; Ballance, Robert; Haskell, Karen
Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model ismore » focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.« less
Networking DEC and IBM computers
NASA Technical Reports Server (NTRS)
Mish, W. H.
1983-01-01
Local Area Networking of DEC and IBM computers within the structure of the ISO-OSI Seven Layer Reference Model at a raw signaling speed of 1 Mops or greater are discussed. After an introduction to the ISO-OSI Reference Model nd the IEEE-802 Draft Standard for Local Area Networks (LANs), there follows a detailed discussion and comparison of the products available from a variety of manufactures to perform this networking task. A summary of these products is presented in a table.
The Use of High Performance Computing (HPC) to Strengthen the Development of Army Systems
2011-11-01
accurately predicting the supersonic magus effect about spinning cones, ogive- cylinders , and boat-tailed afterbodies. This work led to the successful...successful computer model of the proposed product or system, one can then build prototypes on the computer and study the effects on the performance of...needed. The NRC report discusses the requirements for effective use of such computing power. One needs “models, algorithms, software, hardware
Ionospheric Slant Total Electron Content Analysis Using Global Positioning System Based Estimation
NASA Technical Reports Server (NTRS)
Komjathy, Attila (Inventor); Mannucci, Anthony J. (Inventor); Sparks, Lawrence C. (Inventor)
2017-01-01
A method, system, apparatus, and computer program product provide the ability to analyze ionospheric slant total electron content (TEC) using global navigation satellite systems (GNSS)-based estimation. Slant TEC is estimated for a given set of raypath geometries by fitting historical GNSS data to a specified delay model. The accuracy of the specified delay model is estimated by computing delay estimate residuals and plotting a behavior of the delay estimate residuals. An ionospheric threat model is computed based on the specified delay model. Ionospheric grid delays (IGDs) and grid ionospheric vertical errors (GIVEs) are computed based on the ionospheric threat model.
USDA-ARS?s Scientific Manuscript database
Computer simulation is a useful tool for benchmarking the electrical and fuel energy consumption and water use in a fluid milk plant. In this study, a computer simulation model of the fluid milk process based on high temperature short time (HTST) pasteurization was extended to include models for pr...
Parametric inference for biological sequence analysis.
Pachter, Lior; Sturmfels, Bernd
2004-11-16
One of the major successes in computational biology has been the unification, by using the graphical model formalism, of a multitude of algorithms for annotating and comparing biological sequences. Graphical models that have been applied to these problems include hidden Markov models for annotation, tree models for phylogenetics, and pair hidden Markov models for alignment. A single algorithm, the sum-product algorithm, solves many of the inference problems that are associated with different statistical models. This article introduces the polytope propagation algorithm for computing the Newton polytope of an observation from a graphical model. This algorithm is a geometric version of the sum-product algorithm and is used to analyze the parametric behavior of maximum a posteriori inference calculations for graphical models.
NASA Astrophysics Data System (ADS)
Chang, Yung-Chia; Li, Vincent C.; Chiang, Chia-Ju
2014-04-01
Make-to-order or direct-order business models that require close interaction between production and distribution activities have been adopted by many enterprises in order to be competitive in demanding markets. This article considers an integrated production and distribution scheduling problem in which jobs are first processed by one of the unrelated parallel machines and then distributed to corresponding customers by capacitated vehicles without intermediate inventory. The objective is to find a joint production and distribution schedule so that the weighted sum of total weighted job delivery time and the total distribution cost is minimized. This article presents a mathematical model for describing the problem and designs an algorithm using ant colony optimization. Computational experiments illustrate that the algorithm developed is capable of generating near-optimal solutions. The computational results also demonstrate the value of integrating production and distribution in the model for the studied problem.
An efficient formulation of robot arm dynamics for control and computer simulation
NASA Astrophysics Data System (ADS)
Lee, C. S. G.; Nigam, R.
This paper describes an efficient formulation of the dynamic equations of motion of industrial robots based on the Lagrange formulation of d'Alembert's principle. This formulation, as applied to a PUMA robot arm, results in a set of closed form second order differential equations with cross product terms. They are not as efficient in computation as those formulated by the Newton-Euler method, but provide a better analytical model for control analysis and computer simulation. Computational complexities of this dynamic model together with other models are tabulated for discussion.
Jover, Jesús
2017-11-08
DFT calculations are widely used for computing properties, reaction mechanisms and energy profiles in organometallic reactions. A qualitative agreement between the experimental and the calculated results seems to usually be enough to validate a computational methodology but recent advances in computation indicate that a nearly quantitative agreement should be possible if an appropriate DFT study is carried out. Final percent product concentrations, often reported as yields, are by far the most commonly reported properties in experimental metal-mediated synthesis studies but reported DFT studies have not focused on predicting absolute product amounts. The recently reported stoichiometric pentafluoroethylation of benzoic acid chlorides (R-C 6 H 4 COCl) with [(phen)Cu(PPh 3 )C 2 F 5 ] (phen = 1,10-phenanthroline, PPh 3 = triphenylphosphine) has been used as a case study to check whether the experimental product concentrations can be reproduced by any of the most popular DFT approaches with high enough accuracy. To this end, the Gibbs energy profile for the pentafluoroethylation of benzoic acid chloride has been computed using 14 different DFT methods. These computed Gibbs energy profiles have been employed to build kinetic models predicting the final product concentration in solution. The best results are obtained with the D3-dispersion corrected B3LYP functional, which has been successfully used afterwards to model the reaction outcomes of other simple (R = o-Me, p-Me, p-Cl, p-F, etc.) benzoic acid chlorides. The product concentrations of more complex reaction networks in which more than one position of the substrate may be activated by the copper catalyst (R = o-Br and p-I) are also predicted appropriately.
USDA-ARS?s Scientific Manuscript database
A model for the evolution of pyrolysis products in a fluidized bed has been developed. In this study the unsteady constitutive transport equations for inert gas flow and decomposition kinetics were modeled using the commercial computational fluid dynamics (CFD) software FLUENT-12. The model system d...
Environmental models are products of the computer architecture and software tools available at the time of development. Scientifically sound algorithms may persist in their original state even as system architectures and software development approaches evolve and progress. Dating...
Computable general equilibrium model fiscal year 2013 capability development report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwards, Brian Keith; Rivera, Michael Kelly; Boero, Riccardo
This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences inmore » the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.« less
Framework Resources Multiply Computing Power
NASA Technical Reports Server (NTRS)
2010-01-01
As an early proponent of grid computing, Ames Research Center awarded Small Business Innovation Research (SBIR) funding to 3DGeo Development Inc., of Santa Clara, California, (now FusionGeo Inc., of The Woodlands, Texas) to demonstrate a virtual computer environment that linked geographically dispersed computer systems over the Internet to help solve large computational problems. By adding to an existing product, FusionGeo enabled access to resources for calculation- or data-intensive applications whenever and wherever they were needed. Commercially available as Accelerated Imaging and Modeling, the product is used by oil companies and seismic service companies, which require large processing and data storage capacities.
Computer Technology-Integrated Projects Should not Supplant Craft Projects in Science Education
NASA Astrophysics Data System (ADS)
Klopp, Tabatha J.; Rule, Audrey C.; Suchsland Schneider, Jean; Boody, Robert M.
2014-03-01
The current emphasis on computer technology integration and narrowing of the curriculum has displaced arts and crafts. However, the hands-on, concrete nature of craft work in science modeling enables students to understand difficult concepts and to be engaged and motivated while learning spatial, logical, and sequential thinking skills. Analogy use is also helpful in understanding unfamiliar, complex science concepts. This study of 28 academically advanced elementary to middle-school students examined student work and perceptions during a science unit focused on four fossil organisms: crinoid, brachiopod, horn coral and trilobite. The study compared: (1) analogy-focused instruction to independent Internet research and (2) computer technology-rich products to crafts-based products. Findings indicate student products were more creative after analogy-based instruction and when made using technology. However, students expressed a strong desire to engage in additional craft work after making craft products and enjoyed making crafts more after analogy-focused instruction. Additionally, more science content was found in the craft products than the technology-rich products. Students expressed a particular liking for two of the fossil organisms because they had been modeled with crafts. The authors recommend that room should be retained for crafts in the science curriculum to model science concepts.
Computational modeling of the generation and propagation of distortion products in the inner ear
NASA Astrophysics Data System (ADS)
Bowling, Thomas; Wen, Haiqi; Meaud, Julien
2018-05-01
Distortion product otoacoustic emissions are used in both clinical and research settings to assess cochlear function although there are still questions for how the distortion products propagate in the cochlea from their generation location to the middle ear. Here, a physiologically based computational model of the gerbil ear is used to investigate distortion product propagation. The fluid is modeled in three dimensions and includes two ducts. Simulations of the distortion products in the cochlear fluid pressure and basilar membrane are compared with published experimental data. Model results are consistent with measurements from Ren and colleagues which indicated that the intracochlear distortion product is dominated by a forward traveling wave at a low primary frequency ratio, although backward traveling waves become apparent when other ratios are considered. The magnitude and phase of both basilar membrane and spatial variations of the distortion product fluid pressure are qualitatively similar to the expected response of a slowly propagating backward traveling wave. These results combined suggest that distortion products propagate primarily as a slow wave both when the cochlea is driven by intracochlear sources and an acoustic stimulus in the ear canal.
Development of mpi_EPIC model for global agroecosystem modeling
Kang, Shujiang; Wang, Dali; Jeff A. Nichols; ...
2014-12-31
Models that address policy-maker concerns about multi-scale effects of food and bioenergy production systems are computationally demanding. We integrated the message passing interface algorithm into the process-based EPIC model to accelerate computation of ecosystem effects. Simulation performance was further enhanced by applying the Vampir framework. When this enhanced mpi_EPIC model was tested, total execution time for a global 30-year simulation of a switchgrass cropping system was shortened to less than 0.5 hours on a supercomputer. The results illustrate that mpi_EPIC using parallel design can balance simulation workloads and facilitate large-scale, high-resolution analysis of agricultural production systems, management alternatives and environmentalmore » effects.« less
Challenges facing developers of CAD/CAM models that seek to predict human working postures
NASA Astrophysics Data System (ADS)
Wiker, Steven F.
2005-11-01
This paper outlines the need for development of human posture prediction models for Computer Aided Design (CAD) and Computer Aided Manufacturing (CAM) design applications in product, facility and work design. Challenges facing developers of posture prediction algorithms are presented and discussed.
NASA Astrophysics Data System (ADS)
von Kamp, Axel; Klamt, Steffen
2017-06-01
Computational modelling of metabolic networks has become an established procedure in the metabolic engineering of production strains. One key principle that is frequently used to guide the rational design of microbial cell factories is the stoichiometric coupling of growth and product synthesis, which makes production of the desired compound obligatory for growth. Here we show that the coupling of growth and production is feasible under appropriate conditions for almost all metabolites in genome-scale metabolic models of five major production organisms. These organisms comprise eukaryotes and prokaryotes as well as heterotrophic and photoautotrophic organisms, which shows that growth coupling as a strain design principle has a wide applicability. The feasibility of coupling is proven by calculating appropriate reaction knockouts, which enforce the coupling behaviour. The study presented here is the most comprehensive computational investigation of growth-coupled production so far and its results are of fundamental importance for rational metabolic engineering.
NASA Astrophysics Data System (ADS)
Yu, Jonas C. P.; Wee, H. M.; Yang, P. C.; Wu, Simon
2016-06-01
One of the supply chain risks for hi-tech products is the result of rapid technological innovation; it results in a significant decline in the selling price and demand after the initial launch period. Hi-tech products include computers and communication consumer's products. From a practical standpoint, a more realistic replenishment policy is needed to consider the impact of risks; especially when some portions of shortages are lost. In this paper, suboptimal and optimal order policies with partial backordering are developed for a buyer when the component cost, the selling price, and the demand rate decline at a continuous rate. Two mathematical models are derived and discussed: one model has the suboptimal solution with the fixed replenishment interval and a simpler computational process; the other one has the optimal solution with the varying replenishment interval and a more complicated computational process. The second model results in more profit. Numerical examples are provided to illustrate the two replenishment models. Sensitivity analysis is carried out to investigate the relationship between the parameters and the net profit.
Computational Modeling of Fluid–Structure–Acoustics Interaction during Voice Production
Jiang, Weili; Zheng, Xudong; Xue, Qian
2017-01-01
The paper presented a three-dimensional, first-principle based fluid–structure–acoustics interaction computer model of voice production, which employed a more realistic human laryngeal and vocal tract geometries. Self-sustained vibrations, important convergent–divergent vibration pattern of the vocal folds, and entrainment of the two dominant vibratory modes were captured. Voice quality-associated parameters including the frequency, open quotient, skewness quotient, and flow rate of the glottal flow waveform were found to be well within the normal physiological ranges. The analogy between the vocal tract and a quarter-wave resonator was demonstrated. The acoustic perturbed flux and pressure inside the glottis were found to be at the same order with their incompressible counterparts, suggesting strong source–filter interactions during voice production. Such high fidelity computational model will be useful for investigating a variety of pathological conditions that involve complex vibrations, such as vocal fold paralysis, vocal nodules, and vocal polyps. The model is also an important step toward a patient-specific surgical planning tool that can serve as a no-risk trial and error platform for different procedures, such as injection of biomaterials and thyroplastic medialization. PMID:28243588
Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas
2016-01-01
Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.
Automatic Generation of Just-in-Time Online Assessments from Software Design Models
ERIC Educational Resources Information Center
Zualkernan, Imran A.; El-Naaj, Salim Abou; Papadopoulos, Maria; Al-Amoudi, Budoor K.; Matthews, Charles E.
2009-01-01
Computer software is pervasive in today's society. The rate at which new versions of computer software products are released is phenomenal when compared to the release rate of new products in traditional industries such as aircraft building. This rapid rate of change can partially explain why most certifications in the software industry are…
The impact of pharmacophore modeling in drug design.
Guner, Osman F
2005-07-01
With the reliable use of computer simulations in scientific research, it is possible to achieve significant increases in productivity as well as a reduction in research costs compared with experimental approaches. For example, computer-simulation can substantially enchance productivity by focusing the scientist to better, more informed choices, while also driving the 'fail-early' concept to result in a significant reduction in cost. Pharmacophore modeling is a reliable computer-aided design tool used in the discovery of new classes of compounds for a given therapeutic category. This commentary will briefly review the benefits and applications of this technology in drug discovery and design, and will also highlight its historical evolution. The two most commonly used approaches for pharmacophore model development will be discussed, and several examples of how this technology was successfully applied to identify new potent leads will be provided. The article concludes with a brief outline of the controversial issue of patentability of pharmacophore models.
Understanding Emergency Care Delivery Through Computer Simulation Modeling.
Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L
2018-02-01
In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.
The production and escape of nitrogen atoms on Mars
NASA Technical Reports Server (NTRS)
Fox, J. L.
1993-01-01
Updated rate coefficients and a revised ionosphere-thermosphere model are used to compute the production rates and densities of odd nitrogen species in the Martian atmosphere. Computed density profiles for N(4S), N(2D), N(2P), and NO are presented. The model NO densities are found to be about a factor of 2-3 less than those measured by the Viking 1 mass spectrometer. Revised values for the escape rates of N atoms from dissociative recombination and ionospheric reactions are also computed. Dissociative recombination is found to be comparable in importance to photodissociation at low solar activity, but it is still the most important escape mechanism for N-14 at high solar activity.
Analysis of a Multi-Fidelity Surrogate for Handling Real Gas Equations of State
NASA Astrophysics Data System (ADS)
Ouellet, Frederick; Park, Chanyoung; Rollin, Bertrand; Balachandar, S.
2017-06-01
The explosive dispersal of particles is a complex multiphase and multi-species fluid flow problem. In these flows, the detonation products of the explosive must be treated as real gas while the ideal gas equation of state is used for the surrounding air. As the products expand outward from the detonation point, they mix with ambient air and create a mixing region where both state equations must be satisfied. One of the most accurate, yet computationally expensive, methods to handle this problem is an algorithm that iterates between both equations of state until pressure and thermal equilibrium are achieved inside of each computational cell. This work aims to use a multi-fidelity surrogate model to replace this process. A Kriging model is used to produce a curve fit which interpolates selected data from the iterative algorithm using Bayesian statistics. We study the model performance with respect to the iterative method in simulations using a finite volume code. The model's (i) computational speed, (ii) memory requirements and (iii) computational accuracy are analyzed to show the benefits of this novel approach. Also, optimizing the combination of model accuracy and computational speed through the choice of sampling points is explained. This work was supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program as a Cooperative Agreement under the Predictive Science Academic Alliance Program under Contract No. DE-NA0002378.
NASA Astrophysics Data System (ADS)
Barber, Duncan Henry
During some postulated accidents at nuclear power stations, fuel cooling may be impaired. In such cases, the fuel heats up and the subsequent increased fission-gas release from the fuel to the gap may result in fuel sheath failure. After fuel sheath failure, the barrier between the coolant and the fuel pellets is lost or impaired, gases and vapours from the fuel-to-sheath gap and other open voids in the fuel pellets can be vented. Gases and steam from the coolant can enter the broken fuel sheath and interact with the fuel pellet surfaces and the fission-product inclusion on the fuel surface (including material at the surface of the fuel matrix). The chemistry of this interaction is an important mechanism to model in order to assess fission-product releases from fuel. Starting in 1995, the computer program SOURCE 2.0 was developed by the Canadian nuclear industry to model fission-product release from fuel during such accidents. SOURCE 2.0 has employed an early thermochemical model of irradiated uranium dioxide fuel developed at the Royal Military College of Canada. To overcome the limitations of computers of that time, the implementation of the RMC model employed lookup tables to pre-calculated equilibrium conditions. In the intervening years, the RMC model has been improved, the power of computers has increased significantly, and thermodynamic subroutine libraries have become available. This thesis is the result of extensive work based on these three factors. A prototype computer program (referred to as SC11) has been developed that uses a thermodynamic subroutine library to calculate thermodynamic equilibria using Gibbs energy minimization. The Gibbs energy minimization requires the system temperature (T) and pressure (P), and the inventory of chemical elements (n) in the system. In order to calculate the inventory of chemical elements in the fuel, the list of nuclides and nuclear isomers modelled in SC11 had to be expanded from the list used by SOURCE 2.0. A benchmark calculation demonstrates the improvement in agreement of the total inventory of those chemical elements included in the RMC fuel model to an ORIGEN-S calculation. ORIGEN-S is the Oak Ridge isotope generation and depletion computer program. The Gibbs energy minimizer requires a chemical database containing coefficients from which the Gibbs energy of pure compounds, gas and liquid mixtures, and solid solutions can be calculated. The RMC model of irradiated uranium dioxide fuel has been converted into the required format. The Gibbs energy minimizer has been incorporated into a new model of fission-product vaporization from the fuel surface. Calculated release fractions using the new code have been compared to results calculated with SOURCE IST 2.0P11 and to results of tests used in the validation of SOURCE 2.0. The new code shows improvements in agreement with experimental releases for a number of nuclides. Of particular significance is the better agreement between experimental and calculated release fractions for 140La. The improved agreement reflects the inclusion in the RMC model of the solubility of lanthanum (III) oxide (La2O3) in the fuel matrix. Calculated lanthanide release fractions from earlier computer programs were a challenge to environmental qualification analysis of equipment for some accident scenarios. The new prototype computer program would alleviate this concern. Keywords: Nuclear Engineering; Material Science; Thermodynamics; Radioactive Material, Gibbs Energy Minimization, Actinide Generation and Depletion, FissionProduct Generation and Depletion.
Computable General Equilibrium Model Fiscal Year 2013 Capability Development Report - April 2014
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwards, Brian Keith; Rivera, Michael K.; Boero, Riccardo
2014-04-01
This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences inmore » the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.« less
Synfuel program analysis. Volume 2: VENVAL users manual
NASA Astrophysics Data System (ADS)
Muddiman, J. B.; Whelan, J. W.
1980-07-01
This volume is intended for program analysts and is a users manual for the VENVAL model. It contains specific explanations as to input data requirements and programming procedures for the use of this model. VENVAL is a generalized computer program to aid in evaluation of prospective private sector production ventures. The program can project interrelated values of installed capacity, production, sales revenue, operating costs, depreciation, investment, dent, earnings, taxes, return on investment, depletion, and cash flow measures. It can also compute related public sector and other external costs and revenues if unit costs are furnished.
A Four-Stage Model for Planning Computer-Based Instruction.
ERIC Educational Resources Information Center
Morrison, Gary R.; Ross, Steven M.
1988-01-01
Describes a flexible planning process for developing computer based instruction (CBI) in which the CBI design is implemented on paper between the lesson design and the program production. A four-stage model is explained, including (1) an initial flowchart, (2) storyboards, (3) a detailed flowchart, and (4) an evaluation. (16 references)…
Computer Simulation of a Hardwood Processing Plant
D. Earl Kline; Philip A. Araman
1990-01-01
The overall purpose of this paper is to introduce computer simulation as a decision support tool that can be used to provide managers with timely information. A simulation/animation modeling procedure is demonstrated for wood products manufacuring systems. Simulation modeling techniques are used to assist in identifying and solving problems. Animation is used for...
ERIC Educational Resources Information Center
Darabi, Aubteen; Nelson, David W.; Meeker, Richard; Liang, Xinya; Boulware, Wilma
2010-01-01
In a diagnostic problem solving operation of a computer-simulated chemical plant, chemical engineering students were randomly assigned to two groups: one studying product-oriented worked examples, the other practicing conventional problem solving. Effects of these instructional strategies on the progression of learners' mental models were examined…
Interactive computer aided technology, evolution in the design/manufacturing process
NASA Technical Reports Server (NTRS)
English, C. H.
1975-01-01
A powerful computer-operated three dimensional graphic system and associated auxiliary computer equipment used in advanced design, production design, and manufacturing was described. This system has made these activities more productive than when using older and more conventional methods to design and build aerospace vehicles. With the use of this graphic system, designers are now able to define parts using a wide variety of geometric entities, define parts as fully surface 3-dimensional models as well as "wire-frame" models. Once geometrically defined, the designer is able to take section cuts of the surfaced model and automatically determine all of the section properties of the planar cut, lightpen detect all of the surface patches and automatically determine the volume and weight of the part. Further, his designs are defined mathematically at a degree of accuracy never before achievable.
Computationally efficient algorithm for Gaussian Process regression in case of structured samples
NASA Astrophysics Data System (ADS)
Belyaev, M.; Burnaev, E.; Kapushev, Y.
2016-04-01
Surrogate modeling is widely used in many engineering problems. Data sets often have Cartesian product structure (for instance factorial design of experiments with missing points). In such case the size of the data set can be very large. Therefore, one of the most popular algorithms for approximation-Gaussian Process regression-can be hardly applied due to its computational complexity. In this paper a computationally efficient approach for constructing Gaussian Process regression in case of data sets with Cartesian product structure is presented. Efficiency is achieved by using a special structure of the data set and operations with tensors. Proposed algorithm has low computational as well as memory complexity compared to existing algorithms. In this work we also introduce a regularization procedure allowing to take into account anisotropy of the data set and avoid degeneracy of regression model.
Analysis of an algae-based CELSS. I - Model development
NASA Technical Reports Server (NTRS)
Holtzapple, Mark T.; Little, Frank E.; Makela, Merry E.; Patterson, C. O.
1989-01-01
A steady state chemical model and computer program have been developed for a life support system and applied to trade-off studies. The model is based on human demand for food and oxygen determined from crew metabolic needs. The model includes modules for water recycle, waste treatment, CO2 removal and treatment, and food production. The computer program calculates rates of use and material balance for food, O2, the recycle of human waste and trash, H2O, N2, and food production/supply. A simple noniterative solution for the model has been developed using the steady state rate equations for the chemical reactions. The model and program have been used in system sizing and subsystem trade-off studies of a partially closed life support system.
Analysis of an algae-based CELSS. Part 1: model development
NASA Technical Reports Server (NTRS)
Holtzapple, M. T.; Little, F. E.; Makela, M. E.; Patterson, C. O.
1989-01-01
A steady state chemical model and computer program have been developed for a life support system and applied to trade-off studies. The model is based on human demand for food and oxygen determined from crew metabolic needs. The model includes modules for water recycle, waste treatment, CO2 removal and treatment, and food production. The computer program calculates rates of use and material balance for food. O2, the recycle of human waste and trash, H2O, N2, and food production supply. A simple non-iterative solution for the model has been developed using the steady state rate equations for the chemical reactions. The model and program have been used in system sizing and subsystem trade-off studies of a partially closed life support system.
Self-, other-, and joint monitoring using forward models.
Pickering, Martin J; Garrod, Simon
2014-01-01
In the psychology of language, most accounts of self-monitoring assume that it is based on comprehension. Here we outline and develop the alternative account proposed by Pickering and Garrod (2013), in which speakers construct forward models of their upcoming utterances and compare them with the utterance as they produce them. We propose that speakers compute inverse models derived from the discrepancy (error) between the utterance and the predicted utterance and use that to modify their production command or (occasionally) begin anew. We then propose that comprehenders monitor other people's speech by simulating their utterances using covert imitation and forward models, and then comparing those forward models with what they hear. They use the discrepancy to compute inverse models and modify their representation of the speaker's production command, or realize that their representation is incorrect and may develop a new production command. We then discuss monitoring in dialogue, paying attention to sequential contributions, concurrent feedback, and the relationship between monitoring and alignment.
Self-, other-, and joint monitoring using forward models
Pickering, Martin J.; Garrod, Simon
2014-01-01
In the psychology of language, most accounts of self-monitoring assume that it is based on comprehension. Here we outline and develop the alternative account proposed by Pickering and Garrod (2013), in which speakers construct forward models of their upcoming utterances and compare them with the utterance as they produce them. We propose that speakers compute inverse models derived from the discrepancy (error) between the utterance and the predicted utterance and use that to modify their production command or (occasionally) begin anew. We then propose that comprehenders monitor other people’s speech by simulating their utterances using covert imitation and forward models, and then comparing those forward models with what they hear. They use the discrepancy to compute inverse models and modify their representation of the speaker’s production command, or realize that their representation is incorrect and may develop a new production command. We then discuss monitoring in dialogue, paying attention to sequential contributions, concurrent feedback, and the relationship between monitoring and alignment. PMID:24723869
1990-12-31
health hazards from weapons combustion products, to include rockets and missiles, became evident, Research to elucidate significant health effects of...CO/CO2 ratios was low for all but one of dhe formulations, In general, if the model were to be used in its present state for health risk assessments...35 Part 2: Modeling for Health Hazard Prediction Introduction ................................................. 37 Results and D iscussion
Sharing Research Models: Using Software Engineering Practices for Facilitation
Bryant, Stephanie P.; Solano, Eric; Cantor, Susanna; Cooley, Philip C.; Wagener, Diane K.
2011-01-01
Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems’ behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations—such as nonintuitive user interface features and data input specifications—may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices— the iterative software development process, object-oriented methodology, and Unified Modeling Language—and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers. PMID:21687780
Damsel: A Data Model Storage Library for Exascale Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koziol, Quincey
The goal of this project is to enable exascale computational science applications to interact conveniently and efficiently with storage through abstractions that match their data models. We will accomplish this through three major activities: (1) identifying major data model motifs in computational science applications and developing representative benchmarks; (2) developing a data model storage library, called Damsel, that supports these motifs, provides efficient storage data layouts, incorporates optimizations to enable exascale operation, and is tolerant to failures; and (3) productizing Damsel and working with computational scientists to encourage adoption of this library by the scientific community.
interoperability emerging infrastructure for data management on computational grids Software Packages Services : ATLAS: Management and Steering: Computing Management Board Software Project Management Board Database Model Group Computing TDR: 4.5 Event Data 4.8 Database and Data Management Services 6.3.4 Production and
Numerical Modeling of Nonlinear Thermodynamics in SMA Wires
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reynolds, D R; Kloucek, P
We present a mathematical model describing the thermodynamic behavior of shape memory alloy wires, as well as a computational technique to solve the resulting system of partial differential equations. The model consists of conservation equations based on a new Helmholtz free energy potential. The computational technique introduces a viscosity-based continuation method, which allows the model to handle dynamic applications where the temporally local behavior of solutions is desired. Computational experiments document that this combination of modeling and solution techniques appropriately predicts the thermally- and stress-induced martensitic phase transitions, as well as the hysteretic behavior and production of latent heat associatedmore » with such materials.« less
Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas
2017-01-01
Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948
NASA Technical Reports Server (NTRS)
Liever, Peter A.; West, Jeffrey S.
2016-01-01
A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.
Computational Features of Flow Modeling in Nanostructured Sensors
NASA Astrophysics Data System (ADS)
Ionescu, Adela; Savu, Dan; Savu, Sorin; Coman, Daniela
2009-04-01
Nowadays the productivity of the welding processes represents an important factor in economy concepts. The technologies which are developed by the researchers are oriented to the increasing of the welding processes' productivity and to the improvement of the products' quality.
Zhu, Tong; Moussa, Ehab M; Witting, Madeleine; Zhou, Deliang; Sinha, Kushal; Hirth, Mario; Gastens, Martin; Shang, Sherwin; Nere, Nandkishor; Somashekar, Shubha Chetan; Alexeenko, Alina; Jameel, Feroz
2018-07-01
Scale-up and technology transfer of lyophilization processes remains a challenge that requires thorough characterization of the laboratory and larger scale lyophilizers. In this study, computational fluid dynamics (CFD) was employed to develop computer-based models of both laboratory and manufacturing scale lyophilizers in order to understand the differences in equipment performance arising from distinct designs. CFD coupled with steady state heat and mass transfer modeling of the vial were then utilized to study and predict independent variables such as shelf temperature and chamber pressure, and response variables such as product resistance, product temperature and primary drying time for a given formulation. The models were then verified experimentally for the different lyophilizers. Additionally, the models were applied to create and evaluate a design space for a lyophilized product in order to provide justification for the flexibility to operate within a certain range of process parameters without the need for validation. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Septiani, Eka Lutfi; Widiyastuti, W.; Winardi, Sugeng; Machmudah, Siti; Nurtono, Tantular; Kusdianto
2016-02-01
Flame assisted spray dryer are widely uses for large-scale production of nanoparticles because of it ability. Numerical approach is needed to predict combustion and particles production in scale up and optimization process due to difficulty in experimental observation and relatively high cost. Computational Fluid Dynamics (CFD) can provide the momentum, energy and mass transfer, so that CFD more efficient than experiment due to time and cost. Here, two turbulence models, k-ɛ and Large Eddy Simulation were compared and applied in flame assisted spray dryer system. The energy sources for particle drying was obtained from combustion between LPG as fuel and air as oxidizer and carrier gas that modelled by non-premixed combustion in simulation. Silica particles was used to particle modelling from sol silica solution precursor. From the several comparison result, i.e. flame contour, temperature distribution and particle size distribution, Large Eddy Simulation turbulence model can provide the closest data to the experimental result.
Light attenuation in a shallow, turbid reservoir, Lake Houston, Texas
Lee, Roger W.; Rast, Walter
1997-01-01
with an average error of the computed coefficient to measured value of ±13 percent. The model can be useful in computing the thickness of the euphotic zone to determine primary productivity in the reservoir.
C-Language Integrated Production System, Version 6.0
NASA Technical Reports Server (NTRS)
Riley, Gary; Donnell, Brian; Ly, Huyen-Anh Bebe; Ortiz, Chris
1995-01-01
C Language Integrated Production System (CLIPS) computer programs are specifically intended to model human expertise or other knowledge. CLIPS is designed to enable research on, and development and delivery of, artificial intelligence on conventional computers. CLIPS 6.0 provides cohesive software tool for handling wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming: representation of knowledge as heuristics - essentially, rules of thumb that specify set of actions performed in given situation. Object-oriented programming: modeling of complex systems comprised of modular components easily reused to model other systems or create new components. Procedural-programming: representation of knowledge in ways similar to those of such languages as C, Pascal, Ada, and LISP. Version of CLIPS 6.0 for IBM PC-compatible computers requires DOS v3.3 or later and/or Windows 3.1 or later.
Rapid prototyping modelling in oral and maxillofacial surgery: A two year retrospective study.
Suomalainen, Anni; Stoor, Patricia; Mesimäki, Karri; Kontio, Risto K
2015-12-01
The use of rapid prototyping (RP) models in medicine to construct bony models is increasing. The aim of the study was to evaluate retrospectively the indication for the use of RP models in oral and maxillofacial surgery at Helsinki University Central Hospital during 2009-2010. Also, the used computed tomography (CT) examination - multislice CT (MSCT) or cone beam CT (CBCT) - method was evaluated. In total 114 RP models were fabricated for 102 patients. The mean age of the patients at the time of the production of the model was 50.4 years. The indications for the modelling included malignant lesions (29%), secondary reconstruction (25%), prosthodontic treatment (22%), orthognathic surgery or asymmetry (13%), benign lesions (8%), and TMJ disorders (4%). MSCT examination was used in 92 and CBCT examination in 22 cases. Most of the models (75%) were conventional hard tissue models. Models with colored tumour or other structure(s) of interest were ordered in 24%. Two out of the 114 models were soft tissue models. The main benefit of the models was in treatment planning and in connection with the production of pre-bent plates or custom made implants. The RP models both facilitate and improve treatment planning and intraoperative efficiency. Rapid prototyping, radiology, computed tomography, cone beam computed tomography.
Realization of planning design of mechanical manufacturing system by Petri net simulation model
NASA Astrophysics Data System (ADS)
Wu, Yanfang; Wan, Xin; Shi, Weixiang
1991-09-01
Planning design is to work out a more overall long-term plan. In order to guarantee a mechanical manufacturing system (MMS) designed to obtain maximum economical benefit, it is necessary to carry out a reasonable planning design for the system. First, some principles on planning design for MMS are introduced. Problems of production scheduling and their decision rules for computer simulation are presented. Realizable method of each production scheduling decision rule in Petri net model is discussed. Second, the solution of conflict rules for conflict problems during running Petri net is given. Third, based on the Petri net model of MMS which includes part flow and tool flow, according to the principle of minimum event time advance, a computer dynamic simulation of the Petri net model, that is, a computer dynamic simulation of MMS, is realized. Finally, the simulation program is applied to a simulation exmple, so the scheme of a planning design for MMS can be evaluated effectively.
[Modeling developmental aspects of sensorimotor control of speech production].
Kröger, B J; Birkholz, P; Neuschaefer-Rube, C
2007-05-01
Detailed knowledge of the neurophysiology of speech acquisition is important for understanding the developmental aspects of speech perception and production and for understanding developmental disorders of speech perception and production. A computer implemented neural model of sensorimotor control of speech production was developed. The model is capable of demonstrating the neural functions of different cortical areas during speech production in detail. (i) Two sensory and two motor maps or neural representations and the appertaining neural mappings or projections establish the sensorimotor feedback control system. These maps and mappings are already formed and trained during the prelinguistic phase of speech acquisition. (ii) The feedforward sensorimotor control system comprises the lexical map (representations of sounds, syllables, and words of the first language) and the mappings from lexical to sensory and to motor maps. The training of the appertaining mappings form the linguistic phase of speech acquisition. (iii) Three prelinguistic learning phases--i. e. silent mouthing, quasi stationary vocalic articulation, and realisation of articulatory protogestures--can be defined on the basis of our simulation studies using the computational neural model. These learning phases can be associated with temporal phases of prelinguistic speech acquisition obtained from natural data. The neural model illuminates the detailed function of specific cortical areas during speech production. In particular it can be shown that developmental disorders of speech production may result from a delayed or incorrect process within one of the prelinguistic learning phases defined by the neural model.
Zittrain, Jonathan
2008-10-28
Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a drawing pin and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This paper explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring.
Computed Tomography Scanner Productivity and Entry-Level Models in the Global Market
Almeida, R. M. V. R.
2017-01-01
Objective This study evaluated the productivity of computed tomography (CT) models and characterized their simplest (entry-level) models' supply in the world market. Methods CT exam times were measured in eight health facilities in the state of Rio de Janeiro, Brazil. Exams were divided into six stages: (1) arrival of patient records to the examination room; (2) patient arrival; (3) patient positioning; (4) data input prior to exam; (5) image acquisition; and (6) patient departure. CT exam productivity was calculated by dividing the total weekly working time by the total exam time for each model. Additionally, an internet search identified full-body CT manufacturers and their offered entry-level models. Results The time durations of 111 CT exams were obtained. Differences among average exam times were not large, and they were mainly due to stages not directly related to data acquisition or image reconstruction. The survey identified that most manufacturers offer 2- to 4-slice models for Asia, South America, and Africa, and one offers single-slice models (Asia). In the USA, two manufacturers offer models below 16-slice. Conclusion Productivity gains are not linearly related to “slice” number. It is suggested that the use of “shareable platforms” could make CTs cheaper, increasing their availability. PMID:29093804
Space-filling designs for computer experiments: A review
Joseph, V. Roshan
2016-01-29
Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less
Space-filling designs for computer experiments: A review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joseph, V. Roshan
Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less
Assessing product image quality for online shopping
NASA Astrophysics Data System (ADS)
Goswami, Anjan; Chung, Sung H.; Chittar, Naren; Islam, Atiq
2012-01-01
Assessing product-image quality is important in the context of online shopping. A high quality image that conveys more information about a product can boost the buyer's confidence and can get more attention. However, the notion of image quality for product-images is not the same as that in other domains. The perception of quality of product-images depends not only on various photographic quality features but also on various high level features such as clarity of the foreground or goodness of the background etc. In this paper, we define a notion of product-image quality based on various such features. We conduct a crowd-sourced experiment to collect user judgments on thousands of eBay's images. We formulate a multi-class classification problem for modeling image quality by classifying images into good, fair and poor quality based on the guided perceptual notions from the judges. We also conduct experiments with regression using average crowd-sourced human judgments as target. We compute a pseudo-regression score with expected average of predicted classes and also compute a score from the regression technique. We design many experiments with various sampling and voting schemes with crowd-sourced data and construct various experimental image quality models. Most of our models have reasonable accuracies (greater or equal to 70%) on test data set. We observe that our computed image quality score has a high (0.66) rank correlation with average votes from the crowd sourced human judgments.
A method for three-dimensional modeling of wind-shear environments for flight simulator applications
NASA Technical Reports Server (NTRS)
Bray, R. S.
1984-01-01
A computational method for modeling severe wind shears of the type that have been documented during severe convective atmospheric conditions is offered for use in research and training flight simulation. The procedure was developed with the objectives of operational flexibility and minimum computer load. From one to five, simple down burst wind models can be configured and located to produce the wind field desired for specific simulated flight scenarios. A definition of related turbulence parameters is offered as an additional product of the computations. The use of the method to model several documented examples of severe wind shear is demonstrated.
ERIC Educational Resources Information Center
School Science Review, 1985
1985-01-01
Presents 23 experiments, demonstrations, activities, and computer programs in biology, chemistry, and physics. Topics include lead in petrol, production of organic chemicals, reduction of water, enthalpy, X-ray diffraction model, nuclear magnetic resonance spectroscopy, computer simulation for additive mixing of colors, Archimedes Principle, and…
Applications products of aviation forecast models
NASA Technical Reports Server (NTRS)
Garthner, John P.
1988-01-01
A service called the Optimum Path Aircraft Routing System (OPARS) supplies products based on output data from the Naval Oceanographic Global Atmospheric Prediction System (NOGAPS), a model run on a Cyber-205 computer. Temperatures and winds are extracted from the surface to 100 mb, approximately 55,000 ft. Forecast winds are available in six-hour time steps.
Design of a Production System for Cognitive Modeling #1. Technical Report 77-2.
ERIC Educational Resources Information Center
Anderson, John R.; Kline, Paul J.
This report describes several of the design decisions underlying ACT, a production system model of human cognition. ACT can be considered a high level computer programming language as well as a theory of the cognitive mechanisms underlying human information processing. ACT design decisions were based on both psychological and artificial…
NASA Astrophysics Data System (ADS)
Iglesias, A.; Quiroga, S.; Garrote, L.; Cunningham, R.
2012-04-01
This paper provides monetary estimates of the effects of agricultural adaptation to climate change in Europe. The model computes spatial crop productivity changes as a response to climate change linking biophysical and socioeconomic components. It combines available data sets of crop productivity changes under climate change (Iglesias et al 2011, Ciscar et al 2011), statistical functions of productivity response to water and nitrogen inputs, catchment level water availability, and environmental policy scenarios. Future global change scenarios are derived from several socio-economic futures of representative concentration pathways and regional climate models. The economic valuation is conducted by using GTAP general equilibrium model. The marginal productivity changes has been used as an input for the economic general equilibrium model in order to analyse the economic impact of the agricultural changes induced by climate change in the world. The study also includes the analysis of an adaptive capacity index computed by using the socio-economic results of GTAP. The results are combined to prioritize agricultural adaptation policy needs in Europe.
The study of early human embryos using interactive 3-dimensional computer reconstructions.
Scarborough, J; Aiton, J F; McLachlan, J C; Smart, S D; Whiten, S C
1997-07-01
Tracings of serial histological sections from 4 human embryos at different Carnegie stages were used to create 3-dimensional (3D) computer models of the developing heart. The models were constructed using commercially available software developed for graphic design and the production of computer generated virtual reality environments. They are available as interactive objects which can be downloaded via the World Wide Web. This simple method of 3D reconstruction offers significant advantages for understanding important events in morphological sciences.
Advances in mixed-integer programming methods for chemical production scheduling.
Velez, Sara; Maravelias, Christos T
2014-01-01
The goal of this paper is to critically review advances in the area of chemical production scheduling over the past three decades and then present two recently proposed solution methods that have led to dramatic computational enhancements. First, we present a general framework and problem classification and discuss modeling and solution methods with an emphasis on mixed-integer programming (MIP) techniques. Second, we present two solution methods: (a) a constraint propagation algorithm that allows us to compute parameters that are then used to tighten MIP scheduling models and (b) a reformulation that introduces new variables, thus leading to effective branching. We also present computational results and an example illustrating how these methods are implemented, as well as the resulting enhancements. We close with a discussion of open research challenges and future research directions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rest, J.; Zawadzki, S.A.
The primary physical/chemical models that form the basis of the FASTGRASS mechanistic computer model for calculating fission-product release from nuclear fuel are described. Calculated results are compared with test data and the major mechanisms affecting the transport of fission products during steady-state and accident conditions are identified.
O'Donnell, Michael
2015-01-01
State-and-transition simulation modeling relies on knowledge of vegetation composition and structure (states) that describe community conditions, mechanistic feedbacks such as fire that can affect vegetation establishment, and ecological processes that drive community conditions as well as the transitions between these states. However, as the need for modeling larger and more complex landscapes increase, a more advanced awareness of computing resources becomes essential. The objectives of this study include identifying challenges of executing state-and-transition simulation models, identifying common bottlenecks of computing resources, developing a workflow and software that enable parallel processing of Monte Carlo simulations, and identifying the advantages and disadvantages of different computing resources. To address these objectives, this study used the ApexRMS® SyncroSim software and embarrassingly parallel tasks of Monte Carlo simulations on a single multicore computer and on distributed computing systems. The results demonstrated that state-and-transition simulation models scale best in distributed computing environments, such as high-throughput and high-performance computing, because these environments disseminate the workloads across many compute nodes, thereby supporting analysis of larger landscapes, higher spatial resolution vegetation products, and more complex models. Using a case study and five different computing environments, the top result (high-throughput computing versus serial computations) indicated an approximate 96.6% decrease of computing time. With a single, multicore compute node (bottom result), the computing time indicated an 81.8% decrease relative to using serial computations. These results provide insight into the tradeoffs of using different computing resources when research necessitates advanced integration of ecoinformatics incorporating large and complicated data inputs and models. - See more at: http://aimspress.com/aimses/ch/reader/view_abstract.aspx?file_no=Environ2015030&flag=1#sthash.p1XKDtF8.dpuf
Effect of turbulence on the disintegration rate of flushable consumer products.
Karadagli, Fatih; Rittmann, Bruce E; McAvoy, Drew C; Richardson, John E
2012-05-01
A previously developed model for the physical disintegration of flushable consumer products is expanded by investigating the effects of turbulence on the rate of physical disintegration. Disintegration experiments were conducted with cardboard tampon applicators at 100, 150, and 200 rotations per minute, corresponding to Reynold's numbers of 25,900, 39,400, and 52,900, respectively, which were estimated by using computational fluid dynamics modeling. The experiments were simulated with the disintegration model to obtain best-fit values of the kinetic and distribution parameters. Computed rate coefficients (ki) for all solid sizes (i.e., greater than 8, 4 to 8, 2 to 4, and 1 to 2 mm) increased strongly with Reynold's number or rotational speed. Thus, turbulence strongly affected the disintegration rate of flushable products, and the relationship of the ki values to Reynold's number can be included in mathematical representations of physical disintegration.
Fast Inference with Min-Sum Matrix Product.
Felzenszwalb, Pedro F; McAuley, Julian J
2011-12-01
The MAP inference problem in many graphical models can be solved efficiently using a fast algorithm for computing min-sum products of n × n matrices. The class of models in question includes cyclic and skip-chain models that arise in many applications. Although the worst-case complexity of the min-sum product operation is not known to be much better than O(n(3)), an O(n(2.5)) expected time algorithm was recently given, subject to some constraints on the input matrices. In this paper, we give an algorithm that runs in O(n(2) log n) expected time, assuming that the entries in the input matrices are independent samples from a uniform distribution. We also show that two variants of our algorithm are quite fast for inputs that arise in several applications. This leads to significant performance gains over previous methods in applications within computer vision and natural language processing.
NASA geometry data exchange specification for computational fluid dynamics (NASA IGES)
NASA Technical Reports Server (NTRS)
Blake, Matthew W.; Kerr, Patricia A.; Thorp, Scott A.; Jou, Jin J.
1994-01-01
This document specifies a subset of an existing product data exchange specification that is widely used in industry and government. The existing document is called the Initial Graphics Exchange Specification. This document, a subset of IGES, is intended for engineers analyzing product performance using tools such as computational fluid dynamics (CFD) software. This document specifies how to define mathematically and exchange the geometric model of an object. The geometry is represented utilizing nonuniform rational B-splines (NURBS) curves and surfaces. Only surface models are represented; no solid model representation is included. This specification does not include most of the other types of product information available in IGES (e.g., no material properties or surface finish properties) and does not provide all the specific file format details of IGES. The data exchange protocol specified in this document is fully conforming to the American National Standard (ANSI) IGES 5.2.
NASA Astrophysics Data System (ADS)
Lavrov, V. V.; Spirin, N. A.
2016-09-01
Advances in modern science and technology are inherently connected with the development, implementation, and widespread use of computer systems based on mathematical modeling. Algorithms and computer systems are gaining practical significance solving a range of process tasks in metallurgy of MES-level (Manufacturing Execution Systems - systems controlling industrial process) of modern automated information systems at the largest iron and steel enterprises in Russia. This fact determines the necessity to develop information-modeling systems based on mathematical models that will take into account the physics of the process, the basics of heat and mass exchange, the laws of energy conservation, and also the peculiarities of the impact of technological and standard characteristics of raw materials on the manufacturing process data. Special attention in this set of operations for metallurgic production is devoted to blast-furnace production, as it consumes the greatest amount of energy, up to 50% of the fuel used in ferrous metallurgy. The paper deals with the requirements, structure and architecture of BF Process Engineer's Automated Workstation (AWS), a computer decision support system of MES Level implemented in the ICS of the Blast Furnace Plant at Magnitogorsk Iron and Steel Works. It presents a brief description of main model subsystems as well as assumptions made in the process of mathematical modelling. Application of the developed system allows the engineering and process staff to analyze online production situations in the blast furnace plant, to solve a number of process tasks related to control of heat, gas dynamics and slag conditions of blast-furnace smelting as well as to calculate the optimal composition of blast-furnace slag, which eventually results in increasing technical and economic performance of blast-furnace production.
1981-06-01
design of manufacturing systems, "ilidation and verification of ICAM modules, integration of ICAM modules and the orderly transition of ICAM modules into...Function Model of "Manufacture Product" (MFGO) VIII - Composite Function Model of " Design Product" (DESIGNO) IX - Composite Information Model of...User Interface Requirements; and the Architecture of Design . This work was performed during the period of 29 September 1978 through 10
NASA Astrophysics Data System (ADS)
Ponomarev, A. A.; Mamadaliev, R. A.; Semenova, T. V.
2016-10-01
The article presents a brief overview of the current state of computed tomography in the sphere of oil and gas production in Russia and in the world. Operation of computed microtomograph Skyscan 1172 is also provided, as well as personal examples of its application in solving geological problems.
McLaren, D G; Buchanan, D S; Williams, J E
1987-10-01
A static, deterministic computer model, programmed in Microsoft Basic for IBM PC and Apple Macintosh computers, was developed to calculate production efficiency (cost per kg of product) for nine alternative types of crossbreeding system involving four breeds of swine. The model simulates efficiencies for four purebred and 60 alternative two-, three- and four-breed rotation, rotaterminal, backcross and static cross systems. Crossbreeding systems were defined as including all purebred, crossbred and commercial matings necessary to maintain a total of 10,000 farrowings. Driving variables for the model are mean conception rate at first service and for an 8-wk breeding season, litter size born, preweaning survival rate, postweaning average daily gain, feed-to-gain ratio and carcass backfat. Predictions are computed using breed direct genetic and maternal effects for the four breeds, plus individual, maternal and paternal specific heterosis values, input by the user. Inputs required to calculate the number of females farrowing in each sub-system include the proportion of males and females replaced each breeding cycle in purebred and crossbred populations, the proportion of male and female offspring in seedstock herds that become breeding animals, and the number of females per boar. Inputs required to calculate the efficiency of terminal production (cost-to-product ratio) for each sub-system include breeding herd feed intake, gilt development costs, feed costs and labor and overhead costs. Crossbreeding system efficiency is calculated as the weighted average of sub-system cost-to-product ratio values, weighting by the number of females farrowing in each sub-system.
Damsel: A Data Model Storage Library for Exascale Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choudhary, Alok; Liao, Wei-keng
Computational science applications have been described as having one of seven motifs (the “seven dwarfs”), each having a particular pattern of computation and communication. From a storage and I/O perspective, these applications can also be grouped into a number of data model motifs describing the way data is organized and accessed during simulation, analysis, and visualization. Major storage data models developed in the 1990s, such as Network Common Data Format (netCDF) and Hierarchical Data Format (HDF) projects, created support for more complex data models. Development of both netCDF and HDF5 was influenced by multi-dimensional dataset storage requirements, but their accessmore » models and formats were designed with sequential storage in mind (e.g., a POSIX I/O model). Although these and other high-level I/O libraries have had a beneficial impact on large parallel applications, they do not always attain a high percentage of peak I/O performance due to fundamental design limitations, and they do not address the full range of current and future computational science data models. The goal of this project is to enable exascale computational science applications to interact conveniently and efficiently with storage through abstractions that match their data models. The project consists of three major activities: (1) identifying major data model motifs in computational science applications and developing representative benchmarks; (2) developing a data model storage library, called Damsel, that supports these motifs, provides efficient storage data layouts, incorporates optimizations to enable exascale operation, and is tolerant to failures; and (3) productizing Damsel and working with computational scientists to encourage adoption of this library by the scientific community. The product of this project, Damsel library, is openly available for download from http://cucis.ece.northwestern.edu/projects/DAMSEL. Several case studies and application programming interface reference are also available to assist new users to learn to use the library.« less
Improving waterfowl production estimates: Results of a test in the prairie pothole region
Arnold, P.M.; Cowardin, L.M.
1985-01-01
The U.S. Fish and Wildlife Service in an effort to improve and standardize methods for estimating waterfowl production tested a new technique in the four-county Arrowwood Wetland Management District (WMD) for three years (1982-1984). On 14 randomly selected 10.36 km2 plots, upland and wetland habitat was mapped, classified, and digitized. Waterfowl breeding pairs were counted twice each year and the proportion of wetland basins containing water was determined. Pair numbers and habitat conditions were entered into a computer model developed by Northern Prairie Wildlife Research Center. That model estimates production on small federally owned wildlife tracts, federal wetland easements, and private land. Results indicate that production estimates were most accurate for mallards (Anas platyrhynchos), the species for which the computer model and data base were originally designed. Predictions for the pintail (Anas acuta), gadwall (A. strepa), blue-winged teal (A. discors), and northern shoveler (A. clypeata) were believed to be less accurate. Modeling breeding period dynamics of a waterfowl species and making credible production estimates for a geographic area are possible if the data used in the model are adequate. The process of modeling the breeding period of a species aids in locating areas of insufficient biological knowledge. This process will help direct future research efforts and permit more efficient gathering of field data.
Automatic control algorithm effects on energy production
NASA Technical Reports Server (NTRS)
Mcnerney, G. M.
1981-01-01
A computer model was developed using actual wind time series and turbine performance data to simulate the power produced by the Sandia 17-m VAWT operating in automatic control. The model was used to investigate the influence of starting algorithms on annual energy production. The results indicate that, depending on turbine and local wind characteristics, a bad choice of a control algorithm can significantly reduce overall energy production. The model can be used to select control algorithms and threshold parameters that maximize long term energy production. The results from local site and turbine characteristics were generalized to obtain general guidelines for control algorithm design.
NASA Technical Reports Server (NTRS)
Liever, Peter A.; West, Jeffrey S.; Harris, Robert E.
2016-01-01
A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate Discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured mesh Discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.
Wauchope, R Don; Ahuja, Lajpat R; Arnold, Jeffrey G; Bingner, Ron; Lowrance, Richard; van Genuchten, Martinus T; Adams, Larry D
2003-01-01
We present an overview of USDA Agricultural Research Service (ARS) computer models and databases related to pest-management science, emphasizing current developments in environmental risk assessment and management simulation models. The ARS has a unique national interdisciplinary team of researchers in surface and sub-surface hydrology, soil and plant science, systems analysis and pesticide science, who have networked to develop empirical and mechanistic computer models describing the behavior of pests, pest responses to controls and the environmental impact of pest-control methods. Historically, much of this work has been in support of production agriculture and in support of the conservation programs of our 'action agency' sister, the Natural Resources Conservation Service (formerly the Soil Conservation Service). Because we are a public agency, our software/database products are generally offered without cost, unless they are developed in cooperation with a private-sector cooperator. Because ARS is a basic and applied research organization, with development of new science as our highest priority, these products tend to be offered on an 'as-is' basis with limited user support except for cooperating R&D relationship with other scientists. However, rapid changes in the technology for information analysis and communication continually challenge our way of doing business.
Modeling Code Is Helping Cleveland Develop New Products
NASA Technical Reports Server (NTRS)
1998-01-01
Master Builders, Inc., is a 350-person company in Cleveland, Ohio, that develops and markets specialty chemicals for the construction industry. Developing new products involves creating many potential samples and running numerous tests to characterize the samples' performance. Company engineers enlisted NASA's help to replace cumbersome physical testing with computer modeling of the samples' behavior. Since the NASA Lewis Research Center's Structures Division develops mathematical models and associated computation tools to analyze the deformation and failure of composite materials, its researchers began a two-phase effort to modify Lewis' Integrated Composite Analyzer (ICAN) software for Master Builders' use. Phase I has been completed, and Master Builders is pleased with the results. The company is now working to begin implementation of Phase II.
Technology for the product and process data base
NASA Technical Reports Server (NTRS)
Barnes, R. D.
1984-01-01
The computerized product and process data base is increasingly recognized to be the cornerstone component of an overall system aimed at the integrated automation of the industrial processes of a given company or enterprise. The technology needed to support these more effective computer integrated design and manufacturing methods, especially the concept of 3-D computer-sensible product definitions rather than engineering drawings, is not fully available and rationalized. Progress is being made, however, in bridging this technology gap with concentration on the modeling of sophisticated information and data structures, high-performance interactive user interfaces and comprehensive tools for managing the resulting computerized product definition and process data base.
A Multi-Fidelity Surrogate Model for the Equation of State for Mixtures of Real Gases
NASA Astrophysics Data System (ADS)
Ouellet, Frederick; Park, Chanyoung; Koneru, Rahul; Balachandar, S.; Rollin, Bertrand
2017-11-01
The explosive dispersal of particles is a complex multiphase and multi-species fluid flow problem. In these flows, the products of detonated explosives must be treated as real gases while the ideal gas equation of state is used for the ambient air. As the products expand outward, they mix with the air and create a region where both state equations must be satisfied. One of the most accurate, yet expensive, methods to handle this problem is an algorithm that iterates between both state equations until both pressure and thermal equilibrium are achieved inside of each computational cell. This work creates a multi-fidelity surrogate model to replace this process. This is achieved by using a Kriging model to produce a curve fit which interpolates selected data from the iterative algorithm. The surrogate is optimized for computing speed and model accuracy by varying the number of sampling points chosen to construct the model. The performance of the surrogate with respect to the iterative method is tested in simulations using a finite volume code. The model's computational speed and accuracy are analyzed to show the benefits of this novel approach. This work was supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program, under Contract No. DE-NA00023.
Rapid prototyping modelling in oral and maxillofacial surgery: A two year retrospective study
Stoor, Patricia; Mesimäki, Karri; Kontio, Risto K.
2015-01-01
Background The use of rapid prototyping (RP) models in medicine to construct bony models is increasing. Material and Methods The aim of the study was to evaluate retrospectively the indication for the use of RP models in oral and maxillofacial surgery at Helsinki University Central Hospital during 2009-2010. Also, the used computed tomography (CT) examination – multislice CT (MSCT) or cone beam CT (CBCT) - method was evaluated. Results In total 114 RP models were fabricated for 102 patients. The mean age of the patients at the time of the production of the model was 50.4 years. The indications for the modelling included malignant lesions (29%), secondary reconstruction (25%), prosthodontic treatment (22%), orthognathic surgery or asymmetry (13%), benign lesions (8%), and TMJ disorders (4%). MSCT examination was used in 92 and CBCT examination in 22 cases. Most of the models (75%) were conventional hard tissue models. Models with colored tumour or other structure(s) of interest were ordered in 24%. Two out of the 114 models were soft tissue models. Conclusions The main benefit of the models was in treatment planning and in connection with the production of pre-bent plates or custom made implants. The RP models both facilitate and improve treatment planning and intraoperative efficiency. Key words:Rapid prototyping, radiology, computed tomography, cone beam computed tomography. PMID:26644837
Computational fluid model incorporating liver metabolic activities in perfusion bioreactor.
Hsu, Myat Noe; Tan, Guo-Dong Sean; Tania, Marshella; Birgersson, Erik; Leo, Hwa Liang
2014-05-01
The importance of in vitro hepatotoxicity testing during early stages of drug development in the pharmaceutical industry demands effective bioreactor models with optimized conditions. While perfusion bioreactors have been proven to enhance mass transfer and liver specific functions over a long period of culture, the flow-induced shear stress has less desirable effects on the hepatocytes liver-specific functions. In this paper, a two-dimensional human liver hepatocellular carcinoma (HepG2) cell culture flow model, under a specified flow rate of 0.03 mL/min, was investigated. Besides computing the distribution of shear stresses acting on the surface of the cell culture, our numerical model also investigated the cell culture metabolic functions such as the oxygen consumption, glucose consumption, glutamine consumption, and ammonia production to provide a fuller analysis of the interaction among the various metabolites within the cell culture. The computed albumin production of our 2D flow model was verified by the experimental HepG2 culture results obtained over 3 days of culture. The results showed good agreement between our experimental data and numerical predictions with corresponding cumulative albumin production of 2.9 × 10(-5) and 3.0 × 10(-5) mol/m(3) , respectively. The results are of importance in making rational design choices for development of future bioreactors with more complex geometries. © 2013 Wiley Periodicals, Inc.
Recent technology products from Space Human Factors research
NASA Technical Reports Server (NTRS)
Jenkins, James P.
1991-01-01
The goals of the NASA Space Human Factors program and the research carried out concerning human factors are discussed with emphasis given to the development of human performance models, data, and tools. The major products from this program are described, which include the Laser Anthropometric Mapping System; a model of the human body for evaluating the kinematics and dynamics of human motion and strength in microgravity environment; an operational experience data base for verifying and validating the data repository of manned space flights; the Operational Experience Database Taxonomy; and a human-computer interaction laboratory whose products are the display softaware and requirements and the guideline documents and standards for applications on human-computer interaction. Special attention is given to the 'Convoltron', a prototype version of a signal processor for synthesizing the head-related transfer functions.
Computer Models Used to Support Cleanup Decision Making at Hazardous and Radioactive Waste Sites
This report is a product of the Interagency Environmental Pathway Modeling Workgroup. This report will help bring a uniform approach to solving environmental modeling problems common to site remediation and restoration efforts.
NASA Technical Reports Server (NTRS)
Sozen, Mehmet
2003-01-01
In what follows, the model used for combustion of liquid hydrogen (LH2) with liquid oxygen (LOX) using chemical equilibrium assumption, and the novel computational method developed for determining the equilibrium composition and temperature of the combustion products by application of the first and second laws of thermodynamics will be described. The modular FORTRAN code developed as a subroutine that can be incorporated into any flow network code with little effort has been successfully implemented in GFSSP as the preliminary runs indicate. The code provides capability of modeling the heat transfer rate to the coolants for parametric analysis in system design.
A Multi-Fidelity Surrogate Model for Handling Real Gas Equations of State
NASA Astrophysics Data System (ADS)
Ouellet, Frederick; Park, Chanyoung; Rollin, Bertrand; Balachandar, S."bala"
2016-11-01
The explosive dispersal of particles is an example of a complex multiphase and multi-species fluid flow problem. This problem has many engineering applications including particle-laden explosives. In these flows, the detonation products of the explosive cannot be treated as a perfect gas so a real gas equation of state is used to close the governing equations (unlike air, which uses the ideal gas equation for closure). As the products expand outward from the detonation point, they mix with ambient air and create a mixing region where both of the state equations must be satisfied. One of the more accurate, yet computationally expensive, methods to deal with this is a scheme that iterates between the two equations of state until pressure and thermal equilibrium are achieved inside of each computational cell. This work strives to create a multi-fidelity surrogate model of this process. We then study the performance of the model with respect to the iterative method by performing both gas-only and particle laden flow simulations using an Eulerian-Lagrangian approach with a finite volume code. Specifically, the model's (i) computational speed, (ii) memory requirements and (iii) computational accuracy are analyzed to show the benefits of this novel modeling approach. This work was supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program, under Contract No. DE-NA00023.
Structural Acoustic Physics Based Modeling of Curved Composite Shells
2017-09-19
Results show that the finite element computational models accurately match analytical calculations, and that the composite material studied in this...products. 15. SUBJECT TERMS Finite Element Analysis, Structural Acoustics, Fiber-Reinforced Composites, Physics-Based Modeling 16. SECURITY...2 4 FINITE ELEMENT MODEL DESCRIPTION
Ihekwaba, Adaoha E C; Mura, Ivan; Walshaw, John; Peck, Michael W; Barker, Gary C
2016-11-01
Clostridium botulinum produces botulinum neurotoxins (BoNTs), highly potent substances responsible for botulism. Currently, mathematical models of C. botulinum growth and toxigenesis are largely aimed at risk assessment and do not include explicit genetic information beyond group level but integrate many component processes, such as signalling, membrane permeability and metabolic activity. In this paper we present a scheme for modelling neurotoxin production in C. botulinum Group I type A1, based on the integration of diverse information coming from experimental results available in the literature. Experiments show that production of BoNTs depends on the growth-phase and is under the control of positive and negative regulatory elements at the intracellular level. Toxins are released as large protein complexes and are associated with non-toxic components. Here, we systematically review and integrate those regulatory elements previously described in the literature for C. botulinum Group I type A1 into a population dynamics model, to build the very first computational model of toxin production at the molecular level. We conduct a validation of our model against several items of published experimental data for different wild type and mutant strains of C. botulinum Group I type A1. The result of this process underscores the potential of mathematical modelling at the cellular level, as a means of creating opportunities in developing new strategies that could be used to prevent botulism; and potentially contribute to improved methods for the production of toxin that is used for therapeutics.
Modelling and Optimization of Nannochloropsis and Chlorella Growth for Various Locations and Seasons
NASA Astrophysics Data System (ADS)
Gharagozloo, P. E.
2014-12-01
Efficient production of algal biofuels could reduce dependence on foreign oil providing domestic renewable energy. Algae-based biofuels are attractive for their large oil yield potential despite decreased land use and natural-resource requirements compared to terrestrial energy crops. Important factors controlling algal-lipid productivity include temperature, nutrient availability, salinity, pH, and the light-to-biomass conversion rate. Computational approaches allow for inexpensive predictions of algae-growth kinetics for various bioreactor sizes and geometries without multiple, expensive measurement systems. In this work, we parameterize our physics-based computational algae growth model for the marine Nannochloropsis oceanica and freshwater Chlorella species. We then compare modelling results with experiments conducted in identical raceway ponds at six geographical locations in the United States (Hawaii, California, Arizona, Ohio, Georgia, and Florida) and three seasons through the Algae Testbed Public Private Partnership - Unified Field Studies. Results show that the computational model effectively predicts algae growth in systems across varying environments and identifies the causes for reductions in algal productivities. The model is then used to identify improvements to the cultivation system to produce higher biomass yields. This model could be used to study the effects of scale-up including the effects of predation, depth-decay of light (light extinction), and optimized nutrient and CO2 delivery. As more multifactorial data are accumulated for a variety of algal strains, the model could be used to select appropriate algal species for various geographic and climatic locations and seasons. Applying the model facilitates optimization of pond designs based on location and season.
Biomanufacturing: a US-China National Science Foundation-sponsored workshop.
Sun, Wei; Yan, Yongnian; Lin, Feng; Spector, Myron
2006-05-01
A recent US-China National Science Foundation-sponsored workshop on biomanufacturing reviewed the state-of-the-art of an array of new technologies for producing scaffolds for tissue engineering, providing precision multi-scale control of material, architecture, and cells. One broad category of such techniques has been termed solid freeform fabrication. The techniques in this category include: stereolithography, selected laser sintering, single- and multiple-nozzle deposition and fused deposition modeling, and three-dimensional printing. The precise and repetitive placement of material and cells in a three-dimensional construct at the micrometer length scale demands computer control. These novel computer-controlled scaffold production techniques, when coupled with computer-based imaging and structural modeling methods for the production of the templates for the scaffolds, define an emerging field of computer-aided tissue engineering. In formulating the questions that remain to be answered and discussing the knowledge required to further advance the field, the Workshop provided a basis for recommendations for future work.
A stellar audit: the computation of encounter rates for 47 Tucanae and omega Centauri
NASA Astrophysics Data System (ADS)
Davies, Melvyn B.; Benz, Willy
1995-10-01
Using King-Mitchie models, we compute encounter rates between the various stellar species in the globular clusters omega Cen and 47 Tuc. We also compute event rates for encounters between single stars and a population of primordial binaries. Using these rates, and what we have learnt from hydrodynamical simulations of encounters performed earlier, we compute the production rates of objects such as low-mass X-ray binaries (LMXBs), smothered neutron stars and blue stragglers (massive main-sequence stars). If 10 per cent of the stars are contained in primordial binaries, the production rate of interesting objects from encounters involving these binaries is as large as that from encounters between single stars. For example, encounters involving binaries produce a significant number of blue stragglers in both globular cluster models. The number of smothered neutron stars may exceed the number of LMXBs by a factor of 5-20, which may help to explain why millisecond pulsars are observed to outnumber LMXBs in globular clusters.
Thermodynamics of quasideterministic digital computers
NASA Astrophysics Data System (ADS)
Chu, Dominique
2018-02-01
A central result of stochastic thermodynamics is that irreversible state transitions of Markovian systems entail a cost in terms of an infinite entropy production. A corollary of this is that strictly deterministic computation is not possible. Using a thermodynamically consistent model, we show that quasideterministic computation can be achieved at finite, and indeed modest cost with accuracies that are indistinguishable from deterministic behavior for all practical purposes. Concretely, we consider the entropy production of stochastic (Markovian) systems that behave like and and a not gates. Combinations of these gates can implement any logical function. We require that these gates return the correct result with a probability that is very close to 1, and additionally, that they do so within finite time. The central component of the model is a machine that can read and write binary tapes. We find that the error probability of the computation of these gates falls with the power of the system size, whereas the cost only increases linearly with the system size.
Andreozzi, Stefano; Chakrabarti, Anirikh; Soh, Keng Cher; Burgard, Anthony; Yang, Tae Hoon; Van Dien, Stephen; Miskovic, Ljubisa; Hatzimanikatis, Vassily
2016-05-01
Rational metabolic engineering methods are increasingly employed in designing the commercially viable processes for the production of chemicals relevant to pharmaceutical, biotechnology, and food and beverage industries. With the growing availability of omics data and of methodologies capable to integrate the available data into models, mathematical modeling and computational analysis are becoming important in designing recombinant cellular organisms and optimizing cell performance with respect to desired criteria. In this contribution, we used the computational framework ORACLE (Optimization and Risk Analysis of Complex Living Entities) to analyze the physiology of recombinant Escherichia coli producing 1,4-butanediol (BDO) and to identify potential strategies for improved production of BDO. The framework allowed us to integrate data across multiple levels and to construct a population of large-scale kinetic models despite the lack of available information about kinetic properties of every enzyme in the metabolic pathways. We analyzed these models and we found that the enzymes that primarily control the fluxes leading to BDO production are part of central glycolysis, the lower branch of tricarboxylic acid (TCA) cycle and the novel BDO production route. Interestingly, among the enzymes between the glucose uptake and the BDO pathway, the enzymes belonging to the lower branch of TCA cycle have been identified as the most important for improving BDO production and yield. We also quantified the effects of changes of the target enzymes on other intracellular states like energy charge, cofactor levels, redox state, cellular growth, and byproduct formation. Independent earlier experiments on this strain confirmed that the computationally obtained conclusions are consistent with the experimentally tested designs, and the findings of the present studies can provide guidance for future work on strain improvement. Overall, these studies demonstrate the potential and effectiveness of ORACLE for the accelerated design of microbial cell factories. Copyright © 2016 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Pavlichin, Dmitri S.; Mabuchi, Hideo
2014-06-01
Nanoscale integrated photonic devices and circuits offer a path to ultra-low power computation at the few-photon level. Here we propose an optical circuit that performs a ubiquitous operation: the controlled, random-access readout of a collection of stored memory phases or, equivalently, the computation of the inner product of a vector of phases with a binary selector" vector, where the arithmetic is done modulo 2pi and the result is encoded in the phase of a coherent field. This circuit, a collection of cascaded interferometers driven by a coherent input field, demonstrates the use of coherence as a computational resource, and of the use of recently-developed mathematical tools for modeling optical circuits with many coupled parts. The construction extends in a straightforward way to the computation of matrix-vector and matrix-matrix products, and, with the inclusion of an optical feedback loop, to the computation of a weighted" readout of stored memory phases. We note some applications of these circuits for error correction and for computing tasks requiring fast vector inner products, e.g. statistical classification and some machine learning algorithms.
A flexible tool for diagnosing water, energy, and entropy budgets in climate models
NASA Astrophysics Data System (ADS)
Lembo, Valerio; Lucarini, Valerio
2017-04-01
We have developed a new flexible software for studying the global energy budget, the hydrological cycle, and the material entropy production of global climate models. The program receives as input radiative, latent and sensible energy fluxes, with the requirement that the variable names are in agreement with the Climate and Forecast (CF) conventions for the production of NetCDF datasets. Annual mean maps, meridional sections and time series are computed by means of Climate Data Operators (CDO) collection of command line operators developed at Max-Planck Institute for Meteorology (MPI-M). If a land-sea mask is provided, the program also computes the required quantities separately on the continents and oceans. Depending on the user's choice, the program also calls the MATLAB software to compute meridional heat transports and location and intensities of the peaks in the two hemispheres. We are currently planning to adapt the program in order to be included in the Earth System Model eValuation Tool (ESMValTool) community diagnostics.
NASA Astrophysics Data System (ADS)
Siegel, J.; Siegel, Edward Carl-Ludwig
2011-03-01
Cook-Levin computational-"complexity"(C-C) algorithmic-equivalence reduction-theorem reducibility equivalence to renormalization-(semi)-group phase-transitions critical-phenomena statistical-physics universality-classes fixed-points, is exploited with Gauss modular/clock-arithmetic/model congruences = signal X noise PRODUCT reinterpretation. Siegel-Baez FUZZYICS=CATEGORYICS(SON of ``TRIZ''): Category-Semantics(C-S) tabular list-format truth-table matrix analytics predicts and implements "noise"-induced phase-transitions (NITs) to accelerate versus to decelerate Harel [Algorithmics(1987)]-Sipser[Intro. Theory Computation(1997) algorithmic C-C: "NIT-picking" to optimize optimization-problems optimally(OOPO). Versus iso-"noise" power-spectrum quantitative-only amplitude/magnitude-only variation stochastic-resonance, this "NIT-picking" is "noise" power-spectrum QUALitative-type variation via quantitative critical-exponents variation. Computer-"science" algorithmic C-C models: Turing-machine, finite-state-models/automata, are identified as early-days once-workable but NOW ONLY LIMITING CRUTCHES IMPEDING latter-days new-insights!!!
Multi-objective group scheduling optimization integrated with preventive maintenance
NASA Astrophysics Data System (ADS)
Liao, Wenzhu; Zhang, Xiufang; Jiang, Min
2017-11-01
This article proposes a single-machine-based integration model to meet the requirements of production scheduling and preventive maintenance in group production. To describe the production for identical/similar and different jobs, this integrated model considers the learning and forgetting effects. Based on machine degradation, the deterioration effect is also considered. Moreover, perfect maintenance and minimal repair are adopted in this integrated model. The multi-objective of minimizing total completion time and maintenance cost is taken to meet the dual requirements of delivery date and cost. Finally, a genetic algorithm is developed to solve this optimization model, and the computation results demonstrate that this integrated model is effective and reliable.
NASA Astrophysics Data System (ADS)
Neverov, V. V.; Kozhukhov, Y. V.; Yablokov, A. M.; Lebedev, A. A.
2017-08-01
Nowadays the optimization using computational fluid dynamics (CFD) plays an important role in the design process of turbomachines. However, for the successful and productive optimization it is necessary to define a simulation model correctly and rationally. The article deals with the choice of a grid and computational domain parameters for optimization of centrifugal compressor impellers using computational fluid dynamics. Searching and applying optimal parameters of the grid model, the computational domain and solver settings allows engineers to carry out a high-accuracy modelling and to use computational capability effectively. The presented research was conducted using Numeca Fine/Turbo package with Spalart-Allmaras and Shear Stress Transport turbulence models. Two radial impellers was investigated: the high-pressure at ψT=0.71 and the low-pressure at ψT=0.43. The following parameters of the computational model were considered: the location of inlet and outlet boundaries, type of mesh topology, size of mesh and mesh parameter y+. Results of the investigation demonstrate that the choice of optimal parameters leads to the significant reduction of the computational time. Optimal parameters in comparison with non-optimal but visually similar parameters can reduce the calculation time up to 4 times. Besides, it is established that some parameters have a major impact on the result of modelling.
Multiple regression technique for Pth degree polynominals with and without linear cross products
NASA Technical Reports Server (NTRS)
Davis, J. W.
1973-01-01
A multiple regression technique was developed by which the nonlinear behavior of specified independent variables can be related to a given dependent variable. The polynomial expression can be of Pth degree and can incorporate N independent variables. Two cases are treated such that mathematical models can be studied both with and without linear cross products. The resulting surface fits can be used to summarize trends for a given phenomenon and provide a mathematical relationship for subsequent analysis. To implement this technique, separate computer programs were developed for the case without linear cross products and for the case incorporating such cross products which evaluate the various constants in the model regression equation. In addition, the significance of the estimated regression equation is considered and the standard deviation, the F statistic, the maximum absolute percent error, and the average of the absolute values of the percent of error evaluated. The computer programs and their manner of utilization are described. Sample problems are included to illustrate the use and capability of the technique which show the output formats and typical plots comparing computer results to each set of input data.
NASA Astrophysics Data System (ADS)
Mohsin, Muhammad; Mu, Yongtong; Memon, Aamir Mahmood; Kalhoro, Muhammad Talib; Shah, Syed Baber Hussain
2017-07-01
Pakistani marine waters are under an open access regime. Due to poor management and policy implications, blind fishing is continued which may result in ecological as well as economic losses. Thus, it is of utmost importance to estimate fishery resources before harvesting. In this study, catch and effort data, 1996-2009, of Kiddi shrimp Parapenaeopsis stylifera fishery from Pakistani marine waters was analyzed by using specialized fishery software in order to know fishery stock status of this commercially important shrimp. Maximum, minimum and average capture production of P. stylifera was observed as 15 912 metric tons (mt) (1997), 9 438 mt (2009) and 11 667 mt/a. Two stock assessment tools viz. CEDA (catch and effort data analysis) and ASPIC (a stock production model incorporating covariates) were used to compute MSY (maximum sustainable yield) of this organism. In CEDA, three surplus production models, Fox, Schaefer and Pella-Tomlinson, along with three error assumptions, log, log normal and gamma, were used. For initial proportion (IP) 0.8, the Fox model computed MSY as 6 858 mt (CV=0.204, R 2 =0.709) and 7 384 mt (CV=0.149, R 2 =0.72) for log and log normal error assumption respectively. Here, gamma error produced minimization failure. Estimated MSY by using Schaefer and Pella-Tomlinson models remained the same for log, log normal and gamma error assumptions i.e. 7 083 mt, 8 209 mt and 7 242 mt correspondingly. The Schafer results showed highest goodness of fit R 2 (0.712) values. ASPIC computed MSY, CV, R 2, F MSY and B MSY parameters for the Fox model as 7 219 mt, 0.142, 0.872, 0.111 and 65 280, while for the Logistic model the computed values remained 7 720 mt, 0.148, 0.868, 0.107 and 72 110 correspondingly. Results obtained have shown that P. stylifera has been overexploited. Immediate steps are needed to conserve this fishery resource for the future and research on other species of commercial importance is urgently needed.
Winter Simulation Conference, Miami Beach, Fla., December 4-6, 1978, Proceedings. Volumes 1 & 2
NASA Technical Reports Server (NTRS)
Highland, H. J. (Editor); Nielsen, N. R.; Hull, L. G.
1978-01-01
The papers report on the various aspects of simulation such as random variate generation, simulation optimization, ranking and selection of alternatives, model management, documentation, data bases, and instructional methods. Simulation studies in a wide variety of fields are described, including system design and scheduling, government and social systems, agriculture, computer systems, the military, transportation, corporate planning, ecosystems, health care, manufacturing and industrial systems, computer networks, education, energy, production planning and control, financial models, behavioral models, information systems, and inventory control.
A STELLA model to estimate water and nitrogen dynamics in a short-rotation woody crop plantation
Ying Ouyang; Jiaen Zhang; Theodor D. Leininger; Brent R. Frey
2015-01-01
Although short-rotation woody crop biomass production technology has demonstrated a promising potential to supply feedstocks for bioenergy production, the water and nutrient processes in the woody crop planation ecosystem are poorly understood. In this study, a computer model was developed to estimate the dynamics of water and nitrogen (N) species (e.g., NH4...
Advanced Collaborative Environments Supporting Systems Integration and Design
2003-03-01
concurrently view a virtual system or product model while maintaining natural, human communication . These virtual systems operate within a computer-generated...These environments allow multiple individuals to concurrently view a virtual system or product model while simultaneously maintaining natural, human ... communication . As a result, TARDEC researchers and system developers are using this advanced high-end visualization technology to develop future
The Evolution of a Connectionist Model of Situated Human Language Understanding
NASA Astrophysics Data System (ADS)
Mayberry, Marshall R.; Crocker, Matthew W.
The Adaptive Mechanisms in Human Language Processing (ALPHA) project features both experimental and computational tracks designed to complement each other in the investigation of the cognitive mechanisms that underlie situated human utterance processing. The models developed in the computational track replicate results obtained in the experimental track and, in turn, suggest further experiments by virtue of behavior that arises as a by-product of their operation.
Impact of scaffold rigidity on the design and evolution of an artificial Diels-Alderase
Preiswerk, Nathalie; Beck, Tobias; Schulz, Jessica D.; Milovník, Peter; Mayer, Clemens; Siegel, Justin B.; Baker, David; Hilvert, Donald
2014-01-01
By combining targeted mutagenesis, computational refinement, and directed evolution, a modestly active, computationally designed Diels-Alderase was converted into the most proficient biocatalyst for [4+2] cycloadditions known. The high stereoselectivity and minimal product inhibition of the evolved enzyme enabled preparative scale synthesis of a single product diastereomer. X-ray crystallography of the enzyme–product complex shows that the molecular changes introduced over the course of optimization, including addition of a lid structure, gradually reshaped the pocket for more effective substrate preorganization and transition state stabilization. The good overall agreement between the experimental structure and the original design model with respect to the orientations of both the bound product and the catalytic side chains contrasts with other computationally designed enzymes. Because design accuracy appears to correlate with scaffold rigidity, improved control over backbone conformation will likely be the key to future efforts to design more efficient enzymes for diverse chemical reactions. PMID:24847076
Computational Analysis of Stereospecificity in the Cope Rearrangement
ERIC Educational Resources Information Center
Glish, Laura; Hanks, Timothy W.
2007-01-01
The Cope rearrangement is a highly stereospecific, concerted reaction of considerable synthetic utility. Experimental product distributions from the reaction of disubstituted 1,5-hexadienes can be readily understood by computer modeling of the various possible transitions states. Semi-empirical methods give relative energies of transition states…
Estimating and validating ground-based timber harvesting production through computer simulation
Jingxin Wang; Chris B. LeDoux
2003-01-01
Estimating ground-based timber harvesting systems production with an object oriented methodology was investigated. The estimation model developed generates stands of trees, simulates chain saw, drive-to-tree feller-buncher, swing-to-tree single-grip harvester felling, and grapple skidder and forwarder extraction activities, and analyzes costs and productivity. It also...
The SMAP level 4 carbon product for monitoring ecosystem land-atmosphere CO2 exchange
USDA-ARS?s Scientific Manuscript database
The NASA Soil Moisture Active Passive (SMAP) mission Level 4 Carbon (L4C) product provides model estimates of Net Ecosystem CO2 exchange (NEE) incorporating SMAP soil moisture information. The L4C product includes NEE, computed as total ecosystem respiration less gross photosynthesis, at a daily ti...
Ugulu, Rex Asibuodu; Allen, Stephen
2017-12-01
The data presented in this article is an original data on "Investigating the role of onsite learning in the optimisation of craft gang's productivity in the construction industry". This article describes the constraints influencing craft gang's productivity and the influence of onsite learning on the blockwork craft gang's productivity. It also presented the method of data collection, using a semi-structured interview and an observation method to collect data from construction organisations. We provided statistics on the top most important constraints affecting the craft gang's productivity using 3-D Bar charts. In addition, we computed the correlation coefficients and the regression model on the influence of onsite learning on craft gang's productivity using the man-hour as the dependent variable. The relationship between blockwork inputs and cycle numbers was determined at 5% significance level. Finally, we presented data information on the application of the learning curve theory using the unit straight-line model equations and computed the learning rate of the observed craft gang's blockwork repetitive work.
Gronert, Scott; Garver, John M; Nichols, Charles M; Worker, Benjamin B; Bierbaum, Veronica M
2014-11-21
The gas-phase reactions of carbon- and nitrogen-centered nucleophiles with polyfluorobromobenzenes were examined in a selected-ion flow tube (SIFT) and modeled computationally at the MP2/6-31+G(d,p)//MP2/6-31+G(d) level. In the gas-phase experiments, rate constants and branching ratios were determined. The carbon nucleophiles produce expected nucleophilic aromatic substitution (SNAr) and proton transfer products along with unexpected products that result from SN2 reactions at the bromine center (polyfluorophenide leaving group). With nitrogen nucleophiles, the SN2 at bromine channel is suppressed. In the SNAr channels, the "element effect" is observed, and fluoride loss competes with bromide loss. The computational modeling indicates that all the substitution barriers are well below the entrance channel and that entropy and dynamics effects control the product distributions.
Spin-neurons: A possible path to energy-efficient neuromorphic computers
NASA Astrophysics Data System (ADS)
Sharad, Mrigank; Fan, Deliang; Roy, Kaushik
2013-12-01
Recent years have witnessed growing interest in the field of brain-inspired computing based on neural-network architectures. In order to translate the related algorithmic models into powerful, yet energy-efficient cognitive-computing hardware, computing-devices beyond CMOS may need to be explored. The suitability of such devices to this field of computing would strongly depend upon how closely their physical characteristics match with the essential computing primitives employed in such models. In this work, we discuss the rationale of applying emerging spin-torque devices for bio-inspired computing. Recent spin-torque experiments have shown the path to low-current, low-voltage, and high-speed magnetization switching in nano-scale magnetic devices. Such magneto-metallic, current-mode spin-torque switches can mimic the analog summing and "thresholding" operation of an artificial neuron with high energy-efficiency. Comparison with CMOS-based analog circuit-model of a neuron shows that "spin-neurons" (spin based circuit model of neurons) can achieve more than two orders of magnitude lower energy and beyond three orders of magnitude reduction in energy-delay product. The application of spin-neurons can therefore be an attractive option for neuromorphic computers of future.
Spin-neurons: A possible path to energy-efficient neuromorphic computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharad, Mrigank; Fan, Deliang; Roy, Kaushik
Recent years have witnessed growing interest in the field of brain-inspired computing based on neural-network architectures. In order to translate the related algorithmic models into powerful, yet energy-efficient cognitive-computing hardware, computing-devices beyond CMOS may need to be explored. The suitability of such devices to this field of computing would strongly depend upon how closely their physical characteristics match with the essential computing primitives employed in such models. In this work, we discuss the rationale of applying emerging spin-torque devices for bio-inspired computing. Recent spin-torque experiments have shown the path to low-current, low-voltage, and high-speed magnetization switching in nano-scale magnetic devices.more » Such magneto-metallic, current-mode spin-torque switches can mimic the analog summing and “thresholding” operation of an artificial neuron with high energy-efficiency. Comparison with CMOS-based analog circuit-model of a neuron shows that “spin-neurons” (spin based circuit model of neurons) can achieve more than two orders of magnitude lower energy and beyond three orders of magnitude reduction in energy-delay product. The application of spin-neurons can therefore be an attractive option for neuromorphic computers of future.« less
Computer Programs For Automated Welding System
NASA Technical Reports Server (NTRS)
Agapakis, John E.
1993-01-01
Computer programs developed for use in controlling automated welding system described in MFS-28578. Together with control computer, computer input and output devices and control sensors and actuators, provide flexible capability for planning and implementation of schemes for automated welding of specific workpieces. Developed according to macro- and task-level programming schemes, which increases productivity and consistency by reducing amount of "teaching" of system by technician. System provides for three-dimensional mathematical modeling of workpieces, work cells, robots, and positioners.
Estimators of primary production for interpretation of remotely sensed data on ocean color
NASA Technical Reports Server (NTRS)
Platt, Trevor; Sathyendranath, Shubha
1993-01-01
The theoretical basis is explained for some commonly used estimators of daily primary production in a vertically uniform water column. These models are recast into a canonical form, with dimensionless arguments, to facilitate comparison with each other and with an analytic solution. The limitations of each model are examined. The values of the photoadaptation parameter I(k) observed in the ocean are analyzed, and I(k) is used as a scale to normalize the surface irradiance. The range of this scaled irradiance is presented. An equation is given for estimation of I(k) from recent light history. It is shown how the models for water column production can be adapted for estimation of the production in finite layers. The distinctions between model formulation, model implementation and model evaluation are discussed. Recommendations are given on the choice of algorithm for computation of daily production according to the degree of approximation acceptable in the result.
Factors Influencing F/OSS Cloud Computing Software Product Success: A Quantitative Study
ERIC Educational Resources Information Center
Letort, D. Brian
2012-01-01
Cloud Computing introduces a new business operational model that allows an organization to shift information technology consumption from traditional capital expenditure to operational expenditure. This shift introduces challenges from both the adoption and creation vantage. This study evaluates factors that influence Free/Open Source Software…
When Computer Writers Compose by Hand.
ERIC Educational Resources Information Center
Collier, Richard; Werier, Clifford
1995-01-01
Reviews videotapes of three professional writers composing several essays from start to finish, both by hand and by computer. Discusses similarities and differences among the completed essays. Finds that writing appears to be governed by deep cognitive models that are little influenced by the mode of text production or by the writer's preference…
Teaching Pulmonary Gas Exchange Physiology Using Computer Modeling
ERIC Educational Resources Information Center
Kapitan, Kent S.
2008-01-01
Students often have difficulty understanding the relationship of O[subscript 2] consumption, CO[subscript 2] production, cardiac output, and distribution of ventilation-perfusion ratios in the lung to the final arterial blood gas composition. To overcome this difficulty, I have developed an interactive computer simulation of pulmonary gas exchange…
Impacts of cloud water droplets on the OH production rate from peroxide photolysis.
Martins-Costa, M T C; Anglada, J M; Francisco, J S; Ruiz-López, Manuel F
2017-12-06
Understanding the difference between observed and modeled concentrations of HO x radicals in the troposphere is a current major issue in atmospheric chemistry. It is widely believed that existing atmospheric models miss a source of such radicals and several potential new sources have been proposed. In recent years, interest has increased on the role played by cloud droplets and organic aerosols. Computer modeling of ozone photolysis, for instance, has shown that atmospheric aqueous interfaces accelerate the associated OH production rate by as much as 3-4 orders of magnitude. Since methylhydroperoxide is a main source and sink of HO x radicals, especially at low NO x concentrations, it is fundamental to assess what is the influence of clouds on its chemistry and photochemistry. In this study, computer simulations for the photolysis of methylhydroperoxide at the air-water interface have been carried out showing that the OH production rate is severely enhanced, reaching a comparable level to ozone photolysis.
Spatial Variation of Pressure in the Lyophilization Product Chamber Part 1: Computational Modeling.
Ganguly, Arnab; Varma, Nikhil; Sane, Pooja; Bogner, Robin; Pikal, Michael; Alexeenko, Alina
2017-04-01
The flow physics in the product chamber of a freeze dryer involves coupled heat and mass transfer at different length and time scales. The low-pressure environment and the relatively small flow velocities make it difficult to quantify the flow structure experimentally. The current work presents the three-dimensional computational fluid dynamics (CFD) modeling for vapor flow in a laboratory scale freeze dryer validated with experimental data and theory. The model accounts for the presence of a non-condensable gas such as nitrogen or air using a continuum multi-species model. The flow structure at different sublimation rates, chamber pressures, and shelf-gaps are systematically investigated. Emphasis has been placed on accurately predicting the pressure variation across the subliming front. At a chamber set pressure of 115 mtorr and a sublimation rate of 1.3 kg/h/m 2 , the pressure variation reaches about 9 mtorr. The pressure variation increased linearly with sublimation rate in the range of 0.5 to 1.3 kg/h/m 2 . The dependence of pressure variation on the shelf-gap was also studied both computationally and experimentally. The CFD modeling results are found to agree within 10% with the experimental measurements. The computational model was also compared to analytical solution valid for small shelf-gaps. Thus, the current work presents validation study motivating broader use of CFD in optimizing freeze-drying process and equipment design.
Modelling parallel programs and multiprocessor architectures with AXE
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Fineman, Charles E.
1991-01-01
AXE, An Experimental Environment for Parallel Systems, was designed to model and simulate for parallel systems at the process level. It provides an integrated environment for specifying computation models, multiprocessor architectures, data collection, and performance visualization. AXE is being used at NASA-Ames for developing resource management strategies, parallel problem formulation, multiprocessor architectures, and operating system issues related to the High Performance Computing and Communications Program. AXE's simple, structured user-interface enables the user to model parallel programs and machines precisely and efficiently. Its quick turn-around time keeps the user interested and productive. AXE models multicomputers. The user may easily modify various architectural parameters including the number of sites, connection topologies, and overhead for operating system activities. Parallel computations in AXE are represented as collections of autonomous computing objects known as players. Their use and behavior is described. Performance data of the multiprocessor model can be observed on a color screen. These include CPU and message routing bottlenecks, and the dynamic status of the software.
Cognitive Modeling of Individual Variation in Reference Production and Comprehension
Hendriks, Petra
2016-01-01
A challenge for most theoretical and computational accounts of linguistic reference is the observation that language users vary considerably in their referential choices. Part of the variation observed among and within language users and across tasks may be explained from variation in the cognitive resources available to speakers and listeners. This paper presents a computational model of reference production and comprehension developed within the cognitive architecture ACT-R. Through simulations with this ACT-R model, it is investigated how cognitive constraints interact with linguistic constraints and features of the linguistic discourse in speakers’ production and listeners’ comprehension of referring expressions in specific tasks, and how this interaction may give rise to variation in referential choice. The ACT-R model of reference explains and predicts variation among language users in their referential choices as a result of individual and task-related differences in processing speed and working memory capacity. Because of limitations in their cognitive capacities, speakers sometimes underspecify or overspecify their referring expressions, and listeners sometimes choose incorrect referents or are overly liberal in their interpretation of referring expressions. PMID:27092101
1986-08-01
publication by Ms. Jessica S. Ruff, Information Products Division, WES. This manual is published in loose-leaf format for convenience in ." ._ periodic...transfer computations. m. Variety of output options. Background 8. This manual is a product of a program of evaluation and refinement of mathematical water...zooplankton and higher order herbivores. However, these groups are presently not included in the model. Macrophyte production may also have an impact upon
Prediction of Chemical Function: Model Development and Application
The United States Environmental Protection Agency’s Exposure Forecaster (ExpoCast) project is developing both statistical and mechanism-based computational models for predicting exposures to thousands of chemicals, including those in consumer products. The high-throughput (...
Development of a Model and Computer Code to Describe Solar Grade Silicon Production Processes
NASA Technical Reports Server (NTRS)
Srivastava, R.; Gould, R. K.
1979-01-01
Mathematical models and computer codes based on these models, which allow prediction of the product distribution in chemical reactors for converting gaseous silicon compounds to condensed-phase silicon were developed. The following tasks were accomplished: (1) formulation of a model for silicon vapor separation/collection from the developing turbulent flow stream within reactors of the Westinghouse (2) modification of an available general parabolic code to achieve solutions to the governing partial differential equations (boundary layer type) which describe migration of the vapor to the reactor walls, (3) a parametric study using the boundary layer code to optimize the performance characteristics of the Westinghouse reactor, (4) calculations relating to the collection efficiency of the new AeroChem reactor, and (5) final testing of the modified LAPP code for use as a method of predicting Si(1) droplet sizes in these reactors.
A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
2012-08-01
17 1.1.1 Mass production / destruction terms . . . . . . . . . . . . . . . . . . . . 18 1.1.2 Energy exchange terms...that US3D does not cur- rently model electronic energy . If the US3D solution is post-processed to account for electronic energy modes, the computed...nonequilibrium model for electronic energy to the 12 Distribution A: Approved for public release; distribution is unlimited. Figure 9: (left) jet profile solution
A Computational Model of Active Vision for Visual Search in Human-Computer Interaction
2010-08-01
processors that interact with the production rules to produce behavior, and (c) parameters that constrain the behavior of the model (e.g., the...velocity of a saccadic eye movement). While the parameters can be task-specific, the majority of the parameters are usually fixed across a wide variety...previously estimated durations. Hooge and Erkelens (1996) review these four explanations of fixation duration control. A variety of research
Distributed Computing for the Pierre Auger Observatory
NASA Astrophysics Data System (ADS)
Chudoba, J.
2015-12-01
Pierre Auger Observatory operates the largest system of detectors for ultra-high energy cosmic ray measurements. Comparison of theoretical models of interactions with recorded data requires thousands of computing cores for Monte Carlo simulations. Since 2007 distributed resources connected via EGI grid are successfully used. The first and the second versions of production system based on bash scripts and MySQL database were able to submit jobs to all reliable sites supporting Virtual Organization auger. For many years VO auger belongs to top ten of EGI users based on the total used computing time. Migration of the production system to DIRAC interware started in 2014. Pilot jobs improve efficiency of computing jobs and eliminate problems with small and less reliable sites used for the bulk production. The new system has also possibility to use available resources in clouds. Dirac File Catalog replaced LFC for new files, which are organized in datasets defined via metadata. CVMFS is used for software distribution since 2014. In the presentation we give a comparison of the old and the new production system and report the experience on migrating to the new system.
I-deas TMG to NX Space Systems Thermal Model Conversion and Computational Performance Comparison
NASA Technical Reports Server (NTRS)
Somawardhana, Ruwan
2011-01-01
CAD/CAE packages change on a continuous basis as the power of the tools increase to meet demands. End -users must adapt to new products as they come to market and replace legacy packages. CAE modeling has continued to evolve and is constantly becoming more detailed and complex. Though this comes at the cost of increased computing requirements Parallel processing coupled with appropriate hardware can minimize computation time. Users of Maya Thermal Model Generator (TMG) are faced with transitioning from NX I -deas to NX Space Systems Thermal (SST). It is important to understand what differences there are when changing software packages We are looking for consistency in results.
Integrated computer-aided design using minicomputers
NASA Technical Reports Server (NTRS)
Storaasli, O. O.
1980-01-01
Computer-Aided Design/Computer-Aided Manufacturing (CAD/CAM), a highly interactive software, has been implemented on minicomputers at the NASA Langley Research Center. CAD/CAM software integrates many formerly fragmented programs and procedures into one cohesive system; it also includes finite element modeling and analysis, and has been interfaced via a computer network to a relational data base management system and offline plotting devices on mainframe computers. The CAD/CAM software system requires interactive graphics terminals operating at a minimum of 4800 bits/sec transfer rate to a computer. The system is portable and introduces 'interactive graphics', which permits the creation and modification of models interactively. The CAD/CAM system has already produced designs for a large area space platform, a national transonic facility fan blade, and a laminar flow control wind tunnel model. Besides the design/drafting element analysis capability, CAD/CAM provides options to produce an automatic program tooling code to drive a numerically controlled (N/C) machine. Reductions in time for design, engineering, drawing, finite element modeling, and N/C machining will benefit productivity through reduced costs, fewer errors, and a wider range of configuration.
Quantum Iterative Deepening with an Application to the Halting Problem
Tarrataca, Luís; Wichert, Andreas
2013-01-01
Classical models of computation traditionally resort to halting schemes in order to enquire about the state of a computation. In such schemes, a computational process is responsible for signaling an end of a calculation by setting a halt bit, which needs to be systematically checked by an observer. The capacity of quantum computational models to operate on a superposition of states requires an alternative approach. From a quantum perspective, any measurement of an equivalent halt qubit would have the potential to inherently interfere with the computation by provoking a random collapse amongst the states. This issue is exacerbated by undecidable problems such as the Entscheidungsproblem which require universal computational models, e.g. the classical Turing machine, to be able to proceed indefinitely. In this work we present an alternative view of quantum computation based on production system theory in conjunction with Grover's amplitude amplification scheme that allows for (1) a detection of halt states without interfering with the final result of a computation; (2) the possibility of non-terminating computation and (3) an inherent speedup to occur during computations susceptible of parallelization. We discuss how such a strategy can be employed in order to simulate classical Turing machines. PMID:23520465
Walshaw, John; Peck, Michael W.; Barker, Gary C.
2016-01-01
Clostridium botulinum produces botulinum neurotoxins (BoNTs), highly potent substances responsible for botulism. Currently, mathematical models of C. botulinum growth and toxigenesis are largely aimed at risk assessment and do not include explicit genetic information beyond group level but integrate many component processes, such as signalling, membrane permeability and metabolic activity. In this paper we present a scheme for modelling neurotoxin production in C. botulinum Group I type A1, based on the integration of diverse information coming from experimental results available in the literature. Experiments show that production of BoNTs depends on the growth-phase and is under the control of positive and negative regulatory elements at the intracellular level. Toxins are released as large protein complexes and are associated with non-toxic components. Here, we systematically review and integrate those regulatory elements previously described in the literature for C. botulinum Group I type A1 into a population dynamics model, to build the very first computational model of toxin production at the molecular level. We conduct a validation of our model against several items of published experimental data for different wild type and mutant strains of C. botulinum Group I type A1. The result of this process underscores the potential of mathematical modelling at the cellular level, as a means of creating opportunities in developing new strategies that could be used to prevent botulism; and potentially contribute to improved methods for the production of toxin that is used for therapeutics. PMID:27855161
Vandersall, Jennifer A.; Gardner, Shea N.; Clague, David S.
2010-05-04
A computational method and computer-based system of modeling DNA synthesis for the design and interpretation of PCR amplification, parallel DNA synthesis, and microarray chip analysis. The method and system include modules that address the bioinformatics, kinetics, and thermodynamics of DNA amplification and synthesis. Specifically, the steps of DNA selection, as well as the kinetics and thermodynamics of DNA hybridization and extensions, are addressed, which enable the optimization of the processing and the prediction of the products as a function of DNA sequence, mixing protocol, time, temperature and concentration of species.
Measurement of Productivity and Quality in Non-Marketable Services: With Application to Schools
ERIC Educational Resources Information Center
Fare, R.; Grosskopf, S.; Forsund, F. R.; Hayes, K.; Heshmati, A.
2006-01-01
Purpose: This paper seeks to model and compute productivity, including a measure of quality, of a service which does not have marketable outputs--namely public education at the micro level. This application is a case study for Sweden public schools. Design/methodology/approach: A Malmquist productivity index is employed which allows for multiple…
The application of cloud computing to scientific workflows: a study of cost and performance.
Berriman, G Bruce; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Vöckler, Jens-S
2013-01-28
The current model of transferring data from data centres to desktops for analysis will soon be rendered impractical by the accelerating growth in the volume of science datasets. Processing will instead often take place on high-performance servers co-located with data. Evaluations of how new technologies such as cloud computing would support such a new distributed computing model are urgently needed. Cloud computing is a new way of purchasing computing and storage resources on demand through virtualization technologies. We report here the results of investigations of the applicability of commercial cloud computing to scientific computing, with an emphasis on astronomy, including investigations of what types of applications can be run cheaply and efficiently on the cloud, and an example of an application well suited to the cloud: processing a large dataset to create a new science product.
NASA Astrophysics Data System (ADS)
Kettle, Helen; Merchant, Chris J.
2008-08-01
Modeling the vertical penetration of photosynthetically active radiation (PAR) through the ocean, and its utilization by phytoplankton, is fundamental to simulating marine primary production. The variation of attenuation and absorption of light with wavelength suggests that photosynthesis should be modeled at high spectral resolution, but this is computationally expensive. To model primary production in global 3d models, a balance between computer time and accuracy is necessary. We investigate the effects of varying the spectral resolution of the underwater light field and the photosynthetic efficiency of phytoplankton ( α∗), on primary production using a 1d coupled ecosystem ocean turbulence model. The model is applied at three sites in the Atlantic Ocean (CIS (∼60°N), PAP (∼50°N) and ESTOC (∼30°N)) to include the effect of different meteorological forcing and parameter sets. We also investigate three different methods for modeling α∗ - as a fixed constant, varying with both wavelength and chlorophyll concentration [Bricaud, A., Morel, A., Babin, M., Allali, K., Claustre, H., 1998. Variations of light absorption by suspended particles with chlorophyll a concentration in oceanic (case 1) waters. Analysis and implications for bio-optical models. J. Geophys. Res. 103, 31033-31044], and using a non-spectral parameterization [Anderson, T.R., 1993. A spectrally averaged model of light penetration and photosynthesis. Limnol. Oceanogr. 38, 1403-1419]. After selecting the appropriate ecosystem parameters for each of the three sites we vary the spectral resolution of light and α∗ from 1 to 61 wavebands and study the results in conjunction with the three different α∗ estimation methods. The results show modeled estimates of ocean primary productivity are highly sensitive to the degree of spectral resolution and α∗. For accurate simulations of primary production and chlorophyll distribution we recommend a spectral resolution of at least six wavebands if α∗ is a function of wavelength and chlorophyll, and three wavebands if α∗ is a fixed value.
Ganguly, Arnab; Alexeenko, Alina A; Schultz, Steven G; Kim, Sherry G
2013-10-01
A physics-based model for the sublimation-transport-condensation processes occurring in pharmaceutical freeze-drying by coupling product attributes and equipment capabilities into a unified simulation framework is presented. The system-level model is used to determine the effect of operating conditions such as shelf temperature, chamber pressure, and the load size on occurrence of choking for a production-scale dryer. Several data sets corresponding to production-scale runs with a load from 120 to 485 L have been compared with simulations. A subset of data is used for calibration, whereas another data set corresponding to a load of 150 L is used for model validation. The model predictions for both the onset and extent of choking as well as for the measured product temperature agree well with the production-scale measurements. Additionally, we study the effect of resistance to vapor transport presented by the duct with a valve and a baffle in the production-scale freeze-dryer. Computation Fluid Dynamics (CFD) techniques augmented with a system-level unsteady heat and mass transfer model allow to predict dynamic process conditions taking into consideration specific dryer design. CFD modeling of flow structure in the duct presented here for a production-scale freeze-dryer quantifies the benefit of reducing the obstruction to the flow through several design modifications. It is found that the use of a combined valve-baffle system can increase vapor flow rate by a factor of 2.2. Moreover, minor design changes such as moving the baffle downstream by about 10 cm can increase the flow rate by 54%. The proposed design changes can increase drying rates, improve efficiency, and reduce cycle times due to fewer obstructions in the vapor flow path. The comprehensive simulation framework combining the system-level model and the detailed CFD computations can provide a process analytical tool for more efficient and robust freeze-drying of bio-pharmaceuticals. Copyright © 2013 Elsevier B.V. All rights reserved.
Morrison, Adrian F; Herbert, John M
2017-06-14
Recently, we introduced an ab initio version of the Frenkel-Davydov exciton model for computing excited-state properties of molecular crystals and aggregates. Within this model, supersystem excited states are approximated as linear combinations of excitations localized on molecular sites, and the electronic Hamiltonian is constructed and diagonalized in a direct-product basis of non-orthogonal configuration state functions computed for isolated fragments. Here, we derive and implement analytic derivative couplings for this model, including nuclear derivatives of the natural transition orbital and symmetric orthogonalization transformations that are part of the approximation. Nuclear derivatives of the exciton Hamiltonian's matrix elements, required in order to compute the nonadiabatic couplings, are equivalent to the "Holstein" and "Peierls" exciton/phonon couplings that are widely discussed in the context of model Hamiltonians for energy and charge transport in organic photovoltaics. As an example, we compute the couplings that modulate triplet exciton transport in crystalline tetracene, which is relevant in the context of carrier diffusion following singlet exciton fission.
Computer Software Management and Information Center
NASA Technical Reports Server (NTRS)
1983-01-01
Computer programs for passive anti-roll tank, earth resources laboratory applications, the NIMBUS-7 coastal zone color scanner derived products, transportable applications executive, plastic and failure analysis of composites, velocity gradient method for calculating velocities in an axisymmetric annular duct, an integrated procurement management system, data I/O PRON for the Motorola exorcisor, aerodynamic shock-layer shape, kinematic modeling, hardware library for a graphics computer, and a file archival system are documented.
Impact of office productivity cloud computing on energy consumption and greenhouse gas emissions.
Williams, Daniel R; Tang, Yinshan
2013-05-07
Cloud computing is usually regarded as being energy efficient and thus emitting less greenhouse gases (GHG) than traditional forms of computing. When the energy consumption of Microsoft's cloud computing Office 365 (O365) and traditional Office 2010 (O2010) software suites were tested and modeled, some cloud services were found to consume more energy than the traditional form. The developed model in this research took into consideration the energy consumption at the three main stages of data transmission; data center, network, and end user device. Comparable products from each suite were selected and activities were defined for each product to represent a different computing type. Microsoft provided highly confidential data for the data center stage, while the networking and user device stages were measured directly. A new measurement and software apportionment approach was defined and utilized allowing the power consumption of cloud services to be directly measured for the user device stage. Results indicated that cloud computing is more energy efficient for Excel and Outlook which consumed less energy and emitted less GHG than the standalone counterpart. The power consumption of the cloud based Outlook (8%) and Excel (17%) was lower than their traditional counterparts. However, the power consumption of the cloud version of Word was 17% higher than its traditional equivalent. A third mixed access method was also measured for Word which emitted 5% more GHG than the traditional version. It is evident that cloud computing may not provide a unified way forward to reduce energy consumption and GHG. Direct conversion from the standalone package into the cloud provision platform can now consider energy and GHG emissions at the software development and cloud service design stage using the methods described in this research.
NASA Technical Reports Server (NTRS)
Pandya, Abhilash; Maida, James; Hasson, Scott; Greenisen, Michael; Woolford, Barbara
1993-01-01
As manned exploration of space continues, analytical evaluation of human strength characteristics is critical. These extraterrestrial environments will spawn issues of human performance which will impact the designs of tools, work spaces, and space vehicles. Computer modeling is an effective method of correlating human biomechanical and anthropometric data with models of space structures and human work spaces. The aim of this study is to provide biomechanical data from isolated joints to be utilized in a computer modeling system for calculating torque resulting from any upper extremity motions: in this study, the ratchet wrench push-pull operation (a typical extravehicular activity task). Established here are mathematical relationships used to calculate maximum torque production of isolated upper extremity joints. These relationships are a function of joint angle and joint velocity.
Computational Fluid Dynamics of Whole-Body Aircraft
NASA Astrophysics Data System (ADS)
Agarwal, Ramesh
1999-01-01
The current state of the art in computational aerodynamics for whole-body aircraft flowfield simulations is described. Recent advances in geometry modeling, surface and volume grid generation, and flow simulation algorithms have led to accurate flowfield predictions for increasingly complex and realistic configurations. As a result, computational aerodynamics has emerged as a crucial enabling technology for the design and development of flight vehicles. Examples illustrating the current capability for the prediction of transport and fighter aircraft flowfields are presented. Unfortunately, accurate modeling of turbulence remains a major difficulty in the analysis of viscosity-dominated flows. In the future, inverse design methods, multidisciplinary design optimization methods, artificial intelligence technology, and massively parallel computer technology will be incorporated into computational aerodynamics, opening up greater opportunities for improved product design at substantially reduced costs.
Richard Haynes; Darius Adams; Peter Ince; John Mills; Ralph Alig
2006-01-01
The United States has a century of experience with the development of models that describe markets for forest products and trends in resource conditions. In the last four decades, increasing rigor in policy debates has stimulated the development of models to support policy analysis. Increasingly, research has evolved (often relying on computer-based models) to increase...
Computer Training for Entrepreneurial Meteorologists.
NASA Astrophysics Data System (ADS)
Koval, Joseph P.; Young, George S.
2001-05-01
Computer applications of increasing diversity form a growing part of the undergraduate education of meteorologists in the early twenty-first century. The advent of the Internet economy, as well as a waning demand for traditional forecasters brought about by better numerical models and statistical forecasting techniques has greatly increased the need for operational and commercial meteorologists to acquire computer skills beyond the traditional techniques of numerical analysis and applied statistics. Specifically, students with the skills to develop data distribution products are in high demand in the private sector job market. Meeting these demands requires greater breadth, depth, and efficiency in computer instruction. The authors suggest that computer instruction for undergraduate meteorologists should include three key elements: a data distribution focus, emphasis on the techniques required to learn computer programming on an as-needed basis, and a project orientation to promote management skills and support student morale. In an exploration of this approach, the authors have reinvented the Applications of Computers to Meteorology course in the Department of Meteorology at The Pennsylvania State University to teach computer programming within the framework of an Internet product development cycle. Because the computer skills required for data distribution programming change rapidly, specific languages are valuable for only a limited time. A key goal of this course was therefore to help students learn how to retrain efficiently as technologies evolve. The crux of the course was a semester-long project during which students developed an Internet data distribution product. As project management skills are also important in the job market, the course teamed students in groups of four for this product development project. The success, failures, and lessons learned from this experiment are discussed and conclusions drawn concerning undergraduate instructional methods for computer applications in meteorology.
NASA Technical Reports Server (NTRS)
Cariapa, Vikram
1993-01-01
The trend in the modern global economy towards free market policies has motivated companies to use rapid prototyping technologies to not only reduce product development cycle time but also to maintain their competitive edge. A rapid prototyping technology is one which combines computer aided design with computer controlled tracking of focussed high energy source (eg. lasers, heat) on modern ceramic powders, metallic powders, plastics or photosensitive liquid resins in order to produce prototypes or models. At present, except for the process of shape melting, most rapid prototyping processes generate products that are only dimensionally similar to those of the desired end product. There is an urgent need, therefore, to enhance the understanding of the characteristics of these processes in order to realize their potential for production. Currently, the commercial market is dominated by four rapid prototyping processes, namely selective laser sintering, stereolithography, fused deposition modelling and laminated object manufacturing. This phase of the research has focussed on the selective laser sintering and stereolithography rapid prototyping processes. A theoretical model for these processes is under development. Different rapid prototyping sites supplied test specimens (based on ASTM 638-84, Type I) that have been measured and tested to provide a data base on surface finish, dimensional variation and ultimate tensile strength. Further plans call for developing and verifying the theoretical models by carefully designed experiments. This will be a joint effort between NASA and other prototyping centers to generate a larger database, thus encouraging more widespread usage by product designers.
NASA Astrophysics Data System (ADS)
Shah, Rahul H.
Production costs account for the largest share of the overall cost of manufacturing facilities. With the U.S. industrial sector becoming more and more competitive, manufacturers are looking for more cost and resource efficient working practices. Operations management and production planning have shown their capability to dramatically reduce manufacturing costs and increase system robustness. When implementing operations related decision making and planning, two fields that have shown to be most effective are maintenance and energy. Unfortunately, the current research that integrates both is limited. Additionally, these studies fail to consider parameter domains and optimization on joint energy and maintenance driven production planning. Accordingly, production planning methodology that considers maintenance and energy is investigated. Two models are presented to achieve well-rounded operating strategy. The first is a joint energy and maintenance production scheduling model. The second is a cost per part model considering maintenance, energy, and production. The proposed methodology will involve a Time-of-Use electricity demand response program, buffer and holding capacity, station reliability, production rate, station rated power, and more. In practice, the scheduling problem can be used to determine a joint energy, maintenance, and production schedule. Meanwhile, the cost per part model can be used to: (1) test the sensitivity of the obtained optimal production schedule and its corresponding savings by varying key production system parameters; and (2) to determine optimal system parameter combinations when using the joint energy, maintenance, and production planning model. Additionally, a factor analysis on the system parameters is conducted and the corresponding performance of the production schedule under variable parameter conditions, is evaluated. Also, parameter optimization guidelines that incorporate maintenance and energy parameter decision making in the production planning framework are discussed. A modified Particle Swarm Optimization solution technique is adopted to solve the proposed scheduling problem. The algorithm is described in detail and compared to Genetic Algorithm. Case studies are presented to illustrate the benefits of using the proposed model and the effectiveness of the Particle Swarm Optimization approach. Numerical Experiments are implemented and analyzed to test the effectiveness of the proposed model. The proposed scheduling strategy can achieve savings of around 19 to 27 % in cost per part when compared to the baseline scheduling scenarios. By optimizing key production system parameters from the cost per part model, the baseline scenarios can obtain around 20 to 35 % in savings for the cost per part. These savings further increase by 42 to 55 % when system parameter optimization is integrated with the proposed scheduling problem. Using this method, the most influential parameters on the cost per part are the rated power from production, the production rate, and the initial machine reliabilities. The modified Particle Swarm Optimization algorithm adopted allows greater diversity and exploration compared to Genetic Algorithm for the proposed joint model which results in it being more computationally efficient in determining the optimal scheduling. While Genetic Algorithm could achieve a solution quality of 2,279.63 at an expense of 2,300 seconds in computational effort. In comparison, the proposed Particle Swarm Optimization algorithm achieved a solution quality of 2,167.26 in less than half the computation effort which is required by Genetic Algorithm.
Automated plant, production management system
NASA Astrophysics Data System (ADS)
Aksenova, V. I.; Belov, V. I.
1984-12-01
The development of a complex of tasks for the operational management of production (OUP) within the framework of an automated system for production management (ASUP) shows that it is impossible to have effective computations without reliable initial information. The influence of many factors involving the production and economic activity of the entire enterprise upon the plan and course of production are considered. It is suggested that an adequate model should be available which covers all levels of the hierarchical system: workplace, section (bridgade), shop, enterprise, and the model should be incorporated into the technological sequence of performance and there should be provisions for an adequate man machine system.
NASA Technical Reports Server (NTRS)
Jackman, Charles H.; Meade, Paul E.
1988-01-01
Daily average solar proton flux data for 1978 and 1979 are used in a proton energy degradation scheme to derive ion pair production rates and atomic nitrogen production rates. The latter are computed in a form suitable for inclusion in an atmopheric, two-dimensional, time-dependent photochemical model. Odd nitrogen distributions are computed from the model, including atomic nitrogen production from solar protons, and are compared with baseline distributions. The comparisons show that the average effect of the solar protons in 1978 and 1979 was to cause changes in odd nitrogen only above 10 mbar and at latitudes only above about 50 deg in both hemispheres. The influence of the solar proton-produced odd nitrogen on the local abundance of odd nitrogen depends primarily on the background odd nitrogen abundance as well as the altitude and season.
Basic Requirements for Systems Software Research and Development
NASA Technical Reports Server (NTRS)
Kuszmaul, Chris; Nitzberg, Bill
1996-01-01
Our success over the past ten years evaluating and developing advanced computing technologies has been due to a simple research and development (R/D) model. Our model has three phases: (a) evaluating the state-of-the-art, (b) identifying problems and creating innovations, and (c) developing solutions, improving the state- of-the-art. This cycle has four basic requirements: a large production testbed with real users, a diverse collection of state-of-the-art hardware, facilities for evalua- tion of emerging technologies and development of innovations, and control over system management on these testbeds. Future research will be irrelevant and future products will not work if any of these requirements is eliminated. In order to retain our effectiveness, the numerical aerospace simulator (NAS) must replace out-of-date production testbeds in as timely a fashion as possible, and cannot afford to ignore innovative designs such as new distributed shared memory machines, clustered commodity-based computers, and multi-threaded architectures.
EOS MLS Level 2 Data Processing Software Version 3
NASA Technical Reports Server (NTRS)
Livesey, Nathaniel J.; VanSnyder, Livesey W.; Read, William G.; Schwartz, Michael J.; Lambert, Alyn; Santee, Michelle L.; Nguyen, Honghanh T.; Froidevaux, Lucien; wang, Shuhui; Manney, Gloria L.;
2011-01-01
This software accepts the EOS MLS calibrated measurements of microwave radiances products and operational meteorological data, and produces a set of estimates of atmospheric temperature and composition. This version has been designed to be as flexible as possible. The software is controlled by a Level 2 Configuration File that controls all aspects of the software: defining the contents of state and measurement vectors, defining the configurations of the various forward models available, reading appropriate a priori spectroscopic and calibration data, performing retrievals, post-processing results, computing diagnostics, and outputting results in appropriate files. In production mode, the software operates in a parallel form, with one instance of the program acting as a master, coordinating the work of multiple slave instances on a cluster of computers, each computing the results for individual chunks of data. In addition, to do conventional retrieval calculations and producing geophysical products, the Level 2 Configuration File can instruct the software to produce files of simulated radiances based on a state vector formed from a set of geophysical product files taken as input. Combining both the retrieval and simulation tasks in a single piece of software makes it far easier to ensure that identical forward model algorithms and parameters are used in both tasks. This also dramatically reduces the complexity of the code maintenance effort.
Williams, Kent E; Voigt, Jeffrey R
2004-01-01
The research reported herein presents the results of an empirical evaluation that focused on the accuracy and reliability of cognitive models created using a computerized tool: the cognitive analysis tool for human-computer interaction (CAT-HCI). A sample of participants, expert in interacting with a newly developed tactical display for the U.S. Army's Bradley Fighting Vehicle, individually modeled their knowledge of 4 specific tasks employing the CAT-HCI tool. Measures of the accuracy and consistency of task models created by these task domain experts using the tool were compared with task models created by a double expert. The findings indicated a high degree of consistency and accuracy between the different "single experts" in the task domain in terms of the resultant models generated using the tool. Actual or potential applications of this research include assessing human-computer interaction complexity, determining the productivity of human-computer interfaces, and analyzing an interface design to determine whether methods can be automated.
Bańkowski, Robert; Wiadrowska, Bozena; Beresińska, Martyna; Ludwicki, Jan K; Noworyta-Głowacka, Justyna; Godyń, Artur; Doruchowski, Grzegorz; Hołownicki, Ryszard
2013-01-01
Faulty but still operating agricultural pesticide sprayers may pose an unacceptable health risk for operators. The computerized models designed to calculate exposure and risk for pesticide sprayers used as an aid in the evaluation and further authorisation of plant protection products may be applied also to assess a health risk for operators when faulty sprayers are used. To evaluate the impact of different exposure scenarios on the health risk for the operators using faulty agricultural spraying equipment by means of computer modelling. The exposure modelling was performed for 15 pesticides (5 insecticides, 7 fungicides and 3 herbicides). The critical parameter, i.e. toxicological end-point, on which the risk assessment was based was the no observable adverse effect level (NOAEL). This enabled risk to be estimated under various exposure conditions such as pesticide concentration in the plant protection product and type of the sprayed crop as well as the number of treatments. Computer modelling was based on the UK POEM model including determination of the acceptable operator exposure level (AOEL). Thus the degree of operator exposure could be defined during pesticide treatment whether or not personal protection equipment had been employed by individuals. Data used for computer modelling was obtained from simulated, pesticide substitute treatments using variously damaged knapsack sprayers. These substitute preparations consisted of markers that allowed computer simulations to be made, analogous to real-life exposure situations, in a dose dependent fashion. Exposures were estimated according to operator dosimetry exposure under 'field' conditions for low level, medium and high target field crops. The exposure modelling in the high target field crops demonstrated exceedance of the AOEL in all simulated treatment cases (100%) using damaged sprayers irrespective of the type of damage or if individual protective measures had been adopted or not. For low level and medium field crops exceedances ranged between 40 - 80% cases. The computer modelling may be considered as an practical tool for the hazard assessment when the faulty agricultural sprayers are used. It also may be applied for programming the quality checks and maintenance systems of this equipment.
[Computer simulation of thyroid regulatory mechanisms in health and malignancy].
Abduvaliev, A A; Gil'dieva, M S; Khidirov, B N; Saĭdalieva, M; Saatov, T S
2010-07-01
The paper describes a computer model for regulation of the number of thyroid follicular cells in health and malignancy. The authors'computer program for mathematical simulation of the regulatory mechanisms of a thyroid follicular cellular community cannot be now referred to as good commercial products. For commercialization of this product, it is necessary to draw up a direct relation of the introduced corrected values from the actually existing normal values, such as the peripheral blood concentrations of thyroid hormones or the mean values of endocrine tissue mitotic activity. However, the described computer program has been also used in researches by our scientific group in the study of thyroid cancer. The available biological experimental data and theoretical provisions on thyroid structural and functional organization at the cellular level allow one to construct mathematical models for quantitative analysis of the regulation of the size of a cellular community of a thyroid follicle in health and abnormalities, by using the method for simulation of the regulatory mechanisms of living systems and the equations of cellular community regulatory communities.
Cladé, Thierry; Snyder, Joshua C.
2010-01-01
Clinical trials which use imaging typically require data management and workflow integration across several parties. We identify opportunities for all parties involved to realize benefits with a modular interoperability model based on service-oriented architecture and grid computing principles. We discuss middleware products for implementation of this model, and propose caGrid as an ideal candidate due to its healthcare focus; free, open source license; and mature developer tools and support. PMID:20449775
PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vondy, D.R.
1979-10-01
The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined.
2012-11-21
examination of some of the aromatics show that the model captures well benzene from toluene decomposition in BF, but underpredicts styrene and ethylbenzene ...critical toluene pyrolysis products and stable soot precursors were compared with computational models using two semi-detailed chemical mechanisms... ethylbenzene , which at least one of the mechanisms reproduces quite well. The largest measured species in the incipiently sooting flame is indene, whose
Norton, Tomás; Sun, Da-Wen; Grant, Jim; Fallon, Richard; Dodd, Vincent
2007-09-01
The application of computational fluid dynamics (CFD) in the agricultural industry is becoming ever more important. Over the years, the versatility, accuracy and user-friendliness offered by CFD has led to its increased take-up by the agricultural engineering community. Now CFD is regularly employed to solve environmental problems of greenhouses and animal production facilities. However, due to a combination of increased computer efficacy and advanced numerical techniques, the realism of these simulations has only been enhanced in recent years. This study provides a state-of-the-art review of CFD, its current applications in the design of ventilation systems for agricultural production systems, and the outstanding challenging issues that confront CFD modellers. The current status of greenhouse CFD modelling was found to be at a higher standard than that of animal housing, owing to the incorporation of user-defined routines that simulate crop biological responses as a function of local environmental conditions. Nevertheless, the most recent animal housing simulations have addressed this issue and in turn have become more physically realistic.
Formation of algae growth constitutive relations for improved algae modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gharagozloo, Patricia E.; Drewry, Jessica Louise.
This SAND report summarizes research conducted as a part of a two year Laboratory Directed Research and Development (LDRD) project to improve our abilities to model algal cultivation. Algae-based biofuels have generated much excitement due to their potentially large oil yield from relatively small land use and without interfering with the food or water supply. Algae mitigate atmospheric CO2 through metabolism. Efficient production of algal biofuels could reduce dependence on foreign oil by providing a domestic renewable energy source. Important factors controlling algal productivity include temperature, nutrient concentrations, salinity, pH, and the light-to-biomass conversion rate. Computational models allow for inexpensivemore » predictions of algae growth kinetics in these non-ideal conditions for various bioreactor sizes and geometries without the need for multiple expensive measurement setups. However, these models need to be calibrated for each algal strain. In this work, we conduct a parametric study of key marine algae strains and apply the findings to a computational model.« less
Methodology for extracting local constants from petroleum cracking flows
Chang, Shen-Lin; Lottes, Steven A.; Zhou, Chenn Q.
2000-01-01
A methodology provides for the extraction of local chemical kinetic model constants for use in a reacting flow computational fluid dynamics (CFD) computer code with chemical kinetic computations to optimize the operating conditions or design of the system, including retrofit design improvements to existing systems. The coupled CFD and kinetic computer code are used in combination with data obtained from a matrix of experimental tests to extract the kinetic constants. Local fluid dynamic effects are implicitly included in the extracted local kinetic constants for each particular application system to which the methodology is applied. The extracted local kinetic model constants work well over a fairly broad range of operating conditions for specific and complex reaction sets in specific and complex reactor systems. While disclosed in terms of use in a Fluid Catalytic Cracking (FCC) riser, the inventive methodology has application in virtually any reaction set to extract constants for any particular application and reaction set formulation. The methodology includes the step of: (1) selecting the test data sets for various conditions; (2) establishing the general trend of the parametric effect on the measured product yields; (3) calculating product yields for the selected test conditions using coupled computational fluid dynamics and chemical kinetics; (4) adjusting the local kinetic constants to match calculated product yields with experimental data; and (5) validating the determined set of local kinetic constants by comparing the calculated results with experimental data from additional test runs at different operating conditions.
DOT National Transportation Integrated Search
2008-01-01
Computer simulations are often used in aviation studies. These simulation tools may require complex, high-fidelity aircraft models. Since many of the flight models used are third-party developed products, independent validation is desired prior to im...
1998-01-01
Don Sirois, an Auburn University research associate, and Bruce Strom, a mechanical engineering Co-Op Student, are evaluating the dimensional characteristics of an aluminum automobile engine casting. More accurate metal casting processes may reduce the weight of some cast metal products used in automobiles, such as engines. Research in low gravity has taken an important first step toward making metal products used in homes, automobiles, and aircraft less expensive, safer, and more durable. Auburn University and industry are partnering with NASA to develop one of the first accurate computer model predictions of molten metals and molding materials used in a manufacturing process called casting. Ford Motor Company's casting plant in Cleveland, Ohio is using NASA-sponsored computer modeling information to improve the casting process of automobile and light-truck engine blocks.
Improving Metal Casting Process
NASA Technical Reports Server (NTRS)
1998-01-01
Don Sirois, an Auburn University research associate, and Bruce Strom, a mechanical engineering Co-Op Student, are evaluating the dimensional characteristics of an aluminum automobile engine casting. More accurate metal casting processes may reduce the weight of some cast metal products used in automobiles, such as engines. Research in low gravity has taken an important first step toward making metal products used in homes, automobiles, and aircraft less expensive, safer, and more durable. Auburn University and industry are partnering with NASA to develop one of the first accurate computer model predictions of molten metals and molding materials used in a manufacturing process called casting. Ford Motor Company's casting plant in Cleveland, Ohio is using NASA-sponsored computer modeling information to improve the casting process of automobile and light-truck engine blocks.
The Five Key Questions of Human Performance Modeling.
Wu, Changxu
2018-01-01
Via building computational (typically mathematical and computer simulation) models, human performance modeling (HPM) quantifies, predicts, and maximizes human performance, human-machine system productivity and safety. This paper describes and summarizes the five key questions of human performance modeling: 1) Why we build models of human performance; 2) What the expectations of a good human performance model are; 3) What the procedures and requirements in building and verifying a human performance model are; 4) How we integrate a human performance model with system design; and 5) What the possible future directions of human performance modeling research are. Recent and classic HPM findings are addressed in the five questions to provide new thinking in HPM's motivations, expectations, procedures, system integration and future directions.
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Minnis, P.; Spangenberg, D.; Ayers, J. K.; Palikonda, R.; Vakhnin, A.; Dubois, R.; Murphy, P. R.
2014-12-01
The processing, storage and dissemination of satellite cloud and radiation products produced at NASA Langley Research Center are key activities for the Climate Science Branch. A constellation of systems operates in sync to accomplish these goals. Because of the complexity involved with operating such intricate systems, there are both high failure rates and high costs for hardware and system maintenance. Cloud computing has the potential to ameliorate cost and complexity issues. Over time, the cloud computing model has evolved and hybrid systems comprising off-site as well as on-site resources are now common. Towards our mission of providing the highest quality research products to the widest audience, we have explored the use of the Amazon Web Services (AWS) Cloud and Storage and present a case study of our results and efforts. This project builds upon NASA Langley Cloud and Radiation Group's experience with operating large and complex computing infrastructures in a reliable and cost effective manner to explore novel ways to leverage cloud computing resources in the atmospheric science environment. Our case study presents the project requirements and then examines the fit of AWS with the LaRC computing model. We also discuss the evaluation metrics, feasibility, and outcomes and close the case study with the lessons we learned that would apply to others interested in exploring the implementation of the AWS system in their own atmospheric science computing environments.
García-Betances, Rebeca I.; Cabrera-Umpiérrez, María Fernanda; Ottaviano, Manuel; Pastorino, Matteo; Arredondo, María T.
2016-01-01
Despite the speedy evolution of Information and Computer Technology (ICT), and the growing recognition of the importance of the concept of universal design in all domains of daily living, mainstream ICT-based product designers and developers still work without any truly structured tools, guidance or support to effectively adapt their products and services to users’ real needs. This paper presents the approach used to define and evaluate parametric cognitive models that describe interaction and usage of ICT by people with aging- and disability-derived functional impairments. A multisensorial training platform was used to train, based on real user measurements in real conditions, the virtual parameterized user models that act as subjects of the test-bed during all stages of simulated disabilities-friendly ICT-based products design. An analytical study was carried out to identify the relevant cognitive functions involved, together with their corresponding parameters as related to aging- and disability-derived functional impairments. Evaluation of the final cognitive virtual user models in a real application has confirmed that the use of these models produce concrete valuable benefits to the design and testing process of accessible ICT-based applications and services. Parameterization of cognitive virtual user models allows incorporating cognitive and perceptual aspects during the design process. PMID:26907296
NASA Technical Reports Server (NTRS)
Dunbar, D. N.; Tunnah, B. G.
1978-01-01
A FORTRAN computer program is described for predicting the flow streams and material, energy, and economic balances of a typical petroleum refinery, with particular emphasis on production of aviation turbine fuel of varying end point and hydrogen content specifications. The program has provision for shale oil and coal oil in addition to petroleum crudes. A case study feature permits dependent cases to be run for parametric or optimization studies by input of only the variables which are changed from the base case.
NASA Technical Reports Server (NTRS)
Dunbar, D. N.; Tunnah, B. G.
1978-01-01
The FORTRAN computing program predicts flow streams and material, energy, and economic balances of a typical petroleum refinery, with particular emphasis on production of aviation turbine fuels of varying end point and hydrogen content specifications. The program has a provision for shale oil and coal oil in addition to petroleum crudes. A case study feature permits dependent cases to be run for parametric or optimization studies by input of only the variables which are changed from the base case.
The CREp program, a fully parameterizable program to compute exposure ages (3He, 10Be)
NASA Astrophysics Data System (ADS)
Martin, L.; Blard, P. H.; Lave, J.; Delunel, R.; Balco, G.
2015-12-01
Over the last decades, cosmogenic exposure dating permitted major advances in Earth surface sciences, and particularly in paleoclimatology. Yet, exposure age calculation is a dense procedure. It requires numerous choices of parameterization and the use of an appropriate production rate. Nowadays, Earth surface scientists may either calculate exposure ages on their own or use the available programs. However, these programs do not offer the possibility to include all the most recent advances in Cosmic Ray Exposure (CRE) dating. Notably, they do not propose the most recent production rate datasets and they only offer few possibilities to test the impact of the atmosphere model and the geomagnetic model on the computed ages. We present the CREp program, a Matlab © code that computes CRE ages for 3He and 10Be over the last 2 million years. The CREp program includes the scaling models of Lal-Stone in the "Lal modified" version (Balco et al., 2008; Lal, 1991; Stone, 2000) and the LSD model (Lifton et al., 2014). For any of these models, CREP allows choosing between the ERA-40 atmosphere model (Uppala et al., 2005) and the standard atmosphere (National Oceanic and Atmospheric Administration, 1976). Regarding the geomagnetic database, users can opt for one of the three proposed datasets: Muscheler et al. 2005, GLOPIS-75 (Laj et al. 2004) and the geomagnetic framework proposed in the LSD model (Lifton et al., 2014). They may also import their own geomagnetic database. Importantly, the reference production rate can be chosen among a large variety of possibilities. We made an effort to propose a wide and homogenous calibration database in order to promote the use of local calibration rates: CREp includes all the calibration data published until July 2015 and will be able to access an updated online database including all the newly published production rates. This is crucial for improving the ages accuracy. Users may also choose a global production rate or use their own data to either calibrate a production rate or directly input a Sea Level High Latitude value. The program is fast to calculate a large number of ages and to export the final density probability function associated with each age into an Excel © spreadsheet format.
Production Management System for AMS Computing Centres
NASA Astrophysics Data System (ADS)
Choutko, V.; Demakov, O.; Egorov, A.; Eline, A.; Shan, B. S.; Shi, R.
2017-10-01
The Alpha Magnetic Spectrometer [1] (AMS) has collected over 95 billion cosmic ray events since it was installed on the International Space Station (ISS) on May 19, 2011. To cope with enormous flux of events, AMS uses 12 computing centers in Europe, Asia and North America, which have different hardware and software configurations. The centers are participating in data reconstruction, Monte-Carlo (MC) simulation [2]/Data and MC production/as well as in physics analysis. Data production management system has been developed to facilitate data and MC production tasks in AMS computing centers, including job acquiring, submitting, monitoring, transferring, and accounting. It was designed to be modularized, light-weighted, and easy-to-be-deployed. The system is based on Deterministic Finite Automaton [3] model, and implemented by script languages, Python and Perl, and the built-in sqlite3 database on Linux operating systems. Different batch management systems, file system storage, and transferring protocols are supported. The details of the integration with Open Science Grid are presented as well.
A Decision-making Model for a Two-stage Production-delivery System in SCM Environment
NASA Astrophysics Data System (ADS)
Feng, Ding-Zhong; Yamashiro, Mitsuo
A decision-making model is developed for an optimal production policy in a two-stage production-delivery system that incorporates a fixed quantity supply of finished goods to a buyer at a fixed interval of time. First, a general cost model is formulated considering both supplier (of raw materials) and buyer (of finished products) sides. Then an optimal solution to the problem is derived on basis of the cost model. Using the proposed model and its optimal solution, one can determine optimal production lot size for each stage, optimal number of transportation for semi-finished goods, and optimal quantity of semi-finished goods transported each time to meet the lumpy demand of consumers. Also, we examine the sensitivity of raw materials ordering and production lot size to changes in ordering cost, transportation cost and manufacturing setup cost. A pragmatic computation approach for operational situations is proposed to solve integer approximation solution. Finally, we give some numerical examples.
Two computational methods are proposed for estimation of the emission rate of volatile organic compounds (VOCs) from solvent-based indoor coating materials based on the knowledge of product formulation. The first method utilizes two previously developed mass transfer models with ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rogers, L.A.
1980-06-01
In the Department of Energy test of the Edna Delcambre No. 1 well for recovery of natural gas from geopressured-geothermal brine, part of the test producted gas in excess of the amount that could be dissolved in the brine. Where this excess gas originated was unknown and several theories were proposed to explain the source. This annual report describes IGT's work to match the observed gas/water production with computer simulation. Two different theoretical models were calculated in detail using available reservoir simulators. One model considered the excess gas to be dispersed as small bubbles in pores. The other model consideredmore » the excess gas as a nearby free gas cap above the aquifer. Reservoir engineering analysis of the flow test data was used to determine the basic reservoir characteristics. The computer studies revealed that the dispersed gas model gave characteristically the wrong shape for plots of gas/water ratio, and no reasonable match of the calculated values could be made to the experimental results. The free gas cap model gave characteristically better shapes to the gas/water ratio plots if the initial edge of the free gas was only about 400 feet from the well. Because there were two other wells at approximately this distance (Delcambre No. 4 and No. 4A wells) which had a history of down-hole blowouts and mechanical problems, it appears that the source of the excess free gas is from a separate horizon which connected to the Delcambre No. 1 sand via these nearby wells. This conclusion is corroborated by the changes in gas composition when the excess gas occurs and the geological studies which indicate the nearest free gas cap to be several thousand feet away. The occurrence of this excess free gas can thus be explained by known reservoir characteristics, and no new model for gas entrapment or production is needed.« less
Stochastic model simulation using Kronecker product analysis and Zassenhaus formula approximation.
Caglar, Mehmet Umut; Pal, Ranadip
2013-01-01
Probabilistic Models are regularly applied in Genetic Regulatory Network modeling to capture the stochastic behavior observed in the generation of biological entities such as mRNA or proteins. Several approaches including Stochastic Master Equations and Probabilistic Boolean Networks have been proposed to model the stochastic behavior in genetic regulatory networks. It is generally accepted that Stochastic Master Equation is a fundamental model that can describe the system being investigated in fine detail, but the application of this model is computationally enormously expensive. On the other hand, Probabilistic Boolean Network captures only the coarse-scale stochastic properties of the system without modeling the detailed interactions. We propose a new approximation of the stochastic master equation model that is able to capture the finer details of the modeled system including bistabilities and oscillatory behavior, and yet has a significantly lower computational complexity. In this new method, we represent the system using tensors and derive an identity to exploit the sparse connectivity of regulatory targets for complexity reduction. The algorithm involves an approximation based on Zassenhaus formula to represent the exponential of a sum of matrices as product of matrices. We derive upper bounds on the expected error of the proposed model distribution as compared to the stochastic master equation model distribution. Simulation results of the application of the model to four different biological benchmark systems illustrate performance comparable to detailed stochastic master equation models but with considerably lower computational complexity. The results also demonstrate the reduced complexity of the new approach as compared to commonly used Stochastic Simulation Algorithm for equivalent accuracy.
Validation of the thermal code of RadTherm-IR, IR-Workbench, and F-TOM
NASA Astrophysics Data System (ADS)
Schwenger, Frédéric; Grossmann, Peter; Malaplate, Alain
2009-05-01
System assessment by image simulation requires synthetic scenarios that can be viewed by the device to be simulated. In addition to physical modeling of the camera, a reliable modeling of scene elements is necessary. Software products for modeling of target data in the IR should be capable of (i) predicting surface temperatures of scene elements over a long period of time and (ii) computing sensor views of the scenario. For such applications, FGAN-FOM acquired the software products RadTherm-IR (ThermoAnalytics Inc., Calumet, USA; IR-Workbench (OKTAL-SE, Toulouse, France). Inspection of the accuracy of simulation results by validation is necessary before using these products for applications. In the first step of validation, the performance of both "thermal solvers" was determined through comparison of the computed diurnal surface temperatures of a simple object with the corresponding values from measurements. CUBI is a rather simple geometric object with well known material parameters which makes it suitable for testing and validating object models in IR. It was used in this study as a test body. Comparison of calculated and measured surface temperature values will be presented, together with the results from the FGAN-FOM thermal object code F-TOM. In the second validation step, radiances of the simulated sensor views computed by RadTherm-IR and IR-Workbench will be compared with radiances retrieved from the recorded sensor images taken by the sensor that was simulated. Strengths and weaknesses of the models RadTherm-IR, IR-Workbench and F-TOM will be discussed.
Agricultural production and water use scenarios in Cyprus under global change
NASA Astrophysics Data System (ADS)
Bruggeman, Adriana; Zoumides, Christos; Camera, Corrado; Pashiardis, Stelios; Zomeni, Zomenia
2014-05-01
In many countries of the world, food demand exceeds the total agricultural production. In semi-arid countries, agricultural water demand often also exceeds the sustainable supply of water resources. These water-stressed countries are expected to become even drier, as a result of global climate change. This will have a significant impact on the future of the agricultural sector and on food security. The aim of the AGWATER project consortium is to provide recommendations for climate change adaptation for the agricultural sector in Cyprus and the wider Mediterranean region. Gridded climate data sets, with 1-km horizontal resolution were prepared for Cyprus for 1980-2010. Regional Climate Model results were statistically downscaled, with the help of spatial weather generators. A new soil map was prepared using a predictive modelling and mapping technique and a large spatial database with soil and environmental parameters. Stakeholder meetings with agriculture and water stakeholders were held to develop future water prices, based on energy scenarios and to identify climate resilient production systems. Green houses, including also hydroponic systems, grapes, potatoes, cactus pears and carob trees were the more frequently identified production systems. The green-blue-water model, based on the FAO-56 dual crop coefficient approach, has been set up to compute agricultural water demand and yields for all crop fields in Cyprus under selected future scenarios. A set of agricultural production and water use performance indicators are computed by the model, including green and blue water use, crop yield, crop water productivity, net value of crop production and economic water productivity. This work is part of the AGWATER project - AEIFORIA/GEOGRO/0311(BIE)/06 - co-financed by the European Regional Development Fund and the Republic of Cyprus through the Research Promotion Foundation.
A Machine LearningFramework to Forecast Wave Conditions
NASA Astrophysics Data System (ADS)
Zhang, Y.; James, S. C.; O'Donncha, F.
2017-12-01
Recently, significant effort has been undertaken to quantify and extract wave energy because it is renewable, environmental friendly, abundant, and often close to population centers. However, a major challenge is the ability to accurately and quickly predict energy production, especially across a 48-hour cycle. Accurate forecasting of wave conditions is a challenging undertaking that typically involves solving the spectral action-balance equation on a discretized grid with high spatial resolution. The nature of the computations typically demands high-performance computing infrastructure. Using a case-study site at Monterey Bay, California, a machine learning framework was trained to replicate numerically simulated wave conditions at a fraction of the typical computational cost. Specifically, the physics-based Simulating WAves Nearshore (SWAN) model, driven by measured wave conditions, nowcast ocean currents, and wind data, was used to generate training data for machine learning algorithms. The model was run between April 1st, 2013 and May 31st, 2017 generating forecasts at three-hour intervals yielding 11,078 distinct model outputs. SWAN-generated fields of 3,104 wave heights and a characteristic period could be replicated through simple matrix multiplications using the mapping matrices from machine learning algorithms. In fact, wave-height RMSEs from the machine learning algorithms (9 cm) were less than those for the SWAN model-verification exercise where those simulations were compared to buoy wave data within the model domain (>40 cm). The validated machine learning approach, which acts as an accurate surrogate for the SWAN model, can now be used to perform real-time forecasts of wave conditions for the next 48 hours using available forecasted boundary wave conditions, ocean currents, and winds. This solution has obvious applications to wave-energy generation as accurate wave conditions can be forecasted with over a three-order-of-magnitude reduction in computational expense. The low computational cost (and by association low computer-power requirement) means that the machine learning algorithms could be installed on a wave-energy converter as a form of "edge computing" where a device could forecast its own 48-hour energy production.
DOT National Transportation Integrated Search
1977-01-01
Auto production and operation consume energy, material, capital and labor resources. Numerous substitution possibilities exist within and between resource sectors, corresponding to the broad spectrum of potential design technologies. Alternative auto...
CELSS scenario analysis: Breakeven calculations
NASA Technical Reports Server (NTRS)
Mason, R. M.
1980-01-01
A model of the relative mass requirements of food production components in a controlled ecological life support system (CELSS) based on regenerative concepts is described. Included are a discussion of model scope, structure, and example calculations. Computer programs for cultivar and breakeven calculations are also included.
Computer Model Of Fragmentation Of Atomic Nuclei
NASA Technical Reports Server (NTRS)
Wilson, John W.; Townsend, Lawrence W.; Tripathi, Ram K.; Norbury, John W.; KHAN FERDOUS; Badavi, Francis F.
1995-01-01
High Charge and Energy Semiempirical Nuclear Fragmentation Model (HZEFRG1) computer program developed to be computationally efficient, user-friendly, physics-based program for generating data bases on fragmentation of atomic nuclei. Data bases generated used in calculations pertaining to such radiation-transport applications as shielding against radiation in outer space, radiation dosimetry in outer space, cancer therapy in laboratories with beams of heavy ions, and simulation studies for designing detectors for experiments in nuclear physics. Provides cross sections for production of individual elements and isotopes in breakups of high-energy heavy ions by combined nuclear and Coulomb fields of interacting nuclei. Written in ANSI FORTRAN 77.
ERIC Educational Resources Information Center
Wick, Michael R.; Kleine, Patricia A.; Nelson, Andrew J.
2011-01-01
This article presents the development, testing, and application of an enrollment model. The model incorporates incoming freshman enrollment class size and historical persistence, transfer, and graduation rates to predict a six-year enrollment window and associated annual graduate production. The model predicts six-year enrollment to within 0.67…
Van Bockstal, Pieter-Jan; Mortier, Séverine Thérèse F C; De Meyer, Laurens; Corver, Jos; Vervaet, Chris; Nopens, Ingmar; De Beer, Thomas
2017-05-01
Conventional pharmaceutical freeze-drying is an inefficient and expensive batch-wise process, associated with several disadvantages leading to an uncontrolled end product variability. The proposed continuous alternative, based on spinning the vials during freezing and on optimal energy supply during drying, strongly increases process efficiency and improves product quality (uniformity). The heat transfer during continuous drying of the spin frozen vials is provided via non-contact infrared (IR) radiation. The energy transfer to the spin frozen vials should be optimised to maximise the drying efficiency while avoiding cake collapse. Therefore, a mechanistic model was developed which allows computing the optimal, dynamic IR heater temperature in function of the primary drying progress and which, hence, also allows predicting the primary drying endpoint based on the applied dynamic IR heater temperature. The model was validated by drying spin frozen vials containing the model formulation (3.9mL in 10R vials) according to the computed IR heater temperature profile. In total, 6 validation experiments were conducted. The primary drying endpoint was experimentally determined via in-line near-infrared (NIR) spectroscopy and compared with the endpoint predicted by the model (50min). The mean ratio of the experimental drying time to the predicted value was 0.91, indicating a good agreement between the model predictions and the experimental data. The end product had an elegant product appearance (visual inspection) and an acceptable residual moisture content (Karl Fischer). Copyright © 2017 Elsevier B.V. All rights reserved.
Huang, Zhi; Marra, Francesco; Subbiah, Jeyamkondan; Wang, Shaojin
2018-04-13
Radio frequency (RF) heating has great potential for achieving rapid and volumetric heating in foods, providing safe and high-quality food products due to deep penetration depth, moisture self-balance effects, and leaving no chemical residues. However, the nonuniform heating problem (usually resulting in hot and cold spots in the heated product) needs to be resolved. The inhomogeneous temperature distribution not only affects the quality of the food but also raises the issue of food safety when the microorganisms or insects may not be controlled in the cold spots. The mathematical modeling for RF heating processes has been extensively studied in a wide variety of agricultural products recently. This paper presents a comprehensive review of recent progresses in computer simulation for RF heating uniformity improvement and the offered solutions to reduce the heating nonuniformity. It provides a brief introduction on the basic principle of RF heating technology, analyzes the applications of numerical simulation, and discusses the factors influencing the RF heating uniformity and the possible methods to improve heating uniformity. Mathematical modeling improves the understanding of RF heating of food and is essential to optimize the RF treatment protocol for pasteurization and disinfestation applications. Recommendations for future research have been proposed to further improve the accuracy of numerical models, by covering both heat and mass transfers in the model, validating these models with sample movement and mixing, and identifying the important model parameters by sensitivity analysis.
Mathematical Modeling of Resonant Processes in Confined Geometry of Atomic and Atom-Ion Traps
NASA Astrophysics Data System (ADS)
Melezhik, Vladimir S.
2018-02-01
We discuss computational aspects of the developed mathematical models for resonant processes in confined geometry of atomic and atom-ion traps. The main attention is paid to formulation in the nondirect product discrete-variable representation (npDVR) of the multichannel scattering problem with nonseparable angular part in confining traps as the boundary-value problem. Computational efficiency of this approach is demonstrated in application to atomic and atom-ion confinement-induced resonances we predicted recently.
Middle Rio Grande Cooperative Water Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tidwell, Vince; Passell, Howard
2005-11-01
This is computer simulation model built in a commercial modeling product Called Studio Expert, developed by Powersim, Inc. The simulation model is built in a system dynamics environment, allowing the simulation of the interaction among multiple systems that are all changing over time. The model focuses on hydrology, ecology, demography, and economy of the Middle Rio Grande, with Water as the unifying feature.
Modeling and control for closed environment plant production systems
NASA Technical Reports Server (NTRS)
Fleisher, David H.; Ting, K. C.; Janes, H. W. (Principal Investigator)
2002-01-01
A computer program was developed to study multiple crop production and control in controlled environment plant production systems. The program simulates crop growth and development under nominal and off-nominal environments. Time-series crop models for wheat (Triticum aestivum), soybean (Glycine max), and white potato (Solanum tuberosum) are integrated with a model-based predictive controller. The controller evaluates and compensates for effects of environmental disturbances on crop production scheduling. The crop models consist of a set of nonlinear polynomial equations, six for each crop, developed using multivariate polynomial regression (MPR). Simulated data from DSSAT crop models, previously modified for crop production in controlled environments with hydroponics under elevated atmospheric carbon dioxide concentration, were used for the MPR fitting. The model-based predictive controller adjusts light intensity, air temperature, and carbon dioxide concentration set points in response to environmental perturbations. Control signals are determined from minimization of a cost function, which is based on the weighted control effort and squared-error between the system response and desired reference signal.
Parametric identification of the process of preparing ceramic mixture as an object of control
NASA Astrophysics Data System (ADS)
Galitskov, Stanislav; Nazarov, Maxim; Galitskov, Konstantin
2017-10-01
Manufacture of ceramic materials and products largely depends on the preparation of clay raw materials. The main process here is the process of mixing, which in industrial production is mostly done in cross-compound clay mixers of continuous operation with steam humidification. The authors identified features of dynamics of this technological stage, which in itself is a non-linear control object with distributed parameters. When solving practical tasks for automation of a certain class of ceramic materials production it is important to make parametric identification of moving clay. In this paper the task is solved with the use of computational models, approximated to a particular section of a clay mixer along its length. The research introduces a methodology of computational experiments as applied to the designed computational model. Parametric identification of dynamic links was carried out according to transient characteristics. The experiments showed that the control object in question is to a great extent a non-stationary one. The obtained results are problematically oriented on synthesizing a multidimensional automatic control system for preparation of ceramic mixture with specified values of humidity and temperature exposed to the technological process of major disturbances.
Computational Studies for Underground Coal Gasification (UCG) Process
NASA Astrophysics Data System (ADS)
Chatterjee, Dipankar
2017-07-01
Underground coal gasification (UCG) is a well proven technology in order to access the coal lying either too deep underground, or is otherwise too costly to be extracted using the conventional mining methods. UCG product gas is commonly used as a chemical feedstock or as fuel for power generation. During the UCG process, a cavity is formed in the coal seam during its conversion to gaseous products. The cavity grows in a three-dimensional fashion as the gasification proceeds. The UCG process is indeed a result of several complex interactions of various geo-thermo-mechanical processes such as the fluid flow, heat and mass transfer, chemical reactions, water influx, thermo-mechanical failure, and other geological aspects. The rate of the growth of this cavity and its shape will have a significant impact on the gas flow patterns, chemical kinetics, temperature distributions, and finally the quality of the product gas. It has been observed that there is insufficient information available in the literature to provide clear insight into these issues. It leaves us with a great opportunity to investigate and explore the UCG process, both from the experimental as well as theoretical perspectives. In the development and exploration of new research, experiment is undoubtedly very important. However, due to the excessive cost involvement with experimentation it is not always recommended for the complicated process like UCG. Recently, with the advent of the high performance computational facilities it is quite possible to make alternative experimentation numerically of many physically involved problems using certain computational tools like CFD (computational fluid dynamics). In order to gain a comprehensive understanding of the underlying physical phenomena, modeling strategies have frequently been utilized for the UCG process. Keeping in view the above, the various modeling strategies commonly deployed for carrying out mathematical modeling of UCG process are described here in a concise manner. The available strategies are categorized in several groups and their salient features are discussed in order to have a good understanding of the underlying physical phenomena. This would likely to be a valuable documentation in order to understand the physical process of UCG and will pave to formulate new and involved modeling and simulation techniques for computationally modeling the UCG process.
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Minnis, P.; Palikonda, R.; Smith, W. L., Jr.; Spangenberg, D.
2016-12-01
The NASA LaRC Satellite ClOud and Radiative Property retrieval System (SatCORPS) processes and derives near real-time (NRT) global cloud products from operational geostationary satellite imager datasets. These products are being used in NRT to improve forecast model, aircraft icing warnings, and support aircraft field campaigns. Next generation satellites, such as the Japanese Himawari-8 and the upcoming NOAA GOES-R, present challenges for NRT data processing and product dissemination due to the increase in temporal and spatial resolution. The volume of data is expected to increase to approximately 10 folds. This increase in data volume will require additional IT resources to keep up with the processing demands to satisfy NRT requirements. In addition, these resources are not readily available due to cost and other technical limitations. To anticipate and meet these computing resource requirements, we have employed a hybrid cloud computing environment to augment the generation of SatCORPS products. This paper will describe the workflow to ingest, process, and distribute SatCORPS products and the technologies used. Lessons learn from working on both AWS Clouds and GovCloud will be discussed: benefits, similarities, and differences that could impact decision to use cloud computing and storage. A detail cost analysis will be presented. In addition, future cloud utilization, parallelization, and architecture layout will be discussed for GOES-R.
Caprock Integrity during Hydrocarbon Production and CO2 Injection in the Goldeneye Reservoir
NASA Astrophysics Data System (ADS)
Salimzadeh, Saeed; Paluszny, Adriana; Zimmerman, Robert
2016-04-01
Carbon Capture and Storage (CCS) is a key technology for addressing climate change and maintaining security of energy supplies, while potentially offering important economic benefits. UK offshore, depleted hydrocarbon reservoirs have the potential capacity to store significant quantities of carbon dioxide, produced during power generation from fossil fuels. The Goldeneye depleted gas condensate field, located offshore in the UK North Sea at a depth of ~ 2600 m, is a candidate for the storage of at least 10 million tons of CO2. In this research, a fully coupled, full-scale model (50×20×8 km), based on the Goldeneye reservoir, is built and used for hydro-carbon production and CO2 injection simulations. The model accounts for fluid flow, heat transfer, and deformation of the fractured reservoir. Flow through fractures is defined as two-dimensional laminar flow within the three-dimensional poroelastic medium. The local thermal non-equilibrium between injected CO2 and host reservoir has been considered with convective (conduction and advection) heat transfer. The numerical model has been developed using standard finite element method with Galerkin spatial discretisation, and finite difference temporal discretisation. The geomechanical model has been implemented into the object-oriented Imperial College Geomechanics Toolkit, in close interaction with the Complex Systems Modelling Platform (CSMP), and validated with several benchmark examples. Fifteen major faults are mapped from the Goldeneye field into the model. Modal stress intensity factors, for the three modes of fracture opening during hydrocarbon production and CO2 injection phases, are computed at the tips of the faults by computing the I-Integral over a virtual disk. Contact stresses -normal and shear- on the fault surfaces are iteratively computed using a gap-based augmented Lagrangian-Uzawa method. Results show fault activation during the production phase that may affect the fault's hydraulic conductivity and its connection to the reservoir rocks. The direction of growth is downward during production and it is expected to be upward during injection. Elevated fluid pressures inside faults during CO2 injection may further facilitate fault activation by reducing normal effective stresses. Activated faults can act as permeable conduits and potentially jeopardise caprock integrity for CO2 storage purposes.
ERIC Educational Resources Information Center
Faiola, Anthony; Matei, Sorin Adam
2010-01-01
The evolution of human-computer interaction design (HCID) over the last 20 years suggests that there is a growing need for educational scholars to consider new and more applicable theoretical models of interactive product design. The authors suggest that such paradigms would call for an approach that would equip HCID students with a better…
2007-06-01
CFD in the AFSEO The SBD, designated GBU - 39 /B, contains a 250 production environment include the requirement for rapid pound warhead, measures 6 feet...Take Off (RATO) Separation." ITEA Conference, Apr 2006. Figure 3. B-52/MOP Figure 1. GBU - 39 /B small diameter bomb computational model Figure 4. MOP
NASA Astrophysics Data System (ADS)
Nurjanah; Dahlan, J. A.; Wibisono, Y.
2017-02-01
This paper aims to make a design and development computer-based e-learning teaching material for improving mathematical understanding ability and spatial sense of junior high school students. Furthermore, the particular aims are (1) getting teaching material design, evaluation model, and intrument to measure mathematical understanding ability and spatial sense of junior high school students; (2) conducting trials computer-based e-learning teaching material model, asessment, and instrument to develop mathematical understanding ability and spatial sense of junior high school students; (3) completing teaching material models of computer-based e-learning, assessment, and develop mathematical understanding ability and spatial sense of junior high school students; (4) resulting research product is teaching materials of computer-based e-learning. Furthermore, the product is an interactive learning disc. The research method is used of this study is developmental research which is conducted by thought experiment and instruction experiment. The result showed that teaching materials could be used very well. This is based on the validation of computer-based e-learning teaching materials, which is validated by 5 multimedia experts. The judgement result of face and content validity of 5 validator shows that the same judgement result to the face and content validity of each item test of mathematical understanding ability and spatial sense. The reliability test of mathematical understanding ability and spatial sense are 0,929 and 0,939. This reliability test is very high. While the validity of both tests have a high and very high criteria.
Development of a Model and Computer Code to Describe Solar Grade Silicon Production Processes
NASA Technical Reports Server (NTRS)
Srivastava, R.; Gould, R. K.
1979-01-01
The program aims at developing mathematical models and computer codes based on these models, which allow prediction of the product distribution in chemical reactors for converting gaseous silicon compounds to condensed-phase silicon. The major interest is in collecting silicon as a liquid on the reactor walls and other collection surfaces. Two reactor systems are of major interest, a SiCl4/Na reactor in which Si(l) is collected on the flow tube reactor walls and a reactor in which Si(l) droplets formed by the SiCl4/Na reaction are collected by a jet impingement method. During this quarter the following tasks were accomplished: (1) particle deposition routines were added to the boundary layer code; and (2) Si droplet sizes in SiCl4/Na reactors at temperatures below the dew point of Si are being calculated.
Numerical Simulations of Thermobaric Explosions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhl, A L; Bell, J B; Beckner, V E
2007-05-04
A Model of the energy evolution in thermobaric explosions is presented. It is based on the two-phase formulation: conservation laws for the gas and particle phases along with inter-phase interaction terms. It incorporates a Combustion Model based on the mass conservation laws for fuel, air and products; source/sink terms are treated in the fast-chemistry limit appropriate for such gas dynamic fields. The Model takes into account both the afterburning of the detonation products of the booster with air, and the combustion of the fuel (Al or TNT detonation products) with air. Numerical simulations were performed for 1.5-g thermobaric explosions inmore » five different chambers (volumes ranging from 6.6 to 40 liters and length-to-diameter ratios from 1 to 12.5). Computed pressure waveforms were very similar to measured waveforms in all cases - thereby proving that the Model correctly predicts the energy evolution in such explosions. The computed global fuel consumption {mu}(t) behaved as an exponential life function. Its derivative {dot {mu}}(t) represents the global rate of fuel consumption. It depends on the rate of turbulent mixing which controls the rate of energy release in thermobaric explosions.« less
Bioproducts and environmental quality: Biofuels, greenhouse gases, and water quality
NASA Astrophysics Data System (ADS)
Ren, Xiaolin
Promoting bio-based products is one oft-proposed solution to reduce GHG emissions because the feedstocks capture carbon, offsetting at least partially the carbon discharges resulting from use of the products. However, several life cycle analyses point out that while biofuels may emit less life cycle net carbon emissions than fossil fuels, they may exacerbate other parts of biogeochemical cycles, notably nutrient loads in the aquatic environment. In three essays, this dissertation explores the tradeoff between GHG emissions and nitrogen leaching associated with biofuel production using general equilibrium models. The first essay develops a theoretical general equilibrium model to calculate the second-best GHG tax with the existence of a nitrogen leaching distortion. The results indicate that the second-best GHG tax could be higher or lower than the first-best tax rates depending largely on the elasticity of substitution between fossil fuel and biofuel. The second and third essays employ computable general equilibrium models to further explore the tradeoff between GHG emissions and nitrogen leaching. The computable general equilibrium models also incorporate multiple biofuel pathways, i.e., biofuels made from different feedstocks using different processes, to identify the cost-effective combinations of biofuel pathways under different policies, and the corresponding economic and environmental impacts.
Detonation product EOS studies: Using ISLS to refine CHEETAH
NASA Astrophysics Data System (ADS)
Zaug, Joseph; Fried, Larry; Hansen, Donald
2001-06-01
Knowledge of an effective interatomic potential function underlies any effort to predict or rationalize the properties of solids and liquids. The experiments we undertake are directed towards determination of equilibrium and dynamic properties of simple fluids at densities sufficiently high that traditional computational methods and semi-empirical forms successful at ambient conditions may require reconsideration. In this paper we present high-pressure and temperature experimental sound speed data on a suite of non-ideal simple fluids and fluid mixtures. Impulsive Stimulated Light Scattering conducted in the diamond-anvil cell offers an experimental approach to determine cross-pair potential interactions through equation of state determinations. In addition the kinetics of structural relaxation in fluids can be studied. We compare our experimental results with our thermochemical computational model CHEETAH. Computational models are systematically improved with each addition of experimental data. Experimentally grounded computational models provide a good basis to confidently understand the chemical nature of reactions at extreme conditions.
OpenFOAM: Open source CFD in research and industry
NASA Astrophysics Data System (ADS)
Jasak, Hrvoje
2009-12-01
The current focus of development in industrial Computational Fluid Dynamics (CFD) is integration of CFD into Computer-Aided product development, geometrical optimisation, robust design and similar. On the other hand, in CFD research aims to extend the boundaries ofpractical engineering use in "non-traditional " areas. Requirements of computational flexibility and code integration are contradictory: a change of coding paradigm, with object orientation, library components, equation mimicking is proposed as a way forward. This paper describes OpenFOAM, a C++ object oriented library for Computational Continuum Mechanics (CCM) developed by the author. Efficient and flexible implementation of complex physical models is achieved by mimicking the form ofpartial differential equation in software, with code functionality provided in library form. Open Source deployment and development model allows the user to achieve desired versatility in physical modeling without the sacrifice of complex geometry support and execution efficiency.
NASA Astrophysics Data System (ADS)
Carozza, D. A.; Bianchi, D.; Galbraith, E. D.
2015-12-01
Environmental change and the exploitation of marine resources have had profound impacts on marine communities, with potential implications for ocean biogeochemistry and food security. In order to study such global-scale problems, it is helpful to have computationally efficient numerical models that predict the first-order features of fish biomass production as a function of the environment, based on empirical and mechanistic understandings of marine ecosystems. Here we describe the ecological module of the BiOeconomic mArine Trophic Size-spectrum (BOATS) model, which takes an Earth-system approach to modeling fish biomass at the global scale. The ecological model is designed to be used on an Earth System model grid, and determines size spectra of fish biomass by explicitly resolving life history as a function of local temperature and net primary production. Biomass production is limited by the availability of photosynthetic energy to upper trophic levels, following empirical trophic efficiency scalings, and by well-established empirical temperature-dependent growth rates. Natural mortality is calculated using an empirical size-based relationship, while reproduction and recruitment depend on both the food availability to larvae from net primary production and the production of eggs by mature adult fish. We describe predicted biomass spectra and compare them to observations, and conduct a sensitivity study to determine how the change as a function of net primary production and temperature. The model relies on a limited number of parameters compared to similar modeling efforts, while retaining realistic representations of biological and ecological processes, and is computationally efficient, allowing extensive parameter-space analyses even when implemented globally. As such, it enables the exploration of the linkages between ocean biogeochemistry, climate, and upper trophic levels at the global scale, as well as a representation of fish biomass for idealized studies of fisheries.
NASA Astrophysics Data System (ADS)
Carozza, David Anthony; Bianchi, Daniele; Galbraith, Eric Douglas
2016-04-01
Environmental change and the exploitation of marine resources have had profound impacts on marine communities, with potential implications for ocean biogeochemistry and food security. In order to study such global-scale problems, it is helpful to have computationally efficient numerical models that predict the first-order features of fish biomass production as a function of the environment, based on empirical and mechanistic understandings of marine ecosystems. Here we describe the ecological module of the BiOeconomic mArine Trophic Size-spectrum (BOATS) model, which takes an Earth-system approach to modelling fish biomass at the global scale. The ecological model is designed to be used on an Earth-system model grid, and determines size spectra of fish biomass by explicitly resolving life history as a function of local temperature and net primary production. Biomass production is limited by the availability of photosynthetic energy to upper trophic levels, following empirical trophic efficiency scalings, and by well-established empirical temperature-dependent growth rates. Natural mortality is calculated using an empirical size-based relationship, while reproduction and recruitment depend on both the food availability to larvae from net primary production and the production of eggs by mature adult fish. We describe predicted biomass spectra and compare them to observations, and conduct a sensitivity study to determine how they change as a function of net primary production and temperature. The model relies on a limited number of parameters compared to similar modelling efforts, while retaining reasonably realistic representations of biological and ecological processes, and is computationally efficient, allowing extensive parameter-space analyses even when implemented globally. As such, it enables the exploration of the linkages between ocean biogeochemistry, climate, and upper trophic levels at the global scale, as well as a representation of fish biomass for idealized studies of fisheries.
Toward a unified account of comprehension and production in language development.
McCauley, Stewart M; Christiansen, Morten H
2013-08-01
Although Pickering & Garrod (P&G) argue convincingly for a unified system for language comprehension and production, they fail to explain how such a system might develop. Using a recent computational model of language acquisition as an example, we sketch a developmental perspective on the integration of comprehension and production. We conclude that only through development can we fully understand the intertwined nature of comprehension and production in adult processing.
AGIS: The ATLAS Grid Information System
NASA Astrophysics Data System (ADS)
Anisenkov, A.; Di Girolamo, A.; Klimentov, A.; Oleynik, D.; Petrosyan, A.; Atlas Collaboration
2014-06-01
ATLAS, a particle physics experiment at the Large Hadron Collider at CERN, produced petabytes of data annually through simulation production and tens of petabytes of data per year from the detector itself. The ATLAS computing model embraces the Grid paradigm and a high degree of decentralization and computing resources able to meet ATLAS requirements of petabytes scale data operations. In this paper we describe the ATLAS Grid Information System (AGIS), designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by the ATLAS Distributed Computing applications and services.
Turbulent reacting flow computations including turbulence-chemistry interactions
NASA Technical Reports Server (NTRS)
Narayan, J. R.; Girimaji, S. S.
1992-01-01
A two-equation (k-epsilon) turbulence model has been extended to be applicable for compressible reacting flows. A compressibility correction model based on modeling the dilatational terms in the Reynolds stress equations has been used. A turbulence-chemistry interaction model is outlined. In this model, the effects of temperature and species mass concentrations fluctuations on the species mass production rates are decoupled. The effect of temperature fluctuations is modeled via a moment model, and the effect of concentration fluctuations is included using an assumed beta-pdf model. Preliminary results obtained using this model are presented. A two-dimensional reacting mixing layer has been used as a test case. Computations are carried out using the Navier-Stokes solver SPARK using a finite rate chemistry model for hydrogen-air combustion.
ERIC Educational Resources Information Center
Wee, Loo Kang
2012-01-01
We develop an Easy Java Simulation (EJS) model for students to experience the physics of idealized one-dimensional collision carts. The physics model is described and simulated by both continuous dynamics and discrete transition during collision. In designing the simulations, we discuss briefly three pedagogical considerations namely (1) a…
EX.MAIN. Expert System Model for Maintenance and Staff Training.
ERIC Educational Resources Information Center
Masturzi, Elio R.
EX.MAIN, a model for maintenance and staff training which combines knowledge based expert systems and computer based training, was developed jointly by the Department of Production Engineering of the University of Naples and CIRCUMVESUVIANA, the largest private railroad in Italy. It is a global model in the maintenance field which contains both…
ZIMOD: A Simple Computer Model of the Zimbabwean Economy.
ERIC Educational Resources Information Center
Knox, Jon; And Others
1988-01-01
This paper describes a rationale for the construction and use of a simple consistency model of the Zimbabwean economy that incorporates an input-output matrix. The model is designed to investigate alternative industrial strategies and their consequences for the balance of payments, consumption, and overall gross domestic product growth for a…
NASA Astrophysics Data System (ADS)
Pi, E. I.; Siegel, E.
2010-03-01
Siegel[AMS Natl.Mtg.(2002)-Abs.973-60-124] digits logarithmic- law inversion to ONLY BEQS BEC:Quanta/Bosons=#: EMP-like SEVERE VULNERABILITY of ONLY #-networks(VS.ANALOG INvulnerability) via Barabasi NP(VS.dynamics[Not.AMS(5/2009)] critique);(so called)``quantum-computing''(QC) = simple-arithmetic (sansdivision);algorithmiccomplexities:INtractibility/UNdecidabi lity/INefficiency/NONcomputability/HARDNESS(so MIScalled) ``noise''-induced-phase-transition(NIT)ACCELERATION:Cook-Levin theorem Reducibility = RG fixed-points; #-Randomness DEFINITION via WHAT? Query(VS. Goldreich[Not.AMS(2002)] How? mea culpa)= ONLY MBCS hot-plasma v #-clumping NON-random BEC; Modular-Arithmetic Congruences = Signal x Noise PRODUCTS = clock-model; NON-Shor[Physica A,341,586(04)]BEC logarithmic-law inversion factorization: Watkins #-theory U statistical- physics); P=/=NP C-S TRIVIAL Proof: Euclid!!! [(So Miscalled) computational-complexity J-O obviation(3 millennia AGO geometry: NO:CC,``CS'';``Feet of Clay!!!'']; Query WHAT?:Definition: (so MIScalled)``complexity''=UTTER-SIMPLICITY!! v COMPLICATEDNESS MEASURE(S).
Exploiting data representation for fault tolerance
Hoemmen, Mark Frederick; Elliott, J.; Sandia National Lab.; ...
2015-01-06
Incorrect computer hardware behavior may corrupt intermediate computations in numerical algorithms, possibly resulting in incorrect answers. Prior work models misbehaving hardware by randomly flipping bits in memory. We start by accepting this premise, and present an analytic model for the error introduced by a bit flip in an IEEE 754 floating-point number. We then relate this finding to the linear algebra concepts of normalization and matrix equilibration. In particular, we present a case study illustrating that normalizing both vector inputs of a dot product minimizes the probability of a single bit flip causing a large error in the dot product'smore » result. Moreover, the absolute error is either less than one or very large, which allows detection of large errors. Then, we apply this to the GMRES iterative solver. We count all possible errors that can be introduced through faults in arithmetic in the computationally intensive orthogonalization phase of GMRES, and show that when the matrix is equilibrated, the absolute error is bounded above by one.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, A.L.
This report presents a summary of the status of research activities associated with fission product behavior (release and transport) under severe accident conditions within the primary systems of water-moderated and water-cooled nuclear reactors. For each of the areas of fission product release and fission product transport, the report summarizes relevant information on important phenomena, major experiments performed, relevant computer models and codes, comparisons of computer code calculations with experimental results, and general conclusions on the overall state of the art. Finally, the report provides an assessment of the overall importance and knowledge of primary system release and transport phenomena andmore » presents major conclusions on the state of the art.« less
... Issue All Issues Explore Findings by Topic Cell Biology Cellular Structures, Functions, Processes, Imaging, Stress Response Chemistry ... Glycobiology, Synthesis, Natural Products, Chemical Reactions Computers in Biology Bioinformatics, Modeling, Systems Biology, Data Visualization Diseases Cancer, ...
Acute and chronic environmental effects of clandestine methamphetamine waste.
Kates, Lisa N; Knapp, Charles W; Keenan, Helen E
2014-09-15
The illicit manufacture of methamphetamine (MAP) produces substantial amounts of hazardous waste that is dumped illegally. This study presents the first environmental evaluation of waste produced from illicit MAP manufacture. Chemical oxygen demand (COD) was measured to assess immediate oxygen depletion effects. A mixture of five waste components (10mg/L/chemical) was found to have a COD (130 mg/L) higher than the European Union wastewater discharge regulations (125 mg/L). Two environmental partition coefficients, K(OW) and K(OC), were measured for several chemicals identified in MAP waste. Experimental values were input into a computer fugacity model (EPI Suite™) to estimate environmental fate. Experimental log K(OW) values ranged from -0.98 to 4.91, which were in accordance with computer estimated values. Experimental K(OC) values ranged from 11 to 72, which were much lower than the default computer values. The experimental fugacity model for discharge to water estimates that waste components will remain in the water compartment for 15 to 37 days. Using a combination of laboratory experimentation and computer modelling, the environmental fate of MAP waste products was estimated. While fugacity models using experimental and computational values were very similar, default computer models should not take the place of laboratory experimentation. Copyright © 2014 Elsevier B.V. All rights reserved.
An, Gary; Bartels, John; Vodovotz, Yoram
2011-03-01
The clinical translation of promising basic biomedical findings, whether derived from reductionist studies in academic laboratories or as the product of extensive high-throughput and -content screens in the biotechnology and pharmaceutical industries, has reached a period of stagnation in which ever higher research and development costs are yielding ever fewer new drugs. Systems biology and computational modeling have been touted as potential avenues by which to break through this logjam. However, few mechanistic computational approaches are utilized in a manner that is fully cognizant of the inherent clinical realities in which the drugs developed through this ostensibly rational process will be ultimately used. In this article, we present a Translational Systems Biology approach to inflammation. This approach is based on the use of mechanistic computational modeling centered on inherent clinical applicability, namely that a unified suite of models can be applied to generate in silico clinical trials, individualized computational models as tools for personalized medicine, and rational drug and device design based on disease mechanism.
Electron Impact Ionization: A New Parameterization for 100 eV to 1 MeV Electrons
NASA Technical Reports Server (NTRS)
Fang, Xiaohua; Randall, Cora E.; Lummerzheim, Dirk; Solomon, Stanley C.; Mills, Michael J.; Marsh, Daniel; Jackman, Charles H.; Wang, Wenbin; Lu, Gang
2008-01-01
Low, medium and high energy electrons can penetrate to the thermosphere (90-400 km; 55-240 miles) and mesosphere (50-90 km; 30-55 miles). These precipitating electrons ionize that region of the atmosphere, creating positively charged atoms and molecules and knocking off other negatively charged electrons. The precipitating electrons also create nitrogen-containing compounds along with other constituents. Since the electron precipitation amounts change within minutes, it is necessary to have a rapid method of computing the ionization and production of nitrogen-containing compounds for inclusion in computationally-demanding global models. A new methodology has been developed, which has parameterized a more detailed model computation of the ionizing impact of precipitating electrons over the very large range of 100 eV up to 1,000,000 eV. This new parameterization method is more accurate than a previous parameterization scheme, when compared with the more detailed model computation. Global models at the National Center for Atmospheric Research will use this new parameterization method in the near future.
Higgs pair production at NLO QCD for CP-violating Higgs sectors
NASA Astrophysics Data System (ADS)
Gröber, R.; Mühlleitner, M.; Spira, M.
2017-12-01
Higgs pair production through gluon fusion is an important process at the LHC to test the dynamics underlying electroweak symmetry breaking. Higgs sectors beyond the Standard Model (SM) can substantially modify this cross section through novel couplings not present in the SM or the on-shell production of new heavy Higgs bosons that subsequently decay into Higgs pairs. CP violation in the Higgs sector is important for the explanation of the observed matter-antimatter asymmetry through electroweak baryogenesis. In this work we compute the next-to-leading order (NLO) QCD corrections in the heavy top quark limit, including the effects of CP violation in the Higgs sector. We choose the effective theory (EFT) approach, which provides a rather model-independent way to explore New Physics (NP) effects by adding dimension-6 operators, both CP-conserving and CP-violating ones, to the SM Lagrangian. Furthermore, we perform the computation within a specific UV-complete model and choose as benchmark model the general 2-Higgs-Doublet Model with CP violation, the C2HDM. Depending on the dimension-6 coefficients, the relative NLO QCD corrections are affected by several per cent through the new CP-violating operators. This is also the case for SM-like Higgs pair production in the C2HDM, while the relative QCD corrections in the production of heavier C2HDM Higgs boson pairs deviate more strongly from the SM case. The absolute cross sections both in the EFT and the C2HDM can be modified by more than an order of magnitude. In particular, in the C2HDM the resonant production of Higgs pairs can by far exceed the SM cross section.
Schrodt, Fabian; Kneissler, Jan; Ehrenfeld, Stephan; Butz, Martin V
2017-04-01
In line with Allen Newell's challenge to develop complete cognitive architectures, and motivated by a recent proposal for a unifying subsymbolic computational theory of cognition, we introduce the cognitive control architecture SEMLINCS. SEMLINCS models the development of an embodied cognitive agent that learns discrete production rule-like structures from its own, autonomously gathered, continuous sensorimotor experiences. Moreover, the agent uses the developing knowledge to plan and control environmental interactions in a versatile, goal-directed, and self-motivated manner. Thus, in contrast to several well-known symbolic cognitive architectures, SEMLINCS is not provided with production rules and the involved symbols, but it learns them. In this paper, the actual implementation of SEMLINCS causes learning and self-motivated, autonomous behavioral control of the game figure Mario in a clone of the computer game Super Mario Bros. Our evaluations highlight the successful development of behavioral versatility as well as the learning of suitable production rules and the involved symbols from sensorimotor experiences. Moreover, knowledge- and motivation-dependent individualizations of the agents' behavioral tendencies are shown. Finally, interaction sequences can be planned on the sensorimotor-grounded production rule level. Current limitations directly point toward the need for several further enhancements, which may be integrated into SEMLINCS in the near future. Overall, SEMLINCS may be viewed as an architecture that allows the functional and computational modeling of embodied cognitive development, whereby the current main focus lies on the development of production rules from sensorimotor experiences. Copyright © 2017 Cognitive Science Society, Inc.
SAMICS Validation. SAMICS Support Study, Phase 3
NASA Technical Reports Server (NTRS)
1979-01-01
SAMICS provides a consistent basis for estimating array costs and compares production technology costs. A review and a validation of the SAMICS model are reported. The review had the following purposes: (1) to test the computational validity of the computer model by comparison with preliminary hand calculations based on conventional cost estimating techniques; (2) to review and improve the accuracy of the cost relationships being used by the model: and (3) to provide an independent verification to users of the model's value in decision making for allocation of research and developement funds and for investment in manufacturing capacity. It is concluded that the SAMICS model is a flexible, accurate, and useful tool for managerial decision making.
ERIC Educational Resources Information Center
Carroll, Margaret Aby; Chandler, Yvonne J.
This study examines whether an analysis of characteristics of libraries or information centers and librarians in highly productive companies yields operational models and standards that can improve their efficiency and effectiveness and their parent organization's productivity. Data was collected using an e-mail survey instrument sent to 500 large…
Editorial: Computational Creativity, Concept Invention, and General Intelligence
NASA Astrophysics Data System (ADS)
Besold, Tarek R.; Kühnberger, Kai-Uwe; Veale, Tony
2015-12-01
Over the last decade, computational creativity as a field of scientific investigation and computational systems engineering has seen growing popularity. Still, the levels of development between projects aiming at systems for artistic production or performance and endeavours addressing creative problem-solving or models of creative cognitive capacities is diverging. While the former have already seen several great successes, the latter still remain in their infancy. This volume collects reports on work trying to close the accrued gap.
1978-01-17
approach to designing computers: Formal mathematical methods were applied and computers themselves began to be widely used in designing other...capital, labor resources and the funds of consumers. Analysis of the model indicates that at the present time the average complexity of production of...ALGORITHMIC COMPLETENESS AND COMPLEXITY OF MICROPROGRAMS Kiev KIBERNETIKA in Russian No 3, May/Jun 77 pp 1-15 manuscript received 22 Dec 76 G0LUNK0V
Electrostatic Precipitator (ESP) TRAINING MANUAL
The manual assists engineers in using a computer program, the ESPVI 4.0W, that models all elements of an electrostatic precipitator (ESP). The program is a product of the Electric Power Research Institute and runs in the Windows environment. Once an ESP is accurately modeled, the...
A Symbolic Model of the Nonconscious Acquisition of Information.
ERIC Educational Resources Information Center
Ling, Charles X.; Marinov, Marin
1994-01-01
Challenges Smolensky's theory that human intuitive/nonconscious cognitive processes can only be accurately explained in terms of subsymbolic computations in artificial neural networks. Symbolic learning models of two cognitive tasks involving nonconscious acquisition of information are presented: learning production rules and artificial finite…
A micro-epidemic model for primary dengue infection
NASA Astrophysics Data System (ADS)
Mishra, Arti; Gakkhar, Sunita
2017-06-01
In this paper, a micro-epidemic non-linear dynamical model has been proposed and analyzed for primary dengue infection. The model incorporates the effects of T cells immune response as well as humoral response during pathogenesis of dengue infection. The time delay has been accounted for production of antibodies from B cells. The basic reproduction number (R0) has been computed. Three equilibrium states are obtained. The existence and stability conditions for infection-free and ineffective cellular immune response state have been discussed. The conditions for existence of endemic state have been obtained. Further, the parametric region is obtained where system exhibits complex behavior. The threshold value of time delay has been computed which is critical for change in stability of endemic state. A threshold level for antibodies production rate has been obtained over which the infection will die out even though R0 > 1. The model is in line with the clinical observation that viral load decreases within 7-14 days from the onset of primary infection.
Equations of state for explosive detonation products: The PANDA model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerley, G.I.
1994-05-01
This paper discusses a thermochemical model for calculating equations of state (EOS) for the detonation products of explosives. This model, which was first presented at the Eighth Detonation Symposium, is available in the PANDA code and is referred to here as ``the Panda model``. The basic features of the PANDA model are as follows. (1) Statistical-mechanical theories are used to construct EOS tables for each of the chemical species that are to be allowed in the detonation products. (2) The ideal mixing model is used to compute the thermodynamic functions for a mixture of these species, and the composition ofmore » the system is determined from assumption of chemical equilibrium. (3) For hydrocode calculations, the detonation product EOS are used in tabular form, together with a reactive burn model that allows description of shock-induced initiation and growth or failure as well as ideal detonation wave propagation. This model has been implemented in the three-dimensional Eulerian code, CTH.« less
Neural computation of arithmetic functions
NASA Technical Reports Server (NTRS)
Siu, Kai-Yeung; Bruck, Jehoshua
1990-01-01
An area of application of neural networks is considered. A neuron is modeled as a linear threshold gate, and the network architecture considered is the layered feedforward network. It is shown how common arithmetic functions such as multiplication and sorting can be efficiently computed in a shallow neural network. Some known results are improved by showing that the product of two n-bit numbers and sorting of n n-bit numbers can be computed by a polynomial-size neural network using only four and five unit delays, respectively. Moreover, the weights of each threshold element in the neural networks require O(log n)-bit (instead of n-bit) accuracy. These results can be extended to more complicated functions such as multiple products, division, rational functions, and approximation of analytic functions.
Alloy Design Workbench-Surface Modeling Package Developed
NASA Technical Reports Server (NTRS)
Abel, Phillip B.; Noebe, Ronald D.; Bozzolo, Guillermo H.; Good, Brian S.; Daugherty, Elaine S.
2003-01-01
NASA Glenn Research Center's Computational Materials Group has integrated a graphical user interface with in-house-developed surface modeling capabilities, with the goal of using computationally efficient atomistic simulations to aid the development of advanced aerospace materials, through the modeling of alloy surfaces, surface alloys, and segregation. The software is also ideal for modeling nanomaterials, since surface and interfacial effects can dominate material behavior and properties at this level. Through the combination of an accurate atomistic surface modeling methodology and an efficient computational engine, it is now possible to directly model these types of surface phenomenon and metallic nanostructures without a supercomputer. Fulfilling a High Operating Temperature Propulsion Components (HOTPC) project level-I milestone, a graphical user interface was created for a suite of quantum approximate atomistic materials modeling Fortran programs developed at Glenn. The resulting "Alloy Design Workbench-Surface Modeling Package" (ADW-SMP) is the combination of proven quantum approximate Bozzolo-Ferrante-Smith (BFS) algorithms (refs. 1 and 2) with a productivity-enhancing graphical front end. Written in the portable, platform independent Java programming language, the graphical user interface calls on extensively tested Fortran programs running in the background for the detailed computational tasks. Designed to run on desktop computers, the package has been deployed on PC, Mac, and SGI computer systems. The graphical user interface integrates two modes of computational materials exploration. One mode uses Monte Carlo simulations to determine lowest energy equilibrium configurations. The second approach is an interactive "what if" comparison of atomic configuration energies, designed to provide real-time insight into the underlying drivers of alloying processes.
NASA Astrophysics Data System (ADS)
Gulamali, M. Y.; Saunders, J. H.; Jackson, M. D.; Pain, C. C.
2009-04-01
We present results from a new computational multi-fluid dynamics code, designed to model the transport of heat, mass and chemical species during flow of single or multiple immiscible fluid phases through porous media, including gravitational effects and compressibility. The model also captures the electrical phenomena which may arise through electrokinetic, electrochemical and electrothermal coupling. Building on the advanced computational technology of the Imperial College Ocean Model, this new development leads the way towards a complex multiphase code using arbitrary unstructured and adaptive meshes, and domains decomposed to run in parallel over a cluster of workstations or a dedicated parallel computer. These facilities will allow efficient and accurate modelling of multiphase flows which capture large- and small-scale transport phenomena, while preserving the important geology and/or surface topology to make the results physically meaningful and realistic. Applications include modelling of contaminant transport in aquifers, multiphase flow during hydrocarbon production, migration of carbon dioxide during sequestration, and evaluation of the design and safety of nuclear reactors. Simulations of the streaming potential resulting from multiphase flow in laboratory- and field-scale models demonstrate that streaming potential signals originate at fluid fronts, and at geologic boundaries where fluid saturation changes. This suggests that downhole measurements of streaming potential may be used to inform production strategies in oil and gas reservoirs. As water encroaches on an oil production well, the streaming-potential signal associated with the water front encompasses the well even when the front is up to 100 m away, so the potential measured at the well starts to change significantly relative to a distant reference electrode. Variations in the geometry of the encroaching water front could be characterized using an array of electrodes positioned along the well, but a good understanding of the local reservoir geology will be required to identify signals caused by the front. The streaming potential measured at a well will be maximized in low-permeability reservoirs produced at a high rate, and in thick reservoirs with low shale content.
POPEYE: A production rule-based model of multitask supervisory control (POPCORN)
NASA Technical Reports Server (NTRS)
Townsend, James T.; Kadlec, Helena; Kantowitz, Barry H.
1988-01-01
Recent studies of relationships between subjective ratings of mental workload, performance, and human operator and task characteristics have indicated that these relationships are quite complex. In order to study the various relationships and place subjective mental workload within a theoretical framework, we developed a production system model for the performance component of the complex supervisory task called POPCORN. The production system model is represented by a hierarchial structure of goals and subgoals, and the information flow is controlled by a set of condition-action rules. The implementation of this production system, called POPEYE, generates computer simulated data under different task difficulty conditions which are comparable to those of human operators performing the task. This model is the performance aspect of an overall dynamic psychological model which we are developing to examine and quantify relationships between performance and psychological aspects in a complex environment.
NASA Technical Reports Server (NTRS)
Frazier, John M.; Mattie, D. R.; Hussain, Saber; Pachter, Ruth; Boatz, Jerry; Hawkins, T. W.
2000-01-01
The development of quantitative structure-activity relationship (QSAR) is essential for reducing the chemical hazards of new weapon systems. The current collaboration between HEST (toxicology research and testing), MLPJ (computational chemistry) and PRS (computational chemistry, new propellant synthesis) is focusing R&D efforts on basic research goals that will rapidly transition to useful products for propellant development. Computational methods are being investigated that will assist in forecasting cellular toxicological end-points. Models developed from these chemical structure-toxicity relationships are useful for the prediction of the toxicological endpoints of new related compounds. Research is focusing on the evaluation tools to be used for the discovery of such relationships and the development of models of the mechanisms of action. Combinations of computational chemistry techniques, in vitro toxicity methods, and statistical correlations, will be employed to develop and explore potential predictive relationships; results for series of molecular systems that demonstrate the viability of this approach are reported. A number of hydrazine salts have been synthesized for evaluation. Computational chemistry methods are being used to elucidate the mechanism of action of these salts. Toxicity endpoints such as viability (LDH) and changes in enzyme activity (glutahoione peroxidase and catalase) are being experimentally measured as indicators of cellular damage. Extrapolation from computational/in vitro studies to human toxicity, is the ultimate goal. The product of this program will be a predictive tool to assist in the development of new, less toxic propellants.
A generalized spatiotemporal covariance model for stationary background in analysis of MEG data.
Plis, S M; Schmidt, D M; Jun, S C; Ranken, D M
2006-01-01
Using a noise covariance model based on a single Kronecker product of spatial and temporal covariance in the spatiotemporal analysis of MEG data was demonstrated to provide improvement in the results over that of the commonly used diagonal noise covariance model. In this paper we present a model that is a generalization of all of the above models. It describes models based on a single Kronecker product of spatial and temporal covariance as well as more complicated multi-pair models together with any intermediate form expressed as a sum of Kronecker products of spatial component matrices of reduced rank and their corresponding temporal covariance matrices. The model provides a framework for controlling the tradeoff between the described complexity of the background and computational demand for the analysis using this model. Ways to estimate the value of the parameter controlling this tradeoff are also discussed.
A hydrological emulator for global applications - HE v1.0.0
NASA Astrophysics Data System (ADS)
Liu, Yaling; Hejazi, Mohamad; Li, Hongyi; Zhang, Xuesong; Leng, Guoyong
2018-03-01
While global hydrological models (GHMs) are very useful in exploring water resources and interactions between the Earth and human systems, their use often requires numerous model inputs, complex model calibration, and high computation costs. To overcome these challenges, we construct an efficient open-source and ready-to-use hydrological emulator (HE) that can mimic complex GHMs at a range of spatial scales (e.g., basin, region, globe). More specifically, we construct both a lumped and a distributed scheme of the HE based on the monthly abcd model to explore the tradeoff between computational cost and model fidelity. Model predictability and computational efficiency are evaluated in simulating global runoff from 1971 to 2010 with both the lumped and distributed schemes. The results are compared against the runoff product from the widely used Variable Infiltration Capacity (VIC) model. Our evaluation indicates that the lumped and distributed schemes present comparable results regarding annual total quantity, spatial pattern, and temporal variation of the major water fluxes (e.g., total runoff, evapotranspiration) across the global 235 basins (e.g., correlation coefficient r between the annual total runoff from either of these two schemes and the VIC is > 0.96), except for several cold (e.g., Arctic, interior Tibet), dry (e.g., North Africa) and mountainous (e.g., Argentina) regions. Compared against the monthly total runoff product from the VIC (aggregated from daily runoff), the global mean Kling-Gupta efficiencies are 0.75 and 0.79 for the lumped and distributed schemes, respectively, with the distributed scheme better capturing spatial heterogeneity. Notably, the computation efficiency of the lumped scheme is 2 orders of magnitude higher than the distributed one and 7 orders more efficient than the VIC model. A case study of uncertainty analysis for the world's 16 basins with top annual streamflow is conducted using 100 000 model simulations, and it demonstrates the lumped scheme's extraordinary advantage in computational efficiency. Our results suggest that the revised lumped abcd model can serve as an efficient and reasonable HE for complex GHMs and is suitable for broad practical use, and the distributed scheme is also an efficient alternative if spatial heterogeneity is of more interest.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trahan, Christine Alexandra; Garcia, Omar Fidel; Martino, Anthony A.
2010-08-01
The search is on for new renewable energy and algal-derived biofuel is a critical piece in the multi-faceted renewable energy puzzle. It has 30x more oil than any terrestrial oilseed crop, ideal composition for biodiesel, no competition with food crops, can be grown in waste water, and is cleaner than petroleum based fuels. This project discusses these three goals: (1) Conduct fundamental research into the effects that dynamic biotic and abiotic stressors have on algal growth and lipid production - Genomics/Transcriptomics, Bioanalytical spectroscopy/Chemical imaging; (2) Discover spectral signatures for algal health at the benchtop and greenhouse scale - Remote sensing,more » Bioanalytical spectroscopy; and (3) Develop computational model for algal growth and productivity at the raceway scale - Computational modeling.« less
NASA Astrophysics Data System (ADS)
Mahalov, M. S.; Blumenstein, V. Yu
2017-10-01
The mechanical condition and residual stresses (RS) research and computational algorithms creation in complex types of loading on the product lifecycle stages relevance is shown. The mechanical state and RS forming finite element model at surface plastic deformation strengthening machining, including technological inheritance effect, is presented. A model feature is the production previous stages obtained transformation properties consideration, as well as these properties evolution during metal particles displacement through the deformation space in the present loading step.
Code of Federal Regulations, 2014 CFR
2014-07-01
... into Mexico. (b) Cost of production of a car line shall mean the aggregate of the products of: (1) The average U.S. dealer wholesale price for such car line as computed from each official dealer price list effective during the course of a model year, and (2) The number of automobiles within the car line produced...
Code of Federal Regulations, 2012 CFR
2012-07-01
... into Mexico. (b) Cost of production of a car line shall mean the aggregate of the products of: (1) The average U.S. dealer wholesale price for such car line as computed from each official dealer price list effective during the course of a model year, and (2) The number of automobiles within the car line produced...
Code of Federal Regulations, 2013 CFR
2013-07-01
... into Mexico. (b) Cost of production of a car line shall mean the aggregate of the products of: (1) The average U.S. dealer wholesale price for such car line as computed from each official dealer price list effective during the course of a model year, and (2) The number of automobiles within the car line produced...
Estimating and validating harvesting system production through computer simulation
John E. Baumgras; Curt C. Hassler; Chris B. LeDoux
1993-01-01
A Ground Based Harvesting System Simulation model (GB-SIM) has been developed to estimate stump-to-truck production rates and multiproduct yields for conventional ground-based timber harvesting systems in Appalachian hardwood stands. Simulation results reflect inputs that define harvest site and timber stand attributes, wood utilization options, and key attributes of...
40 CFR 600.502-81 - Definitions.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) Cost of production of a car line shall mean the aggregate of the products of: (i) The average U.S. dealer wholesale price for such car line as computed from each official dealer price list effective during the course of a model year, and (ii) The number of automobiles within the car line produced during...
Fang, Jiansong; Wu, Zengrui; Cai, Chuipu; Wang, Qi; Tang, Yun; Cheng, Feixiong
2017-11-27
Natural products with diverse chemical scaffolds have been recognized as an invaluable source of compounds in drug discovery and development. However, systematic identification of drug targets for natural products at the human proteome level via various experimental assays is highly expensive and time-consuming. In this study, we proposed a systems pharmacology infrastructure to predict new drug targets and anticancer indications of natural products. Specifically, we reconstructed a global drug-target network with 7,314 interactions connecting 751 targets and 2,388 natural products and built predictive network models via a balanced substructure-drug-target network-based inference approach. A high area under receiver operating characteristic curve of 0.96 was yielded for predicting new targets of natural products during cross-validation. The newly predicted targets of natural products (e.g., resveratrol, genistein, and kaempferol) with high scores were validated by various literature studies. We further built the statistical network models for identification of new anticancer indications of natural products through integration of both experimentally validated and computationally predicted drug-target interactions of natural products with known cancer proteins. We showed that the significantly predicted anticancer indications of multiple natural products (e.g., naringenin, disulfiram, and metformin) with new mechanism-of-action were validated by various published experimental evidence. In summary, this study offers powerful computational systems pharmacology approaches and tools for the development of novel targeted cancer therapies by exploiting the polypharmacology of natural products.
The use of computational ecological models to inform environmental management and policy has proliferated in the past 25 years. These models have become essential tools as linkages and feedbacks between human actions and ecological responses can be complex, and as funds for sampl...
Patricia K. Lebow; Henry Spelter; Peter J. Ince
2003-01-01
This report provides documentation and user information for FPL-PELPS, a personal computer price endogenous linear programming system for economic modeling. Originally developed to model the North American pulp and paper industry, FPL-PELPS follows its predecessors in allowing the modeling of any appropriate sector to predict consumption, production and capacity by...
Model documentation Renewable Fuels Module of the National Energy Modeling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-01-01
This report documents the objectives, analaytical approach and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it relates to the production of the 1996 Annual Energy Outlook forecasts. The report catalogues and describes modeling assumptions, computational methodologies, data inputs, and parameter estimation techniques. A number of offline analyses used in lieu of RFM modeling components are also described.
Yang, Deming; Xu, Zhenming
2011-09-15
Crushing and separating technology is widely used in waste printed circuit boards (PCBs) recycling process. A set of automatic line without negative impact to environment for recycling waste PCBs was applied in industry scale. Crushed waste PCBs particles grinding and classification cyclic system is the most important part of the automatic production line, and it decides the efficiency of the whole production line. In this paper, a model for computing the process of the system was established, and matrix analysis method was adopted. The result showed that good agreement can be achieved between the simulation model and the actual production line, and the system is anti-jamming. This model possibly provides a basis for the automatic process control of waste PCBs production line. With this model, many engineering problems can be reduced, such as metals and nonmetals insufficient dissociation, particles over-pulverizing, incomplete comminuting, material plugging and equipment fever. Copyright © 2011 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heroux, Michael; Lethin, Richard
Programming models and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programming models and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that makemore » design, prototyping and delivery of new programming models easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programming models and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) Programming Models & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.« less
Predicting neutron damage using TEM with in situ ion irradiation and computer modeling
NASA Astrophysics Data System (ADS)
Kirk, Marquis A.; Li, Meimei; Xu, Donghua; Wirth, Brian D.
2018-01-01
We have constructed a computer model of irradiation defect production closely coordinated with TEM and in situ ion irradiation of Molybdenum at 80 °C over a range of dose, dose rate and foil thickness. We have reexamined our previous ion irradiation data to assign appropriate error and uncertainty based on more recent work. The spatially dependent cascade cluster dynamics model is updated with recent Molecular Dynamics results for cascades in Mo. After a careful assignment of both ion and neutron irradiation dose values in dpa, TEM data are compared for both ion and neutron irradiated Mo from the same source material. Using the computer model of defect formation and evolution based on the in situ ion irradiation of thin foils, the defect microstructure, consisting of densities and sizes of dislocation loops, is predicted for neutron irradiation of bulk material at 80 °C and compared with experiment. Reasonable agreement between model prediction and experimental data demonstrates a promising direction in understanding and predicting neutron damage using a closely coordinated program of in situ ion irradiation experiment and computer simulation.
The distributed production system of the SuperB project: description and results
NASA Astrophysics Data System (ADS)
Brown, D.; Corvo, M.; Di Simone, A.; Fella, A.; Luppi, E.; Paoloni, E.; Stroili, R.; Tomassetti, L.
2011-12-01
The SuperB experiment needs large samples of MonteCarlo simulated events in order to finalize the detector design and to estimate the data analysis performances. The requirements are beyond the capabilities of a single computing farm, so a distributed production model capable of exploiting the existing HEP worldwide distributed computing infrastructure is needed. In this paper we describe the set of tools that have been developed to manage the production of the required simulated events. The production of events follows three main phases: distribution of input data files to the remote site Storage Elements (SE); job submission, via SuperB GANGA interface, to all available remote sites; output files transfer to CNAF repository. The job workflow includes procedures for consistency checking, monitoring, data handling and bookkeeping. A replication mechanism allows storing the job output on the local site SE. Results from 2010 official productions are reported.
Exploiting Self-organization in Bioengineered Systems: A Computational Approach.
Davis, Delin; Doloman, Anna; Podgorski, Gregory J; Vargis, Elizabeth; Flann, Nicholas S
2017-01-01
The productivity of bioengineered cell factories is limited by inefficiencies in nutrient delivery and waste and product removal. Current solution approaches explore changes in the physical configurations of the bioreactors. This work investigates the possibilities of exploiting self-organizing vascular networks to support producer cells within the factory. A computational model simulates de novo vascular development of endothelial-like cells and the resultant network functioning to deliver nutrients and extract product and waste from the cell culture. Microbial factories with vascular networks are evaluated for their scalability, robustness, and productivity compared to the cell factories without a vascular network. Initial studies demonstrate that at least an order of magnitude increase in production is possible, the system can be scaled up, and the self-organization of an efficient vascular network is robust. The work suggests that bioengineered multicellularity may offer efficiency improvements difficult to achieve with physical engineering approaches.
ERIC Educational Resources Information Center
Sander, Ian M.; McGoldrick, Matthew T.; Helms, My N.; Betts, Aislinn; van Avermaete, Anthony; Owers, Elizabeth; Doney, Evan; Liepert, Taimi; Niebur, Glen; Liepert, Douglas; Leevy, W. Matthew
2017-01-01
Advances in three-dimensional (3D) printing allow for digital files to be turned into a "printed" physical product. For example, complex anatomical models derived from clinical or pre-clinical X-ray computed tomography (CT) data of patients or research specimens can be constructed using various printable materials. Although 3D printing…
Computer technique for simulating the combustion of cellulose and other fuels
Andrew M. Stein; Brian W. Bauske
1971-01-01
A computer method has been developed for simulating the combustion of wood and other cellulosic fuels. The products of combustion are used as input for a convection model that slimulates real fires. The method allows the chemical process to proceed to equilibrium and then examines the effects of mass addition and repartitioning on the fluid mechanics of the convection...
Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling
NASA Technical Reports Server (NTRS)
Mog, Robert A.
1997-01-01
Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.
MapReduce Based Parallel Bayesian Network for Manufacturing Quality Control
NASA Astrophysics Data System (ADS)
Zheng, Mao-Kuan; Ming, Xin-Guo; Zhang, Xian-Yu; Li, Guo-Ming
2017-09-01
Increasing complexity of industrial products and manufacturing processes have challenged conventional statistics based quality management approaches in the circumstances of dynamic production. A Bayesian network and big data analytics integrated approach for manufacturing process quality analysis and control is proposed. Based on Hadoop distributed architecture and MapReduce parallel computing model, big volume and variety quality related data generated during the manufacturing process could be dealt with. Artificial intelligent algorithms, including Bayesian network learning, classification and reasoning, are embedded into the Reduce process. Relying on the ability of the Bayesian network in dealing with dynamic and uncertain problem and the parallel computing power of MapReduce, Bayesian network of impact factors on quality are built based on prior probability distribution and modified with posterior probability distribution. A case study on hull segment manufacturing precision management for ship and offshore platform building shows that computing speed accelerates almost directly proportionally to the increase of computing nodes. It is also proved that the proposed model is feasible for locating and reasoning of root causes, forecasting of manufacturing outcome, and intelligent decision for precision problem solving. The integration of bigdata analytics and BN method offers a whole new perspective in manufacturing quality control.
Development of student's skills of 3D modeling of assembly units
NASA Astrophysics Data System (ADS)
Chepur, P. V.; Boshhenko, T. V.
2018-03-01
The paper presents data on the influence of additives of the pre-treated aluminium oxide powder on the structure of cast lead-tin-based bronzes. The article demonstrates that modern, advanced from the point of view of automation, methods in designing products are the basis for the successful implementation of any production task. The advantages of product presentation in the form of an assembly consisting of 3D models of its details are described. The extreme importance of high-quality preparation of students of engineering specialties for work in computer-aided design programs such as AutoCAD, Compass 3D, Inventer|, Solid Edge, Solid Works, Revit, ANSYS is considered. It is established that one of the most effective forms of increasing the level of computer graphic preparation of students are academic competitions and contests on modeling and prototyping products. The stages of creation of assembly unit models in the AutoCad and Compass 3D software suits generally accepted both in design in a business environment and during training of specialists are considered. The developed 3D models of assembly units are presented in the course of preparation for academic competitions (called Academic Olympics in Russia) of students of the 2nd-5th years of study and the first year students of the master's program in engineering. The conclusions and recommendations on the development of the direction of three-dimensional design in the environment of higher education are given.
NASA Technical Reports Server (NTRS)
Dunbar, D. N.; Tunnah, B. G.
1978-01-01
The FORTRAN computing program predicts the flow streams and material, energy, and economic balances of a typical petroleum refinery, with particular emphasis on production of aviation turbine fuel of varying end point and hydrogen content specifications. The program has provision for shale oil and coal oil in addition to petroleum crudes. A case study feature permits dependent cases to be run for parametric or optimization studies by input of only the variables which are changed from the base case. The report has sufficient detail for the information of most readers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klopman, G.; Tu, M.
1997-09-01
It is shown that a combination of two programs, MultiCASE and META, can help assess the biodegradability of industrial organic materials in the ecosystem. MultiCASE is an artificial intelligence computer program that had been trained to identify molecular substructures believed to cause or inhibit biodegradation and META is an expert system trained to predict the aerobic biodegradation products of organic molecules. These two programs can be used to help evaluate the fate of disposed chemicals by estimating their biodegradability and the nature of their biodegradation products under conditions that may model the environment.
Miller, Robert T.; Delin, G.N.
1994-01-01
A three-dimensional, anisotropic, nonisothermal, ground-water-flow, and thermal-energy-transport model was constructed to simulate the four short-term test cycles. The model was used to simulate the entire short-term testing period of approximately 400 days. The only model properties varied during model calibration were longitudinal and transverse thermal dispersivities, which, for final calibration, were simulated as 3.3 and 0.33 meters, respectively. The model was calibrated by comparing model-computed results to (1) measured temperatures at selected altitudes in four observation wells, (2) measured temperatures at the production well, and (3) calculated thermal efficiencies of the aquifer. Model-computed withdrawal-water temperatures were within an average of about 3 percent of measured values and model-computed aquifer-thermal efficiencies were within an average of about 5 percent of calculated values for the short-term test cycles. These data indicate that the model accurately simulated thermal-energy storage within the Franconia-Ironton-Galesville aquifer.
Spin wave Feynman diagram vertex computation package
NASA Astrophysics Data System (ADS)
Price, Alexander; Javernick, Philip; Datta, Trinanjan
Spin wave theory is a well-established theoretical technique that can correctly predict the physical behavior of ordered magnetic states. However, computing the effects of an interacting spin wave theory incorporating magnons involve a laborious by hand derivation of Feynman diagram vertices. The process is tedious and time consuming. Hence, to improve productivity and have another means to check the analytical calculations, we have devised a Feynman Diagram Vertex Computation package. In this talk, we will describe our research group's effort to implement a Mathematica based symbolic Feynman diagram vertex computation package that computes spin wave vertices. Utilizing the non-commutative algebra package NCAlgebra as an add-on to Mathematica, symbolic expressions for the Feynman diagram vertices of a Heisenberg quantum antiferromagnet are obtained. Our existing code reproduces the well-known expressions of a nearest neighbor square lattice Heisenberg model. We also discuss the case of a triangular lattice Heisenberg model where non collinear terms contribute to the vertex interactions.
The IEA/ORAU Long-Term Global Energy- CO2 Model: Personal Computer Version A84PC
Edmonds, Jae A.; Reilly, John M.; Boden, Thomas A. [CDIAC; Reynolds, S. E. [CDIAC; Barns, D. W.
1995-01-01
The IBM A84PC version of the Edmonds-Reilly model has the capability to calculate both CO2 and CH4 emission estimates by source and region. Population, labor productivity, end-use energy efficiency, income effects, price effects, resource base, technological change in energy production, environmental costs of energy production, market-penetration rate of energy-supply technology, solar and biomass energy costs, synfuel costs, and the number of forecast periods may be interactively inspected and altered producing a variety of global and regional CO2 and CH4 emission scenarios for 1975 through 2100. Users are strongly encouraged to see our instructions for downloading, installing, and running the model.
Progress report on LBL's numerical modeling studies on Cerro Prieto
DOE Office of Scientific and Technical Information (OSTI.GOV)
Halfman-Dooley, S.E.; Lippman, M.J.; Bodvarsson, G.S.
1989-04-01
An exploitation model of the Cerro Prieto geothermal system is needed to assess the energy capacity of the field, estimate its productive lifetime and develop an optimal reservoir management plan. The model must consider the natural state (i.e., pre-exploitation) conditions of the system and be able to predict changes in the reservoir thermodynamic conditions (and fluid chemistry) in response to fluid production (and injection). This paper discusses the results of a three-dimensional numerical simulation of the natural state conditions of the Cerro Prieto field and compares computed and observed pressure and temperature/enthalpy changes for the 1973--1987 production period. 16 refs.,more » 24 figs., 2 tabs.« less
Computer Simulation in Predicting Biochemical Processes and Energy Balance at WWTPs
NASA Astrophysics Data System (ADS)
Drewnowski, Jakub; Zaborowska, Ewa; Hernandez De Vega, Carmen
2018-02-01
Nowadays, the use of mathematical models and computer simulation allow analysis of many different technological solutions as well as testing various scenarios in a short time and at low financial budget in order to simulate the scenario under typical conditions for the real system and help to find the best solution in design or operation process. The aim of the study was to evaluate different concepts of biochemical processes and energy balance modelling using a simulation platform GPS-x and a comprehensive model Mantis2. The paper presents the example of calibration and validation processes in the biological reactor as well as scenarios showing an influence of operational parameters on the WWTP energy balance. The results of batch tests and full-scale campaign obtained in the former work were used to predict biochemical and operational parameters in a newly developed plant model. The model was extended with sludge treatment devices, including anaerobic digester. Primary sludge removal efficiency was found as a significant factor determining biogas production and further renewable energy production in cogeneration. Water and wastewater utilities, which run and control WWTP, are interested in optimizing the process in order to save environment, their budget and decrease the pollutant emissions to water and air. In this context, computer simulation can be the easiest and very useful tool to improve the efficiency without interfering in the actual process performance.
A review of predictive nonlinear theories for multiscale modeling of heterogeneous materials
NASA Astrophysics Data System (ADS)
Matouš, Karel; Geers, Marc G. D.; Kouznetsova, Varvara G.; Gillman, Andrew
2017-02-01
Since the beginning of the industrial age, material performance and design have been in the midst of innovation of many disruptive technologies. Today's electronics, space, medical, transportation, and other industries are enriched by development, design and deployment of composite, heterogeneous and multifunctional materials. As a result, materials innovation is now considerably outpaced by other aspects from component design to product cycle. In this article, we review predictive nonlinear theories for multiscale modeling of heterogeneous materials. Deeper attention is given to multiscale modeling in space and to computational homogenization in addressing challenging materials science questions. Moreover, we discuss a state-of-the-art platform in predictive image-based, multiscale modeling with co-designed simulations and experiments that executes on the world's largest supercomputers. Such a modeling framework consists of experimental tools, computational methods, and digital data strategies. Once fully completed, this collaborative and interdisciplinary framework can be the basis of Virtual Materials Testing standards and aids in the development of new material formulations. Moreover, it will decrease the time to market of innovative products.
A review of predictive nonlinear theories for multiscale modeling of heterogeneous materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matouš, Karel, E-mail: kmatous@nd.edu; Geers, Marc G.D.; Kouznetsova, Varvara G.
2017-02-01
Since the beginning of the industrial age, material performance and design have been in the midst of innovation of many disruptive technologies. Today's electronics, space, medical, transportation, and other industries are enriched by development, design and deployment of composite, heterogeneous and multifunctional materials. As a result, materials innovation is now considerably outpaced by other aspects from component design to product cycle. In this article, we review predictive nonlinear theories for multiscale modeling of heterogeneous materials. Deeper attention is given to multiscale modeling in space and to computational homogenization in addressing challenging materials science questions. Moreover, we discuss a state-of-the-art platformmore » in predictive image-based, multiscale modeling with co-designed simulations and experiments that executes on the world's largest supercomputers. Such a modeling framework consists of experimental tools, computational methods, and digital data strategies. Once fully completed, this collaborative and interdisciplinary framework can be the basis of Virtual Materials Testing standards and aids in the development of new material formulations. Moreover, it will decrease the time to market of innovative products.« less
Presenting Thin Media Models Affects Women's Choice of Diet or Normal Snacks
ERIC Educational Resources Information Center
Krahe, Barbara; Krause, Christina
2010-01-01
Our study explored the influence of thin- versus normal-size media models and of self-reported restrained eating behavior on women's observed snacking behavior. Fifty female undergraduates saw a set of advertisements for beauty products showing either thin or computer-altered normal-size female models, allegedly as part of a study on effective…
40 CFR 86.1865-12 - How to comply with the fleet average CO2 standards.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) Calculating the fleet average carbon-related exhaust emissions. (1) Manufacturers must compute separate production-weighted fleet average carbon-related exhaust emissions at the end of the model year for passenger... for sale, and certifying model types to standards as defined in § 86.1818-12. The model type carbon...
Modeling of Word Translation: Activation Flow from Concepts to Lexical Items
ERIC Educational Resources Information Center
Roelofs, Ardi; Dijkstra, Ton; Gerakaki, Svetlana
2013-01-01
Whereas most theoretical and computational models assume a continuous flow of activation from concepts to lexical items in spoken word production, one prominent model assumes that the mapping of concepts onto words happens in a discrete fashion (Bloem & La Heij, 2003). Semantic facilitation of context pictures on word translation has been taken to…
The use of computational ecological models to inform environmental management and policy has proliferated in the past 25 years. These models have become essential tools as linkages and feedbacks between human actions and ecological responses can be complex, and as funds for sampl...
Zhang, Yuan; Yu, Guangren; Yu, Liang; Siddhu, Muhammad Abdul Hanan; Gao, Mengjiao; Abdeltawab, Ahmed A; Al-Deyab, Salem S; Chen, Xiaochun
2016-03-01
Computational fluid dynamics (CFD) was applied to investigate mixing mode and power consumption in anaerobic mono- and co-digestion. Cattle manure (CM) and corn stover (CS) were used as feedstock and stirred tank reactor (STR) was used as digester. Power numbers obtained by the CFD simulation were compared with those from the experimental correlation. Results showed that the standard k-ε model was more appropriate than other turbulence models. A new index, net power production instead of gas production, was proposed to optimize feedstock ratio for anaerobic co-digestion. Results showed that flow field and power consumption were significantly changed in co-digestion of CM and CS compared with those in mono-digestion of either CM or CS. For different mixing modes, the optimum feedstock ratio for co-digestion changed with net power production. The best option of CM/CS ratio for continuous mixing, intermittent mixing I, and intermittent mixing II were 1:1, 1:1 and 1:3, respectively. Copyright © 2016. Published by Elsevier Ltd.
Ranganathan, Panneerselvam; Savithri, Sivaraman
2018-06-01
Computational Fluid Dynamics (CFD) technique is used in this work to simulate the hydrothermal liquefaction of Nannochloropsis sp. microalgae in a lab-scale continuous plug-flow reactor to understand the fluid dynamics, heat transfer, and reaction kinetics in a HTL reactor under hydrothermal condition. The temperature profile in the reactor and the yield of HTL products from the present simulation are obtained and they are validated with the experimental data available in the literature. Furthermore, the parametric study is carried out to study the effect of slurry flow rate, reactor temperature, and external heat transfer coefficient on the yield of products. Though the model predictions are satisfactory in comparison with the experimental results, it still needs to be improved for better prediction of the product yields. This improved model will be considered as a baseline for design and scale-up of large-scale HTL reactor. Copyright © 2018 Elsevier Ltd. All rights reserved.
Heather L. Kimball; Paul C. Selmants; Alvaro Moreno; Steve W. Running; Christian P. Giardina; Benjamin Poulter
2017-01-01
Gross primary production (GPP) is the Earthâs largest carbon flux into the terrestrial biosphere and plays a critical role in regulating atmospheric chemistry and global climate. The Moderate Resolution Imaging Spectrometer (MODIS)-MOD17 data product is a widely used remote sensing-based model that provides global estimates of spatiotemporal trends in GPP. When the...
Reactions of Free Radicals with Nitro-Compounds and Nitrates
1981-03-31
PAGE(I/hmm a•Ia ntatemd the fragment derived from the nitrates but not from the nitro-compounds could undergo exothermic rearrangement. Product analyses...compounds could undergo exothermic rearrangement. Product analyses and computer modelling were undertaken, these provided a clear explanation of why the...Nitrate 14 Reaction of Oxygen Atoms with Nitromethane 16 Reaction of Oxygen Atoms with Nitroethane 17 Products from Nitrocompounds 18 Effect of Carbon
Xu, Jingjie; Xie, Yan; Lu, Benzhuo; Zhang, Linbo
2016-08-25
The Debye-Hückel limiting law is used to study the binding kinetics of substrate-enzyme system as well as to estimate the reaction rate of a electrostatically steered diffusion-controlled reaction process. It is based on a linearized Poisson-Boltzmann model and known for its accurate predictions in dilute solutions. However, the substrate and product particles are in nonequilibrium states and are possibly charged, and their contributions to the total electrostatic field cannot be explicitly studied in the Poisson-Boltzmann model. Hence the influences of substrate and product on reaction rate coefficient were not known. In this work, we consider all the charged species, including the charged substrate, product, and mobile salt ions in a Poisson-Nernst-Planck model, and then compare the results with previous work. The results indicate that both the charged substrate and product can significantly influence the reaction rate coefficient with different behaviors under different setups of computational conditions. It is interesting to find that when substrate and product are both considered, under an overall neutral boundary condition for all the bulk charged species, the computed reaction rate kinetics recovers a similar Debye-Hückel limiting law again. This phenomenon implies that the charged product counteracts the influence of charged substrate on reaction rate coefficient. Our analysis discloses the fact that the total charge concentration of substrate and product, though in a nonequilibrium state individually, obeys an equilibrium Boltzmann distribution, and therefore contributes as a normal charged ion species to ionic strength. This explains why the Debye-Hückel limiting law still works in a considerable range of conditions even though the effects of charged substrate and product particles are not specifically and explicitly considered in the theory.
Construction of energy-stable Galerkin reduced order models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalashnikova, Irina; Barone, Matthew Franklin; Arunajatesan, Srinivasan
2013-05-01
This report aims to unify several approaches for building stable projection-based reduced order models (ROMs). Attention is focused on linear time-invariant (LTI) systems. The model reduction procedure consists of two steps: the computation of a reduced basis, and the projection of the governing partial differential equations (PDEs) onto this reduced basis. Two kinds of reduced bases are considered: the proper orthogonal decomposition (POD) basis and the balanced truncation basis. The projection step of the model reduction can be done in two ways: via continuous projection or via discrete projection. First, an approach for building energy-stable Galerkin ROMs for linear hyperbolicmore » or incompletely parabolic systems of PDEs using continuous projection is proposed. The idea is to apply to the set of PDEs a transformation induced by the Lyapunov function for the system, and to build the ROM in the transformed variables. The resulting ROM will be energy-stable for any choice of reduced basis. It is shown that, for many PDE systems, the desired transformation is induced by a special weighted L2 inner product, termed the %E2%80%9Csymmetry inner product%E2%80%9D. Attention is then turned to building energy-stable ROMs via discrete projection. A discrete counterpart of the continuous symmetry inner product, a weighted L2 inner product termed the %E2%80%9CLyapunov inner product%E2%80%9D, is derived. The weighting matrix that defines the Lyapunov inner product can be computed in a black-box fashion for a stable LTI system arising from the discretization of a system of PDEs in space. It is shown that a ROM constructed via discrete projection using the Lyapunov inner product will be energy-stable for any choice of reduced basis. Connections between the Lyapunov inner product and the inner product induced by the balanced truncation algorithm are made. Comparisons are also made between the symmetry inner product and the Lyapunov inner product. The performance of ROMs constructed using these inner products is evaluated on several benchmark test cases.« less
Mathematical Description of Complex Chemical Kinetics and Application to CFD Modeling Codes
NASA Technical Reports Server (NTRS)
Bittker, D. A.
1993-01-01
A major effort in combustion research at the present time is devoted to the theoretical modeling of practical combustion systems. These include turbojet and ramjet air-breathing engines as well as ground-based gas-turbine power generating systems. The ability to use computational modeling extensively in designing these products not only saves time and money, but also helps designers meet the quite rigorous environmental standards that have been imposed on all combustion devices. The goal is to combine the very complex solution of the Navier-Stokes flow equations with realistic turbulence and heat-release models into a single computer code. Such a computational fluid-dynamic (CFD) code simulates the coupling of fluid mechanics with the chemistry of combustion to describe the practical devices. This paper will focus on the task of developing a simplified chemical model which can predict realistic heat-release rates as well as species composition profiles, and is also computationally rapid. We first discuss the mathematical techniques used to describe a complex, multistep fuel oxidation chemical reaction and develop a detailed mechanism for the process. We then show how this mechanism may be reduced and simplified to give an approximate model which adequately predicts heat release rates and a limited number of species composition profiles, but is computationally much faster than the original one. Only such a model can be incorporated into a CFD code without adding significantly to long computation times. Finally, we present some of the recent advances in the development of these simplified chemical mechanisms.
Mathematical description of complex chemical kinetics and application to CFD modeling codes
NASA Technical Reports Server (NTRS)
Bittker, D. A.
1993-01-01
A major effort in combustion research at the present time is devoted to the theoretical modeling of practical combustion systems. These include turbojet and ramjet air-breathing engines as well as ground-based gas-turbine power generating systems. The ability to use computational modeling extensively in designing these products not only saves time and money, but also helps designers meet the quite rigorous environmental standards that have been imposed on all combustion devices. The goal is to combine the very complex solution of the Navier-Stokes flow equations with realistic turbulence and heat-release models into a single computer code. Such a computational fluid-dynamic (CFD) code simulates the coupling of fluid mechanics with the chemistry of combustion to describe the practical devices. This paper will focus on the task of developing a simplified chemical model which can predict realistic heat-release rates as well as species composition profiles, and is also computationally rapid. We first discuss the mathematical techniques used to describe a complex, multistep fuel oxidation chemical reaction and develop a detailed mechanism for the process. We then show how this mechanism may be reduced and simplified to give an approximate model which adequately predicts heat release rates and a limited number of species composition profiles, but is computationally much faster than the original one. Only such a model can be incorporated into a CFD code without adding significantly to long computation times. Finally, we present some of the recent advances in the development of these simplified chemical mechanisms.
NASA Astrophysics Data System (ADS)
Barreiro, F. H.; Borodin, M.; De, K.; Golubkov, D.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Padolski, S.; Wenaus, T.; ATLAS Collaboration
2017-10-01
The second generation of the ATLAS Production System called ProdSys2 is a distributed workload manager that runs daily hundreds of thousands of jobs, from dozens of different ATLAS specific workflows, across more than hundred heterogeneous sites. It achieves high utilization by combining dynamic job definition based on many criteria, such as input and output size, memory requirements and CPU consumption, with manageable scheduling policies and by supporting different kind of computational resources, such as GRID, clouds, supercomputers and volunteer-computers. The system dynamically assigns a group of jobs (task) to a group of geographically distributed computing resources. Dynamic assignment and resources utilization is one of the major features of the system, it didn’t exist in the earliest versions of the production system where Grid resources topology was predefined using national or/and geographical pattern. Production System has a sophisticated job fault-recovery mechanism, which efficiently allows to run multi-Terabyte tasks without human intervention. We have implemented “train” model and open-ended production which allow to submit tasks automatically as soon as new set of data is available and to chain physics groups data processing and analysis with central production by the experiment. We present an overview of the ATLAS Production System and its major components features and architecture: task definition, web user interface and monitoring. We describe the important design decisions and lessons learned from an operational experience during the first year of LHC Run2. We also report the performance of the designed system and how various workflows, such as data (re)processing, Monte-Carlo and physics group production, users analysis, are scheduled and executed within one production system on heterogeneous computing resources.
NASA Astrophysics Data System (ADS)
Starikov, A. I.; Nekrasov, R. Yu; Teploukhov, O. J.; Soloviev, I. V.; Narikov, K. A.
2016-10-01
Manufactures, machinery and equipment improve of constructively as science advances and technology, and requirements are improving of quality and longevity. That is, the requirements for surface quality and precision manufacturing, oil and gas equipment parts are constantly increasing. Production of oil and gas engineering products on modern machine tools with computer numerical control - is a complex synthesis of technical and electrical equipment parts, as well as the processing procedure. Technical machine part wears during operation and in the electrical part are accumulated mathematical errors. Thus, the above-mentioned disadvantages of any of the following parts of metalworking equipment affect the manufacturing process of products in general, and as a result lead to the flaw.
Tominaga, Maki; Asakura, Takashi; Akiyama, Tsuyoshi
2007-06-01
To investigate the effect of micro and macro stressors in the work environment on the subjective health status and productive behavior of computer professionals, we conducted a web-based investigation with Japanese IT-related company employees in 53 company unions. The questionnaire consisted of individual attributes, employment characteristics, working hour characteristics, company size and profitability, personal characteristics (i.e., Growth Need Strength), micro and macro stressors scale, and four outcome scales concerning the subjective health status and productive behavior. We obtained 1,049 Japanese IT-related company employees' data (response rate: 66%), and analyzed the data of computer engineers (80%; n=871). The results of hierarchical multiple regressions showed that each full model explained 23% in psychological distress, 20% in cumulative fatigue, 44% in job dissatisfaction, and 35% in intentions to leave, respectively. In micro stressors, "quantitative and qualitative work overload" had the strongest influence on both the subjective health status and intentions to leave. Furthermore, in macro stressors, "career and future ambiguity" was the most important predictor of the subjective health status, and "insufficient evaluation systems" and "poor supervisor's support" were important predictors of productive behavior as well. These findings suggest that improving not only micro stressors but also macro stressors will enhance the subjective health status and increase the productive behavior of computer professionals in Japan.
Donato, David I.
2013-01-01
A specialized technique is used to compute weighted ordinary least-squares (OLS) estimates of the parameters of the National Descriptive Model of Mercury in Fish (NDMMF) in less time using less computer memory than general methods. The characteristics of the NDMMF allow the two products X'X and X'y in the normal equations to be filled out in a second or two of computer time during a single pass through the N data observations. As a result, the matrix X does not have to be stored in computer memory and the computationally expensive matrix multiplications generally required to produce X'X and X'y do not have to be carried out. The normal equations may then be solved to determine the best-fit parameters in the OLS sense. The computational solution based on this specialized technique requires O(8p2+16p) bytes of computer memory for p parameters on a machine with 8-byte double-precision numbers. This publication includes a reference implementation of this technique and a Gaussian-elimination solver in preliminary custom software.
Safety Metrics for Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Leveson, Nancy G; Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
Simulation of charge exchange plasma propagation near an ion thruster propelled spacecraft
NASA Technical Reports Server (NTRS)
Robinson, R. S.; Kaufman, H. R.; Winder, D. R.
1981-01-01
A model describing the charge exchange plasma and its propagation is discussed, along with a computer code based on the model. The geometry of an idealized spacecraft having an ion thruster is outlined, with attention given to the assumptions used in modeling the ion beam. Also presented is the distribution function describing charge exchange production. The barometric equation is used in relating the variation in plasma potential to the variation in plasma density. The numerical methods and approximations employed in the calculations are discussed, and comparisons are made between the computer simulation and experimental data. An analytical solution of a simple configuration is also used in verifying the model.
NASA Technical Reports Server (NTRS)
Kim, Sang-Wook; Chen, Yen-Sen
1988-01-01
An algebraic stress turbulence model and a computational procedure for turbulent boundary layer flows which is based on the semidiscrete Galerkin FEM are discussed. In the algebraic stress turbulence model, the eddy viscosity expression is obtained from the Reynolds stress turbulence model, and the turbulent kinetic energy dissipation rate equation is improved by including a production range time scale. Good agreement with experimental data is found for the examples of a fully developed channel flow, a fully developed pipe flow, a flat plate boundary layer flow, a plane jet exhausting into a moving stream, a circular jet exhausting into a moving stream, and a wall jet flow.
Computer-Mediated Assessment of Higher-Order Thinking Development
ERIC Educational Resources Information Center
Tilchin, Oleg; Raiyn, Jamal
2015-01-01
Solving complicated problems in a contemporary knowledge-based society requires higher-order thinking (HOT). The most productive way to encourage development of HOT in students is through use of the Problem-based Learning (PBL) model. This model organizes learning by solving corresponding problems relative to study courses. Students are directed…
L. Linsen; B.J. Karis; E.G. McPherson; B. Hamann
2005-01-01
In computer graphics, models describing the fractal branching structure of trees typically exploit the modularity of tree structures. The models are based on local production rules, which are applied iteratively and simultaneously to create a complex branching system. The objective is to generate three-dimensional scenes of often many realistic- looking and non-...
Problem Solving Under Time-Constraints.
ERIC Educational Resources Information Center
Richardson, Michael; Hunt, Earl
A model of how automated and controlled processing can be mixed in computer simulations of problem solving is proposed. It is based on previous work by Hunt and Lansman (1983), who developed a model of problem solving that could reproduce the data obtained with several attention and performance paradigms, extending production-system notation to…
CEASAW: A User-Friendly Computer Environment Analysis for the Sawmill Owner
Guillermo Mendoza; William Sprouse; Philip A. Araman; William G. Luppold
1991-01-01
Improved spreadsheet software capabilities have brought optimization to users with little or no background in mathematical programming. Better interface capabilities of spreadsheet models now make it possible to combine optimization models with a spreadsheet system. Sawmill production and inventory systems possess many features that make them suitable application...
BROJA-2PID: A Robust Estimator for Bivariate Partial Information Decomposition
NASA Astrophysics Data System (ADS)
Makkeh, Abdullah; Theis, Dirk; Vicente, Raul
2018-04-01
Makkeh, Theis, and Vicente found in [8] that Cone Programming model is the most robust to compute the Bertschinger et al. partial information decompostion (BROJA PID) measure [1]. We developed a production-quality robust software that computes the BROJA PID measure based on the Cone Programming model. In this paper, we prove the important property of strong duality for the Cone Program and prove an equivalence between the Cone Program and the original Convex problem. Then describe in detail our software and how to use it.\
Design, processing and testing of LSI arrays, hybrid microelectronics task
NASA Technical Reports Server (NTRS)
Himmel, R. P.; Stuhlbarg, S. M.; Ravetti, R. G.; Zulueta, P. J.; Rothrock, C. W.
1979-01-01
Mathematical cost models previously developed for hybrid microelectronic subsystems were refined and expanded. Rework terms related to substrate fabrication, nonrecurring developmental and manufacturing operations, and prototype production are included. Sample computer programs were written to demonstrate hybrid microelectric applications of these cost models. Computer programs were generated to calculate and analyze values for the total microelectronics costs. Large scale integrated (LST) chips utilizing tape chip carrier technology were studied. The feasibility of interconnecting arrays of LSU chips utilizing tape chip carrier and semiautomatic wire bonding technology was demonstrated.
NASA Technical Reports Server (NTRS)
1993-01-01
Jack is an advanced human factors software package that provides a three dimensional model for predicting how a human will interact with a given system or environment. It can be used for a broad range of computer-aided design applications. Jack was developed by the computer Graphics Research Laboratory of the University of Pennsylvania with assistance from NASA's Johnson Space Center, Ames Research Center and the Army. It is the University's first commercial product. Jack is still used for academic purposes at the University of Pennsylvania. Commercial rights were given to Transom Technologies, Inc.
Automated method for structural segmentation of nasal airways based on cone beam computed tomography
NASA Astrophysics Data System (ADS)
Tymkovych, Maksym Yu.; Avrunin, Oleg G.; Paliy, Victor G.; Filzow, Maksim; Gryshkov, Oleksandr; Glasmacher, Birgit; Omiotek, Zbigniew; DzierŻak, RóŻa; Smailova, Saule; Kozbekova, Ainur
2017-08-01
The work is dedicated to the segmentation problem of human nasal airways using Cone Beam Computed Tomography. During research, we propose a specialized approach of structured segmentation of nasal airways. That approach use spatial information, symmetrisation of the structures. The proposed stages can be used for construction a virtual three dimensional model of nasal airways and for production full-scale personalized atlases. During research we build the virtual model of nasal airways, which can be used for construction specialized medical atlases and aerodynamics researches.
Future Approach to tier-0 extension
NASA Astrophysics Data System (ADS)
Jones, B.; McCance, G.; Cordeiro, C.; Giordano, D.; Traylen, S.; Moreno García, D.
2017-10-01
The current tier-0 processing at CERN is done on two managed sites, the CERN computer centre and the Wigner computer centre. With the proliferation of public cloud resources at increasingly competitive prices, we have been investigating how to transparently increase our compute capacity to include these providers. The approach taken has been to integrate these resources using our existing deployment and computer management tools and to provide them in a way that exposes them to users as part of the same site. The paper will describe the architecture, the toolset and the current production experiences of this model.
Computer analysis of railcar vibrations
NASA Technical Reports Server (NTRS)
Vlaminck, R. R.
1975-01-01
Computer models and techniques for calculating railcar vibrations are discussed along with criteria for vehicle ride optimization. The effect on vibration of car body structural dynamics, suspension system parameters, vehicle geometry, and wheel and rail excitation are presented. Ride quality vibration data collected on the state-of-the-art car and standard light rail vehicle is compared to computer predictions. The results show that computer analysis of the vehicle can be performed for relatively low cost in short periods of time. The analysis permits optimization of the design as it progresses and minimizes the possibility of excessive vibration on production vehicles.
Gebraad, P. M. O.; Teeuwisse, F. W.; van Wingerden, J. W.; ...
2016-01-01
This article presents a wind plant control strategy that optimizes the yaw settings of wind turbines for improved energy production of the whole wind plant by taking into account wake effects. The optimization controller is based on a novel internal parametric model for wake effects, called the FLOw Redirection and Induction in Steady-state (FLORIS) model. The FLORIS model predicts the steady-state wake locations and the effective flow velocities at each turbine, and the resulting turbine electrical energy production levels, as a function of the axial induction and the yaw angle of the different rotors. The FLORIS model has a limitedmore » number of parameters that are estimated based on turbine electrical power production data. In high-fidelity computational fluid dynamics simulations of a small wind plant, we demonstrate that the optimization control based on the FLORIS model increases the energy production of the wind plant, with a reduction of loads on the turbines as an additional effect.« less
NASA Astrophysics Data System (ADS)
HerdaǦDELEN, Amaç; Bingol, Haluk
Social interactions and personal tastes shape our consumption behavior of cultural products. In this study, we present a computational model of a cultural market and we aim to analyze the behavior of the consumer population as an emergent phenomena. Our results suggest that the final market shares of cultural products dramatically depend on consumer heterogeneity and social interaction pressure. Furthermore, the relation between the resulting market shares and social interaction is robust with respect to a wide range of variation in the parameter values and the type of topology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowery, P.S.; Lessor, D.L.
Waste glass melter and in situ vitrification (ISV) processes represent the combination of electrical thermal, and fluid flow phenomena to produce a stable waste-from product. Computational modeling of the thermal and fluid flow aspects of these processes provides a useful tool for assessing the potential performance of proposed system designs. These computations can be performed at a fraction of the cost of experiment. Consequently, computational modeling of vitrification systems can also provide and economical means for assessing the suitability of a proposed process application. The computational model described in this paper employs finite difference representations of the basic continuum conservationmore » laws governing the thermal, fluid flow, and electrical aspects of the vitrification process -- i.e., conservation of mass, momentum, energy, and electrical charge. The resulting code is a member of the TEMPEST family of codes developed at the Pacific Northwest Laboratory (operated by Battelle for the US Department of Energy). This paper provides an overview of the numerical approach employed in TEMPEST. In addition, results from several TEMPEST simulations of sample waste glass melter and ISV processes are provided to illustrate the insights to be gained from computational modeling of these processes. 3 refs., 13 figs.« less
Modal response of a computational vocal fold model with a substrate layer of adipose tissue.
Jones, Cameron L; Achuthan, Ajit; Erath, Byron D
2015-02-01
This study demonstrates the effect of a substrate layer of adipose tissue on the modal response of the vocal folds, and hence, on the mechanics of voice production. Modal analysis is performed on the vocal fold structure with a lateral layer of adipose tissue. A finite element model is employed, and the first six mode shapes and modal frequencies are studied. The results show significant changes in modal frequencies and substantial variation in mode shapes depending on the strain rate of the adipose tissue. These findings highlight the importance of considering adipose tissue in computational vocal fold modeling.
Computer Synthesis Approaches of Hyperboloid Gear Drives with Linear Contact
NASA Astrophysics Data System (ADS)
Abadjiev, Valentin; Kawasaki, Haruhisa
2014-09-01
The computer design has improved forming different type software for scientific researches in the field of gearing theory as well as performing an adequate scientific support of the gear drives manufacture. Here are attached computer programs that are based on mathematical models as a result of scientific researches. The modern gear transmissions require the construction of new mathematical approaches to their geometric, technological and strength analysis. The process of optimization, synthesis and design is based on adequate iteration procedures to find out an optimal solution by varying definite parameters. The study is dedicated to accepted methodology in the creation of soft- ware for the synthesis of a class high reduction hyperboloid gears - Spiroid and Helicon ones (Spiroid and Helicon are trademarks registered by the Illinois Tool Works, Chicago, Ill). The developed basic computer products belong to software, based on original mathematical models. They are based on the two mathematical models for the synthesis: "upon a pitch contact point" and "upon a mesh region". Computer programs are worked out on the basis of the described mathematical models, and the relations between them are shown. The application of the shown approaches to the synthesis of commented gear drives is illustrated.
Single Top Production at Next-to-Leading Order in the Standard Model Effective Field Theory.
Zhang, Cen
2016-04-22
Single top production processes at hadron colliders provide information on the relation between the top quark and the electroweak sector of the standard model. We compute the next-to-leading order QCD corrections to the three main production channels: t-channel, s-channel, and tW associated production, in the standard model including operators up to dimension six. The calculation can be matched to parton shower programs and can therefore be directly used in experimental analyses. The QCD corrections are found to significantly impact the extraction of the current limits on the operators, because both of an improved accuracy and a better precision of the theoretical predictions. In addition, the distributions of some of the key discriminating observables are modified in a nontrivial way, which could change the interpretation of measurements in terms of UV complete models.
The Design of Case Products’ Shape Form Information Database Based on NURBS Surface
NASA Astrophysics Data System (ADS)
Liu, Xing; Liu, Guo-zhong; Xu, Nuo-qi; Zhang, Wei-she
2017-07-01
In order to improve the computer design of product shape design,applying the Non-uniform Rational B-splines(NURBS) of curves and surfaces surface to the representation of the product shape helps designers to design the product effectively.On the basis of the typical product image contour extraction and using Pro/Engineer(Pro/E) to extract the geometric feature of scanning mold,in order to structure the information data base system of value point,control point and node vector parameter information,this paper put forward a unified expression method of using NURBS curves and surfaces to describe products’ geometric shape and using matrix laboratory(MATLAB) to simulate when products have the same or similar function.A case study of electric vehicle’s front cover illustrates the access process of geometric shape information of case product in this paper.This method can not only greatly reduce the capacity of information debate,but also improve the effectiveness of computer aided geometric innovation modeling.
On the Performance of Alternate Conceptual Ecohydrological Models for Streamflow Prediction
NASA Astrophysics Data System (ADS)
Naseem, Bushra; Ajami, Hoori; Cordery, Ian; Sharma, Ashish
2016-04-01
A merging of a lumped conceptual hydrological model with two conceptual dynamic vegetation models is presented to assess the performance of these models for simultaneous simulations of streamflow and leaf area index (LAI). Two conceptual dynamic vegetation models with differing representation of ecological processes are merged with a lumped conceptual hydrological model (HYMOD) to predict catchment scale streamflow and LAI. The merged RR-LAI-I model computes relative leaf biomass based on transpiration rates while the RR-LAI-II model computes above ground green and dead biomass based on net primary productivity and water use efficiency in response to soil moisture dynamics. To assess the performance of these models, daily discharge and 8-day MODIS LAI product for 27 catchments of 90 - 1600km2 in size located in the Murray - Darling Basin in Australia are used. Our results illustrate that when single-objective optimisation was focussed on maximizing the objective function for streamflow or LAI, the other un-calibrated predicted outcome (LAI if streamflow is the focus) was consistently compromised. Thus, single-objective optimization cannot take into account the essence of all processes in the conceptual ecohydrological models. However, multi-objective optimisation showed great strength for streamflow and LAI predictions. Both response outputs were better simulated by RR-LAI-II than RR-LAI-I due to better representation of physical processes such as net primary productivity (NPP) in RR-LAI-II. Our results highlight that simultaneous calibration of streamflow and LAI using a multi-objective algorithm proves to be an attractive tool for improved streamflow predictions.
Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling
NASA Astrophysics Data System (ADS)
Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.
2017-12-01
Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.
The open XXX spin chain in the SoV framework: scalar product of separate states
NASA Astrophysics Data System (ADS)
Kitanine, N.; Maillet, J. M.; Niccoli, G.; Terras, V.
2017-06-01
We consider the XXX open spin-1/2 chain with the most general non-diagonal boundary terms, that we solve by means of the quantum separation of variables (SoV) approach. We compute the scalar products of separate states, a class of states which notably contains all the eigenstates of the model. As usual for models solved by SoV, these scalar products can be expressed as some determinants with a non-trivial dependance in terms of the inhomogeneity parameters that have to be introduced for the method to be applicable. We show that these determinants can be transformed into alternative ones in which the homogeneous limit can easily be taken. These new representations can be considered as generalizations of the well-known determinant representation for the scalar products of the Bethe states of the periodic chain. In the particular case where a constraint is applied on the boundary parameters, such that the transfer matrix spectrum and eigenstates can be characterized in terms of polynomial solutions of a usual T-Q equation, the scalar product that we compute here corresponds to the scalar product between two off-shell Bethe-type states. If in addition one of the states is an eigenstate, the determinant representation can be simplified, hence leading in this boundary case to direct analogues of algebraic Bethe ansatz determinant representations of the scalar products for the periodic chain.
40 CFR 86.079-31 - Separate certification.
Code of Federal Regulations, 2010 CFR
2010-07-01
...-Duty Engines, and for 1985 and Later Model Year New Gasoline Fueled, Natural Gas-Fueled, Liquefied... certification of part of his product line. The selection of test vehicles (or test engines) and the computation...
40 CFR 86.079-31 - Separate certification.
Code of Federal Regulations, 2011 CFR
2011-07-01
...-Duty Engines, and for 1985 and Later Model Year New Gasoline Fueled, Natural Gas-Fueled, Liquefied... certification of part of his product line. The selection of test vehicles (or test engines) and the computation...
Dynamic optimization of CELSS crop photosynthetic rate by computer-assisted feedback control
NASA Astrophysics Data System (ADS)
Chun, C.; Mitchell, C. A.
1997-01-01
A procedure for dynamic optimization of net photosynthetic rate (Pn) for crop production in Controlled Ecological Life-Support Systems (CELSS) was developed using leaf lettuce as a model crop. Canopy Pn was measured in real time and fed back for environmental control. Setpoints of photosynthetic photon flux (PPF) and CO_2 concentration for each hour of the crop-growth cycle were decided by computer to reach a targeted Pn each day. Decision making was based on empirical mathematical models combined with rule sets developed from recent experimental data. Comparisons showed that dynamic control resulted in better yield per unit energy input to the growth system than did static control. With comparable productivity parameters and potential for significant energy savings, dynamic control strategies will contribute greatly to the sustainability of space-deployed CELSS.
Integrating a Single Tablet PC in Chemistry, Engineering, and Physics Courses
ERIC Educational Resources Information Center
Rogers, James W.; Cox, James R.
2008-01-01
A tablet PC is a versatile computer that combines the computing power of a notebook with the pen functionality of a PDA (Cox and Rogers 2005b). The authors adopted tablet PC technology in order to improve the process and product of the lecture format in their chemistry, engineering, and physics courses. In this high-tech model, a single tablet PC…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Busbey, A.B.
Seismic Processing Workshop, a program by Parallel Geosciences of Austin, TX, is discussed in this column. The program is a high-speed, interactive seismic processing and computer analysis system for the Apple Macintosh II family of computers. Also reviewed in this column are three products from Wilkerson Associates of Champaign, IL. SubSide is an interactive program for basin subsidence analysis; MacFault and MacThrustRamp are programs for modeling faults.
S.J. Chang; Rodney L. Busby; P.R. Pasala; Daniel J. Leduc
2005-01-01
A Visual Basic computer model that can be used to estimate the harvestvalue of loblolly pine plantations in the west gulf region is presented. Themodel uses a dynamic programming algorithm to convert stand tablespredicted by COMPUTE_P-LOB into a listing of seven products thatmaximizes the harvested value of the stand.
S.J. Chang; Rodney L. Busby; P.R. Pasala; Jeffrey C. Goelz
2005-01-01
A Visual Basic computer model that can be used to estimate the harvestvalue of slash pine plantations in the west gulf region is presented. Themodel uses a dynamic programming algorithm to convert stand tablespredicted by COMPUTE_P-SLASH into a listing of seven products thatmaximizes the harvested value of the stand.
Ledford, Chelsea; McMahon, Monica; Whitesell, Ashley; Khan, Ghalib; Kandagatla, Suneel K; Hurst, Dow P; Reggio, Patricia H; Raner, Gregory M
2017-02-01
To develop a model for binding and catalysis associated with the stimulation of 4-fluorophenol (4-FP) oxidation in the presence of long chain aldehydes by the enzymatic catalyst, cytochrome P450 BM3 -F87G. A variation of the Michaeli-Menten kinetic model was employed to describe interactions at the active site of the enzyme, along with computer aided modeling approaches. In addition to the hydroquinone product arising from de-fluorination of 4-FP, a second product (p-fluorocatechol) was also observed and, like the hydroquinone, its rate of formation increased in the presence of the aldehyde. When only aldehyde was present with the enzyme, BM3-F87G catalyzed its oxidation to the corresponding carboxylic acid; however, this activity was inhibited when 4-FP was added to the reaction. A 3D computer model of the active site containing both aldehyde and 4-FP was generated, guided by these kinetic observations. Finally, partitioning between the two phenolic products was examined with an emphasis on the conditions directing the initial epoxidation at either the 2,3- or 3,4-positions on the substrate. Temperature, reaction time, substrate concentration, and the structure of the aldehyde had no substantial effect on the overall product ratios, however the NADPH coupling efficiency decreased when unsaturated aldehydes were included, or when the temperature of the reaction was reduced. The unsaturated aldehyde, trans-2-decenal, stimulates BM3-F87G catalyzed oxidation of 4-fluorophenol through a cooperative active site binding mode that doesn't influence product distributions or coupling efficiencies, while 4-fluorophenol acts as a competitive inhibitor of aldehyde oxidation.
Diagnostic analysis of two-dimensional monthly average ozone balance with Chapman chemistry
NASA Technical Reports Server (NTRS)
Stolarski, Richard S.; Jackman, Charles H.; Kaye, Jack A.
1986-01-01
Chapman chemistry has been used in a two-dimensional model to simulate ozone balance phenomenology. The similarity between regions of ozone production and loss calculated using Chapman chemistry and those computed using LIMS and SAMS data with a photochemical equilibrium model indicate that such simplified chemistry is useful in studying gross features in stratospheric ozone balance. Net ozone production or loss rates are brought about by departures from the photochemical equilibrium (PCE) condition. If transport drives ozone above its PCE condition, then photochemical loss dominates production. If transport drives ozone below its PCE condition, then photochemical production dominates loss. Gross features of ozone loss/production (L/P) inferred for the real atmosphere from data are also simulated using only eddy diffusion. This indicates that one must be careful in assigning a transport scheme for a two-dimensional model that mimics only behavior of the observed ozone L/P.
Computational models for the analysis of three-dimensional internal and exhaust plume flowfields
NASA Technical Reports Server (NTRS)
Dash, S. M.; Delguidice, P. D.
1977-01-01
This paper describes computational procedures developed for the analysis of three-dimensional supersonic ducted flows and multinozzle exhaust plume flowfields. The models/codes embodying these procedures cater to a broad spectrum of geometric situations via the use of multiple reference plane grid networks in several coordinate systems. Shock capturing techniques are employed to trace the propagation and interaction of multiple shock surfaces while the plume interface, separating the exhaust and external flows, and the plume external shock are discretely analyzed. The computational grid within the reference planes follows the trace of streamlines to facilitate the incorporation of finite-rate chemistry and viscous computational capabilities. Exhaust gas properties consist of combustion products in chemical equilibrium. The computational accuracy of the models/codes is assessed via comparisons with exact solutions, results of other codes and experimental data. Results are presented for the flows in two-dimensional convergent and divergent ducts, expansive and compressive corner flows, flow in a rectangular nozzle and the plume flowfields for exhausts issuing out of single and multiple rectangular nozzles.
ERIC Educational Resources Information Center
Oakes, G. L.; Felton, A. J.; Garner, K. B.
2006-01-01
The BSc in computer aided product design (CAPD) course at the University of Wolverhampton was conceived as a collaborative venture in 1989 between the School of Engineering and the School of Art and Design. The award was at the forefront of forging interdisciplinary collaboration at undergraduate level in the field of product design. It has…
NASA Astrophysics Data System (ADS)
Kwak, Minjung; Kim, Harrison
2015-01-01
Remanufacturing is emerging as a promising solution for achieving green, profitable businesses. This article considers a manufacturer that produces new products and also remanufactured versions of the new products that become available at the end of their life cycle. For such a manufacturer, design decisions at the initial design stage determine both the current profit from manufacturing and future profit from remanufacturing. To maximize the total profit, design decisions must carefully consider both ends of product life cycle, i.e. manufacturing and end-of-life stages. This article proposes a decision-support model for the life-cycle design using mixed-integer nonlinear programming. With an aim to maximize the total life-cycle profit, the proposed model searches for an (at least locally) optimal product design (i.e. design specifications and the selling price) for the new and remanufactured products. It optimizes both the initial design and design upgrades at the end-of-life stage and also provides corresponding production strategies, including production quantities and take-back rate. The model is extended to a multi-objective model that maximizes both economic profit and environmental-impact saving. To illustrate, the developed model is demonstrated with an example of a desktop computer.
Accelerating Climate and Weather Simulations through Hybrid Computing
NASA Technical Reports Server (NTRS)
Zhou, Shujia; Cruz, Carlos; Duffy, Daniel; Tucker, Robert; Purcell, Mark
2011-01-01
Unconventional multi- and many-core processors (e.g. IBM (R) Cell B.E.(TM) and NVIDIA (R) GPU) have emerged as effective accelerators in trial climate and weather simulations. Yet these climate and weather models typically run on parallel computers with conventional processors (e.g. Intel, AMD, and IBM) using Message Passing Interface. To address challenges involved in efficiently and easily connecting accelerators to parallel computers, we investigated using IBM's Dynamic Application Virtualization (TM) (IBM DAV) software in a prototype hybrid computing system with representative climate and weather model components. The hybrid system comprises two Intel blades and two IBM QS22 Cell B.E. blades, connected with both InfiniBand(R) (IB) and 1-Gigabit Ethernet. The system significantly accelerates a solar radiation model component by offloading compute-intensive calculations to the Cell blades. Systematic tests show that IBM DAV can seamlessly offload compute-intensive calculations from Intel blades to Cell B.E. blades in a scalable, load-balanced manner. However, noticeable communication overhead was observed, mainly due to IP over the IB protocol. Full utilization of IB Sockets Direct Protocol and the lower latency production version of IBM DAV will reduce this overhead.
QCD next-to-leading-order predictions matched to parton showers for vector-like quark models.
Fuks, Benjamin; Shao, Hua-Sheng
2017-01-01
Vector-like quarks are featured by a wealth of beyond the Standard Model theories and are consequently an important goal of many LHC searches for new physics. Those searches, as well as most related phenomenological studies, however, rely on predictions evaluated at the leading-order accuracy in QCD and consider well-defined simplified benchmark scenarios. Adopting an effective bottom-up approach, we compute next-to-leading-order predictions for vector-like-quark pair production and single production in association with jets, with a weak or with a Higgs boson in a general new physics setup. We additionally compute vector-like-quark contributions to the production of a pair of Standard Model bosons at the same level of accuracy. For all processes under consideration, we focus both on total cross sections and on differential distributions, most these calculations being performed for the first time in our field. As a result, our work paves the way to precise extraction of experimental limits on vector-like quarks thanks to an accurate control of the shapes of the relevant observables and emphasise the extra handles that could be provided by novel vector-like-quark probes never envisaged so far.
Lackmann, J-W; Wende, K; Verlackt, C; Golda, J; Volzke, J; Kogelheide, F; Held, J; Bekeschus, S; Bogaerts, A; Schulz-von der Gathen, V; Stapelmann, K
2018-05-16
Reactive oxygen and nitrogen species released by cold physical plasma are being proposed as effectors in various clinical conditions connected to inflammatory processes. As these plasmas can be tailored in a wide range, models to compare and control their biochemical footprint are desired to infer on the molecular mechanisms underlying the observed effects and to enable the discrimination between different plasma sources. Here, an improved model to trace short-lived reactive species is presented. Using FTIR, high-resolution mass spectrometry, and molecular dynamics computational simulation, covalent modifications of cysteine treated with different plasmas were deciphered and the respective product pattern used to generate a fingerprint of each plasma source. Such, our experimental model allows a fast and reliable grading of the chemical potential of plasmas used for medical purposes. Major reaction products were identified to be cysteine sulfonic acid, cystine, and cysteine fragments. Less-abundant products, such as oxidized cystine derivatives or S-nitrosylated cysteines, were unique to different plasma sources or operating conditions. The data collected point at hydroxyl radicals, atomic O, and singlet oxygen as major contributing species that enable an impact on cellular thiol groups when applying cold plasma in vitro or in vivo.
ERIC Educational Resources Information Center
Dell, Gary S.; Martin, Nadine; Schwartz, Myrna F.
2007-01-01
Lexical access in language production, and particularly pathologies of lexical access, are often investigated by examining errors in picture naming and word repetition. In this article, we test a computational approach to lexical access, the two-step interactive model, by examining whether the model can quantitatively predict the repetition-error…
Modeling Study of the Low-Temperature Oxidation of Large Methyl Esters from C11 to C19
Herbinet, Olivier; Biet, Joffrey; Hakka, Mohammed Hichem; Warth, Valérie; Glaude, Pierre Alexandre; Nicolle, André; Battin-Leclerc, Frédérique
2013-01-01
The modeling of the low temperature oxidation of large saturated methyl esters really representative of those found in biodiesel fuels has been investigated. Models have been developed for these species and then detailed kinetic mechanisms have been automatically generated using a new extended version of software EXGAS, which includes reactions specific to the chemistry of esters. A model generated for a binary mixture of n-decane and methyl palmitate was used to simulate experimental results obtained in a jet-stirred reactor for this fuel. This model predicts very well the reactivity of the fuel and the mole fraction profiles of most reaction products. This work also shows that a model for a middle size methyl ester such as methyl decanoate predicts fairly well the reactivity and the mole fractions of most species with a substantial decrease in computational time. Large n-alkanes such as n-hexadecane are also good surrogates for reproducing the reactivity of methyl esters, with an important gain in computational time, but they cannot account for the formation of specific products such as unsaturated esters or cyclic ethers with an ester function. PMID:23814504
NASA Astrophysics Data System (ADS)
Memon, Aamir Mahmood; Liu, Qun; Memon, Khadim Hussain; Baloch, Wazir Ali; Memon, Asfandyar; Baset, Abdul
2015-07-01
Catch and effort data were analyzed to estimate the maximum sustainable yield (MSY) of King Soldier Bream, Argyrops spinifer (Forsskål, 1775, Family: Sparidae), and to evaluate the present status of the fish stocks exploited in Pakistani waters. The catch and effort data for the 25-years period 1985-2009 were analyzed using two computer software packages, CEDA (catch and effort data analysis) and ASPIC (a surplus production model incorporating covariates). The maximum catch of 3 458 t was observed in 1988 and the minimum catch of 1 324 t in 2005, while the average annual catch of A. spinifer over the 25 years was 2 500 t. The surplus production models of Fox, Schaefer, and Pella Tomlinson under three error assumptions of normal, log-normal and gamma are in the CEDA package and the two surplus models of Fox and logistic are in the ASPIC package. In CEDA, the MSY was estimated by applying the initial proportion (IP) of 0.8, because the starting catch was approximately 80% of the maximum catch. Except for gamma, because gamma showed maximization failures, the estimated results of MSY using CEDA with the Fox surplus production model and two error assumptions, were 1 692.08 t ( R 2=0.572) and 1 694.09 t ( R 2=0.606), respectively, and from the Schaefer and the Pella Tomlinson models with two error assumptions were 2 390.95 t ( R 2=0.563), and 2 380.06 t ( R 2=0.605), respectively. The MSY estimated by the Fox model was conservatively compared to the Schaefer and Pella Tomlinson models. The MSY values from Schaefer and Pella Tomlinson models were the same. The computed values of MSY using the ASPIC computer software program with the two surplus production models of Fox and logistic were 1 498 t ( R 2=0.917), and 2 488 t ( R 2=0.897) respectively. The estimated values of MSY using CEDA were about 1 700-2 400 t and the values from ASPIC were 1 500-2 500 t. The estimates output by the CEDA and the ASPIC packages indicate that the stock is overfished, and needs some effective management to reduce the fishing effort of the species in Pakistani waters.
On the computation of molecular surface correlations for protein docking using fourier techniques.
Sakk, Eric
2007-08-01
The computation of surface correlations using a variety of molecular models has been applied to the unbound protein docking problem. Because of the computational complexity involved in examining all possible molecular orientations, the fast Fourier transform (FFT) (a fast numerical implementation of the discrete Fourier transform (DFT)) is generally applied to minimize the number of calculations. This approach is rooted in the convolution theorem which allows one to inverse transform the product of two DFTs in order to perform the correlation calculation. However, such a DFT calculation results in a cyclic or "circular" correlation which, in general, does not lead to the same result as the linear correlation desired for the docking problem. In this work, we provide computational bounds for constructing molecular models used in the molecular surface correlation problem. The derived bounds are then shown to be consistent with various intuitive guidelines previously reported in the protein docking literature. Finally, these bounds are applied to different molecular models in order to investigate their effect on the correlation calculation.
New 3D model for dynamics modeling
NASA Astrophysics Data System (ADS)
Perez, Alain
1994-05-01
The wrist articulation represents one of the most complex mechanical systems of the human body. It is composed of eight bones rolling and sliding along their surface and along the faces of the five metacarpals of the hand and the two bones of the arm. The wrist dynamics are however fundamental for the hand movement, but it is so complex that it still remains incompletely explored. This work is a part of a new concept of computer-assisted surgery, which consists in developing computer models to perfect surgery acts by predicting their consequences. The modeling of the wrist dynamics are based first on the static model of its bones in three dimensions. This 3D model must optimise the collision detection procedure which is the necessary step to estimate the physical contact constraints. As many other possible computer vision models do not fit with enough precision to this problem, a new 3D model has been developed thanks to the median axis of the digital distance map of the bones reconstructed volume. The collision detection procedure is then simplified for contacts are detected between spheres. The experiment of this original 3D dynamic model products realistic computer animation images of solids in contact. It is now necessary to detect ligaments on digital medical images and to model them in order to complete a wrist model.
Toward A Simulation-Based Tool for the Treatment of Vocal Fold Paralysis
Mittal, Rajat; Zheng, Xudong; Bhardwaj, Rajneesh; Seo, Jung Hee; Xue, Qian; Bielamowicz, Steven
2011-01-01
Advances in high-performance computing are enabling a new generation of software tools that employ computational modeling for surgical planning. Surgical management of laryngeal paralysis is one area where such computational tools could have a significant impact. The current paper describes a comprehensive effort to develop a software tool for planning medialization laryngoplasty where a prosthetic implant is inserted into the larynx in order to medialize the paralyzed vocal fold (VF). While this is one of the most common procedures used to restore voice in patients with VF paralysis, it has a relatively high revision rate, and the tool being developed is expected to improve surgical outcomes. This software tool models the biomechanics of airflow-induced vibration in the human larynx and incorporates sophisticated approaches for modeling the turbulent laryngeal flow, the complex dynamics of the VFs, as well as the production of voiced sound. The current paper describes the key elements of the modeling approach, presents computational results that demonstrate the utility of the approach and also describes some of the limitations and challenges. PMID:21556320
An, Gary; Bartels, John; Vodovotz, Yoram
2011-01-01
The clinical translation of promising basic biomedical findings, whether derived from reductionist studies in academic laboratories or as the product of extensive high-throughput and –content screens in the biotechnology and pharmaceutical industries, has reached a period of stagnation in which ever higher research and development costs are yielding ever fewer new drugs. Systems biology and computational modeling have been touted as potential avenues by which to break through this logjam. However, few mechanistic computational approaches are utilized in a manner that is fully cognizant of the inherent clinical realities in which the drugs developed through this ostensibly rational process will be ultimately used. In this article, we present a Translational Systems Biology approach to inflammation. This approach is based on the use of mechanistic computational modeling centered on inherent clinical applicability, namely that a unified suite of models can be applied to generate in silico clinical trials, individualized computational models as tools for personalized medicine, and rational drug and device design based on disease mechanism. PMID:21552346
Computer-aided design of microvasculature systems for use in vascular scaffold production.
Mondy, William Lafayette; Cameron, Don; Timmermans, Jean-Pierre; De Clerck, Nora; Sasov, Alexander; Casteleyn, Christophe; Piegl, Les A
2009-09-01
In vitro biomedical engineering of intact, functional vascular networks, which include capillary structures, is a prerequisite for adequate vascular scaffold production. Capillary structures are necessary since they provide the elements and compounds for the growth, function and maintenance of 3D tissue structures. Computer-aided modeling of stereolithographic (STL) micro-computer tomographic (micro-CT) 3D models is a technique that enables us to mimic the design of vascular tree systems containing capillary beds, found in tissues. In our first paper (Mondy et al 2009 Tissue Eng. at press), using micro-CT, we studied the possibility of using vascular tissues to produce data capable of aiding the design of vascular tree scaffolding, which would help in the reverse engineering of a complete vascular tree system including capillary bed structures. In this paper, we used STL models of large datasets of computer-aided design (CAD) data of vascular structures which contained capillary structures that mimic those in the dermal layers of rabbit skin. Using CAD software we created from 3D STL models a bio-CAD design for the development of capillary-containing vascular tree scaffolding for skin. This method is designed to enhance a variety of therapeutic protocols including, but not limited to, organ and tissue repair, systemic disease mediation and cell/tissue transplantation therapy. Our successful approach to in vitro vasculogenesis will allow the bioengineering of various other types of 3D tissue structures, and as such greatly expands the potential applications of biomedical engineering technology into the fields of biomedical research and medicine.
Metabolic flexibility of mitochondrial respiratory chain disorders predicted by computer modelling.
Zieliński, Łukasz P; Smith, Anthony C; Smith, Alexander G; Robinson, Alan J
2016-11-01
Mitochondrial respiratory chain dysfunction causes a variety of life-threatening diseases affecting about 1 in 4300 adults. These diseases are genetically heterogeneous, but have the same outcome; reduced activity of mitochondrial respiratory chain complexes causing decreased ATP production and potentially toxic accumulation of metabolites. Severity and tissue specificity of these effects varies between patients by unknown mechanisms and treatment options are limited. So far most research has focused on the complexes themselves, and the impact on overall cellular metabolism is largely unclear. To illustrate how computer modelling can be used to better understand the potential impact of these disorders and inspire new research directions and treatments, we simulated them using a computer model of human cardiomyocyte mitochondrial metabolism containing over 300 characterised reactions and transport steps with experimental parameters taken from the literature. Overall, simulations were consistent with patient symptoms, supporting their biological and medical significance. These simulations predicted: complex I deficiencies could be compensated using multiple pathways; complex II deficiencies had less metabolic flexibility due to impacting both the TCA cycle and the respiratory chain; and complex III and IV deficiencies caused greatest decreases in ATP production with metabolic consequences that parallel hypoxia. Our study demonstrates how results from computer models can be compared to a clinical phenotype and used as a tool for hypothesis generation for subsequent experimental testing. These simulations can enhance understanding of dysfunctional mitochondrial metabolism and suggest new avenues for research into treatment of mitochondrial disease and other areas of mitochondrial dysfunction. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Susmikanti, Mike; Dewayatna, Winter; Sulistyo, Yos
2014-09-01
One of the research activities in support of commercial radioisotope production program is a safety research on target FPM (Fission Product Molybdenum) irradiation. FPM targets form a tube made of stainless steel which contains nuclear-grade high-enrichment uranium. The FPM irradiation tube is intended to obtain fission products. Fission materials such as Mo99 used widely the form of kits in the medical world. The neutronics problem is solved using first-order perturbation theory derived from the diffusion equation for four groups. In contrast, Mo isotopes have longer half-lives, about 3 days (66 hours), so the delivery of radioisotopes to consumer centers and storage is possible though still limited. The production of this isotope potentially gives significant economic value. The criticality and flux in multigroup diffusion model was calculated for various irradiation positions and uranium contents. This model involves complex computation, with large and sparse matrix system. Several parallel algorithms have been developed for the sparse and large matrix solution. In this paper, a successive over-relaxation (SOR) algorithm was implemented for the calculation of reactivity coefficients which can be done in parallel. Previous works performed reactivity calculations serially with Gauss-Seidel iteratives. The parallel method can be used to solve multigroup diffusion equation system and calculate the criticality and reactivity coefficients. In this research a computer code was developed to exploit parallel processing to perform reactivity calculations which were to be used in safety analysis. The parallel processing in the multicore computer system allows the calculation to be performed more quickly. This code was applied for the safety limits calculation of irradiated FPM targets containing highly enriched uranium. The results of calculations neutron show that for uranium contents of 1.7676 g and 6.1866 g (× 106 cm-1) in a tube, their delta reactivities are the still within safety limits; however, for 7.9542 g and 8.838 g (× 106 cm-1) the limits were exceeded.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manteuffel, T.A.
The objective of this project is the development of numerical solution techniques for deterministic models of the transport of neutral and charged particles and the demonstration of their effectiveness in both a production environment and on advanced architecture computers. The primary focus is on various versions of the linear Boltzman equation. These equations are fundamental in many important applications. This project is an attempt to integrate the development of numerical algorithms with the process of developing production software. A major thrust of this reject will be the implementation of these algorithms on advanced architecture machines that reside at the Advancedmore » Computing Laboratory (ACL) at Los Alamos National Laboratories (LANL).« less
Monte Carlo Simulations and Generation of the SPI Response
NASA Technical Reports Server (NTRS)
Sturner, S. J.; Shrader, C. R.; Weidenspointner, G.; Teegarden, B. J.; Attie, D.; Diehl, R.; Ferguson, C.; Jean, P.; vonKienlin, A.
2003-01-01
In this paper we discuss the methods developed for the production of the INTEGRAL/SPI instrument response. The response files were produced using a suite of Monte Carlo simulation software developed at NASA/GSFC based on the GEANT-3 package available from CERN. The production of the INTEGRAL/SPI instrument response also required the development of a detailed computer mass model for SPI. We discuss our extensive investigations into methods to reduce both the computation time and storage requirements for the SPI response. We also discuss corrections to the simulated response based on our comparison of ground and inflight calibration data with MGEANT simulation.
Monte Carlo Simulations and Generation of the SPI Response
NASA Technical Reports Server (NTRS)
Sturner, S. J.; Shrader, C. R.; Weidenspointner, G.; Teegarden, B. J.; Attie, D.; Cordier, B.; Diehl, R.; Ferguson, C.; Jean, P.; vonKienlin, A.
2003-01-01
In this paper we discuss the methods developed for the production of the INTEGRAL/SPI instrument response. The response files were produced using a suite of Monte Carlo simulation software developed at NASA/GSFC based on the GEANT-3 package available from CERN. The production of the INTEGRAL/SPI instrument response also required the development of a detailed computer mass model for SPI. We discuss ow extensive investigations into methods to reduce both the computation time and storage requirements for the SPI response. We also discuss corrections to the simulated response based on our comparison of ground and infiight Calibration data with MGEANT simulations.
ECUT (Energy Conversion and Utilization Technologies) program: Biocatalysis Project
NASA Technical Reports Server (NTRS)
1988-01-01
Fiscal year 1987 research activities and accomplishments for the Biocatalysis Project of the U.S. Department of Energy, Energy Conversion and Utilization Technologies (ECUT) Division are presented. The project's technical activities were organized into three work elements. The Molecular Modeling and Applied Genetics work element includes modeling and simulation studies to verify a dynamic model of the enzyme carboxypeptidase; plasmid stabilization by chromosomal integration; growth and stability characteristics of plasmid-containing cells; and determination of optional production parameters for hyper-production of polyphenol oxidase. The Bioprocess Engineering work element supports efforts in novel bioreactor concepts that are likely to lead to substantially higher levels of reactor productivity, product yields, and lower separation energetics. The Bioprocess Design and Assessment work element attempts to develop procedures (via user-friendly computer software) for assessing the economics and energetics of a given biocatalyst process.
Numerous urban canopy schemes have recently been developed for mesoscale models in order to approximate the drag and turbulent production effects of a city on the air flow. However, little data exists by which to evaluate the efficacy of the schemes since "area-averaged&quo...
A New Model of Sensorimotor Coupling in the Development of Speech
ERIC Educational Resources Information Center
Westermann, Gert; Miranda, Eduardo Reck
2004-01-01
We present a computational model that learns a coupling between motor parameters and their sensory consequences in vocal production during a babbling phase. Based on the coupling, preferred motor parameters and prototypically perceived sounds develop concurrently. Exposure to an ambient language modifies perception to coincide with the sounds from…
A Theory for the Neural Basis of Language Part 2: Simulation Studies of the Model
ERIC Educational Resources Information Center
Baron, R. J.
1974-01-01
Computer simulation studies of the proposed model are presented. Processes demonstrated are (1) verbally directed recall of visual experience; (2) understanding of verbal information; (3) aspects of learning and forgetting; (4) the dependence of recognition and understanding in context; and (5) elementary concepts of sentence production. (Author)
USDA-ARS?s Scientific Manuscript database
Accurate estimates of terrestrial carbon sequestration is essential for evaluating changes in the carbon cycle due to global climate change. In a recent assessment of 26 carbon assimilation models at 39 FLUXNET tower sites across the United States and Canada, all models failed to adequately compute...
NASA Technical Reports Server (NTRS)
Mulligan, Jeffrey B.
2017-01-01
A color algebra refers to a system for computing sums and products of colors, analogous to additive and subtractive color mixtures. The difficulty addressed here is the fact that, because of metamerism, we cannot know with certainty the spectrum that produced a particular color solely on the basis of sensory data. Knowledge of the spectrum is not required to compute additive mixture of colors, but is critical for subtractive (multiplicative) mixture. Therefore, we cannot predict with certainty the multiplicative interactions between colors based solely on sensory data. There are two potential applications of a color algebra: first, to aid modeling phenomena of human visual perception, such as color constancy and transparency; and, second, to provide better models of the interactions of lights and surfaces for computer graphics rendering.
Computational methods for a three-dimensional model of the petroleum-discovery process
Schuenemeyer, J.H.; Bawiec, W.J.; Drew, L.J.
1980-01-01
A discovery-process model devised by Drew, Schuenemeyer, and Root can be used to predict the amount of petroleum to be discovered in a basin from some future level of exploratory effort: the predictions are based on historical drilling and discovery data. Because marginal costs of discovery and production are a function of field size, the model can be used to make estimates of future discoveries within deposit size classes. The modeling approach is a geometric one in which the area searched is a function of the size and shape of the targets being sought. A high correlation is assumed between the surface-projection area of the fields and the volume of petroleum. To predict how much oil remains to be found, the area searched must be computed, and the basin size and discovery efficiency must be estimated. The basin is assumed to be explored randomly rather than by pattern drilling. The model may be used to compute independent estimates of future oil at different depth intervals for a play involving multiple producing horizons. We have written FORTRAN computer programs that are used with Drew, Schuenemeyer, and Root's model to merge the discovery and drilling information and perform the necessary computations to estimate undiscovered petroleum. These program may be modified easily for the estimation of remaining quantities of commodities other than petroleum. ?? 1980.
A nonlinear bi-level programming approach for product portfolio management.
Ma, Shuang
2016-01-01
Product portfolio management (PPM) is a critical decision-making for companies across various industries in today's competitive environment. Traditional studies on PPM problem have been motivated toward engineering feasibilities and marketing which relatively pay less attention to other competitors' actions and the competitive relations, especially in mathematical optimization domain. The key challenge lies in that how to construct a mathematical optimization model to describe this Stackelberg game-based leader-follower PPM problem and the competitive relations between them. The primary work of this paper is the representation of a decision framework and the optimization model to leverage the PPM problem of leader and follower. A nonlinear, integer bi-level programming model is developed based on the decision framework. Furthermore, a bi-level nested genetic algorithm is put forward to solve this nonlinear bi-level programming model for leader-follower PPM problem. A case study of notebook computer product portfolio optimization is reported. Results and analyses reveal that the leader-follower bi-level optimization model is robust and can empower product portfolio optimization.
Optimisation of assembly scheduling in VCIM systems using genetic algorithm
NASA Astrophysics Data System (ADS)
Dao, Son Duy; Abhary, Kazem; Marian, Romeo
2017-09-01
Assembly plays an important role in any production system as it constitutes a significant portion of the lead time and cost of a product. Virtual computer-integrated manufacturing (VCIM) system is a modern production system being conceptually developed to extend the application of traditional computer-integrated manufacturing (CIM) system to global level. Assembly scheduling in VCIM systems is quite different from one in traditional production systems because of the difference in the working principles of the two systems. In this article, the assembly scheduling problem in VCIM systems is modeled and then an integrated approach based on genetic algorithm (GA) is proposed to search for a global optimised solution to the problem. Because of dynamic nature of the scheduling problem, a novel GA with unique chromosome representation and modified genetic operations is developed herein. Robustness of the proposed approach is verified by a numerical example.
Four PPPPerspectives on computational creativity in theory and in practice
NASA Astrophysics Data System (ADS)
Jordanous, Anna
2016-04-01
Computational creativity is the modelling, simulating or replicating of creativity computationally. In examining and learning from these "creative systems", from what perspective should the creativity of a system be considered? Are we interested in the creativity of the system's output? Or of its creative processes? Features of the system? Or how it operates within its environment? Traditionally computational creativity has focused more on creative systems' products or processes, though this focus has widened recently. Creativity research offers the Four Ps of creativity: Person/Producer, Product, Process and Press/Environment. This paper presents the Four Ps, explaining each in the context of creativity research and how it relates to computational creativity. To illustrate the usefulness of the Four Ps in taking broader perspectives on creativity in its computational treatment, the concepts of novelty and value are explored using the Four Ps, highlighting aspects of novelty and value that may otherwise be overlooked. Analysis of recent research in computational creativity finds that although each of the Four Ps appears in the body of computational creativity work, individual pieces of work often do not acknowledge all Four Ps, missing opportunities to widen their work's relevance. We can see, though, that high-status computational creativity papers do typically address all Four Ps. This paper argues that the broader views of creativity afforded by the Four Ps is vital in guiding us towards more comprehensively useful computational investigations of creativity.
NASA Astrophysics Data System (ADS)
King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubenchik, A. M.
2015-12-01
The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.
Correlation between solar flare productivity and photospheric vector magnetic fields
NASA Astrophysics Data System (ADS)
Cui, Yanmei; Wang, Huaning
2008-11-01
Studying the statistical correlation between the solar flare productivity and photospheric magnetic fields is very important and necessary. It is helpful to set up a practical flare forecast model based on magnetic properties and improve the physical understanding of solar flare eruptions. In the previous study ([Cui, Y.M., Li, R., Zhang, L.Y., He, Y.L., Wang, H.N. Correlation between solar flare productivity and photospheric magnetic field properties 1. Maximum horizontal gradient, length of neutral line, number of singular points. Sol. Phys. 237, 45 59, 2006]; from now on we refer to this paper as ‘Paper I’), three measures of the maximum horizontal gradient, the length of the neutral line, and the number of singular points are computed from 23990 SOHO/MDI longitudinal magnetograms. The statistical relationship between the solar flare productivity and these three measures is well fitted with sigmoid functions. In the current work, the three measures of the length of strong-shear neutral line, total unsigned current, and total unsigned current helicity are computed from 1353 vector magnetograms observed at Huairou Solar Observing Station. The relationship between the solar flare productivity and the current three measures can also be well fitted with sigmoid functions. These results are expected to be beneficial to future operational flare forecasting models.
Chen, Xiaodong; Sadineni, Vikram; Maity, Mita; Quan, Yong; Enterline, Matthew; Mantri, Rao V
2015-12-01
Lyophilization is an approach commonly undertaken to formulate drugs that are unstable to be commercialized as ready to use (RTU) solutions. One of the important aspects of commercializing a lyophilized product is to transfer the process parameters that are developed in lab scale lyophilizer to commercial scale without a loss in product quality. This process is often accomplished by costly engineering runs or through an iterative process at the commercial scale. Here, we are highlighting a combination of computational and experimental approach to predict commercial process parameters for the primary drying phase of lyophilization. Heat and mass transfer coefficients are determined experimentally either by manometric temperature measurement (MTM) or sublimation tests and used as inputs for the finite element model (FEM)-based software called PASSAGE, which computes various primary drying parameters such as primary drying time and product temperature. The heat and mass transfer coefficients will vary at different lyophilization scales; hence, we present an approach to use appropriate factors while scaling-up from lab scale to commercial scale. As a result, one can predict commercial scale primary drying time based on these parameters. Additionally, the model-based approach presented in this study provides a process to monitor pharmaceutical product robustness and accidental process deviations during Lyophilization to support commercial supply chain continuity. The approach presented here provides a robust lyophilization scale-up strategy; and because of the simple and minimalistic approach, it will also be less capital intensive path with minimal use of expensive drug substance/active material.
FACTOR - FACTOR II. Departmental Program and Model Documentation 71-3.
ERIC Educational Resources Information Center
Wilson, Stanley; Billingsley, Ray
This computer program is designed to optimize a Cobb-Douglas type of production function. The user of this program may choose isoquants and/or the expansion path for a Cobb-Douglas type of production function with up to nine resources. An expansion path is the combination of quantities of each resource that minimizes the cost at each production…
Incremental Lexical Learning in Speech Production: A Computational Model and Empirical Evaluation
ERIC Educational Resources Information Center
Oppenheim, Gary Michael
2011-01-01
Naming a picture of a dog primes the subsequent naming of a picture of a dog (repetition priming) and interferes with the subsequent naming of a picture of a cat (semantic interference). Behavioral studies suggest that these effects derive from persistent changes in the way that words are activated and selected for production, and some have…
ERIC Educational Resources Information Center
Khattab, Ali-Maher; And Others
1982-01-01
A causal modeling system, using confirmatory maximum likelihood factor analysis with the LISREL IV computer program, evaluated the construct validity underlying the higher order factor structure of a given correlation matrix of 46 structure-of-intellect tests emphasizing the product of transformations. (Author/PN)
NASA Technical Reports Server (NTRS)
Dunbar, D. N.; Tunnah, B. G.
1979-01-01
Program predicts production volumes of petroleum refinery products, with particular emphasis on aircraft-turbine fuel blends and their key properties. It calculates capital and operating costs for refinery and its margin of profitability. Program also includes provisions for processing of synthetic crude oils from oil shale and coal liquefaction processes and contains highly-detailed blending computations for alternative jet-fuel blends of varying endpoint specifications.
Wan, Yue; Yang, Hongwei; Masui, Toshihiko
2005-01-01
At the present time, ambient air pollution is a serious public health problem in China. Based on the concentration-response relationship provided by international and domestic epidemiologic studies, the authors estimated the mortality and morbidity induced by the ambient air pollution of 2000. To address the mechanism of the health impact on the national economy, the authors applied a computable general equilibrium (CGE) model, named AIM/Material China, containing 39 production sectors and 32 commodities. AIM/Material analyzes changes of the gross domestic product (GDP), final demand, and production activity originating from health damages. If ambient air quality met Grade II of China's air quality standard in 2000, then the avoidable GDP loss would be 0.38%o of the national total, of which 95% was led by labor loss. Comparatively, medical expenditure had less impact on national economy, which is explained from the aspect of the final demand by commodities and the production activities by sectors. The authors conclude that the CGE model is a suitable tool for assessing health impacts from a point of view of national economy through the discussion about its applicability.
Shekhawat, Lalita Kanwar; Sarkar, Jayati; Gupta, Rachit; Hadpe, Sandeep; Rathore, Anurag S
2018-02-10
Centrifugation continues to be one of the most commonly used unit operations for achieving efficient harvest of the product from the mammalian cell culture broth during production of therapeutic monoclonal antibodies (mAbs). Since the mammalian cells are known to be shear sensitive, optimal performance of the centrifuge requires a balance between productivity and shear. In this study, Computational Fluid Dynamics (CFD) has been successfully used as a tool to facilitate efficient optimization. Multiphase Eulerian-Eulerian model coupled with Gidaspow drag model along with Eulerian-Eulerian k-ε mixture turbulence model have been used to quantify the complex hydrodynamics of the centrifuge and thus evaluate the turbulent stresses generated by the centrifugal forces. An empirical model has been developed by statistical analysis of experimentally observed cell lysis data as a function of turbulent stresses. An operating window that offers the optimal balance between high productivity, high separation efficiency, and low cell damage has been identified by use of CFD modeling. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Combs, L. P.
1974-01-01
A computer program for analyzing rocket engine performance was developed. The program is concerned with the formation, distribution, flow, and combustion of liquid sprays and combustion product gases in conventional rocket combustion chambers. The capabilities of the program to determine the combustion characteristics of the rocket engine are described. Sample data code sheets show the correct sequence and formats for variable values and include notes concerning options to bypass the input of certain data. A seperate list defines the variables and indicates their required dimensions.
AGIS: The ATLAS Grid Information System
NASA Astrophysics Data System (ADS)
Anisenkov, Alexey; Belov, Sergey; Di Girolamo, Alessandro; Gayazov, Stavro; Klimentov, Alexei; Oleynik, Danila; Senchenko, Alexander
2012-12-01
ATLAS is a particle physics experiment at the Large Hadron Collider at CERN. The experiment produces petabytes of data annually through simulation production and tens petabytes of data per year from the detector itself. The ATLAS Computing model embraces the Grid paradigm and a high degree of decentralization and computing resources able to meet ATLAS requirements of petabytes scale data operations. In this paper we present ATLAS Grid Information System (AGIS) designed to integrate configuration and status information about resources, services and topology of whole ATLAS Grid needed by ATLAS Distributed Computing applications and services.
Statistical models of lunar rocks and regolith
NASA Technical Reports Server (NTRS)
Marcus, A. H.
1973-01-01
The mathematical, statistical, and computational approaches used in the investigation of the interrelationship of lunar fragmental material, regolith, lunar rocks, and lunar craters are described. The first two phases of the work explored the sensitivity of the production model of fragmental material to mathematical assumptions, and then completed earlier studies on the survival of lunar surface rocks with respect to competing processes. The third phase combined earlier work into a detailed statistical analysis and probabilistic model of regolith formation by lithologically distinct layers, interpreted as modified crater ejecta blankets. The fourth phase of the work dealt with problems encountered in combining the results of the entire project into a comprehensive, multipurpose computer simulation model for the craters and regolith. Highlights of each phase of research are given.
Creation of system of computer-aided design for technological objects
NASA Astrophysics Data System (ADS)
Zubkova, T. M.; Tokareva, M. A.; Sultanov, N. Z.
2018-05-01
Due to the competition in the market of process equipment, its production should be flexible, retuning to various product configurations, raw materials and productivity, depending on the current market needs. This process is not possible without CAD (computer-aided design). The formation of CAD begins with planning. Synthesizing, analyzing, evaluating, converting operations, as well as visualization and decision-making operations, can be automated. Based on formal description of the design procedures, the design route in the form of an oriented graph is constructed. The decomposition of the design process, represented by the formalized description of the design procedures, makes it possible to make an informed choice of the CAD component for the solution of the task. The object-oriented approach allows us to consider the CAD as an independent system whose properties are inherited from the components. The first step determines the range of tasks to be performed by the system, and a set of components for their implementation. The second one is the configuration of the selected components. The interaction between the selected components is carried out using the CALS standards. The chosen CAD / CAE-oriented approach allows creating a single model, which is stored in the database of the subject area. Each of the integration stages is implemented as a separate functional block. The transformation of the CAD model into the model of the internal representation is realized by the block of searching for the geometric parameters of the technological machine, in which the XML-model of the construction is obtained on the basis of the feature method from the theory of image recognition. The configuration of integrated components is divided into three consecutive steps: configuring tasks, components, interfaces. The configuration of the components is realized using the theory of "soft computations" using the Mamdani fuzzy inference algorithm.
NASA Technical Reports Server (NTRS)
Scott, Carl D.
2004-01-01
Chemical kinetic models for the nucleation and growth of clusters and single-walled carbon nanotube (SWNT) growth are developed for numerical simulations of the production of SWNTs. Two models that involve evaporation and condensation of carbon and metal catalysts, a full model involving all carbon clusters up to C80, and a reduced model are discussed. The full model is based on a fullerene model, but nickel and carbon/nickel cluster reactions are added to form SWNTs from soot and fullerenes. The full model has a large number of species--so large that to incorporate them into a flow field computation for simulating laser ablation and arc processes requires that they be simplified. The model is reduced by defining large clusters that represent many various sized clusters. Comparisons are given between these models for cases that may be applicable to arc and laser ablation production. Solutions to the system of chemical rate equations of these models for a ramped temperature profile show that production of various species, including SWNTs, agree to within about 50% for a fast ramp, and within 10% for a slower temperature decay time.
GEOS Atmospheric Model: Challenges at Exascale
NASA Technical Reports Server (NTRS)
Putman, William M.; Suarez, Max J.
2017-01-01
The Goddard Earth Observing System (GEOS) model at NASA's Global Modeling and Assimilation Office (GMAO) is used to simulate the multi-scale variability of the Earth's weather and climate, and is used primarily to assimilate conventional and satellite-based observations for weather forecasting and reanalysis. In addition, assimilations coupled to an ocean model are used for longer-term forecasting (e.g., El Nino) on seasonal to interannual times-scales. The GMAO's research activities, including system development, focus on numerous time and space scales, as detailed on the GMAO website, where they are tabbed under five major themes: Weather Analysis and Prediction; Seasonal-Decadal Analysis and Prediction; Reanalysis; Global Mesoscale Modeling, and Observing System Science. A brief description of the GEOS systems can also be found at the GMAO website. GEOS executes as a collection of earth system components connected through the Earth System Modeling Framework (ESMF). The ESMF layer is supplemented with the MAPL (Modeling, Analysis, and Prediction Layer) software toolkit developed at the GMAO, which facilitates the organization of the computational components into a hierarchical architecture. GEOS systems run in parallel using a horizontal decomposition of the Earth's sphere into processing elements (PEs). Communication between PEs is primarily through a message passing framework, using the message passing interface (MPI), and through explicit use of node-level shared memory access via the SHMEM (Symmetric Hierarchical Memory access) protocol. Production GEOS weather prediction systems currently run at 12.5-kilometer horizontal resolution with 72 vertical levels decomposed into PEs associated with 5,400 MPI processes. Research GEOS systems run at resolutions as fine as 1.5 kilometers globally using as many as 30,000 MPI processes. Looking forward, these systems can be expected to see a 2 times increase in horizontal resolution every two to three years, as well as less frequent increases in vertical resolution. Coupling these resolution changes with increases in complexity, the computational demands on the GEOS production and research systems should easily increase 100-fold over the next five years. Currently, our 12.5 kilometer weather prediction system narrowly meets the time-to-solution demands of a near-real-time production system. Work is now in progress to take advantage of a hybrid MPI-OpenMP parallelism strategy, in an attempt to achieve a modest two-fold speed-up to accommodate an immediate demand due to increased scientific complexity and an increase in vertical resolution. Pursuing demands that require a 10- to 100-fold increases or more, however, would require a detailed exploration of the computational profile of GEOS, as well as targeted solutions using more advanced high-performance computing technologies. Increased computing demands of 100-fold will be required within five years based on anticipated changes in the GEOS production systems, increases of 1000-fold can be anticipated over the next ten years.
A hydrological emulator for global applications – HE v1.0.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Yaling; Hejazi, Mohamad; Li, Hongyi
While global hydrological models (GHMs) are very useful in exploring water resources and interactions between the Earth and human systems, their use often requires numerous model inputs, complex model calibration, and high computation costs. To overcome these challenges, we construct an efficient open-source and ready-to-use hydrological emulator (HE) that can mimic complex GHMs at a range of spatial scales (e.g., basin, region, globe). More specifically, we construct both a lumped and a distributed scheme of the HE based on the monthly abcd model to explore the tradeoff between computational cost and model fidelity. Model predictability and computational efficiency are evaluatedmore » in simulating global runoff from 1971 to 2010 with both the lumped and distributed schemes. The results are compared against the runoff product from the widely used Variable Infiltration Capacity (VIC) model. Our evaluation indicates that the lumped and distributed schemes present comparable results regarding annual total quantity, spatial pattern, and temporal variation of the major water fluxes (e.g., total runoff, evapotranspiration) across the global 235 basins (e.g., correlation coefficient r between the annual total runoff from either of these two schemes and the VIC is > 0.96), except for several cold (e.g., Arctic, interior Tibet), dry (e.g., North Africa) and mountainous (e.g., Argentina) regions. Compared against the monthly total runoff product from the VIC (aggregated from daily runoff), the global mean Kling–Gupta efficiencies are 0.75 and 0.79 for the lumped and distributed schemes, respectively, with the distributed scheme better capturing spatial heterogeneity. Notably, the computation efficiency of the lumped scheme is 2 orders of magnitude higher than the distributed one and 7 orders more efficient than the VIC model. A case study of uncertainty analysis for the world's 16 basins with top annual streamflow is conducted using 100 000 model simulations, and it demonstrates the lumped scheme's extraordinary advantage in computational efficiency. Lastly, our results suggest that the revised lumped abcd model can serve as an efficient and reasonable HE for complex GHMs and is suitable for broad practical use, and the distributed scheme is also an efficient alternative if spatial heterogeneity is of more interest.« less
NASA Technical Reports Server (NTRS)
Kim, S.-W.; Chen, C.-P.
1987-01-01
A multiple-time-scale turbulence model of a single point closure and a simplified split-spectrum method is presented. In the model, the effect of the ratio of the production rate to the dissipation rate on eddy viscosity is modeled by use of the multiple-time-scales and a variable partitioning of the turbulent kinetic energy spectrum. The concept of a variable partitioning of the turbulent kinetic energy spectrum and the rest of the model details are based on the previously reported algebraic stress turbulence model. Example problems considered include: a fully developed channel flow, a plane jet exhausting into a moving stream, a wall jet flow, and a weakly coupled wake-boundary layer interaction flow. The computational results compared favorably with those obtained by using the algebraic stress turbulence model as well as experimental data. The present turbulence model, as well as the algebraic stress turbulence model, yielded significantly improved computational results for the complex turbulent boundary layer flows, such as the wall jet flow and the wake boundary layer interaction flow, compared with available computational results obtained by using the standard kappa-epsilon turbulence model.
NASA Technical Reports Server (NTRS)
Kim, S.-W.; Chen, C.-P.
1989-01-01
A multiple-time-scale turbulence model of a single point closure and a simplified split-spectrum method is presented. In the model, the effect of the ratio of the production rate to the dissipation rate on eddy viscosity is modeled by use of the multiple-time-scales and a variable partitioning of the turbulent kinetic energy spectrum. The concept of a variable partitioning of the turbulent kinetic energy spectrum and the rest of the model details are based on the previously reported algebraic stress turbulence model. Example problems considered include: a fully developed channel flow, a plane jet exhausting into a moving stream, a wall jet flow, and a weakly coupled wake-boundary layer interaction flow. The computational results compared favorably with those obtained by using the algebraic stress turbulence model as well as experimental data. The present turbulence model, as well as the algebraic stress turbulence model, yielded significantly improved computational results for the complex turbulent boundary layer flows, such as the wall jet flow and the wake boundary layer interaction flow, compared with available computational results obtained by using the standard kappa-epsilon turbulence model.
The Search for Efficiency in Arboreal Ray Tracing Applications
NASA Astrophysics Data System (ADS)
van Leeuwen, M.; Disney, M.; Chen, J. M.; Gomez-Dans, J.; Kelbe, D.; van Aardt, J. A.; Lewis, P.
2016-12-01
Forest structure significantly impacts a range of abiotic conditions, including humidity and the radiation regime, all of which affect the rate of net and gross primary productivity. Current forest productivity models typically consider abstract media to represent the transfer of radiation within the canopy. Examples include the representation forest structure via a layered canopy model, where leaf area and inclination angles are stratified with canopy depth, or as turbid media where leaves are randomly distributed within space or within confined geometric solids such as blocks, spheres or cones. While these abstract models are known to produce accurate estimates of primary productivity at the stand level, their limited geometric resolution restricts applicability at fine spatial scales, such as the cell, leaf or shoot levels, thereby not addressing the full potential of assimilation of data from laboratory and field measurements with that of remote sensing technology. Recent research efforts have explored the use of laser scanning to capture detailed tree morphology at millimeter accuracy. These data can subsequently be used to combine ray tracing with primary productivity models, providing an ability to explore trade-offs among different morphological traits or assimilate data from spatial scales, spanning the leaf- to the stand level. Ray tracing has a major advantage of allowing the most accurate structural description of the canopy, and can directly exploit new 3D structural measurements, e.g., from laser scanning. However, the biggest limitation of ray tracing models is their high computational cost, which currently limits their use for large-scale applications. In this talk, we explore ways to more efficiently exploit ray tracing simulations and capture this information in a readily computable form for future evaluation, thus potentially enabling large-scale first-principles forest growth modelling applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, M.V.
1989-01-01
A numerical model was developed to simulate the operation of an integrated system for the production of methane and single-cell algal protein from a variety of biomass energy crops or waste streams. Economic analysis was performed at the end of each simulation. The model was capable of assisting in the determination of design parameters by providing relative economic information for various strategies. Three configurations of anaerobic reactors were simulated. These included fed-bed reactors, conventional stirred tank reactors, and continuously expanding reactors. A generic anaerobic digestion process model, using lumped substrate parameters, was developed for use by type-specific reactor models. Themore » generic anaerobic digestion model provided a tool for the testing of conversion efficiencies and kinetic parameters for a wide range of substrate types and reactor designs. Dynamic growth models were used to model the growth of algae and Eichornia crassipes was modeled as a function of daily incident radiation and temperature. The growth of Eichornia crassipes was modeled for the production of biomass as a substrate for digestion. Computer simulations with the system model indicated that tropical or subtropical locations offered the most promise for a viable system. The availability of large quantities of digestible waste and low land prices were found to be desirable in order to take advantage of the economies of scale. Other simulations indicated that poultry and swine manure produced larger biogas yields than cattle manure. The model was created in a modular fashion to allow for testing of a wide variety of unit operations. Coding was performed in the Pascal language for use on personal computers.« less
NASA Astrophysics Data System (ADS)
Weres, Jerzy; Kujawa, Sebastian; Olek, Wiesław; Czajkowski, Łukasz
2016-04-01
Knowledge of physical properties of biomaterials is important in understanding and designing agri-food and wood processing industries. In the study presented in this paper computational methods were developed and combined with experiments to enhance identification of agri-food and forest product properties, and to predict heat and water transport in such products. They were based on the finite element model of heat and water transport and supplemented with experimental data. Algorithms were proposed for image processing, geometry meshing, and inverse/direct finite element modelling. The resulting software system was composed of integrated subsystems for 3D geometry data acquisition and mesh generation, for 3D geometry modelling and visualization, and for inverse/direct problem computations for the heat and water transport processes. Auxiliary packages were developed to assess performance, accuracy and unification of data access. The software was validated by identifying selected properties and using the estimated values to predict the examined processes, and then comparing predictions to experimental data. The geometry, thermal conductivity, specific heat, coefficient of water diffusion, equilibrium water content and convective heat and water transfer coefficients in the boundary layer were analysed. The estimated values, used as an input for simulation of the examined processes, enabled reduction in the uncertainty associated with predictions.
New Vistas in Chemical Product and Process Design.
Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul
2016-06-07
Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates.
Incorporating the gas analyzer response time in gas exchange computations.
Mitchell, R R
1979-11-01
A simple method for including the gas analyzer response time in the breath-by-breath computation of gas exchange rates is described. The method uses a difference equation form of a model for the gas analyzer in the computation of oxygen uptake and carbon dioxide production and avoids a numerical differentiation required to correct the gas fraction wave forms. The effect of not accounting for analyzer response time is shown to be a 20% underestimation in gas exchange rate. The present method accurately measures gas exchange rate, is relatively insensitive to measurement errors in the analyzer time constant, and does not significantly increase the computation time.
Advances in multi-scale modeling of solidification and casting processes
NASA Astrophysics Data System (ADS)
Liu, Baicheng; Xu, Qingyan; Jing, Tao; Shen, Houfa; Han, Zhiqiang
2011-04-01
The development of the aviation, energy and automobile industries requires an advanced integrated product/process R&D systems which could optimize the product and the process design as well. Integrated computational materials engineering (ICME) is a promising approach to fulfill this requirement and make the product and process development efficient, economic, and environmentally friendly. Advances in multi-scale modeling of solidification and casting processes, including mathematical models as well as engineering applications are presented in the paper. Dendrite morphology of magnesium and aluminum alloy of solidification process by using phase field and cellular automaton methods, mathematical models of segregation of large steel ingot, and microstructure models of unidirectionally solidified turbine blade casting are studied and discussed. In addition, some engineering case studies, including microstructure simulation of aluminum casting for automobile industry, segregation of large steel ingot for energy industry, and microstructure simulation of unidirectionally solidified turbine blade castings for aviation industry are discussed.
Method of locating underground mines fires
Laage, Linneas; Pomroy, William
1992-01-01
An improved method of locating an underground mine fire by comparing the pattern of measured combustion product arrival times at detector locations with a real time computer-generated array of simulated patterns. A number of electronic fire detection devices are linked thru telemetry to a control station on the surface. The mine's ventilation is modeled on a digital computer using network analysis software. The time reguired to locate a fire consists of the time required to model the mines' ventilation, generate the arrival time array, scan the array, and to match measured arrival time patterns to the simulated patterns.
A roadmap to computational social neuroscience.
Tognoli, Emmanuelle; Dumas, Guillaume; Kelso, J A Scott
2018-02-01
To complement experimental efforts toward understanding human social interactions at both neural and behavioral levels, two computational approaches are presented: (1) a fully parameterizable mathematical model of a social partner, the Human Dynamic Clamp which, by virtue of experimentally controlled interactions between Virtual Partners and real people, allows for emergent behaviors to be studied; and (2) a multiscale neurocomputational model of social coordination that enables exploration of social self-organization at all levels-from neuronal patterns to people interacting with each other. These complementary frameworks and the cross product of their analysis aim at understanding the fundamental principles governing social behavior.
76 FR 44888 - Privacy Act of 1974, System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-27
...'') cloud computing model. The suite is composed of Gmail for e-mail, Google Docs for office productivity... 22202. The request must include the requestor's full name, his/her current address and a return address...
Econ's optimal decision model of wheat production and distribution-documentation
NASA Technical Reports Server (NTRS)
1977-01-01
The report documents the computer programs written to implement the ECON optical decision model. The programs were written in APL, an extremely compact and powerful language particularly well suited to this model, which makes extensive use of matrix manipulations. The algorithms used are presented and listings of and descriptive information on the APL programs used are given. Possible changes in input data are also given.
Spherical harmonic modelling to ultra-high degree of Bouguer and isostatic anomalies
NASA Astrophysics Data System (ADS)
Balmino, G.; Vales, N.; Bonvalot, S.; Briais, A.
2012-07-01
The availability of high-resolution global digital elevation data sets has raised a growing interest in the feasibility of obtaining their spherical harmonic representation at matching resolution, and from there in the modelling of induced gravity perturbations. We have therefore estimated spherical Bouguer and Airy isostatic anomalies whose spherical harmonic models are derived from the Earth's topography harmonic expansion. These spherical anomalies differ from the classical planar ones and may be used in the context of new applications. We succeeded in meeting a number of challenges to build spherical harmonic models with no theoretical limitation on the resolution. A specific algorithm was developed to enable the computation of associated Legendre functions to any degree and order. It was successfully tested up to degree 32,400. All analyses and syntheses were performed, in 64 bits arithmetic and with semi-empirical control of the significant terms to prevent from calculus underflows and overflows, according to IEEE limitations, also in preserving the speed of a specific regular grid processing scheme. Finally, the continuation from the reference ellipsoid's surface to the Earth's surface was performed by high-order Taylor expansion with all grids of required partial derivatives being computed in parallel. The main application was the production of a 1' × 1' equiangular global Bouguer anomaly grid which was computed by spherical harmonic analysis of the Earth's topography-bathymetry ETOPO1 data set up to degree and order 10,800, taking into account the precise boundaries and densities of major lakes and inner seas, with their own altitude, polar caps with bedrock information, and land areas below sea level. The harmonic coefficients for each entity were derived by analyzing the corresponding ETOPO1 part, and free surface data when required, at one arc minute resolution. The following approximations were made: the land, ocean and ice cap gravity spherical harmonic coefficients were computed up to the third degree of the altitude, and the harmonics of the other, smaller parts up to the second degree. Their sum constitutes what we call ETOPG1, the Earth's TOPography derived Gravity model at 1' resolution (half-wavelength). The EGM2008 gravity field model and ETOPG1 were then used to rigorously compute 1' × 1' point values of surface gravity anomalies and disturbances, respectively, worldwide, at the real Earth's surface, i.e. at the lower limit of the atmosphere. The disturbance grid is the most interesting product of this study and can be used in various contexts. The surface gravity anomaly grid is an accurate product associated with EGM2008 and ETOPO1, but its gravity information contents are those of EGM2008. Our method was validated by comparison with a direct numerical integration approach applied to a test area in Morocco-South of Spain (Kuhn, private communication 2011) and the agreement was satisfactory. Finally isostatic corrections according to the Airy model, but in spherical geometry, with harmonic coefficients derived from the sets of the ETOPO1 different parts, were computed with a uniform depth of compensation of 30 km. The new world Bouguer and isostatic gravity maps and grids here produced will be made available through the Commission for the Geological Map of the World. Since gravity values are those of the EGM2008 model, geophysical interpretation from these products should not be done for spatial scales below 5 arc minutes (half-wavelength).
Arnela, Marc; Guasch, Oriol; Alías, Francesc
2013-10-01
One of the key effects to model in voice production is that of acoustic radiation of sound waves emanating from the mouth. The use of three-dimensional numerical simulations allows to naturally account for it, as well as to consider all geometrical head details, by extending the computational domain out of the vocal tract. Despite this advantage, many approximations to the head geometry are often performed for simplicity and impedance load models are still used as well to reduce the computational cost. In this work, the impact of some of these simplifications on radiation effects is examined for vowel production in the frequency range 0-10 kHz, by means of comparison with radiation from a realistic head. As a result, recommendations are given on their validity depending on whether high frequency energy (above 5 kHz) should be taken into account or not.
Experimental and computational fluid dynamics studies of mixing of complex oral health products
NASA Astrophysics Data System (ADS)
Cortada-Garcia, Marti; Migliozzi, Simona; Weheliye, Weheliye Hashi; Dore, Valentina; Mazzei, Luca; Angeli, Panagiota; ThAMes Multiphase Team
2017-11-01
Highly viscous non-Newtonian fluids are largely used in the manufacturing of specialized oral care products. Mixing often takes place in mechanically stirred vessels where the flow fields and mixing times depend on the geometric configuration and the fluid physical properties. In this research, we study the mixing performance of complex non-Newtonian fluids using Computational Fluid Dynamics models and validate them against experimental laser-based optical techniques. To this aim, we developed a scaled-down version of an industrial mixer. As test fluids, we used mixtures of glycerol and a Carbomer gel. The viscosities of the mixtures against shear rate at different temperatures and phase ratios were measured and found to be well described by the Carreau model. The numerical results were compared against experimental measurements of velocity fields from Particle Image Velocimetry (PIV) and concentration profiles from Planar Laser Induced Fluorescence (PLIF).
Gluon-fusion Higgs production in the Standard Model Effective Field Theory
NASA Astrophysics Data System (ADS)
Deutschmann, Nicolas; Duhr, Claude; Maltoni, Fabio; Vryonidou, Eleni
2017-12-01
We provide the complete set of predictions needed to achieve NLO accuracy in the Standard Model Effective Field Theory at dimension six for Higgs production in gluon fusion. In particular, we compute for the first time the contribution of the chromomagnetic operator {\\overline{Q}}_LΦ σ {q}_RG at NLO in QCD, which entails two-loop virtual and one-loop real contributions, as well as renormalisation and mixing with the Yukawa operator {Φ}^{\\dagger}Φ{\\overline{Q}}_LΦ {q}_R and the gluon-fusion operator Φ†Φ GG. Focusing on the top-quark-Higgs couplings, we consider the phenomenological impact of the NLO corrections in constraining the three relevant operators by implementing the results into the M adG raph5_ aMC@NLO frame-work. This allows us to compute total cross sections as well as to perform event generation at NLO that can be directly employed in experimental analyses.
MoPCoM Methodology: Focus on Models of Computation
NASA Astrophysics Data System (ADS)
Koudri, Ali; Champeau, Joël; Le Lann, Jean-Christophe; Leilde, Vincent
Today, developments of Real Time Embedded Systems have to face new challenges. On the one hand, Time-To-Market constraints require a reliable development process allowing quick design space exploration. On the other hand, rapidly developing technology, as stated by Moore's law, requires techniques to handle the resulting productivity gap. In a previous paper, we have presented our Model Based Engineering methodology addressing those issues. In this paper, we make a focus on Models of Computation design and analysis. We illustrate our approach on a Cognitive Radio System development implemented on an FPGA. This work is part of the MoPCoM research project gathering academic and industrial organizations (http://www.mopcom.fr).
Dynamic regulation of erythropoiesis: A computer model of general applicability
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1979-01-01
A mathematical model for the control of erythropoiesis was developed based on the balance between oxygen supply and demand at a renal oxygen detector which controls erythropoietin release and red cell production. Feedback regulation of tissue oxygen tension is accomplished by adjustments of hemoglobin levels resulting from the output of a renal-bone marrow controller. Special consideration was given to the determinants of tissue oxygenation including evaluation of the influence of blood flow, capillary diffusivity, oxygen uptake and oxygen-hemoglobin affinity. A theoretical analysis of the overall control system is presented. Computer simulations of altitude hypoxia, red cell infusion hyperoxia, and homolytic anemia demonstrate validity of the model for general human application in health and disease.
A potential method for lift evaluation from velocity field data
NASA Astrophysics Data System (ADS)
de Guyon-Crozier, Guillaume; Mulleners, Karen
2017-11-01
Computing forces from velocity field measurements is one of the challenges in experimental aerodynamics. This work focuses on low Reynolds flows, where the dynamics of the leading and trailing edge vortices play a major role in lift production. Recent developments in 2D potential flow theory, using discrete vortex models, have shown good results for unsteady wing motions. A method is presented to calculate lift from experimental velocity field data using a discrete vortex potential flow model. The model continuously adds new point vortices at leading and trailing edges whose circulations are set directly from vorticity measurements. Forces are computed using the unsteady Blasius equation and compared with measured loads.
Taste CREp: the Cosmic-Ray Exposure program
NASA Astrophysics Data System (ADS)
Martin, Léo; Blard, Pierre-Henri; Balco, Greg; Lavé, Jérôme; Delunel, Romain; Lifton, Nathaniel
2017-04-01
We present here the CREp program and the ICE-D production rate database, an online system to compute Cosmic Ray Exposure (CRE) ages with cosmogenic 3He and 10Be (crep.crpg.cnrs-nancy.fr). The CREp calculator is designed to automatically reflect the current state of the global calibration database production rate stored in ICE-D (http://calibration.ice-d.org). ICE-D will be regularly updated in order to incorporate new calibration data and reflect the current state of the available literature. The CREp program permits to calculate ages in a flexible way: 1) Two scaling models are available, i.e. i) the empirical Lal-Stone time-dependent model (Balco et al., 2008; Lal, 1991; Stone, 2000) with the muon parameters of Braucher et al. (2011), and ii) the Lifton-Sato-Dunai (LSD) theoretical model (Lifton et al., 2014). 2) Users may also test the impact of the atmosphere model, using either i) the ERA-40 database (Uppala et al., 2005), or ii) the standard atmosphere (N.O.A.A., 1976). 3) For the time-dependent correction, users or choose among the three proposed geomagnetic datasets (Lifton, 2016; Lifton et al., 2014; Muscheler et al., 2005) or import their own database. 4) For the important choice of the production rate, CREp is linked to a database of production rate calibration data, ICE-D. This database includes published empirical calibration rate studies that are publicly available at present, including those of the CRONUS-Earth and CRONUS-EU projects, as well as studies from other projects. Users may select the production rates either: i) using a worldwide mean value, ii) a regionally averaged value (not available in regions with no data), iii) a local unique value, which can be chosen among the existing dataset or imported by the user, or iv) any combination of single or multiple calibration data. We tested the efficacy of the different scaling models by looking at the statistical dispersion of the computed Sea Level High Latitude (SLHL) calibrated production rates. Lal/Stone and LSD models have comparable efficacies, and the impact of the tested atmospheric model and the geomagnetic database is also limited. If a global mean is chosen, the 1σ uncertainty arising from the production rate is about 5% for 10Be and 10% for 3He. If a regional production rate is picked, these uncertainties are potentially lower.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habte, A.; Sengupta, M.; Wilcox, S.
Models to compute Global Horizontal Irradiance (GHI) and Direct Normal Irradiance (DNI) have been in development over the last 3 decades. These models can be classified as empirical or physical, based on the approach. Empirical models relate ground based observations with satellite measurements and use these relations to compute surface radiation. Physical models consider the radiation received from the earth at the satellite and create retrievals to estimate surface radiation. While empirical methods have been traditionally used for computing surface radiation for the solar energy industry the advent of faster computing has made operational physical models viable. The Global Solarmore » Insolation Project (GSIP) is an operational physical model from NOAA that computes GHI using the visible and infrared channel measurements from the GOES satellites. GSIP uses a two-stage scheme that first retrieves cloud properties and uses those properties in a radiative transfer model to calculate surface radiation. NREL, University of Wisconsin and NOAA have recently collaborated to adapt GSIP to create a 4 km GHI and DNI product every 30 minutes. This paper presents an outline of the methodology and a comprehensive validation using high quality ground based solar data from the National Oceanic and Atmospheric Administration (NOAA) Surface Radiation (SURFRAD) (http://www.srrb.noaa.gov/surfrad/sitepage.html) and Integrated Surface Insolation Study (ISIS) http://www.srrb.noaa.gov/isis/isissites.html), the Solar Radiation Research Laboratory (SRRL) at National Renewable Energy Laboratory (NREL), and Sun Spot One (SS1) stations.« less
Challenging Density Functional Theory Calculations with Hemes and Porphyrins.
de Visser, Sam P; Stillman, Martin J
2016-04-07
In this paper we review recent advances in computational chemistry and specifically focus on the chemical description of heme proteins and synthetic porphyrins that act as both mimics of natural processes and technological uses. These are challenging biochemical systems involved in electron transfer as well as biocatalysis processes. In recent years computational tools have improved considerably and now can reproduce experimental spectroscopic and reactivity studies within a reasonable error margin (several kcal·mol(-1)). This paper gives recent examples from our groups, where we investigated heme and synthetic metal-porphyrin systems. The four case studies highlight how computational modelling can correctly reproduce experimental product distributions, predicted reactivity trends and guide interpretation of electronic structures of complex systems. The case studies focus on the calculations of a variety of spectroscopic features of porphyrins and show how computational modelling gives important insight that explains the experimental spectra and can lead to the design of porphyrins with tuned properties.
Singh, Ramesh K.; Liu, Shu-Guang; Tieszen, Larry L.; Suyker, Andrew E.; Verma, Shashi B.
2012-01-01
Gross primary production (GPP) is a key indicator of ecosystem performance, and helps in many decision-making processes related to environment. We used the Eddy covariancelight use efficiency (EC-LUE) model for estimating GPP in the Great Plains, United States in order to evaluate the performance of this model. We developed a novel algorithm for computing the photosynthetically active radiation (PAR) based on net radiation. A strong correlation (R2=0.94,N=24) was found between daily PAR and Landsat-based mid-day instantaneous net radiation. Though the Moderate Resolution Spectroradiometer (MODIS) based instantaneous net radiation was in better agreement (R2=0.98,N=24) with the daily measured PAR, there was no statistical significant difference between Landsat based PAR and MODIS based PAR. The EC-LUE model validation also confirms the need to consider biological attributes (C3 versus C4 plants) for potential light use efficiency. A universal potential light use efficiency is unable to capture the spatial variation of GPP. It is necessary to use C3 versus C4 based land use/land cover map for using EC-LUE model for estimating spatiotemporal distribution of GPP.
A characterization of workflow management systems for extreme-scale applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia
We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less
A characterization of workflow management systems for extreme-scale applications
Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...
2017-02-16
We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less
Use of agents to implement an integrated computing environment
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.
1995-01-01
Integrated Product and Process Development (IPPD) embodies the simultaneous application to both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. Agents are used to implement the overall infrastructure on the computer. Successful agent utilization requires that they be made of three components: the resource, the model, and the wrap. Current work is focused on the development of generalized agent schemes and associated demonstration projects. When in place, the technology independent computing infrastructure will aid the designer in systematically generating knowledge used to facilitate decision-making.
Tabe-Bordbar, Shayan; Marashi, Sayed-Amir
2013-12-01
Elementary modes (EMs) are steady-state metabolic flux vectors with minimal set of active reactions. Each EM corresponds to a metabolic pathway. Therefore, studying EMs is helpful for analyzing the production of biotechnologically important metabolites. However, memory requirements for computing EMs may hamper their applicability as, in most genome-scale metabolic models, no EM can be computed due to running out of memory. In this study, we present a method for computing randomly sampled EMs. In this approach, a network reduction algorithm is used for EM computation, which is based on flux balance-based methods. We show that this approach can be used to recover the EMs in the medium- and genome-scale metabolic network models, while the EMs are sampled in an unbiased way. The applicability of such results is shown by computing “estimated” control-effective flux values in Escherichia coli metabolic network.
NASA Astrophysics Data System (ADS)
Ramotar, Lokendra; Rohrauer, Greg L.; Filion, Ryan; MacDonald, Kathryn
2017-03-01
The development of a dynamic thermal battery model for hybrid and electric vehicles is realized. A thermal equivalent circuit model is created which aims to capture and understand the heat propagation from the cells through the entire pack and to the environment using a production vehicle battery pack for model validation. The inclusion of production hardware and the liquid battery thermal management system components into the model considers physical and geometric properties to calculate thermal resistances of components (conduction, convection and radiation) along with their associated heat capacity. Various heat sources/sinks comprise the remaining model elements. Analog equivalent circuit simulations using PSpice are compared to experimental results to validate internal temperature nodes and heat rates measured through various elements, which are then employed to refine the model further. Agreement with experimental results indicates the proposed method allows for a comprehensive real-time battery pack analysis at little computational expense when compared to other types of computer based simulations. Elevated road and ambient conditions in Mesa, Arizona are simulated on a parked vehicle with varying quiescent cooling rates to examine the effect on the diurnal battery temperature for longer term static exposure. A typical daily driving schedule is also simulated and examined.
Two-Equation Turbulence Models for Prediction of Heat Transfer on a Transonic Turbine Blade
NASA Technical Reports Server (NTRS)
Garg, Vijay K.; Ameri, Ali A.; Gaugler, R. E. (Technical Monitor)
2001-01-01
Two versions of the two-equation k-omega model and a shear stress transport (SST) model are used in a three-dimensional, multi-block, Navier-Stokes code to compare the detailed heat transfer measurements on a transonic turbine blade. It is found that the SST model resolves the passage vortex better on the suction side of the blade, thus yielding a better comparison with the experimental data than either of the k-w models. However, the comparison is still deficient on the suction side of the blade. Use of the SST model does require the computation of distance from a wall, which for a multiblock grid, such as in the present case, can be complicated. However, a relatively easy fix for this problem was devised. Also addressed are issues such as (1) computation of the production term in the turbulence equations for aerodynamic applications, and (2) the relation between the computational and experimental values for the turbulence length scale, and its influence on the passage vortex on the suction side of the turbine blade.
Crustal thickness of Antarctica estimated using data from gravimetric satellites
NASA Astrophysics Data System (ADS)
Llubes, Muriel; Seoane, Lucia; Bruinsma, Sean; Rémy, Frédérique
2018-04-01
Computing a better crustal thickness model is still a necessary improvement in Antarctica. In this remote continent where almost all the bedrock is covered by the ice sheet, seismic investigations do not reach a sufficient spatial resolution for geological and geophysical purposes. Here, we present a global map of Antarctic crustal thickness computed from space gravity observations. The DIR5 gravity field model, built from GOCE and GRACE gravimetric data, is inverted with the Parker-Oldenburg iterative algorithm. The BEDMAP products are used to estimate the gravity effect of the ice and the rocky surface. Our result is compared to crustal thickness calculated from seismological studies and the CRUST1.0 and AN1 models. Although the CRUST1.0 model shows a very good agreement with ours, its spatial resolution is larger than the one we obtain with gravimetric data. Finally, we compute a model in which the crust-mantle density contrast is adjusted to fit the Moho depth from the CRUST1.0 model. In East Antarctica, the resulting density contrast clearly shows higher values than in West Antarctica.
PLM in the context of the maritime virtual education
NASA Astrophysics Data System (ADS)
Raicu, Alexandra; Oanta, Emil M.
2016-12-01
This paper presents new approaches regarding the use of Product Lifecycle Management concept to achieve knowledge integration of the academic disciplines in the maritime education context. The philosophy of the educational system is now changing faster worldwide and it is in a continuous developing process. There is a demand to develop modern educational facilities for CAD/CAE/CAM training of the future maritime engineers, which offers collaborative environments between the academic disciplines and the teachers. It is well known that the students must understand the importance of the connectivity between the academic disciplines and the computer aided methods to interface them. Thus, besides the basic knowledge and competences acquired from the CAD courses, students learn how to increase the design productivity, to create a parametric design, the original instruments of automatic design, 3D printing methods, how to interface the CAD/CAE/CAM applications. As an example, the Strength of Materials discipline briefly presents alternate computer aided methods to compute the geometrical characteristics of the cross sections using the CAD geometry, creation the free body diagrams and presentation the deflected shapes of various educational models, including the rotational effect when the forces are not applied in the shear center, using the results of the FEM applications. During the computer aided engineering academic disciplines, after the students design and analyze a virtual 3D model they can convert it into a physical object using 3D printing method. Constanta Maritime University offers a full understanding of the concept of Product Lifecycle Management, collaborative creation, management and dissemination.
A Production System Model of Capturing Reactive Moving Targets. M.S. Thesis
NASA Technical Reports Server (NTRS)
Jagacinski, R. J.; Plamondon, B. D.; Miller, R. A.
1984-01-01
Subjects manipulated a control stick to position a cursor over a moving target that reacted with a computer-generated escape strategy. The cursor movements were described at two levels of abstraction. At the upper level, a production system described transitions among four modes of activity; rapid acquisition, close following, a predictive mode, and herding. Within each mode, differential equations described trajectory-generating mechanisms. A simulation of this two-level model captures the targets in a manner resembling the episodic time histories of human subjects.
A Conceptual Model for Analysing Collaborative Work and Products in Groupware Systems
NASA Astrophysics Data System (ADS)
Duque, Rafael; Bravo, Crescencio; Ortega, Manuel
Collaborative work using groupware systems is a dynamic process in which many tasks, in different application domains, are carried out. Currently, one of the biggest challenges in the field of CSCW (Computer-Supported Cooperative Work) research is to establish conceptual models which allow for the analysis of collaborative activities and their resulting products. In this article, we propose an ontology that conceptualizes the required elements which enable an analysis to infer a set of analysis indicators, thus evaluating both the individual and group work and the artefacts which are produced.
Computational neuroanatomy of speech production.
Hickok, Gregory
2012-01-05
Speech production has been studied predominantly from within two traditions, psycholinguistics and motor control. These traditions have rarely interacted, and the resulting chasm between these approaches seems to reflect a level of analysis difference: whereas motor control is concerned with lower-level articulatory control, psycholinguistics focuses on higher-level linguistic processing. However, closer examination of both approaches reveals a substantial convergence of ideas. The goal of this article is to integrate psycholinguistic and motor control approaches to speech production. The result of this synthesis is a neuroanatomically grounded, hierarchical state feedback control model of speech production.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hagopian, C.R.; Lewis, P.J.; McDonald, J.J.
1983-02-01
Improvements and innovations in styrene production since 1966 are outlined. Rigorous process models are attributed to the changes. Such models are used to evaluate the effects of changing raw material costs, utility costs, and available catalyst choices. The process model can also evaluate the best operating configuration and catalyst choice for a plant. All specified innovations are incorporated in the Mobil/Badger ethylbenzene and the Cosden/Badger styrene processes (both of which are schematicized). Badger's training programs are reviewed. Badger's Styrenics Business Team converts information into plant design basis. A reaction model with input derived from isothermal and adiabatic pilot plant unitsmore » is at the heart of complete computer simulation of ethylbenzene and styrene processes.« less
Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources
NASA Astrophysics Data System (ADS)
Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.
2011-12-01
Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.
NASA Technical Reports Server (NTRS)
Dalee, Robert C.; Bacskay, Allen S.; Knox, James C.
1990-01-01
An overview of the CASE/A-ECLSS series modeling package is presented. CASE/A is an analytical tool that has supplied engineering productivity accomplishments during ECLSS design activities. A components verification program was performed to assure component modeling validity based on test data from the Phase II comparative test program completed at the Marshall Space Flight Center. An integrated plotting feature has been added to the program which allows the operator to analyze on-screen data trends or get hard copy plots from within the CASE/A operating environment. New command features in the areas of schematic, output, and model management, and component data editing have been incorporated to enhance the engineer's productivity during a modeling program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacKinnon, R.J.; Sullivan, T.M.; Kinsey, R.R.
1997-05-01
The BLT-EC computer code has been developed, implemented, and tested. BLT-EC is a two-dimensional finite element computer code capable of simulating the time-dependent release and reactive transport of aqueous phase species in a subsurface soil system. BLT-EC contains models to simulate the processes (container degradation, waste-form performance, transport, chemical reactions, and radioactive production and decay) most relevant to estimating the release and transport of contaminants from a subsurface disposal system. Water flow is provided through tabular input or auxiliary files. Container degradation considers localized failure due to pitting corrosion and general failure due to uniform surface degradation processes. Waste-form performancemore » considers release to be limited by one of four mechanisms: rinse with partitioning, diffusion, uniform surface degradation, and solubility. Transport considers the processes of advection, dispersion, diffusion, chemical reaction, radioactive production and decay, and sources (waste form releases). Chemical reactions accounted for include complexation, sorption, dissolution-precipitation, oxidation-reduction, and ion exchange. Radioactive production and decay in the waste form is simulated. To improve the usefulness of BLT-EC, a pre-processor, ECIN, which assists in the creation of chemistry input files, and a post-processor, BLTPLOT, which provides a visual display of the data have been developed. BLT-EC also includes an extensive database of thermodynamic data that is also accessible to ECIN. This document reviews the models implemented in BLT-EC and serves as a guide to creating input files and applying BLT-EC.« less
Morrison, Tina M.; Dreher, Maureen L.; Nagaraja, Srinidhi; Angelone, Leonardo M.; Kainz, Wolfgang
2018-01-01
The total product life cycle (TPLC) of medical devices has been defined by four stages: discovery and ideation, regulatory decision, product launch, and postmarket monitoring. Manufacturers of medical devices intended for use in the peripheral vasculature, such as stents, inferior vena cava (IVC) filters, and stent-grafts, mainly use computational modeling and simulation (CM&S) to aid device development and design optimization, supplement bench testing for regulatory decisions, and assess postmarket changes or failures. For example, computational solid mechanics and fluid dynamics enable the investigation of design limitations in the ideation stage. To supplement bench data in regulatory submissions, manufactures can evaluate the effects of anatomical characteristics and expected in vivo loading environment on device performance. Manufacturers might also harness CM&S to aid root-cause analyses that are necessary when failures occur postmarket, when the device is exposed to broad clinical use. Once identified, CM&S tools can then be used for redesign to address the failure mode and re-establish the performance profile with the appropriate models. The Center for Devices and Radiological Health (CDRH) wants to advance the use of CM&S for medical devices and supports the development of virtual physiological patients, clinical trial simulations, and personalized medicine. Thus, the purpose of this paper is to describe specific examples of how CM&S is currently used to support regulatory submissions at different phases of the TPLC and to present some of the stakeholder-led initiatives for advancing CM&S for regulatory decision-making. PMID:29479395
Morrison, Tina M; Dreher, Maureen L; Nagaraja, Srinidhi; Angelone, Leonardo M; Kainz, Wolfgang
2017-01-01
The total product life cycle (TPLC) of medical devices has been defined by four stages: discovery and ideation, regulatory decision, product launch, and postmarket monitoring. Manufacturers of medical devices intended for use in the peripheral vasculature, such as stents, inferior vena cava (IVC) filters, and stent-grafts, mainly use computational modeling and simulation (CM&S) to aid device development and design optimization, supplement bench testing for regulatory decisions, and assess postmarket changes or failures. For example, computational solid mechanics and fluid dynamics enable the investigation of design limitations in the ideation stage. To supplement bench data in regulatory submissions, manufactures can evaluate the effects of anatomical characteristics and expected in vivo loading environment on device performance. Manufacturers might also harness CM&S to aid root-cause analyses that are necessary when failures occur postmarket, when the device is exposed to broad clinical use. Once identified, CM&S tools can then be used for redesign to address the failure mode and re-establish the performance profile with the appropriate models. The Center for Devices and Radiological Health (CDRH) wants to advance the use of CM&S for medical devices and supports the development of virtual physiological patients, clinical trial simulations, and personalized medicine. Thus, the purpose of this paper is to describe specific examples of how CM&S is currently used to support regulatory submissions at different phases of the TPLC and to present some of the stakeholder-led initiatives for advancing CM&S for regulatory decision-making.
ERIC Educational Resources Information Center
Guidera, Stan; MacPherson, D. Scot
2008-01-01
This paper presents the results of a study that was conducted to identify and document student perceptions of the effectiveness of computer modeling software introduced in a design foundations course that had previously utilized only conventional manually-produced representation techniques. Rather than attempt to utilize a production-oriented CAD…
Kumar, Anup; Guria, Chandan; Chitres, G; Chakraborty, Arunangshu; Pathak, A K
2016-10-01
A comprehensive mathematical model involving NPK-10:26:26 fertilizer, NaCl, NaHCO3, light and temperature operating variables for Dunaliella tertiolecta cultivation is formulated to predict microalgae-biomass and lipid productivity. Proposed model includes Monod/Andrews kinetics for the absorption of essential nutrients into algae-biomass and Droop model involving internal nutrient cell quota for microalgae growth, assuming algae-biomass is composed of sugar, functional-pool and neutral-lipid. Biokinetic model parameters are determined by minimizing the residual-sum-of-square-errors between experimental and computed microalgae-biomass and lipid productivity using genetic algorithm. Developed model is validated with the experiments of Dunaliella tertiolecta cultivation using air-agitated sintered-disk chromatographic glass-bubble column and the effects of operating variables on microalgae-biomass and lipid productivity is investigated. Finally, parametric sensitivity analysis is carried out to know the sensitivity of model parameters on the obtained results in the input parameter space. Proposed model may be helpful in scale-up studies and implementation of model-based control strategy in large-scale algal cultivation. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rogers, L.A.; Randolph, P.L.
1979-01-01
A paper presented by the Institute of Gas Technology (IGT) at the Third Geopressured-Geothermal Energy Conference hypothesized that the high ratio of produced gas to produced water from the No. 1 sand in the Edna Delcambre No. 1 well was due to free gas trapped in pores by imbibition over geological time. This hypothesis was examined in relation to preliminary test data which reported only average gas to water ratios over the roughly 2-day steps in flow rate. Subsequent public release of detailed test data revealed substantial departures from the previously reported computer simulation results. Also, data now in themore » public domain reveal the existence of a gas cap on the aquifier tested. This paper describes IGT's efforts to match the observed gas/water production with computer simulation. Two models for the occurrence and production of gas in excess of that dissolved in the brine have been used. One model considers the gas to be dispersed in pores by imbibition, and the other model considers the gas as a nearby free gas cap above the aquifier. The studies revealed that the dispersed gas model characteristically gave the wrong shape to plots of gas production on the gas/water ratio plots such that no reasonable match to the flow data could be achieved. The free gas cap model gave a characteristically better shape to the production plots and could provide an approximate fit to the data of the edge of the free gas cap is only about 400 feet from the well.Because the geological structure maps indicate the free gas cap to be several thousand feet away and the computer simulation results match the distance to the nearby Delcambre Nos. 4 and 4A wells, it appears that the source of the excess free gas in the test of the No. 1 sand may be from these nearby wells. The gas source is probably a separate gas zone and is brought into contact with the No. 1 sand via a conduit around the No. 4 well.« less
NASA Astrophysics Data System (ADS)
Reid, J. S.; Westphal, D. L.; Christopher, S. A.; Prins, E. M.; Gasso, S.; Reid, E.; Theisen, M.; Schmidt, C. C.; Hunter, J.; Eck, T.
2002-05-01
The Fire Locating and Modeling of Burning Emissions (FLAMBE') project is a joint Navy, NOAA, NASA and university project to integrate satellite products with numerical aerosol models to produce a real time fire and emissions inventory. At the center of the program is the Wildfire Automated Biomass Burning Algorithm (WF ABBA) which provides real-time fire products and the NRL Aerosol Analysis and Prediction System to model smoke transport. In this presentation we give a brief overview of the system and methods, but emphasize new estimations of smoke coverage and emission fluxes from the South American continent. Temporal and smoke patterns compare reasonably well with AERONET and MODIS aerosol optical depth products for the 2000 and 2001 fire seasons. Fluxes are computed by relating NAAPS output fields and MODIS optical depth maps with modeled wind fields. Smoke emissions and transport fluxes out of the continent can then be estimated by perturbing the modeled emissions to gain agreement with the satellite and wind products. Regional smoke emissions are also presented for grass and forest burning.
Gutierrez, Jahir M; Lewis, Nathan E
2015-07-01
Eukaryotic cell lines, including Chinese hamster ovary cells, yeast, and insect cells, are invaluable hosts for the production of many recombinant proteins. With the advent of genomic resources, one can now leverage genome-scale computational modeling of cellular pathways to rationally engineer eukaryotic host cells. Genome-scale models of metabolism include all known biochemical reactions occurring in a specific cell. By describing these mathematically and using tools such as flux balance analysis, the models can simulate cell physiology and provide targets for cell engineering that could lead to enhanced cell viability, titer, and productivity. Here we review examples in which metabolic models in eukaryotic cell cultures have been used to rationally select targets for genetic modification, improve cellular metabolic capabilities, design media supplementation, and interpret high-throughput omics data. As more comprehensive models of metabolism and other cellular processes are developed for eukaryotic cell culture, these will enable further exciting developments in cell line engineering, thus accelerating recombinant protein production and biotechnology in the years to come. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Carbon solids in oxygen-deficient explosives (LA-UR-13-21151)
NASA Astrophysics Data System (ADS)
Peery, Travis
2013-06-01
The phase behavior of excess carbon in oxygen-deficient explosives has a significant effect on detonation properties and product equations of state. Mixtures of fuel oil in ammonium nitrate (ANFO) above a stoichiometric ratio demonstrate that even small amounts of graphite, on the order of 5% by mole fraction, can substantially alter the Chapman-Jouget (CJ) state properties, a central ingredient in modeling the products equation of state. Similar effects can be seen for Composition B, which borders the carbon phase boundary between graphite and diamond. Nano-diamond formation adds complexity to the product modeling because of surface adsorption effects. I will discuss these carbon phase issues in our equation of state modeling of detonation products, including our statistical mechanics description of carbon clustering and surface chemistry to properly treat solid carbon formation. This work is supported by the Advanced Simulation and Computing Program, under the NNSA.
Modeling Production Plant Forming Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rhee, M; Becker, R; Couch, R
2004-09-22
Engineering has simulation tools and experience in modeling forming processes. Y-12 personnel have expressed interest in validating our tools and experience against their manufacturing process activities such as rolling, casting, and forging etc. We have demonstrated numerical capabilities in a collaborative DOE/OIT project with ALCOA that is nearing successful completion. The goal was to use ALE3D to model Alcoa's slab rolling process in order to demonstrate a computational tool that would allow Alcoa to define a rolling schedule that would minimize the probability of ingot fracture, thus reducing waste and energy consumption. It is intended to lead to long-term collaborationmore » with Y-12 and perhaps involvement with other components of the weapons production complex. Using simulations to aid in design of forming processes can: decrease time to production; reduce forming trials and associated expenses; and guide development of products with greater uniformity and less scrap.« less
Pricing strategy in a dual-channel and remanufacturing supply chain system
NASA Astrophysics Data System (ADS)
Jiang, Chengzhi; Xu, Feng; Sheng, Zhaohan
2010-07-01
This article addresses the pricing strategy problems in a supply chain system where the manufacturer sells original products and remanufactured products via indirect retailer channels and direct Internet channels. Due to the complexity of that system, agent technologies that provide a new way for analysing complex systems are used for modelling. Meanwhile, in order to reduce the computational load of searching procedure for optimal prices and profits, a learning search algorithm is designed and implemented within the multi-agent supply chain model. The simulation results show that the proposed model can find out optimal prices of original products and remanufactured products in both channels, which lead to optimal profits of the manufacturer and the retailer. It is also found that the optimal profits are increased by introducing direct channel and remanufacturing. Furthermore, the effect of customer preference, direct channel cost and remanufactured unit cost on optimal prices and profits are examined.
Computational inference of the structure and regulation of the lignin pathway in Panicum virgatum
Faraji, Mojdeh; Fonseca, Luis L.; Escamilla-Treviño, Luis; ...
2015-09-17
Switchgrass is a prime target for biofuel production from inedible plant parts and has been the subject of numerous investigations in recent years. Yet, one of the main obstacles to effective biofuel production remains to be the major problem of recalcitrance. Recalcitrance emerges in part from the 3-D structure of lignin as a polymer in the secondary cell wall. Lignin limits accessibility of the sugars in the cellulose and hemicellulose polymers to enzymes and ultimately decreases ethanol yield. Monolignols, the building blocks of lignin polymers, are synthesized in the cytosol and translocated to the plant cell wall, where they undergomore » polymerization. The biosynthetic pathway leading to monolignols in switchgrass is not completely known, and difficulties associated with in vivo measurements of these intermediates pose a challenge for a true understanding of the functioning of the pathway. In this study, a systems biological modeling approach is used to address this challenge and to elucidate the structure and regulation of the lignin pathway through a computational characterization of alternate candidate topologies. The analysis is based on experimental data characterizing stem and tiller tissue of four transgenic lines (knock-downs of genes coding for key enzymes in the pathway) as well as wild-type switchgrass plants. These data consist of the observed content and composition of monolignols. The possibility of a G-lignin specific metabolic channel associated with the production and degradation of coniferaldehyde is examined, and the results support previous findings from another plant species. The computational analysis suggests regulatory mechanisms of product inhibition and enzyme competition, which are well known in biochemistry, but so far had not been reported in switchgrass. By including these mechanisms, the pathway model is able to represent all observations. In conclusion, the results show that the presence of the coniferaldehyde channel is necessary and that product inhibition and competition over cinnamoyl-CoA-reductase (CCR1) are essential for matching the model to observed increases in H-lignin levels in 4-coumarate:CoA-ligase (4CL) knockdowns. Moreover, competition for 4-coumarate:CoA-ligase (4CL) is essential for matching the model to observed increases in the pathway metabolites in caffeic acid O-methyltransferase (COMT) knockdowns. As far as possible, the model was validated with independent data.« less
Mechanics of airflow in the human nasal airways.
Doorly, D J; Taylor, D J; Schroter, R C
2008-11-30
The mechanics of airflow in the human nasal airways is reviewed, drawing on the findings of experimental and computational model studies. Modelling inevitably requires simplifications and assumptions, particularly given the complexity of the nasal airways. The processes entailed in modelling the nasal airways (from defining the model, to its production and, finally, validating the results) is critically examined, both for physical models and for computational simulations. Uncertainty still surrounds the appropriateness of the various assumptions made in modelling, particularly with regard to the nature of flow. New results are presented in which high-speed particle image velocimetry (PIV) and direct numerical simulation are applied to investigate the development of flow instability in the nasal cavity. These illustrate some of the improved capabilities afforded by technological developments for future model studies. The need for further improvements in characterising airway geometry and flow together with promising new methods are briefly discussed.
Task Decomposition Model for Dispatchers in Dynamic Scheduling of Demand Responsive Transit Systems
DOT National Transportation Integrated Search
2000-06-01
Since the passage of ADA, the demand for paratransit service is steadily increasing. Paratransit companies are relying on computer automation to streamline dispatch operations, increase productivity and reduce operator stress and error. Little resear...
Lipoprotein metabolism indicators improve cardiovascular risk prediction
USDA-ARS?s Scientific Manuscript database
Background: Cardiovascular disease risk increases when lipoprotein metabolism is dysfunctional. We have developed a computational model able to derive indicators of lipoprotein production, lipolysis, and uptake processes from a single lipoprotein profile measurement. This is the first study to inves...
AGIS: Integration of new technologies used in ATLAS Distributed Computing
NASA Astrophysics Data System (ADS)
Anisenkov, Alexey; Di Girolamo, Alessandro; Alandes Pradillo, Maria
2017-10-01
The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services. Being an intermediate middleware system between clients and external information sources (like central BDII, GOCDB, MyOSG), AGIS defines the relations between experiment specific used resources and physical distributed computing capabilities. Being in production during LHC Runl AGIS became the central information system for Distributed Computing in ATLAS and it is continuously evolving to fulfil new user requests, enable enhanced operations and follow the extension of the ATLAS Computing model. The ATLAS Computing model and data structures used by Distributed Computing applications and services are continuously evolving and trend to fit newer requirements from ADC community. In this note, we describe the evolution and the recent developments of AGIS functionalities, related to integration of new technologies recently become widely used in ATLAS Computing, like flexible computing utilization of opportunistic Cloud and HPC resources, ObjectStore services integration for Distributed Data Management (Rucio) and ATLAS workload management (PanDA) systems, unified storage protocols declaration required for PandDA Pilot site movers and others. The improvements of information model and general updates are also shown, in particular we explain how other collaborations outside ATLAS could benefit the system as a computing resources information catalogue. AGIS is evolving towards a common information system, not coupled to a specific experiment.
NASA Astrophysics Data System (ADS)
Iungo, Giacomo Valerio; Camarri, Simone; Ciri, Umberto; El-Asha, Said; Leonardi, Stefano; Rotea, Mario A.; Santhanagopalan, Vignesh; Viola, Francesco; Zhan, Lu
2016-11-01
Site conditions, such as topography and local climate, as well as wind farm layout strongly affect performance of a wind power plant. Therefore, predictions of wake interactions and their effects on power production still remain a great challenge in wind energy. For this study, an onshore wind turbine array was monitored through lidar measurements, SCADA and met-tower data. Power losses due to wake interactions were estimated to be approximately 4% and 2% of the total power production under stable and convective conditions, respectively. This dataset was then leveraged for the calibration of a data driven RANS (DDRANS) solver, which is a compelling tool for prediction of wind turbine wakes and power production. DDRANS is characterized by a computational cost as low as that for engineering wake models, and adequate accuracy achieved through data-driven tuning of the turbulence closure model. DDRANS is based on a parabolic formulation, axisymmetry and boundary layer approximations, which allow achieving low computational costs. The turbulence closure model consists in a mixing length model, which is optimally calibrated with the experimental dataset. Assessment of DDRANS is then performed through lidar and SCADA data for different atmospheric conditions. This material is based upon work supported by the National Science Foundation under the I/UCRC WindSTAR, NSF Award IIP 1362033.
Yamamoto, Takehiro; Ueda, Shuya
2013-01-01
Biofilm is a slime-like complex aggregate of microorganisms and their products, extracellular polymer substances, that grows on a solid surface. The growth phenomenon of biofilm is relevant to the corrosion and clogging of water pipes, the chemical processes in a bioreactor, and bioremediation. In these phenomena, the behavior of the biofilm under flow has an important role. Therefore, controlling the biofilm behavior in each process is important. To provide a computational tool for analyzing biofilm growth, the present study proposes a computational model for the simulation of biofilm growth in flows. This model accounts for the growth, decay, detachment and adhesion of biofilms. The proposed model couples the computation of the surrounding fluid flow, using the finite volume method, with the simulation of biofilm growth, using the cellular automaton approach, a relatively low-computational-cost method. Furthermore, a stochastic approach for considering the adhesion process is proposed. Numerical simulations for the biofilm growth on a planar wall and that in an L-shaped rectangular channel were carried out. A variety of biofilm structures were observed depending on the strength of the flow. Moreover, the importance of the detachment and adhesion processes was confirmed.
NASA Astrophysics Data System (ADS)
Tohidnia, S.; Tohidi, G.
2018-02-01
The current paper develops three different ways to measure the multi-period global cost efficiency for homogeneous networks of processes when the prices of exogenous inputs are known at all time periods. A multi-period network data envelopment analysis model is presented to measure the minimum cost of the network system based on the global production possibility set. We show that there is a relationship between the multi-period global cost efficiency of network system and its subsystems, and also its processes. The proposed model is applied to compute the global cost Malmquist productivity index for measuring the productivity changes of network system and each of its process between two time periods. This index is circular. Furthermore, we show that the productivity changes of network system can be defined as a weighted average of the process productivity changes. Finally, a numerical example will be presented to illustrate the proposed approach.
King, W. E.; Anderson, A. T.; Ferencz, R. M.; ...
2015-12-29
The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In thismore » study, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.« less
NASA Astrophysics Data System (ADS)
Gong, Y.; Yang, Y.; Yang, X.
2018-04-01
For the purpose of extracting productions of some specific branching plants effectively and realizing its 3D reconstruction, Terrestrial LiDAR data was used as extraction source of production, and a 3D reconstruction method based on Terrestrial LiDAR technologies combined with the L-system was proposed in this article. The topology structure of the plant architectures was extracted using the point cloud data of the target plant with space level segmentation mechanism. Subsequently, L-system productions were obtained and the structural parameters and production rules of branches, which fit the given plant, was generated. A three-dimensional simulation model of target plant was established combined with computer visualization algorithm finally. The results suggest that the method can effectively extract a given branching plant topology and describes its production, realizing the extraction of topology structure by the computer algorithm for given branching plant and also simplifying the extraction of branching plant productions which would be complex and time-consuming by L-system. It improves the degree of automation in the L-system extraction of productions of specific branching plants, providing a new way for the extraction of branching plant production rules.
Economic lot sizing in a production system with random demand
NASA Astrophysics Data System (ADS)
Lee, Shine-Der; Yang, Chin-Ming; Lan, Shu-Chuan
2016-04-01
An extended economic production quantity model that copes with random demand is developed in this paper. A unique feature of the proposed study is the consideration of transient shortage during the production stage, which has not been explicitly analysed in existing literature. The considered costs include set-up cost for the batch production, inventory carrying cost during the production and depletion stages in one replenishment cycle, and shortage cost when demand cannot be satisfied from the shop floor immediately. Based on renewal reward process, a per-unit-time expected cost model is developed and analysed. Under some mild condition, it can be shown that the approximate cost function is convex. Computational experiments have demonstrated that the average reduction in total cost is significant when the proposed lot sizing policy is compared with those with deterministic demand.
Okeno, Tobias O; Magothe, Thomas M; Kahi, Alexander K; Peters, Kurt J
2013-01-01
A bio-economic model was developed to evaluate the utilisation of indigenous chickens (IC) under different production systems accounting for the risk attitude of the farmers. The model classified the production systems into three categories based on the level of management: free-range system (FRS), where chickens were left to scavenge for feed resources with no supplementation and healthcare; intensive system (IS), where the chickens were permanently confined and supplied with rationed feed and healthcare; and semi-intensive system (SIS), a hybrid of FRS and IS, where the chickens were partially confined, supplemented with rationed feeds, provided with healthcare and allowed to scavenge within the homestead or in runs. The model allows prediction of the live weights and feed intake at different stages in the life cycle of the IC and can compute the profitability of each production system using both traditional and risk-rated profit models. The input parameters used in the model represent a typical IC production system in developing countries but are flexible and therefore can be modified to suit specific situations and simulate profitability and costs of other poultry species production systems. The model has the capability to derive the economic values as changes in the genetic merit of the biological parameter results in marginal changes in profitability and costs of the production systems. The results suggested that utilisation of IC in their current genetic merit and production environment is more profitable under FRS and SIS but not economically viable under IS.
Benchmarking of Neutron Production of Heavy-Ion Transport Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Remec, Igor; Ronningen, Reginald M.; Heilbronn, Lawrence
Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models andmore » codes and additional benchmarking are required.« less
Rostami, Ali A; Pithawalla, Yezdi B; Liu, Jianmin; Oldham, Michael J; Wagner, Karl A; Frost-Pineda, Kimberly; Sarkar, Mohamadi A
2016-08-16
Concerns have been raised in the literature for the potential of secondhand exposure from e-vapor product (EVP) use. It would be difficult to experimentally determine the impact of various factors on secondhand exposure including, but not limited to, room characteristics (indoor space size, ventilation rate), device specifications (aerosol mass delivery, e-liquid composition), and use behavior (number of users and usage frequency). Therefore, a well-mixed computational model was developed to estimate the indoor levels of constituents from EVPs under a variety of conditions. The model is based on physical and thermodynamic interactions between aerosol, vapor, and air, similar to indoor air models referred to by the Environmental Protection Agency. The model results agree well with measured indoor air levels of nicotine from two sources: smoking machine-generated aerosol and aerosol exhaled from EVP use. Sensitivity analysis indicated that increasing air exchange rate reduces room air level of constituents, as more material is carried away. The effect of the amount of aerosol released into the space due to variability in exhalation was also evaluated. The model can estimate the room air level of constituents as a function of time, which may be used to assess the level of non-user exposure over time.
Analysis hierarchical model for discrete event systems
NASA Astrophysics Data System (ADS)
Ciortea, E. M.
2015-11-01
The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.
Cornelisse, C J; Hermens, W T; Joe, M T; Duijndam, W A; van Duijn, P
1976-11-01
A numerical method was developed for computing the steady-state concentration gradient of a diffusible enzyme reaction product in a membrane-limited compartment of a simplified theoretical cell model. In cytochemical enzyme reactions proceeding according to the metal-capture principle, the local concentration of the primary reaction product is an important factor in the onset of the precipitation process and in the distribution of the final reaction product. The following variables were incorporated into the model: enzyme activity, substrate concentration, Km, diffusion coefficient of substrate and product, particle radius and cell radius. The method was applied to lysosomal acid phosphatase. Numerical values for the variables were estimated from experimental data in the literature. The results show that the calculated phosphate concentrations inside lysosomes are several orders of magnitude lower than the critical concentrations for efficient phosphate capture found in a previous experimental model study. Reasons for this apparent discrepancy are discussed.
Wan, Songlin; Zhang, Xiangchao; He, Xiaoying; Xu, Min
2016-12-20
Computer controlled optical surfacing requires an accurate tool influence function (TIF) for reliable path planning and deterministic fabrication. Near the edge of the workpieces, the TIF has a nonlinear removal behavior, which will cause a severe edge-roll phenomenon. In the present paper, a new edge pressure model is developed based on the finite element analysis results. The model is represented as the product of a basic pressure function and a correcting function. The basic pressure distribution is calculated according to the surface shape of the polishing pad, and the correcting function is used to compensate the errors caused by the edge effect. Practical experimental results demonstrate that the new model can accurately predict the edge TIFs with different overhang ratios. The relative error of the new edge model can be reduced to 15%.
Match probabilities in a finite, subdivided population
Malaspinas, Anna-Sapfo; Slatkin, Montgomery; Song, Yun S.
2011-01-01
We generalize a recently introduced graphical framework to compute the probability that haplotypes or genotypes of two individuals drawn from a finite, subdivided population match. As in the previous work, we assume an infinite-alleles model. We focus on the case of a population divided into two subpopulations, but the underlying framework can be applied to a general model of population subdivision. We examine the effect of population subdivision on the match probabilities and the accuracy of the product rule which approximates multi-locus match probabilities as a product of one-locus match probabilities. We quantify the deviation from predictions of the product rule by R, the ratio of the multi-locus match probability to the product of the one-locus match probabilities.We carry out the computation for two loci and find that ignoring subdivision can lead to underestimation of the match probabilities if the population under consideration actually has subdivision structure and the individuals originate from the same subpopulation. On the other hand, under a given model of population subdivision, we find that the ratio R for two loci is only slightly greater than 1 for a large range of symmetric and asymmetric migration rates. Keeping in mind that the infinite-alleles model is not the appropriate mutation model for STR loci, we conclude that, for two loci and biologically reasonable parameter values, population subdivision may lead to results that disfavor innocent suspects because of an increase in identity-by-descent in finite populations. On the other hand, for the same range of parameters, population subdivision does not lead to a substantial increase in linkage disequilibrium between loci. Those results are consistent with established practice. PMID:21266180
Socio-economic and climate change impacts on agriculture: an integrated assessment, 1990–2080
Fischer, Günther; Shah, Mahendra; N. Tubiello, Francesco; van Velhuizen, Harrij
2005-01-01
A comprehensive assessment of the impacts of climate change on agro-ecosystems over this century is developed, up to 2080 and at a global level, albeit with significant regional detail. To this end an integrated ecological–economic modelling framework is employed, encompassing climate scenarios, agro-ecological zoning information, socio-economic drivers, as well as world food trade dynamics. Specifically, global simulations are performed using the FAO/IIASA agro-ecological zone model, in conjunction with IIASAs global food system model, using climate variables from five different general circulation models, under four different socio-economic scenarios from the intergovernmental panel on climate change. First, impacts of different scenarios of climate change on bio-physical soil and crop growth determinants of yield are evaluated on a 5′×5′ latitude/longitude global grid; second, the extent of potential agricultural land and related potential crop production is computed. The detailed bio-physical results are then fed into an economic analysis, to assess how climate impacts may interact with alternative development pathways, and key trends expected over this century for food demand and production, and trade, as well as key composite indices such as risk of hunger and malnutrition, are computed. This modelling approach connects the relevant bio-physical and socio-economic variables within a unified and coherent framework to produce a global assessment of food production and security under climate change. The results from the study suggest that critical impact asymmetries due to both climate and socio-economic structures may deepen current production and consumption gaps between developed and developing world; it is suggested that adaptation of agricultural techniques will be central to limit potential damages under climate change. PMID:16433094
Short time ahead wind power production forecast
NASA Astrophysics Data System (ADS)
Sapronova, Alla; Meissner, Catherine; Mana, Matteo
2016-09-01
An accurate prediction of wind power output is crucial for efficient coordination of cooperative energy production from different sources. Long-time ahead prediction (from 6 to 24 hours) of wind power for onshore parks can be achieved by using a coupled model that would bridge the mesoscale weather prediction data and computational fluid dynamics. When a forecast for shorter time horizon (less than one hour ahead) is anticipated, an accuracy of a predictive model that utilizes hourly weather data is decreasing. That is because the higher frequency fluctuations of the wind speed are lost when data is averaged over an hour. Since the wind speed can vary up to 50% in magnitude over a period of 5 minutes, the higher frequency variations of wind speed and direction have to be taken into account for an accurate short-term ahead energy production forecast. In this work a new model for wind power production forecast 5- to 30-minutes ahead is presented. The model is based on machine learning techniques and categorization approach and using the historical park production time series and hourly numerical weather forecast.
Analysis of dynamics and fit of diving suits
NASA Astrophysics Data System (ADS)
Mahnic Naglic, M.; Petrak, S.; Gersak, J.; Rolich, T.
2017-10-01
Paper presents research on dynamical behaviour and fit analysis of customised diving suits. Diving suits models are developed using the 3D flattening method, which enables the construction of a garment model directly on the 3D computer body model and separation of discrete 3D surfaces as well as transformation into 2D cutting parts. 3D body scanning of male and female test subjects was performed with the purpose of body measurements analysis in static and dynamic postures and processed body models were used for construction and simulation of diving suits prototypes. All necessary parameters, for 3D simulation were applied on obtained cutting parts, as well as parameters values for mechanical properties of neoprene material. Developed computer diving suits prototypes were used for stretch analysis on areas relevant for body dimensional changes according to dynamic anthropometrics. Garment pressures against the body in static and dynamic conditions was also analysed. Garments patterns for which the computer prototype verification was conducted were used for real prototype production. Real prototypes were also used for stretch and pressure analysis in static and dynamic conditions. Based on the obtained results, correlation analysis between body changes in dynamic positions and dynamic stress, determined on computer and real prototypes, was performed.
Computational methods for diffusion-influenced biochemical reactions.
Dobrzynski, Maciej; Rodríguez, Jordi Vidal; Kaandorp, Jaap A; Blom, Joke G
2007-08-01
We compare stochastic computational methods accounting for space and discrete nature of reactants in biochemical systems. Implementations based on Brownian dynamics (BD) and the reaction-diffusion master equation are applied to a simplified gene expression model and to a signal transduction pathway in Escherichia coli. In the regime where the number of molecules is small and reactions are diffusion-limited predicted fluctuations in the product number vary between the methods, while the average is the same. Computational approaches at the level of the reaction-diffusion master equation compute the same fluctuations as the reference result obtained from the particle-based method if the size of the sub-volumes is comparable to the diameter of reactants. Using numerical simulations of reversible binding of a pair of molecules we argue that the disagreement in predicted fluctuations is due to different modeling of inter-arrival times between reaction events. Simulations for a more complex biological study show that the different approaches lead to different results due to modeling issues. Finally, we present the physical assumptions behind the mesoscopic models for the reaction-diffusion systems. Input files for the simulations and the source code of GMP can be found under the following address: http://www.cwi.nl/projects/sic/bioinformatics2007/
Modeling methods for merging computational and experimental aerodynamic pressure data
NASA Astrophysics Data System (ADS)
Haderlie, Jacob C.
This research describes a process to model surface pressure data sets as a function of wing geometry from computational and wind tunnel sources and then merge them into a single predicted value. The described merging process will enable engineers to integrate these data sets with the goal of utilizing the advantages of each data source while overcoming the limitations of both; this provides a single, combined data set to support analysis and design. The main challenge with this process is accurately representing each data source everywhere on the wing. Additionally, this effort demonstrates methods to model wind tunnel pressure data as a function of angle of attack as an initial step towards a merging process that uses both location on the wing and flow conditions (e.g., angle of attack, flow velocity or Reynold's number) as independent variables. This surrogate model of pressure as a function of angle of attack can be useful for engineers that need to predict the location of zero-order discontinuities, e.g., flow separation or normal shocks. Because, to the author's best knowledge, there is no published, well-established merging method for aerodynamic pressure data (here, the coefficient of pressure Cp), this work identifies promising modeling and merging methods, and then makes a critical comparison of these methods. Surrogate models represent the pressure data for both data sets. Cubic B-spline surrogate models represent the computational simulation results. Machine learning and multi-fidelity surrogate models represent the experimental data. This research compares three surrogates for the experimental data (sequential--a.k.a. online--Gaussian processes, batch Gaussian processes, and multi-fidelity additive corrector) on the merits of accuracy and computational cost. The Gaussian process (GP) methods employ cubic B-spline CFD surrogates as a model basis function to build a surrogate model of the WT data, and this usage of the CFD surrogate in building the WT data could serve as a "merging" because the resulting WT pressure prediction uses information from both sources. In the GP approach, this model basis function concept seems to place more "weight" on the Cp values from the wind tunnel (WT) because the GP surrogate uses the CFD to approximate the WT data values. Conversely, the computationally inexpensive additive corrector method uses the CFD B-spline surrogate to define the shape of the spanwise distribution of the Cp while minimizing prediction error at all spanwise locations for a given arc length position; this, too, combines information from both sources to make a prediction of the 2-D WT-based Cp distribution, but the additive corrector approach gives more weight to the CFD prediction than to the WT data. Three surrogate models of the experimental data as a function of angle of attack are also compared for accuracy and computational cost. These surrogates are a single Gaussian process model (a single "expert"), product of experts, and generalized product of experts. The merging approach provides a single pressure distribution that combines experimental and computational data. The batch Gaussian process method provides a relatively accurate surrogate that is computationally acceptable, and can receive wind tunnel data from port locations that are not necessarily parallel to a variable direction. On the other hand, the sequential Gaussian process and additive corrector methods must receive a sufficient number of data points aligned with one direction, e.g., from pressure port bands (tap rows) aligned with the freestream. The generalized product of experts best represents wind tunnel pressure as a function of angle of attack, but at higher computational cost than the single expert approach. The format of the application data from computational and experimental sources in this work precluded the merging process from including flow condition variables (e.g., angle of attack) in the independent variables, so the merging process is only conducted in the wing geometry variables of arc length and span. The merging process of Cp data allows a more "hands-off" approach to aircraft design and analysis, (i.e., not as many engineers needed to debate the Cp distribution shape) and generates Cp predictions at any location on the wing. However, the cost with these benefits are engineer time (learning how to build surrogates), computational time in constructing the surrogates, and surrogate accuracy (surrogates introduce error into data predictions). This dissertation effort used the Trap Wing / First AIAA CFD High-Lift Prediction Workshop as a relevant transonic wing with a multi-element high-lift system, and this work identified that the batch GP model for the WT data and the B-spline surrogate for the CFD might best be combined using expert belief weights to describe Cp as a function of location on the wing element surface. (Abstract shortened by ProQuest.).
NASA Astrophysics Data System (ADS)
Jalbout, Abraham F.; Roy, Amlan K.; Shipar, Abul Haider; Ahmed, M. Samsuddin
Theoretical energy changes of various intermediates leading to the formation of the Amadori rearrangement products (ARPs) under different mechanistic assumptions have been calculated, by using open chain glucose (O-Glu)/closed chain glucose (A-Glu and B-Glu) and glycine (Gly) as a model for the Maillard reaction. Density functional theory (DFT) computations have been applied on the proposed mechanisms under different pH conditions. Thus, the possibility of the formation of different compounds and electronic energy changes for different steps in the proposed mechanisms has been evaluated. B-Glu has been found to be more efficient than A-Glu, and A-Glu has been found more efficient than O-Glu in the reaction. The reaction under basic condition is the most favorable for the formation of ARPs. Other reaction pathways have been computed and discussed in this work.0
Challenges in Managing Trustworthy Large-scale Digital Science
NASA Astrophysics Data System (ADS)
Evans, B. J. K.
2017-12-01
The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.
Shabestary, Kiyan; Hudson, Elton P
2016-12-01
Chemical and fuel production by photosynthetic cyanobacteria is a promising technology but to date has not reached competitive rates and titers. Genome-scale metabolic modeling can reveal limitations in cyanobacteria metabolism and guide genetic engineering strategies to increase chemical production. Here, we used constraint-based modeling and optimization algorithms on a genome-scale model of Synechocystis PCC6803 to find ways to improve productivity of fermentative, fatty-acid, and terpene-derived fuels. OptGene and MOMA were used to find heuristics for knockout strategies that could increase biofuel productivity. OptKnock was used to find a set of knockouts that led to coupling between biofuel and growth. Our results show that high productivity of fermentation or reversed beta-oxidation derived alcohols such as 1-butanol requires elimination of NADH sinks, while terpenes and fatty-acid based fuels require creating imbalances in intracellular ATP and NADPH production and consumption. The FBA-predicted productivities of these fuels are at least 10-fold higher than those reported so far in the literature. We also discuss the physiological and practical feasibility of implementing these knockouts. This work gives insight into how cyanobacteria could be engineered to reach competitive biofuel productivities.
Computational Models Used to Assess US Tobacco Control Policies.
Feirman, Shari P; Glasser, Allison M; Rose, Shyanika; Niaura, Ray; Abrams, David B; Teplitskaya, Lyubov; Villanti, Andrea C
2017-11-01
Simulation models can be used to evaluate existing and potential tobacco control interventions, including policies. The purpose of this systematic review was to synthesize evidence from computational models used to project population-level effects of tobacco control interventions. We provide recommendations to strengthen simulation models that evaluate tobacco control interventions. Studies were eligible for review if they employed a computational model to predict the expected effects of a non-clinical US-based tobacco control intervention. We searched five electronic databases on July 1, 2013 with no date restrictions and synthesized studies qualitatively. Six primary non-clinical intervention types were examined across the 40 studies: taxation, youth prevention, smoke-free policies, mass media campaigns, marketing/advertising restrictions, and product regulation. Simulation models demonstrated the independent and combined effects of these interventions on decreasing projected future smoking prevalence. Taxation effects were the most robust, as studies examining other interventions exhibited substantial heterogeneity with regard to the outcomes and specific policies examined across models. Models should project the impact of interventions on overall tobacco use, including nicotine delivery product use, to estimate preventable health and cost-saving outcomes. Model validation, transparency, more sophisticated models, and modeling policy interactions are also needed to inform policymakers to make decisions that will minimize harm and maximize health. In this systematic review, evidence from multiple studies demonstrated the independent effect of taxation on decreasing future smoking prevalence, and models for other tobacco control interventions showed that these strategies are expected to decrease smoking, benefit population health, and are reasonable to implement from a cost perspective. Our recommendations aim to help policymakers and researchers minimize harm and maximize overall population-level health benefits by considering the real-world context in which tobacco control interventions are implemented. © The Author 2017. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ho, P.; Johannes, J.; Kudriavtsev, V.
The use of computational modeling to improve equipment and process designs for chemical vapor deposition (CVD) reactors is becoming increasingly common. Commercial codes are available that facilitate the modeling of chemically-reacting flows, but chemical reaction mechanisms must be separately developed for each system of interest. One f the products of the Watkins-Johnson Company (WJ) is a reactor marketed to semiconductor manufacturers for the atmospheric-pressure chemical vapor deposition (APCVD) of silicon oxide films. In this process, TEOS (tetraethoxysilane, Si(OC{sub 2}H{sub 5}){sub 4}) and ozone (O{sub 3}) are injected (in nitrogen and oxygen carrier gases) over hot silicon wafers that are beingmore » carried through the system on a moving belt. As part of their equipment improvement process, WJ is developing computational models of this tool. In this effort, they are collaborating with Sandia National Laboratories (SNL) to draw on Sandia`s experience base in understanding and modeling the chemistry of CVD processes.« less
Quantum Computation using Arrays of N Polar Molecules in Pendular States.
Wei, Qi; Cao, Yudong; Kais, Sabre; Friedrich, Bretislav; Herschbach, Dudley
2016-11-18
We investigate several aspects of realizing quantum computation using entangled polar molecules in pendular states. Quantum algorithms typically start from a product state |00⋯0⟩ and we show that up to a negligible error, the ground states of polar molecule arrays can be considered as the unentangled qubit basis state |00⋯0⟩ . This state can be prepared by simply allowing the system to reach thermal equilibrium at low temperature (<1 mK). We also evaluate entanglement, characterized by concurrence of pendular state qubits in dipole arrays as governed by the external electric field, dipole-dipole coupling and number N of molecules in the array. In the parameter regime that we consider for quantum computing, we find that qubit entanglement is modest, typically no greater than 10 -4 , confirming the negligible entanglement in the ground state. We discuss methods for realizing quantum computation in the gate model, measurement-based model, instantaneous quantum polynomial time circuits and the adiabatic model using polar molecules in pendular states. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
A Primer on High-Throughput Computing for Genomic Selection
Wu, Xiao-Lin; Beissinger, Timothy M.; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J. M.; Weigel, Kent A.; Gatti, Natalia de Leon; Gianola, Daniel
2011-01-01
High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin–Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized genetic gain). Eventually, HTC may change our view of data analysis as well as decision-making in the post-genomic era of selection programs in animals and plants, or in the study of complex diseases in humans. PMID:22303303
NASA Astrophysics Data System (ADS)
Hadi, M. Z.; Djatna, T.; Sugiarto
2018-04-01
This paper develops a dynamic storage assignment model to solve storage assignment problem (SAP) for beverages order picking in a drive-in rack warehousing system to determine the appropriate storage location and space for each beverage products dynamically so that the performance of the system can be improved. This study constructs a graph model to represent drive-in rack storage position then combine association rules mining, class-based storage policies and an arrangement rule algorithm to determine an appropriate storage location and arrangement of the product according to dynamic orders from customers. The performance of the proposed model is measured as rule adjacency accuracy, travel distance (for picking process) and probability a product become expiry using Last Come First Serve (LCFS) queue approach. Finally, the proposed model is implemented through computer simulation and compare the performance for different storage assignment methods as well. The result indicates that the proposed model outperforms other storage assignment methods.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...
Code of Federal Regulations, 2012 CFR
2012-10-01
... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...
Code of Federal Regulations, 2011 CFR
2011-10-01
... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...
Code of Federal Regulations, 2010 CFR
2010-10-01
... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.706(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...
Palmer, Tim N.; O’Shea, Michael
2015-01-01
How is the brain configured for creativity? What is the computational substrate for ‘eureka’ moments of insight? Here we argue that creative thinking arises ultimately from a synergy between low-energy stochastic and energy-intensive deterministic processing, and is a by-product of a nervous system whose signal-processing capability per unit of available energy has become highly energy optimised. We suggest that the stochastic component has its origin in thermal (ultimately quantum decoherent) noise affecting the activity of neurons. Without this component, deterministic computational models of the brain are incomplete. PMID:26528173
Establishing an IERS Sub-Center for Ocean Angular Momentum
NASA Technical Reports Server (NTRS)
Ponte, Rui M.
2001-01-01
With the objective of establishing the Special Bureau for the Oceans (SBO), a new archival center for ocean angular momentum (OAM) products, we have computed and analyzed a number of OAM products from several ocean models, with and without data assimilation. All three components of OAM (axial term related to length of day variations and equatorial terms related to polar motion) have been examined in detail, in comparison to the respective Earth rotation parameters. An 11+ year time series of OAM given at 5-day intervals has been made publicly available. Other OAM products spanning longer periods and with higher temporal resolution, as well as products calculated from ocean model/data assimilation systems, have been prepared and should become part of the SBO archives in the near future.
Application of mathematical models and computation in plant metabolomics
USDA-ARS?s Scientific Manuscript database
The investigation and reporting of plants’ chemical constituents has greatly evolved over the centuries of natural products and phytochemical research. Starting from the extraction and identification of plant-based bioactive components, such as historical salicin or more recent paclitaxel, phytochem...
Proceedings of the Thirteenth Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1988-01-01
Topics covered in the workshop included studies and experiments conducted in the Software Engineering Laboratory (SEL), a cooperative effort of NASA Goddard Space Flight Center, the University of Maryland, and Computer Sciences Corporation; software models; software products; and software tools.
Model building techniques for analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald
2009-09-01
The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the productmore » definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.« less
Towards a C2 Poly-Visualization Tool: Leveraging the Power of Social-Network Analysis and GIS
2011-06-01
from Magsino.14 AutoMap, a product of CASOS at Carnegie Mellon University, is a text-mining tool that enables the extraction of network data from...enables community leaders to prepare for biological attacks using computational models. BioWar is a CASOS package that combines many factors into a...models, demographically accurate agent modes, wind dispersion models, and an error-diagnostic model. Construct, also developed by CASOS , is a
The FITS model office ergonomics program: a model for best practice.
Chim, Justine M Y
2014-01-01
An effective office ergonomics program can predict positive results in reducing musculoskeletal injury rates, enhancing productivity, and improving staff well-being and job satisfaction. Its objective is to provide a systematic solution to manage the potential risk of musculoskeletal disorders among computer users in an office setting. A FITS Model office ergonomics program is developed. The FITS Model Office Ergonomics Program has been developed which draws on the legislative requirements for promoting the health and safety of workers using computers for extended periods as well as previous research findings. The Model is developed according to the practical industrial knowledge in ergonomics, occupational health and safety management, and human resources management in Hong Kong and overseas. This paper proposes a comprehensive office ergonomics program, the FITS Model, which considers (1) Furniture Evaluation and Selection; (2) Individual Workstation Assessment; (3) Training and Education; (4) Stretching Exercises and Rest Break as elements of an effective program. An experienced ergonomics practitioner should be included in the program design and implementation. Through the FITS Model Office Ergonomics Program, the risk of musculoskeletal disorders among computer users can be eliminated or minimized, and workplace health and safety and employees' wellness enhanced.
A tree-parenchyma coupled model for lung ventilation simulation.
Pozin, Nicolas; Montesantos, Spyridon; Katz, Ira; Pichelin, Marine; Vignon-Clementel, Irene; Grandmont, Céline
2017-11-01
In this article, we develop a lung ventilation model. The parenchyma is described as an elastic homogenized media. It is irrigated by a space-filling dyadic resistive pipe network, which represents the tracheobronchial tree. In this model, the tree and the parenchyma are strongly coupled. The tree induces an extra viscous term in the system constitutive relation, which leads, in the finite element framework, to a full matrix. We consider an efficient algorithm that takes advantage of the tree structure to enable a fast matrix-vector product computation. This framework can be used to model both free and mechanically induced respiration, in health and disease. Patient-specific lung geometries acquired from computed tomography scans are considered. Realistic Dirichlet boundary conditions can be deduced from surface registration on computed tomography images. The model is compared to a more classical exit compartment approach. Results illustrate the coupling between the tree and the parenchyma, at global and regional levels, and how conditions for the purely 0D model can be inferred. Different types of boundary conditions are tested, including a nonlinear Robin model of the surrounding lung structures. Copyright © 2017 John Wiley & Sons, Ltd.
Using Reconstructed POD Modes as Turbulent Inflow for LES Wind Turbine Simulations
NASA Astrophysics Data System (ADS)
Nielson, Jordan; Bhaganagar, Kiran; Juttijudata, Vejapong; Sirisup, Sirod
2016-11-01
Currently, in order to get realistic atmospheric effects of turbulence, wind turbine LES simulations require computationally expensive precursor simulations. At times, the precursor simulation is more computationally expensive than the wind turbine simulation. The precursor simulations are important because they capture turbulence in the atmosphere and as stated above, turbulence impacts the power production estimation. On the other hand, POD analysis has been shown to be capable of capturing turbulent structures. The current study was performed to determine the plausibility of using lower dimension models from POD analysis of LES simulations as turbulent inflow to wind turbine LES simulations. The study will aid the wind energy community by lowering the computational cost of full scale wind turbine LES simulations, while maintaining a high level of turbulent information and being able to quickly apply the turbulent inflow to multi turbine wind farms. This will be done by comparing a pure LES precursor wind turbine simulation with simulations that use reduced POD mod inflow conditions. The study shows the feasibility of using lower dimension models as turbulent inflow of LES wind turbine simulations. Overall the power production estimation and velocity field of the wind turbine wake are well captured with small errors.
Computer-Aided Sensor Development Focused on Security Issues.
Bialas, Andrzej
2016-05-26
The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research.
Computer-Aided Sensor Development Focused on Security Issues
Bialas, Andrzej
2016-01-01
The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research. PMID:27240360
Plis, Sergey M; George, J S; Jun, S C; Paré-Blagoev, J; Ranken, D M; Wood, C C; Schmidt, D M
2007-01-01
We propose a new model to approximate spatiotemporal noise covariance for use in neural electromagnetic source analysis, which better captures temporal variability in background activity. As with other existing formalisms, our model employs a Kronecker product of matrices representing temporal and spatial covariance. In our model, spatial components are allowed to have differing temporal covariances. Variability is represented as a series of Kronecker products of spatial component covariances and corresponding temporal covariances. Unlike previous attempts to model covariance through a sum of Kronecker products, our model is designed to have a computationally manageable inverse. Despite increased descriptive power, inversion of the model is fast, making it useful in source analysis. We have explored two versions of the model. One is estimated based on the assumption that spatial components of background noise have uncorrelated time courses. Another version, which gives closer approximation, is based on the assumption that time courses are statistically independent. The accuracy of the structural approximation is compared to an existing model, based on a single Kronecker product, using both Frobenius norm of the difference between spatiotemporal sample covariance and a model, and scatter plots. Performance of ours and previous models is compared in source analysis of a large number of single dipole problems with simulated time courses and with background from authentic magnetoencephalography data.
Development of climate data storage and processing model
NASA Astrophysics Data System (ADS)
Okladnikov, I. G.; Gordov, E. P.; Titov, A. G.
2016-11-01
We present a storage and processing model for climate datasets elaborated in the framework of a virtual research environment (VRE) for climate and environmental monitoring and analysis of the impact of climate change on the socio-economic processes on local and regional scales. The model is based on a «shared nothings» distributed computing architecture and assumes using a computing network where each computing node is independent and selfsufficient. Each node holds a dedicated software for the processing and visualization of geospatial data providing programming interfaces to communicate with the other nodes. The nodes are interconnected by a local network or the Internet and exchange data and control instructions via SSH connections and web services. Geospatial data is represented by collections of netCDF files stored in a hierarchy of directories in the framework of a file system. To speed up data reading and processing, three approaches are proposed: a precalculation of intermediate products, a distribution of data across multiple storage systems (with or without redundancy), and caching and reuse of the previously obtained products. For a fast search and retrieval of the required data, according to the data storage and processing model, a metadata database is developed. It contains descriptions of the space-time features of the datasets available for processing, their locations, as well as descriptions and run options of the software components for data analysis and visualization. The model and the metadata database together will provide a reliable technological basis for development of a high- performance virtual research environment for climatic and environmental monitoring.
NASA Technical Reports Server (NTRS)
Vorosmarty, C.; Grace, A.; Moore, B.; Choudhury, B.; Willmott, C. J.
1990-01-01
A strategy is presented for integrating scanning multichannel microwave radiometer data from the Nimbus-7 satellite with meteorological station records and computer simulations of land surface hydrology, terrestrial nutrient cycling, and trace gas emission. Analysis of the observations together with radiative transfer analysis shows that in the tropics the temporal and spatial variations of the polarization difference are determined primarily by the structure and phenology of vegetation and seasonal inundations of major rivers and wetlands. It is concluded that the proposed surface hydrology model, along with climatological records, and, potentially, 37-GHz data for phenology, will provide inputs to a terrestrial ecosystem model that predicts regional net primary production and CO2 gas exchange.
New statistical scission-point model to predict fission fragment observables
NASA Astrophysics Data System (ADS)
Lemaître, Jean-François; Panebianco, Stefano; Sida, Jean-Luc; Hilaire, Stéphane; Heinrich, Sophie
2015-09-01
The development of high performance computing facilities makes possible a massive production of nuclear data in a full microscopic framework. Taking advantage of the individual potential calculations of more than 7000 nuclei, a new statistical scission-point model, called SPY, has been developed. It gives access to the absolute available energy at the scission point, which allows the use of a parameter-free microcanonical statistical description to calculate the distributions and the mean values of all fission observables. SPY uses the richness of microscopy in a rather simple theoretical framework, without any parameter except the scission-point definition, to draw clear answers based on perfect knowledge of the ingredients involved in the model, with very limited computing cost.
A thermodynamic approach to obtain materials properties for engineering applications
NASA Technical Reports Server (NTRS)
Chang, Y. Austin
1993-01-01
With the ever increases in the capabilities of computers for numerical computations, we are on the verge of using these tools to model manufacturing processes for improving the efficiency of these processes as well as the quality of the products. One such process is casting for the production of metals. However, in order to model metal casting processes in a meaningful way it is essential to have the basic properties of these materials in their molten state, solid state as well as in the mixed state of solid and liquid. Some of the properties needed may be considered as intrinsic such as the density, heat capacity or enthalpy of freezing of a pure metal, while others are not. For instance, the enthalpy of solidification of an alloy is not a defined thermodynamic quantity. Its value depends on the micro-segregation of the phases during the course of solidification. The objective of the present study is to present a thermodynamic approach to obtain some of the intrinsic properties and combining thermodynamics with kinetic models to estimate such quantities as the enthalpy of solidification of an alloy.
NASA Astrophysics Data System (ADS)
Myatt, J. F.; Shaw, J. G.; Solodov, A. A.; Maximov, A. V.; Short, R. W.; Seka, W.; Follett, R. K.; Edgell, D. H.; Froula, D. H.; Goncharov, V. N.
2015-11-01
Hot-electron preheat, caused by laser-plasma instabilities, can impair the performance of inertial confinement fusion implosions. It is therefore imperative to understand processes that can generate hot electrons and to design mitigation strategies should preheat be found to be excessive at the ignition scale (laser-plasma interactions do not follow hydrodynamic scaling). For this purpose, a new 3-D model [laser-plasma simulation environment (LPSE)] has been constructed that computes hot-electron generation in direct-drive plasmas based on the assumption that two-plasmon decay is the dominant, hot-electron-producing instability. It uses an established model of TPD-driven turbulence together with a new GPU based hybrid particle method of hot-electron production. The time-dependent hot-electron power, total energy, and energy spectrum are computed and compared with data from recent OMEGA implosion experiments that have sought to mitigate TPD by the use of multilayered (mid- Z) ablators. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.
ERIC Educational Resources Information Center
Porter, Lon A., Jr.; Chapman, Cole A.; Alaniz, Jacob A.
2017-01-01
In this work, a versatile and user-friendly selection of stereolithography (STL) files and computer-aided design (CAD) models are shared to assist educators and students in the production of simple and inexpensive 3D printed filter fluorometer instruments. These devices are effective resources for supporting active learners in the exploration of…
ERIC Educational Resources Information Center
Tian, Wei; Cai, Li; Thissen, David; Xin, Tao
2013-01-01
In item response theory (IRT) modeling, the item parameter error covariance matrix plays a critical role in statistical inference procedures. When item parameters are estimated using the EM algorithm, the parameter error covariance matrix is not an automatic by-product of item calibration. Cai proposed the use of Supplemented EM algorithm for…